The detention facility in Guantanamo Bay, Cuba hangs over the United States and now the Obama administration like a cloud of acid rain. In recent months hunger strikes once again have brought the injustice of the camp, the inhumane treatment of its inhabitants, and the indefinite detention of its inmates to the attention of the world. The camp is now an indelible blot on the United States, both on our reputation abroad, as well as upon our self-image as a land of constitutional republicanism. Above all it is a meaningful challenge to our self-respect.
Most of the 779 people that Wikipedia says were brought to Guantanamo were never charged with a crime. Of the fewer than 200 who remain, some no doubt are terrorists and criminals; others, equally as clearly, were unjustly captured, imprisoned, tortured. They are now being held outside rules of law and in violation of our legal and constitutional traditions of freedom. No doubt there are inconvenient questions about what to do with these men. But they are men under our collective care and they are owed more than being kept like animals in pens in purgatory.
President Obama has announced once again his decision to close the camp. We wish him the courage to do what is right. At this moment, it is worth recalling the case of Mohammed Jawad, the first Guantanamo detainee to testify under oath and to a military commission about being tortured by his American captors. Last month there was a dramatic reading of statements made by Jawad's lawyer, David Frakt, juxtaposed with statements made by the case's lead prosecutor, Darrel Vandeveld who left the military in order to help free Jawad. The reading was held at the Pen World Voices Festival of International Literature. In their statements, both men use the language of Constitutionality to suggest that, by torturing detainees such as Jawad, "America," as Frakt puts it, "lost a little of its greatness."
Here is what Vandeveld, a lifelong military man, writes of his choice to testify in favor of Jawad:
In 2007, I volunteered to prosecute detainees at Guantanamo in the U.S. military commissions. I was assigned as the lead prosecutor in several cases, including the case of Mohammed Jawad, a young man from Afghanistan. While I was a prosecutor, David Frakt helped me to find and expose gross human rights abuses of Mohammed and other detainees by the U.S. government. In September 2008, I became convinced that the prosecution of Mohammed was unjust and that the military commissions were grossly flawed. I requested to be relieved and reassigned to other duties. After stepping down from the prosecution, I worked with David Frakt to expose detainee abuse, to secure Mohammed’s release and bring about much-needed reforms to the U.S. military commissions.
Vandeveld served 24 years in the army, winning a bronze star for valor in Iraq. After his service he went to law school and became a military lawyer. His decision to ask to be relieved from his prosecution duties was, he writes, simply doing his duty: “I did it because I believe in truth, justice, the rule of law, and our common humanity. I did it for Mohammed Jawad, I did it because it was my duty, and I did it for us all.”
As the debate about closing Guantanamo heats up, this is a good time to acquaint oneself with the case of Mohammed Jawad. The transcript from the staged discussion between David Frakt and Darrel Vandeveld is a good place to begin. We are all indebted to The Mantle for publishing it. It is your weekend read.
Monday night marked the Hudson Valley Premiere of the highly anticipated new Margarethe von Trotta feature film, Hannah Arendt starring Barbara Sukowa as Hannah Arendt. The film opens officially in New York City on May 29th.
Roger Berkowitz introduced the film to a packed house at Bard College.
After the screening, Roger Berkowitz moderated a discussion with screenwriter Pam Katz and actor Barbara Sukowa. Sukowa, who on Saturday won the LOLA award for Best Actress for her portrayal of Hannah Arendt, spoke about the challenge of making a movie that will be seen so differently by those who know Hannah Arendt's work and those who don't. Katz, who co-wrote the script with von Trotta, talked about how important it was to include archival footage from the Eichmann trial rather than having an actor play Eichmann. We will be posting video of the entire discussion soon.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Russianist Eric Naiman considers the career of the British historian A.D. Harvey, whom he believes is responsible for propagating the claim that Dostoevsky met Dickens during an 1862 visit to London. Naiman believes that, under various pseudonyms and over the course of several decades, Harvey has written a number of articles that occasionally criticize, but usually praise, his own work. Those of you thinking about grad school beware; Naiman suggests that Harvey-who, despite having written more than a dozen books of various kinds, has no academic affiliation-may have been driven to this by the scholarly life: "Even for holders of tenured university positions, scholarship can make for a lonely life. One spends years on a monograph and then waits a few more years for someone to write about it. How much lonelier the life of an independent scholar, who does not have regular contact, aggravating as that can sometimes be, with colleagues. Attacking one's own book can be seen as an understandable response to an at times intolerable isolation. How comforting to construct a community of scholars who can analyse, supplement and occasionally even ruthlessly criticize each other's work. I've traced the connections between A. D. Harvey, Stephanie Harvey, Graham Headley, Trevor McGovern, John Schellenberger, Leo Bellingham, Michael Lindsay and Ludovico Parra, but they may be part of a much wider circle of friends."
Ria Chhabra decided to check out the hype around the health benefits of organic food. She tracked the health and vitality of two groups of fruit flies, one swarming around conventional bananas and potatoes and the other given pricier organic fare. There has been great skepticism recently about the benefits of organic food. But Chhabra's results-recently published in PLOS ONE, an international, peer-reviewed, open-access, online publication-show increased fertility, lower stress, and longer lives for the flies fed organic produce. What makes this study especially fascinating, is that Chhabra is only 16. Read the story of how her high school science project is making waves throughout the world of science.
Christina Davis ponders the meaning of the space in the title of T.S. Eliot's "The Waste Land." She suggests that his use of "waste" as an adjective gives it a temporal quality, one that suggests an impermanent state: "In this phrase, he was likely echoing St. Augustine's concern about the ossification of certain written words into an orthodoxy: "I should write so that my words echo rather than to set down one true opinion that should exclude all other possibilities.""
Terence Malick offers a cinema inspired by grand conceptual oppositions and profound experience. In Tree of Life, Malick meditates on the tension between grace and will. In his new film, To the Wonder, Malick offers archetypes of the artist, the rationalist, the personal experience, and passion. In The New Yorker, Richard Brody rightly revels in the magic of the film: "What Malick is after-by way of his archetypes and through his images-is religious experience as such, and he defines it in a scene set in the priest's church. There, an elderly, gray-bearded black man who is cleaning the stained glass speaks and tells the priest what he's missing-"You've got to have a little more excitement"-and, a moment later, shows him what he means, exclaiming, "The power hits you!" and speaking, excitedly, in tongues, then putting his hand on the stained glass and saying that he feels the warmth of the light."
Nick Murray interviews landscape architect Diana Balmori about the changing role of her profession. Balmori, for her part, emphasizes that it is not enough to simply return a landscape to nature, nor to conquer it somehow. Instead, she says that she tries to build in a way that strengthens relationships between an environment and its inhabitants.
Music in the Holocaust: Jewish Identity and Cosmopolitanism
Part Three: Kurt Weill and the Modernist Migration: Music of Weill and Other Emigres
Learn more here.
The word designating military drones comes from the word for bee. This is true all over the world, in countless languages. Partly because of this linguistic consistency, it is a common misperception that drones take their name from the buzzing sound when unmanned aircraft fill the air. More accurately, however, drones trace their etymological lineage to the male honey-bee that is called a drone. The male drone-bee is distinguished from the female worker-bees. It does no useful work and has one single function: to impregnate the queen-bee. What unites military drones with their apiary namesakes is not sound, but thoughtless purposefulness.
The beauty of the drone-bee—like the dark beauty of the military drone—is its single-minded purpose. It is a miracle of efficiency, designed to do one thing. The drone-bee is not distracted by the perfume of flowers or the contentment of labor. It is born, lives, and dies with only one task in mind. Similarly, the military drone suffers neither from hunger nor from distraction. It does what it is told. If necessary, it will sacrifice itself for its mission. It is a model of thoughtless efficiency.
A few weeks ago I wrote about Ernst Jünger’s novel The Glass Bees, in which a brilliant inventor produces tiny flying glass bees that offer limitless potential for surveillance and war. Today I turn to Jake Kosek’s recent paper “Ecologies of Empire: On The New Uses of the Honeybee.” Kosek does not cite Jünger’s novel, and yet his article is in many ways its non-fiction sequel. What Kosek sees is that the rise of drones in military strategy is tied deeply to their ability to mimic the activity and demeanor of male honey-bees. It is because bees can fly, swarm, change direction, alter their course, and yet achieve their single purpose absent any intentionality or thinking that bees are so useful in modern warfare.
Bees have long been associated with military endeavors, both metaphorically and literally. Kosek tells that our word bomb comes from the Greek bombos, which means bee. The first bombs were, it seems, beehives dropped or catapulted into the heart of the enemy camp. Bees are today trained to sniff out toxic chemicals; and beeswax was for generations an essential ingredient in munitions.
In the war on terror, bees have taken on a special significance. The “enemy’s lack of coherence—institutionally, ideologically, and territorially— makes the search for the enemy central to the politics of the war on terror.” War in the war on terror is ever less a contest of armies on the battlefield and is increasingly a war of knowledge. This means that surveillance—for centuries an important complement to battlefield tactics—comes to occupy the core of the modern war on terror. In this regard, drones are essential, as drones can hover in the air unseen for days, gathering essential intelligence on persons, groups, or even whole cities. All the more powerful would be miniature drones that fly through the air unseen and at ground level. That is why Kosek writes that “Intelligence gathering [is] not just limited to psychologists, sociologists, lawyers, and military planners, but [has come] to include biologists, anthropologists, epidemiologists, and even entomologists.” What the military use of bees promises is access to information and worlds not previously open to human knowledge. Bees, Kosek writes, are increasingly the model for the modern military.
The advantage of bees is not simply their thoughtlessness, but is found also in their ability to operate as part of a swarm. Current drone technology requires that each drone be controlled by a single pilot. What happens when hundreds of drones must share the airspace around a target? How can drones coordinate their activity? Kosek quotes a private contractor, John Sauter, who says:
“A central aspect of the future of warfare technology is to get networks of machines to operate as self-synchronized war fighting units that can act as complex adaptive systems. . . We want these machines to be fighting units that can operate as reconfigurable swarms that are less mechanical and more organic, less engineered and more grown.”
The point is that drones, be they large or small, must increasingly work in conjunction with each other at a speed and level of nuance that is impossible for human controllers to manage. The result is that we must model the drones of the future on bees.
The scientists working with the Pentagon to create drones that can fly and function like bees are not entomologists, but mathematicians. The DNA of the glass or silicone bees of the future will be complex algorithms inspired by but actually surpassing the ability of swarms “to coordinate and collect small bits of information that can be synchronized to make collective action by drones possible.” Once this is possible, one controller will be able to manage a single drone “and the others adapt, react, and coordinate with that drone.”
Kosek’s article is provocative and fascinating. His ruminations on empire strike me as overdone; his insights about the way our training and use of bees has transformed the bee and the ways that bees are serving as models and inspiration for our own development of new ways to fight wars and solve problems are important. So too is his imagination of the bee as the six-legged soldier of the future. Whether the drones of the future are cyborg bees (as some in Kosek’s article suggest) or mechanical bees as Jünger imagined half a century ago, it is nevertheless the case that thinking about the impact of drones on warfare and human life is enriched by the meditation on the male honeybee. For your weekend read , I offer you Jake Kosek’s “Ecologies of Empire: On The New Uses of the Honeybee.”
When Gershom Scholem once wrote to Arendt that her phrase the “banality of evil” was a cliché, her response was swift: As far as she had known, nobody had ever used it before. The banality of evil was no common formulation worn meaningless by overuse. When she coined the phrase, it was a searing and dangerous provocation to thought, a warning to all those who in the face of horrific crimes carried out by bureaucrats would seek to transform those bureaucrats into monsters. To make people like Eichmann into radically evil monsters is, Arendt argued, to mistake an even greater and more insidious fact about evil: that in the modern context of bureaucratic governance, evil depends upon banal people who allow themselves to participate in evil because they are thoughtless and lack the clarity of mind or the courage of conviction to stand up to the mechanized and bureaucratized doing of evil.
One can disagree with Arendt’s thesis, but it was hardly a cliché. Unfortunately, too often today it is used as the cliché Scholem feared it had already become. A case in point is an opinion piece in Wednesday’s Wall Street Journal by James Taranto.
Taranto is discussing a current case in which Dr. Kermit Gosnell is on trial for murdering seven viable fetuses.
Three associates have pled guilty to third-degree murder and five others have pled guilty to other crimes. Gosnell faces the death penalty. According to the New York Times, whose account Taranto refers to,
Reporters heard testimony from the Philadelphia medical examiner about unsanitary, even filthy conditions at Dr. Gosnell’s clinic, from which the remains of 47 fetuses were removed, some in a water jug, a juice carton and a pet-food container.
In earlier testimony, according to several news reports, an unlicensed doctor said that Dr. Gosnell, 72, showed him how to cut the necks of babies born alive to make sure they died, and a young woman who worked at the clinic as a teenager said she assisted in abortions in which she saw at least five babies moving and breathing.
The details are grisly. The main thrust of Taranto’s article is that the liberal media is ignoring the case because it upsets their narrative that abortions are clean and easy. According to experts cited in the Times article, it seems that conservative media outlets have ignored the case as well, and that the Times actually had given it more coverage than more conservative papers, but I will leave that argument to others.
What interests me more is Taranto’s sudden invocation of Hannah Arendt and her thesis of the banality of evil. The context is the guilty pleas of the eight employees of Gosnell’s clinic. They included an unlicensed doctor and untrained aids who worked under difficult and unsanitary conditions where they were trained how to break the neck of living fetuses. An Associated Press wire story described the fate of these workers and concluded: “But for most, it was the best job they could find.” This is what leads Taranto (through the route of a reader’s comment and a 1999 essay in the New York Observer) to compare the AP’s account of eight medical technicians with Hannah Arendt’s account of Adolf Eichmann.
It is not at all clear whether Taranto has ever set eyes upon Arendt’s book, for he cites only an essay on the book. It is, of course, the height of cliché to speak about books and ideas from second or third hand sources. But that is what Taranto does. He repeats the following claims from the 1999 article, all false: first, that Arendt believed that Eichmann wasn’t anti-Semitic (she reports his claim, but dismisses it as unbelievable, a fact all-too-often forgotten); that she offered the banality of evil as an “overarching theory”; that she “took him at his word” that he was just following orders; that she was a philosopher; and that she was the “world’s worst court reporter”—as if that is what she were.
But what is truly mind-boggling is that after dismissing Arendt’s thesis based on second-hand accounts, Taranto then comes to agree with her. He writes:
And while Rosenbaum [the author of the 1999 article] seems correct in rejecting "the banality of evil" as an overarching theory, surely it has some explanatory or descriptive power. "Faceless little men following evil orders" surely is a fitting characterization of the Pennsylvania bureaucrats who, because of a mix of indifference, incompetence and politics, failed in their oversight of Gosnell's clinic and allowed it to keep operating for decades.
It's also true that banality is a tactic of evil, a method it employs to make orders easier to follow. One of Gosnell's employees might have blown the whistle on him had he expressly commanded them to slash babies to death after they were born, rather than to "snip" them after they "precipitated" to "ensure fetal demise."
All too often we see this approach to Arendt’s book and thesis. She is excoriated for getting Eichmann wrong and for having the temerity to suggest he wasn’t a monster. And then we are told that actually, she was largely right, and that there is something fundamentally true about the idea that evil is done and made possible as much by thoughtlessness as by fanaticism. In other words, she was right in general but not about Eichmann.
Such an argument has become popular in the wake of David Cesarani’s book on Eichmann, which simultaneously says that Arendt under emphasized Eichmann's anti-Semitism and then accepted her argument about the banality of evil. There is a legitimate debate about how Arendt perceived Eichmann. It is wrong to say that she accepted his claims of being a friend of Jews and it is simply inaccurate to think she thought he was not an anti-Semite. That said, there is evidence of his later anti-Semitism expressed in Argentina that Arendt had not seen. Does that evidence impact her thesis? I don't believe so, but if she had had access to it and included it, such remarks would have given a fuller appraisal of Eichmann. In any case, few who repeat Cesarani's argument have read him or for that matter Arendt herself.
To reject and embrace the banality of evil in the same essay is too simple. It is easy to repeat Arendt’s insight but then protect oneself from the unsettling implications the weight of her thought must bear. To do so, sadly, is to treat the banality of evil as a cliché. She and her work deserve better.
No government exclusively based on the means of violence has ever existed. Even the totalitarian ruler, whose chief instrument of rule is torture, needs a power basis—the secret police and its net of informers. Only the development of robot soldiers, which, as previously mentioned, would eliminate the human factor completely and, conceivably, permit one man with a push button to destroy whomever he pleased, could change this fundamental ascendancy of power over violence.
—Hannah Arendt, “On Violence.”
Hannah Arendt wrote these lines in the midst of the United States’ defeat in Vietnam. Her argument was that as long as robot soldiers were a thing of the future, brute violence and force like that unleashed by the United States would always succumb to collective power, of the kind exhibited by the Vietcong. Hers was, at least in part, a hopeful voice, praising the impotence of violence in the face of power.
To read Arendt’s lines today, amidst the rise of drone warfare, alters the valence of her remarks. Drones are increasingly prototypes and even embodiments of the “robot soldiers” that Arendt worried would dehumanize war and elevate violence over power. If we draw out the consequences from Arendt’s logic, then drone soldiers might displace the traditional limits that politics places on violence; drones, in other words, make possible unprecedented levels of unlimited violence.
The rise of drones matters, Arendt suggests, in ways that are not currently being seen. Her worry has little to do with assassination, the concern of most opponents of drones today. Nor is she specifically concerned with surveillance. Instead, against those, like General Stanley McChrystal, who argue that drones are simply new tools in an old activity of war, Arendt’s warning is that drones and robot soldiers may change the very dynamic of war and politics.
To see how drones change the calculus of violence in politics, we need to understand Arendt’s thesis about the traditional political superiority of power over violence. The priority of power over violence is based on the idea that power is “inherent in the very existence of political communities.” Power, Arendt writes, “corresponds to the human ability not just to act, but to act in concert.” It “springs up whenever people get together and act in concert.” All government, and this is central to Arendt’s thesis, needs power in order to act.
This need for popular support is true even for totalitarian governments, which also depend on the power of people—at least a select group of them like the secret police and their informers—continuing to act together. It is thus a myth that totalitarian rule can exist without the support of the people. Whether in Nazi Germany or contemporary Syria, totalitarian or tyrannical governments still are predicated on power that comes from support of key segments of the population.
Even if all government is predicated on some power, governments also employ violence—but that violence is held in check by political limits. As a government loses its popular support, it finds itself tempted to “substitute violence for power.” The problem is that when governments give in to the temptation to use violence to shore up slackening of popular power, their use of violence diminishes further their power and results in impotence. The more violence a government needs to rely upon, the less power it has at its disposal. There is thus a political limit on how much violence any government can employ before it brings about the loss of its own power.
As much as she respects the claims for power over violence, Arendt is clear-eyed about the damage violence can wield. In a direct confrontation between power and violence, violence will win—at least in the short term. Arendt writes that if Gandhi’s “enormously powerful and successful strategy of nonviolent resistance” had met a different enemy—a Stalin or Bashar al-Assad instead of a Churchill or Mubarek—“the outcome would not have been decolonization, but massacre and submission.” Sheer violence can bring victory. But the price for such a triumph is high, not only for the losers, but also for the victors.
We see this exemplified in Middle East over the last few years. In those countries like Bahrain and Syria where governments did not shy from unlimited violence to repress popular revolts, the governments have maintained themselves and the Arab Spring has turned into a long and frigid winter. Assad has been able to maintain power; but his power is irreparably diminished. In the end, there is a limit to the viability and effectiveness of relying on mere violence at the expense of power. This is even more true in a constitutional democracy, where support of the people is a political necessity.
As confident as Arendt is that violence is limited in politics by the need for power, she worries that the coming age of “robot soldiers” might bring about the end of the political advantage power has over violence. Robot soldiers can be controlled absent of consent or political support. With the push of a button or a simple command, a tyrant or totalitarian ruler can exert nearly unlimited violence and destruction, even without the support a massive secret police or a network of informers. Drones threaten the time-immemorial dependence of even the most lonely tyrant on others who will support him and do his bidding.
Of course drones must be built, programmed, and maintained. No tyrant is fully autonomous. Yet building, programming, and maintaining machinery are fundamentally different jobs than arresting and killing dissenters. It is far easier for programmers and electricians to justify doing their jobs in a powerless yet violent state than for soldiers and secret agents to justify theirs.
In a drone-led war, men will rarely need to go into action as soldiers. That is of course one reputed advantage of drones, that they make war less dangerous and more technically predictable. But it also means that as modern warfare becomes safer and more humane, it also excludes without human soldiers and risks stripping war of its human and active character. This helps to explain an enigmatic passage of Arendt’s in The Human Condition, where she offers modern war as an example of when action “loses its specific character” as human action and “becomes one form of achievement among others.” The degradation of human action in modern war, she writes,
happens whenever human togetherness is lost, that is, when people are only for or against other people, as for instance in modern warfare, where men go into action and use means of violence in order to achieve certain objectives for their own side against the enemy. In these instances, which of course have always existed, speech becomes indeed ‘mere talk,’ simply one more means toward the end….
Arendt is here thinking of the anonymity of the modern soldier epitomized by the monuments to the unknown soldiers—the mute mass of humanity who fight and die without the “still existing need for glorification” that makes war a human instead of a merely mechanical activity.
Her modern warfare in its inhumanity and technological capacity abandons the togetherness that has traditionally made war a prime example of human political togetherness.
In the technological advances of modern warfare that made war so awful and so mechanical, Arendt actually found a glimmer of hope: that war’s rabid violence was compensated by neither political advantage nor personal glory. In On Revolution, she dared hope that the fact that technology had reached the stage “where the means of destruction were such as to exclude their rational use” might lead to a “disappearance of war from the scene of politics….” It was possible, she thought, that the threat of total war and total destruction that accompanies war in the modern era might actually lead to the disappearance of war.
Clearly such a hope has not come to pass. One reason for the continuation of war, however, is that the horrors of war are made ever more palatable and silent—at least to the victors—by the use of technology that exerts violence without the need for political power and participation. The drone wars of the early 21st century are in this respect notable for the unprecedented silence that accompanies violence. Since U.S. soldiers are rarely injured or killed and since the strikes are classified and the damage remote, we have indeed entered an era where we can fight wars absent the speech, glory, and “human togetherness” that has traditionally marked both the comradeship of soldiers and the patriotic sacrifice of a nation at war. It is in this extraordinary capacity of mute violence to substitute for power in which we can glimpse both the promise and the peril of drones.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Hannah Arendt Center Senior Fellow Wyatt Mason explores the wild and wonderful world of super-artist Kehinde Wiley. "Wiley, as some of you may know, is an American artist, an unusually successful one. In the decade of his career to date, he's become one of the most sought-after painters in America. Holland Cotter, of The New York Times, called Wiley "a history painter, one of the best we have.... He creates history as much as he tells it." Even if you don't know him by name, you've likely glimpsed his grand portraits of hip-hop artists-LL, Ice-T, Biggie. Maybe you've even seen his massive portrait of the King of Pop: the one of MJ in full armor, astride a prancing warhorse. If all this suggests that Wiley, a 36-year-old gay African-American man, is court painter to the black celebretariat, that misconception has been useful to promoting his brand, up to a point."
Mason is skeptical, but if you don't know the Wiley brand, the route through Wiley's world of surfaces is about as fine a reflection as you'll find of the challenges facing the artist in a consumer society.
Zainab Al-Khawaja is sitting in a Bahrani prison reading Martin Luther King Jr. Al-Khawaja is a political prisoner. She is in a cell with 14 others, some murderers. To maintain her dignity and to announce her difference from common criminals, she has refused to wear an orange prison jumpsuit. As a punishment, she is denied family visits, including by her baby. She is now on hunger strike. "Prison administrators ask me why I am on hunger strike. I reply, "Because I want to see my baby." They respond, nonchalantly, "Obey and you will see her." But if I obey, my little Jude will not in fact be seeing her mother, but rather a broken version of her. I wrote to the prison administration that I refuse to wear the convicts' uniform because "no moral man can patiently adjust to injustice." (Thoreau)." Al-Khawaja's thoughts on dignity and non-violence are more than worthy testaments to her mentor.
Sara Horowitz takes on the "micro-gig," a new kind of freelancing that allows people to employ others for small tasks, like delivering or assembling IKEA furniture. Horowitz, however, worries about what "micro-gigging" might mean for workers: "It's as if we're eliminating the "extraneous" parts of a worker's day--like lunch or bathroom breaks--and paying only for the minutes someone is actually in front of the computer or engaged in a task." Welcome to our piece-work future.
Chloe Pantazi considers the work of the photographer Chim, also known as David Seymour, on the occasion of a showing of his work at the International Center of Photography. Pantazi focuses in particular on Chim's photos of children, saying that as he "offers up the every day lives of such adults working within the industry of war (as soldiers, munitions workers) we trust that Chim's postwar photographs of children yield something close to their every day, as vulnerable innocents who-like the newborn seen suckling at its mother's breast in a photograph taken of the crowd at a land reform meeting at the brink of the Civil War, in Spain, 1936-were virtually reared on the conflicts of their time."
Lucy McKeon explores Russian poet Kiril Medvedev, who has renounced the copyright to all of his works. McKeon recounts Medvedev's rebellion against the bourgeois idea of artist as private citizen-a type idealized by Joseph Brodsky in his 1987 Nobel Prize address. Medvedev is searching for a post-individualized and post-socialist culture-what he calls new humanism. "Logically, Medvedev's answer to individualized disconnectedness calls for a synthesis of twentieth-century leftist political and intellectual thought, a situation where several senses of the word 'humanism' begin to collide." Where something from poetry meets something from philosophy; where postmodernism, logocentrism, psychology, culture and counterculture, "and probably something else, too, that we haven't though of yet," writes Medevedev, join to form "a new shared understanding of humanity." Only in this utopian future society could the artist as private citizen responsibly exist and create."
Music in the Holocaust: Jewish Identity and Cosmopolitanism
Part II: Music of Warsaw, Ludz and other Eastern Ghettoes
Learn more here.
Roger Berkowitz lauds the idea of early college. Jeffrey Jurgens considers Jeremy Walton's recent article "Confessional Pluralism and the Civil Society Effect." Cristiana Grigore responds to the recent New York Times article, "The Kings of Roma" by describing her own Roma upbringing in Romania. Kathleen B. Jones takes on New Materialism from an Arendtian point of view.
For too long now high school has been a waste of time for too many people. I always remind my students that Georg Friedrich Hegel developed his lectures on the Philosophy of Right as a course for a German Gymnasium, the equivalent of high school in the United States. Most American high schools have long abandoned the idea of offering challenging courses that demand students think and engage with the world and the history of ideas. Our brightest students are too often bored, confirmed in their intelligence, but rarely pushed. This is especially true of our public high schools in our poorest neighborhoods.
One of the most heartening trends in response to this tragedy is the idea of early college. Bard College has been a leader in the early college movement, now embraced by the Bill and Melinda Gates Foundation, and others.
The New York Times has an excellent article on Bard’s newest Early College in Newark:
Across the country in communities like Newark, the early college high school model is being lauded as a way to provide low-income students with a road map to and through college. According to the most recent figures from the National Center for Education Statistics, 68 percent of all high school graduates make it to a two- or four-year institution, but only 52 percent of low-income students do the same. Of poor students in four-year institutions, only 47 percent graduate within six years, compared with 58 percent of the general population.
Not surprisingly, the challenges are greatest for students whose parents did not attend any college: their graduation rate hovers around 40 percent. Early college high schools seek to rectify that, by merging high school and some college. Students can earn both a high school diploma and an associate degree, and some are set on the path to a four-year degree.
Educators and big-ticket donors have praised the schools for saving students money and time — most schools compress the academic experience into four years. Since 2002, the Bill and Melinda Gates Foundation has provided more than $40 million toward initiatives. The Ford Foundation and the Carnegie Corporation of New York have also chipped in. President Obama is a proponent, giving a shout-out in his State of the Union address to P-Tech, a public-private partnership that pairs the New York City public school system and the City University of New York with I.B.M., which promises graduates a shot at a well-paying job.
There are now more than 400 early college high schools across the country — North Carolina has 76 of them — educating an estimated 100,000 students.
Bard, a liberal arts college in Annandale-on-Hudson, N.Y., is at the vanguard of the movement, with a president, Leon Botstein, who has long chastised the American high school system for its inefficiencies. More than 30 years ago, Bard took over Simon’s Rock, a private college for 11th graders and up in Great Barrington, Mass. In 2001, it opened an early college high school in Lower Manhattan, enormously popular with hyper-motivated New Yorkers, and in 2008 it started one in Queens that has become a magnet for the high-achieving offspring of Chinese, Polish and Bengali immigrants. Until now, Bard’s model has largely focused on elite students.
In Newark, Bard moved into a school building across from a tire shop and a bail bond business. Hanging outside is a cheerful red banner with the Bard name etched in white, as if to signal that new life is being breathed into the neighborhood.
I was at dinner with a colleague this week—midterm week. Predictably, talk turned to the scourge of all professors: grading essays. There are few tasks in the life of a college professor less fulfilling than grading student essays. Every once in a while a really good essay jolts me to consciousness. I am elated by such encounters. To be honest, however, reading essays is for the most part stultifying. This is not the fault of the students, many of whom are brilliant and exuberant writers. I find it trying to wade through 25 essays discussing the same book, offering varying opinions and theories, while keeping my attention and interest. How many different ways can one ask for a thesis, talk about the importance of transition sentences, and correct grammar? For some time it is fun, in a way. One learns new things and is captivated by comparing how bright young minds see things. But after years, grading the essay becomes just part of the worst part of a great job.
So how might my colleagues and I react to news that EdX—the influential Harvard-MIT led consortium offering online courses—has developed software that will grade college student essays? I imagine it is sort of like how people felt when the dishwasher was invented. You mean we can cook and feast and don’t have to scrub pots and wash dishes? It promises to allow us to focus on teaching well without having to do that part of our job that we truly dread.
The appeal of computer grading is obvious and broad. Not only will many professors and teachers be freed from unwanted tedium, but also it may help our students. One advantage of computer grading is that it is nearly instantaneous. Students can hand in their work and get a grade and feedback seconds later. Too often essays are handed back days or even weeks after they are submitted. By then the students have lost interest in their paper and forgotten the inspiration that breathed life into their writing. To receive immediate feedback will allow students to see what they did wrong and how they could improve while the generative impulse underlying the paper is still fresh. Computer grading might encourage students to turn in numerous drafts of a paper; it may very well help teach students to write better, something that professorial comments delivered after a week rarely accomplish.
Another putative advantage of computer grading is its objectivity and consistency. Every professor knows that it matters when we read essays and in what order. Some essays find us awake and attentive. Others meet my eyes as they struggle to remain open. As much as I try to ignore the names on the top of the page, I can’t deny that my reading and grading is personalized to the students. I teach at a small liberal arts college where I know the students. If I read a particularly difficult sentence by a student I have come to trust, I often make a second effort. My personal attention has advantages but it is of course discriminatory. The computer will not do that, which may be seen by some as more fair. What is more, the computer doesn’t get tired or need caffeine.
Perhaps the most important advantage for administrators considering these programs is the cost savings. If computers relieve professors from the burden of grading, that means professors can teach more. It may also mean that fewer TA’s are necessary in large lecture courses, thus saving money for strapped universities. There may even be a further side benefit to these programs. If universities need fewer TA’s to grade papers, they may admit fewer graduate students to their programs, thus going some way towards alleviating the extraordinary and irresponsible over-production of young professors that is swelling the ranks of unemployable Ph.D.s.
There are, of course, real worries about computer grading of essays. My concern is not that the computers will make mistakes (so do I); or that we lack studies that show that computers can grade as well as human professors—for I doubt professors are on the whole excellent graders. The real issue is elsewhere.
According to the group “Professionals Against Machine Scoring of Student Essays in High-Stakes Assessment,” the problem with computer grading of essays is simple: Machines cannot read. Here is what the group says in a statement:
Let’s face the realities of automatic essay scoring. Computers cannot ‘read.’ They cannot measure the essentials of effective written communication: accuracy, reasoning, adequacy of evidence, good sense, ethical stance, convincing argument, meaningful organization, clarity, and veracity, among others.
What needs to be taken seriously is not that computers can’t grade as well as humans. In many ways they grade better. More consistently. More honestly. With less grade inflation. And more quickly. But computer grading will be different than human grading. It will be less nuanced and aspire to clearly defined criteria. Are sentences grammatical? Is there a clear statement of the thesis? Are there examples given? Is there a transition between sentences? All of these are important parts of good writing and the computer can be trained to look for these characteristics in an essay. What this means, however, is that computers will demand the kind of clear, precise, and logical writing that computers can understand and that many professors and administrators demand from students. What this also means, however, is that writing will become more mechanical.
There is much to be learned here from an analogy with the rise of computer chess. The great grandmaster Gary Kasparov—who famously lost to Deep Blue— has perceptively argued that machines have changed the ways Chess is played and redefined what a good chess move and a well-played chess game looks like. As I have written before:
The heavy use of computer analysis has pushed the game itself in new directions. The machine doesn’t care about style or patterns or hundreds of years of established theory. It counts up the values of the chess pieces, analyzes a few billion moves, and counts them up again. (A computer translates each piece and each positional factor into a value in order to reduce the game to numbers it can crunch.) It is entirely free of prejudice and doctrine and this has contributed to the development of players who are almost as free of dogma as the machines with which they train. Increasingly, a move isn’t good or bad because it looks that way or because it hasn’t been done that way before. It’s simply good if it works and bad if it doesn’t. Although we still require a strong measure of intuition and logic to play well, humans today are starting to play more like computers. One way to put this is that as we rely on computers and begin to value what computers value and think like computers think, our world becomes more rational, more efficient, and more powerful, but also less beautiful, less unique, and less exotic.
Much the same might be expected from the increasing use of computers to grade (and eventually to write) essays. Students will learn to write in ways expected from computers, just as they today try to learn to write in ways desired by their professors. The difference is that different professors demand and respond to varying styles. Computers will consistently and logically drive writing towards a more mechanical and logical style. Writing, like Chess playing, will likely become more rational, more efficient, and more effective, but also less beautiful, less unique, and less eccentric. In other words, writing will become less human.
It turns out that many secondary school districts already use computers to grade essays. But according to John Markoff in The New York Times, the EdX software promises to bring the technology into college classrooms as well as online courses.
It is quite possible that in the near future, my colleagues and I will no longer have to complain about grading essays. But that is unlikely at Bard. More likely is that such software will be used in large university lecture courses. In such courses with hundreds of students, professors already shorten questions or replace essays with multiple-choice tests. Or they use armies of underpaid graduate students to grade these essays. It is quite likely that software will actually augment the educational value of writing assignments at college in these large lecture halls.
In seminars, however, and in classes at small liberal arts colleges like Bard where I teach, such software will not likely free my colleagues and me from reading essays. The essays I assign are not simple responses to questions in which there are clear criteria for grading. I look for elegance, brevity, insight, and the human spark (please no comments on my writing). Whether or not I am good at evaluating writing or at teaching writing, that is my aspiration. I seek to encourage writing that is thoughtful rather than writing that is simply accurate. When I have time to make meaningful comments on papers, they concern structure, elegance, and depth. It is not only a way to grade an essay, but also a way to connect with my students and help them to see what it means to write and think well.
And yet, I can easily imagine making use of such a computer-grading program. I rarely have time to grade essays as well or as quickly as I would like. I would love to have my students submit drafts of their essays to the EdX computer program.
If they could repeatedly submit their essays and receive such feedback and use the computer to catch not only grammatical errors but also poor sentences, redundancies, repetitions, and whatever other mistakes the computer can be trained to recognize, that would allow them to respond and rework their essays many times before I see them. Used well, I hope, such grading programs might really augment my capacities as a professor and their experiences as students.
I have real fears that grading technology will rarely be used well. Rather, it will too-often replace human grading altogether and in large lectures, high schools and standardized tests will impose a new and inhuman standard on the way we write and thus the way we think. We should greet such new technologies enthusiastically and skeptically. But first, we should try to understand them. Towards that end, it is well worth reading John Markoff’s excellent account of the new EdX computer grading software in The New York Times. It is your weekend read.
In an essay in the Wall Street Journal, Frans de Waal—C. H. Candler Professor of Primate Behavior at Emory University—offers a fascinating review of recent scientific studies that upend long-held expectations about the intelligence of animals. De Waal rehearses a catalogue of fantastic studies in which animals do things that scientists have long thought they could not do. Here are a few examples:
Ayumu, a male chimpanzee, excels at memory; just as the IBM computer Watson can beat human champions at Jeopardy, Ayumu can easily best the human memory champion in games of memory.
Similarly, Kandula, a young elephant bull, was able to reach some fragrant fruit hung out of reach by moving a stool over to the tree, standing on it, and reaching for the fruit with his trunk. I’ll admit this doesn’t seem like much of a feat to me, but for the researchers de Waal talks with, it is surprising proof that elephants can use tools.
Scientists may be surprised that animals can remember things or use tools to accomplish tasks, but any one raised on children’s tales of Lassie or Black Beauty knows this well, as does anyone whose pet dog opened a door knob, brought them a newspaper, or barked at intruders. The problem these studies address is less our societal view of animals than the overly reductive view of animals that de Waal attributes to his fellow scientists. It’s hard to take these studies seriously as evidence that animals think in the way that humans do.
Seemingly more interesting are experiments with self-recognition and also facial recognition. De Waal describes one Asian Elephant who stood in front of a mirror and “repeatedly rubbed a white cross on her forehead.” Apparently the elephant recognized the image in the mirror as herself. In another experiment, chimpanzees were able to recognize which pictures of chimpanzees were from their own species. Like my childhood Labrador who used to stare knowingly into the mirror, these studies confirm that animals are able to recognize themselves. This means that animals do, likely, understand that they are selves.
For de Waal, these studies have started to upend a view of humankind's unique place in the universe that dates back at least to ancient Greece. “Science,” he writes, “keeps chipping away at the wall that separates us from the other animals. We have moved from viewing animals as instinct-driven stimulus-response machines to seeing them as sophisticated decision makers.”
The flattening of the distinction between animals and humans is to be celebrated, De Waal argues, and not feared. He writes:
Aristotle's ladder of nature is not just being flattened; it is being transformed into a bush with many branches. This is no insult to human superiority. It is long-overdue recognition that intelligent life is not something for us to seek in the outer reaches of space but is abundant right here on earth, under our noses.
DeWaal has long championed the intelligence of animals, and now his vision is gaining momentum. This week, in a long essay called “One of Us” in the new Lapham’s Quarterly on animals, the glorious essayist John Jeremiah Sullivan begins with this description of similar studies to the ones DeWaal writes about:
These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”
With nuance and subtlety, Sullivan understands that our tradition has not drawn the boundary between human and animal nearly as securely as de Waal portrays it. Throughout human existence, humans and animals have been conjoined in the human imagination. Sullivan writes that the most consistent “motif in the artwork made between four thousand and forty thousand years ago,” is the focus on “animal-human hybrids, drawings and carvings and statuettes showing part man or woman and part something else—lion or bird or bear.” In these paintings and sculptures, our ancestors gave form to a basic intuition: “Animals knew things, possessed their forms of wisdom.”
Religious history too is replete with evidence of the human recognition of the dignity of animals. God says in Isaiah that the beasts will honor him and St. Francis, the namesake of the new Pope, is famous for preaching to birds. What is more, we are told that God cares about the deaths of animals.
“In the Gospel According to Matthew we’re told, “Not one of them will fall to the ground apart from your Father.” Think about that. If the bird dies on the branch, and the bird has no immortal soul, and is from that moment only inanimate matter, already basically dust, how can it be “with” God as it’s falling? And not in some abstract all-of-creation sense but in the very way that we are with Him, the explicit point of the verse: the line right before it is “fear not them which kill the body, but are not able to kill the soul.” If sparrows lack souls, if the logos liveth not in them, Jesus isn’t making any sense in Matthew 10:28-29.
What changed and interrupted the ancient and deeply human appreciation of our kinship with besouled animals? Sullivan’s answer is René Descartes. The modern depreciation of animals, Sullivan writes,
proceeds, with the rest of the Enlightenment, from the mind of René Descartes, whose take on animals was vividly (and approvingly) paraphrased by the French philosopher Nicolas Malebranche: they “eat without pleasure, cry without pain, grow without knowing it; they desire nothing, fear nothing, know nothing.” Descartes’ term for them was automata—windup toys, like the Renaissance protorobots he’d seen as a boy in the gardens at Saint-Germain-en-Laye, “hydraulic statues” that moved and made music and even appeared to speak as they sprinkled the plants.
Too easy, however, is the move to say that the modern comprehension of the difference between animal and human proceeds from a mechanistic view of animals. We live at a time of the animal rights movement. Around the world, societies exist and thrive whose mission is to prevent cruelty toward and to protect animals. Yes, factory farms treat chickens and pigs as organic mechanisms for the production of meat, but these farms co-exist with active and quite successful movements calling for humane standards in food production. Whatever the power of Cartesian mechanics, its success is at odds with the persistence of the religious, ancient solidarity, and also deeply modern sympathy between human and animal.
A more meaningful account of the modern attitude towards animals might be found in Spinoza. Spinoza, as Sullivan quotes him, recognizes that animals feel in ways that Descartes did not. As do animal rights activists, Spinoza admits what is obvious: that animals feel pain, show emotion, and have desires. And yet, Spinoza maintains a distinction between human and animal—one grounded not in emotion or feeling, but in human nature. In his Ethics, he writes:
Hence it follows that the emotions of the animals which are called irrational…only differ from man’s emotions to the extent that brute nature differs from human nature. Horse and man are alike carried away by the desire of procreation, but the desire of the former is equine, the desire of the latter is human…Thus, although each individual lives content and rejoices in that nature belonging to him wherein he has his being, yet the life, wherein each is content and rejoices, is nothing else but the idea, or soul, of the said individual…It follows from the foregoing proposition that there is no small difference between the joy which actuates, say, a drunkard, and the joy possessed by a philosopher.
Spinoza argues against the law prohibiting slaughter of animals—it is “founded rather on vain superstition and womanish pity than on sound reason”—because humans are more powerful than animals. Here is how he defends the slaughter of animals:
The rational quest of what is useful to us further teaches us the necessity of associating ourselves with our fellow men, but not with beasts, or things, whose nature is different from our own; we have the same rights in respect to them as they have in respect to us. Nay, as everyone’s right is defined by his virtue, or power, men have far greater rights over beasts than beasts have over men. Still I do not deny that beasts feel: what I deny is that we may not consult our own advantage and use them as we please, treating them in the way which best suits us; for their nature is not like ours.
Spinoza’s point is quite simple: Of course animals feel and of course they are intelligent. Who could doubt such a thing? But they are not human. That is clear too. While we humans may care for and even love our pets, we recognize the difference between a dog and a human. And we will, in the end, associate more with our fellow humans than with dogs and porpoises. Finally, we humans will use animals when they serve our purposes. And this is ok, because have the power to do so.
Is Spinoza arguing that might makes right? Surely not in the realm of law amongst fellow humans. But he is insisting that we recognize that for us humans, there is something about being human that is different and, even, higher and more important. Spinoza couches his argument in the language of natural right, but what he is saying is that we must recognize that there are important differences between animals and humans.
At a time that values equality over what Friedrich Nietzsche called the “pathos of difference,” the valuation of human beings over animals is ever more in doubt. This comes home clearly in a story told recently by General Stanley McChrystal, about a soldier who expressed sympathy for some dogs killed in a raid in Iraq. McChrystal responded, severely: “"Seven enemy were killed on that target last night. Seven humans. Are you telling me you're more concerned about the dog than the people that died? The car fell silent again. "Hey listen," I said. "Don't lose your humanity in this thing."” Many, no doubt, are more concerned, or at least are equally concerned, about the deaths of animals as they are about the deaths of humans. There is ever-increasing discomfort about McChrystal’s common sense affirmation of Spinoza’s claim that human beings simply are of more worth than animals.
The distinctions upon which the moral sense of human distinction is based are foundering. For DeWaal and Sullivan, the danger today is that we continue to insist on differences between animals and humans—differences that we don’t fully understand. The consequences of their openness to the humanization of animals, however, is undoubtedly the animalization of humans. The danger that we humans lose sight of what distinguishes us from animals is much more significant than the possibility that we underestimate animal intelligence.
I fully agree with DeWaal and Sullivan that there is a symphony of intelligence in the world, much of it not human. And yes, we should have proper respect for our ignorance. But all the experiments in the world do little to alter the basic facts, that no matter how intelligent and feeling and even conscious animals may be, humans and animals are different.
What is the quality of that difference? It is difficult to say and may never be fully articulated in propositional form. On one level it is this: Simply to live, as do plants or animals, does not constitute a human life. In other words, human life is not simply about living. Nor is it about doing tasks or even being conscious of ourselves as humans. It is about living meaningfully. There may, of course, be some animals that can create worlds of meaning—worlds that we have not yet discovered. But their worlds are not the worlds to which we humans aspire.
Over two millennia ago, Sophocles, in his “Ode to Man,” named man Deinon, a Greek word that connotes both greatness and horror, that which is so extraordinary as to be at once terrifying and beautiful. Man, Sophocles tells us, can travel over water and tame animals, using them to plough fields. He can invent speech, and institute governments that bring humans together to form lasting institutions. As an inventor and maker of his world, this wonder that is man terrifyingly carries the seeds of his destruction. As he invents and comes to control his world, he threatens to extinguish the mystery of his existence, that part of man that man himself does not control. As the chorus sings: “Always overcoming all ways, man loses his way and comes to nothing.” If man so tames the earth as to free himself from uncertainty, what then is left of human being?
Sophocles knew that man could be a terror; but he also glorified the wonder that man is. He knew that what separates us humans from animals is our capacity to alter the earth and our natural environment. “The human artifice of the world,” Arendt writes, “separates human existence from all mere animal environment…” Not only by building houses and erecting dams—animals can do those things and more—but also by telling stories and building political communities that give to man a humanly created world in which he lives. If all we did as humans was live or build things on earth, we would not be human.
To be human means that we can destroy all living matter on the Earth. We can even today destroy the earth itself. Whether we do so or not, it now means that to live on Earth today is a “Choice” that we make, not a matter of fate or chance. Our Earth, although we did not create it, is now something we humans can decide to sustain or destroy. In this sense, it is a human creation. No other animal has such a potential or such a responsibility.
There is a deep desire today to flee from that awesome and increasingly unbearable human responsibility. We flee, therefore, our humanity and take solace in the view that we are just one amongst the many animals in the world. We see this reductionism above all in human rights discourse. One core demand of human rights—that men and women have a right to live and not be killed—brought about a shift in the idea of humanity from logos to life. The rise of a politics of life—the political demand that governments limit freedoms and regulate populations in order to protect and facilitate their citizens’ ability to live in comfort—has pushed the animality, the “life,” of human beings to the center of political and ethical activity. In embracing a politics of life over a politics of the meaningful life, human rights rejects the distinctive dignity of human rationality and works to reduce humanity to its animality.
Hannah Arendt saw human rights as dangerous precisely because they risked confusing the meaning of human worldliness with the existence of mere animal life. For Arendt, human beings are the beings who build and live in a political world, by which she means the stories, institutions, and achievements that mark the glory and agony of humanity. To be human, she insists, is more than simply living, laboring, working, acting, and thinking. It is to do all of these activities in such a way as to create, together, a common life amongst a plurality of persons.
I fear that the interest in animal consciousness today is less a result of scientific proof that animals are human than it is an increasing discomfort with the world we humans have built. A first step in responding to such discomfort, however, is a reaffirmation of our humanity and our human responsibility. There is no better way to begin that process than in engaging with a very human response to the question of our animality. Towards that end, I commend to you “One of Us,” by John Jeremiah Sullivan.
Thomas Levin of Princeton came to Bard Tuesday to give a lecture to the Drones Seminar, a weekly class I am participating in, led by my colleague Thomas Keenan and conceived by two of our students Arthur Holland and Dan Gettinger. Levin has studied surveillance techniques for years and he came to think with us about how the present obsession with drones will transform our landscape and our imaginations. At a time when the obsession with drones in the media is focused on their offensive capacities, it is important to recall that drones were originally developed as a surveillance technology. If drones are to become omnipresent in our lives, what will that mean?
Levin began by reminding us of the embrace of other surveillance devices in mass culture, like recording devices at the turn of the 20th century. He offered old postcards and cartoons in which unsuspecting servants or children were caught goofing off or insulting their superiors with newfangled recording devices like the cylinder phonograph and, later, hidden cameras and spy satellites. The realization emerges that we are being watched, and this sense pervades the popular consciousness. In looking to these representations from mass culture of the fear, awareness, and even expectation that we will be watched and listened to, Levin finds the emergence of what he calls “rhetoric of surveillance.”
In short, we talk and think constantly about the fact that we are or may be being watched. This cannot but change the way we behave and act. Levin poses this question. What, he asks, is the emerging drone imaginary?
To answer that question it is helpful to revisit an uncannily prescient imagination of the rise of drones in a text written over half a century ago, Ernst Jünger’s The Glass Bees. Originally published in 1957 and recently reissued in translation with an introduction by science fiction novelist Bruce Sterling, Jünger’s text centers around a job interview between an unnamed former light cavalry officer and Giacomo Zapparoni, secretive, filthy rich, and powerful proprietor of The Zapparoni Works that “manufactured robots for every imaginable purpose.” Zapparoni’s secret, however, is that he instead of big and hulking robots, he specialized in Lilliputian robots that gave “the impression of intelligent ants.”
The robots were not powerful in themselves, but they worked together. Like drone bees and drone ants—that exist only for procreation and then die—the small robots, or drones, serve specific purposes in industry or business. Zapparoni’s tiny robots “could count, weigh, sort gems or paper money….” Their power came from their coordination.
The robots “worked in dangerous locations, handling explosives, dangerous viruses, and even radioactive materials. Swarms of selectors could not only detect the faintest smell of smoke but could also extinguish a fire at an early stage; others repaired defective wiring, and still others fed upon filth and became indispensable in all jobs where cleanliness was essential.” Dispensable and efficient, Zapparoni’s little robots could do the most dangerous and least desirable tasks.
In The Glass Bees, we are introduced to Zapparoni’s latest invention: flying glass bees that can pollinate flowers much more efficiently and quickly than natural bees. The bees “were about the size of a walnut still encased in its green shell.” They were completely transparent and they were an improvement upon nature, at least insofar as the pollination of flowers was concerned. If a true or natural bee “sucked first on the calyx, at least a dessert remained.” But Zapparoni’s glass bees “proceeded more economically; that is, they drained the flower more thoroughly.” What is more, the bees were a marvel of agility and skill: “Given the flying speed, the fact that no collisions occurred during these flights back and forth was a masterly feat.” According to the cavalry officer, “It was evident that the natural procedure had been simplified, cut short, and standardized.”
Before our hero is introduced to Zapparoni’s bees, he is given a warning: “Beware of the bees!” And yet he forgets this warning. Watching the glass bees, the cavalry officer is fascinated. He felt himself “come under the spell of the deeper domain of techniques,” which like a spectacle “both enthralled and mesmerized.” His mind, he writes, went to sleep and he “forgot time” and “also entirely forgot the possibility of danger.”
Jünger’s book tells, in part, the story of our fascination and subjection to technologies of surveillance. On Facebook or Words with Friends, or even using our smart phones or GPS systems, we allow our fascination with technology to dull our sense of its danger. As Jünger writes: “Technical perfection strives toward the calculable, human perfection toward the incalculable. Perfect mechanisms—around which, therefore, stands an uncanny but fascinating halo of brilliance—evoke both fear and a titanic pride which will be humbled not by insight but only by catastrophe.”
The protagonist of The Glass Bees, a former member of the Light Cavalry and later a tank inspector, had once been fascinated by the “succession of ever new models becoming obsolete at an ever increasing speed, this cunning question-and-answer game between overbred brains.” What he came to see is that “the struggle for power had reached a new stage; it was fought with scientific formulas. The weapons vanished in the abyss like fleeting images, like pictures one throws into the fire. New ones were produced in protean succession.” Victory ceased to be about physical battle; it became, instead, a contest of technical mastery and knowledge.
The danger drones pose is not necessarily military. As General Stanley McChrystal rightly said when I asked him about this last week at the New York Historical Society, drones are simply another military tool that can be used for good or ill. Many fret today about collateral damage by drones and forget that if we had to send in armies to do these tasks the collateral damage would be much greater. Others worry about assassination, but drones are simply the tool, not the person pulling the trigger. It may be true that having drones when others don’t offers an enormous military advantage and makes the decision to go to kill easier, but when both sides have drones, we will all think heavily between beginning a cycle of illegal assassinations.
Rather, the danger of drones is how they change us as humans. As we humans interact more regularly with drones and machines and computers, we will inevitably come to expect ourselves and our friends and our colleagues and our lovers to act with the efficiency and selflessness of drones. Sherry Turkle worries that mechanical companions offer such fascination and unquestionable love that humans are beginning to prefer spending time with their machines than with other humans—who make demands, get tired, act cranky, and disappoint us. Ron Arkin has argued that robot soldiers will be more humane at war than human soldiers, who often act rashly out of exhaustion, anger, or revenge. Doctors are learning to rely on Watson and artificially intelligent medical machines, who can bring databases of knowledge to bear on diagnoses with the speed and objectivity that humans can only dream of. In every area of human life where humans once were thought to be necessary, drones and machines are proving more reliable, more capable, and more desirable.
The danger drones represent is not what they do better than humans, but that they do it better than humans. They are a further step in the human dream of self-improvement—the desire to overcome our shame at our all-too-human limitations.
The incredible popularity of drones today is partly a result of their freeing us to fight wars with ever-reduced human and economic costs. But drones are popular also because they appeal to the human desire for perfection. The question is, however, how perfect we humans can be before we begin to lose our humanity. That is, of course, the force of Jünger’s warning: Beware of the bees!
As drones appear everywhere around us, you would do well to put down the newspaper and turn off You Tube and, instead, revisit Ernst Jünger’s classic tale of drones. The Glass Bees is your weekend read. You can read Bruce Sterling’s introduction to The Glass Bees here.
The white smoke ushered in a Pope from the New World, but one firmly planted in the old one. Pope Francis I is from Argentina but descended from Italy. According to the Arch-Bishop of Paris, quoted in The New York Times, the Pope was not of the Curia and not part of the Italian system. At the same time, because of his “culture and background, he was Italo-compatible.” Straddling the new and the old, there is some glimmer of hope that Francis I will be able to reform the machinery of the ecclesiastical administration from the inside.
Amidst this tension, the new Pope signaled his desire to be seen as an outsider by choosing the name Francis I, aligning himself with St. Francis as protector of the poor and the downtrodden. At a time of near universal distrust in the ecclesiastical order, the Pope and his supporters present the choice of Cardinal Jorge Maria Bergoglio as an affirmation of simplicity and humility.
And in some respects the new Pope does appear to be a Pope for whom the life of Jesus and life of St. Francis serve as an example of humility and service. At least if such stories like this one told by Emily Schmall and Larry Rohter are to be credited:
In 2001 he surprised the staff of Muñiz Hospital in Buenos Aires, asking for a jar of water, which he used to wash the feet of 12 patients hospitalized with complications from the virus that causes AIDS. He then kissed their feet, telling reporters that “society forgets the sick and the poor.” More recently, in September 2012, he scolded priests in Buenos Aires who refused to baptize the children of unwed mothers. “No to hypocrisy,” he said of the priests at the time. “They are the ones who separate the people of God from salvation.”
Some complain that the Pope abjures liberation theology for its connection to Marxism and rejects the using of the Gospel for political and economic transformation. Nevertheless, stories like the one above are important and show an exemplary character in Pope Francis I.
Bigger questions arise about new Pope’s past connection to what is called the Dirty War in Argentina, the period from 1976-1983 in which a brutal dictatorship stole children from their communist parents and gave them to military families while also disappearing political and ideological opponents. As one of my colleagues wrote to me, “Almost alone among major Latin American Churches, the Argentine Church officially allied itself with the military in a campaign to eradicate political dissidents (mostly left-wingers).” Bergoglio was a Catholic Church official during this period and he has been accused by many in Argentina of either not doing enough to oppose the regime or, more scandalously, actively collaborating with the dirty war. In 2005, a formal lawsuit claimed that that Bergoglio had been complicit in the kidnapping and torture of two Jesuit priests, Orland Yorio and Francisco Jalics. The priests were working in a poor barrio advocating against the dictatorship. Bergoglio insisted they stop and they were stripped from the Jesuit Order. They disappeared and months later they were found drugged and partially undressed, according to the reporting of Emily Schmall and Larry Rohter.
Margaret Hebbelthwaite, in the Guardian, defends Bergoglio, whom she knows and respects. “It was the kind of complex situation that is capable of multiple interpretations, but it is far more likely Bergoglio was trying to save their lives.” And this is the account Bergoglio gives himself, as Schmall and Rohter report:
In a long interview published by an Argentine newspaper in 2010, he defended his behavior during the dictatorship. He said that he had helped hide people being sought for arrest or disappearance by the military because of their political views, had helped others leave Argentina and had lobbied the country’s military rulers directly for the release and protection of others.
I of course have no idea whether Bergoglio is the victim of baseless calumny, as he claims, or whether he actively or meekly collaborated with a ruthless dictatorship. What is clear, however, is that at the very least, Bergoglio and his colleagues in the Argentine Catholic Church over many years looked the other way and allowed a brutal government to terrorize its population without a word of opposition.
With that history in mind, it is worthwhile to consider Hannah Arendt’s essay “The Christian Pope,” published in the New York Review of Books in 1965. Arendt was reviewing Journal of a Soul, the spiritual diaries of Pope John XXIII, the former Angelo Giuseppe Roncalli. The Jewish thinker has little patience for “endlessly repetitive devout outpourings and self-exhortation” that go on for “pages and pages” and read like “an elementary textbook on how to be good and avoid evil.” Arendt had little patience with such things and little hope that clichés, no matter how well meaning, would have much impact on the moral state of our time.
What did fascinate Arendt, however, were the anecdotes Pope John XXIII tells and the stories about him that she heard while traveling in Rome. She tells of a “Roman chambermaid” in her hotel who asked her, in all innocence:
“Madam,” she said, “this Pope was a real Christian. How could that be? And how could it happen that a true Christian would sit on St. Peter’s chair? Didn’t he first have to be appointed Bishop, and Archbishop, and Cardinal, until he finally was elected to be Pope? Had nobody been aware of who he was?”
Arendt had a simple answer for the maid. “No.” She writes that Roncalli was largely unknown upon his selection and arrived as an outsider. He was, in the words of her title, a true Christian living in the spirit of Jesus Christ. In a sense, this was so surprising in the midst of the 20th century that no one had imagined it to be possible, and Roncalli was selected without anyone knowing who he was.
Who he was Arendt found not in his book, but in the stories told about him. Whether the stories are authentic, she writes, is not so important, because “even if their authenticity were denied, their very invention would be characteristic enough for the man and for what people thought of him to make them worth telling.” One of these stories shows Roncalli’s common touch, something now being praised widely in Bergoglio.
The story tells that the plumbers had arrived for repairs in the Vatican. The Pope heard how one of them started swearing in the name of the whole Holy Family. He came out and asked politely: “Must you do this? Can’t you say merde as we do too?”
My favorite story tells of Roncalli’s meeting with Pope Pius XII in 1944 in Paris. Apparently Pius tells Roncalli that he is busy and has only 7 minutes to spare for their conversation. Roncalli then “took his leave with the words: “In that case, the remaining six minutes are superfluous.”
And then there is the story of Roncalli’s reaction when he was given Rolf Hochhuth’s play, The Deputy, which portrayed Pope Pius XII as silent and indifferent to the persecution and extermination of European Jews. When Roncalli was asked what one could do against Hochhuth’s play, he responded: “’Do against it? What can you do against the truth?’”
These stories are essential, Arendt writes, because they
show the complete independence which comes from a true detachment from the things of this world, the splendid freedom from prejudice and convention which quite frequently could result in an almost Voltairean wit, an astounding quickness in turning the tables.
Arendt found in Roncalli the kind of independence and “self-thinking” she valued so highly and that unites all the persons she profiled in her book Men in Dark Times. For Roncalli, his “complete freedom from cares and worries was his form of humility; what set him free was that he could say without any reservation, mental or emotional: “Thy will be done.”” It was this humility that girded Roncalli’s faith and led to his being content to live from day to day and even hour to hour “like the lilies in the field” with “no concern for the future.” It was, in other words, his faith—and not any theory or philosophy—that “guarded him against ‘in any way conniving with evil in the hope that by so doing [he] may be useful to someone.’” A true Christian in imitation of Jesus, Roncalli was one who “welcomed his painful and premature death as confirmation of his vocation: the “sacrifice” that was needed for the great enterprise he had to leave undone.”
There was one exception, however, to Roncalli’s sureness of his innocence, and that was his action and service during World War II. Here is Arendt’s account:
It is with respect to his work in Turkey, where, during the war, he came into contact with Jewish organizations (and, in one instance, prevented the Turkish government from shipping back to Germany some hundred Jewish children who had escaped from Nazi-occupied Europe) that he later raised one of the very rare serious reproaches against himself—for all “examinations of conscience” notwithstanding, he was not at all given to self-criticism. “Could I not,” he wrote, “should I not, have done more, have made a more decided effort and gone against the inclinations of my nature? Did the search for calm and peace, which I considered to be more in harmony with the Lord’s spirit, not perhaps mask a certain unwillingness to take up the sword?” At this time, however, he had permitted himself but one outburst. Upon the outbreak of the war with Russia, he was approached by the German Ambassador, Franz von Papen, who asked him to use his influence in Rome for outspoken support of Germany by the Pope. “And what shall I say about the millions of Jews your countrymen are murdering in Poland and in Germany?” This was in 1941, when the great massacre had just begun.
Even in his questioning of himself in his actions during the war, Roncalli shows himself to be a man of independence and faith. Yes, he might have done more. But unlike so many who did nothing, he made his dissent known, worked to do good where he could, and yet still fell short. And then struggled with his shortcomings.
These stories of the self-thinking independence of Pope John XXIII offer a revealing and humbling reflection in relation to the new Pope Francis I. Like Roncalli, Bergoglio is praised for his humility and his simple faith. And like Roncalli, Bergoglio served the Church through dark times, when secular authorities were engaging in untold evils and the Church remained silent if not complicit. But Roncalli not only did speak up and act to protect the persecuted and hopeless, he also worried that he had not done enough. He was right.
Many are accusing Pope Francis I of war crimes and complicity. I worry about jumping to conclusions when we do not know what happened. But the new Pope carries baggage Roncalli did not—formal accusations of complicity with terror and torture. It is human to respond with denials and anger. It would be befitting, however, if Pope Francis I would throw aside such defenses and let the truth come out. That would be an instance of leadership by example that might actually serve to cleanse the dirty laundry of the Catholic Church.
On this first weekend of Pope Francis I new reign, it is well worth revisiting Hannah Arendt’s The Christian Pope. It is your weekend read.
The NY Times Editorial page takes aim at online education on Monday. It turns out that studies show that more students in online classes drop out of classes, more fail, and fewer graduate. This is not surprising. But one might ask so what? Online courses are proliferating and will continue to do so because they are less expensive. For some students, they may even be better. But for high-risk students, the track record is poor. Here is the Times editorial board’s conclusion:
A five-year study, issued in 2011, tracked 51,000 students enrolled in Washington State community and technical colleges. It found that those who took higher proportions of online courses were less likely to earn degrees or transfer to four-year colleges. The reasons for such failures are well known. Many students, for example, show up at college (or junior college) unprepared to learn, unable to manage time and having failed to master basics like math and English.
Lacking confidence as well as competence, these students need engagement with their teachers to feel comfortable and to succeed. What they often get online is estrangement from the instructor who rarely can get to know them directly. Colleges need to improve online courses before they deploy them widely. Moreover, schools with high numbers of students needing remedial education should consider requiring at least some students to demonstrate success in traditional classes before allowing them to take online courses.
The Times’ solution is based on a common lament, that young people are caught in a double bind, what Joseph Stiglitz recently described as a Catch-22:
Without a college education, they are condemned to a life of poor prospects; with a college education, they may be condemned to a lifetime of living at the brink. And increasingly even a college degree isn’t enough; one needs either a graduate degree or a series of (often unpaid) internships. Those at the top have the connections and social capital to get those opportunities. Those in the middle and bottom don’t. The point is that no one makes it on his or her own. And those at the top get more help from their families than do those lower down on the ladder. Government should help to level the playing field.
Stiglitz, like the NY Times editorial board, worries that the current higher educational system is poorly suited to addressing questions of class. Both are right. College education is too expensive for most poor and even many middle class Americans. This is especially true since many people spend much of their time (and money) in college taking remedial courses where they learn little of extra value. And when these at-risk students do attend college, they too often emerge with life-altering debt rather than a transformative education.
What both the Times and Stiglitz want is to change the system of college and how we subsidize it. I leave aside the argument over whether government subsidies for higher education are the right answer. That becomes a question of how much money we want to pay as a percentage of our GDP.
But what does seem strange is that we continue to see our colleges as the problem here. As the Times rightly sees, the problem is that students arrive at college unprepared.
Our overburdened public colleges must spend a fortune on remedial education for students. And then we charge students for this remedial education, which frequently fails, leaving them with debt and nothing else.
Whereas colleges cost students money, high school education is typically free. The first line of attack on inequality through education should be reforming and improving high schools. Yet no one speaks about that. President Obama’s education initiatives focus on early pre-school education and community college. High Schools are left out. But if we could divert the huge resources currently spent on remedial college education to high schools, maybe college wouldn’t be so necessary. And maybe those who attended college might then be ready to work at a college level.
In this week's Chronicle of Higher Education, Richard D. Kahlenberg lifts (or rips) the band-aid off a wound that has been festering for decades. For much of the 20th century, class animated campus Marxists. Since the 1970s, race and gender have largely supplanted class as the source of youthful protest. But the pendulum is swinging back. Studies find that "being an underrepresented minority increased one's chances of admissions at selective colleges by almost 28 percentage points, but that being low-income provided no boost whatsoever." Will racial and gender politics give way to a renewed interest in class? Will there be a divide on the left between class and identity politics? In either case, the debate is beginning.
Here is Kahlenberg:
Long hidden from view, economic status is emerging from the shadows, as once-taboo discussions are taking shape. The growing economic divide in America, and on American campuses, has given rise to new student organizations, and new dialogues, focused on raising awareness of class issues—and proposing solutions. With the U.S. Supreme Court likely to curtail the consideration of race in college admissions this year, the role of economic disadvantage as a basis for preferences could further raise the salience of class.
This interest represents a return to an earlier era. Throughout the first half of the 20th century, class concerns animated Marxists on campus and New Deal politicians in the public sphere. Both groups papered over important dimensions of race and gender to focus on the nation's economic divide. Programs like Federal Housing Administration-guaranteed loans and the GI Bill provided crucial opportunities for upward mobility to some working-class families and students.
Colleges, meanwhile, began using the SAT to identify talented working-class candidates for admission. But FHA loans, the GI Bill, and the SAT still left many African-Americans, Latinos, and women out in the cold.
In the 1960s and 70s, that narrow class focus was rightly challenged by civil-rights activists, feminists, and advocates of gay rights, who shined new light on racism, sexism and homophobia. Black studies, women's studies, and later gay studies took root on college campuses, along with affirmative-action programs in student admissions and faculty employment to correct for the lack of attention paid to marginalized groups by politicians and academics alike.
Somewhere along the way, however, the pendulum swung to the point that issues of class were submerged. Admissions officers, for example, paid close attention to racial and ethnic diversity, but little to economic diversity. William Bowen, a former president of Princeton University, and his colleagues reported in 2005 that being an underrepresented minority increased one's chances of admissions at selective colleges by almost 28 percentage points, but that being low-income provided no boost whatsoever. Campuses became more racially and ethnically diverse—and all-male colleges began admitting women—but students from the most advantaged socioeconomic quartile of the population came to outnumber students from the least advantaged quartile at selective colleges by 25 to 1, according to a 2004 study by the Century Foundation.
Read the whole article here.
Kahlenberg’s inquiry into the return of class to debates on campus cannot be seen outside the context of rising inequality in the U.S. Just this week Anne Lowrey reports in the New York Times that incomes are rising briskly for the top 1% but are actually stagnant or falling for everyone else:
Incomes rose more than 11 percent for the top 1 percent of earners during the economic recovery, but not at all for everybody else, according to new data.
It may be true that prices are declining and the middle class, despite its wage stagnation, is still living well. But we cannot ignore the increasing divide between the rich and the middle class. Not to mention the poor.
This was the topic of an op-ed essay in Monday’s New York Times by Nobel Laureate, Joseph Stiglitz, who writes, “The gap between aspiration and reality could hardly be wider.” Stiglitz, like Kahlenberg, sets the question of class inequality against increasing racial equality:
While racial segregation decreased, economic segregation increased. After 1980, the poor grew poorer, the middle stagnated, and the top did better and better. Disparities widened between those living in poor localities and those living in rich suburbs — or rich enough to send their kids to private schools. A result was a widening gap in educational performance — the achievement gap between rich and poor kids born in 2001 was 30 to 40 percent larger than it was for those born 25 years earlier, the Stanford sociologist Sean F. Reardon found.
Many on the left will respond that race and class are linked: minorities, who are poor, they say, suffer worst of all. That may be true. But race, gender, and identity have dominated the conversation about equality and oppression in this country for 50 years. That is changing. This will be hard for some to accept, and yet it makes sense. Poverty, more than race or gender, is increasingly the true mark of disadvantage in 21st century America.
Stephanie A. Miner, the Mayor of Syracuse NY, has an important op-ed essay in The NY Times Thursday. Syracuse is one of hundreds of cities around the state and tens of thousands around the country that are struggling with the potentially disastrous effects of out-of-control pension costs. Where this crisis is heading can be seen in California, where San Bernadino has become the third California city to declare bankruptcy. These cities are dying. They are caught in a bind. Either they decide not to pay their promised debts to pensioners; or, in honoring those debts, they so fully raise taxes and cut services as to ruin the lives of their citizens.
In Syracuse, Mayor Miner understands well the depth of the problem. First, public employee labor costs are too high not because salaries are high, but because pension costs and medical benefits are rising without limit. Second, revenues are being slashed, both from the recession and from cutbacks from the state and federal governments. Finally, the middle and upper class flight from cities to suburbs have left the tax base in cities low at the moment when poorer city dwellers are disproportionately in need of public services.
The result is that cities are faced with a stark choice: Do they pay older citizens what has been promised to them? Or do they cut those promised pensions in order to provide services for the young? This is a generational conflict that is playing out across the country.
Miner is worried that the response by NY State is making the problem worse. In short, Governor Cuomo and the legislature have decided to let cities that cannot afford to fund their burgeoning pension obligations borrow money to pay those pensions. The kicker is, that the cities are being told to borrow money from the very same pension plan to which they owe money.
If this sounds suspicious, it is. As Danny Hakim—one of the best financial reporters around—wrote almost exactly one year ago in the NY Times, this is a desperate and dangerous move:
When New York State officials agreed to allow local governments to use an unusual borrowing plan to put off a portion of their pension obligations, fiscal watchdogs scoffed at the arrangement, calling it irresponsible and unwise.
And now, their fears are being realized: cities throughout the state, wealthy towns such as Southampton and East Hampton, counties like Nassau and Suffolk, and other public employers like the Westchester Medical Center and the New York Public Library are all managing their rising pension bills by borrowing from the very same $140 billion pension fund to which they owe money.
The state’s borrowing plan allows public employers to reduce their pension contributions in the short term in exchange for higher payments over the long term. Public pension funds around the country assume a certain rate of return every year and, despite the market gains over the last few years, are still straining to make up for steep investment losses incurred in the 2008 financial crisis, requiring governments to contribute more to keep pension systems afloat.
Supporters argue that the borrowing plan makes it possible for governments in New York to “smooth” their annual pension contributions to get through this prolonged period of market volatility.
Critics say it is a budgetary sleight-of-hand that simply kicks pension costs down the road.
Borrowing from the state pension plan to pay municipal pension costs is simply failing to pay the pensions this year and thus having to pay more next year.
Hakim, as good as he is, allows Thomas P. DiNapoli—the state’s comptroller—to get away with calling the scheme “amortization.”
The state’s comptroller, Thomas P. DiNapoli, said in a statement, “While the state’s pension fund is one of the strongest performers in the country, costs have increased due to the Wall Street meltdown.” He added that “amortizing pension costs is an option for some local governments to manage cash flow and to budget for long-term pension costs in good and bad times.”
But how is this amortization? The assumption or hope is that the market will rise, the pension fund will go up, and then the municipalities will owe less. That is hardly amortization. No, it is desperate speculation with public monies.
The crisis in our cities afflicts the whole country, according to a study by the Pew Center on the States.
Cities employing nearly half of U.S. municipal workers saw their pension and retiree health-care funding levels fall from 79% in fiscal year 2007 to 74% in fiscal year 2009, using the latest available data, according to the Pew Center on the States. Pension systems are considered healthy if they are 80% funded.
The reason to pay attention to the problems in cities is that cities have even less ability to solve their pension shortfalls than states. The smaller the population, the more a city would have to tax each citizen in order to help pay for the pensions of its retired public workers. The result is that either cities get bailed out by states and lose their independence (as is happening in Michigan) or the cities file for bankruptcy (as is happening in California).
Mayor Miner, a Democrat, takes a huge risk in standing up to the Governor and the legislature. She is rightly insisting that they stop hiding from our national addiction to the crack-cocaine of unaffordable guaranteed lifetime pensions. Piling unpayable debts upon our cities will, in the end, bankrupt these cities. And it will continue the flight to the suburbs and the hollowing out of the urban core of America. Above all, it will sacrifice our future in order to allow the baby boomers to retire in luxury. Let’s hope Miner’s call doesn’t go unheeded.
You know elite universities are in trouble when their professors say things like Edward Rock. Rock, Distinguished Professor at the University of Pennsylvania Law School and coordinator of Penn’s online education program, has this to say about the impending revolution in online education:
We’re in the business of creating and disseminating knowledge. And in 2012, the internet is an incredibly important place to be present if you’re in the knowledge dissemination business.
If elite colleges are in the knowledge dissemination business, then they will overtime be increasingly devalued and made less relevant. What colleges and universities need to offer is not simply knowledge, but education.
In 1947, at the age of 18, Martin Luther King Jr. wrote a short essay in the The Maroon Tiger, the Morehouse College campus newspaper. The article was titled, “The Purpose of Education.” In short, it argued that we must not confuse education with knowledge.
King began with the personal. Too often, he wrote, “most college men have a misconception of the purpose of education. Most of the "brethren" think that education should equip them with the proper instruments of exploitation so that they can forever trample over the masses. Still others think that education should furnish them with noble ends rather than means to an end.” In other words, too many think that college is designed to teach either means or ends, offering the secrets that unlock the mysteries of our futures.
King takes aim at both these purposes. Beyond the need for education to make us more efficient, education also has a cultural function. In this sense, King writes, Education must inculcate the habit of thinking for oneself, what Hannah Arendt called Selbstdenken, or self-thinking.
“Education,” King writes, “must also train one for quick, resolute and effective thinking.” Quick and resolute thinking requires that one “think incisively” and “think for one's self.” This “is very difficult.” The difficulty comes from the seduction of conformity and the power of prejudice. “We are prone to let our mental life become invaded by legions of half truths, prejudices, and propaganda.” We are all educated into prejudgments. They are human and it is inhuman to live free from prejudicial opinions and thoughts. On the one hand, education is the way we are led into and brought into a world as it exists, with its prejudices and values. And yet, education must also produce self-thinking persons, people who, once they are educated and enter the world as adults, are capable of judging the world into which they been born.
For King, one of the “chief aims of education” is to “save man from the morass of of propaganda.” “Education must enable one to sift and weigh evidence, to discern the true from the false, the real from the unreal, and the facts from the fiction.”
To think for oneself is not the same as critical thinking. Against the common assumption that college should teach “critical reasoning,” King argues that critical thinking alone is insufficient and even dangerous: “Education which stops with efficiency may prove the greatest menace to society. The most dangerous criminal may be the man gifted with reason, but with no morals.” The example King offers is that of Eugene Talmadge, who had been governor of Georgia. Talmadge “possessed one of the better minds of Georgia, or even America.” He was Phi Beta Kappa. He excelled at critical thinking. And yet, Talmadge believed that King and all black people were inferior beings. For King, we cannot call such men well educated.
The lesson the young Martin Luther King Jr. draws is that intelligence and critical reasoning are not enough to make us educated. What is needed, also, is an educational development of character:
We must remember that intelligence is not enough. Intelligence plus character—that is the goal of true education. The complete education gives one not only power of concentration, but worthy objectives upon which to concentrate. The broad education will, therefore, transmit to one not only the accumulated knowledge of the race but also the accumulated experience of social living.
Present debates about higher education focus on two concerns. The first is cost. The second is assessment. While the cost is high for many people, it is also the case the most students and their families understand that what colleges offer is priceless. But that is only true insofar as colleges understand their purpose, which is not simply to disseminate knowledge or teach critical thinking, but is, rather, to nurture character. How are we to assess such education? The demand for assessment, as well meaning as it is, drives education to focus on measurable skills and thus moves us away from the purposes of education as King rightly understands them.
The emerging debate about civic education is many things. Too often it is a tired argument over the “core” or the “canon.” And increasingly it is derailed by arguments about service learning or internships. What really is at issue, however, is a long-overdue response to the misguided dominance of the research-university model of education.
Colleges in the United States were, up through the middle of the 20th century, not research-driven institutions. They were above all religiously affiliated institutions and they offered general education in the classics and the liberal arts. Professors taught the classics outside of their specific disciplines. And students wrestled with timeless questions. This has largely changed today where professors are taught to specialize and think within their disciplinary prejudices. Even distribution requirements fail to make a difference insofar as students forced to take a course outside their discipline learn simply another disciplinary approach. They learn useful knowledge and critical thinking. But what is missing is the kind of general education in the “accumulated experience of social living” that King championed.
I am not suggesting that all specialization is bad or that we should return to religious-affiliated schools. Not in the least. But many of us know that we are failing in our responsibilities to think about what is important and to teach students a curriculum designed to nurture self-thinking and citizenship. We avoid this conversation because it is hard, because people disagree today on whether we should read Plato or Confucius or study Einstein or immunology. Everyone has their discipline to defend and few faculty are willing or able to think about an education that is designed for students and citizens.
Let’s stop bad mouthing all colleges. Much good happens there. Yet let’s also recall King’s parting words in his essay:
If we are not careful, our colleges will produce a group of close-minded, unscientific, illogical propagandists, consumed with immoral acts. Be careful, "brethren!" Be careful, teachers!
King’s The Purpose of Education is your weekend read.
Walter Russell Mead is getting it right about the utter selfishness of the boomer generation and how it is bankrupting our governments, thus leaving government incapable of public services for the next generation.
This story is about more than just high gas prices or taxes. It’s yet another case of the boomer generation stealing from younger generations. Besides promising themselves fat pensions that they refused to save money or tax themselves to pay for, the boomers let the country’s infrastructure run down. The next generation is already staggering under a rising tax burden, student loan debt, and retirees’ massive health care bills. On top of all this, they now have to pay through the nose just to keep the roads, bridges, and tunnels in good repair after years of neglect and deferred maintenance.
One of the great documents of American history is the Constitution of the Commonwealth of Massachusetts, written in 1779 by John Adams.
In Section Two of Chapter Six, Adams offers one of the most eloquent testaments to the political virtues of education. He writes:
Wisdom and knowledge, as well as virtue, diffused generally among the body of the people, being necessary for the preservation of their rights and liberties; and as these depend on spreading the opportunities and advantages of education in the various parts of the country, and among the different orders of the people, it shall be the duty of legislatures and magistrates, in all future periods of this commonwealth, to cherish the interests of literature and the sciences, and all seminaries of them; especially the university at Cambridge, public schools, and grammar-schools in the towns; to encourage private societies and public institutions, rewards and immunities, for the promotion of agriculture, arts, sciences, commerce, trades, manufactures, and a natural history of the country; to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and frugality, honesty and punctuality in their dealings; sincerity, and good humor, and all social affections and generous sentiments, among the people.
Adams felt deeply the connection between virtue and republican government. Like Montesquieu, whose writings are the foundation on which Adams’ constitutionalism is built, Adams knew that a democratic republic could only survive amidst people of virtue. That is why his Constitution also held that the “happiness of a people and the good order and preservation of civil government essentially depend upon piety, religion, and morality.”
For Adams, piety and morality depend upon religion. The Constitution he wrote thus holds that a democratic government must promote the “public worship of God and the public instructions in piety, religion, and morality.” One of the great questions of our time is whether a democratic community can promote and nourish the virtue necessary for civil government in an irreligious age? Is it possible, in other words, to maintain a citizenry oriented to the common sense and common good of the nation absent the religious bonds and beliefs that have traditionally taught awe and respect for those higher goods beyond the interests of individuals?
Hannah Arendt saw the ferocity of this question with clear eyes. Totalitarianism was, for here, the proof of the political victory of nihilism, the devaluation of the highest values, the proof that we now live in a world in which anything is possible and where human beings no longer could claim to be meaningfully different from ants or bees. Absent the religious grounding for human dignity, and in the wake of the loss of the Kantian faith of the dignity of human reason, what was left, Arendt asked, upon which to build the world of common meaning that would elevate human groups from their bestial impulses to the human pursuit of good and glory?
The question of civic education is paramount today, and especially for those of us charged with educating our youth. We need to ask, as Lee Schulman recently has: “What are the essential elements of moral and civic character for Americans? How can higher education contribute to developing these qualities in sustained and effective ways?” In short, we need to insist that our institutions aim to live up to the task Adams claimed for them: “to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and frugality, honesty and punctuality in their dealings; sincerity, and good humor, and all social affections and generous sentiments, among the people.”
Everywhere we look, higher education is being dismissed as overly costly and irrelevant. In many, many cases, this is wrong and irresponsible. There is a reason that applications continue to increase at the best colleges around the country, and it is not simply because these colleges guarantee economic success. What distinguishes the elite educational institutions in the U.S. is not their ability to prepare students for technical careers. On the contrary, a liberal arts tradition offers useless education. But parents and students understand—explicitly or implicitly—that such useless education is powerfully useful. The great discoveries in physics come from useless basic research that then power satellites and computers. New brands emerge from late night reveries over the human psyche. And those who learn to conduct an orchestra or direct a play will years on have little difficulty managing a company. What students learn may be presently useless; but it builds the character and forms the intellect in ways that will have unintended and unimaginable consequences over lives and generations.
The theoretical justifications for the liberal arts are easy to mouth but difficult to put into practice. Especially today, defenses of higher education ignore the fact that colleges are not doing a great job of preparing students for democratic citizenship. Large lectures produce the mechanical digestion of information. Hyper-specialized seminars forget that our charge is to teach a liberal tradition. The fetishizing of research that no one reads exemplifies the rewarding of personal advancement at the expense of a common project. And, above all, the loss of any meaningful sense of a core curriculum reflects the abandonment of our responsibility to instruct students about making judgments about what is important. At faculties around the country, the desire to teach what one wants is seen as “liberal” and progressive, but it means in practice that students are advised that any knowledge is equally is good as any other knowledge.
To call for collective judgment about what students should learn is not to insist on a return to a Western canon. It is to say that if we as faculties cannot agree on what is important than we abdicate our responsibility as educators, to lead students into a common world as independent and engaged citizens who can, and will, then act to remake and re-imagine that world.
John Adams was one of Hannah Arendt’s favorite thinkers, and he was because he understood the deep connection between virtue and republicanism. Few documents are more worth revisiting today than the 1780 Constitution of the Commonwealth of Massachusetts. It is your weekend read.