Hannah Arendt Center for Politics and Humanities
29Mar/133

Are We One of Them?

ArendtWeekendReading

In an essay in the Wall Street Journal, Frans de Waal—C. H. Candler Professor of Primate Behavior at Emory University—offers a fascinating review of recent scientific studies that upend long-held expectations about the intelligence of animals. De Waal rehearses a catalogue of fantastic studies in which animals do things that scientists have long thought they could not do. Here are a few examples:

Ayumu, a male chimpanzee, excels at memory; just as the IBM computer Watson can beat human champions at Jeopardy, Ayumu can easily best the human memory champion in games of memory.

Similarly, Kandula, a young elephant bull, was able to reach some fragrant fruit hung out of reach by moving a stool over to the tree, standing on it, and reaching for the fruit with his trunk. I’ll admit this doesn’t seem like much of a feat to me, but for the researchers de Waal talks with, it is surprising proof that elephants can use tools.

elephant

Scientists may be surprised that animals can remember things or use tools to accomplish tasks, but any one raised on children’s tales of Lassie or Black Beauty knows this well, as does anyone whose pet dog opened a door knob, brought them a newspaper, or barked at intruders. The problem these studies address is less our societal view of animals than the overly reductive view of animals that de Waal attributes to his fellow scientists. It’s hard to take these studies seriously as evidence that animals think in the way that humans do.

Seemingly more interesting are experiments with self-recognition and also facial recognition. De Waal describes one Asian Elephant who stood in front of a mirror and “repeatedly rubbed a white cross on her forehead.” Apparently the elephant recognized the image in the mirror as herself. In another experiment, chimpanzees were able to recognize which pictures of chimpanzees were from their own species. Like my childhood Labrador who used to stare knowingly into the mirror, these studies confirm that animals are able to recognize themselves. This means that animals do, likely, understand that they are selves.

For de Waal, these studies have started to upend a view of humankind's unique place in the universe that dates back at least to ancient Greece. “Science,” he writes, “keeps chipping away at the wall that separates us from the other animals. We have moved from viewing animals as instinct-driven stimulus-response machines to seeing them as sophisticated decision makers.”

The flattening of the distinction between animals and humans is to be celebrated, De Waal argues, and not feared. He writes:

Aristotle's ladder of nature is not just being flattened; it is being transformed into a bush with many branches. This is no insult to human superiority. It is long-overdue recognition that intelligent life is not something for us to seek in the outer reaches of space but is abundant right here on earth, under our noses.

DeWaal has long championed the intelligence of animals, and now his vision is gaining momentum. This week, in a long essay called “One of Us” in the new Lapham’s Quarterly on animals, the glorious essayist John Jeremiah Sullivan begins with this description of similar studies to the ones DeWaal writes about:

These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”

With nuance and subtlety, Sullivan understands that our tradition has not drawn the boundary between human and animal nearly as securely as de Waal portrays it. Throughout human existence, humans and animals have been conjoined in the human imagination. Sullivan writes that the most consistent “motif in the artwork made between four thousand and forty thousand years ago,” is the focus on “animal-human hybrids, drawings and carvings and statuettes showing part man or woman and part something else—lion or bird or bear.” In these paintings and sculptures, our ancestors gave form to a basic intuition: “Animals knew things, possessed their forms of wisdom.”

pot

Religious history too is replete with evidence of the human recognition of the dignity of animals. God says in Isaiah that the beasts will honor him and St. Francis, the namesake of the new Pope, is famous for preaching to birds. What is more, we are told that God cares about the deaths of animals.

“In the Gospel According to Matthew we’re told, “Not one of them will fall to the ground apart from your Father.” Think about that. If the bird dies on the branch, and the bird has no immortal soul, and is from that moment only inanimate matter, already basically dust, how can it be “with” God as it’s falling? And not in some abstract all-of-creation sense but in the very way that we are with Him, the explicit point of the verse: the line right before it is “fear not them which kill the body, but are not able to kill the soul.” If sparrows lack souls, if the logos liveth not in them, Jesus isn’t making any sense in Matthew 10:28-29.

What changed and interrupted the ancient and deeply human appreciation of our kinship with besouled animals? Sullivan’s answer is René Descartes. The modern depreciation of animals, Sullivan writes,

proceeds, with the rest of the Enlightenment, from the mind of René Descartes, whose take on animals was vividly (and approvingly) paraphrased by the French philosopher Nicolas Malebranche: they “eat without pleasure, cry without pain, grow without knowing it; they desire nothing, fear nothing, know nothing.” Descartes’ term for them was automata—windup toys, like the Renaissance protorobots he’d seen as a boy in the gardens at Saint-Germain-en-Laye, “hydraulic statues” that moved and made music and even appeared to speak as they sprinkled the plants.

Too easy, however, is the move to say that the modern comprehension of the difference between animal and human proceeds from a mechanistic view of animals. We live at a time of the animal rights movement. Around the world, societies exist and thrive whose mission is to prevent cruelty toward and to protect animals. Yes, factory farms treat chickens and pigs as organic mechanisms for the production of meat, but these farms co-exist with active and quite successful movements calling for humane standards in food production. Whatever the power of Cartesian mechanics, its success is at odds with the persistence of the religious, ancient solidarity, and also deeply modern sympathy between human and animal.

A more meaningful account of the modern attitude towards animals might be found in Spinoza. Spinoza, as Sullivan quotes him, recognizes that animals feel in ways that Descartes did not. As do animal rights activists, Spinoza admits what is obvious: that animals feel pain, show emotion, and have desires. And yet, Spinoza maintains a distinction between human and animal—one grounded not in emotion or feeling, but in human nature. In his Ethics, he writes:

Hence it follows that the emotions of the animals which are called irrational…only differ from man’s emotions to the extent that brute nature differs from human nature. Horse and man are alike carried away by the desire of procreation, but the desire of the former is equine, the desire of the latter is human…Thus, although each individual lives content and rejoices in that nature belonging to him wherein he has his being, yet the life, wherein each is content and rejoices, is nothing else but the idea, or soul, of the said individual…It follows from the foregoing proposition that there is no small difference between the joy which actuates, say, a drunkard, and the joy possessed by a philosopher.

Spinoza argues against the law prohibiting slaughter of animals—it is “founded rather on vain superstition and womanish pity than on sound reason”—because humans are more powerful than animals.  Here is how he defends the slaughter of animals:

The rational quest of what is useful to us further teaches us the necessity of associating ourselves with our fellow men, but not with beasts, or things, whose nature is different from our own; we have the same rights in respect to them as they have in respect to us. Nay, as everyone’s right is defined by his virtue, or power, men have far greater rights over beasts than beasts have over men. Still I do not deny that beasts feel: what I deny is that we may not consult our own advantage and use them as we please, treating them in the way which best suits us; for their nature is not like ours.

Spinoza’s point is quite simple: Of course animals feel and of course they are intelligent. Who could doubt such a thing? But they are not human. That is clear too. While we humans may care for and even love our pets, we recognize the difference between a dog and a human. And we will, in the end, associate more with our fellow humans than with dogs and porpoises. Finally, we humans will use animals when they serve our purposes. And this is ok, because have the power to do so.

Is Spinoza arguing that might makes right? Surely not in the realm of law amongst fellow humans. But he is insisting that we recognize that for us humans, there is something about being human that is different and, even, higher and more important. Spinoza couches his argument in the language of natural right, but what he is saying is that we must recognize that there are important differences between animals and humans.

At a time that values equality over what Friedrich Nietzsche called the “pathos of difference,” the valuation of human beings over animals is ever more in doubt. This comes home clearly in a story told recently by General Stanley McChrystal, about a soldier who expressed sympathy for some dogs killed in a raid in Iraq.  McChrystal responded, severely: “"Seven enemy were killed on that target last night.  Seven humans.  Are you telling me you're more concerned about the dog than the people that died?  The car fell silent again. "Hey listen," I said. "Don't lose your humanity in this thing."” Many, no doubt, are more concerned, or at least are equally concerned, about the deaths of animals as they are about the deaths of humans. There is ever-increasing discomfort about McChrystal’s common sense affirmation of Spinoza’s claim that human beings simply are of more worth than animals.

eye

The distinctions upon which the moral sense of human distinction is based are foundering. For DeWaal and Sullivan, the danger today is that we continue to insist on differences between animals and humans—differences that we don’t fully understand. The consequences of their openness to the humanization of animals, however, is undoubtedly the animalization of humans. The danger that we humans lose sight of what distinguishes us from animals is much more significant than the possibility that we underestimate animal intelligence.

I fully agree with DeWaal and Sullivan that there is a symphony of intelligence in the world, much of it not human. And yes, we should have proper respect for our ignorance. But all the experiments in the world do little to alter the basic facts, that no matter how intelligent and feeling and even conscious animals may be, humans and animals are different.

What is the quality of that difference? It is difficult to say and may never be fully articulated in propositional form. On one level it is this: Simply to live, as do plants or animals, does not constitute a human life. In other words, human life is not simply about living. Nor is it about doing tasks or even being conscious of ourselves as humans. It is about living meaningfully. There may, of course, be some animals that can create worlds of meaning—worlds that we have not yet discovered. But their worlds are not the worlds to which we humans aspire.

Over two millennia ago, Sophocles, in his “Ode to Man,” named man Deinon, a Greek word that connotes both greatness and horror, that which is so extraordinary as to be at once terrifying and beautiful. Man, Sophocles tells us, can travel over water and tame animals, using them to plough fields. He can invent speech, and institute governments that bring humans together to form lasting institutions. As an inventor and maker of his world, this wonder that is man terrifyingly carries the seeds of his destruction. As he invents and comes to control his world, he threatens to extinguish the mystery of his existence, that part of man that man himself does not control. As the chorus sings: “Always overcoming all ways, man loses his way and comes to nothing.” If man so tames the earth as to free himself from uncertainty, what then is left of human being?

Sophocles knew that man could be a terror; but he also glorified the wonder that man is. He knew that what separates us humans from animals is our capacity to alter the earth and our natural environment. “The human artifice of the world,” Arendt writes, “separates human existence from all mere animal environment…” Not only by building houses and erecting dams—animals can do those things and more—but also by telling stories and building political communities that give to man a humanly created world in which he lives. If all we did as humans was live or build things on earth, we would not be human.

To be human means that we can destroy all living matter on the Earth. We can even today destroy the earth itself. Whether we do so or not, it now means that to live on Earth today is a “Choice” that we make, not a matter of fate or chance. Our Earth, although we did not create it, is now something we humans can decide to sustain or destroy. In this sense, it is a human creation. No other animal has such a potential or such a responsibility.

There is a deep desire today to flee from that awesome and increasingly unbearable human responsibility. We flee, therefore, our humanity and take solace in the view that we are just one amongst the many animals in the world. We see this reductionism above all in human rights discourse. One core demand of human rights—that men and women have a right to live and not be killed—brought about a shift in the idea of humanity from logos to life. The rise of a politics of life—the political demand that governments limit freedoms and regulate populations in order to protect and facilitate their citizens’ ability to live in comfort—has pushed the animality, the “life,” of human beings to the center of political and ethical activity. In embracing a politics of life over a politics of the meaningful life, human rights rejects the distinctive dignity of human rationality and works to reduce humanity to its animality.

Hannah Arendt saw human rights as dangerous precisely because they risked confusing the meaning of human worldliness with the existence of mere animal life. For Arendt, human beings are the beings who build and live in a political world, by which she means the stories, institutions, and achievements that mark the glory and agony of humanity. To be human, she insists, is more than simply living, laboring, working, acting, and thinking. It is to do all of these activities in such a way as to create, together, a common life amongst a plurality of persons.

I fear that the interest in animal consciousness today is less a result of scientific proof that animals are human than it is an increasing discomfort with the world we humans have built. A first step in responding to such discomfort, however, is a reaffirmation of our humanity and our human responsibility. There is no better way to begin that process than in engaging with a very human response to the question of our animality. Towards that end, I commend to you “One of Us,” by John Jeremiah Sullivan.

-RB

20Nov/122

Arendt & Antigone

In a short entry in her Denktagebuch from 1956, Arendt offers a gnomic reflection on Antigone:

Ad Orff, Antigone: Als sei alles darauf angelegt, uns zum Ertönen zu bringen. Wir aber verschliessen uns, verstummen und klagen nicht. Antigone- die klagende, tönende menschliche Stimme, in der alles offenbar wird.

Ad Orff, Antigone: As if all was set out to bring us to sound. But we lock up, fall silent, and do not lament. Antigone – the lamenting, sounding human voice, in which all becomes revealed. (Notebook XXII, February 1956, Denktagebuch)

The entry first caught my attention because while Arendt often refers to literature (favorite authors include Kafka and Rilke), she rarely refers to specific musical pieces in her published work. Here she reacts to the opera Antigonae by Carl Orff.

Orff had composed for the Nazis, who received his Carmina Burana with incredible adulation, and underwent denazification after the war. Antigonae of 1949 is a minimalist work, first in the everyday sense that it sets Hölderlin's translation of the drama to song with little instrumental accompaniment. In this regard it highlights the translation's inherent musicality on the level of form (rhythms and rhymes in the text) and content (we see how at a number of moments the drama turns on references to singing, crying, tone, and lament). Orff's opera can also be described as minimalist in the more precise sense that when the orchestra does emerge, it often plays looping interludes that remind one of the repetitive avant-garde phrasings that Steve Reich would popularize in the 1960s.

Arendt often turns to art as a free space in which to voice philosophical and political questions in the modern age. Readers compelled by her approach might be inspired by the entry on Orff to look for other passages addressing music that would compliment her better known aesthetic analyses.

At a local level, the entry also raises a question: how would Arendt read Sophocles's Antigone? Patchen Markell offers one suggestion when he links Sophocles and Arendt in a “countertradition of thought about recognition” in his book Bound by Recognition. Markell casts a skeptical eye on the equation of identity and justice and offers an alternative mapping which is open to asymmetry and values finitude. In doing so he suggests a possible approach to this entry that notices the uncanny relation of the “we” and Antigone through the instrument of the voice.

The first line of the entry starts with the “we”– presumably the spectators of the opera and perhaps humanity more broadly – and centers on the German term “Ertönen,” which could be translated as “to ring out,” “to sound,” “resound,” or “chime.” It indicates expression, and even a move to freedom. In the next sentence though, this potential for liberation evaporates and “we” fall silent. It ultimately fails at the possibility, even apparent necessity of “klagen,” a term which contains the powerful double meaning of 1) “moan,” “lament,” “wail,” and 2) “litigate,” “file a suit,” “go to law.” Unlike us, Antigone's voice does ring out, she does lament, and in her lament she takes on the law.

Arendt describes Antigone's voice as the “human voice,” but her description leads us to think in the direction of the questioning of the essence of the human in first stasimon (often referred to as the “ode to man”). Roger Berkowitz connects the deinon (wondrous / terrible) in this ode to Arendt's concern over the “danger that we might so fully create and make our artificial world that we endanger that quality of human life which is subject to fate, nature, and chance” in his article in The Fortnightly Review.

In terms of the question of recognition, Arendt's note on Orff draws our attention to those sections of the drama where Antigone pushes against the inhuman, such as when the guard describes her shriek at the sight of her brother's unburried body as “a distressing painful cry, just like a bird/ who’s seen an empty nest, its fledglings gone.” Later, she sings a long lament to her tomb and dead family, as if those who remain alive are nothing to her. The minimalist loops of Orff's music might indicate something of the energy that insists on living when one has nothing to live for or is even condemned to death. These sections are strikingly different from the over-the-top triumphalism of Carmina Burana, which hounds popular culture in movies and commercials to this day. They suggest persistence rather than victory, or perhaps even a paradoxical continuation in an explicit condition of defeat.

Antigone is the voice, Arendt tells us. We seem to recognize it as our own, even if the total meaning of the “all” that would be the content of our realization remains out of reach.

Give a listen to a recording of Orff's Antigonae, over the Thanksgiving holiday.

-Jeff Champlin

10Sep/120

Vain, Like a Butterfly

“Everything that is appears; everything that appears disappears; everything that is alive has an urge to appear; this urge is called vanity; since there is no urge to disappear and disappearance is the law of appearance, the urge, called vanity, is in vain.‘Vanitas vanitatum vanitas’—all is vanity, all is in vain.”

-Hannah Arendt, Denktagebuch, 796

Arendt writes this entry in her Denktagebuch in September 1970. She is 63 years old and long familiar with the law of disappearance. For years the record of her thoughts has been interrupted by mention of the death of friends and mentors: May 1951 “[Hermann] Broch died on 30 May and was buried on 2 June 1951”; February 1969 “Jaspers dies”; November 1968: “Tonight I dreamed of Kurt Blumenfeld… in the dream I didn’t know that he was dead.” The following month the law would bear down again and she would write an entry beginning: “On 31 October Heinrich died…”. Within a little over four years of her husband’s death she would herself be gone.

Harmen Steenwijck -"Vanitas"

“Vanitas vanitatum vanitas.” This could be despair. It could be that dreadful thought that forces itself on us in moments of grief and anxiety, the thought that a life’s endeavor has been for naught, that all our achievements have turned out to be worthless. It could be the distress at the Nietzschean reflection that not only must we each die, but this human race and this earth will eventually disappear without trace. Perhaps it is the same as the horror Sophocles savors when he warns us: “Not to be born is, past all prizing, best; but, when man has seen the light, this is next best by far, that with all speed he should go thither, whence he hath come.”

It could also be frustration at the sheer urgency of the desire to rush into full view when thinking is always conducted in darkness and quiet, at a remove from the world. It might be a distaste, for instance, for glib self-promotion that stands in for political action on the part of candidates for public office, or for everything about the modern university that insists that “research” be published prematurely, rendering it hypocritical, superficial and irrelevant (Denktagebuch, 703).

Yet, though her frustration is real, and though she grieves, Arendt uses the word vanity without judgment. A few weeks ago Ian Storey introduced a “Quote of the Week” that came from the same late period of the Denktagebuch, and wrote movingly of the sense of end that suffuses these last entries. (It’s beautiful and touching and well worth your while.)  He writes also of the shades of Arendt’s response to our endedness, from bitter sadness to old contentment. In the same way, she reacts to the vanity of our beginnings both with an austere refusal of even the fantasy of immortality and wonder that any of it came to be at all.

After all, no one asks to be born. No one demands to come into the world as if birth were a special favor, a privilege granted to some but not to others. We’re propelled into the light of day before we know it, by an urge that has nothing to do with ego and does not belong to us any more than it belongs to our parents or our species. We share it with everything alive. However, if we think of it as a great surging drive towards life or survival, it threatens to diminish thinking and overwhelm the senses as a great unfathomable force; if we think of it as a drive to appear it produces instead the refinement of difference and the delight of variegation.

In these same years Arendt reads about biology and studies up on the science of genetics. She reads the work of the philosophical zoologist Adolph Portmann whose most remarkable studies concern the vast variety in the size, shape and color of butterflies (The Beauty of Butterflies, 1951). Instead of submitting the phenomenon of this variety—and butterflies make up just one terrifically flamboyant example—to the demands of natural and sexual selection as in the mainstream of evolutionary theory, Portmann identifies an Aristotelian desire to appear. Arendt adds to this an existential claim for recognition and even praise. “All that appears wants to be seen and recognized and praised. The highest form of recognition is love: volu ut sis.—The wonder implies affirmation” (Denktagebuch, 701). The moment our surprise at the color of a butterfly turns into wonder that it should have somehow come to be and come to be precisely this color, we affirm its existence. We could never have called up in imagination all the colors of butterflies’ wings, and no-one could have planned the immense series of mutations and other tiny contingencies that brought them all into existence but, exposed to a small section of their uncalled-for variety, astonished by it, wondering at it, affirming it, we will that it be. This is what it means to love the world.

This love comes as a sort of gratitude, even if we’re not sure whom we should be grateful to. Believers thank the creator god. Arendt may not believe—at least not like that—but she reaches for the word blasphemy and so also for a sense of something sacred that needs protection from profanity. In October 1969 she writes: “The desire for earthly immortality is blasphemous, not because it wants to overcome death, but because it negates birth” (744). The problem is not that we want to play God by refusing to die, but that we balk at making way for a new, different world. From her reading in genetics she knows the role of genetic mutation in the generation of natural variety and the many millions of mistakes that had to happen to produce the living world we see. She has noted Portmann’s bon mot: “One of the surest methods for the regular occurrence of new [genetic] combinations is that peculiar game that biologists call sexuality.” What is sacred, then, is the fact of all those butterfly wings, all the fish scales, animal ears, nose shapes, eye colors, skin tones, smiles that could easily have happened in some other way but that appear to us now, just as they are, the needlessly glamorous and constantly renewed results of contingency.

All vanity, yes, and all in vain, certainly. But praise be.

-Anne O’Byrne

17May/120

Is College Worth It?

Student debt is suddenly spurring the once unthinkable debate: Is college necessary? Of course the answer is no. But who needs it and who should pay for it are complicated questions.

Arendt herself had an ambivalent relationship to academic culture. She never held a tenure-track job in the academy and she remained suspicious of intellectuals and academics. She never forgot how easily professors in Germany embraced the rationality of the Nazi program or the conformity with which Marxist and leftist intellectuals excused Stalinism. In the U.S., Arendt was disappointed with the "cliques and factions" as well as the overwhelming "gentility" of academics, that dulled their insights. It was for that reason that she generally shunned the company of academics, with of course notable exceptions. A free thinker—she valued thinking for oneself above all—she was part of and apart from the university world.

We plan to keep the discussion about college and debt going on the Arendt Center blog. Here are a few thoughts to get the debate going.

First, college is not magic. It will neither make you smart nor make you rich. Some of our best writers and thinkers somehow avoided writing five-page papers on the meaning of Sophocles. (That of course does not mean that they didn't read Sophocles, even in the Ancient Greek.) And many of the most successful Americans never graduated or attended college. On the other hand, many college grads and Ph.D.'s  are surviving on food stamps today. Some who attend the University of Phoenix will benefit greatly from it. Many who attend Harvard squander their money and time. Especially today, college is as much a safe path for risk-averse youth as it is a haven for the life of the mind or a tasseled path to the upper classes.

Second, College can be a transformative experience. As I prepare to say goodbye to another cohort of graduates at Bard, I am reminded again how amazing these students are and how much I learn from them every year. I wrote recently about one student who wrote a simply stunning meditation on education. Today I will be meeting with two students about their senior projects. One is a profound, often personal, and yet also deeply mature exploration of loneliness in David Foster Wallace, Hannah Arendt, and Martin Heidegger. The other is a genealogy of whistleblowing from T.E. Lawrence to Bradley Manning, arguing that the rise of whistleblowing in the 20th century is both a symptom of and a contributor to the lost facts in public life. Both are testaments to the fact that college can inspire young adults to wrestle meaningfully and intelligently with the world they must confront.

Third, Most students do not attend college because they want to. Of course some do and I have enormous respect for those who embrace the life of the mind that college can nurture. I also respect those who decide that college is not for them. But the simple fact is that too many college students are here thoughtlessly, going through the motions because they are on a track. College has become a stepping stone to a good job which is a stand in for a good life. Nothing wrong with that, but is it really worth hundreds of thousands of dollars and four years of your time simply to get a credential? College students are young and full of energy. Too often they spend four of their most energetic years studying things they don't care about while they sleep late, drink a lot, and generally have a good time.  This cannot be the best use of most young people's time.

Fourth, it is not at all clear that college is a good investment. There is no limit of students who tell me that taking out debt for an education is always a good investment. This is usually around the time they want to apply to law school or graduate school. And I can only repeat to them so many times that they are simply wrong. Finally, the press is catching up to this fact, and we are treated to a daily drumbeat of stories about the dangers of student debt. College debt in the U.S. now exceeds $1 Trillion, more than credit card debt (although far smaller than mortgage debt).  The problem is widespread, as 94% percent of those who earn a bachelor’s degree take on debt to pay for higher education — up from 45 percent in 1993. And the problem is deep: The average debt in 2011 was $23,300.  For 10% of college graduates, their debt is crippling, as they owe more than $54,000. Three percent owe more than $100,000.

The most egregious debt traps are still the for-profit colleges, which serve the working classes who cannot afford more expensive non-profit colleges. These schools prey on the perception, partly true, that career advancement requires a college degree. But now even public universities and private elite colleges are increasingly graduating students with high debt loads. And then there are law schools and culinary schools, which increasingly graduate indebted and trained professionals into a world in which does not need them.

he result is as sad as it is predictable. Nearly 1 in 9 young graduate borrowers who started repayment in 2009 defaulted within two years. This is about double the rate in 2005. The numbers vary: 15% of recent graduates from for-profit schools are in default. Also 7.2% of public university graduations and 4.6% of private university graduates are defaulting. Each of these groups requires a separate analysis and discussion. And yet overall, we are burdening way too many young people with debts that will plague them their entire lives.

Fifth, to defend college education as a good investment is not simply questionable economically. It also is to devalue the idea of education for its own sake and insist that college is an economic rather than an intellectual experience. One unintended consequence of the expansion of college to a wider audience of strivers is that a college education is decidedly an economic and bourgeois experience, less and less an intellectual adventure. Was college ever Arcadia? Surely not. For much of American history college has been a benefit reserved for the upper classes. And yet to turn education into a commodity, to make it part of the life process of making a living, does further delimit the available spaces for the life of the mind in our society.

Sixth, college is not necessary to make us either moral or enlightened citizens. College education does not make us better people. There are plenty of amazing people in the world who have had not studied Aristotle or learned genetics in college. The United States was built on the tradition of the yeoman farmer, that partly mythical but also real person who worked long days, saved, and treated people honorably.

Morality, as Hannah Arendt never tired of pointing out, is not gained by education. Or as Kant once pointed out to a certain Professor Sulzer in a footnote to his Groundwork of the Metaphysics of Morals, morality can only be taught by example, not through study. Arendt agreed. She saw that many of those who acted most honorably during WWII were not the intellectuals, but common people who simply understood that killing neighbors or shooting Jews was inhuman. What is more, it was often the intellectuals who provided themselves and others with the complex and quasi-scientific rationalizations for genocide. To think rationally, and even to use a current buzzword, to think critically, is no barrier to doing evil. Critical thinking—the art of making distinctions—is no guarantee of goodness.

Seventh, college cannot and should not replace a failed primary and high school system. Our primary schools are a disgrace and then we spend a fortune on remedial education in community colleges and even in four-year colleges, trying to educate people who have been failed by their public schools. We would do much better to take a large part of the billions and billions of public dollars we spend on higher education and put them towards a radical restoration of our public grammar and high schools. If we actually taught people in grammar schools and pushed them to excel in high schools, they would graduate prepared to hold meaningful jobs and also to be thoughtful citizens. Maybe then a college education could then be both less necessary and more valuable.

Bard College, which houses the Hannah Arendt Center, has been engaged for years in creating public high schools that are also early colleges. The premise is that high school students are ready for college level work, and there is nothing to prevent them from doing that. These Bard High School Early Colleges are public high schools staffed by professors with Ph.D's who teach the same courses we teach at Bard College. In four years, students must complete an entire four-year high school curriculum and a two-year college curriculum. They then receive a Bard Associates Degree at graduation, in addition to their high school diploma. This Associates degree —which is free— can either reduce the cost of graduation from a four-year college or replace it altogether.

Early colleges are not the single answer for our crisis of education. But they do point in one direction.  Money spent on really reforming high schools and even primary schools will do so much more to educate a broad, racially diverse, and economically underprivileged cohort of young people than any effort to reform or subsidize colleges and universities. The primary beneficiaries of the directing public money to colleges rather than high schools are Professors and administrators. I benefit from such subsidies and appreciate them. But that does mean I think them right or sensible.

We would be much better off if we redirected our resources and attention to primary and secondary education, which are failing miserably, and stopped obsessing so about college. Most college graduates, wherever they go, will learn something from their four or more years of classes.  But the mantra that one only becomes a full human being by going to college is not only false. It also is dangerous.

-RB

11Aug/107

From the comments section

In response to my essay on simulation, Ben Stevens writes that simulations are fictions that have been around a long time.

So, too, is Sophocles' Antigone. Are these fictions not simulations? For my money, then, what remains to be seen is whether increased pervasion of simulation is qualitatively different from traditional or non-technoscientific modes of mediation including products of verbal art like drama, poetry, and 20th-century philosophy. Are these last of such a different quality or order, of such a factual humanity, as still to make technoscientific modes of mediation seem, by contrast, the more (dis)simulative?

In the comments, I responded: Is a book (a technology) the same as a story (also a technology). Is a film the same as a book? Is facebook the same as a movie?

My point is that Turkle argues that that simulation wants something different than stories or books or movies. Those are media to entertain. Simulation wants a total immersion that becomes a proxy for the real. Contextualizing is important, but you have take seriously the claims of the new technology. It may turn out that the claims are inflated and all will revert to a mere tool for human entertainment. But that is not necessarily true. Some times there are new things in the world.

Professor Thomas asks:

I truly don’t understand the question you are posing, and I hope you will clarify it. “Simulation” isn’t the type of thing that can “want,” right? So are you asking what the many developers and users of simulation want? Or are you asking toward what ends the possibility of simulation drives its users?

The question "what does simulation want?" is, as you say, a question of what does simulation--insofar as we use it--reveal about our wants and drives. Your formulation, to "what ends the possibility of simulation drives its users" is perfectly fine in my view, although I would replace "possibility" with "activity." Insofar as we develop and use simulations, what does that reveal about our wants? And in what ways will simulation transform our wants and desires--thus, what does Simulation want?

This is the question Sherry Turkle asks and her answer is: Simulation wants immersion in a virtual world that is so profound that it replaces the real. or blurs with the real. Or is a proxy for the real. These aren't the same. This needs to be flushed out.

Ben says, haven't we always been living in fictions, thus simulations. I agree. All common life together depends on fictions of unity and common ideas, customs, that form our sense of identity and comprise our world. Plato understood that politics is about the unification of a multitude, and this unity is always based in a fiction (see Nietzsche too, and Arendt). The question we are debating, as I understand it, is a version of "is this time different?" Always a difficult question in medias res. I don't know the answer. But I do think that simulations, as I am coming to understand them, pose the possibility of a radical fictionalizing of the world in ways that will further attenuate our belief in a shared, commonly accessible world. If different people "see" and "feel" the world differently because of neural enhancements and oracular implants and artificial skin grafts, then the very idea of a common world of sense perception falls away and a new idea of reality--one suffused with simulation--takes its place. This is fundamentally different from the fantasy of a book or a movie. Even a religion, which offers a complete worldview, can be confronted with reality as Galileo did. But in a world of simulation, that reality threatens to disappear.

I say all of this not entirely sure of how it works. But the confidence with which such researchers now embrace simulation is a shock to my system.

rb

9Aug/103

The Wonders of Man in an Age of Simulations

Here is my latest essay, The Wonders of Man in an Age of Simulations that just appeared in The Fortnightly Review.

It is a review of books by Ray Kurzweil, Jaron Lanier, and Sherry Turkle and sets up the question of Human Being in an Inhuman Age, the topic of the Arendt Center's upcoming conference.

Read the interesting history of The Fortnightly Review (founded by Anthony Trollope, Frederic Chapman and George Henry Lewes, with Lewes as its first Editor).

A foretaste:

IN “THE ODE TO MAN” from Antigone, Sophocles conjures “Man” as the wondrous being who wears out the “imperishable earth” with his ploughs. This man “overpowers the rough-maned horses with his devices” and tames the “unbending mountain bull.” He flees the “stormy darts” of winter’s frost and he escapes “needful illness.” Such a man who tames nature is a wonder, according to the Ode’s opening line:

Manifold the wonders
And nothing towers more wondrous than man.

The Greek word for “wonder” is Deinon, which connotes both greatness and horror, that which is so extraordinary as to be at once terrifying and beautiful. This is how Sophocles understands man. As an inventor and maker of his world, man can remake and master the earth. This wonder terrifyingly carries the seeds of his destruction. Man, Sophocles imagines, threatens to so fully control his own way of life that he might no longer be man. As the chorus sings: “Always overcoming all ways, man loses his way and comes to nothing.” If man so tames the earth as to free himself from uncertainty, what then is left of human being?

A new urgency has energized those who welcome and those who fear the power of man to transform his nature. While hopes of technological utopias and fears of technological dystopias may be part and parcel of the human condition itself, we are living through a moment when extraordinary technological advances are once again raising the question of what it means to be human. The problem that confronts man in the 20th and now 21st centuries, as Hannah Arendt writes, is that we face the danger that we might so fully create and make our artificial world that we endanger that quality of human life which is subject to fate, nature, and chance. To bring oneself up to date on this current version of the debate over our human, superhuman, and inhuman futures, three recent books serve as excellent guides.

Read the whole.

RB