“Reading is equivalent to thinking with someone else’s head instead of with one’s own.”
— Arthur Schopenhauer
Communism and Commerce.
From the recent art show at the New York Armory.
“[Augustine] distinguishes between the questions of "Who am I?" and "What am I?" the first being directed by man at himself […] For in the "great mystery," the grande profundum, which man is (iv. 14), there is "something of man [aliquid hominis] which the spirit of man which is in him itself knoweth not. But Thou, Lord, who has made him [fecisti eum] knowest everything of him [eius omnia]" (x. 5).”
-Hannah Arendt, Human Condition
In the Human Condition Arendt raises major concerns about the place of man but she does not intend to respond to the loss of the earth as a unique human condition with a restoration of solid ground. To the question “What am I?” the only answer is: “You are a man—whatever that may be.” In lieu of an answer that would give man a new foundation, Arendt offers a description of man's ever changing territory.
Following Augustine, Arendt claims that only God could have the distance to answer the question of "who" man is with anything resembling a concrete statement of human nature. She respects the unknown “spirit of man,” even beyond the knowledge provided by religion.
When philosophy attempts to answer this question, it ends up creating its own image of a higher power, which remains linked through projection to man. Importantly though, philosophy should still ask the question.
Some context can help to open Arendt's question here for readers in English speaking countries where philosophical anthropology never gained the same traction as in Germany. Her challenge picks up on the heated debates of the 1920s and 30s over how to take the collapse of universal values seriously without falling back to simple subjectivism that culminated in the work of Husserl and Heidegger.
In the space of four pages of Being and Time (46-49), Martin Heidegger specifies his criticism with reference to Dilthey, Bergson, Scheler, and Husserl, as well as views from ancient Greek philosophy and Genesis. Heidegger says he has focused his analytic of Dasein on the question of Being and that it cannot therefore provide the fully ontological basis of Dasein needed for "'philosophical' anthropology'" but states that part of his goal is to "make such an anthropology possible." Later though, in section 10, Heidegger provides a further explanation of his criticism of anthropology: in "the attempt to determine the essence of 'man,' as an entity, the question of Being has been forgotten."
In its turn to experience and consciousness, philosophical anthropology forgets to ask the question of ontological definition of perceptual experience (cogitationes). Heidegger thus suggests that his investigation might provide the basis for an anthropology but does not claim to actually deliver this basis. He opens the question of the definition of man, but does so to orient man (recast as Dasein) toward his relation to Being. In a parallel manner, we can understand Arendt's reading of Augustine as opening the question of the relation between the "who" and “what” man is, but not closing it. Her work here is provocative because it can not be said to be in the service of a simple secularization that removes a higher power for human measure. Nor does she wish to save or restore divine guarantee. Perhaps Augustine allows her to pose similar questions of philosophical anthropology to those raised by Heidegger, but to win some distance from her teacher so that she can open a new space of freedom of action rather than freedom of thought.
The word designating military drones comes from the word for bee. This is true all over the world, in countless languages. Partly because of this linguistic consistency, it is a common misperception that drones take their name from the buzzing sound when unmanned aircraft fill the air. More accurately, however, drones trace their etymological lineage to the male honey-bee that is called a drone. The male drone-bee is distinguished from the female worker-bees. It does no useful work and has one single function: to impregnate the queen-bee. What unites military drones with their apiary namesakes is not sound, but thoughtless purposefulness.
The beauty of the drone-bee—like the dark beauty of the military drone—is its single-minded purpose. It is a miracle of efficiency, designed to do one thing. The drone-bee is not distracted by the perfume of flowers or the contentment of labor. It is born, lives, and dies with only one task in mind. Similarly, the military drone suffers neither from hunger nor from distraction. It does what it is told. If necessary, it will sacrifice itself for its mission. It is a model of thoughtless efficiency.
A few weeks ago I wrote about Ernst Jünger’s novel The Glass Bees, in which a brilliant inventor produces tiny flying glass bees that offer limitless potential for surveillance and war. Today I turn to Jake Kosek’s recent paper “Ecologies of Empire: On The New Uses of the Honeybee.” Kosek does not cite Jünger’s novel, and yet his article is in many ways its non-fiction sequel. What Kosek sees is that the rise of drones in military strategy is tied deeply to their ability to mimic the activity and demeanor of male honey-bees. It is because bees can fly, swarm, change direction, alter their course, and yet achieve their single purpose absent any intentionality or thinking that bees are so useful in modern warfare.
Bees have long been associated with military endeavors, both metaphorically and literally. Kosek tells that our word bomb comes from the Greek bombos, which means bee. The first bombs were, it seems, beehives dropped or catapulted into the heart of the enemy camp. Bees are today trained to sniff out toxic chemicals; and beeswax was for generations an essential ingredient in munitions.
In the war on terror, bees have taken on a special significance. The “enemy’s lack of coherence—institutionally, ideologically, and territorially— makes the search for the enemy central to the politics of the war on terror.” War in the war on terror is ever less a contest of armies on the battlefield and is increasingly a war of knowledge. This means that surveillance—for centuries an important complement to battlefield tactics—comes to occupy the core of the modern war on terror. In this regard, drones are essential, as drones can hover in the air unseen for days, gathering essential intelligence on persons, groups, or even whole cities. All the more powerful would be miniature drones that fly through the air unseen and at ground level. That is why Kosek writes that “Intelligence gathering [is] not just limited to psychologists, sociologists, lawyers, and military planners, but [has come] to include biologists, anthropologists, epidemiologists, and even entomologists.” What the military use of bees promises is access to information and worlds not previously open to human knowledge. Bees, Kosek writes, are increasingly the model for the modern military.
The advantage of bees is not simply their thoughtlessness, but is found also in their ability to operate as part of a swarm. Current drone technology requires that each drone be controlled by a single pilot. What happens when hundreds of drones must share the airspace around a target? How can drones coordinate their activity? Kosek quotes a private contractor, John Sauter, who says:
“A central aspect of the future of warfare technology is to get networks of machines to operate as self-synchronized war fighting units that can act as complex adaptive systems. . . We want these machines to be fighting units that can operate as reconfigurable swarms that are less mechanical and more organic, less engineered and more grown.”
The point is that drones, be they large or small, must increasingly work in conjunction with each other at a speed and level of nuance that is impossible for human controllers to manage. The result is that we must model the drones of the future on bees.
The scientists working with the Pentagon to create drones that can fly and function like bees are not entomologists, but mathematicians. The DNA of the glass or silicone bees of the future will be complex algorithms inspired by but actually surpassing the ability of swarms “to coordinate and collect small bits of information that can be synchronized to make collective action by drones possible.” Once this is possible, one controller will be able to manage a single drone “and the others adapt, react, and coordinate with that drone.”
Kosek’s article is provocative and fascinating. His ruminations on empire strike me as overdone; his insights about the way our training and use of bees has transformed the bee and the ways that bees are serving as models and inspiration for our own development of new ways to fight wars and solve problems are important. So too is his imagination of the bee as the six-legged soldier of the future. Whether the drones of the future are cyborg bees (as some in Kosek’s article suggest) or mechanical bees as Jünger imagined half a century ago, it is nevertheless the case that thinking about the impact of drones on warfare and human life is enriched by the meditation on the male honeybee. For your weekend read , I offer you Jake Kosek’s “Ecologies of Empire: On The New Uses of the Honeybee.”
The most common way people give up their power is by thinking they don’t have any.”
— Alice Walker
I was at dinner with a colleague this week—midterm week. Predictably, talk turned to the scourge of all professors: grading essays. There are few tasks in the life of a college professor less fulfilling than grading student essays. Every once in a while a really good essay jolts me to consciousness. I am elated by such encounters. To be honest, however, reading essays is for the most part stultifying. This is not the fault of the students, many of whom are brilliant and exuberant writers. I find it trying to wade through 25 essays discussing the same book, offering varying opinions and theories, while keeping my attention and interest. How many different ways can one ask for a thesis, talk about the importance of transition sentences, and correct grammar? For some time it is fun, in a way. One learns new things and is captivated by comparing how bright young minds see things. But after years, grading the essay becomes just part of the worst part of a great job.
So how might my colleagues and I react to news that EdX—the influential Harvard-MIT led consortium offering online courses—has developed software that will grade college student essays? I imagine it is sort of like how people felt when the dishwasher was invented. You mean we can cook and feast and don’t have to scrub pots and wash dishes? It promises to allow us to focus on teaching well without having to do that part of our job that we truly dread.
The appeal of computer grading is obvious and broad. Not only will many professors and teachers be freed from unwanted tedium, but also it may help our students. One advantage of computer grading is that it is nearly instantaneous. Students can hand in their work and get a grade and feedback seconds later. Too often essays are handed back days or even weeks after they are submitted. By then the students have lost interest in their paper and forgotten the inspiration that breathed life into their writing. To receive immediate feedback will allow students to see what they did wrong and how they could improve while the generative impulse underlying the paper is still fresh. Computer grading might encourage students to turn in numerous drafts of a paper; it may very well help teach students to write better, something that professorial comments delivered after a week rarely accomplish.
Another putative advantage of computer grading is its objectivity and consistency. Every professor knows that it matters when we read essays and in what order. Some essays find us awake and attentive. Others meet my eyes as they struggle to remain open. As much as I try to ignore the names on the top of the page, I can’t deny that my reading and grading is personalized to the students. I teach at a small liberal arts college where I know the students. If I read a particularly difficult sentence by a student I have come to trust, I often make a second effort. My personal attention has advantages but it is of course discriminatory. The computer will not do that, which may be seen by some as more fair. What is more, the computer doesn’t get tired or need caffeine.
Perhaps the most important advantage for administrators considering these programs is the cost savings. If computers relieve professors from the burden of grading, that means professors can teach more. It may also mean that fewer TA’s are necessary in large lecture courses, thus saving money for strapped universities. There may even be a further side benefit to these programs. If universities need fewer TA’s to grade papers, they may admit fewer graduate students to their programs, thus going some way towards alleviating the extraordinary and irresponsible over-production of young professors that is swelling the ranks of unemployable Ph.D.s.
There are, of course, real worries about computer grading of essays. My concern is not that the computers will make mistakes (so do I); or that we lack studies that show that computers can grade as well as human professors—for I doubt professors are on the whole excellent graders. The real issue is elsewhere.
According to the group “Professionals Against Machine Scoring of Student Essays in High-Stakes Assessment,” the problem with computer grading of essays is simple: Machines cannot read. Here is what the group says in a statement:
Let’s face the realities of automatic essay scoring. Computers cannot ‘read.’ They cannot measure the essentials of effective written communication: accuracy, reasoning, adequacy of evidence, good sense, ethical stance, convincing argument, meaningful organization, clarity, and veracity, among others.
What needs to be taken seriously is not that computers can’t grade as well as humans. In many ways they grade better. More consistently. More honestly. With less grade inflation. And more quickly. But computer grading will be different than human grading. It will be less nuanced and aspire to clearly defined criteria. Are sentences grammatical? Is there a clear statement of the thesis? Are there examples given? Is there a transition between sentences? All of these are important parts of good writing and the computer can be trained to look for these characteristics in an essay. What this means, however, is that computers will demand the kind of clear, precise, and logical writing that computers can understand and that many professors and administrators demand from students. What this also means, however, is that writing will become more mechanical.
There is much to be learned here from an analogy with the rise of computer chess. The great grandmaster Gary Kasparov—who famously lost to Deep Blue— has perceptively argued that machines have changed the ways Chess is played and redefined what a good chess move and a well-played chess game looks like. As I have written before:
The heavy use of computer analysis has pushed the game itself in new directions. The machine doesn’t care about style or patterns or hundreds of years of established theory. It counts up the values of the chess pieces, analyzes a few billion moves, and counts them up again. (A computer translates each piece and each positional factor into a value in order to reduce the game to numbers it can crunch.) It is entirely free of prejudice and doctrine and this has contributed to the development of players who are almost as free of dogma as the machines with which they train. Increasingly, a move isn’t good or bad because it looks that way or because it hasn’t been done that way before. It’s simply good if it works and bad if it doesn’t. Although we still require a strong measure of intuition and logic to play well, humans today are starting to play more like computers. One way to put this is that as we rely on computers and begin to value what computers value and think like computers think, our world becomes more rational, more efficient, and more powerful, but also less beautiful, less unique, and less exotic.
Much the same might be expected from the increasing use of computers to grade (and eventually to write) essays. Students will learn to write in ways expected from computers, just as they today try to learn to write in ways desired by their professors. The difference is that different professors demand and respond to varying styles. Computers will consistently and logically drive writing towards a more mechanical and logical style. Writing, like Chess playing, will likely become more rational, more efficient, and more effective, but also less beautiful, less unique, and less eccentric. In other words, writing will become less human.
It turns out that many secondary school districts already use computers to grade essays. But according to John Markoff in The New York Times, the EdX software promises to bring the technology into college classrooms as well as online courses.
It is quite possible that in the near future, my colleagues and I will no longer have to complain about grading essays. But that is unlikely at Bard. More likely is that such software will be used in large university lecture courses. In such courses with hundreds of students, professors already shorten questions or replace essays with multiple-choice tests. Or they use armies of underpaid graduate students to grade these essays. It is quite likely that software will actually augment the educational value of writing assignments at college in these large lecture halls.
In seminars, however, and in classes at small liberal arts colleges like Bard where I teach, such software will not likely free my colleagues and me from reading essays. The essays I assign are not simple responses to questions in which there are clear criteria for grading. I look for elegance, brevity, insight, and the human spark (please no comments on my writing). Whether or not I am good at evaluating writing or at teaching writing, that is my aspiration. I seek to encourage writing that is thoughtful rather than writing that is simply accurate. When I have time to make meaningful comments on papers, they concern structure, elegance, and depth. It is not only a way to grade an essay, but also a way to connect with my students and help them to see what it means to write and think well.
And yet, I can easily imagine making use of such a computer-grading program. I rarely have time to grade essays as well or as quickly as I would like. I would love to have my students submit drafts of their essays to the EdX computer program.
If they could repeatedly submit their essays and receive such feedback and use the computer to catch not only grammatical errors but also poor sentences, redundancies, repetitions, and whatever other mistakes the computer can be trained to recognize, that would allow them to respond and rework their essays many times before I see them. Used well, I hope, such grading programs might really augment my capacities as a professor and their experiences as students.
I have real fears that grading technology will rarely be used well. Rather, it will too-often replace human grading altogether and in large lectures, high schools and standardized tests will impose a new and inhuman standard on the way we write and thus the way we think. We should greet such new technologies enthusiastically and skeptically. But first, we should try to understand them. Towards that end, it is well worth reading John Markoff’s excellent account of the new EdX computer grading software in The New York Times. It is your weekend read.
In an essay in the Wall Street Journal, Frans de Waal—C. H. Candler Professor of Primate Behavior at Emory University—offers a fascinating review of recent scientific studies that upend long-held expectations about the intelligence of animals. De Waal rehearses a catalogue of fantastic studies in which animals do things that scientists have long thought they could not do. Here are a few examples:
Ayumu, a male chimpanzee, excels at memory; just as the IBM computer Watson can beat human champions at Jeopardy, Ayumu can easily best the human memory champion in games of memory.
Similarly, Kandula, a young elephant bull, was able to reach some fragrant fruit hung out of reach by moving a stool over to the tree, standing on it, and reaching for the fruit with his trunk. I’ll admit this doesn’t seem like much of a feat to me, but for the researchers de Waal talks with, it is surprising proof that elephants can use tools.
Scientists may be surprised that animals can remember things or use tools to accomplish tasks, but any one raised on children’s tales of Lassie or Black Beauty knows this well, as does anyone whose pet dog opened a door knob, brought them a newspaper, or barked at intruders. The problem these studies address is less our societal view of animals than the overly reductive view of animals that de Waal attributes to his fellow scientists. It’s hard to take these studies seriously as evidence that animals think in the way that humans do.
Seemingly more interesting are experiments with self-recognition and also facial recognition. De Waal describes one Asian Elephant who stood in front of a mirror and “repeatedly rubbed a white cross on her forehead.” Apparently the elephant recognized the image in the mirror as herself. In another experiment, chimpanzees were able to recognize which pictures of chimpanzees were from their own species. Like my childhood Labrador who used to stare knowingly into the mirror, these studies confirm that animals are able to recognize themselves. This means that animals do, likely, understand that they are selves.
For de Waal, these studies have started to upend a view of humankind's unique place in the universe that dates back at least to ancient Greece. “Science,” he writes, “keeps chipping away at the wall that separates us from the other animals. We have moved from viewing animals as instinct-driven stimulus-response machines to seeing them as sophisticated decision makers.”
The flattening of the distinction between animals and humans is to be celebrated, De Waal argues, and not feared. He writes:
Aristotle's ladder of nature is not just being flattened; it is being transformed into a bush with many branches. This is no insult to human superiority. It is long-overdue recognition that intelligent life is not something for us to seek in the outer reaches of space but is abundant right here on earth, under our noses.
DeWaal has long championed the intelligence of animals, and now his vision is gaining momentum. This week, in a long essay called “One of Us” in the new Lapham’s Quarterly on animals, the glorious essayist John Jeremiah Sullivan begins with this description of similar studies to the ones DeWaal writes about:
These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”
With nuance and subtlety, Sullivan understands that our tradition has not drawn the boundary between human and animal nearly as securely as de Waal portrays it. Throughout human existence, humans and animals have been conjoined in the human imagination. Sullivan writes that the most consistent “motif in the artwork made between four thousand and forty thousand years ago,” is the focus on “animal-human hybrids, drawings and carvings and statuettes showing part man or woman and part something else—lion or bird or bear.” In these paintings and sculptures, our ancestors gave form to a basic intuition: “Animals knew things, possessed their forms of wisdom.”
Religious history too is replete with evidence of the human recognition of the dignity of animals. God says in Isaiah that the beasts will honor him and St. Francis, the namesake of the new Pope, is famous for preaching to birds. What is more, we are told that God cares about the deaths of animals.
“In the Gospel According to Matthew we’re told, “Not one of them will fall to the ground apart from your Father.” Think about that. If the bird dies on the branch, and the bird has no immortal soul, and is from that moment only inanimate matter, already basically dust, how can it be “with” God as it’s falling? And not in some abstract all-of-creation sense but in the very way that we are with Him, the explicit point of the verse: the line right before it is “fear not them which kill the body, but are not able to kill the soul.” If sparrows lack souls, if the logos liveth not in them, Jesus isn’t making any sense in Matthew 10:28-29.
What changed and interrupted the ancient and deeply human appreciation of our kinship with besouled animals? Sullivan’s answer is René Descartes. The modern depreciation of animals, Sullivan writes,
proceeds, with the rest of the Enlightenment, from the mind of René Descartes, whose take on animals was vividly (and approvingly) paraphrased by the French philosopher Nicolas Malebranche: they “eat without pleasure, cry without pain, grow without knowing it; they desire nothing, fear nothing, know nothing.” Descartes’ term for them was automata—windup toys, like the Renaissance protorobots he’d seen as a boy in the gardens at Saint-Germain-en-Laye, “hydraulic statues” that moved and made music and even appeared to speak as they sprinkled the plants.
Too easy, however, is the move to say that the modern comprehension of the difference between animal and human proceeds from a mechanistic view of animals. We live at a time of the animal rights movement. Around the world, societies exist and thrive whose mission is to prevent cruelty toward and to protect animals. Yes, factory farms treat chickens and pigs as organic mechanisms for the production of meat, but these farms co-exist with active and quite successful movements calling for humane standards in food production. Whatever the power of Cartesian mechanics, its success is at odds with the persistence of the religious, ancient solidarity, and also deeply modern sympathy between human and animal.
A more meaningful account of the modern attitude towards animals might be found in Spinoza. Spinoza, as Sullivan quotes him, recognizes that animals feel in ways that Descartes did not. As do animal rights activists, Spinoza admits what is obvious: that animals feel pain, show emotion, and have desires. And yet, Spinoza maintains a distinction between human and animal—one grounded not in emotion or feeling, but in human nature. In his Ethics, he writes:
Hence it follows that the emotions of the animals which are called irrational…only differ from man’s emotions to the extent that brute nature differs from human nature. Horse and man are alike carried away by the desire of procreation, but the desire of the former is equine, the desire of the latter is human…Thus, although each individual lives content and rejoices in that nature belonging to him wherein he has his being, yet the life, wherein each is content and rejoices, is nothing else but the idea, or soul, of the said individual…It follows from the foregoing proposition that there is no small difference between the joy which actuates, say, a drunkard, and the joy possessed by a philosopher.
Spinoza argues against the law prohibiting slaughter of animals—it is “founded rather on vain superstition and womanish pity than on sound reason”—because humans are more powerful than animals. Here is how he defends the slaughter of animals:
The rational quest of what is useful to us further teaches us the necessity of associating ourselves with our fellow men, but not with beasts, or things, whose nature is different from our own; we have the same rights in respect to them as they have in respect to us. Nay, as everyone’s right is defined by his virtue, or power, men have far greater rights over beasts than beasts have over men. Still I do not deny that beasts feel: what I deny is that we may not consult our own advantage and use them as we please, treating them in the way which best suits us; for their nature is not like ours.
Spinoza’s point is quite simple: Of course animals feel and of course they are intelligent. Who could doubt such a thing? But they are not human. That is clear too. While we humans may care for and even love our pets, we recognize the difference between a dog and a human. And we will, in the end, associate more with our fellow humans than with dogs and porpoises. Finally, we humans will use animals when they serve our purposes. And this is ok, because have the power to do so.
Is Spinoza arguing that might makes right? Surely not in the realm of law amongst fellow humans. But he is insisting that we recognize that for us humans, there is something about being human that is different and, even, higher and more important. Spinoza couches his argument in the language of natural right, but what he is saying is that we must recognize that there are important differences between animals and humans.
At a time that values equality over what Friedrich Nietzsche called the “pathos of difference,” the valuation of human beings over animals is ever more in doubt. This comes home clearly in a story told recently by General Stanley McChrystal, about a soldier who expressed sympathy for some dogs killed in a raid in Iraq. McChrystal responded, severely: “"Seven enemy were killed on that target last night. Seven humans. Are you telling me you're more concerned about the dog than the people that died? The car fell silent again. "Hey listen," I said. "Don't lose your humanity in this thing."” Many, no doubt, are more concerned, or at least are equally concerned, about the deaths of animals as they are about the deaths of humans. There is ever-increasing discomfort about McChrystal’s common sense affirmation of Spinoza’s claim that human beings simply are of more worth than animals.
The distinctions upon which the moral sense of human distinction is based are foundering. For DeWaal and Sullivan, the danger today is that we continue to insist on differences between animals and humans—differences that we don’t fully understand. The consequences of their openness to the humanization of animals, however, is undoubtedly the animalization of humans. The danger that we humans lose sight of what distinguishes us from animals is much more significant than the possibility that we underestimate animal intelligence.
I fully agree with DeWaal and Sullivan that there is a symphony of intelligence in the world, much of it not human. And yes, we should have proper respect for our ignorance. But all the experiments in the world do little to alter the basic facts, that no matter how intelligent and feeling and even conscious animals may be, humans and animals are different.
What is the quality of that difference? It is difficult to say and may never be fully articulated in propositional form. On one level it is this: Simply to live, as do plants or animals, does not constitute a human life. In other words, human life is not simply about living. Nor is it about doing tasks or even being conscious of ourselves as humans. It is about living meaningfully. There may, of course, be some animals that can create worlds of meaning—worlds that we have not yet discovered. But their worlds are not the worlds to which we humans aspire.
Over two millennia ago, Sophocles, in his “Ode to Man,” named man Deinon, a Greek word that connotes both greatness and horror, that which is so extraordinary as to be at once terrifying and beautiful. Man, Sophocles tells us, can travel over water and tame animals, using them to plough fields. He can invent speech, and institute governments that bring humans together to form lasting institutions. As an inventor and maker of his world, this wonder that is man terrifyingly carries the seeds of his destruction. As he invents and comes to control his world, he threatens to extinguish the mystery of his existence, that part of man that man himself does not control. As the chorus sings: “Always overcoming all ways, man loses his way and comes to nothing.” If man so tames the earth as to free himself from uncertainty, what then is left of human being?
Sophocles knew that man could be a terror; but he also glorified the wonder that man is. He knew that what separates us humans from animals is our capacity to alter the earth and our natural environment. “The human artifice of the world,” Arendt writes, “separates human existence from all mere animal environment…” Not only by building houses and erecting dams—animals can do those things and more—but also by telling stories and building political communities that give to man a humanly created world in which he lives. If all we did as humans was live or build things on earth, we would not be human.
To be human means that we can destroy all living matter on the Earth. We can even today destroy the earth itself. Whether we do so or not, it now means that to live on Earth today is a “Choice” that we make, not a matter of fate or chance. Our Earth, although we did not create it, is now something we humans can decide to sustain or destroy. In this sense, it is a human creation. No other animal has such a potential or such a responsibility.
There is a deep desire today to flee from that awesome and increasingly unbearable human responsibility. We flee, therefore, our humanity and take solace in the view that we are just one amongst the many animals in the world. We see this reductionism above all in human rights discourse. One core demand of human rights—that men and women have a right to live and not be killed—brought about a shift in the idea of humanity from logos to life. The rise of a politics of life—the political demand that governments limit freedoms and regulate populations in order to protect and facilitate their citizens’ ability to live in comfort—has pushed the animality, the “life,” of human beings to the center of political and ethical activity. In embracing a politics of life over a politics of the meaningful life, human rights rejects the distinctive dignity of human rationality and works to reduce humanity to its animality.
Hannah Arendt saw human rights as dangerous precisely because they risked confusing the meaning of human worldliness with the existence of mere animal life. For Arendt, human beings are the beings who build and live in a political world, by which she means the stories, institutions, and achievements that mark the glory and agony of humanity. To be human, she insists, is more than simply living, laboring, working, acting, and thinking. It is to do all of these activities in such a way as to create, together, a common life amongst a plurality of persons.
I fear that the interest in animal consciousness today is less a result of scientific proof that animals are human than it is an increasing discomfort with the world we humans have built. A first step in responding to such discomfort, however, is a reaffirmation of our humanity and our human responsibility. There is no better way to begin that process than in engaging with a very human response to the question of our animality. Towards that end, I commend to you “One of Us,” by John Jeremiah Sullivan.
For two years I taught literature, reading and writing at a public university in one of New York City’s outer Boroughs. Of course having come out of a liberal arts “thinking” institution what I really thought (maybe hoped) I was teaching was new perspectives. Ironically, the challenge that most struck me was not administrative, nor class size or terrible grammar and endless hours of grading, the most pressing obstacle lay in creating a case for the value of “thinking.”
I state “case” because I regularly felt like my passions and beliefs, as well as my liberal arts education went on daily trial. I had originally come from a hard-scrabble immigrant reality, but my perception of reality had been altered by my education experience, and as an educator I felt the need to authenticate my progressive (core text) education with my students.
I was regularly reminded that the immediate world of the “average” student (citizen) with all its pressing, “real” concerns does not immediately open itself to “thought” in the liberal arts sense. We are a specialization, automation, struggling and hyper competitive society. The “learning time” of a student citizen is spent in the acquisition of “marketable,” and differentiating skills, while their “free time” is the opportunity to decompress from, or completely escape the pressures of competitive skill acquisition. The whole cycle is guided by an air of anxiety fostered in our national eduction philosophy, as well as the troubled economy and scattered society at large. I don’t think one can teach the humanities without listening to their students, and listening to the students calls for a deep inventory on the value of “thought” in the humanities sense, and then ultimately in how to most truthfully communicate this value to the students.
I need to add here that my students were quite smart and insightful. This made it even greater of a challenge. Their intelligence was one of realism. I needed to both acknowledge and sway their perspective, as well as my own.
Each semester I began with a close-reading of David Foster Wallace's commencement speech at Kenyon College, “What is Water.” He begins his speech with the parable of two fish swimming by an older fish which as it swims by asks, How is the Water?” The little ones swim on and only later ask each other, “What is water?” Didactic parable, cliche -- yes -- but Wallace goes on to deconstruct the artifice of commencement speeches, parables, and cliches, and then rebuilds them. Having so skillfully deconstructed them he has invited his listers into the form making, and as he communicates the truth beneath what had earlier seemed lofty or cliche, the listers follow him towards meaning making. Ultimately Wallace states that education is “less about teaching you how to think, and more about teaching you of the choice in what to think about.” To have agency is to be a meaning maker. And as more and more cultural institutions artfully vie for the citizens devotion and loyalty -- politics, religion, but even more so, corporate houses and pop culture designs, in the ever growing noise of institutional marketing the call to choose seems ever more muted.
The choice, for so many students today, is simply in how to most skillfully compartmentalize themselves and their lives in the face of the anxieties of their immediate world. The choice for many young teachers, facing their own set of related anxieties, is in how far are they willing step away from the ideal of learning-living-teaching integration model -- so easy is it today as an educator to simply become disenchanted, frustrated and aloof. Sometimes, “thinking” is the process of choosing what to keep and what to give away.
Wallace's insightful, no b.s, humorous and sincere tone resonated with my students, that is of course until they found out that Wallace killed himself. Then, that’s what everyone wanted to focus on. I can not blame them. There is a ‘text’ to ‘personal’ mystery, a ‘content’ to ‘context’ disjunction that opens itself at such a revelation, a mystery that the “thinking” mind wants to explore. The modern “thinking” mind draws little separation between the lofty and the sublime, the public and the personal. Such is a byproduct of a generation raised on reality television and celebrity stories. I, in all sincerity cannot judge this. My generation, the X’s who came of age on the cusp of the Millennials, were culturally educated by MTV, The Real World and Road Rules, and thus we crave hip, colorful, appropriately gentrified spaces to occupy -- think of artist collectives, or Facebook and Google working environments (bean bags, chill and chic prescription sunglasses, lounge happy hour with juice bars, untraditional working hours, colorful earth tones). I digress, I meant to make some observation of “thinking.”
I was excited to teach what excited me: I began with Wallace, then Kafka, O’Connor (Flannery or Frank), Platonov, Carver, Babel, Achebe Kundera, Elliot, etc... It is, essentially, the seven sisters freshmen reading list, a popular catalogue of classic stories peppered with some international obscurity. It is the “cool” thing in liberal arts. But, over and over my students came to me complaining that they could not find this relevant to their lives. After such reports I would tweak my lesson plans to give a greater introduction to the works, going deeper into the philosophical tenets of the stories, and into the universal reward of being able to utilize the tools of the thinking, writing mind. Induct, deduct, compare, contrast, relate, “give it greater shape,” I would say. “Breath life into it.”
To have the skills to decipher plot, to record the echo of a narrative, to infer characterization from setting, to understand the complex structure of a character, to be invited to participate in the co-creation of a narrative which gently guides you through action but leaves the moral implications up to the reader. These are “indispensable,” I would advise my students. “Indispensable for human agency.” Some would slowly gravitate to my vision, as I prodded further and further into their motivations for being in school, career, and other ‘relevant’ choice. Yet, they often felt only like visitors in my library, preparing to check out and return to the “default” education thinking mode as soon as the quarter, mid, or end semester exam periods began. The pressures of what they call “the real world” are much stronger then the ghosts of books and introspective thought -- vague, powerless, intangible.
“The real world:” Here I am reminded of the scene from the Matrix when Morpheus unveils to Neo “the desert of the real.” A barren waste land of human energy as only a power source nourished for consumption. The Matrix, I will add here, is based on a work by Jean Baudrillard, a french philosopher who warns of a modern society as a place existing in consumption and entertainment, devoid of meaning making -- the urge towards agency, in hibernation; the map towards meaning, defunct. In describing this new world he coined the phrase “the desert of the real.” Again, I fall into tangental thought.
I needed to find a way to invite, seduce, capture my students. I tried using myself as a conduit.
I pride myself on the fact that I am an immigrant, a former “at risk” student, that my tattoos all have mythological meaning and thought behind them, that I am a high-school drop out with credentials to my name, a top tier education, a masters degree, etc... I felt like these could help me bridge for my students the platforms of reality-setting discourse and humanistic thought. I had, and still do, believed that real “thinking” is indispensable in being human, in being free, and in the ability to have fun and play with the world.
Again, my students would, at times, meet me in the middle space I wanted to create, though rarely did this space become living for them, instead they lay their heads to the sound of another’s palpitation and breath, and then moved on. Maybe I planted a seed, I like to think. But then, maybe, they were bringing me somewhere as well.
They could not recklessly follow me, or I them. It was an issue of pragmatic bonds. For a moment, my class, or an individual student I was reading with would delve into the power of words with me and the ending of Andrei Platonov’s “Potudon River” would finally break through the events of the page: “Not every grief can be comforted; there is a grief that ends only after the heart has been worn away in long oblivion, or in distraction amidst life’s everyday concerns.” And my students would draw new understanding of the passage, enter it through a word or phrase that could unlock that middle space between their worlds and the world of literature, philosophy, metaphor. “Grief,” “long oblivion,” life’s everyday concerns,” all the sudden my students would give these new meaning, now only slightly guided by the story and letting their lives find a grip to the reigns. They would find new connections, and again they would return to the “real” world.
More and more I struggled to make thinking relevant. “Will this help me get a better job?” I was asked.
Thinking about it I had to encounter my own struggles with this question. I know the answers. I know the programed liberal arts answer, and the “real” answer. I know that the liberal arts answer exposes the “real” as something at best lacking, at its worst empty. I also know that the real, is real; it happens in real time, removed from the concerns of literature, poetry, and philosophy which concern themselves with the work of mans eternity.
“Unlikely,” I would answer. For gods sake, though I was teaching all these things I cared so deeply about, I also worked nights as a bartender to satisfy the demands of the real. I had to produce something consumable and all of my learning and thoughts on thinking are not that.
Here I acknowledge that this answer is not entirely true. We can find jobs which call for liberal arts skills, but these are few and far between and rarely afford a comfortable standard of living. We may also posit the argument that liberal arts skills will contribute to ones ability to perform better and have a greater understanding of ones job, but this argument does not lend itself to substantial evidence, no matter how much I may actually believe it. This was the litmus test of my “thinking,” and it only survives in embracing the privacies of my world, that I chose my private world despite and above the “real.”
“Unlikely.” And where does that leave us?
Ultimately, all I have as a conscious being is the ability to tell stories, to choose and create my narrative from the scattered world I am provided. Ultimately, after deconstructing both the “real” and the “lofty” I could only encourage my students to choose their own themes. To the question of “what is water?” I could only answer, “the desert.”
Oddly enough, and as “unlikely” as it may seem, when I answered with honesty, to them as well as myself, they followed. -- we could talk.
"Some of my most cherished books." Submitted by Professor Jorge Giannaeas.
Reading furnishes the mind only with materials of knowledge; it is thinking
that makes what we read ours.
- John Locke
The office library of Mery del Rocio Castillo Cisneros, a Professor of Philosophy
and Humanities at Universidad de La Salle in Bogotá.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Clocking in as the longest article ever in Time (h/t Dylan Byers), Steven Brill’s cover story is the single-best account of the insanity and corruption of our current medical system. Why do we accept the skyrocketing costs of medical care? “Those who work in the health care industry and those who argue over health care policy seem inured to the shock.” Brill shows us why the bills are really way too high. Hint: it is not because the care is so good. There are so many excess costs in the system, that reforming it should be easy, if it weren’t so corrupt.
David Goldhill wants to give all working Americans $1,800,000, the amount he calculates a 23 year-old beginning work today at $35,000/year will pay, directly or indirectly, in health care insurance benefits. Goldhill argues that our health care system wastes most of that money because people have no incentive to attend to costs. He suggests a dual system. Give every American health insurance for truly rare and unpredictable illnesses. But for regular costs and smaller emergencies, he would refund workers the money they are losing and let them pay for healthcare themselves.
Oliver Sacks walks through his past and, with the help of his brother, discovers that a memory he had believed his own had actually been that of another. Starting from there, he gives a short account of the weakness of individual remembering, which allows us to take in something we've heard or seen and make it our own. He concludes, finally, that "memory is dialogic and arises not only from direct experience but from the intercourse of many minds."
Michael Lewis writes of the rise of an unapologetic business class in the 1990s and early 2000’s, that they enjoyed the “upside to big risk-taking, the costs of which would be socialized, if they ever went wrong. For a long time they looked simply like fair compensation for being clever and working hard. But that’s not what they really were; and the net effect was… to get rid of the dole for the poor and replace it with a far more generous, and far more subtle, dole for the rich.”
Five women. “Two are wives and daughters in ordinary families unable to comprehend why such misfortune has overtaken them. A third is a young bride living in the household of a high party official. The last two are wives of the Master’s executioners. These stories are based on their memoirs—some written by themselves, others by close friends or by their children. These five women put a human face on the terror of Stalin’s purges and the Gulag in the Soviet Union of the 1930s.”
“Debt doesn’t look like much. It has no shape or smell. But, over time, it leaves a mark. In Spain, it manifested itself, first, as empty buildings, stillborn projects, and idled machines.” So writes Nick Paumgarten. To see how debt looks and smells, look at Simon Norfolk's surreal photographs of Residencial Francisco Hernando, an unfinished development near Seseña, Spain. Working his way through a half-finished city with few people in it, Norfolk's photography suggests that even beginning construction was an act of hubris; "everyone," he says, "wanted to get rich doing nothing."
The Arendt Center’s 2012 conference “Does the President Matter?” asked whether political leadership is still possible today. Guatam Mukunda believes that we can measure the value of a particular leader based on their behavior at the margins—what did that person accomplish over and above what another would have been able to do? In the accompanying video, Mukunda argues that leaders can only be great or terrible when the people selected for such roles are relatively unknown to those making the selection. In an age of information, the chances are slim.
This week on the blog
This week on the blog, we argued that American reformers should shift their efforts at reforming education towards high school and pointed towards Richard Kahlenberg's recent piece in The Chronicle of Higher Education, adding that "poverty, more than race or gender, is increasingly the true mark of disadvantage in 21st century America." We also continued the inquiry into the growing threat that entitlements pose to the next generation, highlighting Geoffrey Canada and Peter Druckenmiller's argument that entitlements are a generational theft that must be arrested. Elsewhere, Na'ama Rokem quotes from Arendt's only Yiddish-language article to explore the philosopher's language politics and her Jewish identity. Jeff Champlin looked at some similarities between Habermas and Arendt in their understandings of power. In the Weekend Read, Roger Berkowitz argues that we need to free federalism from its present partisanship and recall the important connection between federalism and freedom. Finally, if you didn't get around to our remembrance of Ronald Dworkin, you should take some time and give it a read.
Until next week,
The Hannah Arendt Center
You know elite universities are in trouble when their professors say things like Edward Rock. Rock, Distinguished Professor at the University of Pennsylvania Law School and coordinator of Penn’s online education program, has this to say about the impending revolution in online education:
We’re in the business of creating and disseminating knowledge. And in 2012, the internet is an incredibly important place to be present if you’re in the knowledge dissemination business.
If elite colleges are in the knowledge dissemination business, then they will overtime be increasingly devalued and made less relevant. What colleges and universities need to offer is not simply knowledge, but education.
In 1947, at the age of 18, Martin Luther King Jr. wrote a short essay in the The Maroon Tiger, the Morehouse College campus newspaper. The article was titled, “The Purpose of Education.” In short, it argued that we must not confuse education with knowledge.
King began with the personal. Too often, he wrote, “most college men have a misconception of the purpose of education. Most of the "brethren" think that education should equip them with the proper instruments of exploitation so that they can forever trample over the masses. Still others think that education should furnish them with noble ends rather than means to an end.” In other words, too many think that college is designed to teach either means or ends, offering the secrets that unlock the mysteries of our futures.
King takes aim at both these purposes. Beyond the need for education to make us more efficient, education also has a cultural function. In this sense, King writes, Education must inculcate the habit of thinking for oneself, what Hannah Arendt called Selbstdenken, or self-thinking.
“Education,” King writes, “must also train one for quick, resolute and effective thinking.” Quick and resolute thinking requires that one “think incisively” and “think for one's self.” This “is very difficult.” The difficulty comes from the seduction of conformity and the power of prejudice. “We are prone to let our mental life become invaded by legions of half truths, prejudices, and propaganda.” We are all educated into prejudgments. They are human and it is inhuman to live free from prejudicial opinions and thoughts. On the one hand, education is the way we are led into and brought into a world as it exists, with its prejudices and values. And yet, education must also produce self-thinking persons, people who, once they are educated and enter the world as adults, are capable of judging the world into which they been born.
For King, one of the “chief aims of education” is to “save man from the morass of of propaganda.” “Education must enable one to sift and weigh evidence, to discern the true from the false, the real from the unreal, and the facts from the fiction.”
To think for oneself is not the same as critical thinking. Against the common assumption that college should teach “critical reasoning,” King argues that critical thinking alone is insufficient and even dangerous: “Education which stops with efficiency may prove the greatest menace to society. The most dangerous criminal may be the man gifted with reason, but with no morals.” The example King offers is that of Eugene Talmadge, who had been governor of Georgia. Talmadge “possessed one of the better minds of Georgia, or even America.” He was Phi Beta Kappa. He excelled at critical thinking. And yet, Talmadge believed that King and all black people were inferior beings. For King, we cannot call such men well educated.
The lesson the young Martin Luther King Jr. draws is that intelligence and critical reasoning are not enough to make us educated. What is needed, also, is an educational development of character:
We must remember that intelligence is not enough. Intelligence plus character—that is the goal of true education. The complete education gives one not only power of concentration, but worthy objectives upon which to concentrate. The broad education will, therefore, transmit to one not only the accumulated knowledge of the race but also the accumulated experience of social living.
Present debates about higher education focus on two concerns. The first is cost. The second is assessment. While the cost is high for many people, it is also the case the most students and their families understand that what colleges offer is priceless. But that is only true insofar as colleges understand their purpose, which is not simply to disseminate knowledge or teach critical thinking, but is, rather, to nurture character. How are we to assess such education? The demand for assessment, as well meaning as it is, drives education to focus on measurable skills and thus moves us away from the purposes of education as King rightly understands them.
The emerging debate about civic education is many things. Too often it is a tired argument over the “core” or the “canon.” And increasingly it is derailed by arguments about service learning or internships. What really is at issue, however, is a long-overdue response to the misguided dominance of the research-university model of education.
Colleges in the United States were, up through the middle of the 20th century, not research-driven institutions. They were above all religiously affiliated institutions and they offered general education in the classics and the liberal arts. Professors taught the classics outside of their specific disciplines. And students wrestled with timeless questions. This has largely changed today where professors are taught to specialize and think within their disciplinary prejudices. Even distribution requirements fail to make a difference insofar as students forced to take a course outside their discipline learn simply another disciplinary approach. They learn useful knowledge and critical thinking. But what is missing is the kind of general education in the “accumulated experience of social living” that King championed.
I am not suggesting that all specialization is bad or that we should return to religious-affiliated schools. Not in the least. But many of us know that we are failing in our responsibilities to think about what is important and to teach students a curriculum designed to nurture self-thinking and citizenship. We avoid this conversation because it is hard, because people disagree today on whether we should read Plato or Confucius or study Einstein or immunology. Everyone has their discipline to defend and few faculty are willing or able to think about an education that is designed for students and citizens.
Let’s stop bad mouthing all colleges. Much good happens there. Yet let’s also recall King’s parting words in his essay:
If we are not careful, our colleges will produce a group of close-minded, unscientific, illogical propagandists, consumed with immoral acts. Be careful, "brethren!" Be careful, teachers!
King’s The Purpose of Education is your weekend read.
Law school applications have gone off a cliff. Just look at this statistic from today’s NY Times.
As of this month, there were 30,000 applicants to law schools for the fall, a 20 percent decrease from the same time last year and a 38 percent decline from 2010, according to the Law School Admission Council. Of some 200 law schools nationwide, only 4 have seen increases in applications this year. In 2004 there were 100,000 applicants to law schools; this year there are likely to be 54,000.
This radical drop in law school applications is not because people are suddenly reading Shakespeare. The reason is clear. Lawyers aren’t getting jobs. For law school grads in 2011, only 55% got full-time jobs working as lawyers. That means 45% did not get jobs they were trained to do. No wonder students and their parents aren’t lining up to take out debt to get a legal education.
Just as journalism has been upended by the Internet revolution, so too law is changing. The changes are different. Lawyers are still needed and law firms will exist. But more of the work can be done more cheaply, off-location, and by fewer people. Quite simply, we need fewer lawyers. And those we do need, don’t command the salaries they once did.
Finally, law school was for years the refuge of the uncommitted. For liberal arts grad unsure of what to do next, the answer was law school. But now with tuitions skyrocketing, debt ballooning, and job prospects dimming, law schools are out of favor.
What is more, these changes coming to law schools will be coming to other professional and graduate schools as well. All those Ph.D.s in hyper-specialized disciplines ranging from Italian studies to Political Theory are in for a really tragically rude awakening? There are no jobs. And those jobs are not coming back. For academics to keep bringing young scholars into Ph.D. programs now is really deeply wrong.
This retreat from law school is a good thing. My J.D. was hardly an educational experience worth three years of my time. Law schools are caught between being professional schools training practicing lawyers and the desire to be also to be something more. The result, they largely do neither well. They don’t produce lawyers ready to practice. Nor do they produce deep legal minds. Little would be lost if law school were reduced to 2 years (or even less), which is why legal academics are pushing an experiment to offer two-year J.D.s.
Education does matter and will continue to distinguish people who pursue it and excel at it. Liberal arts majors who combine a love for the renaissance with an interest in dance will succeed, whether they create new works of art or found a business curating Italian wines, these students learn to pursue their dreams. Education will survive because it raises people from their daily lives to the life of the mind. Education, as opposed to factory schools and large lectures, fosters creativity and daring, leading people to invent lives for themselves in pursuit of their passions.
While education will survive, schools and universities that have become credentialing factories will be increasingly challenged. When what matters is measureable performance, credentials will become ever less important. Law schools—at least many of them that do not offer an elite status—are credentialing institutions. So too are many of the colleges and universities around the country, where students sit in large lectures for four years so that they can get a degree that stamps them employable. Such credentials are ever less valuable in an age of cheap Internet driven education. That is why these institutions are under pressure.