Hannah Arendt Center for Politics and Humanities

Google Books and the Problem of Tradition


“Kierkegaard, Marx, and Nietzsche are for us like guideposts to a past which has lost its significance.”

--Hannah Arendt, “Tradition and the Modern Age”

The general outlines of the Google Books project are simple in principle and stunning in size. Collaborating with major libraries around the globe, Google has undertaken to scan all known existing books and to make them accessible to the electronically connected public. Started a decade ago in 2004, Google has already digitized roughly a quarter of the estimated 130 million books available worldwide. The completion of the collection is scheduled for 2020.


Amor Mundi 2/23/14


Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

The Public Voice of Women

greekheadIn the London Review of Books’ winter lecture, classicist Mary Beard discusses how the silencing of women was a common dramatic trope throughout Greek and Roman antiquity. From Telemachus’ admonition to Penelope in the Odyssey (“take up your own work, the loom and the distaff…speech will be the business of men”) to the silencing of the princess Philomela by cutting out her tongue in Ovid’s Metamorphoses, female oratory was treated as inappropriate or even dangerous in the public sphere. In the classical tradition, “public speaking and oratory were not merely things that ancient women didn’t do: they were exclusive practices and skills that defined masculinity as a gender. As we saw with Telemachus, to become a man – and we’re talking elite man – was to claim the right to speak. Public speech was a – if not the – defining attribute of male-ness.” The derision of female speech, argues Beard, was not only embedded in our modern traditions of speechmaking but remains an alarmingly widespread issue today, as women speaking in public face a far greater quantity of death threats, Internet trolling, and verbal abuse than men. “The more I have looked at the threats and insults that women have received, the more I have found that they fit into the old patterns I’ve been talking about,” writes Beard. “For a start it doesn’t much matter what line you take as a woman, if you venture into traditional male territory, the abuse comes anyway. It’s not what you say that prompts it, it’s the fact you’re saying it.”

The Irony of the Elite

houseofcardsPeggy Noonan is worried about the decadence of elite American culture in response to a video compilation of real congressmen quoting their favorite lines from the Netflix series “House of Cards,” and the recent publication of an excerpt from Kevin Roose’s new book Young Money. While the folks over at DailyKos are foaming about the irony of Ronald Reagan’s speechwriter complaining about the excesses of the power elites, Noonan makes an important point about the corrosive effects that irony has on elites and on culture more generally. “”House of Cards” very famously does nothing to enhance Washington’s reputation. It reinforces the idea that the Capital has no room for clean people. The earnest, the diligent, the idealistic, they have no place there. Why would powerful members of Congress align themselves with this message? Why do they become part of it? I guess they think they’re showing they’re in on the joke and hip to the culture. I guess they think they’re impressing people with their surprising groovelocity…. All of this is supposed to be merry, high-jinksy, unpretentious, wickedly self-spoofing. But it seems more self-exposing, doesn’t it? And all of it feels so decadent.” Read more about the decadence and irony of elites on the blog in Roger Berkowitz’s Weekend Read.

On the Glory of Being Wrong

equationIn a review of Mario Livio's new book Brilliant Blunders, Freeman Dyson praises the theory, particularly the incorrect theory, as the engine of science: "They are free creations of the human mind, intended to describe our understanding of nature. Since our understanding is incomplete, theories are provisional. Theories are tools of understanding, and a tool does not need to be precisely true in order to be useful. Theories are supposed to be more-or-less true, with plenty of room for disagreement. A scientist who invents a theory that turns out to be wrong is judged leniently. Mistakes are tolerated, so long as the culprit is willing to correct them when nature proves them wrong."

The Singularity is Near Enough to Date

herRay Kurzweil reviews Spike Jonze's Her, which features a romance between a man and his computer's sentient operating system, and takes issue with the ending: “In my view, biological humans will not be outpaced by the AIs because they (we) will enhance themselves (ourselves) with AI. It will not be us versus the machines (whether the machines are enemies or lovers), but rather, we will enhance our own capacity by merging with our intelligent creations. We are doing this already. Even though most of our computers — although not all — are not yet physically inside us, I consider that to be an arbitrary distinction.”

To Hear the Truth, to Hear a True Fiction

thelastIn a review of Claude Lannzman's long percolating The Last of the Unjust, about Benjamin Murmelstein, the last surviving Jewish elder of the Nazi's show ghetto at Theresienstadt, Leah Falk wonders whether reportage or art will ultimately prove more effective at preserving the terror of the Holocaust: "Is there a kind of truth that can’t be adequately served by even the toughest oral testimony, but only by art? The film’s investigation is not: Was Murmelstein a collaborator? But rather, did Lanzmann’s interview with Murmelstein tell his story? Or were we too late? Has everyone, with regard to the Holocaust, always been too late? About Shoah, Lanzmann admitted that he had made a film about the kinds of stories the human brain was not made to handle. Our handling of them as they grow more distant, as the emotional current underneath the facts becomes even less immediately accessible, is something fragile, a skill that must be not only taught, but also constantly reinvented."

From the Hannah Arendt Center Blog

This week on the blog, Jennifer Hudson considers Arendt's understanding of knowledge as tyrannical, and Roger Berkowitz asks two journalists what they understand as their role. And Berkowitz also turns to Nietzsche and Arendt in the Weekend Read to make sense of our elite culture of decadence and irony.

Upcoming Events

blogBlogging and the New Public Intellectual - A Discussion with Tom Goldstein

Sunday, March 9, 2014 , 5:00 pm - 7:00 pm
Bard Graduate Center, NYC
Learn more here.

R.S.V.P. to arendt@bard.edu


Too Busy to Think


“One feels very lonely in this country; this has to do in particular with the fact that everyone is very busy and that for most people the need for leisure simply ceases to exist after a certain amount of time.”

- Hannah Arendt to Gershom Scholem, November 4, 1943

Hannah Arendt had lived for a year and a half in the United States when she noted in a letter to her friend Gershom Scholem: “One feels very lonely in this country; this has to do in particular with the fact that everyone is very busy and that for most people the need for leisure simply ceases to exist after a certain amount of time.”


This entails, Arendt continues, a certain attitude of “permanent absence (by which I mean ‘absent-mindedness’), rendering human contact between people to be very difficult.” Scholem, who received Arendt’s letter from New York in Jerusalem, was familiar with this phenomenon. “All my friends in the U.S. are muted by this ‘public isolation’,” hence communicating with them became very difficult, he writes in December 1943, “unfortunately you are not an exception in that regard.”

Scholem’s response is noteworthy, for he addresses the political implication of Arendt’s (self-) observation. In general, being busy and leading a public life is not a contradiction. “One can be occupied by his daily work, and when this period of work in the private realm of a factory or an office space has ended, one can enter the public sphere by being a citizen – or a friend” (Jerome Kohn). Arendt had a political understanding of friendship; for her, friendship consists of the world that appears between friends who are diverse and embody plurality rather than an imagined or imposed ‘unity’. In a state of “absent-mindedness” though, one cannot be in public, nor political, nor with friends in a meaningful way.

The problem starts with the absent need for “leisure,” Arendt states. In her letter to Scholem she uses a particular (untranslatable) German term for leisure: “Musse,” which is the German version of the Latin concept of otium. It denotes the free time I have for contemplation when I’m not busy (opposed to neg-otium, the time when I’m not free for contemplation, i.e. when I’m busy).

The term “Musse” that Arendt uses also appears in the title “Musse und Müssiggang” (Leisure and Idleness) of section no. 329 in Nietzsche’s Gay Science. Nietzsche, who is not known for having great interest in the New World, in this very passage talks explicitly about America, and in particular about the Americans’ “distinctive vice”: “the breathless haste with which they work,” so that “one no longer has time or energy […] for otium at all.” Arendt read this passage thoroughly: her private (German) copy of Nietzsche’s Gay Science has marked up not only this sentence, but shows underlinings and marginalia throughout the entire entry on “Leisure and Idleness.”

One would think with a watch in one’s hand, Nietzsche continues in his depiction of America’s oblivious take on “Musse,” and the common principle "Rather do anything rather than nothing," would throttle all culture and good taste. In effect, all forms and “the feeling for form itself, the ear and eye for the melody of movements” were visibly perishing because of the haste of the busy people. Before the takeover of the protestant work ethic, it actually was ‘busy action’ that suffered from a bad consciousness, Nietzsche recalls, and Arendt underlined the related sentence: “the desire for enjoyment already calls itself ‘need of recreation,’ and even begins to be ashamed of itself.”

Arendt’s underlining, with regard to her letter to Scholem, outlines – at a very early stage – her larger political and theoretical project: the modern problem of world-alienation and its threat to the human faculty of judgment.

Thinking needs solitude, according to Arendt, not loneliness or isolation (another distinction inspired by Nietzsche).


World-alienated loneliness or isolation precludes the thinker from the common world; yet, out of the state of solitude he can reenter it once he has ended his act of thinking. Judging relates abstract thoughts back to the world by giving them a concrete form perceivable and disputable in public, in company with others. Absent-mindedness is oblivious of this company. That’s why the perished “feeling for form itself,” deriving from a common lack of “Musse,” may entail a crisis of political judgment: in other words, a disconnection between vita contemplative and the public sphere. Nietzsche, in the passage intensely marked by Arendt, offers a form of counteracting this disconnect: “to take a stroll with thoughts and friends.”

-Thomas Wild




Trespassing is an everyday occurrence which is in the very nature of action’s constant establishment of new relationships within a web of relations, and it needs forgiving, dismissing in order to go on by constantly releasing men from what they have done unknowingly.  Only through this constant mutual release from what they do can men remain free agents, only by constant willingness to change their minds and start again can they be trusted with so great a power as that to begin something new.

—Hannah Arendt, The Human Condition

In The Human Condition, Hannah Arendt relates Augustine’s Christian concept of forgiveness to human action and agency. Forgiveness solves an important problem inherent to the activity of action. Since “men never have been and never will be able to undo or even control reliably any of the processes they start through action,” human beings are met with the disabling reality of processes whose outcomes are both unpredictable and irreversible. Knowing that our actions may lead to evil or unhappiness, why would anyone take the risk of action at all?  Remarkably, Arendt finds the remedy to this predicament within the faculty of action itself. The antidote for irreversibility is forgiveness, which "serves to undo the deeds of the past" by releasing actors from the consequences of their actions.


The beauty of forgiveness is that it interrupts otherwise automatic processes. For example, forgiveness enables actors to become freed from vengeance, “which encloses both doer and sufferer in the relentless automatism of the action process, which by itself need never come to an end.” Within the space created by the interruption, forgiveness creates a new relationship that is radically different from what existed before.

As something startlingly new, forgiveness is not conditioned by the wrong that provokes it and it can therefore never be predicted. Arendt admits as much. She explains, “forgiving, in other words, is the only reaction which does not merely re-act but acts anew and unexpectedly.”  Released from vengeful reactions, I can act in ways that are not predetermined or compelled by another's trespasses against me. In this sense, forgiveness is an unanticipated, uncaused and undetermined act; is it truly spontaneous. Arendtian forgiveness seems to take on a metaphysical stature; it appears to be able to change the nature of reality, undoing the irreversible. It acts against necessity, undoing what was done by releasing the doer from the deed.

In the last 60 years, notably in tribunals and reconciliation commissions characteristic of transitional justice, forgiveness has become a political and legal ideal in cases where massive moral injury threatens to extinguish human plurality and dignity. Seen as a willingness to continually participate in an imperfect world with civility, those willing to forgive demonstrate the ability to begin again not only despite the social facts of moral injury and misrecognition, but as Arendt teaches, also despite ontological facts of irreversibility, contingency, and unpredictability. Forgiving victims who are able to respond creatively rather than vindictively are said to escape the vicious cycle of violence and exemplify their moral agency.

What does forgiveness really do as a political tool? Arendt's forgiveness responds creatively to the fact of injury. What I’d like to suggest is that Arendt understands forgiveness as a cure for the irreversibility of action, not of violence. Unlike many contemporary (theological and secular) political views that see forgiveness as an act of compassion in response to atrocity, Arendt insists that forgiveness is an activity of politics.

Understood politically, forgiveness is about surviving these effects of irreversibility. Because linear time shapes human experience, irreversibility is unavoidable. Taking aim at what cannot be undone, forgiveness releases actors from what would otherwise become a mechanistic or routinized cycle of retaliation.


Arendt describes forgiveness as the act of constantly releasing the wrongdoer. Quoting Luke 17:3-4, she says “And if he trespass against thee…and…turn again to thee, saying, I repent; thou shalt release him.” If the wrongdoer shows signs of contrition or transformation, he should be released from the trespass.

In his essay about Arendt’s judgment of Eichmann, Roger Berkowitz argues that Arendt adopts the language of release or dismissal (which I find very similar to Nietzsche's understanding of forgetting) in order to characterize the action of forgiveness, a move that greatly limits the scope or reach of forgiveness. Berkowitz explains,

Arendt critically limits the province of forgiveness to minor trespasses… As she notes, the Greek word in the Gospels traditionally translated as “forgiveness” is aphienai, which Arendt suggests means to “dismiss” and “release” rather than “forgive.” As a release, Arendt’s defense of forgiveness does not reach the forgiving of crimes and sins. Instead, forgiveness is limited to the “constant mutual release” that allows men to continue to act in the world.

People can release each other, but the capacity as denoted by the original Greek amounts to dismissal rather than pardon or exoneration.

Whereas forgiveness releases, its opposite, vengeance, binds people to the past crime and to the process of reaction. Vengeance, unlike forgiveness, is not creative of new possibilities for action. Instead, it “acts in the form of re-acting against an original trespassing, whereby far from putting an end to the consequences of the first misdeed, everybody remains bound to the process. But note that it is the deterministic character that threatens the sphere of action and which morphs a trespass into an unforgiveable crime. The magnitude of the crime is a necessary, but not sufficient condition for crimes against plurality.

Unlike the common imperialist tactic of legalized discrimination, Arendt explains in Eichmann in Jerusalem that war crimes committed by totalitarianism gave rise to the unprecedented:

It was when the Nazi regime declared that the German people not only were unwilling to have any Jews in Germany but wished to make the entire Jewish people disappear from the face of the earth that the new crime, the crime against humanity—in the sense of a crime “against human status,” or against the very nature of mankind—appeared.

She continues,

Expulsion and genocide must remain distinct; the former is an offense against fellow-nations, whereas the latter is an attack upon human diversity as such, that is, upon a characteristic of the ‘human status’ without which the very worlds ‘mankind’ or ‘humanity’ would be devoid of meaning.

Arendt described such actions as those which “transcend the realm of human affairs and the potentialities of human power, both of which they radically destroy wherever they make their appearance.” Eichmann’s actions destroyed human potentiality. Arendt cannot forgive such crimes.


This is our first clue that the offences to which forgiveness responds are within the reach of dismissal, whereas crimes against the human status are not. Moreover, forgiveness releases those who "unknowingly" transgressed. The predicament of action is that people cannot know the consequences of their actions (action is unpredictable). When the act is intended to harm, the law calls for punishment. It would be a mistake therefore to think that Arendtian forgiveness is intended to cure anything outside the realm of action.

It is a striking absence that Arendt did not refer to the concept of forgiveness as it is developed in the Human Condition in her decision in Eichmann in Jerusalem. And yet Arendt wasn't attempting to create a complete system of concepts across her work. As her views changed, her concepts also shifted. But having the limits of Arendt's forgiveness in mind can, I think, nonetheless help us understand her judgment against Eichmann. Because Eichmann’s decisions and rule following annihilated spontaneity and plurality, he cannot be released from his deeds.

-Grace Hunt


Reading Like a Writer

How does one read closely on the internet? I ask this question as I prepare to co-host a series of conversations on “Blogging and the New Public Intellectual” with my Bard colleague and blogger Walter Russell Mead. What we hope to explore in these talks with bloggers and writers-who-blog is the impact of blogging, tweeting, and online writing are having and will have on our public culture of thinking.

Our first guest in the series is Francine Prose, author of 16 novels and numerous essays and non-fiction books, not to mention a children’s book. Prose also teaches as a Visiting Professor of Literature at Bard, and, she blogs for the New York Review of Books.


I first sought out Francine Prose years ago because I kept hearing amazing things from students about her class, “Literature, Language and Lies.” I was captivated by the course description:

Throughout history, written language has been used to create masterpieces and to pump out propaganda, to delight and delude, to reveal and obscure the truth. But unless we read closely--word by word, line by line, sentence by sentence--it can sometimes be hard to tell the difference. In this class, we will close-read the short stories of great writers (James and Joyce, Cheever and Chekov, Mansfield and O'Connor, Beckett and Bowles, etc.) as well as this week's issue of The New Yorker and today's copy of The New York Times as we look at the ways in which words are used to convey information and insight, to transmit truth and beauty, and to form and transform our vision of the world.

My own courses focus on close readings of books and often I teach an entire course on one book that we read slowly and carefully. I teach a course on Plato’s Republic, another on Kant’s Groundwork of the Metaphsyics of Morals, one on Nietzsche’s Birth of Tragedy, and then courses on Arendt’s The Origins of Totalitarianism and her The Human Condition. In these classes, students meditate on single words for an entire period, sometimes for a week. We pay attention to metaphors and allusions, deepening our understanding of the full work by tarrying with individual parts. There is a tradition of teaching this way in philosophy and also in political theory, but one rarely reads the New York Times that way, and Prose’s course struck me as deeply provocative.

I recently picked up and re-read parts of Francine Prose’s Reading Like a Writer, her book full of examples of the kind of slow and painstaking reading I imagine she teaches in her course. It is full of careful and powerful sentences that remind me of what writing can and should be:

And as I wrote, I discovered that writing, like reading, was done one word at a time, one punctuation mark at a time. It required what a friend calls “putting every word on trial for its life”: changing an adjective, cutting a phrase, removing a comma, and putting the comma back in.

It is a book comprised of readings of excerpts from texts; there are beautiful meditations on the richness of certain words and examples of the power of sentences as well as the expressiveness of gestures. Prose celebrates revision, editing, and craftsmanship. She points out how to read and shows that reading is training for thinking and writing.  Of course, she can make one feel guilty for not reading with care and for writing too quickly. She admonishes at one point:

With so much reading ahead of you, the temptation might be to speed up. But in fact it’s essential to slow down and read every word. Because one important thing that can be learned by reading slowly is the seemingly obvious but oddly underappreciated fact that language is the medium we use in much the same way a composer uses notes, the way a painter uses paint.

As Walter Russell Mead and I conceived our series of discussions on the impact of blogging, inviting Francine Prose made great sense. Blogging offers many things, but one thing it does not promote is the kind of slow, word by word, sentence by sentence reading that Prose defends. Not only does it not promote such reading on behalf of readers, but also for bloggers themselves, who are under incredible pressure to post frequently and quickly. There are different kinds of blogs, of course, but the most popular blogs all post multiple items every day and compete to break new stories quickly. Speed is part of the blogger’s world. And yet, even Prose is blogging today.


The need for speed in blogs is less true for cultural blogs, like the NYRB blog (or even the Hannah Arendt Center blog, where we don’t usually rush posts out to beat a news cycle). And yet even here one of the advantages of blogs is their informality. Blog posts do not typically go through the process of editing and revision of essays in a conventional journal. While we do edit some of our blog posts especially for first-time or new writers, the editing process is quick and informal. There is not the usual relationship between a writer and editor that can seek to hone an essay over weeks or months. Blogs are fun, often short, and easy to read. Perhaps they can attract wider audiences and rile the waters more than crafted essays, which are often toned down by editors who lop off the ragged edges. While blogs offer much, they are not honed with the precision of a full-blown essay to be published in a popular magazine or an academic journal. In short, the increasing prevalence and influence of blogs suggests a threat to both the reading and writing for which Prose is such an advocate.

For this weekend, put down the computer and pick up Francine Prose’s Reading Like a Writer. And then come join Francine Prose, Walter Russell Mead, and myself for a discussion of “Blogging and the New Public Intellectual” on Tuesday, March 5, at 6:30 pm at the Bard Graduate Center (38 West 86th St) in NYC. You can RSVP Here.



Infinitely Intoxicating

Louis Pasteur once wrote:

I see everywhere in the world, the inevitable expression of the concept of infinity…. The idea of God is nothing more than one form of the idea of infinity. So long as the mystery of the infinite weighs on the human mind, so long will temples be raised to the cult of the infinite, whether it be called Bramah, Allah, Jehovah, or Jesus…. The Greeks understood the mysterious power of the hidden side of things. They bequethed to us one of the most beautiful words in our language—the word ‘enthusiasm’—En Theos—“A God Within.” The grandeur of human actions is measured by the inspiration from which they spring. Happy is he who hears a god within, and who obeys it. The ideals of art, of science, are lighted by reflection from the infinite.

To bear a god within is not an easy task for us mortals. The god within—even more so than the god without—demands to be obeyed. Having a god inside us—or Socrates like a daimon on our shoulder—is no recipe for happiness.

It can lead to unbearable obligation and even to martyrdom. And, if the god is a muse, it can lead to the travails of the artist.

All great art and all great artists are consumed by the infinite. As Oscar Wilde once wrote, “We are all in the gutter, but some of us are looking up at the stars.” Those are the artists, the ones who amidst the muck feel part of something higher, something everlasting, the infinite.

The great enemy of the infinite is reason. Reason is calculating. It is rational. It is logical. It insists that everything is knowable and comprehensible. Ends justify means. And means can achieve ends. Reason insists on explanation. The self—the mystery—must be made knowable.

David Brooks in the NY Times today lauds the entry of behavioral psychology into politics and policy. We want to know, he writes, how to get people to vote and how to get congress to cut the deficit. If science can tell us how what to put in their drinking water, how to frame the question, what books to read to them in vitro, or how to rewire their brains to be rational, wouldn’t that make policy all the more reasonable? Wouldn’t that be a good thing? 

Science can make us more rational. That of course is the dream of people like Ray Kurzweil as well as the social scientists who insist that humans can be studied like rats. Let’s not object to the fact. We can be studied like rats and that is what university social science departments around the country and the world are doing everyday. This research is eminently useful, as Brooks rightly remarks. If we employ it, we can be made to be more reasonable.

What the rationalization of humanity means, however, is not a question science can answer. Max Weber began the study of the rationalization of mankind when he proposed that the rise of the enlightenment and the age of reason was bringing about an “Entzauberung” or a “de-magicification” of the world. Capitalism emerged at this time for a number of reasons, but one main reason, Weber understood, was that capitalism provided in the profit motive rational and objective criteria for measuring human endeavors. The problem, as Weber so well understood, is that the elevation of reason and rationality brought about the devaluation of all highest values—what Nietzsche would call nihilism. This is because reason, derived from ratio, is always a relation. All values are relative. In such a world, nothing is infinite. Stuck amidst the relations of means and ends, everything is a calculation. All is a game. There is no purpose or meaning to the game of life. As we become more rational, we also become less consumed by the infinite. That is the true danger of the rise of the social sciences and our rationality-consumed culture that insists that all human behavior be made understandable so that it can be made better.

In The Human Condition, Hannah Arendt is concerned with the way that the rise of reason and rationality is challenging the quintessence of the human condition—at least as that human condition has been experienced and known since the dawn of humanity. The rise of the social sciences, she writes over and over, are subjecting the mystery and fecundity of human action to the law of large numbers. While each and every human action may in itself be surprising and mysterious, it is nevertheless true that studied in groups and analyzed over time, human action does fall into comprehensible patterns. The more we study and know these patterns, the more we come to think of humans as predictable animals rather than surprising and spontaneous selves. This sociological and psychological reduction of man to animal is very much at the heart of what Arendt is opposing in her book.

Nowhere is the rationality of our times more visible than in the victory of labor and the marginalization of art. We are, all of us, laborers today. That is why the first question we ask others we meet is: What do you do?  Our labor defines us. It gives our lives meaning in that it assigns us a use and a value. Even professors, judges, and presidents now say regularly: this is my job. By which we mean, don’t blame us for what we do. Don’t hold me to some higher standard. Don’t expect miracles. It is our job to do this. We do this to make a living.

The one group in society who is at times excepted from this reduction to labor is artists. But even the artist is today is taken less and less seriously. Insofar as artists are enthusiasts consumed with the infinite, they are ignored or viewed as marginal. Art is reduced to playfulness. A hobby. “From the standpoint of “making a living,” every activity unconnected with labor becomes a “hobby.””  And those artists who are taken seriously, whose work is bought and sold on the art market, turn artistic work into the job of making a living.

 Art, Arendt writes, is a process of magic. Citing a poem by Rainer Maria Rilke, she insists that the magic of art is the artist’s transfiguration of something ordinary—the canvas, clay or word—into something extraordinary, an expression of the infinite in the finite world of things.

Because art figures the infinite, poetry is the “most human” of the arts and the art that “remains closest to the thought that inspired it.” The poem, of all artworks, is the most lasting because its medium is the least subject to decay. It is the closest expression of the infinite we humans possess.

Ralph Waldo Emerson, whose resonance with Arendt in so many things has been too infrequently remarked, agrees that poetry is the art form in which the individual artist can access and figure in the world a public and common truth. In “The Poet,” Emerson writes:

It is a secret which every intellectual man quickly learns, that beyond the energy of his possessed and conscious intellect, he is capable of a new energy (as of an intellect doubled on itself ), by abandonment to the nature of things; that, beside his privacy of power as an individual man, there is a great public power on which he can draw by unlocking, at all risks, his human doors and suffering the ethereal tides to roll and circulate through him: then he is caught up into the life of the universe; his speech is thunder; his thought is law, and his words are universally intelligible as the plants and animals. The poet knows that he speaks adequately, then, only when he speaks somewhat wildly, or, “with the flower of the mind”; not with the intellect used as an organ but with the intellect released from all service…inebriated by nectar. As the traveler who has lost his way throws his reins on his horse’s neck and trusts to the instinct of the animal to find his road, so must we do with the divine animal who carries us through this world. For if in any manner we can stimulate this instinct, new passages are opened for us into nature, the mind flows into and through things hardest and highest, and the metamorphosis is possible. This is the reason why bards love wine, mead, narcotics, coffee, tea, opium, the fumes of sandalwood and tobacco, or whatever other species of animal exhilaration. All men avail themselves of such means as they can to add this extraordinary power to their normal powers, and to this end they prize conversation, music, pictures, sculpture, dancing, theaters, traveling, wars, mobs, fires, gaming, politics, or love, or science, or animal intoxication, which are several coarser or finer quasi-mechanical substitutes for the true nectar, which is the ravishment of the intellect by coming nearer to the fact.

I take this quotation from Emerson’s “The Poet” from an exceptional recent essay by Sven Birkirts. The essay appears in the latest edition of Lapham’s Quarterly, an entire issue focusing on the merits and need for inebriation.

As Birkirts writes:

For Emerson, the intoxication is not escape but access, a means of getting closer to “the fact,” which might, with heartfelt imprecision, be called life itself. What he means by “public power,” I think, is something like what Carl Jung and others later meant by the phrase collective unconscious, the emphasis falling on the unconscious, that posited reservoir of our shared archetypes and primordial associations—that which reason by itself cannot fathom, for it is, in essence, antithetical to reason.

Birkirt’s reflects not only on the need for inebriation in the pursuit of artistic infinity, but also on the decreasing potency of intoxicants today. For him, the rise of the mass market in art, the globalization of experience, the accessibility of all information all have made the world smaller, knowable, and accountable. What is lost in such access is precisely the portal to the infinite.

Artistically and in almost every other way ours has become a culture of proliferation. Information, perspectives, as well as the hypercharged clips and images of our global experience are within the radius of the keystroke. Nothing is unspoken, nothing is unaccounted. Every taste is given a niche and every niche is catered to. Here, one might argue, is more material than ever; here are opportunities for even greater acts of synthesis. But I am skeptical. Nietzsche wrote in Thus Spoke Zarathustra, “Nothing is true, everything is permitted.” The temptation is to invert the phrases and ascribe causality: where everything is permitted, nothing is true. Where nothing is true, where is the Emersonian fact to be found? This bears directly on the artist’s task. The idea that writers can keep producing grandly synthesizing or totalizing work—that has the ring of truth, of mattering—is debatable.

Birkirt’s essay may not be the intoxicant of your choice this weekend, but it should be. It is your weekend read. And you might check out the surprising selection at the bar at Lapham’s Quarterly as well.

And for those with time to spare: Arthur Koestler, from whom I first learned of the Louis Pasteur quote at the top of this essay, was consumed with the connection between intoxication and the infinite. I have discussed Koestler’s pursuit of the infinite at length. You can read that discussion here.



The “E” Word, Part Two

This Weekend Read is Part Two in “The “E” Word,”  a continuing series on “elitism” in the United States educational system. Read Part One here.

Peter Thiel has made headlines offering fellowships to college students who drop out to start a business. One of those Thiel fellows is Dale Stephens, founder of Uncollege. Uncollege advertises itself as radical. At the top of their website, Uncollege cites a line from the movie "Good Will Hunting":

You wasted $150,000 on an education you coulda got for a buck fifty in late charges at the public library.

The Uncollege website is filled with one-liners extolling life without college. It can be and often is sophomoric. And yet, there is something deeply important about what Uncollege is saying. And its message is resonating. Uncollege has been getting quite a bit of attention lately, part of a culture of  obsession with college dropouts that is increasingly skeptical of the value of college.

At its best, Uncollege does not simply dismiss college as an overpriced institution seeking to preserve worthless knowledge. Rather, Uncollege claims that college has become too anti-intellectual. College, as Uncollege sees it, has become conventional, bureaucratic, and not really dedicated to learning. In short, Uncollege criticizes college for not being enough like college should be. Hardly radical, Uncollege trades rather in revolutionary rhetoric in the sense that Hannah Arendt means the word revolution: a return to basic values. In this case, Uncollege is of course right that colleges have lost their way.

Or that is what I find interesting about Uncollege.

To actually read their website and the recent Uncollege Manifesto by Dale Stephens, is to encounter something different. The first proposition Uncollege highlights has little to do with education and everything to do with economics. It is the decreasing value of a college education. 

The argument that college has ever less value will seem counter intuitive to those captivated by all the paeans to the value of college and increased earning potential of college graduates. But Uncollege certainly has a point. Currently about 30% of the U.S. adult population has a degree. But among 20-24 year olds, nearly 40% have a college degree. And The Obama administration aims to raise that number to 60% by 2020. Uncollege calls this Academic Inflation. As more and more people have a college degree, the value of that degree will decrease. It is already the case that many good jobs require a Masters or a Ph.D. In short, the monetary value of the college degree is diminished and diminishing. This gives us a hint of where Uncollege is coming from.

The Uncollege response to the mainstreaming of college goes by a number of names. At times it is called unschooling. Unschooling is actually a movement began by the legendary educator John Holt. I recall reading John Holt’s How Children Learn while I was in High School—a teacher gave it to me. I was captivated by Holt’s claim that school can destroy the innate curiosity of children. I actually wrote my college application essay on Holt’s educational philosophy and announced to my future college that my motto was Mark Twain’s quip, “I never let school interfere with my education”—which is also a quotation prominently featured in the Uncollege Manifesto.

Unschooling—as opposed to Uncollege—calls for students to make the most of their courses, coupling those courses with independent studies, reading groups, and internships. I regularly advise my students to take fewer not more courses. I tell them to pick one course each semester that most interests them and pursue it intently. Ask the professor for extra reading. Do extra writing. Organize discussion groups about the class with other students. Go to the professor’s office hours weekly and talk about the ideas of the course. Learners must become drivers of their education, not passive consumers. Students should take their pursuit of knowledge out of the classroom, into the dining halls, and into their dorms.

Uncollege ads that unschooling or “hacking your education” can be done outside of schools and universities. With Google, public libraries, and free courses from Stanford, MIT and Harvard professors proliferating on the web, an enterprising student of any age can compose an educational path today that is more rigorous than anything offered “off-the-shelf” at a college or university. I have no problem with online courses. I hope to take  a few. But it is a mistake to think that systems of massive information delivery are the same thing as education.

What Uncollege offers is something more and something less wholesome than simply a call for educational seriousness. It packages that call with the message that college has become boring, conventional, expensive, and unnecessary. In the Uncollege world, only suckers pay for college. The Uncollege Manifesto promotes “Standing out from the other 6.7 billion”; it derides traditional paths pointing out that “5,000 janitors in the United States have Ph.Ds.”; and  cautions, “If you are content with life and education you should probably stop reading… You shall fit in just fine with society and no one will ever require you to be different. Conforming to societal standards is the easy and expected path. You are not alone!” 

At the core of the Uncollege message is that dirty and yet all-so-powerful little word again: “elitism.” Later in the Uncollege Manifesto we are told that young people have a choice between “real accomplishments” and the “easy path to mediocrity”:

To succeed without a college degree you will have to build your competency and reputation through real world accomplishments. I am warning now: this is not going to be easy. If you want to take the easy path to mediocrity, I encourage you to go to college and join the masses. If you want to stand out from the crowd and change the world, Uncollege is for you!

At one point, the Uncollege Manifesto lauds NPR’s “This I Believe” series and commends these short 500 word essays on personal credos. But Uncollege adds a twist: instead of writing what one believes, it advises its devotees to write an essay answering the question: “What do you believe about the world that most others reject?” It is not enough simply to believe in something. You must believe in something that sets you apart and makes you different.

Uncollege is at least suggesting that it might be cool to want, as it has not been for 50 years, to aim for excellence and to yearn to be different. In short, Uncollege is calling up students at elite institutions to boldly grab the ring of elitism and actively seek to stand outside and above the norm. And it is saying that education is no longer elite, but conventional.

It is hard not to see this embrace of elitism as refreshing although no doubt many will scream the “e” word. I have often lectured to students at elite institutions and confronted them with their fear of elitism. They or someone spends upwards of $200,000 on an education not to mention four years of their lives, and then they reject the entire premise of elitism: that they are different or special. By refusing to see themselves as members of an elite, these students too often refuse to accept the responsibility of elites, to mold and preserve societal values and to assume leadership roles in society.

Leading takes courage. In Arendtian terms, it requires living a public life where one takes risks, acts in surprising ways, and subjects oneself to public judgment. Leading can be uncomfortable and dangerous, and it is often more comfortable and fun to pursue one’s private economic, familial, and personal dreams. Our elite colleges have become too much about preparing students for private success rather than launching young people into lives of public engagement. And part of that failure is a result of a retreat from elitism and a false humility that includes an easy embrace of equality.

That Uncollege is selling its message of excellence and elitism to students at elite institutions of higher learning is simply one sign of how mainstream and conformist many of these elite institutions have become. But what is it that Uncollege offers these elite students who drop out and join Uncollege?

According to its website, Uncollege is selling “hackademic camps” and a “gap year program” that are designed to teach young people how to create their own learning plans. The programs come with living abroad programs and internships. Interestingly, these are all programs offered by most major universities and colleges. The difference is money and time. For $10,000 in just one year, you get access to mentors and pushed to write op-eds, and the “opportunity to work at hot Silicon Valley startups, some of them paid positions.” In the gap year program, participants will also “build your personal brand.  Speak at a conference, Write an op-ed for a major news outlet.  Build a personal website.”

None of this sounds radical, intellectual, or all-that elitist. On the contrary, it claims that young people have little to learn from educators. Teachers are unimportant, to be replaced by mentors in the world. The claim is that young people lack nothing but information and access in order to compete in the world.

What Uncollege preaches often has little to do with elitism or intellectual growth. It is a deeply practical product being sold as an alternative to the cost of college. In one year and for one-twentieth of what a four-year elite college education costs, a young person can get launched into the practical world of knowledge workers, hooked up with mentors, and set into the world of business, technology, and media. It is a vocational training program for wannabe elites, training people to leap into the creative and technology fields and compete with recent college graduates but without the four years of studying the classics, the debt, and the degree. The elitism that Uncollege is selling is an entrepreneurial elitism measurable by money. By appealing to young students’ sense of superiority, ambition, and risk-taking, Uncollege stands a real chance of attracting ambitious young people more interested in a good job and a hot career than in reading the classics or studying abstract math.

Let’s stipulate this is a good thing. Not everybody should be going to liberal arts colleges. People unmoved by Nietzsche, Einstein, or Titian who are then forced to sit through lectures, cram for exams, and pull all-nighters writing papers cribbed from the internet are wasting their time and money on an elite liberal arts education. What is more, they bring cynicism into an environment that should be fired by idealism and electrified by passion. For those who truly believe that it is important in the world to have people who are enraptured by Sebald and transformed by Arendt, it is deeply important that the liberal arts college remain a bastion apart, a place where youthful exuberance for the beautiful and the true can shine clearly.

We should remember, as well, that reading great books and studying Stravinsky is not an activity limited to the academy. We should welcome a movement like Uncollege that frees people from unwanted courses but nevertheless encourages them to pursue their education on their own. Yes, many of these self-educated strivers will acquire idiosyncratic readings of Heidegger or strange views about patriotism. But even when different, opinions are the essence of a human political system.

One question we desperately need to ask is whether having a self-chosen minority of people trained in the liberal arts is important in modern society. I teach in an avowedly liberal arts institution precisely because I fervently believe that such ideas matter and that having a class of intellectuals whose minds are fired by ideas is essential to any society, especially a democracy.

I sincerely hope that the liberal arts and the humanities persist. As I have written,

The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.

Our problem, today, is that college is caught between incompatible demands, to spark imaginations and idealism and to prepare young people for employment and success. For a long while now colleges have been doing neither of these things well. Currently, the political pressure on colleges is to cut costs and become more efficient. The unspoken assumption is that colleges must more cheaply and more quickly prepare students for employment. For those of us who care about college as an intellectual endeavor, we should welcome new alternatives to college like internet courses, vocational education, and Uncollege that will pull away young people for whom college would have been the wrong choice. Maybe, under the pressure of Uncollege, colleges will return to their core mission of passionately educating young people and preparing them for lives of civic engagement.

I encourage you this weekend to read the Uncollege Manifesto. Let me know what you think.



What is a Fact?

What is a fact? Few more thorny questions exist. Consider this, from Hannah Arendt’s essay, “Truth and Politics:”

But do facts, independent of opinion and interpretation, exist at all? Have not generations of historians and philosophers of history demonstrated the impossibility of ascertaining facts without interpretation, since they must first be picked out of a chaos of sheer happenings (and the principles of choice are surely not factual data) and then be fitted into a story that can be told only in certain perspective, which has nothing to do with the original occurrence?

Facts are constructed. They are not objective. And there is no clear test for what is a fact. Thus, when Albert Einstein was asked, how science can separate fact from fiction, brilliant hypotheses from nutty quackery, he answered:  ‘There is no objective test.” Unlike rational truths that are true outside of experience and absolute, all factual truths are contingent. They might have been otherwise. That is one reason it is so hard to pin them down.

Steve Shapin reminds us of these puzzles in an excellent essay in this weeks London Review of Books. Shapin is reviewing a new book on Immanuel Velikovsky by Michael Gordin. Velikovsky, for those born since the 1960s, caused an uproar in the 1960s and 70s with his scientific claims that Venus was the result of a dislodged piece of Jupiter, that comets led to the parting of the Red Sea, that it dislodged the orbit of Mars threatening Earth, and caused the relocation of the North Pole, not to mention the showering of plagues of vermin onto the earth that nourished the Israelites in the desert.

Gordin’s book is about how American scientists went ballistic over Velikovsky. They sought to censor his work and schemed to prevent the publication of his book, Worlds in Collision, at the prestigious Macmillan press. At the center of the controversy was Harvard, where establishment scientists worked assiduously to discredit Velikovsky and stop the circulation of his ideas. [I am sensitive to such issues because I was also the target of such a suppression campaign. When my book The Gift of Science was about to be published by Harvard University Press, I received a call from the editor. It turns out an established scholar had demanded that HUP not publish my book, threatening to no longer review books for the press let alone publish with them. Thankfully, HUP resisted that pressure, for which I will always be grateful.]

For these Harvard scientists, Velikovsky was a charlatan peddling a dangerous pseudo science. The danger in Velikovsky’s claims was more than simple misinformation. It led, above all, to an attack on the very essence of scientific authority. What Velikovsky claimed as science flew in the face of what the scientific community knew to be true. He set himself up as an outsider, a dissident. Which he was. In the wake of totalitarianism, he argued that democratic society must allow for alternative and heretical views. The establishment, Velikovsky insisted, had no monopoly on truth. Let all views out, and let the best one win.

Shapin beautifully sums up the real seduction and danger lurking in Velikovsky’s work.

The Velikovsky affair made clear that there were radically differing conceptions of the political and intellectual constitution of a legitimate scientific community, of what it was to make and evaluate scientific knowledge. One appealing notion was that science is and ought to be a democracy, willing to consider all factual and theoretical claims, regardless of who makes them and of how they stand with respect to canons of existing belief. Challenges to orthodoxy ought to be welcomed: after all, hadn’t science been born historically through such challenges and hadn’t it progressed by means of the continual creative destruction of dogma? This, of course, was Velikovsky’s view, and it was not an easy matter for scientists in the liberal West to deny the legitimacy of that picture of scientific life. (Wasn’t this the lesson that ought to be learned from the experience of science in Nazi Germany and Stalinist Russia?) Yet living according to such ideals was impossible – nothing could be accomplished if every apparently crazy idea were to be given careful consideration – and in 1962 Thomas Kuhn’s immensely influential Structure of Scientific Revolutions commended a general picture of science in which ‘dogma’ (daringly given that name) had an essential role in science and in which ‘normal science’ rightly proceeded not through its permeability to all sorts of ideas but through a socially enforced ‘narrowing of perception’. Scientists judged new ideas to be beyond the pale not because they didn’t conform to abstract ideas about scientific values or formal notions of scientific method, but because such claims, given what scientists securely knew about the world, were implausible. Planets just didn’t behave the way Velikovsky said they did; his celestial mechanics required electromagnetic forces which just didn’t exist; the tails of comets were just not the sorts of body that could dump oil and manna on Middle Eastern deserts. A Harvard astronomer blandly noted that ‘if Dr Velikovsky is right, the rest of us are crazy.'

Immanuel Velikovsky

It is hard not to read this account and not think about contemporary debates over global warming, Darwinism, and the fall of the World Trade Center. In all three cases, outsiders and even some dissident scientists have made arguments that have been loudly disavowed by mainstream scientists.

No one has done more to explore the claims of modern pseudo science than Naomi Oreskes. In her book Merchants of Doubt written with Erik Conway, Oreskes shows how “a small handful of men” could, for purely ideological reasons, sow doubt about the ‘facts’ regarding global warming and the health effects of cigarettes. In a similar vein, Jonathan Kay has chronicled the efforts of pseudo scientists to argue that there was no possible way that the World Trade Towers could have been brought down by jet fuel fires, thus suggesting and seeking to “prove” that the U.S. government was behind the destruction of 9/11.

Oreskes wants to show, at once, that it is too easy for politically motivated scientists to sow doubt about scientific fact, and also that there is a workable and effective way for the scientific community to patrol the border between science and pseudo science. What governs that boundary is, in Oreskes words, “the scientific consensus.” The argument that global warming is a fact rests on claims about the scientific method: value free studies, evaluated by a system of peer review, moving towards consensus. Peer review is, for Oreskes, “is a crucial part of science.” And yet, for those who engage in it know full well, peer review is also deeply political, subject to petty and also not so petty disputes, jealousies, and vendettas. For this and other reasons, consensus is, as Oreskes herself admits, not always accurate: “The scientific consensus might, of course, be wrong. If the history of science teaches anything, it is humility, and no one can be faulted for failing to act on what is not known.”

Just as Einstein said 50 years ago, in the matters of establishing scientific fact, there is no objective test. This is frustrating. Indeed, it can be dangerous, not only when pseudo scientists sow doubt about global warming thus preventing meaningful and necessary action. But also, the pervasive and persuasive claims of pseudo science sow cynicism that undermines the factual and truthful foundations of human life.

Arendt reminds us, with a clarity rarely equaled, that factual truth is always contingent. “Facts are beyond agreement and consent, and all talk about them—all exchanges of opinion based on correct information—will contribute nothing to their establishment.” Against the pseudo scientific claims of many, science is always a contingent and hypothetical endeavor, one that deals in hypotheses, agreement, and factual proof. Scientific truth is always empirical truth and the truths of science are, in the end, grounded in consensus.

The trouble here is that scientific truths must—as scientific—claim to be true and not simply an opinion. Science makes a claim to authority that is predicated not upon proof but on the value and meaningfulness of impartial inquiry. It is a value that is increasingly in question.

What the challenge of pseudo science shows is how tenuous scientific authority and the value placed on disinterested research really is. Such inquiry has not always been valued and there is no reason to expect it to be valued about partial inquiry in the future. Arendt suggests that the origin of the value in disinterested inquiry was Homer’s decision to praise the Trojans equally as he lauded the Achaeans. Never before, she writes, had one people been able to look “with equal eyes upon friend and foe.” It was this revolutionary Greek objectivity that became the source for modern science. For those who do value science and understand the incredible advantages it has bestowed upon modern civilization, it is important to recall that the Homeric disinterestedness is neither natural nor necessary. In the effort to fight pseudo science, we must be willing and able to defend just such a position and thus what Nietzsche calls the “pathos of distance” must be central to any defense of the modern scientific world.

When science loses its authority, pseudo science thrives. That is the situation we are increasingly in today. There are no objective tests and no clear lines demarcating good and bad science. And that leaves us with the challenge of the modern age: to pursue truth and establish facts without secure or stable foundations. For that, we need reliable guides whom we can trust. And for that reason, you should read Steven Shapin’s latest essay. It is your weekend read.



A Sorry Bunch of Dwarfs

Freeman Dyson, the eclectic physicist, took good aim at philosophy last week in a review of the silly book by Jim Holt, Why Does the World Exist?" An Existential Detective Story. Holt went around to "a portrait gallery of leading modern philosophers," and asked them the Leibnizian question: Why is there something rather than nothing?" The book offers their answers, along with biographical descriptions.

For Dyson, Holt's book "compels us to ask" these "ugly questions." First, "When and why did philosophy lose its bite?" Philosophers were, once important. In China, Confucius and his followers made a civilization. So too in Greece did Socrates and then the schools of Plato and Aristotle give birth to the western world. In the Christian era Jesus and Paul, then Aquinas and Augustine granted depth to dominant worldviews. Philosophers like Descartes, Hobbes, and Leibniz were central figures in the scientific revolution, and philosophical minds like Nietzsche, Heidegger, and Arendt (even if one was a philologist and the other two refused the name philosopher) have become central figures in the experience of nihilism. Against these towering figures, the "leading philosophers" in Holt's book cut a paltry figure. Here is Dyson:

Holt's philosophers belong to the twentieth and twenty-first centuries. Compared with the giants of the past, they are a sorry bunch of dwarfs. They are thinking deep thoughts and giving scholarly lectures to academic audiences, but hardly anybody in the world outside is listening. They are historically insignificant. At some time toward the end of the nineteenth century, philosophers faded from public life. Like the snark in Lewis Carroll's poem, they suddenly and silently vanished. So far as the general public was concerned, philosophers became invisible.

There are many reasons for the death of philosophy, some of which were behind Hannah Arendt's refusal to call herself a philosopher. Philosophy was born, at least in its Platonic variety, from out of the thinker's reaction to the death of Socrates. Confronted with the polis that put the thinker to death, Plato and Aristotle responded by retreating from the world into the world of ideas. Philosophical truth separated itself from worldly truths, and idealism was born. Realism was less a return to the world than a reactive fantasy to idealism. In both, the truths that were sought were otherworldly truths, disconnected to the world.

Christianity furthered the divorce of philosophy from the world by imagining two distinct realms, the higher realm existing beyond the world. Science, too, taught that truth could only be found in a world of abstract reason, divorced from real things. Christianity and science together gave substance to the philosophical rebellion against the world. The result, as Dyson rightly notes, is that philosophy today is as abstract, worldly, and relevant as it is profound.

What Dyson doesn't explore is why philosophers of the past had such importance, even as they also thought about worlds of ideas. The answer cannot be that ideas had more import in the past than now. On the contrary, we live in an age more saturated in ideas than any other. More people today are college educated, literate, and knowledgeable of philosophy than at any period in the history of the world. Books like Holt's are proof positive of the profitable industry of philosophical trinkets. That is the paradox—at a time when philosophy is read by more people than ever, it is less impactful than it ever was.

One explanation for this paradox is nihilism—The devaluing or re-valuing of the highest values. The truth about truth turned out to be neither so simple nor singular as the philosophers had hoped. An attentive inquiry into the true and the good led not to certainty, but to ideology critique. For Nietzsche, truth, like the Christian God, was a human creation, and the first truth of our age is that we recognized it as such. That is the precondition for the death of God and the death of truth. Nihilism has not expunged ideas from our world, but multiplied them. When speaking about the "true" or the "good" or the "just," Christians, Platonists, and moralists no longer have the stage to themselves. They must now shout to be heard amongst the public relations managers, advertisers, immoralists, epicureans, anarchists, and born again Christians.

Dyson ignores this strain of philosophy. He does point out that Nietzsche was the last great philosopher, but then dismisses Heidegger who "lost his credibility in 1933" and even Wittgentstein who would remain silent if a woman attended his lectures until she would leave. And yet it is Heidegger who has given us the great literary masterpieces of the 20th century philosophy.

His work on technology (The Question Concerning Technik) and art (The Origins of the Work of Art) has been widely read in artistic, literary, and lay circles. It is hard to imagine a philosopher more engaged with the science and literature than Heidegger was. He read physics widely and co-taught courses at the house of the Swiss psychiatrist Medard Boss and also taught seminars with the German novelist Ernst Jünger.

It seems worthwhile to end with a poem of Heidegger's from his little book, Aus der Erfahrung des Denkens/From Out of the Experience of Thinking:

Drei Gefahren drohen dem Denken
Die gute und darum heilsame Gefahr ist die Nachbarschaft des singenden Dichters.
Die böse und darum schärfste Gefahr ist das Denken selber. Es muß gegen sich selbst denken, was es nur selten vermag.
Die schlechte und darum wirre Gefahr ist das Philosophieren.

Three dangers threaten thinking.
The good and thus healthy danger is the nearness of singing poetry.
The evil and thus sharpest danger is thinking itself. It must think against itself, something it can do only rarely.
The bad and thus confusing danger is philosophizing.



Does the President Matter?

“Hence it is not in the least superstitious, it is even a counsel of realism, to look for the unforeseeable and unpredictable, to be prepared for and to expect “miracles” in the political realm. And the more heavily the scales are weighted in favor of disaster, the more miraculous will the deed done in freedom appear.”

                        —Hannah Arendt, What is Freedom?

This week at Bard College, in preparation for the Hannah Arendt Center Conference "Does the President Matter?", we put up 2 writing blocks around campus, multi-paneled chalkboards that invite students to respond to the question: Does the President Matter?  The blocks generated quite a few interesting comments. Many mentioned the Supreme Court. Quite a few invoked the previous president, war, and torture. And, since we are at Bard, others responded: it depends what you mean by matters.

This last comment struck me as prescient. It does depend on what you mean by matters.

If what we mean is, say, an increasing and unprecedented power by a democratic leader not seen since the time of enlightened monarchy, the president does matter. We live in an age of an imperial presidency. The President can, at least he does, send our troops into battle without the approval of Congress. The President can, and does, harness the power of the TV, Internet, and twitter to bypass his critics and reach the masses more directly than ever before. The president can, and does, appoint Supreme Court Justices with barely a whimper from the Senate; and the president’s appointments can, and do, swing the balance on a prisoner’s right to habeas corpus, a woman’s right to choose, or a couple’s right to marry.

And yet, what if by matter, we mean something else? What if we mean, having the power to change who we are in meaningful ways? What if by matter we mean: to confront honestly the enormous challenges of the present? What if by matter we mean: to make unpredictable and visionary choices, to invite and inspire a better future?

­On the really big questions—the thoughtless consumerism that degrades our environment and our souls; the millions of people who have no jobs and increasingly little prospect for productive employment; the threat of devastating terrorism; and the astronomical National Debt: 16 trillion and counting for the US. -- That is $140,000 for each taxpayer. -- Add to that the deficiency in Public Pension Obligations (estimated at anywhere from $1 to $5 trillion.) Not to mention the 1 trillion dollars of inextinguishable student debt that is creating a lost generation of young people whose lives are stifled by unwise decisions made before they were allowed to buy a beer.

This election should be about a frank acknowledgement of the unsustainability of our economic, social, and environmental practices and expectations. We should be talking together about how we should remake our future in ways that are both just and exciting. This election should be scary and exciting. But so far it’s small-minded and ugly.

Around the world, we witness worldwide distrust and disdain for government. In Greece there is a clear choice between austerity and devaluation; but Greek leaders have saddled their people with half-hearted austerity that causes pain without prospect for relief.  In Italy, the paralysis of political leaders has led to resignation and the appointment of an interim technocratic government. In Germany, the most powerful European leader delays and denies, trusting that others will blink every time they are brought to the mouth of the abyss.

No wonder that the Tea Party and Occupy Wall Street in the US, and the Pirate Parties in Europe share a common sense that liberal democratic government is broken. A substantial—and highly educated—portion of the electorate has concluded that our government is so inept and so compromised that it needs to be abandoned or radically constrained. No president, it seems, is up to the challenge of fixing our broken political system.

Every President comes to Washington promising reform!  And they all fail.  According to Jon Rauch, a leading journalist for The Atlantic and the National Journal, this is inevitable. He has this to say in his book Government's End:

If the business of America is business, the business of government programs and their clients is to stay in business. And after a while, as the programs and the clients and their political protectors adapt to nourish and protect each other, government and its universe of groups reach a turning point—or, perhaps more accurately, a point from which there is no turning back. That point has arrived. Government has become what it is and will remain: a large, incoherent, often incomprehensible mass that is solicitous of its clients but impervious to any broad, coherent program of reform. And this evolution cannot be reversed.

On the really big questions of transforming politics, the President is, Rauch argues, simply powerless. President Obama apparently agrees. Just last week he said, in Florida: "The most important lesson I've learned is that you can't change Washington from the inside. You can only change it from the outside."

A similar sentiment is offered by Laurence Lessig, a founding member of Creative Commons. In his recent book Republic 2.0, Lessig writes:

The great threat today is in plain sight. It is the economy of influence now transparent to all, which has normalized a process that draws our democracy away from the will of the people. A process that distorts our democracy from ends sought by both the Left and the Right: For the single most salient feature of the government that we have evolved is not that it discriminates in favor of one side and against the other. The single most salient feature is that it discriminates against all sides to favor itself. We have created an engine of influence that seeks not some particular strand of political or economic ideology, whether Marx or Hayek. We have created instead an engine of influence that seeks simply to make those most connected rich.

The system of influence and corruption through PACs, SuperPacs, and lobbyists is so entrenched, Lessig writes, that no reform seems plausible.  All that is left is the Hail Mary idea of a new constitutional convention—an idea Lessig promotes widely, as with his Conference On the Constitutional Convention last year at Harvard.

For Rauch on the Right and Lessig on the Left, government is so concerned with its parochial interests and its need to stay in business that we have forfeited control over it. We have, in other words, lost the freedom to govern ourselves.

The question  "Does the President Matter?" is asked, in the context of the Arendt Center conference, from out of Hannah Arendt's maxim that Freedom is the fundamental raison d'etre of politics. In "What is Freedom?", Arendt writes:

“Freedom is actually the reason that men live together in political organization at all. Without it, political life as such would be meaningless. The raison d’être of politics is freedom.”

So what is freedom? To be free, Arendt says, is to act. Arendt writes: "Men are free as long as they act, neither before nor after; for to be free and to act are the same.”

What is action? Action is something done spontaneously. It brings something new into the world. Man is the being capable of starting something new. Political action, and action in general, must happen in public. Like the performing arts—dance, theatre, and music—politics and political actions requires an audience. Political actors act in front of other people. They need spectators, so that the spectators can be drawn to the action; and when the spectators find the doings of politicians right, or true, or beautiful, they gather around and form themselves into a polity. The political act, the free act must be surprising if it is to draw people to itself. Only an act that is surprising and bold is a political act, because only such an act will strike others, and make them pay attention.

The very word politics derives from the Greek polis which itself is rooted in the Greek pelein, a verb used to describe the circular motion of smoke rings rising up from out of a pipe. The point is that politics is the gathering of a plurality around a common center. The plurality does not become a singularity in circling around a polestar, but it does acknowledgement something common, something that unites the members of a polity in spite of their uniqueness and difference.

When President Washington stepped down after his second term; when President Lincoln emancipated the slaves; when FDR created the New Deal; when President Eisenhower called the Arkansas National Guard into Federal Service in order to integrate schools in Little Rock; these presidents acted in ways that helped refine, redefine, and re-imagine what it means to be an American.

Arendt makes one further point about action and freedom that is important as they relate to the question: Does the President Matter? Courage, she writes, is "the political virtue par excellence."  To act in public is leave the security of one's home and enter the world of the public. Such action is dangerous, for the political actor might be jailed for his crime or even killed. Arendt's favorite example of political courage is Socrates, who was killed for his courageous engagement of his fellow Athenians. We must always recall that Socrates was sentenced to death for violating the Athenian law.

Political action also requires courage because the actor can suffer a fate even worse than death. He may be ignored. At least to be killed for one's ideas means that one is recognized as capable of action, of saying and doing something that matters. To be ignored, however, denies the actor the basic human capacity for action and freedom.

One fascinating corollary of Arendt's understanding of the identity of action and freedom is that action, any action—any original deed, any political act that is new and shows leadership—is, of necessity, something that was not done before. It is, therefore, always against the law.

This is an insight familiar to readers of Fyodor Dostoevsky. In Crime and Punishment Raskolnikov says:

Let's say, the lawgivers and founders of mankind, starting from the most ancient and going on to the Lycurguses, the Solons, the Muhammads, the Napoleons, and so forth, that all of them to a man were criminals, from the fact alone that in giving a new law they thereby violated the old one.

All leaders are, in important ways, related to criminals. This is an insight Arendt and Nietzsche too share.

Shortly after we began to plan this conference, I heard an interview with John Ashcroft speaking on the Freakonomics Radio Show. He said:

"Leadership in a moral and cultural sense may be even more important than what a person does in a governmental sense. A leader calls people to their highest and best. ... No one ever achieves greatness merely by obeying the law. People who do above what the law requires become really valuable to a culture. And a President can set a tone that inspires people to do that."

My first reaction was: This is a surprising thing for the Attorney General of the United States to say. My second reaction was: I want him to speak at the conference. Sadly, Mr. Ashcroft could not be with us here today. But this does not change the fact that, in an important way, Ashcroft is right. Great leaders will rise above the laws in crisis. They will call us to our highest and best.

What Ashcroft doesn't quite say, and yet Arendt and Dostoevsky make clear, is that there is a thin and yet all-so-important line separating great leaders from criminals. Both act in ways unexpected and novel. In a sense, both break the law.

But only the leader's act shows itself to be right and thus re-makes the law.  Hitler may have acted and shown a capacity for freedom; his action, however, was rejected. He was a criminal, not a legislator.  Martin Luther King Jr. or Gandhi also broke the laws in actions of civil disobedience. Great leader show in their lawbreaking that the earlier law had been wrong; they forge a new moral and also written law through the force and power of moral example.

In what is perhaps the latest example in the United States of a Presidential act of lawbreaking, President George W. Bush clearly broke both U.S. and international law in his prosecution of the war on terror. At least at this time it seems painfully clear that President George W. Bush's decision to systematize torture stands closer to a criminal act than an act of great legislation.

In many ways Presidential politics in the 21st takes place in the shadow of George W. Bush's overreach. One result is that we have reacted against great and daring leadership. In line with the spirit of equality that drives our age, we ruthlessly expose the foibles, missteps, scandals and failures of anyone who rises to prominence. Bold leaders are risk takers. They fail and embarrass themselves. They have unruly skeletons in their closets. They will hesitate to endure and rarely prevail in the public inquisition that the presidential selection process has become.

These candidates, who are inoffensive enough to prevail, are branded by their consultants as pragmatists. Our current pragmatists are Products of Harvard Business School and Harvard Law School. Mr. Romney loves data. President Obama worships experts. They are both nothing if not faithful to the doctrine of technocratic optimism, that we with the right people in charge we can do anything. The only problem is they refuse to tell us what it is they want to do. They have forgotten that politics is a matter of thinking, not a pragmatic exercise in technical efficiency.

Look at the Mall in Washington: the Washington monument honors our first President,  the Jefferson Memorial, the Lincoln Memorial, the Memorial to Franklin Delano Roosevelt.  There is not a monument to any president since FDR. And yet, just 2 years ago we dedicated the Martin Luther King Memorial. It doesn't seem like an accident that the leaders of the Civil Rights Movement were not politicians. Our leaders today do not gravitate to the presidency. The presidency does not attract leaders. Bold leaders today are not the people running for office.

Yet, people crave what used to be called a statesman. To ask: "Does the President Matter?" is to ask:  might a president, might a political leader, be able to transform our nation, to restore the dignity and meaning of politics? It is to ask, in other words, for a miracle.

At the end of her essay, "What is Freedom?", Hannah Arendt said this about the importance of miracles in politics.

Hence it is not in the least superstitious, it is even a counsel of realism, to look for the unforeseeable and unpredictable, to be prepared for and to expect “miracles” in the political realm. And the more heavily the scales are weighted in favor of disaster, the more miraculous will the deed done in freedom appear.

She continued:

It is men who perform miracles—men who because they have received the twofold gift of freedom and action can establish a reality of their own.

I don't know if the president matters.

But I know that he or she must. Which is why we must believe that miracles are possible. And that means we, ourselves, must act in freedom to make the miraculous happen.

In the service of the not-yet-imagined possibilities of our time, our goal over the two days of the conference days was to engage in the difficult, surprising, and never-to-be-understood work of thinking, and of thinking together, in public, amongst others. We heard from philosophers and businessmen, artists and academics. The speakers came from across the political spectrum, but they shared a commitment to thinking beyond ideology. Such thinking is itself a form of action, especially so in a time of such ideological rigidity. Whether our meeting here at Bard gives birth to the miracle of political action--that is up to you.  If we succeeded in thinking together, in provoking, and in unsettling, we perhaps sowed the seeds that will one day blossom into the miracle of freedom.


Watch Roger's  opening talk from the conference, "Does the President Matter?" here.


Vain, Like a Butterfly

“Everything that is appears; everything that appears disappears; everything that is alive has an urge to appear; this urge is called vanity; since there is no urge to disappear and disappearance is the law of appearance, the urge, called vanity, is in vain.‘Vanitas vanitatum vanitas’—all is vanity, all is in vain.”

-Hannah Arendt, Denktagebuch, 796

Arendt writes this entry in her Denktagebuch in September 1970. She is 63 years old and long familiar with the law of disappearance. For years the record of her thoughts has been interrupted by mention of the death of friends and mentors: May 1951 “[Hermann] Broch died on 30 May and was buried on 2 June 1951”; February 1969 “Jaspers dies”; November 1968: “Tonight I dreamed of Kurt Blumenfeld… in the dream I didn’t know that he was dead.” The following month the law would bear down again and she would write an entry beginning: “On 31 October Heinrich died…”. Within a little over four years of her husband’s death she would herself be gone.

Harmen Steenwijck -"Vanitas"

“Vanitas vanitatum vanitas.” This could be despair. It could be that dreadful thought that forces itself on us in moments of grief and anxiety, the thought that a life’s endeavor has been for naught, that all our achievements have turned out to be worthless. It could be the distress at the Nietzschean reflection that not only must we each die, but this human race and this earth will eventually disappear without trace. Perhaps it is the same as the horror Sophocles savors when he warns us: “Not to be born is, past all prizing, best; but, when man has seen the light, this is next best by far, that with all speed he should go thither, whence he hath come.”

It could also be frustration at the sheer urgency of the desire to rush into full view when thinking is always conducted in darkness and quiet, at a remove from the world. It might be a distaste, for instance, for glib self-promotion that stands in for political action on the part of candidates for public office, or for everything about the modern university that insists that “research” be published prematurely, rendering it hypocritical, superficial and irrelevant (Denktagebuch, 703).

Yet, though her frustration is real, and though she grieves, Arendt uses the word vanity without judgment. A few weeks ago Ian Storey introduced a “Quote of the Week” that came from the same late period of the Denktagebuch, and wrote movingly of the sense of end that suffuses these last entries. (It’s beautiful and touching and well worth your while.)  He writes also of the shades of Arendt’s response to our endedness, from bitter sadness to old contentment. In the same way, she reacts to the vanity of our beginnings both with an austere refusal of even the fantasy of immortality and wonder that any of it came to be at all.

After all, no one asks to be born. No one demands to come into the world as if birth were a special favor, a privilege granted to some but not to others. We’re propelled into the light of day before we know it, by an urge that has nothing to do with ego and does not belong to us any more than it belongs to our parents or our species. We share it with everything alive. However, if we think of it as a great surging drive towards life or survival, it threatens to diminish thinking and overwhelm the senses as a great unfathomable force; if we think of it as a drive to appear it produces instead the refinement of difference and the delight of variegation.

In these same years Arendt reads about biology and studies up on the science of genetics. She reads the work of the philosophical zoologist Adolph Portmann whose most remarkable studies concern the vast variety in the size, shape and color of butterflies (The Beauty of Butterflies, 1951). Instead of submitting the phenomenon of this variety—and butterflies make up just one terrifically flamboyant example—to the demands of natural and sexual selection as in the mainstream of evolutionary theory, Portmann identifies an Aristotelian desire to appear. Arendt adds to this an existential claim for recognition and even praise. “All that appears wants to be seen and recognized and praised. The highest form of recognition is love: volu ut sis.—The wonder implies affirmation” (Denktagebuch, 701). The moment our surprise at the color of a butterfly turns into wonder that it should have somehow come to be and come to be precisely this color, we affirm its existence. We could never have called up in imagination all the colors of butterflies’ wings, and no-one could have planned the immense series of mutations and other tiny contingencies that brought them all into existence but, exposed to a small section of their uncalled-for variety, astonished by it, wondering at it, affirming it, we will that it be. This is what it means to love the world.

This love comes as a sort of gratitude, even if we’re not sure whom we should be grateful to. Believers thank the creator god. Arendt may not believe—at least not like that—but she reaches for the word blasphemy and so also for a sense of something sacred that needs protection from profanity. In October 1969 she writes: “The desire for earthly immortality is blasphemous, not because it wants to overcome death, but because it negates birth” (744). The problem is not that we want to play God by refusing to die, but that we balk at making way for a new, different world. From her reading in genetics she knows the role of genetic mutation in the generation of natural variety and the many millions of mistakes that had to happen to produce the living world we see. She has noted Portmann’s bon mot: “One of the surest methods for the regular occurrence of new [genetic] combinations is that peculiar game that biologists call sexuality.” What is sacred, then, is the fact of all those butterfly wings, all the fish scales, animal ears, nose shapes, eye colors, skin tones, smiles that could easily have happened in some other way but that appear to us now, just as they are, the needlessly glamorous and constantly renewed results of contingency.

All vanity, yes, and all in vain, certainly. But praise be.

-Anne O’Byrne


The Humanities and Common Sense

In this post, academics and university faculty will be criticized. Railing against college professors has become a common pastime, one practiced almost exclusively by those who have been taught and mentored by those whom are now being criticized. It is thus only fair to say upfront that the college education in the United States is, in spite of its myriad flaws, still of incredible value and meaning to tens if not hundreds of thousands of students every year.

That said, too much of what our faculties teach is neither interesting nor wanted by our students.

This is a point that Jacques Berlinerblau makes in a recent essay in the Chronicle of Higher Education.

Observers of gentrification like to draw a distinction between needs and wants. Residents in an emerging neighborhood need dry cleaners, but it's wine bars they really want. The application of that insight to the humanities leads me to an unhappy conclusion: Our students, and the educated public at large, neither want us nor need us.

What is amazing is that not only do our students not want what we offer, but neither do our colleagues. It is an amazing and staggering truth that much of what academics write and publish is rarely, if ever, read. And if you want to really experience the problem, attend an academic conference some day, where you will see panels of scholars presenting their work, sometimes to 1 or 2 audience members. According to Berlinerblau, the average audience at academic conference panels is fourteen persons.

The standard response to such realizations is that scholarship is timeless. Its value may not be discovered for decades or even centuries until someone, somewhere, pulls down a dusty volume and reads something that changes the world. There is truth in such claims. When one goes digging in archives, there are pearls of wisdom to be found. What is more, the scholarly process consists of the accumulation of information and insight over generations. In other words, academic research is like basic scientific research, useless but useful in itself.

The problem with this argument is that such really original scholarship is rare and getting ever more rare. While there are exceptions, little original research is left to do in most fields of the humanities. Few important books are published each year. The vast majority are as derivative as they are unnecessary. We would all do well to read and think about the few important books (obviously there will be some disagreement and divergent schools) than to spend our time trying to establish our expertise by commenting on some small part of those books.

The result of the academic imperative of publish or perish is the increasing specialization that leads to the knowing more and more about less and less. This is the source of the irrelevance of much of humanities scholarship today.

As Hannah Arendt wrote 50 years ago in her essay On Violence, humanities scholars today are better served by being learned and erudite than by seeking to do original research by uncovering some new or forgotten scrap. While such finds can be interesting, they are exceedingly rare and largely insignificant.

As a result—and it is hard to hear for many in the scholarly community—we simply don't need 200 medieval scholars in the United States or 300 Rawlsians or 400 Biblical scholars. It is important that Chaucer and Nietzsche are taught to university students; but the idea that every college and university needs a Chaucer and a Nietzsche scholar to teach Chaucer and Nietzsche is simply wrong. We should, of course, continue to support scholars, those whose work is to some extent scholarly innovative. But more needed are well-read and thoughtful teachers who can teach widely and write for a general audience.

To say that excessively specialized humanities scholarship today is irrelevant is not to say that the humanities are irrelevant. The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.

Hannah Arendt argues precisely for this connection between the humanities and politics in her essay The Crisis in Culture. Part Two of the essay addresses the political significance of culture, which she relates to humanism—both of which are said to be of Roman origin. The Romans, she writes, knew how to care for and cultivate the grandiose political and artistic creations of the Greeks. And it is a line from Pericles that forms the center of Arendt's reflections.

The Periclean citation is translated (in part) by Arendt to say: "We love beauty within the limits of political judgment." The judgment of beauty, of culture, and of art is, Pericles says, limited by the political judgment of the people. There is, in other words, an intimate connection between culture and politics. In culture, we make judgments of taste and thus learn the faculty of judgment so necessary for politics. And political judgment, in turn, limits and guides our cultural judgments.

What unites culture and politics is that they are "both phenomena of the public world." Judgment, the primary faculty of politics, is discovered, nurtured, and practiced in the world of culture and the judgment of taste. What the study of culture through the humanities offers, therefore, is an orientation towards a common world that is known and understood through a common sense.  The humanities, Arendt argues, are crucial for the development and preservation of common sense—something that is unfortunately all-too-lacking in much humanities scholarship today.

What this means is that teaching the humanities is absolutely essential for politics—and as long as that is the case, there will be a rationale for residential colleges and universities. The mania for distance learning today is understandable. Education is, in many cases, too expensive. Much could be done more cheaply and efficiently at colleges. And this will happen. Colleges will, increasingly, bring computers and the Internet into their curricula. But as powerful as the Internet is, and as useful as it is as a replacement for passive learning in large lectures, it is not yet a substitute for face-to-face learning that takes place at a college or university. The learning that takes place in the hallways, offices, and dining halls when students live, eat, and breathe their coursework over four years is simply fundamentally different from taking a course online in one's free time. As exciting as technology is, it is important to remember that education is, at its best, not about transmitting information but about inspiring thinking.

Berlinerblau thinks that what will save the humanities is better training in pedagogy. He writes:

As for the tools, let's look at it this way. Much as we try to foist "critical thinking skills" on undergraduates, I suggest we impart critical communication skills to our master's and doctoral students. That means teaching them how to teach, how to write, how to speak in public. It also means equipping them with an understanding that scholarly knowledge is no longer locked up in journals and class lectures. Spry and free, it now travels digitally, where it may intersect with an infinitely larger and more diverse audience.  The communicative competences I extoll are only infrequently part of our genetic endowment. They don't come naturally to many people—which is precisely what sets the true humanist apart from the many. She or he is someone you always want to speak with, listen to, and read, someone who always teaches you something, blows your mind, singes your feathers. To render complexity with clarity and style—that is our heroism.

The focus on pedagogy is a mistake and comes from the basic flawed assumption that the problem with the humanities is that the professors aren't good communicators. It may be true that professors communicate poorly, but the real problem is deeper. If generations of secondary school teachers trained in pedagogy have taught us anything it is that pedagogical teaching is not useful. Authority in the classroom comes from knowledge and insight, not from pedagogical techniques or theories.

The pressing issue is less pedagogy than the fact that what most professors know is so specialized as to be irrelevant. What is needed is not better pedagogical training, but a more broad and erudite training, one that focuses less on original research and academic publishing and instead demands reading widely and writing aimed at an educated yet popular audience. What we need, in other words, are academics who read widely with excitement and inspiration and speak to the interested public.

More professors should be blogging and writing in public-interest journals. They should be reviewing literature rather than each other's books and, shockingly, they should be writing fewer academic monographs.

To say that the humanities should engage the world does not mean that the humanities should be politicized. The politicization of the humanities has shorn them of their authority and their claim to being true or beautiful. Humanities scholarship can only serve as an incubator for judgment when it is independent from social and political interests. But political independence is not the same as political sterility. Humanities scholarship can, and must, teach us to see and know our world as it is.

There are few essays that better express the worldly importance of the humanities than Hannah Arendt's The Crisis of Culture. It is worth reading and re-reading it. On this hot summer weekend, do yourself that favor.

You can order Arendt's Between Past and Future here. Or you can read it here.




The Highest Law of the Land

“The highest laws of the land (America) are not only the constitution and constitutional laws, but also contracts.”

-Hannah Arendt, Denktagebuch, p. 131

Having published The Origins of Totalitarianism, Arendt turned her attention  to the country around her.  In a sequence of entries in her Denktagebuch for September 1951, she starts by referring to America as “the politically new” – these are thoughts that will eventually result in her argument in On Revolution .  Her analysis has often been criticized from an historical point of view, especially as she refers to the Constitution as being the first to be established “without force, without ruling (archein) and being ruled (archesthai). “  Whatever the validity of these criticisms, they strike me as missing an essential point of her concerns.  Arendt is trying to work out what she a few pages later calls “the central question of the coming (künftigen) politics,” a problem she sees as lodged in “the problem of the giving of laws.” (ibid, 141). Her aim is to describe a political (i.e. humanly appropriate) system that would not rest upon will and in particular on the will of the sovereign.  “That I must have power (Macht) to be able to will, makes the problem of power into the central political fact of all politics that are grounded on sovereignty – all, that is, with the exception of the American.” (idem)

Her concern in these pages (130-143) centers around what a human society would be that was truly political.  Her version of America is her entry into this question.  What is striking about her discussion in the intervening (and other) pages is that she approaches this question explicitly through the lens of European philosophy.  Thus she is attempting an answer to the question of “can we determine the particular excellence of the American polity by viewing it through the lenses of European thought?”  The point is not to Europeanize America: it is to see if America does not in some manner constitute a potential instantiation of what has been thought in Europe over the nineteenth and twentieth centuries.

The sequence of European thinkers she invokes is important. She first mentions Marx and then Nietzsche, each of whom she sees as part of and as makers of the “end of Western philosophy.”  Marx is held to have inverted Hegel, Nietzsche the same for Plato. The point of her analysis of Marx and Nietzsche is to assert that they released thought from its bond to the “Absolute.”  Indeed:  to hold to the idea of an Absolute is to “make possible in the present unjust and bestial behavior.” (ibid, 133).  As we know, this will be an ever-returning theme in her work.  She expects to find in America the elements of the political that does not rest on an “absolute.”

At what might one look to find this vision of a non-absolute political? Nietzsche provides the opening to an answer.  We are to look not to his doctrine of the revaluation of values but to his discussion of promising in the second essay of the Genealogy of Morals.  She quotes: “To breed an animal with the right to make promises – is that not …  the real problem of humans?” For Arendt, the foundation of a new “morality” lies in the right to make a promise; the promise makes possible human relations based on contract.  And the grounding on contract, as she writes in the Denktagebuch, was for her the particular excellence of the American polity.

What is the implication of Arendt's claim that contract is the “highest law” and particular excellence of America? One answer is revealed by the end of extended quotation of Nietzsche’s Genealogy of Morals where he indicates that the person who has the right to make promises can “ für sich als Zukunft gut sagen zu können,”a phrase that might be rendered as “able to give himself as answer for the future.” In Arendt’s gloss, this means that if in making a contract (which is what a promise is) one pledges that each will remain true to him-or herself as the person making the contract, then each has made his or her own being the foundation for a political space.

Such a grounding or foundation is not based either on will or on any external absolute.  It is a matter, as the signers of the Declaration made clear, that we “mutually pledge to each other our Lives, our Fortunes and our sacred Honor.” Temporally speaking, this means that what one did in the past remains alive as the present.   Our political present will thereby be tied to the historical, although not, she notes, in a “weltgeschichtliche” [world-historical: i.e. transcendental] manner.

To make the implications of this clearer, she immediately turns to a consideration of Max Weber’s distinction between the “ethic of responsibility” (which she holds to be the foundation of the pragmatism and genius of American politics) as opposed to the “ethic of conviction,” which, she says, allows for anything as we cannot know “until the day of the Last Judgment” if our conviction be correct. The implication here is that if we base our polity on the conviction of the supposed correctness of our moral judgments (as opposed to our ability to be responsible to ourselves) we will be able to justify anything, as the validation for our claim can be infinitely postponed. (One has but to look at the claims made about bringing democracy to Iraq).  Indeed, Arendt sees “central question of our time” to be a change in our ability to make valid moral judgments, that is ones the correctness of which is not postponed indefinitely. (ibid 138).   She now turns to an examination of how various thinkers have dealt with the problem of moral judgment.  After she worked her way through a partial rejection of the manners in which Hegel, Nietzsche, and the Kant of the Critique of Practical Reason respond to this main question, she turns to the Critique of the Power of Judgment.  Those thoughts are not developed at this time in the Denktagebuch -- but they will concern her for the rest of her life.

What is striking here is how the approach from European philosophy brings out the importance of what is new in the American experiment.  As Hamilton wrote in the first Federalist:

It has been frequently remarked that it seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force. If there be any truth in the remark, the crisis at which we are arrived may with propriety be regarded as the era in which that decision is to be made; and a wrong election of the part we shall act may, in this view, deserve to be considered as the general misfortune of mankind.

To which, in our present day, one may only wonder if at some point a “wrong election“ has not been made.

-Tracy B. Strong (UCSD)


The Voice of Right and Wrong

Whatever the source of moral knowledge might be—divine commandments or moral reason—every sane man, it was assumed, carried within himself a voice that tells him what is right and what is wrong, and this regardless of the law of the land and regardless of the voices of his fellowmen.

-Hannah Arendt, Some Questions of Moral Philosophy, in Responsibility and Judgment, p. 61.

In a series of lectures she wrote for two courses she taught, one in 1965 at the New School and the second in 1966 at the University of Chicago, Arendt mapped out some of her complicated thinking about moral philosophy and the “perplexities inherent in the human faculty of willing.” In these lectures, she drew heavily on Kant and Nietzsche, but began her reflections by calling attention to the historical motivation for her concerns: “We—at least the older ones among us—have witnessed the total collapse of all established moral standards in public and private life during the nineteen-thirties and –forties, not only...in Hitler’s Germany but also in Stalin’s Russia.” (54). The distinction between right and wrong that it was assumed “every sane man” heard like a voice within him had not stood the test of time.

How easily, Arendt observed, ordinary people had changed their habits of mind, exchanging one set of values for another “with hardly more trouble than it [took] to change the table manners of an individual or a people.” (50). How had this happened? If acting morally, and not just legally, depended on the “thinking” conversation one had with oneself about what one should or shouldn’t do, then it was as if large sections of the population in every strata had simply stopped thinking, did what they were told to do, and then proceeded to forget.

Two weeks ago today, Anders Behring Breivik, the 33-year-old Norwegian man who admitted to killing 77 people last July in two separate attacks, entered a specially outfitted courtroom in Oslo to stand trial for criminal acts of terrorism and mass murder. After the charges against him were read, Mr. Breivik pleaded not guilty. "I acknowledge the acts, but not criminal guilt - I claim I was doing it in self-defense." He would have preferred, he added, to appear before a military tribunal; he was, he contended, a political activist involved in a war in Europe.

Since he admitted his acts, the trial now turns on the question of Breivik’s sanity.  Two psychiatric reports have produced contradictory conclusions; the first found him insane at the time of the killings, suffering from paranoid schizophrenic delusions, while the second declared him sane. “[E]very sane man, it was assumed, carried within himself a voice that tells him what is right and what is wrong.” In his own words, Breivik was no exception. Before he started shooting, Breivik explained at his trial last week, he heard “ ‘100 voices’ in his head telling him not to do it.” (http://www.bbc.co.uk/news/world-europe-17789206) But that moment of hesitation passed; he had prepared himself for years through a process he described as a deliberate program of dehumanization. Steeling himself against the comprehension of what he had done was important, he added, because “he would break down mentally” if he allowed himself to empathize with his victims.

“The criterion of right and wrong, the answer to the question, what ought I to do? depends in the last analysis neither on habits and customs, which I share with those around me, nor on a command of either divine or human origin, but what I decide with regard to myself,” Arendt observed in the same essay on moral philosophy. (97) What keeps a person from committing atrocities, or “evil” acts, is, for Arendt, the capacity to be a “thinking being, rooted in his thoughts and remembrances, and hence knowing that he has to live with himself.” This same capacity produce “limits to what he can permit himself to do, and these limits will not be imposed on him from the outside, but will be self-set.” These same limits, she continued, “are absent when men skid only over the surface of events, where they permit themselves to be carried away without ever penetrating into whatever depth they may be capable of.”

Breivik’s description of his yearlong “sabbatical” playing a video game, World of Warcraft, for up to 16 hours per day serves as an indication of the program of dehumanization to which he subjected himself. And his years’ long immersion in the ideology and methods of radical terrorism, with, ironically, his endorsement of Al Qaeda as “the most successful revolutionary movement in the world” serves as an example of the kinds of “thoughtlessness” that can become a willed experience, in individuals and in groups, and is a necessary prelude to despicable acts. But then, Breivik never imagined he would survive July 22; he envisioned his action as a suicide mission, perhaps the ultimate act of forgetfulness, the annihilation of the possibility of thought and judgment themselves.

-Kathleen B. Jones