“Collisions of values are of the essence of what they are and what we are…the world in which what we see as incompatible values are not in conflict is a world altogether beyond our ken; …it is on earth that we live, and it is here that we must believe and act.”
-- Isaiah Berlin, The Crooked Timber of Humanity
“It is in the very nature of things human that every act that has once made its appearance and has been recorded in the history of mankind stays with mankind as a potentiality long after its actuality has become a thing of the past. No punishment has ever possessed enough power of deterrence to prevent the commission of crimes. On the contrary, whatever the punishment, once a specific crime has appeared for the first time, its reappearance is more likely than its initial emergence could ever have been."
--Hannah Arendt, Eichmann in Jerusalem
On the left, it is obvious: Zionism must be overthrown and Gazans freed. On the right, the answer is clear: Hamas is a terrorist organization that must be obliterated. And amongst humanitarians, it is an article of unquestioned faith: women and children must be protected, ceasefires upheld, and medicine, water, and food permitted to enter the country. To talk with representatives of any of these three camps is to be confronted with a tsunami of facts in airtight logically cohesive diatribes. Each one has a set of facts that is unimpeachable so long as it is recited without interruption. But what these radical proponents do not seem to see is that their blinkered radicalism serves nothing more strongly than the status quo, deepening the deadlock, and making it ever less likely for meaningful compromise. As my friend Uday Mehta so aptly formulated it, these radicals are the vanguard of the status quo.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Sari Nusseibeh, recently retired President of Al Quds University in East Jerusalem, thoughtfully writes of the end of his lifelong dream that Israel and Palestine might be able to live together in a peaceful and vibrant future. All that is left, he writes, is the promise of hell. "I can, of course, see and admire beautiful individuals. Israel boasts so many of them - poets, writers, journalists, scholars, artists - and just ordinary people in ordinary jobs, trying to live their harmless lives. But that special luster of an idealistic nation to be admired has vanished. I can no longer see it anywhere. It has become replaced, in my mind - sorry to say - by what appears to have become a scientifically skilled colonialist group of self-serving thugs, bent on self-aggrandizement, capitalizing on world-guilt for past pains and horrors suffered, and now hiding behind a religious fiction to justify all the pain and suffering it does to my own people, our heritage and culture.... I cannot see an Israeli government now offering what a Palestinian government can now accept. I can therefore only foresee a worsening climate - not a one-time disaster (say, an avalanche following the killing of a Jew while performing a prayer in the Noble Sanctuary, on what Israelis call the Temple Mount in Jerusalem) that can once and for all be put behind, by whichever side, but an increasingly ugly living climate in which only those who can acclimatize and be ugly themselves can survive. In simple words, even if called 'holy,' I can foresee this place turning into a hell for all those who live in it. It will not be place for normal human beings who want to pursue normal lives, let alone a place where anyone can hope to fulfill a sublime life." Read Roger Berkowitz's response on the Arendt Center blog.
In the New York Review of Books Robert Pogue Harrison notes that changing the world through work has become a Silicon Valley cliché: "When Steve Jobs sought to persuade John Sculley, the chief executive of Pepsi, to join Apple in 1983, he succeeded with an irresistible pitch: 'Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?' The day I sat down to write this article, a full-page ad for Blackberry in The New York Times featured a smiling Arianna Huffington with an oversize caption in quotes: 'Don't just take your place at the top of the world. Change the world.' A day earlier, I heard Bill Gates urge the Stanford graduating class to 'change the world' through optimism and empathy. The mantra is so hackneyed by now that it's hard to believe it still gets chanted regularly. Our silicon age, which sees no glory in maintenance, but only in transformation and disruption, makes it extremely difficult for us to imagine how, in past eras, those who would change the world were viewed with suspicion and dread. If you loved the world; if you considered it your mortal home; if you were aware of how much effort and foresight it had cost your forebears to secure its foundations, build its institutions, and shape its culture; if you saw the world as the place of your secular afterlife, then you had good reasons to impute sinister tendencies to those who would tamper with its configuration or render it alien to you. Referring to all that happened during the 'dark times' of the first half of the twentieth century, 'with its political catastrophes, its moral disasters, and its astonishing development of the arts and sciences,' Hannah Arendt summarized the human cost of endless disruption: 'The world becomes inhuman, inhospitable to human needs-which are the needs of mortals-when it is violently wrenched into a movement in which there is no longer any sort of permanence.'" You can also watch Harrison's talk on Thinking and Friendship given at the Arendt Center.
In a piece on the place of theory and dangerous thinking in contemporary intellectual discourse, Henry Giroux describes why such practices appear to be in decline, citing its unintelligibility, an assault on them from particular political interests, as well as the corporatization of the university, among other things. It doesn't help that good critical thinking is hard to do, and that thinking and action aren't the same: "One important function of dangerous thinking is that it foregrounds the responsibility of artists, intellectuals, academics and others who use it. Mapping the full range of how power is used and how it can be made accountable represents a productive pedagogical and political use of theory. Theorizing the political, economic and cultural landscapes is central to any form of political activism and suggests that theory is like oxygen. That is, a valuable resource, which one has to become conscious of in order to realize how necessary it is to have it. Where we should take pause is when academic culture uses critical thought in the service of ideological purity and in doing so transforms pedagogy into forms of poisonous indoctrination for students. Critical thought in this case ossifies from a practice to a form of political dogmatism. The cheerleaders for casino capitalism hate critical theory and thought because they contain the possibility of politicizing everyday life and exposing those savage market-driven ideologies, practices and social relations that hide behind an appeal to commonsense. Both the fetishism of thinking and its dismissal are part of the same coin, the overall refusal to link conception and practice, agency and intervention, all aggravated by neoliberalism's hatred of all things social and public."
Stephen Mucher makes the case that liberal arts faculty should be more involved in teacher education, suggesting that teachers who are well versed in the humanities, in addition to teaching practice, prepare more curious, more creative students with better critical thinking skills: "Without a professional core of teachers who are versed in the humanities and steeped in the great questions of science, schools are especially vulnerable to forces that reduce teaching to a series of discrete measurable acts. Yet the more teaching is dissected, the less attractive the profession becomes for graduates who might otherwise consider it a viable and meaningful career option. More directly, these reductionist policy trends obscure something that humanists care deeply about -- the enduring beauty of teaching and learning. As one outgoing pedagogy chair lamented in 1900, 'the attempt to mechanize instruction is part of the monstrous error that free minds can be coerced; it has really the same root as religious persecution.' By remaining largely silent for so long, colleges of liberal arts and sciences have contributed to these developments. By pushing big questions about K-12 teaching to the margins and assigning them solely to education specialists, institutions of higher education became complicit in trends that continue to make public education more separate and more unequal."
In an interview, poet Carol Muske-Dukes takes on the notion of "unoriginal genius," which she thinks is alienating contemporary poetry from the public, and emphasizes instead an older way of thinking about verse. Let's bring back readable poetry we can recite: "Proponents of unoriginal genius would say that they are putting forward a version of interpretation and illumination of a technological age. But the fact is, this mirroring of disjunction represents no real speaking or reading or thinking population.... The struggle here, as it is with overly accessible, catchy poetry, is a struggle to be both popular and enlightening. We live in a time when language matters. Not only because of the constant threat of misunderstanding in translation - in diplomacy, in wartime, in the university and literary life - but, as always, in individual human relations. So the abdication of accessible rhetoric and a turn toward so-called scholarship is an abdication of the human. The academy has opted for pointless experimentation in language compared to my mother's generation - she's ninety-eight - of well schooled, publicly educated students of poetry who know pages and pages of poetry by heart. Should anyone who believes in sense be ostracized from the ongoing conversation of literature?"
Kate Losse suggests that there's something sinister behind the connection of work and leisure on the campuses of innovative tech companies: "Of course, the remaking of the contemporary tech office into a mixed work-cum-leisure space is not actually meant to promote leisure. Instead, the work/leisure mixing that takes place in the office mirrors what happens across digital, social and professional spaces. Work has seeped into our leisure hours, making the two tough to distinguish. And so, the white-collar work-life blend reaches its logical conclusion with the transformation of modern luxury spaces such as airport lounges into spaces that look much like the offices from which the technocrat has arrived. Perhaps to secure the business of the new moneyed tech class, the design of the new Centurion Lounge for American Express card members draws from the same design palette as today's tech office: reclaimed-wood panels, tree-stump stools, copious couches and a cafeteria serving kale salad on bespoke ceramic plates. In these lounges, the blurring of recreation and work becomes doubly disconcerting for the tech employee. Is one headed out on vacation or still at the office - and is there a difference? If the reward for participation in the highly lucrative tech economy is not increasing leisure but a kind of highly decorated, almost Disneyland vision of perpetual labour, what will be its endgame? As work continues to consume workers' lives, tech offices might compete for increasingly unique and obscure toys and luxury perks to inhibit their employees' awareness that they are always working." Maybe Silicon Valley's idea of changing the world is simply the collapse of the labor vs. leisure distinction.
The Hannah Arendt Center's annual fall conference, The Unmaking of Americans: Are There Still American Values Worth Fighting For?, will be held this year on October 9-10!
Registration is now OPEN! You can register here!
Learn more about the conference here.
This week on the Blog, Michael Weinman discusses Arendt's use of the term "irony" in her report on the "banality of evil" in his Quote of the Week. American modernist poet Wallace Sevens provides this week's Thought on Thinking. We look back on a free speech lecture Zephyr Teachout delivered at Bard in 2012 in our Video Archives. And Roger Berkowitz discusses the hell that the Middle East is fast becoming in the Weekend Read.
“Between Sovereign states there can be no last resort except war; if war no longer serves that purpose, that fact alone proves that we must have a new concept of the state.”
—Hannah Arendt, in an interview with Adelbert Reif, 1970.
"If people think that one can only write about these things in a solemn tone of voice...Look, there are people who take it amiss—and I can understand that in a sense—that, for instance, I can still laugh. But I was really of the opinion that Eichmann was a buffoon..."
Holocaust Remembrance Day, or Yom Hashoah, fell on the 27th day of the month of Nisan or in April this year. It begins at sundown and continues into the next day. A memorial to the six million Jewish people who were slaughtered by the Nazis between 1933 and 1945, it is a time to call these events to mind and consider their continued resonance and relevance in our own dark times. How shall we, in the words of Hannah Arendt, bear the burden of such a past? With what attitude should such events be commemorated?
Fifty years ago, on October 28, 1964, a televised conversation between the German-Jewish political theorist, Hannah Arendt, and the well-known German journalist, Günter Gaus, was broadcast in West Germany. Arendt’s Eichmann in Jerusalem: A Report on the Banality of Evil, her controversial analysis of the Jerusalem trial of Adolf Eichmann, had just been published in German in the Federal Republic and Gaus used the occasion to generate a “portrait of Hannah Arendt.” The interview ranged across a wide field of topics, including the difference between philosophy and politics, the situation in Germany before and after the war, the state of Israel, and even Arendt’s personal experiences as a detainee in Germany and France during the Second World War.
Already a cause célèbre in the United States the book had brought Arendt lavish praise and no small amount of damnation. What Gaus especially wanted to know was what Arendt thought about criticism levied against her by Jews angered by her portrait of Eichmann and her comments about Jewish leaders and other Jewish victims of the Holocaust. “Above all,” said Gaus, “people were offended by the question you raised of the extent to which Jews are to blame for their passive acceptance of the German mass murders, or to what extent the collaboration of certain Jewish councils almost constitutes a kind of guilt of their own.”
Gaus acknowledged that Arendt had already addressed these critics, by saying that such comments were, in some cases, based on a misunderstanding and, in others, part of a political campaign against her, but he had already crossed a contested border. Without hesitation, she corrected Gaus:
First of all, I must, in all friendliness, state that you yourself have become a victim of this campaign. Nowhere in my book did I reproach the Jewish people with nonresistance. Someone else did that in the Eichmann trial, namely Mr. Hausner of the Israeli public prosecutor’s office. I called such questions directed to the witnesses in Jerusalem both foolish and cruel.
True, Gaus admitted. He had read the book and agreed that Arendt had not made that point exactly. But, he continued, some criticism had been levied against her because of “the tone in which many passages are written.”
“Well,” Arendt replied, “that is another matter...That the tone of voice is predominantly ironic is completely true.”
What did she mean by ironic? “If people think that one can only write about these things in a solemn tone of voice.... Look, there are people who take it amiss—and I can understand that in a sense—that, for instance, I can still laugh. But I was really of the opinion that Eichmann was a buffoon...” To convey the shock she experienced when, contrary to her own expectations, Eichmann “in the flesh” appeared to be more a clown than a monster, Arendt countered with a reverse shock, adopting a sardonic, unsentimental voice to unmask what she later termed “the banality of evil.” It could be read as her way to diminish the self-aggrandizement of the architects of the Final Solution to middling size. The trouble was she used this voice rather undiplomatically to describe not only Eichmann’s actions but also the complicity of others, including some members of the Jewish community she judged harshly for cooperating with Nazis. “When people reproach me with accusing the Jewish people, that is a malignant lie and propaganda and nothing else. The tone of voice is, however, an objection against me personally. And I cannot do anything about that.”
“You are prepared to bear that?” asked Gaus. “Yes, willingly,” Arendt claimed. What she had not anticipated was how unprepared many who read her were to take on this new shock of the “banality of evil” on top of the horrifying accounts of Jewish suffering conveyed at the trial.
In fact, “bearing the burden of the past,” thinking about the past in its morally perplexing and disconcerting entirety, was the focus of Arendt’s writing, from her earliest essays to her last. And in no case did this burden bearing affect her more personally than when she published Eichmann in Jerusalem. When she returned from a European trip taken for a needed rest soon after the book’s release, she found stacks of letters waiting for her. Some correspondents praised the bravery of her truth-telling, but the lion’s share found her book detestable. A few included death threats.
Was her refusal to concede that her “tone” had anything to do with the hostility the book generated merely a matter of sheer stubbornness? Or was the ironic tone itself emblematic of Arendt’s ideas about the danger implicit in thinking and the burden of responsibility that lay at the heart of judgment?
In the introduction to The Life of the Mind, Arendt offered this account of the generation of her controversial and still frequently misunderstood concept of “the banality of evil”:
In my report of [the Eichmann trial] I spoke of ‘the banality of evil.’ Behind that phrase I was dimly aware of the fact that it went counter to our tradition of thought—literary, theological, or philosophic—about the phenomenon of evil...However, what I was confronted with was utterly different and still undeniably factual. I was struck by the manifest shallowness in the doer that made it impossible to trace the uncontestable evil of his deeds to any deeper level of roots or motives. The deeds were monstrous, but the doer—at least the very effective one now on trial—was quite ordinary, commonplace, and neither demonic nor monstrous...Might the problem of good and evil, our faculty of telling right from wrong, be connected with our faculty of thought?...Could the activity of thinking as such, the habit of examining whatever happens to come to pass or to attract attention, regardless of results and specific content, could this activity be among the conditions that make men abstain from evil-doing or even actually ‘condition’ them against it?
But, Arendt insisted, thinking’s ability to condition people against evil-doing did not mean “that thinking would ever be able to produce the good deed as its result, as though ‘virtue could be taught and learned’—only habits and customs can be taught, and we know only too well the alarming speed with which they are unlearned and forgotten when new circumstances demand a change in manners and patterns of behavior.” What cold comfort, then, this thinking business seemed to be, offering no guarantee that evil will be avoided and good prevail.
Arendt had removed the guarantee of absolute innocence and automatic guilt from the question of moral responsibility. What did she put in its place? The capacity to exercise an “independent human faculty, unsupported by law and public opinion, that judges in full spontaneity every deed and intent anew whenever the occasion arises.” And who evidenced this capacity? They were not distinguished by any superior intelligence or sophistication in moral matters but “dared to judge for themselves.” Deciding that conformity would leave them unable to “live with themselves,” sometimes they even chose to die rather than become complicit. “The dividing line between those who think and therefore have to judge for themselves, and those who do not, strikes across all social and cultural or educational differences.”
Nonetheless, Arendt’s tone made it seem as if she knew she would have acted more valiantly than those who cooperated with the Nazis. Outraged by her moral judgment of Jewish leaders many asked: Who is she to judge those who were forced to make difficult decisions and, in the interests of saving the many sacrificed the few? Arendt answered this question in a 1964 essay entitled “Personal Responsibility Under Dictatorship,” “Since this question of judging without being present is usually coupled by the accusation of arrogance, who has ever maintained that by judging a wrong I presuppose that I myself would be incapable of committing it?”
—Kathleen B. Jones
This Quote of the Week is adapted from an essay originally appearing in Humanities Magazine, March/April 2014.
"The end of the old is not necessarily the beginning of the new."
Hannah Arendt, The Life of the Mind
This is a simple enough statement, and yet it masks a profound truth, one that we often overlook out of the very human tendency to seek consistency and connection, to make order out of the chaos of reality, and to ignore the anomalous nature of that which lies in between whatever phenomena we are attending to.
Perhaps the clearest example of this has been what proved to be the unfounded optimism that greeted the overthrow of autocratic regimes through American intervention in Afghanistan and Iraq, and the native-born movements known collectively as the Arab Spring. It is one thing to disrupt the status quo, to overthrow an unpopular and undemocratic regime. But that end does not necessarily lead to the establishment of a new, beneficent and participatory political structure. We see this time and time again, now in Putin's Russia, a century ago with the Russian Revolution, and over two centuries ago with the French Revolution.
Of course, it has long been understood that oftentimes, to begin something new, we first have to put an end to something old. The popular saying that you can't make an omelet without breaking a few eggs reflects this understanding, although it is certainly not the case that breaking eggs will inevitably and automatically lead to the creation of an omelet. Breaking eggs is a necessary but not sufficient cause of omelets, and while this is not an example of the classic chicken and egg problem, I think we can imagine that the chicken might have something to say on the matter of breaking eggs. Certainly, the chicken would have a different view on what is signified or ought to be signified by the end of the old, meaning the end of the egg shell, insofar as you can't make a chicken without it first breaking out of the egg that it took form within.
So, whether you take the chicken's point of view, or adopt the perspective of the omelet, looking backwards, reverse engineering the current situation, it is only natural to view the beginning of the new as an effect brought into being by the end of the old, to assume or make an inference based on sequencing in time, to posit a causal relationship and commit the logical fallacy of post hoc ergo propter hoc, if for no other reason that by force of narrative logic that compels us to create a coherent storyline. In this respect, Arendt points to the foundation tales of ancient Israel and Rome:
We have the Biblical story of the exodus of Israeli tribes from Egypt, which preceded the Mosaic legislation constituting the Hebrew people, and Virgil's story of the wanderings of Aeneas, which led to the foundation of Rome—"dum conderet urbem," as Virgil defines the content of his great poem even in its first lines. Both legends begin with an act of liberation, the flight from oppression and slavery in Egypt and the flight from burning Troy (that is, from annihilation); and in both instances this act is told from the perspective of a new freedom, the conquest of a new "promised land" that offers more than Egypt's fleshpots and the foundation of a new City that is prepared for by a war destined to undo the Trojan war, so that the order of events as laid down by Homer could be reversed.
Fast forward to the American Revolution, and we find that the founders of the republic, mindful of the uniqueness of their undertaking, searched for archetypes in the ancient world. And what they found in the narratives of Exodus and the Aeneid was that the act of liberation, and the establishment of a new freedom are two events, not one, and in effect subject to Alfred Korzybski's non-Aristotelian Principle of Non-Identity. The success of the formation of the American republic can be attributed to the awareness on their part of the chasm that exists between the closing of one era and the opening of a new age, of their separation in time and space:
No doubt if we read these legends as tales, there is a world of difference between the aimless desperate wanderings of the Israeli tribes in the desert after the Exodus and the marvelously colorful tales of the adventures of Aeneas and his fellow Trojans; but to the men of action of later generations who ransacked the archives of antiquity for paradigms to guide their own intentions, this was not decisive. What was decisive was that there was a hiatus between disaster and salvation, between liberation from the old order and the new freedom, embodied in a novus ordo saeclorum, a "new world order of the ages" with whose rise the world had structurally changed.
I find Arendt's use of the term hiatus interesting, given that in contemporary American culture it has largely been appropriated by the television industry to refer to a series that has been taken off the air for a period of time, but not cancelled. The typical phrase is on hiatus, meaning on a break or on vacation. But Arendt reminds us that such connotations only scratch the surface of the word's broader meanings. The Latin word hiatus refers to an opening or rupture, a physical break or missing part or link in a concrete material object. As such, it becomes a spatial metaphor when applied to an interruption or break in time, a usage introduced in the 17th century. Interestingly, this coincides with the period in English history known as the Interregnum, which began in 1649 with the execution of King Charles I, led to Oliver Cromwell's installation as Lord Protector, and ended after Cromwell's death with the Restoration of the monarchy under Charles II, son of Charles I. While in some ways anticipating the American Revolution, the English Civil War followed an older pattern, one that Mircea Eliade referred to as the myth of eternal return, a circular movement rather than the linear progression of history and cause-effect relations.
The idea of moving forward, of progress, requires a future-orientation that only comes into being in the modern age, by which I mean the era that followed the printing revolution associated with Johannes Gutenberg (I discuss this in my book, On the Binding Biases of Time and Other Essays on General Semantics and Media Ecology). But that same print culture also gave rise to modern science, and with it the monopoly granted to efficient causality, cause-effect relations, to the exclusion in particular of final and formal cause (see Marshall and Eric McLuhan's Media and Formal Cause). This is the basis of the Newtonian universe in which every action has an equal and opposite reaction, and every effect can be linked back in a causal chain to another event that preceded it and brought it into being. The view of time as continuous and connected can be traced back to the introduction of the mechanical clock in the 13th century, but was solidified through the printing of calendars and time lines, and the same effect was created in spatial terms by the reproduction of maps, and the use of spatial grids, e.g., the Mercator projection.
And while the invention of history, as a written narrative concerning the linear progression over time can be traced back to the ancient Israelites, and the story of the exodus, the story incorporates the idea of a hiatus in overlapping structures:
A1. Joseph is the golden boy, the son favored by his father Jacob, earning him the enmity of his brothers
A2. he is sold into slavery by them, winds up in Egypt as a slave and then is falsely accused and imprisoned
A3. by virtue of his ability to interpret dreams he gains his freedom and rises to the position of Pharaoh's prime minister
B1. Joseph welcomes his brothers and father, and the House of Israel goes down to Egypt to sojourn due to famine in the land of Canaan
B2. their descendants are enslaved, oppressed, and persecuted
B3. Moses is chosen to confront Pharaoh, liberate the Israelites, and lead them on their journey through the desert
C1. the Israelites are freed from bondage and escape from Egypt
C2. the revelation at Sinai fully establishes their covenant with God
C3. after many trials, they return to the Promised Land
It can be clearly seen in these narrative structures that the role of the hiatus, in ritual terms, is that of the rite of passage, the initiation period that marks, in symbolic fashion, the change in status, the transformation from one social role or state of being to another (e.g., child to adult, outsider to member of the group). This is not to discount the role that actual trials, tests, and other hardships may play in the transition, as they serve to establish or reinforce, psychologically and sometimes physically, the value and reality of the transformation.
In mythic terms, this structure has become known as the hero's journey or hero's adventure, made famous by Joseph Campbell in The Hero with a Thousand Faces, and also known as the monomyth, because he claimed that the same basic structure is universal to all cultures. The basis structure he identified consists of three main elements: separation (e.g., the hero leaves home), initiation (e.g., the hero enters another realm, experiences tests and trials, leading to the bestowing of gifts, abilities, and/or a new status), and return (the hero returns to utilize what he has gained from the initiation and save the day, restoring the status quo or establishing a new status quo).
Understanding the mythic, non-rational element of initiation is the key to recognizing the role of the hiatus, and in the modern era this meant using rationality to realize the limits of rationality. With this in mind, let me return to the quote I began this essay with, but now provide the larger context of the entire paragraph:
The legendary hiatus between a no-more and a not-yet clearly indicated that freedom would not be the automatic result of liberation, that the end of the old is not necessarily the beginning of the new, that the notion of an all-powerful time continuum is an illusion. Tales of a transitory period—from bondage to freedom, from disaster to salvation—were all the more appealing because the legends chiefly concerned the deeds of great leaders, persons of world-historic significance who appeared on the stage of history precisely during such gaps of historical time. All those who pressed by exterior circumstances or motivated by radical utopian thought-trains, were not satisfied to change the world by the gradual reform of an old order (and this rejection of the gradual was precisely what transformed the men of action of the eighteenth century, the first century of a fully secularized intellectual elite, into the men of the revolutions) were almost logically forced to accept the possibility of a hiatus in the continuous flow of temporal sequence.
Note that concept of gaps in historical time, which brings to mind Eliade's distinction between the sacred and the profane. Historical time is a form of profane time, and sacred time represents a gap or break in that linear progression, one that takes us outside of history, connecting us instead in an eternal return to the time associated with a moment of creation or foundation. The revelation in Sinai is an example of such a time, and accordingly Deuteronomy states that all of the members of the House of Israel were present at that event, not just those alive at that time, but those not present, the generations of the future. This statement is included in the liturgy of the Passover Seder, which is a ritual reenactment of the exodus and revelation, which in turn becomes part of the reenactment of the Passion in Christianity, one of the primary examples of Campbell's monomyth.
Arendt's hiatus, then represents a rupture between two different states or stages, an interruption, a disruption linked to an eruption. In the parlance of chaos and complexity theory, it is a bifurcation point. Arendt's contemporary, Peter Drucker, a philosopher who pioneered the scholarly study of business and management, characterized the contemporary zeitgeist in the title of his 1969 book: The Age of Discontinuity. It is an age in which Newtonian physics was replaced by Einstein's relativity and Heisenberg's uncertainty, the phrase quantum leap becoming a metaphor drawn from subatomic physics for all forms of discontinuity. It is an age in which the fixed point of view that yielded perspective in art and the essay and novel in literature yielded to Cubism and subsequent forms of modern art, and stream of consciousness in writing.
Beginning in the 19th century, photography gave us the frozen, discontinuous moment, and the technique of montage in the motion picture gave us a series of shots and scenes whose connections have to be filled in by the audience. Telegraphy gave us the instantaneous transmission of messages that took them out of their natural context, the subject of the famous comment by Henry David Thoreau that connecting Maine and Texas to one another will not guarantee that they have anything sensible to share with each other. The wire services gave us the nonlinear, inverted pyramid style of newspaper reporting, which also was associated with the nonlinear look of the newspaper front page, a form that Marshall McLuhan referred to as a mosaic. Neil Postman criticized television's role in decontextualizing public discourse in Amusing Ourselves to Death, where he used the phrase, "in the context of no context," and I discuss this as well in my recently published follow-up to his work, Amazing Ourselves to Death.
The concept of the hiatus comes naturally to the premodern mind, schooled by myth and ritual within the context of oral culture. That same concept is repressed, in turn, by the modern mind, shaped by the linearity and rationality of literacy and typography. As the modern mind yields to a new, postmodern alternative, one that emerges out of the electronic media environment, we see the return of the repressed in the idea of the jump cut writ large.
There is psychological satisfaction in the deterministic view of history as the inevitable result of cause-effect relations in the Newtonian sense, as this provides a sense of closure and coherence consistent with the typographic mindset. And there is similar satisfaction in the view of history as entirely consisting of human decisions that are the product of free will, of human agency unfettered by outside constraints, which is also consistent with the individualism that emerges out of the literate mindset and print culture, and with a social rather that physical version of efficient causality. What we are only beginning to come to terms with is the understanding of formal causality, as discussed by Marshall and Eric McLuhan in Media and Formal Cause. What formal causality suggests is that history has a tendency to follow certain patterns, patterns that connect one state or stage to another, patterns that repeat again and again over time. This is the notion that history repeats itself, meaning that historical events tend to fall into certain patterns (repetition being the precondition for the existence of patterns), and that the goal, as McLuhan articulated in Understanding Media, is pattern recognition. This helps to clarify the famous remark by George Santayana, "those who cannot remember the past are condemned to repeat it." In other words, those who are blind to patterns will find it difficult to break out of them.
Campbell engages in pattern recognition in his identification of the heroic monomyth, as Arendt does in her discussion of the historical hiatus. Recognizing the patterns are the first step in escaping them, and may even allow for the possibility of taking control and influencing them. This also means understanding that the tendency for phenomena to fall into patterns is a powerful one. It is a force akin to entropy, and perhaps a result of that very statistical tendency that is expressed by the Second Law of Thermodynamics, as Terrence Deacon argues in Incomplete Nature. It follows that there are only certain points in history, certain moments, certain bifurcation points, when it is possible to make a difference, or to make a difference that makes a difference, to use Gregory Bateson's formulation, and change the course of history. The moment of transition, of initiation, the hiatus, represents such a moment.
McLuhan's concept of medium goes far beyond the ordinary sense of the word, as he relates it to the idea of gaps and intervals, the ground that surrounds the figure, and explains that his philosophy of media is not about transportation (of information), but transformation. The medium is the hiatus.
The particular pattern that has come to the fore in our time is that of the network, whether it's the decentralized computer network and the internet as the network of networks, or the highly centralized and hierarchical broadcast network, or the interpersonal network associated with Stanley Milgram's research (popularly known as six degrees of separation), or the neural networks that define brain structure and function, or social networking sites such as Facebook and Twitter, etc. And it is not the nodes, which may be considered the content of the network, that defines the network, but the links that connect them, which function as the network medium, and which, in the systems view favored by Bateson, provide the structure for the network system, the interaction or relationship between the nodes. What matters is not the nodes, it's the modes.
Hiatus and link may seem like polar opposites, the break and the bridge, but they are two sides of the same coin, the medium that goes between, simultaneously separating and connecting. The boundary divides the system from its environment, allowing the system to maintain its identity as separate and distinct from the environment, keeping it from being absorbed by the environment. But the membrane also serves as a filter, engaged in the process of abstracting, to use Korzybski's favored term, letting through or bringing material, energy, and information from the environment into the system so that the system can maintain itself and survive. The boundary keeps the system in touch with its situation, keeps it contextualized within its environment.
The systems view emphasizes space over time, as does ecology, but the concept of the hiatus as a temporal interruption suggests an association with evolution as well. Darwin's view of evolution as continuous was consistent with Newtonian physics. The more recent modification of evolutionary theory put forth by Stephen Jay Gould, known as punctuated equilibrium, suggests that evolution occurs in fits and starts, in relatively rare and isolated periods of major change, surrounded by long periods of relative stability and stasis. Not surprisingly, this particular conception of discontinuity was introduced during the television era, in the early 1970s, just a few years after the publication of Peter Drucker's The Age of Discontinuity.
When you consider the extraordinary changes that we are experiencing in our time, technologically and ecologically, the latter underlined by the recent news concerning the United Nations' latest report on global warming, what we need is an understanding of the concept of change, a way to study the patterns of change, patterns that exist and persist across different levels, the micro and the macro, the physical, chemical, biological, psychological, and social, what Bateson referred to as metapatterns, the subject of further elaboration by biologist Tyler Volk in his book on the subject. Paul Watzlawick argued for the need to study change in and of itself in a little book co-authored by John H. Weakland and Richard Fisch, entitled Change: Principles of Problem Formation and Problem Resolution, which considers the problem from the point of view of psychotherapy. Arendt gives us a philosophical entrée into the problem by introducing the pattern of the hiatus, the moment of discontinuity that leads to change, and possibly a moment in which we, as human agents, can have an influence on the direction of that change.
To have such an influence, we do need to have that break, to find a space and more importantly a time to pause and reflect, to evaluate and formulate. Arendt famously emphasizes the importance of thinking in and of itself, the importance not of the content of thought alone, but of the act of thinking, the medium of thinking, which requires an opening, a time out, a respite from the onslaught of 24/7/365. This underscores the value of sacred time, and it follows that it is no accident that during that period of initiation in the story of the exodus, there is the revelation at Sinai and the gift of divine law, the Torah or Law, and chief among them the Ten Commandments, which includes the fourth of the commandments, and the one presented in greatest detail, to observe the Sabbath day. This premodern ritual requires us to make the hiatus a regular part of our lives, to break the continuity of profane time on a weekly basis. From that foundation, other commandments establish the idea of the sabbatical year, and the sabbatical of sabbaticals, or jubilee year. Whether it's a Sabbath mandated by religious observance, or a new movement to engage in a Technology Sabbath, the hiatus functions as the response to the homogenization of time that was associated with efficient causality and literate linearity, and that continues to intensify in conjunction with the technological imperative of efficiency über alles.
To return one last time to the quote that I began with, the end of the old is not necessarily the beginning of the new because there may not be a new beginning at all, there may not be anything new to take the place of the old. The end of the old may be just that, the end, period, the end of it all. The presence of a hiatus to follow the end of the old serves as a promise that something new will begin to take its place after the hiatus is over. And the presence of a hiatus in our lives, individually and collectively, may also serve as a promise that we will not inevitably rush towards an end of the old that will also be an end of it all, that we will be able to find the opening to begin something new, that we will be able to make the transition to something better, that both survival and progress are possible, through an understanding of the processes of continuity and change.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
On the Guernica blog, David Bromwich examines “how Obama became a publicist for his presidency (rather than the president).” In his first term Obama delivered 1,852 separate speeches, comments, or scheduled public remarks and granted 591 interviews. These exceptional numbers, Bromwich writes, were the result of “magical thinking” on the part of the Obama White House: if the American public heard the president often enough, they would see how sincere and bipartisan he was and accept his policies. An endless string of speeches, road trips, and town hall meetings thus came to serve as a stand-in for the decision-making and confrontation that true leadership requires, and genuine conviction demands. Argues Bromwich: “…The truth is that Obama’s convictions were never strong. He did not find this out until his convictions were tested, and they were not tested until he became president. Perhaps the thin connection between Obama’s words and his actions does not support the use of the word “conviction” at all. Let us say instead that he mistook his preferences for convictions—and he can still be trusted to tell us what he would prefer to do. Review the record and it will show that his first statement on a given issue generally lays out what he would prefer. Later on, he resigns himself to supporting a lesser evil, which he tells us is temporary and necessary. The creation of a category of permanent prisoners in “this war we’re in” (which he declines to call “the war on terror”) was an early and characteristic instance. Such is Obama’s belief in the power and significance of his own words that, as he judges his own case, saying the right thing is a decent second-best to doing the right thing.” For more see a commentary on the Arendt Center blog.
Phillip Durkin, author of the forthcoming book Borrowed Words, uses an interactive tool to show how English has changed over the last thousand years. Although still mostly dominated by Latin and French, English has also begun to borrow from languages with more distant origins, like Japanese, Russian, and Greek. Durkin's tool, and presumably his book, is a reminder of the fact that both words and their speakers exist in history, something all too easily lost in the hegemony of any present context.
Leonard Pierce takes aim at the aspirationism of the creative class, who, he says, are selling us their luck as our own failure. He concludes from the long view, “It is hard enough just being alive, just living and trying to be a decent person without being overwhelmed by shame and guilt and the demands of the world; the last thing we need is someone who got a few extra pulls of the handle at the cosmic slot machine telling us we’re doing it all wrong. If there is something we should aspire to, it certainly cannot be a position from which we look upon ordinary people, people no less miraculous but perhaps just a little less lucky than ourselves, as a lesser form of life."
In a speech to German Parliament, Angela Merkel, that country's chancellor, explains her position on privacy and surveillance. The question is about more than what happens in what country's borders, she says, and "millions of people who live in undemocratic states are watching very closely how the world’s democracies react to threats to their security: whether they act circumspectly, in sovereign self-assurance, or undermine precisely what in the eyes of these millions of people makes them so attractive—freedom and the dignity of the individual."
Considering the Philippine writer and hero Jose Mizal in the wake of reading Benedict Anderson's short book Why Counting Counts, Gina Apostol notes his two legacies: “For a Filipino novelist like myself, Rizal is a troubling emblem. Many writers like to dwell on the burden of his monumental legacy. But my problem is that Rizal is forgotten as an artist. Remembered (or dismembered) as a patriot, a martyr, a nationalist, a savior, a saint, Rizal is not discussed much as a writer — he is not read as an artist. Our national hero now shares the fate of all of us who attempt to write about our country in fiction. No one really reads his novels."
Audra Wolfe, taking note of Neil Degrasse Tyson's resurrection of Carl Sagan's TV science epic Cosmos, suggests that any hope that the series may bring increased attention, and therefore increased funding, to scientific pursuits may be misguided: "As is so often the case with science communication, the assumption seems to be that public understanding of science—sprinkled with a hearty dose of wonder and awe—will produce respect for scientific authority, support for science funding, and a new generation of would-be scientists. If only Americans loved science a little more, the thinking goes, we could end our squabbling about climate change, clean energy, evolution, and funding NASA and the National Science Foundation. These are high hopes to pin on a television show, even one as glorious as Cosmos." Although Wolfe makes a good argument about how Sagan's world is different from the world we now inhabit with Tyson, there's something more basic at work, here: the pernicious notion that, if we educate people who don't agree with us just a little bit more, they'll come around to our way of thinking. This, obviously, is a deeply dismissive point of view, one that suggests that everyone should think as we do, and that they don't is a question of status rather than viewpoint. If Cosmos gets people interested in science, it will be the possibility, the things that we are yet to understand, that get them excited, rather than what has already been settled. Speak to that sense of wonder and people very well may listen; speak to what you think people don't know and should, and they'll tune you out.
This week on the blog, read a recap and watch the video of Roger Berkowitz and Walter Russell Mead speaking with SCOTUSblog founder, Tom Goldstein, as part of our “Blogging and the New Public Intellectual series. Jason Adams relates Arendt’s belief that the act of thinking slips humanity out of historical and biographical time and into a non-time that reconstitutes the world.Roger Berkowitz ponders whether President Obama lacks conviction, and in the Weekend Read, Roger Berkowitz examines the current antisemitic controversies surrounding both Martin Heidegger and Paul de Man.
The first of the three volumes of the Gesammtausgabe of Martin Heidegger’s work, titled Überlegenungen or Reflections arrived in the mail. Somehow I’ll read the over 1,000 pages in these three volumes. And on April 8 in New York City I’ll be moderating a discussion on these volumes at the Goethe Institute in New York City, with Peter Trawny, the editor, as well as Babette Babich and Andrew Mitchell. But these volumes, even before they are published, have preemptively elicited dozens upon dozens of reviews and scandalized-yelps of outrage, nearly all by people who haven’t read them. What is more, most of these commentators also have never seriously read Martin Heidegger’s philosophy. The occasion for the outrage is that these so-called Schwarzen Hefte (The Black Notebooks) include statements that clearly trade in Jewish stereotypes and anti-Semitic tropes.
No one should be surprised that Heidegger had certain opinions about Jews that are anti-Semitic. Heidegger may be the most important philosopher of the 20th century. Be wary of anyone who denies his importance. But that does not mean he was a good person or without prejudices. The fact that his published work had never previously included anti-Semitic remarks is hardly evidence of his tolerance.
Amongst the most salacious of the literati pronouncing “Heidegger’s Hitler Problem is Worse Than We Thought” is Rebecca Schumann at Slate. Slightly better is the horrifically titled “Heidegger's 'black notebooks' reveal antisemitism at core of his philosophy,” by Philip Oltermann in The Guardian. On the other side, Jonathan Rée writes in defense of Heidegger. Rée makes an excellent point about the confusion of the charge of antisemitism and philosophy:
I think that those who say that because he was anti-Semitic we should not read his philosophy show a deep ignorance about the whole tradition of writing and reading philosophy. The point about philosophy is not that it offers an anthology of opinions congenial to us, which we can dip into to find illustrations of what you might call greeting card sentiments. Philosophy is about learning to be aware of problems in your own thinking where you might not have suspected them. It offers its readers an intellectual boot camp, where every sentence is a challenge, to be negotiated with care. The greatest philosophers may well be wrong: the point of recognising them as great is not to subordinate yourself to them, but to challenge yourself to work out exactly where they go wrong.
But the charge of many of Heidegger’s critics is not simply that he is an antisemite, but that his philosophy is founded upon antisemitism. As someone who has read Heidegger closely for decades, I can say confidently that such an opinion is based on fundamental misunderstandings. There is no need to deny Heidegger’s antisemitism. And yet, that is not at all an indictment of his philosophy. But Rée goes further, and concludes:
As for the hullaballoo over the Schwarzen Hefte. In the first place it seems to me a remarkable piece of publicity-seeking on the part of the publisher, who hints that we may at last find the black heart of anti-Semitism that beats in every sentence Heidegger wrote. That would of course be very gratifying to people who want an excuse for not taking Heidegger seriously, but it seems to me—from the few leaked passages I have seen, dating from 1938-9—that if Heidegger is on trial for vicious anti-Semitism, then the newly published notebooks make a case for the defence rather than the prosecution.
While I agree with Rée that this is largely a case of insane overreaction, one cannot say that the notebooks offer a defense of Heidegger, certainly not before reading them. What is more, only three of the planned four volumes of these notebooks are being published. The final notebook, covering the years 1941-1945, is apparently being held back and not even Peter Trawny, the editor of the first three volumes, is permitted to read the final one. We are left to imagine how much more damaging that final volume may be. What is undeniable, it seems, is that Heidegger certainly adopted and reflected upon some vulgur examples of antisemitism.
It is no small irony that the Schwarzen Hefte are being published in Germany at the same moment as a new biography of Paul de Man (The Double Life of Paul de Man by Evelyn Barish) is being released and reviewed in the U.S. De Man, like Heidegger, stands accused of Nazi writing and opinions during the war. Peter Brooks has an excellent essay on the controversy in the New York Review of Books. He writes:
Judging the extent and the gravity of de Man’s collaboration is difficult. At the war’s end, he was summoned for questioning in Brussels by the auditeur-général in charge of denazification, who decided not to bring any charges against him (whereas the editors of Le Soir were condemned to severe punishments). One could leave it at that: if not guiltless, not sufficiently guilty to merit sanction. Yet both those to whom de Man was an intellectual hero and those to whom he was akin to an academic Satan have wanted to know more.
Brooks is at his best when he takes seriously the charges against de Man but also reminds us of the context as well as the lost nuance in our backward looking judgments:
The most useful pieces in Responses come from the Belgians Ortwin de Graef, who as a young scholar discovered the wartime pieces, and Els de Bens. They help us to understand the nuances of collaboration in the occupied country, the different degrees of complicity with an enemy whom some saw as a liberator, and the evolution of a situation in which an apparent grant of at least limited freedom of speech and opinion gradually revealed itself to be an illusion. They do not conduce to excusing de Man—he clearly made wrong choices at a time when some others made right, and heroic, choices. They give us rather grounds for thought about life under occupation (which most Americans have not known) and the daily compromises of survival. They suggest that in our hindsight we need to be careful of unnuanced judgment. To try to understand is not in this case to excuse, but rather to hold ourselves, as judges, to an ethical standard.
On that ethical standard, Brooks finds Barish lacking. Her assertions are unsupported. And footnotes lead nowhere, as, for example, “I shared this information, and it has since been previously published in Belgian sources not now available to me.” And also, “This writer understands that an essay (citation unavailable) was produced by a student in Belgium.” As Brooks comments, “That does not pass any sort of muster. One could do a review of Barish’s footnotes that would cast many doubts on her scholarship.”
Brooks’ review is an important reminder of the way that charges of antisemitism are crude weapons. Barish, he writes,” goes on to conclude that de Man was not a pronounced anti-Semite but rather “one of the lukewarm, whom Dante condemned to sit eternally at the gates of Hell, men without principles or convictions who compromised with evil.”” I am left to wonder what it means to condemn lukewarm antisemites or racists to purgatory.
As the Director of the Hannah Arendt Center, I confront all kinds of misinformation on behalf of those who insist that Hannah Arendt defended Adolf Eichmann (on the contrary she called for him to be killed and erased from the face of the earth), that she blamed the Jews for the Holocaust (she never equates Jewish cooperation with the crimes of the Nazis), and that she opposed the state of Israel (she thought the existence of Israel important and necessary). No matter how often it is corrected, such misinformation has the tendency to spread and choke off meaningful thought and consideration.
The propagandists and vultures are circling the new Heidegger affair with open mouths. It is important at such moments to recall how easily such feeding frenzies can devour the good and the middling along with the bad and horrifically evil. It is helpful, therefore, to read a few sober cautions about the current Paul de Man controversy. Susan Rubin Suleiman has an excellent account in the NY Times Book Review. And then there is Brooks' essay in the NYRB. They are your weekend reads.
“What I propose, therefore, is very simple: it is nothing more than to think what we are doing”
—Hannah Arendt, “Prologue”, The Human Condition
The final scene of Alfonso Cuarón’s new film, Gravity, shows us Sandra Bullock trapped underwater in a satellite escape pod that she has just crashed into earth. Breaking loose from the straps and the heavy door of the pod, her body shoots up, slender and nymph like, to the surface of the unnamed body of water in which she almost drowned. She crawls out to the sand, in the footsteps of some primordial amphibian and within a few seconds she has struggled her way to uprightness, readjusting to gravity and completing the entire process of evolution. With Bullock, we feel relief and gratitude for the force that pulls us all down and makes us earth-bound creatures. In the 90 minutes leading up to this moment, we have seen her float in space, escaping one disaster or explosion after another and keeping herself precariously tethered to a bunch of satellite debris, until she finally manages to launch herself back to earth and to gravity.
I thought of this last scene – that final bit of action and irony thrown in before we are allowed to leave the movie theater: “You think she has made it back to earth? Oh no! She is about to drown!” – as I watched Margarethe von Trotta’s Hannah Arendt. The earth, and the fact that we are earth-bound creatures, our life with gravity, was a matter of great interest to Arendt. She discusses the launch of the Sputnik, that forefather of the satellites that crowd the sky in Gravity, in the forward of her book, The Human Condition, and worries that we might all find ourselves in the intellectual corollary of Sandra Bullock’s hovering in space, loosing our earthly orientation. The earth, Arendt writes, “is the very quintessence of the human condition.” (You can read an essay and watch a talk on Arendt’s discussion of earth alienation).
Unlike Cuarón, von Trotta has not produced an action movie in the conventional sense of the term, a fact that she seems to mark explicitly in the first scene of her film, which depicts the abduction of Adolf Eichmann by Mossad agents in Argentina. That moment could be the focal point of an action movie, but von Trotta wants to show us not action, but thinking, a contrast that she draws from Arendt’s writings, of course.
The movie is rich with details of Arendt’s life in the world: her love relationships and friendships, her body and the domestic setting that housed it, her public life. But what it attempts to capture are the moments in which Arendt withdraws from all of that to do what she suggests in the forward to The Human Condition: “to think what we are doing.” Barbara Sukowa depicts the thinking Arendt as she lies down on her recliner, eyes closed, slowly sucking on her cigarette. In fact, what she does is not thinking, but – as we are made to notice by Mary McCarthy’s chiding imitation of her friend’s heavy German accent in one of the party scenes that takes place in the Arendt-Blücher home on the Upper West Side – she is “sinking”. This is not a minor detail. Arendt’s political thought and her controversial analysis of the Eichmann trial, which is at the center of the movie, were formed by her own experience of statelessness and exile; the book about Eichmann, which she wrote in English, speaks with that German accent.
From the moment that McCarthy has imitated it, whenever Arendt speaks passionately about “the responsibility to sink” and “Eichmann’s inability to sink”, the viewer can’t help but note with amusement. A second immigrant’s slip of the tongue, caught by McCarthy and highlighted by its significant recurrence in the movie, also belongs to the same underwater sphere where Bullock spends the final dramatic moments of Gravity. In a discussion of the upcoming American elections, Arendt predicts that what will matter “when the ships are down” is Kennedy’s youth and charisma. When McCarthy corrects her, Arendt waves her hands impatiently. But as von Trotta’s film winds its way toward its ending, in the dramatic scene in which Arendt finally decides to lecture in public and provide a passionate defense of her book, she corrects herself and states that radical evil occurs when people fail to act “when the chips are down,” emphasizing the affricate sound of her acquired American idiom.
Though it could not be more different from Cuarón’s last bid to pump his viewers’ adrenaline by throwing Bullock into the sea, this too is an action scene. Arendt is performing precisely the type of action championed in her Human Condition, stepping out to the Agora, engaging in debate and defending her position. What von Trotta has shown is that Arendt’s terms are useful also for thinking about current cinema and the ways in which it shows us what it means to be human, what it means to act and to think about what we are doing.
University of Chicago