Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Amy Ireland is thinking about a genocide at the level of "genus-cide," the eradication of humanity itself. The threat is not weaponry but technology. And the exemplary precursor is the horse: "In the United States--where competition with the automobile was at its most intense--there were about 26 million horses in 1915. By the 1950s only 2 million remained." The question Ireland asks is whether humans are going the way of horses to be replaced by more efficient machines. Will artificially intelligent machines consume humans' fuel? "Far from being actively malevolent, an artificially intelligent agent endowed with enough power only needs to be indifferent to become a murderer. What are we, after all, but fuel? Atoms that can be freely disassembled and reassembled into something else - a thousand paperclip factories, for instance, or a massive supercomputer, capable of mathematical calculations we can't even begin to imagine in our current state of technological paucity. Even the clearly delimited goal of creating exactly one million paperclips can warrant the wasting of an entire planet, for a fully rational AI would never assign zero probability to the hypothesis that it has not yet achieved its goal.... There is something satisfying about imagining a malevolent artificial intelligence that actively wants to destroy us because it fears us, loathes us, or at least finds our existence frustrating and inconvenient. But the notion that something will destroy us out of sheer indifference is much harder to swallow because it forces us to consider the possibility of our utter insignificance. Bostrom surmises with all the level-headedness of a pure statistician that the odds against humanity's survival are overwhelmingly high. The default outcome of our construction of a single strong artificial intelligence is, quite plainly, extinction. His intention, naturally, is to raise awareness of the risks that lie behind this seemingly anodyne technological innovation and encourage governments, corporations or other entities that may one day attempt to build strong AI to implement rigorously tested control measures before letting the thing out of the box. All this is well and good, but it rests upon a deeper anthropomorphic supposition. What if the most radical gesture a flailing humanity can make at this juncture is not to increase its investment in security and control, but to pass it on? What if we are entangled in a larger evolutionary process that we never had control over in the first place? The real question then, might not be how to survive the construction of strong artificial intelligence but whether or not the survival of the human race is a good thing after all." Ireland is right to pose the question of "genus-cide," although her tone is a bit blithe. The threat is not the eradication of human beings but, as Arendt writes in The Human Condition, the loss of the human condition, those characteristics of being human like labor, work, action, and (sometimes) thinking. As Arendt writes, "This future man, whom the scientists tell us they will produce in no more than a hundred years, seems to be possessed by a rebellion against human existence as it has been given, a free gift from nowhere (secularly speaking), which he wishes to exchange, as it were, for something he has made himself. There is no reason to doubt our abilities to accomplish such an exchange."
Karl Ove Knausgaard was commissioned to travel from Sweden to the Viking's first settlement in Newfoundland and then drive across the United States in order to reflect on the state of America. In part one of his two-part "Saga," Knausgaard offers this insight into a specifically American form of poverty, the poverty of imagination and the abandonment of distinction: "I'd seen poverty before, of course, even incomprehensible poverty, as in the slums outside Maputo, in Mozambique. But I'd never seen anything like this. If what I had seen tonight--house after house after house abandoned, deserted, decaying as if there had been disaster--if this was poverty, then it must be a new kind poverty, maybe in the same way that the wealth that had amassed here in the 20th century had been a new kind of wealth. I had never really understood how a nation that so celebrated the individual could obliterate all differences the way this country did. In a system of mass production, the individual workers are replaceable and the products are identical. The identical cars are followed by identical gas stations, identical restaurants, identical motels and, as an extension of these, by identical TV screens, which hang everywhere in this country, broadcasting identical entertainment and identical dreams. Not even the Soviet Union at the height of its power had succeeded in creating such a unified, collective identity as the one Americans lived their lives within. When times got rough, a person could abandon one town in favor of another, and that new town would still represent the same thing. Was that what home was here? Not the place, not the local, but the culture, the general?"
Peter Railton gave the John Dewey Lecture at the American Philosophical Association Meeting this year, where amidst reflections on philosophical thinking, personal courage, and political activism, he offered a guileless and moving account of his personal struggle with depression. "And what of depression? Perhaps we all know the mask of depression, that frozen, affectless face we catch glimpses of on our students, colleagues, and friends. I can't do anything about that. But perhaps I can do something about the face of depression--its visible image in the minds of our children and parents, teachers and students. Because in truth, we are still to a considerable degree still in a world of 'Don't ask, don't tell' with regard to depression and associated mental disorders, such as anxiety, even though these will severely affect one in ten of us over the course of a lifetime, and often at more than one point in a lifetime. So there's nothing for it. Those whose have dwelt in the depths depression need to come out as well. Some already have, but far too few adult men (big surprise!), and especially far too few of the adult men who somehow have come to bear the stamp of respectability and recognition, and thus are visible to hundreds of students and colleagues. It's no big deal, right? We're all enlightened about this. Then why do the words stick in my throat when I tell you that another theme uniting the three episodes I have recounted from my life, and that has played an equally important role in shaping my philosophy, is that they were all accompanied by my depression. This moody high school student, this struggling protester, this anxious young faculty member--they were all me and they were all living through major depressive episodes at the time. And there have been other such episodes, some more recent. Thankfully, for me and especially for my family who have been through so much already, not right now. Did others know? I don't know. Some must have guessed--perhaps those who themselves had known depression in their lives could see the mask of depression upon my face. But the thing is: I couldn't say it. I couldn't say, 'Look, I'm dying inside. I need help.' Because that's what depression is--it isn't sadness or moodiness, it is above all a logic that undermines from within, that brings to bear all the mind's mighty resources in convincing you that you're worthless, incapable, unloveable, and everyone would be better off without you. Not a steely-eyed, careful critique from which one might learn, but an incessant bludgeoning that exaggerates past errors while ignoring new information, eroding even the ability to form memories. A young man once had the courage to tell me, 'My brain is telling me to kill myself, but my body is saying "no."' Happily, his body won. But it doesn't always. Every year, thousands of young men don't win the battle. We are captive audiences to our own minds, and it can become intolerable." Depression, Railton suggests, is still in the closet, and this causes untold pain at colleges, where, as a recent study shows, the mental health of college Freshmen is at an all-time low--something that will not surprise any of us who teach in this nation's colleges and universities.
In Railton's speech on depression discussed above, he also has this tidbit on meetings: "Oscar Wilde is still right--because the cost of building a society where the people have more say in how their lives are run is still many, many meetings. What is a meeting, after all, but people deliberating together with a capacity to act as a group that is more than just a sum of individual actions, and this sort of informed joint action is a precondition for significant social change. Come together, decide together, act together, and bear the consequences together. We must own our institutions or they will surely own us. As Aristotle told us, one becomes a citizen not by belonging to a polity or having a vote, but by shouldering the tasks of joint deliberation and civic governance. And there is no civic or faculty governance, no oversight of discrimination in hiring and promotion, no regulation of pollutants, no organization of faculty or students to initiate curricular reform, no mobilization by professional associations to protect their most vulnerable members or to promote greater diversity, no increased humaneness in the treatment of animals and human subjects, no chance to offset arbitrariness and bullying within offices and departments, no oversight of progress and revision of plans in response to changing circumstances, without actual people who care spending long hours in the work of planning, meeting, and making things happens. The alternative is for all these decisions to be made at the discretion of those on high--or not at all." At a moment when faith and participation in all institutions is rare and the pursuit of individual pursuits comparatively common, Railton's reminder of what Arendt calls the power of talking and acting together is worth heeding.
David Cole writes that the Senate Torture Report, when read in full, leads to fundamentally different conclusions than most of the headlines and early accounts suggest. Above all, the report blaming the CIA for lying may have missed the real story: "The full story is more complicated, and ultimately much more disturbing, than the initial responses--mine included--suggested. And because these documents may be the closest we come to some form of accountability, it is essential that we get the lessons right.... So why did the committee focus on efficacy and misrepresentation, rather than on the program's fundamental illegality? Possibly because that meant it could cast the C.I.A. as solely responsible, a rogue agency. A focus on legality would have rightly held C.I.A. officials responsible for failing to say no--but it also would have implicated many more officials who were just as guilty, if not more so. Lawyers at the Justice Department wrote a series of highly implausible legal memos from 2002 to 2007, opining that waterboarding, sleep deprivation, confinement in coffinlike boxes, painful stress positions and slamming people into walls were not torture; were not cruel, inhuman or degrading; and did not violate the Geneva Conventions. The same can be said for President George W. Bush, Vice President Dick Cheney and all the cabinet-level officials responsible for national security, each of whom signed off on a program that was patently illegal. The reality is, no one in a position of authority said no. This may well explain the committee's focus on the C.I.A. and its alleged misrepresentations. The inquiry began as a bipartisan effort, and there is no way that the Republican members would have agreed to an investigation that might have found fault with the entire leadership of the Bush administration. But while the committee's framing may be understandable as a political matter, it was a mistake as a matter of historical accuracy and of moral principle. The report is, to date, the closest thing to official accountability that we have. But by focusing on whether the program worked and whether the C.I.A. lied, the report was critically misleading. Responsibility for the program lies not with the C.I.A. alone, but also with everyone else, up to the highest levels of the White House, who said yes when law and morality plainly required them to say no."
Adam Phillips worries about what's inside us: "We are never as good as we should be; and neither, it seems, are other people. A life without a so-called critical faculty would seem an idiocy: what are we, after all, but our powers of discrimination, our taste, the violence of our preferences? Self-criticism, and the self as critical, are essential to our sense, our picture, of our so-called selves. Nothing makes us more critical--more suspicious or appalled or even mildly amused--than the suggestion that we should drop all this relentless criticism, that we should be less impressed by it and start really loving ourselves. But the self-critical part of ourselves, the part that Freud calls the super-ego, has some striking deficiencies: it is remarkably narrow-minded; it has an unusually impoverished vocabulary; and it is, like all propagandists, relentlessly repetitive. It is cruelly intimidating--Lacan writes of 'the obscene super-ego'--and it never brings us any news about ourselves. There are only ever two or three things we endlessly accuse ourselves of, and they are all too familiar; a stuck record, as we say, but in both senses--the super-ego is reiterative. It is the stuck record of the past ('something there badly not wrong', Beckett's line from Worstward Ho, is exactly what it must not say) and it insists on diminishing us. It is, in short, unimaginative; both about morality, and about ourselves. Were we to meet this figure socially, this accusatory character, this internal critic, this unrelenting fault-finder, we would think there was something wrong with him. He would just be boring and cruel. We might think that something terrible had happened to him, that he was living in the aftermath, in the fallout, of some catastrophe. And we would be right." In other words, critical thinking is essential, but let's also recall that it is dangerous. All thinking is an attack on the status quo and the common world in which we live. That is what Arendt means when she wrote, "There are no dangerous thoughts. Thinking itself is dangerous." That doesn't mean we should stop thinking critically, but it does mean that thinking requires knowing when thinking is, and when it is not, needed. That is the moment of judgment.
Novelist Gary Shteyngart spent a week watching Russian television and living like a Russian oligarch: "Here is the question I'm trying to answer: What will happen to me--an Americanized Russian-speaking novelist who emigrated from the Soviet Union as a child--if I let myself float into the television-filtered head space of my former countrymen? Will I learn to love Putin as 85 percent of Russians profess to do? Will I dash to the Russian consulate on East 91st Street and ask for my citizenship back? Will I leave New York behind and move to Crimea, which, as of this year, Putin's troops have reoccupied, claiming it has belonged to Russia practically since the days of the Old Testament? Or will I simply go insane? A friend of mine in St. Petersburg, a man in his 30s who, like many his age, avoids state-controlled TV and goes straight to alternative news sources on the Internet, warns me in an email: 'Your task may prove harmful to your psyche and your health in general. Russian TV, especially the news, is a biohazard.' I'll be fine, I think. Russians have survived far worse than this. But, just in case, I have packed a full complement of anti-anxiety, sleep and pain medication."
Andy Greenwald considers what made the recently concluded sitcom Parks and Recreation successful and what it's legacy might be: "Art doesn't always have to be a dark mirror reflecting reality. It can and should also be a window, thrown open to let in every last bit of possible light. Parks and Recreation never quite resembled the real America. But every episode was imbued with the idea that maybe it could, if only we, the people, cared a little more and tried a little harder. The Wire, the greatest drama of the young 21st century, left us with a tough legacy to reckon with. Parks and Rec, the best comedy of that same century, gifted us with a beautiful model to which we can collectively aspire. I doubt the future will be as bleak as David Simon's vision for it or as rosy as Mike Schur's. The joy of being a TV fan is that we get to consider both. That's not a cop-out, by the way. That's a compromise, and one that even President Leslie Knope could accept. After all, Parks was built on the bedrock belief that opposing ideas could not only have merit, they could coexist. Like the show itself, it's an idea that sounds simple but in practice is anything but."
"Arendt's Critique of Modern Society as an Analysis of Process Imaginary"
Tuesday, March 3, 2015
The Hannah Arendt Center, 1:00 pm
The Hannah Arendt Center announces three post-doctoral fellowships for the 2015-2016 academic year.
To learn more about the fellowships, including how to apply, click here.
Application Deadline: Thursday, March 5, 2015
HAC members at all levels are eligible to participate in a monthly reading group led online via a telecommunication website by Roger Berkowitz, Director of the Hannah Arendt Center.
For questions and to enroll in our virtual reading group, please email David Bisson, our Media Coordinator, at firstname.lastname@example.org.
Friday, March 6, 2015
Bluejeans.com, 11:00 am - 12:00 pm
"Figuring Rights: Wollstonecraft and the Right to Political Community
Tuesday, March 10, 2015
The Hannah Arendt Center, 6:00 - 7:00 pm
Synopsis: A diverse group of South African actors tours the war-torn regions of Northern Ireland, Rwanda, and the former Yugoslavia to share their country's experiment with reconciliation. As they ignite a dialogue among people with raw memories of atrocity, the actors find they must once again confront their homeland's violent past, and question their own capacity for healing and forgiveness.
Tuesday, March 24, 2015
Weis Cinema, Campus Center, 6:30 pm
Putting Courage at the Centre: Gandhi on Civility, Society and Self-Knowledge
Invite Only. RSVP Required.
Property and Freedom: Are Access to Legal Title and Assets the Path to Overcoming Poverty in South Africa?
A one-day conference sponsored by the Hannah Arendt Center for Politics and Humanities at Bard College, the Human Rights Project, and the Center for Civic Engagement, with support from the Ford Foundation, The Brenthurst Foundation, and The University of The Western Cape
Monday, April 6, 2015
Bard College Campus Center, Weis Cinema, 10:00 am - 7:00 pm
Invite Only. RSVP Required.
Thursday and Friday, October 15 and 16, 2015
The Hannah Arendt Center's eighth annual fall conference, "Privacy: Why Does It Matter?," will be held this year on Thursday and Friday, October 15-16, 2015! We'll see you there!
This week on the Blog, Johannes Lang explores the moral and political consequences of emotion entering into the public sphere in the Quote of the Week. American moral and social philosopher Eric Hoffer provides this week's Thoughts on Thinking. In a special feature, we recognize Aliza Becker, one of her Associate Fellows, and her creation of the American Jewish Peace Archive: An Oral History of Israeli-Palestinian Peace Activists (AJPA). And we appreciate Arendt's engagement with Saint Augustine's "Confessions" in our Library feature.
This coming Friday, March 6th, the Hannah Arendt Center will host the fifth session of its Virtual Reading Group. We will be discussing Chapters 10-13 of The Human Condition.
The reading group is available to all members and is always welcoming new participants! Please click here to learn more!
By Johannes Lang
“Whatever the passions and the emotions may be, and whatever their true connection with thought and reason, they certainly are located in the human heart. And not only is the human heart a place of darkness which, with certainty, no human eye can penetrate; the qualities of the heart need darkness and protection against the light of the public to grow and to remain what they are meant to be, innermost motives which are not for public display.”
–Hannah Arendt, On Revolution (1963)
Since September 11, 2001, historians and social scientists have rediscovered the political relevance of emotion. In the current climate of war and terror, public discussion is suffused with references to fear, hatred, and patriotism. But what are the moral and political consequences when such passions enter the public sphere? One of the most famous political thinkers of the twentieth century, Hannah Arendt, worried about the entry of emotion into politics. She scolded the French revolutionaries for having been carried away by their compassion for the poor and praised the American Founding Fathers for their aloof commitment to universal ideals and for their detached attitude to the suffering masses. Emotions may be important as subjective motives for individual action, Arendt granted, but they should neither be aired in public nor be made the basis for collective action. Emotions disfigure politics; political movements should be based on rational argument, not passion. Yet, as Volker Heins has pointed out, there was one thing Arendt feared more than the intrusion of emotions into politics: a politics completely devoid of emotion. The “ice-cold reasoning” and bureaucratic rationality she discerned behind the Holocaust was infinitely more terrifying than any other political pathology known to man. Arendt’s deep ambivalence toward emotions confronts us with a fundamental question: What is the proper place of emotion in politics?
By Hans Teerds
“The French have become masters in the art of being happy among ‘small things,’ within the space of their own four walls, between chest and bed, table and chair, dog and cat and flowerpot, extending to these things a care and tenderness which, in a world where rapid industrialization constantly kills off the things of yesterday to produce today’s objects, may even appear to be the world’s last, purely humane corner.”
-- Hannah Arendt, The Human Condition
During my first reading of Arendt’s The Human Condition, this particular quote attracted my attention, probably since I’m trained as an architect and sensible to these kind of imaginable, tangible examples. (I must also mention the very nice and almost poetic rhythm in the ‘construction’ of the sentences quoted above). The passage immediately reminded me of the famous text “Paris: Capital of the Nineteenth Century,” in which Walter Benjamin, among other things, links the importance of the domestic interior to the emerging impact of industrialization on the working people. Through the mutability of modernity, as symbolised by the arcades with their cast-iron constructions, Benjamin argues that the interior comes into conscious being to the extent that our life, work, and surroundings change. The interior of domestic life originates in the need for a place of one’s own: a small but personal haven in a turbulent world that is subject to constant change.
"[T]hese are exercises in political thought as it arises out of the actuality of political incidents (though such incidents are mentioned only occasionally), and my assumption is that thought itself arises out of incidents of living experience and must remain bound to them as the only guidepost by which to take its bearings."
-- Hannah Arendt, “Preface,” in Between Past and Future
One of the enduring sources of inspiration in Hannah Arendt's political thought is her exceptional capability of tying together reflections on concrete worldly events with in-depth philosophical, historical, and cultural insights. Her thinking never prioritizes abstract theorizations and never uses the incidents of the political world only as “examples.” Instead, for her the activity of thinking is about making sense of the events of the time. Whenever Arendt – against her habit – assumes a self-reflective position with regards to her own way of doing political theory, her emphasis is on the experiential nature of thought. In The Origins of Totalitarianism, she insists on facing the “impact of reality and the shock of experience” in all their force without succumbing to either reckless pessimism or optimism. The call to “think what we are doing,” as she puts it in The Human Condition, is indeed the primus motor underlying all her works.
"The habit of thinking prevents us at times from experiencing reality, immunises us against it, makes it seem no more than any other thought."
-- Marcel Proust
(Featured Image: Marcel Proust; Source: Famous Authors)
"The end of the old is not necessarily the beginning of the new."
Hannah Arendt, The Life of the Mind
This is a simple enough statement, and yet it masks a profound truth, one that we often overlook out of the very human tendency to seek consistency and connection, to make order out of the chaos of reality, and to ignore the anomalous nature of that which lies in between whatever phenomena we are attending to.
Perhaps the clearest example of this has been what proved to be the unfounded optimism that greeted the overthrow of autocratic regimes through American intervention in Afghanistan and Iraq, and the native-born movements known collectively as the Arab Spring. It is one thing to disrupt the status quo, to overthrow an unpopular and undemocratic regime. But that end does not necessarily lead to the establishment of a new, beneficent and participatory political structure. We see this time and time again, now in Putin's Russia, a century ago with the Russian Revolution, and over two centuries ago with the French Revolution.
Of course, it has long been understood that oftentimes, to begin something new, we first have to put an end to something old. The popular saying that you can't make an omelet without breaking a few eggs reflects this understanding, although it is certainly not the case that breaking eggs will inevitably and automatically lead to the creation of an omelet. Breaking eggs is a necessary but not sufficient cause of omelets, and while this is not an example of the classic chicken and egg problem, I think we can imagine that the chicken might have something to say on the matter of breaking eggs. Certainly, the chicken would have a different view on what is signified or ought to be signified by the end of the old, meaning the end of the egg shell, insofar as you can't make a chicken without it first breaking out of the egg that it took form within.
So, whether you take the chicken's point of view, or adopt the perspective of the omelet, looking backwards, reverse engineering the current situation, it is only natural to view the beginning of the new as an effect brought into being by the end of the old, to assume or make an inference based on sequencing in time, to posit a causal relationship and commit the logical fallacy of post hoc ergo propter hoc, if for no other reason that by force of narrative logic that compels us to create a coherent storyline. In this respect, Arendt points to the foundation tales of ancient Israel and Rome:
We have the Biblical story of the exodus of Israeli tribes from Egypt, which preceded the Mosaic legislation constituting the Hebrew people, and Virgil's story of the wanderings of Aeneas, which led to the foundation of Rome—"dum conderet urbem," as Virgil defines the content of his great poem even in its first lines. Both legends begin with an act of liberation, the flight from oppression and slavery in Egypt and the flight from burning Troy (that is, from annihilation); and in both instances this act is told from the perspective of a new freedom, the conquest of a new "promised land" that offers more than Egypt's fleshpots and the foundation of a new City that is prepared for by a war destined to undo the Trojan war, so that the order of events as laid down by Homer could be reversed.
Fast forward to the American Revolution, and we find that the founders of the republic, mindful of the uniqueness of their undertaking, searched for archetypes in the ancient world. And what they found in the narratives of Exodus and the Aeneid was that the act of liberation, and the establishment of a new freedom are two events, not one, and in effect subject to Alfred Korzybski's non-Aristotelian Principle of Non-Identity. The success of the formation of the American republic can be attributed to the awareness on their part of the chasm that exists between the closing of one era and the opening of a new age, of their separation in time and space:
No doubt if we read these legends as tales, there is a world of difference between the aimless desperate wanderings of the Israeli tribes in the desert after the Exodus and the marvelously colorful tales of the adventures of Aeneas and his fellow Trojans; but to the men of action of later generations who ransacked the archives of antiquity for paradigms to guide their own intentions, this was not decisive. What was decisive was that there was a hiatus between disaster and salvation, between liberation from the old order and the new freedom, embodied in a novus ordo saeclorum, a "new world order of the ages" with whose rise the world had structurally changed.
I find Arendt's use of the term hiatus interesting, given that in contemporary American culture it has largely been appropriated by the television industry to refer to a series that has been taken off the air for a period of time, but not cancelled. The typical phrase is on hiatus, meaning on a break or on vacation. But Arendt reminds us that such connotations only scratch the surface of the word's broader meanings. The Latin word hiatus refers to an opening or rupture, a physical break or missing part or link in a concrete material object. As such, it becomes a spatial metaphor when applied to an interruption or break in time, a usage introduced in the 17th century. Interestingly, this coincides with the period in English history known as the Interregnum, which began in 1649 with the execution of King Charles I, led to Oliver Cromwell's installation as Lord Protector, and ended after Cromwell's death with the Restoration of the monarchy under Charles II, son of Charles I. While in some ways anticipating the American Revolution, the English Civil War followed an older pattern, one that Mircea Eliade referred to as the myth of eternal return, a circular movement rather than the linear progression of history and cause-effect relations.
The idea of moving forward, of progress, requires a future-orientation that only comes into being in the modern age, by which I mean the era that followed the printing revolution associated with Johannes Gutenberg (I discuss this in my book, On the Binding Biases of Time and Other Essays on General Semantics and Media Ecology). But that same print culture also gave rise to modern science, and with it the monopoly granted to efficient causality, cause-effect relations, to the exclusion in particular of final and formal cause (see Marshall and Eric McLuhan's Media and Formal Cause). This is the basis of the Newtonian universe in which every action has an equal and opposite reaction, and every effect can be linked back in a causal chain to another event that preceded it and brought it into being. The view of time as continuous and connected can be traced back to the introduction of the mechanical clock in the 13th century, but was solidified through the printing of calendars and time lines, and the same effect was created in spatial terms by the reproduction of maps, and the use of spatial grids, e.g., the Mercator projection.
And while the invention of history, as a written narrative concerning the linear progression over time can be traced back to the ancient Israelites, and the story of the exodus, the story incorporates the idea of a hiatus in overlapping structures:
A1. Joseph is the golden boy, the son favored by his father Jacob, earning him the enmity of his brothers
A2. he is sold into slavery by them, winds up in Egypt as a slave and then is falsely accused and imprisoned
A3. by virtue of his ability to interpret dreams he gains his freedom and rises to the position of Pharaoh's prime minister
B1. Joseph welcomes his brothers and father, and the House of Israel goes down to Egypt to sojourn due to famine in the land of Canaan
B2. their descendants are enslaved, oppressed, and persecuted
B3. Moses is chosen to confront Pharaoh, liberate the Israelites, and lead them on their journey through the desert
C1. the Israelites are freed from bondage and escape from Egypt
C2. the revelation at Sinai fully establishes their covenant with God
C3. after many trials, they return to the Promised Land
It can be clearly seen in these narrative structures that the role of the hiatus, in ritual terms, is that of the rite of passage, the initiation period that marks, in symbolic fashion, the change in status, the transformation from one social role or state of being to another (e.g., child to adult, outsider to member of the group). This is not to discount the role that actual trials, tests, and other hardships may play in the transition, as they serve to establish or reinforce, psychologically and sometimes physically, the value and reality of the transformation.
In mythic terms, this structure has become known as the hero's journey or hero's adventure, made famous by Joseph Campbell in The Hero with a Thousand Faces, and also known as the monomyth, because he claimed that the same basic structure is universal to all cultures. The basis structure he identified consists of three main elements: separation (e.g., the hero leaves home), initiation (e.g., the hero enters another realm, experiences tests and trials, leading to the bestowing of gifts, abilities, and/or a new status), and return (the hero returns to utilize what he has gained from the initiation and save the day, restoring the status quo or establishing a new status quo).
Understanding the mythic, non-rational element of initiation is the key to recognizing the role of the hiatus, and in the modern era this meant using rationality to realize the limits of rationality. With this in mind, let me return to the quote I began this essay with, but now provide the larger context of the entire paragraph:
The legendary hiatus between a no-more and a not-yet clearly indicated that freedom would not be the automatic result of liberation, that the end of the old is not necessarily the beginning of the new, that the notion of an all-powerful time continuum is an illusion. Tales of a transitory period—from bondage to freedom, from disaster to salvation—were all the more appealing because the legends chiefly concerned the deeds of great leaders, persons of world-historic significance who appeared on the stage of history precisely during such gaps of historical time. All those who pressed by exterior circumstances or motivated by radical utopian thought-trains, were not satisfied to change the world by the gradual reform of an old order (and this rejection of the gradual was precisely what transformed the men of action of the eighteenth century, the first century of a fully secularized intellectual elite, into the men of the revolutions) were almost logically forced to accept the possibility of a hiatus in the continuous flow of temporal sequence.
Note that concept of gaps in historical time, which brings to mind Eliade's distinction between the sacred and the profane. Historical time is a form of profane time, and sacred time represents a gap or break in that linear progression, one that takes us outside of history, connecting us instead in an eternal return to the time associated with a moment of creation or foundation. The revelation in Sinai is an example of such a time, and accordingly Deuteronomy states that all of the members of the House of Israel were present at that event, not just those alive at that time, but those not present, the generations of the future. This statement is included in the liturgy of the Passover Seder, which is a ritual reenactment of the exodus and revelation, which in turn becomes part of the reenactment of the Passion in Christianity, one of the primary examples of Campbell's monomyth.
Arendt's hiatus, then represents a rupture between two different states or stages, an interruption, a disruption linked to an eruption. In the parlance of chaos and complexity theory, it is a bifurcation point. Arendt's contemporary, Peter Drucker, a philosopher who pioneered the scholarly study of business and management, characterized the contemporary zeitgeist in the title of his 1969 book: The Age of Discontinuity. It is an age in which Newtonian physics was replaced by Einstein's relativity and Heisenberg's uncertainty, the phrase quantum leap becoming a metaphor drawn from subatomic physics for all forms of discontinuity. It is an age in which the fixed point of view that yielded perspective in art and the essay and novel in literature yielded to Cubism and subsequent forms of modern art, and stream of consciousness in writing.
Beginning in the 19th century, photography gave us the frozen, discontinuous moment, and the technique of montage in the motion picture gave us a series of shots and scenes whose connections have to be filled in by the audience. Telegraphy gave us the instantaneous transmission of messages that took them out of their natural context, the subject of the famous comment by Henry David Thoreau that connecting Maine and Texas to one another will not guarantee that they have anything sensible to share with each other. The wire services gave us the nonlinear, inverted pyramid style of newspaper reporting, which also was associated with the nonlinear look of the newspaper front page, a form that Marshall McLuhan referred to as a mosaic. Neil Postman criticized television's role in decontextualizing public discourse in Amusing Ourselves to Death, where he used the phrase, "in the context of no context," and I discuss this as well in my recently published follow-up to his work, Amazing Ourselves to Death.
The concept of the hiatus comes naturally to the premodern mind, schooled by myth and ritual within the context of oral culture. That same concept is repressed, in turn, by the modern mind, shaped by the linearity and rationality of literacy and typography. As the modern mind yields to a new, postmodern alternative, one that emerges out of the electronic media environment, we see the return of the repressed in the idea of the jump cut writ large.
There is psychological satisfaction in the deterministic view of history as the inevitable result of cause-effect relations in the Newtonian sense, as this provides a sense of closure and coherence consistent with the typographic mindset. And there is similar satisfaction in the view of history as entirely consisting of human decisions that are the product of free will, of human agency unfettered by outside constraints, which is also consistent with the individualism that emerges out of the literate mindset and print culture, and with a social rather that physical version of efficient causality. What we are only beginning to come to terms with is the understanding of formal causality, as discussed by Marshall and Eric McLuhan in Media and Formal Cause. What formal causality suggests is that history has a tendency to follow certain patterns, patterns that connect one state or stage to another, patterns that repeat again and again over time. This is the notion that history repeats itself, meaning that historical events tend to fall into certain patterns (repetition being the precondition for the existence of patterns), and that the goal, as McLuhan articulated in Understanding Media, is pattern recognition. This helps to clarify the famous remark by George Santayana, "those who cannot remember the past are condemned to repeat it." In other words, those who are blind to patterns will find it difficult to break out of them.
Campbell engages in pattern recognition in his identification of the heroic monomyth, as Arendt does in her discussion of the historical hiatus. Recognizing the patterns are the first step in escaping them, and may even allow for the possibility of taking control and influencing them. This also means understanding that the tendency for phenomena to fall into patterns is a powerful one. It is a force akin to entropy, and perhaps a result of that very statistical tendency that is expressed by the Second Law of Thermodynamics, as Terrence Deacon argues in Incomplete Nature. It follows that there are only certain points in history, certain moments, certain bifurcation points, when it is possible to make a difference, or to make a difference that makes a difference, to use Gregory Bateson's formulation, and change the course of history. The moment of transition, of initiation, the hiatus, represents such a moment.
McLuhan's concept of medium goes far beyond the ordinary sense of the word, as he relates it to the idea of gaps and intervals, the ground that surrounds the figure, and explains that his philosophy of media is not about transportation (of information), but transformation. The medium is the hiatus.
The particular pattern that has come to the fore in our time is that of the network, whether it's the decentralized computer network and the internet as the network of networks, or the highly centralized and hierarchical broadcast network, or the interpersonal network associated with Stanley Milgram's research (popularly known as six degrees of separation), or the neural networks that define brain structure and function, or social networking sites such as Facebook and Twitter, etc. And it is not the nodes, which may be considered the content of the network, that defines the network, but the links that connect them, which function as the network medium, and which, in the systems view favored by Bateson, provide the structure for the network system, the interaction or relationship between the nodes. What matters is not the nodes, it's the modes.
Hiatus and link may seem like polar opposites, the break and the bridge, but they are two sides of the same coin, the medium that goes between, simultaneously separating and connecting. The boundary divides the system from its environment, allowing the system to maintain its identity as separate and distinct from the environment, keeping it from being absorbed by the environment. But the membrane also serves as a filter, engaged in the process of abstracting, to use Korzybski's favored term, letting through or bringing material, energy, and information from the environment into the system so that the system can maintain itself and survive. The boundary keeps the system in touch with its situation, keeps it contextualized within its environment.
The systems view emphasizes space over time, as does ecology, but the concept of the hiatus as a temporal interruption suggests an association with evolution as well. Darwin's view of evolution as continuous was consistent with Newtonian physics. The more recent modification of evolutionary theory put forth by Stephen Jay Gould, known as punctuated equilibrium, suggests that evolution occurs in fits and starts, in relatively rare and isolated periods of major change, surrounded by long periods of relative stability and stasis. Not surprisingly, this particular conception of discontinuity was introduced during the television era, in the early 1970s, just a few years after the publication of Peter Drucker's The Age of Discontinuity.
When you consider the extraordinary changes that we are experiencing in our time, technologically and ecologically, the latter underlined by the recent news concerning the United Nations' latest report on global warming, what we need is an understanding of the concept of change, a way to study the patterns of change, patterns that exist and persist across different levels, the micro and the macro, the physical, chemical, biological, psychological, and social, what Bateson referred to as metapatterns, the subject of further elaboration by biologist Tyler Volk in his book on the subject. Paul Watzlawick argued for the need to study change in and of itself in a little book co-authored by John H. Weakland and Richard Fisch, entitled Change: Principles of Problem Formation and Problem Resolution, which considers the problem from the point of view of psychotherapy. Arendt gives us a philosophical entrée into the problem by introducing the pattern of the hiatus, the moment of discontinuity that leads to change, and possibly a moment in which we, as human agents, can have an influence on the direction of that change.
To have such an influence, we do need to have that break, to find a space and more importantly a time to pause and reflect, to evaluate and formulate. Arendt famously emphasizes the importance of thinking in and of itself, the importance not of the content of thought alone, but of the act of thinking, the medium of thinking, which requires an opening, a time out, a respite from the onslaught of 24/7/365. This underscores the value of sacred time, and it follows that it is no accident that during that period of initiation in the story of the exodus, there is the revelation at Sinai and the gift of divine law, the Torah or Law, and chief among them the Ten Commandments, which includes the fourth of the commandments, and the one presented in greatest detail, to observe the Sabbath day. This premodern ritual requires us to make the hiatus a regular part of our lives, to break the continuity of profane time on a weekly basis. From that foundation, other commandments establish the idea of the sabbatical year, and the sabbatical of sabbaticals, or jubilee year. Whether it's a Sabbath mandated by religious observance, or a new movement to engage in a Technology Sabbath, the hiatus functions as the response to the homogenization of time that was associated with efficient causality and literate linearity, and that continues to intensify in conjunction with the technological imperative of efficiency über alles.
To return one last time to the quote that I began with, the end of the old is not necessarily the beginning of the new because there may not be a new beginning at all, there may not be anything new to take the place of the old. The end of the old may be just that, the end, period, the end of it all. The presence of a hiatus to follow the end of the old serves as a promise that something new will begin to take its place after the hiatus is over. And the presence of a hiatus in our lives, individually and collectively, may also serve as a promise that we will not inevitably rush towards an end of the old that will also be an end of it all, that we will be able to find the opening to begin something new, that we will be able to make the transition to something better, that both survival and progress are possible, through an understanding of the processes of continuity and change.
Indeed my opinion now is that evil is never “radical,” that it is only extreme, and that it possesses neither depth nor any demonic dimension. It can overgrow and lay waste the whole world precisely because it spreads like a fungus over the surface. It is ‘thought-defying,’ as I said, because thought tries to reach some depth, to go to the roots, and the moment it concerns itself with evil, it is frustrated because there is nothing.
-Hannah Arendt, letter to Gershom Scholem
Recent commentators have marked the 50th anniversary of Stanley Kubrick’s bleak nuclear satire, Dr. Strangelove, by noting that the film contained quite a bit more reality than we had thought. While national security and military officials at the time scoffed at the film’s farfetched depictions of a nuclear holocaust set off by a crazed general, we now know that such an unthinkable event would have been, at least theoretically, entirely possible. Yet there is another, deeper sense in which Kubrick’s satire puts us in touch with a reality that could not be readily depicted through other means.
The film tells the story of a rogue general who, at the height of the Cold War arms race, launches a nuclear attack that cannot be recalled, which leads to the destruction of most of humanity in a nuclear holocaust. These are events that we would conventionally describe as “tragic,” but the film is no tragedy. Why not? One answer, of course, is the comic, satirical touch with which Kubrick treated the material, his use of Peter Sellers to play three different characters, and his method of actually tricking his actors into playing their roles more ridiculously than they would have otherwise. But in a deeper sense, Stranglove is about the loss of a capacity for the tragic. The characters, absorbed in utter banalities as they hurtle toward collective catastrophe, display no real grasp of the moral reality of their actions, because they’ve lost contact with the moral reality of the world they share. Dr. Strangelove, then, is a satire about the impossibility of tragedy.
In order to think about what this might mean, it’s helpful to turn to the idea, famously invoked by Hannah Arendt at the end of Eichmann in Jerusalem, of the banality of evil. As Arendt stressed in a later essay, the banality of evil is not a theory or a doctrine “but something quite factual, the phenomenon of evil deeds, committed on a gigantic scale, which could not be traced to any particularity of wickedness, pathology, or ideological conviction in the doer, whose only personal distinction was perhaps extraordinary shallowness.” Eichmann was no villainous monster or demon; rather, he was “terrifyingly normal,” and his chief characteristic was “not stupidity but a curious, quite authentic inability to think.” The inability to think has nothing to do with the capacity of strategizing, performing instrumental calculations, or “reckoning with consequences,” as Hobbes put it. Rather, thinking has to do with awakening the inner dialogue involved in all consciousness, the questioning of the self by the self, which Arendt says dissolves all certainties and examines anew all accepted dogmas and values.
According to Arendt, the socially recognized function of “clichés, stock phrases, adherence to conventional, standardized codes of expression and conduct” is to “protect us against reality”; their function is to protect us against the claim that reality makes on our thinking. This claim, which awakens the dissolving powers of thought, can be so destabilizing that we all must inure ourselves to some degree against it, so that ordinary life can go on at all. What characterized Eichmann is that “he clearly knew of no such claim at all.” Eichmann’s absorption in instrumental and strategic problem solving, on the one hand, and clichés and empty platitudes on the other, was total. The absence of thought, and with it the absence of judgment, ensured a total lack of contact with the moral reality of his actions. Hence the “banality” of his evil resides not in the enormity of the consequences of his actions, but in the depthless opacity of the perpetrator.
The characters in Dr. Strangelove are banal in precisely this sense. All of them—from the affable, hapless president, the red-blooded general, the vodka-swilling diplomat, the self-interested advisors and Dr. Strangelove himself—are silly cardboard cutouts, superficial stereotypes of characters that any lack depth, self-reflection or the capacity for communicating anything other than empty clichés. They are missing what Arendt called “the activity of thinking as such, the habit of examining and reflecting upon whatever happens to come to pass, regardless of specific content and quite independent of results…” They also lack any contact with the moral reality of their activity. All of their actions takes place in an increasingly claustrophobic series of confined spaces carefully sealed off by design: the war room, the military base, the bomber cockpit. The world—Arendt’s common world of appearances that constitutes the possibility of narrative and story telling—never appears at all; reality cannot break through.
The presence of some of Arendt’s core themes in Kubrick’s film should not come as a surprise. Although she dedicated very little attention in her published works to the problem of nuclear war, in an early draft of a text that would later become The Human Condition, Arendt claimed that two experiences of the 20th century, “totalitarianism and the atomic bomb – ignite the question about the meaning of politics in our time. They are fundamental experiences of our age, and if we ignore them it is as if we never lived in the world that is our world.” Moreover, the culmination of strategic statecraft in social scientific doctrines mandating the nuclear arms race reflects on some of the core themes Arendt identified with political modernity: the emergence of a conception of politics as a strategic use of violence for the purposes of protecting society.
Niccolò Machiavelli, a thinker for whom Arendt had a lot of admiration, helped inaugurate this modern adventure of strategic statecraft by reframing politics as l’arte della stato – the art of the state, which unlike the internal civic space of the republic, always finds itself intervening within an instrumental economy of violence. For Machiavelli the prince, shedding the persona of Ciceronian humanism, must be willing to become beastly, animal-like, to discover the virtues of the vir virtutis in the animal nature of the lion and the fox. If political modernity is inaugurated by Machiavelli’s image of the centaur, the Prince-becoming-beastly, Strangelove closes with a suitable 20th century corollary to the career of modern statecraft. It is the image of the amiable, good-natured “pilot” who never steers the machines he occupies but is himself steered by them, finally straddling and literally transforming himself into the Bomb. It is an image that, in our own age of remote drone warfare and the possible dawning of a new, not yet fully conceivable epoch of post-human violence, has not lost its power to provoke reflection.