**This article was originally published on April 9, 2012. You can access the original article here.**
"It is true that storytelling reveals meaning without committing the error of defining it, that it brings about consent and reconciliation with things as they really are, and that we may even trust it to contain eventually by implication that last word which we expect from the Day of Judgment”.
--Hannah Arendt, “Isak Dinesen: 1885 – 1963” in Men in Dark Times
"The end of the old is not necessarily the beginning of the new."
Hannah Arendt, The Life of the Mind
This is a simple enough statement, and yet it masks a profound truth, one that we often overlook out of the very human tendency to seek consistency and connection, to make order out of the chaos of reality, and to ignore the anomalous nature of that which lies in between whatever phenomena we are attending to.
Perhaps the clearest example of this has been what proved to be the unfounded optimism that greeted the overthrow of autocratic regimes through American intervention in Afghanistan and Iraq, and the native-born movements known collectively as the Arab Spring. It is one thing to disrupt the status quo, to overthrow an unpopular and undemocratic regime. But that end does not necessarily lead to the establishment of a new, beneficent and participatory political structure. We see this time and time again, now in Putin's Russia, a century ago with the Russian Revolution, and over two centuries ago with the French Revolution.
Of course, it has long been understood that oftentimes, to begin something new, we first have to put an end to something old. The popular saying that you can't make an omelet without breaking a few eggs reflects this understanding, although it is certainly not the case that breaking eggs will inevitably and automatically lead to the creation of an omelet. Breaking eggs is a necessary but not sufficient cause of omelets, and while this is not an example of the classic chicken and egg problem, I think we can imagine that the chicken might have something to say on the matter of breaking eggs. Certainly, the chicken would have a different view on what is signified or ought to be signified by the end of the old, meaning the end of the egg shell, insofar as you can't make a chicken without it first breaking out of the egg that it took form within.
So, whether you take the chicken's point of view, or adopt the perspective of the omelet, looking backwards, reverse engineering the current situation, it is only natural to view the beginning of the new as an effect brought into being by the end of the old, to assume or make an inference based on sequencing in time, to posit a causal relationship and commit the logical fallacy of post hoc ergo propter hoc, if for no other reason that by force of narrative logic that compels us to create a coherent storyline. In this respect, Arendt points to the foundation tales of ancient Israel and Rome:
We have the Biblical story of the exodus of Israeli tribes from Egypt, which preceded the Mosaic legislation constituting the Hebrew people, and Virgil's story of the wanderings of Aeneas, which led to the foundation of Rome—"dum conderet urbem," as Virgil defines the content of his great poem even in its first lines. Both legends begin with an act of liberation, the flight from oppression and slavery in Egypt and the flight from burning Troy (that is, from annihilation); and in both instances this act is told from the perspective of a new freedom, the conquest of a new "promised land" that offers more than Egypt's fleshpots and the foundation of a new City that is prepared for by a war destined to undo the Trojan war, so that the order of events as laid down by Homer could be reversed.
Fast forward to the American Revolution, and we find that the founders of the republic, mindful of the uniqueness of their undertaking, searched for archetypes in the ancient world. And what they found in the narratives of Exodus and the Aeneid was that the act of liberation, and the establishment of a new freedom are two events, not one, and in effect subject to Alfred Korzybski's non-Aristotelian Principle of Non-Identity. The success of the formation of the American republic can be attributed to the awareness on their part of the chasm that exists between the closing of one era and the opening of a new age, of their separation in time and space:
No doubt if we read these legends as tales, there is a world of difference between the aimless desperate wanderings of the Israeli tribes in the desert after the Exodus and the marvelously colorful tales of the adventures of Aeneas and his fellow Trojans; but to the men of action of later generations who ransacked the archives of antiquity for paradigms to guide their own intentions, this was not decisive. What was decisive was that there was a hiatus between disaster and salvation, between liberation from the old order and the new freedom, embodied in a novus ordo saeclorum, a "new world order of the ages" with whose rise the world had structurally changed.
I find Arendt's use of the term hiatus interesting, given that in contemporary American culture it has largely been appropriated by the television industry to refer to a series that has been taken off the air for a period of time, but not cancelled. The typical phrase is on hiatus, meaning on a break or on vacation. But Arendt reminds us that such connotations only scratch the surface of the word's broader meanings. The Latin word hiatus refers to an opening or rupture, a physical break or missing part or link in a concrete material object. As such, it becomes a spatial metaphor when applied to an interruption or break in time, a usage introduced in the 17th century. Interestingly, this coincides with the period in English history known as the Interregnum, which began in 1649 with the execution of King Charles I, led to Oliver Cromwell's installation as Lord Protector, and ended after Cromwell's death with the Restoration of the monarchy under Charles II, son of Charles I. While in some ways anticipating the American Revolution, the English Civil War followed an older pattern, one that Mircea Eliade referred to as the myth of eternal return, a circular movement rather than the linear progression of history and cause-effect relations.
The idea of moving forward, of progress, requires a future-orientation that only comes into being in the modern age, by which I mean the era that followed the printing revolution associated with Johannes Gutenberg (I discuss this in my book, On the Binding Biases of Time and Other Essays on General Semantics and Media Ecology). But that same print culture also gave rise to modern science, and with it the monopoly granted to efficient causality, cause-effect relations, to the exclusion in particular of final and formal cause (see Marshall and Eric McLuhan's Media and Formal Cause). This is the basis of the Newtonian universe in which every action has an equal and opposite reaction, and every effect can be linked back in a causal chain to another event that preceded it and brought it into being. The view of time as continuous and connected can be traced back to the introduction of the mechanical clock in the 13th century, but was solidified through the printing of calendars and time lines, and the same effect was created in spatial terms by the reproduction of maps, and the use of spatial grids, e.g., the Mercator projection.
And while the invention of history, as a written narrative concerning the linear progression over time can be traced back to the ancient Israelites, and the story of the exodus, the story incorporates the idea of a hiatus in overlapping structures:
A1. Joseph is the golden boy, the son favored by his father Jacob, earning him the enmity of his brothers
A2. he is sold into slavery by them, winds up in Egypt as a slave and then is falsely accused and imprisoned
A3. by virtue of his ability to interpret dreams he gains his freedom and rises to the position of Pharaoh's prime minister
B1. Joseph welcomes his brothers and father, and the House of Israel goes down to Egypt to sojourn due to famine in the land of Canaan
B2. their descendants are enslaved, oppressed, and persecuted
B3. Moses is chosen to confront Pharaoh, liberate the Israelites, and lead them on their journey through the desert
C1. the Israelites are freed from bondage and escape from Egypt
C2. the revelation at Sinai fully establishes their covenant with God
C3. after many trials, they return to the Promised Land
It can be clearly seen in these narrative structures that the role of the hiatus, in ritual terms, is that of the rite of passage, the initiation period that marks, in symbolic fashion, the change in status, the transformation from one social role or state of being to another (e.g., child to adult, outsider to member of the group). This is not to discount the role that actual trials, tests, and other hardships may play in the transition, as they serve to establish or reinforce, psychologically and sometimes physically, the value and reality of the transformation.
In mythic terms, this structure has become known as the hero's journey or hero's adventure, made famous by Joseph Campbell in The Hero with a Thousand Faces, and also known as the monomyth, because he claimed that the same basic structure is universal to all cultures. The basis structure he identified consists of three main elements: separation (e.g., the hero leaves home), initiation (e.g., the hero enters another realm, experiences tests and trials, leading to the bestowing of gifts, abilities, and/or a new status), and return (the hero returns to utilize what he has gained from the initiation and save the day, restoring the status quo or establishing a new status quo).
Understanding the mythic, non-rational element of initiation is the key to recognizing the role of the hiatus, and in the modern era this meant using rationality to realize the limits of rationality. With this in mind, let me return to the quote I began this essay with, but now provide the larger context of the entire paragraph:
The legendary hiatus between a no-more and a not-yet clearly indicated that freedom would not be the automatic result of liberation, that the end of the old is not necessarily the beginning of the new, that the notion of an all-powerful time continuum is an illusion. Tales of a transitory period—from bondage to freedom, from disaster to salvation—were all the more appealing because the legends chiefly concerned the deeds of great leaders, persons of world-historic significance who appeared on the stage of history precisely during such gaps of historical time. All those who pressed by exterior circumstances or motivated by radical utopian thought-trains, were not satisfied to change the world by the gradual reform of an old order (and this rejection of the gradual was precisely what transformed the men of action of the eighteenth century, the first century of a fully secularized intellectual elite, into the men of the revolutions) were almost logically forced to accept the possibility of a hiatus in the continuous flow of temporal sequence.
Note that concept of gaps in historical time, which brings to mind Eliade's distinction between the sacred and the profane. Historical time is a form of profane time, and sacred time represents a gap or break in that linear progression, one that takes us outside of history, connecting us instead in an eternal return to the time associated with a moment of creation or foundation. The revelation in Sinai is an example of such a time, and accordingly Deuteronomy states that all of the members of the House of Israel were present at that event, not just those alive at that time, but those not present, the generations of the future. This statement is included in the liturgy of the Passover Seder, which is a ritual reenactment of the exodus and revelation, which in turn becomes part of the reenactment of the Passion in Christianity, one of the primary examples of Campbell's monomyth.
Arendt's hiatus, then represents a rupture between two different states or stages, an interruption, a disruption linked to an eruption. In the parlance of chaos and complexity theory, it is a bifurcation point. Arendt's contemporary, Peter Drucker, a philosopher who pioneered the scholarly study of business and management, characterized the contemporary zeitgeist in the title of his 1969 book: The Age of Discontinuity. It is an age in which Newtonian physics was replaced by Einstein's relativity and Heisenberg's uncertainty, the phrase quantum leap becoming a metaphor drawn from subatomic physics for all forms of discontinuity. It is an age in which the fixed point of view that yielded perspective in art and the essay and novel in literature yielded to Cubism and subsequent forms of modern art, and stream of consciousness in writing.
Beginning in the 19th century, photography gave us the frozen, discontinuous moment, and the technique of montage in the motion picture gave us a series of shots and scenes whose connections have to be filled in by the audience. Telegraphy gave us the instantaneous transmission of messages that took them out of their natural context, the subject of the famous comment by Henry David Thoreau that connecting Maine and Texas to one another will not guarantee that they have anything sensible to share with each other. The wire services gave us the nonlinear, inverted pyramid style of newspaper reporting, which also was associated with the nonlinear look of the newspaper front page, a form that Marshall McLuhan referred to as a mosaic. Neil Postman criticized television's role in decontextualizing public discourse in Amusing Ourselves to Death, where he used the phrase, "in the context of no context," and I discuss this as well in my recently published follow-up to his work, Amazing Ourselves to Death.
The concept of the hiatus comes naturally to the premodern mind, schooled by myth and ritual within the context of oral culture. That same concept is repressed, in turn, by the modern mind, shaped by the linearity and rationality of literacy and typography. As the modern mind yields to a new, postmodern alternative, one that emerges out of the electronic media environment, we see the return of the repressed in the idea of the jump cut writ large.
There is psychological satisfaction in the deterministic view of history as the inevitable result of cause-effect relations in the Newtonian sense, as this provides a sense of closure and coherence consistent with the typographic mindset. And there is similar satisfaction in the view of history as entirely consisting of human decisions that are the product of free will, of human agency unfettered by outside constraints, which is also consistent with the individualism that emerges out of the literate mindset and print culture, and with a social rather that physical version of efficient causality. What we are only beginning to come to terms with is the understanding of formal causality, as discussed by Marshall and Eric McLuhan in Media and Formal Cause. What formal causality suggests is that history has a tendency to follow certain patterns, patterns that connect one state or stage to another, patterns that repeat again and again over time. This is the notion that history repeats itself, meaning that historical events tend to fall into certain patterns (repetition being the precondition for the existence of patterns), and that the goal, as McLuhan articulated in Understanding Media, is pattern recognition. This helps to clarify the famous remark by George Santayana, "those who cannot remember the past are condemned to repeat it." In other words, those who are blind to patterns will find it difficult to break out of them.
Campbell engages in pattern recognition in his identification of the heroic monomyth, as Arendt does in her discussion of the historical hiatus. Recognizing the patterns are the first step in escaping them, and may even allow for the possibility of taking control and influencing them. This also means understanding that the tendency for phenomena to fall into patterns is a powerful one. It is a force akin to entropy, and perhaps a result of that very statistical tendency that is expressed by the Second Law of Thermodynamics, as Terrence Deacon argues in Incomplete Nature. It follows that there are only certain points in history, certain moments, certain bifurcation points, when it is possible to make a difference, or to make a difference that makes a difference, to use Gregory Bateson's formulation, and change the course of history. The moment of transition, of initiation, the hiatus, represents such a moment.
McLuhan's concept of medium goes far beyond the ordinary sense of the word, as he relates it to the idea of gaps and intervals, the ground that surrounds the figure, and explains that his philosophy of media is not about transportation (of information), but transformation. The medium is the hiatus.
The particular pattern that has come to the fore in our time is that of the network, whether it's the decentralized computer network and the internet as the network of networks, or the highly centralized and hierarchical broadcast network, or the interpersonal network associated with Stanley Milgram's research (popularly known as six degrees of separation), or the neural networks that define brain structure and function, or social networking sites such as Facebook and Twitter, etc. And it is not the nodes, which may be considered the content of the network, that defines the network, but the links that connect them, which function as the network medium, and which, in the systems view favored by Bateson, provide the structure for the network system, the interaction or relationship between the nodes. What matters is not the nodes, it's the modes.
Hiatus and link may seem like polar opposites, the break and the bridge, but they are two sides of the same coin, the medium that goes between, simultaneously separating and connecting. The boundary divides the system from its environment, allowing the system to maintain its identity as separate and distinct from the environment, keeping it from being absorbed by the environment. But the membrane also serves as a filter, engaged in the process of abstracting, to use Korzybski's favored term, letting through or bringing material, energy, and information from the environment into the system so that the system can maintain itself and survive. The boundary keeps the system in touch with its situation, keeps it contextualized within its environment.
The systems view emphasizes space over time, as does ecology, but the concept of the hiatus as a temporal interruption suggests an association with evolution as well. Darwin's view of evolution as continuous was consistent with Newtonian physics. The more recent modification of evolutionary theory put forth by Stephen Jay Gould, known as punctuated equilibrium, suggests that evolution occurs in fits and starts, in relatively rare and isolated periods of major change, surrounded by long periods of relative stability and stasis. Not surprisingly, this particular conception of discontinuity was introduced during the television era, in the early 1970s, just a few years after the publication of Peter Drucker's The Age of Discontinuity.
When you consider the extraordinary changes that we are experiencing in our time, technologically and ecologically, the latter underlined by the recent news concerning the United Nations' latest report on global warming, what we need is an understanding of the concept of change, a way to study the patterns of change, patterns that exist and persist across different levels, the micro and the macro, the physical, chemical, biological, psychological, and social, what Bateson referred to as metapatterns, the subject of further elaboration by biologist Tyler Volk in his book on the subject. Paul Watzlawick argued for the need to study change in and of itself in a little book co-authored by John H. Weakland and Richard Fisch, entitled Change: Principles of Problem Formation and Problem Resolution, which considers the problem from the point of view of psychotherapy. Arendt gives us a philosophical entrée into the problem by introducing the pattern of the hiatus, the moment of discontinuity that leads to change, and possibly a moment in which we, as human agents, can have an influence on the direction of that change.
To have such an influence, we do need to have that break, to find a space and more importantly a time to pause and reflect, to evaluate and formulate. Arendt famously emphasizes the importance of thinking in and of itself, the importance not of the content of thought alone, but of the act of thinking, the medium of thinking, which requires an opening, a time out, a respite from the onslaught of 24/7/365. This underscores the value of sacred time, and it follows that it is no accident that during that period of initiation in the story of the exodus, there is the revelation at Sinai and the gift of divine law, the Torah or Law, and chief among them the Ten Commandments, which includes the fourth of the commandments, and the one presented in greatest detail, to observe the Sabbath day. This premodern ritual requires us to make the hiatus a regular part of our lives, to break the continuity of profane time on a weekly basis. From that foundation, other commandments establish the idea of the sabbatical year, and the sabbatical of sabbaticals, or jubilee year. Whether it's a Sabbath mandated by religious observance, or a new movement to engage in a Technology Sabbath, the hiatus functions as the response to the homogenization of time that was associated with efficient causality and literate linearity, and that continues to intensify in conjunction with the technological imperative of efficiency über alles.
To return one last time to the quote that I began with, the end of the old is not necessarily the beginning of the new because there may not be a new beginning at all, there may not be anything new to take the place of the old. The end of the old may be just that, the end, period, the end of it all. The presence of a hiatus to follow the end of the old serves as a promise that something new will begin to take its place after the hiatus is over. And the presence of a hiatus in our lives, individually and collectively, may also serve as a promise that we will not inevitably rush towards an end of the old that will also be an end of it all, that we will be able to find the opening to begin something new, that we will be able to make the transition to something better, that both survival and progress are possible, through an understanding of the processes of continuity and change.
The response has been swift and negative to the Rolling Stone Magazine cover—a picture of Dzhokhar Tsarnaev who with his now dead brother planted deadly homemade bombs near the finish-line of the Boston Marathon. The cover features a picture Tsarnaev himself posted on his Facebook page before the bombing. It shows him as he wanted himself to be seen—that itself has offended many, who ask why he is not pictured as a suspect or convict. In the photo he is young, hip, handsome, and cool. He could be a rock star, and given the context of the Rolling Stone cover, that is how he appears.
The cover is jarring, and that is intended. It is controversial, and that was probably also intended. Hundreds of thousands of comments on Facebook and around the web are critical and angry, asking how Rolling Stone could portray the bomber as a rock-star. They overlook or ignore the text accompanying the photo on the cover, which reads: “The Bomber. How a Popular, Promising Student Was Failed by His Family, Fell Into Radical Islam, and Became a Monster.” CVS and other retailers have announced they will not sell the magazine in their stores.
That is unfortunate, for the story written by Janet Reitman is exceptionally good and deserves to be read.
Controversies like this have a perverse effect. Just as the furor over Hannah Arendt’s Eichmann in Jerusalem resulted in the viral dissemination of her claims about the Jewish leaders, so too will this Rolling Stone cover be seen by millions of people who otherwise would never have heard of Rolling Stone. What is more, such publicity makes it ever less likely that the story itself will be read seriously, just as Arendt’s book was criticized by everyone, but read by few.
Reitman’s narrative itself is unexceptional. It is a common story line: young, normal kid becomes radicalized and does something none of his old friends can believe he could do. This is a now familiar narrative that we hear in the wake of the tragedies in Newtown (Adam Lanza was described as a nice quiet kid) and Columbine (Time’s cover announced “The Monsters Next Door.”)
This is also the narrative that Rolling Stone managing editor Will Dana embraced to defend the Cover on NPR arguing it was an "apt image because part of what the story is about is what an incredibly normal kid [Tsarnaev] seemed like to those who knew him best back in Cambridge.” It was echoed too by Erin Burnett, on CNN, who recently invoked Hannah Arendt’s idea of the “banality of evil.” In the easy frame the story offers, Tsarnaev was a good kid, part of a striving immigrant family, someone who loved multi-racial America. And then something went wrong. He found Islam; his family fell apart; and he became a monster.
This story is too simple. And yet within the Rolling Stone story, there is a wealth of information and reporting that does give a nuanced and thoughtful portrayal of Tsarnaev’s journey into the heart of evil.
One fact that is important to note is that Tsarnaev is not Eichmann. Eichmann was a member of the SS, a nationalist security service engaged in world war and dedicated to wiping certain races of peoples off the face of the earth. He committed genocide as part of a system of extermination, something both worse than and yet less messy than murder itself. It is Tsarnaev, who had no state apparatus behind him, who become a cold-blooded murderer. The problems that Hannah Arendt thought that the court in Jerusalem faced with Eichmann—that he was a new type of criminal—do not apply in Tsarnaev’s case. He is a murderer. To understand him is not to understand a new type of criminal. And yet it is a worthy endeavor to try to understand why more and more young men like Tsarnaev are so easily radicalized and drawn to murdering innocent people in the name of a cause.
Both Eichmann and Tsarnaev were from upwardly striving bourgeois families that struggled with economic setbacks. Eichmann was white and Austrian, Tsarnaev an immigrant in Cambridge, but both were economically disaffected. Tsarnaev wanted to make money and, like his parents, dreamed of a better life.
Tsarnaev’s family had difficulty fitting in with U.S. culture. His father was ill and could not work. His mother sought to earn money. And his older brother, whom he idolized, saw his dreams of Olympic boxing dashed partly because he was not a citizen. He increasingly turned to a radical version of Islam. When Tsarnaev’s parents both returned to Dagestan, he fell increasingly under the influence of his older brother.
Like Eichmann, Tsarnaev appears to have adopted an ideology that provided a coherent and meaningful narrative that gave his life significance. One can see this in a number of tweets and statements that are quoted in the article. For example, just before the bombing, he tweeted:
"Evil triumphs when good men do nothing."
"If you have the knowledge and the inspiration all that's left is to take action."
"Most of you are conditioned by the media."
Like Eichmann, Tsarnaev came to see himself as a hero, someone willing to suffer and even die for a noble cause. His cause was different—anti-American jihad instead of anti-Semitic Nazism—but he was an ideological idealist, a joiner, someone who found meaning and importance in belonging to a movement. A smart and talented and by most accounts good young man, he was lost and adrift, searching for someone and something to give his life purpose. He found that someone in his brother and that something in jihad against America, the land that previously he had so embraced. And he became someone who believed that what he was doing was right and necessary, even if he understood also that it was wrong.
We see clearly this ambivalent understanding of right and wrong in the note Tsarnaev apparently scrawled while he was hiding in a boat before he was captured. Here is how Reitman’s article describes what he wrote:
When investigators finally gained access to the boat, they discovered a jihadist screed scrawled on its walls. In it, according to a 30-count indictment handed down in late June, Jihad [Tsarnaev's nickname] appeared to take responsibility for the bombing, though he admitted he did not like killing innocent people. But "the U.S. government is killing our innocent civilians," he wrote, presumably referring to Muslims in Iraq and Afghanistan. "I can't stand to see such evil go unpunished. . . . We Muslims are one body, you hurt one, you hurt us all," he continued, echoing a sentiment that is cited so frequently by Islamic militants that it has become almost cliché. Then he veered slightly from the standard script, writing a statement that left no doubt as to his loyalties: "Fuck America."
Eichmann too spoke of his shock and disapproval of killing innocent Jews, but he justified doing so for the higher Nazi cause. He also said that when he found out about the sufferings of Germans at the hands of the allies, it made it easier for him to justify what he had done, because he saw it as equivalent. The fact that the Germans were aggressors, that they had started the war, and that they were killing and torturing innocent people simply did not register for Eichmann, just as it did not register for Tsarnaev that the people in the Boston marathon were innocent. There are, of course, innocent people in Iraq and Afghanistan who have died at the hands of U.S. bombs. Even for those of us who were against the wars and question their sense and justification, however, there is a difference between death in a war zone and terrorism.
The Rolling Stone article does a good job of chronicling Tsarnaev's slide into a radical jihadist ideology, one mixed with conspiracy theories.
The Prophet Muhammad, he noted on Twitter, was now his role model. "For me to know that I am FREE from HYPOCRISY is more dear to me than the weight of the ENTIRE world in GOLD," he posted, quoting an early Islamic scholar. He began following Islamic Twitter accounts. "Never underestimate the rebel with a cause," he declared.
His rebellious cause was to awaken Americans to their complicity both in the bombing of innocent Muslims and also to his belief in the common conspiracy theory that America was behind the 9/11 attacks. In one Tweet he wrote: "Idk [I don’t know] why it's hard for many of you to accept that 9/11 was an inside job, I mean I guess fuck the facts y'all are some real #patriots #gethip."
Besides these tweets that offer a provocative insight into Tsarnaev's emergent ideological convictions, the real virtue of the article is its focus on Tsarnaev's friends, his school, and his place in American youth culture. While his friends certainly do not support or condone what Tsarnaev did, many share some of his conspiratorial and anti-American beliefs. Here are two descriptions of the mainstream nature of many of his beliefs:
To be fair, Will and others note, Jahar's perspective on U.S. foreign policy wasn't all that dissimilar from a lot of other people they knew. "In terms of politics, I'd say he's just as anti-American as the next guy in Cambridge," says Theo.
This is not an uncommon belief. Payack, who [was Tsarnaev's wrestling coach and mentor and] also teaches writing at the Berklee College of Music, says that a fair amount of his students, notably those born in other countries, believe 9/11 was an "inside job." Aaronson tells me he's shocked by the number of kids he knows who believe the Jews were behind 9/11. "The problem with this demographic is that they do not know the basic narratives of their histories – or really any narratives," he says. "They're blazed on pot and searching the Internet for any 'factoids' that they believe fit their highly de-historicized and decontextualized ideologies. And the adult world totally misunderstands them and dismisses them – and does so at our collective peril," he adds.
The article presents a sad portrait of youth culture, and not just because all these “normal” kids are smoking “a copious amount of weed.” The jarring realization is that these talented and intelligent young people at a good school in a storied neighborhood come off so disaffected. What is more, their beliefs in conspiracies are accepted by the adults in their lives as commonplaces; their anti-Americanism is simply a noted fact; and their idolization of slacking (Tsarnaev's favorite word, his friends say, “was "sherm," Cambridge slang for ‘slacker’”) is seen as cute. There is painfully little concern by adults to insist that the young people face facts and confront unserious opinions.
In short, the young people in Tsarnaev's story appear to be abandoned by adults to their own youthful and quite fanciful views of reality. Youth culture dominates, and adult supervision seems absent. There is seemingly no one who, in Arendt’s language from “The Crisis in Education”, takes responsibility for teaching them to love the world as it is.
The Rolling Stone article and cover do not glorify a monster; but they do play on two dangerous trends in modern culture that Hannah Arendt worried about in her writing: First, the rise of youth culture and the abandonment of adult authority in education; and second, the fascination bourgeois culture has for vice and the short distance that separates an acceptance of vice from an acceptance of monstrosity. If only all the people who are so concerned about a magazine cover today were more concerned about the delusions and fantasies of Tsarnaev, his friends, and others like them.
Taking responsibility for teaching young people to love the world is the very essence of what Arendt understands education to be. It will be the topic of the Hannah Arendt Center upcoming conference “Failing Fast: The Crisis of the Educated Citizen.” Registration for the conference opened this week. For now, ignore the controversy and read Reitman’s article “Jahar’s World.” It is your weekend read. It is as good an argument for thinking seriously about the failure of our approach to education as one can find.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
In the New York Times, Roger Berkowitz takes on what he calls the new consensus emerging in responses to the new "Hannah Arendt" movie, that seems to be resolving the vitriolic debates over Arendt's characterization of Adolf Eichmann over the last 50 years. This new consensus holds that Arendt was right in her general claim that many evildoers are normal people, but was wrong about Eichmann in particular. As Christopher Browning summed it up recently in the New York Review of Books, "Arendt grasped an important concept but not the right example." As Berkowitz writes, this new consensus is founded upon "new scholarship on Eichmann's writings and reflections from the 1950's, when he was living amongst a fraternity of former Nazis in Argentina, before Israeli agents captured him and spirited him out of the country and to Israel. Eichmann's writings include an unpublished memoir, "The Others Spoke, Now Will I Speak," and an interview conducted over many months with a Nazi journalist and war criminal, Willem Sassen, which were not released until long after the trial. Eichmann's justification of his actions to Sassen is considered more genuine than his testimony before judges in Jerusalem. In recent decades, scholars have argued that the Sassen interviews show that Arendt was simply wrong in her judgment of Eichmann because she did not have all the facts." As tempting as this new consensus is, it is wrong, Berkowitz argues. Read his full argument here.
Geoff Dyer, flipping through the catalogue of a recent Gary Winograd retrospective at SFMoMA, considers the way that the street photographer presented what he saw: "the pictures didn't look right, they were all skewed and lurchy, random-seeming and wrong. They were, it was felt, an unprovoked assault on the eye... We were accustomed to viewing the world through a set of conventional lenses that Winograd wrenched from our face, making us conscious of how short-sighted we had been." Winograd's still pictures, in other words, act on their viewers, betraying our sense of the world, shifting it out of focus, and therefore revealing it for what it is.
Tony Horwitz uses the upcoming 150th anniversary of Gettysburg to zoom out and consider the changing historical narrative about the American Civil War, in the process offering up an important reminder that history is a living, changing thing: "the 150th anniversary of the Civil War is too narrow a lens through which to view the conflict. We are commemorating the four years of combat that began in 1861 and ended with Union victory in 1865. But Iraq and Afghanistan remind us, yet again, that the aftermath of war matters as much as its initial outcome. Though Confederate armies surrendered in 1865, white Southerners fought on by other means, wearing down a war-weary North that was ambivalent about if not hostile to black equality. Looking backwards, and hitting the pause button at the Gettysburg Address or the passage of the 13th amendment, we see a "good" and successful war for freedom. If we focus instead on the run-up to war, when Lincoln pledged to not interfere with slavery in the South, or pan out to include the 1870s, when the nation abandoned Reconstruction, the story of the Civil War isn't quite so uplifting. "
Computer scientist and writer Jaron Lanier critiques the present digital economy with a close look at the evolving relationship between technology and power. To make his argument for change, he insightfully reinterprets what many consider to be a paradox - that the pairing of technology and power at once enriches and erodes the agency of individual actors. Companies like Google are so valuable, he argues, because they control enormously powerful and expensive servers-he calls them Siren Servers to emphasize their irresistible allure-that allow it to manipulate aggregate activity over time. "While people are rarely forced to accept the influence of Siren Servers in any particular case, on a broad statistical basis it becomes impossible for a population to do anything but acquiesce over time....While no particular Google ad is guaranteed to work, the overall Google ad scheme by definition must work, because of the laws of statistics. Superior computation lets a Siren Server enjoy the magical benefits of reliably manipulating others even though no hand is forced ... We need to experiment; to learn how to nurture a middle class that can thrive even in a highly automated society."
Discussing her recent essay in Harper's, writer Rebecca Makkai talks about her experience of her grandfather, whom she knew as a yoga instructor who lived in Hawaii, who was also the principal author of Hungary's Second Jewish Law, which passed in 1939. At one point, she strikes a particularly Arendtian note: "There's also the fact that it's just very difficult, psychologically, to reconcile the face of a real person with one of the darkest moments of the twentieth century. It's not the same as looking at someone who's personally violent, likely to reach out and hit you. This guy is chopping up papaya on his balcony, telling jokes, and I think we have an instinct to forgive, to see just the best in that person, to see him at just that moment. (The irony being that this is what he and his colleagues failed to do - to see humans in front of them.)"
Roger Berkowitz will be in attendance at the Moviehouse in Millerton for a discussion after the 4:00 pm screening of "Hannah Arendt" and before the 7:00 pm screening.
July 16, 2013
Following the 7:40 pm showing of "Hannah Arendt" at the Quad Cinema on 13th St. in N.Y.C., there will be a Q&A with Roger Berkowitz about the film.
July 21, 2013
Following the 6:00 pm showing of "Hannah Arendt" at Symphony Space on Broadway and 95th St. in N.Y.C., there will be a Q&A with Roger Berkowitz about the film.
The sixth annual fall conference, "Failing Fast" The Educated Citizen in Crisis"
Olin Hall, Bard College
Learn more here.
From the Hannah Arendt Center Blog
This week on the blog, Ian Storey in the Quote of the Week looks at the implications of the recent Supreme Court same sex marriage rulings. Jeff Champlin considers Arendt's reading of Kant, offering a new way to think about judgment. Hannah Arendt's thinking is brought to bear on the Paula Deen scandal. And, for your weekend read, Roger Berkowitz looks at the moral implications of financial inequality.
In an essay in the Wall Street Journal, Frans de Waal—C. H. Candler Professor of Primate Behavior at Emory University—offers a fascinating review of recent scientific studies that upend long-held expectations about the intelligence of animals. De Waal rehearses a catalogue of fantastic studies in which animals do things that scientists have long thought they could not do. Here are a few examples:
Ayumu, a male chimpanzee, excels at memory; just as the IBM computer Watson can beat human champions at Jeopardy, Ayumu can easily best the human memory champion in games of memory.
Similarly, Kandula, a young elephant bull, was able to reach some fragrant fruit hung out of reach by moving a stool over to the tree, standing on it, and reaching for the fruit with his trunk. I’ll admit this doesn’t seem like much of a feat to me, but for the researchers de Waal talks with, it is surprising proof that elephants can use tools.
Scientists may be surprised that animals can remember things or use tools to accomplish tasks, but any one raised on children’s tales of Lassie or Black Beauty knows this well, as does anyone whose pet dog opened a door knob, brought them a newspaper, or barked at intruders. The problem these studies address is less our societal view of animals than the overly reductive view of animals that de Waal attributes to his fellow scientists. It’s hard to take these studies seriously as evidence that animals think in the way that humans do.
Seemingly more interesting are experiments with self-recognition and also facial recognition. De Waal describes one Asian Elephant who stood in front of a mirror and “repeatedly rubbed a white cross on her forehead.” Apparently the elephant recognized the image in the mirror as herself. In another experiment, chimpanzees were able to recognize which pictures of chimpanzees were from their own species. Like my childhood Labrador who used to stare knowingly into the mirror, these studies confirm that animals are able to recognize themselves. This means that animals do, likely, understand that they are selves.
For de Waal, these studies have started to upend a view of humankind's unique place in the universe that dates back at least to ancient Greece. “Science,” he writes, “keeps chipping away at the wall that separates us from the other animals. We have moved from viewing animals as instinct-driven stimulus-response machines to seeing them as sophisticated decision makers.”
The flattening of the distinction between animals and humans is to be celebrated, De Waal argues, and not feared. He writes:
Aristotle's ladder of nature is not just being flattened; it is being transformed into a bush with many branches. This is no insult to human superiority. It is long-overdue recognition that intelligent life is not something for us to seek in the outer reaches of space but is abundant right here on earth, under our noses.
DeWaal has long championed the intelligence of animals, and now his vision is gaining momentum. This week, in a long essay called “One of Us” in the new Lapham’s Quarterly on animals, the glorious essayist John Jeremiah Sullivan begins with this description of similar studies to the ones DeWaal writes about:
These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”
With nuance and subtlety, Sullivan understands that our tradition has not drawn the boundary between human and animal nearly as securely as de Waal portrays it. Throughout human existence, humans and animals have been conjoined in the human imagination. Sullivan writes that the most consistent “motif in the artwork made between four thousand and forty thousand years ago,” is the focus on “animal-human hybrids, drawings and carvings and statuettes showing part man or woman and part something else—lion or bird or bear.” In these paintings and sculptures, our ancestors gave form to a basic intuition: “Animals knew things, possessed their forms of wisdom.”
Religious history too is replete with evidence of the human recognition of the dignity of animals. God says in Isaiah that the beasts will honor him and St. Francis, the namesake of the new Pope, is famous for preaching to birds. What is more, we are told that God cares about the deaths of animals.
“In the Gospel According to Matthew we’re told, “Not one of them will fall to the ground apart from your Father.” Think about that. If the bird dies on the branch, and the bird has no immortal soul, and is from that moment only inanimate matter, already basically dust, how can it be “with” God as it’s falling? And not in some abstract all-of-creation sense but in the very way that we are with Him, the explicit point of the verse: the line right before it is “fear not them which kill the body, but are not able to kill the soul.” If sparrows lack souls, if the logos liveth not in them, Jesus isn’t making any sense in Matthew 10:28-29.
What changed and interrupted the ancient and deeply human appreciation of our kinship with besouled animals? Sullivan’s answer is René Descartes. The modern depreciation of animals, Sullivan writes,
proceeds, with the rest of the Enlightenment, from the mind of René Descartes, whose take on animals was vividly (and approvingly) paraphrased by the French philosopher Nicolas Malebranche: they “eat without pleasure, cry without pain, grow without knowing it; they desire nothing, fear nothing, know nothing.” Descartes’ term for them was automata—windup toys, like the Renaissance protorobots he’d seen as a boy in the gardens at Saint-Germain-en-Laye, “hydraulic statues” that moved and made music and even appeared to speak as they sprinkled the plants.
Too easy, however, is the move to say that the modern comprehension of the difference between animal and human proceeds from a mechanistic view of animals. We live at a time of the animal rights movement. Around the world, societies exist and thrive whose mission is to prevent cruelty toward and to protect animals. Yes, factory farms treat chickens and pigs as organic mechanisms for the production of meat, but these farms co-exist with active and quite successful movements calling for humane standards in food production. Whatever the power of Cartesian mechanics, its success is at odds with the persistence of the religious, ancient solidarity, and also deeply modern sympathy between human and animal.
A more meaningful account of the modern attitude towards animals might be found in Spinoza. Spinoza, as Sullivan quotes him, recognizes that animals feel in ways that Descartes did not. As do animal rights activists, Spinoza admits what is obvious: that animals feel pain, show emotion, and have desires. And yet, Spinoza maintains a distinction between human and animal—one grounded not in emotion or feeling, but in human nature. In his Ethics, he writes:
Hence it follows that the emotions of the animals which are called irrational…only differ from man’s emotions to the extent that brute nature differs from human nature. Horse and man are alike carried away by the desire of procreation, but the desire of the former is equine, the desire of the latter is human…Thus, although each individual lives content and rejoices in that nature belonging to him wherein he has his being, yet the life, wherein each is content and rejoices, is nothing else but the idea, or soul, of the said individual…It follows from the foregoing proposition that there is no small difference between the joy which actuates, say, a drunkard, and the joy possessed by a philosopher.
Spinoza argues against the law prohibiting slaughter of animals—it is “founded rather on vain superstition and womanish pity than on sound reason”—because humans are more powerful than animals. Here is how he defends the slaughter of animals:
The rational quest of what is useful to us further teaches us the necessity of associating ourselves with our fellow men, but not with beasts, or things, whose nature is different from our own; we have the same rights in respect to them as they have in respect to us. Nay, as everyone’s right is defined by his virtue, or power, men have far greater rights over beasts than beasts have over men. Still I do not deny that beasts feel: what I deny is that we may not consult our own advantage and use them as we please, treating them in the way which best suits us; for their nature is not like ours.
Spinoza’s point is quite simple: Of course animals feel and of course they are intelligent. Who could doubt such a thing? But they are not human. That is clear too. While we humans may care for and even love our pets, we recognize the difference between a dog and a human. And we will, in the end, associate more with our fellow humans than with dogs and porpoises. Finally, we humans will use animals when they serve our purposes. And this is ok, because have the power to do so.
Is Spinoza arguing that might makes right? Surely not in the realm of law amongst fellow humans. But he is insisting that we recognize that for us humans, there is something about being human that is different and, even, higher and more important. Spinoza couches his argument in the language of natural right, but what he is saying is that we must recognize that there are important differences between animals and humans.
At a time that values equality over what Friedrich Nietzsche called the “pathos of difference,” the valuation of human beings over animals is ever more in doubt. This comes home clearly in a story told recently by General Stanley McChrystal, about a soldier who expressed sympathy for some dogs killed in a raid in Iraq. McChrystal responded, severely: “"Seven enemy were killed on that target last night. Seven humans. Are you telling me you're more concerned about the dog than the people that died? The car fell silent again. "Hey listen," I said. "Don't lose your humanity in this thing."” Many, no doubt, are more concerned, or at least are equally concerned, about the deaths of animals as they are about the deaths of humans. There is ever-increasing discomfort about McChrystal’s common sense affirmation of Spinoza’s claim that human beings simply are of more worth than animals.
The distinctions upon which the moral sense of human distinction is based are foundering. For DeWaal and Sullivan, the danger today is that we continue to insist on differences between animals and humans—differences that we don’t fully understand. The consequences of their openness to the humanization of animals, however, is undoubtedly the animalization of humans. The danger that we humans lose sight of what distinguishes us from animals is much more significant than the possibility that we underestimate animal intelligence.
I fully agree with DeWaal and Sullivan that there is a symphony of intelligence in the world, much of it not human. And yes, we should have proper respect for our ignorance. But all the experiments in the world do little to alter the basic facts, that no matter how intelligent and feeling and even conscious animals may be, humans and animals are different.
What is the quality of that difference? It is difficult to say and may never be fully articulated in propositional form. On one level it is this: Simply to live, as do plants or animals, does not constitute a human life. In other words, human life is not simply about living. Nor is it about doing tasks or even being conscious of ourselves as humans. It is about living meaningfully. There may, of course, be some animals that can create worlds of meaning—worlds that we have not yet discovered. But their worlds are not the worlds to which we humans aspire.
Over two millennia ago, Sophocles, in his “Ode to Man,” named man Deinon, a Greek word that connotes both greatness and horror, that which is so extraordinary as to be at once terrifying and beautiful. Man, Sophocles tells us, can travel over water and tame animals, using them to plough fields. He can invent speech, and institute governments that bring humans together to form lasting institutions. As an inventor and maker of his world, this wonder that is man terrifyingly carries the seeds of his destruction. As he invents and comes to control his world, he threatens to extinguish the mystery of his existence, that part of man that man himself does not control. As the chorus sings: “Always overcoming all ways, man loses his way and comes to nothing.” If man so tames the earth as to free himself from uncertainty, what then is left of human being?
Sophocles knew that man could be a terror; but he also glorified the wonder that man is. He knew that what separates us humans from animals is our capacity to alter the earth and our natural environment. “The human artifice of the world,” Arendt writes, “separates human existence from all mere animal environment…” Not only by building houses and erecting dams—animals can do those things and more—but also by telling stories and building political communities that give to man a humanly created world in which he lives. If all we did as humans was live or build things on earth, we would not be human.
To be human means that we can destroy all living matter on the Earth. We can even today destroy the earth itself. Whether we do so or not, it now means that to live on Earth today is a “Choice” that we make, not a matter of fate or chance. Our Earth, although we did not create it, is now something we humans can decide to sustain or destroy. In this sense, it is a human creation. No other animal has such a potential or such a responsibility.
There is a deep desire today to flee from that awesome and increasingly unbearable human responsibility. We flee, therefore, our humanity and take solace in the view that we are just one amongst the many animals in the world. We see this reductionism above all in human rights discourse. One core demand of human rights—that men and women have a right to live and not be killed—brought about a shift in the idea of humanity from logos to life. The rise of a politics of life—the political demand that governments limit freedoms and regulate populations in order to protect and facilitate their citizens’ ability to live in comfort—has pushed the animality, the “life,” of human beings to the center of political and ethical activity. In embracing a politics of life over a politics of the meaningful life, human rights rejects the distinctive dignity of human rationality and works to reduce humanity to its animality.
Hannah Arendt saw human rights as dangerous precisely because they risked confusing the meaning of human worldliness with the existence of mere animal life. For Arendt, human beings are the beings who build and live in a political world, by which she means the stories, institutions, and achievements that mark the glory and agony of humanity. To be human, she insists, is more than simply living, laboring, working, acting, and thinking. It is to do all of these activities in such a way as to create, together, a common life amongst a plurality of persons.
I fear that the interest in animal consciousness today is less a result of scientific proof that animals are human than it is an increasing discomfort with the world we humans have built. A first step in responding to such discomfort, however, is a reaffirmation of our humanity and our human responsibility. There is no better way to begin that process than in engaging with a very human response to the question of our animality. Towards that end, I commend to you “One of Us,” by John Jeremiah Sullivan.
Of late there has been no shortage of commentary on the ten years that have passed since the U.S. invasion of Iraq in 2003. Much of it has focused on the justifications for the war provided by members of the Bush administration, the lingering consequences of the invasion for President Obama and other policymakers, and the often harrowing experiences of American soldiers. These are certainly matters that should be discussed at length.
But U.S. public discourse continues to say little about the impact of the war on Iraqis themselves or about their efforts to survive and interpret it.
Much of it also remains tightly focused on the era after 9/11, as if those day’s events rendered the longer arc of Iraqi history—including the part that the U.S. has played in it—more or less irrelevant. To the extent that the country’s past is addressed at all, it commonly reduces “sectarianism,” “tribalism,” and other shibboleths to intrinsic and timeless features of Iraqi (and wider Arab and Islamic) life.
Two recent contributions on Jadaliyya (www.jadaliyya.com), a blog and e-zine published by the Arab Studies Institute, offer a counterpoint to these prevailing trends. The first is an interview with historian Dina Rizk Khoury related to the publication of her recent book, Iraq in Wartime: Soldiering, Martyrdom, and Resistance (Cambridge, 2013). As Khoury rightly notes, most of the discussion in the U.S. has failed to recognize the fact that Iraqis spent the last twenty-three years of Baathist rule in a state of nearly continuous military conflict. First there was the Iran-Iraq War, then the Iraqi seizure of Kuwait, then the 1991 Gulf War and the ensuring embargo, and finally the most recent American invasion and occupation.
Under such conditions, Khoury argues, war became a matter of normalcy and bureaucratic governance that insinuated violence into the fabric of everyday life in Iraq. At the same time, it created recurring crises and ruptures that reshaped the structures of state authority and citizenship. And it enabled the Iraqi state to fabricate a myth of soldiering and martyrdom that, in the long run, helped to recalibrate Iraqis’ notions of national belonging along ethnic and sectarian lines. Wittingly or unwittingly, the actions of U.S. policymakers after the Gulf War and the 2003 invasion have reinforced Iraq’s societal divisions and the prevalence of violence as a mode of political action.
The second contribution is a commentary from Orit Bashkin, “The Forgotten Protagonists: The Invasion and the Historian.” Bashkin has written extensively on the politics of pluralism (The Other Iraq, Stanford, 2010) and Jewish displacement (New Babylonians, Stanford, 2012) in twentieth-century Iraq, but here she focuses on the present and future conditions of historical scholarship. She contends that our knowledge of the Iraqi past has grown in significant ways over the past decade. (If we take Melani McAlister’s book Epic Encounters seriously, this outcome should hardly surprise us: American cultural, scholarly, and geopolitical interests in the Middle East have long been tightly intertwined.) Such expansion has been facilitated in no small part by the relocation of the Baath Party archives to the U.S. in 2008. This move has allowed professional historians ready access to a crucial corpus of texts on Saddam Hussein’s regime.
Yet Bashkin also worries that the prospects for historical knowledge production will be decidedly less rosy in the years to come. In particular, many of the other materials on which historians of Iraq rely—Ottoman records, collections of poetic and theological writings, museums, archaeological sites, and so on—have been or are being destroyed in the wake of the U.S. invasion.
As a result, it will be considerably more difficult for scholars not simply to reconstruct the Iraqi past, but also to comprehend how Iraqi citizens relate to it. In particular, we will be less able to grasp the imperial and colonial practices, post-independence state policies, and other forces that have forged the country’s current ethnic and religious cleavages. And we will be less able to understand the multiple and competing nostalgias that now proliferate among Iraqi citizens. Such nostalgias include the ambivalent and paradoxical longing for the days of Saddam Hussein, when (in Bashkin’s words) “at least there was some sense of law and order.”
American public discourse is in desperate need of commentary that positions present-day Iraqis as complex actors who both shape and are shaped by the flow of local, regional, and global histories. As Khoury and Bashkin suggest, the current focus on the past ten years is both literally and metaphorically short-sighted. And yet, for a variety of reasons, lengthening our gaze will be easier said than done.
During a conference organized in her honor in Toronto, Hannah Arendt was asked by Hans Morgenthau, to categorize herself as such: “What are you? Are you a conservative? Are you a liberal? Where is your position in the contemporary possibilities?”
Arendt replied: “I don’t know and I’ve never known. And I suppose I never had any such position. You know the left think that I am conservative, and the conservatives think that I am a maverick or God knows what. And I must say I couldn’t care less. I don’t think that the real questions of this century will get any kind of illumination by this kind of thing.”
It is precisely in this spirit that one should read Jens Hanssen’s recent paper “Reading Hannah Arendt in the Middle East: Preliminary Observations on Totalitarianism, Revolution and Dissent”.
Hanssen offers in his paper a rather detailed survey of how Arendt has been read – and misread – by the Middle East, beginning with Kanan Makiya’s World Policy Journal article (2006) “An Iraqi Discovers Arendt”, all the way to Israeli revisionist (and evidently critical of Israel) scholars such as Idith Zertal and Amnon Raz-Krakotzkin.
The particular examples he brings up are paradigmatic of this already established tradition of appropriations of Hannah Arendt that though emerging from her political thought, have much to do with politics and little with thinking.
For example, the case of Kanan Makiya is interesting if only because of his controversial – and rather maverick – position in the landscape of Iraqi politics. This Marxist engineer-turned-neo-conservative political advisor (in Hanssen's telling) is apparently credited with being the first Arab author to apply Arendt’s phenomenology of totalitarianism to Baathist Iraq.
Makiya makes a case for Iraq as a totalitarian regime in Arendt’s terms, drawing a straight line from anti-Semitism and intellectual support for Saddam Hussein to comparisons with Nazi Germany. Though his book The Republic of Fear stands for many Iraqis as the greatest testimony to the sad state of affairs under Hussein, the analysis is at best a misappropriation in many respects and seems to fall within the line of warmongering that Arendt so vehemently criticized as McCarthyism: To use totalitarian means to fight – real or imagined – totalitarian enemies.
The most interesting reading he brings up however is Vince Dolan’s course at the American University in Beirut, “Contemporary Philosophical Reflections on the Use of Political Violence”, in the spring of 1983. Dolan tailored the course to polemicize Arendt’s distinction between power and violence – perhaps the most difficult in all of her thought – by first exposing students to Habermas’ evaluation of Arendt’s project and then bringing her into conversation with Popper, Adorno and Horkheimer.
While this practice is common among liberal academics, the integration of Arendt into the corpus of critical theory has been time and again debunked by serious Arendt scholars, of which I might bring only two salient examples:
First, Dana Villa (Arendt and Heidegger, 1996, p. 3-4) argues that although Habermas called Arendt’s theory of political action “the systematic renewal of the Aristotelian concept of praxis”, there is no one that would argue more vehemently against Aristotle (and the whole project of critical theory) than Arendt.
According to Villa, critical theory has immensely profited from Arendt’s renewal of Aristotelian praxis as opposed to the instrumentalization of action in order to highlight the intersubjective nature of political action, when in fact this renewal is a radical reconceptualization whose renewal is nothing but a renewal in order to overcome rather than to restore the tradition of political thought of and since Aristotle.
Second, Fina Birulés insisted in an interview from 2001 that there is a wide gap between Arendt’s radical theory of democracy and Habermas. According to Birulés, though Habermas is deeply indebted to Arendt, his theory of communicative action is hardly political at all and he reduces the concept of plurality to some sort of ideal community of dialogue.
Doubtless Hanssen is correct in pointing out that Arendt did not provide a concise definition of totalitarianism. Definition is a privilege of theory that Arendt’s story-telling didn’t embrace and she “merely” listed phenomenological elements. However he also indicates how Arendt insisted that only two forms of totalitarianism existed: Nazi Germany and the Soviet Union. This distinction is crucial to understand the rest of his paper.
Nowadays totalitarianism – as much as the banality of evil – is a slogan in newspapers and politics, often lacking in meaning and intention and this brings to mind the whole post 9-11 discourse in philosophy and politics in which Islam and Islamism – among other things – take the place of the “old” totalitarian movements.
While it is true that in phenomenological and structural terms nothing since the collapse of the Soviet Union can be called strictly totalitarian, there is no doubt that there are totalitarian elements in many movements and policies not only in the Middle East today, but also in the democratic West.
Among other – far less influential readings of Arendt – Hanssen lists the translations into Arabic and Persian, providing crucial information about how and why Arendt informed certain – mostly – Arab authors.
Lastly there is an elaborate discussion on the use – and again, abuse – of Arendt by Israeli scholars since her “rehabilitation” in Israel that coincided with the rise to prominence of certain revisionist scholars.
Though Hannah Arendt wasn’t exclusively concerned with Zionism or the Jewish question, it is undeniable that her entire work was informed by her status and experience as a Jew in the Europe of the early 20th century.
There are many Hannah Arendts and to this effect Jerome Kohn writes in the introduction to her “Jewish Writings”: “In 1975, the year she died, she spoke of a voice that comes from behind the masks she wears to suit the occasions and the various roles that world offers her. That voice is identical to none of the masks, but she hopes it is identifiable, sounding through all of them”.
Something that is identifiable in her entire work – but not identical anywhere, is her concern with the young State of Israel in spite of the controversies into which she became trapped later on.
While it is true that Arendt was very critical of the Zionist establishment and of the course that Israel had taken, it is also important to remember that her writings (“The Crisis of Zionism” and “Peace or Armistice in the Middle East”) were anchored in an intense anxiety over the Jewish people regaining control of their own destinies and entering the realm of politics.
Julia Kristeva expressed this best in her speech upon receiving the Hannah Arendt Prize in 2006, making it clear how for Arendt the survival of Israel and the refoundation of politics in the West was part of one and the same task:
Thirty years after her death, added to the danger she tries to confront through a refoundation of political authority and which, as they get worse, make this refoundation increasingly improbable, is the new threat that weighs on Israel and the world. Arendt had a premonition about it as she warned against underestimating the Arab world and, while giving the State of Israel her unconditional support as the only remedy to the acosmism of the Jewish people, and as a way to return to the “world” and “politics” of which history has deprived, she also voiced criticism.
But Jerome Kohn writes also in the introduction to the Jewish Writings, “Already in 1948 Arendt foresaw what now perhaps has come to pass, that Israel would become a militaristic state behind closed but threatened borders, a “semi-sovereign” state from which Jewish culture would gradually vanish” (paraphrased from her “To Save the Jewish homeland”).
In her piece “Peace or Armistice in the Middle East,” Arendt laid out what is in my opinion a foundation for what could be the ideal of Arab-Jewish cooperation in the Middle East – including even a surprisingly rare background on Arab personalities that had lent support to the possibility of a Jewish settlement from Lebanon and Egypt – but the element of religious fundamentalism and anti-Semitism that have crystallized now in the Middle East couldn’t be foreseen by Arendt, or at least not to the extent that they were articulated by Kristeva:
Although many of her analyses and advances seem to us more prophetic than ever, Arendt could not foresee the rise of Islamic fundamentalism, nor the havoc it is wreaking in a world faced with the powerlessness of politics to respond, and the apolitia, the indifference created by the omnipresent society of the spectacle.
Hanssen concludes from reading Arendt on totalitarianism, revolution and dissent in the Middle East that “one of the most powerful (in Arendt’s sense of power as consent-based), non-violent movements coming out of the Arab World today is the Boycott, Sanctions and Divestments campaign that Palestinian civil society groups have called for in 2005 and has now become a global counter-hegemonic phenomenon” and raises the question whether Hannah Arendt would have supported Palestinian BDS movement to bring about the end of Israeli occupation.
On the one hand he argues that “the intellectual merit of BDS campaign from an Arendtian standpoint is that it is not based on old and invalid hyperbolic equation of Israel with Nazi Germany.” On the other hand, he also says:
There is certainly ample room for this kind of non-violent action in her writings. For one, she supported the economic boycott of German businesses in the 1930’s and was furious when Zionist Organization in Palestine broke it.
Leaving the associations with Nazi Germany asides, it is vital to recall that it was Arendt who said that not even in the moon is one safe from anti-Semitism and that the State of Israel alone wouldn’t come to solve the Jewish question.
It is clear by now that BDS campaign has blended elements no doubt altruistic of non-violent struggle with elements from the old anti-Semitism, in which there’s little distinction made between Israelis and Jews.
BDS has come to include not only boycott to the settlements (as has been articulated with great intelligence by Peter Beinart and his book “The Crisis of Zionism”) but also academic and cultural boycott. In extreme cases, there have been boycotts of products not for being Israeli or produced in the settlements, but merely out of being kosher products produced in Britain and the United States.
While it is more than clear that Arendt saw and foresaw the risks and dangers to which Israel polity was exposed by its leaders, she also articulated with clarity that it wasn’t the Jews alone who were responsible for this sad state of affairs and whether or not Hannah Arendt’s ideal of a binational state is at all realizable at this point – bearing in mind the complexities of Arab Spring – what is clear is that an ideology fed on old anti-Semitism and prejudice as much as on uncritical views of Arab and Palestinian history is very unlikely to produce the Arab-Jewish councils (at the heart of her theorizing on revolutions) upon the basis of which a secular and democratic state might be founded.