"The end of the old is not necessarily the beginning of the new."
Hannah Arendt, The Life of the Mind
This is a simple enough statement, and yet it masks a profound truth, one that we often overlook out of the very human tendency to seek consistency and connection, to make order out of the chaos of reality, and to ignore the anomalous nature of that which lies in between whatever phenomena we are attending to.
Perhaps the clearest example of this has been what proved to be the unfounded optimism that greeted the overthrow of autocratic regimes through American intervention in Afghanistan and Iraq, and the native-born movements known collectively as the Arab Spring. It is one thing to disrupt the status quo, to overthrow an unpopular and undemocratic regime. But that end does not necessarily lead to the establishment of a new, beneficent and participatory political structure. We see this time and time again, now in Putin's Russia, a century ago with the Russian Revolution, and over two centuries ago with the French Revolution.
Of course, it has long been understood that oftentimes, to begin something new, we first have to put an end to something old. The popular saying that you can't make an omelet without breaking a few eggs reflects this understanding, although it is certainly not the case that breaking eggs will inevitably and automatically lead to the creation of an omelet. Breaking eggs is a necessary but not sufficient cause of omelets, and while this is not an example of the classic chicken and egg problem, I think we can imagine that the chicken might have something to say on the matter of breaking eggs. Certainly, the chicken would have a different view on what is signified or ought to be signified by the end of the old, meaning the end of the egg shell, insofar as you can't make a chicken without it first breaking out of the egg that it took form within.
So, whether you take the chicken's point of view, or adopt the perspective of the omelet, looking backwards, reverse engineering the current situation, it is only natural to view the beginning of the new as an effect brought into being by the end of the old, to assume or make an inference based on sequencing in time, to posit a causal relationship and commit the logical fallacy of post hoc ergo propter hoc, if for no other reason that by force of narrative logic that compels us to create a coherent storyline. In this respect, Arendt points to the foundation tales of ancient Israel and Rome:
We have the Biblical story of the exodus of Israeli tribes from Egypt, which preceded the Mosaic legislation constituting the Hebrew people, and Virgil's story of the wanderings of Aeneas, which led to the foundation of Rome—"dum conderet urbem," as Virgil defines the content of his great poem even in its first lines. Both legends begin with an act of liberation, the flight from oppression and slavery in Egypt and the flight from burning Troy (that is, from annihilation); and in both instances this act is told from the perspective of a new freedom, the conquest of a new "promised land" that offers more than Egypt's fleshpots and the foundation of a new City that is prepared for by a war destined to undo the Trojan war, so that the order of events as laid down by Homer could be reversed.
Fast forward to the American Revolution, and we find that the founders of the republic, mindful of the uniqueness of their undertaking, searched for archetypes in the ancient world. And what they found in the narratives of Exodus and the Aeneid was that the act of liberation, and the establishment of a new freedom are two events, not one, and in effect subject to Alfred Korzybski's non-Aristotelian Principle of Non-Identity. The success of the formation of the American republic can be attributed to the awareness on their part of the chasm that exists between the closing of one era and the opening of a new age, of their separation in time and space:
No doubt if we read these legends as tales, there is a world of difference between the aimless desperate wanderings of the Israeli tribes in the desert after the Exodus and the marvelously colorful tales of the adventures of Aeneas and his fellow Trojans; but to the men of action of later generations who ransacked the archives of antiquity for paradigms to guide their own intentions, this was not decisive. What was decisive was that there was a hiatus between disaster and salvation, between liberation from the old order and the new freedom, embodied in a novus ordo saeclorum, a "new world order of the ages" with whose rise the world had structurally changed.
I find Arendt's use of the term hiatus interesting, given that in contemporary American culture it has largely been appropriated by the television industry to refer to a series that has been taken off the air for a period of time, but not cancelled. The typical phrase is on hiatus, meaning on a break or on vacation. But Arendt reminds us that such connotations only scratch the surface of the word's broader meanings. The Latin word hiatus refers to an opening or rupture, a physical break or missing part or link in a concrete material object. As such, it becomes a spatial metaphor when applied to an interruption or break in time, a usage introduced in the 17th century. Interestingly, this coincides with the period in English history known as the Interregnum, which began in 1649 with the execution of King Charles I, led to Oliver Cromwell's installation as Lord Protector, and ended after Cromwell's death with the Restoration of the monarchy under Charles II, son of Charles I. While in some ways anticipating the American Revolution, the English Civil War followed an older pattern, one that Mircea Eliade referred to as the myth of eternal return, a circular movement rather than the linear progression of history and cause-effect relations.
The idea of moving forward, of progress, requires a future-orientation that only comes into being in the modern age, by which I mean the era that followed the printing revolution associated with Johannes Gutenberg (I discuss this in my book, On the Binding Biases of Time and Other Essays on General Semantics and Media Ecology). But that same print culture also gave rise to modern science, and with it the monopoly granted to efficient causality, cause-effect relations, to the exclusion in particular of final and formal cause (see Marshall and Eric McLuhan's Media and Formal Cause). This is the basis of the Newtonian universe in which every action has an equal and opposite reaction, and every effect can be linked back in a causal chain to another event that preceded it and brought it into being. The view of time as continuous and connected can be traced back to the introduction of the mechanical clock in the 13th century, but was solidified through the printing of calendars and time lines, and the same effect was created in spatial terms by the reproduction of maps, and the use of spatial grids, e.g., the Mercator projection.
And while the invention of history, as a written narrative concerning the linear progression over time can be traced back to the ancient Israelites, and the story of the exodus, the story incorporates the idea of a hiatus in overlapping structures:
A1. Joseph is the golden boy, the son favored by his father Jacob, earning him the enmity of his brothers
A2. he is sold into slavery by them, winds up in Egypt as a slave and then is falsely accused and imprisoned
A3. by virtue of his ability to interpret dreams he gains his freedom and rises to the position of Pharaoh's prime minister
B1. Joseph welcomes his brothers and father, and the House of Israel goes down to Egypt to sojourn due to famine in the land of Canaan
B2. their descendants are enslaved, oppressed, and persecuted
B3. Moses is chosen to confront Pharaoh, liberate the Israelites, and lead them on their journey through the desert
C1. the Israelites are freed from bondage and escape from Egypt
C2. the revelation at Sinai fully establishes their covenant with God
C3. after many trials, they return to the Promised Land
It can be clearly seen in these narrative structures that the role of the hiatus, in ritual terms, is that of the rite of passage, the initiation period that marks, in symbolic fashion, the change in status, the transformation from one social role or state of being to another (e.g., child to adult, outsider to member of the group). This is not to discount the role that actual trials, tests, and other hardships may play in the transition, as they serve to establish or reinforce, psychologically and sometimes physically, the value and reality of the transformation.
In mythic terms, this structure has become known as the hero's journey or hero's adventure, made famous by Joseph Campbell in The Hero with a Thousand Faces, and also known as the monomyth, because he claimed that the same basic structure is universal to all cultures. The basis structure he identified consists of three main elements: separation (e.g., the hero leaves home), initiation (e.g., the hero enters another realm, experiences tests and trials, leading to the bestowing of gifts, abilities, and/or a new status), and return (the hero returns to utilize what he has gained from the initiation and save the day, restoring the status quo or establishing a new status quo).
Understanding the mythic, non-rational element of initiation is the key to recognizing the role of the hiatus, and in the modern era this meant using rationality to realize the limits of rationality. With this in mind, let me return to the quote I began this essay with, but now provide the larger context of the entire paragraph:
The legendary hiatus between a no-more and a not-yet clearly indicated that freedom would not be the automatic result of liberation, that the end of the old is not necessarily the beginning of the new, that the notion of an all-powerful time continuum is an illusion. Tales of a transitory period—from bondage to freedom, from disaster to salvation—were all the more appealing because the legends chiefly concerned the deeds of great leaders, persons of world-historic significance who appeared on the stage of history precisely during such gaps of historical time. All those who pressed by exterior circumstances or motivated by radical utopian thought-trains, were not satisfied to change the world by the gradual reform of an old order (and this rejection of the gradual was precisely what transformed the men of action of the eighteenth century, the first century of a fully secularized intellectual elite, into the men of the revolutions) were almost logically forced to accept the possibility of a hiatus in the continuous flow of temporal sequence.
Note that concept of gaps in historical time, which brings to mind Eliade's distinction between the sacred and the profane. Historical time is a form of profane time, and sacred time represents a gap or break in that linear progression, one that takes us outside of history, connecting us instead in an eternal return to the time associated with a moment of creation or foundation. The revelation in Sinai is an example of such a time, and accordingly Deuteronomy states that all of the members of the House of Israel were present at that event, not just those alive at that time, but those not present, the generations of the future. This statement is included in the liturgy of the Passover Seder, which is a ritual reenactment of the exodus and revelation, which in turn becomes part of the reenactment of the Passion in Christianity, one of the primary examples of Campbell's monomyth.
Arendt's hiatus, then represents a rupture between two different states or stages, an interruption, a disruption linked to an eruption. In the parlance of chaos and complexity theory, it is a bifurcation point. Arendt's contemporary, Peter Drucker, a philosopher who pioneered the scholarly study of business and management, characterized the contemporary zeitgeist in the title of his 1969 book: The Age of Discontinuity. It is an age in which Newtonian physics was replaced by Einstein's relativity and Heisenberg's uncertainty, the phrase quantum leap becoming a metaphor drawn from subatomic physics for all forms of discontinuity. It is an age in which the fixed point of view that yielded perspective in art and the essay and novel in literature yielded to Cubism and subsequent forms of modern art, and stream of consciousness in writing.
Beginning in the 19th century, photography gave us the frozen, discontinuous moment, and the technique of montage in the motion picture gave us a series of shots and scenes whose connections have to be filled in by the audience. Telegraphy gave us the instantaneous transmission of messages that took them out of their natural context, the subject of the famous comment by Henry David Thoreau that connecting Maine and Texas to one another will not guarantee that they have anything sensible to share with each other. The wire services gave us the nonlinear, inverted pyramid style of newspaper reporting, which also was associated with the nonlinear look of the newspaper front page, a form that Marshall McLuhan referred to as a mosaic. Neil Postman criticized television's role in decontextualizing public discourse in Amusing Ourselves to Death, where he used the phrase, "in the context of no context," and I discuss this as well in my recently published follow-up to his work, Amazing Ourselves to Death.
The concept of the hiatus comes naturally to the premodern mind, schooled by myth and ritual within the context of oral culture. That same concept is repressed, in turn, by the modern mind, shaped by the linearity and rationality of literacy and typography. As the modern mind yields to a new, postmodern alternative, one that emerges out of the electronic media environment, we see the return of the repressed in the idea of the jump cut writ large.
There is psychological satisfaction in the deterministic view of history as the inevitable result of cause-effect relations in the Newtonian sense, as this provides a sense of closure and coherence consistent with the typographic mindset. And there is similar satisfaction in the view of history as entirely consisting of human decisions that are the product of free will, of human agency unfettered by outside constraints, which is also consistent with the individualism that emerges out of the literate mindset and print culture, and with a social rather that physical version of efficient causality. What we are only beginning to come to terms with is the understanding of formal causality, as discussed by Marshall and Eric McLuhan in Media and Formal Cause. What formal causality suggests is that history has a tendency to follow certain patterns, patterns that connect one state or stage to another, patterns that repeat again and again over time. This is the notion that history repeats itself, meaning that historical events tend to fall into certain patterns (repetition being the precondition for the existence of patterns), and that the goal, as McLuhan articulated in Understanding Media, is pattern recognition. This helps to clarify the famous remark by George Santayana, "those who cannot remember the past are condemned to repeat it." In other words, those who are blind to patterns will find it difficult to break out of them.
Campbell engages in pattern recognition in his identification of the heroic monomyth, as Arendt does in her discussion of the historical hiatus. Recognizing the patterns are the first step in escaping them, and may even allow for the possibility of taking control and influencing them. This also means understanding that the tendency for phenomena to fall into patterns is a powerful one. It is a force akin to entropy, and perhaps a result of that very statistical tendency that is expressed by the Second Law of Thermodynamics, as Terrence Deacon argues in Incomplete Nature. It follows that there are only certain points in history, certain moments, certain bifurcation points, when it is possible to make a difference, or to make a difference that makes a difference, to use Gregory Bateson's formulation, and change the course of history. The moment of transition, of initiation, the hiatus, represents such a moment.
McLuhan's concept of medium goes far beyond the ordinary sense of the word, as he relates it to the idea of gaps and intervals, the ground that surrounds the figure, and explains that his philosophy of media is not about transportation (of information), but transformation. The medium is the hiatus.
The particular pattern that has come to the fore in our time is that of the network, whether it's the decentralized computer network and the internet as the network of networks, or the highly centralized and hierarchical broadcast network, or the interpersonal network associated with Stanley Milgram's research (popularly known as six degrees of separation), or the neural networks that define brain structure and function, or social networking sites such as Facebook and Twitter, etc. And it is not the nodes, which may be considered the content of the network, that defines the network, but the links that connect them, which function as the network medium, and which, in the systems view favored by Bateson, provide the structure for the network system, the interaction or relationship between the nodes. What matters is not the nodes, it's the modes.
Hiatus and link may seem like polar opposites, the break and the bridge, but they are two sides of the same coin, the medium that goes between, simultaneously separating and connecting. The boundary divides the system from its environment, allowing the system to maintain its identity as separate and distinct from the environment, keeping it from being absorbed by the environment. But the membrane also serves as a filter, engaged in the process of abstracting, to use Korzybski's favored term, letting through or bringing material, energy, and information from the environment into the system so that the system can maintain itself and survive. The boundary keeps the system in touch with its situation, keeps it contextualized within its environment.
The systems view emphasizes space over time, as does ecology, but the concept of the hiatus as a temporal interruption suggests an association with evolution as well. Darwin's view of evolution as continuous was consistent with Newtonian physics. The more recent modification of evolutionary theory put forth by Stephen Jay Gould, known as punctuated equilibrium, suggests that evolution occurs in fits and starts, in relatively rare and isolated periods of major change, surrounded by long periods of relative stability and stasis. Not surprisingly, this particular conception of discontinuity was introduced during the television era, in the early 1970s, just a few years after the publication of Peter Drucker's The Age of Discontinuity.
When you consider the extraordinary changes that we are experiencing in our time, technologically and ecologically, the latter underlined by the recent news concerning the United Nations' latest report on global warming, what we need is an understanding of the concept of change, a way to study the patterns of change, patterns that exist and persist across different levels, the micro and the macro, the physical, chemical, biological, psychological, and social, what Bateson referred to as metapatterns, the subject of further elaboration by biologist Tyler Volk in his book on the subject. Paul Watzlawick argued for the need to study change in and of itself in a little book co-authored by John H. Weakland and Richard Fisch, entitled Change: Principles of Problem Formation and Problem Resolution, which considers the problem from the point of view of psychotherapy. Arendt gives us a philosophical entrée into the problem by introducing the pattern of the hiatus, the moment of discontinuity that leads to change, and possibly a moment in which we, as human agents, can have an influence on the direction of that change.
To have such an influence, we do need to have that break, to find a space and more importantly a time to pause and reflect, to evaluate and formulate. Arendt famously emphasizes the importance of thinking in and of itself, the importance not of the content of thought alone, but of the act of thinking, the medium of thinking, which requires an opening, a time out, a respite from the onslaught of 24/7/365. This underscores the value of sacred time, and it follows that it is no accident that during that period of initiation in the story of the exodus, there is the revelation at Sinai and the gift of divine law, the Torah or Law, and chief among them the Ten Commandments, which includes the fourth of the commandments, and the one presented in greatest detail, to observe the Sabbath day. This premodern ritual requires us to make the hiatus a regular part of our lives, to break the continuity of profane time on a weekly basis. From that foundation, other commandments establish the idea of the sabbatical year, and the sabbatical of sabbaticals, or jubilee year. Whether it's a Sabbath mandated by religious observance, or a new movement to engage in a Technology Sabbath, the hiatus functions as the response to the homogenization of time that was associated with efficient causality and literate linearity, and that continues to intensify in conjunction with the technological imperative of efficiency über alles.
To return one last time to the quote that I began with, the end of the old is not necessarily the beginning of the new because there may not be a new beginning at all, there may not be anything new to take the place of the old. The end of the old may be just that, the end, period, the end of it all. The presence of a hiatus to follow the end of the old serves as a promise that something new will begin to take its place after the hiatus is over. And the presence of a hiatus in our lives, individually and collectively, may also serve as a promise that we will not inevitably rush towards an end of the old that will also be an end of it all, that we will be able to find the opening to begin something new, that we will be able to make the transition to something better, that both survival and progress are possible, through an understanding of the processes of continuity and change.
Anthony Grafton calls David Nirenberg’s Anti-Judaism “one of the saddest stories, and one of the most learned, I have ever read.” Grafton knows that Anti-Judaism “is certainly not the first effort to survey the long grim history of the charges that have been brought against the Jews by their long gray line of self-appointed prosecutors.” What makes this account of the long history of Jewish hatred so compelling is that Nirenberg asks the big question: Why the Jews?
[Nirenberg] wants to know why: why have so many cultures and so many intellectuals had so much to say about the Jews? More particularly, he wants to know why so many of them generated their descriptions and explanations of Jewishness not out of personal knowledge or scholarly research, but out of thin air—and from assumptions, some inherited and others newly minted, that the Jews could be wholly known even to those who knew no Jews.
The question recalls the famous joke told during the Holocaust, especially amongst Jews in concentration camps. Here is one formulation of the joke from Antisemitism, the first book in the trilogy that comprises Hannah Arendt’s magnum opus, The Origins of Totalitarianism: “An antisemite claimed that the Jews had caused the war; the reply was: Yes, the Jews and the bicyclists. Why the bicyclists? Asks the one? Why the Jews? asks the other.”
The point of the joke is clear: Anti-Judaism is as senseless and irrational as anti-bicyclists would be. “The theory that the Jews are always the scapegoat,” Arendt writes, “implies that the scapegoat might have been anyone else as well”—even bicyclists. The question, then, is why the Jews? Grafton gives a clue to Nirenberg’s subtle answer:
Nirenberg’s answer—and to summarize it, as to summarize so much of this impassioned book, is to flatten it—is that ideas about the Jews can do, and have done, many different and important jobs. True, they are anything but stable: this is not a paper chase after some original idea of the Jew that crops up everywhere from early Christianity to early Nazism. Visions of the Jews change emphasis and content as the larger societies that entertain them change shape and texture. Ideas have multiple contexts, and Nirenberg shows dazzling skill and a daunting command of the sources as he observes the changes and draws connections between them and his authors’ larger worlds.
Nirenberg’s point is that anti-Judaism has nothing to do with Jews themselves. The negative ideas about Jews are held throughout history by a motley group of Christians, philosophers, tyrants, and martyrs. Shakespeare’s account of Shylock is only one of many examples in which an intellectual employs anti-Jewish stereotypes—the Jew as greedy moneylender—to make a wider social critique, this time of the dangers of capitalism. London is becoming a city of commerce. There are no Jews in London. Yet Shakespeare turns to Jews in order to find a way to criticize the emergent commercial culture.
The use of negative sentiments about Jews to bash capitalism was common, Nirenberg writes, and carries through history from Jerome to Marx. Marx couches his critique of capitalism through the lens of a critique of Jews. Shakespeare does the same with commercial society. Jews stand in for the oppressed in the world, so that oppressing Christians could be seen as making them Jewish. Jews at the same time were seen as powerful bankers and powerful agents of world domination, so that any group of conspirators from Bolsheviks in Russia to media moguls in Hollywood were tarred with the pungent scent of Judaism.
Jews have been characterized by non-Jews for their obstinacy—their refusal, for example, to recognize the known truth that the Messiah had come, which enabled them to become the villains of both early Christian and early Muslim narratives. They have been characterized by non-Jews for their viciousness—their desire to desecrate the sacrament and murder Christian children, which allowed them to be used both by rebels against royal authority, and by kings, in the Middle Ages, as each side could claim, when the wind blew from the right quarter, that Jews were polluting society through their materialism and greed. . . . Nirenberg’s parade of imagined and imaginary Jews—the most hideous procession since that of the flagellants in The Seventh Seal—stretches from the Arabian peninsula to London, and from the seventh century BCE to the twentieth CE. Working always from the original sources in their original languages, he observes the multiple ways in which imaginary Jews served the purposes of real writers and thinkers—everyone from Muhammad, founding a new religion, to Shakespeare, observing a new commercial society. God, here, is partly in the details: in the careful, tenderly observant way in which Nirenberg dissects everything from fierce political rhetoric to resonant Shakespearean drama. In works of the imagination, profound treatises, and acts of political radicalism, as he analyses them, imaginary Jews are wielded to powerful effect. He shows us the philosophes of the Enlightenment, those friends of humanity and enemies of tyrannical “infamy,” as they develop a viciously negative vision of Jewish sterility and error to attack Christianity at its origins or to characterize the authorities whom they defied.
The only reservation Grafton voices concerns the univocality of Nirenberg’s account. As exceptional as the account of anti-Jewish opinion is, Nirenberg largely ignores other perspectives and examples where real and imaginary Jews were accepted, embraced, and even praised.
As a social historian of conflict and an intellectual historian of the uncanny imagination, Nirenberg is unbeatable. But Jews and non-Jews lived other histories together as well. As Josephus recalled, when the thousands of diaspora Jews settled in the cities of the Roman world, across Asia Minor and Italy as well as Egypt, many of their pagan neighbors found their ways attractive. Pagans admired the Jews’ pursuit of a coherent code for living and their worship of a single, unseen god. Some became “god-fearers,” who accepted the Jewish god but did not hold full membership in the Jewish community. Some converted. Jews, meanwhile, pursued their own visions of high culture—whether these involved learning to write Greek tragedies about the Jewish past or rebuilding one’s foreskin to make possible appearances at the gymnasium.
Grafton largely stops there and minimizes his “very small complaints….Anti-Judaism is that rare thing, a great book, as much in its ability to provoke disagreement as in its power to shape future writing on the vast territory that its author has so brilliantly mapped.” But Grafton’s small complaints deserve a wider hearing, especially as concerns the leading question he and Nirenberg pose, “Why the Jews?”
The overarching argument of Anti-Judaism is one of eternal antisemitism: Anti-Judaism had nothing to do with the Jews themselves. It is an attitude that sees the Jews to be to blame and is concerned with imaginary Jews as opposed to real Jews. Anti-Judaism is powerful and impactful, but it has no rational connection to reality. Here is how Michael Walzer aptly sums up Nirenberg’s argument:
His argument is that a certain view of Judaism lies deep in the structure of Western civilization and has helped its intellectuals and polemicists explain Christian heresies, political tyrannies, medieval plagues, capitalist crises, and revolutionary movements. Anti-Judaism is and has long been one of the most powerful theoretical systems “for making sense of the world.” No doubt, Jews sometimes act out the roles that anti-Judaism assigns them—but so do the members of all the other national and religious groups, and in much greater numbers. The theory does not depend on the behavior of “real” Jews.
As Walzer notes in his own review of Anti-Judaism in the most recent issue of the New York Review of Books, Nirenberg includes an epilogue that takes on the most famous opponent of his view of eternal antisemitism, Hannah Arendt. As Arendt understands Nirenberg’s view, “Jew-hatred is a normal and natural reaction to which history gives only more or less opportunity. Outbursts need no special explanation because they are natural consequences of an eternal problem.” Since anti-Judaism is eternal and unending, it has been normalized. If thousand years, then Jew-killing is a normal, and even human, occupation and Jew-hatred is justified beyond the need of argument.”
The point is that Grafton’s minor complaint—that Nirenberg offers a magisterial account of Jew-hatred and ignores philo-semitism—is not so minor after all. By claiming that anti-Judaism is omnipresent and omnipotent—by focusing only on anti-Judaism and leaving aside those who embrace or praise Jews—Nirenberg risks normalizing antisemitism. Everyone traffics in Jew-hatred, even Jews. Such a move means, however, that we lose the ability to distinguish those who are antisemites from those who are not. Which is why Arendt argues that the eternal antisemitism thesis is one way to “escape the seriousness of antisemitism and the significance of the fact that the Jews were driven into the storm center of events.”
Walzer and Nirenberg condemn Arendt for seriously asking the question “Why the Jews?” She insists that there are reasons for antisemitism, reasons that the Nazis sought to exterminate the Jews and not the bicyclists. There are such reasons, and anti-Judaism is not simply mysterious and irrational accident. She does not think those are good reasons. She of course never says that the Jews are to blame or that the Jews were responsible for the holocaust as Nirenberg and Walzer wrongly argue. But she does insist we confront the fact that Jews have proven such convenient targets for anti-Judaism, that we seek to understand why it is that over and over it is the Jews who are targeted. There is not one simple answer to that question, Why the Jews? But Arendt asks it seriously and courageously and seeks to come up with a series of potential answers, none of which have to do with her claiming that the Jews are to blame.
If you have The Origins of Totalitarianism on your shelf, take it out and read Chapter One on “Antisemitism as an Outrage to Common Sense.” Then read Grafton and Walzer on Nirenberg’s Anti-Judaism. It will be a sad but thrilling weekend.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Drones are simply one weapon in a large arsenal with which we fight the war on terror. Even targeted killings, the signature drone capability, are nothing new. The U.S. and other countries have targeted and killed individual leaders for decades if not centuries, using snipers, poisons, bombs, and many other technologies. To take a historical perspective, drones don’t change much. Nor is the airborne capacity of drones to deliver devastation from afar anything new, having as its predecessors the catapult, the long bow, the bomber, and the cruise missile. And yet, there is seemingly something new about the way drones change the feel and reality of warfare. On one side, drones sanitize the battlefield from a space of blood, fear, and heroic fortitude into a video game played on consoles. On the other side, drones dominate life, creating a low pitched humming sound that reminds inhabitants that at any moment a missile might pierce their daily routines. The two sides of this phenomenology of drones is the topic of an essay by Nasser Hussain in The Boston Review: “In order to widen our vision, I provide a phenomenology of drone strikes, examining both how the world appears through the lens of a drone camera and the experience of the people on the ground. What is it like to watch a drone’s footage, or to wait below for it to strike? What does the drone’s camera capture, and what does it occlude?” You can also read Roger Berkowitz’s weekend read on seeing through drones.
Marilynne Robinson, speaking to the American Conservative about her faith, elaborates on what she sees as the central flaws in contemporary American Christianity: "Something I find regrettable in contemporary Christianity is the degree to which it has abandoned its own heritage, in thought and art and literature. It was at the center of learning in the West for centuries—because it deserved to be. Now there seems to be actual hostility on the part of many Christians to what, historically, was called Christian thought, as if the whole point were to get a few things right and then stand pat. I believe very strongly that this world, these billions of companions on earth that we know are God’s images, are to be loved, not only in their sins, but especially in all that is wonderful about them. And as God is God of the living, that means we ought to be open to the wonderful in all generations. These are my reasons for writing about Christian figures of the past. At present there is much praying on street corners. There are many loud declarations of personal piety, which my reading of the Gospels forbids me to take at face value. The media are drawn by noise, so it is difficult to get a sense of the actual state of things in American religious culture."
Is poetry going the way of the Dodo bird? Vanessa Place makes this argument in a recent essay “Poetry is Dead. I Killed It,” on the Poetry Foundation website. And Kenneth Goldsmith, in the New Yorker, asks whether Place is right. The internet, he suggests, has killed or at least so rethought poetry that it may be unrecognizable. "Quality is beside the point—this type of content is about the quantity of language that surrounds us, and about how difficult it is to render meaning from such excesses. In the past decade, writers have been culling the Internet for material, making books that are more focussed [sic] on collecting than on reading. These ways of writing—word processing, databasing, recycling, appropriating, intentionally plagiarizing, identity ciphering, and intensive programming, to name just a few—have traditionally been considered outside the scope of literary practice."
In a rare interview, famously reclusive Calvin and Hobbes cartoonist Bill Watterson prognosticates on the future of the comics: "Personally, I like paper and ink better than glowing pixels, but to each his own. Obviously the role of comics is changing very fast. On the one hand, I don’t think comics have ever been more widely accepted or taken as seriously as they are now. On the other hand, the mass media is disintegrating, and audiences are atomizing. I suspect comics will have less widespread cultural impact and make a lot less money. I’m old enough to find all this unsettling, but the world moves on. All the new media will inevitably change the look, function, and maybe even the purpose of comics, but comics are vibrant and versatile, so I think they’ll continue to find relevance one way or another. But they definitely won’t be the same as what I grew up with."
Cambodian director Rithy Panh's new movie, The Missing Picture is about the rule of the Khmer Rouge in Cambodia. In making the film, he had to confront the challenge of making a movie about atrocities that are famously without explicit visual records, and he hit upon a unique solution: clay dolls. Although these figures "are necessarily silent, immobile, and therefore devoid of the intensity of those moments in other Panh films where his camera bores in on the face of a witness and lingers there as he remembers what happened, or what he did," Richard Bernstein suggests that they give the movie a unique power.
This week on the blog, Ian Storey revisits George Orwell's prescient essay, "Politics and the English Language." Jeffrey Champlin looks at James Muldoon's essay about Arendt's writngs on the advocacy of council systems in On Revolution. And your weekend read looks at the cultural impact of drones on the nations and groups that are employing them.
It requires courage even to leave the protective security of our four walls and enter the public realm, not because of particular dangers which may lie in wait for us, but because we have arrived in a realm where the concern for life has lost its validity. Courage liberates men from their worry about life for the freedom of the world. Courage is indispensable because in politics not life but the world is at stake.
-Hannah Arendt, Between Past and Future
This quote is a favorite among political theorists who study Arendt. Understandably, for it seems perfectly to capture Arendt as the figure whose principal concern is the public sphere and the politics that can occur only in this sphere. The private realm is characterized by protective walls that allow us blind ourselves to everything but our individual needs while the public opens us up to the grander concerns of the world.
Courage, in this reading, is largely a rhetorical flourish that affirms the grandness of the public realm and the smallness of private, bourgeois concerns with comfort and self-interest. But in reading the concept of courage solely through what has become the “characteristically” Arendtian opposition between the public and private spheres, one overlooks the profound significance of courage for understanding the character of the public realm as Arendt uniquely conceived of it. Arendt acknowledges that courage is necessary for individuals to leave the private sphere and its particular concerns: it takes courage to leave the protective security of private life. But she does not stop there and asserts that courage reflects a key feature of the public realm itself beyond and independent of individuals’ move out of the private. According to Arendt, we need courage not only to leave the private sphere, but also to confront the fact that in the public realm, the world itself is at stake in our own activity of politics.
What Arendt means by this statement that the world is at stake in politics is not clear without a clear understanding of the plurality is for her constitutive of the public realm. For Arendt, plurality is not a statement of difference; it does not summarize the fact that each occupies his or her own standpoint in the world. Rather, plurality reflects the fact that all individuals must show themselves and appear to other human beings. She writes in The Life of the Mind, “everything that is meant to be perceived by somebody. Not Man but men inhabit this planet. Plurality is the law of the earth.” In other words, plurality reflects the fact that the human world is a function of relations of spectatorship. Our world is built upon individuals showing themselves to and being seen by others.
Politics for Arendt is that activity by which individuals reveal or disclose themselves to one another; it describes the activity by which we appear. But when we understand with Arendt that the world itself is constituted in an by these relations of spectatorship, we are forced to confront the fact that the stakes of choosing to appear in the public cannot be limited to individual life and the question of whether or not we choose to live this life courageously. In choosing to appear, in having the courage to appear, we accept the task of creating the world itself and become constitutive members of what is an objective home for all human beings.
This relationship to the world and the burdens and responsibility it imposes on individuals in the very basic task of appearing is for Arendt a necessary, inescapable feature of public life. And this fact that individual appearance is constitutive of the world is what ultimately makes the decision to enter the public realm a matter of courage. To show oneself to others—to say, as Cicero did, “[b]y God I’d much rather go astray with Plato than hold true views with these people”—is not just to reveal, however courageously and however contrary to established codes of behavior, oneself as an individual. It is to affirm and reconcile oneself to one’s responsibilities in a world that is created and sustained by nothing other than individuals showing themselves in their thoughts and judgments to one another. The courage that politics demands is the courage to take on the responsibility to make the world.
Courage might be one motivation behind the decision to leave the protective walls of the private. Others might be recklessness, pride, ambition, or, as Arendt said of the Nazis, merely the ruthless desire to conform to what others are doing. But the choice to engage in politics and appear in the world implicates not just questions about the individual’s character, good or bad, but grounds of the world itself and whether this is strong enough to sustain a world for all men. And one cannot take up this task of creating and sustaining the world with nothing more than one’s own human capacity to appear to others without courage.
Science fiction, Hannah Arendt tells us, has too long been undervalued by those who would seek to comprehend the human condition. It is in the human fantasies of our future that mankind reveals our desires, both possible and not yet possible. For Arendt, some of those deepest and longest-held desires included the desire to flee the earth, to play God and to make human beings, and to make labor unnecessary. Her book, The Human Condition, is in part an effort to think through the fact that many of these human desires were, for the first time in millennia, threatening to become possible.
We make a mistake to ignore science fiction, especially in an era where the unprecedented advance of technological ability makes it possible that today’s dreams will soon be realized. With that in mind, it is worth looking at Alex Mar’s profile of life, death, and cryogenic preservation of FM-2030, otherwise known as Fereidoun M. Esfandiary.
Writing in The Believer, Mar introduces FM-2030, one of the founders of the transhumanism movement. FM-2030 has a single defining dream for the future of man, that we overcome our given and earthly and biological limits. If man, as Arendt writes, is both someone who lives in a given and fated world and someone who can change and re-make that world, the transhumanists like FM-2030 imagine a time in the near future in which all biological, temporal, and physical limits will be overcome. Including death.
The ultimate goal for transhumanists has never been merely to improve mankind, but to defeat our greatest opponent: death. Of course, not all champions of Progress make the titanic leap to Immortality—the jump is so vast, so wildly immodest and presumptuous as to cross over into the realm of the kind of uncomfortably eccentric. But as FM would put it, “No one today can be too optimistic.” Transhumanists, in their crusade against time, have begun to buy themselves some of it, at the cost of a pricey life-insurance policy. With some cryoprotectants and a lot of liquid nitrogen, humanity—or at least the one-thousand-ish people affiliated with Alcor, currently the largest cryonics group in the country—has been gifted with the semi-scientific semi-possibility of radically extended life. Die a clinical “death,” go to sleep, wake up eons later, when existence is a whole new ball game. So when will immortality come?
If you want to understand the human condition, that means knowing well too our most human dreams. Today, technological optimism is at the center of those dreams. Fereidoun M. Esfandiary was for many the first great transhumanist of the late 20th century, the precursor to Ray Kurzweil, who also dreams of his own immortality. This story of his untimely death, and efforts to preserve him, reveal much about the movement he helped to found.
Read the article here.
Read related essays on the human dream of a non-human future here.
You can also purchase the inaugural issue of HA, the Hannah Arendt Center Journal, which features a selection of articles by Nicholson Baker, Babette Babich, Rob Riemen, Marianne Constable, and Roger Berkowitz from our 2010 conference, “Human Being in an Inhuman Age.”
“[Augustine] distinguishes between the questions of "Who am I?" and "What am I?" the first being directed by man at himself […] For in the "great mystery," the grande profundum, which man is (iv. 14), there is "something of man [aliquid hominis] which the spirit of man which is in him itself knoweth not. But Thou, Lord, who has made him [fecisti eum] knowest everything of him [eius omnia]" (x. 5).”
-Hannah Arendt, Human Condition
In the Human Condition Arendt raises major concerns about the place of man but she does not intend to respond to the loss of the earth as a unique human condition with a restoration of solid ground. To the question “What am I?” the only answer is: “You are a man—whatever that may be.” In lieu of an answer that would give man a new foundation, Arendt offers a description of man's ever changing territory.
Following Augustine, Arendt claims that only God could have the distance to answer the question of "who" man is with anything resembling a concrete statement of human nature. She respects the unknown “spirit of man,” even beyond the knowledge provided by religion.
When philosophy attempts to answer this question, it ends up creating its own image of a higher power, which remains linked through projection to man. Importantly though, philosophy should still ask the question.
Some context can help to open Arendt's question here for readers in English speaking countries where philosophical anthropology never gained the same traction as in Germany. Her challenge picks up on the heated debates of the 1920s and 30s over how to take the collapse of universal values seriously without falling back to simple subjectivism that culminated in the work of Husserl and Heidegger.
In the space of four pages of Being and Time (46-49), Martin Heidegger specifies his criticism with reference to Dilthey, Bergson, Scheler, and Husserl, as well as views from ancient Greek philosophy and Genesis. Heidegger says he has focused his analytic of Dasein on the question of Being and that it cannot therefore provide the fully ontological basis of Dasein needed for "'philosophical' anthropology'" but states that part of his goal is to "make such an anthropology possible." Later though, in section 10, Heidegger provides a further explanation of his criticism of anthropology: in "the attempt to determine the essence of 'man,' as an entity, the question of Being has been forgotten."
In its turn to experience and consciousness, philosophical anthropology forgets to ask the question of ontological definition of perceptual experience (cogitationes). Heidegger thus suggests that his investigation might provide the basis for an anthropology but does not claim to actually deliver this basis. He opens the question of the definition of man, but does so to orient man (recast as Dasein) toward his relation to Being. In a parallel manner, we can understand Arendt's reading of Augustine as opening the question of the relation between the "who" and “what” man is, but not closing it. Her work here is provocative because it can not be said to be in the service of a simple secularization that removes a higher power for human measure. Nor does she wish to save or restore divine guarantee. Perhaps Augustine allows her to pose similar questions of philosophical anthropology to those raised by Heidegger, but to win some distance from her teacher so that she can open a new space of freedom of action rather than freedom of thought.
In an essay in the Wall Street Journal, Frans de Waal—C. H. Candler Professor of Primate Behavior at Emory University—offers a fascinating review of recent scientific studies that upend long-held expectations about the intelligence of animals. De Waal rehearses a catalogue of fantastic studies in which animals do things that scientists have long thought they could not do. Here are a few examples:
Ayumu, a male chimpanzee, excels at memory; just as the IBM computer Watson can beat human champions at Jeopardy, Ayumu can easily best the human memory champion in games of memory.
Similarly, Kandula, a young elephant bull, was able to reach some fragrant fruit hung out of reach by moving a stool over to the tree, standing on it, and reaching for the fruit with his trunk. I’ll admit this doesn’t seem like much of a feat to me, but for the researchers de Waal talks with, it is surprising proof that elephants can use tools.
Scientists may be surprised that animals can remember things or use tools to accomplish tasks, but any one raised on children’s tales of Lassie or Black Beauty knows this well, as does anyone whose pet dog opened a door knob, brought them a newspaper, or barked at intruders. The problem these studies address is less our societal view of animals than the overly reductive view of animals that de Waal attributes to his fellow scientists. It’s hard to take these studies seriously as evidence that animals think in the way that humans do.
Seemingly more interesting are experiments with self-recognition and also facial recognition. De Waal describes one Asian Elephant who stood in front of a mirror and “repeatedly rubbed a white cross on her forehead.” Apparently the elephant recognized the image in the mirror as herself. In another experiment, chimpanzees were able to recognize which pictures of chimpanzees were from their own species. Like my childhood Labrador who used to stare knowingly into the mirror, these studies confirm that animals are able to recognize themselves. This means that animals do, likely, understand that they are selves.
For de Waal, these studies have started to upend a view of humankind's unique place in the universe that dates back at least to ancient Greece. “Science,” he writes, “keeps chipping away at the wall that separates us from the other animals. We have moved from viewing animals as instinct-driven stimulus-response machines to seeing them as sophisticated decision makers.”
The flattening of the distinction between animals and humans is to be celebrated, De Waal argues, and not feared. He writes:
Aristotle's ladder of nature is not just being flattened; it is being transformed into a bush with many branches. This is no insult to human superiority. It is long-overdue recognition that intelligent life is not something for us to seek in the outer reaches of space but is abundant right here on earth, under our noses.
DeWaal has long championed the intelligence of animals, and now his vision is gaining momentum. This week, in a long essay called “One of Us” in the new Lapham’s Quarterly on animals, the glorious essayist John Jeremiah Sullivan begins with this description of similar studies to the ones DeWaal writes about:
These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”
With nuance and subtlety, Sullivan understands that our tradition has not drawn the boundary between human and animal nearly as securely as de Waal portrays it. Throughout human existence, humans and animals have been conjoined in the human imagination. Sullivan writes that the most consistent “motif in the artwork made between four thousand and forty thousand years ago,” is the focus on “animal-human hybrids, drawings and carvings and statuettes showing part man or woman and part something else—lion or bird or bear.” In these paintings and sculptures, our ancestors gave form to a basic intuition: “Animals knew things, possessed their forms of wisdom.”
Religious history too is replete with evidence of the human recognition of the dignity of animals. God says in Isaiah that the beasts will honor him and St. Francis, the namesake of the new Pope, is famous for preaching to birds. What is more, we are told that God cares about the deaths of animals.
“In the Gospel According to Matthew we’re told, “Not one of them will fall to the ground apart from your Father.” Think about that. If the bird dies on the branch, and the bird has no immortal soul, and is from that moment only inanimate matter, already basically dust, how can it be “with” God as it’s falling? And not in some abstract all-of-creation sense but in the very way that we are with Him, the explicit point of the verse: the line right before it is “fear not them which kill the body, but are not able to kill the soul.” If sparrows lack souls, if the logos liveth not in them, Jesus isn’t making any sense in Matthew 10:28-29.
What changed and interrupted the ancient and deeply human appreciation of our kinship with besouled animals? Sullivan’s answer is René Descartes. The modern depreciation of animals, Sullivan writes,
proceeds, with the rest of the Enlightenment, from the mind of René Descartes, whose take on animals was vividly (and approvingly) paraphrased by the French philosopher Nicolas Malebranche: they “eat without pleasure, cry without pain, grow without knowing it; they desire nothing, fear nothing, know nothing.” Descartes’ term for them was automata—windup toys, like the Renaissance protorobots he’d seen as a boy in the gardens at Saint-Germain-en-Laye, “hydraulic statues” that moved and made music and even appeared to speak as they sprinkled the plants.
Too easy, however, is the move to say that the modern comprehension of the difference between animal and human proceeds from a mechanistic view of animals. We live at a time of the animal rights movement. Around the world, societies exist and thrive whose mission is to prevent cruelty toward and to protect animals. Yes, factory farms treat chickens and pigs as organic mechanisms for the production of meat, but these farms co-exist with active and quite successful movements calling for humane standards in food production. Whatever the power of Cartesian mechanics, its success is at odds with the persistence of the religious, ancient solidarity, and also deeply modern sympathy between human and animal.
A more meaningful account of the modern attitude towards animals might be found in Spinoza. Spinoza, as Sullivan quotes him, recognizes that animals feel in ways that Descartes did not. As do animal rights activists, Spinoza admits what is obvious: that animals feel pain, show emotion, and have desires. And yet, Spinoza maintains a distinction between human and animal—one grounded not in emotion or feeling, but in human nature. In his Ethics, he writes:
Hence it follows that the emotions of the animals which are called irrational…only differ from man’s emotions to the extent that brute nature differs from human nature. Horse and man are alike carried away by the desire of procreation, but the desire of the former is equine, the desire of the latter is human…Thus, although each individual lives content and rejoices in that nature belonging to him wherein he has his being, yet the life, wherein each is content and rejoices, is nothing else but the idea, or soul, of the said individual…It follows from the foregoing proposition that there is no small difference between the joy which actuates, say, a drunkard, and the joy possessed by a philosopher.
Spinoza argues against the law prohibiting slaughter of animals—it is “founded rather on vain superstition and womanish pity than on sound reason”—because humans are more powerful than animals. Here is how he defends the slaughter of animals:
The rational quest of what is useful to us further teaches us the necessity of associating ourselves with our fellow men, but not with beasts, or things, whose nature is different from our own; we have the same rights in respect to them as they have in respect to us. Nay, as everyone’s right is defined by his virtue, or power, men have far greater rights over beasts than beasts have over men. Still I do not deny that beasts feel: what I deny is that we may not consult our own advantage and use them as we please, treating them in the way which best suits us; for their nature is not like ours.
Spinoza’s point is quite simple: Of course animals feel and of course they are intelligent. Who could doubt such a thing? But they are not human. That is clear too. While we humans may care for and even love our pets, we recognize the difference between a dog and a human. And we will, in the end, associate more with our fellow humans than with dogs and porpoises. Finally, we humans will use animals when they serve our purposes. And this is ok, because have the power to do so.
Is Spinoza arguing that might makes right? Surely not in the realm of law amongst fellow humans. But he is insisting that we recognize that for us humans, there is something about being human that is different and, even, higher and more important. Spinoza couches his argument in the language of natural right, but what he is saying is that we must recognize that there are important differences between animals and humans.
At a time that values equality over what Friedrich Nietzsche called the “pathos of difference,” the valuation of human beings over animals is ever more in doubt. This comes home clearly in a story told recently by General Stanley McChrystal, about a soldier who expressed sympathy for some dogs killed in a raid in Iraq. McChrystal responded, severely: “"Seven enemy were killed on that target last night. Seven humans. Are you telling me you're more concerned about the dog than the people that died? The car fell silent again. "Hey listen," I said. "Don't lose your humanity in this thing."” Many, no doubt, are more concerned, or at least are equally concerned, about the deaths of animals as they are about the deaths of humans. There is ever-increasing discomfort about McChrystal’s common sense affirmation of Spinoza’s claim that human beings simply are of more worth than animals.
The distinctions upon which the moral sense of human distinction is based are foundering. For DeWaal and Sullivan, the danger today is that we continue to insist on differences between animals and humans—differences that we don’t fully understand. The consequences of their openness to the humanization of animals, however, is undoubtedly the animalization of humans. The danger that we humans lose sight of what distinguishes us from animals is much more significant than the possibility that we underestimate animal intelligence.
I fully agree with DeWaal and Sullivan that there is a symphony of intelligence in the world, much of it not human. And yes, we should have proper respect for our ignorance. But all the experiments in the world do little to alter the basic facts, that no matter how intelligent and feeling and even conscious animals may be, humans and animals are different.
What is the quality of that difference? It is difficult to say and may never be fully articulated in propositional form. On one level it is this: Simply to live, as do plants or animals, does not constitute a human life. In other words, human life is not simply about living. Nor is it about doing tasks or even being conscious of ourselves as humans. It is about living meaningfully. There may, of course, be some animals that can create worlds of meaning—worlds that we have not yet discovered. But their worlds are not the worlds to which we humans aspire.
Over two millennia ago, Sophocles, in his “Ode to Man,” named man Deinon, a Greek word that connotes both greatness and horror, that which is so extraordinary as to be at once terrifying and beautiful. Man, Sophocles tells us, can travel over water and tame animals, using them to plough fields. He can invent speech, and institute governments that bring humans together to form lasting institutions. As an inventor and maker of his world, this wonder that is man terrifyingly carries the seeds of his destruction. As he invents and comes to control his world, he threatens to extinguish the mystery of his existence, that part of man that man himself does not control. As the chorus sings: “Always overcoming all ways, man loses his way and comes to nothing.” If man so tames the earth as to free himself from uncertainty, what then is left of human being?
Sophocles knew that man could be a terror; but he also glorified the wonder that man is. He knew that what separates us humans from animals is our capacity to alter the earth and our natural environment. “The human artifice of the world,” Arendt writes, “separates human existence from all mere animal environment…” Not only by building houses and erecting dams—animals can do those things and more—but also by telling stories and building political communities that give to man a humanly created world in which he lives. If all we did as humans was live or build things on earth, we would not be human.
To be human means that we can destroy all living matter on the Earth. We can even today destroy the earth itself. Whether we do so or not, it now means that to live on Earth today is a “Choice” that we make, not a matter of fate or chance. Our Earth, although we did not create it, is now something we humans can decide to sustain or destroy. In this sense, it is a human creation. No other animal has such a potential or such a responsibility.
There is a deep desire today to flee from that awesome and increasingly unbearable human responsibility. We flee, therefore, our humanity and take solace in the view that we are just one amongst the many animals in the world. We see this reductionism above all in human rights discourse. One core demand of human rights—that men and women have a right to live and not be killed—brought about a shift in the idea of humanity from logos to life. The rise of a politics of life—the political demand that governments limit freedoms and regulate populations in order to protect and facilitate their citizens’ ability to live in comfort—has pushed the animality, the “life,” of human beings to the center of political and ethical activity. In embracing a politics of life over a politics of the meaningful life, human rights rejects the distinctive dignity of human rationality and works to reduce humanity to its animality.
Hannah Arendt saw human rights as dangerous precisely because they risked confusing the meaning of human worldliness with the existence of mere animal life. For Arendt, human beings are the beings who build and live in a political world, by which she means the stories, institutions, and achievements that mark the glory and agony of humanity. To be human, she insists, is more than simply living, laboring, working, acting, and thinking. It is to do all of these activities in such a way as to create, together, a common life amongst a plurality of persons.
I fear that the interest in animal consciousness today is less a result of scientific proof that animals are human than it is an increasing discomfort with the world we humans have built. A first step in responding to such discomfort, however, is a reaffirmation of our humanity and our human responsibility. There is no better way to begin that process than in engaging with a very human response to the question of our animality. Towards that end, I commend to you “One of Us,” by John Jeremiah Sullivan.