Hannah Arendt Center for Politics and Humanities
28Jul/140

Death and the Public Realm

public_realm

**This article was originally published on May 13, 2013**

"There is perhaps no clearer testimony to the loss of the public realm in the modern age than the almost complete loss of authentic concern with immortality, a loss somewhat overshadowed by the simultaneous loss of the metaphysical concern with eternity."

--Hannah Arendt,  The Human Condition

12Jul/142

The Unknown Within Ourselves

artist_privacy

Privacy is sacrificed unthinkingly to government and corporations; transparency and sharing trump depth and inscrutability; and we justifiably bemoan the death of privacy. Technology is blamed, but the truth is that privacy is being lost not because it can be, but because we have forgotten why it is important.

30Jun/142

Amor Mundi 6/29/14

Amor Mundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

1

Lila

1"It seemed to me to be half-sadness and half-fury, and I wondered what in her life could have put that expression in her eyes." This is how Reverend John Ames, the voice of Marilynne Robinson's Pulitzer Prize-winning novel Gilead, describes his younger wife Lila, whose former life is largely a mystery even to Ames himself. Now, Robinson's much-anticipated fourth novel will tell Lila's story and how, after being rescued as a child by a drifter named Doll, the two craft a life together on the run and on the fringes of society. Though the novel does not come out until October, its description recalls another remarkable female pair in Robinson's work--the young Ruth and her aunt Sylvie from Robinson's first novel Housekeeping, which came out thirty-five years ago--and Housekeeping's theme of Christian homelessness. For now, FSG offers a sneak peek of Lila: "The child was four or five, long-legged, and Doll couldn't keep her covered up, but she chafed at her calves with her big, rough hand and brushed the damp from her cheek and her hair. She whispered, 'Don't know what I think I'm doing. Never figured on it. Well, maybe I did. I don't know. I guess I probly did. This sure ain't the night for it.' She hitched up her apron to cover the child's legs and carried her out past the clearing. The door might have opened, and a woman might have called after them, Where you going with that child? and then, after a minute, closed the door again, as if she had done all decency required. 'Well,' Doll whispered, 'we'll just have to see.'"

Tactics Beat Genius

1Simon Critchley seeks the bleeding obvious philosophical lessons from soccer-the priority of the coach over individual players: "Allow me to state the bleeding obvious: this is a tactical game. It is not about passion and individual genius, notwithstanding the relentless commodification of stars like Messi, Ronaldo, and Neymar. No, soccer is about the use of reason and intelligence in order to construct a collective team formation that will contain and defeat the opposition. It requires discipline and relentless training, particularly in order to maintain the shape of the team and the way it occupies and controls space. This is the job of the coach, who tends to get reduced to some kind of either bizarrely animated comic character or casually disaffected bystander when games are televised. But he is the one who sets the team up to play a certain, clearly determined way, the prime mover although sometimes moved rather than unmoved. Otherwise said, soccer is not about individual players." Soccer may then be the perfect game in our world of quantitative analysis and big data, one in which what matters less are exceptionally talented individuals and what wins in the end is well-managed, data-driven, carefully-crafted strategic analysis. Which would maybe explain why the Oakland A's are presently the best team in baseball.

When the People are the Fourth Estate

A protester uses a mobile phone as he passes next to a burning vehicle during a protest at Taksim Square in IstanbulIn an article for the most recent Nieman Report, Engin Ondin, the founder of the Turkish citizen journalism aggregator 140journos, describes the founding of the project, and its growth following last year's protests in Istanbul. Although he and his partners have increasingly relied on citizen-editors as informants, he finds that oversight remains important, and that he can use Twitter apps to help: "Turkey has about 12 million active Twitter users, roughly a third of the online population. We have more than 300 volunteer content producers all across the country, including a survivor of the Uludere attack. As the number of Turkish citizens feeding information to 140journos grew, we shifted gears. Instead of doing all the reporting ourselves, we focused on collecting, categorizing, validating and Storifying the news content sent to us. To verify news reports, we use free tools like Yandex Panorama (Russia's version of Google StreetView) and TinEye, a search service to help determine if images are new or pulled from websites. To monitor the flow of news tips, 140journos uses TweetDeck. We keep lists of 140journos contributors who tweet news from more than 50 cities, universities and other political hotspots in Turkey. We also keep lists organized by individual events, such as protests against executions in Egypt, and lists organized by factions, such as ultra-nationalists and conservatives." Although he doesn't quite come out and say it, this kind of work is important in any place where freedom of the press is limited, or perhaps merely focused on other things.

1

The Literature of the New Wealth

1Pivoting off of Thomas Picketty's attention to classic literary fiction in Capital in the Twenty First Century, Stephen Marche points out that we have already seen the literary proof of the second gilded age, and that it is by and large Franzenite: "Future economic historians won't have to look very far to find fictional descriptions of our current financial realities. The social realist novel of the moment can be identified by the preeminent, almost exclusive, emphasis it places on social expressions of the changing economic reality. Currently, the large-scale realism of Jonathan Franzen, articulated in his famous article for Harper's in 1996 and achieved most fully in The Corrections and Freedom, stands utterly triumphant. The narrative forms that thrived in the mid-nineties - minimalism, with its descriptions of poor and rural men; magical realism which incorporated non-Western elements into the traditional English novel; the exotic lyricism of John Berger or Michael Ondaatje - have been pushed to the side."

Digital Likenesses

1Reporting from the trial set to determine whether or not the NCAA can continue to exclusively profit from the likenesses of its players, Charlie Pierce frames the debate in the language of personhood, and whether or not a digital representation of a person is the same thing as the person himself: "As near as I can tell, the video games in question were created by taking game films from various NCAA football and basketball games and then transferring them technologically until actual players found themselves with NCAA-licensed avatars that live forever. It was seeing his avatar that prompted Ed O'Bannon to launch his lawsuit in the first place and, having done so, he opened up a number of interesting questions about who he is, both in real life and in virtual reality. Is Ed O'Bannon's avatar really Ed O'Bannon, or is it an Ed O'Bannon made by someone else so that a lot of someone elses could make a whole lot of money? Isn't that a fundamental looting of one's fundamental identity? Doesn't the real Ed O'Bannon have a say in the use of his name, his image, and his likeness? After all, that's him in that game. The avatar runs the court like he did. It shoots the way he did. It passes the ball the way he did. There doesn't seem to be any moral basis for an argument that Ed O'Bannon doesn't have the right to control - let alone profit from - all the Ed O'Bannons that have been created out of the work that the real Ed O'Bannon did as an athlete. How can an actual person find himself an indentured servant in virtual reality?"

From the Hannah Arendt Center Blog

This week on the Blog, Faisal Baluch explores Arendt's distinction between politics and violence as a way to understand her support of a Jewish army in the Quote of the Week. American philosopher Eric Hoffer provides this week's Thought on Thinking. And Roger Berkowitz distinguishes Arendt's banality of evil from Shirley Jackson's "The Lottery" in the Weekend Read.

1

19May/140

Amor Mundi 5/18/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

Is Democracy Over?

1Thomas Meaney and Yascha Mounk argue in a long essay in The Nation that the democratic moment is passing if it has not yet passed. Meaney and Mounk build their argument on a simple critical insight, a kind of "unmasking" of what might be called the hypocrisy of modern democracy. Democracy is supposed to be the will of the people. It is a long time since the small group of Athenian citizens governed themselves. Modern democrats have defended representative democracy as a pragmatic alternative because gathering all the citizens of modern states together for democratic debate is simply impossible. But technology has changed that. "As long as direct democracy was impracticable within the confines of the modern territorial state, the claim that representative institutions constituted the truest form of self-government was just about plausible. But now, in the early twenty-first century, the claim about direct democracy being impossible at the national level and beyond is no longer credible. As the constraints of time and space have eroded, the ubiquitous assumption that we live in a democracy seems very far from reality. The American people may not all fit into Madison Square Garden, but they can assemble on virtual platforms and legislate remotely, if that is what they want. Yet almost no one desires to be that actively political, or to replace representation with more direct political responsibility. Asked to inform themselves about the important political issues of the day, most citizens politely decline. If forced to hold an informed opinion on every law and regulation, many would gladly mount the barricades to defend their right not to rule themselves in such a burdensome manner. The challenge posed by information technology lies not in the possibility that we might adopt more direct forms of democracy but in the disquieting recognition that we no longer dream of ruling ourselves." In short, democracy understood as self-government is now once again possible in the technical age. Such techno-democratic possibility is not, however, leading to more democracy. Thus, Meaney and Mounk conclude, technology allows us to see through the illusions of democracy as hypocritical and hollow. While it is true that people are not flocking to technical versions of mass democracies, they are taking to the streets and organizing protests, and involving themselves in the activities of citizenship. Meaney and Mounk are right, democracy is not assured, and we should never simply assume its continued vitality. But neither should we write it off entirely. Read more in the Weekend Read by Roger Berkowitz.

Who is Modi?

1Narenda Modi is a corruption-fighting son of a tea merchant who has risen from one of India's lowest castes to be its new Prime Minister. He is also a member of an ultra-nationalist organization who is alleged to have enabled anti-Muslim pogroms and has until now been banned from traveling to the United States. An unsigned editorial in the Wall Street Journal gushes: "Mr. Modi's record offers reason for optimism. As governor for 13 years of Gujarat state, he was the archetypal energetic executive, forcing through approvals of new projects and welcoming foreign investment. Gujarat now accounts for 25% of India's exports, and the poverty rate has plunged. As the son of a tea-seller, Mr. Modi also has a gut sense of the economic aspirations of ordinary Indians." In a longer essay in the same paper, Geeta Anand and Gordon Fairclough speak of India's "post-ideological moment": "Voters from different castes and regions, rural and urban areas, the middle class and those who want to be middle class-all turned out to vote for Mr. Modi. 'This is a big shift. It is the beginning of a post-ideological generation, not left-centered,' says Shekhar Gupta, editor in chief of the Indian Express newspaper. 'This is the rise of Indians more interested in themselves. They are aspirational, and they are united in their impatience.'" And yet, in the Guardian, Pankaj Mishra warns: "Back then, it would have been inconceivable that a figure such as Narendra Modi, the Hindu nationalist chief minister of Gujarat accused, along with his closest aides, of complicity in crimes ranging from an anti-Muslim pogrom in his state in 2002 to extrajudicial killings, and barred from entering the US, may occupy India's highest political office. Modi is a lifelong member of the Rashtriya Swayamsevak Sangh (RSS), a paramilitary Hindu nationalist organization inspired by the fascist movements of Europe, whose founder's belief that Nazi Germany had manifested 'race pride at its highest' by purging the Jews is by no means unexceptional among the votaries of Hindutva, or ''Hinduness'. In 1948, a former member of the RSS murdered Gandhi for being too soft on Muslims. The outfit, traditionally dominated by upper-caste Hindus, has led many vicious assaults on minorities. A notorious executioner of dozens of Muslims in Gujarat in 2002 crowed that he had slashed open with his sword the womb of a heavily pregnant woman and extracted her foetus. Modi himself described the relief camps housing tens of thousands of displaced Muslims as 'child-breeding centres'. Such rhetoric has helped Modi sweep one election after another in Gujarat."

A Penny for Your Thoughts

1Subscriptions to academic journals can run into the $1,000s. What is more, after a publication and review process that takes years, the articles are frequently barricaded behind firewalls for years more. Robert Darnton, despairing over inaccessibility of academic journals and what that means both for both research and the public good, notes that there is, in fact, some hope in any number of organizations looking to align the interests of authors and readers both: "the desire to reach readers may be one of the most underestimated forces in the world of knowledge. Aside from journal articles, academics produce a large numbers of books, yet they rarely make much money from them. Authors in general derive little income from a book a year or two after its publication. Once its commercial life has ended, it dies a slow death, lying unread, except for rare occasions, on the shelves of libraries, inaccessible to the vast majority of readers. At that stage, authors generally have one dominant desire-for their work to circulate freely through the public; and their interest coincides with the goals of the open-access movement." The new model of open-source academic publishing seeks to subsidize peer review by charging a fee for submission. Good idea.

Against Critical Thinking

1Hardly any idea is more in vogue these days than 'critical thinking.' There is even a National Council for Excellence in Critical Thinking that defines critical thinking as the intellectually disciplined process of skillfully conceptualizing, applying, analyzing, and evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. Isn't that what we are supposed to be teaching our children and our students? Not according to Michael S. Roth, President of Wesleyan University. In "The Stone" in the New York Times, Roth argues that students-and not only students-are too critical in their approach to texts and ideas. "Our best college students are very good at being critical. In fact being smart, for many, means being critical. Having strong critical skills shows that you will not be easily fooled. It is a sign of sophistication, especially when coupled with an acknowledgment of one's own 'privilege.'The combination of resistance to influence and deflection of responsibility by confessing to one's advantages is a sure sign of one's ability to negotiate the politics of learning on campus. But this ability will not take you very far beyond the university. Taking things apart, or taking people down, can provide the satisfactions of cynicism. But this is thin gruel." Critical thinking is important. First, however, learning requires submission to the text, the facts, or the thinker. Too often, students and even professors skip the hard work of learning and proceed directly to criticism. As I am constantly telling my students, first try to understand Nietzsche before you decide if he is right or wrong.

The Death Penalty in Context

1In an essay on the racial-bias in the death penalty, Ta-Nehisi Coates writes: "When [Ramesh] Ponnuru suggests that the way to correct for the death penalty's disproportionate use is to execute more white people, he is presenting a world in which the death penalty has neither history nor context. One merely flips the 'Hey Guys, Let's Not Be Racist' switch and then the magic happens. Those of us who cite the disproportionate application of the death penalty as a reason for outlawing it do so because we believe that a criminal-justice system is not an abstraction but a real thing, existing in a real context, with a real history. In America, the history of the criminal justice-and the death penalty-is utterly inseparable from white supremacy. During the Civil War, black soldiers were significantly more likely to be court-martialed and executed than their white counterparts. This practice continued into World War II. 'African-Americans comprised 10 percent of the armed forces but accounted for almost 80 percent of the soldiers executed during the war,' writes law professor Elizabeth Lutes Hillman."

The Rainbow Pope

1Omar Encarnación argues in Foreign Affairs that we should pay attention to Pope Francis not only because of his well-remarked attention to economic inequality. "More surprising than Francis' endorsement of economic populism and even liberalization theology are his views on social issues, homosexuality in particular, which suggest an even deeper Latin American influence on Francis' papacy. On a flight back from Brazil last July, he told reporters: 'If someone is gay and seeks the Lord with good will, who am I to judge?' Then, in an interview in September, he called on Catholics to 'get over their obsession with abortion, contraceptives, and homosexuality.' Most recently, in an interview in March, Francis insinuated that he supported same-sex civil unions and that the church would tolerate them -- for economic reasons. 'Matrimony is between a man and a woman,' he said. But moves to 'regulate diverse situations of cohabitation [are] driven by the need to regulate economic aspects among persons, as for instance to assure medical care.'"

What They Show

1Dahlia Schweitzer praises the work of photographer Cindy Sherman for daring to reveal what's beneath: "After all, Sherman's photographs are an encyclopedia of body language, identities performed with carefully arranged figures. The body is a collection of limbs used to convey roles, personalities, and situations. Each gesture, each object, is loaded with meaning. Her photographs are never casual snapshots or self-portraits. Rather, they are explorations of arrangement and archetype. She questions stereotype and learned behavior through her compositions and subjects, and through the diorama-like environments she creates for each scenario. She exposes the ruptures under the surface by taking everyday life and shifting it off-kilter, examining society's expectations for appearance and behavior. Her photographs work for the attention they bring to that which does not fit, to the exact point of the tear."

Heidegger, Arendt, and the Political

1Babette Babich speaks with Roger Berkowitz and Tracy Strong in a long conversation touching upon Hannah Arendt, the Margarethe von Trotta film, managerial governance, totalitarianism, the Eichmann case, Stanley Milgram, evil, democracy, Martin Heidegger, and politics in the 21st century.

 

 

 

 

 

From the Hannah Arendt Center Blog

This week on the Blog, Jennifer M. Hudson in the Quote of the Week compares Thomas Piketty to Arendt's approach to populism and technocratic rule. And in the Weekend Read, Roger Berkowitz argues that claims portending the end of democracy are overstated.

17May/140

Is Democracy Over?

ArendtWeekendReading

Thomas Meaney and Yascha Mounk argue in a long essay in The Nation that the democratic moment is passing if not yet already passed. The sweep of their essay is broad. Alexis de Tocqueville saw American democracy replacing the age of European aristocracy. He worried that democratic equality would be unable to preserve the freedoms associated with aristocratic individualism, but he knew that the move from aristocracy to democracy was unstoppable. So today, Meaney and Mounk write, we are witnessing the end of the age of democracy and equality. This is so, they suggest, even if we do not yet know what will replace it.

5766228-1

Source: The Spectator

Meaney and Mounk build their argument on a simple critical insight, a kind of “unmasking” of what might be called the hypocrisy of modern democracy. Democracy is supposed to be the will of the people. It is a long time since the small group of Athenian citizens governed themselves. Modern democrats have defended representative democracy as a pragmatic alternative because gathering all the citizens of modern states together for democratic debate is simply impossible. But technology has changed that.

As long as direct democracy was impracticable within the confines of the modern territorial state, the claim that representative institutions constituted the truest form of self-government was just about plausible. But now, in the early twenty-first century, the claim about direct democracy being impossible at the national level and beyond is no longer credible. As the constraints of time and space have eroded, the ubiquitous assumption that we live in a democracy seems very far from reality. The American people may not all fit into Madison Square Garden, but they can assemble on virtual platforms and legislate remotely, if that is what they want. Yet almost no one desires to be that actively political, or to replace representation with more direct political responsibility. Asked to inform themselves about the important political issues of the day, most citizens politely decline. If forced to hold an informed opinion on every law and regulation, many would gladly mount the barricades to defend their right not to rule themselves in such a burdensome manner. The challenge posed by information technology lies not in the possibility that we might adopt more direct forms of democracy but in the disquieting recognition that we no longer dream of ruling ourselves.

In short, democracy understood as self-government is now once again possible in the technical age. Such techno-democratic possibility is not, however, leading to more democracy. Thus, Meaney and Mounk conclude, technology allows us to see through the illusions of democracy as hypocritical and hollow.

The very word “democracy” indicts the political reality of most modern states. It takes a considerable degree of delusion to believe that any modern government has been “by” the people in anything but the most incidental way. In the digital age, the claim that the political participation of the people in decision-making makes democracy a legitimate form of government is only that much hollower. Its sole lingering claim to legitimacy—that it allows the people the regular chance to remove leaders who displease them—is distinctly less inspiring. Democracy was once a comforting fiction. Has it become an uninhabitable one?

Such arguments by “unmasking” are attractive and popular today. They work, as Peter Baehr argued recently in a talk at the Arendt Center, through the logic of exposure, by accusing “a person, argument or way of life of being fundamentally defective.” It may be that there are populist democratic revolts happening in Turkey and Thailand, revolts that are unsettling to elites. Similarly, the democratic energies of the Tea Party and Occupy Wall Street are seen by many as evidence of the crisis of democracy. Democracy, it is said, is defective, based on a deception and buttressed by illusion. But it hardly does a service to truth to see democratic ferment as proof of the end of democracy.

Meaney and Mounck argue that there are three main reasons that have brought democracy to the brink of crisis. First, the interrelation of democracies within a global financial world means that democratic leaders are increasingly beholden to banks and financiers than to their citizens.

[W]ith world trade more pervasive, and with the domestic economies of even the most affluent nations deeply dependent on foreign investments, the ideological predilections of a few governments have become the preoccupation of all. There is a reason why all mainstream politicians now make decisions based on variables such as the risk of capital flight and the reactions of bond rating agencies, rather than on traditional calculations about the will of their electorates. As the German economist Wolfgang Streeck has argued, this shift in political calculus occurred because the most significant constituency of democracies is no longer voters but the creditors of public debt.

1

Second, democracies have come to be associated not just with self-government, but with good government leading to peace and plenty. But this is a fallacy. There is no reason that democracies will be better governed than autocracies or that economic growth in democracies will outperform that of autocracies. This creates an “expectations gap” in which people demand of democracies a level of success they cannot deliver.

Third, democracy has largely been sold around the world as “synonymous with modernization, economic uplift and individual self-realization.” Democratic politicians, often an elite, wrapped their power in largesse and growth that papered over important religious and moral differences. Today populism in Thailand, Egypt, and Turkey clashes with the clientism of democratic rulers and threatens the quasi-democratic alliance of the elites and the masses.

Meaney and Mounk are no doubt correct in perceiving challenges to democracy today. And they are right that democratic citizens consistently prefer technocratic competence over democratic dissent and debate. As they write,

…we live in highly bureaucratic states that require ever-increasing degrees of technical competence. We expect our governments to do more and to do it better. The more our expectations are addressed, the more bureaucratic and opaque government becomes and the less democratic control is possible.

The danger of representative democracy is that it imagines government as something we outsource to a professional class so that we can get on with what is most important in our lives. There is a decided similarity between representative democracy and technocracy, in that both presume that political administration is a necessary but uninspiring activity to be avoided and relegated to a class of bureaucrats and technocrats. The threat of representative democracy is that it is founded upon and regenerates an anti-political and apolitical culture, one that imagines politics as menial work to be done by others.

What Meaney and Mounk overlook, however, is that at least in the United States, we have never simply been a representative democracy. The United States is a complicated political system that cannot justly or rightly be called either a democracy or a representative democracy. Rightly understood, the USA is a federal, democratic, constitutional republic. Its democratic elements are both limited and augmented by its constitutional and federalist character as well as by its republican tradition. At least until recently, it combined a strong national government with equally strong traditions of state and local power. If citizens could not be involved in national politics, they could and often were highly involved in local governance.  And local institutions, empowered by the participation of energized citizens, were frequently more powerful or at least as powerful as were national institutions.

Of course, the late 20th and early 21st centuries have witnessed a tectonic constitutional shift in America away from local institutions and toward a highly powerful, centralized, and bureaucratized national government. But this shift is neither inevitable nor irreversible. Indeed, largely driven by the right, the new federalism has returned to states some traditional powers. These powers can be used, however, by the left and the right. As Ben Barber has been arguing from the left, there is an opportunity in the dysfunctional national government to return power and vitality to our cities and our towns. Both Occupy Wall Street and the Tea Party show that there are large numbers of people who are dissatisfied with our political centralization and feel disenfranchised and distant from the ideals of democratic self-government. The Tea Party, more than Occupy, has channeled that disenchantment into local political organizations and institutions. But the opportunity to do so is present on the left as well as on the right.

index

There is a deeply religious element to American democracy that is bound up with the idea and reality of American exceptionalism, a reservoir of democratic potency that is not yet tapped out. Meaney and Mounk see this, albeit in a throwaway line that is buried in their essay:

Outside of a few outliers such as India and the United States, where deep in the provinces one still encounters something like religious zeal for democracy, many people in nominal democracies around the world do not believe they are inheritors of a sacral dispensation. Nor should they.

We are witnessing a crisis of democracy around the world, in the sense that both established and newer democracies are finding their populations dissatisfied. While it is true that people are not flocking to technical versions of mass democracies, they are taking to the streets and organizing protests, and involving themselves in the activities of citizenship. Meaney and Mounk are right, democracy is not assured, and we should never simply assume its continued vitality. But neither should we write it off entirely. Their essay should be read less as an obituary than a provocation. But it should be read. It is your Weekend Read.

14Apr/142

Hiatus, Discontinuity, and Change

Arendtquote

"The end of the old is not necessarily the beginning of the new."

Hannah Arendt, The Life of the Mind

This is a simple enough statement, and yet it masks a profound truth, one that we often overlook out of the very human tendency to seek consistency and connection, to make order out of the chaos of reality, and to ignore the anomalous nature of that which lies in between whatever phenomena we are attending to.

Perhaps the clearest example of this has been what proved to be the unfounded optimism that greeted the overthrow of autocratic regimes through American intervention in Afghanistan and Iraq, and the native-born movements known collectively as the Arab Spring. It is one thing to disrupt the status quo, to overthrow an unpopular and undemocratic regime. But that end does not necessarily lead to the establishment of a new, beneficent and participatory political structure. We see this time and time again, now in Putin's Russia, a century ago with the Russian Revolution, and over two centuries ago with the French Revolution.

Of course, it has long been understood that oftentimes, to begin something new, we first have to put an end to something old. The popular saying that you can't make an omelet without breaking a few eggs reflects this understanding, although it is certainly not the case that breaking eggs will inevitably and automatically lead to the creation of an omelet. Breaking eggs is a necessary but not sufficient cause of omelets, and while this is not an example of the classic chicken and egg problem, I think we can imagine that the chicken might have something to say on the matter of breaking eggs. Certainly, the chicken would have a different view on what is signified or ought to be signified by the end of the old, meaning the end of the egg shell, insofar as you can't make a chicken without it first breaking out of the egg that it took form within.

eggs

So, whether you take the chicken's point of view, or adopt the perspective of the omelet, looking backwards, reverse engineering the current situation, it is only natural to view the beginning of the new as an effect brought into being by the end of the old, to assume or make an inference based on sequencing in time, to posit a causal relationship and commit the logical fallacy of post hoc ergo propter hoc, if for no other reason that by force of narrative logic that compels us to create a coherent storyline.  In this respect, Arendt points to the foundation tales of ancient Israel and Rome:

We have the Biblical story of the exodus of Israeli tribes from Egypt, which preceded the Mosaic legislation constituting the Hebrew people, and Virgil's story of the wanderings of Aeneas, which led to the foundation of Rome—"dum conderet urbem," as Virgil defines the content of his great poem even in its first lines. Both legends begin with an act of liberation, the flight from oppression and slavery in Egypt and the flight from burning Troy (that is, from annihilation); and in both instances this act is told from the perspective of a new freedom, the conquest of a new "promised land" that offers more than Egypt's fleshpots and the foundation of a new City that is prepared for by a war destined to undo the Trojan war, so that the order of events as laid down by Homer could be reversed.

 Fast forward to the American Revolution, and we find that the founders of the republic, mindful of the uniqueness of their undertaking, searched for archetypes in the ancient world. And what they found in the narratives of Exodus and the Aeneid was that the act of liberation, and the establishment of a new freedom are two events, not one, and in effect subject to Alfred Korzybski's non-Aristotelian Principle of Non-Identity. The success of the formation of the American republic can be attributed to the awareness on their part of the chasm that exists between the closing of one era and the opening of a new age, of their separation in time and space:

No doubt if we read these legends as tales, there is a world of difference between the aimless desperate wanderings of the Israeli tribes in the desert after the Exodus and the marvelously colorful tales of the adventures of Aeneas and his fellow Trojans; but to the men of action of later generations who ransacked the archives of antiquity for paradigms to guide their own intentions, this was not decisive. What was decisive was that there was a hiatus between disaster and salvation, between liberation from the old order and the new freedom, embodied in a novus ordo saeclorum, a "new world order of the ages" with whose rise the world had structurally changed.

I find Arendt's use of the term hiatus interesting, given that in contemporary American culture it has largely been appropriated by the television industry to refer to a series that has been taken off the air for a period of time, but not cancelled. The typical phrase is on hiatus, meaning on a break or on vacation. But Arendt reminds us that such connotations only scratch the surface of the word's broader meanings. The Latin word hiatus refers to an opening or rupture, a physical break or missing part or link in a concrete material object. As such, it becomes a spatial metaphor when applied to an interruption or break in time, a usage introduced in the 17th century. Interestingly, this coincides with the period in English history known as the Interregnum, which began in 1649 with the execution of King Charles I, led to Oliver Cromwell's installation as Lord Protector, and ended after Cromwell's death with the Restoration of the monarchy under Charles II, son of Charles I. While in some ways anticipating the American Revolution, the English Civil War followed an older pattern, one that Mircea Eliade referred to as the myth of eternal return, a circular movement rather than the linear progression of history and cause-effect relations.

The idea of moving forward, of progress, requires a future-orientation that only comes into being in the modern age, by which I mean the era that followed the printing revolution associated with Johannes Gutenberg (I discuss this in my book, On the Binding Biases of Time and Other Essays on General Semantics and Media Ecology). But that same print culture also gave rise to modern science, and with it the monopoly granted to efficient causality, cause-effect relations, to the exclusion in particular of final and formal cause (see Marshall and Eric McLuhan's Media and Formal Cause). This is the basis of the Newtonian universe in which every action has an equal and opposite reaction, and every effect can be linked back in a causal chain to another event that preceded it and brought it into being. The view of time as continuous and connected can be traced back to the introduction of the mechanical clock in the 13th century, but was solidified through the printing of calendars and time lines, and the same effect was created in spatial terms by the reproduction of maps, and the use of spatial grids, e.g., the Mercator projection.

And while the invention of history, as a written narrative concerning the linear progression over time can be traced back to the ancient Israelites, and the story of the exodus, the story incorporates the idea of a hiatus in overlapping structures:

A1.  Joseph is the golden boy, the son favored by his father Jacob, earning him the enmity of his brothers

A2.  he is sold into slavery by them, winds up in Egypt as a slave and then is falsely accused and imprisoned

A3.  by virtue of his ability to interpret dreams he gains his freedom and rises to the position of Pharaoh's prime minister

 

B1.  Joseph welcomes his brothers and father, and the House of Israel goes down to Egypt to sojourn due to famine in the land of Canaan

B2.  their descendants are enslaved, oppressed, and persecuted

B3.  Moses is chosen to confront Pharaoh, liberate the Israelites, and lead them on their journey through the desert

 

C1.  the Israelites are freed from bondage and escape from Egypt

C2.  the revelation at Sinai fully establishes their covenant with God

C3.  after many trials, they return to the Promised Land

It can be clearly seen in these narrative structures that the role of the hiatus, in ritual terms, is that of the rite of passage, the initiation period that marks, in symbolic fashion, the change in status, the transformation from one social role or state of being to another (e.g., child to adult, outsider to member of the group). This is not to discount the role that actual trials, tests, and other hardships may play in the transition, as they serve to establish or reinforce, psychologically and sometimes physically, the value and reality of the transformation.

In mythic terms, this structure has become known as the hero's journey or hero's adventure, made famous by Joseph Campbell in The Hero with a Thousand Faces, and also known as the monomyth, because he claimed that the same basic structure is universal to all cultures. The basis structure he identified consists of three main elements: separation (e.g., the hero leaves home), initiation (e.g., the hero enters another realm, experiences tests and trials, leading to the bestowing of gifts, abilities, and/or a new status), and return (the hero returns to utilize what he has gained from the initiation and save the day, restoring the status quo or establishing a new status quo).

Understanding the mythic, non-rational element of initiation is the key to recognizing the role of the hiatus, and in the modern era this meant using rationality to realize the limits of rationality. With this in mind, let me return to the quote I began this essay with, but now provide the larger context of the entire paragraph:

The legendary hiatus between a no-more and a not-yet clearly indicated that freedom would not be the automatic result of liberation, that the end of the old is not necessarily the beginning of the new, that the notion of an all-powerful time continuum is an illusion. Tales of a transitory period—from bondage to freedom, from disaster to salvation—were all the more appealing because the legends chiefly concerned the deeds of great leaders, persons of world-historic significance who appeared on the stage of history precisely during such gaps of historical time. All those who pressed by exterior circumstances or motivated by radical utopian thought-trains, were not satisfied to change the world by the gradual reform of an old order (and this rejection of the gradual was precisely what transformed the men of action of the eighteenth century, the first century of a fully secularized intellectual elite, into the men of the revolutions) were almost logically forced to accept the possibility of a hiatus in the continuous flow of temporal sequence.

Note that concept of gaps in historical time, which brings to mind Eliade's distinction between the sacred and the profane. Historical time is a form of profane time, and sacred time represents a gap or break in that linear progression, one that takes us outside of history, connecting us instead in an eternal return to the time associated with a moment of creation or foundation. The revelation in Sinai is an example of such a time, and accordingly Deuteronomy states that all of the members of the House of Israel were present at that event, not just those alive at that time, but those not present, the generations of the future. This statement is included in the liturgy of the Passover Seder, which is a ritual reenactment of the exodus and revelation, which in turn becomes part of the reenactment of the Passion in Christianity, one of the primary examples of Campbell's monomyth.

Arendt's hiatus, then represents a rupture between two different states or stages, an interruption, a disruption linked to an eruption. In the parlance of chaos and complexity theory, it is a bifurcation point. Arendt's contemporary, Peter Drucker, a philosopher who pioneered the scholarly study of business and management, characterized the contemporary zeitgeist in the title of his 1969 book: The Age of Discontinuity. It is an age in which Newtonian physics was replaced by Einstein's relativity and Heisenberg's uncertainty, the phrase quantum leap becoming a metaphor drawn from subatomic physics for all forms of discontinuity. It is an age in which the fixed point of view that yielded perspective in art and the essay and novel in literature yielded to Cubism and subsequent forms of modern art, and stream of consciousness in writing.

cubism

Beginning in the 19th century, photography gave us the frozen, discontinuous moment, and the technique of montage in the motion picture gave us a series of shots and scenes whose connections have to be filled in by the audience. Telegraphy gave us the instantaneous transmission of messages that took them out of their natural context, the subject of the famous comment by Henry David Thoreau that connecting Maine and Texas to one another will not guarantee that they have anything sensible to share with each other. The wire services gave us the nonlinear, inverted pyramid style of newspaper reporting, which also was associated with the nonlinear look of the newspaper front page, a form that Marshall McLuhan referred to as a mosaic. Neil Postman criticized television's role in decontextualizing public discourse in Amusing Ourselves to Death, where he used the phrase, "in the context of no context," and I discuss this as well in my recently published follow-up to his work, Amazing Ourselves to Death.

The concept of the hiatus comes naturally to the premodern mind, schooled by myth and ritual within the context of oral culture. That same concept is repressed, in turn, by the modern mind, shaped by the linearity and rationality of literacy and typography. As the modern mind yields to a new, postmodern alternative, one that emerges out of the electronic media environment, we see the return of the repressed in the idea of the jump cut writ large.

There is psychological satisfaction in the deterministic view of history as the inevitable result of cause-effect relations in the Newtonian sense, as this provides a sense of closure and coherence consistent with the typographic mindset. And there is similar satisfaction in the view of history as entirely consisting of human decisions that are the product of free will, of human agency unfettered by outside constraints, which is also consistent with the individualism that emerges out of the literate mindset and print culture, and with a social rather that physical version of efficient causality. What we are only beginning to come to terms with is the understanding of formal causality, as discussed by Marshall and Eric McLuhan in Media and Formal Cause. What formal causality suggests is that history has a tendency to follow certain patterns, patterns that connect one state or stage to another, patterns that repeat again and again over time. This is the notion that history repeats itself, meaning that historical events tend to fall into certain patterns (repetition being the precondition for the existence of patterns), and that the goal, as McLuhan articulated in Understanding Media, is pattern recognition. This helps to clarify the famous remark by George Santayana, "those who cannot remember the past are condemned to repeat it." In other words, those who are blind to patterns will find it difficult to break out of them.

Campbell engages in pattern recognition in his identification of the heroic monomyth, as Arendt does in her discussion of the historical hiatus.  Recognizing the patterns are the first step in escaping them, and may even allow for the possibility of taking control and influencing them. This also means understanding that the tendency for phenomena to fall into patterns is a powerful one. It is a force akin to entropy, and perhaps a result of that very statistical tendency that is expressed by the Second Law of Thermodynamics, as Terrence Deacon argues in Incomplete Nature. It follows that there are only certain points in history, certain moments, certain bifurcation points, when it is possible to make a difference, or to make a difference that makes a difference, to use Gregory Bateson's formulation, and change the course of history. The moment of transition, of initiation, the hiatus, represents such a moment.

McLuhan's concept of medium goes far beyond the ordinary sense of the word, as he relates it to the idea of gaps and intervals, the ground that surrounds the figure, and explains that his philosophy of media is not about transportation (of information), but transformation. The medium is the hiatus.

The particular pattern that has come to the fore in our time is that of the network, whether it's the decentralized computer network and the internet as the network of networks, or the highly centralized and hierarchical broadcast network, or the interpersonal network associated with Stanley Milgram's research (popularly known as six degrees of separation), or the neural networks that define brain structure and function, or social networking sites such as Facebook and Twitter, etc. And it is not the nodes, which may be considered the content of the network, that defines the network, but the links that connect them, which function as the network medium, and which, in the systems view favored by Bateson, provide the structure for the network system, the interaction or relationship between the nodes. What matters is not the nodes, it's the modes.

Hiatus and link may seem like polar opposites, the break and the bridge, but they are two sides of the same coin, the medium that goes between, simultaneously separating and connecting. The boundary divides the system from its environment, allowing the system to maintain its identity as separate and distinct from the environment, keeping it from being absorbed by the environment. But the membrane also serves as a filter, engaged in the process of abstracting, to use Korzybski's favored term, letting through or bringing material, energy, and information from the environment into the system so that the system can maintain itself and survive. The boundary keeps the system in touch with its situation, keeps it contextualized within its environment.

The systems view emphasizes space over time, as does ecology, but the concept of the hiatus as a temporal interruption suggests an association with evolution as well. Darwin's view of evolution as continuous was consistent with Newtonian physics. The more recent modification of evolutionary theory put forth by Stephen Jay Gould, known as punctuated equilibrium, suggests that evolution occurs in fits and starts, in relatively rare and isolated periods of major change, surrounded by long periods of relative stability and stasis. Not surprisingly, this particular conception of discontinuity was introduced during the television era, in the early 1970s, just a few years after the publication of Peter Drucker's The Age of Discontinuity.

When you consider the extraordinary changes that we are experiencing in our time, technologically and ecologically, the latter underlined by the recent news concerning the United Nations' latest report on global warming, what we need is an understanding of the concept of change, a way to study the patterns of change, patterns that exist and persist across different levels, the micro and the macro, the physical, chemical, biological, psychological, and social, what Bateson referred to as metapatterns, the subject of further elaboration by biologist Tyler Volk in his book on the subject. Paul Watzlawick argued for the need to study change in and of itself in a little book co-authored by John H. Weakland and Richard Fisch, entitled Change: Principles of Problem Formation and Problem Resolution, which considers the problem from the point of view of psychotherapy. Arendt gives us a philosophical entrée into the problem by introducing the pattern of the hiatus, the moment of discontinuity that leads to change, and possibly a moment in which we, as human agents, can have an influence on the direction of that change.

To have such an influence, we do need to have that break, to find a space and more importantly a time to pause and reflect, to evaluate and formulate. Arendt famously emphasizes the importance of thinking in and of itself, the importance not of the content of thought alone, but of the act of thinking, the medium of thinking, which requires an opening, a time out, a respite from the onslaught of 24/7/365. This underscores the value of sacred time, and it follows that it is no accident that during that period of initiation in the story of the exodus, there is the revelation at Sinai and the gift of divine law, the Torah or Law, and chief among them the Ten Commandments, which includes the fourth of the commandments, and the one presented in greatest detail, to observe the Sabbath day. This premodern ritual requires us to make the hiatus a regular part of our lives, to break the continuity of profane time on a weekly basis. From that foundation, other commandments establish the idea of the sabbatical year, and the sabbatical of sabbaticals, or jubilee year. Whether it's a Sabbath mandated by religious observance, or a new movement to engage in a Technology Sabbath, the hiatus functions as the response to the homogenization of time that was associated with efficient causality and literate linearity, and that continues to intensify in conjunction with the technological imperative of efficiency über alles.

hiatus

To return one last time to the quote that I began with, the end of the old is not necessarily the beginning of the new because there may not be a new beginning at all, there may not be anything new to take the place of the old. The end of the old may be just that, the end, period, the end of it all. The presence of a hiatus to follow the end of the old serves as a promise that something new will begin to take its place after the hiatus is over. And the presence of a hiatus in our lives, individually and collectively, may also serve as a promise that we will not inevitably rush towards an end of the old that will also be an end of it all, that we will be able to find the opening to begin something new, that we will be able to make the transition to something better, that both survival and progress are possible, through an understanding of the processes of continuity and change.

-Lance Strate

31Mar/140

Amor Mundi 3/30/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

Jonathan Schell

schellJonathan Schell has died. I first read "The Fate of the Earth" as a college freshman in Introduction to Political Theory and it was and is one of those books that forever impacts the young mind. Jim Sleeper, writing in the Yale Daily News, gets to the heart of Schell’s power: “From his work as a correspondent for The New Yorker in the Vietnam War through his rigorous manifesto for nuclear disarmament in "The Fate of the Earth", his magisterial re-thinking of state power and people’s power in “The Unconquerable World: Power, Nonviolence, and the Will of the People,” and his wry, rigorous assessments of politics for The Nation, Jonathan showed how varied peoples’ democratic aspirations might lead them to address shared global challenges.” The Obituary in the New York Times adds: “With “The Fate of the Earth” Mr. Schell was widely credited with helping rally ordinary citizens around the world to the cause of nuclear disarmament. The book, based on his extensive interviews with members of the scientific community, outlines the likely aftermath of a nuclear war and deconstructs the United States’ long-held rationale for nuclear buildup as a deterrent. “Usually, people wait for things to occur before trying to describe them,” Mr. Schell wrote in the book’s opening section. “But since we cannot afford under any circumstances to let a holocaust occur, we are forced in this one case to become the historians of the future — to chronicle and commit to memory an event that we have never experienced and must never experience.””

Standing on Someone Else's Soil

suareIn an interview, Simon Schama, author of the forthcoming book and public television miniseries "The Story of the Jews," uses early Jewish settlement in America as a way into why he thinks that Jews have often been cast as outsiders: "You know, Jews come to Newport, they come to New Amsterdam, where they run into Dutch anti-Semites immediately. One of them, at least — Peter Stuyvesant, the governor. But they also come to Newport in the middle of the 17th century. And Newport is significant in Rhode Island because Providence colony is founded by Roger Williams. And Roger Williams is a kind of fierce Christian of the kind of radical — in 17th-century terms — left. But his view is that there is no church that is not corrupt and imperfect. Therefore, no good Christian is ever entitled to form a government [or] entitled to bar anybody else’s worship. That includes American Indians, and it certainly includes the Jews. And there’s an incredible spark of fire of toleration that begins in New England. And Roger Williams is himself a refugee from persecution, from Puritan Massachusetts. But the crucial big point to make is that Jews have had a hard time when nations and nation-states have founded themselves on myths about soil, blood and tribe."

Don't Get Older: or Don't Show It

techNoam Scheiber describes the “wakeful nightmare for the lower-middle-aged” that has taken over the world of technology. The desire for the new, new thing has led to disdain for age; “famed V.C. Vinod Khosla told a conference that “people over forty-five basically die in terms of new ideas.” The value of experience and the wisdom of age or even of middle are scorned when everyone walks around with encyclopedias and instruction manuals in our pockets. The result: “Silicon Valley has become one of the most ageist places in America. Tech luminaries who otherwise pride themselves on their dedication to meritocracy don’t think twice about deriding the not-actually-old. “Young people are just smarter,” Facebook CEO Mark Zuckerberg told an audience at Stanford back in 2007. As I write, the website of ServiceNow, a large Santa Clara–based I.T. services company, features the following advisory in large letters atop its “careers” page: “We Want People Who Have Their Best Work Ahead of Them, Not Behind Them.””

You and I Will Die Unbelievers, Tied to the Tracks of the Train

artKenan Malik wonders how non-believers can appreciate sacred art. Perhaps, he says, the godless can understand it as "an exploration of what it means to be human; what it is to be human not in the here and now, not in our immediacy, nor merely in our physicality, but in a more transcendental sense. It is a sense that is often difficult to capture in a purely propositional form, but one that we seek to grasp through art or music or poetry. Transcendence does not, however, necessarily have to be understood in a religious fashion, solely in relation to some concept of the divine. It is rather a recognition that our humanness is invested not simply in our existence as individuals or as physical beings but also in our collective existence as social beings and in our ability, as social beings, to rise above our individual physical selves and to see ourselves as part of a larger project, to project onto the world, and onto human life, a meaning or purpose that exists only because we as human beings create it."

The Singularity is the News

algoThe Niemen Journalism lab has the straight scoop about the algorithm, written by Ken Scwhenke, that wrote the first story about last week's West Coast earthquake. Although computer programs like Schwenke's may be able to take over journalism's function as a source of initial news (that is, a notice that something is happening,) it seems unlikely that they will be able to take over one of its more sophisticated functions, which is to help people situate themselves in the world rather than merely know what's going on in it.

Laughing at the Past

comicIn an interview, Kate Beaton, the cartoonist responsible for the history and literature web comic Hark A Vagrant!, talks about how her comics, perhaps best described as academic parody, can be useful for teachers and students: "Oh yes, all the time! That’s the best! It’s so flattering—but I get it, the comics are a good icebreaker. If you are laughing at something, you already like it, and want to know more. If they’re laughing, they’re learning, who doesn’t want to be in on the joke? You can’t take my comics at face value, but you can ask, ‘What’s going on here? What’s this all about?’ Then your teacher gets down to brass tacks."

From the Hannah Arendt Center Blog

This week on the blog, our Quote of the Week comes from Arendt Center Research Associate, Thomas Wild, who looks at the close friendship between Hannah Arendt and Alfred Kazin who bonded over literature, writers, and the power of the written word.

28Feb/141

Privacy and Politics

ArendtWeekendReading

In the most recent NY Review of Books, David Cole wonders if we've reached the point of no return on the issue of privacy:

“Reviewing seven years of the NSA amassing comprehensive records on every American’s every phone call, the board identified only one case in which the program actually identified an unknown terrorist suspect. And that case involved not an act or even an attempted act of terrorism, but merely a young man who was trying to send money to Al-Shabaab, an organization in Somalia. If that’s all the NSA can show for a program that requires all of us to turn over to the government the records of our every phone call, is it really worth it?”

Cole is beyond convincing in listing the dangers to privacy in the new national security state. Like many others in the media, he speaks the language of necessary trade-offs involved in living in a dangerous world, but suggests we are trading away too much and getting back too little in return. He warns that if we are not careful, privacy will disappear. He is right.

gorey

Edward Gorey Charitable Trust

What is often forgotten and is absent in Cole’s narrative is that most people—at least in practice—simply don’t care that much about privacy. Whether snoopers promise security or better-targeted advertisements, we are willing to open up our inner worlds for the price of convenience. If we are to save privacy, the first step is articulating what it is about privacy that makes it worth saving.

Cole simply assumes the value of privacy and doesn’t address the benefits of privacy until his final paragraph. When he does come to explaining why privacy is important, he invokes popular culture dystopias to suggest the horror of a world without privacy:

More broadly, all three branches of government—and the American public—need to take up the challenge of how to preserve privacy in the information age. George Orwell’s 1984, Ray Bradbury’s Fahrenheit 451, and Philip K. Dick’s The Minority Report all vividly portrayed worlds without privacy. They are not worlds in which any of us would want to live. The threat is no longer a matter of science fiction. It’s here. And as both reports eloquently attest, unless we adapt our laws to address the ever-advancing technology that increasingly consumes us, it will consume our privacy, too.

There are two problems with such fear mongering in defense of privacy. The first is that these dystopias seem too distant. Most of us don’t experience the violations of our privacy by the government or by Facebook as intrusions. The second is that on a daily basis the fact that my phone knows where I am and that in a pinch the government could locate me is pretty convenient. These dystopian visions can appear not so dystopian.

Most writing about privacy simply assume that privacy is important. We are treated to myriad descriptions of the way privacy is violated. The intent is to shock us. But rarely are people shocked enough to actually respond in ways that protect the privacy they often say that they cherish. We have collectively come to see privacy as a romantic notion, a long-forgotten idle, exotic and even titillating in its possibilities, but ultimately irrelevant in our lives.

There is, of course, a reason why so many advocates of privacy don’t articulate a meaningful defense of privacy: It is because to defend privacy means to defend a rich and varied sphere of difference and plurality, the right and importance of people actually holding opinions divergent from one’s own. In an age of political correctness and ideological conformism, privacy sounds good in principle but is less welcome in practice when those we disagree with assert privacy rights.  Thus many who defend privacy do so only in the abstract.

keyhole

When it comes to actually allowing individuals to raise their children according to their religious or racial beliefs or when the question is whether people can marry whomever they want, defenders of privacy often turn tail and insist that some opinions and some practices must be prohibited. Over and over today, advocates of privacy show that they value an orderly, safe, and respectful public realm and that they are willing to abandon privacy in the name of security and a broad conception of civility according to which no one should have to encounter opinions and acts that give them offense.

The only major thinker of the last 100 years who insisted fully and consistently on the crucial importance of a rich and vibrant private realm is Hannah Arendt. Privacy, Arendt argues, is essential because it is what allows individuals to emerge as unique persons in the world. The private realm is the realm of “exclusiveness,” it is that realm in which we “choose those with whom we wish to spend our lives, personal friends and those we love.” The private choices we make are guided by nothing objective or knowable, “but strikes, inexplicably and unerringly, at one person in his uniqueness, his unlikeness to all other people we know.” Privacy is controversial because the “rules of uniqueness and exclusiveness are, and always will be, in conflict with the standards of society.” Arendt’s defense of mixed marriages (and by extension gay marriages) proceeds—no less than her defense of the right of parents to educate their children in single-sex or segregated schools—from her conviction that the uniqueness and distinction of private lives need to be respected and protected.

Privacy, for Arendt, is connected to the “sanctity of the hearth” and thus to the idea of private property. Indeed, property itself is respected not on economic grounds, but because “without owning a house a man could not participate in the affairs of the world because he had no location in it which was properly his own.” Property guarantees privacy because it enforces a boundary line, “ kind of no man’s land between the private and the public, sheltering and protecting both.” In private, behind the four walls of house and heath, the “sacredness of the hidden” protects men from the conformist expectations of the social and political worlds.

In private, shaded from the conformity of societal opinions as well from the demands of the public world, we can grow in our own way and develop our own idiosyncratic character. Because we are hidden, “man does not know where he comes from when he is born and where he goes when he dies.” This essential darkness of privacy gives flight to our uniqueness, our freedom to be different. It is privacy, in other words, that we become who we are. What this means is that without privacy there can be no meaningful difference. The political importance of privacy is that privacy is what guarantees difference and thus plurality in the public world.

Arendt develops her thinking on privacy most explicitly in her essays on education. Education must perform two seemingly contradictory functions. First, education leads a young person into the public world, introducing them and acclimating them to the traditions, public language, and common sense that precede him. Second, education must also guard the child against the world, care for the child so that “nothing destructive may happen to him from the world.” The child, to be protected against the destructive onslaught of the world, needs the privacy that has its “traditional place” in the family.

Because the child must be protected against the world, his traditional place is in the family, whose adult members return back from the outside world and withdraw into the security of private life within four walls. These four walls, within which people’s private family life is lived, constitute a shield against the world and specifically against the public aspect of the world. This holds good not only for the life of childhood but for human life in general…Everything that lives, not vegetative life alone, emerges from darkness and, however, strong its natural tendency to thrust itself into the light, it nevertheless needs the security of darkness to grow at all.

The public world is unforgiving. It can be cold and hard. All persons count equally in public, and little if any allowance is made for individual hardships or the bonds of friendship and love. Only in privacy, Arendt argues, can individuals emerge as unique individuals who can then leave the private realm to engage the political sphere as confident, self-thinking, and independent citizens.

public

The political import of Arendt’s defense of privacy is that privacy is what allows for meaningful plurality and differences that prevent one mass movement, one idea, or one opinion from imposing itself throughout society. Just as Arendt valued the constitutional federalism in the American Constitution because it multiplied power sources through the many state and local governments in the United States, so did she too value privacy because it nurtures meaningfully different and even opposed opinions, customs, and faiths. She defends the regional differences in the United States as important and even necessary to preserve the constitutional structure of dispersed power that she saw as the great bulwark of freedom against the tyranny of the majority. In other words, Arendt saw privacy as the foundation not only of private eccentricity, but also of political freedom.

Cole offers a clear-sighted account of the ways that government is impinging on privacy. It is essential reading and it is your weekend read.

-RB

17Feb/141

Amor Mundi 2/16/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

The Young and Unexceptional

xcetAccording to Rich Lowry and Ramesh Ponnuru, “The survival of American exceptionalism as we have known it is at the heart of the debate over Obama’s program. It is why that debate is so charged.” Mitt Romney repeated this same line during his failed bid to unseat the President, arguing that President Obama “doesn't have the same feelings about American exceptionalism that we do.” American exceptionalism—long a sociological concept used to describe qualities that distinguished American cultural and political institutions—has become a political truncheon. Now comes Peter Beinart writing in the National Journal that the conservatives are half correct. It is true that American exceptionalism is threatened and in decline. But the cause is not President Obama. Beinart argues that the real cause of the decline of exceptionalist feeling in the United States is conservatism itself. Here is Beinart on one way the current younger generation is an exception to the tradition of American exceptionalism: “For centuries, observers have seen America as an exception to the European assumption that modernity brings secularism. “There is no country in the world where the Christian religion retains a greater influence over the souls of men than in America,” de Tocqueville wrote. In his 1996 book, American Exceptionalism: A Double-Edged Sword, Seymour Martin Lipset quoted Karl Marx as calling America “preeminently the country of religiosity,” and then argued that Marx was still correct. America, wrote Lipset, remained “the most religious country in Christendom.”  But in important ways, the exceptional American religiosity that Gingrich wants to defend is an artifact of the past. The share of Americans who refuse any religious affiliation has risen from one in 20 in 1972 to one in five today. Among Americans under 30, it's one in three. According to the Pew Research Center, millennials—Americans born after 1980—are more than 30 percentage points less likely than seniors to say that "religious faith and values are very important to America's success." And young Americans don't merely attend church far less frequently than their elders. They also attend far less than young people did in the past. "Americans," Pew notes, "do not generally become more [religiously] affiliated as they move through the life cycle"—which means it's unlikely that America's decline in religious affiliation will reverse itself simply as millennials age.  In 1970, according to the World Religion Database, Europeans were over 16 percentage points more likely than Americans to eschew any religious identification. By 2010, the gap was less than half of 1 percentage point. According to Pew, while Americans are today more likely to affirm a religious affiliation than people in Germany or France, they are actually less likely to do so than Italians and Danes.” Read more on Beinart and American exceptionalism in the Weekend Read.

 Humans and the Technium

guyIn this interview, Kevin Kelly, one of the founders of Wired magazine, explains his concept of the “technium,” or the whole system of technology that has developed over time and which, he argues, has its own biases and tendencies “inherently outside of what humans like us want.” One thing technology wants is to watch us and to track us. Kelly writes: “How can we have a world in which we are all watching each other, and everybody feels happy? I don't see any counter force to the forces of surveillance and self-tracking, so I'm trying to listen to what the technology wants, and the technology is suggesting that it wants to be watched. What the Internet does is track, just like what the Internet does is to copy, and you can't stop copying. You have to go with the copies flowing, and I think the same thing about this technology. It's suggesting that it wants to monitor, it wants to track, and that you really can't stop the tracking. So maybe what we have to do is work with this tracking—try to bring symmetry or have areas where there's no tracking in a temporary basis. I don't know, but this is the question I'm asking myself: how are we going to live in a world of ubiquitous tracking?” Asking such questions is where humans fit into the technium world. “In a certain sense,” he says, “what becomes really valuable in a world running under Google's reign are great questions, and that’s something that for a long time humans will be better at than machines. Machines are for answers; humans are for questions.”

Literature Against Consumer Culture 

coupleTaking issue with a commentator's claim that The Paris Review's use of the word "crepuscular" (adj., resembling twilight) was elitist, Eleanor Catton suggests that the anti-critical attitude of contemporary readers arises out of consumer culture: "The reader who is outraged by being “forced” to look up an unfamiliar word — characterising the writer as a tyrant, a torturer — is a consumer outraged by inconvenience and false advertising. Advertising relies on the fiction that the personal happiness of the consumer is valued above all other things; we are reassured in every way imaginable that we, the customers, are always right." Literature, she says, resists this attitude, and, in fact cannot be elitist at all: "A book cannot be selective of its readership; nor can it insist upon the conditions under which it is read or received. The degree to which a book is successful depends only on the degree to which it is loved. All a starred review amounts to is an expression of brand loyalty, an assertion of personal preference for one brand of literature above another. It is as hopelessly beside the point as giving four stars to your mother, three stars to your childhood, or two stars to your cat."

Global Corruption

corruptVladislav Inozemtsev reviews Laurence Cockcroft’s book Global Corruption. “The book’s central argument is that corruption has political roots, which Cockcroft identifies as the “merging of elites.” Surveying the mechanisms of top-level decision-making from Russia to Brazil, to Peru and India, as well as in many other countries, he discerns a pattern: Politicians today often act as entrepreneurs, surround themselves with sycophants and deputies, and so navigate the entire political process as they would any commercial business. The hallmarks of a corrupt society are the widespread leveraging of wealth to secure public office; the leveraging of such authority to secure various kinds of privileges; and the interplay of both to make even bigger money. Simply put, corruption is a transformation of public service into a specific kind of entrepreneurship.”

Amazon's Bait and Switch

amazonGeorge Packer takes a look at Amazon's role in the book business noting that its founder, Jeff Bezos, knew from the start that book sales were only the lure; Amazon's real business was Big Data, a big deal in an industry that speaks to people's hearts and minds as well as their wallets. Still, "Amazon remains intimately tangled up in books. Few notice if Amazon prices an electronics store out of business (except its staff); but, in the influential, self-conscious world of people who care about reading, Amazon’s unparalleled power generates endless discussion, along with paranoia, resentment, confusion, and yearning. For its part, Amazon continues to expend considerable effort both to dominate this small, fragile market and to win the hearts and minds of readers. To many book professionals, Amazon is a ruthless predator. The company claims to want a more literate world—and it came along when the book world was in distress, offering a vital new source of sales. But then it started asking a lot of personal questions, and it created dependency and harshly exploited its leverage; eventually, the book world realized that Amazon had its house keys and its bank-account number, and wondered if that had been the intention all along."

Ready or Not

michaelTa-Nehisi Coates, in the wake of NFL prospect Michael Sam's announcement that he is gay, considers how the concept of readiness is backwards: "The question which we so often have been offered—is the NFL ready for a gay player?—is backwards. Powerful interests are rarely “ready” for change, so much as they are assaulted by it. We refer to barriers being "broken" for a reason. The reason is not because great powers generally like to unbar the gates and hold a picnic in the honor of the previously excluded. The NFL has no moral right to be "ready" for a gay player, which is to say it has no right to discriminate against gay men at its leisure which anyone is bound to respect.”

Counter Reformation

classThis week, the magazine Jacobin released Class Action, a handbook for activist teachers, set against school reform and financed using the Kickstarter crowdfunding platform. One of the many essays contained within is Dean Baker's "Unremedial Education," which contains one of the handbook's major theses, an important reminder for those who are interested in education as a route to both the life of the mind and the success of the person: "Education is tremendously valuable for reasons unrelated to work and income. Literacy, basic numeracy skills, and critical thinking are an essential part of a fulfilling life. Insofar as we have children going through school without developing these skills, it is an enormous failing of society. Any just society would place a top priority on ensuring that all children learn such basic skills before leaving school. However, it clearly is not the case that plausible increases in education quality and attainment will have a substantial impact on inequality."

From the Hannah Arendt Center Blog

This week on the blog, Roger Berkowitz asks "Why Think?". And in the Weekend Read, Berkowitz reflects on the loss of American exceptionalism.

17Feb/140

The Dystopia of Knowledge

Arendtquote

“This future man, whom the scientists tell us they will produce in no more than a hundred years, seems to be possessed by a rebellion against human existence as it has been given, a free gift from nowhere (secularly speaking), which he wishes to exchange, as it were, for something he has made himself.”

Hannah Arendt, The Human Condition

The future man of whom Arendt writes is one who has been released from earthly ties, from nature.  He has been released from earth as a physical space but also as “the quintessence of the human condition.”  He will have been able to “create life in a test tube” and “extend man’s life-span far beyond the hundred-year limit.”  The idea that this man would wish to exchange his given existence for something artificial is part of a rather intricate intellectual historical argument about the development of modern science.

The more man has sought after perfect knowledge of nature, the more he has found himself in nature’s stead, and the more uncertain he has felt, and the more he has continued to seek, with dire consequences.  This is the essential idea.  The negative consequences are bundled together within Arendt’s term, “world alienation,” and signify, ultimately, the endangerment of possibilities for human freedom.  Evocative of dystopian fiction from the first half of the twentieth century, this theme has enjoyed renewed popularity in our current world of never-ending war and ubiquitous surveillance facilitated by technical innovation.

surv

Arendt’s narration gravitates around Galileo’s consummation of the Copernican revolution, which marks the birth of “the modern astrophysical world view.”  The significance of Galileo, Arendt writes, is that with him we managed to find “the Archimedean point” or the universal point of view.  This is an imagined point outside the earth from which it should be possible to make objective observations and formulate universal natural laws.  Our reaching of the Archimedean point, without leaving the earth, was responsible for natural science’s greatest triumphs and the extreme pace of discovery and technical innovation.

This was also a profoundly destabilizing achievement, and Arendt’s chronicle of its cultural effects takes on an almost psychological resonance.  While we had known since Plato that the senses were unreliable for the discovery of truth, she says, Galileo’s telescope told us that we could not trust our capacity for reason, either.  Instead, a manmade instrument had shown us the truth, undermining both reason and faith in reason.

In grappling with the resulting radical uncertainty, we arrived at Descartes’ solution of universal doubt.  Arendt describes this as a turn towards introspection, which provides a solution insofar as it takes place within the confines of one’s mind.  External forces cannot intrude here, at least upon the certainty that mental processes are true in the sense that they are real.  Man’s turn within himself afforded him some control.  This is because it corresponded with “the most obvious conclusion to be drawn from the new physical science: though one cannot know truth as something given and disclosed, man can at least know what he makes himself.” According to Arendt, this is the fundamental reasoning that has driven science and discovery at an ever-quickening pace.  It is at the source of man’s desire to exchange his given existence “for something he has made himself.”

The discovery of the Archimedean point with Galileo led us to confront our basic condition of uncertainty, and the Cartesian solution was to move the Archimedean point inside man.  The human mind became the ultimate point of reference, supported by a mathematical framework that it produces itself.  Mathematics, as a formal structure produced by the mind, became the highest expression of knowledge.  As a consequence, “common sense” was internalized and lost its worldly, relational aspect.  If common sense only means that all of us will arrive at the same answer to a mathematical question, then it refers to a faculty that is internally held by individuals rather than one that fits us each into the common world of all, with each other, which is Arendt’s ideal.  She points to the loss of common sense as a crucial aspect of “world alienation.”

This loss is closely related to Arendt’s concerns about threats to human political communication. She worries that we have reached the point at which the discoveries of science are no longer comprehensible.  They cannot be translated from the language of mathematics into speech, which is at the core of Arendt’s notion of political action and freedom.

The threat to freedom is compounded when we apply our vision from the Archimedean point to ourselves.  Arendt cautions, “If we look down from this point upon what is going on on earth and upon the various activities of men, … then these activities will indeed appear to ourselves as no more than ‘overt behavior,’ which we can study with the same methods we use to study the behavior of rats.” (“The Conquest of Space and the Stature of Man” in Between Past and Future)

She argues against the behaviorist perspective on human affairs as a false one, but more frightening for her is the fact it could become reality.  We may be seeking this transformation through our desire to control and know and thus live in a world that we have ourselves created.  When we look at human affairs from the Archimedean, objective scientific point of view, our behavior appears to be analyzable, predictable, and uniform like the activity of subatomic particles or the movement of celestial bodies.  We are choosing to look at things with such far remove that, like these other activities and movements, they are beyond the grasp of experience.  “World alienation” refers to this taking of distance, which collapses human action into behavior.  The purpose would be to remedy the unbearable condition of contingency, but in erasing contingency, by definition, we erase the unexpected events that are the worldly manifestations of human freedom.

To restate the argument in rather familiar terms: Our quest for control, to put an end to the unbearable human condition of uncertainty and contingency, leads to a loss of both control and freedom.  This sentiment should be recognizable as a hallmark of the immediate post-war period, represented in works of fiction like Kubrick’s Dr. Strangelove, Beckett’s Endgame, and Orwell’s 1984.  We can also find it even earlier in Koestler’s Darkness at Noon and Huxley’s Brave New World.  There has been a recent recovery and reemergence of the dystopian genre, at least in one notable case, and with it renewed interest in Arendt’s themes as they are explored here.

Dave Eggers’ The Circle, released in 2013, revolves around an imagined Bay Area cultish tech company that is a combination of Google, Facebook, Twitter, and PayPal.  In its apparent quest for progress, convenience, and utility, it creates an all-encompassing universe in which all of existence is interpreted in terms of data points and everything is recorded. The protagonist, an employee of the Circle, is eventually convinced to “go transparent,” meaning that her every moment is live streamed and recorded, with very few exceptions.   Reviews of the book have emphasized our culture of over-sharing and the risks to privacy that this entails.  They have also drawn parallels between this allegorical warning and the Snowden revelations.  Few, though, if any, have discussed the book in terms of the human quest for absolute knowledge in order to eliminate uncertainty and contingency, with privacy as collateral damage.

dave

In The Circle, the firm promotes transparency and surveillance as solutions to crime and corruption.  Executives claim that through acquired knowledge and technology, anything is possible, including social harmony and world peace.  The goal is to organize human affairs in a harmonious way using technical innovation and objective knowledge.  This new world is to be man made so that it can be manipulated for progressive ends.  In one key conversation, Mae, the main character, confronts one of the three firm leaders, saying, “… you can’t be saying that everyone should know everything,” to which he replies, “… I’m saying that everyone should have a right to know everything and should have the tools to know anything.  There’s not enough time to know everything, though I certainly wish there was.”

In this world, there are several senses in which man has chosen to replace existence as given with something he has made himself.  First and most obviously, new gadgets dazzle him at every turn, and he is dependent on them.  Second, he reduces all information “to the measure of the human mind.”  The technical innovations and continuing scientific discoveries are made with the help of manmade instruments, such that:  “Instead of objective qualities … we find instruments, and instead of nature or the universe—in the words of Heisenberg—man encounters only himself.” (The Human Condition, p. 261) Everything is reduced to a mathematical calculation.  An employee’s (somewhat forced) contributions to the social network are tabulated and converted into “retail raw,” the dollar measure of consumption they have inspired (through product placement, etc.).  All circlers are ranked, in a competitive manner, according to their presence on social media.  The effects in terms of Arendt’s notion of common sense are obvious.  Communication takes place in flat, dead prose.  Some reviewers have criticized Eggers for the writing style, but what appears to be bad writing actually matches the form to the content in this case.

Finally, it is not enough to experience reality here; all experience must be recorded, stored, and made searchable by the Circle.  Experience is thus replaced with a man made replica.  Again, the logic is that we can only know what we produce ourselves.  As all knowledge is organized according to human artifice, the human mind, observing from a sufficient distance, can find the patterns within it.  These forms, pleasing to the mind, are justifiable because they work.

blue

They produce practical successes.  Here, harmony is discovered because it is created.  Arendt writes:

“If it should be true that a whole universe, or rather any number of utterly different universes will spring into existence and ‘prove’ whatever over-all pattern the human mind has constructed, then man may indeed, for a moment, rejoice in a reassertion of the ‘pre-established harmony between pure mathematics and physics,’ between mind and matter, between man and the universe.  But it will be difficult to ward off the suspicion that this mathematically preconceived world may be a dream world where every dreamed vision man himself produces has the character of reality only as long as the dream lasts.”

If harmony is artificially created, then it can only last so long as it is enforced.  Indeed, in the end of the novel, when the “dream” is revealed as nightmare, Mae is faced with the choice of prolonging it.  We can find a similar final moment of hope in The Human Condition.  As she often does, Arendt has set up a crushing course of events, a seeming onslaught of catastrophe, but she leaves us with at least one ambiguous ray of light: “The idea that only what I am going to make will be real—perfectly true and legitimate in the realm of fabrication—is forever defeated by the actual course of events, where nothing happens more frequently than the totally unexpected.”

-Jennifer M. Hudson

2Feb/140

Amor Mundi 2/2/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

The Right to Not Care

womanEvincing a particular kind of anti-political judgment, the editors at N+1 are trying to wiggle their way out of the internet's world of opinion: "We assert our right to not care about stuff, to not say anything, to opt out of debate over things that are silly and also things that are serious—because why pretend to have a strong opinion when we do not? Why are we being asked to participate in some imaginary game of Risk where we have to take a side? We welcome the re-emergence of politics in the wake of the financial crash, the restoration of sincerity as a legitimate adult posture. But already we see this new political sincerity morphing into a set of consumer values, up for easy exploitation. We are all cosmopolitans online, attentive to everything; but the internet is not one big General Assembly, and the controversies planted in establishment newspapers aren’t always the sort of problems that require the patient attention of a working group. Some opinions deserve radical stack (like #solidarityisforwhitewomen), but the glorified publicity stunts that dress up in opinion’s clothes to get viral distribution in the form of “debate” (Open Letters to Miley Cyrus) do not. We ought to be selective about who deserves our good faith. Some people duke it out to solve problems. Others pick fights for the spectacle, knowing we’ll stick around to watch. In the meantime they’ll sell us refreshments, as we loiter on the sideline, waiting to see which troll will out-troll his troll." Read Roger Berkowitz’s  response on the Arendt Center blog.

Ignorance Praised in Art and Education

artBarry Schwabsky wonders what the proliferation of MFAs and not Ph.D.’s in art means for artists. Could it be dangerous and lead to intellectually gifted but sterile artists? Don’t worry, Schwabsky writes, since art schools have adopted ignorance as their motto: "Just as no one family of techniques can be prescribed as the right content of art education, neither can any one set of ideas. The instructor’s knowledge and experience are always in principal too limited for the job they’ve taken on. They’re supposed to help usher their students into the not-yet-known, toward what, in Draw It With Your Eyes Closed, the Canadian artist Jon Pylypchuk calls "another place where there was no grade and just a friend telling you that what you did was good."  Sooner or later teaching art, and making art, is about coming to terms with one’s own ignorance.  Maybe that’s why the art world’s favorite philosopher these days is, whose best-known book—published in France in 1987 and translated into English four years later—is called The Ignorant Schoolmaster. Its subject is Joseph Jacotot, a forgotten French educator of the early nineteenth century whose “intellectual adventure” was founded on a paradoxical—one might be tempted to say nonsensical—principle: “He proclaimed that one could teach what one didn’t know.” The educator’s job, since teacher and student are assumed to be equal in intelligence, is nothing more than to “use all possible means of convincing the ignorant one of his power” of understanding. The teacher is there simply to remind the learner to pay attention, to keep working.” It might be helpful to recall Arendt’s argument in “The Crisis in Education,” that teaching must teach something if it is to give students the possibility of rebuilding the world anew.

Not Dead Yet

bookDigital journalism professor Meredith Borussard explains why she's banned e-readers from her classroom, and gives a short history of the book while she's at it: "The user interface for a book has been refined for centuries. What we call a ‘printed book’ today is a codex, a set of uniformly sized pages bound between covers. It was adopted around the 3rd or 4th century. A book’s interface is nearly perfect. It is portable, it never runs out of power, and you can write notes in it if you forget your notebook. The physical book is seamlessly integrated into the educational experience: It fits on any desk, even those cramped little writing surfaces that flip up from the side of a seat. You can sit around a table with 15 other people, each of whom has a book, and you can all see each other to have a conversation about what is on the page."

Hopelessly American

flagCarol Becker confronts “the first time I was aware that the world had changed and that "we" (my age group) were no longer the "younger generation." Another group was ascending, and its members appeared confoundedly different from us.” Becker reflects on what it is that identifies her generation and suggests that their idealism was hopelessly American: “I was asked if I still believed in making a “better world.” I was taken aback. I could not imagine a life where that was not a goal, nor a world incapable of movement forward. Having grown up believing in progress–not the progress of technology or material wealth but that of personal and social transformation—it probably is the concept of “hope” that most separates my generation from those that immediately followed. Perhaps I am delusional and, like all who suffer from delusions, unable to function without them. Or it could be that I am “hopelessly American”, as my students in Greece used to say, because of my conviction that the world can be changed for the better and that I or we, must have a hand in that process.”

The Last of the Unjust

filmClaude Lanzmann, maker of the magisterial Shoah, has been deeply critical of Hannah Arendt’s appraisal of Jewish leaders. Now Lanzmann has a new film out that is proving almost as controversial as Eichmann in Jerusalem. I wrote about it earlier, here. This weekend, Jeremy Gerard has a short profile of the movie in the New York Times.  “Life and death in Theresienstadt were overseen by successive heads of the Judenrat, the Jewish council set up by the Nazis in ghettos and camps to enforce Nazi orders and to oversee labor and the transfer of people to Auschwitz-Birkenau, Dachau and other camps. The first two were executed when their usefulness ended. The final elder, serving from December 1944 to May 1945, was a brilliant Viennese rabbi, Benjamin Murmelstein, who called himself “the last of the unjust,” a phrase that Mr. Lanzmann appropriated for the title of his 3-hour-40-minute look at this divisive figure. In the documentary, opening on Feb. 7, he revisits an intense week he spent filming Rabbi Murmelstein nearly four decades ago. Some critics and Holocaust survivors have found the new documentary overly sympathetic to the rabbi; Mr. Lanzmann himself has therefore become an unlikely player in the continuing debate over how we are to remember Jews who worked in any way with the Nazis.”

From the Hannah Arendt Center Blog

This week on the blog, Ian Storey writes about Arendt, Steve McQueen, and Kanye West. And in the Weekend Read, Roger Berkowitz takes on the editors at N+1 who berate the internet for inciting too much free speech.

28Jan/140

Amor Mundi 1/26/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

Expansive Writing

Flickr - Manky M.

Flickr - Manky M.

In The Origins of Totalitarianism, Hannah Arendt asks after the “elements” of totalitarianism, those fundamental building blocks that made possible an altogether new and horrific form of government. The two structural elements she locates are the emergence of a new ideological form of Antisemitism and the rise of transnational imperialist movements, which gives the structure to Part One (Antisemitism) and Part Two (Imperialism) of her book. Underlying both Antisemitism and Imperialism, however, is what Arendt calls “metaphysical loneliness.” Totalitarian government, Arendt writes, “bases itself on loneliness, on the experience of not belonging to the world at all, which is among the most radical and desperate experiences of man.” In a world of individualism in which the human bonds of religion, family, clan, and nation are increasingly seen as arbitrary, tenuous, and weak, so that individuals people find themselves uprooted, redundant, and superfluous. “Metaphysical loneliness,” Arendt writes, is the “basic experience” of modern society that is “the common ground for terror, the essence of totalitarian government, and for ideology or logicality, the preparation of its executioners and victims, is closely connected with uprootedness and superfluousness which have been the curse of modern masses since the beginning of the industrial revolution and have become acute with the rise of imperialism at the end of the last century and the breakdown of political institutions and social traditions in our own time.” The question underlying so much of Arendt’s work is how to respond to what she calls “the break in tradition,” the fact that the political, social, and intellectual traditions that bound people together in publically meaningful institutions and networks have frayed beyond repair. The customs and traditions that for millennia were the unspoken common sense of peoples can no longer be presumed. How to make life meaningful, how to inure individuals from the seduction of ideological movements that lend weight to their meaningless lives? If metaphysical loneliness is the basic experiences of modern life, then it is not surprising that great modern literature would struggle with the agony of such disconnection and seek to articulate paths of reconnection. That, indeed, is the thesis of Wyatt Mason’s essay “Make This Not True,” in this week’s New York Review of Books. Modern fiction, Mason argues, struggles to answer the question: How can we live and die and not be alone? There are, he writes, at least three paradigmatic answers, represented alternatively by three of the greatest contemporary writers, David Foster Wallace, Jonathan Franzen, and George Saunders. Reviewing Saunders Tenth of September (a 2012 finalist for the National Book Award), Mason writes suggests an important link between Saunder’s Buddhism and his writing:  “In Buddhist practice, through sitting meditation, the mind may be schooled in the way of softness, openness, expansiveness. This imaginative feat—of being able to live these ideas—is one of enormous subtlety. What makes Saunders’s work unique is not its satirical verve or its fierce humor but its unfathomable capacity to dramatize, in story form, the life-altering teachings of such a practice. … [I]f fiction is to continue to exert an influence over a culture that finds it ever easier to connect, however frailly, to the world around them through technology, Saunders’s stories suggest that the ambition to connect outwardly isn’t the only path we can choose. Rather, his fiction shows us that the path to reconciliation with our condition is inward, a journey we must make alone.”

Second Life

aiAi Weiwei describes what he thinks Internet access has done for his home country: "the Internet is the best thing that ever happened to China.” If Mason and Saunders (see above) worry that technology magnifies the loneliness of modern mass society, Ai Weiwei argues that the World Wide Web “turns us into individuals and also enables us to share our perceptions and feelings. It creates a culture of individualism and exchange even though the real society doesn't promote it. There isn't a single Chinese university that can invite me to give a talk. Even though I know there are many students who would like to hear what I have to say."

Bringing Power to the People

poetIn an interview about art, politics, and the intersection between the two, Sudanese poet Mamoun Eltlib describes a revolution for those who have rejected the political: "When you come to politicians now, people don’t really care about them, because they find out it’s just a chair or election problem between them. It’s not about them as Sudanese. So when you do something for the people without asking them to vote for you or elect you or to do anything, just to make a very beautiful, attractive program, they respond. I was in Doha for a conference for three days, to solve the problem in Sudan. They brought all the intellectuals and the writers and the thinkers from the political parties and from the rebel groups and from the government itself, as well as independent writers like me and Faisal, and they made this paper called, ‘Loving Your Enemy Through Culture,’ because I was saying that we don’t just need to change the people, we need to change the politicians. If we really want to fight now, we have just one way, the cultural way."

Losing Our Religion

saintIn Democracy in America Alexis de Tocqueville argues that the American brand of religion—strong on morality while permissive on rituals and dogma—is deeply important to liberal democracy. While democracy imagines political and civil liberties, religion maintains a “civic religion” that privileges moral consensus over dogmatism provides a common core of moral belief even amongst a plurality of faiths and sects. Under this view, the continued religiosity of Americans especially in comparison to the irreligiosity of Europeans is an important part ingredient in the American experience of democracy. With this in mind, consider this snippet from Megan Hustad’s memoir More Than Conquerors. Hustad talks about growing up in a missionary household, and how her father is coping with changes he sees happening around him: "Thanks be to God, my parents would say. Thanks to my ability to take care of myself, I would say. My father knows I choose to fill my time with people for whom Christianity is an outmoded concept, a vestigial cultural tail humanity would be better off losing. He knows most of my friends are of the opinion that the country would be better off without people who think like he does. His new status as cultural relic bothers him. He finds it ironic that moral relativists temporarily misplace their relativism when talk turns to Jesus. He doesn’t like how “evangelical” and “fundamentalist” are so often conflated in news reports and in opinion pieces, as if there were no shadows between them. It seems to him more evidence that the United States is becoming a post-Christian society like England and much of Europe before it. Used to be, he remembers, one didn’t have to explain the contours of faith. Billy Graham appeared on prime-time television. Everyone in this country, he remembered, knew what faith was for."

From the Hannah Arendt Center Blog

This week on the blog, Roger Berkowitz explores the literary responses to loneliness in the writing of George Saunders via Wyatt Mason. Jeffrey Champlin discusses how Arendt read Adam Smith.

20Jan/141

Amor Mundi 1/19/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

On Muckraking and Political Change

watchdogJim Sleeper turned me on to Dean Starkman’s excerpt from his new book chronicling the failure of the press to expose wrongdoing in the lead up to the financial crisis, The Watchdog That Didn’t Bark: The Financial Crisis and the Disappearance of Investigative Journalism. Starkman writes: “Now is a good time to consider what journalism the public needs. What actually works? Who are journalism’s true forefathers and foremothers? Is there a line of authority in journalism’s collective past that can help us navigate its future? What creates value, both in a material sense and in terms of what is good and valuable in American journalism? Accountability reporting comes in many forms—a series of revelations in a newspaper or online, a book, a TV magazine segment—but its most common manifestation has been the long-form newspaper or magazine story, the focus of this book. Call it the Great Story. The form was pioneered by the muckrakers’ quasi-literary work in the early 20th century, with Tarbell’s exposé on the Standard Oil monopoly in McClure’s magazine a brilliant example. As we’ll see, the Great Story has demonstrated its subversive power countless times and has exposed and clarified complex problems for mass audiences across a nearly limitless range of subjects: graft in American cities, modern slave labor in the US, the human costs of leveraged buyouts, police brutality and corruption, the secret recipients on Wall Street of government bailouts, the crimes and cover-ups of media and political elites, and on and on, year in and year out. The greatest of muckraking editors, Samuel S. McClure, would say to his staff, over and over, almost as a mantra, “The story is the thing!” And he was right.” Starkman has incredible optimism in the power of the press is infective. But in the weekend read, Roger Berkowitz turns to Walter Lippmann to raise questions about Starkman’s basic assumptions.

Our Unconstitutional Standing Army

armyKathleen Frydl has an excellent essay in The American Interest arguing against our professionalized military and for the return of a citizen’s army.  “Without much reflection or argument, the United States now supports the professional “large standing army” feared by the Founding Fathers, and the specter of praetorianism they invoked casts an ever more menacing shadow as the nation drifts toward an almost mercenary force, which pays in citizenship, opportunity structures (such as on-the-job technical training and educational benefits), a privileged world of social policy (think Tricare), and, in the case of private contractors, lots of money. Strict constructionists of the Constitution frequently ignore one of its most important principles—that the military should be large and powerful only when it includes the service of citizen-soldiers. This oversight clearly relates to the modern American tendency to define freedom using the neo-liberal language of liberty, shorn of any of the classical republican terminology of service. We would do well to remember Cicero’s most concise summary of a constitutional state: “Freedom is the participation in power.”” I don’t know what Hannah Arendt would have thought about the draft. But I do know she’d sympathize with Frydl’s worries about a professionalized army.

What Has It Done To Us?

timeTim Wu marvels at the human augmented by technology. Consider what an intelligent time traveler would think if talking to a reasonably educated woman today: "The woman behind the curtain, is, of course, just one of us. That is to say, she is a regular human who has augmented her brain using two tools: her mobile phone and a connection to the Internet and, thus, to Web sites like Wikipedia, Google Maps, and Quora. To us, she is unremarkable, but to the man she is astonishing. With our machines, we are augmented humans and prosthetic gods, though we’re remarkably blasé about that fact, like anything we’re used to. Take away our tools, the argument goes, and we’re likely stupider than our friend from the early twentieth century, who has a longer attention span, may read and write Latin, and does arithmetic faster. The time-traveler scenario demonstrates that how you answer the question of whether we are getting smarter depends on how you classify “we.”” We, the underlying humans may know less and less. But “we,” the digitally enabled cyborgs that we’ve become, are geniuses. Much of the focus and commentary about artificial intelligence asks the wrong question, about whether machines will become human. The better question is what will become of humans as we integrate more fully with our machines. That was the topic of Human Being in an Inhuman Age, the 2010 Arendt Center Conference. A selection of essays from that conference are published in the inaugural edition of HA: The Journal of the Hannah Arendt Center.

Thinking History

historyIn an interview with high school teacher David Cutler, history professor Eric Foner explains how we could make history education more effective: "Knowledge of the events of history is important, obviously, but also I think what I see in college students, that seems to be lacking at least when they come into college, is writing experience. In other words, being able to write that little essay with an argument. I see that they think, 'OK, there are the facts of history and that's it—what more is there to be said?' But of course, the very selection of what is a fact, or what is important as a fact, is itself based on an interpretation. You can't just separate fact and interpretation quite as simply as many people seem to think. I would love to see students get a little more experience in trying to write history, and trying to understand why historical interpretation changes over time." Foner wants students to think history, not simply to know it.

Reading Croatian Fiction

fictionGary Shteyngart, Google Glass wearer and author of the recently published memoir Little Failure, explains the arc of his reading habits: "When I was growing up, I was reading a lot of male fiction, if you can call it that. I was up to my neck in Saul Bellow, which was wonderful and was very instrumental but I think I’ve gone, like most people I think I’ve expanded my range quite a bit. When you’re young you focus on things that are incredibly important to you and read, God knows, every Nabokov that’s ever been written. But then, it is time to move beyond that little place where you live and I’ve been doing that; I’m so curious to see so many people send me books now it’s exciting to go to the mailbox and see a work of Croatian fiction."

This Week on the Blog

This week on the blog, Sandipto Dasgupta discusses Arendt and B.R. Ambedkar, one of the authors of the Indian constitution. In the weekend read, Roger Berkowitz examines the merit of muckraking journalism and its role as watchdog of corruption.

6Jan/141

Amor Mundi 1/5/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor MundiLove of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

The Missing NSA Debate About Capitalism

nsaHero or traitor? That is the debate The New York Times wants about Edward Snowden. But the deeper question is what, if anything, will change? Evgeny Morozov has a strong essay in The Financial Times: "Mr. Snowden created an opening for a much-needed global debate that could have highlighted many of these issues. Alas, it has never arrived. The revelations of the US's surveillance addiction were met with a rather lacklustre, one-dimensional response. Much of this overheated rhetoric - tinged with anti-Americanism and channelled into unproductive forms of reform - has been useless." The basic truth is that "No laws and tools will protect citizens who, inspired by the empowerment fairy tales of Silicon Valley, are rushing to become data entrepreneurs, always on the lookout for new, quicker, more profitable ways to monetise their own data - be it information about their shopping or copies of their genome. These citizens want tools for disclosing their data, not guarding it.... What eludes Mr. Snowden - along with most of his detractors and supporters - is that we might be living through a transformation in how capitalism works, with personal data emerging as an alternative payment regime. The benefits to consumers are already obvious; the potential costs to citizens are not. As markets in personal information proliferate, so do the externalities - with democracy the main victim. This ongoing transition from money to data is unlikely to weaken the clout of the NSA; on the contrary, it might create more and stronger intermediaries that can indulge its data obsession. So to remain relevant and have some political teeth, the surveillance debate must be linked to debates about capitalism - or risk obscurity in the highly legalistic ghetto of the privacy debate."

The Non-Private World Today

worldConsidering the Fourth Amendment implications of the recent Federal injunction on the NSA's domestic spying program, David Cole notes something important about the world we're living in: "The reality of life in the digital age is that virtually everything you do leaves a trace that is shared with a third party-your Internet service provider, phone company, credit card company, or bank. Short of living off the grid, you don't have a choice in the matter. If you use a smartphone, you are signaling your whereabouts at all times, and sharing with your phone provider a track record of your thoughts, interests, and desires. Technological innovations have made it possible for all of this information to be collected, stored, and analyzed by computers in ways that were impossible even a decade ago. Should the mere existence of this information make it freely searchable by the NSA, without any basis for suspicion?"

The End of the Blog

blogJason Kottke thinks that the blog is no longer the most important new media form: "The primary mode for the distribution of links has moved from the loosely connected network of blogs to tightly integrated services like Facebook and Twitter. If you look at the incoming referers to a site like BuzzFeed, you'll see tons of traffic from Facebook, Twitter, Reddit, Stumbleupon, and Pinterest but not a whole lot from blogs, even in the aggregate. For the past month at kottke.org, 14 percent of the traffic came from referrals compared to 30 percent from social, and I don't even work that hard on optimizing for social media. Sites like BuzzFeed and Upworthy aren't seeking traffic from blogs anymore. Even the publicists clogging my inbox with promotional material urge me to 'share this on my social media channels' rather than post it to my blog." Of course, it may be the case that the blog form remains deeply important, but only for those blogs that people visit regularly and then distribute through social media. The major blogs are more powerful and popular than ever. What we are learning is that not everyone is a blogger.

Against Daddy Days

daddyTa-Nehisi Coates explains why he's frustrated about the way we're having the conversation about paternity leave: "So rather than hear about the stigma men feel in terms of taking care of kids, I'd like for men to think more about the stigma that women feel when they're trying to build a career and a family. And then measure whatever angst they're feeling against the real systemic forces that devalue the labor of women. I think that's what's at the root of much of this: When some people do certain work we cheer. When others do it we yawn. I appreciated the hosannas when I was strolling down Flatbush, but I doubt the female electrician walking down the same street got the same treatment."

The Professional Palate Unmasked

nyBreaking a tradition of his profession, New York magazine restaurant critic Adam Platt has decided to reveal his face. During his explanation, he stakes a claim for the continued importance of the critic in the digital age: "So is there still room for the steady (and, yes, sometimes weary) voice of the professional in a world where everyone's a critic? Of course there is. This is especially true in the theatrical realm of restaurants, where the quality and enjoyment of your dinner can vary dramatically depending on where you sit, what time of day you eat, how long the restaurant has been open, and what you happened to order. Anonymity would be nice, but it's always been less important than a sturdy gut and a settled palate. Most important of all, however, is a healthy expense account, because if a critic's employer allows for enough paid visits to a particular restaurant, even the most elaborately simpering treatment won't change his or her point of view."