In the wake of Mozilla C.E.O. Brendan Eich's resignation over his support for California's 2008 Proposition 8, which banned gay marriage and has since been overturned in court, Andrew Sullivan laments the process by which Eich was compelled to step down.
In his post, Sullivan, a gay man who has been making the conservative case for gay marriage for nearly two decades, suggests that to simply label Eich a bigot and move forward under that presumption is too easy. Indeed, he says, that "the ability to work alongside or for people with whom we have a deep political disagreement is not a minor issue in a liberal society. It is a core foundation of toleration. We either develop the ability to tolerate those with whom we deeply disagree, or liberal society is basically impossible. Civil conversation becomes culture war; arguments and reason cede to emotion and anger." In this context, what is a crusade for tolerance also becomes a front for intolerance, something about which Sullivan is deeply troubled. The propagation of such a sure belief means the end of civil society and, in its face, he proposes we embrace uncertainty, concluding, finally, that "a moral movement without mercy is not moral; it is, when push comes to shove, cruel."
Sullivan makes a passionate and necessary plea for both moral uncertainty and, equally important, a willingness to live with and amongst those whose opinions we find both wrong and hurtful. What makes American democracy special is not that we have the right answers, but that we are committed to the conversation, not that we employ mandarins blessed with the right answers but that we trust everyday citizens to figure it out as we go along. Sullivan makes his case that Eich was honorable, open, and willing to engage in meaningful dialogue with those he disagreed with. Let's leave aside accusations of political correctness and such. The important point is that we are living in a country increasingly at odds with its democratic tradition of debate and disagreement. We bemoan the fact that Republicans and Democrats can't talk across the aisle; how is that we now won't even work with someone who respectfully disagrees with us politically?
—RB h/t Josh Kopin
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Over at SCOTUSblog, Burt Neuborne writes that “American democracy is now a wholly owned subsidiary of Oligarchs, Inc.” The good news, Neuborne reminds, is that “this too shall pass.” After a fluid and trenchant review of the case and the recent decision declaring limits on aggregate giving to political campaigns to be unconstitutional, Neuborne writes: “Perhaps most importantly, McCutcheon illustrates two competing visions of the First Amendment in action. Chief Justice Roberts’s opinion turning American democracy over to the tender mercies of the very rich insists that whether aggregate contribution limits are good or bad for American democracy is not the Supreme Court’s problem. He tears seven words out of the forty-five words that constitute Madison’s First Amendment – “Congress shall make no law abridging . . . speech”; ignores the crucial limiting phrase “the freedom of,” and reads the artificially isolated text fragment as an iron deregulatory command that disables government from regulating campaign financing, even when deregulation results in an appalling vision of government of the oligarchs, by the oligarchs, and for the oligarchs that would make Madison (and Lincoln) weep. Justice Breyer’s dissent, seeking to retain some limit on the power of the very rich to exercise undue influence over American democracy, views the First Amendment, not as a simplistic deregulatory command, but as an aspirational ideal seeking to advance the Founders’ effort to establish a government of the people, by the people, and for the people for the first time in human history. For Justice Breyer, therefore, the question of what kind of democracy the Supreme Court’s decision will produce is at the center of the First Amendment analysis. For Chief Justice Roberts, it is completely beside the point. I wonder which approach Madison would have chosen. As a nation, we’ve weathered bad constitutional law before. Once upon a time, the Supreme Court protected slavery. Once upon a time the Supreme Court blocked minimum-wage and maximum-hour legislation. Once upon a time, the Supreme Court endorsed racial segregation, denied equality to women, and jailed people for their thoughts and associations. This, too, shall pass. The real tragedy would be for people to give up on taking our democracy back from the oligarchs. Fixing the loopholes in disclosure laws, and public financing of elections are now more important than ever. Moreover, the legal walls of the airless room are paper-thin. Money isn’t speech at obscenely high levels. Protecting political equality is a compelling interest justifying limits on uncontrolled spending by the very rich. And preventing corruption means far more than stopping quid pro quo bribery. It means the preservation of a democracy where the governed can expect their representatives to decide issues independently, free from economic serfdom to their paymasters. The road to 2016 starts here. The stakes are the preservation of democracy itself.” It is important to remember that the issue is not really partisan, but that both parties are corrupted by the influx of huge amounts of money. Democracy is in danger not because one party will by the election, but because the oligarchs on both sides are crowding out grassroots participation. This is an essay you should read in full. For a plain English review of the decision, read this from SCOTUSblog. And for a Brief History of Campaign Finance, check out this from the Arendt Center Archives.
Zephyr Teachout, the most original and important thinker about the constitutional response to political corruption, has an op-ed in the Washington Post: “We should take this McCutcheon moment to build a better democracy. The plans are there. Rep. John Sarbanes (D-Md.) has proposed something that would do more than fix flaws. H.R. 20, which he introduced in February, is designed around a belief that federal political campaigns should be directly funded by millions of passionate, but not wealthy, supporters. A proposal in New York would do a similar thing at the state level.” Teachout spoke at the Arendt Center two years ago after the Citizens United case. Afterwards, Roger Berkowitz wrote: “It is important to see that Teachout is really pointing out a shift between two alternate political theories. First, she argues that for the founders and for the United States up until the mid-20th century, the foundational value that legitimates our democracy is the confidence that our political system is free from corruption. Laws that restrict lobbying or penalize bribery are uncontroversial and constitutional, because they recognize core—if not the core—constitutional values. Second, Teachout sees that increasingly free speech has replaced anti-corruption as the foundational constitutional value in the United States. Beginning in the 20th century and culminating in the Court's decision in Citizens United, the Court gradually accepted the argument that the only way to guarantee a legitimate democracy is to give unlimited protection to the marketplace of idea. Put simply, truth is nothing else but the product of free debate and any limits on debate, especially political debate, will delegitimize our politics.” Read the entirety of his commentary here. Watch a recording of Teachout’s speech here.
A new exhibition opened two weeks ago at the Haus der Kulturen der Welt in Berlin that examines the changing ways in which states police and govern their subjects through forensics, and how certain aesthetic-political practices have also been used to challenge or expose states. Curated by Anselm Franke and Eyal Weizman, Forensis “raises fundamental questions about the conditions under which spatial and material evidence is recorded and presented, and tests the potential of new types of evidence to expand our juridical imagination, open up forums for political dispute and practice, and articulate new claims for justice.” Harry Burke and Lucy Chien review the exhibition on Rhizome: “The exhibition argues that forensics is a political practice primarily at the point of interpretation. Yet if the exhibition is its own kind of forensic practice, then it is the point of the viewer's engagement where the exhibition becomes significant. The underlying argument in Forensis is that the object of forensics should be as much the looker and the act of looking as the looked-upon.” You may want to read more and then we suggest Mengele’s Skull: The Advent of a Forensic Aesthetics.
In an interview, Leslie Jamison, author of the very recently published The Empathy Exams, offers up a counterintuitive defense of empathy: “I’m interested in everything that might be flawed or messy about empathy — how imagining other lives can constitute a kind of tyranny, or artificially absolve our sense of guilt or responsibility; how feeling empathy can make us feel we’ve done something good when we actually haven’t. Zizek talks about how 'feeling good' has become a kind of commodity we purchase for ourselves when we buy socially responsible products; there’s some version of this inoculation logic — or danger — that’s possible with empathy as well: we start to like the feeling of feeling bad for others; it can make us feel good about ourselves. So there’s a lot of danger attached to empathy: it might be self-serving or self-absorbed; it might lead our moral reasoning astray, or supplant moral reasoning entirely. But do I want to defend it, despite acknowledging this mess? More like: I want to defend it by acknowledging this mess. Saying: Yes. Of course. But yet. Anyway.”
In a review of Romanian writer Herta Muller's recently translated collection Christina and Her Double, Costica Bradatan points to what changing language can do, what it can't do, and how those who attempt to manipulate it may also underestimate its power: “Behind all these efforts was the belief that language can change the real world. If religious terms are removed from language, people will stop having religious feelings; if the vocabulary of death is properly engineered, people will stop being afraid of dying. We may smile today, but in the long run such polices did produce a change, if not the intended one. The change was not in people’s attitudes toward death or the afterworld, but in their ability to make sense of what was going on. Since language plays such an important part in the construction of the self, when the state subjects you to constant acts of linguistic aggression, whether you realize it or not, your sense of who you are and of your place in the world are seriously affected. Your language is not just something you use, but an essential part of what you are. For this reason any political disruption of the way language is normally used can in the long run cripple you mentally, socially, and existentially. When you are unable to think clearly you cannot act coherently. Such an outcome is precisely what a totalitarian system wants: a population perpetually caught in a state of civic paralysis.”
Charles Samuleson, author of "The Deepest Human Life: An Introduction to Philosophy for Everyone," has this paean to the humanities in the Wall Street Journal: “I once had a student, a factory worker, who read all of Schopenhauer just to find a few lines that I quoted in class. An ex-con wrote a searing essay for me about the injustice of mandatory minimum sentencing, arguing that it fails miserably to live up to either the retributive or utilitarian standards that he had studied in Introduction to Ethics. I watched a preschool music teacher light up at Plato's "Republic," a recovering alcoholic become obsessed by Stoicism, and a wayward vet fall in love with logic (he's now finishing law school at Berkeley). A Sudanese refugee asked me, trembling, if we could study arguments concerning religious freedom. Never more has John Locke —or, for that matter, the liberal arts—seemed so vital to me.”
Arthur C. Brooks makes the case that charitable giving makes us happier and even more successful: “In 2003, while working on a book about charitable giving, I stumbled across a strange pattern in my data. Paradoxically, I was finding that donors ended up with more income after making their gifts. This was more than correlation; I found solid evidence that giving stimulated prosperity…. Why? Charitable giving improves what psychologists call “self-efficacy,” one’s belief that one is capable of handling a situation and bringing about a desired outcome. When people give their time or money to a cause they believe in, they become problem solvers. Problem solvers are happier than bystanders and victims of circumstance.” Do yourself a favor, then, and become a member of the Arendt Center.
What Heidegger's Denktagebuch reveals about his thinking during the Nazi regime.
April 8, 2014
Goethe Institut, NYC
Learn more here.
"My Name is Ruth."
An Evening with Bard Big Read and Marilynne Robinson's Housekeeping
Excerpts will be read by Neil Gaiman, Nicole Quinn, & Mary Caponegro
April 23, 2014
Richard B. Fisher Center, Bard College
Learn more here.
This week on the blog, our Quote of the Week comes from Martin Wager, who views Arendt's idea of world alienation through the lens of modern day travel. Josh Kopin looks at Stanford Literary Lab's idea of using computers and data as a tool for literary criticism. In the Weekend Read, Roger Berkowitz ponders the slippery slope of using the First Amendment as the basis for campaign finance reform.
Franco Moretti is a literature professor, and founder of the Stanford Literary Lab, who believes in something called "computational criticism," that is, the ability of computers to aid in the understanding of literature. Joshua Rothman's recent profile of Moretti has provoked a lot of response, most of it defending traditional literary criticism from the digital barbarians at the gates. Moretti's defenders argue, however, that his critics have failed to understand a crucial difference between his work and what they're worried it might supplant: "The basic idea in Moretti’s work is that, if you really want to understand literature, you can’t just read a few books or poems over and over (“Hamlet,” “Anna Karenina,” “The Waste Land”). Instead, you have to work with hundreds or even thousands of texts at a time. By turning those books into data, and analyzing that data, you can discover facts about literature in general—facts that are true not just about a small number of canonized works but about what the critic Margaret Cohen has called the 'Great Unread.'"
The truth Moretti is after, however, has nothing to do with literature, with the bone curdling insights of tragedy or the personal insights of the novel's hero. What Moretti seeks is a better understanding of all the other texts, of the entirety of texts and the overarching literariness of a period or of history as a whole. One could say that rather than supplant the traditional literary critic, Moretti's work will aid the literary historian, if only to give a potentially comprehensive idea of any given zeitgeist. That is true, so far as it goes. But as the already decreasing numbers of literature students are now in part siphoned off into alternative studies of literature that ignore and even disdain the surprising and irreducible quality of momentary shock of insight, the declining impact of the literary sensibility will only be accelerated. This is hardly to condemn Moretti and his data-oriented approach to literature as a reservoir of information into mass society; we ought, nevertheless, to find in the popularity of such trends the provocation to remind ourselves why literature is meant to be read by humans instead of machines.
RB h/t Josh Kopin
A few weeks ago, Christy Wampole, a professor of French at Princeton, took to the New York Times to point to what she sees as a pandemic of irony, the symptom of a malignant hipster culture which has metastasized, spreading out from college campuses and hip neighborhoods and into the population at large. Last week, author R. Jay Magill responded to Wampole, noting that the professor was a very late entry into an analysis of irony that stretches back to the last gasps of the 20th century, and that even that discourse fits into a much longer conversation about sincerity and irony that has been going on at least since Diogenes.
Of course, this wasn’t Magill’s first visit to this particular arena; his own entry, entitled Sincerity: How a Moral Ideal Born Five Hundred Years Ago Inspired Religious Wars, Modern Art, Hipster Chic, and the Curious Notion That We All Have Something to Say (No Matter How Dull), came out in July. Magill very effectively recapitulates the main point from his book in his article for the Atlantic, but, if you were to read this new summary alone, you would both deny yourself of some of the pleasures of Magill’s research and prose, as well as spare yourself from some of his less convincing arguments, arguments which, incidentally, happen to suffice for the thrust of his recent article.
The most interesting chapters of Magill’s book deal with the early history of the rise of sincerity, which he traces back to the Reformation. In Magill’s telling, the word “sincere” enters the record of English in 1533, when an English reformer named John Frith writes, to Sir Thomas More, that John Wycliffe “had lived ‘a very sincere life.’” Before that use, in its origin in Latin and French, the word “sincere” had only been used to describe objects and, now, Frith was using it not only for the first time in English but also to describe a particular individual as unusually true and pure to his self, set in opposition to the various hypocrisies that had taken root within the Catholic Church. Magill sums this up quite elegantly: “to be sincere” he writes “was to be reformed.”
Now, this would have been revolutionary enough, since it suggested that a relationship with God required internal confirmation rather than external acclamation—in the words of St. Paul, a fidelity to the spirit of the law and not just the letter. And yet reformed sincerity was not simply a return to the Gospel. In order to be true to one’s self, there must be a self to accord with, an internal to look towards. Indeed, Magill’s history of the idea of sincerity succeeds when it describes the development of the self, and, in particular, that development as variably determined by the internal or the external.
It gets more complicated, however, or perhaps more interesting, when Magill turns towards deceptive presentations of the self, that is, when he begins to talk about insincerity. He begins this conversation with Montaigne, who “comes to sense a definite split between his public and private selves and is the first author obsessed with portraying himself as he really is.” The most interesting appearance of this conversation is an excellent chapter on Jean-Jacques Rousseau, who suggested that people should aspire to self-sameness, should do their best to “reconcile” one’s self to one’s self, a demand for authenticity that would come to be fully expressed in Immanuel Kant’s moral law, the command that I must set myself as a law for myself.
Sincerity, the moral ideal first put forth by John Frith, started as the Reformation’s response to the inability of the Catholic Church to enact that particular principle, in other words, its hypocrisy. This follows for each of the movements that Magill writes about, each responding to the hypocrisy of their own moment in a specific way. On this matter he has a very good teacher, Hannah Arendt, an inheritor of Kant, who was himself a reader of Rousseau. Arendt writes, in Crisis of the Republic, what might serve as a good summation of one of Magill’s more convincing arguments: “if we inquire historically into the causes likely to transform engagés into enragés, it is not injustice that ranks first, but hypocrisy.”
Still, while what makes the sincerity of Frith (who was burned at the stake) or Wycliffe (whose body was exhumed a half century after his death so that it, too, could be burned) compelling is the turn inwards, it is Rousseau’s substitution of the turn back for that turn inward that appears to interest Magill, who decries “the Enlightenment understanding of the world” that “would entirely dominate the West, relegating Rousseau to that breed of reactionary artististic and political minds who stood against the progress of technology, commerce, and modernization and pined for utopia.”
The whole point is moot; Rousseau was himself a hypocrite, often either unable or unwilling to enact the principles he set out in his writings. As Magill moves forward, though, it becomes clear the he values the turn back as a manifestation of sincerity, as a sort of expressing oneself honestly. The last few hundred years in the development of sincerity, it seems, are finding new iterations of the past in the self. He writes that the Romantics, a group he seems to favor as more sincere than most, “harbored a desire to escape a desire to escape forward-moving, rational civilization by worshipping nature, emotion, love, the nostalgic past, the bucolic idyll, violence, the grotesque, the mystical, the outcast and, failing these, suicide.” In turn, in his last chapter, Magill writes that hipster culture serves a vital cultural purpose: its “sincere remembrance of things past, however commodified or cheesy or kitschy or campy or embarrassing, remains real and small and beautiful because otherwise these old things are about to be discarded by a culture that bulldozes content once it has its economic utility.”
The hipster, for Magill, is not the cold affectation of an unculture, as Wampole wants to claim, but is instead the inheritor “of the the entire history of the Protestant-Romantic-rebellious ethos that has aimed for five hundred years to jam a stick into the endlessly turning spokes of time, culture and consumption and yell, “Stop! I want to get off!”
There’s the rub. What Magill offers doesn’t necessarily strike me as a move towards sincerity, but it is definitely a nod to nostalgia. Consider how he recapitulates his argument in the article:
One need really only look at what counts as inventive new music, film, or art. Much of it is stripped down, bare, devoid of over-production, or aware of its production—that is, an irony that produces sincerity. Sure, pop music and Jeff Koons alike retain huge pull (read: $$$), but lately there has been a return to artistic and musical genres that existed prior to the irony-debunking of 9/11: early punk, disco, rap, New Wave—with a winking nod to sparse Casio keyboard sounds, drum machines, naïve drawing, fake digital-look drawings, and jangly, Clash-like guitars. Bands like Arcade Fire, Metric, Scissor Sisters, CSS, Chairlift, and the Temper Trap all go in for heavy nostalgia and an acknowledgement of a less self-conscious, more D.I.Y. time in music.
Here, Magill is very selectively parsing the recent history of “indie music,” ignoring a particularly striking embrace of artificial pop music that happened alongside the rise of the “sincere” genres, like new folk, that he favors. There’s no reason to assume that Jeff Koons’s blown up balloon animals or Andy Warhol’s Brillo Boxes are any less sincere than the Scissor Sisters’s camp disco, just as there is no reason to assume that a desire to return to nature is any less sincere than the move into the city. Although Magill makes a good argument for the hipster’s cultural purpose, that purpose is not itself evidence that the hipster is expressing what’s truly inside himself, just as there’s no way for you to be sure that I am sincerely expressing my feelings about Sincerity. Magill, ultimately, makes the same mistake as Wampole, in that he judges with no evidence; the only person you can accurately identify as sincere is yourself.
On election night, as he was speaking to the crowd assembled at McCormack Place in Chicago, President Barack Obama took a moment to thank “all the people who voted in this election,” and, in particular, those “who voted for the first time or waited in line for a very long time. By the way” he added, “we have to fix that.”
Although there have been questions over the last few elections cycles about attempts to restrict the franchise, the inefficiency of the process for many of those who do vote is, arguably, a much more important issue; many voters in Ohio, Michigan and Florida waited in line for an hour or more yesterday before actually being able to cast their ballots.
This wait, which is simply annoying for some, makes voting onerous for others—while few people left voting lines once they entered them, it is certainly a possibility that some saw the lines outside the polling place and chose not to queue up at all.
Although many states, states as diverse as Illinois and Texas, have chosen to combat this electoral gridlock by offering some form of early voting and while both Washington and Oregon conduct their elections exclusively by mail, the fact that these methods spread thin both the place and the time of the election means that they mitigate even the illusion that act of voting is a public one and that the election itself is the decision making process of a community. Samuel Goldman offers a slightly different solution: “make the first Tuesday after the first Monday in November a federal holiday.”
Most of the practical obstacles to voting are rooted in the fact that Tuesdays are workdays. If more citizens had the day off, they’d have less need of absentee ballots, early voting, extended poll hours, and the rest of the mess.
Declaring Election Day a federal holiday wouldn’t force private employers to close for the day. I suspect that many would, however, particularly if Election Day replaced one of the holidays already on the calendar.
The benefits of making Election Day a holiday go beyond access. Doing so would also provide an opportunity for demonstrations, celebrations, protests, and encounters with our neighbors. In the 18th century, elections were the occasion for speeches, feasts, games, and, occasionally, drunken riots. We wouldn’t want to bring back the riots. Yet there’s no reason that the rest shouldn’t become part our public culture again. Independence Day is wonderful. But I’d rather see marching bands leading the way to the polls than to the fireworks.
As it is, voting tends to be limited to the hours before and after the working day, and any celebration of the electoral process is limited to the supporters of successful candidates. Turning the first Tuesday after the first Monday in November into a national holiday, and using that day to come together not only to vote but also to publically encourage the act of voting and to praise the voter, can only serve to involve more people in the American political process.