Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
It is a new year, not only for Jews celebrating Rosh Hashanah but also for hundreds of thousands of college and university students around the world. Over at Harvard, they invited Nannerl O. Keohane—past President of Wellesley College—to give the new students some advice on how to reflect upon and imagine the years of education that lay before them. Above all, Keohane urges students to take time to think about what they want from their education: “You now have this incredible opportunity to shape who you are as a person, what you are like, and what you seek for the future. You have both the time and the materials to do this. You may think you’ve never been busier in your life, and that’s probably true; but most of you have “time” in the sense of no other duties that require your attention and energy. Shaping your character is what you are supposed to do with your education; it’s not competing with something else. You won’t have many other periods in your life that will be this way until you retire when, if you are fortunate, you’ll have another chance; but then you will be more set in your ways, and may find it harder to change.”
Robin Kelly, writing on the 1963 March on Washington and the March's recent fiftieth anniversary celebrations, zooms out a little bit on the original event. It has, he says, taken on the characteristics of a big, feel good event focused on Civil Rights and directly responsible for the passage of the Civil Rights Act, when, in fact, all those people also came to Washington in support of economic equality and the gritty work of passing laws was accomplished later, with additional momentum and constraints. It's important to remember, he says, that "big glitzy marches do not make a movement; the organizations and activists who came to Washington, D. C., will continue to do their work, fight their fights, and make connections between disparate struggles, no matter what happens in the limelight."
Robinson Meyer investigates what, exactly, poet Seamus Heaney's last words were. Just before he passed away last week at 74, Heaney, an Irish Nobel Laureate, texted the Latin phrase noli timere, don't be afraid, to his wife. Heaney's son Michael mentioned this in his eulogy for his father, and it was written down and reported as, variously, the correct phrase or the incorrect nolle timore. For Meyer, this mis-recording of the poet's last words is emblematic of some of the transcriptions and translations he did in his work, and the further translations and transcriptions we will now engage in because he is gone. "We die" Meyer writes, "and the language gets away from us, in little ways, like a dropped vowel sound, a change in prepositions, a mistaken transcription. Errors in transfer make a literature."
Jay Rosen, who will be speaking at the Hannah Arendt Center’s NYC Lecture Series on Sunday, Oct. 27th at 5pm, has recently suggested that journalism solves the problem of awayness - “Journalism enters the picture when human settlement, daily economy, and political organization grow beyond the scale of the self-informing populace.” C.W. Anderson adds that "awayness" should include alienation from a moment in time as well as from a particular place: "Think about how we get our news today: We dive in and out of Twitter, with its short bursts of immediate information. We click over to a rapidly updating New York Times Lede blog post, with it's rolling updates and on the ground reports, complete with YouTube videos and embedded tweets. Eventually, that blog post becomes a full-fledged article, usually written by someone else. And finally, at another end of the spectrum, we peruse infographics that can sum up decades of data into a single image. All of these are journalism, in some fashion. But the kind of journalisms they are - what they are for - is arguably very different. They each deal with the problem of context in different ways."
Adam Gopnik makes a case for the study of English, and of the humanities more broadly. His defense is striking because it rejects a recent turn towards their supposed use value, instead emphasizing such study for its own sake: "No sane person proposes or has ever proposed an entirely utilitarian, production-oriented view of human purpose. We cannot merely produce goods and services as efficiently as we can, sell them to each other as cheaply as possible, and die. Some idea of symbolic purpose, of pleasure seeking rather than rent seeking, of Doing Something Else, is essential to human existence. That’s why we pass out tax breaks to churches, zoning remissions to parks, subsidize new ballparks and point to the density of theatres and galleries as signs of urban life, to be encouraged if at all possible. When a man makes a few billion dollars, he still starts looking around for a museum to build a gallery for or a newspaper to buy. No civilization we think worth studying, or whose relics we think worth visiting, existed without what amounts to an English department—texts that mattered, people who argued about them as if they mattered, and a sense of shame among the wealthy if they couldn’t talk about them, at least a little, too. It’s what we call civilization."
The sixth annual fall conference, "Failing Fast:The Crisis of the Educated Citizen"
Olin Hall, Bard College
"Any period to which its own past has become as questionable as it has to us must eventually come up against the phenomenon of language, for in it the past is contained ineradicably, thwarting all attempts to get rid of it once and for all. The Greek polis will continue to exist at the bottom of our political existence...for as long as we use the word 'politics.'"
-Hannah Arendt, "Walter Benjamin: 1892-1940"
Some years ago a mentor told me a story from his days as a graduate student at a prestigious political science department. There was a professor there specializing in Russian politics and Sovietology, an older professor who loved teaching and taught well past the standard age of retirement. His enthusiasm was palpable, and he was well-liked by his students. His most popular course was on Russian politics, and towards the end of one semester, a precocious undergraduate visited during office hours: “How hard is it to learn Russian,” the student asked, “because I’d really like to start.” “Pretty hard,” he said, “but that’s great to hear. What has you so excited about it?” “Well,” said the student, “after taking your course, I’m very inspired to read Marx in the original.” At the next class the professor told this story to all of his students, and none of them laughed. He paused for a moment, then somewhat despondently said: “It has only now become clear to me….that none of you know the first thing about Karl Marx.”
The story has several morals. As a professor, it reminds me to be careful about assuming what students know. As a student, it reminds me of an undergraduate paper I wrote which spelled Marx’s first name with a “C.” My professor kindly marked the mistake, but today I can better imagine her frustration. And if the story works as a joke, it is because we accept its basic premise, that knowledge of foreign languages is important, not only for our engagement with texts but with the world at large. After all, the course in question was not about Marx.
The fast approach of the Hannah Arendt Center’s 2013 Conference on “The Educated Citizen in Crisis” offers a fitting backdrop to consider the place of language education in the education of the citizen. The problem has long been salient in America, a land of immigrants and a country of rich cultural diversity; and debates about the relation between the embrace of English and American assimilation continue to draw attention. Samuel Huntington, for example, recently interpreted challenges to English preeminence as a threat to American political culture: “There is no Americano dream,” he writes in “The Hispanic Challenge,” “There is only the American dream created by an Anglo-Protestant society. Mexican Americans will share in that dream and in that society only if they dream in English.” For Huntington English is an element of national citizenship, not only as a language learned, but as an essential component of American identity.
This might be juxtaposed with Tracy Strong’s support of learning (at least a) second language, including Latin, as an element of democratic citizenship. A second language, writes Strong (see his “Language Learning and the Social Sciences”) helps one acquire “what I might call an anthropological perspective on one’s own society,” for “An important achievement of learning a foreign language is learning a perspective on one’s world that is not one’s own. In turn, the acquisition of another perspective or even the recognition of the legitimacy of another perspective is, to my understanding, a very important component of a democratic political understanding.” Strong illustrates his point with a passage from Hannah Arendt’s “Truth and Politics”: “I form an opinion,” says Arendt, “by considering a given issue from different viewpoints, by making present to my mind the standpoints of those who are absent: that is, I represent them.”
Hannah Arendt’s deep respect for the American Constitution and American political culture, manifest no less (perhaps even more!) in her criticism than her praise, is well known. After fleeing Nazi Germany and German-occupied France, Arendt moved to the United States where she became a naturalized citizen in 1951. And her views on the relation between the English language and American citizenship are rich and complex.
In “The Crisis in Education” Arendt highlights how education plays a unique political role in America, where “it is obvious that the enormously difficult melting together of the most diverse ethnic groups…can only be accomplished through the schooling, education, and Americanization of the immigrants’ children.” Education prepares citizens to enter a common world, of which English in America is a key component: “Since for most of these children English is not their mother tongue but has to be learned in school, schools must obviously assume functions which in a nation-state would be performed as a matter of course in the home.”
At the same time, Arendt’s own embrace of English is hardly straightforward. In a famous 1964 interview with she says: “The Europe of the pre-Hitler period? I do not long for that, I can tell you. What remains? The language remains. […] I have always consciously refused to lose my mother tongue. I have always maintained a certain distance from French, which I then spoke very well, as well as from English, which I write today […] I write in English, but I have never lost a feeling of distance from it. There is a tremendous difference between your mother tongue and another language…The German language is the essential thing that has remained and that I have always consciously preserved.”
Here Arendt seems both with and against Huntington. On one hand, learning and embracing English—the public language of the country—is what enables diverse Americans to share a common political world. And in this respect, her decision to write and publish in English represents one of her most important acts of American democratic citizenship. By writing in English, Arendt “assumes responsibility for the world,” the same responsibility that education requires from its educators if they are to give the younger generation a common world, but which she finds sorely lacking in “The Crisis of Education.”
At the same time, though, Arendt rejects the idea that American citizenship requires treating English as if it were a mother tongue. Arendt consciously preserves her German mother tongue as both an element of her identity and a grounding of her understanding of the world, and in 1967 she even accepted the Sigmund Freud Award of the German Academy of Language and Poetry that “lauded her efforts to keep the German language alive although she had been living and writing in the United States for more than three decades” (I quote from Frank Mehring’s 2011 article “‘All for the Sake of Freedom’: Hannah Arendt’s Democratic Dissent, Trauma, and American Citizenship”). For Arendt, it seems, it is precisely this potentiality in America—for citizens to share and assume responsibility for a common world approached in its own terms, while also bringing to bear a separate understanding grounded by very different terms—that offers America’s greatest democratic possibilities. One might suggest that Arendt’s engagement with language, in her combination of English responsibility and German self-understanding, offers a powerful and thought-provoking model of American democratic citizenship.
What about the teaching of language? In the “The Crisis in Education” Arendt is critical of the way language, especially foreign language, is taught in American schools. In a passage worth quoting at length she says:
“The close connection between these two things—the substitution of doing for learning and of playing for working—is directly illustrated by the teaching of languages; the child is to learn by speaking, that is by doing, not by studying grammar and syntax; in other words he is to learn a foreign language in the same way that as an infant he learned his own language: as though at play and in the uninterrupted continuity of simple existence. Quite apart from the question of whether this is possible or not…it is perfectly clear that this procedure consciously attempts to keep the older child as far as possible at the infant level.”
Arendt writes that such “pragmatist” methods intend “not to teach knowledge but to inculcate a skill.” Pragmatic instruction helps one to get by in the real world; but it does not allow one to love or understand the world. It renders language useful, but reduces language to an instrument, something easily discarded when no longer needed. It precludes philosophical engagement and representative thinking. The latest smartphone translation apps render it superfluous.
But how would one approach language differently? And what does this have to do with grammar and syntax? Perhaps there are clues in the passage selected as our quote of the week, culled from Arendt’s 1968 biographical essay about her friend Walter Benjamin. There, Arendt appreciates that Benjamin's study of language abandons any “utilitarian” or “communicative” goals, but approaches language as a “poetic phenomenon.” The focused study of grammar develops different habits than pragmatist pedagogy. In the process of translation, for example, it facilitates an engagement with language that is divorced from practical use and focused squarely on meaning. To wrestle with grammar means to wrestle with language in the pursuit of truth, in a manner that inspires love for language—that it exists—and cross-cultural understanding. Arendt was famous for flexing her Greek and Latin muscles—in part, I think, as a reflection of her love for the world. The study of Greek and Latin is especially amenable to a relationship of love, because these languages are hardly “practical.” One studies them principally to understand, to shed light on the obscure; and through their investigation one discovers the sunken meanings that remain hidden and embedded in our modern languages, in words we speak regularly without realizing all that is contained within them. By engaging these “dead” languages, we more richly and seriously understand ourselves. And these same disinterested habits, when applied to the study of modern foreign languages, can enrich not only our understanding of different worldviews, but our participation in the world as democratic citizens.
San Jose State University is experimenting with a program where students pay a reduced fee for online courses run by the private firm Udacity. Teachers and their unions are in retreat across the nation. And groups like Uncollege insist that schools and universities are unnecessary. At a time when teachers are everywhere on the defensive, it is great to read this opening salvo from Leon Wieseltier:
When I look back at my education, I am struck not by how much I learned but by how much I was taught. I am the progeny of teachers; I swoon over teachers. Even what I learned on my own I owed to them, because they guided me in my sense of what is significant.
I share Wieseltier’s reverence for educators. Eric Rothschild and Werner Feig lit fires in my brain while I was in high school. Austin Sarat taught me to teach myself in college. Laurent Mayali introduced me to the wonders of history. Marianne Constable pushed me to be a rigorous reader. Drucilla Cornell fired my idealism for justice. And Philippe Nonet showed me how much I still had to know and inspired me to read and think ruthlessly in graduate school. Like Wieseltier, I can trace my life’s path through the lens of my teachers.
The occasion for such a welcome love letter to teachers is Wieseltier’s rapacious rejection of homeschooling and unschooling, two movements that he argues denigrate teachers. As sympathetic as I am to his paean to pedagogues, Wieseltier’s rejection of all alternatives to conventional education today is overly defensive.
For all their many ills, homeschooling and unschooling are two movements that seek to personalize and intensify the often conventional and factory-like educational experience of our nation’s high schools and colleges. According to Wieseltier, these alternatives are possessed of the “demented idea that children can be competently taught by people whose only qualifications for teaching them are love and a desire to keep them from the world.” These movements believe that young people can “reject college and become “self-directed learners.”” For Wieseltier, the claim that people can teach themselves is both an “insult to the great profession of pedagogy” and a romantic over-estimation of “untutored ‘self’.”
The romance of the untutored self is strong, but hardly dangerous. While today educators like Will Richardson and entrepreneurs like Dale Stephens celebrate the abundance of the internet and argue that anyone can teach themselves with simply an internet connection, that dream has a history. Consider this endorsement of autodidactic learning from Ray Bradbury from long before the internet:
Yes, I am. I’m completely library educated. I’ve never been to college. I went down to the library when I was in grade school in Waukegan, and in high school in Los Angeles, and spent long days every summer in the library. I used to steal magazines from a store on Genesee Street, in Waukegan, and read them and then steal them back on the racks again. That way I took the print off with my eyeballs and stayed honest. I didn’t want to be a permanent thief, and I was very careful to wash my hands before I read them. But with the library, it’s like catnip, I suppose: you begin to run in circles because there’s so much to look at and read. And it’s far more fun than going to school, simply because you make up your own list and you don’t have to listen to anyone. When I would see some of the books my kids were forced to bring home and read by some of their teachers, and were graded on—well, what if you don’t like those books?
In this interview in the Paris Review, Bradbury not only celebrates the freedom of the untutored self, but also dismisses college along much the same lines as Dale Stephens of Uncollege does. Here is Bradbury again:
You can’t learn to write in college. It’s a very bad place for writers because the teachers always think they know more than you do—and they don’t. They have prejudices. They may like Henry James, but what if you don’t want to write like Henry James? They may like John Irving, for instance, who’s the bore of all time. A lot of the people whose work they’ve taught in the schools for the last thirty years, I can’t understand why people read them and why they are taught. The library, on the other hand, has no biases. The information is all there for you to interpret. You don’t have someone telling you what to think. You discover it for yourself.
What the library and the internet offer is unfiltered information. For the autodidact, that is all that is needed. Education is a self-driven exploration of the database of the world.
Of course such arguments are elitist. Not everyone is a Ray Bradbury or a Wilhelm Gottfried Leibniz, who taught himself Latin in a few days. Hannah Arendt refused to go to her high school Greek class because it was offered at 8 am—too early an hour for her mind to wake up, she claimed. She learned Greek on her own. For such people self-learning is an option. But even Arendt needed teachers, which is why she went to Freiburg to study with Martin Heidegger. She had heard, she later wrote, that thinking was happening there. And she wanted to learn to think.
What is it that teachers teach when they are teaching? To answer “thinking” or “critical reasoning” or “self-reflection” is simply to open more questions. And yet these are the crucial questions we need to ask. At a period in time when education is increasingly confused with information delivery, we need to articulate and promote the dignity of teaching.
What is most provocative in Wieseltier’s essay is his civic argument for a liberal arts education. Education, he writes, is the salvation of both the person and the citizen. Indeed it is the bulwark of a democratic politics:
Surely the primary objectives of education are the formation of the self and the formation of the citizen. A political order based on the expression of opinion imposes an intellectual obligation upon the individual, who cannot acquit himself of his democratic duty without an ability to reason, a familiarity with argument, a historical memory. An ignorant citizen is a traitor to an open society. The demagoguery of the media, which is covertly structural when it is not overtly ideological, demands a countervailing force of knowledgeable reflection.
That education is the answer to our political ills is an argument heard widely. During the recent presidential election, the candidates frequently appealed to education as the panacea for everything from our flagging economy to our sclerotic political system. Wieseltier trades in a similar argument: A good liberal arts education will yield critical thinkers who will thus be able to parse the obfuscation inherent in the media and vote for responsible and excellent candidates.
I am skeptical of arguments that imagine education as a panacea for politics. Behind such arguments is usually the unspoken assumption: “If X were educated and knew what they were talking about, they would see the truth and agree with me.” There is a confidence here in a kind of rational speech situation (of the kind imagined by Jürgen Habermas) that holds that when the conditions are propitious, everyone will come to agree on a rational solution. But that is not the way human nature or politics works. Politics involves plurality and the amazing thing about human beings is that educated or not, we embrace an extraordinary variety of strongly held, intelligent, and conscientious opinions. I am a firm believer in education. But I hold out little hope that education will make people see eye to eye, end our political paralysis, or usher in a more rational polity.
What then is the value of education? And why is that we so deeply need great teachers? Hannah Arendt saw education as “the point at which we decide whether we love the world enough to assume responsibility for it." The educator must love the world and believe in it if he or she is to introduce young people to that world as something noble and worthy of respect. In this sense education is conservative, insofar as it conserves the world as it has been given. But education is also revolutionary, insofar as the teacher must realize that it is part of that world as it is that young people will change the world. Teachers simply teach what is, Arendt argued; they leave to the students the chance to transform it.
To teach the world as it is, one must love the world—what Arendt comes to call amor mundi. A teacher must not despise the world or see it as oppressive, evil, and deceitful. Yes, the teacher can recognize the limitations of the world and see its faults. But he or she must nevertheless love the world with its faults and thus lead the student into the world as something inspired and beautiful. To teach Plato, you must love Plato. To teach geology, you must love rocks. While critical thinking is an important skill, what teachers teach is rather enthusiasm and love of learning. The great teachers are the lovers of learning. What they teach, above all, is the experience of discovery. And they do so by learning themselves.
Education is to be distinguished from knowledge transmission. It must also be distinguished from credentialing. And finally, education is not the same as indoctrinating students with values or beliefs. Education is about opening students to the fact of what is. Teaching them about the world as it is. It is then up to the student, the young, to judge whether the world that they have inherited is loveable and worthy of retention, or whether it must be changed. The teacher is not responsible for changing the world; rather the teacher nurtures new citizens who are capable of judging the world on their own.
Arendt thus affirms Ralph Waldo Emerson's view that “He only who is able to stand alone is qualified for society.” Emerson’s imperative, to take up the divine idea allotted to each one of us, resonates with Arendt’s Socratic imperative, to be true to oneself. Education, Arendt insists, must risk allowing people their unique and personal viewpoints, eschewing political education and seeking, simply, to nurture independent minds. Education prepares the youth for politics by bringing them into a common world as independent and unique individuals. From this perspective, the progeny of teachers is the educated citizen, someone one who is both self-reliant in an Emersonian sense and also part of a common world.
A few weeks ago, Christy Wampole, a professor of French at Princeton, took to the New York Times to point to what she sees as a pandemic of irony, the symptom of a malignant hipster culture which has metastasized, spreading out from college campuses and hip neighborhoods and into the population at large. Last week, author R. Jay Magill responded to Wampole, noting that the professor was a very late entry into an analysis of irony that stretches back to the last gasps of the 20th century, and that even that discourse fits into a much longer conversation about sincerity and irony that has been going on at least since Diogenes.
Of course, this wasn’t Magill’s first visit to this particular arena; his own entry, entitled Sincerity: How a Moral Ideal Born Five Hundred Years Ago Inspired Religious Wars, Modern Art, Hipster Chic, and the Curious Notion That We All Have Something to Say (No Matter How Dull), came out in July. Magill very effectively recapitulates the main point from his book in his article for the Atlantic, but, if you were to read this new summary alone, you would both deny yourself of some of the pleasures of Magill’s research and prose, as well as spare yourself from some of his less convincing arguments, arguments which, incidentally, happen to suffice for the thrust of his recent article.
The most interesting chapters of Magill’s book deal with the early history of the rise of sincerity, which he traces back to the Reformation. In Magill’s telling, the word “sincere” enters the record of English in 1533, when an English reformer named John Frith writes, to Sir Thomas More, that John Wycliffe “had lived ‘a very sincere life.’” Before that use, in its origin in Latin and French, the word “sincere” had only been used to describe objects and, now, Frith was using it not only for the first time in English but also to describe a particular individual as unusually true and pure to his self, set in opposition to the various hypocrisies that had taken root within the Catholic Church. Magill sums this up quite elegantly: “to be sincere” he writes “was to be reformed.”
Now, this would have been revolutionary enough, since it suggested that a relationship with God required internal confirmation rather than external acclamation—in the words of St. Paul, a fidelity to the spirit of the law and not just the letter. And yet reformed sincerity was not simply a return to the Gospel. In order to be true to one’s self, there must be a self to accord with, an internal to look towards. Indeed, Magill’s history of the idea of sincerity succeeds when it describes the development of the self, and, in particular, that development as variably determined by the internal or the external.
It gets more complicated, however, or perhaps more interesting, when Magill turns towards deceptive presentations of the self, that is, when he begins to talk about insincerity. He begins this conversation with Montaigne, who “comes to sense a definite split between his public and private selves and is the first author obsessed with portraying himself as he really is.” The most interesting appearance of this conversation is an excellent chapter on Jean-Jacques Rousseau, who suggested that people should aspire to self-sameness, should do their best to “reconcile” one’s self to one’s self, a demand for authenticity that would come to be fully expressed in Immanuel Kant’s moral law, the command that I must set myself as a law for myself.
Sincerity, the moral ideal first put forth by John Frith, started as the Reformation’s response to the inability of the Catholic Church to enact that particular principle, in other words, its hypocrisy. This follows for each of the movements that Magill writes about, each responding to the hypocrisy of their own moment in a specific way. On this matter he has a very good teacher, Hannah Arendt, an inheritor of Kant, who was himself a reader of Rousseau. Arendt writes, in Crisis of the Republic, what might serve as a good summation of one of Magill’s more convincing arguments: “if we inquire historically into the causes likely to transform engagés into enragés, it is not injustice that ranks first, but hypocrisy.”
Still, while what makes the sincerity of Frith (who was burned at the stake) or Wycliffe (whose body was exhumed a half century after his death so that it, too, could be burned) compelling is the turn inwards, it is Rousseau’s substitution of the turn back for that turn inward that appears to interest Magill, who decries “the Enlightenment understanding of the world” that “would entirely dominate the West, relegating Rousseau to that breed of reactionary artististic and political minds who stood against the progress of technology, commerce, and modernization and pined for utopia.”
The whole point is moot; Rousseau was himself a hypocrite, often either unable or unwilling to enact the principles he set out in his writings. As Magill moves forward, though, it becomes clear the he values the turn back as a manifestation of sincerity, as a sort of expressing oneself honestly. The last few hundred years in the development of sincerity, it seems, are finding new iterations of the past in the self. He writes that the Romantics, a group he seems to favor as more sincere than most, “harbored a desire to escape a desire to escape forward-moving, rational civilization by worshipping nature, emotion, love, the nostalgic past, the bucolic idyll, violence, the grotesque, the mystical, the outcast and, failing these, suicide.” In turn, in his last chapter, Magill writes that hipster culture serves a vital cultural purpose: its “sincere remembrance of things past, however commodified or cheesy or kitschy or campy or embarrassing, remains real and small and beautiful because otherwise these old things are about to be discarded by a culture that bulldozes content once it has its economic utility.”
The hipster, for Magill, is not the cold affectation of an unculture, as Wampole wants to claim, but is instead the inheritor “of the the entire history of the Protestant-Romantic-rebellious ethos that has aimed for five hundred years to jam a stick into the endlessly turning spokes of time, culture and consumption and yell, “Stop! I want to get off!”
There’s the rub. What Magill offers doesn’t necessarily strike me as a move towards sincerity, but it is definitely a nod to nostalgia. Consider how he recapitulates his argument in the article:
One need really only look at what counts as inventive new music, film, or art. Much of it is stripped down, bare, devoid of over-production, or aware of its production—that is, an irony that produces sincerity. Sure, pop music and Jeff Koons alike retain huge pull (read: $$$), but lately there has been a return to artistic and musical genres that existed prior to the irony-debunking of 9/11: early punk, disco, rap, New Wave—with a winking nod to sparse Casio keyboard sounds, drum machines, naïve drawing, fake digital-look drawings, and jangly, Clash-like guitars. Bands like Arcade Fire, Metric, Scissor Sisters, CSS, Chairlift, and the Temper Trap all go in for heavy nostalgia and an acknowledgement of a less self-conscious, more D.I.Y. time in music.
Here, Magill is very selectively parsing the recent history of “indie music,” ignoring a particularly striking embrace of artificial pop music that happened alongside the rise of the “sincere” genres, like new folk, that he favors. There’s no reason to assume that Jeff Koons’s blown up balloon animals or Andy Warhol’s Brillo Boxes are any less sincere than the Scissor Sisters’s camp disco, just as there is no reason to assume that a desire to return to nature is any less sincere than the move into the city. Although Magill makes a good argument for the hipster’s cultural purpose, that purpose is not itself evidence that the hipster is expressing what’s truly inside himself, just as there’s no way for you to be sure that I am sincerely expressing my feelings about Sincerity. Magill, ultimately, makes the same mistake as Wampole, in that he judges with no evidence; the only person you can accurately identify as sincere is yourself.
What precisely do we mean when we use the term “genocide”? Has the word always been associated with the mass killing of individuals on the basis of their group affiliation? Or have there been alternative conceptions of genocide of which we should be aware?
These questions were at the heart of the Hannah Arendt Center’s latest Lunchtime Talk, which occurred amid picturesque snowfall on Wednesday, February 29th. The presenter was Douglas Irvin, a Ph.D. candidate at Rutgers’ Center for the Study of Genocide and Human Rights. Irving's talk revolved around the work of the Polish-Jewish lawyer Raphael Lemkin (1900-1959).
After escaping from Nazi-occupied Poland and lecturing at the University of Stockholm, Lemkin emigrated to the U.S., served as an advisor at the Nuremberg Trials, and played a central role in the passage of the 1948 U.N. Genocide Convention. Indeed, Lemkin was the first public figure to use the term “genocide,” which he derived from the Greek root genus (family, race, or tribe) and the Latin root, cide (killing).
Lemkin and Arendt were contemporaries with overlapping experiences and interests, but they engaged very little with one another in print (aside, perhaps, from a few allusions and anonymous criticisms). Irvin contends that there are good reasons for this lack of dialogue, since the two differed significantly in their views of genocide and humanity more broadly.
On the one hand, Arendt regarded genocide as a historically recent outgrowth of modern totalitarianism. According to Irvin, this understanding was in keeping with her more general conception of the human cosmos, which ultimately emerged through, and was grounded in, individual interactions within the arena of the polis.
Lemkin, by contrast, regarded genocide as a much older phenomenon, one that was premised not on the destruction of individuals on the basis of their group affiliation, but rather on the annihilation of entire cultural traditions and collective identities. Drawing eclectically on the work of seventeenth-century Spanish theologians, romantic thinkers like Johann Gottfried von Herder, and anthropological understandings of cultures as integrated wholes, Lemkin ultimately defined genocide as a coordinated attack on the conditions that make the lives of nations and other collectivities possible.
In this conception, genocide does not necessarily or inevitably entail the mass killing of a group’s members, but rather turns on concerted efforts to obliterate that group’s institutions, language, religious observance, and economic livelihood. In Irvin’s argument, this approach resonated with the broadly communitarian nature of Lemkin’s thought: human existence was in his estimation defined by interactions between culture-bearing groups, and human freedom could ultimately be secured through the benevolent recognition and protection of cultural pluralism.
Significantly, the U.N. Genocide Convention that Lemkin championed did not incorporate many aspects of his thinking. His ideas encountered strong resistance from the U.S., U.K., and other imperial powers, many of which feared that their treatment of indigenous and colonial populations would qualify as genocide under the standards that Lemkin (and his collaborators) proposed. As a result, our current understanding of genocide is in no small part a byproduct of a diplomatic battle to redefine this legal category in a fashion that would encompass the Nazi Holocaust but not implicate other states (including several of the Allied powers that fought against Germany in World War II). This wrangling has also contributed to the minimal attention that has since been paid to Lemkin’s ideas, which were only rediscovered in a significant way in the early 1990s.
Douglas Irvin’s stimulating talk suggested that such inattention is unfortunate. Whatever one thinks of Lemkin’s effort to inscribe a form of cultural relativity into liberal international law, a more thoughtful understanding of his life and thought can only enrich our understanding of genocide’s career as a concept.
Click here to watch the Douglas Irvin lunchtime talk.