San Jose State University is experimenting with a program where students pay a reduced fee for online courses run by the private firm Udacity. Teachers and their unions are in retreat across the nation. And groups like Uncollege insist that schools and universities are unnecessary. At a time when teachers are everywhere on the defensive, it is great to read this opening salvo from Leon Wieseltier:
When I look back at my education, I am struck not by how much I learned but by how much I was taught. I am the progeny of teachers; I swoon over teachers. Even what I learned on my own I owed to them, because they guided me in my sense of what is significant.
I share Wieseltier’s reverence for educators. Eric Rothschild and Werner Feig lit fires in my brain while I was in high school. Austin Sarat taught me to teach myself in college. Laurent Mayali introduced me to the wonders of history. Marianne Constable pushed me to be a rigorous reader. Drucilla Cornell fired my idealism for justice. And Philippe Nonet showed me how much I still had to know and inspired me to read and think ruthlessly in graduate school. Like Wieseltier, I can trace my life’s path through the lens of my teachers.
The occasion for such a welcome love letter to teachers is Wieseltier’s rapacious rejection of homeschooling and unschooling, two movements that he argues denigrate teachers. As sympathetic as I am to his paean to pedagogues, Wieseltier’s rejection of all alternatives to conventional education today is overly defensive.
For all their many ills, homeschooling and unschooling are two movements that seek to personalize and intensify the often conventional and factory-like educational experience of our nation’s high schools and colleges. According to Wieseltier, these alternatives are possessed of the “demented idea that children can be competently taught by people whose only qualifications for teaching them are love and a desire to keep them from the world.” These movements believe that young people can “reject college and become “self-directed learners.”” For Wieseltier, the claim that people can teach themselves is both an “insult to the great profession of pedagogy” and a romantic over-estimation of “untutored ‘self’.”
The romance of the untutored self is strong, but hardly dangerous. While today educators like Will Richardson and entrepreneurs like Dale Stephens celebrate the abundance of the internet and argue that anyone can teach themselves with simply an internet connection, that dream has a history. Consider this endorsement of autodidactic learning from Ray Bradbury from long before the internet:
Yes, I am. I’m completely library educated. I’ve never been to college. I went down to the library when I was in grade school in Waukegan, and in high school in Los Angeles, and spent long days every summer in the library. I used to steal magazines from a store on Genesee Street, in Waukegan, and read them and then steal them back on the racks again. That way I took the print off with my eyeballs and stayed honest. I didn’t want to be a permanent thief, and I was very careful to wash my hands before I read them. But with the library, it’s like catnip, I suppose: you begin to run in circles because there’s so much to look at and read. And it’s far more fun than going to school, simply because you make up your own list and you don’t have to listen to anyone. When I would see some of the books my kids were forced to bring home and read by some of their teachers, and were graded on—well, what if you don’t like those books?
In this interview in the Paris Review, Bradbury not only celebrates the freedom of the untutored self, but also dismisses college along much the same lines as Dale Stephens of Uncollege does. Here is Bradbury again:
You can’t learn to write in college. It’s a very bad place for writers because the teachers always think they know more than you do—and they don’t. They have prejudices. They may like Henry James, but what if you don’t want to write like Henry James? They may like John Irving, for instance, who’s the bore of all time. A lot of the people whose work they’ve taught in the schools for the last thirty years, I can’t understand why people read them and why they are taught. The library, on the other hand, has no biases. The information is all there for you to interpret. You don’t have someone telling you what to think. You discover it for yourself.
What the library and the internet offer is unfiltered information. For the autodidact, that is all that is needed. Education is a self-driven exploration of the database of the world.
Of course such arguments are elitist. Not everyone is a Ray Bradbury or a Wilhelm Gottfried Leibniz, who taught himself Latin in a few days. Hannah Arendt refused to go to her high school Greek class because it was offered at 8 am—too early an hour for her mind to wake up, she claimed. She learned Greek on her own. For such people self-learning is an option. But even Arendt needed teachers, which is why she went to Freiburg to study with Martin Heidegger. She had heard, she later wrote, that thinking was happening there. And she wanted to learn to think.
What is it that teachers teach when they are teaching? To answer “thinking” or “critical reasoning” or “self-reflection” is simply to open more questions. And yet these are the crucial questions we need to ask. At a period in time when education is increasingly confused with information delivery, we need to articulate and promote the dignity of teaching.
What is most provocative in Wieseltier’s essay is his civic argument for a liberal arts education. Education, he writes, is the salvation of both the person and the citizen. Indeed it is the bulwark of a democratic politics:
Surely the primary objectives of education are the formation of the self and the formation of the citizen. A political order based on the expression of opinion imposes an intellectual obligation upon the individual, who cannot acquit himself of his democratic duty without an ability to reason, a familiarity with argument, a historical memory. An ignorant citizen is a traitor to an open society. The demagoguery of the media, which is covertly structural when it is not overtly ideological, demands a countervailing force of knowledgeable reflection.
That education is the answer to our political ills is an argument heard widely. During the recent presidential election, the candidates frequently appealed to education as the panacea for everything from our flagging economy to our sclerotic political system. Wieseltier trades in a similar argument: A good liberal arts education will yield critical thinkers who will thus be able to parse the obfuscation inherent in the media and vote for responsible and excellent candidates.
I am skeptical of arguments that imagine education as a panacea for politics. Behind such arguments is usually the unspoken assumption: “If X were educated and knew what they were talking about, they would see the truth and agree with me.” There is a confidence here in a kind of rational speech situation (of the kind imagined by Jürgen Habermas) that holds that when the conditions are propitious, everyone will come to agree on a rational solution. But that is not the way human nature or politics works. Politics involves plurality and the amazing thing about human beings is that educated or not, we embrace an extraordinary variety of strongly held, intelligent, and conscientious opinions. I am a firm believer in education. But I hold out little hope that education will make people see eye to eye, end our political paralysis, or usher in a more rational polity.
What then is the value of education? And why is that we so deeply need great teachers? Hannah Arendt saw education as “the point at which we decide whether we love the world enough to assume responsibility for it." The educator must love the world and believe in it if he or she is to introduce young people to that world as something noble and worthy of respect. In this sense education is conservative, insofar as it conserves the world as it has been given. But education is also revolutionary, insofar as the teacher must realize that it is part of that world as it is that young people will change the world. Teachers simply teach what is, Arendt argued; they leave to the students the chance to transform it.
To teach the world as it is, one must love the world—what Arendt comes to call amor mundi. A teacher must not despise the world or see it as oppressive, evil, and deceitful. Yes, the teacher can recognize the limitations of the world and see its faults. But he or she must nevertheless love the world with its faults and thus lead the student into the world as something inspired and beautiful. To teach Plato, you must love Plato. To teach geology, you must love rocks. While critical thinking is an important skill, what teachers teach is rather enthusiasm and love of learning. The great teachers are the lovers of learning. What they teach, above all, is the experience of discovery. And they do so by learning themselves.
Education is to be distinguished from knowledge transmission. It must also be distinguished from credentialing. And finally, education is not the same as indoctrinating students with values or beliefs. Education is about opening students to the fact of what is. Teaching them about the world as it is. It is then up to the student, the young, to judge whether the world that they have inherited is loveable and worthy of retention, or whether it must be changed. The teacher is not responsible for changing the world; rather the teacher nurtures new citizens who are capable of judging the world on their own.
Arendt thus affirms Ralph Waldo Emerson's view that “He only who is able to stand alone is qualified for society.” Emerson’s imperative, to take up the divine idea allotted to each one of us, resonates with Arendt’s Socratic imperative, to be true to oneself. Education, Arendt insists, must risk allowing people their unique and personal viewpoints, eschewing political education and seeking, simply, to nurture independent minds. Education prepares the youth for politics by bringing them into a common world as independent and unique individuals. From this perspective, the progeny of teachers is the educated citizen, someone one who is both self-reliant in an Emersonian sense and also part of a common world.
A few weeks ago, Christy Wampole, a professor of French at Princeton, took to the New York Times to point to what she sees as a pandemic of irony, the symptom of a malignant hipster culture which has metastasized, spreading out from college campuses and hip neighborhoods and into the population at large. Last week, author R. Jay Magill responded to Wampole, noting that the professor was a very late entry into an analysis of irony that stretches back to the last gasps of the 20th century, and that even that discourse fits into a much longer conversation about sincerity and irony that has been going on at least since Diogenes.
Of course, this wasn’t Magill’s first visit to this particular arena; his own entry, entitled Sincerity: How a Moral Ideal Born Five Hundred Years Ago Inspired Religious Wars, Modern Art, Hipster Chic, and the Curious Notion That We All Have Something to Say (No Matter How Dull), came out in July. Magill very effectively recapitulates the main point from his book in his article for the Atlantic, but, if you were to read this new summary alone, you would both deny yourself of some of the pleasures of Magill’s research and prose, as well as spare yourself from some of his less convincing arguments, arguments which, incidentally, happen to suffice for the thrust of his recent article.
The most interesting chapters of Magill’s book deal with the early history of the rise of sincerity, which he traces back to the Reformation. In Magill’s telling, the word “sincere” enters the record of English in 1533, when an English reformer named John Frith writes, to Sir Thomas More, that John Wycliffe “had lived ‘a very sincere life.’” Before that use, in its origin in Latin and French, the word “sincere” had only been used to describe objects and, now, Frith was using it not only for the first time in English but also to describe a particular individual as unusually true and pure to his self, set in opposition to the various hypocrisies that had taken root within the Catholic Church. Magill sums this up quite elegantly: “to be sincere” he writes “was to be reformed.”
Now, this would have been revolutionary enough, since it suggested that a relationship with God required internal confirmation rather than external acclamation—in the words of St. Paul, a fidelity to the spirit of the law and not just the letter. And yet reformed sincerity was not simply a return to the Gospel. In order to be true to one’s self, there must be a self to accord with, an internal to look towards. Indeed, Magill’s history of the idea of sincerity succeeds when it describes the development of the self, and, in particular, that development as variably determined by the internal or the external.
It gets more complicated, however, or perhaps more interesting, when Magill turns towards deceptive presentations of the self, that is, when he begins to talk about insincerity. He begins this conversation with Montaigne, who “comes to sense a definite split between his public and private selves and is the first author obsessed with portraying himself as he really is.” The most interesting appearance of this conversation is an excellent chapter on Jean-Jacques Rousseau, who suggested that people should aspire to self-sameness, should do their best to “reconcile” one’s self to one’s self, a demand for authenticity that would come to be fully expressed in Immanuel Kant’s moral law, the command that I must set myself as a law for myself.
Sincerity, the moral ideal first put forth by John Frith, started as the Reformation’s response to the inability of the Catholic Church to enact that particular principle, in other words, its hypocrisy. This follows for each of the movements that Magill writes about, each responding to the hypocrisy of their own moment in a specific way. On this matter he has a very good teacher, Hannah Arendt, an inheritor of Kant, who was himself a reader of Rousseau. Arendt writes, in Crisis of the Republic, what might serve as a good summation of one of Magill’s more convincing arguments: “if we inquire historically into the causes likely to transform engagés into enragés, it is not injustice that ranks first, but hypocrisy.”
Still, while what makes the sincerity of Frith (who was burned at the stake) or Wycliffe (whose body was exhumed a half century after his death so that it, too, could be burned) compelling is the turn inwards, it is Rousseau’s substitution of the turn back for that turn inward that appears to interest Magill, who decries “the Enlightenment understanding of the world” that “would entirely dominate the West, relegating Rousseau to that breed of reactionary artististic and political minds who stood against the progress of technology, commerce, and modernization and pined for utopia.”
The whole point is moot; Rousseau was himself a hypocrite, often either unable or unwilling to enact the principles he set out in his writings. As Magill moves forward, though, it becomes clear the he values the turn back as a manifestation of sincerity, as a sort of expressing oneself honestly. The last few hundred years in the development of sincerity, it seems, are finding new iterations of the past in the self. He writes that the Romantics, a group he seems to favor as more sincere than most, “harbored a desire to escape a desire to escape forward-moving, rational civilization by worshipping nature, emotion, love, the nostalgic past, the bucolic idyll, violence, the grotesque, the mystical, the outcast and, failing these, suicide.” In turn, in his last chapter, Magill writes that hipster culture serves a vital cultural purpose: its “sincere remembrance of things past, however commodified or cheesy or kitschy or campy or embarrassing, remains real and small and beautiful because otherwise these old things are about to be discarded by a culture that bulldozes content once it has its economic utility.”
The hipster, for Magill, is not the cold affectation of an unculture, as Wampole wants to claim, but is instead the inheritor “of the the entire history of the Protestant-Romantic-rebellious ethos that has aimed for five hundred years to jam a stick into the endlessly turning spokes of time, culture and consumption and yell, “Stop! I want to get off!”
There’s the rub. What Magill offers doesn’t necessarily strike me as a move towards sincerity, but it is definitely a nod to nostalgia. Consider how he recapitulates his argument in the article:
One need really only look at what counts as inventive new music, film, or art. Much of it is stripped down, bare, devoid of over-production, or aware of its production—that is, an irony that produces sincerity. Sure, pop music and Jeff Koons alike retain huge pull (read: $$$), but lately there has been a return to artistic and musical genres that existed prior to the irony-debunking of 9/11: early punk, disco, rap, New Wave—with a winking nod to sparse Casio keyboard sounds, drum machines, naïve drawing, fake digital-look drawings, and jangly, Clash-like guitars. Bands like Arcade Fire, Metric, Scissor Sisters, CSS, Chairlift, and the Temper Trap all go in for heavy nostalgia and an acknowledgement of a less self-conscious, more D.I.Y. time in music.
Here, Magill is very selectively parsing the recent history of “indie music,” ignoring a particularly striking embrace of artificial pop music that happened alongside the rise of the “sincere” genres, like new folk, that he favors. There’s no reason to assume that Jeff Koons’s blown up balloon animals or Andy Warhol’s Brillo Boxes are any less sincere than the Scissor Sisters’s camp disco, just as there is no reason to assume that a desire to return to nature is any less sincere than the move into the city. Although Magill makes a good argument for the hipster’s cultural purpose, that purpose is not itself evidence that the hipster is expressing what’s truly inside himself, just as there’s no way for you to be sure that I am sincerely expressing my feelings about Sincerity. Magill, ultimately, makes the same mistake as Wampole, in that he judges with no evidence; the only person you can accurately identify as sincere is yourself.
What precisely do we mean when we use the term “genocide”? Has the word always been associated with the mass killing of individuals on the basis of their group affiliation? Or have there been alternative conceptions of genocide of which we should be aware?
These questions were at the heart of the Hannah Arendt Center’s latest Lunchtime Talk, which occurred amid picturesque snowfall on Wednesday, February 29th. The presenter was Douglas Irvin, a Ph.D. candidate at Rutgers’ Center for the Study of Genocide and Human Rights. Irving's talk revolved around the work of the Polish-Jewish lawyer Raphael Lemkin (1900-1959).
After escaping from Nazi-occupied Poland and lecturing at the University of Stockholm, Lemkin emigrated to the U.S., served as an advisor at the Nuremberg Trials, and played a central role in the passage of the 1948 U.N. Genocide Convention. Indeed, Lemkin was the first public figure to use the term “genocide,” which he derived from the Greek root genus (family, race, or tribe) and the Latin root, cide (killing).
Lemkin and Arendt were contemporaries with overlapping experiences and interests, but they engaged very little with one another in print (aside, perhaps, from a few allusions and anonymous criticisms). Irvin contends that there are good reasons for this lack of dialogue, since the two differed significantly in their views of genocide and humanity more broadly.
On the one hand, Arendt regarded genocide as a historically recent outgrowth of modern totalitarianism. According to Irvin, this understanding was in keeping with her more general conception of the human cosmos, which ultimately emerged through, and was grounded in, individual interactions within the arena of the polis.
Lemkin, by contrast, regarded genocide as a much older phenomenon, one that was premised not on the destruction of individuals on the basis of their group affiliation, but rather on the annihilation of entire cultural traditions and collective identities. Drawing eclectically on the work of seventeenth-century Spanish theologians, romantic thinkers like Johann Gottfried von Herder, and anthropological understandings of cultures as integrated wholes, Lemkin ultimately defined genocide as a coordinated attack on the conditions that make the lives of nations and other collectivities possible.
In this conception, genocide does not necessarily or inevitably entail the mass killing of a group’s members, but rather turns on concerted efforts to obliterate that group’s institutions, language, religious observance, and economic livelihood. In Irvin’s argument, this approach resonated with the broadly communitarian nature of Lemkin’s thought: human existence was in his estimation defined by interactions between culture-bearing groups, and human freedom could ultimately be secured through the benevolent recognition and protection of cultural pluralism.
Significantly, the U.N. Genocide Convention that Lemkin championed did not incorporate many aspects of his thinking. His ideas encountered strong resistance from the U.S., U.K., and other imperial powers, many of which feared that their treatment of indigenous and colonial populations would qualify as genocide under the standards that Lemkin (and his collaborators) proposed. As a result, our current understanding of genocide is in no small part a byproduct of a diplomatic battle to redefine this legal category in a fashion that would encompass the Nazi Holocaust but not implicate other states (including several of the Allied powers that fought against Germany in World War II). This wrangling has also contributed to the minimal attention that has since been paid to Lemkin’s ideas, which were only rediscovered in a significant way in the early 1990s.
Douglas Irvin’s stimulating talk suggested that such inattention is unfortunate. Whatever one thinks of Lemkin’s effort to inscribe a form of cultural relativity into liberal international law, a more thoughtful understanding of his life and thought can only enrich our understanding of genocide’s career as a concept.
Click here to watch the Douglas Irvin lunchtime talk.