The word designating military drones comes from the word for bee. This is true all over the world, in countless languages. Partly because of this linguistic consistency, it is a common misperception that drones take their name from the buzzing sound when unmanned aircraft fill the air. More accurately, however, drones trace their etymological lineage to the male honey-bee that is called a drone. The male drone-bee is distinguished from the female worker-bees. It does no useful work and has one single function: to impregnate the queen-bee. What unites military drones with their apiary namesakes is not sound, but thoughtless purposefulness.
The beauty of the drone-bee—like the dark beauty of the military drone—is its single-minded purpose. It is a miracle of efficiency, designed to do one thing. The drone-bee is not distracted by the perfume of flowers or the contentment of labor. It is born, lives, and dies with only one task in mind. Similarly, the military drone suffers neither from hunger nor from distraction. It does what it is told. If necessary, it will sacrifice itself for its mission. It is a model of thoughtless efficiency.
A few weeks ago I wrote about Ernst Jünger’s novel The Glass Bees, in which a brilliant inventor produces tiny flying glass bees that offer limitless potential for surveillance and war. Today I turn to Jake Kosek’s recent paper “Ecologies of Empire: On The New Uses of the Honeybee.” Kosek does not cite Jünger’s novel, and yet his article is in many ways its non-fiction sequel. What Kosek sees is that the rise of drones in military strategy is tied deeply to their ability to mimic the activity and demeanor of male honey-bees. It is because bees can fly, swarm, change direction, alter their course, and yet achieve their single purpose absent any intentionality or thinking that bees are so useful in modern warfare.
Bees have long been associated with military endeavors, both metaphorically and literally. Kosek tells that our word bomb comes from the Greek bombos, which means bee. The first bombs were, it seems, beehives dropped or catapulted into the heart of the enemy camp. Bees are today trained to sniff out toxic chemicals; and beeswax was for generations an essential ingredient in munitions.
In the war on terror, bees have taken on a special significance. The “enemy’s lack of coherence—institutionally, ideologically, and territorially— makes the search for the enemy central to the politics of the war on terror.” War in the war on terror is ever less a contest of armies on the battlefield and is increasingly a war of knowledge. This means that surveillance—for centuries an important complement to battlefield tactics—comes to occupy the core of the modern war on terror. In this regard, drones are essential, as drones can hover in the air unseen for days, gathering essential intelligence on persons, groups, or even whole cities. All the more powerful would be miniature drones that fly through the air unseen and at ground level. That is why Kosek writes that “Intelligence gathering [is] not just limited to psychologists, sociologists, lawyers, and military planners, but [has come] to include biologists, anthropologists, epidemiologists, and even entomologists.” What the military use of bees promises is access to information and worlds not previously open to human knowledge. Bees, Kosek writes, are increasingly the model for the modern military.
The advantage of bees is not simply their thoughtlessness, but is found also in their ability to operate as part of a swarm. Current drone technology requires that each drone be controlled by a single pilot. What happens when hundreds of drones must share the airspace around a target? How can drones coordinate their activity? Kosek quotes a private contractor, John Sauter, who says:
“A central aspect of the future of warfare technology is to get networks of machines to operate as self-synchronized war fighting units that can act as complex adaptive systems. . . We want these machines to be fighting units that can operate as reconfigurable swarms that are less mechanical and more organic, less engineered and more grown.”
The point is that drones, be they large or small, must increasingly work in conjunction with each other at a speed and level of nuance that is impossible for human controllers to manage. The result is that we must model the drones of the future on bees.
The scientists working with the Pentagon to create drones that can fly and function like bees are not entomologists, but mathematicians. The DNA of the glass or silicone bees of the future will be complex algorithms inspired by but actually surpassing the ability of swarms “to coordinate and collect small bits of information that can be synchronized to make collective action by drones possible.” Once this is possible, one controller will be able to manage a single drone “and the others adapt, react, and coordinate with that drone.”
Kosek’s article is provocative and fascinating. His ruminations on empire strike me as overdone; his insights about the way our training and use of bees has transformed the bee and the ways that bees are serving as models and inspiration for our own development of new ways to fight wars and solve problems are important. So too is his imagination of the bee as the six-legged soldier of the future. Whether the drones of the future are cyborg bees (as some in Kosek’s article suggest) or mechanical bees as Jünger imagined half a century ago, it is nevertheless the case that thinking about the impact of drones on warfare and human life is enriched by the meditation on the male honeybee. For your weekend read , I offer you Jake Kosek’s “Ecologies of Empire: On The New Uses of the Honeybee.”
My girlfriend and I walked by a clothing storefront and noticed the print on some of the t-shirts at the lower right corner of the window and went in. She had mentioned this Imaginary Foundation (IF) before. They make print t-shirts.
I went to school at an expensive liberal arts college in the Hudson Valley—everyone there makes print t-shirts. It is like a business you start as a college sophomore as a way to convince yourself that you are a ‘creative entrepreneur’ before you enter the corporate world (or, alternatively, as a penance for inherited culture and comfort) the not-for-profit world.
Often, I cannot stand them —the print t-shirts. There is something out of shape about them, as if the juxtaposition of body/shirt/image, sets askew some intrinsic agreement in the marriage of fashion and identity. And yet, the IF designs spoke to me. There is something dreamy and yet sincere about these prints. If le petit prince was looking for a print t-shirt, he would buy one of these.
It just so happened that the owner of the company was visiting this Seattle distributor and was in the store. He was awkward, skittish and European. I liked him, and before we left I told him that I blog for a thinking and humanities institute out east and may want to write about his brand. That’s how I got into the Imaginary Foundation.
The shirts are not exactly ‘pretty,’ or ‘fashionable,’ rather, their attraction is a gesture beyond themselves -- a rare feat in a culture that positions branding as the apex of success. I’ll describe one shirt and if interested you can invest your own time in the Imaginary Foundation.
The “Being There” shirt has three anonymous human heads (one of the cloud suit, one of the water suit, and one of the fire suit). The heads are in peripheral view and are aligned, with a slight skew (allowing us the view of all three faces), as they break through a wall, the veil of the universe.
Other shirts handle concepts of psychosis and love “Love Science,” science and discovery in a reach towards heaven “Reach,” and other such concepts widely considered esoteric or cliché within the lens of our popular culture. But, we no longer understand what a ‘cliché’ is. I have long held the view that a cliché is a truth, or a point of interest and perspective insight, that has simply been worn out by overexposure. But who has worn it out? How have we taken the liberty and quiet pleasure of the private sphere (the realms of reflection, contemplation, meditation as it is thought of in the Greek terms), out of our living cycle, our consciousness, our daily existence? Why is the call for private contemplation no longer a necessity of existence? It seems we should have more time then ever for such practices. So many of our daily chores, our basic needs, are met through the economic matrix. I no longer have to chop wood for warmth, hunt a boar for food, trek down to the river for a water simply, etc... Why shouldn’t I spend more time in private contemplation, or even public conversation on these more subtle topics of the human necessity? Why shouldn’t I be making something in an effort to communicate those private necessities? The actualization of the humanist requires space for such a practice. And yet, anything that requires a slowing down of, a calling for the work of the mind and private reasoning, is now, quite often immediately, labeled a cliché.
In The Human Condition Arendt writes “The emancipation of labor and the concomitant emancipation of the laboring classes from oppression and exploitation certainly means progress in the direction of non-violence. It is much less certain that it was also progress in the direction of freedom.” She is not saying that laboring classes should not have been emancipated. Rather, that the humanist goal has been blurred by some glitch. Instead of moving towards freedom from wasteful labor (a waste of human power -- physical, mental, spiritual) we instead have emancipated labor. Most of us have become imprisoned in a non-sustainable cycle that for the continuation of its forward motion requires an ever-increasing consumption and waste. This waste can be seen in terms of power. The core power of the human psyche originates in the liberty of free private thoughts—a psychological space for contemplation. A mapping of one’s stillness that is only possible in the acquisition of free time. Free time is a result of freedom from labors necessity. What Arendt’s thoughts gesture towards is that the set of basic necessities that we have been freed from, have been replaced by another, far more complicated and disguised set—the necessity to perpetuate a system that is moving much faster then us; a necessity to consume and continue consuming. To be ‘a part of‘ is, today, to be a consumer—to take ones place in the labor of waste.
Oh right, I wanted to tell you about a product...
“IF” is a creative project. It gains the viewers attention and borrows the imagination. This is a beginning. It does not steal, it borrows. It suggests the prospect of resonance rather than ownership.
I checked out the company website. The “about” page describes the development of the Imaginary Foundation: “a think tank from Switzerland that does experimental research on new ways of thinking and the power of the imagination. They hold dear a belief in human potential and seek progress in all directions.” The page is dotted with black and white images from the sixties, shaggy haired men and turtle-neck clad women engaged in contemplative, laissez-faire, light spirited dialogue. The imaginary director of the foundation is described as a “70-something uber-intellectual whose father founded the Dadaist movement.” The foundation is imaginary. It is a base, a canvas, for the products (the t-shirts) and the ideas behind them.
The blog section of the site imagines a list of contributors: Isadore Muggll, Kamilla Rousseau, etc. These architects, as is the back story, are too imaginary. “IF” is a fictional foundation for the product. But the product is real and engaging.
What is captured here goes beyond the tangible properties of the product (t-shirts). It is about what the product delivers—the wonder of creativity and science, the archetypes of the IF. Imagination IS the foundation of this product.
The blog itself is a venue for artists who marry technology and art, as well as other thought provoking materials. The image I use at the head of this article is taken from the blog. Cloud, idea, light, community, play—IF: all these are represented in the Cloud installation. This art installation is a discovery I am brought to by the Imaginary Foundation.
I once taught a course on the development of contemporary advertising, heavily focused on Edward Bernays and the peripheral route of persuasion. Bernays was Sigmund Freud’s nephew, Woodrow Wilson’s image advisor, the father of the term "Public Relations," and the architect of the torches of freedom (Lucky Strikes) campaign, among many others. His theory, though terribly simplified here, was that the modern consumer does not purchase with his mind; rather, he defers to his emotions in most choices. The rational-actor is a fiction. If consumerism became god, branding became its religion.
Ad campaigns have become remarkably creative, and even, at times, beautiful. Have you ever felt the urge to cry during a Jeep commercial? Many have. I think I have. The central conceptual premise of the AMC show Mad Men, depends upon this tension: between art and consumption; the rendering from black and white, to color; the effective marketing and selling off of the human experience. In question is the art aspect of advertising. It is at the core of Don Draper’s motivations, and the one that despite his many character failings keeps endearing him to us. Ultimately we are asking, will he reconcile his artistic urge (his private motivation) with his office at the homunculus of the consumerism model (his role in the corporate arena). Exposed is a manipulation, an incongruence, an infidelity in the marriage of advertising and art. Where as art points towards something beyond itself, beyond even the image and the medium, the ad campaign points only to one purpose—back into itself. No idea behind it. Nothing living. It consumes.
Advertising is like the Ouroboros, the dragon that swallows its own tail; having entirely swallowed itself, the modern advertising campaign defies the laws of balance, it is only the un-relentless, hungry serpent head of consumption -- devoid of the body of life. The only urge driving it is to possess.
It is the difference between the work of Egon Schiele and Penthouse, the writings of Georges Bataille and a godaddy.com super bowl campaign.
Seduce ->consume. This is the current mandate of the ad campaign. But this relationship is only sustainable through incompletion. It requires continual doses. Seduce -> consume -> feel a lack even in the possession of product (contract unfulfilled) -> be seduced again -> consume. Ad infinitum. A terrible loop.
How can consumerism and individual consciousness (the most private sector) be made sustainable? Is it possible for a product to speak beyond itself? To fulfill the promise of its persuasion? And if it could, what would that mean for us?
Here I position the word sustainability to face two directions. In part it refers to what Arendt terms as “worldly,” the creation produced through work and not labor, something that has the potential to last beyond the productions of time, something that maneuvers into the arena of the eternal. I also want to posit the word in terms of its evolving contemporary potential. The one sector of the public, and political sphere that allows for the platform of this conversation is the environmental movement. It is where we have begun to contemplate the world beyond the shortsighted view of individual lifetimes. We speak of the sustainability of our planet; we are considering new ways to move our habits from wasteful and consumptive, towards lasting and sustainable power. It is a fairly new conversation and the word “sustainability” is evolving with each new perspective we bring to it.
Sustainability goes beyond consumer awareness. It is about the awareness of the product, how a brand gains consciousness. I need to explore here a definition of “consciousness.”
I have come to understand definitions as ever evolving in accordance with society and the pressures put upon it by the conditions of the time, the fractals of our world (more simply put, the culture stew).
Consciousness is the expanding of space into which one can resonate. To learn of the world around us, to acknowledge it, to consider its multiple dimensions, is to become more conscious -- to create space into which we can move by the will of our imagination and invention.
The Imaginary Foundation is an example of this bridge. It acknowledges itself and its fiction. It allows for play. It is a small company that uses the fabrication of its narrative to bring the consumers attention to the mimetic principles behind its product. Revealing the architects conceit brings me (the consumer) into co-authorship of the story. It endears itself to me. We do not only consume the product. We consume the narrative of the product. Even if I do not purchase, if I am thinking about it, I am talking about it, I have bought in. If it generates new ideas and deeper order thoughts, then I have begun to take ownership of the product. I consume the myth, I begin to co-author it -- I don it in the neural network of culture. And thus the product has gained consciousness, has begun to be carried beyond the object -- it resonates.
My study of this product is limited. I am not encouraging anyone here to purchase a shirt. I have not purchased a shirt. What I think this opens up is a table for negotiations between the current consumerism model, and individual consciousness—an opportunity to examine sustainable consumerism in all implications.
San Jose State University is experimenting with a program where students pay a reduced fee for online courses run by the private firm Udacity. Teachers and their unions are in retreat across the nation. And groups like Uncollege insist that schools and universities are unnecessary. At a time when teachers are everywhere on the defensive, it is great to read this opening salvo from Leon Wieseltier:
When I look back at my education, I am struck not by how much I learned but by how much I was taught. I am the progeny of teachers; I swoon over teachers. Even what I learned on my own I owed to them, because they guided me in my sense of what is significant.
I share Wieseltier’s reverence for educators. Eric Rothschild and Werner Feig lit fires in my brain while I was in high school. Austin Sarat taught me to teach myself in college. Laurent Mayali introduced me to the wonders of history. Marianne Constable pushed me to be a rigorous reader. Drucilla Cornell fired my idealism for justice. And Philippe Nonet showed me how much I still had to know and inspired me to read and think ruthlessly in graduate school. Like Wieseltier, I can trace my life’s path through the lens of my teachers.
The occasion for such a welcome love letter to teachers is Wieseltier’s rapacious rejection of homeschooling and unschooling, two movements that he argues denigrate teachers. As sympathetic as I am to his paean to pedagogues, Wieseltier’s rejection of all alternatives to conventional education today is overly defensive.
For all their many ills, homeschooling and unschooling are two movements that seek to personalize and intensify the often conventional and factory-like educational experience of our nation’s high schools and colleges. According to Wieseltier, these alternatives are possessed of the “demented idea that children can be competently taught by people whose only qualifications for teaching them are love and a desire to keep them from the world.” These movements believe that young people can “reject college and become “self-directed learners.”” For Wieseltier, the claim that people can teach themselves is both an “insult to the great profession of pedagogy” and a romantic over-estimation of “untutored ‘self’.”
The romance of the untutored self is strong, but hardly dangerous. While today educators like Will Richardson and entrepreneurs like Dale Stephens celebrate the abundance of the internet and argue that anyone can teach themselves with simply an internet connection, that dream has a history. Consider this endorsement of autodidactic learning from Ray Bradbury from long before the internet:
Yes, I am. I’m completely library educated. I’ve never been to college. I went down to the library when I was in grade school in Waukegan, and in high school in Los Angeles, and spent long days every summer in the library. I used to steal magazines from a store on Genesee Street, in Waukegan, and read them and then steal them back on the racks again. That way I took the print off with my eyeballs and stayed honest. I didn’t want to be a permanent thief, and I was very careful to wash my hands before I read them. But with the library, it’s like catnip, I suppose: you begin to run in circles because there’s so much to look at and read. And it’s far more fun than going to school, simply because you make up your own list and you don’t have to listen to anyone. When I would see some of the books my kids were forced to bring home and read by some of their teachers, and were graded on—well, what if you don’t like those books?
In this interview in the Paris Review, Bradbury not only celebrates the freedom of the untutored self, but also dismisses college along much the same lines as Dale Stephens of Uncollege does. Here is Bradbury again:
You can’t learn to write in college. It’s a very bad place for writers because the teachers always think they know more than you do—and they don’t. They have prejudices. They may like Henry James, but what if you don’t want to write like Henry James? They may like John Irving, for instance, who’s the bore of all time. A lot of the people whose work they’ve taught in the schools for the last thirty years, I can’t understand why people read them and why they are taught. The library, on the other hand, has no biases. The information is all there for you to interpret. You don’t have someone telling you what to think. You discover it for yourself.
What the library and the internet offer is unfiltered information. For the autodidact, that is all that is needed. Education is a self-driven exploration of the database of the world.
Of course such arguments are elitist. Not everyone is a Ray Bradbury or a Wilhelm Gottfried Leibniz, who taught himself Latin in a few days. Hannah Arendt refused to go to her high school Greek class because it was offered at 8 am—too early an hour for her mind to wake up, she claimed. She learned Greek on her own. For such people self-learning is an option. But even Arendt needed teachers, which is why she went to Freiburg to study with Martin Heidegger. She had heard, she later wrote, that thinking was happening there. And she wanted to learn to think.
What is it that teachers teach when they are teaching? To answer “thinking” or “critical reasoning” or “self-reflection” is simply to open more questions. And yet these are the crucial questions we need to ask. At a period in time when education is increasingly confused with information delivery, we need to articulate and promote the dignity of teaching.
What is most provocative in Wieseltier’s essay is his civic argument for a liberal arts education. Education, he writes, is the salvation of both the person and the citizen. Indeed it is the bulwark of a democratic politics:
Surely the primary objectives of education are the formation of the self and the formation of the citizen. A political order based on the expression of opinion imposes an intellectual obligation upon the individual, who cannot acquit himself of his democratic duty without an ability to reason, a familiarity with argument, a historical memory. An ignorant citizen is a traitor to an open society. The demagoguery of the media, which is covertly structural when it is not overtly ideological, demands a countervailing force of knowledgeable reflection.
That education is the answer to our political ills is an argument heard widely. During the recent presidential election, the candidates frequently appealed to education as the panacea for everything from our flagging economy to our sclerotic political system. Wieseltier trades in a similar argument: A good liberal arts education will yield critical thinkers who will thus be able to parse the obfuscation inherent in the media and vote for responsible and excellent candidates.
I am skeptical of arguments that imagine education as a panacea for politics. Behind such arguments is usually the unspoken assumption: “If X were educated and knew what they were talking about, they would see the truth and agree with me.” There is a confidence here in a kind of rational speech situation (of the kind imagined by Jürgen Habermas) that holds that when the conditions are propitious, everyone will come to agree on a rational solution. But that is not the way human nature or politics works. Politics involves plurality and the amazing thing about human beings is that educated or not, we embrace an extraordinary variety of strongly held, intelligent, and conscientious opinions. I am a firm believer in education. But I hold out little hope that education will make people see eye to eye, end our political paralysis, or usher in a more rational polity.
What then is the value of education? And why is that we so deeply need great teachers? Hannah Arendt saw education as “the point at which we decide whether we love the world enough to assume responsibility for it." The educator must love the world and believe in it if he or she is to introduce young people to that world as something noble and worthy of respect. In this sense education is conservative, insofar as it conserves the world as it has been given. But education is also revolutionary, insofar as the teacher must realize that it is part of that world as it is that young people will change the world. Teachers simply teach what is, Arendt argued; they leave to the students the chance to transform it.
To teach the world as it is, one must love the world—what Arendt comes to call amor mundi. A teacher must not despise the world or see it as oppressive, evil, and deceitful. Yes, the teacher can recognize the limitations of the world and see its faults. But he or she must nevertheless love the world with its faults and thus lead the student into the world as something inspired and beautiful. To teach Plato, you must love Plato. To teach geology, you must love rocks. While critical thinking is an important skill, what teachers teach is rather enthusiasm and love of learning. The great teachers are the lovers of learning. What they teach, above all, is the experience of discovery. And they do so by learning themselves.
Education is to be distinguished from knowledge transmission. It must also be distinguished from credentialing. And finally, education is not the same as indoctrinating students with values or beliefs. Education is about opening students to the fact of what is. Teaching them about the world as it is. It is then up to the student, the young, to judge whether the world that they have inherited is loveable and worthy of retention, or whether it must be changed. The teacher is not responsible for changing the world; rather the teacher nurtures new citizens who are capable of judging the world on their own.
Arendt thus affirms Ralph Waldo Emerson's view that “He only who is able to stand alone is qualified for society.” Emerson’s imperative, to take up the divine idea allotted to each one of us, resonates with Arendt’s Socratic imperative, to be true to oneself. Education, Arendt insists, must risk allowing people their unique and personal viewpoints, eschewing political education and seeking, simply, to nurture independent minds. Education prepares the youth for politics by bringing them into a common world as independent and unique individuals. From this perspective, the progeny of teachers is the educated citizen, someone one who is both self-reliant in an Emersonian sense and also part of a common world.
What is a fact? Few more thorny questions exist. Consider this, from Hannah Arendt’s essay, “Truth and Politics:”
But do facts, independent of opinion and interpretation, exist at all? Have not generations of historians and philosophers of history demonstrated the impossibility of ascertaining facts without interpretation, since they must first be picked out of a chaos of sheer happenings (and the principles of choice are surely not factual data) and then be fitted into a story that can be told only in certain perspective, which has nothing to do with the original occurrence?
Facts are constructed. They are not objective. And there is no clear test for what is a fact. Thus, when Albert Einstein was asked, how science can separate fact from fiction, brilliant hypotheses from nutty quackery, he answered: ‘There is no objective test.” Unlike rational truths that are true outside of experience and absolute, all factual truths are contingent. They might have been otherwise. That is one reason it is so hard to pin them down.
Steve Shapin reminds us of these puzzles in an excellent essay in this weeks London Review of Books. Shapin is reviewing a new book on Immanuel Velikovsky by Michael Gordin. Velikovsky, for those born since the 1960s, caused an uproar in the 1960s and 70s with his scientific claims that Venus was the result of a dislodged piece of Jupiter, that comets led to the parting of the Red Sea, that it dislodged the orbit of Mars threatening Earth, and caused the relocation of the North Pole, not to mention the showering of plagues of vermin onto the earth that nourished the Israelites in the desert.
Gordin’s book is about how American scientists went ballistic over Velikovsky. They sought to censor his work and schemed to prevent the publication of his book, Worlds in Collision, at the prestigious Macmillan press. At the center of the controversy was Harvard, where establishment scientists worked assiduously to discredit Velikovsky and stop the circulation of his ideas. [I am sensitive to such issues because I was also the target of such a suppression campaign. When my book The Gift of Science was about to be published by Harvard University Press, I received a call from the editor. It turns out an established scholar had demanded that HUP not publish my book, threatening to no longer review books for the press let alone publish with them. Thankfully, HUP resisted that pressure, for which I will always be grateful.]
For these Harvard scientists, Velikovsky was a charlatan peddling a dangerous pseudo science. The danger in Velikovsky’s claims was more than simple misinformation. It led, above all, to an attack on the very essence of scientific authority. What Velikovsky claimed as science flew in the face of what the scientific community knew to be true. He set himself up as an outsider, a dissident. Which he was. In the wake of totalitarianism, he argued that democratic society must allow for alternative and heretical views. The establishment, Velikovsky insisted, had no monopoly on truth. Let all views out, and let the best one win.
Shapin beautifully sums up the real seduction and danger lurking in Velikovsky’s work.
The Velikovsky affair made clear that there were radically differing conceptions of the political and intellectual constitution of a legitimate scientific community, of what it was to make and evaluate scientific knowledge. One appealing notion was that science is and ought to be a democracy, willing to consider all factual and theoretical claims, regardless of who makes them and of how they stand with respect to canons of existing belief. Challenges to orthodoxy ought to be welcomed: after all, hadn’t science been born historically through such challenges and hadn’t it progressed by means of the continual creative destruction of dogma? This, of course, was Velikovsky’s view, and it was not an easy matter for scientists in the liberal West to deny the legitimacy of that picture of scientific life. (Wasn’t this the lesson that ought to be learned from the experience of science in Nazi Germany and Stalinist Russia?) Yet living according to such ideals was impossible – nothing could be accomplished if every apparently crazy idea were to be given careful consideration – and in 1962 Thomas Kuhn’s immensely influential Structure of Scientific Revolutions commended a general picture of science in which ‘dogma’ (daringly given that name) had an essential role in science and in which ‘normal science’ rightly proceeded not through its permeability to all sorts of ideas but through a socially enforced ‘narrowing of perception’. Scientists judged new ideas to be beyond the pale not because they didn’t conform to abstract ideas about scientific values or formal notions of scientific method, but because such claims, given what scientists securely knew about the world, were implausible. Planets just didn’t behave the way Velikovsky said they did; his celestial mechanics required electromagnetic forces which just didn’t exist; the tails of comets were just not the sorts of body that could dump oil and manna on Middle Eastern deserts. A Harvard astronomer blandly noted that ‘if Dr Velikovsky is right, the rest of us are crazy.'
It is hard not to read this account and not think about contemporary debates over global warming, Darwinism, and the fall of the World Trade Center. In all three cases, outsiders and even some dissident scientists have made arguments that have been loudly disavowed by mainstream scientists.
No one has done more to explore the claims of modern pseudo science than Naomi Oreskes. In her book Merchants of Doubt written with Erik Conway, Oreskes shows how “a small handful of men” could, for purely ideological reasons, sow doubt about the ‘facts’ regarding global warming and the health effects of cigarettes. In a similar vein, Jonathan Kay has chronicled the efforts of pseudo scientists to argue that there was no possible way that the World Trade Towers could have been brought down by jet fuel fires, thus suggesting and seeking to “prove” that the U.S. government was behind the destruction of 9/11.
Oreskes wants to show, at once, that it is too easy for politically motivated scientists to sow doubt about scientific fact, and also that there is a workable and effective way for the scientific community to patrol the border between science and pseudo science. What governs that boundary is, in Oreskes words, “the scientific consensus.” The argument that global warming is a fact rests on claims about the scientific method: value free studies, evaluated by a system of peer review, moving towards consensus. Peer review is, for Oreskes, “is a crucial part of science.” And yet, for those who engage in it know full well, peer review is also deeply political, subject to petty and also not so petty disputes, jealousies, and vendettas. For this and other reasons, consensus is, as Oreskes herself admits, not always accurate: “The scientific consensus might, of course, be wrong. If the history of science teaches anything, it is humility, and no one can be faulted for failing to act on what is not known.”
Just as Einstein said 50 years ago, in the matters of establishing scientific fact, there is no objective test. This is frustrating. Indeed, it can be dangerous, not only when pseudo scientists sow doubt about global warming thus preventing meaningful and necessary action. But also, the pervasive and persuasive claims of pseudo science sow cynicism that undermines the factual and truthful foundations of human life.
Arendt reminds us, with a clarity rarely equaled, that factual truth is always contingent. “Facts are beyond agreement and consent, and all talk about them—all exchanges of opinion based on correct information—will contribute nothing to their establishment.” Against the pseudo scientific claims of many, science is always a contingent and hypothetical endeavor, one that deals in hypotheses, agreement, and factual proof. Scientific truth is always empirical truth and the truths of science are, in the end, grounded in consensus.
The trouble here is that scientific truths must—as scientific—claim to be true and not simply an opinion. Science makes a claim to authority that is predicated not upon proof but on the value and meaningfulness of impartial inquiry. It is a value that is increasingly in question.
What the challenge of pseudo science shows is how tenuous scientific authority and the value placed on disinterested research really is. Such inquiry has not always been valued and there is no reason to expect it to be valued about partial inquiry in the future. Arendt suggests that the origin of the value in disinterested inquiry was Homer’s decision to praise the Trojans equally as he lauded the Achaeans. Never before, she writes, had one people been able to look “with equal eyes upon friend and foe.” It was this revolutionary Greek objectivity that became the source for modern science. For those who do value science and understand the incredible advantages it has bestowed upon modern civilization, it is important to recall that the Homeric disinterestedness is neither natural nor necessary. In the effort to fight pseudo science, we must be willing and able to defend just such a position and thus what Nietzsche calls the “pathos of distance” must be central to any defense of the modern scientific world.
When science loses its authority, pseudo science thrives. That is the situation we are increasingly in today. There are no objective tests and no clear lines demarcating good and bad science. And that leaves us with the challenge of the modern age: to pursue truth and establish facts without secure or stable foundations. For that, we need reliable guides whom we can trust. And for that reason, you should read Steven Shapin’s latest essay. It is your weekend read.
Hannah Arendt spoke of having acquired, through her life, a "love of the world." When writing about education she argues that "education is the point at which we decide whether we love the world enough to assume responsibility for it." And in politics, she insists, we must care for and love the world more than oneself. What then is the world?
The world is related to human making and to the things and artifacts that human beings make. What defines the things of a world is that those things gather individuals together.
In the public realm, a politician is that person who speaks and acts in such a way that those around him come to see those institutions and values that they share and treasure. The common world is the world that emerges when a plurality of people bind themselves to stories, traditions, institutions, rituals, and practices that they share and that they love. Like a table that unites those who sit around it in a common conversation or feast, the common world brings different people together. It stands between them, both joining and separating them.
In the private realm, a world is founded in property, and property has an essential role in the public realm too. For property is what one owns, what is proper to one, and thus defines one over against others in the common world. Property provides the boundaries between people and also serves as the boundary between the commonality of the public realm and the uniqueness of the private realm. It is no accident that original Greek word for law, nemein, also means to distribute and to possess, as well as to dwell. Property, in English, also names the laws of propriety, what is right and given to each.
In both the public and the private realms the world consists of things that endure. Worldly things must not only be common. They must also last. Since we must love the world more than our own lives—since we must be willing to pursue the world as an ideal and sacrifice ourselves to the glory and good of the world we share with others—the world must offer us the promise of permanence and thus immortality.
How are to understand the worldly conditions of permanence and immortality? We might ask: What is a house?
This is one of the many questions at issue in Jonathan Franzen's essay "House For Sale," about his return to his mother's house in Webster Grove, Missouri to sell the house after her death. Here is how Franzen describes his mother's house.
This was the house where, five days a month for ten month, while my brothers and I were going about our coastal lives, she had come home alone from chemotherapy and crawled into bed. The house from which, a year after that, in early June, she had called me in New York and said she was returning to the hospital for more exploratory surgery, and then had broken down in tears and apologized for being such a disappointment to everyone and giving us more bad news. The house where, a week after her surgeon had shaken his head bitterly and sewn her abdomen back up, she'd grilled her most trusted daughter-in-law on the idea of the afterlife, and my sister-in-law had confessed that, in point of sheer logistics, the idea seemed to her pretty far-fetched, and my mother, agreeing with her, had then, as it were, put a check beside the item "Decide about the afterlife" and continued down her to-do list in her usual pragmatic way, addressing other tasks that her decision had rendered more urgent than ever, such as "Invite best friends over one by one and say goodbye to them forever." This was the house from which, on a Saturday morning in July, my brother Bob had driven her to her hairdresser, who was Vietnamese and affordable and who greeted her with the words "Oh, Mrs. Fran, Mrs. Fan, you look terrible," and to which she'd returned, an hour later, to complete her makeover, because she was spending long-hoarded frequent-flyer miles on two first-class tickets, and first-class travel was an occasion for looking her best, which also translated into feeling her best; she came down from her bedroom dressed for first class, said goodbye to her sister, who had traveled from New York to ensure that the house would not be empty when my mother walked away from it—that someone would be left behind—and then went to the airport with my brother and flew to the Pacific Northwest for the rest of her life. Her house, being a house, was enough slower in its dying to be a zone of comfort to my mother, who needed something larger than herself to hold on to but didn't believe in supernatural beings. Her home was the heavy (but not infinitely heavy) and sturdy (but not everlasting) God that she'd loved and served and been sustained by, and my aunt had done a very smart thing by coming when she did.
Franzen offers us a house in many valences.
It was where his mother lived. Where she was sick. Where she thought about dying and God. Where she recovered from surgery and made herself up. Above all, it was his mother's house. Later he writes that the house was "my mother's novel, the concrete story she told about herself." In this house she "pondered the arrangement of paintings on a wall like a writer pondering commas." It was a house in which she showed herself. It was thus an invitation. And "she wanted you to want to stay."
The problem is that Franzen does not want to stay in his mother's house. He grew up in the house, but he resents it. The house his mother made, was filled with "sturdy and well made" furniture that "my brothers and I couldn't make ourselves want." He has fled the house and returns only to remove those photos that for his mother made the house hers, to act like a conqueror, he admits, and repossess the house from his mother. But only to then sell it.
If Mrs. Fanzen's house is her novel and if it was a house in which she both concealed and showed herself, her son's house in NYC is something else entirely. Here is how Franzen describes his own dwelling place:
I now owned a nice apartment on East Eighty-first Street. Walking in the door, after two months in California, I had the sensation of walking into somebody else's apartment. The guy who lived here was apparently a prosperous middle-aged Manhattanite with the sort of life I'd spent my thirties envying from afar, vaguely disdaining, and finally being defeated in my attempts to imagine my way into. How odd that I now had the keys to this guy's apartment.
House for sale is, amongst other themes like the loss of religion, the loss of family, and the loss of the American middle class, about the loss of the American house. It is also therefore, in an Arendtian vein, a story about the loss of our world, the property that both hides and nurtures our souls and separates and distinguishes us from our fellow citizens. Denuded of our habitus and property, we are defenseless against the conformity of society. Without desks and bookshelves passed down over generations that fit us, over and against our choices, into a private world, we are consumers who build a temporary bulwark whether styled by Ikea or the local antique store. Such a house is not meant to last and to be passed down across the generations. It will be used and, eventually, sold or walked away from. With nothing that defines us in a lasting and immortal vein, our lives have no depth or meaning beyond our accomplishments. There is no weight or law that claims us and obligates. We are free, but free, unsure why we are here or what it all means.
I recently encountered Jonathan Franzen's essay within an extraordinary theatrical experience. The play "House For Sale" is based on his essay by the same name.
It has been adapted for the stage by Daniel Fish. I have now been to see it twice. The play is hilarious, brutal, and shattering. It makes Franzen's essay come alive in ways miraculous and uplifting. The final scene itself is worth dropping every plan you have, flying to NYC, and rushing to the Duke Theatre on 42nd St. to catch it. I can't recommend this highly enough. But hurry, it is playing for only a few more performances. You can buy tickets here.
Or, if you simply can't get to NYC, buy The Discomfort Zone, Franzen's book of essays in which "House For Sale" originally appeared. It is your weekend read.
In this post, academics and university faculty will be criticized. Railing against college professors has become a common pastime, one practiced almost exclusively by those who have been taught and mentored by those whom are now being criticized. It is thus only fair to say upfront that the college education in the United States is, in spite of its myriad flaws, still of incredible value and meaning to tens if not hundreds of thousands of students every year.
That said, too much of what our faculties teach is neither interesting nor wanted by our students.
This is a point that Jacques Berlinerblau makes in a recent essay in the Chronicle of Higher Education.
Observers of gentrification like to draw a distinction between needs and wants. Residents in an emerging neighborhood need dry cleaners, but it's wine bars they really want. The application of that insight to the humanities leads me to an unhappy conclusion: Our students, and the educated public at large, neither want us nor need us.
What is amazing is that not only do our students not want what we offer, but neither do our colleagues. It is an amazing and staggering truth that much of what academics write and publish is rarely, if ever, read. And if you want to really experience the problem, attend an academic conference some day, where you will see panels of scholars presenting their work, sometimes to 1 or 2 audience members. According to Berlinerblau, the average audience at academic conference panels is fourteen persons.
The standard response to such realizations is that scholarship is timeless. Its value may not be discovered for decades or even centuries until someone, somewhere, pulls down a dusty volume and reads something that changes the world. There is truth in such claims. When one goes digging in archives, there are pearls of wisdom to be found. What is more, the scholarly process consists of the accumulation of information and insight over generations. In other words, academic research is like basic scientific research, useless but useful in itself.
The problem with this argument is that such really original scholarship is rare and getting ever more rare. While there are exceptions, little original research is left to do in most fields of the humanities. Few important books are published each year. The vast majority are as derivative as they are unnecessary. We would all do well to read and think about the few important books (obviously there will be some disagreement and divergent schools) than to spend our time trying to establish our expertise by commenting on some small part of those books.
The result of the academic imperative of publish or perish is the increasing specialization that leads to the knowing more and more about less and less. This is the source of the irrelevance of much of humanities scholarship today.
As Hannah Arendt wrote 50 years ago in her essay On Violence, humanities scholars today are better served by being learned and erudite than by seeking to do original research by uncovering some new or forgotten scrap. While such finds can be interesting, they are exceedingly rare and largely insignificant.
As a result—and it is hard to hear for many in the scholarly community—we simply don't need 200 medieval scholars in the United States or 300 Rawlsians or 400 Biblical scholars. It is important that Chaucer and Nietzsche are taught to university students; but the idea that every college and university needs a Chaucer and a Nietzsche scholar to teach Chaucer and Nietzsche is simply wrong. We should, of course, continue to support scholars, those whose work is to some extent scholarly innovative. But more needed are well-read and thoughtful teachers who can teach widely and write for a general audience.
To say that excessively specialized humanities scholarship today is irrelevant is not to say that the humanities are irrelevant. The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.
Hannah Arendt argues precisely for this connection between the humanities and politics in her essay The Crisis in Culture. Part Two of the essay addresses the political significance of culture, which she relates to humanism—both of which are said to be of Roman origin. The Romans, she writes, knew how to care for and cultivate the grandiose political and artistic creations of the Greeks. And it is a line from Pericles that forms the center of Arendt's reflections.
The Periclean citation is translated (in part) by Arendt to say: "We love beauty within the limits of political judgment." The judgment of beauty, of culture, and of art is, Pericles says, limited by the political judgment of the people. There is, in other words, an intimate connection between culture and politics. In culture, we make judgments of taste and thus learn the faculty of judgment so necessary for politics. And political judgment, in turn, limits and guides our cultural judgments.
What unites culture and politics is that they are "both phenomena of the public world." Judgment, the primary faculty of politics, is discovered, nurtured, and practiced in the world of culture and the judgment of taste. What the study of culture through the humanities offers, therefore, is an orientation towards a common world that is known and understood through a common sense. The humanities, Arendt argues, are crucial for the development and preservation of common sense—something that is unfortunately all-too-lacking in much humanities scholarship today.
What this means is that teaching the humanities is absolutely essential for politics—and as long as that is the case, there will be a rationale for residential colleges and universities. The mania for distance learning today is understandable. Education is, in many cases, too expensive. Much could be done more cheaply and efficiently at colleges. And this will happen. Colleges will, increasingly, bring computers and the Internet into their curricula. But as powerful as the Internet is, and as useful as it is as a replacement for passive learning in large lectures, it is not yet a substitute for face-to-face learning that takes place at a college or university. The learning that takes place in the hallways, offices, and dining halls when students live, eat, and breathe their coursework over four years is simply fundamentally different from taking a course online in one's free time. As exciting as technology is, it is important to remember that education is, at its best, not about transmitting information but about inspiring thinking.
Berlinerblau thinks that what will save the humanities is better training in pedagogy. He writes:
As for the tools, let's look at it this way. Much as we try to foist "critical thinking skills" on undergraduates, I suggest we impart critical communication skills to our master's and doctoral students. That means teaching them how to teach, how to write, how to speak in public. It also means equipping them with an understanding that scholarly knowledge is no longer locked up in journals and class lectures. Spry and free, it now travels digitally, where it may intersect with an infinitely larger and more diverse audience. The communicative competences I extoll are only infrequently part of our genetic endowment. They don't come naturally to many people—which is precisely what sets the true humanist apart from the many. She or he is someone you always want to speak with, listen to, and read, someone who always teaches you something, blows your mind, singes your feathers. To render complexity with clarity and style—that is our heroism.
The focus on pedagogy is a mistake and comes from the basic flawed assumption that the problem with the humanities is that the professors aren't good communicators. It may be true that professors communicate poorly, but the real problem is deeper. If generations of secondary school teachers trained in pedagogy have taught us anything it is that pedagogical teaching is not useful. Authority in the classroom comes from knowledge and insight, not from pedagogical techniques or theories.
The pressing issue is less pedagogy than the fact that what most professors know is so specialized as to be irrelevant. What is needed is not better pedagogical training, but a more broad and erudite training, one that focuses less on original research and academic publishing and instead demands reading widely and writing aimed at an educated yet popular audience. What we need, in other words, are academics who read widely with excitement and inspiration and speak to the interested public.
More professors should be blogging and writing in public-interest journals. They should be reviewing literature rather than each other's books and, shockingly, they should be writing fewer academic monographs.
To say that the humanities should engage the world does not mean that the humanities should be politicized. The politicization of the humanities has shorn them of their authority and their claim to being true or beautiful. Humanities scholarship can only serve as an incubator for judgment when it is independent from social and political interests. But political independence is not the same as political sterility. Humanities scholarship can, and must, teach us to see and know our world as it is.
There are few essays that better express the worldly importance of the humanities than Hannah Arendt's The Crisis of Culture. It is worth reading and re-reading it. On this hot summer weekend, do yourself that favor.
"It is true that storytelling reveals meaning without committing the error of defining it, that it brings about consent and reconciliation with things as they really are, and that we may even trust it to contain eventually by implication that last word which we expect from the Day of Judgment”.
- Hannah Arendt, “Isak Dinesen: 1885 – 1963” in Men in Dark Times
According to Arendt, it is through action – and all action is but acts of speech – that human beings disclose themselves in their whoness rather than merely on the basis of their whatness. Her indebtedness for storytelling comes from a two-fold source: The Greek world on the one hand - the poets and the historians, and on the other the writings of Isak Dinesen.
Arendt devoted no theoretical effort to pass Dinesen under the lens of theory, other than some occasional mention and a literary profile in the book that Auden called her most German book – because of the form of epic legends in which the stories of the anti-heroes, under the shadow of dark times, are told.
Herself a talented storyteller, her books can be read better against this background of storytelling than on theoretical impetus; this is not because Arendt wasn’t a vehement defender of the life of the mind but because of her insight about the inability of intellectual traditions and history to understand and comprehend the events of her century.
Her reading of Dinesen conforms to the difficulties of understanding Totalitarianism. Spanish philosopher Fina Birulés puts in the following words: “While storytelling does not solve any problem and does not master anything forever, it adds yet another element in the repertory of the world, it is a way for human beings to leave a lasting presence in the world, not as species, but as a plurality of who’s”.
The relationship between storytelling and reconciliation is laid out by Arendt through Dinesen: “The reward of storytelling is to be able to let go: “When the storyteller is loyal to the story, there, in the end, silence will speak. Where the story has been betrayed, silence is but emptiness. But we, the faithful, when we have spoken our last word, will hear the voice of silence”. To let go is an act of reconciliation.
Arendt writes the story of this anxiety and melancholy of her own through Dinesen: “That grief of having lost her life and lover in Africa should have made her a writer and given her a sort of second life was best understood as a joke, and “God loves a joke” became her maxim in the latter part of her life”.
Agnes Heller writes that Arendt knows in advance what it is that she wants to find in her storytelling, in spite of – often – finding something unexpected.
Dinesen becomes a reflection of mirrors for Arendt who in writing about Dinesen’s own storytelling that seems artificial and blurs the distinction between truth and fiction, finds the detachment necessary to comprehend the world, temporarily: “To become an artist also needs time and a certain detachment from the heavy, intoxicating business of sheer living that, perhaps, only the born artist can manage in the midst of living.”
The flight into imaginary worlds at the hand of Dinesen’s pen isn’t simply a performance and re-enactment of the Gothic – as is for example William Beckford’s “Vathek” – but rather a coming to terms with the present by telling a story about its burdens.
It is nothing but an anchoring on the present at a time when the foundation of the present itself – the past – seems irrevocably lost. A similar example of storytelling through mirrors would be, for example, Susan Sontag’s review of Anna Banti’s “Artemisia” for The London Review of Books in 2003.
“Artemisia” is a novel written late in the Second World War about the life of Artemisia Gentilenschi, a 17th century Italian painter: Banti, trained as an art historian, is meticulously careful about her treatment of sources on Gentilenschi’s life and writes in what Sontag calls “a double destiny”; according to her, Anna Banti does not find herself in Artemisia and is careful enough to write in the detachment of the third person, only available to the truly committed storyteller in a game of hide and seek: “We are playing a chasing game, Artemisia and I”.
More than a biography or a historical novel, Artemisia is a deeply emotional but sober and detached portrait of a woman in the early 17th century, tainted by the scandal of a rape that disgraced her family and haunted no more by her total commitment to art, than by the immense loneliness of living as an artist in a male-dominated world – but told with more grace than resentment.
The story about Banti and Artemisia that Sontag is telling is one of permanent displacement and loss; not only because of the female story being told but because the original novel was lost under the ruins of Banti’s house in Borgo San Jacopo when the mines detonated by the Germans wrecked the houses near the river, including hers.
Without knowing as much, Susan Sontag is writing about Banti in the same way that Arendt is writing about Dinesen: Behind a story of loss and womanhood, there is an affirmative and rather reckless anchoring in the present – in Sontag’s case, the world after Totalitarianism: The Cold War, Iraq, Afghanistan, 9/11 and Abu Ghraib. It is against this background that she is writing about a “phoenix of a novel”, which is in itself a testimony to Sontag’s own work.
What both writers learnt from their own writers is a bitter lesson in contemporary history, as eloquently put by Arendt about Dinesen:
Thus, the earlier part of her life had taught her that, while you can tell stories or write poems about life, you cannot make life poetic, live it as though it were a work of art (as Goethe had done) or use it for the realization of an “idea”. Life might contain the “essence” (what else could?); recollection, the repetition in imagination, may decipher the essence and deliver to you the “elixir”; and eventually you may even be privileged to “make” something out of it, “to compound the story”. But life itself is neither essence nor elixir, and if you treat it as such it will only play its tricks on you.
When Lebanese writer Mira Baz left Yemen in 2011, in the course of the revolution and just before the deadly “Friday of Dignity” massacre, after nearly a decade teaching and writing in the mysterious land – similar to Dinesen’s Africa seen through Arendt and Banti’s Florence seen through Sontag, a sort of paradise lost and not without heavy taxes levied by the status of paradise, she was to become displaced and would turn her poetic travelogue of Yemen into a vast vault of memory.
In March 2012 she wrote – exactly a year after the massacre – about the experience of the displacement, invoking the following lines from Dinesen:
“If I know a song of Africa,
Of the giraffe and the African new moon lying on her back,
Of the plows in the field and the sweaty faces of the coffee pickers,
Does Africa know a song of me?”
After which she writes:
The house and the garden had quickly become my home, where in the mornings I fed my regular guests Bulbuls and Serins, and found serenity when, through watching them, I meditated on existence, on cycles, on life, on everything and nothingness. Out there was Yemen. Within the garden walls, and all the walls, was me, inside my head.
Through reading and writing, life cannot be changed, but it can be made understandable and livable, after the same fashion of John Updike when he described the prose of Bruno Schulz: “The harrowing effect of Schulz’ prose is to construct the world anew, as from fragments that exist after some unnamable disaster”. The disaster is always the turbulence of history and the unnamable is the loss, but here storytelling becomes a privilege, a sign of truth, and the burden of a presence – entering the world once again, even if it had been lost once.
Fina Birulés concludes her timely meditation on Arendt and Dinesen: “The political function of the narrator – historian or novelist – is to teach the acceptance of things as they are. From this acceptance, that might be called as well veracity, is born the faculty of judgment, by means of which, in words of Isak Dinesen, in the end we will have the privilege to see and to see again, and that is what is called Day of Judgment.”
What precisely do we mean when we use the term “genocide”? Has the word always been associated with the mass killing of individuals on the basis of their group affiliation? Or have there been alternative conceptions of genocide of which we should be aware?
These questions were at the heart of the Hannah Arendt Center’s latest Lunchtime Talk, which occurred amid picturesque snowfall on Wednesday, February 29th. The presenter was Douglas Irvin, a Ph.D. candidate at Rutgers’ Center for the Study of Genocide and Human Rights. Irving's talk revolved around the work of the Polish-Jewish lawyer Raphael Lemkin (1900-1959).
After escaping from Nazi-occupied Poland and lecturing at the University of Stockholm, Lemkin emigrated to the U.S., served as an advisor at the Nuremberg Trials, and played a central role in the passage of the 1948 U.N. Genocide Convention. Indeed, Lemkin was the first public figure to use the term “genocide,” which he derived from the Greek root genus (family, race, or tribe) and the Latin root, cide (killing).
Lemkin and Arendt were contemporaries with overlapping experiences and interests, but they engaged very little with one another in print (aside, perhaps, from a few allusions and anonymous criticisms). Irvin contends that there are good reasons for this lack of dialogue, since the two differed significantly in their views of genocide and humanity more broadly.
On the one hand, Arendt regarded genocide as a historically recent outgrowth of modern totalitarianism. According to Irvin, this understanding was in keeping with her more general conception of the human cosmos, which ultimately emerged through, and was grounded in, individual interactions within the arena of the polis.
Lemkin, by contrast, regarded genocide as a much older phenomenon, one that was premised not on the destruction of individuals on the basis of their group affiliation, but rather on the annihilation of entire cultural traditions and collective identities. Drawing eclectically on the work of seventeenth-century Spanish theologians, romantic thinkers like Johann Gottfried von Herder, and anthropological understandings of cultures as integrated wholes, Lemkin ultimately defined genocide as a coordinated attack on the conditions that make the lives of nations and other collectivities possible.
In this conception, genocide does not necessarily or inevitably entail the mass killing of a group’s members, but rather turns on concerted efforts to obliterate that group’s institutions, language, religious observance, and economic livelihood. In Irvin’s argument, this approach resonated with the broadly communitarian nature of Lemkin’s thought: human existence was in his estimation defined by interactions between culture-bearing groups, and human freedom could ultimately be secured through the benevolent recognition and protection of cultural pluralism.
Significantly, the U.N. Genocide Convention that Lemkin championed did not incorporate many aspects of his thinking. His ideas encountered strong resistance from the U.S., U.K., and other imperial powers, many of which feared that their treatment of indigenous and colonial populations would qualify as genocide under the standards that Lemkin (and his collaborators) proposed. As a result, our current understanding of genocide is in no small part a byproduct of a diplomatic battle to redefine this legal category in a fashion that would encompass the Nazi Holocaust but not implicate other states (including several of the Allied powers that fought against Germany in World War II). This wrangling has also contributed to the minimal attention that has since been paid to Lemkin’s ideas, which were only rediscovered in a significant way in the early 1990s.
Douglas Irvin’s stimulating talk suggested that such inattention is unfortunate. Whatever one thinks of Lemkin’s effort to inscribe a form of cultural relativity into liberal international law, a more thoughtful understanding of his life and thought can only enrich our understanding of genocide’s career as a concept.
Click here to watch the Douglas Irvin lunchtime talk.