Hannah Arendt Center for Politics and Humanities
17Mar/140

Amor Mundi Newsletter 3/16/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

The Preferential President

obOn the Guernica blog, David Bromwich examines “how Obama became a publicist for his presidency (rather than the president).” In his first term Obama delivered 1,852 separate speeches, comments, or scheduled public remarks and granted 591 interviews. These exceptional numbers, Bromwich writes, were the result of “magical thinking” on the part of the Obama White House: if the American public heard the president often enough, they would see how sincere and bipartisan he was and accept his policies. An endless string of speeches, road trips, and town hall meetings thus came to serve as a stand-in for the decision-making and confrontation that true leadership requires, and genuine conviction demands. Argues Bromwich: “…The truth is that Obama’s convictions were never strong. He did not find this out until his convictions were tested, and they were not tested until he became president. Perhaps the thin connection between Obama’s words and his actions does not support the use of the word “conviction” at all. Let us say instead that he mistook his preferences for convictions—and he can still be trusted to tell us what he would prefer to do. Review the record and it will show that his first statement on a given issue generally lays out what he would prefer. Later on, he resigns himself to supporting a lesser evil, which he tells us is temporary and necessary. The creation of a category of permanent prisoners in “this war we’re in” (which he declines to call “the war on terror”) was an early and characteristic instance. Such is Obama’s belief in the power and significance of his own words that, as he judges his own case, saying the right thing is a decent second-best to doing the right thing.” For more see a commentary on the Arendt Center blog.

Borrowing More than Just Vowels

languagenewPhillip Durkin, author of the forthcoming book Borrowed Words, uses an interactive tool to show how English has changed over the last thousand years. Although still mostly dominated by Latin and French, English has also begun to borrow from languages with more distant origins, like Japanese, Russian, and Greek. Durkin's tool, and presumably his book, is a reminder of the fact that both words and their speakers exist in history, something all too easily lost in the hegemony of any present context.

The Aspirationism of the Creative Class

believeLeonard Pierce takes aim at the aspirationism of the creative class, who, he says, are selling us their luck as our own failure. He concludes from the long view, “It is hard enough just being alive, just living and trying to be a decent person without being overwhelmed by shame and guilt and the demands of the world; the last thing we need is someone who got a few extra pulls of the handle at the cosmic slot machine telling us we’re doing it all wrong.  If there is something we should aspire to, it certainly cannot be a position from which we look upon ordinary people, people no less miraculous but perhaps just a little less lucky than ourselves, as a lesser form of life."

Freedom and Dignity

merkelIn a speech to German Parliament, Angela Merkel, that country's chancellor, explains her position on privacy and surveillance. The question is about more than what happens in what country's borders, she says, and "millions of people who live in undemocratic states are watching very closely how the world’s democracies react to threats to their security: whether they act circumspectly, in sovereign self-assurance, or undermine precisely what in the eyes of these millions of people makes them so attractive—freedom and the dignity of the individual."

The Hero and the Artist

joseConsidering the Philippine writer and hero Jose Mizal in the wake of reading Benedict Anderson's short book Why Counting Counts, Gina Apostol notes his two legacies: “For a Filipino novelist like myself, Rizal is a troubling emblem. Many writers like to dwell on the burden of his monumental legacy. But my problem is that Rizal is forgotten as an artist. Remembered (or dismembered) as a patriot, a martyr, a nationalist, a savior, a saint, Rizal is not discussed much as a writer — he is not read as an artist. Our national hero now shares the fate of all of us who attempt to write about our country in fiction. No one really reads his novels."

If Only They Knew...

cosmosAudra Wolfe, taking note of Neil Degrasse Tyson's resurrection of Carl Sagan's TV science epic Cosmos, suggests that any hope that the series may bring increased attention, and therefore increased funding, to scientific pursuits may be misguided: "As is so often the case with science communication, the assumption seems to be that public understanding of science—sprinkled with a hearty dose of wonder and awe—will produce respect for scientific authority, support for science funding, and a new generation of would-be scientists. If only Americans loved science a little more, the thinking goes, we could end our squabbling about climate change, clean energy, evolution, and funding NASA and the National Science Foundation. These are high hopes to pin on a television show, even one as glorious as Cosmos." Although Wolfe makes a good argument about how Sagan's world is different from the world we now inhabit with Tyson, there's something more basic at work, here: the pernicious notion that, if we educate people who don't agree with us just a little bit more, they'll come around to our way of thinking. This, obviously, is a deeply dismissive point of view, one that suggests that everyone should think as we do, and that they don't is a question of status rather than viewpoint. If Cosmos gets people interested in science, it will be the possibility, the things that we are yet to understand, that get them excited, rather than what has already been settled. Speak to that sense of wonder and people very well may listen; speak to what you think people don't know and should, and they'll tune you out.

From the Hannah Arendt Center Blog

This week on the blog, read a recap and watch the video of Roger Berkowitz and Walter Russell Mead speaking with SCOTUSblog founder, Tom Goldstein, as part of our “Blogging and the New Public Intellectual series. Jason Adams relates Arendt’s belief that the act of thinking slips humanity out of historical and biographical time and into a non-time that reconstitutes the world.Roger Berkowitz ponders whether President Obama lacks conviction, and in the Weekend Read, Roger Berkowitz examines the current antisemitic controversies surrounding both Martin Heidegger and Paul de Man.

20Jan/141

Amor Mundi 1/19/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

On Muckraking and Political Change

watchdogJim Sleeper turned me on to Dean Starkman’s excerpt from his new book chronicling the failure of the press to expose wrongdoing in the lead up to the financial crisis, The Watchdog That Didn’t Bark: The Financial Crisis and the Disappearance of Investigative Journalism. Starkman writes: “Now is a good time to consider what journalism the public needs. What actually works? Who are journalism’s true forefathers and foremothers? Is there a line of authority in journalism’s collective past that can help us navigate its future? What creates value, both in a material sense and in terms of what is good and valuable in American journalism? Accountability reporting comes in many forms—a series of revelations in a newspaper or online, a book, a TV magazine segment—but its most common manifestation has been the long-form newspaper or magazine story, the focus of this book. Call it the Great Story. The form was pioneered by the muckrakers’ quasi-literary work in the early 20th century, with Tarbell’s exposé on the Standard Oil monopoly in McClure’s magazine a brilliant example. As we’ll see, the Great Story has demonstrated its subversive power countless times and has exposed and clarified complex problems for mass audiences across a nearly limitless range of subjects: graft in American cities, modern slave labor in the US, the human costs of leveraged buyouts, police brutality and corruption, the secret recipients on Wall Street of government bailouts, the crimes and cover-ups of media and political elites, and on and on, year in and year out. The greatest of muckraking editors, Samuel S. McClure, would say to his staff, over and over, almost as a mantra, “The story is the thing!” And he was right.” Starkman has incredible optimism in the power of the press is infective. But in the weekend read, Roger Berkowitz turns to Walter Lippmann to raise questions about Starkman’s basic assumptions.

Our Unconstitutional Standing Army

armyKathleen Frydl has an excellent essay in The American Interest arguing against our professionalized military and for the return of a citizen’s army.  “Without much reflection or argument, the United States now supports the professional “large standing army” feared by the Founding Fathers, and the specter of praetorianism they invoked casts an ever more menacing shadow as the nation drifts toward an almost mercenary force, which pays in citizenship, opportunity structures (such as on-the-job technical training and educational benefits), a privileged world of social policy (think Tricare), and, in the case of private contractors, lots of money. Strict constructionists of the Constitution frequently ignore one of its most important principles—that the military should be large and powerful only when it includes the service of citizen-soldiers. This oversight clearly relates to the modern American tendency to define freedom using the neo-liberal language of liberty, shorn of any of the classical republican terminology of service. We would do well to remember Cicero’s most concise summary of a constitutional state: “Freedom is the participation in power.”” I don’t know what Hannah Arendt would have thought about the draft. But I do know she’d sympathize with Frydl’s worries about a professionalized army.

What Has It Done To Us?

timeTim Wu marvels at the human augmented by technology. Consider what an intelligent time traveler would think if talking to a reasonably educated woman today: "The woman behind the curtain, is, of course, just one of us. That is to say, she is a regular human who has augmented her brain using two tools: her mobile phone and a connection to the Internet and, thus, to Web sites like Wikipedia, Google Maps, and Quora. To us, she is unremarkable, but to the man she is astonishing. With our machines, we are augmented humans and prosthetic gods, though we’re remarkably blasé about that fact, like anything we’re used to. Take away our tools, the argument goes, and we’re likely stupider than our friend from the early twentieth century, who has a longer attention span, may read and write Latin, and does arithmetic faster. The time-traveler scenario demonstrates that how you answer the question of whether we are getting smarter depends on how you classify “we.”” We, the underlying humans may know less and less. But “we,” the digitally enabled cyborgs that we’ve become, are geniuses. Much of the focus and commentary about artificial intelligence asks the wrong question, about whether machines will become human. The better question is what will become of humans as we integrate more fully with our machines. That was the topic of Human Being in an Inhuman Age, the 2010 Arendt Center Conference. A selection of essays from that conference are published in the inaugural edition of HA: The Journal of the Hannah Arendt Center.

Thinking History

historyIn an interview with high school teacher David Cutler, history professor Eric Foner explains how we could make history education more effective: "Knowledge of the events of history is important, obviously, but also I think what I see in college students, that seems to be lacking at least when they come into college, is writing experience. In other words, being able to write that little essay with an argument. I see that they think, 'OK, there are the facts of history and that's it—what more is there to be said?' But of course, the very selection of what is a fact, or what is important as a fact, is itself based on an interpretation. You can't just separate fact and interpretation quite as simply as many people seem to think. I would love to see students get a little more experience in trying to write history, and trying to understand why historical interpretation changes over time." Foner wants students to think history, not simply to know it.

Reading Croatian Fiction

fictionGary Shteyngart, Google Glass wearer and author of the recently published memoir Little Failure, explains the arc of his reading habits: "When I was growing up, I was reading a lot of male fiction, if you can call it that. I was up to my neck in Saul Bellow, which was wonderful and was very instrumental but I think I’ve gone, like most people I think I’ve expanded my range quite a bit. When you’re young you focus on things that are incredibly important to you and read, God knows, every Nabokov that’s ever been written. But then, it is time to move beyond that little place where you live and I’ve been doing that; I’m so curious to see so many people send me books now it’s exciting to go to the mailbox and see a work of Croatian fiction."

This Week on the Blog

This week on the blog, Sandipto Dasgupta discusses Arendt and B.R. Ambedkar, one of the authors of the Indian constitution. In the weekend read, Roger Berkowitz examines the merit of muckraking journalism and its role as watchdog of corruption.

9Sep/133

Amor Mundi 9/8/13

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

Balancing Solitude and Society

Illustration by Dan Williams

Illustration by Dan Williams

It is a new year, not only for Jews celebrating Rosh Hashanah but also for hundreds of thousands of college and university students around the world. Over at Harvard, they invited Nannerl O. Keohane—past President of Wellesley College—to give the new students some advice on how to reflect upon and imagine the years of education that lay before them. Above all, Keohane urges students to take time to think about what they want from their education: “You now have this incredible opportunity to shape who you are as a person, what you are like, and what you seek for the future. You have both the time and the materials to do this. You may think you’ve never been busier in your life, and that’s probably true; but most of you have “time” in the sense of no other duties that require your attention and energy. Shaping your character is what you are supposed to do with your education; it’s not competing with something else. You won’t have many other periods in your life that will be this way until you retire when, if you are fortunate, you’ll have another chance; but then you will be more set in your ways, and may find it harder to change.”

The March, Fifty Years On

mlkRobin Kelly, writing on the 1963 March on Washington and the March's recent fiftieth anniversary celebrations, zooms out a little bit on the original event. It has, he says, taken on the characteristics of a big, feel good event focused on Civil Rights and directly responsible for the passage of the Civil Rights Act, when, in fact, all those people also came to Washington in support of economic equality and the gritty work of passing laws was accomplished later, with additional momentum and constraints. It's important to remember, he says, that "big glitzy marches do not make a movement; the organizations and activists who came to Washington, D. C., will continue to do their work, fight their fights, and make connections between disparate struggles, no matter what happens in the limelight."

Famous Last Words

textRobinson Meyer investigates what, exactly, poet Seamus Heaney's last words were. Just before he passed away last week at 74, Heaney, an Irish Nobel Laureate, texted the Latin phrase noli timere, don't be afraid, to his wife. Heaney's son Michael mentioned this in his eulogy for his father, and it was written down and reported as, variously, the correct phrase or the incorrect nolle timore. For Meyer, this mis-recording of the poet's last words is emblematic of some of the transcriptions and translations he did in his work, and the further translations and transcriptions we will now engage in because he is gone. "We die" Meyer writes, "and the language gets away from us, in little ways, like a dropped vowel sound, a change in prepositions, a mistaken transcription. Errors in transfer make a literature."

We're All Billy Pilgrim Now

gearsJay Rosen, who will be speaking at the Hannah Arendt Center’s NYC Lecture Series on Sunday, Oct. 27th at 5pm, has recently suggested that journalism solves the problem of awayness - “Journalism enters the picture when human settlement, daily economy, and political organization grow beyond the scale of the self-informing populace.” C.W. Anderson adds that "awayness" should include alienation from a moment in time as well as from a particular place: "Think about how we get our news today: We dive in and out of Twitter, with its short bursts of immediate information. We click over to a rapidly updating New York Times Lede blog post, with it's rolling updates and on the ground reports, complete with YouTube videos and embedded tweets. Eventually, that blog post becomes a full-fledged article, usually written by someone else. And finally, at another end of the spectrum, we peruse infographics that can sum up decades of data into a single image. All of these are journalism, in some fashion. But the kind of journalisms they are - what they are for - is arguably very different. They each deal with the problem of context in different ways."

...Because I Like it

readingAdam Gopnik makes a case for the study of English, and of the humanities more broadly. His defense is striking because it rejects a recent turn towards their supposed use value, instead emphasizing such study for its own sake: "No sane person proposes or has ever proposed an entirely utilitarian, production-oriented view of human purpose. We cannot merely produce goods and services as efficiently as we can, sell them to each other as cheaply as possible, and die. Some idea of symbolic purpose, of pleasure seeking rather than rent seeking, of Doing Something Else, is essential to human existence. That’s why we pass out tax breaks to churches, zoning remissions to parks, subsidize new ballparks and point to the density of theatres and galleries as signs of urban life, to be encouraged if at all possible. When a man makes a few billion dollars, he still starts looking around for a museum to build a gallery for or a newspaper to buy. No civilization we think worth studying, or whose relics we think worth visiting, existed without what amounts to an English department—texts that mattered, people who argued about them as if they mattered, and a sense of shame among the wealthy if they couldn’t talk about them, at least a little, too. It’s what we call civilization."

Featured Events

smallfailingOctober 3-4, 2013

The sixth annual fall conference, "Failing Fast:The Crisis of the Educated Citizen"

Olin Hall, Bard College

Learn more here.
9Sep/130

A Common Language

Arendtquote

"Any period to which its own past has become as questionable as it has to us must eventually come up against the phenomenon of language, for in it the past is contained ineradicably, thwarting all attempts to get rid of it once and for all. The Greek polis will continue to exist at the bottom of our political existence...for as long as we use the word 'politics.'"

-Hannah Arendt, "Walter Benjamin: 1892-1940"

Some years ago a mentor told me a story from his days as a graduate student at a prestigious political science department. There was a professor there specializing in Russian politics and Sovietology, an older professor who loved teaching and taught well past the standard age of retirement. His enthusiasm was palpable, and he was well-liked by his students. His most popular course was on Russian politics, and towards the end of one semester, a precocious undergraduate visited during office hours: “How hard is it to learn Russian,” the student asked, “because I’d really like to start.” “Pretty hard,” he said, “but that’s great to hear. What has you so excited about it?” “Well,” said the student, “after taking your course, I’m very inspired to read Marx in the original.” At the next class the professor told this story to all of his students, and none of them laughed. He paused for a moment, then somewhat despondently said: “It has only now become clear to me….that none of you know the first thing about Karl Marx.”

The story has several morals. As a professor, it reminds me to be careful about assuming what students know. As a student, it reminds me of an undergraduate paper I wrote which spelled Marx’s first name with a “C.” My professor kindly marked the mistake, but today I can better imagine her frustration. And if the story works as a joke, it is because we accept its basic premise, that knowledge of foreign languages is important, not only for our engagement with texts but with the world at large. After all, the course in question was not about Marx.

The fast approach of the Hannah Arendt Center’s 2013 Conference on “The Educated Citizen in Crisis” offers a fitting backdrop to consider the place of language education in the education of the citizen. The problem has long been salient in America, a land of immigrants and a country of rich cultural diversity; and debates about the relation between the embrace of English and American assimilation continue to draw attention. Samuel Huntington, for example, recently interpreted challenges to English preeminence as a threat to American political culture: “There is no Americano dream,” he writes in “The Hispanic Challenge,” “There is only the American dream created by an Anglo-Protestant society. Mexican Americans will share in that dream and in that society only if they dream in English.”  For Huntington English is an element of national citizenship, not only as a language learned, but as an essential component of American identity.

This might be juxtaposed with Tracy Strong’s support of learning (at least a) second language, including Latin, as an element of democratic citizenship. A second language, writes Strong (see his “Language Learning and the Social Sciences”) helps one acquire “what I might call an anthropological perspective on one’s own society,” for “An important achievement of learning a foreign language is learning a perspective on one’s world that is not one’s own. In turn, the acquisition of another perspective or even the recognition of the legitimacy of another perspective is, to my understanding, a very important component of a democratic political understanding.” Strong illustrates his point with a passage from Hannah Arendt’s “Truth and Politics”: “I form an opinion,” says Arendt, “by considering a given issue from different viewpoints, by making present to my mind the standpoints of those who are absent: that is, I represent them.”

Hannah Arendt’s deep respect for the American Constitution and American political culture, manifest no less (perhaps even more!) in her criticism than her praise, is well known. After fleeing Nazi Germany and German-occupied France, Arendt moved to the United States where she became a naturalized citizen in 1951. And her views on the relation between the English language and American citizenship are rich and complex.

In “The Crisis in Education” Arendt highlights how education plays a unique political role in America, where “it is obvious that the enormously difficult melting together of the most diverse ethnic groups…can only be accomplished through the schooling, education, and Americanization of the immigrants’ children.” Education prepares citizens to enter a common world, of which English in America is a key component: “Since for most of these children English is not their mother tongue but has to be learned in school, schools must obviously assume functions which in a nation-state would be performed as a matter of course in the home.”

At the same time, Arendt’s own embrace of English is hardly straightforward. In a famous 1964 interview with she says: “The Europe of the pre-Hitler period? I do not long for that, I can tell you. What remains? The language remains. […] I have always consciously refused to lose my mother tongue. I have always maintained a certain distance from French, which I then spoke very well, as well as from English, which I write today […] I write in English, but I have never lost a feeling of distance from it. There is a tremendous difference between your mother tongue and another language…The German language is the essential thing that has remained and that I have always consciously preserved.”

Here Arendt seems both with and against Huntington. On one hand, learning and embracing English—the public language of the country—is what enables diverse Americans to share a common political world. And in this respect, her decision to write and publish in English represents one of her most important acts of American democratic citizenship. By writing in English, Arendt “assumes responsibility for the world,” the same responsibility that education requires from its educators if they are to give the younger generation a common world, but which she finds sorely lacking in “The Crisis of Education.”

At the same time, though, Arendt rejects the idea that American citizenship requires treating English as if it were a mother tongue. Arendt consciously preserves her German mother tongue as both an element of her identity and a grounding of her understanding of the world, and in 1967 she even accepted the Sigmund Freud Award of the German Academy of Language and Poetry that “lauded her efforts to keep the German language alive although she had been living and writing in the United States for more than three decades” (I quote from Frank Mehring’s 2011 article “‘All for the Sake of Freedom’: Hannah Arendt’s Democratic Dissent, Trauma, and American Citizenship”).  For Arendt, it seems, it is precisely this potentiality in America—for citizens to share and assume responsibility for a common world approached in its own terms, while also bringing to bear a separate understanding grounded by very different terms—that offers America’s greatest democratic possibilities. One might suggest that Arendt’s engagement with language, in her combination of English responsibility and German self-understanding, offers a powerful and thought-provoking model of American democratic citizenship.

What about the teaching of language? In the “The Crisis in Education” Arendt is critical of the way language, especially foreign language, is taught in American schools. In a passage worth quoting at length she says:

“The close connection between these two things—the substitution of doing for learning and of playing for working—is directly illustrated by the teaching of languages; the child is to learn by speaking, that is by doing, not by studying grammar and syntax; in other words he is to learn a foreign language in the same way that as an infant he learned his own language: as though at play and in the uninterrupted continuity of simple existence. Quite apart from the question of whether this is possible or not…it is perfectly clear that this procedure consciously attempts to keep the older child as far as possible at the infant level.”

Arendt writes that such “pragmatist” methods intend “not to teach knowledge but to inculcate a skill.” Pragmatic instruction helps one to get by in the real world; but it does not allow one to love or understand the world. It renders language useful, but reduces language to an instrument, something easily discarded when no longer needed. It precludes philosophical engagement and representative thinking. The latest smartphone translation apps render it superfluous.

language

But how would one approach language differently? And what does this have to do with grammar and syntax? Perhaps there are clues in the passage selected as our quote of the week, culled from Arendt’s 1968 biographical essay about her friend Walter Benjamin. There, Arendt appreciates that Benjamin's study of language abandons any “utilitarian” or “communicative” goals, but approaches language as a “poetic phenomenon.” The focused study of grammar develops different habits than pragmatist pedagogy. In the process of translation, for example, it facilitates an engagement with language that is divorced from practical use and focused squarely on meaning. To wrestle with grammar means to wrestle with language in the pursuit of truth, in a manner that inspires love for language—that it exists—and cross-cultural understanding. Arendt was famous for flexing her Greek and Latin muscles—in part, I think, as a reflection of her love for the world. The study of Greek and Latin is especially amenable to a relationship of love, because these languages are hardly “practical.” One studies them principally to understand, to shed light on the obscure; and through their investigation one discovers the sunken meanings that remain hidden and embedded in our modern languages, in words we speak regularly without realizing all that is contained within them. By engaging these “dead” languages, we more richly and seriously understand ourselves. And these same disinterested habits, when applied to the study of modern foreign languages, can enrich not only our understanding of different worldviews, but our participation in the world as democratic citizens.

-John LeJeune

5Aug/130

We Create the Conditions that Condition Us

Arendtquote

This "Quote" of the week originally ran on May 28, 2012

"The human condition comprehends more than the condition under which life has been given to man. Men are conditioned beings because everything they come in contact with turns immediately into a condition of their existence.  The world in which the vita activa spends itself consists of things produced by human activities; but the things that owe their existence exclusively to men nevertheless constantly condition their human makers."

-Hannah Arendt, The Human Condition

The human condition is the context or situation we, as human beings, find ourselves in, the implication being that human life cannot be fully understood by considering humanity in isolation from its environment.  We are, to a large degree, shaped by our environment, which is why Arendt refers to us as conditioned beings.

We are conditioned by phenomena external to us, and this may be considered learning in its broadest sense, that is, in the sense that the Skinnerian conditioned response is a learned reaction to external stimuli.  It follows that any form of life that is capable of modifying its behavior in response to external stimuli is, to some extent, a conditioned being.

mouse

On a grander scale, natural selection, as it is popularly understood, can be seen as a conditioning force.  Survival of the fittest is survival of those best able to adapt to existing external conditions, survival of those best able to meet the conditions of their environment.  The fittest are, quite naturally, those in the best condition, that is, the best condition to survive.  Whether we are considering the effects of natural selection upon an entire species, or individual members of a species, or what Richard Dawkins refers to as the selfish gene, the environment sets the conditions that various forms of life must meet to survive and reproduce.

Such views are inherently incorrect insofar as they posit an artificial separation between the conditions of life and the form of life that is conditioned.  An ecological or systems view would instead emphasize the interdependent and interactive relationships that exist, as all forms of life alter their conditions simply by their very presence, by their metabolism, for example, and through their reproduction.  Darwin understood this, I hasten to add, and the seeds of ecology can be found in his work, although they did not fully germinate until the turn of the 20th century.  And Skinner certainly was aware of the individual's capacity for self-stimulation, and self-modification, but a truly relational approach in psychology did not coalesce until Gregory Bateson introduced a cybernetic perspective during the 1950s.

In the passage quoted above, it is readily apparent that Arendt is an ecological thinker.  In saying that, "the things that owe their existence exclusively to men nevertheless constantly condition their human makers," she is saying that we create the conditions that in turn condition us.  We exist within a reciprocal relationship, a dialogue if you like, between the conditioned and the conditions, the internal and the external, the organism and its environment.  The changes that we introduce into our environment, that alter the environment, feedback into ourselves as we are influenced, affected, and shaped by our environment.

The contrast between using tools and techniques in the most basic way to adapt to the conditions of the environment, and the creation of an entirely new technological environment of great complexity that requires us to perform highly convoluted acts of adaptation was portrayed with brilliant sensitivity and humor in the 1980 South African film, directed by Jamie Uys, entitled The Gods Must Be Crazy.  A good part of the documentary style opening can be seen on this YouTube clip:

The story of the Coke bottle, although fictional, follows the pattern of many documented cases in which the introduction of new technologies to traditional societies has had disruptive, and often enough, disastrous effects (the film itself, I hasten to add, is marvelously comedic, and quite often slapstick following the introductory quarter hour.)

The understanding that we are conditioned by the conditions we ourselves introduce was not unknown in the ancient world.  The 115th Psalm of David, in its polemic against idolatry and the idols that are "the work of men's hands," cautions that "they who make them shall be like unto them; yea every one that trusts in them."  Along the same lines, the Gospel of Matthew includes the famous quote, "all those who take up the sword shall perish by the sword," while the Epistle to the Galatians advises, "whatsoever a man sows, that shall he also reap." A more contemporary variation of that maxim is, "as you make your bed, so you shall lie on it," although in the United States it is often rendered in the imperative and punitive form of, "you made your bed, go lie in it!"  During the 19th century, Henry David Thoreau notified us that "we do not ride on the railroad; it rides upon us," while Mark Twain humorously observed that, "if all you have is a hammer, everything looks like a nail."  More recently, we have been told, "ask a silly question, get a silly answer," to which computer scientists have responded with the acronym GIGO, which stands for, "garbage in, garbage out."  Winston Churchill said, "we shape our buildings, and thereafter they shape us," and former Fordham professor John Culkin, in turn, offered, "we shape our tools, and thereafter they shape us," as a corollary to Marhsall McLuhan's media ecology aphorism, "the medium is the message."

All of these voices, in their varying ways, are pointing to the same essential truth about the human condition that Arendt is relating in the quote that begins this post.  And to pick up where that quote leaves off, Arendt goes on to argue,

In addition to the conditions under which life is given to man on earth, and partly out of them, men constantly create their own, self-made conditions, which, their human origin and their variability not withstanding, possess the same conditioning power as natural things.

The "conditions" that we make are used to create a buffer or shield against the conditions that we inherit, so that our self-made conditions are meant to stand between us and what we would consider to be the natural environment.  In this sense, our self-made conditions mediatebetween ourselves and the pre-existing conditions that we operate under, which is to say that our conditions are media of human life.  And in mediating, in going between our prior conditions and ourselves, the new conditions that we create become our new environment.  And as we become conditioned to our new conditions, they fade from view, being routinized they melt into the background and become essentially invisible to us.Let us return now for the conclusion of the passage from The Human Condition:

Whatever touches or enters into a sustained relationship with human life immediately assumes the character of a condition of human existence.  This is why men, no matter what they do, are always conditioned beings.  Whatever enters the world of its own accord or is drawn into it by human effort becomes part of the human condition.  The impact of the world's reality upon human existence is felt and received as a conditioning force.  The objectivity of the world—its object- or thing-character—and the human condition supplement each other; because human existence is conditioned existence, it would be impossible without things, and things would be a heap of unrelated articles, a non-world, if they were not the conditioners of human existence.

eye

This last point is quite striking.  It is we, as human beings, who create worlds, which brings to mind the moving commentary from the Talmud:  "whoever saves a life, it is considered as if he saved an entire world."  We create worlds, in the sense that we give meaning to existence, we attribute meaning to phenomena, we construct symbolic as well as material environments.  Each one of us, in our singular subjectivity, creates a world of our own, and therefore each one of us represents a world unto ourselves.

But these individual worlds are links, nodes in a social network, interdependent and interactive parts of an ecological whole.  The term condition, in its root meaning is derived from the Latin prefix com, which means together, and dicere, which means to speak.  And our ability to speak together, to engage in discussion and deliberation, to enter into symbolic interaction, constitutes the means by which we collectively construct our intersubjective, social realities, our worlds.

As human beings, we are conditioned not only by our labor, the ways in which we obtain the necessities of life, i.e., air, water, food, shelter, to which Marx sought to reduce all aspects of society, a position that Arendt severely criticized.  We are conditioned not only by our work, which Arendt associated with artifacts, with instrumentality and technology, with arts and crafts.  We are conditioned most importantly by action, which in Arendt's view is intimately tied to speech and the symbolic, and to processes rather than things, to relations rather than objects.

In the end, Arendt reminds us that the human condition is itself conditional, and to be fully human requires not only that we take care of biological necessity, nor that we make life easier through technological innovation, but that we cooperate through speech and action in collectively constructing a world that is truly blessed with freedom and with justice.

-Lance Strate

25Feb/130

Learning From Crisis

"[T]here is another even more cogent reason for [the layman] concerning himself with a critical situation in which he is not immediately involved. And that is the opportunity, provided by the very fact of crisis—which tears away facades and obliterates prejudices—to explore and inquire into whatever has been laid bare of the essence of the matter…"

-Hannah Arendt, "The Crisis in Education"

I

It is often said that the Chinese word for “crisis,” or weiji, means a combination of “danger” and “opportunity,” and every so often the trope appears in the highest echelons of American politics. Linguist Benjamin Zimmer cites its frequent use by John F. Kennedy in speeches leading into the 1960 presidential election; and more recently, Al Gore in 2006-7 used weiji to anchor both his Congressional testimony on the problem of climate change, and his Vanity Fair article (“The Moment of Truth”) concerning the same. During her January 2007 trip to the Middle East, then-Secretary of State Condoleezza Rice told reporters of conditions in the region, "I don't read Chinese but I am told that the Chinese character for crisis is wei-ji, which means both danger and opportunity…And I think that states it very well. We'll try to maximize the opportunity."

This use of weiji has irked some linguists. Zimmer calls Gore’s Chinese riff a “linguistic canard” and writes that in all these cases, “[T]he trope was deployed for similar effect: as a framing technique for describing current perils posed by a particular world crisis and future possibilities for resolving that crisis. Thus it allows the speaker to shift rhetorical footing from pessimism to optimism, ending with an upbeat tone and a call to action.” Victor H. Mair, a professor of Chinese language and literature at UPenn, identifies a “fatal” error of interpretation that centers on the second character, ji, which rather than “opportunity,” here means something like “incipient moment; crucial point (when something begins or changes).” Thus, “A weiji indicates a perilous situation when one should be especially wary. It is not a juncture when one goes looking for advantages and benefits.”

To those still seeking New Age wisdom in the danger/opportunity coupling, Mair points to the old Greek usage. Modern “crisis” stems from the Greek krinein, meaning to separate, decide, or judge. The word reached Middle English in the 15th century via Latin, and the Oxford English Dictionary says that by mid-16th century it meant judgment related specifically to sickness and the sudden change of disease (The Online Etymology Dictionary cites Hippocrates using krinein in the same way.). Soon thereafter it referred more generally to “A vitally important or decisive stage in the progress of anything; a turning-point,” as well as judgment or decision simply, and “A point by which to judge; a criterion; token; sign.”

In moments of crisis the important connection between “danger” and “opportunity” centers on their common source in a disruption of normal order, a disruption that entails instability and volatility, but also openings to previously precluded or unimagined possibilities for action. The moment of crisis is transient, and in political matters the statesman’s virtue is two-fold—not only to manage (or “seize”) a crisis situation, but also to recognize the situation when it arises (See Lenin, “The Crisis Has Matured,” September 29, 1917) or foresee its coming. By recognizing a crisis for what it is—a moment of decision—we can wrest the decision to ourselves.

II

Hannah Arendt’s essay “The Crisis in Education” seems to offer a different understanding of social and political crisis—one less concerned with critical moments and more concerned with the “elemental structures” of modernity that “crystallize” over time and manifest today in a variety of ways. The essay starts by observing that “The general crisis that has overtaken the modern world everywhere and in almost every sphere of life manifests itself differently in each country, involving different areas and taking on different forms.” In America the general crisis has assumed the form of “the recurring crisis in education that, during the last decade at least, has become a political problem of the first magnitude[.]” This introduces a recurring theme in the essay, that while examining a particular political crisis in America, the essay is also—and perhaps more fundamentally—about “a more general crisis and instability in modern society.”

This more general crisis is the modern crisis of authority that is “closely connected with the crisis of tradition…the crisis in our attitude towards the realm of the past.” Seeing how this bears on the crisis of education requires examining “whatever has been laid bare of the essence of the matter, and the essence of education is natality, the fact that human beings are born into the world.” At the same time, Arendt writes, “Basically we are always educating for a world that is or is becoming out of joint,” a world that, because it is made by mortals, “runs the risk of becoming as mortal as they.” And thus—because the essence of education is natality, and the “newcomers” need a world in which to live and act, but the world in which we live and act constantly “is or is becoming out of joint”—the problem of education concerns how to stabilize this world for the “newcomers” without also stifling their capacity to renew or even drastically alter it: “Exactly for the sake of what is new and revolutionary in every child,” Arendt writes, “education must be conservative; it must preserve this newness and introduce it as a new thing into an old world[.]”

Here the crisis of modernity and education converge—for the process of giving students a world has historically relied on the authority of tradition and the past. But if these authorities can no longer be relied upon, then what remains? Stunningly, Arendt locates a new authority for modern conditions in the teacher’s “assumption of responsibility for that world.”

III

Arendt’s account of the American crisis of education illustrates the connection between local political crises around the world and a larger civilizational crisis. Indeed, a central goal of “The Crisis in Education” is to highlight the blind spots in understanding that result when one regards “a local phenomenon” like the crisis of education as “unconnected with the larger issues of the century, to be blamed on certain peculiarities of life in the United States” (as for example its history of “continuous immigration”). To localize such problems is tempting because “However clearly a general problem may present itself in a crisis, it is nevertheless impossible ever to isolate completely the universal element from the concrete and specific circumstances in which it makes its appearance.” But while “There is always a temptation to believe that we are dealing with specific problems confined within historical and national boundaries and of importance only to those immediately affected”— “It is precisely this belief that in our time has consistently proved false” (emphasis added).

This false belief prevents us from, among other things, ascertaining “which aspects of the modern world and its crisis have actually revealed themselves” (in a local crisis)—that is, “the true reasons that for decades things could be said and done in such glaring contradiction to common sense.” And events continue in this manner due in part to the illusion that situation-specific and/or scientific solutions, which may (or may not) satisfactorily solve local problems in the short term, actually touch upon the heart of the matter. The illusion manifests in “repeat performance” of the crisis, “though perhaps different in form, since there are no limits to the possibilities of nonsense and capricious notions that can be decked out as the last word in science.”  Arendt’s criticism of the futility of pragmatist pedagogy in addressing the crisis of authority in the classroom represents a case in point.

IV

In recent months and years, few words have achieved more prominence in Washington politics than crisis. As recently as February 3, President Obama said in a CBS interview that “Washington cannot continually operate under a cloud of crisis.” And following the latest inconclusive negotiations over the country’s fiscal situation and looming (depending on who you ask) “debt crisis,” a recent article in the Huffington Post bemoans the “pattern of a Congress that governs from crisis to crisis” that has become “all too familiar—and predictable. The trend goes something like this: As a deadline approaches, Republicans repeat their calls for spending cuts. Democrats accuse Republicans of hostage-taking. A short-term agreement is then reached that averts economic calamity, but ultimately kicks the can down the road for yet another fight.”

What does it mean for a Congress to routinely “govern from crisis to crisis”? Does “governing by crisis” constitute functioning politics, or a political crisis of the first order? In The Crisis in Education Arendt writes that “the very fact of crisis…tears away facades and obliterates prejudices,” and allows one “to explore and inquire into whatever has been laid bare of the essence of the matter.” But to state the obvious, if “the very fact of crisis…tears away facades and obliterates prejudices,” then such tearing and obliteration requires that “the very fact of crisis” be recognized and acknowledged. In the current governing crisis in Washington, what fundamentally new, to say nothing of unprejudiced, questions—other than how Washington’s two parties will “compromise” and avoid self-destruction—have been asked? Who has spoken seriously, truthfully, and critically, in an effort to lay bare the essence of the matter?

At a time when happenings in Washington “could be said and done in such glaring contradiction to common sense” (How else are we to understand “governing by crisis”?), Hannah Arendt reminds us to seek out and overcome those “prejudices” and “preformed judgments”—including the obligatory moves to technocratic and ideological narratives—that preclude the introduction of new questions and corresponding answers that require direct and original judgments and, perhaps most importantly, thinking and responsibility. Counterintuitively, in such situations Arendt highlights the importance of questions rather than solutions in confronting political crisis—that the proper response to crisis requires thinking rather than knowledge. To narrowly search for efficient policy “solutions” or ideological “compromises” based on prior prejudices simply misses the point.

If crisis does not seem especially urgent to Arendt in “The Crisis on Education,” she does warn that, in the end, “unreflective perseverance…can only…lead to ruin.” Ironically, one of the prejudiced assumptions that seems most prevalent in Congress today—that abandoning one’s prejudices and preformed judgments spells political death—may be most indicative of our current political crisis.–—And yet if, as Arendt suggests on more than one occasion, one answer to the modern crisis of authority lies in the “assumption of responsibility”—be it responsibility for the world in the classroom, responsibility for extraordinary action in politics (Arendt once attributed Lenin’s revolutionary authority to his singular willingness to “assume responsibility for the revolution after it happened.”), or even responsibility for truthful speech (as opposed to “mere talk”) and action in normal, everyday politics—then notwithstanding whatever the American crisis is, whoever has the courage to speak truthfully and accept political responsibility may wake up to find real power and opportunity suddenly within his grasp.

-John LeJeune

23Jan/130

The Higher Education Bubble? Not So Fast.

We have a higher education bubble. The combination of unsustainable debt loads on young people and the advent of technological alternatives is clearly set to upend the staid and often sclerotic world of higher education.

In this month’s The American Interest, Nathan Hardin—the author of Sex & God at Yale: Porn, Political Correctness, and a Good Education Gone Bad (St. Martin’s, 2012) and editor of The College Fix—tries to quantify the destructive changes coming to higher education. Here is his opening paragraph:

In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist. The technology driving this change is already at work, and nothing can stop it. The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor’s degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.

Step back a second. Beware of all prognostications of this sort. Nobody knows what will happen tomorrow let alone 50 years from now. Even today the NY Times reports that the University of Cincinnati and the University of Arizona are turning to online courses as a way of increasing enrollment at their residential campuses. Whether this will work and how this will transform the very idea of a residential college are not yet clear. But the kinds of predictions Hardin makes can be provocative, thus inducing of thought. But they are rarely accurate and too often are simply irresponsible.

Beyond the hyperbole, here is something true. Colleges will exist so long as they can convince students and their parents that the value of education is worth the cost. One reason some colleges are suffering today is clearly the cost. But another reason is the declining perception of value.  We should also remember that many colleges—especially the best and most expensive ones—are seeing record demand. If and when the college bubble bursts, not all colleges will be hit equally. Some will thrive and others will likely disappear. Still others will adapt. We should be wary of collapsing all colleges into a single narrative or thinking we can see the future.

Part of the problem is that colleges offer education, something inherently difficult to put a value on. For a long time, the “value” of higher education was intangible. It was the marker of elite status to be a Harvard man or some such thing. One learned Latin and Greek and studied poetry and genetics. But what really was being offered was sophistication, character, erudition, culture, and status, not to mention connections and access.

More recently, college is “sold” in a very different way. It promises earning power. This has brought a whole new generation and many new classes into university education as they seek the magic ticket granting access to an upper middle class lifestyle. As the percentage of college graduates increases, the distinction and thus market value of college education decreases. The problem colleges have is that in their rush to open the doors to all paying customers, they have devalued the product they are offering. The real reason colleges are threatened now—if they indeed are threatened—is less financial than it is intellectual and moral. Quite simply, many of our colleges have progressively abandoned their intangible mission to educate students and embraced the market-driven task of credentialing students for employment. When for-profit or internet-based colleges can do this more cheaply and more efficiently, it is only logical that they will succeed.

For many professors and graduate students, the predicted demise of the residential college will be a hard shock. Professors who thought they had earned lifetime security with tenure will be fired as their departments are shuttered or their entire universities closed down. Just as reporters, book sellers, and now lawyers around the country have seen their jobs evaporate by the disruption of the internet, so too will professors be replaced by technological efficiencies. And this may well happen fast.

Gregory Ferenstein, who describes himself as a writer and educator and writes for Techcrunch and the Huffington Post,  has gone so far to offer a proposed timeline of the disappearance of most colleges as we know them. Here is his outline, which begins with the recently announced pilot program that will see basic courses at San Jose State University replaced by online courses administered by the private company Udacity:

  1. [The] Pilot [program in which Udacity is offering online courses for the largest university system in the world, the California State University System] succeeds, expands to more universities and classes
  2. Part-time faculty get laid off, more community colleges are shuttered, extracurricular college services are closed, and humanities and arts departments are dissolved for lack of enrollment (science enrollment increases–yay!?)
  3. Graduate programs dry up, once master’s and PhD students realize there are no teaching jobs. Fewer graduate students means fewer teaching assistants and, therefore, fewer classes
  4. Competency-based measures begin to find the online students perform on par with, if not better than, campus-based students. Major accredited state college systems offer fully online university degrees, then shutter more and more college campuses
  5. A few Ivy League universities begin to control most of the online content, as universities all over the world converge toward the classes that produce the highest success rates
  6. In the near future, learning on a college campus returns to its elite roots, where a much smaller percentage of students are personally mentored by research and expert faculty

I put little faith in things working out exactly as Ferenstein predicts, and yet I can’t imagine he is that far off the mark. As long as colleges see themselves in the knowledge-production business and the earnings-power business, they will be vulnerable to cheaper alternatives. Such quantifiable ends can be done more cheaply and sometimes better using technology and distance learning. Only education—the leading of students into a common world of tradition, values, and common sense—depends on the residential model of one-to-one in-person learning associated with the liberal arts college. The large university lecture course is clearly an endangered species.

Which is why it is so surprising to read a nearly diametrically opposed position suggesting that we are about to enter a golden age for untenured and adjunct faculty. This it the opinion of Michael Bérubé, the President of the Modern Language Association. Bérubé gave the Presidential Address at the 2013 MLA meetings in Boston earlier this month.

It is helpful and instructive to compare Hardin’s technophilic optimism with Bérubé’s recent remarks . He dedicated much of his speech to a very different optimism, namely that contingent and adjunct faculty would finally get the increased salaries and respect that they deserved. According to Bérubé:

[F]or the first time, MLA recommendations for faculty working conditions [are] being aggressively promoted by way of social media…. After this, I think, it really will be impossible for people to treat contingent faculty as an invisible labor force. What will come of this development I do not know, but I can say that I am ending the year with more optimism for contingent faculty members than I had when I began the year, and that’s certainly not something I thought I would be able to say tonight.

Bérubé’s talk is above all a defense of professionalization in the humanities. He defends graduate training in theory as a way to approach literary texts. He extols the virtues of specialized academic research over and above teaching. He embraces and justifies “careers of study in the humanities” over and against the humanities themselves. Above all, he argues that there are good reasons to “bother with advanced study in the humanities?” In short, Bérubé defends not the humanities, but the specialized study of the humanities by a small group of graduate students and professors.

I understand what Bérubé means. There is a joy in the pursuit of esoteric knowledge even if he eschews the idea of joy wanting instead to identify his pursuit work and professionalized labor. But to think that there is an optimistic future for the thousands of young graduate students and contingent faculty who are currently hoping to make professional careers in the advanced study of the humanities is lunacy. Yes advanced study of the humanities is joyful for some? But why should it be a paying job? There is a real blindness not only to the technological and economic imperatives of the moment in Bérubé’s speech, but also to the idea of the humanities.

As Hannah Arendt wrote 50 years ago in her essay On Violence, humanities scholars today are better served by being learned and erudite than by seeking to do original research by uncovering some new or forgotten scrap. What we need is not professional humanities scholars so much as educated and curious thinkers and readers.

As I have written before:

To say that excessively specialized humanities scholarship today is irrelevant is not to say that the humanities are irrelevant. The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.

If humanities programs and liberal arts colleges go the way of the duck-billed platypus, it will only partly be because of new technologies and rising debt. It will also be because the over-professionalization of the humanities has led—in some but not all colleges—to a course of study that simply is not seen as valuable by many young people. The changes that Hardin and Ferenstein see coming will certainly shake up the all-too-comfortable world of professional higher education. That is not bad at all. The question is whether educators can adapt and begin to offer courses and learning that is valuable. But that will only happen if we abandon the hyper-professionalized self-image defended by scholars like Michael Bérubé. One model for such a change is, of course, the public intellectual writing and thinking of Hannah Arendt.

-RB

16Jan/130

The Progeny of Teachers

San Jose State University is experimenting with a program where students pay a reduced fee for online courses run by the private firm Udacity. Teachers and their unions are in retreat across the nation. And groups like Uncollege insist that schools and universities are unnecessary. At a time when teachers are everywhere on the defensive, it is great to read this opening salvo from Leon Wieseltier:

When I look back at my education, I am struck not by how much I learned but by how much I was taught. I am the progeny of teachers; I swoon over teachers. Even what I learned on my own I owed to them, because they guided me in my sense of what is significant.

I share Wieseltier’s reverence for educators. Eric Rothschild and Werner Feig lit fires in my brain while I was in high school. Austin Sarat taught me to teach myself in college. Laurent Mayali introduced me to the wonders of history. Marianne Constable pushed me to be a rigorous reader. Drucilla Cornell fired my idealism for justice. And Philippe Nonet showed me how much I still had to know and inspired me to read and think ruthlessly in graduate school. Like Wieseltier, I can trace my life’s path through the lens of my teachers. 

The occasion for such a welcome love letter to teachers is Wieseltier’s rapacious rejection of homeschooling and unschooling, two movements that he argues denigrate teachers. As sympathetic as I am to his paean to pedagogues, Wieseltier’s rejection of all alternatives to conventional education today is overly defensive.

For all their many ills, homeschooling and unschooling are two movements that seek to personalize and intensify the often conventional and factory-like educational experience of our nation’s high schools and colleges. According to Wieseltier, these alternatives are possessed of the “demented idea that children can be competently taught by people whose only qualifications for teaching them are love and a desire to keep them from the world.” These movements believe that young people can “reject college and become “self-directed learners.”” For Wieseltier, the claim that people can teach themselves is both an “insult to the great profession of pedagogy” and a romantic over-estimation of “untutored ‘self’.” 

The romance of the untutored self is strong, but hardly dangerous. While today educators like Will Richardson and entrepreneurs like Dale Stephens celebrate the abundance of the internet and argue that anyone can teach themselves with simply an internet connection, that dream has a history. Consider this endorsement of autodidactic learning from Ray Bradbury from long before the internet:

Yes, I am. I’m completely library educated. I’ve never been to college. I went down to the library when I was in grade school in Waukegan, and in high school in Los Angeles, and spent long days every summer in the library. I used to steal magazines from a store on Genesee Street, in Waukegan, and read them and then steal them back on the racks again. That way I took the print off with my eyeballs and stayed honest. I didn’t want to be a permanent thief, and I was very careful to wash my hands before I read them. But with the library, it’s like catnip, I suppose: you begin to run in circles because there’s so much to look at and read. And it’s far more fun than going to school, simply because you make up your own list and you don’t have to listen to anyone. When I would see some of the books my kids were forced to bring home and read by some of their teachers, and were graded on—well, what if you don’t like those books?

In this interview in the Paris Review, Bradbury not only celebrates the freedom of the untutored self, but also dismisses college along much the same lines as Dale Stephens of Uncollege does. Here is Bradbury again:

You can’t learn to write in college. It’s a very bad place for writers because the teachers always think they know more than you do—and they don’t. They have prejudices. They may like Henry James, but what if you don’t want to write like Henry James? They may like John Irving, for instance, who’s the bore of all time. A lot of the people whose work they’ve taught in the schools for the last thirty years, I can’t understand why people read them and why they are taught. The library, on the other hand, has no biases. The information is all there for you to interpret. You don’t have someone telling you what to think. You discover it for yourself. 

What the library and the internet offer is unfiltered information. For the autodidact, that is all that is needed. Education is a self-driven exploration of the database of the world.

Of course such arguments are elitist. Not everyone is a Ray Bradbury or a Wilhelm Gottfried Leibniz, who taught himself Latin in a few days. Hannah Arendt refused to go to her high school Greek class because it was offered at 8 am—too early an hour for her mind to wake up, she claimed. She learned Greek on her own. For such people self-learning is an option. But even Arendt needed teachers, which is why she went to Freiburg to study with Martin Heidegger. She had heard, she later wrote, that thinking was happening there. And she wanted to learn to think.

What is it that teachers teach when they are teaching? To answer “thinking” or “critical reasoning” or “self-reflection” is simply to open more questions. And yet these are the crucial questions we need to ask. At a period in time when education is increasingly confused with information delivery, we need to articulate and promote the dignity of teaching.

What is most provocative in Wieseltier’s essay is his civic argument for a liberal arts education.  Education, he writes, is the salvation of both the person and the citizen. Indeed it is the bulwark of a democratic politics:

Surely the primary objectives of education are the formation of the self and the formation of the citizen. A political order based on the expression of opinion imposes an intellectual obligation upon the individual, who cannot acquit himself of his democratic duty without an ability to reason, a familiarity with argument, a historical memory. An ignorant citizen is a traitor to an open society. The demagoguery of the media, which is covertly structural when it is not overtly ideological, demands a countervailing force of knowledgeable reflection.

That education is the answer to our political ills is an argument heard widely. During the recent presidential election, the candidates frequently appealed to education as the panacea for everything from our flagging economy to our sclerotic political system. Wieseltier trades in a similar argument: A good liberal arts education will yield critical thinkers who will thus be able to parse the obfuscation inherent in the media and vote for responsible and excellent candidates.

I am skeptical of arguments that imagine education as a panacea for politics. Behind such arguments is usually the unspoken assumption: “If X were educated and knew what they were talking about, they would see the truth and agree with me.” There is a confidence here in a kind of rational speech situation (of the kind imagined by Jürgen Habermas) that holds that when the conditions are propitious, everyone will come to agree on a rational solution. But that is not the way human nature or politics works. Politics involves plurality and the amazing thing about human beings is that educated or not, we embrace an extraordinary variety of strongly held, intelligent, and conscientious opinions. I am a firm believer in education. But I hold out little hope that education will make people see eye to eye, end our political paralysis, or usher in a more rational polity.

What then is the value of education? And why is that we so deeply need great teachers? Hannah Arendt saw education as “the point at which we decide whether we love the world enough to assume responsibility for it." The educator must love the world and believe in it if he or she is to introduce young people to that world as something noble and worthy of respect. In this sense education is conservative, insofar as it conserves the world as it has been given. But education is also revolutionary, insofar as the teacher must realize that it is part of that world as it is that young people will change the world. Teachers simply teach what is, Arendt argued; they leave to the students the chance to transform it.

To teach the world as it is, one must love the world—what Arendt comes to call amor mundi. A teacher must not despise the world or see it as oppressive, evil, and deceitful. Yes, the teacher can recognize the limitations of the world and see its faults. But he or she must nevertheless love the world with its faults and thus lead the student into the world as something inspired and beautiful. To teach Plato, you must love Plato. To teach geology, you must love rocks. While critical thinking is an important skill, what teachers teach is rather enthusiasm and love of learning. The great teachers are the lovers of learning. What they teach, above all, is the experience of discovery. And they do so by learning themselves.

Education is to be distinguished from knowledge transmission. It must also be distinguished from credentialing. And finally, education is not the same as indoctrinating students with values or beliefs. Education is about opening students to the fact of what is. Teaching them about the world as it is.  It is then up to the student, the young, to judge whether the world that they have inherited is loveable and worthy of retention, or whether it must be changed. The teacher is not responsible for changing the world; rather the teacher nurtures new citizens who are capable of judging the world on their own.

Arendt thus affirms Ralph Waldo Emerson's view that “He only who is able to stand alone is qualified for society.” Emerson’s imperative, to take up the divine idea allotted to each one of us, resonates with Arendt’s Socratic imperative, to be true to oneself. Education, Arendt insists, must risk allowing people their unique and personal viewpoints, eschewing political education and seeking, simply, to nurture independent minds. Education prepares the youth for politics by bringing them into a common world as independent and unique individuals. From this perspective, the progeny of teachers is the educated citizen, someone one who is both self-reliant in an Emersonian sense and also part of a common world.

-RB

4Dec/122

The Irony of Sincerity

A few weeks ago, Christy Wampole, a professor of French at Princeton, took to the New York Times to point to what she sees as a pandemic of irony, the symptom of a malignant hipster culture which has metastasized, spreading out from college campuses and hip neighborhoods and into the population at large. Last week, author R. Jay Magill responded to Wampole, noting that the professor was a very late entry into an analysis of irony that stretches back to the last gasps of the 20th century, and that even that discourse fits into a much longer conversation about sincerity and irony that has been going on at least since Diogenes.

Of course, this wasn’t Magill’s first visit to this particular arena; his own entry, entitled Sincerity: How a Moral Ideal Born Five Hundred Years Ago Inspired Religious Wars, Modern Art, Hipster Chic, and the Curious Notion That We All Have Something to Say (No Matter How Dull), came out in July. Magill very effectively recapitulates the main point from his book in his article for the Atlantic, but, if you were to read this new summary alone, you would both deny yourself of some of the pleasures of Magill’s research and prose, as well as spare yourself from some of his less convincing arguments, arguments which, incidentally, happen to suffice for the thrust of his recent article.

The most interesting chapters of Magill’s book deal with the early history of the rise of sincerity, which he traces back to the Reformation. In Magill’s telling, the word “sincere” enters the record of English in 1533, when an English reformer named John Frith writes, to Sir Thomas More, that John Wycliffe “had lived ‘a very sincere life.’” Before that use, in its origin in Latin and French, the word “sincere” had only been used to describe objects and, now, Frith was using it not only for the first time in English but also to describe a particular individual as unusually true and pure to his self, set in opposition to the various hypocrisies that had taken root within the Catholic Church. Magill sums this up quite elegantly: “to be sincere” he writes “was to be reformed.”

Now, this would have been revolutionary enough, since it suggested that a relationship with God required internal confirmation rather than external acclamation—in the words of St. Paul, a fidelity to the spirit of the law and not just the letter. And yet reformed sincerity was not simply a return to the Gospel. In order to be true to one’s self, there must be a self to accord with, an internal to look towards. Indeed, Magill’s history of the idea of sincerity succeeds when it describes the development of the self, and, in particular, that development as variably determined by the internal or the external.

Image by Shirin Rezaee

It gets more complicated, however, or perhaps more interesting, when Magill turns towards deceptive presentations of the self, that is, when he begins to talk about insincerity. He begins this conversation with Montaigne, who “comes to sense a definite split between his public and private selves and is the first author obsessed with portraying himself as he really is.” The most interesting appearance of this conversation is an excellent chapter on Jean-Jacques Rousseau, who suggested that people should aspire to self-sameness, should do their best to “reconcile” one’s self to one’s self, a demand for authenticity that would come to be fully expressed in Immanuel Kant’s moral law, the command that I must set myself as a law for myself.

Sincerity, the moral ideal first put forth by John Frith, started as the Reformation’s response to the inability of the Catholic Church to enact that particular principle, in other words, its hypocrisy. This follows for each of the movements that Magill writes about, each responding to the hypocrisy of their own moment in a specific way. On this matter he has a very good teacher, Hannah Arendt, an inheritor of Kant, who was himself a reader of Rousseau. Arendt writes, in Crisis of the Republic, what might serve as a good summation of one of Magill’s more convincing arguments: “if we inquire historically into the causes likely to transform engagés into enragés, it is not injustice that ranks first, but hypocrisy.”

Still, while what makes the sincerity of Frith (who was burned at the stake) or Wycliffe (whose body was exhumed a half century after his death so that it, too, could be burned) compelling is the turn inwards, it is Rousseau’s substitution of the turn back for that turn inward that appears to interest Magill, who decries “the Enlightenment understanding of the world” that “would entirely dominate the West, relegating Rousseau to that breed of reactionary artististic and political minds who stood against the progress of technology, commerce, and modernization and pined for utopia.”

The whole point is moot; Rousseau was himself a hypocrite, often either unable or unwilling to enact the principles he set out in his writings. As Magill moves forward, though, it becomes clear the he values the turn back as a manifestation of sincerity, as a sort of expressing oneself honestly. The last few hundred years in the development of sincerity, it seems, are finding new iterations of the past in the self. He writes that the Romantics, a group he seems to favor as more sincere than most, “harbored a desire to escape a desire to escape forward-moving, rational civilization by worshipping nature, emotion, love, the nostalgic past, the bucolic idyll, violence, the grotesque, the mystical, the outcast and, failing these, suicide.” In turn, in his last chapter, Magill writes that hipster culture serves a vital cultural purpose: its “sincere remembrance of things past, however commodified or cheesy or kitschy or campy or embarrassing, remains real and small and beautiful because otherwise these old things are about to be discarded by a culture that bulldozes content once it has its economic utility.”

The hipster, for Magill, is not the cold affectation of an unculture, as Wampole wants to claim, but is instead the inheritor “of the the entire history of the Protestant-Romantic-rebellious ethos that has aimed for five hundred years to jam a stick into the endlessly turning spokes of time, culture and consumption and yell, “Stop! I want to get off!”

There’s the rub. What Magill offers doesn’t necessarily strike me as a move towards sincerity, but it is definitely a nod to nostalgia. Consider how he recapitulates his argument in the article:

One need really only look at what counts as inventive new music, film, or art. Much of it is stripped down, bare, devoid of over-production, or aware of its production—that is, an irony that produces sincerity. Sure, pop music and Jeff Koons alike retain huge pull (read: $$$), but lately there has been a return to artistic and musical genres that existed prior to the irony-debunking of 9/11: early punk, disco, rap, New Wave—with a winking nod to sparse Casio keyboard sounds, drum machines, naïve drawing, fake digital-look drawings, and jangly, Clash-like guitars. Bands like Arcade Fire, Metric, Scissor Sisters, CSS, Chairlift, and the Temper Trap all go in for heavy nostalgia and an acknowledgement of a less self-conscious, more D.I.Y. time in music.

Here, Magill is very selectively parsing the recent history of “indie music,” ignoring a particularly striking embrace of artificial pop music that happened alongside the rise of the “sincere” genres, like new folk, that he favors. There’s no reason to assume that Jeff Koons’s blown up balloon animals or Andy Warhol’s Brillo Boxes are any less sincere than the Scissor Sisters’s camp disco, just as there is no reason to assume that a desire to return to nature is any less sincere than the move into the city. Although Magill makes a good argument for the hipster’s cultural purpose, that purpose is not itself evidence that the hipster is expressing what’s truly inside himself, just as there’s no way for you to be sure that I am sincerely expressing my feelings about Sincerity. Magill, ultimately, makes the same mistake as Wampole, in that he judges with no evidence; the only person you can accurately identify as sincere is yourself.

-Josh Kopin

28May/120

We Create the Conditions that Condition Us

"The human condition comprehends more than the condition under which life has been given to man. Men are conditioned beings because everything they come in contact with turns immediately into a condition of their existence.  The world in which the vita activa spends itself consists of things produced by human activities; but the things that owe their existence exclusively to men nevertheless constantly condition their human makers."

-Hannah Arendt, The Human Condition, 1958, p. 9

The human condition is the context or situation we, as human beings, find ourselves in, the implication being that human life cannot be fully understood by considering humanity in isolation from its environment.  We are, to a large degree, shaped by our environment, which is why Arendt refers to us as conditioned beings.

We are conditioned by phenomena external to us, and this may be considered learning in its broadest sense, that is, in the sense that the Skinnerian conditioned response is a learned reaction to external stimuli.  It follows that any form of life that is capable of modifying its behavior in response to external stimuli is, to some extent, a conditioned being.

On a grander scale, natural selection, as it is popularly understood, can be seen as a conditioning force.  Survival of the fittest is survival of those best able to adapt to existing external conditions, survival of those best able to meet the conditions of their environment.  The fittest are, quite naturally, those in the best condition, that is, the best condition to survive.  Whether we are considering the effects of natural selection upon an entire species, or individual members of a species, or what Richard Dawkins refers to as the selfish gene, the environment sets the conditions that various forms of life must meet to survive and reproduce.

Such views are inherently incorrect insofar as they posit an artificial separation between the conditions of life and the form of life that is conditioned.  An ecological or systems view would instead emphasize the interdependent and interactive relationships that exist, as all forms of life alter their conditions simply by their very presence, by their metabolism, for example, and through their reproduction.  Darwin understood this, I hasten to add, and the seeds of ecology can be found in his work, although they did not fully germinate until the turn of the 20th century.  And Skinner certainly was aware of the individual's capacity for self-stimulation, and self-modification, but a truly relational approach in psychology did not coalesce until Gregory Bateson introduced a cybernetic perspective during the 1950s.

In the passage quoted above, it is readily apparent that Arendt is an ecological thinker.  In saying that, "the things that owe their existence exclusively to men nevertheless constantly condition their human makers," she is saying that we create the conditions that in turn condition us.  We exist within a reciprocal relationship, a dialogue if you like, between the conditioned and the conditions, the internal and the external, the organism and its environment.  The changes that we introduce into our environment, that alter the environment, feedback into ourselves as we are influenced, affected, and shaped by our environment.

The contrast between using tools and techniques in the most basic way to adapt to the conditions of the environment, and the creation of an entirely new technological environment of great complexity that requires us to perform highly convoluted acts of adaptation was portrayed with brilliant sensitivity and humor in the 1980 South African film, directed by Jamie Uys, entitled The Gods Must Be Crazy.  A good part of the documentary style opening can be seen on this YouTube clip:

The story of the Coke bottle, although fictional, follows the pattern of many documented cases in which the introduction of new technologies to traditional societies has had disruptive, and often enough, disastrous effects (the film itself, I hasten to add, is marvelously comedic, and quite often slapstick following the introductory quarter hour.)

The understanding that we are conditioned by the conditions we ourselves introduce was not unknown in the ancient world.  The 115th Psalm of David, in its polemic against idolatry and the idols that are "the work of men's hands," cautions that "they who make them shall be like unto them; yea every one that trusts in them."  Along the same lines, the Gospel of Matthew includes the famous quote, "all those who take up the sword shall perish by the sword," while the Epistle to the Galatians advises, "whatsoever a man sows, that shall he also reap." A more contemporary variation of that maxim is, "as you make your bed, so you shall lie on it," although in the United States it is often rendered in the imperative and punitive form of, "you made your bed, go lie in it!"  During the 19th century, Henry David Thoreau notified us that "we do not ride on the railroad; it rides upon us," while Mark Twain humorously observed that, "if all you have is a hammer, everything looks like a nail."  More recently, we have been told, "ask a silly question, get a silly answer," to which computer scientists have responded with the acronym GIGO, which stands for, "garbage in, garbage out."  Winston Churchill said, "we shape our buildings, and thereafter they shape us," and former Fordham professor John Culkin, in turn, offered, "we shape our tools, and thereafter they shape us," as a corollary to Marhsall McLuhan's media ecology aphorism, "the medium is the message."

All of these voices, in their varying ways, are pointing to the same essential truth about the human condition that Arendt is relating in the quote that begins this post.  And to pick up where that quote leaves off, Arendt goes on to argue,

In addition to the conditions under which life is given to man on earth, and partly out of them, men constantly create their own, self-made conditions, which, their human origin and their variability not withstanding, possess the same conditioning power as natural things.

The "conditions" that we make are used to create a buffer or shield against the conditions that we inherit, so that our self-made conditions are meant to stand between us and what we would consider to be the natural environment.  In this sense, our self-made conditions mediate between ourselves and the pre-existing conditions that we operate under, which is to say that our conditions are media of human life.  And in mediating, in going between our prior conditions and ourselves, the new conditions that we create become our new environment.  And as we become conditioned to our new conditions, they fade from view, being routinized they melt into the background and become essentially invisible to us.

Let us return now for the conclusion of the passage from The Human Condition:

Whatever touches or enters into a sustained relationship with human life immediately assumes the character of a condition of human existence.  This is why men, no matter what they do, are always conditioned beings.  Whatever enters the world of its own accord or is drawn into it by human effort becomes part of the human condition.  The impact of the world's reality upon human existence is felt and received as a conditioning force.  The objectivity of the world—its object- or thing-character—and the human condition supplement each other; because human existence is conditioned existence, it would be impossible without things, and things would be a heap of unrelated articles, a non-world, if they were not the conditioners of human existence.

This last point is quite striking.  It is we, as human beings, who create worlds, which brings to mind the moving commentary from the Talmud:  "whoever saves a life, it is considered as if he saved an entire world."  We create worlds, in the sense that we give meaning to existence, we attribute meaning to phenomena, we construct symbolic as well as material environments.  Each one of us, in our singular subjectivity, creates a world of our own, and therefore each one of us represents a world unto ourselves.

But these individual worlds are links, nodes in a social network, interdependent and interactive parts of an ecological whole.  The term condition, in its root meaning is derived from the Latin prefix com, which means together, and dicere, which means to speak.  And our ability to speak together, to engage in discussion and deliberation, to enter into symbolic interaction, constitutes the means by which we collectively construct our intersubjective, social realities, our worlds.

As human beings, we are conditioned not only by our labor, the ways in which we obtain the necessities of life, i.e., air, water, food, shelter, to which Marx sought to reduce all aspects of society, a position that Arendt severely criticized.  We are conditioned not only by our work, which Arendt associated with artifacts, with instrumentality and technology, with arts and crafts.  We are conditioned most importantly by action, which in Arendt's view is intimately tied to speech and the symbolic, and to processes rather than things, to relations rather than objects.

In the end, Arendt reminds us that the human condition is itself conditional, and to be fully human requires not only that we take care of biological necessity, nor that we make life easier through technological innovation, but that we cooperate through speech and action in collectively constructing a world that is truly blessed with freedom and with justice.

-Lance Strate

 

5Mar/120

On the History of “Genocide”

What precisely do we mean when we use the term “genocide”? Has the word always been associated with the mass killing of individuals on the basis of their group affiliation? Or have there been alternative conceptions of genocide of which we should be aware?

These questions were at the heart of the Hannah Arendt Center’s latest Lunchtime Talk, which occurred amid picturesque snowfall on Wednesday, February 29th. The presenter was Douglas Irvin, a Ph.D. candidate at Rutgers’ Center for the Study of Genocide and Human Rights. Irving's talk revolved around the work of the Polish-Jewish lawyer Raphael Lemkin (1900-1959).

After escaping from Nazi-occupied Poland and lecturing at the University of Stockholm, Lemkin emigrated to the U.S., served as an advisor at the Nuremberg Trials, and played a central role in the passage of the 1948 U.N. Genocide Convention. Indeed, Lemkin was the first public figure to use the term “genocide,” which he derived from the Greek root genus (family, race, or tribe) and the Latin root, cide (killing).

Lemkin and Arendt were contemporaries with overlapping experiences and interests, but they engaged very little with one another in print (aside, perhaps, from a few allusions and anonymous criticisms). Irvin contends that there are good reasons for this lack of dialogue, since the two differed significantly in their views of genocide and humanity more broadly.

On the one hand, Arendt regarded genocide as a historically recent outgrowth of modern totalitarianism. According to Irvin, this understanding was in keeping with her more general conception of the human cosmos, which ultimately emerged through, and was grounded in, individual interactions within the arena of the polis.

Lemkin, by contrast, regarded genocide as a much older phenomenon, one that was premised not on the destruction of individuals on the basis of their group affiliation, but rather on the annihilation of entire cultural traditions and collective identities. Drawing eclectically on the work of seventeenth-century Spanish theologians, romantic thinkers like Johann Gottfried von Herder, and anthropological understandings of cultures as integrated wholes, Lemkin ultimately defined genocide as a coordinated attack on the conditions that make the lives of nations and other collectivities possible.

In this conception, genocide does not necessarily or inevitably entail the mass killing of a group’s members, but rather turns on concerted efforts to obliterate that group’s institutions, language, religious observance, and economic livelihood. In Irvin’s argument, this approach resonated with the broadly communitarian nature of Lemkin’s thought: human existence was in his estimation defined by interactions between culture-bearing groups, and human freedom could ultimately be secured through the benevolent recognition and protection of cultural pluralism.

Significantly, the U.N. Genocide Convention that Lemkin championed did not incorporate many aspects of his thinking. His ideas encountered strong resistance from the U.S., U.K., and other imperial powers, many of which feared that their treatment of indigenous and colonial populations would qualify as genocide under the standards that Lemkin (and his collaborators) proposed. As a result, our current understanding of genocide is in no small part a byproduct of a diplomatic battle to redefine this legal category in a fashion that would encompass the Nazi Holocaust but not implicate other states (including several of the Allied powers that fought against Germany in World War II). This wrangling has also contributed to the minimal attention that has since been paid to Lemkin’s ideas, which were only rediscovered in a significant way in the early 1990s.

Douglas Irvin’s stimulating talk suggested that such inattention is unfortunate. Whatever one thinks of Lemkin’s effort to inscribe a form of cultural relativity into liberal international law, a more thoughtful understanding of his life and thought can only enrich our understanding of genocide’s  career as a concept.

Click here to watch the Douglas Irvin lunchtime talk.

-Jeff Jurgens