“Scientific and philosophic truth have parted company.”
—Hannah Arendt, The Human Condition, 41.290
What can it mean that there are two different types of truth—scientific and philosophic? And how could they not be connected?
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Anthony Grafton calls David Nirenberg’s Anti-Judaism “one of the saddest stories, and one of the most learned, I have ever read.” Grafton knows that Anti-Judaism “is certainly not the first effort to survey the long grim history of the charges that have been brought against the Jews by their long gray line of self-appointed prosecutors.” What makes this account of the long history of Jewish hatred so compelling is that Nirenberg asks the big question: Why the Jews? “[Nirenberg] wants to know why: why have so many cultures and so many intellectuals had so much to say about the Jews? More particularly, he wants to know why so many of them generated their descriptions and explanations of Jewishness not out of personal knowledge or scholarly research, but out of thin air—and from assumptions, some inherited and others newly minted, that the Jews could be wholly known even to those who knew no Jews.” The question recalls the famous joke told during the Holocaust, especially amongst Jews in concentration camps. Here is one formulation of the joke from Antisemitism, the first book in the trilogy that comprises Hannah Arendt’s magnum opus, The Origins of Totalitarianism: “An antisemite claimed that the Jews had caused the war; the reply was: Yes, the Jews and the bicyclists. Why the bicyclists? Asks the one? Why the Jews? asks the other.” Read more on the Arendt Center blog.
News that the SAT is about to undergo a makeover leaves Bard College President Leon Botstein unimpressed: “The changes recently announced by the College Board to its SAT college entrance exam bring to mind the familiar phrase “too little, too late.” The alleged improvements are motivated not by any serious soul searching about the SAT but by the competition the College Board has experienced from its arch rival, the ACT, the other major purveyor of standardized college entrance exams. But the problems that plague the SAT also plague the ACT. The SAT needs to be abandoned and replaced. The SAT has a status as a reliable measure of college readiness it does not deserve. The College Board has successfully marketed its exams to parents, students, colleges and universities as arbiters of educational standards. The nation actually needs fewer such exam schemes; they damage the high school curriculum and terrify both students and parents. The blunt fact is that the SAT has never been a good predictor of academic achievement in college. High school grades adjusted to account for the curriculum and academic programs in the high school from which a student graduates are. The essential mechanism of the SAT, the multiple choice test question, is a bizarre relic of long outdated twentieth century social scientific assumptions and strategies. As every adult recognizes, knowing something or how to do something in real life is never defined by being able to choose a “right” answer from a set of possible answers (some of them intentionally misleading) put forward by faceless test designers who are rarely eminent experts. No scientist, engineer, writer, psychologist, artist, or physician— and certainly no scholar, and therefore no serious university faculty member—pursues his or her vocation by getting right answers from a set of prescribed alternatives that trivialize complexity and ambiguity.”
Foreign policy types are up in arms—not over Russia’s pending annexation of Crimea, but over the response in the West. By yelling loudly but doing nothing in Syria and now in the Ukraine, America and Europe are losing all credibility. The insinuation is clear. If we don’t draw the line at Crimea, we will embolden Putin in Poland. Much as in the 1930s, the current NATO alliance seems unwilling to stand up for anything on principle if the costs are more than a few photo opportunities and some angry tweets. According to The American Interest, “Putin believes the West is decadent, weak, and divided. The West needs to prove him wrong.” And in Politico, Ben Judah writes: “Russia’s rulers have been buying up Europe for years. They have mansions and luxury flats from London’s West End to France’s Cote d’Azure. Their children are safe at British boarding and Swiss finishing schools. And their money is squirrelled away in Austrian banks and British tax havens.Putin’s inner circle no longer fear the European establishment. They once imagined them all in MI6. Now they know better. They have seen firsthand how obsequious Western aristocrats and corporate tycoons suddenly turn when their billions come into play. They now view them as hypocrites—the same European elites who help them hide their fortunes.”
In The New York Times Magazine, Siddhartha Deb profiles Arundhati Roy, the Indian writer best known in the West for her 1997 novel The God of Small Things. Though the book made Roy into a national icon, her political essays – in which she has addressed, among other issues, India’s occupation of Kashmir, the “lunacy” of India’s nuclear programme, and the paramilitary operations in central India against the ultraleft guerillas and indigenous populations – have angered many nationalist and upper-class Indians for their fierce critiques. Roy’s most recent work, The Doctor and the Saint, is an introduction to Dr. B.R. Ambedkar’s famous 1936 essay “The Annihilation of Caste” that is likely to spark controversy over her rebuke of Ghandi, who wanted to abolish untouchability but not caste. How does Roy see her fiction in relation to her politics? “I’m not a person who likes to use fiction as a means,” she says. “I think it’s an irreducible thing, fiction. It’s itself. It’s not a movie, it’s not a political tract, it’s not a slogan. The ways in which I have thought politically, the proteins of that have to be broken down and forgotten about, until it comes out as the sweat on your skin.” You can read Deb’s profile of Roy here, and an excerpt from The Doctor and the Saint here.
Comparing the MOOC and the GED, Michael Guerreiro wonders whether participants approach both programs with the same sense of purpose. The answer, he suspects, is no: "The data tells us that very few of the students who enroll in a MOOC will ever reach its end. In the ivy, brick, and mortar world from which MOOCs were spun, that would be damning enough. Sticking around is important there; credentials and connections reign, starting with the high-school transcript and continuing through graduate degrees. But students may go into an online course knowing that a completion certificate, even offered under the imprimatur of Harvard or UPenn, doesn’t have the same worth. A recent study by a team of researchers from Coursera found that, for many MOOC students, the credential isn’t the goal at all. Students may treat the MOOC as a resource or a text rather than as a course, jumping in to learn new code or view an enticing lecture and back out whenever they want, just as they would while skimming the wider Web. For many, MOOCs may be just one more Internet tool or diversion; in the Coursera study, the retention rate among committed students for a typical class was shown to be roughly on par with that of a mobile app. And the London Times reported last week that, when given the option to get course credit for their MOOC (for a fee), none of the thousand, or so students who enrolled in a British online class did.” A potent reminder that while MOOCs may indeed succeed and may even replace university education for many people, they are not so much about education as a combination of entertainment, credential, and manual. These are important activities each, but they are not what liberal arts colleges should be about. The hope in the rise of MOOCs, as we’ve written before, is that they help return college to its mission: to teach critical thinking and expose students to the life of the mind.
Noam Chomsky, speaking to the Adjunct Faculty Association of the United Steelworkers, takes issue with the idea that the American university was once living and is now undead, and seeks a way forward: "First of all, we should put aside any idea that there was once a “golden age.” Things were different and in some ways better in the past, but far from perfect. The traditional universities were, for example, extremely hierarchical, with very little democratic participation in decision-making. One part of the activism of the 1960s was to try to democratize the universities, to bring in, say, student representatives to faculty committees, to bring in staff to participate. These efforts were carried forward under student initiatives, with some degree of success. Most universities now have some degree of student participation in faculty decisions. And I think those are the kinds of things we should be moving towards: a democratic institution, in which the people involved in the institution, whoever they may be (faculty, students, staff), participate in determining the nature of the institution and how it runs; and the same should go for a factory. These are not radical ideas."
This week on the blog Anna Metcalfe examines the multi-dimensional idea of action which Arendt discusses in The Human Condition. And in the Weekend Read, entitled 'Why the Jews?', Roger Berkowitz delves into anti-Judaism and its deeply seated roots in Western civilization.
Featuring Housekeeping by Marilynne Robinson.
Bard College partners with five local libraries for six weeks of activities, performances, and discussions scheduled throughout the Hudson Valley.
Learn more here.
'What Europe? Ideals to Fight for Today'
The HAC co-sponsors the second annual conference with Bard College in Berlin
March 27-28, 2014
Learn more here.
At Duke University and the University of North Carolina, two highly popular professors have transformed their course Think Again: How to Reason and Argue into a Massive Online Open Course (MOOC) that is taken by 170,000 people from all over the world at one time. This is old news. There is nothing to worry about when hundreds of thousands of people around the world watch flashy lectures by top professors on how to think and argue. Better such diversions than playing Temple Run. There are advantages and benefits from MOOCs and other forms of computer learning. And we should not run scared from MOOCs.
But the alacrity with which universities are adopting MOOCs as a way of cutting costs and marketing themselves as international brands harbors a danger too. The danger is not that more people will watch MOOCs or that MOOCs might be used to convey basic knowledge inside or outside of universities. No, the real danger in MOOCs is that watching a professor on your Ipad becomes confused with education.
You know elite universities are in trouble when their professors say things like Edward Rock. Rock, Distinguished Professor at the University of Pennsylvania Law School and coordinator of Penn’s online education program, has this to say about the impending revolution in online education:
We’re in the business of creating and disseminating knowledge. And in 2012, the internet is an incredibly important place to be present if you’re in the knowledge dissemination business.
If elite colleges are in the knowledge dissemination business, then they will over time be increasingly devalued and made less relevant. There is no reason that computers or televisions can’t convey knowledge as well or even better than humans. Insofar as professors and colleges imagine themselves to be in the “business of creating and disseminating knowledge,” they will be replaced by computers. And it will be their own fault.
The rising popularity of MOOCs must be understood not as a product of new technology, but as a response to the failure of our universities. As Scott Newstock has argued, the basic principle behind MOOCs is hardly new. Newstock quotes one prominent expert who argues that the average distance learner "knows more of the subject, and knows it better, than the student who has covered the same ground in the classroom." Indeed, "the day is coming when the work done [via distance learning] will be greater in amount than that done in the class-rooms of our colleges." What you might not expect is that this prediction was made in 1885. "The commentator quoted above was Yale classicist (and future University of Chicago President) William Rainey Harper, evaluating correspondence courses." What Newstock’s provocation shows is that efforts to replace education with knowledge dissemination have been around for centuries. But they have failed, at least until now.
MOOCs are so popular today because of the sadly poor quality of much—but certainly not all—college and university education. Around the country there are cavernous lecture halls filled with many hundreds of students. A lone professor stands up front, often with a PowerPoint presentation in a darkened room. Students have their computers open. Some are taking notes, but many are checking Facebook or surfing the Internet. Some are asleep. And others did not bother to show up, since the professor has posted his or her lecture notes online so that students can just read them instead of making the effort to make it to class. Such lectures may be half-decent ways to disseminate knowledge. Some lectures are better than others. But not much learning goes on in such lectures that can’t be simply replicated more efficiently and maybe even better on a computer. It is in this context that advocates of MOOCs are correct. When one compares a large lecture course with a well-designed online course, it may very well be that the online course is a superior educational venture. That it is cheaper too makes the advance of MOOCs seemingly inevitable.
As I have written here before, the best argument for MOOCs is that they may finally put the large and impersonal college lecture course out of its misery. There is no reason to be nostalgic for the lecture course. It was never a very good idea. Aside from a few exceptional lecturers—in my world I can think of the reputations of Hegel, his student Eduard Gans, Martin Heidegger, and, of course, Hannah Arendt—college lectures are largely an economical way to allow masses of students to acquire basic introductory knowledge in a field. If the masses are now more massive and the lectures more accessible, I’ll accept that as progress.
What this means is that there is an opportunity, at this moment, to embrace MOOCs as a disruptive force that will allow us to re-dedicate our universities and colleges to the practice of education as opposed to the business of knowledge dissemination. What colleges and universities need to offer is not simply knowledge, but education.
“Education,” as Martin Luther King wrote, “must also train one for quick, resolute and effective thinking.” Quick and resolute thinking requires that one “think incisively” and “think for one's self.” This “is very difficult.” The difficulty comes from the seduction of conformity and the power of prejudice. “We are prone to let our mental life become invaded by legions of half truths, prejudices, and propaganda.” We are all educated into prejudgments. They are human and it is inhuman to live free from prejudicial opinions and thoughts. On the one hand, education is the way we are led into and brought into a world as it exists, with its prejudices and values. And yet, education must also produce self-thinking persons, people who, once they are educated and enter the world as adults, are capable of judging the world into which they been born. (I have written more about King’s thoughts on education here).
In her essay “The Crisis in Education,” Hannah Arendt writes that education must have a double aspect. First, education leads a new young person into an already existing world. The world is that which is there before the child was born and will continue to exist after the child dies. It is the common world of things, stories, and experiences in which all of us spend our lives. All children, as newcomers who are born into a world that is at first strange to them, must be led into the already existing world. They must be taught to speak a common language, respect common values, see the same facts, and hear the same stories. This common world is what Arendt calls the “truth… we cannot change; metaphorically, it is the ground on which we stand and the sky that stretches above us.” In its first aspect, then, education must protect the world from “the onslaught of the new that bursts upon it with each new generation.” This is the conservationist function of education: to conserve the common world against the rebelliousness of the new. And this is why Arendt writes, “Education is the point at which we decide whether we love the world enough to assume responsibility for it.”
At the same time, however, there is a second aspect of education that seeks to afford the child “special protection and care so that nothing destructive may happen to him from the world.” The teacher must nurture the independence and newness of each child, what “we generally call the free development of characteristic qualities and talents… the uniqueness that distinguishes every human being from every other.” The teacher must not simply love the world, but as part of the world in which we live, the teacher must also love the fact—and it is a fact—that the world will change and be transformed by new ideas and new people. Education must love this transformative nature of children, and we must “love our children enough” so that we do not “strike from their hands their chance of undertaking something new, something unforeseen by us, but to prepare them in advance for the task of renewing a common world.” Alongside its conservationist role, education also must be revolutionary in the sense that it prepares students to strike out and create something altogether new.
Now is the time to use the disruption around MOOCs to rethink and re-invigorate our commitment to education and not simply to the dissemination of knowledge. This will not be easy.
A case in point is the same Duke University Course mentioned above, “Think Again: How to Reason and Argue.” In a recent article by Michael Fitzgerald, the Professors— Walter Sinnott-Armstrong from Duke and Ram Neta of the University of North Carolina at Chapel Hill— describe how teaching their MOOC led them to radically re-conceive how they teach in physical university classrooms. Here is Fitzgerald:
“The big shift: far fewer in-class lectures. Students will watch the lectures on Coursera beginning Monday. "Class will become a time for activities and also teamwork," said Sinnott-Armstrong. He's devised exercises to help on-campus students engage with the concepts in the class, including a college bowl-like competition, a murder mystery night and a scavenger hunt, all to help students develop a deeper understanding of the material presented in the lectures. "You can have these fun activities in the classroom when you're not wasting the classroom time with the lectures," he said.”
What we see here is that the mass appeal of MOOCs and their use as a way of replacing lectures is not being seized as an opportunity to make education more serious, but as an excuse to make college more fun. That professors at two of this country’s elite universities see it as progress that classes are replaced by murder mystery games and scavenger hunts is evidence of a profound confusion between education and infotainment. I have no doubt that much can be learned through fun and games. Children learn through games and it makes all the sense in the world that Finland allows children in schools to play until they are seven or eight years old. Even in primary or at times in secondary school, simulations and games may be useful. But there is a limit. Education, at least higher education, is not simply fun and games in the pursuit of knowledge.
As Arendt understood, education requires that students be nurtured and allowed to grow into adults who think for themselves in a serious and engaged way about the world. This is one reason Arendt is so critical of reformist pedagogy that seeks to stimulate children—especially older children in secondary schools and even college—to learn through play. When we teach children a foreign language through games instead of through grammar or when we make them learn history by playing computer games instead of by reading and studying, we “keep the older child as far as possible at the infant level. The very thing that should prepare the child for the world of adults, the gradually acquired habit of work and of not-playing, is done away with in favor of the autonomy of the world of childhood.” The same can be said of university courses that adopt the juvenile means of primary and secondary education.
The reasons for such a move to games in the classroom are many. Games are easy, students love them, and thus they fill massive classes, leading to superstar professors who can command supersized salaries. What is more, games work. You can learn a language through games. But games rarely teach seriousness and independence of thought.
The rise of MOOCs and the rise of fun in the college classroom are part of the trend to reduce education to a juvenile pursuit. One hardly needs an advanced degree to oversee a scavenger hunt or prepare students to take a test. And scavenger hunts, as useful as they may be in making learning fun, will hardly inculcate the independence of mind and strength of character that will produce self-thinking citizens capable of renewing the common world.
The question of how to address the crisis in education today—the fact that an ever more knowledgeable population with greatest access to information than at any time in the history of the world is perhaps the most politically illiterate citizenry in centuries—is the theme of the upcoming Hannah Arendt Center Conference, “Failing Fast: The Educated Citizen in Crisis.” In preparation for the conference, you can do nothing better than to re-read Hannah Arendt’s essay, "The Crisis in Education." You can also buy Between Past and Future the book of essays in which it appears. However you read it, "The Crisis in Education" is your weekend read.
“[Augustine] distinguishes between the questions of "Who am I?" and "What am I?" the first being directed by man at himself […] For in the "great mystery," the grande profundum, which man is (iv. 14), there is "something of man [aliquid hominis] which the spirit of man which is in him itself knoweth not. But Thou, Lord, who has made him [fecisti eum] knowest everything of him [eius omnia]" (x. 5).”
-Hannah Arendt, Human Condition
In the Human Condition Arendt raises major concerns about the place of man but she does not intend to respond to the loss of the earth as a unique human condition with a restoration of solid ground. To the question “What am I?” the only answer is: “You are a man—whatever that may be.” In lieu of an answer that would give man a new foundation, Arendt offers a description of man's ever changing territory.
Following Augustine, Arendt claims that only God could have the distance to answer the question of "who" man is with anything resembling a concrete statement of human nature. She respects the unknown “spirit of man,” even beyond the knowledge provided by religion.
When philosophy attempts to answer this question, it ends up creating its own image of a higher power, which remains linked through projection to man. Importantly though, philosophy should still ask the question.
Some context can help to open Arendt's question here for readers in English speaking countries where philosophical anthropology never gained the same traction as in Germany. Her challenge picks up on the heated debates of the 1920s and 30s over how to take the collapse of universal values seriously without falling back to simple subjectivism that culminated in the work of Husserl and Heidegger.
In the space of four pages of Being and Time (46-49), Martin Heidegger specifies his criticism with reference to Dilthey, Bergson, Scheler, and Husserl, as well as views from ancient Greek philosophy and Genesis. Heidegger says he has focused his analytic of Dasein on the question of Being and that it cannot therefore provide the fully ontological basis of Dasein needed for "'philosophical' anthropology'" but states that part of his goal is to "make such an anthropology possible." Later though, in section 10, Heidegger provides a further explanation of his criticism of anthropology: in "the attempt to determine the essence of 'man,' as an entity, the question of Being has been forgotten."
In its turn to experience and consciousness, philosophical anthropology forgets to ask the question of ontological definition of perceptual experience (cogitationes). Heidegger thus suggests that his investigation might provide the basis for an anthropology but does not claim to actually deliver this basis. He opens the question of the definition of man, but does so to orient man (recast as Dasein) toward his relation to Being. In a parallel manner, we can understand Arendt's reading of Augustine as opening the question of the relation between the "who" and “what” man is, but not closing it. Her work here is provocative because it can not be said to be in the service of a simple secularization that removes a higher power for human measure. Nor does she wish to save or restore divine guarantee. Perhaps Augustine allows her to pose similar questions of philosophical anthropology to those raised by Heidegger, but to win some distance from her teacher so that she can open a new space of freedom of action rather than freedom of thought.