Hannah Arendt Center for Politics and Humanities
24Mar/140

Amor Mundi 3/23/14

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

What Silver Knows

foxData journalist Nate Silver reopened his FiveThirtyEight blog this past week, after leaving the New York Times last year. Although the website launched with a full slate of articles, the opening salvo is a manifesto he calls "What The Fox Knows," referencing the maxim from the poet Archilochus’, “The fox knows many things, but the hedgehog knows one big thing.” For Silver, this means, “We take a pluralistic approach and we hope to contribute to your understanding of the news in a variety of ways.” What separates FiveThirtyEight is its focus on big data, the long trail of information left by everything we do in a digital world. From big data, Silver believes he can predict outcomes more accurately than traditional journalism, and that he will also be better able to explain and predict human behavior. “Indeed, as more human behaviors are being measured, the line between the quantitative and the qualitative has blurred. I admire Brian Burke, who led the U.S. men’s hockey team on an Olympic run in 2010 and who has been an outspoken advocate for gay-rights causes in sports. But Burke said something on the hockey analytics panel at the MIT Sloan Sports Analytics Conference last month that I took issue with. He expressed concern that statistics couldn’t measure a hockey player’s perseverance. For instance, he asked, would one of his forwards retain control of the puck when Zdeno Chara, the Boston Bruins’ intimidating 6’9″ defenseman, was bearing down on him? The thing is, this is something you could measure. You could watch video of all Bruins games and record how often different forwards kept control of the puck. Soon, the NHL may install motion-tracking cameras in its arenas, as other sports leagues have done, creating a record of each player’s x- and y-coordinates throughout the game and making this data collection process much easier.” As the availability of data increases beyond comprehension, humans will necessarily turn the effort of analysis over to machines running algorithms. Predictions and simulations will abound and human actions—whether voting for a president or holding on to a hockey puck—will increasingly appear to be predictable behavior. The fact that actions are never fully predictable is already fading from view; we have become accustomed to knowing how things will end before they begin. At the very least, Nate Silver and his team at FiveThirtyEight will try to “critique incautious uses of statistics when they arise elsewhere in news coverage.”

All in All, Another Tweet in the Wall

tejuAuthor Teju Cole recently composed and released an essay called “A Piece of The Wall” exclusively on Twitter. In an interview, along with details about the technical aspects of putting together what's more like a piece of radio journalism than a piece of print journalism, Cole notes that there may be a connection between readership and change: "I’m not getting my hopes up, but the point of writing about these things, and hoping they reach a big audience, has nothing to do with “innovation” or with “writing.” It’s about the hope that more and more people will have their conscience moved about the plight of other human beings. In the case of drones, for example, I think that all the writing and sorrow about it has led to a scaling back of operations: It continues, it’s still awful, but the rate has been scaled back, and this has been in specific response to public criticism. I continue to believe the emperor has a soul."

A Religious Age?

bergerPeter Berger has a thoughtful critique of Charles Taylor’s A Secular Age, one that accepts Taylor’s philosophical premise but denies its sociological reality. “I think that Taylor’s magnum opus makes a very significant contribution, though I disagree with its central proposition: We don’t live in a “secular age”; rather in most of the world we live in a turbulently religious age (with the exception of a few places, like university philosophy departments in Canada and football clubs in Britain). (Has Taylor been recently in Nepal? Or for that matter in central Texas?) Taylor is a very sophisticated philosopher, not an empirically oriented sociologist of religion. It so happens that we now have a sizable body of empirical data from much of the world (including America and Europe) on what ordinary religious people actually believe and how they relate their faith to various secular definitions of reality). Let me just mention the rich work of Robert Wuthnow, Nancy Ammerman and Tanya Luhrmann in the US, and Grace Davie, Linda Woodhead and Daniele Hervieu-Leger in Europe. There is a phrase that sociology students learn in the first year of graduate study—frequency distribution:  It is important for me to understand just what X is; it is even more important for me to know how much X there is at a given time in a given place. The phrase is to be recommended to all inclined to make a priori  statements about anything. In this case, I think that Taylor has made a very useful contribution in his careful description of what he calls “the immanent frame” (he also calls it “exclusive humanism”)—a sense of reality that excludes all references to transcendence or anything beyond mundane human experience. Taylor also traced the historical development of this definition of reality.” Maybe the disagreement is more subtle: Religion continues in the secular age, but it is more personal. Quite simply, churches were once the tallest and most central buildings, representing the center of public and civic life. That is no longer the case in Europe; nor in Nepal.

Looking Under the Skin

scarlettAnthony Lane in The New Yorker asks the question, “Why should we watch Scarlett Johansson with any more attention than we pay to other actors?” His answer concerns Johansson’s role and performance in her new movie “Under the Skin.” Lane is near obsessed with Johansson’s ability to reveal nothing and everything with a look—what he calls the “Johansson look, already potent and unnerving. She was starting to poke under the skin.” He continues describing Johansson in a photo shoot: ““Give me nothing,” Dukovic said, and Johansson wiped the expression from her face, saying, “I’ll just pretend to be a model.” Pause. “I rarely have anything inside me.” Then came the laugh: dry and dirty, as if this were a drama class and her task was to play a Martini. Invited to simulate a Renaissance picture, she immediately slipped into a sixteenth-century persona, pretending to hold a pose for a painter and kvetching about it: “How long do I have to sit here for? My sciatica is killing me.” You could not wish for a more plausible insight into the mind-set of the Mona Lisa. A small table and a stool were provided, and Johansson sat down with her arms folded in front of her. “I want to look Presidential,” she declared. “I want this to be my Mt. Rushmore portrait.” Once more, Dukovic told her what to show: “Absolutely nothing.” Not long after, he and his team began to pack up. The whole shoot had taken seventeen minutes. She had given him absolutely everything. We should not be surprised by this. After all, film stars are those unlikely beings who seem more alive, not less, when images are made of them; who unfurl and reach toward the light, instead of seizing up, when confronted by a camera; and who, by some miracle or trick, become enriched versions of themselves, even as they ramify into other selves on cue. Clarence Sinclair Bull, the great stills photographer at M-G-M, said of Greta Garbo that “she seems to feel the emotion for each pose as part of her personality.” From the late nineteen-twenties, he held a near-monopoly on pictures of Garbo, so uncanny was their rapport. “All I did was to light the face and wait. And watch,” he said. Why should we watch Johansson with any more attention than we pay to other actors?”

Fantasizing About Being Lost

malaysiaGeoffrey Gray suggests a reason why we've become obsessed with the missing plane: "Wherever the Malaysia Airlines plane is, it found a hiding place. And the longer it takes investigators to discover where it is and what went wrong, the longer we have to indulge in the fantasy that we too might be able to elude the computers tracking our clicks, text messages, and even our movements. Hidden from the rest of the world, if only for an imagined moment, we feel what the passengers of Flight 370 most likely don't: safe."

 

This Week on the Hannah Arendt Center Blog

This week on the blog, learn more about the Program Associate position now available at the Arendt Center. In the Quote of the Week, Ian Zuckerman looks at the role some of Arendt's core themes play in Kubrik's famed nuclear satire, "Dr Strangelove." And, HannahArendt.net issues a call for papers for their upcoming 'Justice and Law' edition being released in August of this year.

The Hannah Arendt Center
The Hannah Arendt Center at Bard is a unique institution, offering a marriage of non-partisan politics and the humanities. It serves as an intellectual incubator for engaged thinking and public discussion of the nation's most pressing political and ethical challenges.
13Jan/140

Amor Mundi 1/12/14

Arendtamormundi

Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

False Analogies: Stalin and Cromwell

cromwellPeter Singer writes of the suddenly divergent attitudes toward the two greatest mass murderers of the 20th Century, Hitler and Stalin: “Hitler and Stalin were ruthless dictators who committed murder on a vast scale. But, while it is impossible to imagine a Hitler statue in Berlin, or anywhere else in Germany, statues of Stalin have been restored in towns across Georgia (his birthplace), and another is to be erected in Moscow as part of a commemoration of all Soviet leaders.” When Putin was asked recently about his plan to erect statues of Stalin, he justified it by comparing Stalin to Oliver Cromwell: “Asked about Moscow’s plans for a statue of Stalin, he pointed to Oliver Cromwell, the leader of the Parliamentarian side in the seventeenth-century English Civil War, and asked: “What’s the real difference between Cromwell and Stalin?” He then answered his own question: “None whatsoever,” and went on to describe Cromwell as a “cunning fellow” who “played a very ambiguous role in Britain’s history.” (A statue of Cromwell stands outside the House of Commons in London.)” For a lesson in false analogies, read more here.

After All the People We Killed

ecuSome stories are so morally complicated and politically convoluted that they tug us this way and that as we read about them. That is how I felt reading Bethany Horne’s account of the genocidal, environmental, political, criminal, and corporate tragedy that is unfolding in Ecuador. Horne’s title, “After All the People We Killed, We Felt Dizzy” is a quotation from a member of the Huaorani tribe describing their massacre of an entire family group from the Taromenane people. A 6-year-old girl who survived the massacre has since been kidnapped twice and has now been elevated into a symbol in a political war between environmentalists and human rights activists on one side and the Ecuadoran government on the other. “Conta [the kidnapped girl] can't know that the jungle she was snatched from by those armed men in helicopters is a rallying cry for 15 million people in Ecuador. She can't know that the land rights and human rights of her people are the cause of a massive movement to force the president of Ecuador to do something he does not want to do. And last of all, Conta can't possibly comprehend the full impact of what Correa wants so badly from the Taromenane: the crude oil underneath their homes, a commodity that powers a world she does not understand that threatens to swallow her.”

Talking to Each Other

John Cuneo

John Cuneo

In a short profile of author and MIT professor Sherry Turkle, Megan Garber elucidates the difference that Turkle makes between the way we talk at each other, with our machines, and the way we talk to each other, in person-to-person conversations: “Conversations, as they tend to play out in person, are messy—full of pauses and interruptions and topic changes and assorted awkwardness. But the messiness is what allows for true exchange. It gives participants the time—and, just as important, the permission—to think and react and glean insights. ‘You can’t always tell, in a conversation, when the interesting bit is going to come,’ Turkle says. 'It’s like dancing: slow, slow, quick-quick, slow. You know? It seems boring, but all of a sudden there’s something, and whoa.’”

Incomplete Tellings are all that Remain

manMark Slouka remembers his recently passed father and elaborates on one of the particular things he lost: "With him gone, there’s no one to reminisce with, no one to corroborate my memories (or correct them), no one to identify the little girl smiling up from the curling photograph at the bottom of the shoebox. In 1942, in Brno, my father’s family hid a man in the rabbit hutch for a week, until he could be moved. That’s all I know of the story, and now it’s all I’ll ever know. With no one to check me, error will spread like weeds. Which is how the past is transmuted into fiction, and then the fool’s gold of history."

Banking and the English Language

benThomas Streithorst, before attempt to untangle the language of finance, explains why he thinks the task is necessary: "Sometimes I think bankers earn all that money because they make what they do seem both tedious and unintelligible. Banking may be the only business where boredom is something to strive for, so its jargon both obfuscates and sends you to sleep. But six years of pain forces us to realize that economics is too important to be left to the bankers. If the rest of us keep bailing them out, we might as well know what they do. Fortunately, finance isn’t as complicated as its practitioners pretend. It does, however, have its own language, and if you don’t understand it, it sounds like gobbledygook."

From the Hannah Arendt Center Blog

This week on the Blog, Steven Tatum considers what it means to teach Arendtian thinking. In the Weekend Read, Roger Berkowitz reflects on President Vladimir Putin's recent attempt to justify statues memorializing Josef Stalin by comparing him to Oliver Cromwell.

 

The Hannah Arendt Center
The Hannah Arendt Center at Bard is a unique institution, offering a marriage of non-partisan politics and the humanities. It serves as an intellectual incubator for engaged thinking and public discussion of the nation's most pressing political and ethical challenges.
18Nov/130

One Against All

Arendtquote

This Quote of the Week was originally published on September 3, 2012.

It can be dangerous to tell the truth: “There will always be One against All, one person against all others. [This is so] not because One is terribly wise and All are terribly foolish, but because the process of thinking and researching, which finally yields truth, can only be accomplished by an individual person. In its singularity or duality, one human being seeks and finds – not the truth (Lessing) –, but some truth.”

-Hannah Arendt, Denktagebuch, Book XXIV, No. 21

Hannah Arendt wrote these lines when she was confronted with the severe and often unfair, even slanderous, public criticism launched against her and her book Eichmann in Jerusalemafter its publication in 1963. The quote points to her understanding of the thinking I (as opposed to the acting We) on which she bases her moral and, partly, her political philosophy.

denk

It is the thinking I, defined with Kant as selbstdenkend (self-thinking [“singularity”]) and an-der-Stelle-jedes-andern-denkend (i.e., in Arendt’s terms, thinking representatively or practicing the two-in-one [“duality”]). Her words also hint at an essay she published in 1967 titled “Truth and Politics,” wherein she takes up the idea that it is dangerous to tell the truth, factual truth in particular, and considers the teller of factual truth to be powerless. Logically, the All are the powerful, because they may determine what at a specific place and time is considered to be factual truth; their lies, in the guise of truth, constitute reality. Thus, it is extremely hard to fight them.

In answer to questions posed in 1963 by the journalist Samuel Grafton regarding her report on Eichmann and published only recently, Arendt states: “Once I wrote, I was bound to tell the truth as I see it.” The statement reveals that she was quite well aware of the fact that her story, i.e., the result of her own thinking and researching, was only one among others. She also realized the lack of understanding and, in many cases, of thinking and researching, on the part of her critics.

ius

Thus, she lost any hope of being able to publicly debate her position in a “real controversy,” as she wrote to Rabbi Hertzberg (April 8, 1966). By the same token, she determined that she would not entertain her critics, as Socrates did the Athenians: “Don’t be offended at my telling you the truth.” Reminded of this quote from Plato’s Apology (31e) in a supportive letter from her friend Helen Wolff, she acknowledged the reference, but acted differently. After having made up her mind, she wrote to Mary McCarthy: “I am convinced that I should not answer individual critics. I probably shall finally make, not an answer, but a kind of evaluation of this whole strange business.” In other words, she did not defend herself in following the motto “One against All,” which she had perceived and noted in her Denktagebuch. Rather, as announced to McCarthy, she provided an “evaluation” in the 1964 preface to the German edition of Eichmann in Jerusalem and later when revising that preface for the postscript of the second English edition.

Arendt also refused to act in accordance with the old saying: Fiat iustitia, et pereat mundus(let there be justice, though the world perish). She writes – in the note of the Denktagebuchfrom which today’s quote is taken – that such acting would reveal the courage of the teller of truth “or, perhaps, his stubbornness, but neither the truth of what he had to say nor even his own truthfulness.” Thus, she rejected an attitude known in German cultural tradition under the name of Michael Kohlhaas.  A horse trader living in the 16th century, Kohlhaas became known for endlessly and in vain fighting injustice done to him (two of his horses were stolen on the order of a nobleman) and finally taking the law into his own hands by setting fire to houses in Wittenberg.

card

Even so, Arendt has been praised as a woman of “intellectual courage” with regard to her book on Eichmann (see Richard Bernstein’s contribution to Thinking in Dark Times).

Intellectual courage based on thinking and researching was rare in Arendt’s time and has become even rarer since then. But should Arendt therefore only matter nostalgicly? Certainly not. Her emphasis on the benefits of thinking as a solitary business still remains current. Consider, for example, the following reference to Sherry Turkle, a sociologist at MIT and author of the recent book Alone Together. In an interview with Peter Haffner (published on July 27, 2012, in SZ Magazin), she argues that individuals who become absorbed in digital communication lose crucial components of their faculty of thinking. Turkle says (my translation): Students who spend all their time and energy on communication via SMS, Facebook, etc. “can hardly concentrate on a particular subject. They have difficulty thinking a complex idea through to its end.” No doubt, this sounds familiar to all of us who know about Hannah Arendt’s effort to promote thinking (and judging) in order to make our world more human.

To return to today’s quote: It can be dangerous to tell the truth, but thinking is dangerous too. Once in a while, not only the teller of truth but the thinking 'I' as well may find himself or herself in the position of One against All.

-Ursula Ludz

The Hannah Arendt Center
The Hannah Arendt Center at Bard is a unique institution, offering a marriage of non-partisan politics and the humanities. It serves as an intellectual incubator for engaged thinking and public discussion of the nation's most pressing political and ethical challenges.
7Jun/130

In the Age of Big Data, Should We Live in Awe of Machines?

ArendtWeekendReading

In 1949, The New York Times asked Norbert Wiener, author of Cybernetics, to write an essay for the paper that expressed his ideas in simple form. For editorial and other reasons, Wiener’s essay never appeared and was lost. Recently, a draft of the never-published essay was found in the MIT archives. Written now 64 years ago, the essay remains deeply topical. The Times recently printed excerpts. Here is the first paragraph:

By this time the public is well aware that a new age of machines is upon us based on the computing machine, and not on the power machine. The tendency of these new machines is to replace human judgment on all levels but a fairly high one, rather than to replace human energy and power by machine energy and power. It is already clear that this new replacement will have a profound influence upon our lives, but it is not clear to the man of the street what this influence will be.

Wiener draws a core distinction between machines and computing machines, a distinction that is founded upon the ability of machines to mimic and replace not only human labor, but also human judgment. In the 1950s, when Wiener wrote, most Americans worried about automation replacing factory workers. What Wiener saw was a different danger: that intelligent machines could be created that would “replace human judgment on all levels but a fairly high one.”  

Today, of course, Wiener’s prophecy is finally coming true. The IBM supercomputer Watson is being trained to make diagnoses with such accuracy, speed, and efficiency that it will largely replace the need for doctors to be trained in diagnostics.

watson

Google is developing a self-driving car that will obviate the need for humans to judge how fast and near to others they will drive, just as GPS systems already render moot the human sense of direction. MOOCs are automating the process of education and grading so that fewer human decisions need to be made at every level. Facebook is automating the acquisition of friends, lawyers are employing computers to read and analyze documents, and on Wall Street computer trading is automating the buying and selling of stocks. Surveillance drones, of course, are being given increasing autonomy to sift through data and decide which persons to follow or investigate. Finally, in the scandal of the day, the National Security Agency is using computer algorithms to mine data about our phone calls looking for abnormalities and suspicious patterns that would suggest potential dangers. In all these cases, the turn to machines to supplement or even replace human judgment has a simple reason: Even if machines cannot think, they can be programmed to do traditionally human tasks in ways that are faster, more reliable, and less expensive than can be done by human beings. In ways big and small, human judgment is being replaced by computers and machines.

It is important to recognize that Wiener is not arguing that we will create artificial human beings. The claim is not that humans are simply fancy machines or that machines can become human. Rather, the point is that machines can be made to mimic human judgment with such precision and subtlety so that their judgments, while not human, are considered either equal to human judgment or even better. The result, Wiener writes, is that “Machines much more closely analogous to the human organism are well understood, and are now on the verge of being built. They will control entire industrial processes and will even make possible the factory substantially without employees.”

Wiener saw this new machine age as dangerous on at least two grounds. First, economically, the rise of machines carries the potential to upend basic structures of civilization. He writes:

These new machines have a great capacity for upsetting the present basis of industry, and of reducing the economic value of the routine factory employee to a point at which he is not worth hiring at any price. If we combine our machine-potentials of a factory with the valuation of human beings on which our present factory system is based, we are in for an industrial revolution of unmitigated cruelty.

The dangers Wiener sees from our increased reliance on computing machines are not limited to economic dislocation. The real threat that computing machines pose is that as we cede more and more power to machines in our daily lives, we will, he writes, gradually forfeit our freedom and independence:

[I]f we move in the direction of making machines which learn and whose behavior is modified by experience, we must face the fact that every degree of independence we give the machine is a degree of possible defiance of our wishes. The genie in the bottle will not willingly go back in the bottle, nor have we any reason to expect them to be well disposed to us.

In short, it is only a humanity which is capable of awe, which will also be capable of controlling the new potentials which we are opening for ourselves. We can be humble and live a good life with the aid of the machines, or we can be arrogant and die.

For Wiener, our eventual servitude to machines is both an acceptable result and a fait accompli, one we must learn to accept. If we insist on arrogantly maintaining our independence and freedom, we will die. I gather the point is not that machines will rise up and kill their creators, but rather that we ourselves will program our machines to eliminate, imprison, immobilize, or re-program those humans who refuse to comply with paternalistic and well-meaning directives of the machines systems we create in order to provide ourselves with security and plenty.

Wiener counsels that instead of self-important resistance, we must learn to be in awe of our machines. Our machines will improve our lives. They will ensure better medical care, safer streets, more efficient production, better education, more reliable childcare and more human warfare. Machines offer the promise of a cybernetic civilization in which an entire human and natural world is regulated and driven towards a common good with super-human intelligence and calculative power. In the face of such utopian possibility, we must accept our new status as the lucky beneficiaries of the regulatory systems we have created and humble ourselves as beings meant to live well rather than to live free.

tech

Recent revelations about the U.S. government’s using powerful computers to mine and analyze enormous amounts of data collected via subpoenas from U.S. telecom companies is simply one example of the kind of tradeoff Wiener suggests we will and we should make. If I understand the conclusions of Glenn Greenwald’s typically excellent investigative reporting, the NSA uses computer algorithms to scan the totality of phone calls and internet traffic in and out of the United States. The NSA needs all of this data—all of our private data—in order to understand the normal patterns of telephony and web traffic and thus to notice, as well, those exceptional patterns of calling, chatting, and surfing. The civil libertarian challenges of such a program are clear: the construction of a database of normal behavior allows the government to attend to those whose activities are outside the norm. Those outliers can be terrorists or pedophiles; they may be Branch Davidians or members of Occupy Wall Street; they may be Heideggerians or Arendtians. Whomever they are, once those who exist and act in patterns outside the norm are identified, it is up to the government whether to act on that information and what to do with it. We are put in the position of having to trust our government to use that information wisely, with pitifully little oversight. Yet the temptation will always be there for the government to make use of private information once they have it.

In the face of the rise of machines and the present NSA action, we have, Wiener writes, a choice. We can arrogantly thump our chests and insist that our privacy be protected from snooping machines and governmental bureaucracies, or we can sit back and stare in awe of the power of these machines to keep us safe from terrorists and criminals at such a slight cost to our happiness and quality of life. We already allow the healthcare bureaucracy to know the most intimate details of our lives and the banking system to penetrate into the most minute details of our finances and the advertising system to know the most embarrassing details of our surfing and purchasing histories; why, Wiener pushes us to ask, should we shy away from allowing the security apparatus from making use of our communication?

If there is a convincing answer to this hypothetical question and if we are to decide to resist the humbling loss of human freedom and human dignity that Wiener welcomes, we need to articulate the dangers Wiener recognizes and then rationalizes in a much more provocative and profound way. Towards that end, there are few books more worth reading than Hannah Arendt’s The Human Condition. Wiener is not mentioned in Hannah Arendt’s 1958 book; and yet, her concern and her theme, if not her response, are very much in line with the threat that cybernetic scientific and computational thinking pose for the future of human beings.

In her prologue to The Human Condition, Arendt writes that two threatening events define the modern age. The first was the launch of Sputnik. The threat of Sputnik had nothing to do with the cold war or the Russian lead in the race for space. Rather, Sputnik signifies for Arendt the fact that we humans are finally capable of realizing the age-old dream of altering the basic conditions of human life, above all that we are earth-bound creatures subject to fate. What Sputnik meant is that we were then in the 1950s, for the first time, in a position to humanly control and transform our human condition and that we are doing so, thoughtlessly, without politically and thoughtfully considering what that would mean. I have written much about this elsewhere and given a TEDx talk about it here.

The second “equally decisive” and “no less threatening event” is “the advent of automation.”  In the 1950s, automation of factories threatened to “liberate mankind from its oldest and most natural burden, the burden of laboring and the bondage to necessity.” Laboring, Arendt writes, has for thousands of years been one essential part of what it means to be a human being. Along with work and action, labor comprises those activities engaged in by all humans. To be human has meant to labor and support oneself; to be human has for thousands of years meant that we produce things—houses, tables, stories, and artworks—that provide a common humanly built world in which we live together; and to be human has meant to have the ability to act and speak in such a way as to surprise others so that your action will be seen and talked about and reacted to with a force that will alter the course and direction of the human world. Together these activities comprise the dignity of man, our freedom to build, influence, and change our given world—within limits.

But all three of these activities of what Arendt calls the vita activa, are now threatened, if not with extinction, then at least with increasing rarity and public irrelevance. As automation replaces human laborers, the human condition of laboring for our necessary preservation is diminished, and we come to rely more and more on the altruism of a state enriched by the productivity of machine labor. Laboring, part of what it has meant to be human for thousands of years, threatens to become ever less necessary and to occupy an ever smaller demand on our existence. As the things we make, the houses we live in, and the art we produce become ever more consumable, fleeting, and temporary, the common world in which we live comes to seem ever more fluid; we move houses and abandon friends with the greater ease than previous ages would dispose of a pair of pants. Our collective focus turns toward our present material needs rather than towards the building of common spiritual and ethical worlds. Finally, as human action is seen as the statistically predictable and understandable outcome of human behavior rather than the surprising and free action of human beings, our human dignity is sacrificed to our rational control and steering of life to secure safety and plenty. The threat to labor, work, and action that Arendt engages emerges from the rise of science—what she calls earth and world alienation—and the insistence that all things, including human beings, are comprehensible and predictable by scientific laws.

Arendt’s response to these collective threats to the human condition is that we must “think what we are doing.” She writes at the end of her prologue:

What I propose in the following is a reconsideration of the human condition from the vantage point of our newest experiences and our most recent fears. This, obviously, is a matter of thought, and thoughtlessness—the heedless recklessness or hopeless confusion or complacent repetition of “truths” which have become trivial and empty—seems to me among the outstanding characteristics of our time. What I propose, therefore, is very simple: it is nothing more than to think what we are doing.

Years before Arendt traveled to Jerusalem and witnessed what she saw as the thoughtlessness of Adolf Eichmann, she saw the impending thoughtlessness of our age as the great danger of our time. Only by thinking what we are doing—and in thinking also resisting the behaviorism and materialism of our calculating time—can we humans hope to resist the impulse to be in awe of our machines and, instead, retain our reverence for human being that is foundation of our humanity. Thinking—that dark, irrational, and deeply human activity—is the one meaningful response Arendt finds to both the thoughtlessness of scientific behaviorism and the thoughtlessness of the bureaucratic administration of mass murder.

think

There will be great examples of chest thumping about the loss of privacy and the violation of constitutional liberties over the next few days. This is as it should be. There will also be sober warnings about the need to secure ourselves from terrorists and enemies. This is also necessary. What is needed beyond both these predictable postures, however, is serious thinking about the tradeoffs between our need for reliable and affordable security along with honest discussion of what we today mean by human freedom. To begin such a discussion, it is well worth revisiting Norbert Wiener’s essay. It is your weekend read.

If you are interested in pursuing Arendt’s own response to crisis of humanism, you can find a series of essays and public lectures on that theme here.

-RB

Roger Berkowitz
Roger Berkowitz is Associate Professor of Political Studies and Human Rights at Bard College, and Academic Director of the Hannah Arendt Center for Politics and the Humanities. He is also the author of "Gift of Science: Leibiniz and the Modern Legal Tradition", as well as co-editor of "Thinking in Dark Times: Hannah Arendt on Ethics and Politics".
4Jun/131

On MOOCs; and Some Possible Futures for Higher Ed

ArendtEducation

Barely more than a year old, MITx and edX now dominate discussion about the future of higher education like nothing else I have seen in my time in Cambridge, MA. I have been teaching at MIT for more than 10 years now, and can’t remember any subject touching directly on university life that came even remotely close to absorbing the attention of higher ed professionals in the region the way that edX has. From initial investments of $30 million each by the founding institutions Harvard and MIT, and each month it seems bringing announcement of new partnerships with the world’s colleges & universities (27 institutions currently belong to the “X” consortium), the levels of hype and institutional buy-in have been nothing short of extraordinary.

Because of their ubiquity in the popular press, higher ed industry periodicals, and blogosphere, Massively Open Online Courses or MOOCs have become that most dangerous topic of discussion: a subject about which everybody needs to have an opinion. Such topics can unfortunately generate more heat than light, as the requirement to have and to express a point of view often means that the strongest and most extravagant opinions will claim attention and command the terms of debate. This is unfortunate if you favor the nuanced opinion or (as I do) feel genuinely ambivalent about MOOCs and the role(s) that they might play in shaping the future of higher education.

arm

So far much of the discourse about MOOCs has tended to settle around two competing claims -- one for, one against -- that I articulated in a tweet a few months ago. Either MOOC providers are described as delivering free or low-cost quality higher education to those hard-pressed to afford it (and so performing a valuable public service); or MOOCs are understood to be selling a "lite" version of higher education to the poor while consolidating power and prestige with a few wealthy elite schools.  In this dystopian view, the democratizing claims made by Udacity, Coursera and edX (the last formed of these outfits, and the only non-profit among them) are revealed instead to be essentially colonialist ones -- the colonialists, ed-tech profiteers hell-bent on thoroughly remaking the university as a crypto-corporate enterprise.  MOOCs are understood to be an engine in this transformation, and an integral part of an overall design for reshaping higher education as a neoliberal market pursuit.

I can’t doubt that there is truth in both of these sets of claims. It is difficult at the same time to ignore that arguments for and against MOOCs look past each other in crucial respects; and leave precious little ground between them. What the accounts do share is an assumption that MOOCs will transform or “revolutionize” the landscape of higher education (for good or ill). Either MOOCs will be agents for elevating some in the less advantaged and underserved corners of the world; or MOOCs are instruments for extracting bodies from classrooms and tenure-track lines from university departments. The somewhat high-flown claims to educate and elevate underserved populations of the globe, often based on stray anecdote, are offered independently of any more substantive claim about the specific learning communities who benefit (or stand to benefit) from MOOCs. Similarly, claims about the profit motives animating the companies offering MOOCs subordinate all discussion of MOOCs to the ideological positions that they supposedly exist to promote. The designs attributed to MOOCs, and to the instructors who offer MOOCs, are such as foreclose discussion rather than promote it.

While both accounts of MOOCs envision significant future consequences from their implementation, moreover, neither says very much about actually-existing MOOCs. The MOOC has become a repository for utopian and dystopian narratives about the present and future directions of higher ed. As a result, this or that fact about MOOCs is often considered (or not) insofar as it confirms the prevailing theory about them. 150,000 signing up for a class demonstrates a clear hunger on the part of many across the globe for access to a quality education; this fact authorizes enlarged claims for the ability to transform higher education by bringing MOOCs to the masses. Similarly, the replicability of the digital medium -- and the fact that course content such as video lectures, once made, do not necessarily need to be re-made each year -- is conceived as a key to how MOOCs will force everyone in higher ed to make do (not do more) with less: less student-faculty interaction, fewer tenure-track professors, down the road the prospect of fewer instructors (the majority of them adjuncts already) paid to teach in college classrooms.

In addition to fears that MOOCs will reinforce ongoing trends of budget cuts, adjunctification and layoffs of college teaching staff, another legitimate concern is that MOOCs will—by helping some schools with their branding strategies—have the effect of consolidating elite privilege with a few schools and the “superprofessors” (themselves overwhelmingly white and male) who teach MOOCs, leaving other lesser-ranked schools struggling to compete against a lower-priced virtual curriculum. The fear is that MOOCs will facilitate the emergence of two tiers in higher ed offerings: the “real” version, available only to the students whose families can afford the exorbitant tuition, or who survive by taking out massive student loan debts); and the second-rate online version. With proposals on the table such as California’s Senate Bill 520, which would grant college credit for certain approved online courses, and Coursera’s recent announcement that they will sell their MOOCs to 10 public universities in the US, these fears are unfortunately very real. I hope to see more MOOCs spring up to contest that sense of inevitable recentering of authority from within the elite universities that host them. However difficult the task may prove to be, we need to disentangle the genuinely democratizing outreach work done by online education from its re-inscription of elite privilege.

coursera

These are important and pressing concerns.  By the same token, they hardly exhaust all that can be said about MOOCs today.  A host of important questions about the creation and implementation of MOOCs -- about course content, mode of learning, assessment, and so on -- should not be lost amidst conversations about the larger tendency (whether benevolist and democratizing, or insidious and corporatizing) to which MOOCs properly belong. The movement of classroom tasks and functions online learning presents opportunities as well as risks; we should understand both. In an essay written late last year I tried to look without blinders at MOOCs, and to reflect both on the risks associated with their format and implementation as well as on their potential as instruments of learning and encounter. I wrote at the time that it wasn’t my intention "to defend the MOOC so much as...to hold open some alternative futures for it." For these alternative futures to emerge there needs to be vision, will, and coordinated effort on the part of many in higher ed. I am still willing at least to entertain the possibility that MOOCs may turn out to be an enabling, positive invention, while I acknowledge indicators that point in the direction of their being a lamentably misguided one. But the rush to condemn and dismiss online courses may be as fundamentally mistaken as the rush to anoint them the future of higher education.

Blended learning modes present opportunities for both pedagogical experimentation and outreach; neither opportunity should I think be dismissed lightly. I have heard many instructors of MOOCs (in both STEM and humanities subjects) remark that the experience of teaching online has transformed their thinking and approach to teaching familiar material in the traditional classroom -- whether in pace and timing, course content, evaluation and assessment, etc. My interest in MOOCs extends to how the format can be imagined to provide access to a university curriculum to populations that may not have had this kind of access, as this is the population that stands to gain most from them. But in addition to the flat, global learning community ritually invoked as the audience for MOOCs, we could benefit from thinking locally too. How can the online course format make possible new relationships not only with the most far-flung remote corners of the earth but with the neighborhoods and communities nearest to campus? Can we make MOOCs that foster meaningful links with the community or create learning communities that cut across both the university and the online platform?

Among other alternative futures for MOOCs, I imagine more opportunities to collaborate with colleagues at other institutions. The single-delivery, “sage on stage” MOOC is no more the only online model available than is the large lecture class at a brick-and-mortar school. While MOOCs are still for the most part free and non credit-bearing, we should try out (and generate metrics to assess) as many different teaching arrangements as possible. I hasten to add that this exploration should include the intellectual freedom along with the technological affordances to create a MOOC of any kind, at any time, with anybody. With instructors and modules selected in advance, some infrastructural support in each site, and a set of shared principles for continuity of curriculum and presentation, anybody could create a MOOC. Universities like Penn have already begun asking faculty to sign non-compete agreements, presumably to curb these kind of collaborations. For as long as such arrangements are permissible, however, I would urge researchers to collaborate on MOOCs themselves. This may be a tall order; but not I think impossible.

mooc

From various quarters we have heard recent calls for a slow-down of the MOOC bandwagon. An open letter from Harvard faculty to the Dean of Faculty of Arts & Sciences calls for more oversight and reflective engagement with the question of how MOOCs offered through edX will affect “the higher education system as a whole.” I support these calls as consistent with the seriousness of the proposals to transform higher ed that are currently before us. From my modest position within the ranks of MIT administration I have been glad to see great care on the part of faculty to ensure that a spirit of experimentation and exploration with regard to MOOCs remains compatible with the core principles of the university and with a residential education. Cathy Davidson at Duke will in January 2014 teach a MOOC with Coursera simultaneously combined with a brick-and-mortar course on “The History and Future of Higher Ed,” with participation from classes at other schools and universities as well. These and other movements are to me reassuring signs, indicators of collaborative engagement around a topic of great importance. They indicate a willingness too to eschew rehearsing polarized opinions for or against MOOCs in order to attend at once to their innovative construction and to their effective and responsible implementation. The challenge is to remind ourselves periodically to think small (locally, incrementally) at the same time that we heed calls to think big.

-Noel Jackson

The Hannah Arendt Center
The Hannah Arendt Center at Bard is a unique institution, offering a marriage of non-partisan politics and the humanities. It serves as an intellectual incubator for engaged thinking and public discussion of the nation's most pressing political and ethical challenges.
5Apr/130

Making the Grade

ArendtWeekendReading

I was at dinner with a colleague this week—midterm week. Predictably, talk turned to the scourge of all professors: grading essays. There are few tasks in the life of a college professor less fulfilling than grading student essays. Every once in a while a really good essay jolts me to consciousness. I am elated by such encounters. To be honest, however, reading essays is for the most part stultifying. This is not the fault of the students, many of whom are brilliant and exuberant writers. I find it trying to wade through 25 essays discussing the same book, offering varying opinions and theories, while keeping my attention and interest. How many different ways can one ask for a thesis, talk about the importance of transition sentences, and correct grammar? For some time it is fun, in a way. One learns new things and is captivated by comparing how bright young minds see things. But after years, grading the essay becomes just part of the worst part of a great job.

grading

Flickr, MacVicar

So how might my colleagues and I react to news that EdX—the influential Harvard-MIT led consortium offering online courses—has developed software that will grade college student essays? I imagine it is sort of like how people felt when the dishwasher was invented. You mean we can cook and feast and don’t have to scrub pots and wash dishes? It promises to allow us to focus on teaching well without having to do that part of our job that we truly dread.

The appeal of computer grading is obvious and broad. Not only will many professors and teachers be freed from unwanted tedium, but also it may help our students. One advantage of computer grading is that it is nearly instantaneous. Students can hand in their work and get a grade and feedback seconds later. Too often essays are handed back days or even weeks after they are submitted. By then the students have lost interest in their paper and forgotten the inspiration that breathed life into their writing. To receive immediate feedback will allow students to see what they did wrong and how they could improve while the generative impulse underlying the paper is still fresh. Computer grading might encourage students to turn in numerous drafts of a paper; it may very well help teach students to write better, something that professorial comments delivered after a week rarely accomplish.

Another putative advantage of computer grading is its objectivity and consistency. Every professor knows that it matters when we read essays and in what order. Some essays find us awake and attentive. Others meet my eyes as they struggle to remain open. As much as I try to ignore the names on the top of the page, I can’t deny that my reading and grading is personalized to the students. I teach at a small liberal arts college where I know the students. If I read a particularly difficult sentence by a student I have come to trust, I often make a second effort. My personal attention has advantages but it is of course discriminatory. The computer will not do that, which may be seen by some as more fair. What is more, the computer doesn’t get tired or need caffeine.

Perhaps the most important advantage for administrators considering these programs is the cost savings. If computers relieve professors from the burden of grading, that means professors can teach more. It may also mean that fewer TA’s are necessary in large lecture courses, thus saving money for strapped universities. There may even be a further side benefit to these programs. If universities need fewer TA’s to grade papers, they may admit fewer graduate students to their programs, thus going some way towards alleviating the extraordinary and irresponsible over-production of young professors that is swelling the ranks of unemployable Ph.D.s.

chair

There are, of course, real worries about computer grading of essays. My concern is not that the computers will make mistakes (so do I); or that we lack studies that show that computers can grade as well as human professors—for I doubt professors are on the whole excellent graders. The real issue is elsewhere.

According to the group “Professionals Against Machine Scoring of Student Essays in High-Stakes Assessment,” the problem with computer grading of essays is simple: Machines cannot read.  Here is what the group says in a statement:

Let’s face the realities of automatic essay scoring. Computers cannot ‘read.’ They cannot measure the essentials of effective written communication: accuracy, reasoning, adequacy of evidence, good sense, ethical stance, convincing argument, meaningful organization, clarity, and veracity, among others.

What needs to be taken seriously is not that computers can’t grade as well as humans. In many ways they grade better. More consistently. More honestly. With less grade inflation. And more quickly. But computer grading will be different than human grading. It will be less nuanced and aspire to clearly defined criteria. Are sentences grammatical? Is there a clear statement of the thesis? Are there examples given? Is there a transition between sentences? All of these are important parts of good writing and the computer can be trained to look for these characteristics in an essay. What this means, however, is that computers will demand the kind of clear, precise, and logical writing that computers can understand and that many professors and administrators demand from students. What this also means, however, is that writing will become more mechanical.

There is much to be learned here from an analogy with the rise of computer chess. The great grandmaster Gary Kasparov—who famously lost to Deep Blue— has perceptively argued that machines have changed the ways Chess is played and redefined what a good chess move and a well-played chess game looks like. As I have written before:

The heavy use of computer analysis has pushed the game itself in new directions. The machine doesn’t care about style or patterns or hundreds of years of established theory. It counts up the values of the chess pieces, analyzes a few billion moves, and counts them up again. (A computer translates each piece and each positional factor into a value in order to reduce the game to numbers it can crunch.) It is entirely free of prejudice and doctrine and this has contributed to the development of players who are almost as free of dogma as the machines with which they train. Increasingly, a move isn’t good or bad because it looks that way or because it hasn’t been done that way before. It’s simply good if it works and bad if it doesn’t. Although we still require a strong measure of intuition and logic to play well, humans today are starting to play more like computers. One way to put this is that as we rely on computers and begin to value what computers value and think like computers think, our world becomes more rational, more efficient, and more powerful, but also less beautiful, less unique, and less exotic.

Much the same might be expected from the increasing use of computers to grade (and eventually to write) essays. Students will learn to write in ways expected from computers, just as they today try to learn to write in ways desired by their professors. The difference is that different professors demand and respond to varying styles. Computers will consistently and logically drive writing towards a more mechanical and logical style. Writing, like Chess playing, will likely become more rational, more efficient, and more effective, but also less beautiful, less unique, and less eccentric. In other words, writing will become less human.

It turns out that many secondary school districts already use computers to grade essays. But according to John Markoff in The New York Times, the EdX software promises to bring the technology into college classrooms as well as online courses.

edx

It is quite possible that in the near future, my colleagues and I will no longer have to complain about grading essays. But that is unlikely at Bard. More likely is that such software will be used in large university lecture courses. In such courses with hundreds of students, professors already shorten questions or replace essays with multiple-choice tests. Or they use armies of underpaid graduate students to grade these essays. It is quite likely that software will actually augment the educational value of writing assignments at college in these large lecture halls.

In seminars, however, and in classes at small liberal arts colleges like Bard where I teach, such software will not likely free my colleagues and me from reading essays. The essays I assign are not simple responses to questions in which there are clear criteria for grading. I look for elegance, brevity, insight, and the human spark (please no comments on my writing). Whether or not I am good at evaluating writing or at teaching writing, that is my aspiration. I seek to encourage writing that is thoughtful rather than writing that is simply accurate. When I have time to make meaningful comments on papers, they concern structure, elegance, and depth. It is not only a way to grade an essay, but also a way to connect with my students and help them to see what it means to write and think well.

And yet, I can easily imagine making use of such a computer-grading program. I rarely have time to grade essays as well or as quickly as I would like. I would love to have my students submit drafts of their essays to the EdX computer program.

If they could repeatedly submit their essays and receive such feedback and use the computer to catch not only grammatical errors but also poor sentences, redundancies, repetitions, and whatever other mistakes the computer can be trained to recognize, that would allow them to respond and rework their essays many times before I see them. Used well, I hope, such grading programs might really augment my capacities as a professor and their experiences as students.

I have real fears that grading technology will rarely be used well. Rather, it will too-often replace human grading altogether and in large lectures, high schools and standardized tests will impose a new and inhuman standard on the way we write and thus the way we think. We should greet such new technologies enthusiastically and skeptically. But first, we should try to understand them. Towards that end, it is well worth reading John Markoff’s excellent account of the new EdX computer grading software in The New York Times. It is your weekend read.

-RB

Roger Berkowitz
Roger Berkowitz is Associate Professor of Political Studies and Human Rights at Bard College, and Academic Director of the Hannah Arendt Center for Politics and the Humanities. He is also the author of "Gift of Science: Leibiniz and the Modern Legal Tradition", as well as co-editor of "Thinking in Dark Times: Hannah Arendt on Ethics and Politics".
4Jan/130

The “E” Word, Part Two

This Weekend Read is Part Two in “The “E” Word,”  a continuing series on “elitism” in the United States educational system. Read Part One here.

Peter Thiel has made headlines offering fellowships to college students who drop out to start a business. One of those Thiel fellows is Dale Stephens, founder of Uncollege. Uncollege advertises itself as radical. At the top of their website, Uncollege cites a line from the movie "Good Will Hunting":

You wasted $150,000 on an education you coulda got for a buck fifty in late charges at the public library.

The Uncollege website is filled with one-liners extolling life without college. It can be and often is sophomoric. And yet, there is something deeply important about what Uncollege is saying. And its message is resonating. Uncollege has been getting quite a bit of attention lately, part of a culture of  obsession with college dropouts that is increasingly skeptical of the value of college.

At its best, Uncollege does not simply dismiss college as an overpriced institution seeking to preserve worthless knowledge. Rather, Uncollege claims that college has become too anti-intellectual. College, as Uncollege sees it, has become conventional, bureaucratic, and not really dedicated to learning. In short, Uncollege criticizes college for not being enough like college should be. Hardly radical, Uncollege trades rather in revolutionary rhetoric in the sense that Hannah Arendt means the word revolution: a return to basic values. In this case, Uncollege is of course right that colleges have lost their way.

Or that is what I find interesting about Uncollege.

To actually read their website and the recent Uncollege Manifesto by Dale Stephens, is to encounter something different. The first proposition Uncollege highlights has little to do with education and everything to do with economics. It is the decreasing value of a college education. 

The argument that college has ever less value will seem counter intuitive to those captivated by all the paeans to the value of college and increased earning potential of college graduates. But Uncollege certainly has a point. Currently about 30% of the U.S. adult population has a degree. But among 20-24 year olds, nearly 40% have a college degree. And The Obama administration aims to raise that number to 60% by 2020. Uncollege calls this Academic Inflation. As more and more people have a college degree, the value of that degree will decrease. It is already the case that many good jobs require a Masters or a Ph.D. In short, the monetary value of the college degree is diminished and diminishing. This gives us a hint of where Uncollege is coming from.

The Uncollege response to the mainstreaming of college goes by a number of names. At times it is called unschooling. Unschooling is actually a movement began by the legendary educator John Holt. I recall reading John Holt’s How Children Learn while I was in High School—a teacher gave it to me. I was captivated by Holt’s claim that school can destroy the innate curiosity of children. I actually wrote my college application essay on Holt’s educational philosophy and announced to my future college that my motto was Mark Twain’s quip, “I never let school interfere with my education”—which is also a quotation prominently featured in the Uncollege Manifesto.

Unschooling—as opposed to Uncollege—calls for students to make the most of their courses, coupling those courses with independent studies, reading groups, and internships. I regularly advise my students to take fewer not more courses. I tell them to pick one course each semester that most interests them and pursue it intently. Ask the professor for extra reading. Do extra writing. Organize discussion groups about the class with other students. Go to the professor’s office hours weekly and talk about the ideas of the course. Learners must become drivers of their education, not passive consumers. Students should take their pursuit of knowledge out of the classroom, into the dining halls, and into their dorms.

Uncollege ads that unschooling or “hacking your education” can be done outside of schools and universities. With Google, public libraries, and free courses from Stanford, MIT and Harvard professors proliferating on the web, an enterprising student of any age can compose an educational path today that is more rigorous than anything offered “off-the-shelf” at a college or university. I have no problem with online courses. I hope to take  a few. But it is a mistake to think that systems of massive information delivery are the same thing as education.

What Uncollege offers is something more and something less wholesome than simply a call for educational seriousness. It packages that call with the message that college has become boring, conventional, expensive, and unnecessary. In the Uncollege world, only suckers pay for college. The Uncollege Manifesto promotes “Standing out from the other 6.7 billion”; it derides traditional paths pointing out that “5,000 janitors in the United States have Ph.Ds.”; and  cautions, “If you are content with life and education you should probably stop reading… You shall fit in just fine with society and no one will ever require you to be different. Conforming to societal standards is the easy and expected path. You are not alone!” 

At the core of the Uncollege message is that dirty and yet all-so-powerful little word again: “elitism.” Later in the Uncollege Manifesto we are told that young people have a choice between “real accomplishments” and the “easy path to mediocrity”:

To succeed without a college degree you will have to build your competency and reputation through real world accomplishments. I am warning now: this is not going to be easy. If you want to take the easy path to mediocrity, I encourage you to go to college and join the masses. If you want to stand out from the crowd and change the world, Uncollege is for you!

At one point, the Uncollege Manifesto lauds NPR’s “This I Believe” series and commends these short 500 word essays on personal credos. But Uncollege adds a twist: instead of writing what one believes, it advises its devotees to write an essay answering the question: “What do you believe about the world that most others reject?” It is not enough simply to believe in something. You must believe in something that sets you apart and makes you different.

Uncollege is at least suggesting that it might be cool to want, as it has not been for 50 years, to aim for excellence and to yearn to be different. In short, Uncollege is calling up students at elite institutions to boldly grab the ring of elitism and actively seek to stand outside and above the norm. And it is saying that education is no longer elite, but conventional.

It is hard not to see this embrace of elitism as refreshing although no doubt many will scream the “e” word. I have often lectured to students at elite institutions and confronted them with their fear of elitism. They or someone spends upwards of $200,000 on an education not to mention four years of their lives, and then they reject the entire premise of elitism: that they are different or special. By refusing to see themselves as members of an elite, these students too often refuse to accept the responsibility of elites, to mold and preserve societal values and to assume leadership roles in society.

Leading takes courage. In Arendtian terms, it requires living a public life where one takes risks, acts in surprising ways, and subjects oneself to public judgment. Leading can be uncomfortable and dangerous, and it is often more comfortable and fun to pursue one’s private economic, familial, and personal dreams. Our elite colleges have become too much about preparing students for private success rather than launching young people into lives of public engagement. And part of that failure is a result of a retreat from elitism and a false humility that includes an easy embrace of equality.

That Uncollege is selling its message of excellence and elitism to students at elite institutions of higher learning is simply one sign of how mainstream and conformist many of these elite institutions have become. But what is it that Uncollege offers these elite students who drop out and join Uncollege?

According to its website, Uncollege is selling “hackademic camps” and a “gap year program” that are designed to teach young people how to create their own learning plans. The programs come with living abroad programs and internships. Interestingly, these are all programs offered by most major universities and colleges. The difference is money and time. For $10,000 in just one year, you get access to mentors and pushed to write op-eds, and the “opportunity to work at hot Silicon Valley startups, some of them paid positions.” In the gap year program, participants will also “build your personal brand.  Speak at a conference, Write an op-ed for a major news outlet.  Build a personal website.”

None of this sounds radical, intellectual, or all-that elitist. On the contrary, it claims that young people have little to learn from educators. Teachers are unimportant, to be replaced by mentors in the world. The claim is that young people lack nothing but information and access in order to compete in the world.

What Uncollege preaches often has little to do with elitism or intellectual growth. It is a deeply practical product being sold as an alternative to the cost of college. In one year and for one-twentieth of what a four-year elite college education costs, a young person can get launched into the practical world of knowledge workers, hooked up with mentors, and set into the world of business, technology, and media. It is a vocational training program for wannabe elites, training people to leap into the creative and technology fields and compete with recent college graduates but without the four years of studying the classics, the debt, and the degree. The elitism that Uncollege is selling is an entrepreneurial elitism measurable by money. By appealing to young students’ sense of superiority, ambition, and risk-taking, Uncollege stands a real chance of attracting ambitious young people more interested in a good job and a hot career than in reading the classics or studying abstract math.

Let’s stipulate this is a good thing. Not everybody should be going to liberal arts colleges. People unmoved by Nietzsche, Einstein, or Titian who are then forced to sit through lectures, cram for exams, and pull all-nighters writing papers cribbed from the internet are wasting their time and money on an elite liberal arts education. What is more, they bring cynicism into an environment that should be fired by idealism and electrified by passion. For those who truly believe that it is important in the world to have people who are enraptured by Sebald and transformed by Arendt, it is deeply important that the liberal arts college remain a bastion apart, a place where youthful exuberance for the beautiful and the true can shine clearly.

We should remember, as well, that reading great books and studying Stravinsky is not an activity limited to the academy. We should welcome a movement like Uncollege that frees people from unwanted courses but nevertheless encourages them to pursue their education on their own. Yes, many of these self-educated strivers will acquire idiosyncratic readings of Heidegger or strange views about patriotism. But even when different, opinions are the essence of a human political system.

One question we desperately need to ask is whether having a self-chosen minority of people trained in the liberal arts is important in modern society. I teach in an avowedly liberal arts institution precisely because I fervently believe that such ideas matter and that having a class of intellectuals whose minds are fired by ideas is essential to any society, especially a democracy.

I sincerely hope that the liberal arts and the humanities persist. As I have written,

The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.

Our problem, today, is that college is caught between incompatible demands, to spark imaginations and idealism and to prepare young people for employment and success. For a long while now colleges have been doing neither of these things well. Currently, the political pressure on colleges is to cut costs and become more efficient. The unspoken assumption is that colleges must more cheaply and more quickly prepare students for employment. For those of us who care about college as an intellectual endeavor, we should welcome new alternatives to college like internet courses, vocational education, and Uncollege that will pull away young people for whom college would have been the wrong choice. Maybe, under the pressure of Uncollege, colleges will return to their core mission of passionately educating young people and preparing them for lives of civic engagement.

I encourage you this weekend to read the Uncollege Manifesto. Let me know what you think.

-RB

Roger Berkowitz
Roger Berkowitz is Associate Professor of Political Studies and Human Rights at Bard College, and Academic Director of the Hannah Arendt Center for Politics and the Humanities. He is also the author of "Gift of Science: Leibiniz and the Modern Legal Tradition", as well as co-editor of "Thinking in Dark Times: Hannah Arendt on Ethics and Politics".
5Oct/1215

The Flipped Classroom

For those of us who care about education, at either the college or high school level, there is nothing more exciting and terrifying today than the promise of the use of technology in teaching. At this moment, numerous companies around the country are working with high schools and colleges to create online courses, tutorials, and webinars that will be able to provide training and information to millions of people around the world. In fact, I just took a webinar today, required by the New York Council for the Humanities, a mandatory course that was supposed to train me to facilitate a Community Conversation on democracy that will be held next week at the Arendt Center.

Many of these web-based courses are offered free. They will be taught by leading experts who teach at the best universities in the world. And they will be available to anyone in any country of any income with a computer. The possibilities and potential benefits of such courses are extraordinary.  And yet, as with any great new technology, these courses are also dangerous.

A recent article in the Chronicle of Higher Education describes Tony Hyun Kim, a MIT graduate who moved to Mongolia and spent three months tutoring and teaching local high school students as they took a course in circuits-and-electronics class, a class that is usually taken by MIT sophomores. The class offered free online by edX, a consortium of MIT and Harvard, uses video and interactive exercises and is available to anyone who signs up. What Mr. Kim did is use this advanced course taught by MIT professors as a basic resource for his high school students in Mongolia. He then helped the young students to take the course. Twelve of his students passed the course and earned a certificate of completion. "One 15-year-old, Battushig, aced the course, one of 320 out of students worldwide to do so." According to the Chronicle:

The adventure made the young MIT graduate one of the first to blend edX's content with face-to-face teaching. His hybrid model is one that many American students may experience as edX presses one of its toughest goals: to reimagine campus learning. EdX ended up hiring Mr. Kim, who hopes to start a related project at the university level in Mongolia.

What is now being called the "flipped classroom"—authoritative professors lecture thousands or hundreds of thousands in their dorm rooms while young facilitators then meet with students physically in classrooms—has enormous consequences for education around the world and also in the United States. 

 Currently, every university hires Ph.D.s as professors to teach courses and high schools hire teachers. These professors and teachers teach their own courses, set their curriculum, and are responsible for creating an educational environment. Often they are large lectures or poor classes in which students learn very little. Sometimes at research universities the professors have graduate students who spend time with the undergrads while professors do their own work. Often these graduate students in turn care less about teaching than their own research, leaving poor undergraduates to fend for themselves. In most instances, large lecture courses provide students with painfully little personal attention, the kind of one on one or small group interaction in which real education happens. What is more, these courses are expensive, since the universities subsidize the research and training of the professors.

Now imagine that community colleges and even large universities embrace the flipped classroom? Why not have students take a course from edX or Coursera, another similar service. The course is free. The college or university could then hire facilitators like Mr. Kim to work one on one with students. These facilitators can be cheap. They may even be free. As the Chronicle reports, Harvard professors E. Francis Cook Jr. and Marcello Pagano are working to mobilize a crowd of volunteers to help teach their courses.

The veteran professors will teach a class on epidemiology and biostatistics this fall, one of Harvard's first on edX. Details are still being worked out, but they hope to entice alumni to participate, possibly by moderating online forums or, for those based abroad, leading discussions for local students. Mr. Cook sees those graduates as an "untapped resource. We draw people into this program who want to improve the health of the world," he says. "I'm hoping we'll get a huge buy-in from our alums."

There will be many young people who will volunteer to facilitate such courses. In return they will learn something. They will meet smart young potential employees and recruit them to work in their business ventures. And they will do a service to their alma maters. This enlistment of free labor to help with online learning is already happening. And it will upend the teaching profession at all levels, just as star doctors at major hospitals will increasingly diagnose hundreds of patients a day from their offices while assistants around the world simply follow their instructions.

Will the new educational regime offer a better education for the students? In some cases yes. There are unmistakable advantages both in cost and maybe even in quality that such flipped courses offer. But there is also a profound loss of what might be called educational space and, more importantly, educational authority.  

If such facilitators are recent college graduates, like Mr. Kim, or if they are Ph.D.s but hired not as professors and thus without the authority of present professors, there is a loss of the very sense of what a university or college is—a space for the transmission of knowledge from scholars and scientists to young citizens. What does it mean to lose the community of professors who currently populate these educational institutions?

And what about when this hollowing out of the professoriate infects elite universities like MIT and Harvard themselves? The Chronicle asks:

One question is how edX might improve elite universities, which are late to the e-learning game. In the spring, MIT tested the edX circuits class with about 20 on-campus students. It was a hit: A majority said they would take another Web class....Another benefit: Students could rewind or fast-forward their professor. Data showed MIT students tended to watch the videos at 1.5 speed, which makes voices sound almost like chipmunks but delivers information more rapidly. "I do want MIT to offer more online education," Ms. LaPenta says.

A hit with students it may be. And they may indeed learn the material and pass the course. But listening to their professor's lectures at 1.5 speed—that is fascinating and frightening. We all are aware of the ways that technology divorces us from the traditional pace of human life. We drive or fly and travel distances in hours that used to take years. We send mail at the speed of the internet. But what will it mean when we speak at 1.5 speed? And speaking is one thing. But teaching and learning?

I have no doubt that studies are being done right now to measure the optimal speeds at which students can listen to lectures and still process the information. Pretty soon students will watch lectures like many of us now watch t.v., on delay so that it can be fast-forwarded, rewound, and sped up. It is one thing to imagine this as useful for individuals who want to learn how to program a computer or fix an engine or publish a book. But to think that our most illustrious liberal arts institutions will adopt the motto of education at the personal speed of the internet is more than simply strange.

Education, writes Hannah Arendt in The Crisis in Education, is predicated on the basic fact that human beings are born into the world. Young people come into the world and, because they are newcomers and uninitiated, need to be educated, which means they must be introduced to the world. Parents do this to some degree in the home, bringing the child from the home into the wider world. But the primary institutions in which children are educated, in which they are led into the world, are schools.

Normally the child is first introduced to the world in school. Now school is by no means the world and must not pretend to be; it is rather the institution we interpose between the private domain of the home and the world in order to make the transition from the family to the world possible at all.

For Arendt, the key element of education is the authority of the parent, teacher or professor. The teacher takes responsibility for bringing the child into the world, which requires authority:

The teacher's qualification consists in knowing the world and being able to instruct others about it, but his authority rests on his assumption of responsibility for that world. Vis-à-vis the child it is as though he were a representative of all adult inhabitants, pointing out the details and saying to the child: This is our world.

The authority of the teacher is, at bottom, a matter of his or her willingness to take responsibility for the world. In other words, the teacher must be conservative in the sense that his or her role is to "cherish and protect something—the child against the world, the world against the child, the new against the old, the old against the new." The teacher conserves both the world as it is—insofar as he teaches the child what is rather than what should be or what will be—and the child in her newness—by refusing to tell the child what will be or should be, and thus allowing the child the experience of freedom to rebel against the world when and if the time is right.

Arendt's point is that education requires that a child be confronted with the world as it is, not how the student wants it to be. This will often be painful and uncomfortable.  It requires authority, and it requires that the student learn to conform to the world. An essential part of education, therefore, is that the student not be in control and that students be led by an external, adult, and respected authority. Which is why, for Arendt, education depends upon the authority of teachers and professors. The idea that our best institutions are imagining an educational present where students spend more and more of their time online where they, and not the professors, control and determine their way of learning does present a threat to education.

Of course, the goal of education is to create independent thinkers. The capstone experience at Bard College, where I teach, and at Amherst College, where I studied, is a senior thesis (at Bard this is mandatory, at Amherst only for honors students). The senior thesis is the transition from education to adulthood and it can be an extraordinary and moving experience. But it is a mistake when students insist—as they often do—on doing too many tutorials or seminars too early in their careers. Students must first learn and such learning requires being led by an authority. Too many students and professors today ignore the importance of authority in education. Technology threatens to feed that already present cultural tendency to free students from their tutelage to professors.

Amongst the myriad of benefits promised by distance learning and the flipped classroom, it is imperative to see where the real dangers and pitfalls lie. The grave danger of the flipped classroom is precisely in the perpetuation of the dominant trend of progressive education that has infiltrated teaching at all levels since Piaget and Dewey. It is the claim that students can and ought to be in charge of their own education.

In freeing students from the classroom, in distancing them further from the authority figure of a professor, in replacing Ph.D.s and professors with lesser trained facilitators, in giving students the power to speed up or slow down the professor's lecture, we are empowering and liberating students and giving them ever more control over their education. This may allow them to learn better or graduate more quickly. It may reduce the cost of college and high school and it may train people better for certain jobs. They may enjoy their education more. But such an education does not teach students what the world is like. It does not insist that they first learn what is before they begin to fashion the world as they want it to be. It comes from a loss of faith in and love for the world as it is, a loss that pervades our society that no longer believes in itself. Such an attitude does not assume responsibility for the world and insist that young people must first learn about the world, at least as the world is now. And it is just such a responsibility that educators must adopt.

The real problem with the rush towards technological education is that is focused interminably on the future. On qualifications for jobs and preparation for what is to come. Education, at least education that might succeed in introducing young people into a common world which they love and treasure, requires a turn towards the past. Just such a turn from the backwards-glancing education of the liberal arts to the forward-thrusting education to prepare students for jobs and careers is the real threat inherent in the present mania for technologically-enhanced pedagogy. Technology is not evil; it can be greatly helpful. But we must first understand why it is we are so desperate for it if we are to integrate it into our world. Otherwise, it will break the world.

On this weekend, I encourage you to take up Hannah Arendt's essay, The Crisis in Education. You can order it here  from Amazon. Or listen to Hannah Arendt read from her essay in animated form here.

-RB

The Hannah Arendt Center
The Hannah Arendt Center at Bard is a unique institution, offering a marriage of non-partisan politics and the humanities. It serves as an intellectual incubator for engaged thinking and public discussion of the nation's most pressing political and ethical challenges.
3Sep/121

One Against All

It can be dangerous to tell the truth: “There will always be One against All, one person against all others. [This is so] not because One is terribly wise and All are terribly foolish, but because the process of thinking and researching, which finally yields truth, can only be accomplished by an individual person. In its singularity or duality, one human being seeks and finds – not the truth (Lessing) –, but some truth.”

-Hannah Arendt, Denktagebuch, Book XXIV, No. 21

Hannah Arendt wrote these lines when she was confronted with the severe and often unfair, even slanderous, public criticism launched against her and her book Eichmann in Jerusalem after its publication in 1963. The quote points to her understanding of the thinking I (as opposed to the acting We) on which she bases her moral and, partly, her political philosophy.

It is the thinking I, defined with Kant as selbstdenkend (self-thinking [“singularity”]) and an-der-Stelle-jedes-andern-denkend (i.e., in Arendt’s terms, thinking representatively or practicing the two-in-one [“duality”]). Her words also hint at an essay she published in 1967 titled “Truth and Politics,” wherein she takes up the idea that it is dangerous to tell the truth, factual truth in particular, and considers the teller of factual truth to be powerless. Logically, the All are the powerful, because they may determine what at a specific place and time is considered to be factual truth; their lies, in the guise of truth, constitute reality. Thus, it is extremely hard to fight them.

In answer to questions posed in 1963 by the journalist Samuel Grafton regarding her report on Eichmann and published only recently, Arendt states: “Once I wrote, I was bound to tell the truth as I see it.” The statement reveals that she was quite well aware of the fact that her story, i.e., the result of her own thinking and researching, was only one among others. She also realized the lack of understanding and, in many cases, of thinking and researching, on the part of her critics.

"Iustitia" - Martin van Heemskerck, 1478-1578

Thus, she lost any hope of being able to publicly debate her position in a “real controversy,” as she wrote to Rabbi Hertzberg (April 8, 1966). By the same token, she determined that she would not entertain her critics, as Socrates did the Athenians: “Don’t be offended at my telling you the truth.” Reminded of this quote from Plato’s Apology (31e) in a supportive letter from her friend Helen Wolff, she acknowledged the reference, but acted differently. After having made up her mind, she wrote to Mary McCarthy: “I am convinced that I should not answer individual critics. I probably shall finally make, not an answer, but a kind of evaluation of this whole strange business.” In other words, she did not defend herself in following the motto “One against All,” which she had perceived and noted in her Denktagebuch. Rather, as announced to McCarthy, she provided an “evaluation” in the 1964 preface to the German edition of Eichmann in Jerusalem and later when revising that preface for the postscript of the second English edition.

Arendt also refused to act in accordance with the old saying: Fiat iustitia, et pereat mundus (let there be justice, though the world perish). She writes – in the note of the Denktagebuch from which today’s quote is taken – that such acting would reveal the courage of the teller of truth “or, perhaps, his stubbornness, but neither the truth of what he had to say nor even his own truthfulness.” Thus, she rejected an attitude known in German cultural tradition under the name of Michael Kohlhaas.  A horse trader living in the 16th century, Kohlhaas became known for endlessly and in vain fighting injustice done to him (two of his horses were stolen on the order of a nobleman) and finally taking the law into his own hands by setting fire to houses in Wittenberg.

Even so, Arendt has been praised as a woman of “intellectual courage” with regard to her book on Eichmann (see Richard Bernstein’s contribution to Thinking in Dark Times).

Intellectual courage based on thinking and researching was rare in Arendt’s time and has become even rarer since then. But should Arendt therefore only matter nostalgicly? Certainly not. Her emphasis on the benefits of thinking as a solitary business still remains current. Consider, for example, the following reference to Sherry Turkle, a sociologist at MIT and author of the recent book Alone Together. In an interview with Peter Haffner (published on July 27, 2012, in SZ Magazin), she argues that individuals who become absorbed in digital communication lose crucial components of their faculty of thinking. Turkle says (my translation): Students who spend all their time and energy on communication via SMS, Facebook, etc. “can hardly concentrate on a particular subject. They have difficulty thinking a complex idea through to its end.” No doubt, this sounds familiar to all of us who know about Hannah Arendt’s effort to promote thinking (and judging) in order to make our world more human.

To return to today’s quote: It can be dangerous to tell the truth, but thinking is dangerous too. Once in a while, not only the teller of truth but the thinking 'I' as well may find himself or herself in the position of One against All.

-Ursula Ludz

The Hannah Arendt Center
The Hannah Arendt Center at Bard is a unique institution, offering a marriage of non-partisan politics and the humanities. It serves as an intellectual incubator for engaged thinking and public discussion of the nation's most pressing political and ethical challenges.
4May/120

Leading a Student Into the World

As long as our world changes so rapidly that children can expect to live very differently than their parents, it is likely that education and child rearing will always be in crisis.

This is the first sentence of a senior project I am reading today, the first of many I will read over the next two weeks. If the others are as fascinating as this one, it will be a happy two weeks.

The Bard senior project is the culmination of a Bard Student's year-long inquiry into a topic of their choosing. In this case, my student Steven Tatum—an aspiring teacher who will attend Bard's Master in Teaching Program next year—set out to explore the sense and import of our crisis in education.

In its most basic sense, education is how we lead new human beings into the world and introduce them to it. The Latin root of our word “educate” is educo, which means to rear or bring up a child, but it also means to lead forth and draw out. For most of Western history, education in this sense was a relatively simple matter of leading children into the lifestyles that their families had maintained for generations. But with the modern emphasis on equality, self-determination, and social mobility, the task of leading children into the world became much more difficult since educators could never know how a given student would choose to live in the world. Schools were given the task of leading students into a world of freedom and possibilities.

While these benefits for human freedom certainly make the increased burden on education worth bearing, this difficulty becomes a crisis when parents and teachers cannot be sure what the world will be like when their children and students reach adulthood. How can parents and teachers lead the next generation into a world that neither generation knows?

Tatum's Senior Project asks how to lead a student into the world, and seeks guidance from Hannah Arendt's essay, The Crisis in Education.

In this project I follow Arendt through the crisis in education as a way of learning with her about the essence of education and the educational challenges we face in our uncertain time. I begin at the beginning of education: the birth of a child. For Arendt, the fact that new people are continuously born into the world is the essence of education. In addition to marking the beginning of a living growing being, Arendt focuses on birth as the origin of our capacity to make new beginnings of our own throughout our lives by acting in the world. She believes the task of education is to preserve and foster this capacity for action so that the members of each new generation can participate in building and rebuilding a common world.

The tension in education today is between the need to lead people into an already existing world and the equally pressing imperative to prepare them for a new world that certainly is approaching, faster and more unpredictably than any of us imagine. The news this week is filled with articles about new initiatives at Stanford, the University of Pennsylvania, Harvard, and MIT to create new corporations that will offer courses on the internet. This is part of the trend to orient education toward the future, in the hope that we can teach students more quickly and more efficiently what they will need to know in the new economy.

Underlying much contemporary educational thinking is the assumption that our present world will not last long. More important than leading students into the world, is the need to give them the tools of the future. And this is not wrong. We do live in a world in which the constancy of tradition has been disrupted. Ours is a world in which the foundations are fluid and we cannot rely on past verities, be they moral, political, or scientific. Everything is changeable, and we must prepare our children for such a world.

And yet, even in a world in which we must "think without banisters," there is still a world, a common sense and a common space where people congregate. As Arendt writes,

The loss of worldly permanence and reliability ... does not entail, at least not necessarily, the loss of the human capacity for building, preserving, and caring for a world that can survive us and remain a place fit to live in for those who come after us.

It may be that we live in a time of flux and change, one where permanence and structure are necessarily fleeting. At the same time, it is human to build structures that last, to tell stories that are meaningful, and build works that memorialize. As much as education is about preparing students for the new, it is also about teaching them the stories, showing them the works, and introducing them to the heroes that together comprise the world into which they have been born. Education is importantly a collective effort at remembering and thus calling to mind the world in which we live.

 

With that in mind, it is helpful to consider these lines from Steven's Thesis.

While I focus on the arguments she makes in her published work, studying Arendt has also allowed me to reflect on how my own education has taken place. As a student at Bard College, I found Hannah Arendt’s grave in the college cemetery well before I read any of her work. In writing this project, I have found more and more ways in which I share a common world with her. I did research in her personal library, read her letters, spoke with people who knew her, and sat by her grave. I also learned recently that one of the desks in the classroom at Bard’s Hannah Arendt Center where I took a class on her book Between Past and Future is the desk from her apartment in New York City. These experiences have done more than add personal touches to my research; they resonate with the content of this project in the sense that they have lead me to a deeper awareness of and appreciation for the world that I am entering.

For your weekend read, I commend to you Hannah Arendt's essay, The Crisis in Education.

-RB

Roger Berkowitz
Roger Berkowitz is Associate Professor of Political Studies and Human Rights at Bard College, and Academic Director of the Hannah Arendt Center for Politics and the Humanities. He is also the author of "Gift of Science: Leibiniz and the Modern Legal Tradition", as well as co-editor of "Thinking in Dark Times: Hannah Arendt on Ethics and Politics".
28Mar/121

Pensions: The Unraveling Fiction

How big is the pension crisis in the United States? As I wrote last week, The Pew Charitable Trust has issued a report that there is a whopping $1 trillion dollar gap between the pensions promised to state public employees and the money that has been set aside to pay those pensions. But I also said that many people think that gap is actually much bigger.

The states' calculations assume a rosy 8% or even 10% return on their investments. The Pew report shows that even with those unrealistic assumptions, there will be a $1 trillion gap, since the states are underfunding their pension funds even based on optimistic returns.

Recently, Gillian Tett of the Financial Times talked to a few academics about the question and learned why the gap is actually $3-5 trillion dollars, and not simply $1 trillion. The basic problem is that low interest rates (now around 2%) mean that the investment on pension funds is not returning close to the hoped for amount. As Tett reports:

Thus academics, such as Joshua Rauh of Northwestern University, think that if a more realistic rate of return were used, this would reveal that state pension funds are now underfunded to the tune of $3tn-$4tn. Other observers are even gloomier. “This $4tn figure is a lower bound,” argues Robert Merton, economics professor at MIT. “Liabilities as reported by state and local governments seem to creep steadily up with each report due to ‘actuarial losses’ or overly generous assumptions about mortality and worker behaviour. In recent years, these have added growth of about 4-5 per cent per year to total liabilities.” And, of course, the longer that US interest rates – and bond yields – remain ultra low, the worse this underfunding gap becomes.

Tett's essay makes for a sobering read. As she rightly points out, this problem cannot be ducked forever. Remember, the 2009 bailout that President Obama pushed through was $900 billion, slightly under $1 trillion. We are talking about a shortfall in state budgets of $3-5 trillion in coming years. This is enormous and the effect on state governments and public services will be disastrous. But the very worst effect will be on all of those public employees who have been counting on contractually guaranteed pensions who will, I fear, learn what workers in Rhode Island and Alabama recently learned: such contractual guarantees don't mean much.

What does it mean to have a fact-based politics? This is a question that Hannah Arendt struggled with. First in her writings on totalitarianism, she saw that at the core of totalitarian regimes was the need to keep alive a coherent fantasy that motivated the mass movements supporting the regimes. When inconvenient facts appeared, they simply had to be eradicated.

Later, writing during the Vietnam war and in response to her book Eichmann in Jerusalem, Arendt argued that lies came to serve not totalitarian movements, but well-meaning idealists and technocrats who convinced not only others but even themselves that their lies were in the service of a winnable and noble cause.

Today we face the unraveling of a huge fiction. While the United States is still a wealthy country, we are not as wealthy as we have pretended to be over the last 15 years. But instead of addressing this self-deception, we are continuing to demand higher pensions and better medical care without actually asking who is going to pay for such services. It is a nice slogan to say that pensions and healthcare are human rights. But the current way we are achieving such human rights is by lying to ourselves, and, most pointedly, to the public employees who will see their promised pensions and healthcare evaporate during their retirement.

It would be nice if one of the Presidential candidates in either party would actually discuss the crisis in state pensions. But that would require courage and leadership, not to mention a willingness to have an honest conversation about the fact that this country continues to live beyond its means and promise benefits it cannot afford.

-RB

The Hannah Arendt Center
The Hannah Arendt Center at Bard is a unique institution, offering a marriage of non-partisan politics and the humanities. It serves as an intellectual incubator for engaged thinking and public discussion of the nation's most pressing political and ethical challenges.