“Scientific and philosophic truth have parted company.”
—Hannah Arendt, The Human Condition, 41.290
What can it mean that there are two different types of truth—scientific and philosophic? And how could they not be connected?
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
Anthony Grafton calls David Nirenberg’s Anti-Judaism “one of the saddest stories, and one of the most learned, I have ever read.” Grafton knows that Anti-Judaism “is certainly not the first effort to survey the long grim history of the charges that have been brought against the Jews by their long gray line of self-appointed prosecutors.” What makes this account of the long history of Jewish hatred so compelling is that Nirenberg asks the big question: Why the Jews? “[Nirenberg] wants to know why: why have so many cultures and so many intellectuals had so much to say about the Jews? More particularly, he wants to know why so many of them generated their descriptions and explanations of Jewishness not out of personal knowledge or scholarly research, but out of thin air—and from assumptions, some inherited and others newly minted, that the Jews could be wholly known even to those who knew no Jews.” The question recalls the famous joke told during the Holocaust, especially amongst Jews in concentration camps. Here is one formulation of the joke from Antisemitism, the first book in the trilogy that comprises Hannah Arendt’s magnum opus, The Origins of Totalitarianism: “An antisemite claimed that the Jews had caused the war; the reply was: Yes, the Jews and the bicyclists. Why the bicyclists? Asks the one? Why the Jews? asks the other.” Read more on the Arendt Center blog.
News that the SAT is about to undergo a makeover leaves Bard College President Leon Botstein unimpressed: “The changes recently announced by the College Board to its SAT college entrance exam bring to mind the familiar phrase “too little, too late.” The alleged improvements are motivated not by any serious soul searching about the SAT but by the competition the College Board has experienced from its arch rival, the ACT, the other major purveyor of standardized college entrance exams. But the problems that plague the SAT also plague the ACT. The SAT needs to be abandoned and replaced. The SAT has a status as a reliable measure of college readiness it does not deserve. The College Board has successfully marketed its exams to parents, students, colleges and universities as arbiters of educational standards. The nation actually needs fewer such exam schemes; they damage the high school curriculum and terrify both students and parents. The blunt fact is that the SAT has never been a good predictor of academic achievement in college. High school grades adjusted to account for the curriculum and academic programs in the high school from which a student graduates are. The essential mechanism of the SAT, the multiple choice test question, is a bizarre relic of long outdated twentieth century social scientific assumptions and strategies. As every adult recognizes, knowing something or how to do something in real life is never defined by being able to choose a “right” answer from a set of possible answers (some of them intentionally misleading) put forward by faceless test designers who are rarely eminent experts. No scientist, engineer, writer, psychologist, artist, or physician— and certainly no scholar, and therefore no serious university faculty member—pursues his or her vocation by getting right answers from a set of prescribed alternatives that trivialize complexity and ambiguity.”
Foreign policy types are up in arms—not over Russia’s pending annexation of Crimea, but over the response in the West. By yelling loudly but doing nothing in Syria and now in the Ukraine, America and Europe are losing all credibility. The insinuation is clear. If we don’t draw the line at Crimea, we will embolden Putin in Poland. Much as in the 1930s, the current NATO alliance seems unwilling to stand up for anything on principle if the costs are more than a few photo opportunities and some angry tweets. According to The American Interest, “Putin believes the West is decadent, weak, and divided. The West needs to prove him wrong.” And in Politico, Ben Judah writes: “Russia’s rulers have been buying up Europe for years. They have mansions and luxury flats from London’s West End to France’s Cote d’Azure. Their children are safe at British boarding and Swiss finishing schools. And their money is squirrelled away in Austrian banks and British tax havens.Putin’s inner circle no longer fear the European establishment. They once imagined them all in MI6. Now they know better. They have seen firsthand how obsequious Western aristocrats and corporate tycoons suddenly turn when their billions come into play. They now view them as hypocrites—the same European elites who help them hide their fortunes.”
In The New York Times Magazine, Siddhartha Deb profiles Arundhati Roy, the Indian writer best known in the West for her 1997 novel The God of Small Things. Though the book made Roy into a national icon, her political essays – in which she has addressed, among other issues, India’s occupation of Kashmir, the “lunacy” of India’s nuclear programme, and the paramilitary operations in central India against the ultraleft guerillas and indigenous populations – have angered many nationalist and upper-class Indians for their fierce critiques. Roy’s most recent work, The Doctor and the Saint, is an introduction to Dr. B.R. Ambedkar’s famous 1936 essay “The Annihilation of Caste” that is likely to spark controversy over her rebuke of Ghandi, who wanted to abolish untouchability but not caste. How does Roy see her fiction in relation to her politics? “I’m not a person who likes to use fiction as a means,” she says. “I think it’s an irreducible thing, fiction. It’s itself. It’s not a movie, it’s not a political tract, it’s not a slogan. The ways in which I have thought politically, the proteins of that have to be broken down and forgotten about, until it comes out as the sweat on your skin.” You can read Deb’s profile of Roy here, and an excerpt from The Doctor and the Saint here.
Comparing the MOOC and the GED, Michael Guerreiro wonders whether participants approach both programs with the same sense of purpose. The answer, he suspects, is no: "The data tells us that very few of the students who enroll in a MOOC will ever reach its end. In the ivy, brick, and mortar world from which MOOCs were spun, that would be damning enough. Sticking around is important there; credentials and connections reign, starting with the high-school transcript and continuing through graduate degrees. But students may go into an online course knowing that a completion certificate, even offered under the imprimatur of Harvard or UPenn, doesn’t have the same worth. A recent study by a team of researchers from Coursera found that, for many MOOC students, the credential isn’t the goal at all. Students may treat the MOOC as a resource or a text rather than as a course, jumping in to learn new code or view an enticing lecture and back out whenever they want, just as they would while skimming the wider Web. For many, MOOCs may be just one more Internet tool or diversion; in the Coursera study, the retention rate among committed students for a typical class was shown to be roughly on par with that of a mobile app. And the London Times reported last week that, when given the option to get course credit for their MOOC (for a fee), none of the thousand, or so students who enrolled in a British online class did.” A potent reminder that while MOOCs may indeed succeed and may even replace university education for many people, they are not so much about education as a combination of entertainment, credential, and manual. These are important activities each, but they are not what liberal arts colleges should be about. The hope in the rise of MOOCs, as we’ve written before, is that they help return college to its mission: to teach critical thinking and expose students to the life of the mind.
Noam Chomsky, speaking to the Adjunct Faculty Association of the United Steelworkers, takes issue with the idea that the American university was once living and is now undead, and seeks a way forward: "First of all, we should put aside any idea that there was once a “golden age.” Things were different and in some ways better in the past, but far from perfect. The traditional universities were, for example, extremely hierarchical, with very little democratic participation in decision-making. One part of the activism of the 1960s was to try to democratize the universities, to bring in, say, student representatives to faculty committees, to bring in staff to participate. These efforts were carried forward under student initiatives, with some degree of success. Most universities now have some degree of student participation in faculty decisions. And I think those are the kinds of things we should be moving towards: a democratic institution, in which the people involved in the institution, whoever they may be (faculty, students, staff), participate in determining the nature of the institution and how it runs; and the same should go for a factory. These are not radical ideas."
This week on the blog Anna Metcalfe examines the multi-dimensional idea of action which Arendt discusses in The Human Condition. And in the Weekend Read, entitled 'Why the Jews?', Roger Berkowitz delves into anti-Judaism and its deeply seated roots in Western civilization.
Featuring Housekeeping by Marilynne Robinson.
Bard College partners with five local libraries for six weeks of activities, performances, and discussions scheduled throughout the Hudson Valley.
Learn more here.
'What Europe? Ideals to Fight for Today'
The HAC co-sponsors the second annual conference with Bard College in Berlin
March 27-28, 2014
Learn more here.
At Duke University and the University of North Carolina, two highly popular professors have transformed their course Think Again: How to Reason and Argue into a Massive Online Open Course (MOOC) that is taken by 170,000 people from all over the world at one time. This is old news. There is nothing to worry about when hundreds of thousands of people around the world watch flashy lectures by top professors on how to think and argue. Better such diversions than playing Temple Run. There are advantages and benefits from MOOCs and other forms of computer learning. And we should not run scared from MOOCs.
But the alacrity with which universities are adopting MOOCs as a way of cutting costs and marketing themselves as international brands harbors a danger too. The danger is not that more people will watch MOOCs or that MOOCs might be used to convey basic knowledge inside or outside of universities. No, the real danger in MOOCs is that watching a professor on your Ipad becomes confused with education.
You know elite universities are in trouble when their professors say things like Edward Rock. Rock, Distinguished Professor at the University of Pennsylvania Law School and coordinator of Penn’s online education program, has this to say about the impending revolution in online education:
We’re in the business of creating and disseminating knowledge. And in 2012, the internet is an incredibly important place to be present if you’re in the knowledge dissemination business.
If elite colleges are in the knowledge dissemination business, then they will over time be increasingly devalued and made less relevant. There is no reason that computers or televisions can’t convey knowledge as well or even better than humans. Insofar as professors and colleges imagine themselves to be in the “business of creating and disseminating knowledge,” they will be replaced by computers. And it will be their own fault.
The rising popularity of MOOCs must be understood not as a product of new technology, but as a response to the failure of our universities. As Scott Newstock has argued, the basic principle behind MOOCs is hardly new. Newstock quotes one prominent expert who argues that the average distance learner "knows more of the subject, and knows it better, than the student who has covered the same ground in the classroom." Indeed, "the day is coming when the work done [via distance learning] will be greater in amount than that done in the class-rooms of our colleges." What you might not expect is that this prediction was made in 1885. "The commentator quoted above was Yale classicist (and future University of Chicago President) William Rainey Harper, evaluating correspondence courses." What Newstock’s provocation shows is that efforts to replace education with knowledge dissemination have been around for centuries. But they have failed, at least until now.
MOOCs are so popular today because of the sadly poor quality of much—but certainly not all—college and university education. Around the country there are cavernous lecture halls filled with many hundreds of students. A lone professor stands up front, often with a PowerPoint presentation in a darkened room. Students have their computers open. Some are taking notes, but many are checking Facebook or surfing the Internet. Some are asleep. And others did not bother to show up, since the professor has posted his or her lecture notes online so that students can just read them instead of making the effort to make it to class. Such lectures may be half-decent ways to disseminate knowledge. Some lectures are better than others. But not much learning goes on in such lectures that can’t be simply replicated more efficiently and maybe even better on a computer. It is in this context that advocates of MOOCs are correct. When one compares a large lecture course with a well-designed online course, it may very well be that the online course is a superior educational venture. That it is cheaper too makes the advance of MOOCs seemingly inevitable.
As I have written here before, the best argument for MOOCs is that they may finally put the large and impersonal college lecture course out of its misery. There is no reason to be nostalgic for the lecture course. It was never a very good idea. Aside from a few exceptional lecturers—in my world I can think of the reputations of Hegel, his student Eduard Gans, Martin Heidegger, and, of course, Hannah Arendt—college lectures are largely an economical way to allow masses of students to acquire basic introductory knowledge in a field. If the masses are now more massive and the lectures more accessible, I’ll accept that as progress.
What this means is that there is an opportunity, at this moment, to embrace MOOCs as a disruptive force that will allow us to re-dedicate our universities and colleges to the practice of education as opposed to the business of knowledge dissemination. What colleges and universities need to offer is not simply knowledge, but education.
“Education,” as Martin Luther King wrote, “must also train one for quick, resolute and effective thinking.” Quick and resolute thinking requires that one “think incisively” and “think for one's self.” This “is very difficult.” The difficulty comes from the seduction of conformity and the power of prejudice. “We are prone to let our mental life become invaded by legions of half truths, prejudices, and propaganda.” We are all educated into prejudgments. They are human and it is inhuman to live free from prejudicial opinions and thoughts. On the one hand, education is the way we are led into and brought into a world as it exists, with its prejudices and values. And yet, education must also produce self-thinking persons, people who, once they are educated and enter the world as adults, are capable of judging the world into which they been born. (I have written more about King’s thoughts on education here).
In her essay “The Crisis in Education,” Hannah Arendt writes that education must have a double aspect. First, education leads a new young person into an already existing world. The world is that which is there before the child was born and will continue to exist after the child dies. It is the common world of things, stories, and experiences in which all of us spend our lives. All children, as newcomers who are born into a world that is at first strange to them, must be led into the already existing world. They must be taught to speak a common language, respect common values, see the same facts, and hear the same stories. This common world is what Arendt calls the “truth… we cannot change; metaphorically, it is the ground on which we stand and the sky that stretches above us.” In its first aspect, then, education must protect the world from “the onslaught of the new that bursts upon it with each new generation.” This is the conservationist function of education: to conserve the common world against the rebelliousness of the new. And this is why Arendt writes, “Education is the point at which we decide whether we love the world enough to assume responsibility for it.”
At the same time, however, there is a second aspect of education that seeks to afford the child “special protection and care so that nothing destructive may happen to him from the world.” The teacher must nurture the independence and newness of each child, what “we generally call the free development of characteristic qualities and talents… the uniqueness that distinguishes every human being from every other.” The teacher must not simply love the world, but as part of the world in which we live, the teacher must also love the fact—and it is a fact—that the world will change and be transformed by new ideas and new people. Education must love this transformative nature of children, and we must “love our children enough” so that we do not “strike from their hands their chance of undertaking something new, something unforeseen by us, but to prepare them in advance for the task of renewing a common world.” Alongside its conservationist role, education also must be revolutionary in the sense that it prepares students to strike out and create something altogether new.
Now is the time to use the disruption around MOOCs to rethink and re-invigorate our commitment to education and not simply to the dissemination of knowledge. This will not be easy.
A case in point is the same Duke University Course mentioned above, “Think Again: How to Reason and Argue.” In a recent article by Michael Fitzgerald, the Professors— Walter Sinnott-Armstrong from Duke and Ram Neta of the University of North Carolina at Chapel Hill— describe how teaching their MOOC led them to radically re-conceive how they teach in physical university classrooms. Here is Fitzgerald:
“The big shift: far fewer in-class lectures. Students will watch the lectures on Coursera beginning Monday. "Class will become a time for activities and also teamwork," said Sinnott-Armstrong. He's devised exercises to help on-campus students engage with the concepts in the class, including a college bowl-like competition, a murder mystery night and a scavenger hunt, all to help students develop a deeper understanding of the material presented in the lectures. "You can have these fun activities in the classroom when you're not wasting the classroom time with the lectures," he said.”
What we see here is that the mass appeal of MOOCs and their use as a way of replacing lectures is not being seized as an opportunity to make education more serious, but as an excuse to make college more fun. That professors at two of this country’s elite universities see it as progress that classes are replaced by murder mystery games and scavenger hunts is evidence of a profound confusion between education and infotainment. I have no doubt that much can be learned through fun and games. Children learn through games and it makes all the sense in the world that Finland allows children in schools to play until they are seven or eight years old. Even in primary or at times in secondary school, simulations and games may be useful. But there is a limit. Education, at least higher education, is not simply fun and games in the pursuit of knowledge.
As Arendt understood, education requires that students be nurtured and allowed to grow into adults who think for themselves in a serious and engaged way about the world. This is one reason Arendt is so critical of reformist pedagogy that seeks to stimulate children—especially older children in secondary schools and even college—to learn through play. When we teach children a foreign language through games instead of through grammar or when we make them learn history by playing computer games instead of by reading and studying, we “keep the older child as far as possible at the infant level. The very thing that should prepare the child for the world of adults, the gradually acquired habit of work and of not-playing, is done away with in favor of the autonomy of the world of childhood.” The same can be said of university courses that adopt the juvenile means of primary and secondary education.
The reasons for such a move to games in the classroom are many. Games are easy, students love them, and thus they fill massive classes, leading to superstar professors who can command supersized salaries. What is more, games work. You can learn a language through games. But games rarely teach seriousness and independence of thought.
The rise of MOOCs and the rise of fun in the college classroom are part of the trend to reduce education to a juvenile pursuit. One hardly needs an advanced degree to oversee a scavenger hunt or prepare students to take a test. And scavenger hunts, as useful as they may be in making learning fun, will hardly inculcate the independence of mind and strength of character that will produce self-thinking citizens capable of renewing the common world.
The question of how to address the crisis in education today—the fact that an ever more knowledgeable population with greatest access to information than at any time in the history of the world is perhaps the most politically illiterate citizenry in centuries—is the theme of the upcoming Hannah Arendt Center Conference, “Failing Fast: The Educated Citizen in Crisis.” In preparation for the conference, you can do nothing better than to re-read Hannah Arendt’s essay, "The Crisis in Education." You can also buy Between Past and Future the book of essays in which it appears. However you read it, "The Crisis in Education" is your weekend read.
“[Augustine] distinguishes between the questions of "Who am I?" and "What am I?" the first being directed by man at himself […] For in the "great mystery," the grande profundum, which man is (iv. 14), there is "something of man [aliquid hominis] which the spirit of man which is in him itself knoweth not. But Thou, Lord, who has made him [fecisti eum] knowest everything of him [eius omnia]" (x. 5).”
-Hannah Arendt, Human Condition
In the Human Condition Arendt raises major concerns about the place of man but she does not intend to respond to the loss of the earth as a unique human condition with a restoration of solid ground. To the question “What am I?” the only answer is: “You are a man—whatever that may be.” In lieu of an answer that would give man a new foundation, Arendt offers a description of man's ever changing territory.
Following Augustine, Arendt claims that only God could have the distance to answer the question of "who" man is with anything resembling a concrete statement of human nature. She respects the unknown “spirit of man,” even beyond the knowledge provided by religion.
When philosophy attempts to answer this question, it ends up creating its own image of a higher power, which remains linked through projection to man. Importantly though, philosophy should still ask the question.
Some context can help to open Arendt's question here for readers in English speaking countries where philosophical anthropology never gained the same traction as in Germany. Her challenge picks up on the heated debates of the 1920s and 30s over how to take the collapse of universal values seriously without falling back to simple subjectivism that culminated in the work of Husserl and Heidegger.
In the space of four pages of Being and Time (46-49), Martin Heidegger specifies his criticism with reference to Dilthey, Bergson, Scheler, and Husserl, as well as views from ancient Greek philosophy and Genesis. Heidegger says he has focused his analytic of Dasein on the question of Being and that it cannot therefore provide the fully ontological basis of Dasein needed for "'philosophical' anthropology'" but states that part of his goal is to "make such an anthropology possible." Later though, in section 10, Heidegger provides a further explanation of his criticism of anthropology: in "the attempt to determine the essence of 'man,' as an entity, the question of Being has been forgotten."
In its turn to experience and consciousness, philosophical anthropology forgets to ask the question of ontological definition of perceptual experience (cogitationes). Heidegger thus suggests that his investigation might provide the basis for an anthropology but does not claim to actually deliver this basis. He opens the question of the definition of man, but does so to orient man (recast as Dasein) toward his relation to Being. In a parallel manner, we can understand Arendt's reading of Augustine as opening the question of the relation between the "who" and “what” man is, but not closing it. Her work here is provocative because it can not be said to be in the service of a simple secularization that removes a higher power for human measure. Nor does she wish to save or restore divine guarantee. Perhaps Augustine allows her to pose similar questions of philosophical anthropology to those raised by Heidegger, but to win some distance from her teacher so that she can open a new space of freedom of action rather than freedom of thought.
The word designating military drones comes from the word for bee. This is true all over the world, in countless languages. Partly because of this linguistic consistency, it is a common misperception that drones take their name from the buzzing sound when unmanned aircraft fill the air. More accurately, however, drones trace their etymological lineage to the male honey-bee that is called a drone. The male drone-bee is distinguished from the female worker-bees. It does no useful work and has one single function: to impregnate the queen-bee. What unites military drones with their apiary namesakes is not sound, but thoughtless purposefulness.
The beauty of the drone-bee—like the dark beauty of the military drone—is its single-minded purpose. It is a miracle of efficiency, designed to do one thing. The drone-bee is not distracted by the perfume of flowers or the contentment of labor. It is born, lives, and dies with only one task in mind. Similarly, the military drone suffers neither from hunger nor from distraction. It does what it is told. If necessary, it will sacrifice itself for its mission. It is a model of thoughtless efficiency.
A few weeks ago I wrote about Ernst Jünger’s novel The Glass Bees, in which a brilliant inventor produces tiny flying glass bees that offer limitless potential for surveillance and war. Today I turn to Jake Kosek’s recent paper “Ecologies of Empire: On The New Uses of the Honeybee.” Kosek does not cite Jünger’s novel, and yet his article is in many ways its non-fiction sequel. What Kosek sees is that the rise of drones in military strategy is tied deeply to their ability to mimic the activity and demeanor of male honey-bees. It is because bees can fly, swarm, change direction, alter their course, and yet achieve their single purpose absent any intentionality or thinking that bees are so useful in modern warfare.
Bees have long been associated with military endeavors, both metaphorically and literally. Kosek tells that our word bomb comes from the Greek bombos, which means bee. The first bombs were, it seems, beehives dropped or catapulted into the heart of the enemy camp. Bees are today trained to sniff out toxic chemicals; and beeswax was for generations an essential ingredient in munitions.
In the war on terror, bees have taken on a special significance. The “enemy’s lack of coherence—institutionally, ideologically, and territorially— makes the search for the enemy central to the politics of the war on terror.” War in the war on terror is ever less a contest of armies on the battlefield and is increasingly a war of knowledge. This means that surveillance—for centuries an important complement to battlefield tactics—comes to occupy the core of the modern war on terror. In this regard, drones are essential, as drones can hover in the air unseen for days, gathering essential intelligence on persons, groups, or even whole cities. All the more powerful would be miniature drones that fly through the air unseen and at ground level. That is why Kosek writes that “Intelligence gathering [is] not just limited to psychologists, sociologists, lawyers, and military planners, but [has come] to include biologists, anthropologists, epidemiologists, and even entomologists.” What the military use of bees promises is access to information and worlds not previously open to human knowledge. Bees, Kosek writes, are increasingly the model for the modern military.
The advantage of bees is not simply their thoughtlessness, but is found also in their ability to operate as part of a swarm. Current drone technology requires that each drone be controlled by a single pilot. What happens when hundreds of drones must share the airspace around a target? How can drones coordinate their activity? Kosek quotes a private contractor, John Sauter, who says:
“A central aspect of the future of warfare technology is to get networks of machines to operate as self-synchronized war fighting units that can act as complex adaptive systems. . . We want these machines to be fighting units that can operate as reconfigurable swarms that are less mechanical and more organic, less engineered and more grown.”
The point is that drones, be they large or small, must increasingly work in conjunction with each other at a speed and level of nuance that is impossible for human controllers to manage. The result is that we must model the drones of the future on bees.
The scientists working with the Pentagon to create drones that can fly and function like bees are not entomologists, but mathematicians. The DNA of the glass or silicone bees of the future will be complex algorithms inspired by but actually surpassing the ability of swarms “to coordinate and collect small bits of information that can be synchronized to make collective action by drones possible.” Once this is possible, one controller will be able to manage a single drone “and the others adapt, react, and coordinate with that drone.”
Kosek’s article is provocative and fascinating. His ruminations on empire strike me as overdone; his insights about the way our training and use of bees has transformed the bee and the ways that bees are serving as models and inspiration for our own development of new ways to fight wars and solve problems are important. So too is his imagination of the bee as the six-legged soldier of the future. Whether the drones of the future are cyborg bees (as some in Kosek’s article suggest) or mechanical bees as Jünger imagined half a century ago, it is nevertheless the case that thinking about the impact of drones on warfare and human life is enriched by the meditation on the male honeybee. For your weekend read , I offer you Jake Kosek’s “Ecologies of Empire: On The New Uses of the Honeybee.”
“The shift from the ‘why’ and ‘what’ to the ‘how’ implies that the actual objects of knowledge can no longer be things or eternal motions but must be processes, and that the object of science is no longer nature or the universe but the history, the story of the coming into being, of nature or life or the universe....Nature, because it could be known only in processes which human ingenuity, the ingeniousness of homo faber, could repeat and remake in the experiment, became a process, and all particular natural things derived their significance and meaning solely from their function in the over-all process. In the place of the concept of Being we now find the concept of Process. And whereas it is in the nature of Being to appear and thus disclose itself, it is in the nature of Process to remain invisible, to be something whose existence can only be inferred from the presence of certain phenomena.”
-Hannah Arendt, The Human Condition
Bookending Arendt’s consideration of the human condition “from the vantage point of our newest experiences and our most recent fears” is her invocation of several “events, ” which she took to be emblematic of the modern world launched by the atomic explosions of the 1940s and the threshold of the modern age that preceded it by several centuries. The event she invokes in the opening pages is the launch of Sputnik in 1957; its companion events are named in the last chapter of the book--the discovery of America, the Reformation, and the invention of the telescope and the development of a new science.
Not once mentioned in The Human Condition, but, as Mary Dietz argued so persuasively in her Turning Operations, palpably present as a “felt absence,” is the event of the Shoah, the “hellish experiment” of the SS concentration camps, which is memorialized today, Yom HaShoah. Reading Arendt’s commentaries on the discovery of the Archimedean point and its application in modern science with the palpably present but textually absent event of the Holocaust in mind sheds new light on the significance of her cautionary tale about the worrying implications of the new techno-science of algorithms and quantum physics and its understanding of nature produced through the experiment.
What happens, she seems to be asking, when the meaning of all “particular things” derives solely from “their function in the over-all process”? If nature in all of its aspects is understood as the inter- (or intra-) related aspects of the overall life process of the universe, does then human existence, as part of nature, become merely one part of that larger process, differing perhaps in degree, but not kind, from any other part?
Recently, “new materialist” philosophers have lauded this so-called “posthumanist” conceptualization of existence, arguing that the anthropocentrism anchoring earlier modern philosophies—Arendt implicitly placed among them?—arbitrarily separates humans from the rest of nature and positions them as masters in charge of the world (universe). By contrast, a diverse range of thinkers such as Jane Bennett, Rosi Braidotti, William Connolly, Diana Coole, and Cary Wolfe have drawn on a variety of philosophical and scientific traditions to re-appropriate and “post-modernize” some form of vitalism. The result is a reformulation of an ontology of process—what Connolly calls “a world of becoming”—as the most accurate way to understand matter’s dynamic and eternal self-unfolding. And, consequentially, it also entails transforming agency from a human capacity of “the will” with its related intentions to a theory of agency of “multiple degrees and sites...flowing from simple natural processes, to human beings and collective social assemblages” with each level and site containing “traces and remnants from the levels from which it evolved,” which “affect [agency’s] operation.” (Connolly, A World Becoming, p. 22, emphasis added). The advantage of a “philosophy/faith of radical immanence or immanent realism,” Connolly argues, is its ability to engage the “human predicament”: “how to negotiate life, without hubris or existential resentment, in a world that is neither providential nor susceptible to consummate mastery. We must explore how to invest existential affirmation in such a world, even as we strive to fend off its worst dangers.”
An implicit ethic of aiming to take better care of the world, “to fold a spirit of presumptive generosity for the diversity of life into your conduct” by not becoming too enamored with human agency resides in this philosophy/faith. In the entanglements she explores between human and non-human materiality—a “heterogeneous monism of vibrant bodies” —one can discern similar ethical concerns in Jane Bennett’s Vibrant Matter. “It seems necessary and impossible to rewrite the default grammar of agency, a grammar that assigns activity to people and passivity to things.” Conceptualizing nature as “an active becoming, a creative not-quite-human force capable of producing the new” Bennett affirms a “vital materiality [that] congeals into bodies, bodies that seek to persevere or prolong their run,” (p. 118, emphasis in the original) where “bodies” connotes all forms of matter. And she contends that this vital materialism can “enhance the prospects for a more sustainability-oriented public.” Yet, without some normative criteria for discerning the ways this new materialism can work toward “sustainability,” it is by no means obvious how either a declaration of faith in the “radical character of the (fractious) kinship between the human and the non-human” or having greater “attentiveness to the indispensable foreignness that we are” would lead to a change in political direction toward more gratitude and away from more destructive patterns of production and consumption. The recognition of our vulnerability could just as easily lead to renewed efforts to truncate or even eradicate the “foreignness” within.
Nonetheless, although these and other accounts call for a reconceptualization of concepts of agency and of causality, none pushes as far toward a productivist/performative account of matter and meaning as does Karen Barad’s theory of “agential realism.” Drawing out the implications of Niels Bohr’s quantum mechanics, Barad develops a theory of how “subjects” and “objects” are produced as apparently separable entities by “specific material configurings of the world” which enact “boundaries, properties, and meanings.” And, in her conceptualization, “meaning is not a human-based notion; rather meaning is an ongoing performance of the world in its differential intelligibility...Intelligibility is not an inherent characteristic of humans but a feature of the world in its differential becoming. The world articulates itself differently...[H]uman concepts or experimental practices are not foundational to the nature of phenomena. ” The world is immanently real and matter immanently materializes.
At first glance, this posthumanist understanding of reality seems consistent with Arendt’s own critique of Cartesian dualism and Newtonian physics and her understanding of the implicitly conditioned nature of human existence. “Men are conditioned beings because everything they come into contact with turns immediately into a condition of their existence. The world in which the vita activa spends itself consists of things produced by human activities; but the things that owe their existence exclusively to men nevertheless constantly condition their human makers.” Nonetheless, there is a profound difference between them. For Barad, “world” is not Arendt’s humanly built habitat, the domain of homo faber (which does not necessarily entail mastery of nature, but always involves a certain amount of violence done to nature, even to the point of “degrading nature and the world into mere means, robbing both of their independent dignity.” (H.C., p. 156, emphasis added.) “World” is matter, the physical, ever-changing reality of an inherently active, “larger material configuration of the world and it ongoing open-ended articulation.” Or is it?
Since this world is made demonstrably real or determinate only through the design of the right experiment to measure the effects of, or marks on, bodies, or “measuring agencies” (such as a photographic plate) made or produced by “measured objects” (such as electrons), the physical nature of this reality becomes an effect of the experiment itself. Despite the fact that Barad insists that “phenomena do not require cognizing minds for their existence” and that technoscientific practices merely manifest “an expression of the objective existence of particular material phenomena” (p. 361), the importance of the well-crafted scientific experiment to establishing the fact of matter looms large.
Why worry about the experiment as the basis for determining the nature of nature, including so-called “human nature? For Arendt, the answer was clear: “The world of the experiment seems always capable of becoming a man-made reality, and this, while it may increase man’s power of making and acting, even of creating a world, far beyond what any previous age dared imagine...unfortunately puts man back once more—and now even more forcefully—into the prison of his own mind, into the limitations of patterns he himself has created...[A] universe construed according to the behavior of nature in the experiment and in accordance with the very principles which man can translate technically into a working reality lacks all possible representation...With the disappearance of the sensually given world, the transcendent world disappears as well, and with it the possibility of transcending the material world in concept and thought.”
The transcendence of representationalism does not trouble Barad, who sees “representation” as a process of reflection or mirroring hopelessly entangled with an outmoded “geometrical optics of externality.” But for Arendt, appearance matters, and not in the sense that a subject discloses some inner core of being through her speaking and doing, but in the sense that what is given to the senses of perception—and not just to the sense of vision—is the basis for constructing a world in common. The loss of this “sensually given world” found its monstrous enactment in the world of the extermination camps, which Arendt saw as “special laboratories to carry through its experiment in total domination.”
If there is a residual humanism in Arendt’s theorizing it is not the simplistic anthropocentrism, which takes “man as the measure of all things,” a position she implicitly rejects, especially in her critique of instrumentalism. Rather, she insists that “the modes of human cognition [science among them] applicable to things with ‘natural’ qualities, including ourselves to the limited extent that we are specimens of the most highly developed species of organic life, fail us when we raise the question: And who are we?” (H.C., p. 11, emphasis in the original) And then there is the question of responsibility.
We may be unable to control the effects of the actions we set in motion, or, in Barad’s words, “the various ontological entanglements that materiality entails.”
But no undifferentiated assignation of agency to matter, or material sedimentations of the past “ingrained in the body’s becoming” can release us humans from the differential burden of consciousness and memory that is attached to something we call the practice of judgment. And no appeal to an “ethical call...written into the very matter of all being and becoming” will settle the question of judgment, of what is to be done. There may be no place to detach ourselves from responsibility, but how to act in the face of it is by no means given by the fact of entanglement itself. What if “everything is possible.”?
-Kathleen B. Jones