“Scientific and philosophic truth have parted company.”
—Hannah Arendt, The Human Condition, 41.290
What can it mean that there are two different types of truth—scientific and philosophic? And how could they not be connected?
In the New York Review of Books, Sue Halpern argues that we should pay less attention to the character of actors like Edward Snowden and Glenn Greenwald and focus more on the governmental actions they have revealed. Yet much if not most of Halpern’s essay focuses on Snowden and Greenwald themselves, and the paragraph that stands out in Halpern’s essay goes directly to Snowden’s decision to leave the country and evade confronting the U.S. Government in court:
It is here that Edward Snowden’s story begins to sound much like those of Thomas Drake, William Binney, Kirk Wiebe, and Edward Loomis, longtime NSA employees who, a few years earlier than Snowden, attempted to raise concerns with their superiors—only to find themselves rebuffed—about what they perceived to be NSA overreach and illegality when they learned that the agency was indiscriminately monitoring the communications of American citizens without warrants. Binney, Wiebe, and Loomis resigned—and later found themselves the subjects of FBI interrogations. Drake, however, stayed on and brought his suspicions to the office of general counsel for the NSA, where he was told: “Don’t ask any more questions, Mr. Drake.” Frustrated, Drake eventually leaked what he knew to a reporter for The Baltimore Sun. The upshot: a home invasion by the FBI, a federal indictment, and the threat of thirty-five years in prison for being in possession of classified documents that, when he obtained them, had not been classified. After years of harassment by the government and Drake’s financial ruin, the case was dropped the night before trial. It was against this backdrop that Snowden found himself contemplating what to do with what he knew. Stymied by an unresponsive bureaucracy, seeing the fate of earlier NSA whistleblowers, and finding no adequate provisions within the system to challenge the legality of government activity if that activity was considered by the government to touch on national security, he nonetheless set about gathering the evidence to make his case.
For those who would defend Snowden, this narrative is essential. The claim is that the United States now is simply not like the United States of the 1960s and 1970s when Daniel Ellsberg gave himself up after releasing the Pentagon Papers. Ellsberg himself has made this argument while defending Snowden, arguing that Snowden and whistleblowers like him simply cannot and should not trust the U.S. government to treat them legally and humanely.
"Thinking in its non-cognitive, non-specialized sense as a natural need of human life, the actualization of the difference given in consciousness, is not a prerogative of the few but an everpresent faculty of everybody; by the same token, inability to think is not the “prerogative” of those many who lack brain power but the everpresent possibility for everybody—scientists, scholars, and other specialists in mental enterprises not excluded—to shun that intercourse with oneself whose possibility and importance Socrates first discovered."
--Hannah Arendt, “Thinking and Moral Considerations: A Lecture” (1971)
Published eight years after Eichmann in Jerusalem, “Thinking and Moral Considerations” is Arendt’s elaboration of her argument in that book that Adolf Eichmann’s criminal role in the Holocaust did not originate from any “base motives” or even from any motives at all, but from his “thoughtlessness” or “inability to think.” If, she asks, Eichmann’s crimes, which he committed over the course of years, resulted from the fact that he never paused to think, what exactly does it mean to think, and what is the relation between thinking and morality?
In the above quote, which appears on the penultimate page of the lecture, Arendt defines thinking—or the kind of thinking that she argues is necessary for morality—as “the actualization of the difference given in consciousness,” as “that intercourse with oneself whose possibility and importance Socrates first discovered.” She describes this “non-cognitive, non-specialized” kind of thinking both as “a natural need of human life” and as “an everpresent faculty of everybody.” By contrast, she defines “inability to think” as the everpresent possibility for everybody to shun thinking.
We might wonder at this point why Arendt does not simply speak of an “ability not to think,” an ability to (actively) shun thinking, rather than an “inability to think.” Is this because she wants to maintain a hierarchy between something that is natural and human (thinking) and something that is unnatural and inhuman (not thinking)? What would be the justification for such a hierarchy? Or does she want to suggest that Eichmann has become unable to think (through barbarous “nurture”), losing touch with his (nevertheless everpresent) faculty of thinking, which everybody has from birth (“nature”) or from the moment they learn to speak? Thinking and language are intrinsically connected from the first page of Arendt’s lecture, where the primary evidence of Eichmann’s inability to think is that he speaks in clichés. (Also, the lecture is dedicated to a poet, W.H. Auden.) Finally, how does Arendt’s description of thinking as a “natural need of human life” relate to her suggestion that Socrates did not merely discover the importance but the very possibility of thinking?
Arendt casts Socrates as “a model, (…) an example that, unlike the ‘professional’ thinkers, could be representative for our ‘everybody,’ (…) a man who counted himself neither among the many nor among the few (…).” She takes Socrates not as “a personified abstraction with some allegorical meaning ascribed to it,” but as an “ideal type” who “was chosen out of the crowd of living beings, in the past or the present, because he possessed a representative significance in reality which only needed some purification in order to reveal its full meaning.” What, then, is this representative significance?
Arendt bases her conception of thinking and its relation to morality primarily on two famous propositions that Socrates puts forward in the Gorgias: “It is better to be wronged than to do wrong,” and “It would be better for me that my lyre or a chorus I directed should be out of tune and loud with discord, and that multitudes of men should disagree with me rather than that I, being one, should be out of harmony with myself and contradict me” (Arendt’s emphases). According to Arendt, these propositions are not primarily “cogitations about morality” but “insights of experience,” of the experience of the process of thinking. Arendt claims that Socrates means by the first proposition that it is better for him to be wronged than to do wrong if he is thinking, because in thinking you are carrying on a dialogue with yourself, which presupposes some friendship between the partners in the thinking dialogue. You would not want to be friends and enter into a dialogue with someone who does wrong, and since Socrates presupposes that the unexamined life is not worth living, doing wrong leads to a life that is not worth living because examining it in thinking is no longer possible.
Arendt argues that conscience is a “by-product” of consciousness, of the actualization of the difference of me and myself in thinking, because: “What makes a man fear his conscience is the anticipation of the presence of a witness who awaits him only if and when he goes home” (Arendt’s emphasis). However, this formulation suggests that there is no reason to fear your conscience if you never go “home,” that is, if you never engage in the activity of thinking, which, according to Arendt, was precisely Eichmann’s problem. What, then, determines whether someone uses her faculty of thinking or realizes the everpresent possibility of not thinking?
Arendt’s lecture does not contain a strong answer to this question. But although the relation between phenomenological description and normative argument in this lecture remains somewhat unclear, the lecture seems to contain a defense of thinking and a “demand” that everybody think, that everybody aspire to some extent to the ideal-type represented by Socrates, because only thinking can provide an antidote to the “banality of evil.” Arendt acknowledges that thinking can lead to license, cynicism, and nihilism through the relativizing of existing values, because “all critical examinations must go through a stage of at least hypothetically negating accepted opinions and ‘values’ by finding out their implications and tacit assumptions.” However, Arendt’s anti-elitist suggestion is that the problem of nihilism is never that too many people think or that people think too much, but rather that people do not think enough.
Yet Arendt does not tell us what would promote thinking. She does not propose, for instance, to generalize the teaching of thinking through educational institutions, the way that Adorno proposed to create “mobile educational groups” of volunteers to teach “critical (…) self-reflection” to everybody, in his 1966 radio talk, “Education After Auschwitz.” A Habermasian model where people become critical through participation in democratic politics is unavailable for Arendt given her strong opposition of thinking to politics, which belongs to the realm of action. What Arendt does tell us is what is conducive to actualizing the everpresent possibility of not thinking: “(…) general rules which can be taught and learned until they grow into habits that can be replaced by other habits and rules,” the way that Eichmann, as Arendt argues in Eichmann in Jerusalem, simply substituted the duty to do the Führer’s will for Kant’s categorical imperative.
“To find yourself, think for yourself.”
"Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment."
-Hannah Arendt, The Life of the Mind
I find this quote intriguing in that its reference to environments and environmental change speak to the fact that Arendt's philosophy was essentially an ecological one, indeed one that is profoundly media ecological. The quote appears in a section of The Life of the Mind entitled "Science and Common Sense," in which Arendt argues that the practice of science is quite distinct from thinking as a philosophical activity.
As she explains:
Thinking, no doubt, plays an enormous role in every scientific enterprise, but it is a role of a means to an end; the end is determined by a decision about what is worthwhile knowing, and this decision cannot be scientific.
Here Arendt invokes a variation on Gödel's incompleteness theorem in mathematics, noting that science cannot justify itself on scientific grounds, but rather must somehow depend on something outside of and beyond itself. Perhaps more to the point, science, especially as associated with empiricism, cannot be divorced from concrete reality, and does not function only in the abstract realm of ideas that Plato insisted was the only true reality.
The transformation of truth into mere verity results primarily from the fact that the scientist remains bound to the common sense by which we find our bearings in a world of appearances. Thinking withdraws radically and for its own sake from this world and its evidential nature, whereas science profits from a possible withdrawal for the sake of specific results.
It is certainly the case that scientific truth is always contingent, tentative, open to refutation, as Karl Popper explained. Scientific truth is never absolute, never anything more than a map of some other territory, a map that needs to be continually tested and reviewed, updated and revised, as Alfred Korzybski explained by way of establishing his discipline of general semantics. Even the so-called laws of nature and physics need not be considered immutable, but may be subject to change and evolution, as Lee Smolin argues in his insightful book, Time Reborn.
Scientists are engaged in the process of abstracting, insofar as they take the data gained by empirical investigation and make generalizations in the form of theories and hypotheses, but this process of induction cannot be divorced from concrete reality, from the world of appearances. Science may be used to test, challenge, and displace common sense, but it operates on the same level, as a distilled form of common sense, rather than something qualitatively different, a status Arendt reserves for the special activity of thinking associated with philosophy.
Arendt goes on to argue that both common sense and scientific speculation lack "the safeguards inherent in sheer thinking, namely thinking's critical capacity." This includes the capacity for moral judgment, which became horrifically evident by the ways in which Nazi Germany used science to justify its genocidal policies and actions. Auschwitz did not represent a retrieval of tribal violence, but one of the ultimate expressions of the scientific enterprise in action. And the same might be said of Hiroshima and Nagasaki, holding aside whatever might be said to justify the use of the atomic bomb to bring the Second World War to a speedy conclusion. In remaining close to the human lifeworld, science abandons the very capacity that makes us human, that makes human life and human consciousness unique.
The story of modern science is in fact a story of shifting alliances. Science begins as a branch of philosophy, as natural philosophy. Indeed, philosophy itself is generally understood to begin with the pre-Socratics sometimes referred to as Ionian physicists, i.e., Thales, Anaximander, Heraclitus, who first posited the concept of elements and atoms. Both science and philosophy therefore coalesce during the first century that followed the introduction of the Greek alphabet and the emergence of a literate culture in the ancient Greek colonies in Asia Minor.
And just as ancient science is alphabetic in its origins, modern science begins with typography, as the historian Elizabeth Eisenstein explains in her exhaustive study, The Printing Press as an Agent of Change in Early Modern Europe. Simply by making the writings of natural philosophers easily available through the distribution of printed books, scholars were able to compare and contrast what different philosophers had to say about the natural world, and uncover their differences of opinion and contradictions. And this in turn spurned them on to find out for themselves which of various competing explanations are correct, where the truth lies, so that more reading led to even more empirical research, which in turn would have to be published, that is made public, via printing, for the purposes of testing and confirmation. And publication encouraged the formation of a scientific republic of letters, a typographically mediated virtual community.
Eisenstein notes that during the first century following Gutenberg, printed books gave Copernicus access to centuries of recorded observations of the movements of celestial objects, access not easily available to his predecessors. What is remarkable to consider is that the telescope was not invented in his lifetime, that the Polish astronomer arrived at his heliocentric view based only on what could be observed by the naked eye, by gazing up at the heavens, and down at the printed page. The typographic revolution that began in the 15th century was the necessary technological precondition for the Copernican revolution of the 16th century. The telescope as a tool to extend vision beyond its natural capabilities had not yet been invented, and was not required, although soon after its introduction Galileo was able to confirm the theory that Copernicus had put forth a century earlier.
In the restricted literate culture of medieval Europe, the idea took hold that there are two books to be studied in an effort to discern the divine will, and mind: the book of scripture and the book of nature. Both books were seen as sources of knowledge that can be unlocked by a process of reading and interpretation. It was grammar, the ancient study of language, which became one third of the trivium, the foundational curriculum of the medieval university, that became the basis of modern science, and not dialectic or logic, that is, pure thinking, which is the source of the philosophic tradition, as Marshall McLuhan noted in The Classical Trivium. The medieval schoolmen of course placed scripture in the primary position, whereas modern science situates truth in the book of nature alone.
The publication of Francis Bacon's Novum Organum in 1620 first formalized the separation of science from philosophy within print culture, but the divorce was finalized during the 19th century, coinciding with the industrial revolution, as researchers became known as scientists rather than natural philosophers. In place of the alliance with philosophy, science came to be associated with technology; before this time, technology, and engineering, often referred to as mechanics, represented entirely different lines of inquiry, utterly practical, often intuitive rather than systematic. Mechanics was part of the world of work rather than that of action, to use the terms Arendt introduced in The Human Condition, which is to say that it was seen as the work of the hand rather than the mind. By the end of 19th century, scientific discovery emerged as the main the source of major technological breakthroughs, rather than innovation springing fully formed from the tinkering of inventors, and it became necessary to distinguish between applied science and theoretical science, the latter nonetheless still tied to the world of appearances.
Today, the acronym STEM, which stands for science, technology, engineering, and mathematics, has become a major buzzword in education, a major emphasis in particular for higher education, and a major concern in regards to economic competitiveness. We might well take note of how recent this combination of fields and disciplines really is, insofar as mathematics represents pure logic and highly abstract forms of thought, and science once was a purely philosophical enterprise, both aspects of the life of the mind. Technology and engineering, on the other hand, for most of our history took the form of arts and crafts, part of the world of appearances.
The convergence of science and technology also had much to do with scientists' increasing reliance on scientific instruments for their investigations, a trend increasingly prevalent following the introduction of both the telescope and the microscope in the early 17th century, a trend even more apparent from the 19th century on. The laboratory is in fact another such instrument, a technology whose function is to provide precisely controlled conditions, beyond its role as a facility for the storage and use of other scientific instruments. Scientific instruments are media that extend our senses and allow us to see the world in new ways, therefore altering our experience of our environment, while the discoveries they lead to provide us with the means of altering our environments physically. And the laboratory is an instrument that provides us with a total environment, enclosed, controlled, isolated from the world to become in effect the world. It is a micro-environment where experimental changes can be made that anticipate changes that can be made to the macro-environment we regularly inhabit.
The split between science and philosophy can also be characterized as a division between the eye and the ear. Modern science, as intimately bound up in typography, is associated with visualism, the idea that seeing is believing, that truth is based on vision, that knowledge can be displayed visually as an organized set of facts, rather than the product of ongoing dialogue, and debate. McLuhan noted the importance of the fixed point of view as a by-product of training the eye to read, and Walter Ong studied the paradigm-shift in education attributed to Peter Ramus, who introduced pedagogical methods we would today associated with textbooks, outlining, and the visual display of information. Philosophy has not been immune to this influence, but retains a connection to the oral-aural mode through the method of Socratic dialogue, and by way of an understanding of the history of ideas as an ongoing conversation. Arendt, in The Human Condition, explained action, the realm of words, as a social phenomenon, one based on dialogic exchanges of ideas and opinions, not a solitary matter of looking things up. And thinking, which she elevates above the scientific enterprise in The Life of the Mind, is mostly a matter of an inner dialogue, or monologue if you prefer, of hearing oneself think, of silent speech, and not of a mental form of writing out words or imaginary reading. We talk things out, to others and/or to ourselves.
Science, on the other hand, is all about visible representations, as words, numbers, illustrations, tables, graphs, charts, diagrams, etc. And it is the investigation of visible phenomena, or otherwise of phenomena that can be rendered visible through scientific instruments. Acoustic phenomena can only be dealt with scientifically by being turned into a visual measurement, either of numbers or of lines going up and down to depict sound waves. The same is true for the other senses; smell, taste, and touch can only be dealt with scientifically though visual representation. Science cannot deal with any sense other than sight on its own terms, but always requires an act of translation into visual form. Thus, Arendt notes that modern science, being so intimately bound up in the world of appearances, is often concerned with making the invisible visible:
That modern science, always hunting for manifestations of the invisible—atoms, molecules, particles, cells, genes—should have added to the world a spectacular, unprecedented quantity of new perceptible things is only seemingly paradoxical.
Arendt might well have noted the continuity between the modern activity of making the invisible visible as an act of translation, and the medieval alchemist's search for methods of achieving material transformation, the translation of one substance into another. She does note that the use of scientific instruments are a means of extending natural functions, paralleling McLuhan's characterization of media as extensions of body and biology:
In order to prove or disprove its hypotheses… and to discover what makes things work, it [modern science] began to imitate the working processes of nature. For that purpose it produced the countless and enormously complex implements with which to force the non-appearing to appear (if only as an instrument-reading in the laboratory), as that was the sole means the scientist had to persuade himself of its reality. Modern technology was born in the laboratory, but this was not because scientists wanted to produce appliances or change the world. No matter how far their theories leave common-sense experience and common-sense reasoning behind, they must finally come back to some form of it or lose all sense of realness in the object of their investigation.
Note here the close connection between reality, that is, our conception of reality, and what lends someone the aura of authenticity, as Walter Benjamin would put it, is dependent on the visual sense, on the phenomenon being translated into the world of appearances (the aura as opposed to the aural). It is no accident then that there is a close connection in biblical literature and the Hebrew language between the words for spirit and soul, and the words for invisible but audible phenomena such as wind and breath, breath in turn being the basis of speech (and this is not unique to Hebraic culture or vocabulary). It is at this point that Arendt resumes her commentary on the function of the controlled environment:
And this return is possible only via the man-made, artificial world of the laboratory, where that which does not appear of its own accord is forced to appear and to disclose itself. Technology, the "plumber's" work held in some contempt by the scientist, who sees practical applicability as a mere by-product of his own efforts, introduces scientific findings, made in "unparalleled insulation… from the demands of the laity and of everyday life," into the everyday world of appearances and renders them accessible to common-sense experience; but this is possible only because the scientists themselves are ultimately dependent on that experience.
We now reach the point in the text where the quote I began this essay with appears, as Arendt writes:
Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment; and the cognitive processes using the human abilities of thinking and fabricating as means to their end are indeed the most refined modes of common-sense reasoning. The activity of knowing is no less related to our sense of reality and no less a world-building activity than the building of houses.
Again, for Arendt, science and common sense both are distinct in this way from the activity of pure thinking, which can provide a sorely needed critical function. But her insight as to the function of the laboratory as an environment in which the invisible is made visible is important in that this helps us to understand that the laboratory is, in fact, what McLuhan referred to as a counter-environment or anti-environment.
In our everyday environment, the environment itself tends to be invisible, if not literally so, then functionally insofar as whatever fades into the background tends to fall out of our perceptual awareness or is otherwise ignored. Anything that becomes part of our routine falls into this category, becoming environmental, and therefore subliminal. And this includes our media, technology, and symbol systems, insofar as they are part of our everyday world. We do pay attention to them when they are brand new and unfamiliar, but once their novelty wears off they become part of the background, unless they malfunction or breakdown. In the absence of such conditions, we need an anti-environment to provide a contrast through which we can recognize the things we take for granted in our world, to provide a place to stand from which we can observe our situation from the outside in, from a relatively objective stance. We are, in effect, sleepwalkers in our everyday environment, and entering into an anti-environment is a way to wake us up, to enhance awareness and consciousness of our surroundings. This occurs, in a haphazard way, when we return home after spending time experiencing another culture, as for a brief time much of what was once routinized about own culture suddenly seems strange and arbitrary to us. The effect wears off relatively quickly, however, although the after-effects of broadening our minds in this way can be significant.
The controlled environment of the laboratory helps to focus our attention on phenomena that are otherwise invisible to us, either because they are taken for granted, or because they require specialized instrumentation to be rendered visible. It is not just that such phenomena are brought into the world of appearances, however, but also that they are made into objects of concerted study, to be recorded, described, measured, experimented upon, etc.
McLuhan emphasized the role of art as an anti-environment. The art museum, for example, is a controlled environment, and the painting that we encounter there has the potential to make us see things we had never seen before, by which I mean not just objects depicted that are unfamiliar to us, but familiar objects depicted in unfamiliar ways. In this way, works of art are instruments that can help us to see the world in new and different ways, help us to see, to use our senses and perceive in new and different ways. McLuhan believed that artists served as a kind of distant early warning system, borrowing cold war terminology to refer to their ability to anticipate changes occurring in the present that most others are not aware of. He was fond of the Ezra Pound quote that the artist is the antenna of the race, and Kurt Vonnegut expressed a similar sentiment in describing the writer as a canary in a coal mine. We may further consider the art museum or gallery or library as a controlled environment, a laboratory of sorts, and note the parallel in the idea of art as the anticipation of a changed environment.
There are other anti-environments as well. Houses of worship function in this way, often because they are based on earlier eras and different cultures, and otherwise are constructed to remove us out of our everyday environment, and help us to see the world in a different light. They are in some way dedicated to making the invisible world of the spirit visible to us through the use of sacred symbols and objects, even for religions whose concept of God is one that is entirely outside of the world of appearances. Sanctuaries might therefore be considered laboratories used for moral, ethical, and sacred discovery, experimentation, and development, and places where changed environments are also anticipated, in the form of spiritual enlightenment and the pursuit of social justice. This also suggests that the scientific laboratory might be viewed, in a certain sense, as a sacred space, along the lines that Mircea Eliade discusses in The Sacred and the Profane.
The school and the classroom are also anti-environments, or at least ought to be, as Neil Postman argued in Teaching as a Conserving Activity. Students are sequestered away from the everyday environment, into a controlled situation where the world they live in can be studied and understood, and phenomena that are taken for granted can be brought into conscious awareness. It is indeed a place where the invisible can be made visible. In this sense, the school and the classroom are laboratories for learning, although the metaphor can be problematic when it used to imply that the school is only about the world of appearances, and all that is needed is to let students discover that world for themselves. Exploration is indeed essential, and discovery is an important component of learning. But the school is also a place where we may engage in the critical activity of pure thinking, of critical reasoning, of dialogue and disputation.
The classroom is more than a laboratory, or at least it must become more than a laboratory, or the educational enterprise will be incomplete. The school ought to be an anti-environment, not only in regard to the everyday world of appearances and common sense, but also to that special world dominated by STEM, by science, technology, engineering and math. We need the classroom to be an anti-environment for a world subject to a flood of entertainment and information, we need it to be a language-based anti-environment for a world increasingly overwhelmed by images and numbers. We need an anti-environment where words can take precedence, where reading and writing can be balanced by speech and conversation, where reason, thinking, and thinking about thinking can allow for critical evaluation of common sense and common science alike. Only then can schools be engaged in something more than just adjusting students to take their place in a changed and changing environment, integrating them within the technological system, as components of that system, as Jacques Ellul observed in The Technological Society. Only then can schools help students to change the environment itself, not just through scientific and technological innovation, but through the exercise of values other than the technological imperative of efficiency, to make things better, more human, more life-affirming.
The anti-environment that we so desperately need is what Hannah Arendt might well have called a laboratory of the mind.
This Quote of the Week was originally published on September 3, 2012.
It can be dangerous to tell the truth: “There will always be One against All, one person against all others. [This is so] not because One is terribly wise and All are terribly foolish, but because the process of thinking and researching, which finally yields truth, can only be accomplished by an individual person. In its singularity or duality, one human being seeks and finds – not the truth (Lessing) –, but some truth.”
-Hannah Arendt, Denktagebuch, Book XXIV, No. 21
Hannah Arendt wrote these lines when she was confronted with the severe and often unfair, even slanderous, public criticism launched against her and her book Eichmann in Jerusalemafter its publication in 1963. The quote points to her understanding of the thinking I (as opposed to the acting We) on which she bases her moral and, partly, her political philosophy.
It is the thinking I, defined with Kant as selbstdenkend (self-thinking [“singularity”]) and an-der-Stelle-jedes-andern-denkend (i.e., in Arendt’s terms, thinking representatively or practicing the two-in-one [“duality”]). Her words also hint at an essay she published in 1967 titled “Truth and Politics,” wherein she takes up the idea that it is dangerous to tell the truth, factual truth in particular, and considers the teller of factual truth to be powerless. Logically, the All are the powerful, because they may determine what at a specific place and time is considered to be factual truth; their lies, in the guise of truth, constitute reality. Thus, it is extremely hard to fight them.
In answer to questions posed in 1963 by the journalist Samuel Grafton regarding her report on Eichmann and published only recently, Arendt states: “Once I wrote, I was bound to tell the truth as I see it.” The statement reveals that she was quite well aware of the fact that her story, i.e., the result of her own thinking and researching, was only one among others. She also realized the lack of understanding and, in many cases, of thinking and researching, on the part of her critics.
Thus, she lost any hope of being able to publicly debate her position in a “real controversy,” as she wrote to Rabbi Hertzberg (April 8, 1966). By the same token, she determined that she would not entertain her critics, as Socrates did the Athenians: “Don’t be offended at my telling you the truth.” Reminded of this quote from Plato’s Apology (31e) in a supportive letter from her friend Helen Wolff, she acknowledged the reference, but acted differently. After having made up her mind, she wrote to Mary McCarthy: “I am convinced that I should not answer individual critics. I probably shall finally make, not an answer, but a kind of evaluation of this whole strange business.” In other words, she did not defend herself in following the motto “One against All,” which she had perceived and noted in her Denktagebuch. Rather, as announced to McCarthy, she provided an “evaluation” in the 1964 preface to the German edition of Eichmann in Jerusalem and later when revising that preface for the postscript of the second English edition.
Arendt also refused to act in accordance with the old saying: Fiat iustitia, et pereat mundus(let there be justice, though the world perish). She writes – in the note of the Denktagebuchfrom which today’s quote is taken – that such acting would reveal the courage of the teller of truth “or, perhaps, his stubbornness, but neither the truth of what he had to say nor even his own truthfulness.” Thus, she rejected an attitude known in German cultural tradition under the name of Michael Kohlhaas. A horse trader living in the 16th century, Kohlhaas became known for endlessly and in vain fighting injustice done to him (two of his horses were stolen on the order of a nobleman) and finally taking the law into his own hands by setting fire to houses in Wittenberg.
Even so, Arendt has been praised as a woman of “intellectual courage” with regard to her book on Eichmann (see Richard Bernstein’s contribution to Thinking in Dark Times).
Intellectual courage based on thinking and researching was rare in Arendt’s time and has become even rarer since then. But should Arendt therefore only matter nostalgicly? Certainly not. Her emphasis on the benefits of thinking as a solitary business still remains current. Consider, for example, the following reference to Sherry Turkle, a sociologist at MIT and author of the recent book Alone Together. In an interview with Peter Haffner (published on July 27, 2012, in SZ Magazin), she argues that individuals who become absorbed in digital communication lose crucial components of their faculty of thinking. Turkle says (my translation): Students who spend all their time and energy on communication via SMS, Facebook, etc. “can hardly concentrate on a particular subject. They have difficulty thinking a complex idea through to its end.” No doubt, this sounds familiar to all of us who know about Hannah Arendt’s effort to promote thinking (and judging) in order to make our world more human.
To return to today’s quote: It can be dangerous to tell the truth, but thinking is dangerous too. Once in a while, not only the teller of truth but the thinking 'I' as well may find himself or herself in the position of One against All.
Peter Ludlow in the Stone remarks on the generational divide in attitudes towards whistle blowers, leakers, and hackers. According to Time Magazine, “70 percent of those age 18 to 34 sampled in a poll said they believed that Snowden “did a good thing” in leaking the news of the National Security Agency’s surveillance program. This fits a general trend, one heralded by Rick Falkvinge—founder of the European Pirate Parties—at the Hannah Arendt Center Conference last year, that young people value transparency above institutional democratic procedures. Distrusting government and institutions, there is a decided shift towards a faith in transparency and unfettered disclosure. Those who expose such in information are lauded for their courage in the name of the freedom of information.
Ludlow agrees and cites Hannah Arendt’s portrait of Adolf Eichmann for support of his contention that leakers like Edward Snowden and Chelsea Manning acted justly and courageously:
“In “Eichmann in Jerusalem,” one of the most poignant and important works of 20th-century philosophy, Hannah Arendt made an observation about what she called “the banality of evil.” One interpretation of this holds that it was not an observation about what a regular guy Adolf Eichmann seemed to be, but rather a statement about what happens when people play their “proper” roles within a system, following prescribed conduct with respect to that system, while remaining blind to the moral consequences of what the system was doing — or at least compartmentalizing and ignoring those consequences.”
Ludlow insists: “For the leaker and whistleblower the answer to [those who argue it is hubris for leakers to make the moral decision to expose wrongdoing], is that there can be no expectation that the system will act morally of its own accord. Systems are optimized for their own survival and preventing the system from doing evil may well require breaking with organizational niceties, protocols or laws. It requires stepping outside of one’s assigned organizational role.” In other words, bureaucratic systems have every incentive to protect themselves, thus leading to both dysfunction and injustice. We depend upon the actions of individuals who say simply: “No, I can’t continue to allow such injustice to go on.” Whistle blowers and leakers are essential parts of any just bureaucratic organization.
Ludlow’s insight is an important one: It is that the person who thinks for himself and stands alone from the crowd can—in times of crisis when the mass of people are thoughtlessly carried away by herd instincts and crowd mentality—act morally simply by refusing to go along with the collective performance of injustice. The problem is that if Snowden and Manning had simply resigned, their acts of resistance would have had minimal impact. To make a difference and to act in the name of justice, they had to release classified material. In effect, they had to break the law. Ludlow’s claim is that they did so morally and in the name of justice.
But is Ludlow correct to enlist Arendt in support of leakers such as Snowden and Manning? It is true that Arendt deeply understands the importance of individuals who resist the easy path of conformity in the name of doing right. Perhaps nowhere is the importance of such action made more markedly manifest than in her telling of the mention of Anton Schmidt when his name appeared in the testimony of the Eichmann trial:
At this slightly tense moment, the witness happened to mention the name of Anton Schmidt, a Feldwebel, or sergeant, in the German Army - a name that was not entirely unknown to this audience, for Yad Vashem had published Schmidt's story some years before in its Hebrew Bulletin, and a number of Yiddish papers in America had picked it up. Anton Schmidt was in, charge of a patrol in Poland that collected stray German soldiers who were cut off from their units. In the course of doing this, he had run into members of the Jewish underground, including Mr. Kovner, a prominent member, and he had helped the Jewish partisans by supplying them with forged papers and military trucks. Most important of all: "He did not do it for money." This had gone on for five months, from October, 1941, to March, 1942, when Anton Schmidt was arrested and executed. (The prosecution had elicited the story because Kovner declared that he had first heard the name of Eichmann from Schmidt, who had told him about rumors in the Army that it was Eichmann who "arranges everything.") ….
During the few minutes it took Kovner to tell of the help that had come from a German sergeant, a hush settled over the courtroom; it was as though the crowd had spontaneously decided to observe the usual two minutes of silence in honor of the man named Anton Schmidt. And in those two minutes, which were like a sudden burst of light in the midst of impenetrable, unfathomable darkness, a single thought stood out clearly, irrefutably, beyond question - how utterly different everything would be today in this courtroom, in Israel, in Germany, in all of Europe, and perhaps in all countries of the world, if only more such stories could have been told.
For Arendt, great civil disobedients from Socrates to Thoreau play important and essential roles in the political realm. What is more, Arendt fully defends Daniel Ellsberg’s release of the Pentagon Papers. It seems, therefore, that it is appropriate to enlist her in support of the modern day whistleblowers.
There is, however, a problem with this reading. Socrates, Thoreau, and Ellsberg all gave themselves up to the law and allowed themselves to be judged by and within the legal system. In this regard, they differ markedly from Snowden, Manning and others who have sought to remain anonymous or to flee legal judgment. For Arendt, this difference is meaningful.
Consider the case of Shalom Schwartzbard, which Arendt addresses in Eichmann in Jerusalem. Schwartzbard was a Jew who assassinated the leader of Ukranian pogroms in the streets of Paris. Schwartzbard stood where he took his revenge, waited for the police, admitted his act of revenge, and put himself on trial. He claimed to have acted justly at a time when the legal system was refusing to do justice. And a French jury acquitted him.
For Arendt, the Schwartzbard case stands for an essential principle of justice: that to break the law and act justly, one must then bring oneself back into the law. She writes:
He who takes the law into his own hands will render a service to justice only if he is willing to transform the situation in such a way that the law can again operate and his act can, at least posthumously, be validated.
What allows Schwartzbard to serve the end of justice is that he took the risk of putting himself on trial and asked a court of law and a jury to determine whether what he did was just, even it were also illegal. By doing so, Schwartzbard not only claimed that his act was a matter of personal conscience; he insisted as well that it was legal if one understood the laws rightly. He asked the representatives of the law—the French jury—to publicly agree with his claim and to vindicate him. He had no guarantee they would do so. When they did, their judgment brought the justice of Schwartzbard’s act to the bright light of the public and also cast the legal system’s inaction—its refusal to arrest war criminals living openly in Paris—in the shadow of darkness.
When I have suggested to colleagues and friends that Snowden’s flight to Moscow and his refusal to stand trial makes it impossible to see his release of the NSA documents as an act of justice, their response mirrors the argument made by Daniel Ellsberg. Ellsberg—who turned himself over to the police after releasing the Pentagon Papers—has defended Snowden’s decision to flee. The United States of 2013, he argues, is simply no longer the United States of the 1960s. When Ellsberg turned himself in, he was released on bail and given legal protections. He has no faith that the legal system today would treat Snowden with such respect. More likely Snowden would be imprisoned, possibly in solitary confinement. Potentially he would be tortured. There is every reason to believe, Ellsberg and others argue, that Snowden would not receive a fair trial. Under such circumstances, Snowden’s flight is, these supporters argue, justifiable.
I fully admit that it is likely that Snowden would have been treated much less generously than was Ellsberg. But aside from the fact that Snowden never gave the courts the chance to treat him justly, his refusal to submit to the law makes it impossible for his act of disobedience to shine forth as a claim of doing justice. He may claim that he acted in the public interest. He may argue that he acted out of conscience. And he may say he wants a public debate about the rightness of U.S. policy. He may be earnest in all these claims. But the fact that he fled and did not “transform the situation in such a way that the law can again operate and his act can be validated,” means that he does not, in the end, “render a service to justice.” On the contrary, by fleeing, Snowden gives solace to those who portray him as a criminal and make it easier for those who would to discredit him.
All of this is not to say that Snowden was wrong to release the NSA documents. It is clearly the case that the security state has gone off the rails and become encased in a bubble of fearful conformity that justifies nearly any act in the name of security. We do need such a public conversation about these policies and to the extent that Snowden and Manning have helped to encourage one, I am thankful to them. That said, Manning’s anonymity and Snowden’s flight have actually distracted attention from the question of the justice of their acts and focused attention instead on their motives and personal characters. They have, by resisting the return to law, diluted their claims to act justly.
It is a lot to ask that someone risk their life to act justly. But the fact that justice asks much of us is fundamental to the nature of justice itself: That justice, as opposed to legality, is always extreme, exceptional, and dangerous. Arendt knew well that those who act justly may lose their life, as did Socrates and Anton Schmidt. She knew well that those who act justly may lose their freedom, like Nelson Mandela. But she also knew that even those who die or are isolated will, by their courage in the service of justice, shine light into a world of shadows.
Peter Ludlow’s essay on the Banality of Systematic Evil is well worth reading. He is right that it is important for individuals to think for themselves and be willing to risk civil disobedience when they are convinced that bureaucracies have lost their moral bearings. It is your weekend read. And if you want to read more about Arendt and the demands of justice, take a look at this essay on Arendt’s discussion of the Shalom Schwartzbard case.
"It is better for you to suffer than to do wrong because you can remain the friend of the sufferer; who would want to be the friend of and have to live together with a murderer? Not even a murderer. What kind of dialogue could you lead with him? Precisely the dialogue which Shakespeare let Richard III lead with himself after a great number of crimes had been committed:
What do I fear? Myself? There’s none else by.
Richard loves Richard: that is, I am I.
Is there a murderer here? No. Yes, I am:
Then fly. What from myself?"
-Hannah Arendt, ‘Thinking and Moral Considerations’
‘Thinking and Moral Considerations’ is one of the most perfect examples of Arendt’s late writing. A distillation of her career-long thinking on thinking, the essay performs what it so elegantly urges: it is an essay on thinking that thinks.
For Arendt, the moral considerations that follow from thinking and, more grievously, from not thinking are profound. Adolf Eichmann’s “quite authentic inability to think” demonstrated to Arendt the arrival of new kind of evil in the world when she attended his trial in 1961. The airy emptiness of his speech was not the stupidity of a loathsome toad: his jabbering of cliché falling upon cliché sounded totalitarianism’s evil in a chorus of thoughtlessness. Shallowness as exemplified by Eichmann cannot be fixed or given depth by reason; no doctrine will argue the thoughtless into righteousness. Only through the experience of thinking, Arendt insisted, of being in dialogue with oneself, can conscience again be breathed into life. Thinking may be useless in itself; it may be a solitary activity that can often feel a little bit mad. Yet thinking is the precondition for the return of judgment, of knowing and saying: “this is not right.” By 1971, Arendt saw no evidence of a resurgence of thinking in the wake of atrocity.
Writing an essay on thinking that thinks and thus performing the experience of thinking is itself an act of defiance. Performing is the right verb here: Arendt knows she is staging her argument as a public spectacle. Her hero is Socrates: gadfly, midwife, stingray, provoker, deliverer and galvaniser of thinking in others. Socrates democratises perplexity. And when he has finished chatting with others, he carries on talking at home, with his quizzical, critical companion, that ‘obnoxious fellow’ with whom we are forever in dialogue -- the two with whom we make a thinking one. Arendt is fully aware that she is making a character out of Socrates. His inveterate dialogism is a model. Just as Dante’s characters conserve as much historical reality as the poet needs to make them representative, so too, she says, with her Socrates. Against the vacant image of Eichmann inanely mouthing his own eulogy in front of the hangman’s noose which opens the essay, we have Socrates: thoughtlessness versus thoughtfulness.
But what of the third character in Arendt’s essay—Shakespeare’s Richard III? The murderer who nobody wants to befriend? The villain who despite his best efforts cannot stop talking to himself?
Richard plays an odd, yet pivotal, role in Arendt’s performance of thinking. On the one hand, he is Socrates’ evil twin. Richard rejects conscience. ‘Every man that means to live well endeavours … to live without it’, he says. This is easy enough to do, says Arendt, because ‘all he has to do is never go home and examine things.’ Except, in Richard’s case, this proves difficult. He may try to avoid going home, but eventually he runs into himself at midnight; and in solitude, like Socrates, Richard cannot help but have intercourse with himself. Alone he speaks with himself in soliliquoys (from the Latin solus – alone and loqui –to speak; Arendt’s beloved Augustine is believed to have first conceived the compound). And this is what makes this villain—one who many have wanted to claim for the calculating murderousness of the twentieth century—much more like Socrates than Eichmann.
Both Socrates and Richard have the capacity to think. True, Richard thinks himself into villainy—he ‘proves himself a villain’—but this is precisely his pathos in Arendt’s drama. If it is better to suffer than to do harm, it is also better to have suffered at the hands of Richard who at least thought about what he was doing, than suffered as a number in one of Eichmann’s filing cards, the pathetic loner who joins a murderous movement not because he’s frightened of who might await him at home, but because he doesn’t even suspect anyone might be there in the first place. For all the ham-fisted productions that want him to be, Richard is not a Nazi villain in early modern disguise. Better that he could have been, of course, because then we wouldn’t have to contemplate the particular thoughtlessness of contemporary evil.
Richard is no Osama Bin Laden, Colonel Gaddafi or Saddam Hussein either, despite comparable violent last stands (and the corpse lust that attended them). This is well understood by Mark Rylance’s recent performance of Richard in the Globe Theater production that played in London last year and that is rumoured to open on Broadway soon. Rylance’s performance of Richard is like no other. It is also a performance that makes Arendt’s thinking more relevant than ever.
Rylance understands that since the War on Terror, post 9/11, Iraq, Afghanistan, after Guantanamo, rendition and drone wars, it would be a travesty to play Richard’s villainy as safely and exotically other (by contrast, in 1995 it was entirely possible to set the play in a 1930s Nazi context, and have Ian McKellen play the role for its cruel humour with a knowing nod to Brecht). Rylance’s Richard is plausible, pathetic even; he is compelling not in his all-too-evident evil but in his clumsy vulnerability. His creepy teeth sucking, and ever-twisting body mark a silent but persistent cogitation; he is a restless, needy, villain. Like a child, Rylance’s Richard grabs at his conscience— he thinks—and then chucks it away as one more ‘obstacle’, just as he spits in his mother’s face at the very moment he most desires she recognise him. In a neat echo of Arendt’s analysis of how the loneliness of totalitarianism feeds thoughtless evil, the loveless hunchback fights solitude in an effort to avoid the midnight hour; orchestrating collective murder is his defence against being alone with his thoughts. (This was observed by my theater companion who, being ten years old—and a British schoolboy—understands the connection between feeling left out and group violence well). Richard’s tragedy is that circumstances turned him into a serial killer, to this extent he is a conventional villain; his pathos, however, as this production shows, is to be poised between thinking and thoughtlessness, between Socrates and Eichmann.
‘No. Yes, I am/Then fly. What from myself?’ When Rylance speaks this soliloquy he stutters slightly, giggles and looks—as Arendt might have anticipated—a little perplexed. This is not a knowing perplexity; Richard does not master his conscience, nothing is done with the solitary dialogue, but the thinking is there even if Richard himself seems unsettled by its presence. In refusing to play Richard simply as one of the ‘negative heroes in literature’ who, Arendt argues, are often played as such ‘out of envy and resentment’, Rylance brilliantly captures the last moment before evil becomes banal.
To play Richard’s cruelty alongside his vulnerability is not to fail to recognise his villainy, as some have complained; rather, it is to dramatize the experience of thinking in the process of being painfully and violently lost. With pathos, we might think, is the only way to play Richard III today. The Globe’s production is a late, but utterly timely, companion to Arendt’s essay.
The New York Times tells the story of Benjamin Goering. Goering is 22. Until recently he studied computer science and philosophy at the University of Kansas. He felt “frustrated in crowded lecture halls where the professors did not even know his name.” So Goering dropped out of college and went to San Francisco, where he got a job as a software engineer.
I applaud Goering for making a risky decision. College was not for him. This does not mean he wasn’t smart or couldn’t cut it. He clearly has talent and it was being wasted in courses he was not interested in that were costing him and his family many tens of thousands of dollars every year. In leaving, Goering made the right decision for him. Indeed, many more college students should make the same decision he did. There are huge numbers of talented people who are simply not intellectuals and don’t enjoy or get much out of college. This is not destiny. A great or good teacher might perk them up. But largely it is a waste of their time and money for them to struggle through (or sleep through) classes that bore them. If anything, the forced march through Shakespeare and Plato make these students less engaged, cynical, and self-centered as they turn from common sense to the internal pursuit of self interest in partying and life in private.
The story should raise the big question that everyone tiptoes around in this current debate about college: Who should go to college?
The obvious answer is those who want to and those who care about ideas. Those who see that in thinking and reading and talking about justice, democracy, the scientific method, and perspective, we are talking about what it means to live in a large, democratic, bureaucratic country at a time of transition from an industrial to a information-age economy. College, in other words, is for those people who want to think about their world. It is for people who are willing and eager to turn to the great thinkers who came before them and, also, the innovative scientists and artists who have revealed hidden secrets about the natural and the human worlds. It is, in other words, for intellectuals. And this of course raises the “E” question: the question of elitism.
It is folly to think that everyone is or should be interested in such an endeavor. In no society in history have intellectuals been anything but a small minority of the population. This is not a question of privilege. There is no reason to think that those who love ideas are better or more qualified than those who work the earth, build machines, or engineer websites. It may very well be otherwise.
Hannah Arendt was clear that intellectuals had no privileged position in politics. On the contrary, she worried that the rise of intellectuals in politics was specifically dangerous. Intellectuals, insofar as they could get lost in and captivated by ideas, are prone to lose sight of reality in the pursuit of grand schemes. And intellectuals, captivated by the power of reason, are susceptible to rationalizations that excuse wrongs like torture or suicide bombing as means necessary for greater goods. The increasing dominance of intellectuals in politics, Arendt argued, is one of the great dangers facing modern society. She thus welcomed the grand tradition of the American yeoman farmer and affirmed that there is no need to go to college to be an engaged citizen or a profound thinker. The last of our Presidents who did not attend college was Abraham Lincoln. He did just fine. It is simply ridiculous to argue that college is a necessary credential for statesmanship.
While intellectuals have no special claim to leadership or prominence, they are nevertheless important. Intellectuals—those who think— are those people in society who stand apart from the mainstream pressures of economy and influence and outside the political movements of advocacy and propaganda. In the Arendtian tradition, intellectuals are or can be conscious pariahs, those who look at their societies from the outside and thus gain a perspective from distance that allows them to understand and comprehend the society in ways that people deeply embedded within it cannot. Those who stand apart from society and think are important, first because they preserve and deepen the stories and tales we as a society tell about ourselves. In writing poetry, making art, building monuments, writing books, and giving speeches, intellectuals help lend meaning and gravity to the common sense we have of ourselves as a people.
One problem we have in the current debate is that College has morphed into an institution designed to do many (too many) things. On the one hand, college has historically been the place for the education of and formation of intellectuals. But for many decades if not many centuries, that focus has shifted. Today College is still a place for the life of the mind. But it is also a ticket into the middle or upper-middles classes and it is equally a job-training and job-certification program. Of course, it is also a consumer good that brands young people with a certain mystique and identity. For many localities colleges are, themselves, job creation machines, bringing with them all sorts of new businesses and throwing off patents and graduating students that reinvigorate local communities. The university is now a multiversity, to invoke Clark Kerr’s famous term. When we talk about college today, the debate is complicated by these multiple roles.
It is difficult to raise such issues today because they smack of elitism. Since college-educated people think they are superior to those without a fancy diploma, their egalitarianism then insists that everyone should have the same experience. We are not supposed to entertain the idea that some people may not want to go to college. Instead, we are told that if they had a better education, if they knew better, if they just were taught to understand, they would all want to sit in classrooms and read great books or do exciting experiments.
We are stuck today with what Hannah Arendt called, in a related context, the “democratic mentality of an egalitarian society that tends to deny the obvious inability and conspicuous lack of interest of large parts of the population in political matters as such.” In politics, Arendt argued that what was needed were public spaces from which a self-chose “élite could be selected, or rather, where it could select itself.” Similarly, in education today, colleges should be the spaces where those who want to select themselves as an educated élite might lose themselves in books and experiments and amongst paintings and symphonies. There is simply no reason to assume that most people in society need to or should be interested in such an endeavor.
One reason the question of elitism is so present in debates about college is the disgusting and degenerate state of American public high schools. If high schools provided a serious and meaningful civic education, if they taught not simply reading and writing and arithmetic, but history and art—and taught these well—we would not need to send students to remedial education in college where they could be taught these subjects a second time. While many academics wring their hands about making college available to all, they might do much better if they focused on high schools and grammar schools around the country. If we were to redistribute the billions of dollars we spend on remedial college education to serious reform efforts in high schools, that money would be very well spent.
To raise the question of elitism means neither that college should be open only to the rich and connected (on the contrary, it should be open to all who want it), nor that the educated elite is to be segregated from society and kept apart in an ivory tower. When one reads Shakespeare, studies DNA, or dances with Bill T. Jones, one is not simply learning for learning's sake. Few understood this better than John Finley, Greek Professor at Harvard, who wrote General Education in a Free Society in 1945. Finley had this to say about the purposes of a college education:
The heart of the problem of a general education is the continuance of the liberal and humane tradition. Neither the mere acquisition of information nor the development of special skills and talents can give the broad basis of understanding which is essential if our civilization is to be preserved…. Unless the educational process includes at each level of maturity some continuing contact with those fields in which value judgments are of prime importance, it must fall short of the ideal.
What college should offer—as should all education at every level except for the most specialized graduate schools—is the experience of thinking and coming to engage with the world in which one lives. College is, at its best, an eye opening experience, an opportunity for young people to learn the foundational texts and also be exposed to new cultures, new ideas, and new ways of thinking. The ideas of justice, truth, and beauty one learns are not valuable in themselves; they are meaningful only insofar as they impact and inform our daily lives. To read Plato’s Republic is to ask: what are the value of the ideas of good and the just? It is also to meditate on the role of music and art in society. And at the same time, it is to familiarize oneself with characters like Socrates and Plato who, in the world we share, epitomize the qualities of morality, heroism, and the pursuit of the truth wherever it might lead. This can also be done in high schools. And it should be.
It is simply wrong to think such inquiries are unworldly or overly intellectual. Good teachers teach great texts not simply because the books are old, but because they are meaningful. And young students return to these books generation after generation because they find in them stories, examples, and ideas that inspire them to live their lives better and more fully.
As Leon Botstein, President of Bard College where the Hannah Arendt Center is located, writes in his book Jefferson’s Children,
No matter how rigorous the curriculum, no matter how stringent the requirements, if what goes on in the classroom does not leave its mark in the way young adults voluntarily act in private and in public while they are in college, much less in the years after, then the college is not doing what it is supposed to do.
The basic question being asked today is: Is college worthwhile? It is a good question. Too many colleges have lost their way. They no longer even understand what they are here to offer. Faculty frequently put research above teaching. Administration is the fastest growing segment of university education, which is evidence if anything is that universities simply do not know what their mission is anymore. It is no wonder, then, that many of our brightest young people will begin to shy away from the thoughtless expectation that one must attend college.
All around us, people are opting out of college. The mania for online education is at least in part fueled by the hunger for knowledge from students and others who do not want or need to attend college. The Times highlights Uncollege and other organizations that advocate “hacking” your education. Recall that Lincoln was better schooled in the classics of poetry and politics than most every college educated President who followed him. At a time when many colleges are so confused and trying to do so many things, they often do none well. It may be the case today that we need to evolve new networks and new organizations where intellectualism can flourish. And it may be small liberal arts colleges that are more flexible and more able to make that transition than large, bureaucratic research institutions.
The real question this debate needs to raise, but avoids, is: Who should get a college education? The answer, “not everyone,” is one few want to hear. And yet it might be the beginning of a real conversation about what a college education is for and why we are today so often failing to provide it to our students.
Freeman Dyson, the eclectic physicist, took good aim at philosophy last week in a review of the silly book by Jim Holt, Why Does the World Exist?" An Existential Detective Story. Holt went around to "a portrait gallery of leading modern philosophers," and asked them the Leibnizian question: Why is there something rather than nothing?" The book offers their answers, along with biographical descriptions.
For Dyson, Holt's book "compels us to ask" these "ugly questions." First, "When and why did philosophy lose its bite?" Philosophers were, once important. In China, Confucius and his followers made a civilization. So too in Greece did Socrates and then the schools of Plato and Aristotle give birth to the western world. In the Christian era Jesus and Paul, then Aquinas and Augustine granted depth to dominant worldviews. Philosophers like Descartes, Hobbes, and Leibniz were central figures in the scientific revolution, and philosophical minds like Nietzsche, Heidegger, and Arendt (even if one was a philologist and the other two refused the name philosopher) have become central figures in the experience of nihilism. Against these towering figures, the "leading philosophers" in Holt's book cut a paltry figure. Here is Dyson:
Holt's philosophers belong to the twentieth and twenty-first centuries. Compared with the giants of the past, they are a sorry bunch of dwarfs. They are thinking deep thoughts and giving scholarly lectures to academic audiences, but hardly anybody in the world outside is listening. They are historically insignificant. At some time toward the end of the nineteenth century, philosophers faded from public life. Like the snark in Lewis Carroll's poem, they suddenly and silently vanished. So far as the general public was concerned, philosophers became invisible.
There are many reasons for the death of philosophy, some of which were behind Hannah Arendt's refusal to call herself a philosopher. Philosophy was born, at least in its Platonic variety, from out of the thinker's reaction to the death of Socrates. Confronted with the polis that put the thinker to death, Plato and Aristotle responded by retreating from the world into the world of ideas. Philosophical truth separated itself from worldly truths, and idealism was born. Realism was less a return to the world than a reactive fantasy to idealism. In both, the truths that were sought were otherworldly truths, disconnected to the world.
Christianity furthered the divorce of philosophy from the world by imagining two distinct realms, the higher realm existing beyond the world. Science, too, taught that truth could only be found in a world of abstract reason, divorced from real things. Christianity and science together gave substance to the philosophical rebellion against the world. The result, as Dyson rightly notes, is that philosophy today is as abstract, worldly, and relevant as it is profound.
What Dyson doesn't explore is why philosophers of the past had such importance, even as they also thought about worlds of ideas. The answer cannot be that ideas had more import in the past than now. On the contrary, we live in an age more saturated in ideas than any other. More people today are college educated, literate, and knowledgeable of philosophy than at any period in the history of the world. Books like Holt's are proof positive of the profitable industry of philosophical trinkets. That is the paradox—at a time when philosophy is read by more people than ever, it is less impactful than it ever was.
One explanation for this paradox is nihilism—The devaluing or re-valuing of the highest values. The truth about truth turned out to be neither so simple nor singular as the philosophers had hoped. An attentive inquiry into the true and the good led not to certainty, but to ideology critique. For Nietzsche, truth, like the Christian God, was a human creation, and the first truth of our age is that we recognized it as such. That is the precondition for the death of God and the death of truth. Nihilism has not expunged ideas from our world, but multiplied them. When speaking about the "true" or the "good" or the "just," Christians, Platonists, and moralists no longer have the stage to themselves. They must now shout to be heard amongst the public relations managers, advertisers, immoralists, epicureans, anarchists, and born again Christians.
Dyson ignores this strain of philosophy. He does point out that Nietzsche was the last great philosopher, but then dismisses Heidegger who "lost his credibility in 1933" and even Wittgentstein who would remain silent if a woman attended his lectures until she would leave. And yet it is Heidegger who has given us the great literary masterpieces of the 20th century philosophy.
His work on technology (The Question Concerning Technik) and art (The Origins of the Work of Art) has been widely read in artistic, literary, and lay circles. It is hard to imagine a philosopher more engaged with the science and literature than Heidegger was. He read physics widely and co-taught courses at the house of the Swiss psychiatrist Medard Boss and also taught seminars with the German novelist Ernst Jünger.
It seems worthwhile to end with a poem of Heidegger's from his little book, Aus der Erfahrung des Denkens/From Out of the Experience of Thinking:
Drei Gefahren drohen dem Denken
Die gute und darum heilsame Gefahr ist die Nachbarschaft des singenden Dichters.
Die böse und darum schärfste Gefahr ist das Denken selber. Es muß gegen sich selbst denken, was es nur selten vermag.
Die schlechte und darum wirre Gefahr ist das Philosophieren.
Three dangers threaten thinking.
The good and thus healthy danger is the nearness of singing poetry.
The evil and thus sharpest danger is thinking itself. It must think against itself, something it can do only rarely.
The bad and thus confusing danger is philosophizing.
“Hence it is not in the least superstitious, it is even a counsel of realism, to look for the unforeseeable and unpredictable, to be prepared for and to expect “miracles” in the political realm. And the more heavily the scales are weighted in favor of disaster, the more miraculous will the deed done in freedom appear.”
—Hannah Arendt, What is Freedom?
This week at Bard College, in preparation for the Hannah Arendt Center Conference "Does the President Matter?", we put up 2 writing blocks around campus, multi-paneled chalkboards that invite students to respond to the question: Does the President Matter? The blocks generated quite a few interesting comments. Many mentioned the Supreme Court. Quite a few invoked the previous president, war, and torture. And, since we are at Bard, others responded: it depends what you mean by matters.
This last comment struck me as prescient. It does depend on what you mean by matters.
If what we mean is, say, an increasing and unprecedented power by a democratic leader not seen since the time of enlightened monarchy, the president does matter. We live in an age of an imperial presidency. The President can, at least he does, send our troops into battle without the approval of Congress. The President can, and does, harness the power of the TV, Internet, and twitter to bypass his critics and reach the masses more directly than ever before. The president can, and does, appoint Supreme Court Justices with barely a whimper from the Senate; and the president’s appointments can, and do, swing the balance on a prisoner’s right to habeas corpus, a woman’s right to choose, or a couple’s right to marry.
And yet, what if by matter, we mean something else? What if we mean, having the power to change who we are in meaningful ways? What if by matter we mean: to confront honestly the enormous challenges of the present? What if by matter we mean: to make unpredictable and visionary choices, to invite and inspire a better future?
On the really big questions—the thoughtless consumerism that degrades our environment and our souls; the millions of people who have no jobs and increasingly little prospect for productive employment; the threat of devastating terrorism; and the astronomical National Debt: 16 trillion and counting for the US. -- That is $140,000 for each taxpayer. -- Add to that the deficiency in Public Pension Obligations (estimated at anywhere from $1 to $5 trillion.) Not to mention the 1 trillion dollars of inextinguishable student debt that is creating a lost generation of young people whose lives are stifled by unwise decisions made before they were allowed to buy a beer.
This election should be about a frank acknowledgement of the unsustainability of our economic, social, and environmental practices and expectations. We should be talking together about how we should remake our future in ways that are both just and exciting. This election should be scary and exciting. But so far it’s small-minded and ugly.
Around the world, we witness worldwide distrust and disdain for government. In Greece there is a clear choice between austerity and devaluation; but Greek leaders have saddled their people with half-hearted austerity that causes pain without prospect for relief. In Italy, the paralysis of political leaders has led to resignation and the appointment of an interim technocratic government. In Germany, the most powerful European leader delays and denies, trusting that others will blink every time they are brought to the mouth of the abyss.
No wonder that the Tea Party and Occupy Wall Street in the US, and the Pirate Parties in Europe share a common sense that liberal democratic government is broken. A substantial—and highly educated—portion of the electorate has concluded that our government is so inept and so compromised that it needs to be abandoned or radically constrained. No president, it seems, is up to the challenge of fixing our broken political system.
Every President comes to Washington promising reform! And they all fail. According to Jon Rauch, a leading journalist for The Atlantic and the National Journal, this is inevitable. He has this to say in his book Government's End:
If the business of America is business, the business of government programs and their clients is to stay in business. And after a while, as the programs and the clients and their political protectors adapt to nourish and protect each other, government and its universe of groups reach a turning point—or, perhaps more accurately, a point from which there is no turning back. That point has arrived. Government has become what it is and will remain: a large, incoherent, often incomprehensible mass that is solicitous of its clients but impervious to any broad, coherent program of reform. And this evolution cannot be reversed.
On the really big questions of transforming politics, the President is, Rauch argues, simply powerless. President Obama apparently agrees. Just last week he said, in Florida: "The most important lesson I've learned is that you can't change Washington from the inside. You can only change it from the outside."
A similar sentiment is offered by Laurence Lessig, a founding member of Creative Commons. In his recent book Republic 2.0, Lessig writes:
The great threat today is in plain sight. It is the economy of influence now transparent to all, which has normalized a process that draws our democracy away from the will of the people. A process that distorts our democracy from ends sought by both the Left and the Right: For the single most salient feature of the government that we have evolved is not that it discriminates in favor of one side and against the other. The single most salient feature is that it discriminates against all sides to favor itself. We have created an engine of influence that seeks not some particular strand of political or economic ideology, whether Marx or Hayek. We have created instead an engine of influence that seeks simply to make those most connected rich.
The system of influence and corruption through PACs, SuperPacs, and lobbyists is so entrenched, Lessig writes, that no reform seems plausible. All that is left is the Hail Mary idea of a new constitutional convention—an idea Lessig promotes widely, as with his Conference On the Constitutional Convention last year at Harvard.
For Rauch on the Right and Lessig on the Left, government is so concerned with its parochial interests and its need to stay in business that we have forfeited control over it. We have, in other words, lost the freedom to govern ourselves.
The question "Does the President Matter?" is asked, in the context of the Arendt Center conference, from out of Hannah Arendt's maxim that Freedom is the fundamental raison d'etre of politics. In "What is Freedom?", Arendt writes:
“Freedom is actually the reason that men live together in political organization at all. Without it, political life as such would be meaningless. The raison d’être of politics is freedom.”
So what is freedom? To be free, Arendt says, is to act. Arendt writes: "Men are free as long as they act, neither before nor after; for to be free and to act are the same.”
What is action? Action is something done spontaneously. It brings something new into the world. Man is the being capable of starting something new. Political action, and action in general, must happen in public. Like the performing arts—dance, theatre, and music—politics and political actions requires an audience. Political actors act in front of other people. They need spectators, so that the spectators can be drawn to the action; and when the spectators find the doings of politicians right, or true, or beautiful, they gather around and form themselves into a polity. The political act, the free act must be surprising if it is to draw people to itself. Only an act that is surprising and bold is a political act, because only such an act will strike others, and make them pay attention.
The very word politics derives from the Greek polis which itself is rooted in the Greek pelein, a verb used to describe the circular motion of smoke rings rising up from out of a pipe. The point is that politics is the gathering of a plurality around a common center. The plurality does not become a singularity in circling around a polestar, but it does acknowledgement something common, something that unites the members of a polity in spite of their uniqueness and difference.
When President Washington stepped down after his second term; when President Lincoln emancipated the slaves; when FDR created the New Deal; when President Eisenhower called the Arkansas National Guard into Federal Service in order to integrate schools in Little Rock; these presidents acted in ways that helped refine, redefine, and re-imagine what it means to be an American.
Arendt makes one further point about action and freedom that is important as they relate to the question: Does the President Matter? Courage, she writes, is "the political virtue par excellence." To act in public is leave the security of one's home and enter the world of the public. Such action is dangerous, for the political actor might be jailed for his crime or even killed. Arendt's favorite example of political courage is Socrates, who was killed for his courageous engagement of his fellow Athenians. We must always recall that Socrates was sentenced to death for violating the Athenian law.
Political action also requires courage because the actor can suffer a fate even worse than death. He may be ignored. At least to be killed for one's ideas means that one is recognized as capable of action, of saying and doing something that matters. To be ignored, however, denies the actor the basic human capacity for action and freedom.
One fascinating corollary of Arendt's understanding of the identity of action and freedom is that action, any action—any original deed, any political act that is new and shows leadership—is, of necessity, something that was not done before. It is, therefore, always against the law.
This is an insight familiar to readers of Fyodor Dostoevsky. In Crime and Punishment Raskolnikov says:
Let's say, the lawgivers and founders of mankind, starting from the most ancient and going on to the Lycurguses, the Solons, the Muhammads, the Napoleons, and so forth, that all of them to a man were criminals, from the fact alone that in giving a new law they thereby violated the old one.
All leaders are, in important ways, related to criminals. This is an insight Arendt and Nietzsche too share.
Shortly after we began to plan this conference, I heard an interview with John Ashcroft speaking on the Freakonomics Radio Show. He said:
"Leadership in a moral and cultural sense may be even more important than what a person does in a governmental sense. A leader calls people to their highest and best. ... No one ever achieves greatness merely by obeying the law. People who do above what the law requires become really valuable to a culture. And a President can set a tone that inspires people to do that."
My first reaction was: This is a surprising thing for the Attorney General of the United States to say. My second reaction was: I want him to speak at the conference. Sadly, Mr. Ashcroft could not be with us here today. But this does not change the fact that, in an important way, Ashcroft is right. Great leaders will rise above the laws in crisis. They will call us to our highest and best.
What Ashcroft doesn't quite say, and yet Arendt and Dostoevsky make clear, is that there is a thin and yet all-so-important line separating great leaders from criminals. Both act in ways unexpected and novel. In a sense, both break the law.
But only the leader's act shows itself to be right and thus re-makes the law. Hitler may have acted and shown a capacity for freedom; his action, however, was rejected. He was a criminal, not a legislator. Martin Luther King Jr. or Gandhi also broke the laws in actions of civil disobedience. Great leader show in their lawbreaking that the earlier law had been wrong; they forge a new moral and also written law through the force and power of moral example.
In what is perhaps the latest example in the United States of a Presidential act of lawbreaking, President George W. Bush clearly broke both U.S. and international law in his prosecution of the war on terror. At least at this time it seems painfully clear that President George W. Bush's decision to systematize torture stands closer to a criminal act than an act of great legislation.
In many ways Presidential politics in the 21st takes place in the shadow of George W. Bush's overreach. One result is that we have reacted against great and daring leadership. In line with the spirit of equality that drives our age, we ruthlessly expose the foibles, missteps, scandals and failures of anyone who rises to prominence. Bold leaders are risk takers. They fail and embarrass themselves. They have unruly skeletons in their closets. They will hesitate to endure and rarely prevail in the public inquisition that the presidential selection process has become.
These candidates, who are inoffensive enough to prevail, are branded by their consultants as pragmatists. Our current pragmatists are Products of Harvard Business School and Harvard Law School. Mr. Romney loves data. President Obama worships experts. They are both nothing if not faithful to the doctrine of technocratic optimism, that we with the right people in charge we can do anything. The only problem is they refuse to tell us what it is they want to do. They have forgotten that politics is a matter of thinking, not a pragmatic exercise in technical efficiency.
Look at the Mall in Washington: the Washington monument honors our first President, the Jefferson Memorial, the Lincoln Memorial, the Memorial to Franklin Delano Roosevelt. There is not a monument to any president since FDR. And yet, just 2 years ago we dedicated the Martin Luther King Memorial. It doesn't seem like an accident that the leaders of the Civil Rights Movement were not politicians. Our leaders today do not gravitate to the presidency. The presidency does not attract leaders. Bold leaders today are not the people running for office.
Yet, people crave what used to be called a statesman. To ask: "Does the President Matter?" is to ask: might a president, might a political leader, be able to transform our nation, to restore the dignity and meaning of politics? It is to ask, in other words, for a miracle.
At the end of her essay, "What is Freedom?", Hannah Arendt said this about the importance of miracles in politics.
Hence it is not in the least superstitious, it is even a counsel of realism, to look for the unforeseeable and unpredictable, to be prepared for and to expect “miracles” in the political realm. And the more heavily the scales are weighted in favor of disaster, the more miraculous will the deed done in freedom appear.
It is men who perform miracles—men who because they have received the twofold gift of freedom and action can establish a reality of their own.
I don't know if the president matters.
But I know that he or she must. Which is why we must believe that miracles are possible. And that means we, ourselves, must act in freedom to make the miraculous happen.
In the service of the not-yet-imagined possibilities of our time, our goal over the two days of the conference days was to engage in the difficult, surprising, and never-to-be-understood work of thinking, and of thinking together, in public, amongst others. We heard from philosophers and businessmen, artists and academics. The speakers came from across the political spectrum, but they shared a commitment to thinking beyond ideology. Such thinking is itself a form of action, especially so in a time of such ideological rigidity. Whether our meeting here at Bard gives birth to the miracle of political action--that is up to you. If we succeeded in thinking together, in provoking, and in unsettling, we perhaps sowed the seeds that will one day blossom into the miracle of freedom.
Watch Roger's opening talk from the conference, "Does the President Matter?" here.