Hannah Arendt Center for Politics and Humanities
6Mar/140

Socrates on Thinking

Arendtthoughts

“To find yourself, think for yourself.”

-Socrates

socrates

16Dec/130

The Laboratory as Anti-Environment

Arendtquote

"Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment."

-Hannah Arendt, The Life of the Mind

I find this quote intriguing in that its reference to environments and environmental change speak to the fact that Arendt's philosophy was essentially an ecological one, indeed one that is profoundly media ecological. The quote appears in a section of The Life of the Mind entitled "Science and Common Sense," in which Arendt argues that the practice of science is quite distinct from thinking as a philosophical activity.

lifeofmind

As she explains:

Thinking, no doubt, plays an enormous role in every scientific enterprise, but it is a role of a means to an end; the end is determined by a decision about what is worthwhile knowing, and this decision cannot be scientific.

Here Arendt invokes a variation on Gödel's incompleteness theorem in mathematics, noting that science cannot justify itself on scientific grounds, but rather must somehow depend on something outside of and beyond itself. Perhaps more to the point, science, especially as associated with empiricism, cannot be divorced from concrete reality, and does not function only in the abstract realm of ideas that Plato insisted was the only true reality.

The transformation of truth into mere verity results primarily from the fact that the scientist remains bound to the common sense by which we find our bearings in a world of appearances. Thinking withdraws radically and for its own sake from this world and its evidential nature, whereas science profits from a possible withdrawal for the sake of specific results.

It is certainly the case that scientific truth is always contingent, tentative, open to refutation, as Karl Popper explained.  Scientific truth is never absolute, never anything more than a map of some other territory, a map that needs to be continually tested and reviewed, updated and revised, as Alfred Korzybski explained by way of establishing his discipline of general semantics. Even the so-called laws of nature and physics need not be considered immutable, but may be subject to change and evolution, as Lee Smolin argues in his insightful book, Time Reborn.

Scientists are engaged in the process of abstracting, insofar as they take the data gained by empirical investigation and make generalizations in the form of theories and hypotheses, but this process of induction cannot be divorced from concrete reality, from the world of appearances. Science may be used to test, challenge, and displace common sense, but it operates on the same level, as a distilled form of common sense, rather than something qualitatively different, a status Arendt reserves for the special activity of thinking associated with philosophy.

Arendt goes on to argue that both common sense and scientific speculation lack "the safeguards inherent in sheer thinking, namely thinking's critical capacity."  This includes the capacity for moral judgment, which became horrifically evident by the ways in which Nazi Germany used science to justify its genocidal policies and actions. Auschwitz did not represent a retrieval of tribal violence, but one of the ultimate expressions of the scientific enterprise in action. And the same might be said of Hiroshima and Nagasaki, holding aside whatever might be said to justify the use of the atomic bomb to bring the Second World War to a speedy conclusion. In remaining close to the human lifeworld, science abandons the very capacity that makes us human, that makes human life and human consciousness unique.

The story of modern science is in fact a story of shifting alliances. Science begins as a branch of philosophy, as natural philosophy. Indeed, philosophy itself is generally understood to begin with the pre-Socratics sometimes referred to as Ionian physicists, i.e., Thales, Anaximander, Heraclitus, who first posited the concept of elements and atoms. Both science and philosophy therefore coalesce during the first century that followed the introduction of the Greek alphabet and the emergence of a literate culture in the ancient Greek colonies in Asia Minor.

And just as ancient science is alphabetic in its origins, modern science begins with typography, as the historian Elizabeth Eisenstein explains in her exhaustive study, The Printing Press as an Agent of Change in Early Modern Europe. Simply by making the writings of natural philosophers easily available through the distribution of printed books, scholars were able to compare and contrast what different philosophers had to say about the natural world, and uncover their differences of opinion and contradictions. And this in turn spurned them on to find out for themselves which of various competing explanations are correct, where the truth lies, so that more reading led to even more empirical research, which in turn would have to be published, that is made public, via printing, for the purposes of testing and confirmation. And publication encouraged the formation of a scientific republic of letters, a typographically mediated virtual community.

guten

Eisenstein notes that during the first century following Gutenberg, printed books gave Copernicus access to centuries of recorded observations of the movements of celestial objects, access not easily available to his predecessors. What is remarkable to consider is that the telescope was not invented in his lifetime, that the Polish astronomer arrived at his heliocentric view based only on what could be observed by the naked eye, by gazing up at the heavens, and down at the printed page. The typographic revolution that began in the 15th century was the necessary technological precondition for the Copernican revolution of the 16th century.  The telescope as a tool to extend vision beyond its natural capabilities had not yet been invented, and was not required, although soon after its introduction Galileo was able to confirm the theory that Copernicus had put forth a century earlier.

In the restricted literate culture of medieval Europe, the idea took hold that there are two books to be studied in an effort to discern the divine will, and mind: the book of scripture and the book of nature. Both books were seen as sources of knowledge that can be unlocked by a process of reading and interpretation. It was grammar, the ancient study of language, which became one third of the trivium, the foundational curriculum of the medieval university, that became the basis of modern science, and not dialectic or logic, that is, pure thinking, which is the source of the philosophic tradition, as Marshall McLuhan noted in The Classical Trivium. The medieval schoolmen of course placed scripture in the primary position, whereas modern science situates truth in the book of nature alone.

The publication of Francis Bacon's Novum Organum in 1620 first formalized the separation of science from philosophy within print culture, but the divorce was finalized during the 19th century, coinciding with the industrial revolution, as researchers became known as scientists rather than natural philosophers. In place of the alliance with philosophy, science came to be associated with technology; before this time, technology, and engineering, often referred to as mechanics, represented entirely different lines of inquiry, utterly practical, often intuitive rather than systematic. Mechanics was part of the world of work rather than that of action, to use the terms Arendt introduced in The Human Condition, which is to say that it was seen as the work of the hand rather than the mind. By the end of 19th century, scientific discovery emerged as the main the source of major technological breakthroughs, rather than innovation springing fully formed from the tinkering of inventors, and it became necessary to distinguish between applied science and theoretical science, the latter nonetheless still tied to the world of appearances.

Today, the acronym STEM, which stands for science, technology, engineering, and mathematics, has become a major buzzword in education, a major emphasis in particular for higher education, and a major concern in regards to economic competitiveness. We might well take note of how recent this combination of fields and disciplines really is, insofar as mathematics represents pure logic and highly abstract forms of thought, and science once was a purely philosophical enterprise, both aspects of the life of the mind. Technology and engineering, on the other hand, for most of our history took the form of arts and crafts, part of the world of appearances.

The convergence of science and technology also had much to do with scientists' increasing reliance on scientific instruments for their investigations, a trend increasingly prevalent following the introduction of both the telescope and the microscope in the early 17th century, a trend even more apparent from the 19th century on. The laboratory is in fact another such instrument, a technology whose function is to provide precisely controlled conditions, beyond its role as a facility for the storage and use of other scientific instruments. Scientific instruments are media that extend our senses and allow us to see the world in new ways, therefore altering our experience of our environment, while the discoveries they lead to provide us with the means of altering our environments physically. And the laboratory is an instrument that provides us with a total environment, enclosed, controlled, isolated from the world to become in effect the world. It is a micro-environment where experimental changes can be made that anticipate changes that can be made to the macro-environment we regularly inhabit.

The split between science and philosophy can also be characterized as a division between the eye and the ear. Modern science, as intimately bound up in typography, is associated with visualism, the idea that seeing is believing, that truth is based on vision, that knowledge can be displayed visually as an organized set of facts, rather than the product of ongoing dialogue, and debate. McLuhan noted the importance of the fixed point of view as a by-product of training the eye to read, and Walter Ong studied the paradigm-shift in education attributed to Peter Ramus, who introduced pedagogical methods we would today associated with textbooks, outlining, and the visual display of information. Philosophy has not been immune to this influence, but retains a connection to the oral-aural mode through the method of Socratic dialogue, and by way of an understanding of the history of ideas as an ongoing conversation. Arendt, in The Human Condition, explained action, the realm of words, as a social phenomenon, one based on dialogic exchanges of ideas and opinions, not a solitary matter of looking things up. And thinking, which she elevates above the scientific enterprise in The Life of the Mind, is mostly a matter of an inner dialogue, or monologue if you prefer, of hearing oneself think, of silent speech, and not of a mental form of writing out words or imaginary reading. We talk things out, to others and/or to ourselves.

Science, on the other hand, is all about visible representations, as words, numbers, illustrations, tables, graphs, charts, diagrams, etc. And it is the investigation of visible phenomena, or otherwise of phenomena that can be rendered visible through scientific instruments. Acoustic phenomena can only be dealt with scientifically by being turned into a visual measurement, either of numbers or of lines going up and down to depict sound waves.  The same is true for the other senses; smell, taste, and touch can only be dealt with scientifically though visual representation. Science cannot deal with any sense other than sight on its own terms, but always requires an act of translation into visual form. Thus, Arendt notes that modern science, being so intimately bound up in the world of appearances, is often concerned with making the invisible visible:

That modern science, always hunting for manifestations of the invisible—atoms, molecules, particles, cells, genes—should have added to the world a spectacular, unprecedented quantity of new perceptible things is only seemingly paradoxical.

Arendt might well have noted the continuity between the modern activity of making the invisible visible as an act of translation, and the medieval alchemist's search for methods of achieving material transformation, the translation of one substance into another. She does note that the use of scientific instruments are a means of extending natural functions, paralleling McLuhan's characterization of media as extensions of body and biology:

In order to prove or disprove its hypotheses… and to discover what makes things work, it [modern science] began to imitate the working processes of nature. For that purpose it produced the countless and enormously complex implements with which to force the non-appearing to appear (if only as an instrument-reading in the laboratory), as that was the sole means the scientist had to persuade himself of its reality. Modern technology was born in the laboratory, but this was not because scientists wanted to produce appliances or change the world. No matter how far their theories leave common-sense experience and common-sense reasoning behind, they must finally come back to some form of it or lose all sense of realness in the object of their investigation.

Note here the close connection between reality, that is, our conception of reality, and what lends someone the aura of authenticity, as Walter Benjamin would put it, is dependent on the visual sense, on the phenomenon being translated into the world of appearances (the aura as opposed to the aural). It is no accident then that there is a close connection in biblical literature and the Hebrew language between the words for spirit and soul, and the words for invisible but audible phenomena such as wind and breath, breath in turn being the basis of speech (and this is not unique to Hebraic culture or vocabulary). It is at this point that Arendt resumes her commentary on the function of the controlled environment:

And this return is possible only via the man-made, artificial world of the laboratory, where that which does not appear of its own accord is forced to appear and to disclose itself. Technology, the "plumber's" work held in some contempt by the scientist, who sees practical applicability as a mere by-product of his own efforts, introduces scientific findings, made in "unparalleled insulation… from the demands of the laity and of everyday life," into the everyday world of appearances and renders them accessible to common-sense experience; but this is possible only because the scientists themselves are ultimately dependent on that experience.

We now reach the point in the text where the quote I began this essay with appears, as Arendt writes:

Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment; and the cognitive processes using the human abilities of thinking and fabricating as means to their end are indeed the most refined modes of common-sense reasoning. The activity of knowing is no less related to our sense of reality and no less a world-building activity than the building of houses.

Again, for Arendt, science and common sense both are distinct in this way from the activity of pure thinking, which can provide a sorely needed critical function. But her insight as to the function of the laboratory as an environment in which the invisible is made visible is important in that this helps us to understand that the laboratory is, in fact, what McLuhan referred to as a counter-environment or anti-environment.

In our everyday environment, the environment itself tends to be invisible, if not literally so, then functionally insofar as whatever fades into the background tends to fall out of our perceptual awareness or is otherwise ignored. Anything that becomes part of our routine falls into this category, becoming environmental, and therefore subliminal. And this includes our media, technology, and symbol systems, insofar as they are part of our everyday world. We do pay attention to them when they are brand new and unfamiliar, but once their novelty wears off they become part of the background, unless they malfunction or breakdown. In the absence of such conditions, we need an anti-environment to provide a contrast through which we can recognize the things we take for granted in our world, to provide a place to stand from which we can observe our situation from the outside in, from a relatively objective stance. We are, in effect, sleepwalkers in our everyday environment, and entering into an anti-environment is a way to wake us up, to enhance awareness and consciousness of our surroundings. This occurs, in a haphazard way, when we return home after spending time experiencing another culture, as for a brief time much of what was once routinized about own culture suddenly seems strange and arbitrary to us. The effect wears off relatively quickly, however, although the after-effects of broadening our minds in this way can be significant.

science

The controlled environment of the laboratory helps to focus our attention on phenomena that are otherwise invisible to us, either because they are taken for granted, or because they require specialized instrumentation to be rendered visible. It is not just that such phenomena are brought into the world of appearances, however, but also that they are made into objects of concerted study, to be recorded, described, measured, experimented upon, etc.

McLuhan emphasized the role of art as an anti-environment. The art museum, for example, is a controlled environment, and the painting that we encounter there has the potential to make us see things we had never seen before, by which I mean not just objects depicted that are unfamiliar to us, but familiar objects depicted in unfamiliar ways. In this way, works of art are instruments that can help us to see the world in new and different ways, help us to see, to use our senses and perceive in new and different ways. McLuhan believed that artists served as a kind of distant early warning system, borrowing cold war terminology to refer to their ability to anticipate changes occurring in the present that most others are not aware of. He was fond of the Ezra Pound quote that the artist is the antenna of the race, and Kurt Vonnegut expressed a similar sentiment in describing the writer as a canary in a coal mine. We may further consider the art museum or gallery or library as a controlled environment, a laboratory of sorts, and note the parallel in the idea of art as the anticipation of a changed environment.

There are other anti-environments as well. Houses of worship function in this way, often because they are based on earlier eras and different cultures, and otherwise are constructed to remove us out of our everyday environment, and help us to see the world in a different light. They are in some way dedicated to making the invisible world of the spirit visible to us through the use of sacred symbols and objects, even for religions whose concept of God is one that is entirely outside of the world of appearances. Sanctuaries might therefore be considered laboratories used for moral, ethical, and sacred discovery, experimentation, and development, and places where changed environments are also anticipated, in the form of spiritual enlightenment and the pursuit of social justice. This also suggests that the scientific laboratory might be viewed, in a certain sense, as a sacred space, along the lines that Mircea Eliade discusses in The Sacred and the Profane.

The school and the classroom are also anti-environments, or at least ought to be, as Neil Postman argued in Teaching as a Conserving Activity.  Students are sequestered away from the everyday environment, into a controlled situation where the world they live in can be studied and understood, and phenomena that are taken for granted can be brought into conscious awareness. It is indeed a place where the invisible can be made visible. In this sense, the school and the classroom are laboratories for learning, although the metaphor can be problematic when it used to imply that the school is only about the world of appearances, and all that is needed is to let students discover that world for themselves. Exploration is indeed essential, and discovery is an important component of learning. But the school is also a place where we may engage in the critical activity of pure thinking, of critical reasoning, of dialogue and disputation.

The classroom is more than a laboratory, or at least it must become more than a laboratory, or the educational enterprise will be incomplete. The school ought to be an anti-environment, not only in regard to the everyday world of appearances and common sense, but also to that special world dominated by STEM, by science, technology, engineering and math.  We need the classroom to be an anti-environment for a world subject to a flood of entertainment and information, we need it to be a language-based anti-environment for a world increasingly overwhelmed by images and numbers. We need an anti-environment where words can take precedence, where reading and writing can be balanced by speech and conversation, where reason, thinking, and thinking about thinking can allow for critical evaluation of common sense and common science alike. Only then can schools be engaged in something more than just adjusting students to take their place in a changed and changing environment, integrating them within the technological system, as components of that system, as Jacques Ellul observed in The Technological Society. Only then can schools help students to change the environment itself, not just through scientific and technological innovation, but through the exercise of values other than the technological imperative of efficiency, to make things better, more human, more life-affirming.

The anti-environment that we so desperately need is what Hannah Arendt might well have called a laboratory of the mind.

-Lance Strate

18Nov/130

One Against All

Arendtquote

This Quote of the Week was originally published on September 3, 2012.

It can be dangerous to tell the truth: “There will always be One against All, one person against all others. [This is so] not because One is terribly wise and All are terribly foolish, but because the process of thinking and researching, which finally yields truth, can only be accomplished by an individual person. In its singularity or duality, one human being seeks and finds – not the truth (Lessing) –, but some truth.”

-Hannah Arendt, Denktagebuch, Book XXIV, No. 21

Hannah Arendt wrote these lines when she was confronted with the severe and often unfair, even slanderous, public criticism launched against her and her book Eichmann in Jerusalemafter its publication in 1963. The quote points to her understanding of the thinking I (as opposed to the acting We) on which she bases her moral and, partly, her political philosophy.

denk

It is the thinking I, defined with Kant as selbstdenkend (self-thinking [“singularity”]) and an-der-Stelle-jedes-andern-denkend (i.e., in Arendt’s terms, thinking representatively or practicing the two-in-one [“duality”]). Her words also hint at an essay she published in 1967 titled “Truth and Politics,” wherein she takes up the idea that it is dangerous to tell the truth, factual truth in particular, and considers the teller of factual truth to be powerless. Logically, the All are the powerful, because they may determine what at a specific place and time is considered to be factual truth; their lies, in the guise of truth, constitute reality. Thus, it is extremely hard to fight them.

In answer to questions posed in 1963 by the journalist Samuel Grafton regarding her report on Eichmann and published only recently, Arendt states: “Once I wrote, I was bound to tell the truth as I see it.” The statement reveals that she was quite well aware of the fact that her story, i.e., the result of her own thinking and researching, was only one among others. She also realized the lack of understanding and, in many cases, of thinking and researching, on the part of her critics.

ius

Thus, she lost any hope of being able to publicly debate her position in a “real controversy,” as she wrote to Rabbi Hertzberg (April 8, 1966). By the same token, she determined that she would not entertain her critics, as Socrates did the Athenians: “Don’t be offended at my telling you the truth.” Reminded of this quote from Plato’s Apology (31e) in a supportive letter from her friend Helen Wolff, she acknowledged the reference, but acted differently. After having made up her mind, she wrote to Mary McCarthy: “I am convinced that I should not answer individual critics. I probably shall finally make, not an answer, but a kind of evaluation of this whole strange business.” In other words, she did not defend herself in following the motto “One against All,” which she had perceived and noted in her Denktagebuch. Rather, as announced to McCarthy, she provided an “evaluation” in the 1964 preface to the German edition of Eichmann in Jerusalem and later when revising that preface for the postscript of the second English edition.

Arendt also refused to act in accordance with the old saying: Fiat iustitia, et pereat mundus(let there be justice, though the world perish). She writes – in the note of the Denktagebuchfrom which today’s quote is taken – that such acting would reveal the courage of the teller of truth “or, perhaps, his stubbornness, but neither the truth of what he had to say nor even his own truthfulness.” Thus, she rejected an attitude known in German cultural tradition under the name of Michael Kohlhaas.  A horse trader living in the 16th century, Kohlhaas became known for endlessly and in vain fighting injustice done to him (two of his horses were stolen on the order of a nobleman) and finally taking the law into his own hands by setting fire to houses in Wittenberg.

card

Even so, Arendt has been praised as a woman of “intellectual courage” with regard to her book on Eichmann (see Richard Bernstein’s contribution to Thinking in Dark Times).

Intellectual courage based on thinking and researching was rare in Arendt’s time and has become even rarer since then. But should Arendt therefore only matter nostalgicly? Certainly not. Her emphasis on the benefits of thinking as a solitary business still remains current. Consider, for example, the following reference to Sherry Turkle, a sociologist at MIT and author of the recent book Alone Together. In an interview with Peter Haffner (published on July 27, 2012, in SZ Magazin), she argues that individuals who become absorbed in digital communication lose crucial components of their faculty of thinking. Turkle says (my translation): Students who spend all their time and energy on communication via SMS, Facebook, etc. “can hardly concentrate on a particular subject. They have difficulty thinking a complex idea through to its end.” No doubt, this sounds familiar to all of us who know about Hannah Arendt’s effort to promote thinking (and judging) in order to make our world more human.

To return to today’s quote: It can be dangerous to tell the truth, but thinking is dangerous too. Once in a while, not only the teller of truth but the thinking 'I' as well may find himself or herself in the position of One against All.

-Ursula Ludz

20Sep/130

The Banality of Systems and the Justice of Resistance

ArendtWeekendReading

Peter Ludlow in the Stone remarks on the generational divide in attitudes towards whistle blowers, leakers, and hackers. According to Time Magazine, “70 percent of those age 18 to 34 sampled in a poll said they believed that Snowden “did a good thing” in leaking the news of the National Security Agency’s surveillance program. This fits a general trend, one heralded by Rick Falkvinge—founder of the European Pirate Parties—at the Hannah Arendt Center Conference last year, that young people value transparency above institutional democratic procedures. Distrusting government and institutions, there is a decided shift towards a faith in transparency and unfettered disclosure. Those who expose such in information are lauded for their courage in the name of the freedom of information.

Ludlow agrees and cites Hannah Arendt’s portrait of Adolf Eichmann for support of his contention that leakers like Edward Snowden and Chelsea Manning acted justly and courageously:

“In “Eichmann in Jerusalem,” one of the most poignant and important works of 20th-century philosophy, Hannah Arendt made an observation about what she called “the banality of evil.” One interpretation of this holds that it was not an observation about what a regular guy Adolf Eichmann seemed to be, but rather a statement about what happens when people play their “proper” roles within a system, following prescribed conduct with respect to that system, while remaining blind to the moral consequences of what the system was doing — or at least compartmentalizing and ignoring those consequences.”

Ludlow insists: “For the leaker and whistleblower the answer to [those who argue it is hubris for leakers to make the moral decision to expose wrongdoing], is that there can be no expectation that the system will act morally of its own accord. Systems are optimized for their own survival and preventing the system from doing evil may well require breaking with organizational niceties, protocols or laws. It requires stepping outside of one’s assigned organizational role.” In other words, bureaucratic systems have every incentive to protect themselves, thus leading to both dysfunction and injustice. We depend upon the actions of individuals who say simply: “No, I can’t continue to allow such injustice to go on.” Whistle blowers and leakers are essential parts of any just bureaucratic organization.

Ludlow’s insight is an important one: It is that the person who thinks for himself and stands alone from the crowd can—in times of crisis when the mass of people are thoughtlessly carried away by herd instincts and crowd mentality—act morally simply by refusing to go along with the collective performance of injustice. The problem is that if Snowden and Manning had simply resigned, their acts of resistance would have had minimal impact. To make a difference and to act in the name of justice, they had to release classified material. In effect, they had to break the law. Ludlow’s claim is that they did so morally and in the name of justice. 

whistle

But is Ludlow correct to enlist Arendt in support of leakers such as Snowden and Manning? It is true that Arendt deeply understands the importance of individuals who resist the easy path of conformity in the name of doing right. Perhaps nowhere is the importance of such action made more markedly manifest than in her telling of the mention of Anton Schmidt when his name appeared in the testimony of the Eichmann trial:

At this slightly tense moment, the witness happened to mention the name of Anton Schmidt, a Feldwebel, or sergeant, in the German Army - a name that was not entirely unknown to this audience, for Yad Vashem had published Schmidt's story some years before in its Hebrew Bulletin, and a number of Yiddish papers in America had picked it up. Anton Schmidt was in, charge of a patrol in Poland that collected stray German soldiers who were cut off from their units. In the course of doing this, he had run into members of the Jewish underground, including Mr. Kovner, a prominent member, and he had helped the Jewish partisans by supplying them with forged papers and military trucks. Most important of all: "He did not do it for money." This had gone on for five months, from October, 1941, to March, 1942, when Anton Schmidt was arrested and executed. (The prosecution had elicited the story because Kovner declared that he had first heard the name of Eichmann from Schmidt, who had told him about rumors in the Army that it was Eichmann who "arranges everything.") ….

During the few minutes it took Kovner to tell of the help that had come from a German sergeant, a hush settled over the courtroom; it was as though the crowd had spontaneously decided to observe the usual two minutes of silence in honor of the man named Anton Schmidt. And in those two minutes, which were like a sudden burst of light in the midst of impenetrable, unfathomable darkness, a single thought stood out clearly, irrefutably, beyond question - how utterly different everything would be today in this courtroom, in Israel, in Germany, in all of Europe, and perhaps in all countries of the world, if only more such stories could have been told. 

For Arendt, great civil disobedients from Socrates to Thoreau play important and essential roles in the political realm. What is more, Arendt fully defends Daniel Ellsberg’s release of the Pentagon Papers. It seems, therefore, that it is appropriate to enlist her in support of the modern day whistleblowers.

There is, however, a problem with this reading. Socrates, Thoreau, and Ellsberg all gave themselves up to the law and allowed themselves to be judged by and within the legal system. In this regard, they differ markedly from Snowden, Manning and others who have sought to remain anonymous or to flee legal judgment. For Arendt, this difference is meaningful.

Consider the case of Shalom Schwartzbard, which Arendt addresses in Eichmann in Jerusalem. Schwartzbard was a Jew who assassinated the leader of Ukranian pogroms in the streets of Paris. Schwartzbard stood where he took his revenge, waited for the police, admitted his act of revenge, and put himself on trial. He claimed to have acted justly at a time when the legal system was refusing to do justice. And a French jury acquitted him.

For Arendt, the Schwartzbard case stands for an essential principle of justice: that to break the law and act justly, one must then bring oneself back into the law. She writes:

He who takes the law into his own hands will render a service to justice only if he is willing to transform the situation in such a way that the law can again operate and his act can, at least posthumously, be validated.

What allows Schwartzbard to serve the end of justice is that he took the risk of putting himself on trial and asked a court of law and a jury to determine whether what he did was just, even it were also illegal. By doing so, Schwartzbard not only claimed that his act was a matter of personal conscience; he insisted as well that it was legal if one understood the laws rightly. He asked the representatives of the law—the French jury—to publicly agree with his claim and to vindicate him. He had no guarantee they would do so. When they did, their judgment brought the justice of Schwartzbard’s act to the bright light of the public and also cast the legal system’s inaction—its refusal to arrest war criminals living openly in Paris—in the shadow of darkness.

When I have suggested to colleagues and friends that Snowden’s flight to Moscow and his refusal to stand trial makes it impossible to see his release of the NSA documents as an act of justice, their response mirrors the argument made by Daniel Ellsberg. Ellsberg—who turned himself over to the police after releasing the Pentagon Papers—has defended Snowden’s decision to flee. The United States of 2013, he argues, is simply no longer the United States of the 1960s. When Ellsberg turned himself in, he was released on bail and given legal protections. He has no faith that the legal system today would treat Snowden with such respect. More likely Snowden would be imprisoned, possibly in solitary confinement. Potentially he would be tortured. There is every reason to believe, Ellsberg and others argue, that Snowden would not receive a fair trial. Under such circumstances, Snowden’s flight is, these supporters argue,  justifiable.

I fully admit that it is likely that Snowden would have been treated much less generously than was Ellsberg. But aside from the fact that Snowden never gave the courts the chance to treat him justly, his refusal to submit to the law makes it impossible for his act of disobedience to shine forth as a claim of doing justice. He may claim that he acted in the public interest. He may argue that he acted out of conscience. And he may say he wants a public debate about the rightness of U.S. policy. He may be earnest in all these claims. But the fact that he fled and did not “transform the situation in such a way that the law can again operate and his act can be validated,” means that he does not, in the end, “render a service to justice.” On the contrary, by fleeing, Snowden gives solace to those who portray him as a criminal and make it easier for those who would to discredit him.

snow

All of this is not to say that Snowden was wrong to release the NSA documents. It is clearly the case that the security state has gone off the rails and become encased in a bubble of fearful conformity that justifies nearly any act in the name of security. We do need such a public conversation about these policies and to the extent that Snowden and Manning have helped to encourage one, I am thankful to them. That said, Manning’s anonymity and Snowden’s flight have actually distracted attention from the question of the justice of their acts and focused attention instead on their motives and personal characters. They have, by resisting the return to law, diluted their claims to act justly.

It is a lot to ask that someone risk their life to act justly. But the fact that justice asks much of us is fundamental to the nature of justice itself: That justice, as opposed to legality, is always extreme, exceptional, and dangerous. Arendt knew well that those who act justly may lose their life, as did Socrates and Anton Schmidt. She knew well that those who act justly may lose their freedom, like Nelson Mandela. But she also knew that even those who die or are isolated will, by their courage in the service of justice, shine light into a world of shadows.

Peter Ludlow’s essay on the Banality of Systematic Evil is well worth reading. He is right that it is important for individuals to think for themselves and be willing to risk civil disobedience when they are convinced that bureaucracies have lost their moral bearings.  It is your weekend read. And if you want to read more about Arendt and the demands of justice, take a look at this essay on Arendt’s discussion of the Shalom Schwartzbard case.

-RB

17May/130

The MOOCs Debate Continues

ArendtWeekendReading

Thinking stops us. To think is to slow down, even stop, turn around, and reflect. There is that famous scene in the Symposium where Socrates simply stands there in the street for hours, thinking. Barbara Sukowa, in the new film Hannah Arendt, literally smokes saying nothing for minutes on end to offer the exemplary sense of what it means to stop and think. One might even subtitle the new film “Smoking and Thinking,” which is a reminder of one loss—amidst many benefits—that health concerns and the end of smoking means for our thinking lives.

Thinking is especially important at a time of excitement and speed, when everybody around you is rushing headlong into the newest 'new thing'. The new thing in the world of teaching is, of course, online education and particularly the MOOC, the massive open online courses that seemingly everyone now wants to offer. There is a steamroller effect in the air, the fear that if we don’t get on board we will be left behind, standing alone in front of our blackboards lecturing to empty seats.

classroom

Or worse, that we will become an underpaid army of low-paid assistants to superstar professors. Outside of these professional and personal concerns, there is the worry that the rush to online courses and online education will cheapen education.

Aaron Bady seeks to slow us down and think about MOOC’s in his recent essay in The New Inquiry. Here is how he describes our current moment:

In the MOOC moment, it seems to me, it’s already too late, always already too late. The world not only will change, but it has changed. In this sense, it’s isn’t simply that “MOOCs are the future, or online education is changing how we teach,” in the present tense. Those kinds of platitudes are chokingly omnipresent, but the interesting thing is the fact that the future is already now, that it has already changed how we teach. If you don’t get on the MOOC bandwagon, yesterday, you’ll have already been left behind. The world has already changed. To stop and question that fact is to be already belated, behind the times.

The first thing I want to do, then, is slow us down a bit, and go through the last year with a bit more care than we’re usually able to do, to do a “close reading” of the year of the MOOC, as it were. Not only because I have the time, but because, to be blunt, MOOC’s only make sense if you don’t think about it too much, if you’re in too much of a hurry to go deeply into the subject.

Bady is right to ask that we slow down, and of course, this is happening. Amherst College and Duke University recently voted to pull out of EdX and rethink their online strategies. The philosophy department at San Jose State, a university that is embracing MOOCs, issued a thoughtful open letter questioning the implementation and use of MOOCs. At Bard, where the Hannah Arendt Center is located, there are ongoing and serious discussions and experiments proceeding on how to use MOOCs and online education in pedagogically sound and innovative ways. Many schools that don’t get the press and attention associated with speedily adopting the MOOC model are thinking seriously about using MOOCs well, and more generally, about how to employ technology in ways that will enrich or expand the classroom educational experience. In this way, MOOCs are actually spurring reform and innovation in ways Bady does not consider.

Nevertheless, in asking that we breathe, stop and think, Bady does a great service. He clearly has worries about MOOCs. And the concerns are meaningful.

MOOC’s are literally built to cater to the attention span of a distracted and multi-tasking teenager, who pays attention in cycles of 10-15 minutes. This is not a shot at teenagers, however, but an observation about what the form anticipates (and therefore rewards and reproduces) as a normal teenager’s attention span. In place of the 50 minute lectures that are the norm at my university, for example, MOOCs will break a unit of pedagogy down into YouTube-length clips that can be more easily digested, whenever and wherever. Much longer than that, and it falls apart; the TED talk is essentially the gold standard.

MOOCs as they are today do break the large lecture into smaller bits. They require students to answer questions after a few minutes of the lesson to make sure they are following it. Before one can continue, one must in essence take a quiz to see if you are getting it. Let’s stipulate: this is juvenile. It treats the college student like a grammar school student, one who knows little and cannot be trusted to be attentive on their own and needs big brother watching and making sure he is paying attention and learning at every minute.

In short, MOOCs threaten to change education to be about shorter, less demanding, more corporate lessons. The focus will be on skills and measurable learning. What will be sacrificed is the more difficult-to-measure experience of struggling with difficult ideas and the activity of thinking in public with others. Bady’s point, and he is right, is that a fully online education is hardly an education. It is a credential.

That may be true. But the sad fact is that for many if not most of our college students, college is more of a credential than an intellectual feast. Most students simply get very little out of large lectures.

lecture

If they are not sleeping or on Facebook, they are too often focused simply on learning what is necessary to pass the exam. This is a reality that many who criticize MOOCs are not facing up to—that our current educational system is, for large numbers of students, a sham; it is too often a waste of time and money.

Bady focuses on the last of these concerns and believes that the driving force of the arguments for MOOCs is economic. He writes:

But the pro-MOOC argument is always that it’s cheaper and almost never that it’s better; the most utopian MOOC-boosters will rarely claim that MOOCs are of equivalent educational value, and the most they’ll say is that someday it might be.

On this reasoning, MOOCs will soon take over the entirety of higher education, devaluing higher personal instruction. Bady is partly right. MOOCs will devalue a college degree, as ever more people can cheaply acquire one. But they will likely increase the value of a college degree from a physical university where students learn with real professors who care for and nurture them. In short, MOOCs will likely increase the attraction of and resources for those institutions that provide personal educations. There will always be some people who desire a meaningful education—although the number of people who do so is likely smaller than academics would like to admit. What MOOCs allow is for us to provide cheap and more effective credentialing educations for those who don’t actually want to invest the time, effort, and money in such an intellectual endeavor.

And this is where MOOCs have a real potential to provide a service, in separating out two now confused aims of higher education. On the one hand, education is an intellectual pursuit, an opening of the mind to an historical, moral, beautiful, and previously hidden world.  On the other, it is a credential for economic and social advancement. Of course these distinctions can be blurred, and too often they are completely, so that education as an intellectual activity is reduced down cynically to a credential. I think MOOCs can change this. By making the choice more starkly, we can let students choose which kind of education they want. And for those who simply want a credential, the MOOC option is probably better and cheaper and more convenient.

caps

Bady doesn’t take this seriously because he worries that MOOCs are being offered as a replacement for education at all levels. The confusion here, however, is a difficult one to speak about because the issue is one of elitism. We need to recognize that some colleges and some students are aspiring to offer an education. Others are providing instead a certification. But since we call all of these different endeavors a “college education” we confuse the question. One great side-effect of the MOOC phenomena is that we may once again be able to recall that not everyone in a society wants or needs a college education. The best answer is then to spend more resources on our abysmal system of high school teaching. But that is another story.

Bady’s essay is one of the best around on the MOOC phenomenon. It is well worth your time and is your weekend read.

-RB

To read more Arendt Center posts about education, teaching and MOOCs click here, here, here, and here.

29Apr/130

Performing thinking: Arendt’s Richard III

Arendtquote

"It is better for you to suffer than to do wrong because you can remain the friend of the sufferer; who would want to be the friend of and have to live together with a murderer? Not even a murderer.  What kind of dialogue could you lead with him? Precisely the dialogue which Shakespeare let Richard III lead with himself after a great number of crimes had been committed:

What do I fear? Myself? There’s none else by.
Richard loves Richard: that is, I am I.
Is there a murderer here? No. Yes, I am:
Then fly. What from myself?"
-Hannah Arendt, ‘Thinking and Moral Considerations’

‘Thinking and Moral Considerations’ is one of the most perfect examples of Arendt’s late writing. A distillation of her career-long thinking on thinking, the essay performs what it so elegantly urges: it is an essay on thinking that thinks.

For Arendt, the moral considerations that follow from thinking and, more grievously, from not thinking are profound. Adolf Eichmann’s “quite authentic inability to think” demonstrated to Arendt the arrival of new kind of evil in the world when she attended his trial in 1961. The airy emptiness of his speech was not the stupidity of a loathsome toad: his jabbering of cliché falling upon cliché sounded totalitarianism’s evil in a chorus of thoughtlessness. Shallowness as exemplified by Eichmann cannot be fixed or given depth by reason; no doctrine will argue the thoughtless into righteousness. Only through the experience of thinking, Arendt insisted, of being in dialogue with oneself, can conscience again be breathed into life. Thinking may be useless in itself; it may be a solitary activity that can often feel a little bit mad. Yet thinking is the precondition for the return of judgment, of knowing and saying: “this is not right.”  By 1971, Arendt saw no evidence of a resurgence of thinking in the wake of atrocity.

are

Writing an essay on thinking that thinks and thus performing the experience of thinking is itself an act of defiance. Performing is the right verb here: Arendt knows she is staging her argument as a public spectacle. Her hero is Socrates: gadfly, midwife, stingray, provoker, deliverer and galvaniser of thinking in others. Socrates democratises perplexity. And when he has finished chatting with others, he carries on talking at home, with his quizzical, critical companion, that ‘obnoxious fellow’ with whom we are forever in dialogue -- the two with whom we make a thinking one.  Arendt is fully aware that she is making a character out of Socrates. His inveterate dialogism is a model. Just as Dante’s characters conserve as much historical reality as the poet needs to make them representative, so too, she says, with her Socrates. Against the vacant image of Eichmann inanely mouthing his own eulogy in front of the hangman’s noose which opens the essay, we have Socrates: thoughtlessness versus thoughtfulness.

But what of the third character in Arendt’s essay—Shakespeare’s Richard III? The murderer who nobody wants to befriend? The villain who despite his best efforts cannot stop talking to himself?

Richard plays an odd, yet pivotal, role in Arendt’s performance of thinking. On the one hand, he is Socrates’ evil twin. Richard rejects conscience. ‘Every man that means to live well endeavours … to live without it’, he says. This is easy enough to do, says Arendt, because ‘all he has to do is never go home and examine things.’ Except, in Richard’s case, this proves difficult.  He may try to avoid going home, but eventually he runs into himself at midnight; and in solitude, like Socrates, Richard cannot help but have intercourse with himself. Alone he speaks with himself in soliliquoys (from the Latin solus – alone and loqui –to speak; Arendt’s beloved Augustine is believed to have first conceived the compound). And this is what makes this villain—one who many have wanted to claim for the calculating murderousness of the twentieth century—much more like Socrates than Eichmann.

Both Socrates and Richard have the capacity to think. True, Richard thinks himself into villainy—he ‘proves himself a villain’—but this is precisely his pathos in Arendt’s drama. If it is better to suffer than to do harm, it is also better to have suffered at the hands of Richard who at least thought about what he was doing, than suffered as a number in one of Eichmann’s filing cards, the pathetic loner who joins a murderous movement not because he’s frightened of who might await him at home, but because he doesn’t even suspect anyone might be there in the first place. For all the ham-fisted productions that want him to be, Richard is not a Nazi villain in early modern disguise. Better that he could have been, of course, because then we wouldn’t have to contemplate the particular thoughtlessness of contemporary evil.

Richard is no Osama Bin Laden, Colonel Gaddafi or Saddam Hussein either, despite comparable violent last stands (and the corpse lust that attended them).  This is well understood by Mark Rylance’s recent performance of Richard in the Globe Theater production that played in London last year and that is rumoured to open on Broadway soon. Rylance’s performance of Richard is like no other. It is also a performance that makes Arendt’s thinking more relevant than ever.

richard

Mark Rylance in the title role of Richard III at Shakespeare’s Globe,
London, 2012, directed by Tim Caroll. Photographer: Simon Annand.

Rylance understands that since the War on Terror, post 9/11, Iraq, Afghanistan, after Guantanamo, rendition and drone wars, it would be a travesty to play Richard’s villainy as safely and exotically other (by contrast, in 1995 it was entirely possible to set the play in a 1930s Nazi context, and have Ian McKellen play the role for its cruel humour with a knowing nod to Brecht).  Rylance’s Richard is plausible, pathetic even; he is compelling not in his all-too-evident evil but in his clumsy vulnerability. His creepy teeth sucking, and ever-twisting body mark a silent but persistent cogitation; he is a restless, needy, villain. Like a child, Rylance’s Richard grabs at his conscience— he thinks—and then chucks it away as one more ‘obstacle’, just as he spits in his mother’s face at the very moment he most desires she recognise him.  In a neat echo of Arendt’s analysis of how the loneliness of totalitarianism feeds thoughtless evil, the loveless hunchback fights solitude in an effort to avoid the midnight hour; orchestrating collective murder is his defence against being alone with his thoughts. (This was observed by my theater companion who, being ten years old—and a British schoolboy—understands the connection between feeling left out and group violence well). Richard’s tragedy is that circumstances turned him into a serial killer, to this extent he is a conventional villain; his pathos, however, as this production shows, is to be poised between thinking and thoughtlessness, between Socrates and Eichmann.

‘No. Yes, I am/Then fly. What from myself?’ When Rylance speaks this soliloquy he stutters slightly, giggles and looks—as Arendt might have anticipated—a little perplexed. This is not a knowing perplexity; Richard does not master his conscience, nothing is done with the solitary dialogue, but the thinking is there even if Richard himself seems unsettled by its presence. In refusing to play Richard simply as one of the ‘negative heroes in literature’ who, Arendt argues, are often played as such ‘out of envy and resentment’, Rylance brilliantly captures the last moment before evil becomes banal.

To play Richard’s cruelty alongside his vulnerability is not to fail to recognise his villainy, as some have complained; rather, it is to dramatize the experience of thinking in the process of being painfully and violently lost. With pathos, we might think, is the only way to play Richard III today. The Globe’s production is a late, but utterly timely, companion to Arendt’s essay.

-Lyndsey Stonebridge

11Jan/130

Infinitely Intoxicating

Louis Pasteur once wrote:

I see everywhere in the world, the inevitable expression of the concept of infinity…. The idea of God is nothing more than one form of the idea of infinity. So long as the mystery of the infinite weighs on the human mind, so long will temples be raised to the cult of the infinite, whether it be called Bramah, Allah, Jehovah, or Jesus…. The Greeks understood the mysterious power of the hidden side of things. They bequethed to us one of the most beautiful words in our language—the word ‘enthusiasm’—En Theos—“A God Within.” The grandeur of human actions is measured by the inspiration from which they spring. Happy is he who hears a god within, and who obeys it. The ideals of art, of science, are lighted by reflection from the infinite.

To bear a god within is not an easy task for us mortals. The god within—even more so than the god without—demands to be obeyed. Having a god inside us—or Socrates like a daimon on our shoulder—is no recipe for happiness.

It can lead to unbearable obligation and even to martyrdom. And, if the god is a muse, it can lead to the travails of the artist.

All great art and all great artists are consumed by the infinite. As Oscar Wilde once wrote, “We are all in the gutter, but some of us are looking up at the stars.” Those are the artists, the ones who amidst the muck feel part of something higher, something everlasting, the infinite.

The great enemy of the infinite is reason. Reason is calculating. It is rational. It is logical. It insists that everything is knowable and comprehensible. Ends justify means. And means can achieve ends. Reason insists on explanation. The self—the mystery—must be made knowable.

David Brooks in the NY Times today lauds the entry of behavioral psychology into politics and policy. We want to know, he writes, how to get people to vote and how to get congress to cut the deficit. If science can tell us how what to put in their drinking water, how to frame the question, what books to read to them in vitro, or how to rewire their brains to be rational, wouldn’t that make policy all the more reasonable? Wouldn’t that be a good thing? 

Science can make us more rational. That of course is the dream of people like Ray Kurzweil as well as the social scientists who insist that humans can be studied like rats. Let’s not object to the fact. We can be studied like rats and that is what university social science departments around the country and the world are doing everyday. This research is eminently useful, as Brooks rightly remarks. If we employ it, we can be made to be more reasonable.

What the rationalization of humanity means, however, is not a question science can answer. Max Weber began the study of the rationalization of mankind when he proposed that the rise of the enlightenment and the age of reason was bringing about an “Entzauberung” or a “de-magicification” of the world. Capitalism emerged at this time for a number of reasons, but one main reason, Weber understood, was that capitalism provided in the profit motive rational and objective criteria for measuring human endeavors. The problem, as Weber so well understood, is that the elevation of reason and rationality brought about the devaluation of all highest values—what Nietzsche would call nihilism. This is because reason, derived from ratio, is always a relation. All values are relative. In such a world, nothing is infinite. Stuck amidst the relations of means and ends, everything is a calculation. All is a game. There is no purpose or meaning to the game of life. As we become more rational, we also become less consumed by the infinite. That is the true danger of the rise of the social sciences and our rationality-consumed culture that insists that all human behavior be made understandable so that it can be made better.

In The Human Condition, Hannah Arendt is concerned with the way that the rise of reason and rationality is challenging the quintessence of the human condition—at least as that human condition has been experienced and known since the dawn of humanity. The rise of the social sciences, she writes over and over, are subjecting the mystery and fecundity of human action to the law of large numbers. While each and every human action may in itself be surprising and mysterious, it is nevertheless true that studied in groups and analyzed over time, human action does fall into comprehensible patterns. The more we study and know these patterns, the more we come to think of humans as predictable animals rather than surprising and spontaneous selves. This sociological and psychological reduction of man to animal is very much at the heart of what Arendt is opposing in her book.

Nowhere is the rationality of our times more visible than in the victory of labor and the marginalization of art. We are, all of us, laborers today. That is why the first question we ask others we meet is: What do you do?  Our labor defines us. It gives our lives meaning in that it assigns us a use and a value. Even professors, judges, and presidents now say regularly: this is my job. By which we mean, don’t blame us for what we do. Don’t hold me to some higher standard. Don’t expect miracles. It is our job to do this. We do this to make a living.

The one group in society who is at times excepted from this reduction to labor is artists. But even the artist is today is taken less and less seriously. Insofar as artists are enthusiasts consumed with the infinite, they are ignored or viewed as marginal. Art is reduced to playfulness. A hobby. “From the standpoint of “making a living,” every activity unconnected with labor becomes a “hobby.””  And those artists who are taken seriously, whose work is bought and sold on the art market, turn artistic work into the job of making a living.

 Art, Arendt writes, is a process of magic. Citing a poem by Rainer Maria Rilke, she insists that the magic of art is the artist’s transfiguration of something ordinary—the canvas, clay or word—into something extraordinary, an expression of the infinite in the finite world of things.

Because art figures the infinite, poetry is the “most human” of the arts and the art that “remains closest to the thought that inspired it.” The poem, of all artworks, is the most lasting because its medium is the least subject to decay. It is the closest expression of the infinite we humans possess.

Ralph Waldo Emerson, whose resonance with Arendt in so many things has been too infrequently remarked, agrees that poetry is the art form in which the individual artist can access and figure in the world a public and common truth. In “The Poet,” Emerson writes:

It is a secret which every intellectual man quickly learns, that beyond the energy of his possessed and conscious intellect, he is capable of a new energy (as of an intellect doubled on itself ), by abandonment to the nature of things; that, beside his privacy of power as an individual man, there is a great public power on which he can draw by unlocking, at all risks, his human doors and suffering the ethereal tides to roll and circulate through him: then he is caught up into the life of the universe; his speech is thunder; his thought is law, and his words are universally intelligible as the plants and animals. The poet knows that he speaks adequately, then, only when he speaks somewhat wildly, or, “with the flower of the mind”; not with the intellect used as an organ but with the intellect released from all service…inebriated by nectar. As the traveler who has lost his way throws his reins on his horse’s neck and trusts to the instinct of the animal to find his road, so must we do with the divine animal who carries us through this world. For if in any manner we can stimulate this instinct, new passages are opened for us into nature, the mind flows into and through things hardest and highest, and the metamorphosis is possible. This is the reason why bards love wine, mead, narcotics, coffee, tea, opium, the fumes of sandalwood and tobacco, or whatever other species of animal exhilaration. All men avail themselves of such means as they can to add this extraordinary power to their normal powers, and to this end they prize conversation, music, pictures, sculpture, dancing, theaters, traveling, wars, mobs, fires, gaming, politics, or love, or science, or animal intoxication, which are several coarser or finer quasi-mechanical substitutes for the true nectar, which is the ravishment of the intellect by coming nearer to the fact.

I take this quotation from Emerson’s “The Poet” from an exceptional recent essay by Sven Birkirts. The essay appears in the latest edition of Lapham’s Quarterly, an entire issue focusing on the merits and need for inebriation.

As Birkirts writes:

For Emerson, the intoxication is not escape but access, a means of getting closer to “the fact,” which might, with heartfelt imprecision, be called life itself. What he means by “public power,” I think, is something like what Carl Jung and others later meant by the phrase collective unconscious, the emphasis falling on the unconscious, that posited reservoir of our shared archetypes and primordial associations—that which reason by itself cannot fathom, for it is, in essence, antithetical to reason.

Birkirt’s reflects not only on the need for inebriation in the pursuit of artistic infinity, but also on the decreasing potency of intoxicants today. For him, the rise of the mass market in art, the globalization of experience, the accessibility of all information all have made the world smaller, knowable, and accountable. What is lost in such access is precisely the portal to the infinite.

Artistically and in almost every other way ours has become a culture of proliferation. Information, perspectives, as well as the hypercharged clips and images of our global experience are within the radius of the keystroke. Nothing is unspoken, nothing is unaccounted. Every taste is given a niche and every niche is catered to. Here, one might argue, is more material than ever; here are opportunities for even greater acts of synthesis. But I am skeptical. Nietzsche wrote in Thus Spoke Zarathustra, “Nothing is true, everything is permitted.” The temptation is to invert the phrases and ascribe causality: where everything is permitted, nothing is true. Where nothing is true, where is the Emersonian fact to be found? This bears directly on the artist’s task. The idea that writers can keep producing grandly synthesizing or totalizing work—that has the ring of truth, of mattering—is debatable.

Birkirt’s essay may not be the intoxicant of your choice this weekend, but it should be. It is your weekend read. And you might check out the surprising selection at the bar at Lapham’s Quarterly as well.

And for those with time to spare: Arthur Koestler, from whom I first learned of the Louis Pasteur quote at the top of this essay, was consumed with the connection between intoxication and the infinite. I have discussed Koestler’s pursuit of the infinite at length. You can read that discussion here.

-RB

5Dec/120

The “E” Word

The New York Times tells the story of Benjamin Goering. Goering is 22. Until recently he studied computer science and philosophy at the University of Kansas. He felt “frustrated in crowded lecture halls where the professors did not even know his name.” So Goering dropped out of college and went to San Francisco, where he got a job as a software engineer.

I applaud Goering for making a risky decision. College was not for him. This does not mean he wasn’t smart or couldn’t cut it. He clearly has talent and it was being wasted in courses he was not interested in that were costing him and his family many tens of thousands of dollars every year. In leaving, Goering made the right decision for him. Indeed, many more college students should make the same decision he did. There are huge numbers of talented people who are simply not intellectuals and don’t enjoy or get much out of college. This is not destiny. A great or good teacher might perk them up. But largely it is a waste of their time and money for them to struggle through (or sleep through) classes that bore them. If anything, the forced march through Shakespeare and Plato make these students less engaged, cynical, and self-centered as they turn from common sense to the internal pursuit of self interest in partying and life in private.

The story should raise the big question that everyone tiptoes around in this current debate about college: Who should go to college?

The obvious answer is those who want to and those who care about ideas. Those who see that in thinking and reading and talking about justice, democracy, the scientific method, and perspective, we are talking about what it means to live in a large, democratic, bureaucratic country at a time of transition from an industrial to a information-age economy. College, in other words, is for those people who want to think about their world. It is for people who are willing and eager to turn to the great thinkers who came before them and, also, the innovative scientists and artists who have revealed hidden secrets about the natural and the human worlds. It is, in other words, for intellectuals. And this of course raises the “E” question: the question of elitism.

It is folly to think that everyone is or should be interested in such an endeavor. In no society in history have intellectuals been anything but a small minority of the population. This is not a question of privilege. There is no reason to think that those who love ideas are better or more qualified than those who work the earth, build machines, or engineer websites. It may very well be otherwise.

Hannah Arendt was clear that intellectuals had no privileged position in politics. On the contrary, she worried that the rise of intellectuals in politics was specifically dangerous. Intellectuals, insofar as they could get lost in and captivated by ideas, are prone to lose sight of reality in the pursuit of grand schemes. And intellectuals, captivated by the power of reason, are susceptible to rationalizations that excuse wrongs like torture or suicide bombing as means necessary for greater goods. The increasing dominance of intellectuals in politics, Arendt argued, is one of the great dangers facing modern society. She thus welcomed the grand tradition of the American yeoman farmer and affirmed that there is no need to go to college to be an engaged citizen or a profound thinker. The last of our Presidents who did not attend college was Abraham Lincoln. He did just fine. It is simply ridiculous to argue that college is a necessary credential for statesmanship.

While intellectuals have no special claim to leadership or prominence, they are nevertheless important. Intellectuals—those who think— are those people in society who stand apart from the mainstream pressures of economy and influence and outside the political movements of advocacy and propaganda. In the Arendtian tradition, intellectuals are or can be conscious pariahs, those who look at their societies from the outside and thus gain a perspective from distance that allows them to understand and comprehend the society in ways that people deeply embedded within it cannot. Those who stand apart from society and think are important, first because they preserve and deepen the stories and tales we as a society tell about ourselves. In writing poetry, making art, building monuments, writing books, and giving speeches, intellectuals help lend meaning and gravity to the common sense we have of ourselves as a people.

One problem we have in the current debate is that College has morphed into an institution designed to do many (too many) things. On the one hand, college has historically been the place for the education of and formation of intellectuals. But for many decades if not many centuries, that focus has shifted. Today College is still a place for the life of the mind. But it is also a ticket into the middle or upper-middles classes and it is equally a job-training and job-certification program. Of course, it is also a consumer good that brands young people with a certain mystique and identity. For many localities colleges are, themselves, job creation machines, bringing with them all sorts of new businesses and throwing off patents and graduating students that reinvigorate local communities. The university is now a multiversity, to invoke Clark Kerr’s famous term. When we talk about college today, the debate is complicated by these multiple roles.

It is difficult to raise such issues today because they smack of elitism. Since college-educated people think they are superior to those without a fancy diploma, their egalitarianism then insists that everyone should have the same experience. We are not supposed to entertain the idea that some people may not want to go to college. Instead, we are told that if they had a better education, if they knew better, if they just were taught to understand, they would all want to sit in classrooms and read great books or do exciting experiments.

We are stuck today with what Hannah Arendt called, in a related context, the “democratic mentality of an egalitarian society that tends to deny the obvious inability and conspicuous lack of interest of large parts of the population in political matters as such.” In politics, Arendt argued that what was needed were public spaces from which a self-chose “élite could be selected, or rather, where it could select itself.” Similarly, in education today, colleges should be the spaces where those who want to select themselves as an educated élite might lose themselves in books and experiments and amongst paintings and symphonies. There is simply no reason to assume that most people in society need to or should be interested in such an endeavor.

One reason the question of elitism is so present in debates about college is the disgusting and degenerate state of American public high schools. If high schools provided a serious and meaningful civic education, if they taught not simply reading and writing and arithmetic, but history and art—and taught these well—we would not need to send students to remedial education in college where they could be taught these subjects a second time. While many academics wring their hands about making college available to all, they might do much better if they focused on high schools and grammar schools around the country. If we were to redistribute the billions of dollars we spend on remedial college education to serious reform efforts in high schools, that money would be very well spent.

To raise the question of elitism means neither that college should be open only to the rich and connected (on the contrary, it should be open to all who want it), nor that the educated elite is to be segregated from society and kept apart in an ivory tower. When one reads Shakespeare, studies DNA, or dances with Bill T. Jones, one is not simply learning for learning's sake. Few understood this better than John Finley, Greek Professor at Harvard, who wrote General Education in a Free Society in 1945. Finley had this to say about the purposes of a college education:

The heart of the problem of a general education is the continuance of the liberal and humane tradition. Neither the mere acquisition of information nor the development of special skills and talents can give the broad basis of understanding which is essential if our civilization is to be preserved…. Unless the educational process includes at each level of maturity some continuing contact with those fields in which value judgments are of prime importance, it must fall short of the ideal.

What college should offer—as should all education at every level except for the most specialized graduate schools—is the experience of thinking and coming to engage with the world in which one lives. College is, at its best, an eye opening experience, an opportunity for young people to learn the foundational texts and also be exposed to new cultures, new ideas, and new ways of thinking. The ideas of justice, truth, and beauty one learns are not valuable in themselves; they are meaningful only insofar as they impact and inform our daily lives. To read Plato’s Republic is to ask: what are the value of the ideas of good and the just? It is also to meditate on the role of music and art in society. And at the same time, it is to familiarize oneself with characters like Socrates and Plato who, in the world we share, epitomize the qualities of morality, heroism, and the pursuit of the truth wherever it might lead. This can also be done in high schools. And it should be.

It is simply wrong to think such inquiries are unworldly or overly intellectual. Good teachers teach great texts not simply because the books are old, but because they are meaningful. And young students return to these books generation after generation because they find in them stories, examples, and ideas that inspire them to live their lives better and more fully.

As Leon Botstein, President of Bard College where the Hannah Arendt Center is located, writes in his book Jefferson’s Children,

No matter how rigorous the curriculum, no matter how stringent the requirements, if what goes on in the classroom does not leave its mark in the way young adults voluntarily act in private and in public while they are in college, much less in the years after, then the college is not doing what it is supposed to do.

The basic question being asked today is: Is college worthwhile? It is a good question.  Too many colleges have lost their way. They no longer even understand what they are here to offer. Faculty frequently put research above teaching. Administration is the fastest growing segment of university education, which is evidence if anything is that universities simply do not know what their mission is anymore. It is no wonder, then, that many of our brightest young people will begin to shy away from the thoughtless expectation that one must attend college.

All around us, people are opting out of college. The mania for online education is at least in part fueled by the hunger for knowledge from students and others who do not want or need to attend college. The Times highlights Uncollege and other organizations that advocate “hacking” your education. Recall that Lincoln was better schooled in the classics of poetry and politics than most every college educated President who followed him. At a time when many colleges are so confused and trying to do so many things, they often do none well. It may be the case today that we need to evolve new networks and new organizations where intellectualism can flourish. And it may be small liberal arts colleges that are more flexible and more able to make that transition than large, bureaucratic research institutions.

The real question this debate needs to raise, but avoids, is: Who should get a college education? The answer, “not everyone,” is one few want to hear. And yet it might be the beginning of a real conversation about what a college education is for and why we are today so often failing to provide it to our students.

-RB

 

5Nov/120

A Sorry Bunch of Dwarfs

Freeman Dyson, the eclectic physicist, took good aim at philosophy last week in a review of the silly book by Jim Holt, Why Does the World Exist?" An Existential Detective Story. Holt went around to "a portrait gallery of leading modern philosophers," and asked them the Leibnizian question: Why is there something rather than nothing?" The book offers their answers, along with biographical descriptions.

For Dyson, Holt's book "compels us to ask" these "ugly questions." First, "When and why did philosophy lose its bite?" Philosophers were, once important. In China, Confucius and his followers made a civilization. So too in Greece did Socrates and then the schools of Plato and Aristotle give birth to the western world. In the Christian era Jesus and Paul, then Aquinas and Augustine granted depth to dominant worldviews. Philosophers like Descartes, Hobbes, and Leibniz were central figures in the scientific revolution, and philosophical minds like Nietzsche, Heidegger, and Arendt (even if one was a philologist and the other two refused the name philosopher) have become central figures in the experience of nihilism. Against these towering figures, the "leading philosophers" in Holt's book cut a paltry figure. Here is Dyson:

Holt's philosophers belong to the twentieth and twenty-first centuries. Compared with the giants of the past, they are a sorry bunch of dwarfs. They are thinking deep thoughts and giving scholarly lectures to academic audiences, but hardly anybody in the world outside is listening. They are historically insignificant. At some time toward the end of the nineteenth century, philosophers faded from public life. Like the snark in Lewis Carroll's poem, they suddenly and silently vanished. So far as the general public was concerned, philosophers became invisible.

There are many reasons for the death of philosophy, some of which were behind Hannah Arendt's refusal to call herself a philosopher. Philosophy was born, at least in its Platonic variety, from out of the thinker's reaction to the death of Socrates. Confronted with the polis that put the thinker to death, Plato and Aristotle responded by retreating from the world into the world of ideas. Philosophical truth separated itself from worldly truths, and idealism was born. Realism was less a return to the world than a reactive fantasy to idealism. In both, the truths that were sought were otherworldly truths, disconnected to the world.

Christianity furthered the divorce of philosophy from the world by imagining two distinct realms, the higher realm existing beyond the world. Science, too, taught that truth could only be found in a world of abstract reason, divorced from real things. Christianity and science together gave substance to the philosophical rebellion against the world. The result, as Dyson rightly notes, is that philosophy today is as abstract, worldly, and relevant as it is profound.

What Dyson doesn't explore is why philosophers of the past had such importance, even as they also thought about worlds of ideas. The answer cannot be that ideas had more import in the past than now. On the contrary, we live in an age more saturated in ideas than any other. More people today are college educated, literate, and knowledgeable of philosophy than at any period in the history of the world. Books like Holt's are proof positive of the profitable industry of philosophical trinkets. That is the paradox—at a time when philosophy is read by more people than ever, it is less impactful than it ever was.

One explanation for this paradox is nihilism—The devaluing or re-valuing of the highest values. The truth about truth turned out to be neither so simple nor singular as the philosophers had hoped. An attentive inquiry into the true and the good led not to certainty, but to ideology critique. For Nietzsche, truth, like the Christian God, was a human creation, and the first truth of our age is that we recognized it as such. That is the precondition for the death of God and the death of truth. Nihilism has not expunged ideas from our world, but multiplied them. When speaking about the "true" or the "good" or the "just," Christians, Platonists, and moralists no longer have the stage to themselves. They must now shout to be heard amongst the public relations managers, advertisers, immoralists, epicureans, anarchists, and born again Christians.

Dyson ignores this strain of philosophy. He does point out that Nietzsche was the last great philosopher, but then dismisses Heidegger who "lost his credibility in 1933" and even Wittgentstein who would remain silent if a woman attended his lectures until she would leave. And yet it is Heidegger who has given us the great literary masterpieces of the 20th century philosophy.

His work on technology (The Question Concerning Technik) and art (The Origins of the Work of Art) has been widely read in artistic, literary, and lay circles. It is hard to imagine a philosopher more engaged with the science and literature than Heidegger was. He read physics widely and co-taught courses at the house of the Swiss psychiatrist Medard Boss and also taught seminars with the German novelist Ernst Jünger.

It seems worthwhile to end with a poem of Heidegger's from his little book, Aus der Erfahrung des Denkens/From Out of the Experience of Thinking:

Drei Gefahren drohen dem Denken
Die gute und darum heilsame Gefahr ist die Nachbarschaft des singenden Dichters.
Die böse und darum schärfste Gefahr ist das Denken selber. Es muß gegen sich selbst denken, was es nur selten vermag.
Die schlechte und darum wirre Gefahr ist das Philosophieren.

Three dangers threaten thinking.
The good and thus healthy danger is the nearness of singing poetry.
The evil and thus sharpest danger is thinking itself. It must think against itself, something it can do only rarely.
The bad and thus confusing danger is philosophizing.

-RB

25Sep/120

Does the President Matter?

“Hence it is not in the least superstitious, it is even a counsel of realism, to look for the unforeseeable and unpredictable, to be prepared for and to expect “miracles” in the political realm. And the more heavily the scales are weighted in favor of disaster, the more miraculous will the deed done in freedom appear.”

                        —Hannah Arendt, What is Freedom?

This week at Bard College, in preparation for the Hannah Arendt Center Conference "Does the President Matter?", we put up 2 writing blocks around campus, multi-paneled chalkboards that invite students to respond to the question: Does the President Matter?  The blocks generated quite a few interesting comments. Many mentioned the Supreme Court. Quite a few invoked the previous president, war, and torture. And, since we are at Bard, others responded: it depends what you mean by matters.

This last comment struck me as prescient. It does depend on what you mean by matters.

If what we mean is, say, an increasing and unprecedented power by a democratic leader not seen since the time of enlightened monarchy, the president does matter. We live in an age of an imperial presidency. The President can, at least he does, send our troops into battle without the approval of Congress. The President can, and does, harness the power of the TV, Internet, and twitter to bypass his critics and reach the masses more directly than ever before. The president can, and does, appoint Supreme Court Justices with barely a whimper from the Senate; and the president’s appointments can, and do, swing the balance on a prisoner’s right to habeas corpus, a woman’s right to choose, or a couple’s right to marry.

And yet, what if by matter, we mean something else? What if we mean, having the power to change who we are in meaningful ways? What if by matter we mean: to confront honestly the enormous challenges of the present? What if by matter we mean: to make unpredictable and visionary choices, to invite and inspire a better future?

­On the really big questions—the thoughtless consumerism that degrades our environment and our souls; the millions of people who have no jobs and increasingly little prospect for productive employment; the threat of devastating terrorism; and the astronomical National Debt: 16 trillion and counting for the US. -- That is $140,000 for each taxpayer. -- Add to that the deficiency in Public Pension Obligations (estimated at anywhere from $1 to $5 trillion.) Not to mention the 1 trillion dollars of inextinguishable student debt that is creating a lost generation of young people whose lives are stifled by unwise decisions made before they were allowed to buy a beer.

This election should be about a frank acknowledgement of the unsustainability of our economic, social, and environmental practices and expectations. We should be talking together about how we should remake our future in ways that are both just and exciting. This election should be scary and exciting. But so far it’s small-minded and ugly.

Around the world, we witness worldwide distrust and disdain for government. In Greece there is a clear choice between austerity and devaluation; but Greek leaders have saddled their people with half-hearted austerity that causes pain without prospect for relief.  In Italy, the paralysis of political leaders has led to resignation and the appointment of an interim technocratic government. In Germany, the most powerful European leader delays and denies, trusting that others will blink every time they are brought to the mouth of the abyss.

No wonder that the Tea Party and Occupy Wall Street in the US, and the Pirate Parties in Europe share a common sense that liberal democratic government is broken. A substantial—and highly educated—portion of the electorate has concluded that our government is so inept and so compromised that it needs to be abandoned or radically constrained. No president, it seems, is up to the challenge of fixing our broken political system.

Every President comes to Washington promising reform!  And they all fail.  According to Jon Rauch, a leading journalist for The Atlantic and the National Journal, this is inevitable. He has this to say in his book Government's End:

If the business of America is business, the business of government programs and their clients is to stay in business. And after a while, as the programs and the clients and their political protectors adapt to nourish and protect each other, government and its universe of groups reach a turning point—or, perhaps more accurately, a point from which there is no turning back. That point has arrived. Government has become what it is and will remain: a large, incoherent, often incomprehensible mass that is solicitous of its clients but impervious to any broad, coherent program of reform. And this evolution cannot be reversed.

On the really big questions of transforming politics, the President is, Rauch argues, simply powerless. President Obama apparently agrees. Just last week he said, in Florida: "The most important lesson I've learned is that you can't change Washington from the inside. You can only change it from the outside."

A similar sentiment is offered by Laurence Lessig, a founding member of Creative Commons. In his recent book Republic 2.0, Lessig writes:

The great threat today is in plain sight. It is the economy of influence now transparent to all, which has normalized a process that draws our democracy away from the will of the people. A process that distorts our democracy from ends sought by both the Left and the Right: For the single most salient feature of the government that we have evolved is not that it discriminates in favor of one side and against the other. The single most salient feature is that it discriminates against all sides to favor itself. We have created an engine of influence that seeks not some particular strand of political or economic ideology, whether Marx or Hayek. We have created instead an engine of influence that seeks simply to make those most connected rich.

The system of influence and corruption through PACs, SuperPacs, and lobbyists is so entrenched, Lessig writes, that no reform seems plausible.  All that is left is the Hail Mary idea of a new constitutional convention—an idea Lessig promotes widely, as with his Conference On the Constitutional Convention last year at Harvard.

For Rauch on the Right and Lessig on the Left, government is so concerned with its parochial interests and its need to stay in business that we have forfeited control over it. We have, in other words, lost the freedom to govern ourselves.

The question  "Does the President Matter?" is asked, in the context of the Arendt Center conference, from out of Hannah Arendt's maxim that Freedom is the fundamental raison d'etre of politics. In "What is Freedom?", Arendt writes:

“Freedom is actually the reason that men live together in political organization at all. Without it, political life as such would be meaningless. The raison d’être of politics is freedom.”

So what is freedom? To be free, Arendt says, is to act. Arendt writes: "Men are free as long as they act, neither before nor after; for to be free and to act are the same.”

What is action? Action is something done spontaneously. It brings something new into the world. Man is the being capable of starting something new. Political action, and action in general, must happen in public. Like the performing arts—dance, theatre, and music—politics and political actions requires an audience. Political actors act in front of other people. They need spectators, so that the spectators can be drawn to the action; and when the spectators find the doings of politicians right, or true, or beautiful, they gather around and form themselves into a polity. The political act, the free act must be surprising if it is to draw people to itself. Only an act that is surprising and bold is a political act, because only such an act will strike others, and make them pay attention.

The very word politics derives from the Greek polis which itself is rooted in the Greek pelein, a verb used to describe the circular motion of smoke rings rising up from out of a pipe. The point is that politics is the gathering of a plurality around a common center. The plurality does not become a singularity in circling around a polestar, but it does acknowledgement something common, something that unites the members of a polity in spite of their uniqueness and difference.

When President Washington stepped down after his second term; when President Lincoln emancipated the slaves; when FDR created the New Deal; when President Eisenhower called the Arkansas National Guard into Federal Service in order to integrate schools in Little Rock; these presidents acted in ways that helped refine, redefine, and re-imagine what it means to be an American.

Arendt makes one further point about action and freedom that is important as they relate to the question: Does the President Matter? Courage, she writes, is "the political virtue par excellence."  To act in public is leave the security of one's home and enter the world of the public. Such action is dangerous, for the political actor might be jailed for his crime or even killed. Arendt's favorite example of political courage is Socrates, who was killed for his courageous engagement of his fellow Athenians. We must always recall that Socrates was sentenced to death for violating the Athenian law.

Political action also requires courage because the actor can suffer a fate even worse than death. He may be ignored. At least to be killed for one's ideas means that one is recognized as capable of action, of saying and doing something that matters. To be ignored, however, denies the actor the basic human capacity for action and freedom.

One fascinating corollary of Arendt's understanding of the identity of action and freedom is that action, any action—any original deed, any political act that is new and shows leadership—is, of necessity, something that was not done before. It is, therefore, always against the law.

This is an insight familiar to readers of Fyodor Dostoevsky. In Crime and Punishment Raskolnikov says:

Let's say, the lawgivers and founders of mankind, starting from the most ancient and going on to the Lycurguses, the Solons, the Muhammads, the Napoleons, and so forth, that all of them to a man were criminals, from the fact alone that in giving a new law they thereby violated the old one.

All leaders are, in important ways, related to criminals. This is an insight Arendt and Nietzsche too share.

Shortly after we began to plan this conference, I heard an interview with John Ashcroft speaking on the Freakonomics Radio Show. He said:

"Leadership in a moral and cultural sense may be even more important than what a person does in a governmental sense. A leader calls people to their highest and best. ... No one ever achieves greatness merely by obeying the law. People who do above what the law requires become really valuable to a culture. And a President can set a tone that inspires people to do that."

My first reaction was: This is a surprising thing for the Attorney General of the United States to say. My second reaction was: I want him to speak at the conference. Sadly, Mr. Ashcroft could not be with us here today. But this does not change the fact that, in an important way, Ashcroft is right. Great leaders will rise above the laws in crisis. They will call us to our highest and best.

What Ashcroft doesn't quite say, and yet Arendt and Dostoevsky make clear, is that there is a thin and yet all-so-important line separating great leaders from criminals. Both act in ways unexpected and novel. In a sense, both break the law.

But only the leader's act shows itself to be right and thus re-makes the law.  Hitler may have acted and shown a capacity for freedom; his action, however, was rejected. He was a criminal, not a legislator.  Martin Luther King Jr. or Gandhi also broke the laws in actions of civil disobedience. Great leader show in their lawbreaking that the earlier law had been wrong; they forge a new moral and also written law through the force and power of moral example.

In what is perhaps the latest example in the United States of a Presidential act of lawbreaking, President George W. Bush clearly broke both U.S. and international law in his prosecution of the war on terror. At least at this time it seems painfully clear that President George W. Bush's decision to systematize torture stands closer to a criminal act than an act of great legislation.

In many ways Presidential politics in the 21st takes place in the shadow of George W. Bush's overreach. One result is that we have reacted against great and daring leadership. In line with the spirit of equality that drives our age, we ruthlessly expose the foibles, missteps, scandals and failures of anyone who rises to prominence. Bold leaders are risk takers. They fail and embarrass themselves. They have unruly skeletons in their closets. They will hesitate to endure and rarely prevail in the public inquisition that the presidential selection process has become.

These candidates, who are inoffensive enough to prevail, are branded by their consultants as pragmatists. Our current pragmatists are Products of Harvard Business School and Harvard Law School. Mr. Romney loves data. President Obama worships experts. They are both nothing if not faithful to the doctrine of technocratic optimism, that we with the right people in charge we can do anything. The only problem is they refuse to tell us what it is they want to do. They have forgotten that politics is a matter of thinking, not a pragmatic exercise in technical efficiency.

Look at the Mall in Washington: the Washington monument honors our first President,  the Jefferson Memorial, the Lincoln Memorial, the Memorial to Franklin Delano Roosevelt.  There is not a monument to any president since FDR. And yet, just 2 years ago we dedicated the Martin Luther King Memorial. It doesn't seem like an accident that the leaders of the Civil Rights Movement were not politicians. Our leaders today do not gravitate to the presidency. The presidency does not attract leaders. Bold leaders today are not the people running for office.

Yet, people crave what used to be called a statesman. To ask: "Does the President Matter?" is to ask:  might a president, might a political leader, be able to transform our nation, to restore the dignity and meaning of politics? It is to ask, in other words, for a miracle.

At the end of her essay, "What is Freedom?", Hannah Arendt said this about the importance of miracles in politics.

Hence it is not in the least superstitious, it is even a counsel of realism, to look for the unforeseeable and unpredictable, to be prepared for and to expect “miracles” in the political realm. And the more heavily the scales are weighted in favor of disaster, the more miraculous will the deed done in freedom appear.

She continued:

It is men who perform miracles—men who because they have received the twofold gift of freedom and action can establish a reality of their own.

I don't know if the president matters.

But I know that he or she must. Which is why we must believe that miracles are possible. And that means we, ourselves, must act in freedom to make the miraculous happen.

In the service of the not-yet-imagined possibilities of our time, our goal over the two days of the conference days was to engage in the difficult, surprising, and never-to-be-understood work of thinking, and of thinking together, in public, amongst others. We heard from philosophers and businessmen, artists and academics. The speakers came from across the political spectrum, but they shared a commitment to thinking beyond ideology. Such thinking is itself a form of action, especially so in a time of such ideological rigidity. Whether our meeting here at Bard gives birth to the miracle of political action--that is up to you.  If we succeeded in thinking together, in provoking, and in unsettling, we perhaps sowed the seeds that will one day blossom into the miracle of freedom.

-RB

Watch Roger's  opening talk from the conference, "Does the President Matter?" here.

3Sep/121

One Against All

It can be dangerous to tell the truth: “There will always be One against All, one person against all others. [This is so] not because One is terribly wise and All are terribly foolish, but because the process of thinking and researching, which finally yields truth, can only be accomplished by an individual person. In its singularity or duality, one human being seeks and finds – not the truth (Lessing) –, but some truth.”

-Hannah Arendt, Denktagebuch, Book XXIV, No. 21

Hannah Arendt wrote these lines when she was confronted with the severe and often unfair, even slanderous, public criticism launched against her and her book Eichmann in Jerusalem after its publication in 1963. The quote points to her understanding of the thinking I (as opposed to the acting We) on which she bases her moral and, partly, her political philosophy.

It is the thinking I, defined with Kant as selbstdenkend (self-thinking [“singularity”]) and an-der-Stelle-jedes-andern-denkend (i.e., in Arendt’s terms, thinking representatively or practicing the two-in-one [“duality”]). Her words also hint at an essay she published in 1967 titled “Truth and Politics,” wherein she takes up the idea that it is dangerous to tell the truth, factual truth in particular, and considers the teller of factual truth to be powerless. Logically, the All are the powerful, because they may determine what at a specific place and time is considered to be factual truth; their lies, in the guise of truth, constitute reality. Thus, it is extremely hard to fight them.

In answer to questions posed in 1963 by the journalist Samuel Grafton regarding her report on Eichmann and published only recently, Arendt states: “Once I wrote, I was bound to tell the truth as I see it.” The statement reveals that she was quite well aware of the fact that her story, i.e., the result of her own thinking and researching, was only one among others. She also realized the lack of understanding and, in many cases, of thinking and researching, on the part of her critics.

"Iustitia" - Martin van Heemskerck, 1478-1578

Thus, she lost any hope of being able to publicly debate her position in a “real controversy,” as she wrote to Rabbi Hertzberg (April 8, 1966). By the same token, she determined that she would not entertain her critics, as Socrates did the Athenians: “Don’t be offended at my telling you the truth.” Reminded of this quote from Plato’s Apology (31e) in a supportive letter from her friend Helen Wolff, she acknowledged the reference, but acted differently. After having made up her mind, she wrote to Mary McCarthy: “I am convinced that I should not answer individual critics. I probably shall finally make, not an answer, but a kind of evaluation of this whole strange business.” In other words, she did not defend herself in following the motto “One against All,” which she had perceived and noted in her Denktagebuch. Rather, as announced to McCarthy, she provided an “evaluation” in the 1964 preface to the German edition of Eichmann in Jerusalem and later when revising that preface for the postscript of the second English edition.

Arendt also refused to act in accordance with the old saying: Fiat iustitia, et pereat mundus (let there be justice, though the world perish). She writes – in the note of the Denktagebuch from which today’s quote is taken – that such acting would reveal the courage of the teller of truth “or, perhaps, his stubbornness, but neither the truth of what he had to say nor even his own truthfulness.” Thus, she rejected an attitude known in German cultural tradition under the name of Michael Kohlhaas.  A horse trader living in the 16th century, Kohlhaas became known for endlessly and in vain fighting injustice done to him (two of his horses were stolen on the order of a nobleman) and finally taking the law into his own hands by setting fire to houses in Wittenberg.

Even so, Arendt has been praised as a woman of “intellectual courage” with regard to her book on Eichmann (see Richard Bernstein’s contribution to Thinking in Dark Times).

Intellectual courage based on thinking and researching was rare in Arendt’s time and has become even rarer since then. But should Arendt therefore only matter nostalgicly? Certainly not. Her emphasis on the benefits of thinking as a solitary business still remains current. Consider, for example, the following reference to Sherry Turkle, a sociologist at MIT and author of the recent book Alone Together. In an interview with Peter Haffner (published on July 27, 2012, in SZ Magazin), she argues that individuals who become absorbed in digital communication lose crucial components of their faculty of thinking. Turkle says (my translation): Students who spend all their time and energy on communication via SMS, Facebook, etc. “can hardly concentrate on a particular subject. They have difficulty thinking a complex idea through to its end.” No doubt, this sounds familiar to all of us who know about Hannah Arendt’s effort to promote thinking (and judging) in order to make our world more human.

To return to today’s quote: It can be dangerous to tell the truth, but thinking is dangerous too. Once in a while, not only the teller of truth but the thinking 'I' as well may find himself or herself in the position of One against All.

-Ursula Ludz

2Jan/120

Not Thinking-Tracy Strong

“[O]ur newest experiences and our most recent fears…[are] a matter of thought and thoughtlessness – the heedless recklessness or hopeless confusion or complacent repetition of ‘truths’ which have become trivial and empty – [This] seems to me among the outstanding characteristics of our time.”

-Hannah Arendt, The Human Condition

Not thinking was, for Arendt, the increasingly dominant quality of the world in which we live. Thoughtlessness is the negative mirror image of what she called for as the only form of thinking appropriate to period of crisis (indeed, in a strict sense, perhaps to any time) – thinking “without a banister.”

Inherent in this conception is that in ages and at times like our own, when one must think without support, many, perhaps most, will not think, or rather will avoid thinking.  They will thus be left without that voice of conscience – like Socrates' daimon who appears  at moments of judgment and keeps Socrates from justifying, or even engaging in acts that are evil. Importantly, that something is “true,” means  nothing by itself unless it is the subject of thinking.

One might consider here the thoughtlessness that reigned in the general reaction in the United States to the attacks of 9/11, 2001. The analogy was immediately drawn to Pearl Harbor.  From this analogy it followed that our response should be analogous to that after Pearl Harbor, despite the fact that Al-Qaeda, unlike Japan, was not a nation-state.  Furthermore this enemy was linked to an Axis of Evil against which one was to fight a “war on terror.” Osama Bin laden was Hitler or at least Tojo; Saddam Hussein another totalitarian, linked by an Axis of Evil to the other totalitarians.  Yet one cannot fight against terror, only against an enemy – Carl Schmitt had warned of forgetting this.

The result of not thinking about what one has done – whether as a policy maker or a member of the population -- has been a war that has now gone on for ten years with neither goal nor end in sight.  Thoughtlessness has consequences: people die as a result of thoughtlessness.

(I discovered similar thoughts in Elizabeth Young-Bruehl, Why Arendt Matters on pages 12-13 after writing this passage and modified my words, as hers are much better:   I join others in mourning her passing).

-Tracy Strong

21Dec/110

The Occupy Movement – Visualizing Change

Occupy Wall Street is, on one important level, a movement of signs. I mean this quite literally. Handmade signs with witty epigrams, pithy epithets, and heartfelt emotions took root in Zuccotti Park and blossomed on the web. The signs are not simply the old-fashioned placards of protests past. Rather, the signs proliferated in large measure specifically so they could be photographed, uploaded, and disseminated on the World Wide Web. In many ways, Occupy Wall Street communicated its message through photographs of signs.

Pictures of signs, like the one below, tell human stories of average, hard-working Americans who have been upended by the Great Recession.

In the war of signs, pictures of military veterans occupy a privileged role. The military protester shows, in an image, that the anger, despair, and hope that the Occupy Movement represents is not limited to entitled young hipsters. The signs were, quite often, expressions of the average American, the soldier and the homeowner, who had been devastated by economic hardship. The implication is that these individuals lived honorably, played by the rules, and are suddenly in dire straits as a result of a financial crisis.

I first encountered one such iconic picture on Facebook. It shows an older man telling a sad story. This cheerful, gray-haired, bespectacled Navy Veteran and schoolteacher clad in his oxford shirt neatly pressed under a burgundy sweater is undoubtedly one of the poster-children of Occupy Wall Street. His story is common and sad. He has served his country and taught our children. And now his pension doesn't allow him the means to live with dignity.

Older individuals, like soldiers and children, hold a special place in the iconography of the Occupy Movement. They bespeak a kind of innocence and vulnerability. They are hard working and have paid their dues. All they want is what is fair and right. As a Navy veteran and a teacher, this man's simple sign expresses American ideals, and their betrayal. He did the right thing and hoped for a comfortable retirement in his own home, with annual vacations and visits to the grandchildren. Is this too much to hope for? The claim here is, he followed the rules and he got steamrolled.

Not long after this sign and thousands of others like it zipped around the web on Tumblr and Facebook, another sign appeared, as if to answer this veteran's lament and other sad stories of foreclosed homeowners and indebted students. This sign claims to be from a student (not pictured and thus questionable), but one who played by the rules in another sense.

I wrote more about these signs here and here. Both signs appeal to a basic ideal of fairness. But fairness means different things to each. The first sign sees fairness as a kind of social contract. If I work hard and play by the rules, I should be guaranteed a certain standard of living and insured against catastrophe.  Especially when the well off in society, those whose freedoms I fought for and whose children I taught, were bailed out by my tax dollars.

The second announces a different view of fairness as individual responsibility. Life is not fair and no one should expect a handout. Playing by the rules means living within your means, not taking out mortgages you can't afford or student loans that will saddle you with debt. Working hard is not enough, but you must also be thrifty and responsible. If you do decide to take risks or live beyond your means, that is your choice, but don't expect me to feel sorry for you if you fail.

The argument between two notions of responsibility that these competing signs take up is an important one. It goes to the heart of our ideas of personal responsibility, individualism, community, entitlement, and empathy. I have written at length about Occupy Wall Street here and here. But what does it mean that this conversation about who we are and what our country should be is happening through pictures of signs on the Internet?

Occupy Wall Street began with an image, created and disseminated by Adbusters, a Canadian media and anti-advertising group. A charging bull, iconic to the world of finance, gracefully ridden by a female dancer, in front of a surging crowd wearing gas masks and brandishing batons. Smoke fills the air. It is  an image of revolution; but what does the revolution call for? Dance? The power of grace and beauty over brawn? Escape from unrestrained capitalism and a return to more spiritual values?

Undoubtedly the victory of the gracefulness of spirit over the aggression of calculation is one metaphorical text of the image. So too is the power of the people; the mob, which rages behind both the ballerina and the bull. Unresolved is whether the mob stands with the ballerina or the bull, or whether its fury threatens both.

The image of the ballerina and the bull is a political call, but one issued through images and metaphors. Our economy and our politics are like the bull—uncontrolled, wild, and in need of a spiritual master. Such metaphorical thinking is at the very root of both political and metaphysical thinking for it carries over the thinking of everyday reality into a higher and more truthful state. A metaphor—literally a carrying over as its Greek etymology suggests—elevates thinking from the mundane to the speculative, and thus energizes everyday thinking through the power of ideas.

Immanuel Kant once described a despotic state as a "mere machine"—a hand grinder—because both are governed by an absolute individual will that can make mince meat of the individuals under their grip. Kant offered the hand grinder as an example of a successful metaphor—an image that shows a "perfect resemblance of two relations between two totally dissimilar things."

Hannah Arendt discusses Kant's use of the metaphor in her book The Life of the Mind. She quotes there as well from Ernest Fenollosa, in an essay originally published by Ezra Pound:

Metaphor is  ... the very substance of poetry"; without it, "there would have been no bridge whereby to cross from the minor truth of the seen to the major truth of the unseen."

For Arendt thought images are unavoidable in thinking and speaking, for we cannot approach any concept or idea without in some way employing an analogy or metaphor from our lived and daily experience. We have no entry into the temple of truth except through the passageways of metaphor and symbolic thought. We cannot even recognize a dog as a dog or God as God without an idea or concept of "dog" or of "God" that themselves are metaphorical or analogical ideas taken from our experience of the world. Friendship, too, Arendt writes, must originally be thought in images and metaphors, as the Chinese do for whom the character for friendship shows an image of two united hands.

As Arendt writes:

[The Chinese] think in images and not in words. And this thinking in images always remains "concrete" and cannot be discursive, traveling through an ordered train of thought, nor can it give account of itself (logon didonai); the answer to the typically Socratic question ‘What is friendship?’ is visibly present and evident in the emblem of two united hands, and "the emblem liberates a whole stream of pictorial representations" through plausible associations by which images are joined together.

Arendt's point is that Chinese and other pictorial languages offer direct version of the kinds of metaphorical thinking that must attend to all languages, even purely alphabetical languages like those in the West. Even our language depends upon the images and analogies of metaphors to carry our thought beyond the everyday to the deeper level of significance and meaning, on which both philosophy and politics might build a publicly accessible and shared common world.

That thinking happens in images is, Arendt writes, "fascinating and disquieting." It is disquieting because it puts into question the priority of language and reason that so defines the tradition of Western thought—the demand for rational justification in philosophy and politics that is so central to the rationalist foundations of modern society in a scientific age. For rational justification can happen only in words whereas higher truths are accessible only through metaphors and images.

The priority of images over words is the reason that Arendt remains one of the most poetic thinkers in the modern canon.  She is uniquely aware throughout all her writing that

"poetry," when read aloud, "will affect the hearer optically; he will not stick to the word he hears but to the sign he remembers and with it to the sights to which the sign clearly points."

I spoke about this coincidence of thinking, seeing, and acting with the great dancer and choreographer Bill T. Jones in 2010. For Bill T., the effort in his dance "Floating the Tongue" is to enact the process of taking something invisible and internal and bringing it to appear on the stage and in the world.  In Arendt's words, the effort of poetic language must be to bridge "the gulf between the realm of the invisible and the world of appearances."

Political thinking, too, has much to learn from poetry and metaphor. "Politics," writes Hannah Arendt, "deals with the coexistence and association of different men." As we live with others, we human beings aim at freedom—the freedom to be an individual and also the freedom to build a common world together. For Arendt, politics is the activity through which a plurality of human beings constitute themselves as a people, a unity of differences. The political actor is he or she who acts and speaks in such a way as to show the different people around him the common truths that bind them together as a people. It is because politics must employ metaphors and images that build a foundation for a new and public space for freedom to flourish that politics also demands a public space where citizens can meet, speak, and act in public.

A great virtue of the Occupy Wall Street and also the Tea Party movements have been the return of signs, images, and symbols to political discourse. Even the written text on the signs that now carom around the web can only be read within the images that  provide their poetry; images of the rich and poor, elderly and young, military and civilian. Politics, it seems, is leaving behind the rationalist fantasy that if we just all talk about the issues, we will come to some kind of sensible agreement.

For this reason, the Hannah Arendt Center has partnered with Visualize Conversation in an experiment; to ask how and in what ways political images can spur a public discussion.  We have created a new kind of website, Visualize Conversation , dedicated to the visual images that are defining the political world. The site is being launched around the images that have come to characterize the Occupy Movement. Soon, we will begin to focus on imagery that relates to the 2012 Presidential election as well as other national issues.

On this website you are invited to respond to these images with both words and other images, to share the images, and to debate about them with others. It may be fun, but it is also, in part, an opportunity to think about and create the images and metaphors that very well might engage and re-enliven our politics.

-Roger Berkowitz

30Nov/110

“Ojos Sin Luz” (Eyes Without Light)-Dan Gettinger

Dan Gettinger is a student at Bard College.

Lately I've been reflecting on my activity surrounding Occupy Wall St. Remembering the minutes before I was arrested on the Brooklyn Bridge, I wonder what I was thinking in those moments.  The truth is that I was there largely by accident. I read about the Occupy movement and a friend of mine who had gone down encouraged me to go that weekend. One thing led to another and I was spending eight hours at One Police Plaza, NYC. What led me there? Why did the NYPD decide to arrest 749 people? Why are people pitted against each other in anger?

These questions flew through my mind in a nervous rush in those interminable minutes. As my friend in front of me got hauled away he told me to call his Mom. A girl next to me scribbled a phone number on my arm but, sadly, it was that of the National Lawyers Guild and not hers.  I looked up to another Bard student who was safe on the pedestrian walkway and smiled.  Chaos and distress and sadness were etched across the faces of those around me. As I came to the realization that I would be arrested I felt more at ease and relaxed. And alone.

All my life I've been for or against something. Growing up overseas I was for America; representing a homeland that I barely knew but swelled with pride over. In the past decade it has become starker. I despised Bush and loved Obama, protesting one and campaigning for the other. My generation is one of extremes and totalities. We grew up defined by the trespasses of the last President, and now we watch as our confidence in this one seeps away. With a crushingly uncertain future we grasp at hope, looking to fill this void with promises.

Why is this? How is that we are so empty that we must be filled with language that is distilled into slogans and ideologically transparent? Why do we allow ourselves to be categorized and set into camps against each other? I think it is because we are lonely. A generation of drifters set loose by the misdeeds of those who came before. Around us we see everything being commodified and isolated. We value the world in terms of totalities, the cold language of polls.  Discussion becomes debate. Politics becomes personal.  Language gives leeway to the violence of our time. Philip Cushman writes, “We are told by self psychology and object relations theory that the empty self is the natural configuration of human being... that the essence of psychological growth is consumption”.  Ideas become values, a list of priorities rather than inquisitions.  Instead of questioning the origin of a problem, we invest in the answer.  The world becomes a sheet of cookie-cutter shapes and we, the unseeing eyes of selfish sentimentality.

Occupy Wall St. has exposed us as a generation of reactionaries.  This era is one of immediate responses instigated by the ceaseless swirl of the cyber world.  The Internet, modern telecommunications and globalization outline our existence. The information age confines our imagination, creating shapes in which we can mindlessly ease into.  It conditions our thoughts.  “The greatest poverty is not to live/ In a physical world, to feel that one’s desire/ Is too difficult to tell from despair,” says the poet Wallace Stevens.  The compression of information and language forces immediate reactions, instinctual expressions of sentiment.  Instead of taking the time to think, our feelings gush into the abyss that is the Internet.  And lost.  ‘Once more into the breach!’ shouts the exhausted soldier and student alike.

The power of online reaction in the cyber world has prompted the opposite in the physical.  I see it in the ease in which students are called ‘apathetic’.  Apathy is the absence of pathos, the detriment of passion. Students, the supposed vanguard for intellectual pursuit, are considered to be endowed with such an extreme indifference that we are devoid of concern, excitement or motivation.  This word shows the extent to which isolation has infested our campuses and social activity.  It reveals how difficult it has become to really engage with politics and to create community.  When the ancient Greeks entered into the public realm of life they expected to enter into discussion with each other.  We’ve seen the opposite occur.  As a result of the outpouring of ourselves in the cyber world we withdraw from the physical, preferring to slide into a virtual abstraction of reality and of ourselves.  Our passion is put towards filling that inner void and in doing so we exhaust ourselves in chasing our own superficial creations.  We live in a TV democracy, secure in our insecurity.

Hannah Arendt writes that loneliness leads to complacency, an unwillingness to judge truthfully and think. We fill ourselves with the tenets of ideology and in doing so we build walls around each other. This isolation prevents communication. It destroys dialogue and leaves us more susceptible to the shallow language of ideologues.

I'm far from regretting my experience on the bridge. It brought so much that I was feeling to the fore and was an illustration of the frustrations of a generation. But I do not revel in that act nor do I celebrate the movement as the answer anymore. The minute that we begin to consider Occupy Wall St the answer to our problems is the time to stop and think. Here is the time to re-evaluate the reasons why it's happening and why we should support it. It's when we've commodified Occupy, making the movement more about ourselves than the problems it confronts. That's when our loneliness is exposed.

The greatness of Occupy Wall St is that it gives people the opportunity to think. The absence of demands or a structured hierarchy allows the true problems that plague this nation to come first. It begins to cleanse the mind of all these barricades we've erected around ourselves by providing a space to talk about issues like class and privilege that we haven't confronted in decades. We've come to the threshold where unless we get a hard punch to the gut we'll continue to resort to phrases and slogans, packaging up our thoughts into sound bites and deluding ourselves with the belief that this is thinking.

David Graeber writes that the word revolution does not, and cannot, mean “a single, cataclysmic break with past structures of oppression,” a storming of the Winter Palace or Bastille.  It is rather exposing and de-legitimizing the origins of an oppressive system, striking down the pillar of injustice that fuels our plight.  Some of those in Occupy Wall St may say that pillar is the bankers that control our democracy.  I say the roots of these dark times are within us.  They’re the fictitious frames, the keyholes and the kissing booths that we use to define our world.  A society predicated on constant caffeinated consumption, seeking desperate deliverance in passing fashions, is a violent one.  One that seduces our imagination, leaving it languishing in infomercials and Italian leather.  We may not be the cause of this crisis, but our complacency leaves us complicit.

Do not expect the revolution to be televised nor even talked about immediately.  Hannah Arendt says that true thought occurs in solitude, in those quiet moments of intense reflection.  This follows from the Socratic notion that thinking in solitude is the “conversation one has with oneself,” a particularly active questioning and critical self-examination.

I would add that the validation of these thoughts occurs in dialogue with others, in the inter-personal connections that we form through experience.  Thinking is the relentless investigation of an idea, it’s an exploration, but it’s also engaging with others in this way on a non-emotional level, allowing for a substantive discourse.  To separate one self from an idea and be open to the thoughts of others is an extremely difficult process that requires patience and critical listening.  But it’s here where we must begin.  The lack of curiosity is the greatest symptom of being lonely and the surest way to complacency.  Questioning and imagining are activities essential to our freedom.

The raids with batons and bulldozers continue to intrude on unstructured spaces across the nation.  The future of Occupy Wall St is impossible to predict and the consequences even more difficult to anticipate.  However, we may be certain that Liberty Square has reminded us of a far darker occupation that exists within each of us.  An oppressive installment in our hearts that leaves us yearning and fighting for the illusive insoluble ‘I’.  But, “sudden as a shaft of sunlight,” we are experiencing ways of thinking and acting that free us from the past and future, placing this movement in our moment.

-Dan Gettinger