“It is this duality of myself with myself that makes thinking a true activity, in which I am both the one who asks and the one who answers.”
-- Hannah Arendt, The Life of the Mind
How can teachers encourage thinking in school?
Arendt’s The Life of the Mind influences my answer. As an educator, my job is to prompt students to think—to have them become two-in-one (in Socratic terms) or to have soundless dialogues within themselves (in the Platonic sense). One way to accomplish that is to structure courses as a conversation between philosophers. In my American political thought course, for instance, I teach lessons on the liberal John Rawls and the conservative Leo Strauss. An integral part of that particular unit is for students to enact a conversation between those two figures in their own minds.
**This post was originally published August 10th, 2012**
In this post, academics and university faculty will be criticized. Railing against college professors has become a common pastime, one practiced almost exclusively by those who have been taught and mentored by those whom are now being criticized. It is thus only fair to say upfront that the college education in the United States is, in spite of its myriad flaws, still of incredible value and meaning to tens if not hundreds of thousands of students every year.
That said, too much of what our faculties teach is neither interesting nor wanted by our students.
"Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment."
-Hannah Arendt, The Life of the Mind
I find this quote intriguing in that its reference to environments and environmental change speak to the fact that Arendt's philosophy was essentially an ecological one, indeed one that is profoundly media ecological. The quote appears in a section of The Life of the Mind entitled "Science and Common Sense," in which Arendt argues that the practice of science is quite distinct from thinking as a philosophical activity.
As she explains:
Thinking, no doubt, plays an enormous role in every scientific enterprise, but it is a role of a means to an end; the end is determined by a decision about what is worthwhile knowing, and this decision cannot be scientific.
Here Arendt invokes a variation on Gödel's incompleteness theorem in mathematics, noting that science cannot justify itself on scientific grounds, but rather must somehow depend on something outside of and beyond itself. Perhaps more to the point, science, especially as associated with empiricism, cannot be divorced from concrete reality, and does not function only in the abstract realm of ideas that Plato insisted was the only true reality.
The transformation of truth into mere verity results primarily from the fact that the scientist remains bound to the common sense by which we find our bearings in a world of appearances. Thinking withdraws radically and for its own sake from this world and its evidential nature, whereas science profits from a possible withdrawal for the sake of specific results.
It is certainly the case that scientific truth is always contingent, tentative, open to refutation, as Karl Popper explained. Scientific truth is never absolute, never anything more than a map of some other territory, a map that needs to be continually tested and reviewed, updated and revised, as Alfred Korzybski explained by way of establishing his discipline of general semantics. Even the so-called laws of nature and physics need not be considered immutable, but may be subject to change and evolution, as Lee Smolin argues in his insightful book, Time Reborn.
Scientists are engaged in the process of abstracting, insofar as they take the data gained by empirical investigation and make generalizations in the form of theories and hypotheses, but this process of induction cannot be divorced from concrete reality, from the world of appearances. Science may be used to test, challenge, and displace common sense, but it operates on the same level, as a distilled form of common sense, rather than something qualitatively different, a status Arendt reserves for the special activity of thinking associated with philosophy.
Arendt goes on to argue that both common sense and scientific speculation lack "the safeguards inherent in sheer thinking, namely thinking's critical capacity." This includes the capacity for moral judgment, which became horrifically evident by the ways in which Nazi Germany used science to justify its genocidal policies and actions. Auschwitz did not represent a retrieval of tribal violence, but one of the ultimate expressions of the scientific enterprise in action. And the same might be said of Hiroshima and Nagasaki, holding aside whatever might be said to justify the use of the atomic bomb to bring the Second World War to a speedy conclusion. In remaining close to the human lifeworld, science abandons the very capacity that makes us human, that makes human life and human consciousness unique.
The story of modern science is in fact a story of shifting alliances. Science begins as a branch of philosophy, as natural philosophy. Indeed, philosophy itself is generally understood to begin with the pre-Socratics sometimes referred to as Ionian physicists, i.e., Thales, Anaximander, Heraclitus, who first posited the concept of elements and atoms. Both science and philosophy therefore coalesce during the first century that followed the introduction of the Greek alphabet and the emergence of a literate culture in the ancient Greek colonies in Asia Minor.
And just as ancient science is alphabetic in its origins, modern science begins with typography, as the historian Elizabeth Eisenstein explains in her exhaustive study, The Printing Press as an Agent of Change in Early Modern Europe. Simply by making the writings of natural philosophers easily available through the distribution of printed books, scholars were able to compare and contrast what different philosophers had to say about the natural world, and uncover their differences of opinion and contradictions. And this in turn spurned them on to find out for themselves which of various competing explanations are correct, where the truth lies, so that more reading led to even more empirical research, which in turn would have to be published, that is made public, via printing, for the purposes of testing and confirmation. And publication encouraged the formation of a scientific republic of letters, a typographically mediated virtual community.
Eisenstein notes that during the first century following Gutenberg, printed books gave Copernicus access to centuries of recorded observations of the movements of celestial objects, access not easily available to his predecessors. What is remarkable to consider is that the telescope was not invented in his lifetime, that the Polish astronomer arrived at his heliocentric view based only on what could be observed by the naked eye, by gazing up at the heavens, and down at the printed page. The typographic revolution that began in the 15th century was the necessary technological precondition for the Copernican revolution of the 16th century. The telescope as a tool to extend vision beyond its natural capabilities had not yet been invented, and was not required, although soon after its introduction Galileo was able to confirm the theory that Copernicus had put forth a century earlier.
In the restricted literate culture of medieval Europe, the idea took hold that there are two books to be studied in an effort to discern the divine will, and mind: the book of scripture and the book of nature. Both books were seen as sources of knowledge that can be unlocked by a process of reading and interpretation. It was grammar, the ancient study of language, which became one third of the trivium, the foundational curriculum of the medieval university, that became the basis of modern science, and not dialectic or logic, that is, pure thinking, which is the source of the philosophic tradition, as Marshall McLuhan noted in The Classical Trivium. The medieval schoolmen of course placed scripture in the primary position, whereas modern science situates truth in the book of nature alone.
The publication of Francis Bacon's Novum Organum in 1620 first formalized the separation of science from philosophy within print culture, but the divorce was finalized during the 19th century, coinciding with the industrial revolution, as researchers became known as scientists rather than natural philosophers. In place of the alliance with philosophy, science came to be associated with technology; before this time, technology, and engineering, often referred to as mechanics, represented entirely different lines of inquiry, utterly practical, often intuitive rather than systematic. Mechanics was part of the world of work rather than that of action, to use the terms Arendt introduced in The Human Condition, which is to say that it was seen as the work of the hand rather than the mind. By the end of 19th century, scientific discovery emerged as the main the source of major technological breakthroughs, rather than innovation springing fully formed from the tinkering of inventors, and it became necessary to distinguish between applied science and theoretical science, the latter nonetheless still tied to the world of appearances.
Today, the acronym STEM, which stands for science, technology, engineering, and mathematics, has become a major buzzword in education, a major emphasis in particular for higher education, and a major concern in regards to economic competitiveness. We might well take note of how recent this combination of fields and disciplines really is, insofar as mathematics represents pure logic and highly abstract forms of thought, and science once was a purely philosophical enterprise, both aspects of the life of the mind. Technology and engineering, on the other hand, for most of our history took the form of arts and crafts, part of the world of appearances.
The convergence of science and technology also had much to do with scientists' increasing reliance on scientific instruments for their investigations, a trend increasingly prevalent following the introduction of both the telescope and the microscope in the early 17th century, a trend even more apparent from the 19th century on. The laboratory is in fact another such instrument, a technology whose function is to provide precisely controlled conditions, beyond its role as a facility for the storage and use of other scientific instruments. Scientific instruments are media that extend our senses and allow us to see the world in new ways, therefore altering our experience of our environment, while the discoveries they lead to provide us with the means of altering our environments physically. And the laboratory is an instrument that provides us with a total environment, enclosed, controlled, isolated from the world to become in effect the world. It is a micro-environment where experimental changes can be made that anticipate changes that can be made to the macro-environment we regularly inhabit.
The split between science and philosophy can also be characterized as a division between the eye and the ear. Modern science, as intimately bound up in typography, is associated with visualism, the idea that seeing is believing, that truth is based on vision, that knowledge can be displayed visually as an organized set of facts, rather than the product of ongoing dialogue, and debate. McLuhan noted the importance of the fixed point of view as a by-product of training the eye to read, and Walter Ong studied the paradigm-shift in education attributed to Peter Ramus, who introduced pedagogical methods we would today associated with textbooks, outlining, and the visual display of information. Philosophy has not been immune to this influence, but retains a connection to the oral-aural mode through the method of Socratic dialogue, and by way of an understanding of the history of ideas as an ongoing conversation. Arendt, in The Human Condition, explained action, the realm of words, as a social phenomenon, one based on dialogic exchanges of ideas and opinions, not a solitary matter of looking things up. And thinking, which she elevates above the scientific enterprise in The Life of the Mind, is mostly a matter of an inner dialogue, or monologue if you prefer, of hearing oneself think, of silent speech, and not of a mental form of writing out words or imaginary reading. We talk things out, to others and/or to ourselves.
Science, on the other hand, is all about visible representations, as words, numbers, illustrations, tables, graphs, charts, diagrams, etc. And it is the investigation of visible phenomena, or otherwise of phenomena that can be rendered visible through scientific instruments. Acoustic phenomena can only be dealt with scientifically by being turned into a visual measurement, either of numbers or of lines going up and down to depict sound waves. The same is true for the other senses; smell, taste, and touch can only be dealt with scientifically though visual representation. Science cannot deal with any sense other than sight on its own terms, but always requires an act of translation into visual form. Thus, Arendt notes that modern science, being so intimately bound up in the world of appearances, is often concerned with making the invisible visible:
That modern science, always hunting for manifestations of the invisible—atoms, molecules, particles, cells, genes—should have added to the world a spectacular, unprecedented quantity of new perceptible things is only seemingly paradoxical.
Arendt might well have noted the continuity between the modern activity of making the invisible visible as an act of translation, and the medieval alchemist's search for methods of achieving material transformation, the translation of one substance into another. She does note that the use of scientific instruments are a means of extending natural functions, paralleling McLuhan's characterization of media as extensions of body and biology:
In order to prove or disprove its hypotheses… and to discover what makes things work, it [modern science] began to imitate the working processes of nature. For that purpose it produced the countless and enormously complex implements with which to force the non-appearing to appear (if only as an instrument-reading in the laboratory), as that was the sole means the scientist had to persuade himself of its reality. Modern technology was born in the laboratory, but this was not because scientists wanted to produce appliances or change the world. No matter how far their theories leave common-sense experience and common-sense reasoning behind, they must finally come back to some form of it or lose all sense of realness in the object of their investigation.
Note here the close connection between reality, that is, our conception of reality, and what lends someone the aura of authenticity, as Walter Benjamin would put it, is dependent on the visual sense, on the phenomenon being translated into the world of appearances (the aura as opposed to the aural). It is no accident then that there is a close connection in biblical literature and the Hebrew language between the words for spirit and soul, and the words for invisible but audible phenomena such as wind and breath, breath in turn being the basis of speech (and this is not unique to Hebraic culture or vocabulary). It is at this point that Arendt resumes her commentary on the function of the controlled environment:
And this return is possible only via the man-made, artificial world of the laboratory, where that which does not appear of its own accord is forced to appear and to disclose itself. Technology, the "plumber's" work held in some contempt by the scientist, who sees practical applicability as a mere by-product of his own efforts, introduces scientific findings, made in "unparalleled insulation… from the demands of the laity and of everyday life," into the everyday world of appearances and renders them accessible to common-sense experience; but this is possible only because the scientists themselves are ultimately dependent on that experience.
We now reach the point in the text where the quote I began this essay with appears, as Arendt writes:
Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment; and the cognitive processes using the human abilities of thinking and fabricating as means to their end are indeed the most refined modes of common-sense reasoning. The activity of knowing is no less related to our sense of reality and no less a world-building activity than the building of houses.
Again, for Arendt, science and common sense both are distinct in this way from the activity of pure thinking, which can provide a sorely needed critical function. But her insight as to the function of the laboratory as an environment in which the invisible is made visible is important in that this helps us to understand that the laboratory is, in fact, what McLuhan referred to as a counter-environment or anti-environment.
In our everyday environment, the environment itself tends to be invisible, if not literally so, then functionally insofar as whatever fades into the background tends to fall out of our perceptual awareness or is otherwise ignored. Anything that becomes part of our routine falls into this category, becoming environmental, and therefore subliminal. And this includes our media, technology, and symbol systems, insofar as they are part of our everyday world. We do pay attention to them when they are brand new and unfamiliar, but once their novelty wears off they become part of the background, unless they malfunction or breakdown. In the absence of such conditions, we need an anti-environment to provide a contrast through which we can recognize the things we take for granted in our world, to provide a place to stand from which we can observe our situation from the outside in, from a relatively objective stance. We are, in effect, sleepwalkers in our everyday environment, and entering into an anti-environment is a way to wake us up, to enhance awareness and consciousness of our surroundings. This occurs, in a haphazard way, when we return home after spending time experiencing another culture, as for a brief time much of what was once routinized about own culture suddenly seems strange and arbitrary to us. The effect wears off relatively quickly, however, although the after-effects of broadening our minds in this way can be significant.
The controlled environment of the laboratory helps to focus our attention on phenomena that are otherwise invisible to us, either because they are taken for granted, or because they require specialized instrumentation to be rendered visible. It is not just that such phenomena are brought into the world of appearances, however, but also that they are made into objects of concerted study, to be recorded, described, measured, experimented upon, etc.
McLuhan emphasized the role of art as an anti-environment. The art museum, for example, is a controlled environment, and the painting that we encounter there has the potential to make us see things we had never seen before, by which I mean not just objects depicted that are unfamiliar to us, but familiar objects depicted in unfamiliar ways. In this way, works of art are instruments that can help us to see the world in new and different ways, help us to see, to use our senses and perceive in new and different ways. McLuhan believed that artists served as a kind of distant early warning system, borrowing cold war terminology to refer to their ability to anticipate changes occurring in the present that most others are not aware of. He was fond of the Ezra Pound quote that the artist is the antenna of the race, and Kurt Vonnegut expressed a similar sentiment in describing the writer as a canary in a coal mine. We may further consider the art museum or gallery or library as a controlled environment, a laboratory of sorts, and note the parallel in the idea of art as the anticipation of a changed environment.
There are other anti-environments as well. Houses of worship function in this way, often because they are based on earlier eras and different cultures, and otherwise are constructed to remove us out of our everyday environment, and help us to see the world in a different light. They are in some way dedicated to making the invisible world of the spirit visible to us through the use of sacred symbols and objects, even for religions whose concept of God is one that is entirely outside of the world of appearances. Sanctuaries might therefore be considered laboratories used for moral, ethical, and sacred discovery, experimentation, and development, and places where changed environments are also anticipated, in the form of spiritual enlightenment and the pursuit of social justice. This also suggests that the scientific laboratory might be viewed, in a certain sense, as a sacred space, along the lines that Mircea Eliade discusses in The Sacred and the Profane.
The school and the classroom are also anti-environments, or at least ought to be, as Neil Postman argued in Teaching as a Conserving Activity. Students are sequestered away from the everyday environment, into a controlled situation where the world they live in can be studied and understood, and phenomena that are taken for granted can be brought into conscious awareness. It is indeed a place where the invisible can be made visible. In this sense, the school and the classroom are laboratories for learning, although the metaphor can be problematic when it used to imply that the school is only about the world of appearances, and all that is needed is to let students discover that world for themselves. Exploration is indeed essential, and discovery is an important component of learning. But the school is also a place where we may engage in the critical activity of pure thinking, of critical reasoning, of dialogue and disputation.
The classroom is more than a laboratory, or at least it must become more than a laboratory, or the educational enterprise will be incomplete. The school ought to be an anti-environment, not only in regard to the everyday world of appearances and common sense, but also to that special world dominated by STEM, by science, technology, engineering and math. We need the classroom to be an anti-environment for a world subject to a flood of entertainment and information, we need it to be a language-based anti-environment for a world increasingly overwhelmed by images and numbers. We need an anti-environment where words can take precedence, where reading and writing can be balanced by speech and conversation, where reason, thinking, and thinking about thinking can allow for critical evaluation of common sense and common science alike. Only then can schools be engaged in something more than just adjusting students to take their place in a changed and changing environment, integrating them within the technological system, as components of that system, as Jacques Ellul observed in The Technological Society. Only then can schools help students to change the environment itself, not just through scientific and technological innovation, but through the exercise of values other than the technological imperative of efficiency, to make things better, more human, more life-affirming.
The anti-environment that we so desperately need is what Hannah Arendt might well have called a laboratory of the mind.
At Duke University and the University of North Carolina, two highly popular professors have transformed their course Think Again: How to Reason and Argue into a Massive Online Open Course (MOOC) that is taken by 170,000 people from all over the world at one time. This is old news. There is nothing to worry about when hundreds of thousands of people around the world watch flashy lectures by top professors on how to think and argue. Better such diversions than playing Temple Run. There are advantages and benefits from MOOCs and other forms of computer learning. And we should not run scared from MOOCs.
But the alacrity with which universities are adopting MOOCs as a way of cutting costs and marketing themselves as international brands harbors a danger too. The danger is not that more people will watch MOOCs or that MOOCs might be used to convey basic knowledge inside or outside of universities. No, the real danger in MOOCs is that watching a professor on your Ipad becomes confused with education.
You know elite universities are in trouble when their professors say things like Edward Rock. Rock, Distinguished Professor at the University of Pennsylvania Law School and coordinator of Penn’s online education program, has this to say about the impending revolution in online education:
We’re in the business of creating and disseminating knowledge. And in 2012, the internet is an incredibly important place to be present if you’re in the knowledge dissemination business.
If elite colleges are in the knowledge dissemination business, then they will over time be increasingly devalued and made less relevant. There is no reason that computers or televisions can’t convey knowledge as well or even better than humans. Insofar as professors and colleges imagine themselves to be in the “business of creating and disseminating knowledge,” they will be replaced by computers. And it will be their own fault.
The rising popularity of MOOCs must be understood not as a product of new technology, but as a response to the failure of our universities. As Scott Newstock has argued, the basic principle behind MOOCs is hardly new. Newstock quotes one prominent expert who argues that the average distance learner "knows more of the subject, and knows it better, than the student who has covered the same ground in the classroom." Indeed, "the day is coming when the work done [via distance learning] will be greater in amount than that done in the class-rooms of our colleges." What you might not expect is that this prediction was made in 1885. "The commentator quoted above was Yale classicist (and future University of Chicago President) William Rainey Harper, evaluating correspondence courses." What Newstock’s provocation shows is that efforts to replace education with knowledge dissemination have been around for centuries. But they have failed, at least until now.
MOOCs are so popular today because of the sadly poor quality of much—but certainly not all—college and university education. Around the country there are cavernous lecture halls filled with many hundreds of students. A lone professor stands up front, often with a PowerPoint presentation in a darkened room. Students have their computers open. Some are taking notes, but many are checking Facebook or surfing the Internet. Some are asleep. And others did not bother to show up, since the professor has posted his or her lecture notes online so that students can just read them instead of making the effort to make it to class. Such lectures may be half-decent ways to disseminate knowledge. Some lectures are better than others. But not much learning goes on in such lectures that can’t be simply replicated more efficiently and maybe even better on a computer. It is in this context that advocates of MOOCs are correct. When one compares a large lecture course with a well-designed online course, it may very well be that the online course is a superior educational venture. That it is cheaper too makes the advance of MOOCs seemingly inevitable.
As I have written here before, the best argument for MOOCs is that they may finally put the large and impersonal college lecture course out of its misery. There is no reason to be nostalgic for the lecture course. It was never a very good idea. Aside from a few exceptional lecturers—in my world I can think of the reputations of Hegel, his student Eduard Gans, Martin Heidegger, and, of course, Hannah Arendt—college lectures are largely an economical way to allow masses of students to acquire basic introductory knowledge in a field. If the masses are now more massive and the lectures more accessible, I’ll accept that as progress.
What this means is that there is an opportunity, at this moment, to embrace MOOCs as a disruptive force that will allow us to re-dedicate our universities and colleges to the practice of education as opposed to the business of knowledge dissemination. What colleges and universities need to offer is not simply knowledge, but education.
“Education,” as Martin Luther King wrote, “must also train one for quick, resolute and effective thinking.” Quick and resolute thinking requires that one “think incisively” and “think for one's self.” This “is very difficult.” The difficulty comes from the seduction of conformity and the power of prejudice. “We are prone to let our mental life become invaded by legions of half truths, prejudices, and propaganda.” We are all educated into prejudgments. They are human and it is inhuman to live free from prejudicial opinions and thoughts. On the one hand, education is the way we are led into and brought into a world as it exists, with its prejudices and values. And yet, education must also produce self-thinking persons, people who, once they are educated and enter the world as adults, are capable of judging the world into which they been born. (I have written more about King’s thoughts on education here).
In her essay “The Crisis in Education,” Hannah Arendt writes that education must have a double aspect. First, education leads a new young person into an already existing world. The world is that which is there before the child was born and will continue to exist after the child dies. It is the common world of things, stories, and experiences in which all of us spend our lives. All children, as newcomers who are born into a world that is at first strange to them, must be led into the already existing world. They must be taught to speak a common language, respect common values, see the same facts, and hear the same stories. This common world is what Arendt calls the “truth… we cannot change; metaphorically, it is the ground on which we stand and the sky that stretches above us.” In its first aspect, then, education must protect the world from “the onslaught of the new that bursts upon it with each new generation.” This is the conservationist function of education: to conserve the common world against the rebelliousness of the new. And this is why Arendt writes, “Education is the point at which we decide whether we love the world enough to assume responsibility for it.”
At the same time, however, there is a second aspect of education that seeks to afford the child “special protection and care so that nothing destructive may happen to him from the world.” The teacher must nurture the independence and newness of each child, what “we generally call the free development of characteristic qualities and talents… the uniqueness that distinguishes every human being from every other.” The teacher must not simply love the world, but as part of the world in which we live, the teacher must also love the fact—and it is a fact—that the world will change and be transformed by new ideas and new people. Education must love this transformative nature of children, and we must “love our children enough” so that we do not “strike from their hands their chance of undertaking something new, something unforeseen by us, but to prepare them in advance for the task of renewing a common world.” Alongside its conservationist role, education also must be revolutionary in the sense that it prepares students to strike out and create something altogether new.
Now is the time to use the disruption around MOOCs to rethink and re-invigorate our commitment to education and not simply to the dissemination of knowledge. This will not be easy.
A case in point is the same Duke University Course mentioned above, “Think Again: How to Reason and Argue.” In a recent article by Michael Fitzgerald, the Professors— Walter Sinnott-Armstrong from Duke and Ram Neta of the University of North Carolina at Chapel Hill— describe how teaching their MOOC led them to radically re-conceive how they teach in physical university classrooms. Here is Fitzgerald:
“The big shift: far fewer in-class lectures. Students will watch the lectures on Coursera beginning Monday. "Class will become a time for activities and also teamwork," said Sinnott-Armstrong. He's devised exercises to help on-campus students engage with the concepts in the class, including a college bowl-like competition, a murder mystery night and a scavenger hunt, all to help students develop a deeper understanding of the material presented in the lectures. "You can have these fun activities in the classroom when you're not wasting the classroom time with the lectures," he said.”
What we see here is that the mass appeal of MOOCs and their use as a way of replacing lectures is not being seized as an opportunity to make education more serious, but as an excuse to make college more fun. That professors at two of this country’s elite universities see it as progress that classes are replaced by murder mystery games and scavenger hunts is evidence of a profound confusion between education and infotainment. I have no doubt that much can be learned through fun and games. Children learn through games and it makes all the sense in the world that Finland allows children in schools to play until they are seven or eight years old. Even in primary or at times in secondary school, simulations and games may be useful. But there is a limit. Education, at least higher education, is not simply fun and games in the pursuit of knowledge.
As Arendt understood, education requires that students be nurtured and allowed to grow into adults who think for themselves in a serious and engaged way about the world. This is one reason Arendt is so critical of reformist pedagogy that seeks to stimulate children—especially older children in secondary schools and even college—to learn through play. When we teach children a foreign language through games instead of through grammar or when we make them learn history by playing computer games instead of by reading and studying, we “keep the older child as far as possible at the infant level. The very thing that should prepare the child for the world of adults, the gradually acquired habit of work and of not-playing, is done away with in favor of the autonomy of the world of childhood.” The same can be said of university courses that adopt the juvenile means of primary and secondary education.
The reasons for such a move to games in the classroom are many. Games are easy, students love them, and thus they fill massive classes, leading to superstar professors who can command supersized salaries. What is more, games work. You can learn a language through games. But games rarely teach seriousness and independence of thought.
The rise of MOOCs and the rise of fun in the college classroom are part of the trend to reduce education to a juvenile pursuit. One hardly needs an advanced degree to oversee a scavenger hunt or prepare students to take a test. And scavenger hunts, as useful as they may be in making learning fun, will hardly inculcate the independence of mind and strength of character that will produce self-thinking citizens capable of renewing the common world.
The question of how to address the crisis in education today—the fact that an ever more knowledgeable population with greatest access to information than at any time in the history of the world is perhaps the most politically illiterate citizenry in centuries—is the theme of the upcoming Hannah Arendt Center Conference, “Failing Fast: The Educated Citizen in Crisis.” In preparation for the conference, you can do nothing better than to re-read Hannah Arendt’s essay, "The Crisis in Education." You can also buy Between Past and Future the book of essays in which it appears. However you read it, "The Crisis in Education" is your weekend read.
Barely more than a year old, MITx and edX now dominate discussion about the future of higher education like nothing else I have seen in my time in Cambridge, MA. I have been teaching at MIT for more than 10 years now, and can’t remember any subject touching directly on university life that came even remotely close to absorbing the attention of higher ed professionals in the region the way that edX has. From initial investments of $30 million each by the founding institutions Harvard and MIT, and each month it seems bringing announcement of new partnerships with the world’s colleges & universities (27 institutions currently belong to the “X” consortium), the levels of hype and institutional buy-in have been nothing short of extraordinary.
Because of their ubiquity in the popular press, higher ed industry periodicals, and blogosphere, Massively Open Online Courses or MOOCs have become that most dangerous topic of discussion: a subject about which everybody needs to have an opinion. Such topics can unfortunately generate more heat than light, as the requirement to have and to express a point of view often means that the strongest and most extravagant opinions will claim attention and command the terms of debate. This is unfortunate if you favor the nuanced opinion or (as I do) feel genuinely ambivalent about MOOCs and the role(s) that they might play in shaping the future of higher education.
So far much of the discourse about MOOCs has tended to settle around two competing claims -- one for, one against -- that I articulated in a tweet a few months ago. Either MOOC providers are described as delivering free or low-cost quality higher education to those hard-pressed to afford it (and so performing a valuable public service); or MOOCs are understood to be selling a "lite" version of higher education to the poor while consolidating power and prestige with a few wealthy elite schools. In this dystopian view, the democratizing claims made by Udacity, Coursera and edX (the last formed of these outfits, and the only non-profit among them) are revealed instead to be essentially colonialist ones -- the colonialists, ed-tech profiteers hell-bent on thoroughly remaking the university as a crypto-corporate enterprise. MOOCs are understood to be an engine in this transformation, and an integral part of an overall design for reshaping higher education as a neoliberal market pursuit.
I can’t doubt that there is truth in both of these sets of claims. It is difficult at the same time to ignore that arguments for and against MOOCs look past each other in crucial respects; and leave precious little ground between them. What the accounts do share is an assumption that MOOCs will transform or “revolutionize” the landscape of higher education (for good or ill). Either MOOCs will be agents for elevating some in the less advantaged and underserved corners of the world; or MOOCs are instruments for extracting bodies from classrooms and tenure-track lines from university departments. The somewhat high-flown claims to educate and elevate underserved populations of the globe, often based on stray anecdote, are offered independently of any more substantive claim about the specific learning communities who benefit (or stand to benefit) from MOOCs. Similarly, claims about the profit motives animating the companies offering MOOCs subordinate all discussion of MOOCs to the ideological positions that they supposedly exist to promote. The designs attributed to MOOCs, and to the instructors who offer MOOCs, are such as foreclose discussion rather than promote it.
While both accounts of MOOCs envision significant future consequences from their implementation, moreover, neither says very much about actually-existing MOOCs. The MOOC has become a repository for utopian and dystopian narratives about the present and future directions of higher ed. As a result, this or that fact about MOOCs is often considered (or not) insofar as it confirms the prevailing theory about them. 150,000 signing up for a class demonstrates a clear hunger on the part of many across the globe for access to a quality education; this fact authorizes enlarged claims for the ability to transform higher education by bringing MOOCs to the masses. Similarly, the replicability of the digital medium -- and the fact that course content such as video lectures, once made, do not necessarily need to be re-made each year -- is conceived as a key to how MOOCs will force everyone in higher ed to make do (not do more) with less: less student-faculty interaction, fewer tenure-track professors, down the road the prospect of fewer instructors (the majority of them adjuncts already) paid to teach in college classrooms.
In addition to fears that MOOCs will reinforce ongoing trends of budget cuts, adjunctification and layoffs of college teaching staff, another legitimate concern is that MOOCs will—by helping some schools with their branding strategies—have the effect of consolidating elite privilege with a few schools and the “superprofessors” (themselves overwhelmingly white and male) who teach MOOCs, leaving other lesser-ranked schools struggling to compete against a lower-priced virtual curriculum. The fear is that MOOCs will facilitate the emergence of two tiers in higher ed offerings: the “real” version, available only to the students whose families can afford the exorbitant tuition, or who survive by taking out massive student loan debts); and the second-rate online version. With proposals on the table such as California’s Senate Bill 520, which would grant college credit for certain approved online courses, and Coursera’s recent announcement that they will sell their MOOCs to 10 public universities in the US, these fears are unfortunately very real. I hope to see more MOOCs spring up to contest that sense of inevitable recentering of authority from within the elite universities that host them. However difficult the task may prove to be, we need to disentangle the genuinely democratizing outreach work done by online education from its re-inscription of elite privilege.
These are important and pressing concerns. By the same token, they hardly exhaust all that can be said about MOOCs today. A host of important questions about the creation and implementation of MOOCs -- about course content, mode of learning, assessment, and so on -- should not be lost amidst conversations about the larger tendency (whether benevolist and democratizing, or insidious and corporatizing) to which MOOCs properly belong. The movement of classroom tasks and functions online learning presents opportunities as well as risks; we should understand both. In an essay written late last year I tried to look without blinders at MOOCs, and to reflect both on the risks associated with their format and implementation as well as on their potential as instruments of learning and encounter. I wrote at the time that it wasn’t my intention "to defend the MOOC so much as...to hold open some alternative futures for it." For these alternative futures to emerge there needs to be vision, will, and coordinated effort on the part of many in higher ed. I am still willing at least to entertain the possibility that MOOCs may turn out to be an enabling, positive invention, while I acknowledge indicators that point in the direction of their being a lamentably misguided one. But the rush to condemn and dismiss online courses may be as fundamentally mistaken as the rush to anoint them the future of higher education.
Blended learning modes present opportunities for both pedagogical experimentation and outreach; neither opportunity should I think be dismissed lightly. I have heard many instructors of MOOCs (in both STEM and humanities subjects) remark that the experience of teaching online has transformed their thinking and approach to teaching familiar material in the traditional classroom -- whether in pace and timing, course content, evaluation and assessment, etc. My interest in MOOCs extends to how the format can be imagined to provide access to a university curriculum to populations that may not have had this kind of access, as this is the population that stands to gain most from them. But in addition to the flat, global learning community ritually invoked as the audience for MOOCs, we could benefit from thinking locally too. How can the online course format make possible new relationships not only with the most far-flung remote corners of the earth but with the neighborhoods and communities nearest to campus? Can we make MOOCs that foster meaningful links with the community or create learning communities that cut across both the university and the online platform?
Among other alternative futures for MOOCs, I imagine more opportunities to collaborate with colleagues at other institutions. The single-delivery, “sage on stage” MOOC is no more the only online model available than is the large lecture class at a brick-and-mortar school. While MOOCs are still for the most part free and non credit-bearing, we should try out (and generate metrics to assess) as many different teaching arrangements as possible. I hasten to add that this exploration should include the intellectual freedom along with the technological affordances to create a MOOC of any kind, at any time, with anybody. With instructors and modules selected in advance, some infrastructural support in each site, and a set of shared principles for continuity of curriculum and presentation, anybody could create a MOOC. Universities like Penn have already begun asking faculty to sign non-compete agreements, presumably to curb these kind of collaborations. For as long as such arrangements are permissible, however, I would urge researchers to collaborate on MOOCs themselves. This may be a tall order; but not I think impossible.
From various quarters we have heard recent calls for a slow-down of the MOOC bandwagon. An open letter from Harvard faculty to the Dean of Faculty of Arts & Sciences calls for more oversight and reflective engagement with the question of how MOOCs offered through edX will affect “the higher education system as a whole.” I support these calls as consistent with the seriousness of the proposals to transform higher ed that are currently before us. From my modest position within the ranks of MIT administration I have been glad to see great care on the part of faculty to ensure that a spirit of experimentation and exploration with regard to MOOCs remains compatible with the core principles of the university and with a residential education. Cathy Davidson at Duke will in January 2014 teach a MOOC with Coursera simultaneously combined with a brick-and-mortar course on “The History and Future of Higher Ed,” with participation from classes at other schools and universities as well. These and other movements are to me reassuring signs, indicators of collaborative engagement around a topic of great importance. They indicate a willingness too to eschew rehearsing polarized opinions for or against MOOCs in order to attend at once to their innovative construction and to their effective and responsible implementation. The challenge is to remind ourselves periodically to think small (locally, incrementally) at the same time that we heed calls to think big.
After months in which university after university signed on to the bandwagon for Massive Open Online Courses called MOOCs, the battle over the future of education has finally begun. This week Duke University pulled out of EdX, the Harvard/MIT led consortium of Massive Open Online Courses called MOOC’s.
The reason: Its faculty rebelled. According to The New York Times,
While [Duke provost Peter] Lange saw the consortium as expanding the courses available to Duke students, some faculty members worried that the long-term effect might be for the university to offer fewer courses — and hire fewer professors. Others said there had been inadequate consultation with the faculty.
The Times also reports that faculty at Amherst College, my alma mater and former employer, voted against joining EdX. Again, the faculty saw danger. My former colleagues worried that the introduction of online courses would detrimentally impact the quality and spirit of education and the small liberal arts college. They also, as our friends over at ViaMeadia report, worried that MOOCs would “take student tuition dollars away from so-called middle-tier and lower-tier” schools, pushing their colleagues at these institutions out of their jobs.
And that brings us to ground zero of the battle between the faculty and the MOOCs: San Jose State University. San Jose State has jumped out as a leader in the use of blended online and offline courses. Mohammad H. Qayoumi, the university's president, has defended his embrace of online curricula on both educational and financial grounds. He points to one course, "Circuits & Electronics," offered by EdX. In a pilot program, students in that course did better than students in similar real-world courses taught by San Jose State professors. Where nearly 40% of San Jose students taking their traditional course received a C or lower, only 9% of students taking the EdX course did. For Qayoumi and others, such studies offer compelling grounds for integrating MOOCs into the curriculum. The buzzword is “blended courses,” in which the MOOCs are used in conjunction with faculty tutors. In this “flipped classroom,” the old model in which students listen to lectures in lecture halls and then do assignments at home, is replaced by online lectures supplemented by discussions and exercises done in class with professors. As I have written, such a model can be pedagogically powerful, if done right.
But as attractive as MOOCs may be, they carry with them real dangers. And these dangers emerge front and center in the hard-hitting Open Letter that the philosophy department at San Jose State University has published addressed to Michael Sandel. Sandel is the Harvard Professor famous for his popular and excellent course “Justice,” that has been wowing and provoking Harvard undergraduates for decades. Sandel not only teaches his course, he has branded it. He sells videos of the course; he published a book called Justice based on the course, and, most recently, created an online video version of the course for EdX. San Jose State recently became one of the first public universities in the country to sign a contract paying for the use of EdX courses. This is what led to the letter from the philosophers.
The letter begins by laying out the clear issue. The San Jose Philosophy department has professors who can teach courses in justice and ethics of the kind Sandel teaches. From their point of view, “There is no pedagogical problem in our department that JusticeX solves, nor do we have a shortage of faculty capable of teaching our equivalent course.” In short, while some students may prefer a course with a famous Harvard professor, the faculty at San Jose State believe that they are qualified to teach about Justice.
Given their qualifications, the philosophy professors conclude that the real reason for the contract with EdX is not increased educational value, but simply cost. As they write: "We believe that long-term financial considerations motivate the call for massively open online courses (MOOCs) at public universities such as ours.
In short, the faculty sees the writing on the wall. Whatever boilerplate rhetoric about blended courses and educational benefit may be fashionable and necessary, the real issue is simple. Public universities (and many private ones as well) will not keep paying the salaries of professors when those professors are not needed.
While for now professors are kept on to teach courses in a blended classroom, there will soon be need for many fewer professors. As students take Professor Sandel’s class at universities around the country, they will eventually work with teaching assistants—just as students do at Harvard, where Professor Sandel has pitifully little interaction with his hundreds of students in every class. These teaching assistants make little money, significantly less than a tenured or even a non-tenured professor. It is only a matter of time before many university classes are taught virtually by superstar professors assisted by armies of low-paid onsite assistants. State universities will then be able to educate significantly more students at a fraction of the current cost. For many students this will be a great boon—a certified and possibly quality education at a cheap price. For most California voters, this is a good deal. But it is precisely what the faculty at San Jose State fear. As they write:
We believe the purchasing of online and blended courses is not driven by concerns about pedagogy, but by an effort to restructure the U.S. university system in general, and our own California State University system in particular. If the concern were pedagogically motivated, we would expect faculty to be consulted and to monitor quality control. On the other hand, when change is financially driven and involves a compromise of quality it is done quickly, without consulting faculty or curriculum committees, and behind closed doors. This is essentially what happened with SJSU's contract with edX. At a press conference (April 10, 2013 at SJSU) announcing the signing of the contract with edX, California Lieutenant Governor Gavin Newsom acknowledged as much: "The old education financing model, frankly, is no longer sustainable." This is the crux of the problem. It is time to stop masking the real issue of MOOCs and blended courses behind empty rhetoric about a new generation and a new world. The purchasing of MOOCs and blended courses from outside vendors is the first step toward restructuring the CSU.
The San Jose State philosophy professors are undoubtedly correct. We are facing a systematic transformation in higher education in this country and also in secondary education as well. Just as the Internet has revolutionized journalism and just as it is now shaking the foundations of medicine and law, the Internet will not leave education alone. Change seems nigh. Part of this change is being driven by cost. Some of it is also being driven by the failures and perceived failures of our current system. The question for those of us in the world of higher education is whether we can respond intelligently to save the good and change out the bad. It is time that faculties around the country focus on this question and for that we should all be thankful to the philosophy professors at San Jose State.
The Open Letter offers three main points to argue that it is bad pedagogy to replace them with the blended course model of MOOCs and teaching assistants.
First, they argue that good teaching requires professors engaged in research. When professors are engaged in active research programs, they are interested in and motivated by their fields. Students can perceive if a professor is bored with a class and students will always learn more and be driven to study and excel by professors who feel that their work matters. Some may wonder what the use of research is that is read by only a few colleagues around the world, but one answer is that such research is necessary to keep professors fresh and sharp. We all know the sad fate of professors who have disengaged from research.
Second, the philosophy professors accept the argument of many including myself that large lectures are not the best way to teach. They teach by the Socratic method, interacting with students. Such classes, they write, are much better than having students watch Professor Sandel engage Socratically with faculty at Harvard. Of course, the MOOC model would still allow for Socratic and personal engagement, just by much lower paid purveyors of the craft. The unanswered question is whether low-paid assistants can be trained to teach well. The answer may well be yes.
Third, the philosophy faculty worry about the exact same moral justice course being taught across the country. We can already see the disciplinary barricades being drawn. It may be one thing to teach Math to the whole country from one or two MOOCs, but philosophy needs multiple perspectives. But how many? The philosophy professors suggest that their highly diverse and often lower-middle-class students have different experiences and references than do Professor Sandel’s Harvard students. They can, in the classroom, better connect with these students than Professor Sandel via online lectures.
The points the San Jose State philosophy professors raise are important. In many ways, however, their letter misses the point. Our educational system is now structured on a few questionable premises. First, that everyone who attends college wants a liberal arts education. That is simply not true. Many students simply want a credential to get a job. If these students can be taught well and more cheaply, we should help them. There is a question of whether we need to offer everyone the same kind of highly personalized and expensive education. While such arguments will be lambasted as elitist, it is nevertheless true that not everyone wants or needs to read Kant closely. We should seek to protect the ability of those who do—no matter their economic class—and also allow those who don’t a more efficient path through school.
A second questionable premise is that specialization is necessary to be a good teacher. This also is false. Too much specialization removes one from the world of common sense. As I have argued before, we need professors who are educated more generally. It is important to learn about Shakespeare and Aristotle, but you don’t need to be a specialist in Shakespeare or Aristotle to teach them well and thoughtfully to undergraduates. This is not an argument against the Ph.D. It is important to study and learn an intellectual tradition if you are going to teach. But it is an argument against the professionalization of the Ph.D. and of graduate education in general. It is also an argument against the dominance of undergraduate curriculum by professionalized scholars.
Third, and perhaps most importantly, is the premise that everyone needs to go to college. If we put a fraction of the resources we currently spend on remedial education for college students back into public high schools in this country, we could begin the process of transforming high school into a serious and meaningful activity. For one thing, we could begin employing Ph.D.s as high school teachers as are many of the emerging early colleges opening around the country.
I am sympathetic to the philosophy professors at San Jose State. I too teach a course on Justice called “The Foundation of Law: The Quest for Justice.” It is a course quite similar and yet meaningfully different from Michael Sandel’s course on Justice. I believe it is better, no offense meant. And I would be upset if I were told next year that instead of teaching my course I would be in effect a glorified TA for Professor Sandel. I hope it doesn’t come to that, but I know it might.
The only response for those whose jobs are being replaced by computers or the Internet is to go out and figure out how to do it better. That is what happened to journalists who were fired in droves. Many quit voluntarily and began developing new models of journalism, including blogs that have enriched our public discourse and largely rejuvenated public journalism in this country. Blogs, of course, are not perfect, and there is the question of how to make a living writing one. But enterprising bloggers like Andrew Sullivan and Walter Russell Mead are figuring that out. So too are professors like Michael Sandel and Andrew Ng.
We need educators to become experimental these days, to create small schools and intensive curricula within larger institutions that make the most of the personal interaction that is the core of true pedagogy. If that happens, and if teachers offer meaningful education for which students or our taxpayers will pay, then our jobs will be safe. And our students will be better for it. For this reason, we should welcome the technology as a push to make ourselves better teachers.
The Open Letter to Michael Sandel deserves a response. I hope Professor Sandel offers one. Until then, I recommend that this beautiful Spring weekend you read the letter from the San Jose State Philosophy Department. It is your weekend read.
San Jose State University is experimenting with a program where students pay a reduced fee for online courses run by the private firm Udacity. Teachers and their unions are in retreat across the nation. And groups like Uncollege insist that schools and universities are unnecessary. At a time when teachers are everywhere on the defensive, it is great to read this opening salvo from Leon Wieseltier:
When I look back at my education, I am struck not by how much I learned but by how much I was taught. I am the progeny of teachers; I swoon over teachers. Even what I learned on my own I owed to them, because they guided me in my sense of what is significant.
I share Wieseltier’s reverence for educators. Eric Rothschild and Werner Feig lit fires in my brain while I was in high school. Austin Sarat taught me to teach myself in college. Laurent Mayali introduced me to the wonders of history. Marianne Constable pushed me to be a rigorous reader. Drucilla Cornell fired my idealism for justice. And Philippe Nonet showed me how much I still had to know and inspired me to read and think ruthlessly in graduate school. Like Wieseltier, I can trace my life’s path through the lens of my teachers.
The occasion for such a welcome love letter to teachers is Wieseltier’s rapacious rejection of homeschooling and unschooling, two movements that he argues denigrate teachers. As sympathetic as I am to his paean to pedagogues, Wieseltier’s rejection of all alternatives to conventional education today is overly defensive.
For all their many ills, homeschooling and unschooling are two movements that seek to personalize and intensify the often conventional and factory-like educational experience of our nation’s high schools and colleges. According to Wieseltier, these alternatives are possessed of the “demented idea that children can be competently taught by people whose only qualifications for teaching them are love and a desire to keep them from the world.” These movements believe that young people can “reject college and become “self-directed learners.”” For Wieseltier, the claim that people can teach themselves is both an “insult to the great profession of pedagogy” and a romantic over-estimation of “untutored ‘self’.”
The romance of the untutored self is strong, but hardly dangerous. While today educators like Will Richardson and entrepreneurs like Dale Stephens celebrate the abundance of the internet and argue that anyone can teach themselves with simply an internet connection, that dream has a history. Consider this endorsement of autodidactic learning from Ray Bradbury from long before the internet:
Yes, I am. I’m completely library educated. I’ve never been to college. I went down to the library when I was in grade school in Waukegan, and in high school in Los Angeles, and spent long days every summer in the library. I used to steal magazines from a store on Genesee Street, in Waukegan, and read them and then steal them back on the racks again. That way I took the print off with my eyeballs and stayed honest. I didn’t want to be a permanent thief, and I was very careful to wash my hands before I read them. But with the library, it’s like catnip, I suppose: you begin to run in circles because there’s so much to look at and read. And it’s far more fun than going to school, simply because you make up your own list and you don’t have to listen to anyone. When I would see some of the books my kids were forced to bring home and read by some of their teachers, and were graded on—well, what if you don’t like those books?
In this interview in the Paris Review, Bradbury not only celebrates the freedom of the untutored self, but also dismisses college along much the same lines as Dale Stephens of Uncollege does. Here is Bradbury again:
You can’t learn to write in college. It’s a very bad place for writers because the teachers always think they know more than you do—and they don’t. They have prejudices. They may like Henry James, but what if you don’t want to write like Henry James? They may like John Irving, for instance, who’s the bore of all time. A lot of the people whose work they’ve taught in the schools for the last thirty years, I can’t understand why people read them and why they are taught. The library, on the other hand, has no biases. The information is all there for you to interpret. You don’t have someone telling you what to think. You discover it for yourself.
What the library and the internet offer is unfiltered information. For the autodidact, that is all that is needed. Education is a self-driven exploration of the database of the world.
Of course such arguments are elitist. Not everyone is a Ray Bradbury or a Wilhelm Gottfried Leibniz, who taught himself Latin in a few days. Hannah Arendt refused to go to her high school Greek class because it was offered at 8 am—too early an hour for her mind to wake up, she claimed. She learned Greek on her own. For such people self-learning is an option. But even Arendt needed teachers, which is why she went to Freiburg to study with Martin Heidegger. She had heard, she later wrote, that thinking was happening there. And she wanted to learn to think.
What is it that teachers teach when they are teaching? To answer “thinking” or “critical reasoning” or “self-reflection” is simply to open more questions. And yet these are the crucial questions we need to ask. At a period in time when education is increasingly confused with information delivery, we need to articulate and promote the dignity of teaching.
What is most provocative in Wieseltier’s essay is his civic argument for a liberal arts education. Education, he writes, is the salvation of both the person and the citizen. Indeed it is the bulwark of a democratic politics:
Surely the primary objectives of education are the formation of the self and the formation of the citizen. A political order based on the expression of opinion imposes an intellectual obligation upon the individual, who cannot acquit himself of his democratic duty without an ability to reason, a familiarity with argument, a historical memory. An ignorant citizen is a traitor to an open society. The demagoguery of the media, which is covertly structural when it is not overtly ideological, demands a countervailing force of knowledgeable reflection.
That education is the answer to our political ills is an argument heard widely. During the recent presidential election, the candidates frequently appealed to education as the panacea for everything from our flagging economy to our sclerotic political system. Wieseltier trades in a similar argument: A good liberal arts education will yield critical thinkers who will thus be able to parse the obfuscation inherent in the media and vote for responsible and excellent candidates.
I am skeptical of arguments that imagine education as a panacea for politics. Behind such arguments is usually the unspoken assumption: “If X were educated and knew what they were talking about, they would see the truth and agree with me.” There is a confidence here in a kind of rational speech situation (of the kind imagined by Jürgen Habermas) that holds that when the conditions are propitious, everyone will come to agree on a rational solution. But that is not the way human nature or politics works. Politics involves plurality and the amazing thing about human beings is that educated or not, we embrace an extraordinary variety of strongly held, intelligent, and conscientious opinions. I am a firm believer in education. But I hold out little hope that education will make people see eye to eye, end our political paralysis, or usher in a more rational polity.
What then is the value of education? And why is that we so deeply need great teachers? Hannah Arendt saw education as “the point at which we decide whether we love the world enough to assume responsibility for it." The educator must love the world and believe in it if he or she is to introduce young people to that world as something noble and worthy of respect. In this sense education is conservative, insofar as it conserves the world as it has been given. But education is also revolutionary, insofar as the teacher must realize that it is part of that world as it is that young people will change the world. Teachers simply teach what is, Arendt argued; they leave to the students the chance to transform it.
To teach the world as it is, one must love the world—what Arendt comes to call amor mundi. A teacher must not despise the world or see it as oppressive, evil, and deceitful. Yes, the teacher can recognize the limitations of the world and see its faults. But he or she must nevertheless love the world with its faults and thus lead the student into the world as something inspired and beautiful. To teach Plato, you must love Plato. To teach geology, you must love rocks. While critical thinking is an important skill, what teachers teach is rather enthusiasm and love of learning. The great teachers are the lovers of learning. What they teach, above all, is the experience of discovery. And they do so by learning themselves.
Education is to be distinguished from knowledge transmission. It must also be distinguished from credentialing. And finally, education is not the same as indoctrinating students with values or beliefs. Education is about opening students to the fact of what is. Teaching them about the world as it is. It is then up to the student, the young, to judge whether the world that they have inherited is loveable and worthy of retention, or whether it must be changed. The teacher is not responsible for changing the world; rather the teacher nurtures new citizens who are capable of judging the world on their own.
Arendt thus affirms Ralph Waldo Emerson's view that “He only who is able to stand alone is qualified for society.” Emerson’s imperative, to take up the divine idea allotted to each one of us, resonates with Arendt’s Socratic imperative, to be true to oneself. Education, Arendt insists, must risk allowing people their unique and personal viewpoints, eschewing political education and seeking, simply, to nurture independent minds. Education prepares the youth for politics by bringing them into a common world as independent and unique individuals. From this perspective, the progeny of teachers is the educated citizen, someone one who is both self-reliant in an Emersonian sense and also part of a common world.
The gap between our citizens and our Government has never been so wide. The people are looking for honest answers, not easy answers; clear leadership, not false claims and evasiveness and politics as usual.
-Jimmy Carter, July 15, 1979
Contemporary observers of secondary education have appropriately decried the startling lack of understanding most students possess of the American presidency. This critique should not be surprising. In textbooks and classrooms across the country, curriculum writers and teachers offer an abundance of disconnected facts about the nation’s distinct presidencies—the personalities, idiosyncrasies, and unique time-bound crises that give character and a simple narrative arc to each individual president. Some of these descriptions contain vital historical knowledge. Students should learn, for example, how a conflicted Lyndon Johnson pushed Congress for sweeping domestic programs against the backdrop of Vietnam or how a charismatic and effective communicator like Ronald Reagan found Cold War collaboration with Margaret Thatcher and Mikhail Gorbachev.
But what might it mean to ask high school students to look across these and other presidencies to encourage more sophisticated forms of historical thinking? More specifically, what might teachers begin to do to promote thoughtful writing and reflection that goes beyond the respective presidencies and questions the nature of the executive office itself? And how might one teach the presidency, in Arendtian fashion, encouraging open dialogue around common texts, acknowledging the necessary uncertainty in any evolving classroom interpretation of the past, and encouraging flexibility of thought for an unpredictable future? By provocatively asking whether the president “matters,” the 2012 Hannah Arendt Conference provided an ideal setting for New York secondary teachers to explore this central pedagogical challenge in teaching the presidency.
Participants in this special writing workshop, scheduled concurrently with the conference, attended conference panels and also retreated to consider innovative and focused approaches to teaching the presidency.
Conference panels promoted a broader examination of the presidency than typically found in secondary curricula. A diverse and notable group of scholars urged us to consider the events and historical trends, across multiple presidencies, constraining or empowering any particular chief executive. These ideas, explored more thoroughly in the intervening writing workshops, provoked productive argument on what characteristics might define the modern American presidency. In ways both explicit and implicit, sessions pointed participants to numerous and complicated ways Congress, the judiciary, mass media, U.S. citizens, and the president relate to one another.
This sweeping view of the presidency contains pedagogical potency and has a place in secondary classrooms. Thoughtful history educators should ask big questions, encourage open student inquiry, and promote civic discourse around the nature of power and the purposes of human institutions. But as educators, we also know that the aim and value of our discipline resides in place-and time-bound particulars that beg for our interpretation and ultimately build an evolving understanding of the past. Good history teaching combines big ambitious questions with careful attention to events, people, and specific contingencies. Such specifics are the building blocks of storytelling and shape the analogies students need to think through an uncertain future.
Jimmy Carter’s oval office speech on July 15, 1979, describing a national “crisis of confidence” presented a unique case study for thinking about the interaction between American presidents and the populations the office is constitutionally obliged to serve. Workshop participants prepared for the conference by watching the video footage from this address and reading parts of Kevin Mattson’s history of the speech. In what quickly became known as the “Malaise Speech,” Carter attempted a more direct and personal appeal to the American people, calling for personal sacrifice and soul searching, while warning of dire consequences if the nation did not own up to its energy dependencies. After Vietnam and Watergate, Carter believed, America needed a revival that went beyond policy recommendations. His television address, after a mysterious 10-day sequestration at Camp David, took viewers through Carter’s own spiritual journey and promoted the conclusions he drew from it.
Today, the Malaise Speech has come to symbolize a failed Carter presidency. He has been lampooned, for example, on The Simpsons as our most sympathetically honest and humorously ineffectual former president. In one episode, residents of Springfield cheer the unveiling of his presidential statue, emblazoned with “Malaise Forever” on the pedestal. Schools give the historical Carter even less respect. Standardized tests such as the NY Regents exam ask little if anything about his presidency. The Malaise speech is rarely mentioned in classrooms—at either the secondary or post-secondary levels. Similarly, few historians identify Carter as particularly influential, especially when compared to the leaders elected before and after him. Observers who mention his 1979 speeches are most likely footnoting a transitional narrative for an America still recovering from a turbulent Sixties and heading into a decisive conservative reaction.
Indeed, workshop participants used writing to question and debate Carter’s place in history and the limited impact of the speech. But we also identified, through primary sources on the 1976 election and documents around the speech, ways for students to think expansively about the evolving relationship between a president and the people. A quick analysis of the electoral map that brought Carter into office reminded us that Carter was attempting to convince a nation that looks and behaves quite differently than today. The vast swaths of blue throughout the South and red coastal counties in New York and California are striking. Carter’s victory map can resemble an electoral photo negative to what has now become a familiar and predictable image of specific regional alignments in the Bush/Obama era. The president who was elected in 1976, thanks in large part to an electorate still largely undefined by the later rise of the Christian Right, remains an historical enigma. As an Evangelical Democrat from Georgia, with roots in both farming and nuclear physics, comfortable admitting his sins in both Sunday School and Playboy, and neither energized by or defensive about abortion or school prayer, Carter is as difficult to image today as the audience he addressed in 1979.
It is similarly difficult for us to imagine the Malaise Speech ever finding a positive reception. However, this is precisely what Mattson argues. Post-speech weekend polls gave Carter’s modest popularity rating a surprisingly respectable 11-point bump. Similarly, in a year when most of the president’s earlier speeches were ignored, the White House found itself flooded with phone calls and letters, almost universally positive. The national press was mixed and several prominent columnists praised the speech. This reaction to such an unconventional address, Mattson goes on to argue, suggests that the presidency can matter.
Workshop participants who attended later sessions heard Walter Russell Mead reference the ways presidents can be seen as either transformative or transactional. In many ways, the “malaise moment” could be viewed as a late term attempt by a transactional president to forge a transformational presidency. In the days leading up to the speech, Carter went into self-imposed exile, summoning spiritual advisors to his side, and encouraging administration-wide soul searching. Such an approach to leadership, admirable to some and an act of desperation to others, defies conventions and presents an odd image of presidential behavior (an idea elaborated on by conference presenter Wyatt Mason). “Malaise” was never mentioned in Carter’s speech. But his transformational aspirations are hard to miss.
In a nation that was proud of hard work, strong families, close-knit communities, and our faith in God, too many of us now tend to worship self-indulgence and consumption. Human identity is no longer defined by what one does, but by what one owns. But we've discovered that owning things and consuming things does not satisfy our longing for meaning. We've learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.
It is this process—the intellectual act of interpreting Carter and his [in]famous speech as aberrant presidential behavior—that allows teachers and their students to explore together the larger question of defining the modern presidency. And it is precisely this purposeful use of a small number of primary sources that forces students to rethink, through writing and reflection, the parameters that shape how presidents relate to their electorate. In our workshop we saw how case studies, in-depth explorations of the particulars of history, precede productive debate on whether the presidency matters.
The forgotten Carter presidency can play a disproportionately impactful pedagogical role for teachers interested in exploring the modern presidency. As any high school teacher knows, students rarely bring an open interpretive lens to Clinton, Bush, or Obama. Ronald Reagan, as the first political memory for many of their parents, remains a polarizing a figure. However, few students or their parents hold strong politically consequential opinions about Carter. Most Americans, at best, continue to view him as a likable, honest, ethical man who is much more effective as an ex-president than he was as president.
Workshop participants learned that the initial support Carter received after the Malaise Speech faded quickly. Mattson and some members of the administration now argue that the President lacked a plan to follow up on the goodwill he received from a nation desiring leadership. Reading Ezra Klein, we also considered the possibility that, despite all the attention educators give to presidential speeches (as primary sources that quickly encapsulate presidential visions), there is little empirical evidence that any public address really makes much of a difference. In either case, Carter’s loss 16 months later suggests that his failures of leadership both transformational and transactional.
Did Carter’s speech matter? The teachers in the workshop concluded their participation by attempting to answer this question, working collaboratively to draft a brief historical account contextualizing the 1979 malaise moment. In doing so, we engaged in precisely the type of activity missing in too many secondary school classrooms today: interrogating sources, corroborating evidence, debating conflicting interpretations, paying close attention to language, and doing our best to examine our underlying assumptions about the human condition. These efforts produced some clarity, but also added complexity to our understanding of the past and led to many additional questions, both pedagogical and historical. In short, our writing and thinking during the Arendt Conference produced greater uncertainty. And that reality alone suggests that study of the presidency does indeed matter.
Stephen Mucher is assistant professor of history education in the Master of Arts in Teaching Program at Bard College.
The workshop, Teaching the American Presidency, facilitated by Teresa Vilardi and Stephen Mucher, sponsored by the Institute for Writing and Thinking and Master of Arts in Teaching Program in collaboration with the Hannah Arendt Center at Bard College was offered as part of the Center’s 2012 conference, “Does the President Matter? American Politics in an Age of Disrepair.”
In this post, academics and university faculty will be criticized. Railing against college professors has become a common pastime, one practiced almost exclusively by those who have been taught and mentored by those whom are now being criticized. It is thus only fair to say upfront that the college education in the United States is, in spite of its myriad flaws, still of incredible value and meaning to tens if not hundreds of thousands of students every year.
That said, too much of what our faculties teach is neither interesting nor wanted by our students.
This is a point that Jacques Berlinerblau makes in a recent essay in the Chronicle of Higher Education.
Observers of gentrification like to draw a distinction between needs and wants. Residents in an emerging neighborhood need dry cleaners, but it's wine bars they really want. The application of that insight to the humanities leads me to an unhappy conclusion: Our students, and the educated public at large, neither want us nor need us.
What is amazing is that not only do our students not want what we offer, but neither do our colleagues. It is an amazing and staggering truth that much of what academics write and publish is rarely, if ever, read. And if you want to really experience the problem, attend an academic conference some day, where you will see panels of scholars presenting their work, sometimes to 1 or 2 audience members. According to Berlinerblau, the average audience at academic conference panels is fourteen persons.
The standard response to such realizations is that scholarship is timeless. Its value may not be discovered for decades or even centuries until someone, somewhere, pulls down a dusty volume and reads something that changes the world. There is truth in such claims. When one goes digging in archives, there are pearls of wisdom to be found. What is more, the scholarly process consists of the accumulation of information and insight over generations. In other words, academic research is like basic scientific research, useless but useful in itself.
The problem with this argument is that such really original scholarship is rare and getting ever more rare. While there are exceptions, little original research is left to do in most fields of the humanities. Few important books are published each year. The vast majority are as derivative as they are unnecessary. We would all do well to read and think about the few important books (obviously there will be some disagreement and divergent schools) than to spend our time trying to establish our expertise by commenting on some small part of those books.
The result of the academic imperative of publish or perish is the increasing specialization that leads to the knowing more and more about less and less. This is the source of the irrelevance of much of humanities scholarship today.
As Hannah Arendt wrote 50 years ago in her essay On Violence, humanities scholars today are better served by being learned and erudite than by seeking to do original research by uncovering some new or forgotten scrap. While such finds can be interesting, they are exceedingly rare and largely insignificant.
As a result—and it is hard to hear for many in the scholarly community—we simply don't need 200 medieval scholars in the United States or 300 Rawlsians or 400 Biblical scholars. It is important that Chaucer and Nietzsche are taught to university students; but the idea that every college and university needs a Chaucer and a Nietzsche scholar to teach Chaucer and Nietzsche is simply wrong. We should, of course, continue to support scholars, those whose work is to some extent scholarly innovative. But more needed are well-read and thoughtful teachers who can teach widely and write for a general audience.
To say that excessively specialized humanities scholarship today is irrelevant is not to say that the humanities are irrelevant. The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.
Hannah Arendt argues precisely for this connection between the humanities and politics in her essay The Crisis in Culture. Part Two of the essay addresses the political significance of culture, which she relates to humanism—both of which are said to be of Roman origin. The Romans, she writes, knew how to care for and cultivate the grandiose political and artistic creations of the Greeks. And it is a line from Pericles that forms the center of Arendt's reflections.
The Periclean citation is translated (in part) by Arendt to say: "We love beauty within the limits of political judgment." The judgment of beauty, of culture, and of art is, Pericles says, limited by the political judgment of the people. There is, in other words, an intimate connection between culture and politics. In culture, we make judgments of taste and thus learn the faculty of judgment so necessary for politics. And political judgment, in turn, limits and guides our cultural judgments.
What unites culture and politics is that they are "both phenomena of the public world." Judgment, the primary faculty of politics, is discovered, nurtured, and practiced in the world of culture and the judgment of taste. What the study of culture through the humanities offers, therefore, is an orientation towards a common world that is known and understood through a common sense. The humanities, Arendt argues, are crucial for the development and preservation of common sense—something that is unfortunately all-too-lacking in much humanities scholarship today.
What this means is that teaching the humanities is absolutely essential for politics—and as long as that is the case, there will be a rationale for residential colleges and universities. The mania for distance learning today is understandable. Education is, in many cases, too expensive. Much could be done more cheaply and efficiently at colleges. And this will happen. Colleges will, increasingly, bring computers and the Internet into their curricula. But as powerful as the Internet is, and as useful as it is as a replacement for passive learning in large lectures, it is not yet a substitute for face-to-face learning that takes place at a college or university. The learning that takes place in the hallways, offices, and dining halls when students live, eat, and breathe their coursework over four years is simply fundamentally different from taking a course online in one's free time. As exciting as technology is, it is important to remember that education is, at its best, not about transmitting information but about inspiring thinking.
Berlinerblau thinks that what will save the humanities is better training in pedagogy. He writes:
As for the tools, let's look at it this way. Much as we try to foist "critical thinking skills" on undergraduates, I suggest we impart critical communication skills to our master's and doctoral students. That means teaching them how to teach, how to write, how to speak in public. It also means equipping them with an understanding that scholarly knowledge is no longer locked up in journals and class lectures. Spry and free, it now travels digitally, where it may intersect with an infinitely larger and more diverse audience. The communicative competences I extoll are only infrequently part of our genetic endowment. They don't come naturally to many people—which is precisely what sets the true humanist apart from the many. She or he is someone you always want to speak with, listen to, and read, someone who always teaches you something, blows your mind, singes your feathers. To render complexity with clarity and style—that is our heroism.
The focus on pedagogy is a mistake and comes from the basic flawed assumption that the problem with the humanities is that the professors aren't good communicators. It may be true that professors communicate poorly, but the real problem is deeper. If generations of secondary school teachers trained in pedagogy have taught us anything it is that pedagogical teaching is not useful. Authority in the classroom comes from knowledge and insight, not from pedagogical techniques or theories.
The pressing issue is less pedagogy than the fact that what most professors know is so specialized as to be irrelevant. What is needed is not better pedagogical training, but a more broad and erudite training, one that focuses less on original research and academic publishing and instead demands reading widely and writing aimed at an educated yet popular audience. What we need, in other words, are academics who read widely with excitement and inspiration and speak to the interested public.
More professors should be blogging and writing in public-interest journals. They should be reviewing literature rather than each other's books and, shockingly, they should be writing fewer academic monographs.
To say that the humanities should engage the world does not mean that the humanities should be politicized. The politicization of the humanities has shorn them of their authority and their claim to being true or beautiful. Humanities scholarship can only serve as an incubator for judgment when it is independent from social and political interests. But political independence is not the same as political sterility. Humanities scholarship can, and must, teach us to see and know our world as it is.
There are few essays that better express the worldly importance of the humanities than Hannah Arendt's The Crisis of Culture. It is worth reading and re-reading it. On this hot summer weekend, do yourself that favor.