**This post was originally published August 10th, 2012**
In this post, academics and university faculty will be criticized. Railing against college professors has become a common pastime, one practiced almost exclusively by those who have been taught and mentored by those whom are now being criticized. It is thus only fair to say upfront that the college education in the United States is, in spite of its myriad flaws, still of incredible value and meaning to tens if not hundreds of thousands of students every year.
That said, too much of what our faculties teach is neither interesting nor wanted by our students.
"Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment."
-Hannah Arendt, The Life of the Mind
I find this quote intriguing in that its reference to environments and environmental change speak to the fact that Arendt's philosophy was essentially an ecological one, indeed one that is profoundly media ecological. The quote appears in a section of The Life of the Mind entitled "Science and Common Sense," in which Arendt argues that the practice of science is quite distinct from thinking as a philosophical activity.
As she explains:
Thinking, no doubt, plays an enormous role in every scientific enterprise, but it is a role of a means to an end; the end is determined by a decision about what is worthwhile knowing, and this decision cannot be scientific.
Here Arendt invokes a variation on Gödel's incompleteness theorem in mathematics, noting that science cannot justify itself on scientific grounds, but rather must somehow depend on something outside of and beyond itself. Perhaps more to the point, science, especially as associated with empiricism, cannot be divorced from concrete reality, and does not function only in the abstract realm of ideas that Plato insisted was the only true reality.
The transformation of truth into mere verity results primarily from the fact that the scientist remains bound to the common sense by which we find our bearings in a world of appearances. Thinking withdraws radically and for its own sake from this world and its evidential nature, whereas science profits from a possible withdrawal for the sake of specific results.
It is certainly the case that scientific truth is always contingent, tentative, open to refutation, as Karl Popper explained. Scientific truth is never absolute, never anything more than a map of some other territory, a map that needs to be continually tested and reviewed, updated and revised, as Alfred Korzybski explained by way of establishing his discipline of general semantics. Even the so-called laws of nature and physics need not be considered immutable, but may be subject to change and evolution, as Lee Smolin argues in his insightful book, Time Reborn.
Scientists are engaged in the process of abstracting, insofar as they take the data gained by empirical investigation and make generalizations in the form of theories and hypotheses, but this process of induction cannot be divorced from concrete reality, from the world of appearances. Science may be used to test, challenge, and displace common sense, but it operates on the same level, as a distilled form of common sense, rather than something qualitatively different, a status Arendt reserves for the special activity of thinking associated with philosophy.
Arendt goes on to argue that both common sense and scientific speculation lack "the safeguards inherent in sheer thinking, namely thinking's critical capacity." This includes the capacity for moral judgment, which became horrifically evident by the ways in which Nazi Germany used science to justify its genocidal policies and actions. Auschwitz did not represent a retrieval of tribal violence, but one of the ultimate expressions of the scientific enterprise in action. And the same might be said of Hiroshima and Nagasaki, holding aside whatever might be said to justify the use of the atomic bomb to bring the Second World War to a speedy conclusion. In remaining close to the human lifeworld, science abandons the very capacity that makes us human, that makes human life and human consciousness unique.
The story of modern science is in fact a story of shifting alliances. Science begins as a branch of philosophy, as natural philosophy. Indeed, philosophy itself is generally understood to begin with the pre-Socratics sometimes referred to as Ionian physicists, i.e., Thales, Anaximander, Heraclitus, who first posited the concept of elements and atoms. Both science and philosophy therefore coalesce during the first century that followed the introduction of the Greek alphabet and the emergence of a literate culture in the ancient Greek colonies in Asia Minor.
And just as ancient science is alphabetic in its origins, modern science begins with typography, as the historian Elizabeth Eisenstein explains in her exhaustive study, The Printing Press as an Agent of Change in Early Modern Europe. Simply by making the writings of natural philosophers easily available through the distribution of printed books, scholars were able to compare and contrast what different philosophers had to say about the natural world, and uncover their differences of opinion and contradictions. And this in turn spurned them on to find out for themselves which of various competing explanations are correct, where the truth lies, so that more reading led to even more empirical research, which in turn would have to be published, that is made public, via printing, for the purposes of testing and confirmation. And publication encouraged the formation of a scientific republic of letters, a typographically mediated virtual community.
Eisenstein notes that during the first century following Gutenberg, printed books gave Copernicus access to centuries of recorded observations of the movements of celestial objects, access not easily available to his predecessors. What is remarkable to consider is that the telescope was not invented in his lifetime, that the Polish astronomer arrived at his heliocentric view based only on what could be observed by the naked eye, by gazing up at the heavens, and down at the printed page. The typographic revolution that began in the 15th century was the necessary technological precondition for the Copernican revolution of the 16th century. The telescope as a tool to extend vision beyond its natural capabilities had not yet been invented, and was not required, although soon after its introduction Galileo was able to confirm the theory that Copernicus had put forth a century earlier.
In the restricted literate culture of medieval Europe, the idea took hold that there are two books to be studied in an effort to discern the divine will, and mind: the book of scripture and the book of nature. Both books were seen as sources of knowledge that can be unlocked by a process of reading and interpretation. It was grammar, the ancient study of language, which became one third of the trivium, the foundational curriculum of the medieval university, that became the basis of modern science, and not dialectic or logic, that is, pure thinking, which is the source of the philosophic tradition, as Marshall McLuhan noted in The Classical Trivium. The medieval schoolmen of course placed scripture in the primary position, whereas modern science situates truth in the book of nature alone.
The publication of Francis Bacon's Novum Organum in 1620 first formalized the separation of science from philosophy within print culture, but the divorce was finalized during the 19th century, coinciding with the industrial revolution, as researchers became known as scientists rather than natural philosophers. In place of the alliance with philosophy, science came to be associated with technology; before this time, technology, and engineering, often referred to as mechanics, represented entirely different lines of inquiry, utterly practical, often intuitive rather than systematic. Mechanics was part of the world of work rather than that of action, to use the terms Arendt introduced in The Human Condition, which is to say that it was seen as the work of the hand rather than the mind. By the end of 19th century, scientific discovery emerged as the main the source of major technological breakthroughs, rather than innovation springing fully formed from the tinkering of inventors, and it became necessary to distinguish between applied science and theoretical science, the latter nonetheless still tied to the world of appearances.
Today, the acronym STEM, which stands for science, technology, engineering, and mathematics, has become a major buzzword in education, a major emphasis in particular for higher education, and a major concern in regards to economic competitiveness. We might well take note of how recent this combination of fields and disciplines really is, insofar as mathematics represents pure logic and highly abstract forms of thought, and science once was a purely philosophical enterprise, both aspects of the life of the mind. Technology and engineering, on the other hand, for most of our history took the form of arts and crafts, part of the world of appearances.
The convergence of science and technology also had much to do with scientists' increasing reliance on scientific instruments for their investigations, a trend increasingly prevalent following the introduction of both the telescope and the microscope in the early 17th century, a trend even more apparent from the 19th century on. The laboratory is in fact another such instrument, a technology whose function is to provide precisely controlled conditions, beyond its role as a facility for the storage and use of other scientific instruments. Scientific instruments are media that extend our senses and allow us to see the world in new ways, therefore altering our experience of our environment, while the discoveries they lead to provide us with the means of altering our environments physically. And the laboratory is an instrument that provides us with a total environment, enclosed, controlled, isolated from the world to become in effect the world. It is a micro-environment where experimental changes can be made that anticipate changes that can be made to the macro-environment we regularly inhabit.
The split between science and philosophy can also be characterized as a division between the eye and the ear. Modern science, as intimately bound up in typography, is associated with visualism, the idea that seeing is believing, that truth is based on vision, that knowledge can be displayed visually as an organized set of facts, rather than the product of ongoing dialogue, and debate. McLuhan noted the importance of the fixed point of view as a by-product of training the eye to read, and Walter Ong studied the paradigm-shift in education attributed to Peter Ramus, who introduced pedagogical methods we would today associated with textbooks, outlining, and the visual display of information. Philosophy has not been immune to this influence, but retains a connection to the oral-aural mode through the method of Socratic dialogue, and by way of an understanding of the history of ideas as an ongoing conversation. Arendt, in The Human Condition, explained action, the realm of words, as a social phenomenon, one based on dialogic exchanges of ideas and opinions, not a solitary matter of looking things up. And thinking, which she elevates above the scientific enterprise in The Life of the Mind, is mostly a matter of an inner dialogue, or monologue if you prefer, of hearing oneself think, of silent speech, and not of a mental form of writing out words or imaginary reading. We talk things out, to others and/or to ourselves.
Science, on the other hand, is all about visible representations, as words, numbers, illustrations, tables, graphs, charts, diagrams, etc. And it is the investigation of visible phenomena, or otherwise of phenomena that can be rendered visible through scientific instruments. Acoustic phenomena can only be dealt with scientifically by being turned into a visual measurement, either of numbers or of lines going up and down to depict sound waves. The same is true for the other senses; smell, taste, and touch can only be dealt with scientifically though visual representation. Science cannot deal with any sense other than sight on its own terms, but always requires an act of translation into visual form. Thus, Arendt notes that modern science, being so intimately bound up in the world of appearances, is often concerned with making the invisible visible:
That modern science, always hunting for manifestations of the invisible—atoms, molecules, particles, cells, genes—should have added to the world a spectacular, unprecedented quantity of new perceptible things is only seemingly paradoxical.
Arendt might well have noted the continuity between the modern activity of making the invisible visible as an act of translation, and the medieval alchemist's search for methods of achieving material transformation, the translation of one substance into another. She does note that the use of scientific instruments are a means of extending natural functions, paralleling McLuhan's characterization of media as extensions of body and biology:
In order to prove or disprove its hypotheses… and to discover what makes things work, it [modern science] began to imitate the working processes of nature. For that purpose it produced the countless and enormously complex implements with which to force the non-appearing to appear (if only as an instrument-reading in the laboratory), as that was the sole means the scientist had to persuade himself of its reality. Modern technology was born in the laboratory, but this was not because scientists wanted to produce appliances or change the world. No matter how far their theories leave common-sense experience and common-sense reasoning behind, they must finally come back to some form of it or lose all sense of realness in the object of their investigation.
Note here the close connection between reality, that is, our conception of reality, and what lends someone the aura of authenticity, as Walter Benjamin would put it, is dependent on the visual sense, on the phenomenon being translated into the world of appearances (the aura as opposed to the aural). It is no accident then that there is a close connection in biblical literature and the Hebrew language between the words for spirit and soul, and the words for invisible but audible phenomena such as wind and breath, breath in turn being the basis of speech (and this is not unique to Hebraic culture or vocabulary). It is at this point that Arendt resumes her commentary on the function of the controlled environment:
And this return is possible only via the man-made, artificial world of the laboratory, where that which does not appear of its own accord is forced to appear and to disclose itself. Technology, the "plumber's" work held in some contempt by the scientist, who sees practical applicability as a mere by-product of his own efforts, introduces scientific findings, made in "unparalleled insulation… from the demands of the laity and of everyday life," into the everyday world of appearances and renders them accessible to common-sense experience; but this is possible only because the scientists themselves are ultimately dependent on that experience.
We now reach the point in the text where the quote I began this essay with appears, as Arendt writes:
Seen from the perspective of the "real" world, the laboratory is the anticipation of a changed environment; and the cognitive processes using the human abilities of thinking and fabricating as means to their end are indeed the most refined modes of common-sense reasoning. The activity of knowing is no less related to our sense of reality and no less a world-building activity than the building of houses.
Again, for Arendt, science and common sense both are distinct in this way from the activity of pure thinking, which can provide a sorely needed critical function. But her insight as to the function of the laboratory as an environment in which the invisible is made visible is important in that this helps us to understand that the laboratory is, in fact, what McLuhan referred to as a counter-environment or anti-environment.
In our everyday environment, the environment itself tends to be invisible, if not literally so, then functionally insofar as whatever fades into the background tends to fall out of our perceptual awareness or is otherwise ignored. Anything that becomes part of our routine falls into this category, becoming environmental, and therefore subliminal. And this includes our media, technology, and symbol systems, insofar as they are part of our everyday world. We do pay attention to them when they are brand new and unfamiliar, but once their novelty wears off they become part of the background, unless they malfunction or breakdown. In the absence of such conditions, we need an anti-environment to provide a contrast through which we can recognize the things we take for granted in our world, to provide a place to stand from which we can observe our situation from the outside in, from a relatively objective stance. We are, in effect, sleepwalkers in our everyday environment, and entering into an anti-environment is a way to wake us up, to enhance awareness and consciousness of our surroundings. This occurs, in a haphazard way, when we return home after spending time experiencing another culture, as for a brief time much of what was once routinized about own culture suddenly seems strange and arbitrary to us. The effect wears off relatively quickly, however, although the after-effects of broadening our minds in this way can be significant.
The controlled environment of the laboratory helps to focus our attention on phenomena that are otherwise invisible to us, either because they are taken for granted, or because they require specialized instrumentation to be rendered visible. It is not just that such phenomena are brought into the world of appearances, however, but also that they are made into objects of concerted study, to be recorded, described, measured, experimented upon, etc.
McLuhan emphasized the role of art as an anti-environment. The art museum, for example, is a controlled environment, and the painting that we encounter there has the potential to make us see things we had never seen before, by which I mean not just objects depicted that are unfamiliar to us, but familiar objects depicted in unfamiliar ways. In this way, works of art are instruments that can help us to see the world in new and different ways, help us to see, to use our senses and perceive in new and different ways. McLuhan believed that artists served as a kind of distant early warning system, borrowing cold war terminology to refer to their ability to anticipate changes occurring in the present that most others are not aware of. He was fond of the Ezra Pound quote that the artist is the antenna of the race, and Kurt Vonnegut expressed a similar sentiment in describing the writer as a canary in a coal mine. We may further consider the art museum or gallery or library as a controlled environment, a laboratory of sorts, and note the parallel in the idea of art as the anticipation of a changed environment.
There are other anti-environments as well. Houses of worship function in this way, often because they are based on earlier eras and different cultures, and otherwise are constructed to remove us out of our everyday environment, and help us to see the world in a different light. They are in some way dedicated to making the invisible world of the spirit visible to us through the use of sacred symbols and objects, even for religions whose concept of God is one that is entirely outside of the world of appearances. Sanctuaries might therefore be considered laboratories used for moral, ethical, and sacred discovery, experimentation, and development, and places where changed environments are also anticipated, in the form of spiritual enlightenment and the pursuit of social justice. This also suggests that the scientific laboratory might be viewed, in a certain sense, as a sacred space, along the lines that Mircea Eliade discusses in The Sacred and the Profane.
The school and the classroom are also anti-environments, or at least ought to be, as Neil Postman argued in Teaching as a Conserving Activity. Students are sequestered away from the everyday environment, into a controlled situation where the world they live in can be studied and understood, and phenomena that are taken for granted can be brought into conscious awareness. It is indeed a place where the invisible can be made visible. In this sense, the school and the classroom are laboratories for learning, although the metaphor can be problematic when it used to imply that the school is only about the world of appearances, and all that is needed is to let students discover that world for themselves. Exploration is indeed essential, and discovery is an important component of learning. But the school is also a place where we may engage in the critical activity of pure thinking, of critical reasoning, of dialogue and disputation.
The classroom is more than a laboratory, or at least it must become more than a laboratory, or the educational enterprise will be incomplete. The school ought to be an anti-environment, not only in regard to the everyday world of appearances and common sense, but also to that special world dominated by STEM, by science, technology, engineering and math. We need the classroom to be an anti-environment for a world subject to a flood of entertainment and information, we need it to be a language-based anti-environment for a world increasingly overwhelmed by images and numbers. We need an anti-environment where words can take precedence, where reading and writing can be balanced by speech and conversation, where reason, thinking, and thinking about thinking can allow for critical evaluation of common sense and common science alike. Only then can schools be engaged in something more than just adjusting students to take their place in a changed and changing environment, integrating them within the technological system, as components of that system, as Jacques Ellul observed in The Technological Society. Only then can schools help students to change the environment itself, not just through scientific and technological innovation, but through the exercise of values other than the technological imperative of efficiency, to make things better, more human, more life-affirming.
The anti-environment that we so desperately need is what Hannah Arendt might well have called a laboratory of the mind.
The jury trial is, as Alexis de Tocqueville understood, one essential incubator of American democracy. The jury trial is the only space in which most people will ever be forced to sit in judgment of their fellow citizens and declare them innocent or guilty; or, in a civil trial, to judge whether one party’s wrong requires compensation. The experience of being a juror, Tocqueville saw, inculcates in all citizens the habits of mind of the judge; it “spreads to all classes respect for the thing judged and the idea of right.” Juries, he wrote, are “one of the most efficacious means society can make use of for the education of the people.”
If the experience of sitting in judgment as a juror is a bulwark of our democratic freedoms, we should be worried. As Albert W. Dzur writes, the jury trial, once the “standard way Americans handled criminal cases,” is now largely absent from the legal system. The jury trial “has been supplanted by plea agreements, settlements, summary judgments, and other non-trial forums that are usually more efficient and cost-effective in the short term. In addition to cost and efficiency, justice officials worry about juror competence in the face of scientific and technical evidence and expert testimony, further diminishing the opportunity for everyday people to serve.”
Dzur offers a clear case for the disappearance of the jury trial:
[J]uries in the United States today hear a small fraction of cases. In 2005 the Bureau of Justice Statistics reported that juries heard 4 percent of all alleged criminal offenses brought before federal courts. State courts match this trend. Legal scholars Brian J. Ostrom, Shauna M. Strickland, and Paula L. Hannaford-Agor discovered a 15 percent decline in total criminal jury trials in state courts over the last 30 years, compared with a 10 percent decline in criminal bench trials, in which a judge issues the verdict. They also found a 44 percent decline in civil jury trials compared with a 21 percent decline in civil bench trials.
So what does the retreat of Jury trials signify? For Dzur, the answer is that the jury system is an important part of our justice system because it performs a “constructive moral function,” by which he means that juries “force widespread sobriety about the real world of law and order.” Juries can challenge “official and lay attitudes regarding the law. This sobering quality of juries is particularly needed now.” Here is how Dzur characterizes more fully the “sobering quality of juries”:
A juror treats human beings attentively even while embedded within an institution that privileges rationalized procedures. Not advocates, prosecutors, or judges, jurors are independent of court processes and organizational norms while also being charged with judicial responsibility of the highest order. Their presence helps close the social distance between the parties and the court. The juror, who contributes to what is a political, juridical, and moral decision, becomes attuned to others in a way that triggers responsibility for them. Burns notes how jurors’ “intense encounter with the evidence” helps them engage in self-criticism of the “overgeneralized scripts” about crime and criminal offenders they may have brought with them into the courtroom.
In other words, juries are institutional spaces where citizens have the time to attentively consider fundamental moral and legal questions outside of the limelight and sequestered from public opinion, government pressure, and the media circus. Since juries are the institutions where we practice moral judgment, Dzur argues that the loss of juries means that “we are out of practice. Lay citizens no longer have opportunities to play decisive roles in our justice system.”
The recent jury decision in the George Zimmerman case is an example of a jury resisting popular calls for guilt and making a sober judgment that the facts of the case were simply not proven beyond a reasonable doubt. Juries can also resist the government, as might happen if Edward Snowden would return to the United States and put himself on trial before a jury. Such a jury could, and very well might, exonerate Snowden, exercising its fundamental right of jury nullification in the interest of justice. Snowden’s refusal to return is, in some part, a result of the diminished practice of moral judgment reflected in the diminishment of the jury.
Jury judgments are at times surprising and can, in extraordinary cases, go against the letter of the law. But the unpredictability of jury verdicts makes them neither irrational nor thoughtless. They are often intolerant and unfair, but this makes them neither racist nor unjust. Amidst the unquestioned hatred of all discrimination, we have forgotten that discrimination, the art of making relevant distinctions, is actually the root of judging. In our passion for rationality and fairness, we sacrifice judgment, and with judgment, we abandon our sense of justice.
What acts of judgment exemplified by juries offer are an ideal of justice beyond the law. Plato called it the idea of the good. Kant named it the categorical imperative. Arendt thought that judgment appealed to common sense, “that sense which fits us into a community with others.” What all three understood is that if morality and a life lived together with others is to persist, we need judgments that would invoke and actualize that common moral sense, that would keep alive the sense of justice.
For your weekend read, take a look at Dzur’s report on the loss of the juries. Also, you might revisit my own essay on this theme, “Why We Must Judge,” originally published in Democracy: A Journal of Ideas.
Sensus communis as a foundation for men as political beings: Arendt’s reading of Kant’s Critique of Judgment
Annelies Degryse Katholieke Universiteit Leuven, Belgium
Philosophy Social Criticism 2011 37(3): 345
Arendt's late reading of Kant proposes a connection between aesthetics and politics that, among other innovations, offers a new way to think about judgment through a connection between the individual and group reflection. Annelies Degryse of Leuven University breaks down this conception of judgment into two constituent parts and connects it to Kant's "community sense."
Picking up on the argument by Ronald Beiner that Arendt "detranscendentalizes" Kant, Degryse describes how this move to a plurality of spectators can be understood as an "empricalizing" Kant. She helpfully highlights two moments of judgment in Arendt. First, a person perceives through imagination, a specific faculty that moves from a physical to a mental instance. Second, in reflection, one achieves a distance from the original representation that further distances oneself from it. Indeed, here Arendt speaks of the "proper distance, the remoteness or uninvolvedness or disinterestedness, that is requisite for approbation and disapprobation, for evaluating something at its proper worth" (Arendt, Lectures on Kant's Political Philosophy, 1992: 67). Judgment proper occurs in this second step, where one takes a stand on one's first impression in terms of a value assertion.
The first moment of judgment occurs within the mind of the individual. It does not even necessarily need to take the form of words but could occur entirely at the private level. In the second moment though, one needs recourse to language as an instrument of communication. Arendt says that Kant's reference to sensus communis should thus best be translated as "community sense" rather than "common sense." Degryse emphasizes the "common" here as the key to moving to judgment through language. It allows us to go beyond our own limited mode of thinking. In other words, language knows more than any individual person, and in framing a judgment one takes this greater knowledge into account. This is one way to understand what Arendt means by thinking with "an enlarged mentality." Degryse links the use of language in judgments to Arendt's "detranscendentalization" of Kant: "Arendt stresses, with Kant, that we can lose our faculty of enlarged thinking without communication and interaction with one another. (353)" Judgment for Kant is only a faculty of the mind but for Arendt it depends on actual interaction with others.
Degryse sees Arendt's Lectures on Kant's Political Philosophy as explicitly developing the role of spectators that was already implicit in the Human Condition. After all, speech and action need to be received by someone. Drawing on another aspect of Kant's terminology to make this connection, Arendt emphasizes that taste, not genius, constitutes the public realm. The genius can start something new, but in order to communicate it, this novelty must be described in terms that others can perceive. Interestingly, for Arendt, even the genius must himself have at least some access to taste to get his point across. Shifting to the political realm, Degryse notes that Arendt provides the example of the French Revolution: she sees its true impact in the many public responses to the event rather than the acts of the event itself. (One thinks here of the publications of Burke in the England, Paine in the U.S., and Schiller and Hegel in Germany, among many others.)
As a contrast, Degryse says that the philosopher risks losing touch and supporting tyranny because, as per Plato's famous parable of the cave, he does not want to return to the realm of shadows and captivity with others after having ascended alone to the realm of truth. Spectators, always plural, can never lose touch in this way.
In Germany, the Romantics and Idealists worshiped the genius. Even today, taste is often considered a relic of subjectivism. Even though Arendt returns to Kant's aesthetics in a manner reminiscent of the great Idealists Fichte, Schelling, and Hegel, one key contribution of Degryse's article is that it shows how Arendt moves in the direction of plurality rather than the self-positing subject.
One of the great documents of American history is the Constitution of the Commonwealth of Massachusetts, written in 1779 by John Adams.
In Section Two of Chapter Six, Adams offers one of the most eloquent testaments to the political virtues of education. He writes:
Wisdom and knowledge, as well as virtue, diffused generally among the body of the people, being necessary for the preservation of their rights and liberties; and as these depend on spreading the opportunities and advantages of education in the various parts of the country, and among the different orders of the people, it shall be the duty of legislatures and magistrates, in all future periods of this commonwealth, to cherish the interests of literature and the sciences, and all seminaries of them; especially the university at Cambridge, public schools, and grammar-schools in the towns; to encourage private societies and public institutions, rewards and immunities, for the promotion of agriculture, arts, sciences, commerce, trades, manufactures, and a natural history of the country; to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and frugality, honesty and punctuality in their dealings; sincerity, and good humor, and all social affections and generous sentiments, among the people.
Adams felt deeply the connection between virtue and republican government. Like Montesquieu, whose writings are the foundation on which Adams’ constitutionalism is built, Adams knew that a democratic republic could only survive amidst people of virtue. That is why his Constitution also held that the “happiness of a people and the good order and preservation of civil government essentially depend upon piety, religion, and morality.”
For Adams, piety and morality depend upon religion. The Constitution he wrote thus holds that a democratic government must promote the “public worship of God and the public instructions in piety, religion, and morality.” One of the great questions of our time is whether a democratic community can promote and nourish the virtue necessary for civil government in an irreligious age? Is it possible, in other words, to maintain a citizenry oriented to the common sense and common good of the nation absent the religious bonds and beliefs that have traditionally taught awe and respect for those higher goods beyond the interests of individuals?
Hannah Arendt saw the ferocity of this question with clear eyes. Totalitarianism was, for here, the proof of the political victory of nihilism, the devaluation of the highest values, the proof that we now live in a world in which anything is possible and where human beings no longer could claim to be meaningfully different from ants or bees. Absent the religious grounding for human dignity, and in the wake of the loss of the Kantian faith of the dignity of human reason, what was left, Arendt asked, upon which to build the world of common meaning that would elevate human groups from their bestial impulses to the human pursuit of good and glory?
The question of civic education is paramount today, and especially for those of us charged with educating our youth. We need to ask, as Lee Schulman recently has: “What are the essential elements of moral and civic character for Americans? How can higher education contribute to developing these qualities in sustained and effective ways?” In short, we need to insist that our institutions aim to live up to the task Adams claimed for them: “to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and frugality, honesty and punctuality in their dealings; sincerity, and good humor, and all social affections and generous sentiments, among the people.”
Everywhere we look, higher education is being dismissed as overly costly and irrelevant. In many, many cases, this is wrong and irresponsible. There is a reason that applications continue to increase at the best colleges around the country, and it is not simply because these colleges guarantee economic success. What distinguishes the elite educational institutions in the U.S. is not their ability to prepare students for technical careers. On the contrary, a liberal arts tradition offers useless education. But parents and students understand—explicitly or implicitly—that such useless education is powerfully useful. The great discoveries in physics come from useless basic research that then power satellites and computers. New brands emerge from late night reveries over the human psyche. And those who learn to conduct an orchestra or direct a play will years on have little difficulty managing a company. What students learn may be presently useless; but it builds the character and forms the intellect in ways that will have unintended and unimaginable consequences over lives and generations.
The theoretical justifications for the liberal arts are easy to mouth but difficult to put into practice. Especially today, defenses of higher education ignore the fact that colleges are not doing a great job of preparing students for democratic citizenship. Large lectures produce the mechanical digestion of information. Hyper-specialized seminars forget that our charge is to teach a liberal tradition. The fetishizing of research that no one reads exemplifies the rewarding of personal advancement at the expense of a common project. And, above all, the loss of any meaningful sense of a core curriculum reflects the abandonment of our responsibility to instruct students about making judgments about what is important. At faculties around the country, the desire to teach what one wants is seen as “liberal” and progressive, but it means in practice that students are advised that any knowledge is equally is good as any other knowledge.
To call for collective judgment about what students should learn is not to insist on a return to a Western canon. It is to say that if we as faculties cannot agree on what is important than we abdicate our responsibility as educators, to lead students into a common world as independent and engaged citizens who can, and will, then act to remake and re-imagine that world.
John Adams was one of Hannah Arendt’s favorite thinkers, and he was because he understood the deep connection between virtue and republicanism. Few documents are more worth revisiting today than the 1780 Constitution of the Commonwealth of Massachusetts. It is your weekend read.
When people talk about the cost of entitlements or pensions, there is often a whiff of condescension, as if government employees don’t deserve their benefits. Often forgotten is the fact that private pensions are underfunded as well, and they are insured by the federal government. And now we are told that the military may have the biggest pension problem of all. Here is what the Financial Times reports:
Of all the politically difficult budget issues that Mr Hagel will face, few are more charged than the question of military entitlements which have risen sharply over the past decade. A report last year by the Center for Strategic and Budgetary Assessments concluded that at current rates, “military personnel costs will consume the entire defence budget by 2039”. Robert Gates, Mr Obama’s first defence secretary, once warned that these expenses were “eating us alive”.
Just as pensions and entitlements will soon crowd out all other government spending, so too will military pensions crowd out all military spending.
No one today can responsibly argue against pensions and health care. And no one can call the soldiers lazy burdens on the public weal. But neither can we fail to recognize that our addiction to entitlements is destroying our politics and our public spirit. We are sacrificing public action—be it the pursuit of scientific knowledge, the erecting of monuments, the education of our young, the building of infrastructure, and even a well-outfitted military—for the private comfort of individuals. It is no wonder that our political system is broken at a time when all incentives in the country lead interest groups to focus on parochial interests above the common good. It is inconceivable that this situation is not in some way related to the emergence of entitlements as the central function of government.
The question is one of principle. We have gone from a common sense that people are responsible for themselves and the government provides a safety net to a common sense that everyone should receive an education, everyone should receive healthcare, and everyone should receive pension benefits for as long as they live. It is possible to embrace the latter common sense, but with it comes a significantly higher tax burden and a much more communal ethic than has typically reigned in America. This is not a problem that hits only public employees. It is endemic throughout society. And our military.