There is a fascinating essay over on the Guernica blog, where David Bromwich examines “how Obama became a publicist for his presidency (rather than the president).” In his first term Obama delivered 1,852 separate speeches, comments, or scheduled public remarks and granted 591 interviews. These exceptional numbers, explains Bromwich, were the result of “magical thinking” on the part of the Obama White House: if the American public heard the president often enough, they would see how sincere and bipartisan he was and accept his policies. An endless string of speeches, road trips, and town hall meetings thus came to serve as a stand-in for the decision-making and confrontation that true leadership requires, and genuine conviction demands. Argues Bromwich: “…The truth is that Obama’s convictions were never strong. He did not find this out until his convictions were tested, and they were not tested until he became president.
Perhaps the thin connection between Obama’s words and his actions does not support the use of the word “conviction” at all. Let us say instead that he mistook his preferences for convictions—and he can still be trusted to tell us what he would prefer to do. Review the record and it will show that his first statement on a given issue generally lays out what he would prefer. Later on, he resigns himself to supporting a lesser evil, which he tells us is temporary and necessary. The creation of a category of permanent prisoners in “this war we’re in” (which he declines to call “the war on terror”) was an early and characteristic instance. Such is Obama’s belief in the power and significance of his own words that, as he judges his own case, saying the right thing is a decent second-best to doing the right thing.”
Bromwich’s reflections call to mind two classic statements of what might be called the nihilism of the modern age—the psychological state in which all values are relative and none may rise from preference to conviction. The first is a fragment from Friedrich Nietzsche’s notebooks composed in 1881-1882. It reads:
…we call good someone who does his heart’s bidding, but also the one who only tends to his duty;
we call good the meek and the reconciled, but also the courageous, unbending, severe;
we call good someone who employs no force against himself, but also the heroes of self-overcoming;
we call good the utterly loyal friend of the true, but also the man of piety, one who transfigures things;
we call good those who are obedient to themselves, but also the pious;
we call good those who are noble and exalted, but also those who do not despise and condescend;
we call good those of joyful spirit, the peaceable, but also those desirous of battle and victory;
we call good those who always want to be first, but also those who do not want to take precedence over anyone in any respect.
As Nietzsche writes elsewhere, “The most extreme form of nihilism would be: that every belief, every holding-as-true, is necessarily false: because there is no true world at all.” To call the President a nihilist is nothing extreme; it is simply to say that he well represents the age in which he lives, an age that is extraordinarily uncomfortable with convictions of any kind. Some believe in God, but too strong a belief in God is unseemly, even fanatic. It is good to believe in democracy, but we recognize the need for stable tyrannies as well. The free market is the best system of economics, but only if it is not too free. We live in a pragmatic age and Obama is our pragmatic President. That is precisely what many like in him. And yet we also want him to lead. In other words, we want strong leadership of a convinced leader and at the same time we want the pragmatic and technocratic malleability of someone with preferences absent convictions.
There is no better expression of this fear basic psychological state of modernity than William Butler Yeats poem, “THE SECOND COMING”
Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world,
The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.
The problem with President Obama is not that he lacks convictions. It is that he doesn’t know that he lacks convictions. And despite what Bromwich writes, the President hasn’t learned this. He still believes that he has strong convictions that Syria cannot use chemical weapons in a civil war against its own people. He still believes that it is intolerable to allow Russia to annex part of a sovereign country. He stands up and makes his strong convictions clear. But then he sits down and refuses to fight for those convictions, proving them beliefs. The point is not that he should fight in Syria or in Ukraine. The point is that he should not be speaking loudly and issuing ultimatums when he lacks the conviction to back them up.
-RB (hat tip Anna Hadfield)
Amidst charges of implanted memories and celebrity arrogance, I have no insight into what occurred between Mr. Allen and the younger Ms. Farrow. One side seems to think that our “rape culture” induces people to disbelieve victims. The other side believes that accusations in the court of public opinion open the door to character assassination. Both are right, which does little to satisfy those seeking a clear path to certainty and moral outrage.
The truth the avalanche of accusations on all sides has brought to light is that factual truth is always contingent and never certain. The drive for certainty leads quickly to ideological simplifications that deny inconvenient facts in the name of coherent narratives. Over and over the facts are being made to fit the theory; such ideological certainty at the expense of reality is the root of fascism and all totalitarian impulses.
This indeed is the point of a wonderful essay mercifully unrelated to the Allen affair and written by Simon Critchley in The New York Times. Critchley is ostensibly writing about the 1973 BBC series “The Ascent of Man,” hosted by Dr. Jacob Bronowski. Specifically, Critchley focuses on one episode titled “Knowledge or Certainty.” Touching on physics and Werner Heisenberg’s uncertainty principle, the show begins with the premise that while science aims to provide an objectively accurate picture of the world, modern science has proven that such objectivity is impossible. In Critchley’s summation, there is “no absolute knowledge and anyone who claims it—whether a scientist, a politician or a religious believer—opens the door to tragedy. All scientific information is imperfect and we have to treat it with humility.”
The lesson of modern science is that “There is no God’s eye view.” Anyone who claims to know the truth about the material world (and even more so about the moral or spiritual worlds) is “not just wrong, they are morally pernicious.” The point is that “[e]rrors are inextricably bound up with pursuit of human knowledge, which requires not just mathematical calculation but insight, interpretation and a personal act of judgment for which we are responsible.” And the result of this is that,
For Dr. Bronowski, the moral consequence of knowledge is that we must never judge others on the basis of some absolute, God-like conception of certainty. All knowledge, all information that passes between human beings, can be exchanged only within what we might call “a play of tolerance,” whether in science, literature, politics or religion. As he eloquently put it, “Human knowledge is personal and responsible, an unending adventure at the edge of uncertainty.”
At this point in his essay Critchley inserts a video clip of the end of the episode on “Knowledge or Creativity,” a clip that suddenly shifts the scene “to Auschwitz, where many members of Bronowski’s family were murdered.” We see Dr. Bronowski walking in Auschwitz. He says:
There are two parts to the human dilemma. One is the belief that the end justifies the means. That push button philosophy, that deliberate deafness to suffering has become the monster in the war machine. The other is the betrayal of the human spirit. The assertion of dogma closes the mind and turns a nation, a civilization into a regiment of ghosts. Obedient ghosts. Or Tortured ghosts. It’s said that science will dehumanize people and turn them into numbers. That’s false, tragically false. Look for yourself. This is the concentration camp and crematorium at Auschwitz. This is where people were turned into numbers. Into this pond were flushed the ashes of some 4 million people. And that was not done by gas. It was done by arrogance. It was done by dogma. It was done by ignorance. When people believe that they have absolute knowledge with no test in reality, this is how men behave. This is what men do when they aspire to the knowledge of Gods. Science is a very human form of knowledge. We are always at the brink of the known. We always feel forward for what is to be hoped. Every judgment in science stands on the edge of error and is personal. Science is a tribute to what we can know although we are fallible. In the end the words were said by Oliver Cromwell, ‘I beseech you in the bowels of Christ, think it possible that you may be mistaken.’
In other words, fascism, fundamentalism, and the holocaust are caused by a poor understanding of science, a confusion of necessarily hypothetical scientific knowledge with a push-button philosophy of certainty.
Even as Critchley and Bronowski beseech us to think we may be mistaken, I detect no suggestion that we should harbor doubts about the fact or the evil of the holocaust. Born in arrogance and certainty, the evil of the holocaust stands as an exception, something we all understand and know to be horrifically wrong. It may be true that “[w]e always have to acknowledge that we might be mistaken. When we forget that, then we forget ourselves and the worst can happen.” At the same time, there are times when to equivocate in our judgment is to refuse to do justice and even to be complicit in the justifications of wrong.
It may be true that science counsels humility and tolerance, but justice in the end requires making judgments. One of the great challenges of our time is the need to judge absent the solace of absolute knowledge or the illusions of certitude. We must at once admit to the uncertainty of the scientific age and insist that certain truths and certain judgments are beyond meaningful dispute. There are facts—that the holocaust happened and that it was a tragedy—that simply must not be denied. In other words, it matters that the Nazis did not win.
It is, of course, possible that had the Nazis won, our view of antisemitism today would be horrifically different. There is no cosmological truth to Nazi evil; but we humans don’t live disembodied in the cosmos. We live here, on this earth. And on this earth in this world that we have made, certain facts like the evil of the Nazi Final Solution are true. This does not mean that they are written in the stars or revealed by a God or carried by the wind. It does mean that such facts are the foundation of the common world in which we live as humans amidst plurality. In Hannah Arendt’s poetic language, these basic truths are told in stories and they are the basic building blocks of the world we share. This common world is what Arendt calls the “truth… we cannot change; metaphorically, it is the ground on which we stand and the sky that stretches above us.” In the common world, certain facts matter; the victory of common truths is not trivial if we are to live together in a shared world.
That it matters who wins, which facts we embrace, and which stories we tell to our children brings to mind an interview given by Woody Allen and discussed in a thoughtful essay by Damon Linker. Discussing Allen’s film Crimes and Misdemeanors, Allen explicitly defends the main character who kills his lover when she threatens to expose the affair. As Linker elaborates on Allen’s point: “The viewer [of Crimes and Misdemeanors] is left to conclude that Judah got away with his crime scot-free—and that such an outcome is possible for anyone courageous enough to violate accepted moral customs and lucky or clever enough to avoid getting caught by the legal authorities.” Linker then cites this quote from Allen’s interview:
On a lesser level you see it in sports. They create a world of football, for example. You get lost in that world and you care about meaningless things.... People by the thousands watch it, thinking it's very important who wins. But, in fact, if you step back for a second, it's utterly unimportant who wins. It means nothing. In the same way we create for ourselves a world that, in fact, means nothing at all, when you step back. It's meaningless.
The meaninglessness of the world, Allen suggests, means that it doesn’t matter who wins. But that is sports, right? On the cosmic level and as a question of justice, it doesn’t matter if the Seahawks or the Broncos win the Super Bowl. It matters for the players and fans and corporate sponsors, but not in the grand scheme of things.
The problem is that Allen sees the triviality of sports as a metaphor for the meaninglessness of the human world. Here is Allen speaking in another interview in Commonweal Magazine, also cited by Linker:
Human existence is a brutal experience to me…it’s a brutal, meaningless experience—an agonizing, meaningless experience with some oases, delight, some charm and peace, but these are just small oases. Overall, it is a brutal, brutal, terrible experience, and so it’s what can you do to alleviate the agony of the human condition, the human predicament? That is what interests me the most. I continue to make the films because the problem obsesses me all the time and it’s consistently on my mind and I’m consistently trying to alleviate the problem, and I think by making films as frequently as I do I get a chance to vent the problems.
It is worth asking how sincere we should take Allen to be here. If he really thought the world were meaningless, why would he write about its meaninglessness? Is it simply that he writes for the relief of unbearable urges? Couldn’t he then write about pretty much anything? It does seem that Allen writes not simply to relieve himself but also because he has something to say. That he thinks what he writes matters.
What matters, Allen’s films suggest, is truth. Here is what Allen says later in the same Commonweal interview when asked about the apparent amorality of Crimes and Misdemeanors.
I feel that is true—that one can commit a crime, do unspeakable things, and get away with it. There are people who commit all sorts of crimes and get away with it, and some of them are plagued with all sorts of guilt for the rest of their lives and others aren’t. They commit terrible crimes and they have wonderful lives, wonderful, happy lives, with families and children, and they have done unspeakably terrible things. There is no justice, there is no rational structure to it. That is just the way it is, and each person figures out some way to cope…. Some people cope better than others.
For Allen, his film is about truth, namely the truth that the world has no meaning and that evil can prevail and often does.
Linker labels Allen a nihilist, by which he means the conviction that “There is no justice.” And that seems right insofar as Allen does reject and moral or spiritual meaning.
What is missing in Linker’s analysis is an appreciation of the moral significance of nihilism. Part of the problem is that nihilism signifies two related but different ideas. First, nihilism is a rejection of all certainty; nihilism comes from nihil, meaning nothing. It is the philosophy of nothing. Second, nihilism as it is associated with thinkers like Friedrich Nietzsche also takes on the sense of re-valuation of all values. It is thus not merely a negative philosophy but also a call for the creation of new and immoral values. In Linker’s essay, nihilism is understood in the first and strict sense to be a negation of what is. In this sense, however, nihilism is inherent in all critical thinking and in thinking itself.
All thinking must negate what is to free itself for the new. Negation, or nihilism, “is but the other side of conventionalism.” That is why Hannah Arendt saw that “what we commonly call ‘nihilism’ is actually a danger in the thinking activity itself.” What nihilism agitates against is certainty, the “desire to find results that would make further thinking unnecessary.” Thinking, Arendt argues, is “equally dangerous to all creeds and, by itself, does not bring forth any new creed.” It is opposed to common sense and all ideologies. Because thinking sets up obstacles truths and opposes settled certainties, thinking is dangerous: “There are no dangerous thoughts,” Arendt writes, “Thinking itself is dangerous.”
In a scientific age that, as Critchley reminds us, is allergic to certainty, nihilism understood as a rejection of dogmas and certainties is not an immoral doctrine so much as it is the truthful insistence that we oppose what Critchley calls the “monstrous certainty that is endemic to fascism and, sadly, not just fascism but all the various faces of fundamentalism.” Critchley’s essay touches on the human need for tolerance of meaningful plurality and difference. It is your weekend read.
One of the great documents of American history is the Constitution of the Commonwealth of Massachusetts, written in 1779 by John Adams.
In Section Two of Chapter Six, Adams offers one of the most eloquent testaments to the political virtues of education. He writes:
Wisdom and knowledge, as well as virtue, diffused generally among the body of the people, being necessary for the preservation of their rights and liberties; and as these depend on spreading the opportunities and advantages of education in the various parts of the country, and among the different orders of the people, it shall be the duty of legislatures and magistrates, in all future periods of this commonwealth, to cherish the interests of literature and the sciences, and all seminaries of them; especially the university at Cambridge, public schools, and grammar-schools in the towns; to encourage private societies and public institutions, rewards and immunities, for the promotion of agriculture, arts, sciences, commerce, trades, manufactures, and a natural history of the country; to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and frugality, honesty and punctuality in their dealings; sincerity, and good humor, and all social affections and generous sentiments, among the people.
Adams felt deeply the connection between virtue and republican government. Like Montesquieu, whose writings are the foundation on which Adams’ constitutionalism is built, Adams knew that a democratic republic could only survive amidst people of virtue. That is why his Constitution also held that the “happiness of a people and the good order and preservation of civil government essentially depend upon piety, religion, and morality.”
For Adams, piety and morality depend upon religion. The Constitution he wrote thus holds that a democratic government must promote the “public worship of God and the public instructions in piety, religion, and morality.” One of the great questions of our time is whether a democratic community can promote and nourish the virtue necessary for civil government in an irreligious age? Is it possible, in other words, to maintain a citizenry oriented to the common sense and common good of the nation absent the religious bonds and beliefs that have traditionally taught awe and respect for those higher goods beyond the interests of individuals?
Hannah Arendt saw the ferocity of this question with clear eyes. Totalitarianism was, for here, the proof of the political victory of nihilism, the devaluation of the highest values, the proof that we now live in a world in which anything is possible and where human beings no longer could claim to be meaningfully different from ants or bees. Absent the religious grounding for human dignity, and in the wake of the loss of the Kantian faith of the dignity of human reason, what was left, Arendt asked, upon which to build the world of common meaning that would elevate human groups from their bestial impulses to the human pursuit of good and glory?
The question of civic education is paramount today, and especially for those of us charged with educating our youth. We need to ask, as Lee Schulman recently has: “What are the essential elements of moral and civic character for Americans? How can higher education contribute to developing these qualities in sustained and effective ways?” In short, we need to insist that our institutions aim to live up to the task Adams claimed for them: “to countenance and inculcate the principles of humanity and general benevolence, public and private charity, industry and frugality, honesty and punctuality in their dealings; sincerity, and good humor, and all social affections and generous sentiments, among the people.”
Everywhere we look, higher education is being dismissed as overly costly and irrelevant. In many, many cases, this is wrong and irresponsible. There is a reason that applications continue to increase at the best colleges around the country, and it is not simply because these colleges guarantee economic success. What distinguishes the elite educational institutions in the U.S. is not their ability to prepare students for technical careers. On the contrary, a liberal arts tradition offers useless education. But parents and students understand—explicitly or implicitly—that such useless education is powerfully useful. The great discoveries in physics come from useless basic research that then power satellites and computers. New brands emerge from late night reveries over the human psyche. And those who learn to conduct an orchestra or direct a play will years on have little difficulty managing a company. What students learn may be presently useless; but it builds the character and forms the intellect in ways that will have unintended and unimaginable consequences over lives and generations.
The theoretical justifications for the liberal arts are easy to mouth but difficult to put into practice. Especially today, defenses of higher education ignore the fact that colleges are not doing a great job of preparing students for democratic citizenship. Large lectures produce the mechanical digestion of information. Hyper-specialized seminars forget that our charge is to teach a liberal tradition. The fetishizing of research that no one reads exemplifies the rewarding of personal advancement at the expense of a common project. And, above all, the loss of any meaningful sense of a core curriculum reflects the abandonment of our responsibility to instruct students about making judgments about what is important. At faculties around the country, the desire to teach what one wants is seen as “liberal” and progressive, but it means in practice that students are advised that any knowledge is equally is good as any other knowledge.
To call for collective judgment about what students should learn is not to insist on a return to a Western canon. It is to say that if we as faculties cannot agree on what is important than we abdicate our responsibility as educators, to lead students into a common world as independent and engaged citizens who can, and will, then act to remake and re-imagine that world.
John Adams was one of Hannah Arendt’s favorite thinkers, and he was because he understood the deep connection between virtue and republicanism. Few documents are more worth revisiting today than the 1780 Constitution of the Commonwealth of Massachusetts. It is your weekend read.
Freeman Dyson, the eclectic physicist, took good aim at philosophy last week in a review of the silly book by Jim Holt, Why Does the World Exist?" An Existential Detective Story. Holt went around to "a portrait gallery of leading modern philosophers," and asked them the Leibnizian question: Why is there something rather than nothing?" The book offers their answers, along with biographical descriptions.
For Dyson, Holt's book "compels us to ask" these "ugly questions." First, "When and why did philosophy lose its bite?" Philosophers were, once important. In China, Confucius and his followers made a civilization. So too in Greece did Socrates and then the schools of Plato and Aristotle give birth to the western world. In the Christian era Jesus and Paul, then Aquinas and Augustine granted depth to dominant worldviews. Philosophers like Descartes, Hobbes, and Leibniz were central figures in the scientific revolution, and philosophical minds like Nietzsche, Heidegger, and Arendt (even if one was a philologist and the other two refused the name philosopher) have become central figures in the experience of nihilism. Against these towering figures, the "leading philosophers" in Holt's book cut a paltry figure. Here is Dyson:
Holt's philosophers belong to the twentieth and twenty-first centuries. Compared with the giants of the past, they are a sorry bunch of dwarfs. They are thinking deep thoughts and giving scholarly lectures to academic audiences, but hardly anybody in the world outside is listening. They are historically insignificant. At some time toward the end of the nineteenth century, philosophers faded from public life. Like the snark in Lewis Carroll's poem, they suddenly and silently vanished. So far as the general public was concerned, philosophers became invisible.
There are many reasons for the death of philosophy, some of which were behind Hannah Arendt's refusal to call herself a philosopher. Philosophy was born, at least in its Platonic variety, from out of the thinker's reaction to the death of Socrates. Confronted with the polis that put the thinker to death, Plato and Aristotle responded by retreating from the world into the world of ideas. Philosophical truth separated itself from worldly truths, and idealism was born. Realism was less a return to the world than a reactive fantasy to idealism. In both, the truths that were sought were otherworldly truths, disconnected to the world.
Christianity furthered the divorce of philosophy from the world by imagining two distinct realms, the higher realm existing beyond the world. Science, too, taught that truth could only be found in a world of abstract reason, divorced from real things. Christianity and science together gave substance to the philosophical rebellion against the world. The result, as Dyson rightly notes, is that philosophy today is as abstract, worldly, and relevant as it is profound.
What Dyson doesn't explore is why philosophers of the past had such importance, even as they also thought about worlds of ideas. The answer cannot be that ideas had more import in the past than now. On the contrary, we live in an age more saturated in ideas than any other. More people today are college educated, literate, and knowledgeable of philosophy than at any period in the history of the world. Books like Holt's are proof positive of the profitable industry of philosophical trinkets. That is the paradox—at a time when philosophy is read by more people than ever, it is less impactful than it ever was.
One explanation for this paradox is nihilism—The devaluing or re-valuing of the highest values. The truth about truth turned out to be neither so simple nor singular as the philosophers had hoped. An attentive inquiry into the true and the good led not to certainty, but to ideology critique. For Nietzsche, truth, like the Christian God, was a human creation, and the first truth of our age is that we recognized it as such. That is the precondition for the death of God and the death of truth. Nihilism has not expunged ideas from our world, but multiplied them. When speaking about the "true" or the "good" or the "just," Christians, Platonists, and moralists no longer have the stage to themselves. They must now shout to be heard amongst the public relations managers, advertisers, immoralists, epicureans, anarchists, and born again Christians.
Dyson ignores this strain of philosophy. He does point out that Nietzsche was the last great philosopher, but then dismisses Heidegger who "lost his credibility in 1933" and even Wittgentstein who would remain silent if a woman attended his lectures until she would leave. And yet it is Heidegger who has given us the great literary masterpieces of the 20th century philosophy.
His work on technology (The Question Concerning Technik) and art (The Origins of the Work of Art) has been widely read in artistic, literary, and lay circles. It is hard to imagine a philosopher more engaged with the science and literature than Heidegger was. He read physics widely and co-taught courses at the house of the Swiss psychiatrist Medard Boss and also taught seminars with the German novelist Ernst Jünger.
It seems worthwhile to end with a poem of Heidegger's from his little book, Aus der Erfahrung des Denkens/From Out of the Experience of Thinking:
Drei Gefahren drohen dem Denken
Die gute und darum heilsame Gefahr ist die Nachbarschaft des singenden Dichters.
Die böse und darum schärfste Gefahr ist das Denken selber. Es muß gegen sich selbst denken, was es nur selten vermag.
Die schlechte und darum wirre Gefahr ist das Philosophieren.
Three dangers threaten thinking.
The good and thus healthy danger is the nearness of singing poetry.
The evil and thus sharpest danger is thinking itself. It must think against itself, something it can do only rarely.
The bad and thus confusing danger is philosophizing.
One of my favorite images in Arendt's writings comes not from Arendt herself, but her citation of the poem "Magic" by Rainer Maria Rilke. Rilke's poem reads (in an approximate translation):
From indescribable transformation originate
Amazing shapes. Feel! Trust!
We suffer often: To ashes turn our flames;
Yet art can set on fire the dust.
Magic is here. In the realm of enchantment
The ordinary word appears elevated
But sounds as real as if the dove called
To seek its invisible mate.
Arendt cites Rilke's poem in the final section of the chapter of the Human Condition on Work. It is part of her discussion of art and her claim that "the immediate source of the art work is the human capacity for thought."
Art, Arendt writes, has its foundation in thinking. Works of art, she writes, are "thought things." They are thingifications of thoughts, or to use a word that is so often abused, they are reifications of thoughts—The making of thoughts into things. It is this process of transformation and transfiguration that Rilke captures in "Magic": To "set fire to the dust" and bring beauty and truth to the real world. That is what art does.
My mind turned to Rilke's poem as I watched the great South African artist William Kentridge deliver the first of his 2012 Norton Lectures at Harvard University.
Kentridge spoke in praise of shadows, and situated his talk within a reading of Plato's allegory of the Cave in Book VII of the Republic. The story of the cave begins with prisoners shackled and unmovable who see shadows along a wall projected by a fire. First one sets himself free and climbs out into the light of the sun and, slowly, painfully, comes to recognize in the light of the sun that the shadows were indeed shadows, untrue. The parable illustrates the error of sensible things and is one part of Plato's illustration of his theory of ideas. The ideas, supersensible truths of reason and logic, do not deceive and change like the shadowy things of the world. Only what lasts eternally is true; all that is sensible and fleeting is false.
Kentridge tells the story of Plato's cave to explain why he sees art, and especially his art, in opposition to the Platonic idea of truth. If Plato celebrates the primacy of the eternally true over the shadows, Kentridge argues that art elevates the image above the truth. For this reason, at least in part, Kentridge's art works with shadows. Shadow figures and shadow puppets.
Kentridge lauds shadows. In the very limitations of the shadows, in the gaps, in the gaps that inspire in us leaps to complete an image, that is where we think and learn. The leanness of the illusion pushes us to complete the recognition. It is in shadows that we find our agency in apprehending the world.
Shadow art is, for Kentridge, political. Plato's politics depends on a truth known and understood by the few and then imposed on the many. In this sense philosophy is, in Arendt's words, opposed to politics, and the philosopher either must seek merely to be left alone by the people (which is difficult because philosophers are dangerous), or they will always seek to dominate and tyrannize the polity with their reason. Arendt's lifelong battle is to free politics from the certainty of rational and philosophical truth, to open us to a politics of opinion and openness.
Knowledge is power and there is, in Kentridge's words, a relation between knowledge and violence. Kentridge embraces shadows and silhouettes to oppose the philosophical and Platonic tyranny of reason. He writes elsewhere:
I am interested in a political art, that is to say an art of ambiguity, contradiction, uncompleted gestures and uncertain ending - an art (and a politics) in which optimism is kept in check, and nihilism at bay.
Optimism must be kept in check since any certainty about the destination can underwrite the need for violence to bring others to that end. For Kentridge, "There is no destination. all destinations, all bright lights, arouse our mistrust."
Kentridge offers us an image of the artist. He speaks from the studio and from his notebook to emphasize the source of artistic truth in the thought image rather than the logical word. An artist thinks. He sees. He makes art. He makes things that reflect not truth and certainty but gaps, misgivings, and questions. Kentridge gives reality to the questionability of the world in his shadow art. In this way his art reminds us of the magic of Rilke's fire that transfigures dust into flame.
Few modern artists work magic like William Kentridge. His Norton Lectures are a great introduction to his art and the thinking behind his art. If you are not graduating this weekend, take the time to hear and look at what Kentridge says and makes.
You can view Kentridge's First Norton Lecture here. Consider it your visual weekend read.
David Brooks is giving advice to radicals today on how to be radical. It's a strange spot for the left's favorite conservative to be in, although it is a role he's taken up in now a few of his columns. And he's only partly wrong.
The occasion for Brooks' advice to radicals is the latest viral YouTube sensation, “Why I Hate Religion, but Love Jesus.” a video by Jefferson Bethke. The genre is not new. Mr. Jefferson's namesake, Thomas Jefferson, wrote a famous and wonderful little book, The Life and Morals of Jesus of Nazareth, in which he separates out the purely ethical sprouts of Jesus' teachings from the religious chaff. Alexis de Tocqueville saw that religion in a democratic time would increasingly take the form of moral aphorisms without the strict commandments that democratic citizens would rebel against. And modern evangelicalism as practiced in mega churches like Rick Warren's Saddleback Church shuns discussion of sin and burdensome religious rituals or commands. The attraction of Jefferson Bethke's video poem is, precisely, how well it fits in with the anti-authoritarian spirit of our age that craves meaning and justice yet disdains the authority and tradition that give life meaning and embody the ideals of justice.
Brooks' column is less about Bethke's rebellion than about his re-conversion. For Bethke has, since his video, publicly admitted the error of his ways and returned to the bosom of the church. All of which leads Brooks to wonder: Why is rebellion so ineffective today?
This is a question many rebels ask today as well. One common complaint, on the left, is that the rise of humanitarianism has replaced political movements like Marxism, utopianism, and social democracy with the idea that it is simply enough to send food and clothes to those who are suffering. Today's radicals don't want to change the system, they simply perpetuate it by preventing the extreme suffering that might nurture real radicals. The dreamers of a better future seem to be busy with other pursuits. As Brooks rightly notes:
This seems to be a moment when many people — in religion, economics and politics — are disgusted by current institutions, but then they are vague about what sorts of institutions should replace them. This seems to be a moment of fervent protest movements that are ultimately vague and ineffectual.
Brooks has a theory for why this is so, one he has offered before.
My own theory revolves around a single bad idea. For generations people have been told: Think for yourself; come up with your own independent worldview. Unless your name is Nietzsche, that’s probably a bad idea. Very few people have the genius or time to come up with a comprehensive and rigorous worldview.
Aside from the silliness about Nietzsche—who certainly thought with and against a tradition—Brooks makes a valid point. Rebellion cannot be simply a negation of what is, an overturning.
Indeed, that was Nietzsche's point. Nihilism, which Nietzsche diagnosed, is the saying no to what is, an overturning of all values. Thus, a rejection of any value that is not simply a subjective value. This nihilism is precisely what Nietzsche saw and feared, which is why he struggled to think about how new higher values, new idols, new laws, new myths, might emerge.
Brooks wants young rebels to seek out the classics and find their mentors in rebellion. He is right that "Effective rebellion isn’t just expressing your personal feelings. It means replacing one set of authorities and institutions with a better set of authorities and institutions." He is also right that "Authorities and institutions don’t repress the passions of the heart, the way some young people now suppose. They give them focus and a means to turn passion into change."
But the point isn't simply to rediscover Marx so that we can, once again, fight for the proletarian revolution. That movie has run. The point of education is not to provide us with failed dreams of the past so that we can try again. As Arendt wrote in The Crisis of Education, the effort of education is to teach students about the world as it is. Only when they confront the world as it is, honestly, can they begin to resist it.
Instead of blaming our children, we need to look at ourselves. For Arendt, education requires teachers who "love our children enough not to expel them from our world and leave them to their own devices." We must teach them about our world, as it is. And that means, we must teach them about what it means to live in a world in which there are no higher values that can sustain meaningful protests and rebellions. This is the world of nihilism Nietzsche saw emerging 130 years ago. And it is the world David Brooks is, in some ways, trying to articulate, even as he also refuses it and condemn it in his columns.
I don't disagree with Brooks' judgment of nihilism. But it is time for us, finally, to confront the reality of the world we live in. It is our world. If our leaders and public intellectuals won't be honest, how can we expect such courage from our youth?
We need to reconcile ourselves to the corrupting and debilitating nihilism of our world if we have any hope of educating young people who might be able to resist it. Arendt writes:
Education is the point at which we decide whether we love the world enough to assume responsibility for it and by the same tame token save it from that ruin which, except for renewal, except for the coming of the new and young, would be inevitable.
It is the job of education to “teach children what the world is like” so that they can begin the task of reconciling themselves with it; only then can they have a chance of truly resisting it. First come to see the ruin. Then learn to rebel against it. That is the promise of revolution, a circular process that offers the promise of a future that mere rebellion does not.
You'd do well to read Brooks' column.
Better yet, for your weekend read, pull out your edition of Between Past and Future and re-read The Crisis in Education.