"There is no lasting happiness outside the prescribed cycle of painful exhaustion and pleasurable regeneration, and whatever throws this cycle out of balance – poverty and misery where exhaustion is followed by wretchedness instead of regeneration, or great riches and an entirely effortless life where boredom takes the place of exhaustion and where the mills of necessity, of consumption and digestion, grind an impotent human body mercilessly and barrenly to death – ruins the elemental happiness that comes from being alive."
-Hannah Arendt, The Human Condition
A great deal has been written about Hannah Arendt’s philosophical and political thinking, but as the academic year draws to a close, it is important to remember that she urges her readers to think about and appreciate all aspects of human existence, including the life of the body. The passage quoted above comes from the Labor chapter of The Human Condition, in which Arendt traces the worrisome trend in the modern world where human activity is more and more dominated by a concern for the cyclical process of production and consumption. It is safe to say that ours is the kind of “waste economy” she speaks of, in which all objects become consumed and used up rather than used and re-used over time. Even highly technologically advanced devices such as our mobile phones are manufactured and treated as more or less disposable, made to last for a few years before they become obsolete and need to be replaced. The threat that a laboring and consuming society poses to a stable and durable human world has potentially disastrous consequences not only for political life, but also more generally for our ability to feel at home in our condition as earthly beings. In light of Arendt’s critique of labor as a human activity, it is remarkable that she pauses to acknowledge that this essentially worldless cycle of production and consumption with the aim of merely preserving our biological existence is the only activity that holds the key to “lasting” and “elemental” happiness in our lives.
The need to labor is “prescribed” by our condition as living beings most obviously in the case of needing to eat. In one way or another, all of us must continually expend energy in order to have food on the table. Happiness is found in this cycle of exhaustion and regeneration when each side balances the other, when pain and pleasure each contribute to feeling fully alive.
For most Americans this cycle is somewhat indirect since the number of people working on farms or growing food remains a minority. As the expenditure of energy through labor is abstracted (usually through the medium of money) from the regenerative act of consumption, it becomes more difficult to find happiness in the endless cycle of necessity. Furthermore, Arendt points out that the balance of exhaustion and regeneration can only be found in a middle-class life that is harder to come by today given the ever widening gap in income distribution. As the rich get richer and the poor get poorer, life itself becomes a burden for both extremes – a source of misery on one hand and a sign of impotence on the other – rather than a source of sustaining fulfillment.
How might we seek to reclaim this balance?
While many students and teachers (myself included) may be feeling the need for a pleasurable regeneration in the form of a vacation after a long season of schoolwork, Arendt is clear that “intellectual labor” shares few characteristics of manual labor related to maintaining our biological existence. However, there is also a pervasive notion that summer vacation from school was not designed to give students a break from thinking, but rather out of the necessity for young people to work on their families’ farms. Summer vacation is often thought of as a remnant of America’s agrarian past. Despite the fact that this interpretation of summer vacation is in fact historically erroneous, its persistence in the American mind suggests a collective nostalgia for a time when there was a balance of work, labor, and leisure in our lives.
Many educators and politicians today are questioning the wisdom of taking two or more consecutive months off from school, citing the educational demands that the 21st century economy places on individuals trying to earn a living. Summer vacation has been shown to negatively impact those students who are most in need of academic support since they are the least likely to have the privilege of enriching summer experiences at home or in summer programs. Many charter schools have turned to extended school days and extended school years to improve test scores of historically failing (usually urban) populations. It would be wrong to oppose eliminating summer vacation on the grounds that it takes away regenerative time for students, because summer is only regenerative for a privileged segment of the population. But perhaps a case can be made for the present relevance of the historical misconception that summer vacation is a time for young people to learn by laboring for food.
Although the local food movement has largely been the preoccupation of the upper-middle class, it has the potential to change how people in communities across the country participate in cycles of production and consumption. Community based agricultural opportunities are popping up in urban and rural areas, many of which seek to involve as many young people as possible through schools and other community organizations. These farming programs have the potential to teach young people that happiness comes through painful laboring while reaping the direct benefits for oneself and one’s own community. These kinds of work opportunities could begin to shift the imbalance of human activity in our society and reclaim a more direct and fulfilling form of laborer than the mere “jobholder.”
Insofar as education aspires to be more than training in how to make a living in the modern economy – a task made nearly impossible given the rapid technological and societal changes that make it very difficult for teachers to predict what the world may be like when their students are adults – it can open opportunities for young people to reflect on and make meaning of the various aspects of human living on earth. Schools must stand apart from the economic life process long enough to foster a free appreciation for, rather than enslavement to, the cycles of being alive. Participating in the growing of one’s own food during the summer months – whether at home, in a community garden, or on an urban farm – is a good way to learn gratitude for the bodily pain and pleasure that define the life that we have been given.
Science fiction, Hannah Arendt tells us, has too long been undervalued by those who would seek to comprehend the human condition. It is in the human fantasies of our future that mankind reveals our desires, both possible and not yet possible. For Arendt, some of those deepest and longest-held desires included the desire to flee the earth, to play God and to make human beings, and to make labor unnecessary. Her book, The Human Condition, is in part an effort to think through the fact that many of these human desires were, for the first time in millennia, threatening to become possible.
We make a mistake to ignore science fiction, especially in an era where the unprecedented advance of technological ability makes it possible that today’s dreams will soon be realized. With that in mind, it is worth looking at Alex Mar’s profile of life, death, and cryogenic preservation of FM-2030, otherwise known as Fereidoun M. Esfandiary.
Writing in The Believer, Mar introduces FM-2030, one of the founders of the transhumanism movement. FM-2030 has a single defining dream for the future of man, that we overcome our given and earthly and biological limits. If man, as Arendt writes, is both someone who lives in a given and fated world and someone who can change and re-make that world, the transhumanists like FM-2030 imagine a time in the near future in which all biological, temporal, and physical limits will be overcome. Including death.
The ultimate goal for transhumanists has never been merely to improve mankind, but to defeat our greatest opponent: death. Of course, not all champions of Progress make the titanic leap to Immortality—the jump is so vast, so wildly immodest and presumptuous as to cross over into the realm of the kind of uncomfortably eccentric. But as FM would put it, “No one today can be too optimistic.” Transhumanists, in their crusade against time, have begun to buy themselves some of it, at the cost of a pricey life-insurance policy. With some cryoprotectants and a lot of liquid nitrogen, humanity—or at least the one-thousand-ish people affiliated with Alcor, currently the largest cryonics group in the country—has been gifted with the semi-scientific semi-possibility of radically extended life. Die a clinical “death,” go to sleep, wake up eons later, when existence is a whole new ball game. So when will immortality come?
If you want to understand the human condition, that means knowing well too our most human dreams. Today, technological optimism is at the center of those dreams. Fereidoun M. Esfandiary was for many the first great transhumanist of the late 20th century, the precursor to Ray Kurzweil, who also dreams of his own immortality. This story of his untimely death, and efforts to preserve him, reveal much about the movement he helped to found.
Read the article here.
Read related essays on the human dream of a non-human future here.
You can also purchase the inaugural issue of HA, the Hannah Arendt Center Journal, which features a selection of articles by Nicholson Baker, Babette Babich, Rob Riemen, Marianne Constable, and Roger Berkowitz from our 2010 conference, “Human Being in an Inhuman Age.”
No government exclusively based on the means of violence has ever existed. Even the totalitarian ruler, whose chief instrument of rule is torture, needs a power basis—the secret police and its net of informers. Only the development of robot soldiers, which, as previously mentioned, would eliminate the human factor completely and, conceivably, permit one man with a push button to destroy whomever he pleased, could change this fundamental ascendancy of power over violence.
—Hannah Arendt, “On Violence.”
Hannah Arendt wrote these lines in the midst of the United States’ defeat in Vietnam. Her argument was that as long as robot soldiers were a thing of the future, brute violence and force like that unleashed by the United States would always succumb to collective power, of the kind exhibited by the Vietcong. Hers was, at least in part, a hopeful voice, praising the impotence of violence in the face of power.
To read Arendt’s lines today, amidst the rise of drone warfare, alters the valence of her remarks. Drones are increasingly prototypes and even embodiments of the “robot soldiers” that Arendt worried would dehumanize war and elevate violence over power. If we draw out the consequences from Arendt’s logic, then drone soldiers might displace the traditional limits that politics places on violence; drones, in other words, make possible unprecedented levels of unlimited violence.
The rise of drones matters, Arendt suggests, in ways that are not currently being seen. Her worry has little to do with assassination, the concern of most opponents of drones today. Nor is she specifically concerned with surveillance. Instead, against those, like General Stanley McChrystal, who argue that drones are simply new tools in an old activity of war, Arendt’s warning is that drones and robot soldiers may change the very dynamic of war and politics.
To see how drones change the calculus of violence in politics, we need to understand Arendt’s thesis about the traditional political superiority of power over violence. The priority of power over violence is based on the idea that power is “inherent in the very existence of political communities.” Power, Arendt writes, “corresponds to the human ability not just to act, but to act in concert.” It “springs up whenever people get together and act in concert.” All government, and this is central to Arendt’s thesis, needs power in order to act.
This need for popular support is true even for totalitarian governments, which also depend on the power of people—at least a select group of them like the secret police and their informers—continuing to act together. It is thus a myth that totalitarian rule can exist without the support of the people. Whether in Nazi Germany or contemporary Syria, totalitarian or tyrannical governments still are predicated on power that comes from support of key segments of the population.
Even if all government is predicated on some power, governments also employ violence—but that violence is held in check by political limits. As a government loses its popular support, it finds itself tempted to “substitute violence for power.” The problem is that when governments give in to the temptation to use violence to shore up slackening of popular power, their use of violence diminishes further their power and results in impotence. The more violence a government needs to rely upon, the less power it has at its disposal. There is thus a political limit on how much violence any government can employ before it brings about the loss of its own power.
As much as she respects the claims for power over violence, Arendt is clear-eyed about the damage violence can wield. In a direct confrontation between power and violence, violence will win—at least in the short term. Arendt writes that if Gandhi’s “enormously powerful and successful strategy of nonviolent resistance” had met a different enemy—a Stalin or Bashar al-Assad instead of a Churchill or Mubarek—“the outcome would not have been decolonization, but massacre and submission.” Sheer violence can bring victory. But the price for such a triumph is high, not only for the losers, but also for the victors.
We see this exemplified in Middle East over the last few years. In those countries like Bahrain and Syria where governments did not shy from unlimited violence to repress popular revolts, the governments have maintained themselves and the Arab Spring has turned into a long and frigid winter. Assad has been able to maintain power; but his power is irreparably diminished. In the end, there is a limit to the viability and effectiveness of relying on mere violence at the expense of power. This is even more true in a constitutional democracy, where support of the people is a political necessity.
As confident as Arendt is that violence is limited in politics by the need for power, she worries that the coming age of “robot soldiers” might bring about the end of the political advantage power has over violence. Robot soldiers can be controlled absent of consent or political support. With the push of a button or a simple command, a tyrant or totalitarian ruler can exert nearly unlimited violence and destruction, even without the support a massive secret police or a network of informers. Drones threaten the time-immemorial dependence of even the most lonely tyrant on others who will support him and do his bidding.
Of course drones must be built, programmed, and maintained. No tyrant is fully autonomous. Yet building, programming, and maintaining machinery are fundamentally different jobs than arresting and killing dissenters. It is far easier for programmers and electricians to justify doing their jobs in a powerless yet violent state than for soldiers and secret agents to justify theirs.
In a drone-led war, men will rarely need to go into action as soldiers. That is of course one reputed advantage of drones, that they make war less dangerous and more technically predictable. But it also means that as modern warfare becomes safer and more humane, it also excludes without human soldiers and risks stripping war of its human and active character. This helps to explain an enigmatic passage of Arendt’s in The Human Condition, where she offers modern war as an example of when action “loses its specific character” as human action and “becomes one form of achievement among others.” The degradation of human action in modern war, she writes,
happens whenever human togetherness is lost, that is, when people are only for or against other people, as for instance in modern warfare, where men go into action and use means of violence in order to achieve certain objectives for their own side against the enemy. In these instances, which of course have always existed, speech becomes indeed ‘mere talk,’ simply one more means toward the end….
Arendt is here thinking of the anonymity of the modern soldier epitomized by the monuments to the unknown soldiers—the mute mass of humanity who fight and die without the “still existing need for glorification” that makes war a human instead of a merely mechanical activity.
Her modern warfare in its inhumanity and technological capacity abandons the togetherness that has traditionally made war a prime example of human political togetherness.
In the technological advances of modern warfare that made war so awful and so mechanical, Arendt actually found a glimmer of hope: that war’s rabid violence was compensated by neither political advantage nor personal glory. In On Revolution, she dared hope that the fact that technology had reached the stage “where the means of destruction were such as to exclude their rational use” might lead to a “disappearance of war from the scene of politics….” It was possible, she thought, that the threat of total war and total destruction that accompanies war in the modern era might actually lead to the disappearance of war.
Clearly such a hope has not come to pass. One reason for the continuation of war, however, is that the horrors of war are made ever more palatable and silent—at least to the victors—by the use of technology that exerts violence without the need for political power and participation. The drone wars of the early 21st century are in this respect notable for the unprecedented silence that accompanies violence. Since U.S. soldiers are rarely injured or killed and since the strikes are classified and the damage remote, we have indeed entered an era where we can fight wars absent the speech, glory, and “human togetherness” that has traditionally marked both the comradeship of soldiers and the patriotic sacrifice of a nation at war. It is in this extraordinary capacity of mute violence to substitute for power in which we can glimpse both the promise and the peril of drones.
Thomas Levin of Princeton came to Bard Tuesday to give a lecture to the Drones Seminar, a weekly class I am participating in, led by my colleague Thomas Keenan and conceived by two of our students Arthur Holland and Dan Gettinger. Levin has studied surveillance techniques for years and he came to think with us about how the present obsession with drones will transform our landscape and our imaginations. At a time when the obsession with drones in the media is focused on their offensive capacities, it is important to recall that drones were originally developed as a surveillance technology. If drones are to become omnipresent in our lives, what will that mean?
Levin began by reminding us of the embrace of other surveillance devices in mass culture, like recording devices at the turn of the 20th century. He offered old postcards and cartoons in which unsuspecting servants or children were caught goofing off or insulting their superiors with newfangled recording devices like the cylinder phonograph and, later, hidden cameras and spy satellites. The realization emerges that we are being watched, and this sense pervades the popular consciousness. In looking to these representations from mass culture of the fear, awareness, and even expectation that we will be watched and listened to, Levin finds the emergence of what he calls “rhetoric of surveillance.”
In short, we talk and think constantly about the fact that we are or may be being watched. This cannot but change the way we behave and act. Levin poses this question. What, he asks, is the emerging drone imaginary?
To answer that question it is helpful to revisit an uncannily prescient imagination of the rise of drones in a text written over half a century ago, Ernst Jünger’s The Glass Bees. Originally published in 1957 and recently reissued in translation with an introduction by science fiction novelist Bruce Sterling, Jünger’s text centers around a job interview between an unnamed former light cavalry officer and Giacomo Zapparoni, secretive, filthy rich, and powerful proprietor of The Zapparoni Works that “manufactured robots for every imaginable purpose.” Zapparoni’s secret, however, is that he instead of big and hulking robots, he specialized in Lilliputian robots that gave “the impression of intelligent ants.”
The robots were not powerful in themselves, but they worked together. Like drone bees and drone ants—that exist only for procreation and then die—the small robots, or drones, serve specific purposes in industry or business. Zapparoni’s tiny robots “could count, weigh, sort gems or paper money….” Their power came from their coordination.
The robots “worked in dangerous locations, handling explosives, dangerous viruses, and even radioactive materials. Swarms of selectors could not only detect the faintest smell of smoke but could also extinguish a fire at an early stage; others repaired defective wiring, and still others fed upon filth and became indispensable in all jobs where cleanliness was essential.” Dispensable and efficient, Zapparoni’s little robots could do the most dangerous and least desirable tasks.
In The Glass Bees, we are introduced to Zapparoni’s latest invention: flying glass bees that can pollinate flowers much more efficiently and quickly than natural bees. The bees “were about the size of a walnut still encased in its green shell.” They were completely transparent and they were an improvement upon nature, at least insofar as the pollination of flowers was concerned. If a true or natural bee “sucked first on the calyx, at least a dessert remained.” But Zapparoni’s glass bees “proceeded more economically; that is, they drained the flower more thoroughly.” What is more, the bees were a marvel of agility and skill: “Given the flying speed, the fact that no collisions occurred during these flights back and forth was a masterly feat.” According to the cavalry officer, “It was evident that the natural procedure had been simplified, cut short, and standardized.”
Before our hero is introduced to Zapparoni’s bees, he is given a warning: “Beware of the bees!” And yet he forgets this warning. Watching the glass bees, the cavalry officer is fascinated. He felt himself “come under the spell of the deeper domain of techniques,” which like a spectacle “both enthralled and mesmerized.” His mind, he writes, went to sleep and he “forgot time” and “also entirely forgot the possibility of danger.”
Jünger’s book tells, in part, the story of our fascination and subjection to technologies of surveillance. On Facebook or Words with Friends, or even using our smart phones or GPS systems, we allow our fascination with technology to dull our sense of its danger. As Jünger writes: “Technical perfection strives toward the calculable, human perfection toward the incalculable. Perfect mechanisms—around which, therefore, stands an uncanny but fascinating halo of brilliance—evoke both fear and a titanic pride which will be humbled not by insight but only by catastrophe.”
The protagonist of The Glass Bees, a former member of the Light Cavalry and later a tank inspector, had once been fascinated by the “succession of ever new models becoming obsolete at an ever increasing speed, this cunning question-and-answer game between overbred brains.” What he came to see is that “the struggle for power had reached a new stage; it was fought with scientific formulas. The weapons vanished in the abyss like fleeting images, like pictures one throws into the fire. New ones were produced in protean succession.” Victory ceased to be about physical battle; it became, instead, a contest of technical mastery and knowledge.
The danger drones pose is not necessarily military. As General Stanley McChrystal rightly said when I asked him about this last week at the New York Historical Society, drones are simply another military tool that can be used for good or ill. Many fret today about collateral damage by drones and forget that if we had to send in armies to do these tasks the collateral damage would be much greater. Others worry about assassination, but drones are simply the tool, not the person pulling the trigger. It may be true that having drones when others don’t offers an enormous military advantage and makes the decision to go to kill easier, but when both sides have drones, we will all think heavily between beginning a cycle of illegal assassinations.
Rather, the danger of drones is how they change us as humans. As we humans interact more regularly with drones and machines and computers, we will inevitably come to expect ourselves and our friends and our colleagues and our lovers to act with the efficiency and selflessness of drones. Sherry Turkle worries that mechanical companions offer such fascination and unquestionable love that humans are beginning to prefer spending time with their machines than with other humans—who make demands, get tired, act cranky, and disappoint us. Ron Arkin has argued that robot soldiers will be more humane at war than human soldiers, who often act rashly out of exhaustion, anger, or revenge. Doctors are learning to rely on Watson and artificially intelligent medical machines, who can bring databases of knowledge to bear on diagnoses with the speed and objectivity that humans can only dream of. In every area of human life where humans once were thought to be necessary, drones and machines are proving more reliable, more capable, and more desirable.
The danger drones represent is not what they do better than humans, but that they do it better than humans. They are a further step in the human dream of self-improvement—the desire to overcome our shame at our all-too-human limitations.
The incredible popularity of drones today is partly a result of their freeing us to fight wars with ever-reduced human and economic costs. But drones are popular also because they appeal to the human desire for perfection. The question is, however, how perfect we humans can be before we begin to lose our humanity. That is, of course, the force of Jünger’s warning: Beware of the bees!
As drones appear everywhere around us, you would do well to put down the newspaper and turn off You Tube and, instead, revisit Ernst Jünger’s classic tale of drones. The Glass Bees is your weekend read. You can read Bruce Sterling’s introduction to The Glass Bees here.
My girlfriend and I walked by a clothing storefront and noticed the print on some of the t-shirts at the lower right corner of the window and went in. She had mentioned this Imaginary Foundation (IF) before. They make print t-shirts.
I went to school at an expensive liberal arts college in the Hudson Valley—everyone there makes print t-shirts. It is like a business you start as a college sophomore as a way to convince yourself that you are a ‘creative entrepreneur’ before you enter the corporate world (or, alternatively, as a penance for inherited culture and comfort) the not-for-profit world.
Often, I cannot stand them —the print t-shirts. There is something out of shape about them, as if the juxtaposition of body/shirt/image, sets askew some intrinsic agreement in the marriage of fashion and identity. And yet, the IF designs spoke to me. There is something dreamy and yet sincere about these prints. If le petit prince was looking for a print t-shirt, he would buy one of these.
It just so happened that the owner of the company was visiting this Seattle distributor and was in the store. He was awkward, skittish and European. I liked him, and before we left I told him that I blog for a thinking and humanities institute out east and may want to write about his brand. That’s how I got into the Imaginary Foundation.
The shirts are not exactly ‘pretty,’ or ‘fashionable,’ rather, their attraction is a gesture beyond themselves -- a rare feat in a culture that positions branding as the apex of success. I’ll describe one shirt and if interested you can invest your own time in the Imaginary Foundation.
The “Being There” shirt has three anonymous human heads (one of the cloud suit, one of the water suit, and one of the fire suit). The heads are in peripheral view and are aligned, with a slight skew (allowing us the view of all three faces), as they break through a wall, the veil of the universe.
Other shirts handle concepts of psychosis and love “Love Science,” science and discovery in a reach towards heaven “Reach,” and other such concepts widely considered esoteric or cliché within the lens of our popular culture. But, we no longer understand what a ‘cliché’ is. I have long held the view that a cliché is a truth, or a point of interest and perspective insight, that has simply been worn out by overexposure. But who has worn it out? How have we taken the liberty and quiet pleasure of the private sphere (the realms of reflection, contemplation, meditation as it is thought of in the Greek terms), out of our living cycle, our consciousness, our daily existence? Why is the call for private contemplation no longer a necessity of existence? It seems we should have more time then ever for such practices. So many of our daily chores, our basic needs, are met through the economic matrix. I no longer have to chop wood for warmth, hunt a boar for food, trek down to the river for a water simply, etc... Why shouldn’t I spend more time in private contemplation, or even public conversation on these more subtle topics of the human necessity? Why shouldn’t I be making something in an effort to communicate those private necessities? The actualization of the humanist requires space for such a practice. And yet, anything that requires a slowing down of, a calling for the work of the mind and private reasoning, is now, quite often immediately, labeled a cliché.
In The Human Condition Arendt writes “The emancipation of labor and the concomitant emancipation of the laboring classes from oppression and exploitation certainly means progress in the direction of non-violence. It is much less certain that it was also progress in the direction of freedom.” She is not saying that laboring classes should not have been emancipated. Rather, that the humanist goal has been blurred by some glitch. Instead of moving towards freedom from wasteful labor (a waste of human power -- physical, mental, spiritual) we instead have emancipated labor. Most of us have become imprisoned in a non-sustainable cycle that for the continuation of its forward motion requires an ever-increasing consumption and waste. This waste can be seen in terms of power. The core power of the human psyche originates in the liberty of free private thoughts—a psychological space for contemplation. A mapping of one’s stillness that is only possible in the acquisition of free time. Free time is a result of freedom from labors necessity. What Arendt’s thoughts gesture towards is that the set of basic necessities that we have been freed from, have been replaced by another, far more complicated and disguised set—the necessity to perpetuate a system that is moving much faster then us; a necessity to consume and continue consuming. To be ‘a part of‘ is, today, to be a consumer—to take ones place in the labor of waste.
Oh right, I wanted to tell you about a product...
“IF” is a creative project. It gains the viewers attention and borrows the imagination. This is a beginning. It does not steal, it borrows. It suggests the prospect of resonance rather than ownership.
I checked out the company website. The “about” page describes the development of the Imaginary Foundation: “a think tank from Switzerland that does experimental research on new ways of thinking and the power of the imagination. They hold dear a belief in human potential and seek progress in all directions.” The page is dotted with black and white images from the sixties, shaggy haired men and turtle-neck clad women engaged in contemplative, laissez-faire, light spirited dialogue. The imaginary director of the foundation is described as a “70-something uber-intellectual whose father founded the Dadaist movement.” The foundation is imaginary. It is a base, a canvas, for the products (the t-shirts) and the ideas behind them.
The blog section of the site imagines a list of contributors: Isadore Muggll, Kamilla Rousseau, etc. These architects, as is the back story, are too imaginary. “IF” is a fictional foundation for the product. But the product is real and engaging.
What is captured here goes beyond the tangible properties of the product (t-shirts). It is about what the product delivers—the wonder of creativity and science, the archetypes of the IF. Imagination IS the foundation of this product.
The blog itself is a venue for artists who marry technology and art, as well as other thought provoking materials. The image I use at the head of this article is taken from the blog. Cloud, idea, light, community, play—IF: all these are represented in the Cloud installation. This art installation is a discovery I am brought to by the Imaginary Foundation.
I once taught a course on the development of contemporary advertising, heavily focused on Edward Bernays and the peripheral route of persuasion. Bernays was Sigmund Freud’s nephew, Woodrow Wilson’s image advisor, the father of the term "Public Relations," and the architect of the torches of freedom (Lucky Strikes) campaign, among many others. His theory, though terribly simplified here, was that the modern consumer does not purchase with his mind; rather, he defers to his emotions in most choices. The rational-actor is a fiction. If consumerism became god, branding became its religion.
Ad campaigns have become remarkably creative, and even, at times, beautiful. Have you ever felt the urge to cry during a Jeep commercial? Many have. I think I have. The central conceptual premise of the AMC show Mad Men, depends upon this tension: between art and consumption; the rendering from black and white, to color; the effective marketing and selling off of the human experience. In question is the art aspect of advertising. It is at the core of Don Draper’s motivations, and the one that despite his many character failings keeps endearing him to us. Ultimately we are asking, will he reconcile his artistic urge (his private motivation) with his office at the homunculus of the consumerism model (his role in the corporate arena). Exposed is a manipulation, an incongruence, an infidelity in the marriage of advertising and art. Where as art points towards something beyond itself, beyond even the image and the medium, the ad campaign points only to one purpose—back into itself. No idea behind it. Nothing living. It consumes.
Advertising is like the Ouroboros, the dragon that swallows its own tail; having entirely swallowed itself, the modern advertising campaign defies the laws of balance, it is only the un-relentless, hungry serpent head of consumption -- devoid of the body of life. The only urge driving it is to possess.
It is the difference between the work of Egon Schiele and Penthouse, the writings of Georges Bataille and a godaddy.com super bowl campaign.
Seduce ->consume. This is the current mandate of the ad campaign. But this relationship is only sustainable through incompletion. It requires continual doses. Seduce -> consume -> feel a lack even in the possession of product (contract unfulfilled) -> be seduced again -> consume. Ad infinitum. A terrible loop.
How can consumerism and individual consciousness (the most private sector) be made sustainable? Is it possible for a product to speak beyond itself? To fulfill the promise of its persuasion? And if it could, what would that mean for us?
Here I position the word sustainability to face two directions. In part it refers to what Arendt terms as “worldly,” the creation produced through work and not labor, something that has the potential to last beyond the productions of time, something that maneuvers into the arena of the eternal. I also want to posit the word in terms of its evolving contemporary potential. The one sector of the public, and political sphere that allows for the platform of this conversation is the environmental movement. It is where we have begun to contemplate the world beyond the shortsighted view of individual lifetimes. We speak of the sustainability of our planet; we are considering new ways to move our habits from wasteful and consumptive, towards lasting and sustainable power. It is a fairly new conversation and the word “sustainability” is evolving with each new perspective we bring to it.
Sustainability goes beyond consumer awareness. It is about the awareness of the product, how a brand gains consciousness. I need to explore here a definition of “consciousness.”
I have come to understand definitions as ever evolving in accordance with society and the pressures put upon it by the conditions of the time, the fractals of our world (more simply put, the culture stew).
Consciousness is the expanding of space into which one can resonate. To learn of the world around us, to acknowledge it, to consider its multiple dimensions, is to become more conscious -- to create space into which we can move by the will of our imagination and invention.
The Imaginary Foundation is an example of this bridge. It acknowledges itself and its fiction. It allows for play. It is a small company that uses the fabrication of its narrative to bring the consumers attention to the mimetic principles behind its product. Revealing the architects conceit brings me (the consumer) into co-authorship of the story. It endears itself to me. We do not only consume the product. We consume the narrative of the product. Even if I do not purchase, if I am thinking about it, I am talking about it, I have bought in. If it generates new ideas and deeper order thoughts, then I have begun to take ownership of the product. I consume the myth, I begin to co-author it -- I don it in the neural network of culture. And thus the product has gained consciousness, has begun to be carried beyond the object -- it resonates.
My study of this product is limited. I am not encouraging anyone here to purchase a shirt. I have not purchased a shirt. What I think this opens up is a table for negotiations between the current consumerism model, and individual consciousness—an opportunity to examine sustainable consumerism in all implications.
There is a petition going circulating asking Bowling Green University to rescind its decision to cut 11% of its faculty—nearly 100 positions—while simultaneously planning to increase enrollment. The petition reads:
Slashing faculty numbers while planning to increase enrollment by 6,000 students (as you publicly announced in 2012) will greatly diminish BGSU’s position as one of Ohio’s top-rated public universities. Your plans would compromise the education of current students, and it would reduce the prestige of degrees that have already been granted by BGSU.
The decision is designed to save $5.2 Million, just over the $5 Million that the university is set to lose as a result of Ohio’s recent budget cuts to public university education.
On the one hand, this is a story that will be repeated over and over in the coming years. On the other hand, why is it that the university chooses to fill the entirety of its budget gap by letting faculty go? There was no announcement about cutting administrators, pairing back expensive sports programs, and halting an expensive building plan. Here is what the The Bowling Green State University Faculty Association said:
“[T]he $5.2-million savings is suspiciously close to the $5 million number that BGSU officials have floated as the loss from state share of instruction under Ohio’s new funding plan,” the statement indicates. “In other words, Mazey may have decided that faculty alone should absorb any budgetary challenges. It’s certainly easier than cutting six-figure administrators, in-the-red athletics, expensive residence halls, luxurious renovations to the rec center, high-priced outside consultants, failed football bowl games, or Mazey’s team of spin doctors which, as Mazey administration spending indicates, are her true priorities.”
It does seem that the University is cutting the faculty in a disproportionate and severe manner, especially given the announced intent to increase enrollment. It would be much better to cut administration and sports teams. But the sad fact remains, colleges like Bowling Green are going to suffer as public funding is cut back, student debt levels depress enrollments, and alternatives to college emerge. At the same time, technology will begin to displace many faculty members and allow colleges to educate more students with fewer professors.
Given the changes coming to higher education, it is important that colleges and universities adapt intelligently. We might start by cutting back on administration and luxury dorms. One big question is whether tenured faculty positions will continue to make sense at a time that demands flexibility and innovation. It is worth noting that Bowling Green cut exclusively amongst adjuncts and part-time faculty, leaving its tenured faculty untouched. How much longer that will continue to happen is real question.
Controversy is raging around Thomas Friedman’s column today advising the presumptive Secretary of State John Kerry to “break all the rules.”
In short, Friedman—known for his faithful belief that technology is making the world flat and changing things for the better—counsels that the U.S. ignore hostile governments and appeal directly to the people. Here’s the key paragraph:
Let’s break all the rules. Rather than negotiating with Iran’s leaders in secret — which, so far, has produced nothing and allows the Iranian leaders to control the narrative and tell their people that they’re suffering sanctions because of U.S. intransigence — why not negotiate with the Iranian people? President Obama should put a simple offer on the table, in Farsi, for all Iranians to see: The U.S. and its allies will permit Iran to maintain a civil nuclear enrichment capability — which it claims is all it wants to meet power needs — provided it agrees to U.N. observers and restrictions that would prevent Tehran from ever assembling a nuclear bomb. We should not only make this offer public, but also say to the Iranian people over and over: “The only reason your currency is being crushed, your savings rapidly eroded by inflation, many of your college graduates unemployed and your global trade impeded and the risk of war hanging overhead, is because your leaders won’t accept a deal that would allow Iran to develop civil nuclear power but not a bomb.” Iran wants its people to think it has no partner for a civil nuclear deal. The U.S. can prove otherwise.
Foreign policy types like Dan Drezner respond with derision.
Friedman's "break all the rules" strategy is as transgressive as those dumb-ass Dr. Pepper commercials. Worse, he's recommending a policy that would actually be counter-productive to any hope of reaching a deal with Iran. This is the worst kind of "World is Flat" pablum, applied to nuclear diplomacy. God forbid John Kerry were to read it and follow Friedman's advice.
I’ll leave the debate to others. But look at the central assumption in Friedman’s logic. If the leaders of a country don’t agree with us, go to the people. Tell them our plan. They’ll love it. But why is that so? For Friedman and so many of his brothers and sisters on the left and the right in the commentariat, the answer is: because our proposals are rational. Whether it is Friedman on Iran or Brooks on the economy or liberals on gun control or conservatives on the budget, there is an assumption that if everyone would just get together and talk this through like rational individuals, we would agree on a workable and rational solution. This is of course the basic view of President Obama. He sees himself as the most rational person in the room and wonders why people don’t agree with him.
This rationalist fallacy is wrong. Neuro-scientists tell us that people respond to emotional and non-rational inputs. But long ago Hannah Arendt understood and argued that the essence of politics is neither truth nor reason. It is plurality and opinion. The basic condition of politics is plurality, which means people need to come together and pursue a common good in spite of their disagreements and differences.
For Arendt, Western history has seen politics had come under the sway of philosophy and thus the pursuit of rational truth instead of being what it was: a space for the public engagement of different opinions. The tragedy of the last 50 years is that philosophical rationality has now been supplanted by technocratic rationality, so that politics is increasingly about neither opinion nor common truths, but technocracy.
One lesson Arendt took from her fundamental distrust of unity and rationality was the importance of the diffusion of powers and her distrust of centralized power. Her embrace of American Constitutional Federalism was neither conservative nor liberal; it was born from her insistence that politics cannot and should not seek to replace opinions with truths.
Friedman wants rational truth to win out and believes that if we just talk to the people, the veils will fall from their eyes. Well it doesn’t work here at home because people really do disagree and see the world differently. There is no reason to think it will work around the world either. A thoughtful foreign policy, as opposed to a rational one, would begin with the fact of true plurality. The question is not how to make others agree with us, but rather how we who disagree can still live together meaningfully in a common world.
This Weekend Read is Part Two in “The “E” Word,” a continuing series on “elitism” in the United States educational system. Read Part One here.
Peter Thiel has made headlines offering fellowships to college students who drop out to start a business. One of those Thiel fellows is Dale Stephens, founder of Uncollege. Uncollege advertises itself as radical. At the top of their website, Uncollege cites a line from the movie "Good Will Hunting":
You wasted $150,000 on an education you coulda got for a buck fifty in late charges at the public library.
The Uncollege website is filled with one-liners extolling life without college. It can be and often is sophomoric. And yet, there is something deeply important about what Uncollege is saying. And its message is resonating. Uncollege has been getting quite a bit of attention lately, part of a culture of obsession with college dropouts that is increasingly skeptical of the value of college.
At its best, Uncollege does not simply dismiss college as an overpriced institution seeking to preserve worthless knowledge. Rather, Uncollege claims that college has become too anti-intellectual. College, as Uncollege sees it, has become conventional, bureaucratic, and not really dedicated to learning. In short, Uncollege criticizes college for not being enough like college should be. Hardly radical, Uncollege trades rather in revolutionary rhetoric in the sense that Hannah Arendt means the word revolution: a return to basic values. In this case, Uncollege is of course right that colleges have lost their way.
Or that is what I find interesting about Uncollege.
To actually read their website and the recent Uncollege Manifesto by Dale Stephens, is to encounter something different. The first proposition Uncollege highlights has little to do with education and everything to do with economics. It is the decreasing value of a college education.
The argument that college has ever less value will seem counter intuitive to those captivated by all the paeans to the value of college and increased earning potential of college graduates. But Uncollege certainly has a point. Currently about 30% of the U.S. adult population has a degree. But among 20-24 year olds, nearly 40% have a college degree. And The Obama administration aims to raise that number to 60% by 2020. Uncollege calls this Academic Inflation. As more and more people have a college degree, the value of that degree will decrease. It is already the case that many good jobs require a Masters or a Ph.D. In short, the monetary value of the college degree is diminished and diminishing. This gives us a hint of where Uncollege is coming from.
The Uncollege response to the mainstreaming of college goes by a number of names. At times it is called unschooling. Unschooling is actually a movement began by the legendary educator John Holt. I recall reading John Holt’s How Children Learn while I was in High School—a teacher gave it to me. I was captivated by Holt’s claim that school can destroy the innate curiosity of children. I actually wrote my college application essay on Holt’s educational philosophy and announced to my future college that my motto was Mark Twain’s quip, “I never let school interfere with my education”—which is also a quotation prominently featured in the Uncollege Manifesto.
Unschooling—as opposed to Uncollege—calls for students to make the most of their courses, coupling those courses with independent studies, reading groups, and internships. I regularly advise my students to take fewer not more courses. I tell them to pick one course each semester that most interests them and pursue it intently. Ask the professor for extra reading. Do extra writing. Organize discussion groups about the class with other students. Go to the professor’s office hours weekly and talk about the ideas of the course. Learners must become drivers of their education, not passive consumers. Students should take their pursuit of knowledge out of the classroom, into the dining halls, and into their dorms.
Uncollege ads that unschooling or “hacking your education” can be done outside of schools and universities. With Google, public libraries, and free courses from Stanford, MIT and Harvard professors proliferating on the web, an enterprising student of any age can compose an educational path today that is more rigorous than anything offered “off-the-shelf” at a college or university. I have no problem with online courses. I hope to take a few. But it is a mistake to think that systems of massive information delivery are the same thing as education.
What Uncollege offers is something more and something less wholesome than simply a call for educational seriousness. It packages that call with the message that college has become boring, conventional, expensive, and unnecessary. In the Uncollege world, only suckers pay for college. The Uncollege Manifesto promotes “Standing out from the other 6.7 billion”; it derides traditional paths pointing out that “5,000 janitors in the United States have Ph.Ds.”; and cautions, “If you are content with life and education you should probably stop reading… You shall fit in just fine with society and no one will ever require you to be different. Conforming to societal standards is the easy and expected path. You are not alone!”
At the core of the Uncollege message is that dirty and yet all-so-powerful little word again: “elitism.” Later in the Uncollege Manifesto we are told that young people have a choice between “real accomplishments” and the “easy path to mediocrity”:
To succeed without a college degree you will have to build your competency and reputation through real world accomplishments. I am warning now: this is not going to be easy. If you want to take the easy path to mediocrity, I encourage you to go to college and join the masses. If you want to stand out from the crowd and change the world, Uncollege is for you!
At one point, the Uncollege Manifesto lauds NPR’s “This I Believe” series and commends these short 500 word essays on personal credos. But Uncollege adds a twist: instead of writing what one believes, it advises its devotees to write an essay answering the question: “What do you believe about the world that most others reject?” It is not enough simply to believe in something. You must believe in something that sets you apart and makes you different.
Uncollege is at least suggesting that it might be cool to want, as it has not been for 50 years, to aim for excellence and to yearn to be different. In short, Uncollege is calling up students at elite institutions to boldly grab the ring of elitism and actively seek to stand outside and above the norm. And it is saying that education is no longer elite, but conventional.
It is hard not to see this embrace of elitism as refreshing although no doubt many will scream the “e” word. I have often lectured to students at elite institutions and confronted them with their fear of elitism. They or someone spends upwards of $200,000 on an education not to mention four years of their lives, and then they reject the entire premise of elitism: that they are different or special. By refusing to see themselves as members of an elite, these students too often refuse to accept the responsibility of elites, to mold and preserve societal values and to assume leadership roles in society.
Leading takes courage. In Arendtian terms, it requires living a public life where one takes risks, acts in surprising ways, and subjects oneself to public judgment. Leading can be uncomfortable and dangerous, and it is often more comfortable and fun to pursue one’s private economic, familial, and personal dreams. Our elite colleges have become too much about preparing students for private success rather than launching young people into lives of public engagement. And part of that failure is a result of a retreat from elitism and a false humility that includes an easy embrace of equality.
That Uncollege is selling its message of excellence and elitism to students at elite institutions of higher learning is simply one sign of how mainstream and conformist many of these elite institutions have become. But what is it that Uncollege offers these elite students who drop out and join Uncollege?
According to its website, Uncollege is selling “hackademic camps” and a “gap year program” that are designed to teach young people how to create their own learning plans. The programs come with living abroad programs and internships. Interestingly, these are all programs offered by most major universities and colleges. The difference is money and time. For $10,000 in just one year, you get access to mentors and pushed to write op-eds, and the “opportunity to work at hot Silicon Valley startups, some of them paid positions.” In the gap year program, participants will also “build your personal brand. Speak at a conference, Write an op-ed for a major news outlet. Build a personal website.”
None of this sounds radical, intellectual, or all-that elitist. On the contrary, it claims that young people have little to learn from educators. Teachers are unimportant, to be replaced by mentors in the world. The claim is that young people lack nothing but information and access in order to compete in the world.
What Uncollege preaches often has little to do with elitism or intellectual growth. It is a deeply practical product being sold as an alternative to the cost of college. In one year and for one-twentieth of what a four-year elite college education costs, a young person can get launched into the practical world of knowledge workers, hooked up with mentors, and set into the world of business, technology, and media. It is a vocational training program for wannabe elites, training people to leap into the creative and technology fields and compete with recent college graduates but without the four years of studying the classics, the debt, and the degree. The elitism that Uncollege is selling is an entrepreneurial elitism measurable by money. By appealing to young students’ sense of superiority, ambition, and risk-taking, Uncollege stands a real chance of attracting ambitious young people more interested in a good job and a hot career than in reading the classics or studying abstract math.
Let’s stipulate this is a good thing. Not everybody should be going to liberal arts colleges. People unmoved by Nietzsche, Einstein, or Titian who are then forced to sit through lectures, cram for exams, and pull all-nighters writing papers cribbed from the internet are wasting their time and money on an elite liberal arts education. What is more, they bring cynicism into an environment that should be fired by idealism and electrified by passion. For those who truly believe that it is important in the world to have people who are enraptured by Sebald and transformed by Arendt, it is deeply important that the liberal arts college remain a bastion apart, a place where youthful exuberance for the beautiful and the true can shine clearly.
We should remember, as well, that reading great books and studying Stravinsky is not an activity limited to the academy. We should welcome a movement like Uncollege that frees people from unwanted courses but nevertheless encourages them to pursue their education on their own. Yes, many of these self-educated strivers will acquire idiosyncratic readings of Heidegger or strange views about patriotism. But even when different, opinions are the essence of a human political system.
One question we desperately need to ask is whether having a self-chosen minority of people trained in the liberal arts is important in modern society. I teach in an avowedly liberal arts institution precisely because I fervently believe that such ideas matter and that having a class of intellectuals whose minds are fired by ideas is essential to any society, especially a democracy.
I sincerely hope that the liberal arts and the humanities persist. As I have written,
The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.
Our problem, today, is that college is caught between incompatible demands, to spark imaginations and idealism and to prepare young people for employment and success. For a long while now colleges have been doing neither of these things well. Currently, the political pressure on colleges is to cut costs and become more efficient. The unspoken assumption is that colleges must more cheaply and more quickly prepare students for employment. For those of us who care about college as an intellectual endeavor, we should welcome new alternatives to college like internet courses, vocational education, and Uncollege that will pull away young people for whom college would have been the wrong choice. Maybe, under the pressure of Uncollege, colleges will return to their core mission of passionately educating young people and preparing them for lives of civic engagement.
I encourage you this weekend to read the Uncollege Manifesto. Let me know what you think.
The Hannah Arendt Center has followed the shadow dance of the fiscal cliff less for its fiscal than for its political lessons. While a deal was struck, it is hard not to be impressed by the breakdown of our political class. Like the Europeans, we are now officially kicking the can down the road, refusing to address our meaningful problems. There is, in short, no political will and no political leadership with the courage and willingness to act in ways that might help us imagine a new way out of our predicament.
One could say it is the fault of voters. But there is a funny thing happening in politics. The House of Representatives, which is supposed to be the most populist of the major branches of government, is the one branch of government that is calling loudly for painful spending cuts and resisting the rise of our out-of-control debt. True the House is calling for tax cuts, but so too did the Senate and the President. What distinguishes the House now is its insistence on cutting spending. The Senate and President—imagined to be more protected from popular will—are instead combining now to cut taxes, increase spending, and keep the gravy train of government-subsidized stimulus flowing. In a strange way, it is the political body most responsive to voters that is at least calling for change—even if the House Republicans refuse to be honest about what those changes would be or what they would mean. Why or how has this political inversion happened?
One of the few Senators who voted against the compromise is Michael Bennett, the Democratic Senator from Colorado who was supposed to be cliff jumping in Vail (it’s nice here!) but stayed in Washington to vote “No.” Interviewed by Maureen Dowd in The New York Times, Bennett says: “Going over the cliff is a lousy choice and continuing to ignore the fiscal realities that we face is a lousy choice.” Bennett, a free thinking Democrat, knows that things have to change.
"The burden of proof has to shift from the people who want to change the system to the people who want to keep it the same,” he said. “I think if we can get people focused to do what we need to do to keep our kids from being stuck with this debt that they didn’t accrue, you might be surprised at how far we can move this conversation.
But what is it about the system that needs to change? Some see this as simply a matter of policy. Nouriel Roubini, writing today in the Financial Times, thinks taxes need to go up for all Americans to help support a welfare state that is drastically underfunded and yet ever-so necessary:
Neither Democrats nor Republicans recognise that maintaining a basic welfare state, which is right and necessary in our age of globalisation, rapid technological change and demographic pressure, implies higher taxes for the middle class as well as for the rich. A deal that extends unsustainable tax cuts for 98 per cent of Americans is therefore a pyrrhic victory for Mr. Obama.
Roubini may very well be right. But as he himself recognizes, the political will to exercise this transformation is simply not there. What that means policy wise, I do not know.
The after effects of Super-storm Sandy are felt from the beaches to the statehouses. First of all, let’s realize it was not a hurricane, but a freakish combination of storm systems. Super-storm is more truthful than hurricane. Whatever it was, it has upended lives, and politics.
The Financial Times reports today that Governor Chris Christie of New Jersey has now joined NY Governor Andrew Cuomo in requesting not only emergency aid to repair the damage caused by the storm, but also preventative money to build dunes, use eminent domain to purchase property, and generally re-engineer the New Jersey coastline.
The political transformation here is lost on few. As the FT writes:
Mr. Christie, a Republican, has previously sounded more skeptical than Mr. Cuomo, a Democrat, about using state powers to dictate how the state was rebuilt. But he said on Wednesday he might take away local towns’ power to grant “easements” to homeowners objecting to new dunes blocking their sea views and would not rule out using government powers to purchase properties it believed were in the wrong place.
“I have to protect the Jersey shore, both as an economic engine and as a cultural engine,” Mr. Christie said.
The desire to take away local powers and give them to states and to take away state powers and give them to the federal government is neither a democratic nor a republican idea anymore. While the party of the elephant may give lip service to local governance, it has rarely, if ever, backed that up with action. As is now well known, the federal government has grown as fast if not faster under Republican Presidents than it has under democratic.
Hannah Arendt argued that the greatest danger to freedom in the United States was the rise of a large and bureaucratic government. She worried, as she once wrote, that the true threat to freedom was the sheer size of America alongside the rise of a technocracy. The sheer size of the country combined with the rising bureaucracy threatened to swallow the love for freedom she saw as the potent core of American civic life.
Chris Christie and Andrew Cuomo may well be their respective parties’ nominees for President in 2016. They are both deeply popular and have taken a pragmatic and largely centrist approach to governing at a time of financial crisis and natural disaster. And yet, from an Arendtian angle, it is striking that both governors have so internalized the view that problems are to be solved by bureaucrats and technocrats rather than on a local level.
That the bureaucratic approach is so entrenched should not be a surprise. It is both a consequence of a further spur to the retreat from politics that Hannah Arendt describes. Even Christie’s insistence that he must save the Jersey shore as an economic engine shows the near complete victory of economic thinking over politics.
Freeman Dyson, the eclectic physicist, took good aim at philosophy last week in a review of the silly book by Jim Holt, Why Does the World Exist?" An Existential Detective Story. Holt went around to "a portrait gallery of leading modern philosophers," and asked them the Leibnizian question: Why is there something rather than nothing?" The book offers their answers, along with biographical descriptions.
For Dyson, Holt's book "compels us to ask" these "ugly questions." First, "When and why did philosophy lose its bite?" Philosophers were, once important. In China, Confucius and his followers made a civilization. So too in Greece did Socrates and then the schools of Plato and Aristotle give birth to the western world. In the Christian era Jesus and Paul, then Aquinas and Augustine granted depth to dominant worldviews. Philosophers like Descartes, Hobbes, and Leibniz were central figures in the scientific revolution, and philosophical minds like Nietzsche, Heidegger, and Arendt (even if one was a philologist and the other two refused the name philosopher) have become central figures in the experience of nihilism. Against these towering figures, the "leading philosophers" in Holt's book cut a paltry figure. Here is Dyson:
Holt's philosophers belong to the twentieth and twenty-first centuries. Compared with the giants of the past, they are a sorry bunch of dwarfs. They are thinking deep thoughts and giving scholarly lectures to academic audiences, but hardly anybody in the world outside is listening. They are historically insignificant. At some time toward the end of the nineteenth century, philosophers faded from public life. Like the snark in Lewis Carroll's poem, they suddenly and silently vanished. So far as the general public was concerned, philosophers became invisible.
There are many reasons for the death of philosophy, some of which were behind Hannah Arendt's refusal to call herself a philosopher. Philosophy was born, at least in its Platonic variety, from out of the thinker's reaction to the death of Socrates. Confronted with the polis that put the thinker to death, Plato and Aristotle responded by retreating from the world into the world of ideas. Philosophical truth separated itself from worldly truths, and idealism was born. Realism was less a return to the world than a reactive fantasy to idealism. In both, the truths that were sought were otherworldly truths, disconnected to the world.
Christianity furthered the divorce of philosophy from the world by imagining two distinct realms, the higher realm existing beyond the world. Science, too, taught that truth could only be found in a world of abstract reason, divorced from real things. Christianity and science together gave substance to the philosophical rebellion against the world. The result, as Dyson rightly notes, is that philosophy today is as abstract, worldly, and relevant as it is profound.
What Dyson doesn't explore is why philosophers of the past had such importance, even as they also thought about worlds of ideas. The answer cannot be that ideas had more import in the past than now. On the contrary, we live in an age more saturated in ideas than any other. More people today are college educated, literate, and knowledgeable of philosophy than at any period in the history of the world. Books like Holt's are proof positive of the profitable industry of philosophical trinkets. That is the paradox—at a time when philosophy is read by more people than ever, it is less impactful than it ever was.
One explanation for this paradox is nihilism—The devaluing or re-valuing of the highest values. The truth about truth turned out to be neither so simple nor singular as the philosophers had hoped. An attentive inquiry into the true and the good led not to certainty, but to ideology critique. For Nietzsche, truth, like the Christian God, was a human creation, and the first truth of our age is that we recognized it as such. That is the precondition for the death of God and the death of truth. Nihilism has not expunged ideas from our world, but multiplied them. When speaking about the "true" or the "good" or the "just," Christians, Platonists, and moralists no longer have the stage to themselves. They must now shout to be heard amongst the public relations managers, advertisers, immoralists, epicureans, anarchists, and born again Christians.
Dyson ignores this strain of philosophy. He does point out that Nietzsche was the last great philosopher, but then dismisses Heidegger who "lost his credibility in 1933" and even Wittgentstein who would remain silent if a woman attended his lectures until she would leave. And yet it is Heidegger who has given us the great literary masterpieces of the 20th century philosophy.
His work on technology (The Question Concerning Technik) and art (The Origins of the Work of Art) has been widely read in artistic, literary, and lay circles. It is hard to imagine a philosopher more engaged with the science and literature than Heidegger was. He read physics widely and co-taught courses at the house of the Swiss psychiatrist Medard Boss and also taught seminars with the German novelist Ernst Jünger.
It seems worthwhile to end with a poem of Heidegger's from his little book, Aus der Erfahrung des Denkens/From Out of the Experience of Thinking:
Drei Gefahren drohen dem Denken
Die gute und darum heilsame Gefahr ist die Nachbarschaft des singenden Dichters.
Die böse und darum schärfste Gefahr ist das Denken selber. Es muß gegen sich selbst denken, was es nur selten vermag.
Die schlechte und darum wirre Gefahr ist das Philosophieren.
Three dangers threaten thinking.
The good and thus healthy danger is the nearness of singing poetry.
The evil and thus sharpest danger is thinking itself. It must think against itself, something it can do only rarely.
The bad and thus confusing danger is philosophizing.
“…the enormous pathos which we find in both the American and French Revolutions, this ever-repeated insistence that nothing comparable in grandeur and significance had every happened in the whole recorded history of mankind…”
-Hannah Arendt, On Revolution
Although my political memory is admittedly brief, I cannot remember an American presidential election day that was anticipated with less enthusiasm than the one that looms this week, particularly among the generation who are now my students. It is an unfortunate sign when you overhear conversations in the lounge expressing a wistfulness for the halcyon days of Clinton v. Dole. This is not to say that there are no strong emotions about the election – lots of umbrage weekly-renewed, considerable dread and anxiety, even a dash of hope and an occasional twist of satisfaction – but enthusiasm does not seem to be among them. Though this blog tends to dwell more on the political world with a touch of remove from its everyday hurly-burly, as Arendt did, given the proximity of the election it seems worth it to linger for a moment on the particular phenomenon those in this country face tomorrow: a moment of decision that no one seems particularly eager to reach.
Plenty has already been said about why this might be, and there is much more to be said than can be said in this space. I want to dwell on one particularly Arendtian concern that I have heard expressed and worried over more and more during the last few months, the simple question that a student last week pithily expressed as “what’s freedom got to do with it?” Asking the question in that way may nudge us in advance into hand-wringing and gnashing of teeth. But I want to argue something that may seem counter-intuitive, at least to the sensibilities that I hear advanced daily: that in fact the certain grimness or reticence with which many face the impending election is not a sign of the decay of the fabric of American polity, or the slow collapse of the meaningfulness of citizenship, but a sign that the events of the opening years of this millennium have brought us into a new kind of health. That health is precisely in the realm of freedom, a health increasingly robust even as we face terrible sickness and disrepair in other aspects of our political, economic, and cultural systems.
The great diagnostic temptation at this political moment, which one hears espoused often enough, is to say that Americans have forgotten how to experience freedom in our political process at all (if indeed it was ever there), and so we trudge towards Nov. 6 having thoroughly accepted that, whatever particular material interests we might have at stake, “freedom is not even the nonpolitical aim of politics, but a marginal phenomenon." And there is, of course, something to this worry, as there was when Arendt wrote it; a potent part of the dissatisfaction that so many feel and express is the sense that whoever is elected, it will make little difference in the end. There are lots of extremely portentous “minutae” of politics to counterpose to this sense – the composition of the courts, the reversal of pre-existing condition restrictions, reproductive and marriage rights, the bearing of the federal tax structure on nation’s titanic income inequality – but if these kinds of issues could disrupt the sense that they’re taken to address, the national media would have dispelled any concern of the sort long ago. What is at stake cannot be the literal question of whether or not there is anything at stake – otherwise the answer would be trivially obvious, that there is – but that there is a rich sense that, as Arendt puts it repeatedly, the experience of an inexorability to our political economy (to call a spade a spade) has thoroughly overwhelmed our hope for novelty, our belief in the possibility of new beginnings, of revolutionary change.
But there is something to this peculiar kind of despair, itself so different from the form of despair that dominated the part of our society in which I grew up – the sense of not only a crushing personal irrelevance but the fundamental impossibility of escaping a desperate struggle for livelihood – that actually bespeaks something promising for our political culture. The despair of change, which in fact now unites the two ideological poles of American politics, bespeaks a renewed sensitivity to freedom, freedom in the specifically Arendtian sense that space remains in which what is might be radically replaced with what might begin tomorrow. It is a sensitivity to freedom that can only exist in a polity that remembers what it is to feel and desire it.
On the contrary, anyone who doubts that Americans yet feel a sense of Arendtian freedom need only take a glance at the documentary currently making the rounds, 'PressPausePlay', to see that we have in fact again become so suffused with what Arendt called “the specific revolutionary pathos of the absolutely new, of a beginning which would justify starting to count time in the year of the revolutionary event,” that it has leapt out of the political realm and now structures our relationship to technology, to culture, and to education as well. Where once we had to worry that the political had become inextricably reduced to the social, now it seems that we may rather be faced with the universalization of the specifically political, with the preeminence of action and spectatorship in every sphere of the human condition. It’s not at all clear to me that that would be a terrible thing, but the point remains.
One might be inclined to blame the candidates themselves for the lack of enthusiasm, and again it would be hard to deny that there’s something to that. But what is it, exactly, that we find worth blaming? Certainly, those who supported him might have a number of particular political gripes with the way that President Obama executed his term in office, but I also think that most in practice most understand that Obama could never in the American political system have lived up to the messianic fervor surrounding him, and that this is not the true source of disconsolation. I will confess to a certain lack of sympathy for feeling betrayed by Obama’s positions. Likewise, it is hard to fault those who oppose President Obama for being unenthusiastic, to put it mildly, with having Mitt Romney as their only meaningfully available avatar, given that that concept itself entails the expectation that something is being represented. But here, too, I think there is a perfectly resilient awareness among those who will vote for Romney that the man is in fact quite good for the role for which he has been groomed and in which he has placed himself: the consummate manager, the guarantor of the kind of freedom-as-security Arendt worried might wholly replace our sense of freedom-as-possibility, “not the security against ‘violent death,’ as in Hobbes…but a security which should permit an undisturbed development of the life process of the society as a whole.”
No, the difficulty that Americans face is neither that we have lost our “revolutionary pathos” that makes us believe in the promise of something truly new, nor that we have candidates who cannot fulfill our rather extraordinary expectations, but that we have once again come into our desire for both the senses of freedom that Arendt diagnoses, and they are senses of freedom that do not sit easily together. The ambivalence and strain that comes with holding desires for competing freedoms is not something to be bemoaned, but celebrated, and converted into cause for engaging the immense barriers the current configuration of our political system has thrown up against those desires. We desire both the promise of change that holds fast our belief, and the promise of a managerial excellence in navigating the quotidian ho-hummery of administration. And this is simply the reality of, not the American political system, but political system as such: these two forms of promise are inextricably bound to each other, and though it is a tense and at time openly antagonistic partnership, it is nevertheless one that polity, at least in its Modern sense, can’t do without. Political ambivalence, and even pessimism, is not a sign of the decay of our political capacities, but of their renewal by a decade of protest and struggle and failure on both sides of the political spectrum. Our senses of freedom are in rude health…whether our politics can bear it is another question.
The Wall Street Journal ran an interview this week with Luke Muehlhauser, the Executive Director of the Singularity Institute. The Journal asked: Will Artificial Intelligence Make us Obsolete? Muehlhauser's answer was, well yes. In his words:
Cognitive science has discovered that everything the human mind does is done by information processing and machines can do information processing too.
The first statement is clearly false, or at least depends on a strangely mixed up idea of "information processing." The old determinist canard that humans are simply complex machines has not been proven or discovered by cognitive science. And even if humans do process billions upon billions of bits of information it is not at all clear that such a humanly fallible process is reproducible. That is not the claim that cognitive science can make.
But cognitive science can claim that machines can be built that act in ways that are so like humans as to be almost nearly indistinguishable from them. Or, they can even be better than humans in doing many quintessentially human tasks. So machines can not only beat humans at chess, they can make moves that seem like moves only a human could have made, as Gary Kasparov learned to his dismay in the second game of his rematch with Deep Blue. Machines can create paintings that appear to be fully creative, as does Aaron, the painting machine created by artist and computer scientist Harold Cohen. And machines can increasingly make ethical decisions in warfare, as the robo-ethicist Ron Arkin has argued—decisions that are more humane than those made by human warriors.
Too much of the debate over artificial intelligence is caught up in the technical and really irrelevant question of whether machines can fully replicate human beings. The point is that if machines act "as if" they are human, or if they are capable of doing what humans do better than humans, we will gradually and continually allow machines to take over more and more of the basic human activities that make up our world. Already computers make most of the trades on Wall Street and computers are increasingly used in making medical diagnoses. Computers are being used to educate our children and write news stories. Caregivers for the elderly are being replaced by robotic companions. And David Levy, artificial intelligence researcher at the University of Maastricht in the Netherlands, argues that we will be marrying robots in the near future. It is not that these robotic lovers or artificial artists are human, but that they love and paint in ways that do or will soon pass the Turing test: they will be impossible to distinguish from human works.
Undoubtedly one reason machines are acting more human is that humans themselves are acting less so. As we interact more and more with machines, we begin to act predictably, repetitively, and less surprisingly. There is a convergence at foot, and it is the dehumanization of human beings as much as the humanization of robots that should be worrying us.
The crisis must matter.
The most important divide in political and intellectual life today is between those who see society undergoing a transformative crisis and others who believe that the basic structures the 20th century industrial welfare state will persist.
The divide over how to understand the crisis of our times was front and center at the recent Hannah Arendt Center conference "Does the President Matter? A Conference on the American Age of Political Disrepair."
A number of speakers worried about the language of crisis. They rightly see talk about a "crisis" as code for an attack on the institutions of the welfare state. It can be an excuse to not only scale back the unsustainable aspects of our entitlement programs, but also to lower taxes on the wealthiest Americans while doing so.
It is true that many want to misuse the crisis as an attack on the poor and the middle class; that potential abuse, however, is not an excuse to deny the fact of the crisis itself. It is simply no longer possible to responsibly deny that we are living through a transformative crisis that will change the character of America and much of the world. The drivers of that crisis are many and include technology and globalization. The effects are profound and won't be fully understand for decades. At present, the first consequence is a crisis of institutional authority.
We in the US have indeed lost faith in our basic institutions. We don't trust scientists who warn us about global warming; we doubt economists who warn us about debt; we deny doctors who tell us that vaccines are safe. Very few people trust politicians or Ph.D.'s anymore. In fact, according to a 2009 General Social Survey, there are only two institutions in the United States that are said to have "A great deal" of confidence from the American people: the military and the police. This faith in the men with guns is, as Christopher Hayes writes in The Twilight of the Intellectuals, deeply disturbing. But it is not an illusion.
According to John Zogby, who spoke at the Hannah Arendt Center Conference last weekend, the crisis of faith in institutions is widespread and profound. Zogby said:
We call this the greatest economic crisis since the Great Depression and it is. But this is much more than that. This is a transformational crisis. Much more than simply the Great Depression, this is equivalent on the global stage to the fall of the Roman Empire. To the demise of Feudalism. What we have at this moment in time is a myriad—if not almost all—of our familiar institutions unprepared to deal with multiple crises all at once. Whether it is the federal Government or the near bankrupt states or the Democratic Party or the Republican Party or the banking institutions or the brick and mortal halls of higher education. Whether it is the Boy Scouts of America or the Roman Catholic Church, a number of our institutions that make up the superstructure of our society are simply unprepared to deal with the force of change, where we find ourselves.
Zogby was not the only speaker at our conference who noted that "our minds as well as our institutions have not caught up with the failure that they represent." Tracy Strong pointed to the outdated capacity of political primaries and Jeffrey Tulis spoke of the ways that Congress has, over the last century, increasingly abdicated its governmental and constitutional responsibilities. Institutions today spend more resources on self-sustenance (like fund raising) than on problem solving. Today our most important institutions are not only unable to solve the problems we face; the institutions have themselves become the problem.
Walter Russell Mead compared our current period to that era of American politics between 1865 and 1905. Mead noted that few people can name the presidents in that period not because of a failure of leadership but, rather, because in that period the U.S. was going through a cultural and societal transformation from, on one level, an agrarian to an urban-industrial society. We today are experiencing something equally if not more disruptive with globalization, technology, and the Internet. It is a mistake, Mead argued, to think that government or any group can understand and plan for such profound changes. There will be dislocations and opportunities, most of which are invisible today. While Mead offered optimism, he made clear that the years before the new institutions of the future emerge will be difficult and at times dark. There is little a president or a leader can do to change that.
Todd Gitlin and Anne Norton spoke of Occupy Wall Street and also the Tea Party as U.S. movements founded upon the loss of political and institutional power. Gitlin began with the widely quoted quip that the system is not broken, its fixed, an expression that feeds upon the disaffection with mainstream institutions. Norton especially noted the difficulties of a movement that at once decries and yet needs governmental power. The one constant, she rightly noted, is that in a time of institutional decay, those with the least to lose will lose the most.
Rick Falkvinge, founder of the Swedish Pirate Party, situated his party precisely in the space of institutional distrust that Mead and Zogby described. Falkvinge noted that the primary value held by 17 year-olds today is openness and transparency, which he distinguished from free speech. While free speech respects the rights of government and the media to regulate and curate speech, the radical openness embodied by the new generation is something new. The Pirate parties, for example, follow the rule of three. If three members of the Party agree on a policy, then that policy can be a platform of the party. There is no hierarchy; instead the party members are empowered to act. Like Wikileaks, with which it has strong affinities, the Pirate Party is built upon a profound distrust of all institutional power structures that might claim the authority to edit, curate, or distill what ought to be published or how we should govern ourselves.
Hannah Arendt wrote frequently about crises. "A crisis," she saw, "becomes a disaster only when we respond to it with preformed judgments, that is, with prejudices." The recent Arendt Center Conference sought to think about one particular crisis, namely the crisis of leadership in responding to the various crises that beset our age. It was born from the sense that we are increasingly confronting problems before which we cower helpless.
There are, of course, dangers and pitfalls in leadership. I too worry about calls for a leader to redeem us. That said, the coming seismic shifts in our world will bring great pain amidst what may be even greater opportunity. Without a workable political system that can recognize and respond to the coming changes with honesty and inspiration, chances are that our crises will morph into a disaster. Our President must matter, since men rarely accomplish anything meaningful without it. How a president might matter, was the theme of the two day conference.
If you missed the conference, or if you just want to review a few of your favorite talks, now is your chance. The Conference proceedings are online and can be found here. They are your weekend "read".
In this post, academics and university faculty will be criticized. Railing against college professors has become a common pastime, one practiced almost exclusively by those who have been taught and mentored by those whom are now being criticized. It is thus only fair to say upfront that the college education in the United States is, in spite of its myriad flaws, still of incredible value and meaning to tens if not hundreds of thousands of students every year.
That said, too much of what our faculties teach is neither interesting nor wanted by our students.
This is a point that Jacques Berlinerblau makes in a recent essay in the Chronicle of Higher Education.
Observers of gentrification like to draw a distinction between needs and wants. Residents in an emerging neighborhood need dry cleaners, but it's wine bars they really want. The application of that insight to the humanities leads me to an unhappy conclusion: Our students, and the educated public at large, neither want us nor need us.
What is amazing is that not only do our students not want what we offer, but neither do our colleagues. It is an amazing and staggering truth that much of what academics write and publish is rarely, if ever, read. And if you want to really experience the problem, attend an academic conference some day, where you will see panels of scholars presenting their work, sometimes to 1 or 2 audience members. According to Berlinerblau, the average audience at academic conference panels is fourteen persons.
The standard response to such realizations is that scholarship is timeless. Its value may not be discovered for decades or even centuries until someone, somewhere, pulls down a dusty volume and reads something that changes the world. There is truth in such claims. When one goes digging in archives, there are pearls of wisdom to be found. What is more, the scholarly process consists of the accumulation of information and insight over generations. In other words, academic research is like basic scientific research, useless but useful in itself.
The problem with this argument is that such really original scholarship is rare and getting ever more rare. While there are exceptions, little original research is left to do in most fields of the humanities. Few important books are published each year. The vast majority are as derivative as they are unnecessary. We would all do well to read and think about the few important books (obviously there will be some disagreement and divergent schools) than to spend our time trying to establish our expertise by commenting on some small part of those books.
The result of the academic imperative of publish or perish is the increasing specialization that leads to the knowing more and more about less and less. This is the source of the irrelevance of much of humanities scholarship today.
As Hannah Arendt wrote 50 years ago in her essay On Violence, humanities scholars today are better served by being learned and erudite than by seeking to do original research by uncovering some new or forgotten scrap. While such finds can be interesting, they are exceedingly rare and largely insignificant.
As a result—and it is hard to hear for many in the scholarly community—we simply don't need 200 medieval scholars in the United States or 300 Rawlsians or 400 Biblical scholars. It is important that Chaucer and Nietzsche are taught to university students; but the idea that every college and university needs a Chaucer and a Nietzsche scholar to teach Chaucer and Nietzsche is simply wrong. We should, of course, continue to support scholars, those whose work is to some extent scholarly innovative. But more needed are well-read and thoughtful teachers who can teach widely and write for a general audience.
To say that excessively specialized humanities scholarship today is irrelevant is not to say that the humanities are irrelevant. The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.
Hannah Arendt argues precisely for this connection between the humanities and politics in her essay The Crisis in Culture. Part Two of the essay addresses the political significance of culture, which she relates to humanism—both of which are said to be of Roman origin. The Romans, she writes, knew how to care for and cultivate the grandiose political and artistic creations of the Greeks. And it is a line from Pericles that forms the center of Arendt's reflections.
The Periclean citation is translated (in part) by Arendt to say: "We love beauty within the limits of political judgment." The judgment of beauty, of culture, and of art is, Pericles says, limited by the political judgment of the people. There is, in other words, an intimate connection between culture and politics. In culture, we make judgments of taste and thus learn the faculty of judgment so necessary for politics. And political judgment, in turn, limits and guides our cultural judgments.
What unites culture and politics is that they are "both phenomena of the public world." Judgment, the primary faculty of politics, is discovered, nurtured, and practiced in the world of culture and the judgment of taste. What the study of culture through the humanities offers, therefore, is an orientation towards a common world that is known and understood through a common sense. The humanities, Arendt argues, are crucial for the development and preservation of common sense—something that is unfortunately all-too-lacking in much humanities scholarship today.
What this means is that teaching the humanities is absolutely essential for politics—and as long as that is the case, there will be a rationale for residential colleges and universities. The mania for distance learning today is understandable. Education is, in many cases, too expensive. Much could be done more cheaply and efficiently at colleges. And this will happen. Colleges will, increasingly, bring computers and the Internet into their curricula. But as powerful as the Internet is, and as useful as it is as a replacement for passive learning in large lectures, it is not yet a substitute for face-to-face learning that takes place at a college or university. The learning that takes place in the hallways, offices, and dining halls when students live, eat, and breathe their coursework over four years is simply fundamentally different from taking a course online in one's free time. As exciting as technology is, it is important to remember that education is, at its best, not about transmitting information but about inspiring thinking.
Berlinerblau thinks that what will save the humanities is better training in pedagogy. He writes:
As for the tools, let's look at it this way. Much as we try to foist "critical thinking skills" on undergraduates, I suggest we impart critical communication skills to our master's and doctoral students. That means teaching them how to teach, how to write, how to speak in public. It also means equipping them with an understanding that scholarly knowledge is no longer locked up in journals and class lectures. Spry and free, it now travels digitally, where it may intersect with an infinitely larger and more diverse audience. The communicative competences I extoll are only infrequently part of our genetic endowment. They don't come naturally to many people—which is precisely what sets the true humanist apart from the many. She or he is someone you always want to speak with, listen to, and read, someone who always teaches you something, blows your mind, singes your feathers. To render complexity with clarity and style—that is our heroism.
The focus on pedagogy is a mistake and comes from the basic flawed assumption that the problem with the humanities is that the professors aren't good communicators. It may be true that professors communicate poorly, but the real problem is deeper. If generations of secondary school teachers trained in pedagogy have taught us anything it is that pedagogical teaching is not useful. Authority in the classroom comes from knowledge and insight, not from pedagogical techniques or theories.
The pressing issue is less pedagogy than the fact that what most professors know is so specialized as to be irrelevant. What is needed is not better pedagogical training, but a more broad and erudite training, one that focuses less on original research and academic publishing and instead demands reading widely and writing aimed at an educated yet popular audience. What we need, in other words, are academics who read widely with excitement and inspiration and speak to the interested public.
More professors should be blogging and writing in public-interest journals. They should be reviewing literature rather than each other's books and, shockingly, they should be writing fewer academic monographs.
To say that the humanities should engage the world does not mean that the humanities should be politicized. The politicization of the humanities has shorn them of their authority and their claim to being true or beautiful. Humanities scholarship can only serve as an incubator for judgment when it is independent from social and political interests. But political independence is not the same as political sterility. Humanities scholarship can, and must, teach us to see and know our world as it is.
There are few essays that better express the worldly importance of the humanities than Hannah Arendt's The Crisis of Culture. It is worth reading and re-reading it. On this hot summer weekend, do yourself that favor.
Architecture is at the center of politics. We can see the truth of this statement amdist the controversy about post-war reconstruction of Beirut and the establishment of Solidere—the company created to redevelop the city. Reconstruction in Beirut does not mean simply the physical re-making and structuring of certain “sites of memory” scattered throughout the city. Rather, reconstruction is a political process parallel to the constant making and re-making of internal contestations of power and identity inside Lebanon since at least 1860.
The most important and widely studied case of reconstruction in Beirut is the famous Centre Ville or Beirut Central District undertaken by Solidere (discussed at length in “Beirut: Reinventing or Destroying the Public Space?”. Höckel points as well to the case of the southern suburbs and the Elyssar project and the role played by Hezbollah in different states of reconstruction, namely, 1983, 1996 and 2006. In this post, I look at the Elyssar project to develop Beirut's eastern coast and southern suburbs. The project has been mired in delays for decades and exemplifies the blurry line between political projects, architecture, and private interests in postwar Lebanon.
The designation “southern suburb” has a negative connotation in Beirut, and is often used interchangeably with Shi’a Muslims, anarchy, squatters, illegality and poverty. The “suburbs”—formed by a permanent flow of rural migrants and later by both urban and rural refugees from the war—are homogenous and impoverished quarters of Beirut, consisting mostly of members of the Shi’a community and comprises one third of the population of the greater Beirut area. At first the project was to be undertaken by Solidere but after political contestation on the part of the residents and the Amal/Hezbollah party, it was implemented by a public agency created after much negotiation as per Decree No 9043 of August 1996.
The project was criticized on the basis of being based solely on economic considerations and too ambitious (the area is five times bigger than the central district) even though similar plans had already been tested and failed in the Arab world. Yet, it remained largely unmodified. Other issues arose, such as difficulties in land expropriation due to the illegality of building and dwelling in the area, and speculation over land value, in which all parties – Solidere, the Prime Minister’s Office and the local Amal/Hezbollah – withheld and manipulated information, which led to a political stalemate that permanently halted the project.
The project area extends over 586 hectares from the Summerland Resort and Sports City to the boundary of Beirut International Airport in the South. From East to West it extends from the Airport road to the Mediterranean Sea and includes a large portion of coastline – another contentious point for development and speculation.
Elyssar’s plan included the execution of all primary and secondary roads, necessary infrastructure and public services; the construction of over 10,000 units of affordable housing over a 14-year period, manufacturing parks, warehouses and workshop centers. At the heart of plan was also the same scenario of urban violence and displacement in which residents from illegal settlements were to be transferred elsewhere.
The question of illegality and ownership in the area (and everywhere else in Lebanon to a certain degree) is complex and nowhere near resolution. In a 2007 case study by Nadine Khayat, she writes:
The Lebanese state has mostly continued to adopt a non-interventionist strategy toward these areas in Beirut; in fact, many describe the southern suburbs of Beirut as a state within the state, having its own conservative jurisdictions that may arguably be excluding factions and other communal groups present in Lebanon.
The state faced the question of illegal settlements in an area almost entirely controlled politically by Amal/Hezbollah, with the exception of a Maronite minority at the fringes. The hostilities between the state and the militias go back to tensions between 1983 and 1984, when President Gemayel ordered the demolition of illegal neighborhoods in the suburb.
Facing resistance from the residents, with the support of Amal, and what is considered a reminder of the state’s bad will toward the area, it turn led to yet another extension of the war.
The suburbs fall under the definition of ‘slums’ and ‘illegal settlements’. They have been a recurrent nightmare in Beirut’s reconstruction plans because of the absence of planning bodies, uncontrolled migration and growth, and lastly, the lack of appropriate mapping of the slums in purview of the political control of para-state bodies in the area.
Mona Fawaz and Isabelle Peillen’s 2003, “The Case of Beirut, Lebanon”, part of “Understanding Slums: Case Studies for the Global Report on Human Settlements”, lays out the problem: “Given its complex history, the limited legalities in property rights, and the widespread violation in building and construction codes, it is difficult to adopt legality as a criterion for slum identification in Beirut.”
They further add: “To date, Lebanese public policies have never concretely addressed slums and their dwellers, despite a reasonable number of studies dedicated to the issue. Laissez-faire has been the rule, although punctuated by violent incidents of eviction.” The sole exception to this had been, of course, the Elyssar project (Public Agency for the Planning and Development of the South Western suburbs of Beirut). However, that failed time and again not only because of inappropriate funding but also because of the status quo of postwar reconstruction in which confessional fractions battle each other for power.
All the information relevant to the negotiations and contestations in the early phase of the Elyssar project are found in detail in Mona Harb’s “Urban Governance in Post-War Beirut: Resources, Negotiations and Contestations in the Elyssar Project.”
Here it is important to highlight the role that Hezbollah/Amal have played in the contestations and negotiations between the Lebanese state and the suburbs. While they have significantly added to the political stalemate of the project, they have transformed the public space of the suburbs through an intricate network of surveillance, social services, political participation and cultural activities in a way that the Lebanese state has been incapable of offering, particularly in this disadvantaged area.
The characterization of Hezbollah in the Lebanese context is very difficult and while it is not the topic of this essay, the work of Mona Harb and Reinoud Leenders, (Know thy enemy: Hezbollah, terrorism and the politics of perception) provides a framework to understand the role of the group inside the urban configuration of the suburbs as a distinct territory of identity. It is important to note that understanding the group as merely a terrorist group or as a part of the Lebanese institutions are both flawed perspectives, which blur the heterogeneous nature of para-state actors in Lebanon.
The animosity between investors and government institutions on the one hand, and Hezbollah on the other hand, confirms Harb’s observation:
While urban politics present themselves as a means for development they are actually strategies for territorial domination” (see Harb’s “La Dahiye de Beyrouth: Parcours d’une stigmatisation urbaine, consolidation d’un territoire politique”.)
The conquest of the public space and eventual colonization and closure of its history, and of what it is supposed to be found and remembered in it, is in Lebanon, the equivalent of political hegemony.
Architectural interventions and urban planning play a pivotal role in the configuration of the public space as the stage where politics appears, and here comes to mind Daniel Libeskin’s observation that “the public and political realm… is synonymous with architecture.”
The general lines for a discussion of the role of politics in architecture and architecture in politics have not been drawn with the exception of economic considerations and the problem of technology – as a counterpart of history – in weakening effective participation in democracy through excessive technification and functionalism of labor. Nevertheless, the necessity for an architectural configuration of the public space in which the world emerges between people, calls for a review of what Hannah Arendt conceived as the “space of appearances”, in terms radically architectural.
Ronald Beiner writes in “Our Relationship to Architecture as a Mode of Shared Citizenship: Some Arendtian Thoughts”:
The fundamental categories of Arendt’s political philosophy, such as worldliness and public space or “space of appearances”, are architectural ones (one can see this in how certain architectural theorists and even practitioners respond to her work). Hence, precisely where one encounters limits in trying to apply her political philosophy to politics, one can perhaps redeem her political philosophy by applying it to architecture.
For Hannah Arendt, the world – the space of politics – is the only place where we can appear to others in order to act, and it is this action that constitutes the basic units of power – which is always political – and that redeems the world from both the biological – and mortal – cycle of life.
Beiner makes an interesting argument in this regard: the now popular notion of public reason from Rawls and Habermas operates on considerations of constitutional structure and political order which are relevant only to political elites. Whereas, public space is relevant to all citizens; accordingly, public reason is less important than public space.
Arendt was increasingly concerned with the durability of the world as a stable artifice, where human action gains some sort of immortality. As Beiner noted, “That this sort of immortalizing function is implicit in architecture as the creation of a lasting habitat and a more durable context for human activities is not a surprise.”
World-oriented experiences were at the core of Arendt’s thinking about the nature and possibilities of the political. Here we encounter an obvious tension between hegemony and worldliness, in that spatiality or space is not the determining factor in the existence of a public world, but the guarantee that it can appear. Beiner shifts the emphasis from the public space as a setting for episodic freedom, to a public good, in which civic experience can take place. This, based upon one notion of citizenship being that "public things matter."
The notion of cynicism that is so widely discussed in politics can also be found in architecture. Beirut is a first-hand example of what Andrew Benjamin calls, “architecture as annihilation” in the context of museumfication rather than reconstruction for the sake of reconciliation.
For as long as the public space in Lebanon will continue being the battleground of hegemony in jeopardy of power, urban architecture and planning will reflect that. Beiner explains, “If the effect of an ensemble of architectural creation is not the constitution of some kind of polis, at least ideally, then the idea of architecture as a source of citizenship is a hollow one.”
Cynicism is here embodied in the notion that in reconstruction it is only economic growth and prosperity what will bring peace to a country devastated by war. However, Hannah Arendt warns, “Economic growth may one day turn out to be a curse rather than a good, and under no conditions it can either lead into freedom or constitute a proof of its existence.”
There is probably no presidential speech more quoted in Academic circles than Dwight D. Eisenhower's 1961 farewell speech, on the final day of his presidency. It was in that speech that Eisenhower warned of the danger of a military-industrial complex.
The need for a permanent army and a permanent arms industry creates, he writes, a gargantuan defense establishment that would wield an irresistible economic, political, and spiritual influence. In the face of this military-industrial complex, we as a nation must remain vigilant.
In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.
Eisenhower's speech was prescient. Particularly academics love to point to his speech to criticize bloated defense spending and point to the need to critically resist the military demands for more weapons and more soldiers. They are undoubtedly right to do so.
This is true even as today the military may be the one significant institution in American life where top leaders are arguing that America's world preeminence is not sustainable. In Edward Luce's excellent new book Time to Start Thinking, he describes how military leaders are convinced that the U.S. "should sharply reduced its "global footprint" by winding up all wars, notably in Afghanistan, and by closing peacetime military bases in Germany, South Korea, the UK, and elsewhere." The military leaders Luce spoke to also said that the US must learn to live with a nuclear Iran and "stop spending so much time and resources on the war against Al-Qaeda." Military leaders, Luce reports, are upset that "In this country 'shared sacrifice' means putting a yellow ribbon around the oak tree and then going shopping." Many military people seem to share Admiral Michael Mullen's view that the US national debt is the "country's number one threat—greater than that posed by terrorism, by weapons of mass destruction, and by global warming." One must think hard about the fact that military leaders see the need for "shared sacrifice" that will shrink the military-industrial complex while Americans and their elected leaders still speak about tax cuts and stimulus.
Too frequently forgotten in Eisenhower's speech, or even simply overlooked, is the fact that Eisenhower follows his discussion of the military-industrial complex with a similar warning about the dangers of a "revolution in the conduct of research." Parallel to the military-industrial complex is the danger of a university-government complex. (Hat Tip, Tom Billings (see comments)). Eisenhower writes:
Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades. In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.
Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.
Just as modern warfare demands a huge and constant arms industry, so too does the technological revolution demand a huge and constant army of researchers and scientists. This army can only be organized and funded by government largesse. There is a danger, Eisenhower warns, that the university-government complex will take on a life of its own, manufacturing unreal needs (e.g. a Bachelor of Arts degree in order to manage an assembly line) and liberally funding research with little regards to quality, meaning, or need. While the university-government complex is not nearly as expensive or dangerous as the military-industrial complex, there is little doubt that it exists.
Eisenhower warns of a double threat of this university-government complex. First, the nation's scholars could be dominated by Federal employment, and gear their research to fit with governmental mandates. And second, the opposite danger, that "public policy could itself become the captive of a scientific-technological elite."
The existence and power of just such a scientific-technological elite is undeniable today. On the one side are the free-market idealogues, those acolytes of Friedman, Hayek, and Coase, who insist that policy be geared towards rational, self-regulating, economic actors. That real people do not conform to theories of rational behavior is a problem with the people, not the theories.
On the other side are the welfare-state adherents, who insist on governmental support for not only the poor, but also the working classes, the bankers, and corporations. The sad fact that 50 years of anti-poverty programs have not alleviated poverty or that record amounts of money spent on education has seen educational attainment decrease rather than increase is seen to be no argument for the failure of technocratic-governmental solutions. It just means more money and more technical know-how are needed.
It is simply amazing that people in academia can actually defend the current system that we are part of. Of course there are good schools and fine teachers and serious students. But we all know the system is a failure. Graduate students are without prospects; faculty spend so much time publishing articles and books that no one reads; administrators make ever more - sometimes twelve times as much as full professors-and come more and more to serve as the lifeblood of universities; and it is the rare student who amidst the large classes, absent faculty, and social and financial pressures, somehow makes college an intellectual experience.
The idea and practice of college needs to be re-imagined and re-thought. Entrenched interests will oppose this. But at this point the system is so broken that it simply cannot survive. On a financial level, large numbers of universities are being kept afloat on the largesse of federal student loans. If those loans were to disappear or dry up, many colleges would disappear or at the least shrink greatly. This should not happen. And yet, putting our young people $1 trillion in debt is not an answer. For too long we have been paying for our lifestyles with borrowed money. We are now used to our inflated lifestyles and unwilling to give them up. Something will have to give.
The current cost of a college education is unsustainable except for the very top schools that attract the very richest students who then fund endowments that allow those schools to subsidize economic, national, and racial diversity. For schools that cannot attract the wealthiest or do not have endowments that protect them from market forces, change will have to come. This will mean, in many instances, faculty salaries will decrease and costs will have to come down. In other colleges, costs will rise and university education will be ever less accessible. Either way, the conviction that everyone needs a liberal arts degree will probably be revised.
I have no crystal ball showing where this will all lead. But there are better and worse ways that the change will come, and I for one hope that if we turn to honestly thinking about it in the present, the future will be more palatable. This is the debate we need to have.