Where to start?
There is probably no question more debated in the course of Middle Eastern uprisings than that of the status of human rights. Anyone familiar with the region knows that the status of human rights in the Middle East is at best obscure. The question of why there was not a “revolution” in Lebanon is a very complex one, tied with the fate of Syria and with the turbulent Lebanese politics since the end of the civil war, and hence cannot be fully answered. In a vague sense it can be said of course that Lebanon is the freest Arab country and that as such it bears a distinctively different character.
While at face value, the statement is true, being “more free than” in the Middle East is simply understating a problem. Just to outline the basic issues, Lebanon’s record on human rights has been a matter of concern for international watchdogs on the following counts:
Security forces arbitrarily detain and torture political opponents and dissidents without charge, different groups (political, criminal, terrorist and often a combination of the three) intimidate civilians throughout the country in which the presence of the state is at best weak, freedom of speech and press is severely limited by the government, Palestinian refugees are systematically discriminated and homosexual intercourse is still considered a crime.
While these issues remain at the level of the state, in society a number of other issues are prominent: Abuse of domestic workers, racism (for example excluding people from color and maids from the beaches) violence against women and homophobia that even included recently a homophobic rant on a newspaper of the prestigious American University in Beirut. The list could go on forever.
The question of gay rights in Lebanon remains somewhat paradoxical. On the one hand, article 534 of the Lebanese Penal Code prohibits explicitly homosexual intercourse since it “contradicts the laws of nature”, and makes it punishable with prison. On the other hand, Beirut – and Lebanon – remains against all odds a safe haven, for centuries, for many people in the Middle East fleeing persecution or looking for a more tolerant lifestyle.
That of course includes gays and lesbians and it is not uncommon to hear of gay parties held from time to time in Beirut’s celebrated clubs. At the same time, enforcement of the law is sporadic and like everything in Lebanon, it might happen and it might not; best is to read the horoscope in the morning and pray for good luck. A few NGO pro-LGBT have been created in the country since the inception of “Hurriyyat Khassa” (Private Liberties) in 2002.
In 2009 Lebanese LGBT-organization Helem launched a ground-breaking report about the legal status of homosexuals in the entire region, in which a Lebanese judge ruled against the use of article 534 to prosecute homosexuals.
It is against the background of this turbulent scenario that Samer Daboul’s film “Out Loud” (2011) came to life, putting together an unusual tale about friendship and love set in postwar Lebanon in which five friends and a girl set on a perilous journey in order to find their place in the world.
Though the plot of the film seems simple, underneath the surface lurks a challenge to the traditional morals and taboos of Lebanese society – homosexuality, the role of women, the troubled past of the war, delinquency, crime, honor – which for Lebanese cinema, on the other hand, marks a turning point.
This wouldn’t be so important in addressing the question of rights and freedoms in Lebanon were it not for a documentary, “Out Loud – The Documentary”, released together with the film that documents in detail the ordeal through which the director, actors and crew had to go through in order to complete this film.
Shot in Zahlé, in mountainous heartland of Lebanon and what the director called “a city and a nation of conservatism and intolerance”, it is widely reported in the documentary that from the very beginning the cast and crew were met with the same angry mobs, insults, and physical injuries that their film in itself so vehemently tried to overcome; a commercial film about family violence, gay lovers, and the boundaries of relationships between men and women. A film not about Lebanon fifteen or twenty years ago, but about Lebanon of here and today.
Daboul writes: “Although I grew up in the city in which “Out Loud” was filmed, even I had no idea how difficult it would be to make a movie in a nation plagued by violence, racism, sexism, corruption and a lack of respect for art and human rights.” The purpose of “Out Loud” of course wasn’t only to make a movie but a school of life, in which the maker, the actors and the audience could all have a peaceful chance to re-examine their own history and future.
Until very recently in lieu of a public space, in Lebanon, any conflict was solved by means of shooting, kidnapping and blackmailing by armed militias spread throughout the country and acting in the name of the nation.
The wounds have been very slow to heal as is no doubt visible from the contemporary political panorama. Recently, a conversation with an addiction counselor in Beirut revealed the alarming statistics of youth mental illness, alcoholism and drug addiction across all social classes in Lebanon, to which I will devote a different article.
Making films in Lebanon is an arduous process that not only does not receive support from the state but is also subject to an enormous censorship bureaucracy that wants to make sure that the content of the films do not run counter to the religious and political sensibilities of the state. In the absence of strong state powers, the regulations are often malleable and rather look after the sensibilities of political blocs and religious leaders rather than state security, if any such exists.
The whole idea of censorship of ideas is intimately intertwined with the reality of freedom and rights and with the severe limitations – both physical and intellectual – placed upon the public space.
In the Middle East, censorship of a gay relationship is an established practice in order to protect public morality; however what we hear on the news daily that goes from theft to murder to kidnap to abuse to rape to racism, does not require much censorship and is usually consumed by the very same public.
If there is one thing here that one can learn from Hannah Arendt about freedom of speech is that as Roger Berkowitz writes in “Hannah Arendt and Human Rights”:
The only truly human rights, for Arendt, are the rights to act and speak in public. The roots for this Arendtian claim are only fully developed five years later with the publication of The Human Condition. Acting and speaking, she argues, are essential attributes of being human. The human right to speak has, since Aristotle defined man as a being with the capacity to speak and think, been seen to be a “general characteristic of the human condition which no tyrant could take away.”
Similarly, the human right to act in public has been at the essence of human being since Aristotle defined man as a political animal who lives, by definition, in a community with others. It is these rights to speak and act –to be effectual and meaningful in a public world – that, when taken away, threaten the humanity of persons.
While these ideas might seem oversimplified and rather vague in a region “thirsty” for politics, they establish a number of crucial distinctions that must be taken into account in any discussion about human rights. Namely:
1) The failure of human rights is a fundamental fact of the modern age
2) There is a distinction between civil rights and human rights, the latter being what people resort to when the former have failed them
3) It is the fact that we appear in public and speak our minds to our fellowmen that ensures that we live our lives in a plurality of opinions and perspectives and the ultimate indicator of a life being lived with dignity.
Even if we have a “right” to a house, to an education and to a citizenship (that is, belonging to a community) if we do not have the right to speak and act in public and express ourselves (as homosexual, woman, dissident and what not) we are not being permitted to become fully human. Regardless of the stability of political institutions, provision of basic needs and security, there is no such a thing as a human world – a human community – in the absence of the possibility of appearing in the world as what we truly are.
“Out Loud” – both the film and the documentary – are a testimony of the degree to which the many elements composing the multi-layered landscape of Lebanese society are at a tremendous risk of worldlessness by being subject to an authority that relies on violence in lieu of power. Power and violence couldn’t be any more opposite.
Hannah Arendt writes in her journals:
Violence is measurable and calculable and, on the other hand, power is imponderable and incalculable. This is what makes power such a terrible force, but it is there precisely that its eminently human character lies. Power always grows in between men, whereas violence can be possessed by one man alone. If power is seized, power itself is destroyed and only violence is left.
It is always the case in dark times that peoples – and also the intellectuals among them – put their entire faith in politics to solve the conflicts that emerge in the absence of plurality and of the right to have rights, but nothing could be more mistaken. Politics cannot save, cannot redeem, cannot change the world. Just like the human community, it is something entirely contingent, fragile and temporary.
That is why no decisions made on the level of government and policies are a replacement for the spontaneity of human action and appearance. It is here that the immense worth of “Out Loud” lies; in enabling a generation that is no longer afraid of hell – for whatever reason – to have a conversation, and it is there where the rehabilitation of the public space is at stake and not in building empty parks to museumficate a troubled past, as has been often the case in Beirut. In an open conversation, people will continue contesting the legacy and appropriating the memory not as a distant past, but as their own.
The case of Lebanon remains precarious: Lebanon’s clergy has recently united in a call for more censorship; and today it was revealed that the security services summon people for interrogation over what they have posted on their Facebook accounts; HRW condemned the performance of homosexuality tests on detainees in Lebanon, even though this sparked a debate and a discussion on the topic ensued at the seminar “Test of Shame” held at Université Saint-Joseph in Beirut and the Lebanese Medical Society held a discussion in which they concluded those tests are of no scientific value.
In a country like Lebanon, plagued by decades of war and violence, as Samer Daboul has said in his film, people are more than often engaged at survival and just at that – surviving from one war to another, from one ruler to another, from one abuse to another, and as such, the responses of society to the challenges of the times are of an entirely secondary order. But what he has done in his films is what we, those who still have a little faith in Lebanon, should have as a principle: “It’s time to live. Not to survive”.
“The human condition comprehends more than the condition under which life has been given to man. Men are conditioned beings because everything they come in contact with turns immediately into a condition of their existence. The world in which the vita activa spends itself consists of things produced by human activities; but the things that owe their existence exclusively to men nevertheless constantly condition their human makers.”
-Hannah Arendt, The Human Condition, 1958, p. 9
The human condition is the context or situation we, as human beings, find ourselves in, the implication being that human life cannot be fully understood by considering humanity in isolation from its environment. We are, to a large degree, shaped by our environment, which is why Arendt refers to us as conditioned beings.
We are conditioned by phenomena external to us, and this may be considered learning in its broadest sense, that is, in the sense that the Skinnerian conditioned response is a learned reaction to external stimuli. It follows that any form of life that is capable of modifying its behavior in response to external stimuli is, to some extent, a conditioned being.
On a grander scale, natural selection, as it is popularly understood, can be seen as a conditioning force. Survival of the fittest is survival of those best able to adapt to existing external conditions, survival of those best able to meet the conditions of their environment. The fittest are, quite naturally, those in the best condition, that is, the best condition to survive. Whether we are considering the effects of natural selection upon an entire species, or individual members of a species, or what Richard Dawkins refers to as the selfish gene, the environment sets the conditions that various forms of life must meet to survive and reproduce.
Such views are inherently incorrect insofar as they posit an artificial separation between the conditions of life and the form of life that is conditioned. An ecological or systems view would instead emphasize the interdependent and interactive relationships that exist, as all forms of life alter their conditions simply by their very presence, by their metabolism, for example, and through their reproduction. Darwin understood this, I hasten to add, and the seeds of ecology can be found in his work, although they did not fully germinate until the turn of the 20th century. And Skinner certainly was aware of the individual’s capacity for self-stimulation, and self-modification, but a truly relational approach in psychology did not coalesce until Gregory Bateson introduced a cybernetic perspective during the 1950s.
In the passage quoted above, it is readily apparent that Arendt is an ecological thinker. In saying that, “the things that owe their existence exclusively to men nevertheless constantly condition their human makers,” she is saying that we create the conditions that in turn condition us. We exist within a reciprocal relationship, a dialogue if you like, between the conditioned and the conditions, the internal and the external, the organism and its environment. The changes that we introduce into our environment, that alter the environment, feedback into ourselves as we are influenced, affected, and shaped by our environment.
The contrast between using tools and techniques in the most basic way to adapt to the conditions of the environment, and the creation of an entirely new technological environment of great complexity that requires us to perform highly convoluted acts of adaptation was portrayed with brilliant sensitivity and humor in the 1980 South African film, directed by Jamie Uys, entitled The Gods Must Be Crazy. A good part of the documentary style opening can be seen on this YouTube clip:
The story of the Coke bottle, although fictional, follows the pattern of many documented cases in which the introduction of new technologies to traditional societies has had disruptive, and often enough, disastrous effects (the film itself, I hasten to add, is marvelously comedic, and quite often slapstick following the introductory quarter hour.)
The understanding that we are conditioned by the conditions we ourselves introduce was not unknown in the ancient world. The 115th Psalm of David, in its polemic against idolatry and the idols that are “the work of men’s hands,” cautions that “they who make them shall be like unto them; yea every one that trusts in them.” Along the same lines, the Gospel of Matthew includes the famous quote, “all those who take up the sword shall perish by the sword,” while the Epistle to the Galatians advises, “whatsoever a man sows, that shall he also reap.” A more contemporary variation of that maxim is, “as you make your bed, so you shall lie on it,” although in the United States it is often rendered in the imperative and punitive form of, “you made your bed, go lie in it!” During the 19th century, Henry David Thoreau notified us that “we do not ride on the railroad; it rides upon us,” while Mark Twain humorously observed that, “if all you have is a hammer, everything looks like a nail.” More recently, we have been told, “ask a silly question, get a silly answer,” to which computer scientists have responded with the acronym GIGO, which stands for, “garbage in, garbage out.” Winston Churchill said, “we shape our buildings, and thereafter they shape us,” and former Fordham professor John Culkin, in turn, offered, “we shape our tools, and thereafter they shape us,” as a corollary to Marhsall McLuhan’s media ecology aphorism, “the medium is the message.”
All of these voices, in their varying ways, are pointing to the same essential truth about the human condition that Arendt is relating in the quote that begins this post. And to pick up where that quote leaves off, Arendt goes on to argue,
In addition to the conditions under which life is given to man on earth, and partly out of them, men constantly create their own, self-made conditions, which, their human origin and their variability not withstanding, possess the same conditioning power as natural things.
The “conditions” that we make are used to create a buffer or shield against the conditions that we inherit, so that our self-made conditions are meant to stand between us and what we would consider to be the natural environment. In this sense, our self-made conditions mediate between ourselves and the pre-existing conditions that we operate under, which is to say that our conditions are media of human life. And in mediating, in going between our prior conditions and ourselves, the new conditions that we create become our new environment. And as we become conditioned to our new conditions, they fade from view, being routinized they melt into the background and become essentially invisible to us.
Let us return now for the conclusion of the passage from The Human Condition:
Whatever touches or enters into a sustained relationship with human life immediately assumes the character of a condition of human existence. This is why men, no matter what they do, are always conditioned beings. Whatever enters the world of its own accord or is drawn into it by human effort becomes part of the human condition. The impact of the world’s reality upon human existence is felt and received as a conditioning force. The objectivity of the world—its object- or thing-character—and the human condition supplement each other; because human existence is conditioned existence, it would be impossible without things, and things would be a heap of unrelated articles, a non-world, if they were not the conditioners of human existence.
This last point is quite striking. It is we, as human beings, who create worlds, which brings to mind the moving commentary from the Talmud: “whoever saves a life, it is considered as if he saved an entire world.” We create worlds, in the sense that we give meaning to existence, we attribute meaning to phenomena, we construct symbolic as well as material environments. Each one of us, in our singular subjectivity, creates a world of our own, and therefore each one of us represents a world unto ourselves.
But these individual worlds are links, nodes in a social network, interdependent and interactive parts of an ecological whole. The term condition, in its root meaning is derived from the Latin prefix com, which means together, and dicere, which means to speak. And our ability to speak together, to engage in discussion and deliberation, to enter into symbolic interaction, constitutes the means by which we collectively construct our intersubjective, social realities, our worlds.
As human beings, we are conditioned not only by our labor, the ways in which we obtain the necessities of life, i.e., air, water, food, shelter, to which Marx sought to reduce all aspects of society, a position that Arendt severely criticized. We are conditioned not only by our work, which Arendt associated with artifacts, with instrumentality and technology, with arts and crafts. We are conditioned most importantly by action, which in Arendt’s view is intimately tied to speech and the symbolic, and to processes rather than things, to relations rather than objects.
In the end, Arendt reminds us that the human condition is itself conditional, and to be fully human requires not only that we take care of biological necessity, nor that we make life easier through technological innovation, but that we cooperate through speech and action in collectively constructing a world that is truly blessed with freedom and with justice.
One of my favorite images in Arendt’s writings comes not from Arendt herself, but her citation of the poem “Magic” by Rainer Maria Rilke. Rilke’s poem reads (in an approximate translation):
From indescribable transformation originate
Amazing shapes. Feel! Trust!
We suffer often: To ashes turn our flames;
Yet art can set on fire the dust.
Magic is here. In the realm of enchantment
The ordinary word appears elevated
But sounds as real as if the dove called
To seek its invisible mate.
Arendt cites Rilke’s poem in the final section of the chapter of the Human Condition on Work. It is part of her discussion of art and her claim that “the immediate source of the art work is the human capacity for thought.”
Art, Arendt writes, has its foundation in thinking. Works of art, she writes, are “thought things.” They are thingifications of thoughts, or to use a word that is so often abused, they are reifications of thoughts—The making of thoughts into things. It is this process of transformation and transfiguration that Rilke captures in “Magic”: To “set fire to the dust” and bring beauty and truth to the real world. That is what art does.
My mind turned to Rilke’s poem as I watched the great South African artist William Kentridge deliver the first of his 2012 Norton Lectures at Harvard University.
Kentridge spoke in praise of shadows, and situated his talk within a reading of Plato’s allegory of the Cave in Book VII of the Republic. The story of the cave begins with prisoners shackled and unmovable who see shadows along a wall projected by a fire. First one sets himself free and climbs out into the light of the sun and, slowly, painfully, comes to recognize in the light of the sun that the shadows were indeed shadows, untrue. The parable illustrates the error of sensible things and is one part of Plato’s illustration of his theory of ideas. The ideas, supersensible truths of reason and logic, do not deceive and change like the shadowy things of the world. Only what lasts eternally is true; all that is sensible and fleeting is false.
Kentridge tells the story of Plato’s cave to explain why he sees art, and especially his art, in opposition to the Platonic idea of truth. If Plato celebrates the primacy of the eternally true over the shadows, Kentridge argues that art elevates the image above the truth. For this reason, at least in part, Kentridge’s art works with shadows. Shadow figures and shadow puppets.
Kentridge lauds shadows. In the very limitations of the shadows, in the gaps, in the gaps that inspire in us leaps to complete an image, that is where we think and learn. The leanness of the illusion pushes us to complete the recognition. It is in shadows that we find our agency in apprehending the world.
Shadow art is, for Kentridge, political. Plato’s politics depends on a truth known and understood by the few and then imposed on the many. In this sense philosophy is, in Arendt’s words, opposed to politics, and the philosopher either must seek merely to be left alone by the people (which is difficult because philosophers are dangerous), or they will always seek to dominate and tyrannize the polity with their reason. Arendt’s lifelong battle is to free politics from the certainty of rational and philosophical truth, to open us to a politics of opinion and openness.
Knowledge is power and there is, in Kentridge’s words, a relation between knowledge and violence. Kentridge embraces shadows and silhouettes to oppose the philosophical and Platonic tyranny of reason. He writes elsewhere:
I am interested in a political art, that is to say an art of ambiguity, contradiction, uncompleted gestures and uncertain ending – an art (and a politics) in which optimism is kept in check, and nihilism at bay.
Optimism must be kept in check since any certainty about the destination can underwrite the need for violence to bring others to that end. For Kentridge, “There is no destination. all destinations, all bright lights, arouse our mistrust.”
Kentridge offers us an image of the artist. He speaks from the studio and from his notebook to emphasize the source of artistic truth in the thought image rather than the logical word. An artist thinks. He sees. He makes art. He makes things that reflect not truth and certainty but gaps, misgivings, and questions. Kentridge gives reality to the questionability of the world in his shadow art. In this way his art reminds us of the magic of Rilke’s fire that transfigures dust into flame.
Few modern artists work magic like William Kentridge. His Norton Lectures are a great introduction to his art and the thinking behind his art. If you are not graduating this weekend, take the time to hear and look at what Kentridge says and makes.
You can view Kentridge’s First Norton Lecture here. Consider it your visual weekend read.
How many times can we watch the latest European movie? Once again Europe is buckling under the weight of debt and austerity. And once again, Greece, the birthplace of democracy, has led the democratic leaders of Europe to shun their responsibilities and beg for technocratic saviors.
As the Financial Times reports, European leaders are as bankrupt as their economies and they are seeking to be bailed out politically and economically by Mario Draghi, the unelected President of the European Central Bank.
To the frustration of Mario Draghi, its president, the European Central Bank is once again being eyed as a possible saviour of Europe’s monetary union. Since he became president last November, Mr Draghi has urged bolder action by politicians to strengthen public finances and build effective “firewalls” against spreading crises. Earlier this month he scolded governments for creating a European Financial Stability Facility that “could hardly be made to work”. He saw the unelected ECB’s role as strictly limited. Instead, eurozone politicians, led by François Hollande, France’s new president, have sought to turn the tables, demanding action from Frankfurt.
As one person the FT quotes says, “There is a constant frustration at the ECB with politicians.” Sounds familiar. It is not only in Europe that politicians have refused to lead and take responsibility for solving our growing and increasingly insoluble problems.
It is easy to blame politicians. But keep in mind, they are elected. And that may be the problem. For it is we, those self-interested and apparently spoiled folks who elect them, who refuse to consider tax raises or austerity, or both, which would actually be necessary to bring our financial houses into order. This is especially true in Greece where voters have repeatedly refused to honestly and pragmatically accept the reforms needed to right the ship of the Greek state.
Which is why Amartya Sen’s Op-Ed in the NY Times Tuesday sounds so shrill. Sen rightly sees that democracy in Europe is being replaced by technocratic fiat, and this understandably bothers him. He writes:
Perhaps the most troubling aspect of Europe’s current malaise is the replacement of democratic commitments by financial dictates — from leaders of the European Union and the European Central Bank, and indirectly from credit-rating agencies, whose judgments have been notoriously unsound.
But Sen’s response is out of touch. If the Greeks would just be given an opportunity to publicly discuss the matter and engage in a rational public discourse, they would be able to take appropriate steps. In his own words:
Participatory public discussion — the “government by discussion” expounded by democratic theorists like John Stuart Mill and Walter Bagehot — could have identified appropriate reforms over a reasonable span of time, without threatening the foundations of Europe’s system of social justice. In contrast, drastic cuts in public services with very little general discussion of their necessity, efficacy or balance have been revolting to a large section of the European population and have played into the hands of extremists on both ends of the political spectrum.
This is of course a good point. It would be best if the Greeks were to engage in. It would be best if the Greeks were to engage in participatory public discussion leading toward appropriate reforms. But this doesn’t seem to be happening, resulting in the draconian cuts, which, yes, are revolting to a large section of the European population. Unmentioned is the fact that other Europeans are revolted by the fact that Greeks for years have worked pitifully few hours in comparison to other Europeans, have paid significantly lower taxes, and supported a political patronage system that creates an untouchable class of political bureaucrats who live well for doing very little. The New York Times reports, today, on the class of Greek plutocrats who for years have avoided taxes and now, when times are tough, are abandoning philanthropies in Greece and secreting their money to tax havens. If the Greeks won’t help Greece, why should the Germans?
The point is not simply to punish the Greeks for their past transgressions (although that too is not out of place), but that Germans also no longer trust the Greeks and refuse to go on paying for their profligate ways. And since the Greeks have not and seemingly will not democratically make the changes to their lifestyles that are required by their economic position, they are putting their hopes in undemocratically elected technocrats at the European Central Bank to save them.
The Greeks are not alone in seeking to trade democracy for technocracy. As I have written here and here and here over the past months, the trend toward technocratic governance is growing as people around the world lose faith not simply in democratic leaders, but in democracy itself. Around the world, democracies are electing politicians who are handing off power to non-democratically elected technocrats. This is happening in Europe and also here in the U.S. I am sure some of “the people” disagree with this. But more and more seem fine with it.
It is easy to blame the politicians, just as it has been easy for years to blame the press. But as Edward Luce writes in his recent book Time to Start Thinking, the real problem with democracy is us. He focuses on America:
Americans reflexively single out Washington, D.C., as the cause of their ills. As this book will explore, however, Washington’s habits are rooted in American society. Blaming politicians has turned into a lazy perennial of modern American life.
The problem as Luce sees it is that left and right are caught in a thoughtless nostalgia for a golden age that no longer exists. The left, he writes,
yearns for the golden age of the 1950s and ’60s when the middle class was swelling and the federal government sent people to the moon. Breadwinners worked eight hours a day in the factory and could bank on “Cadillac” health care coverage, a solid urban or suburban lifestyle, and five weeks’ vacation a year.
On the other side, the right is nostalgic for
the godly virtues of the Founding Fathers from whom their country has gravely strayed. People stood on their own two feet and upheld core American values. It was a mostly small town place of strong families, where people respected the military and were involved in their community churches.
Luce understands not only that both these visions are nostalgic, but that they are preventing us from thinking honestly and seriously about our present. The problem is that thinking honestly today requires accepting sacrifice.
The wealthy and the upper middle classes (not just the 1% but the top 20%) will have to pay more in taxes. The poor and the middle classes will have to receive smaller pensions, work longer, and get fewer governmental services in return. Maybe citizens will have to do public service. Standards of living across the board will be hit. This is the payback for decades of debt-infused living that we all need to confront. Luce is right, it is time to start thinking.
“The Garden of the Prophet”, Lebanese poet Khalil Gibran’s posthumous book, included the poem “Pity the Nation”, his most famous and that ends with the following stanza: “Pity the nation divided into fragments, each fragment deeming itself a nation.”
“Pity the Nation” might well be an eight-stanza history of Lebanon: Fullness of beliefs and emptiness of religion, acclaiming the bully as hero, not raising its voice except in funerals, boasting not except among ruins, welcoming rulers with trumpeting only in order to farewell them with hooting and welcome another with more trumpeting; more than anything stands out the division into fragments, each one acting as a nation or in the name of the nation.
Already in 1860’s geopolitical conflicts in the region were translated into bitter sectarian conflicts that continued throughout independence, only to be further marred by the creation of the neighboring State of Israel. The weak political leadership of the different sects looked elsewhere than Lebanon to enter larger alliances that could further consolidate their power and quickly enough the central government began to lose control and the sectarian violence deteriorated into a civil war lasting nearly twenty years.
The history of the Lebanese civil war is rather well known, and though remarkable it was in terms of the actors involved, what is even more remarkable is the ways that the Lebanese found to negotiate their former conflicts and rehabilitate the public sphere in order to move on from a turbulent past into a future plagued by open wounds and uncertainties.
Nowhere is the legacy of the war more visible than in the city of Beirut, whose status as a cosmopolitan regional hub wasn’t born out of planning but rather the obvious accidental consequence of a very troubled past.
Craig Larkin outlined in his paper “Reconstructing and Deconstructing Beirut: Space, Memory and Lebanese Youth” some of the reasons behind Lebanon’s dynamism: A mountain refugee for religious minorities; a forged compromise of colonial powers and indigenous elites; a republic of tribes and villages; a cosmopolitan mercantile power-sharing enclave; a playground for the rich; a battle ground for religious and political ideologies; a fusion and combustion of the Arab East and the Christian West; an improbable, precarious, fragmented, shattered, torn nation.
All of these elements convened at once in Beirut in pre-war times: The city grew along the lines of quarters – usually of different religious communities – that developed an inclusive space for all after 1879 when a public garden was launched in the “bourj” (Martyrs’ Square) and the area evolved into a urban hub for all types of public activities.
During the civil war it was precisely this area what split the city in two and along the lines of which militia fighting was drawn, separating the city between East and West Beirut, and shifting the once mixed population. The end of the war, with its permanent calls for dialogue and reconciliation, surprisingly, did nothing to change the demographic status quo of the war.
The reconstruction of Beirut, and particularly of its historical downtown, was taken up in 1994 by private venture Solidere (Société libanaise pour le développement et la reconstruction de Beyrouth), established by then prime minister Rafik Hariri – later assassinated – at a time when the Lebanese state was still too weak and could not appropriately pass strong judgments in order to punish war criminals and effect a true social reconciliation in Lebanese society.
The solution then – as aptly described by Sune Haugbolle in his book “War and Memory in Lebanon”– was a vision of national unity, imagined or imaginary, through which Hariri’s capitalism seized the day with a state-sponsored amnesia in which reconciliation was limited to the private sphere and a vision reigned in which the most important thing was to leave the past behind.
The price that Beirut had to pay for this nominally was the actual destruction of what had been formerly the sole equivalent of a physical public realm. The obvious lack of interest in social reconciliation eliminated the possibility of true interaction between the different communities and this was further consolidated by the total absence of shared public areas. The forces and powers of the state were incorporated into Hariri’s capital and became identical with it.
The reconstruction of Beirut wasn’t so much an exercise in reconstruction as it was the total remaking of a symbolic part of the city that closed off the vaults of the past to interpretation in order to replace the immediate past with two equally disturbing symptoms of amnesia: The absolute past and the absolute future. The motto “Beirut: Ancient City of the Future” was coined and before the reconstruction even began, a large part of the area was demolished; in fact, much more than had been destroyed during the entire war.
The futuristic landscape entirely absent of public spaces – consisting mostly of prohibitively expensive residential towers and an exclusive shopping district – was coupled with an interest to preserve Beirut’s ancient heritage – ruins from Roman and Phoenician times – in order to create a model of a city that was entirely disconnected, even physically, from the vast majority of Beirut and created yet new sources of segregation and division.
Solidere’s concept envisioned a “Beirut reborn” in which the past informs the future, doing precisely what prominent Lebanese architect Bernard Khoury expressed: “It completely bypasses the present. It evokes and links the past and the future, but shrugs off any notion of the present.”
But Beirut shows a different picture in which the present rises as it self-destructs: The ambitiously wealthy downtown is contrasted to a city with poverty looming close to 35% and where news of buildings collapsing because of inadequate infrastructure is not uncommon.
At the same time the ghost of sectarianism is a living reality: What had been checkpoints and militia roadblocks during the civil war have now been replaced by subtle division lines that can be experienced by anyone who travels through the city: Posters of different sect leaders, graffiti and other religious and political icons serve the exact same function and give the unavoidable impression of a city deeply divided that echoes Lebanon’s political landscape.
Acts of memory have become commonplace in response not only to Hariri’s capitalism but to the entire political establishment, however they remain at the level of demanding what no Lebanese movement or faction has ever done: To step up to the challenge of opening public spaces in which there can be social reconciliation; namely, the acceptance that a court of justice cannot punish an entire country in which all groups involved bear responsibility.
Artists on the other hand have remained trapped in two narratives that equally defy the gist of the present: Either the total view of Lebanon through the eyes of the war or the Oriental Romanticism of the pre-republican Lebanon that is identical with the Western fantasies about the Middle East. Khoury says elsewhere: “Beirut has a false relationship with its past, characterized by a superficially Arabocentric kind of nostalgia.” What is remarkable here is the absence of the present.
Recently, I elaborated in “War and Memory in Lebanon” about the challenges posed by Hannah Arendt’s ideas on forgiveness and reconciliation in postwar Lebanon in the context of Tajaddod’s interactive exhibit “Another Memory”, however I want to turn my attention now to Beirut’s relationship to the public space.
Arendt conceived of the public realm as a space produced by particular forms of citizen interaction, where citizens engage in the unpredictable self-disclosure typical of political action, properly conceived, and strengthen the bonds between them in order to sustain this selfsame space.
She writes in The Human Condition:
The term public signifies the world itself, in so far as it is common to all of us and distinguished from our privately owned place in it. This world, however, is not identical with the earth or with nature, and the limited space for the movement of men and the general condition of organic life. It is related, rather, to the human artifact, the fabrication of human hands, as well as to affairs which go on among those who inhabit the man-made world together. To live together in the world means essentially that a world of things is between those who have it in common, as a table is located between who sit around it; the world, like every in-between, relates and separates men at the same time.
Under the conditions of a common world, reality is not guaranteed primarily by the “common nature” of all men who constitute it, but rather by the fact that, differences of position and the resulting variety of perspectives notwithstanding, everybody is always concerned with the same object. The end of the common world has come when it is seen only under one aspect and is permitted to present itself in only one perspective.
This common world which Arendt discusses is a man-made phenomenon that occurs in between men naturally rather than dictated by one man alone, and this variety of “crafted” worlds is typical not only of totalitarian regimes but of any situation – political or otherwise – in which the spontaneity of human action is taken away in order to be replaced with an ideal situation in which the unpredictability of action is traded for calculations.
One of those situations in which human action is calculated is the privatization of the public realm, as has been elaborated by Mark Willson in his paper “Enacting public space: Arendt, citizenship and the city” where he makes the case for the importance of citizenship practices within the shared space of the city and how the political implications of the privatization of the public space always result in the weakening of participatory democracy.
Willson brings up recent work of Margaret Kohn (2004) which is immediately relevant to the case of Beirut: “Even when members of different groups do not engage in formal political discussion, expose to others may help offset the mutual fear and suspicion fostered by segregation. It is difficult to feel solidarity with strangers if we never inhabit places that are shared with people who are different.”
The privatization of downtown Beirut and the area surrounding Martyrs’ Square isn’t simply a question of neo-liberal economy but an attempt to dovetail and manipulate the public space into an artificial arena of consumption.
On the other hand, alternative public spaces have existed in Beirut through the war years and not limited to downtown; Larkin for example brings up the case of Hamra, home to the prestigious American University in Beirut and where the lack of urban planning and official governance enabled the development of a creative environment, allowing greater room for contested post-war visions and plural identities.
Cross-sectarian platforms do exist in Lebanese society (among them, Tajaddod is but one example) and there has been something of a resurrection of a secular movement, however at the level of the state, representation remains largely sectarian as it was from the times of French edict of 1936, after which people had to declare membership in one of the religious communities to receive the right to citizenship. Many aspects of life are still largely determined by sect.
But the consequence of this is that the fragile balance remains in spite of the official narrative of reconciliation between past and future, and without present; proof of the above is that recent clashes in the north of the country quickly spread to Beirut and brought up the anxiety of the civil war years in an environment in which people are acutely aware that the balance may break at the slightest disturbance.
It is highly unlikely that the current political leadership will be able to resolve the sectarian conflict at the heart of Lebanon’s turbulent history since they rose – against all odds – out of the sectarian conflicts and are indebted to the status quo for their power and authority in representing large sections of the Lebanese population.
A public space reinvented on a policy of amnesia isn’t only a limited public realm but also the gentrification of an entire location of memory into an elitist museum, closing not only the past but also the future. A student interviewed by Larkin expressed it best: “The redevelopment involved a covering or hiding of the memory of the war, and in this sense it’s unreal. You can’t talk just of Romans and Phoenicians and our great heritage, without mentioning militias, kidnapping and bombs.”
Even though the historical downtown isn’t the only of Solidere’s ventures (that include also the failed Elyssar plan in southern Beirut) it would be of course an unfair assessment to say that Solidere alone is responsible for the gap in the Lebanese memory. Bernard Khoury comes to mind again when he says the obvious: “Could anything more be demanded of a private company when the country as a whole is incapable of writing its own history? It’s very sad now that in school books history stops in 1975.”
Lourdes Martinez-Garrido articulated it very well in her “Beirut Reconstruction: A Missed Opportunity for Conflict Resolution” (Al Nakhlah, Fall 2008): The Lebanese civil war resolved none of the conditions that generated the initial confrontation. Like any other type of violence, it generated fear, suffering and destruction. In the process of recovery, there was no political plan for social reconstruction.
Finally, the attempted reconstruction of Beirut – though an apparent success – has decidedly turned its own heritage and culture into a “product”, usually a product of entertainment for everyone but those who suffered the war, into a touristic souvenir. This is what Hannah Arendt warned about in “The Crisis in Culture”:
Mass culture comes into being when mass society seizes upon cultural objects, and its danger is that the life process of society (which like all biological processes insatiably draws everything available into the cycle of its metabolism) will literally consume cultural objects, eat them up, and destroy them.
The Lebanese heritage that has survived millennia of wars might yet not survive a couple of decades of amnesia and disappear altogether with the public realm. As these risks loom close, the proponents of doom will seek shelter in the past and the proponents of progress will seek shelter in the future, all while the present will continue, unfortunately, to pity the nation.
Elisabeth Young-Bruehl’s final work, Childism, was published soon after her untimely passing in December of 2011. In the book, Young-Bruehl, a long time psychoanalyst and child advocate, focuses on the pervasive prejudice she feels overshadows many children in our society. Be it abuse, or the modern day phenomenon of helicopter-parenting, she felt these injustices served to demarcate children, marking them as less worthy than adults. The resulting consequences result in unhealthy and damaging parent-children relationships.
Arendt Center intern, Anastasia Blank, has been reading Childism and providing us with a chapter by chapter review, highlighting some of the most interesting and compelling insights and arguments. Her previous posts about the book can be read here. Today, she shares her final thoughts and impressions about the book. We hope you have been inspired to read along. You can purchase the book here.
My past four posts on Elizabeth Young-Bruehl’s Childism have emphasized the role of prejudice in the mistreatment of children. Young-Bruehl has laid a foundation for her reader to both see how childism manifests itself through abuse, prejudice, and neglect and to question where the motivations for such action comes from. In the fifth chapter of her book, Young-Bruehl turns our attention elsewhere, to the researchers, investigators, and theorists who work within the fields of Child Abuse and Neglect (CAN) and Child Protective Services (CPS). Her claim is that progress helping abused children has been stunted by the disjointed views of those working to help them.
One example of the challenges facing those who would protect children is the widespread panic that occurred between the 1980’s and early twenty-first century surrounding satanic ritual abuse (SRA). In 1983 reports around the country began to spring up about how young children were being forced by workers at their daycare centers or preschools into sexual acts and disturbing sacrificial ceremonies.
Workers responsible for the protection of children proved ill equipped to handle this new phenomenon of abuse. Social workers had commitments that rendered them unable to acknowledge the occurrence as a conspiracy theory. Prejudiced by suggestive interviews and Recovered Memory Therapy (RMT), many social workers insisted on finding guilty parties. Others pushed for more family involvement in childcare; and a few select others were trying to use the responses to this mass hysteria as a means for self-reflection on the flaws currently plaguing the field.
From out of the Satanic Ritual Abuse phenomenon rose another issue, False Accusation Syndrome or FAS. Suddenly, the very field that was in place to protect children was wielding them as weapons against their abusers. Worse, the children being used were being victimized in a whole new way:
The problem of false accusations was not a syndrome and was not a condition of child victims….FAS was misnamed; it was made into a child’s problem when it was in fact an adult’s problem: convinced they were helping children, adults projected their images of children as liars [onto them]… FAS was yet another manifestation of childism.
In FAS, the child is doubted solely because of their age. Even the workers charged with protecting children are susceptible to what Young-Bruehl calls the childism prejudice.
Young-Bruehl writes that, in seeking answers and solutions for the abuse and harm being inflicted on children, those within the field began to add to the damage by blaming children. Childism, she writes, occurs when an adult sees problems with a child that actually originates from the adult’s own projections. A person is prejudiced towards a child or children when they place blame, feel resentful towards, or doubt the capabilities of a child.
A progressive shift was made in the early 2000’s when Child Abuse and Neglect (CAN) practitioners began to acknowledge the flaws the field had demonstrated over the past two decades, “Personnel in social work, child services agencies, and Child Protective Services departments… acknowledged that their own field, CAN, was a contributor to [the] crisis”. The major issue within the CAN field was that practitioners and researchers alike were often classifying children into one category of maltreatment. A child was either a victim of physical abuse, sexual abuse, emotional abuse, or neglect. In reality, however, only 5 percent of abused children suffer only one type of abuse.
The problem is that children are sorted and said to suffer one particular type of abuse, but the entirety of their abuse and its effects are not being recognized. When a child is taken from their home because someone in the home was sexually abusing them, this does not address the other factors that were likely involved. The child may have also been neglected, which is why the abuse was allowed to go on. The child may have been verbally abused, which is why they were afraid to speak out about the sexual misconduct. When only one factor in the abuse is given focus, then all of the other issues take the back burner. This means that they are still percolating and affecting the child, but are not being addressed.
Young-Bruehl sees the field of CAN’s tendency to consider the four types of abuse separately as a form of childism, ignoring the children for the adult’s “ease of discussion.” Sadly, this leads to misleading conclusions about what type of abuse is taking place and how to treat affected children. Worse, the conclusions drawn from studying abuse in this type of way will not be producing accurate conclusions, because traumatized children will be classified and treated as a child of a specific type of abuse.
What arose in the CAN field around the satanic ritual abuse uproar was a turn away from hearing the actual experience of a victim towards a classification of their abuse. By sectioning off victims under an awning of a certain type of abuse, the field has turned a blind eye to the needs of the victim. The issue within the CAN field surrounding the cases of SRA were those where practitioners were scrambling to understand what this new type of abuse could be. It was something they had never encountered, and so they needed to make-up for their lack of knowledge by herding the children under a new title. The children were victims of multiple abusers, but what does this actually tell us about the abuse and its effects?
CAN needs to be asking children and adult survivors of abuse about their own experiences. By considering specific cases of victims, CAN will be forced to shed their restrictive abuse-act typology, because most children fall under an umbrella of multiple abuses. Each type of abuse harms the child in different ways, and each needs to be addressed (as well as how the abuses acted together). People who are prejudiced towards children, those who find them burdensome and bad and want to ‘eliminate’ them (both theoretically, by destroying their sense of self, and actually, through means of starvation and physical abuse), can use any one or all of the different types of abuse as a way to harm the body and psyche of a child. As Young-Bruehl puts it, “The acts are weapons in a war between the generations.” However, what we see is that a “silencing” of children has been occurring within the field that is supposed to advocate for the voice of the child.
Children who are attempting to speak out against their abuse are viewed as incapable of doing so. If CAN workers believed in their ability to identify their trauma, then they would let the victims experience determine the help they need. Instead, they tack a title of abuse onto a child, which often does not address the experience(s) of trauma as a whole.
These harmful acts of abuse and neglect go on to shape how the child sees themselves and the world. This view permeates their psyche through adolescence into adulthood. In order to prevent and treat the traumatic events children experience and the prejudices against them, the focus needs to be turned to why adults can view children so negatively that their thoughts evolve into harm, and also how this harm manifests itself in the mind of a victim. In order to understand the mind of the victim, the field needs to start listening better, even if the story being told does not fit perfectly into a box with a specific title.
There is probably no presidential speech more quoted in Academic circles than Dwight D. Eisenhower’s 1961 farewell speech, on the final day of his presidency. It was in that speech that Eisenhower warned of the danger of a military-industrial complex.
The need for a permanent army and a permanent arms industry creates, he writes, a gargantuan defense establishment that would wield an irresistible economic, political, and spiritual influence. In the face of this military-industrial complex, we as a nation must remain vigilant.
In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.
Eisenhower’s speech was prescient. Particularly academics love to point to his speech to criticize bloated defense spending and point to the need to critically resist the military demands for more weapons and more soldiers. They are undoubtedly right to do so.
This is true even as today the military may be the one significant institution in American life where top leaders are arguing that America’s world preeminence is not sustainable. In Edward Luce’s excellent new book Time to Start Thinking, he describes how military leaders are convinced that the U.S. “should sharply reduced its “global footprint” by winding up all wars, notably in Afghanistan, and by closing peacetime military bases in Germany, South Korea, the UK, and elsewhere.” The military leaders Luce spoke to also said that the US must learn to live with a nuclear Iran and “stop spending so much time and resources on the war against Al-Qaeda.” Military leaders, Luce reports, are upset that “In this country ‘shared sacrifice’ means putting a yellow ribbon around the oak tree and then going shopping.” Many military people seem to share Admiral Michael Mullen’s view that the US national debt is the “country’s number one threat—greater than that posed by terrorism, by weapons of mass destruction, and by global warming.” One must think hard about the fact that military leaders see the need for “shared sacrifice” that will shrink the military-industrial complex while Americans and their elected leaders still speak about tax cuts and stimulus.
Too frequently forgotten in Eisenhower’s speech, or even simply overlooked, is the fact that Eisenhower follows his discussion of the military-industrial complex with a similar warning about the dangers of a “revolution in the conduct of research.” Parallel to the military-industrial complex is the danger of a university-government complex. (Hat Tip, Tom Billings (see comments)). Eisenhower writes:
Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades. In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.
Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.
Just as modern warfare demands a huge and constant arms industry, so too does the technological revolution demand a huge and constant army of researchers and scientists. This army can only be organized and funded by government largesse. There is a danger, Eisenhower warns, that the university-government complex will take on a life of its own, manufacturing unreal needs (e.g. a Bachelor of Arts degree in order to manage an assembly line) and liberally funding research with little regards to quality, meaning, or need. While the university-government complex is not nearly as expensive or dangerous as the military-industrial complex, there is little doubt that it exists.
Eisenhower warns of a double threat of this university-government complex. First, the nation’s scholars could be dominated by Federal employment, and gear their research to fit with governmental mandates. And second, the opposite danger, that “public policy could itself become the captive of a scientific-technological elite.”
The existence and power of just such a scientific-technological elite is undeniable today. On the one side are the free-market idealogues, those acolytes of Friedman, Hayek, and Coase, who insist that policy be geared towards rational, self-regulating, economic actors. That real people do not conform to theories of rational behavior is a problem with the people, not the theories.
On the other side are the welfare-state adherents, who insist on governmental support for not only the poor, but also the working classes, the bankers, and corporations. The sad fact that 50 years of anti-poverty programs have not alleviated poverty or that record amounts of money spent on education has seen educational attainment decrease rather than increase is seen to be no argument for the failure of technocratic-governmental solutions. It just means more money and more technical know-how are needed.
It is simply amazing that people in academia can actually defend the current system that we are part of. Of course there are good schools and fine teachers and serious students. But we all know the system is a failure. Graduate students are without prospects; faculty spend so much time publishing articles and books that no one reads; administrators make ever more – sometimes twelve times as much as full professors-and come more and more to serve as the lifeblood of universities; and it is the rare student who amidst the large classes, absent faculty, and social and financial pressures, somehow makes college an intellectual experience.
The idea and practice of college needs to be re-imagined and re-thought. Entrenched interests will oppose this. But at this point the system is so broken that it simply cannot survive. On a financial level, large numbers of universities are being kept afloat on the largesse of federal student loans. If those loans were to disappear or dry up, many colleges would disappear or at the least shrink greatly. This should not happen. And yet, putting our young people $1 trillion in debt is not an answer. For too long we have been paying for our lifestyles with borrowed money. We are now used to our inflated lifestyles and unwilling to give them up. Something will have to give.
The current cost of a college education is unsustainable except for the very top schools that attract the very richest students who then fund endowments that allow those schools to subsidize economic, national, and racial diversity. For schools that cannot attract the wealthiest or do not have endowments that protect them from market forces, change will have to come. This will mean, in many instances, faculty salaries will decrease and costs will have to come down. In other colleges, costs will rise and university education will be ever less accessible. Either way, the conviction that everyone needs a liberal arts degree will probably be revised.
I have no crystal ball showing where this will all lead. But there are better and worse ways that the change will come, and I for one hope that if we turn to honestly thinking about it in the present, the future will be more palatable. This is the debate we need to have.
Acting and Thinking: Thinking is rather complete concentration or absolute waking, that through which and in which all other “faculties” concentrate themselves.
—Arendt, Denktagebuch, vol. 1, 12
In The Human Condition, Hannah Arendt treats action as one of the three “most elementary articulations of the human condition”—those activities that are “within the range of every human being.” But Arendt leaves out other—less elementary—articulations of human being. Most notably, she specifically says that the book will not address thinking, “the highest and perhaps purest activity of which men are capable.” If acting is the highest of the elementary ways of being human, thinking is a specific kind of action that is, by its rarity, reserved for the few. Written by one of those few, The Human Condition is, above all, an attempt to “think what we are doing.”
The Human Condition traces the relation between thinking and acting that cuts through all of Arendt’s writing. Her account of Adolf Eichmann emphasizes his thoughtlessness. She comes to believe that it is thoughtlessness that makes possible evil actions and that thinking is the only possible way to stop or at least dis-empower the human tendency to do evil.
Similarly, thinking what we do is the path toward a reinvigoration of politics.
But what, exactly, is the relation between thinking and acting? Near the beginning Hannah Arendt’s Denktagebuch, in July 1950, Arendt sets down the first of what will become numerous entries under the title: “Acting and Thinking.” While many themes run through the Denktagebuch (literally, a book-of-thoughts), no other theme is so prevalent as “Acting and Thinking.” In this early line of thought, we see Arendt’s attempt to establish the relation between the two activities that would come to dominate her own thinking for the next 25 years.
The full entry, which references Martin Heidegger and William Faulkner, is worth citing in its entirety:
Acting and Thinking: Heidegger can only mean that it rests upon the sameness of being and thinking, and surely then, when thinking is understood as the being of man in the sense of the being of being. Thinking would then be the being that in man is freed to be action. Thinking is here neither speculation nor contemplation nor “cogitation.” It is rather the complete concentration or the absolute waking, that through which and in which all other “faculties” concentrate themselves.
“Why did I wake since waking I never shall sleep again.”
The quoted line at the bottom is a slight misquotation of William Faulkner’s famous line from Absalom, Abaslom (Arendt transposes “never” and “shall”). Thinking, Arendt writes, is an “absolute waking.” It can be a rude awakening, insofar as it tears one from the dream world of easy living and requires concentrated attention to difficulty. In such wakefulness, there is the ecstasy of absolutely wakeful concentration.
The word Arendt uses to describe the fullness of wakeful thinking is the German vollbringen, to complete, or to bring to fullness. This is, not coincidentally, the same word Martin Heidegger uses to describe both thinking and acting in his 1946 Letter on Humanism. Heidegger begins his Letter on Humanismwith a discussion of the relation of action and thinking. The first sentence introduces the relationship: “We are still far from thinking the essence of action decisively enough.”
If usually we think of action as simply something that causes or brings about effects, Heidegger writes that this is not decisive enough. Instead, “The essence of action is the bringing of something to completion, or the bringing of something to fulfillment.” To act is to unfold something in the fullness of its essence, to bring it to be what it most is. It is for this reason that human action is thinking, since “Thinking brings to fullness the relation of being to the essence of man.”
Arendt follows Heidegger in seeing thinking as the same as acting. What Arendt’s account of thinking as fulfilling and completing wakefulness adds to Heidegger’s conjunction of action and thinking is her insistence on human freedom. In the relation of action and thinking Arendt rejects all determinism and all understandings of action and thinking based in speculation, contemplation, or cognition, all of which subordinate human action to rules or reasons. Arendt’s acting and thinking human being is not a shepherd of being, but a beginner.
Thinking, Arendt writes, is freed to act and to bring new things into the world. That is what Arendt means by a thinking that is absolutely awake. Thinking what we are doing must, therefore, be itself an active beginning, a surprising and spontaneous action that inserts itself into the world in act and deed. If such thinking is surprising and new, it will draw others to it who will tell stories about it. Only then, if and when thinking inspires others to act in its wake, does thinking act.
My post on the proposed cuts to political science funding has drawn many comments. The political science community has mobilized strongly, sending out emails emphasizing the fact that Congressman Flake’s cuts do not actually cut any money from the NSF budget, but just from political science, thus in effect redirecting it to other disciplines. Steven Mazie also makes this worthy point. As questionable as political science research is, I have no doubt that political scientists have not cornered the market on irrelevant research.
But such arguments beg the real question, of whether we need federal funding of social science research as it is currently practiced. The social scientists—fearful of being cut off from the sustaining stream of federal funds—are rallying their troops. I have in the last two days received numerous appeals from the American Political Science Association and related groups asking me to write my senators trying to kill these proposed cuts. In the appeals, I am directed to a new virtual edition of the American Journal of Political Science, which features a selection of supposedly exemplary articles produced with NSF funds. I did visit the virtual journal and there found the following:
Self-Organizing Policy Networks: Risk, Partner Selection, and Cooperation in Estuaries. This study looks explicitly at networks involving policy makers dealing with coastal estuaries. [It finds] that in riskier settings (where the resource is the most fragile) highly connected networks spring up and these are important for preventing further resource decline. ·
Not by Twins Alone: Using the Extended Family Design to Investigate Genetic Influence on Political Beliefs. This is one of an increasing number of studies providing evidence for a strong genetic component to political attitudes. The point to the research is not that politics is purely genetic – but that individuals are born with personality traits that carry with them through their life. These are related to political attitudes. ·
Inequality and the Dynamics of Public Opinion: The Self-Reinforcing Link Between Economic Inequality and Mass Preferences. This research looks at the threat that rising income inequality has for democracy. The findings call into question the idea that changes in inequality result in a shift in mass opinion toward more liberal ideas. Indeed the research indicates that increases in inequality shifts mass public opinion in a more conservative direction.
My colleague and friend of the Arendt Center, Walter Russell Mead, had these wise words to say on his excellent blog:
There is a real baby and bathwater problem here. While much academic research is so worthless that not even other academics in the same field bother to read it, some of this research represents high triumphs of the human spirit, opens the door to new medical treatments, or otherwise deepens our understanding of the world around us and increases our ability to live richer, better lives.
The reconstruction of the American university is going to take some time, and nobody knows now exactly how the new system should look. In general, Via Meadia thinks that the “research model” works less well in the humanities and in most social sciences than it does in the natural sciences. In many cases, undergraduate teaching could be separated from scholarly research with no loss to the quality of undergraduate education — and perhaps a substantial gain.
In any case, we think Congressman Flake’s proposals deserve a fair and careful hearing. The policy usefulness of most political science research is at best questionable; at a time of tight government budgets it makes sense to look hard at non-essentials.
There is a real need to rethink the point of academic research in the university system. Every academic knows that the vast majority of published material is not worth publication. We also know that so much is published and almost none of it is read by more than a very few friends and colleagues. Whether that research is nonetheless valuable as a contribution to the storehouse of knowledge and the slowly evolving advance of science is a good question. But the short answer is that most of it is not.
Mead raises an important question about whether humanities and social science professors need to be part of the research model of modern academic institutions. On the one hand, it does seem strange to think of humanities professors as “researchers.” It fits us into the scientific model and suggests that thinking is somehow the product of research, which is a deeply questionable presumption. More likely, research deadens thinking, as it normalizes and limits it.
What thinking does need is time, and that is the challenge that humanities scholars are confronted with today. The demands of teaching and researching and publishing, let alone administering, are such that few academics today have time to read and think. We must insist on a distinction between the time to think and the need to publish.
Of course, one might argue that reading and thinking are what happen in teaching. If we simply teach great books we can read and re-read them, allowing us time to think, inspired by the masters of the past and also the present. That is certainly my approach to teaching, which is why I have always found teaching to be an integral part of my intellectual and writing life. My best papers and articles are the products of classroom insights. Might it be, then, that the research model is the enemy of thinking in the humanities?
That is, of course, too simple a conclusion. Thinking and teaching go together, although teaching hundreds of students and grading thousands of papers every semester is not really teaching, just as writing paper after paper is not really thinking. Teaching requires time, as does thinking. Both time to think and time to talk with students, to engage with them, and inspire them. And to be inspired by them. There is less and less time to do that in our research universities, and even in some of our liberal arts colleges that insist on mimicking the research university model. The model needs to be rethought. We should not run away from that opportunity.
Martin Heidegger’s Letter on Humanism is one of the great works of the 20th century. It was written in 1946 after his experience of the war and being stripped of teaching duties as Rectorship of Freiburg University as well as losing his membership in the Nazi party. The Letter is an attempt to re-cast his past work on a current and future path, seeking to save humanity from inhumanity.
The Letter remains controversial for many reasons, not least because Heidegger refuses to see Nazism as the name for the inhumanity threatening our world. Instead, he attributes the dehumanization of mankind, including Nazism, to a general homelessness that has its roots in what he calls the age of Technik.
I am just finishing up a semester-long seminar on Heidegger’s Letter on Humanism, a course I try to teach every other year. I always end the course by reading the one text on Heidegger’s Letter that I find both intelligent and provocative.
In Rules for the Human Zoo: A Response to the Letter on Humanism, Peter Sloterdijk sets Heidegger’s text in the context of humanism. While this may seem obvious, it is not. Heidegger goes through a history of humanism in two short pages of his text, and never addresses it again. But for Sloterdijk, the text is to be read as a last effort to save a dying humanist tradition.
The core of the humanist tradition, in Sloterdijk’s provocative telling, is the book as love letter. Books, he writes, citing the poet Jean Paul, “are thick letters to friends.” Humanism, as love letters, are messages sent out in printed form looking for friends. A humanist writes a book to move others to love what he himself loves. Humanism is thus a “communitarian fantasy” in which “participation through reading the canon reveals a common love of inspiring messages.”
The power of humanist writing is the power to communicate love of humanity to others whom one does not know. It is to awake in others the love for being human, for living as human, in the way and manner of a human being. And for most of Western humanism, the essence of that human being that inspires such love and devotion has been the human capacity to think, to reason, and to create. It is because humans can create beautiful works of art, found great empires, and devote themselves to truth and to God that humans are different from animals and worthy of our love.
Humanists must, Sloterdijk knows, distinguish themselves from animals. Thus:
Anyone who is asking today about the future of humanity and about the methods of humanization wants to know if there is any hope of mastering the contemporary tendency towards the bestialization of humanity.
Sloterdijk sees humanism as the effort to tame the human beast—the beast in the human. It is the desire to influence for the good the constant tension in human beings between beastialization and humanization.
It is because humanism is always one side of a struggle against a perceived threat of the bestialization of human beings that humanists must, of necessity, stand apart not only from animals, but also from mass culture. Sloterdijk presents this point in the context of Roman humanism with clarity:
Ancient humanism can be understood only when it is grasped as one opponent in a media contest: that is, as the resistance of the books against the amphitheater, and the opposition of the humanizing, patient-making, sensitizing philosophical reading against the dehumanizing, impatient, unrestrained, sensation-mongering and excitement-mongering of the stadium. What the educated Romans called humanitas would have been unthinkable without the need to abstain from the mass culture of the theaters of cruelty.
From these premises, Sloterdijk makes the surprising claim that humanism, “the question of how a person can become a true or real human being becomes unavoidably a media question.” The great event of our time, in Sloterdijk’s telling, and that which ends the humanist endeavor, is the telecommunications revolution. The end of the book, the loss of the medium of high culture that will distinguish itself from the masses and thus the massification and bestializaiton of man is, he writes, a death knell for the very idea of a humanity that is to be held separate from and higher than animals.
Hannah Arendt fought her life against efforts of human rights activists to reduce man to a living being and against the dreams of social scientists to make of man a predictable member of a mass.
Her fight was, on her own terms, the fight to preserve an idea of the human distinct from animals that also powers Heidegger’s exploration of humanity in the Letter on Humanism. Sloterdijk’s account of Heidegger’s effort, and his judgment of its unavoidable failure, is well worth your time this weekend. It is your weekend read.
Student debt is suddenly spurring the once unthinkable debate: Is college necessary? Of course the answer is no. But who needs it and who should pay for it are complicated questions.
Arendt herself had an ambivalent relationship to academic culture. She never held a tenure-track job in the academy and she remained suspicious of intellectuals and academics. She never forgot how easily professors in Germany embraced the rationality of the Nazi program or the conformity with which Marxist and leftist intellectuals excused Stalinism. In the U.S., Arendt was disappointed with the “cliques and factions” as well as the overwhelming “gentility” of academics, that dulled their insights. It was for that reason that she generally shunned the company of academics, with of course notable exceptions. A free thinker—she valued thinking for oneself above all—she was part of and apart from the university world.
We plan to keep the discussion about college and debt going on the Arendt Center blog. Here are a few thoughts to get the debate going.
First, college is not magic. It will neither make you smart nor make you rich. Some of our best writers and thinkers somehow avoided writing five-page papers on the meaning of Sophocles. (That of course does not mean that they didn’t read Sophocles, even in the Ancient Greek.) And many of the most successful Americans never graduated or attended college. On the other hand, many college grads and Ph.D.’s are surviving on food stamps today. Some who attend the University of Phoenix will benefit greatly from it. Many who attend Harvard squander their money and time. Especially today, college is as much a safe path for risk-averse youth as it is a haven for the life of the mind or a tasseled path to the upper classes.
Second, College can be a transformative experience. As I prepare to say goodbye to another cohort of graduates at Bard, I am reminded again how amazing these students are and how much I learn from them every year. I wrote recently about one student who wrote a simply stunning meditation on education. Today I will be meeting with two students about their senior projects. One is a profound, often personal, and yet also deeply mature exploration of loneliness in David Foster Wallace, Hannah Arendt, and Martin Heidegger. The other is a genealogy of whistleblowing from T.E. Lawrence to Bradley Manning, arguing that the rise of whistleblowing in the 20th century is both a symptom of and a contributor to the lost facts in public life. Both are testaments to the fact that college can inspire young adults to wrestle meaningfully and intelligently with the world they must confront.
Third, Most students do not attend college because they want to. Of course some do and I have enormous respect for those who embrace the life of the mind that college can nurture. I also respect those who decide that college is not for them. But the simple fact is that too many college students are here thoughtlessly, going through the motions because they are on a track. College has become a stepping stone to a good job which is a stand in for a good life. Nothing wrong with that, but is it really worth hundreds of thousands of dollars and four years of your time simply to get a credential? College students are young and full of energy. Too often they spend four of their most energetic years studying things they don’t care about while they sleep late, drink a lot, and generally have a good time. This cannot be the best use of most young people’s time.
Fourth, it is not at all clear that college is a good investment. There is no limit of students who tell me that taking out debt for an education is always a good investment. This is usually around the time they want to apply to law school or graduate school. And I can only repeat to them so many times that they are simply wrong. Finally, the press is catching up to this fact, and we are treated to a daily drumbeat of stories about the dangers of student debt. College debt in the U.S. now exceeds $1 Trillion, more than credit card debt (although far smaller than mortgage debt). The problem is widespread, as 94% percent of those who earn a bachelor’s degree take on debt to pay for higher education — up from 45 percent in 1993. And the problem is deep: The average debt in 2011 was $23,300. For 10% of college graduates, their debt is crippling, as they owe more than $54,000. Three percent owe more than $100,000.
The most egregious debt traps are still the for-profit colleges, which serve the working classes who cannot afford more expensive non-profit colleges. These schools prey on the perception, partly true, that career advancement requires a college degree. But now even public universities and private elite colleges are increasingly graduating students with high debt loads. And then there are law schools and culinary schools, which increasingly graduate indebted and trained professionals into a world in which does not need them.
he result is as sad as it is predictable. Nearly 1 in 9 young graduate borrowers who started repayment in 2009 defaulted within two years. This is about double the rate in 2005. The numbers vary: 15% of recent graduates from for-profit schools are in default. Also 7.2% of public university graduations and 4.6% of private university graduates are defaulting. Each of these groups requires a separate analysis and discussion. And yet overall, we are burdening way too many young people with debts that will plague them their entire lives.
Fifth, to defend college education as a good investment is not simply questionable economically. It also is to devalue the idea of education for its own sake and insist that college is an economic rather than an intellectual experience. One unintended consequence of the expansion of college to a wider audience of strivers is that a college education is decidedly an economic and bourgeois experience, less and less an intellectual adventure. Was college ever Arcadia? Surely not. For much of American history college has been a benefit reserved for the upper classes. And yet to turn education into a commodity, to make it part of the life process of making a living, does further delimit the available spaces for the life of the mind in our society.
Sixth, college is not necessary to make us either moral or enlightened citizens. College education does not make us better people. There are plenty of amazing people in the world who have had not studied Aristotle or learned genetics in college. The United States was built on the tradition of the yeoman farmer, that partly mythical but also real person who worked long days, saved, and treated people honorably.
Morality, as Hannah Arendt never tired of pointing out, is not gained by education. Or as Kant once pointed out to a certain Professor Sulzer in a footnote to his Groundwork of the Metaphysics of Morals, morality can only be taught by example, not through study. Arendt agreed. She saw that many of those who acted most honorably during WWII were not the intellectuals, but common people who simply understood that killing neighbors or shooting Jews was inhuman. What is more, it was often the intellectuals who provided themselves and others with the complex and quasi-scientific rationalizations for genocide. To think rationally, and even to use a current buzzword, to think critically, is no barrier to doing evil. Critical thinking—the art of making distinctions—is no guarantee of goodness.
Seventh, college cannot and should not replace a failed primary and high school system. Our primary schools are a disgrace and then we spend a fortune on remedial education in community colleges and even in four-year colleges, trying to educate people who have been failed by their public schools. We would do much better to take a large part of the billions and billions of public dollars we spend on higher education and put them towards a radical restoration of our public grammar and high schools. If we actually taught people in grammar schools and pushed them to excel in high schools, they would graduate prepared to hold meaningful jobs and also to be thoughtful citizens. Maybe then a college education could then be both less necessary and more valuable.
Bard College, which houses the Hannah Arendt Center, has been engaged for years in creating public high schools that are also early colleges. The premise is that high school students are ready for college level work, and there is nothing to prevent them from doing that. These Bard High School Early Colleges are public high schools staffed by professors with Ph.D’s who teach the same courses we teach at Bard College. In four years, students must complete an entire four-year high school curriculum and a two-year college curriculum. They then receive a Bard Associates Degree at graduation, in addition to their high school diploma. This Associates degree —which is free— can either reduce the cost of graduation from a four-year college or replace it altogether.
Early colleges are not the single answer for our crisis of education. But they do point in one direction. Money spent on really reforming high schools and even primary schools will do so much more to educate a broad, racially diverse, and economically underprivileged cohort of young people than any effort to reform or subsidize colleges and universities. The primary beneficiaries of the directing public money to colleges rather than high schools are Professors and administrators. I benefit from such subsidies and appreciate them. But that does mean I think them right or sensible.
We would be much better off if we redirected our resources and attention to primary and secondary education, which are failing miserably, and stopped obsessing so about college. Most college graduates, wherever they go, will learn something from their four or more years of classes. But the mantra that one only becomes a full human being by going to college is not only false. It also is dangerous.