Critical thinking is possible only where the standpoints of all others are open to inspection. Hence, critical thinking, while still a solitary business, does not cut itself off from ‘all others.’ To be sure, it still goes on in isolation, but by the force of imagination it makes the others present and thus moves in a space that is potentially public, open to all sides; in other words, it adopts the position of Kant’s world citizen. To think with an enlarged mentality means that one trains one’s imagination to go visiting.
-Hannah Arendt, Lectures on Kant's Political Philosophy, 43
Arendt’s appeal to the “enlargement of the mind” of Kantian judgment is well known and is often discussed in relation to Eichmann’s failure to think and recognize the world’s plurality. To the extent that we find lessons in these discussions, a prominent one is that we might all be vulnerable to such failures of judgment.
While recognizing how easy it is for us to not think, especially in the bureaucratic structures of the contemporary world, I want to focus here on the moments of thinking and judgment that do occur but fail to garner recognition.
I was recently involved in a discussion about educational and other support programs in prisons around the country. During the conversation, someone made the observation that these programs seem to appeal especially to women. It was the case that each of the women in this conversation had been involved in some prison program, either as an attorney or an educator. But the observation was intended, of course, to go beyond this relatively small group.
I don’t know whether it’s true that many more women than men are involved in programs like Bard’s Prison Initiative or the Innocence Project or any number of such programs. But what struck me about this conversation was that despite no one claiming to possess any knowledge beyond his or her personal observations, many seemed relatively certain about the possible explanation about this phenomenon (or non-phenomenon): that women might have a greater capacity to empathize with others, not because we are innately sensitive beings, but because we can more easily recognize the suffering of others and respond to that suffering.
Many readers of Arendt will immediately react to this description with Arendt’s critique of empathy in mind. For Arendt, empathy destroys critical thinking to the extent that it tries to “know what actually goes on in the mind of all others” as opposed to the comparing our judgment with the possible judgments of others (Lectures on Kant’s Political Philosophy, 43). In trying to feel like someone else, empathy makes it impossible to respond politically, as it destroys the distance between individuals that makes a response to another as other possible.
But if not empathy, what might better describe those, whether they are women or men, who are open to the sufferings and injustices of others? The answer, I submit, is critical thinking.
For Arendt, critical thinking is necessarily imaginative, as it requires that the thinker make “the others present.” The presence of others is not achieved by imagining what goes on in each of the minds of these imagined others. Rather, this presence is what allows one imaginatively to construct a public space in which one’s actions are visible to other people.
Critical thinking thus most importantly lies not in the ability to compare our judgment with the possible judgments of all others, which is what is often stressed in discussions of Arendtian judgment, but rather in the adoption of the position of Kant’s “world citizen.” Adopting such a position is less about imagining others as such and more about recognizing that one is always putting oneself out there for others to judge. Insofar as it is necessary to construct the audience to which the thinker presents herself, the imagination of others is the first step to critical thinking, but only the first step. Critical thinking is, as Kant writes in “What is Enlightenment?,” “addressing the entire reading public” such that that one presents oneself for judgment by this learned group of which one purports to be a member. Like a politician or a writer or an actor, the critical thinker acts with the understanding that she will be judged not just by friends, lovers, or like-minded compatriots, but by an entire learned public whose judgments are tempered neither by love nor even self-serving support.
The space in which women moved has always been “public” to the extent that women who acted always did so with the knowledge that they are opening themselves up to the judgment of others. Thus acting takes courage and a true living of the motto of the enlightenment “Sapere aude! Have the courage to use your own understanding!” (Kant, “What is Enlightenment?”).
But acting also necessarily engages critical thinking in another sense: one’s actions are always public to the extent that in acting one presents oneself for judgment to the world and discloses oneself. The thinking of women might, in this way, have been “forced” into the realm of the critical, for as solitary as the activity of thinking necessarily is, it occurs in a space in which the others are present by not only the “force of imagination,” but also the force of history. Thus, if certain professions, causes, or activities do draw relatively more women than men, part of the explanation might be that women think more critically. The world that one sees, with all its injustices and its suffering, does not move one to action or service. But this world is not the world in which one thinks or acts. Rather, one moves in and responds to the imagined one in which what one does is meaningful because one’s actions are being judged and because as vulnerable as one might feel in being judged, judgment brings along with it the implicit recognition that what one does is visible to others and, quite simply, that it might matter.
Arendt’s understanding of judgment is closely tied to Kant’s Critique of Judgment for a good reason: she herself builds her ideas directly on Kantian judgment. But reading Arendtian judgment through Kant’s shorter piece, “What is Enlightenment?” opens up to us aspects of the former that have previously been obscured. And it opens us up to acts of thinking, judgment, and courage to which we are often blind. Again, I don’t know that more women than men engage in work that supports prisoners and advances the cause of prisoners’ rights. But I don’t think it is controversial to say that the perception that they do exists and that women’s ability to empathize with others, whether because of their backgrounds or simply because they are women, is frequently an accompanying discourse. This could be the right explanation. But it could also be an expression not only of prejudices of what women are, but also of an insufficiency of our conceptual vocabulary to capture what it is that is going on in a way that does not simply reassert these prejudices.
Of late there has been no shortage of commentary on the ten years that have passed since the U.S. invasion of Iraq in 2003. Much of it has focused on the justifications for the war provided by members of the Bush administration, the lingering consequences of the invasion for President Obama and other policymakers, and the often harrowing experiences of American soldiers. These are certainly matters that should be discussed at length.
But U.S. public discourse continues to say little about the impact of the war on Iraqis themselves or about their efforts to survive and interpret it.
Much of it also remains tightly focused on the era after 9/11, as if those day’s events rendered the longer arc of Iraqi history—including the part that the U.S. has played in it—more or less irrelevant. To the extent that the country’s past is addressed at all, it commonly reduces “sectarianism,” “tribalism,” and other shibboleths to intrinsic and timeless features of Iraqi (and wider Arab and Islamic) life.
Two recent contributions on Jadaliyya (www.jadaliyya.com), a blog and e-zine published by the Arab Studies Institute, offer a counterpoint to these prevailing trends. The first is an interview with historian Dina Rizk Khoury related to the publication of her recent book, Iraq in Wartime: Soldiering, Martyrdom, and Resistance (Cambridge, 2013). As Khoury rightly notes, most of the discussion in the U.S. has failed to recognize the fact that Iraqis spent the last twenty-three years of Baathist rule in a state of nearly continuous military conflict. First there was the Iran-Iraq War, then the Iraqi seizure of Kuwait, then the 1991 Gulf War and the ensuring embargo, and finally the most recent American invasion and occupation.
Under such conditions, Khoury argues, war became a matter of normalcy and bureaucratic governance that insinuated violence into the fabric of everyday life in Iraq. At the same time, it created recurring crises and ruptures that reshaped the structures of state authority and citizenship. And it enabled the Iraqi state to fabricate a myth of soldiering and martyrdom that, in the long run, helped to recalibrate Iraqis’ notions of national belonging along ethnic and sectarian lines. Wittingly or unwittingly, the actions of U.S. policymakers after the Gulf War and the 2003 invasion have reinforced Iraq’s societal divisions and the prevalence of violence as a mode of political action.
The second contribution is a commentary from Orit Bashkin, “The Forgotten Protagonists: The Invasion and the Historian.” Bashkin has written extensively on the politics of pluralism (The Other Iraq, Stanford, 2010) and Jewish displacement (New Babylonians, Stanford, 2012) in twentieth-century Iraq, but here she focuses on the present and future conditions of historical scholarship. She contends that our knowledge of the Iraqi past has grown in significant ways over the past decade. (If we take Melani McAlister’s book Epic Encounters seriously, this outcome should hardly surprise us: American cultural, scholarly, and geopolitical interests in the Middle East have long been tightly intertwined.) Such expansion has been facilitated in no small part by the relocation of the Baath Party archives to the U.S. in 2008. This move has allowed professional historians ready access to a crucial corpus of texts on Saddam Hussein’s regime.
Yet Bashkin also worries that the prospects for historical knowledge production will be decidedly less rosy in the years to come. In particular, many of the other materials on which historians of Iraq rely—Ottoman records, collections of poetic and theological writings, museums, archaeological sites, and so on—have been or are being destroyed in the wake of the U.S. invasion.
As a result, it will be considerably more difficult for scholars not simply to reconstruct the Iraqi past, but also to comprehend how Iraqi citizens relate to it. In particular, we will be less able to grasp the imperial and colonial practices, post-independence state policies, and other forces that have forged the country’s current ethnic and religious cleavages. And we will be less able to understand the multiple and competing nostalgias that now proliferate among Iraqi citizens. Such nostalgias include the ambivalent and paradoxical longing for the days of Saddam Hussein, when (in Bashkin’s words) “at least there was some sense of law and order.”
American public discourse is in desperate need of commentary that positions present-day Iraqis as complex actors who both shape and are shaped by the flow of local, regional, and global histories. As Khoury and Bashkin suggest, the current focus on the past ten years is both literally and metaphorically short-sighted. And yet, for a variety of reasons, lengthening our gaze will be easier said than done.
This Weekend Read is Part Two in “The “E” Word,” a continuing series on “elitism” in the United States educational system. Read Part One here.
Peter Thiel has made headlines offering fellowships to college students who drop out to start a business. One of those Thiel fellows is Dale Stephens, founder of Uncollege. Uncollege advertises itself as radical. At the top of their website, Uncollege cites a line from the movie "Good Will Hunting":
You wasted $150,000 on an education you coulda got for a buck fifty in late charges at the public library.
The Uncollege website is filled with one-liners extolling life without college. It can be and often is sophomoric. And yet, there is something deeply important about what Uncollege is saying. And its message is resonating. Uncollege has been getting quite a bit of attention lately, part of a culture of obsession with college dropouts that is increasingly skeptical of the value of college.
At its best, Uncollege does not simply dismiss college as an overpriced institution seeking to preserve worthless knowledge. Rather, Uncollege claims that college has become too anti-intellectual. College, as Uncollege sees it, has become conventional, bureaucratic, and not really dedicated to learning. In short, Uncollege criticizes college for not being enough like college should be. Hardly radical, Uncollege trades rather in revolutionary rhetoric in the sense that Hannah Arendt means the word revolution: a return to basic values. In this case, Uncollege is of course right that colleges have lost their way.
Or that is what I find interesting about Uncollege.
To actually read their website and the recent Uncollege Manifesto by Dale Stephens, is to encounter something different. The first proposition Uncollege highlights has little to do with education and everything to do with economics. It is the decreasing value of a college education.
The argument that college has ever less value will seem counter intuitive to those captivated by all the paeans to the value of college and increased earning potential of college graduates. But Uncollege certainly has a point. Currently about 30% of the U.S. adult population has a degree. But among 20-24 year olds, nearly 40% have a college degree. And The Obama administration aims to raise that number to 60% by 2020. Uncollege calls this Academic Inflation. As more and more people have a college degree, the value of that degree will decrease. It is already the case that many good jobs require a Masters or a Ph.D. In short, the monetary value of the college degree is diminished and diminishing. This gives us a hint of where Uncollege is coming from.
The Uncollege response to the mainstreaming of college goes by a number of names. At times it is called unschooling. Unschooling is actually a movement began by the legendary educator John Holt. I recall reading John Holt’s How Children Learn while I was in High School—a teacher gave it to me. I was captivated by Holt’s claim that school can destroy the innate curiosity of children. I actually wrote my college application essay on Holt’s educational philosophy and announced to my future college that my motto was Mark Twain’s quip, “I never let school interfere with my education”—which is also a quotation prominently featured in the Uncollege Manifesto.
Unschooling—as opposed to Uncollege—calls for students to make the most of their courses, coupling those courses with independent studies, reading groups, and internships. I regularly advise my students to take fewer not more courses. I tell them to pick one course each semester that most interests them and pursue it intently. Ask the professor for extra reading. Do extra writing. Organize discussion groups about the class with other students. Go to the professor’s office hours weekly and talk about the ideas of the course. Learners must become drivers of their education, not passive consumers. Students should take their pursuit of knowledge out of the classroom, into the dining halls, and into their dorms.
Uncollege ads that unschooling or “hacking your education” can be done outside of schools and universities. With Google, public libraries, and free courses from Stanford, MIT and Harvard professors proliferating on the web, an enterprising student of any age can compose an educational path today that is more rigorous than anything offered “off-the-shelf” at a college or university. I have no problem with online courses. I hope to take a few. But it is a mistake to think that systems of massive information delivery are the same thing as education.
What Uncollege offers is something more and something less wholesome than simply a call for educational seriousness. It packages that call with the message that college has become boring, conventional, expensive, and unnecessary. In the Uncollege world, only suckers pay for college. The Uncollege Manifesto promotes “Standing out from the other 6.7 billion”; it derides traditional paths pointing out that “5,000 janitors in the United States have Ph.Ds.”; and cautions, “If you are content with life and education you should probably stop reading… You shall fit in just fine with society and no one will ever require you to be different. Conforming to societal standards is the easy and expected path. You are not alone!”
At the core of the Uncollege message is that dirty and yet all-so-powerful little word again: “elitism.” Later in the Uncollege Manifesto we are told that young people have a choice between “real accomplishments” and the “easy path to mediocrity”:
To succeed without a college degree you will have to build your competency and reputation through real world accomplishments. I am warning now: this is not going to be easy. If you want to take the easy path to mediocrity, I encourage you to go to college and join the masses. If you want to stand out from the crowd and change the world, Uncollege is for you!
At one point, the Uncollege Manifesto lauds NPR’s “This I Believe” series and commends these short 500 word essays on personal credos. But Uncollege adds a twist: instead of writing what one believes, it advises its devotees to write an essay answering the question: “What do you believe about the world that most others reject?” It is not enough simply to believe in something. You must believe in something that sets you apart and makes you different.
Uncollege is at least suggesting that it might be cool to want, as it has not been for 50 years, to aim for excellence and to yearn to be different. In short, Uncollege is calling up students at elite institutions to boldly grab the ring of elitism and actively seek to stand outside and above the norm. And it is saying that education is no longer elite, but conventional.
It is hard not to see this embrace of elitism as refreshing although no doubt many will scream the “e” word. I have often lectured to students at elite institutions and confronted them with their fear of elitism. They or someone spends upwards of $200,000 on an education not to mention four years of their lives, and then they reject the entire premise of elitism: that they are different or special. By refusing to see themselves as members of an elite, these students too often refuse to accept the responsibility of elites, to mold and preserve societal values and to assume leadership roles in society.
Leading takes courage. In Arendtian terms, it requires living a public life where one takes risks, acts in surprising ways, and subjects oneself to public judgment. Leading can be uncomfortable and dangerous, and it is often more comfortable and fun to pursue one’s private economic, familial, and personal dreams. Our elite colleges have become too much about preparing students for private success rather than launching young people into lives of public engagement. And part of that failure is a result of a retreat from elitism and a false humility that includes an easy embrace of equality.
That Uncollege is selling its message of excellence and elitism to students at elite institutions of higher learning is simply one sign of how mainstream and conformist many of these elite institutions have become. But what is it that Uncollege offers these elite students who drop out and join Uncollege?
According to its website, Uncollege is selling “hackademic camps” and a “gap year program” that are designed to teach young people how to create their own learning plans. The programs come with living abroad programs and internships. Interestingly, these are all programs offered by most major universities and colleges. The difference is money and time. For $10,000 in just one year, you get access to mentors and pushed to write op-eds, and the “opportunity to work at hot Silicon Valley startups, some of them paid positions.” In the gap year program, participants will also “build your personal brand. Speak at a conference, Write an op-ed for a major news outlet. Build a personal website.”
None of this sounds radical, intellectual, or all-that elitist. On the contrary, it claims that young people have little to learn from educators. Teachers are unimportant, to be replaced by mentors in the world. The claim is that young people lack nothing but information and access in order to compete in the world.
What Uncollege preaches often has little to do with elitism or intellectual growth. It is a deeply practical product being sold as an alternative to the cost of college. In one year and for one-twentieth of what a four-year elite college education costs, a young person can get launched into the practical world of knowledge workers, hooked up with mentors, and set into the world of business, technology, and media. It is a vocational training program for wannabe elites, training people to leap into the creative and technology fields and compete with recent college graduates but without the four years of studying the classics, the debt, and the degree. The elitism that Uncollege is selling is an entrepreneurial elitism measurable by money. By appealing to young students’ sense of superiority, ambition, and risk-taking, Uncollege stands a real chance of attracting ambitious young people more interested in a good job and a hot career than in reading the classics or studying abstract math.
Let’s stipulate this is a good thing. Not everybody should be going to liberal arts colleges. People unmoved by Nietzsche, Einstein, or Titian who are then forced to sit through lectures, cram for exams, and pull all-nighters writing papers cribbed from the internet are wasting their time and money on an elite liberal arts education. What is more, they bring cynicism into an environment that should be fired by idealism and electrified by passion. For those who truly believe that it is important in the world to have people who are enraptured by Sebald and transformed by Arendt, it is deeply important that the liberal arts college remain a bastion apart, a place where youthful exuberance for the beautiful and the true can shine clearly.
We should remember, as well, that reading great books and studying Stravinsky is not an activity limited to the academy. We should welcome a movement like Uncollege that frees people from unwanted courses but nevertheless encourages them to pursue their education on their own. Yes, many of these self-educated strivers will acquire idiosyncratic readings of Heidegger or strange views about patriotism. But even when different, opinions are the essence of a human political system.
One question we desperately need to ask is whether having a self-chosen minority of people trained in the liberal arts is important in modern society. I teach in an avowedly liberal arts institution precisely because I fervently believe that such ideas matter and that having a class of intellectuals whose minds are fired by ideas is essential to any society, especially a democracy.
I sincerely hope that the liberal arts and the humanities persist. As I have written,
The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.
Our problem, today, is that college is caught between incompatible demands, to spark imaginations and idealism and to prepare young people for employment and success. For a long while now colleges have been doing neither of these things well. Currently, the political pressure on colleges is to cut costs and become more efficient. The unspoken assumption is that colleges must more cheaply and more quickly prepare students for employment. For those of us who care about college as an intellectual endeavor, we should welcome new alternatives to college like internet courses, vocational education, and Uncollege that will pull away young people for whom college would have been the wrong choice. Maybe, under the pressure of Uncollege, colleges will return to their core mission of passionately educating young people and preparing them for lives of civic engagement.
I encourage you this weekend to read the Uncollege Manifesto. Let me know what you think.
The after effects of Super-storm Sandy are felt from the beaches to the statehouses. First of all, let’s realize it was not a hurricane, but a freakish combination of storm systems. Super-storm is more truthful than hurricane. Whatever it was, it has upended lives, and politics.
The Financial Times reports today that Governor Chris Christie of New Jersey has now joined NY Governor Andrew Cuomo in requesting not only emergency aid to repair the damage caused by the storm, but also preventative money to build dunes, use eminent domain to purchase property, and generally re-engineer the New Jersey coastline.
The political transformation here is lost on few. As the FT writes:
Mr. Christie, a Republican, has previously sounded more skeptical than Mr. Cuomo, a Democrat, about using state powers to dictate how the state was rebuilt. But he said on Wednesday he might take away local towns’ power to grant “easements” to homeowners objecting to new dunes blocking their sea views and would not rule out using government powers to purchase properties it believed were in the wrong place.
“I have to protect the Jersey shore, both as an economic engine and as a cultural engine,” Mr. Christie said.
The desire to take away local powers and give them to states and to take away state powers and give them to the federal government is neither a democratic nor a republican idea anymore. While the party of the elephant may give lip service to local governance, it has rarely, if ever, backed that up with action. As is now well known, the federal government has grown as fast if not faster under Republican Presidents than it has under democratic.
Hannah Arendt argued that the greatest danger to freedom in the United States was the rise of a large and bureaucratic government. She worried, as she once wrote, that the true threat to freedom was the sheer size of America alongside the rise of a technocracy. The sheer size of the country combined with the rising bureaucracy threatened to swallow the love for freedom she saw as the potent core of American civic life.
Chris Christie and Andrew Cuomo may well be their respective parties’ nominees for President in 2016. They are both deeply popular and have taken a pragmatic and largely centrist approach to governing at a time of financial crisis and natural disaster. And yet, from an Arendtian angle, it is striking that both governors have so internalized the view that problems are to be solved by bureaucrats and technocrats rather than on a local level.
That the bureaucratic approach is so entrenched should not be a surprise. It is both a consequence of a further spur to the retreat from politics that Hannah Arendt describes. Even Christie’s insistence that he must save the Jersey shore as an economic engine shows the near complete victory of economic thinking over politics.
I spoke with my daughter this morning. She is seven. I asked her what she thought of Mitt Romney's speech. She answered: "Both he and President Obama tell lies simply to get elected." Now I know she is to some extent parroting what she hears around our dinner table and the playground. But there is something deeply disheartening in her seven-year-old cynicism. There is a deep sense not only that our politicians lie, but also that the Presidency is a broken institution. That the President is captive of interests special and not-so-special. That the President is trapped in a bureaucracy impervious to change and that the President, whomever he or she may be, cannot really change the perilous course on which our nation is headed. This indeed is the topic of an upcoming conference, "Does the President Matter? A Conference on the American Age of Political Disrepair."
There are myriad sources for this pessimism that one hears from seven-year-olds, college students, and adults. It is markedly different from the idealism that swept the country four years ago personified in Barack Obama. More so than any time I know of, there is a sense of total hopelessness; a feeling that neither party and no potential president can possibly change our course for the better.
To understand this ennui, one must take President Obama's failure seriously. That failure is simple. He became President amidst the perceived failure of the presidency of George W. Bush. The Country desperately wanted a change. At the same time, the financial crisis threatened to overwhelm the nation. The President offered hope. He embodied all of our dreams, offering a way forward, out of the excesses of the Bush era and towards a re-enlivening of basic American values of freedom and fairness. There was, in the President's own words, a demand for a "new era of responsibility."
The force of Mitt Romney's Convention speech on Thursday was his expression of disappointment in the President. This strikes me as a non-partisan statement and that is its strength. It is hard to find even the most stalwart of President Obama's supporters who will disagree with this assessment. Where does it come from? Why has Obama disappointed us?
One answer comes from Kathleen Hall Jamieson, one of the leading thinkers of Presidential rhetoric of our time. Jamieson has given analyses of many of President Obama's speeches, and his found them deeply wanting. In her 2010 address to the American Political Science Association, she says:
In other words, Barack Obama was never as eloquent as we thought he was. A person matched a moment with rhetoric in a context in which the audience created something heard as eloquence. Widely labeled as eloquent, he creates expectations for his presidency that he cannot satisfy in the presidency barring that he is Abraham Lincoln with the Gettysburg Address or a Second Inaugural in his pocket.
So on the one hand, Obama set the expectations for himself too high. That may be, but it is also the case that he became President at a time of great crisis. Maybe it wasn't a Civil War, but the financial crisis does threaten the future of the United States. One fault of the President is that he has continued to describe the financial crisis as a temporary setback, one that will cause some pain but will pass. He has not taken the financial crisis seriously enough, and categorized it for what it is, a crisis. By refusing to do so, he has lost the opportunity to become a crisis President.
In a recent post, I discussed Roberto Magabeira Unger's insistence that we need a wartime President now without a war, one who rallies the nation to change and sacrifice towards a future goal. What Obama has refused to do is present his vision of where we should go. He speaks about change, but doesn't offer a sense of what that change might be. In Jamieson's analysis, he has failed to provide a rhetorical speech that offers us "a digestive sense of what this presidency is going to do."
A digestive statement for Jamieson is something like John F. Kennedy's question: "Ask not what your country can do for you..." As Jameison writes, such statements "sound as if they're sound bites until you realize that there's a definition underlying a presidency in those kinds of statements." Kennedy meant something with his question, something he backed up with the idea of the Peace Corps and public service.
The problem with President Obama's rhetoric, and thus his presidency, is that he has yet to find such a digestive statement that defines what he cares about and what he believes this country is about. As Jamieson writes, there is nothing like Kennedy's invocation of the Peace Corps or communal sacrifice that defines or articulates Obama's vision for America. There is no theme of "transformation of generational identity." She writes: "Indeed, I would challenge you to give me a phrase that is memorable at all, that defines who we are and where we're going under this presidency."
Jamieson's critique of the President is harsh. But I think it is accurate. That is the reason why Romney's claim of disappointment strikes me as powerful. Whether Romney offers an alternative is hard to know, since he himself seems to change his opinions and views weekly. That said, President Obama has his work cut out for him. He must show us that he can articulate a response to the disappointment people feel and provide the hope that he can still get the country back on track, even after three years of failing to do so.
The crises the President inherited are not his fault. It is disgusting to hear Paul Ryan and others blame the President for every problem in the United States. And despite Mitt Romney's impressive past history, his willingness to change his positions regularly and disavow past achievements raises serious questions about his own ability to lead. And yet, it is undeniable that after three years, the financial crisis is still with us and the political crisis is worse than ever. At some point, the President must take responsibility for his failure to address these crises and offer hope that he has a plan to address them in the future. That is the President's challenge during his convention speech next week. To somehow try to answer the criticism that after three years, we still don't know what it is that President Obama believes in and how he wants to respond to the financial and political crisis that he inherited.
In thinking about what the President will say on Thursday, I encourage everyone to read Jamieson's analysis of the past failure of Obama's rhetoric. It is your weekend read. And if you want to think further about the challenge of the president to lead in times of crisis, think about attending the Hannah Arendt Center's upcoming conference, "Does the President Matter?"
Greece voted on Sunday and the headline account shows that the right of center moderates won. This was presented as good news, for it means a continued embrace of the Euro and years more of austerity. But there are other lessons to glean from the Greek election.
1. Extremism is rising quickly in Greece. As the Financial Times reports,
The parliament, for the first time in Greek history, will be full of extremists. Besides the neo-nazis and a Stalinist communist party there is Syriza, whose leader is a fan of Mao Zedong, Fidel Castro and Hugo Chávez. How did Greece, the birthplace of democracy, come to have a parliament full of hammers, sickles and swastikas?
2. The Greeks are being asked to suffer for years more, but with little or no hope in sight. Here is what the NY Times reports today, an opinion from one of the most knowledgeable commentators on the Greek crisis:
“Greece will be forced to return to the drachma and devalue, and the default will cause bank runs and money flowing into Germany and the United States as the only viable safe haven bets,” he declared the day before Sunday’s Greek elections, irrespective of which party would win. “Greece will default because there is no other choice regardless of anyone’s politics.”
Almost all of the loans that Greece receives from Europe go directly to pay off the interest on loans to banks in Germany and elsewhere. Greece is neither paying down its debt nor investing in its future. The result is that the Greeks will suffer through years more of austerity and will likely be in no better position in a few years than they are now.
3. The combination of 1 and 2 above do not bode well for European politics in the coming years.
When Hannah Arendt looked to the Origins of Totalitarianism in the 20th century, she began her analysis with the financial speculation and subsequent crash of 1870. The ensuing crisis led to a weakening of nation-states and the rise of imperialism, all of which dissolved the traditional political and moral limits that had for centuries formed the structural foundation of European civilization.
As Europe struggles now to overcome national political limits as a response to the financial and banking crisis, it faces once again a political crisis mixed with an economic crisis. Europe is in trouble and they are not alone. But in Europe, unlike in the U.S. or in Japan, the financial crisis is inextricable from a crisis of nationalism and sovereignty. The potential for nationalist extremism on the one hand is real. On the other hand, there is also the potential for a weakening of national political traditions and the rise of technocratic and bureaucratic rule that, for all its rationalism, weakens moral and ethical restraints.
There is probably no question more debated in the course of Middle Eastern uprisings than that of the status of human rights. Anyone familiar with the region knows that the status of human rights in the Middle East is at best obscure. The question of why there was not a “revolution” in Lebanon is a very complex one, tied with the fate of Syria and with the turbulent Lebanese politics since the end of the civil war, and hence cannot be fully answered. In a vague sense it can be said of course that Lebanon is the freest Arab country and that as such it bears a distinctively different character.
While at face value, the statement is true, being “more free than” in the Middle East is simply understating a problem. Just to outline the basic issues, Lebanon’s record on human rights has been a matter of concern for international watchdogs on the following counts:
Security forces arbitrarily detain and torture political opponents and dissidents without charge, different groups (political, criminal, terrorist and often a combination of the three) intimidate civilians throughout the country in which the presence of the state is at best weak, freedom of speech and press is severely limited by the government, Palestinian refugees are systematically discriminated and homosexual intercourse is still considered a crime.
While these issues remain at the level of the state, in society a number of other issues are prominent: Abuse of domestic workers, racism (for example excluding people from color and maids from the beaches) violence against women and homophobia that even included recently a homophobic rant on a newspaper of the prestigious American University in Beirut. The list could go on forever.
The question of gay rights in Lebanon remains somewhat paradoxical. On the one hand, article 534 of the Lebanese Penal Code prohibits explicitly homosexual intercourse since it “contradicts the laws of nature”, and makes it punishable with prison. On the other hand, Beirut – and Lebanon – remains against all odds a safe haven, for centuries, for many people in the Middle East fleeing persecution or looking for a more tolerant lifestyle.
That of course includes gays and lesbians and it is not uncommon to hear of gay parties held from time to time in Beirut’s celebrated clubs. At the same time, enforcement of the law is sporadic and like everything in Lebanon, it might happen and it might not; best is to read the horoscope in the morning and pray for good luck. A few NGO pro-LGBT have been created in the country since the inception of “Hurriyyat Khassa” (Private Liberties) in 2002.
In 2009 Lebanese LGBT-organization Helem launched a ground-breaking report about the legal status of homosexuals in the entire region, in which a Lebanese judge ruled against the use of article 534 to prosecute homosexuals.
It is against the background of this turbulent scenario that Samer Daboul’s film “Out Loud” (2011) came to life, putting together an unusual tale about friendship and love set in postwar Lebanon in which five friends and a girl set on a perilous journey in order to find their place in the world.
Though the plot of the film seems simple, underneath the surface lurks a challenge to the traditional morals and taboos of Lebanese society – homosexuality, the role of women, the troubled past of the war, delinquency, crime, honor – which for Lebanese cinema, on the other hand, marks a turning point.
This wouldn’t be so important in addressing the question of rights and freedoms in Lebanon were it not for a documentary, “Out Loud – The Documentary”, released together with the film that documents in detail the ordeal through which the director, actors and crew had to go through in order to complete this film.
Shot in Zahlé, in mountainous heartland of Lebanon and what the director called “a city and a nation of conservatism and intolerance”, it is widely reported in the documentary that from the very beginning the cast and crew were met with the same angry mobs, insults, and physical injuries that their film in itself so vehemently tried to overcome; a commercial film about family violence, gay lovers, and the boundaries of relationships between men and women. A film not about Lebanon fifteen or twenty years ago, but about Lebanon of here and today.
Daboul writes: “Although I grew up in the city in which “Out Loud” was filmed, even I had no idea how difficult it would be to make a movie in a nation plagued by violence, racism, sexism, corruption and a lack of respect for art and human rights.” The purpose of “Out Loud” of course wasn’t only to make a movie but a school of life, in which the maker, the actors and the audience could all have a peaceful chance to re-examine their own history and future.
Until very recently in lieu of a public space, in Lebanon, any conflict was solved by means of shooting, kidnapping and blackmailing by armed militias spread throughout the country and acting in the name of the nation.
The wounds have been very slow to heal as is no doubt visible from the contemporary political panorama. Recently, a conversation with an addiction counselor in Beirut revealed the alarming statistics of youth mental illness, alcoholism and drug addiction across all social classes in Lebanon, to which I will devote a different article.
Making films in Lebanon is an arduous process that not only does not receive support from the state but is also subject to an enormous censorship bureaucracy that wants to make sure that the content of the films do not run counter to the religious and political sensibilities of the state. In the absence of strong state powers, the regulations are often malleable and rather look after the sensibilities of political blocs and religious leaders rather than state security, if any such exists.
The whole idea of censorship of ideas is intimately intertwined with the reality of freedom and rights and with the severe limitations – both physical and intellectual – placed upon the public space.
In the Middle East, censorship of a gay relationship is an established practice in order to protect public morality; however what we hear on the news daily that goes from theft to murder to kidnap to abuse to rape to racism, does not require much censorship and is usually consumed by the very same public.
If there is one thing here that one can learn from Hannah Arendt about freedom of speech is that as Roger Berkowitz writes in “Hannah Arendt and Human Rights”:
The only truly human rights, for Arendt, are the rights to act and speak in public. The roots for this Arendtian claim are only fully developed five years later with the publication of The Human Condition. Acting and speaking, she argues, are essential attributes of being human. The human right to speak has, since Aristotle defined man as a being with the capacity to speak and think, been seen to be a “general characteristic of the human condition which no tyrant could take away.”
Similarly, the human right to act in public has been at the essence of human being since Aristotle defined man as a political animal who lives, by definition, in a community with others. It is these rights to speak and act –to be effectual and meaningful in a public world – that, when taken away, threaten the humanity of persons.
While these ideas might seem oversimplified and rather vague in a region “thirsty” for politics, they establish a number of crucial distinctions that must be taken into account in any discussion about human rights. Namely:
1) The failure of human rights is a fundamental fact of the modern age
2) There is a distinction between civil rights and human rights, the latter being what people resort to when the former have failed them
3) It is the fact that we appear in public and speak our minds to our fellowmen that ensures that we live our lives in a plurality of opinions and perspectives and the ultimate indicator of a life being lived with dignity.
Even if we have a “right” to a house, to an education and to a citizenship (that is, belonging to a community) if we do not have the right to speak and act in public and express ourselves (as homosexual, woman, dissident and what not) we are not being permitted to become fully human. Regardless of the stability of political institutions, provision of basic needs and security, there is no such a thing as a human world – a human community – in the absence of the possibility of appearing in the world as what we truly are.
“Out Loud” – both the film and the documentary – are a testimony of the degree to which the many elements composing the multi-layered landscape of Lebanese society are at a tremendous risk of worldlessness by being subject to an authority that relies on violence in lieu of power. Power and violence couldn’t be any more opposite.
Hannah Arendt writes in her journals:
Violence is measurable and calculable and, on the other hand, power is imponderable and incalculable. This is what makes power such a terrible force, but it is there precisely that its eminently human character lies. Power always grows in between men, whereas violence can be possessed by one man alone. If power is seized, power itself is destroyed and only violence is left.
It is always the case in dark times that peoples – and also the intellectuals among them – put their entire faith in politics to solve the conflicts that emerge in the absence of plurality and of the right to have rights, but nothing could be more mistaken. Politics cannot save, cannot redeem, cannot change the world. Just like the human community, it is something entirely contingent, fragile and temporary.
That is why no decisions made on the level of government and policies are a replacement for the spontaneity of human action and appearance. It is here that the immense worth of “Out Loud” lies; in enabling a generation that is no longer afraid of hell – for whatever reason – to have a conversation, and it is there where the rehabilitation of the public space is at stake and not in building empty parks to museumficate a troubled past, as has been often the case in Beirut. In an open conversation, people will continue contesting the legacy and appropriating the memory not as a distant past, but as their own.
The case of Lebanon remains precarious: Lebanon’s clergy has recently united in a call for more censorship; and today it was revealed that the security services summon people for interrogation over what they have posted on their Facebook accounts; HRW condemned the performance of homosexuality tests on detainees in Lebanon, even though this sparked a debate and a discussion on the topic ensued at the seminar “Test of Shame” held at Université Saint-Joseph in Beirut and the Lebanese Medical Society held a discussion in which they concluded those tests are of no scientific value.
In a country like Lebanon, plagued by decades of war and violence, as Samer Daboul has said in his film, people are more than often engaged at survival and just at that – surviving from one war to another, from one ruler to another, from one abuse to another, and as such, the responses of society to the challenges of the times are of an entirely secondary order. But what he has done in his films is what we, those who still have a little faith in Lebanon, should have as a principle: “It’s time to live. Not to survive”.
Political scientists around the country are in a huff here, and here, and here. The reason has little to do with the upcoming election, the vacuum in political leadership, or the state of the world. No, they are upset because Arizona Congressman Jeff Flake has proposed cutting the National Science Foundation's Political Science Program that awards about $11 Million a year to support political science research.
The anger and posturing are extraordinary. And political scientists are rushing to defend the relevance and necessity of their research. Special anger is directed at Congressman Flake's blindness to the import of a $700,000 NSF proposed study to develop "A multi-level, agent-based model for identifying the factors that enable or constrain international climate change negotiations." I have no doubt such a study has uses. But I do wonder if those writing the study could make those uses more accessible. They write:
The goal of our research is to develop a new tool for international climate policy analysis based on the concept of agent-based modeling (ABM). ABM facilitates a more realistic and simultaneous treatment of the diverse forces which influence multi-party decisions. Our model will represent both the international climate negotiation process, as well as the key dynamics of domestic economies relevant to energy and climate change. Some key questions to be explored with our model include: Are there patterns of innovation, adaptation, or climate damages that emerge from an ABM representation of an economy that are obscured by conventional assessments? ...
The authors then provide this graphic to illustrate what they mean:
I don't want to disparage the research, which I am sure will be of interest to a subset of academic political scientists. This research may even, over years, produce insights that gradually merge with the fruits of other research to change and even improve our understanding of how multiparty negotiations impact complicated international topics. And, yes, $700,000 is less than a drop in the bucket in the federal budget. But when looking at the Federal Budget, at a time when students are being forced into bankruptcy because they can't repay student debt, is this where the government should be spending its money?
Congressman Flake, who I never have heard of before happens to have a Masters degree in Political Science; he understands that these grants have multiple uses. First, they advance the general knowledge of the social sciences. They also advance the careers of the political scientists who win them. What is more, the vast majority of the funds dispersed go to subsidize the administrative costs at our nation's colleges and universities. And here is where the proposed funding looks mighty suspect.
The researchers proposing this study are from Dartmouth. Dartmouth is a fine school, also a small school that happens to have an endowment of over $3 Billion dollars. As Congressman Flake notes,
According to the NSF Web site, to date, more than $80 million has been awarded to the program’s nearly 200 active projects. Three-quarters of these awards, totaling over $46 million, were directed to universities with endowments greater than $1 billion.
The outrage of the political science community at these cuts is more than misplaced.
We may wonder why political science and not anthropology. I guess the first answer is that Congressman Flake is a political scientist and thus is beginning to cut in the areas he knows best. But the bigger issue is that these cuts are just the beginning of a desperately needed rethinking of what the federal government should be spending money on at a time of coming austerity.
The beauty of the American system is the dispersion of power. The federal government does not control all the levers of power or all the money in the USA. If the NSF cannot or does not fund a study, those who feel the need for that study have plenty of other pots to dip their hands into. There are a myriad of foundations and universities that support an enormous amount of social science research. The issue is not that necessary research may not get done, but that there will now be one fewer pot. That is sad for political scientists, but not a tragedy. Indeed, political scientists might ask: How has bureaucratic federal grant-making changed and influenced the nature of political science research?
The governors of two of our largest states gave "State of the State" messages this week. Both were controversial. Governor Andrew Cuomo in New York took on the teacher's union and demanded that teachers be subjected to measures of accountability. Governor Jerry Brown in California dared California to dream big and challenged the state to move forward with the high-speed train connecting Los Angeles and San Francisco. The Arendt Center is focusing its attention on the desperate need to rethink leadership in our time and wondering how we might encourage bold and courageous leadership. Cuomo's speech does just that. Brown's falls short.
Both Brown and Cuomo embraced the mantle of bold leadership. Brown styled himself the daring doer with his call to build a much-debated high-speed train connecting Los Angeles and San Francisco:
Critics of the high-speed rail project abound, as they often do when something of this magnitude is proposed. The Panama Canal was for years thought to be impractical, and Benjamin Disraeli himself said of the Suez Canal, ‘Totally impossible to be carried out.’ The critics were wrong then, and they’re wrong now.
Cuomo, for his part, imagined himself the rampaging reformer taking on the entrenched interests of the unions. He challenged the teacher's union to accept teacher evaluation that would carry meaningful consequences for ineffective teachers. And promised to withhold funding to districts that do not. “No evaluation, no money. Period,” the Governor said.
I learned my most important lesson in my first year as Governor in the area of public education. I learned that everyone in public education has his or her own lobbyist. Superintendents have lobbyists. Principals have lobbyists. Teachers have lobbyists. School boards have lobbyists. Maintenance personnel have lobbyists. Bus drivers have lobbyists. The only group without a lobbyist? The students.
Well, I learned my lesson. This year, I will take a second job — consider me the lobbyist for the students. I will wage a campaign to put students first, and to remind us that the purpose of public education is to help children grow, not to grow the public education bureaucracy.
I am no fan of union bashing. As an educator myself, I have enormous respect for those who teach. Teachers should be paid more, not less, and good teachers should receive performance bonuses, as is currently happening in Washington, DC. Study after study shows that the biggest factor in whether a child learns is the teacher, not how much money is spent. I think anyone who teaches knows this is true.
Cuomo's decision to take on the education establishment on teacher evaluation is a small step. But it does show a Democratic Governor exerting leadership by opposing a union that is part of his traditional constituency.
He is insisting that the services government provide be better. And he reminds us that government is first and foremost about providing services for citizens, not about providing jobs. If we are going to preserve faith in government, we need to make government work. Cuomo seems intent on doing just that.
Brown, on the other hand, seems entrenched in the failed policies of government. I love fast trains (so does my 2 year old son). I suffer every week on the slow train between New York City and the Hudson Valley where I teach. As someone who has lived in Europe and marveled at the Chinese, I desperately wish that the United States could build a transportation infrastructure that would work.
Thus I am open to Brown's risk-taking insistence we build fast trains. That said, he is committing to a project for which most of the funds are not yet raised and that won't be completed until 2033—under optimistic forecasts.
Who knows if a fast train designed in 2010 will even be useful in 2033? The Erie Canal took 8 years to build. The U.S. built the Panama Canal in less than 10 years. It is one thing for medieval towns to dream big and build a gothic cathedral over decades and centuries, for one has faith that God will still have need of a place of worship. But with technology changing so fast, the $100 billion train could be obsolete before it is completed.
Real leadership requires not simply dreaming big, but acting big. Leadership entails cutting through the bureaucratic red tape that makes it so expensive and time-consuming to take on major public-works projects in this country. Courage would be for a democratic governor to pursue his dream for major infrastructure while at the same time insisting on regulatory and labor reform that would allow the train to be completed in less time than the Erie Canal.
“For the idea of humanity, when purged of all sentimentality, has the very serious consequence that in one form or another men must assume responsibility for all crimes committed by men and that all nations share the onus of evil committed by all others. Shame at being a human being is the purely individual and still non-political expression of this insight”
-- Hannah Arendt, “Organized Guilt and Universal Responsibility”
The twin themes of guilt and responsibility, and the differentiation between them, were key issues for Hannah Arendt in this essay. As Arendt notes, the leader of the SS, Heinrich Himmler, was “neither a Bohemian like Goebbels, nor a sex criminal like Streicher, nor a perverted fanatic like Hitler, nor an adventurer like Goering.” Himmler was an outwardly “respectable” bourgeois who implemented a policy that compelled ordinary bourgeois paterfamilias to act as cogs in the infernal machinery of the “final solution.” While many Germans had strong ideological reasons to participate in the final solution, many a German husband did so without thinking, simply “for the sake of his pension, his life insurance, the security of his wife and children [and thus] was ready to sacrifice his beliefs, his honor, and his human dignity.” By involving millions of ordinary Germans in the Holocaust, and by giving the fullest expression to the human capacity for barbarity, the German bureaucratic machine rendered questionable the traditional juridical differentiations between guilt and responsibility.
What struck Arendt in this early text was that the fact of bureaucratic involvement in the Holocaust did not automatically generate a feeling of guilt, or of responsibility in the participants. Arendt provides a snippet of an interview with an ordinary German, in which, after listing the types of activities he undertook and things he saw in his role as a paymaster at an extermination camp, the officer expresses shock when he learns that the Russians might put him to death. All he can do is break down and ask, “What have I done?”
What is it that allowed this officer to participate in and witness the most horrific crimes and yet remain free of any sense of guilt or responsibility? Arendt’s answer is that he lacked a feeling for the idea of humanity. Arendt is clear about what this idea of humanity is. “Purged of all sentimentality”, the idea of humanity is an awareness of the human capacity for evil. Without an awareness of one’s capacity for evil, a human being is incapable of experiencing shame. Shame, for Arendt, is an important indicator of human ethical awareness.
Rather than corrode the experience of politics, shame provides a model for post-Holocaust politics. Interestingly, Arendt invokes the Jewish prayer of atonement (“Our Father and King, we have sinned before you”) as an example of the kind of response that the Holocaust demands from us. Shame itself is not political – like religious sentiment, shame is non-political because it is the concern of an individual. A future politics, however, must be able to address at a political level – that is, in the space of plurality – what shame accomplishes in the individual. We need to develop a politics that fosters the collective awareness of our capacity for evil, just as shame inspires individuals to face their own responsibility. Politically, the development of shame can allow us to recognize that even if the guilt of a criminal act is limited to a particular individual, we must all be vigilant against our human propensity to participate in evil.
Arendt ‘s comments have resonance for the recently concluded war in Iraq. A New York Times correspondent recently discovered classified testimonies of U.S. soldiers under investigation for committing war crimes in Iraq. These testimonies confirm what critics of the wars in Iraq (and Afghanistan) have long alleged: that the real number of abuses committed by U.S. troops far exceeds those that were eventually disclosed to the public. Some of the (officially classified) images depicting these crimes have been leaked into the media – although they have not found a venue in the U.S. mainstream press. Many of these images are pornographic mementos celebrating the wanton destruction of human lives, much like the assortment of fingers and skulls of Iraqi and Afghani civilians discovered to have been collected as war trophies.
Given the existence of photographic records of these crimes, it seems that the guilty might be clearly identifiable. However, the guilt of those who actively participated in these crimes shades into the responsibility of those who abetted them or, at the very least, turned a blind eye to them. The willingness to equate the presence of these criminals to the statistically unavoidable appearance of “a few bad apples” was perhaps all too readily accepted by those unwilling to undermine their own economic security. Which is another way of saying that the responsibility of the crimes committed has been left unaddressed, even as the guilt of the criminal acts is still being established. “Shame at being a human being” might be for us today at the conclusion of the war in Iraq, as it seemed to Arendt upon the conclusion of the Second World War, both the affective record of an attempt to face the crimes of the war and the realization of “how great a burden mankind is for man- Manu Samnotra
“The essence of totalitarian government, and perhaps the nature of every bureaucracy, is to make functionaries and mere cogs in the administrative machinery out of men, and thus to dehumanize them. And one can debate long and profitably on the rule of Nobody, which is what the political form known as bureau-cracy truly is….we have become very much accustomed by modern psychology and sociology, not to speak of modern bureaucracy, to explaining away the responsibility of the doer for his deed in terms of this or that kind of determinism” (Hannah Arendt: Eichmann in Jerusalem, Essay on the Banality of Evil).
Hannah Arendt here expresses a fundamental problem of the ethics of administration. Can we say that a public administrator or business manager is responsible for actions they have done when they were following the orders given by their superiors? Arendt emphasizes that there is a dangerous inclination to regard such bureaucrats as insignificant ordinary people who are nothing but elements of a system or a larger whole. In this way bureaucratic evildoers may argue–as Eichmann did—that they were only “following orders” and therefore they cannot really be held responsible in a legal sense because as officials they are only administering the rules and regulations of the legal or bureaucratic system. Or, in case of a corporation, the rules and values that are implemented as strategy from the top.
Yet this passage expresses our fundamental unease with such an understanding of bureaucratic or administrative responsibility. As Arendt says further in her work, we should never submit ourselves to this kind of determinism where the doer is not really responsible for his deeds. Eichmann – like every bureaucrat, administrator or middle manager—is always fundamentally responsible for his actions, even if he is not thinking about what he is doing and even if he is not responsible in a legal or institutional sense. Instead, Arendt suggests we must think through in fundamental ethical and moral sense of bureaucratic responsibility. This is the existential condition of the bureaucrat who never becomes totally a part of the system but always must try to consider his or her actions from outside the organization, institution, or bureaucracy. It is from this perspective that he or she is able to make the critical judgment and evaluation of whether such actions would be justifiable from the point of view of universal morality and principles of justice.
-Jacob Dahl Rendtorff