**This post was originally published on June 18, 2012**
By Roger Berkowitz
"It is true that totalitarian domination tried to establish these holes of oblivion into which all deeds, good and evil, would disappear; but just as the Nazis' feverish attempts, from June, 1942, on, to erase all traces of the massacres - through cremation, through burning in open pits, through the use of explosives and flame-throwers and bone-crushing machinery - were doomed to failure, so all efforts to let their opponents "disappear in silent anonymity" were in vain. The holes of oblivion do not exist. Nothing human is that perfect, and there are simply too many people in the world to make oblivion possible. One man will always be left alive to tell the story."
—Hannah Arendt, Eichmann in Jerusalem
Aung San Suu Kyi accepted her Nobel Peace Prize in the early summer of 2012, 21 years after it was awarded. For over two decades since her landslide victory in what was then Burma and is now Myanmar, Suu Kyi has stood fast in her opposition to the military junta ruling her country. The junta has sought to make her disappear, suppress any mention of her, and violently repress all protest and dissent.
“It is obvious: if you do not accept something that assumes the form of ‘destiny,’ you not only change its ‘natural laws’ but also the laws of the enemy playing the role of fate.”
--Hannah Arendt, The Jewish Writings (223)
In 1944, as the Allied armies liberated areas under Nazi control, news about the horrors of the extermination camps inevitably wound its way to the United States. In her interview with Günter Gaus many years later, Hannah Arendt would recount these months as full of devastating shocks that unveiled the fullest extent of what was transpiring in Europe. It was in the midst of the delivery of the news of this carnage, this knowledge of the “fabrication of corpses,” that Arendt continued to perform her role as “something between a historian and political journalist.” This delicate terrain – somewhere “between silence and speechlessness” – is what Arendt had to traverse as she informed and provoked her audience into action.
"The end of the old is not necessarily the beginning of the new."
Hannah Arendt, The Life of the Mind
This is a simple enough statement, and yet it masks a profound truth, one that we often overlook out of the very human tendency to seek consistency and connection, to make order out of the chaos of reality, and to ignore the anomalous nature of that which lies in between whatever phenomena we are attending to.
Perhaps the clearest example of this has been what proved to be the unfounded optimism that greeted the overthrow of autocratic regimes through American intervention in Afghanistan and Iraq, and the native-born movements known collectively as the Arab Spring. It is one thing to disrupt the status quo, to overthrow an unpopular and undemocratic regime. But that end does not necessarily lead to the establishment of a new, beneficent and participatory political structure. We see this time and time again, now in Putin's Russia, a century ago with the Russian Revolution, and over two centuries ago with the French Revolution.
Of course, it has long been understood that oftentimes, to begin something new, we first have to put an end to something old. The popular saying that you can't make an omelet without breaking a few eggs reflects this understanding, although it is certainly not the case that breaking eggs will inevitably and automatically lead to the creation of an omelet. Breaking eggs is a necessary but not sufficient cause of omelets, and while this is not an example of the classic chicken and egg problem, I think we can imagine that the chicken might have something to say on the matter of breaking eggs. Certainly, the chicken would have a different view on what is signified or ought to be signified by the end of the old, meaning the end of the egg shell, insofar as you can't make a chicken without it first breaking out of the egg that it took form within.
So, whether you take the chicken's point of view, or adopt the perspective of the omelet, looking backwards, reverse engineering the current situation, it is only natural to view the beginning of the new as an effect brought into being by the end of the old, to assume or make an inference based on sequencing in time, to posit a causal relationship and commit the logical fallacy of post hoc ergo propter hoc, if for no other reason that by force of narrative logic that compels us to create a coherent storyline. In this respect, Arendt points to the foundation tales of ancient Israel and Rome:
We have the Biblical story of the exodus of Israeli tribes from Egypt, which preceded the Mosaic legislation constituting the Hebrew people, and Virgil's story of the wanderings of Aeneas, which led to the foundation of Rome—"dum conderet urbem," as Virgil defines the content of his great poem even in its first lines. Both legends begin with an act of liberation, the flight from oppression and slavery in Egypt and the flight from burning Troy (that is, from annihilation); and in both instances this act is told from the perspective of a new freedom, the conquest of a new "promised land" that offers more than Egypt's fleshpots and the foundation of a new City that is prepared for by a war destined to undo the Trojan war, so that the order of events as laid down by Homer could be reversed.
Fast forward to the American Revolution, and we find that the founders of the republic, mindful of the uniqueness of their undertaking, searched for archetypes in the ancient world. And what they found in the narratives of Exodus and the Aeneid was that the act of liberation, and the establishment of a new freedom are two events, not one, and in effect subject to Alfred Korzybski's non-Aristotelian Principle of Non-Identity. The success of the formation of the American republic can be attributed to the awareness on their part of the chasm that exists between the closing of one era and the opening of a new age, of their separation in time and space:
No doubt if we read these legends as tales, there is a world of difference between the aimless desperate wanderings of the Israeli tribes in the desert after the Exodus and the marvelously colorful tales of the adventures of Aeneas and his fellow Trojans; but to the men of action of later generations who ransacked the archives of antiquity for paradigms to guide their own intentions, this was not decisive. What was decisive was that there was a hiatus between disaster and salvation, between liberation from the old order and the new freedom, embodied in a novus ordo saeclorum, a "new world order of the ages" with whose rise the world had structurally changed.
I find Arendt's use of the term hiatus interesting, given that in contemporary American culture it has largely been appropriated by the television industry to refer to a series that has been taken off the air for a period of time, but not cancelled. The typical phrase is on hiatus, meaning on a break or on vacation. But Arendt reminds us that such connotations only scratch the surface of the word's broader meanings. The Latin word hiatus refers to an opening or rupture, a physical break or missing part or link in a concrete material object. As such, it becomes a spatial metaphor when applied to an interruption or break in time, a usage introduced in the 17th century. Interestingly, this coincides with the period in English history known as the Interregnum, which began in 1649 with the execution of King Charles I, led to Oliver Cromwell's installation as Lord Protector, and ended after Cromwell's death with the Restoration of the monarchy under Charles II, son of Charles I. While in some ways anticipating the American Revolution, the English Civil War followed an older pattern, one that Mircea Eliade referred to as the myth of eternal return, a circular movement rather than the linear progression of history and cause-effect relations.
The idea of moving forward, of progress, requires a future-orientation that only comes into being in the modern age, by which I mean the era that followed the printing revolution associated with Johannes Gutenberg (I discuss this in my book, On the Binding Biases of Time and Other Essays on General Semantics and Media Ecology). But that same print culture also gave rise to modern science, and with it the monopoly granted to efficient causality, cause-effect relations, to the exclusion in particular of final and formal cause (see Marshall and Eric McLuhan's Media and Formal Cause). This is the basis of the Newtonian universe in which every action has an equal and opposite reaction, and every effect can be linked back in a causal chain to another event that preceded it and brought it into being. The view of time as continuous and connected can be traced back to the introduction of the mechanical clock in the 13th century, but was solidified through the printing of calendars and time lines, and the same effect was created in spatial terms by the reproduction of maps, and the use of spatial grids, e.g., the Mercator projection.
And while the invention of history, as a written narrative concerning the linear progression over time can be traced back to the ancient Israelites, and the story of the exodus, the story incorporates the idea of a hiatus in overlapping structures:
A1. Joseph is the golden boy, the son favored by his father Jacob, earning him the enmity of his brothers
A2. he is sold into slavery by them, winds up in Egypt as a slave and then is falsely accused and imprisoned
A3. by virtue of his ability to interpret dreams he gains his freedom and rises to the position of Pharaoh's prime minister
B1. Joseph welcomes his brothers and father, and the House of Israel goes down to Egypt to sojourn due to famine in the land of Canaan
B2. their descendants are enslaved, oppressed, and persecuted
B3. Moses is chosen to confront Pharaoh, liberate the Israelites, and lead them on their journey through the desert
C1. the Israelites are freed from bondage and escape from Egypt
C2. the revelation at Sinai fully establishes their covenant with God
C3. after many trials, they return to the Promised Land
It can be clearly seen in these narrative structures that the role of the hiatus, in ritual terms, is that of the rite of passage, the initiation period that marks, in symbolic fashion, the change in status, the transformation from one social role or state of being to another (e.g., child to adult, outsider to member of the group). This is not to discount the role that actual trials, tests, and other hardships may play in the transition, as they serve to establish or reinforce, psychologically and sometimes physically, the value and reality of the transformation.
In mythic terms, this structure has become known as the hero's journey or hero's adventure, made famous by Joseph Campbell in The Hero with a Thousand Faces, and also known as the monomyth, because he claimed that the same basic structure is universal to all cultures. The basis structure he identified consists of three main elements: separation (e.g., the hero leaves home), initiation (e.g., the hero enters another realm, experiences tests and trials, leading to the bestowing of gifts, abilities, and/or a new status), and return (the hero returns to utilize what he has gained from the initiation and save the day, restoring the status quo or establishing a new status quo).
Understanding the mythic, non-rational element of initiation is the key to recognizing the role of the hiatus, and in the modern era this meant using rationality to realize the limits of rationality. With this in mind, let me return to the quote I began this essay with, but now provide the larger context of the entire paragraph:
The legendary hiatus between a no-more and a not-yet clearly indicated that freedom would not be the automatic result of liberation, that the end of the old is not necessarily the beginning of the new, that the notion of an all-powerful time continuum is an illusion. Tales of a transitory period—from bondage to freedom, from disaster to salvation—were all the more appealing because the legends chiefly concerned the deeds of great leaders, persons of world-historic significance who appeared on the stage of history precisely during such gaps of historical time. All those who pressed by exterior circumstances or motivated by radical utopian thought-trains, were not satisfied to change the world by the gradual reform of an old order (and this rejection of the gradual was precisely what transformed the men of action of the eighteenth century, the first century of a fully secularized intellectual elite, into the men of the revolutions) were almost logically forced to accept the possibility of a hiatus in the continuous flow of temporal sequence.
Note that concept of gaps in historical time, which brings to mind Eliade's distinction between the sacred and the profane. Historical time is a form of profane time, and sacred time represents a gap or break in that linear progression, one that takes us outside of history, connecting us instead in an eternal return to the time associated with a moment of creation or foundation. The revelation in Sinai is an example of such a time, and accordingly Deuteronomy states that all of the members of the House of Israel were present at that event, not just those alive at that time, but those not present, the generations of the future. This statement is included in the liturgy of the Passover Seder, which is a ritual reenactment of the exodus and revelation, which in turn becomes part of the reenactment of the Passion in Christianity, one of the primary examples of Campbell's monomyth.
Arendt's hiatus, then represents a rupture between two different states or stages, an interruption, a disruption linked to an eruption. In the parlance of chaos and complexity theory, it is a bifurcation point. Arendt's contemporary, Peter Drucker, a philosopher who pioneered the scholarly study of business and management, characterized the contemporary zeitgeist in the title of his 1969 book: The Age of Discontinuity. It is an age in which Newtonian physics was replaced by Einstein's relativity and Heisenberg's uncertainty, the phrase quantum leap becoming a metaphor drawn from subatomic physics for all forms of discontinuity. It is an age in which the fixed point of view that yielded perspective in art and the essay and novel in literature yielded to Cubism and subsequent forms of modern art, and stream of consciousness in writing.
Beginning in the 19th century, photography gave us the frozen, discontinuous moment, and the technique of montage in the motion picture gave us a series of shots and scenes whose connections have to be filled in by the audience. Telegraphy gave us the instantaneous transmission of messages that took them out of their natural context, the subject of the famous comment by Henry David Thoreau that connecting Maine and Texas to one another will not guarantee that they have anything sensible to share with each other. The wire services gave us the nonlinear, inverted pyramid style of newspaper reporting, which also was associated with the nonlinear look of the newspaper front page, a form that Marshall McLuhan referred to as a mosaic. Neil Postman criticized television's role in decontextualizing public discourse in Amusing Ourselves to Death, where he used the phrase, "in the context of no context," and I discuss this as well in my recently published follow-up to his work, Amazing Ourselves to Death.
The concept of the hiatus comes naturally to the premodern mind, schooled by myth and ritual within the context of oral culture. That same concept is repressed, in turn, by the modern mind, shaped by the linearity and rationality of literacy and typography. As the modern mind yields to a new, postmodern alternative, one that emerges out of the electronic media environment, we see the return of the repressed in the idea of the jump cut writ large.
There is psychological satisfaction in the deterministic view of history as the inevitable result of cause-effect relations in the Newtonian sense, as this provides a sense of closure and coherence consistent with the typographic mindset. And there is similar satisfaction in the view of history as entirely consisting of human decisions that are the product of free will, of human agency unfettered by outside constraints, which is also consistent with the individualism that emerges out of the literate mindset and print culture, and with a social rather that physical version of efficient causality. What we are only beginning to come to terms with is the understanding of formal causality, as discussed by Marshall and Eric McLuhan in Media and Formal Cause. What formal causality suggests is that history has a tendency to follow certain patterns, patterns that connect one state or stage to another, patterns that repeat again and again over time. This is the notion that history repeats itself, meaning that historical events tend to fall into certain patterns (repetition being the precondition for the existence of patterns), and that the goal, as McLuhan articulated in Understanding Media, is pattern recognition. This helps to clarify the famous remark by George Santayana, "those who cannot remember the past are condemned to repeat it." In other words, those who are blind to patterns will find it difficult to break out of them.
Campbell engages in pattern recognition in his identification of the heroic monomyth, as Arendt does in her discussion of the historical hiatus. Recognizing the patterns are the first step in escaping them, and may even allow for the possibility of taking control and influencing them. This also means understanding that the tendency for phenomena to fall into patterns is a powerful one. It is a force akin to entropy, and perhaps a result of that very statistical tendency that is expressed by the Second Law of Thermodynamics, as Terrence Deacon argues in Incomplete Nature. It follows that there are only certain points in history, certain moments, certain bifurcation points, when it is possible to make a difference, or to make a difference that makes a difference, to use Gregory Bateson's formulation, and change the course of history. The moment of transition, of initiation, the hiatus, represents such a moment.
McLuhan's concept of medium goes far beyond the ordinary sense of the word, as he relates it to the idea of gaps and intervals, the ground that surrounds the figure, and explains that his philosophy of media is not about transportation (of information), but transformation. The medium is the hiatus.
The particular pattern that has come to the fore in our time is that of the network, whether it's the decentralized computer network and the internet as the network of networks, or the highly centralized and hierarchical broadcast network, or the interpersonal network associated with Stanley Milgram's research (popularly known as six degrees of separation), or the neural networks that define brain structure and function, or social networking sites such as Facebook and Twitter, etc. And it is not the nodes, which may be considered the content of the network, that defines the network, but the links that connect them, which function as the network medium, and which, in the systems view favored by Bateson, provide the structure for the network system, the interaction or relationship between the nodes. What matters is not the nodes, it's the modes.
Hiatus and link may seem like polar opposites, the break and the bridge, but they are two sides of the same coin, the medium that goes between, simultaneously separating and connecting. The boundary divides the system from its environment, allowing the system to maintain its identity as separate and distinct from the environment, keeping it from being absorbed by the environment. But the membrane also serves as a filter, engaged in the process of abstracting, to use Korzybski's favored term, letting through or bringing material, energy, and information from the environment into the system so that the system can maintain itself and survive. The boundary keeps the system in touch with its situation, keeps it contextualized within its environment.
The systems view emphasizes space over time, as does ecology, but the concept of the hiatus as a temporal interruption suggests an association with evolution as well. Darwin's view of evolution as continuous was consistent with Newtonian physics. The more recent modification of evolutionary theory put forth by Stephen Jay Gould, known as punctuated equilibrium, suggests that evolution occurs in fits and starts, in relatively rare and isolated periods of major change, surrounded by long periods of relative stability and stasis. Not surprisingly, this particular conception of discontinuity was introduced during the television era, in the early 1970s, just a few years after the publication of Peter Drucker's The Age of Discontinuity.
When you consider the extraordinary changes that we are experiencing in our time, technologically and ecologically, the latter underlined by the recent news concerning the United Nations' latest report on global warming, what we need is an understanding of the concept of change, a way to study the patterns of change, patterns that exist and persist across different levels, the micro and the macro, the physical, chemical, biological, psychological, and social, what Bateson referred to as metapatterns, the subject of further elaboration by biologist Tyler Volk in his book on the subject. Paul Watzlawick argued for the need to study change in and of itself in a little book co-authored by John H. Weakland and Richard Fisch, entitled Change: Principles of Problem Formation and Problem Resolution, which considers the problem from the point of view of psychotherapy. Arendt gives us a philosophical entrée into the problem by introducing the pattern of the hiatus, the moment of discontinuity that leads to change, and possibly a moment in which we, as human agents, can have an influence on the direction of that change.
To have such an influence, we do need to have that break, to find a space and more importantly a time to pause and reflect, to evaluate and formulate. Arendt famously emphasizes the importance of thinking in and of itself, the importance not of the content of thought alone, but of the act of thinking, the medium of thinking, which requires an opening, a time out, a respite from the onslaught of 24/7/365. This underscores the value of sacred time, and it follows that it is no accident that during that period of initiation in the story of the exodus, there is the revelation at Sinai and the gift of divine law, the Torah or Law, and chief among them the Ten Commandments, which includes the fourth of the commandments, and the one presented in greatest detail, to observe the Sabbath day. This premodern ritual requires us to make the hiatus a regular part of our lives, to break the continuity of profane time on a weekly basis. From that foundation, other commandments establish the idea of the sabbatical year, and the sabbatical of sabbaticals, or jubilee year. Whether it's a Sabbath mandated by religious observance, or a new movement to engage in a Technology Sabbath, the hiatus functions as the response to the homogenization of time that was associated with efficient causality and literate linearity, and that continues to intensify in conjunction with the technological imperative of efficiency über alles.
To return one last time to the quote that I began with, the end of the old is not necessarily the beginning of the new because there may not be a new beginning at all, there may not be anything new to take the place of the old. The end of the old may be just that, the end, period, the end of it all. The presence of a hiatus to follow the end of the old serves as a promise that something new will begin to take its place after the hiatus is over. And the presence of a hiatus in our lives, individually and collectively, may also serve as a promise that we will not inevitably rush towards an end of the old that will also be an end of it all, that we will be able to find the opening to begin something new, that we will be able to make the transition to something better, that both survival and progress are possible, through an understanding of the processes of continuity and change.
“Having said this, I must deal immediately and at some length with the question of violence.”
“Sometimes ‘violence is the only way of ensuring a hearing for moderation.’”
—Hannah Arendt citing Conor Cruise O’Brien, On Violence
Nelson Mandela gave one of the great speeches of 20th century at his trial before the South African Supreme Court in Pretoria in 1964. Mandela’s speech is best remembered for the ringing conclusion in which he articulates the ideals of free and democratic life as that “ideal for which I am prepared to die.” Six months after Martin Luther King Jr. delivered his “I have a dream speech” from the Mall in Washington, DC, Mandela ended his own speech before being sentenced to life imprisonment with these words:
During my lifetime I have dedicated myself to this struggle of the African people. I have fought against white domination, and I have fought against black domination. I have cherished the ideal of a democratic and free society in which all persons live together in harmony and with equal opportunities. It is an ideal which I hope to live for and to achieve. But if needs be, it is an ideal for which I am prepared to die.
Mandela died yesterday and he will be rightly remembered for both his vision and his courage.
I want to focus on another aspect of his legacy, however, the question of violence. Often forgotten by those who quote only the final paragraph of Mandela’s speech, much of his speech is an exploration of the need for and proper revolutionary use of violence. Indeed, after a brief introduction in which Mandela reminds the Court that he holds a bachelor’s degree, that he is a lawyer, and that he was raised to revere his tribal forebears who fought in defense of their fatherland, he comes to the question of violence. “Having said this,” he says, “I must deal immediately and at some length with the question of violence.”
What follows is one of the most thoughtful and subtle reflections on the strategic and moral complications of violence we have. It is worth citing at length, and even this summary barely does Mandela justice. But here is Mandela’s argument for a limited campaign of violence in response to the violence of the South African state:
I do not, however, deny that I planned sabotage. I did not plan it in a spirit of recklessness, nor because I have any love of violence. I planned it as a result of a calm and sober assessment of the political situation that had arisen after many years of tyranny, exploitation, and oppression of my people by the whites.
I admit immediately that I was one of the persons who helped to form Umkhonto we Sizwe, and that I played a prominent role in its affairs until I was arrested in August 1962….
In order to explain these matters properly, I will have to explain what Umkhonto set out to achieve; what methods it prescribed for the achievement of these objects, and why these methods were chosen. I will also have to explain how I became involved in the activities of these organisations.
I deny that Umkhonto was responsible for a number of acts which clearly fell outside the policy of the organisation, and which have been charged in the indictment against us. I do not know what justification there was for these acts, but to demonstrate that they could not have been authorised by Umkhonto, I want to refer briefly to the roots and policy of the organisation.
I have already mentioned that I was one of the persons who helped to form Umkhonto. I, and the others who started the organisation, did so for two reasons. Firstly, we believed that as a result of Government policy, violence by the African people had become inevitable, and that unless responsible leadership was given to canalise and control the feelings of our people, there would be outbreaks of terrorism which would produce an intensity of bitterness and hostility between the various races of this country which is not produced even by war. Secondly, we felt that without violence there would be no way open to the African people to succeed in their struggle against the principle of white supremacy. All lawful modes of expressing opposition to this principle had been closed by legislation, and we were placed in a position in which we had either to accept a permanent state of inferiority, or to defy the government. We chose to defy the law. We first broke the law in a way which avoided any recourse to violence; when this form was legislated against, and then the government resorted to a show of force to crush opposition to its policies, only then did we decide to answer violence with violence.
But the violence which we chose to adopt was not terrorism…..
I must return to June 1961. What were we, the leaders of our people, to do? Were we to give in to the show of force and the implied threat against future action, or were we to fight it and, if so, how?
We had no doubt that we had to continue the fight. Anything else would have been abject surrender. Our problem was not whether to fight, but was how to continue the fight. We of the ANC had always stood for a non-racial democracy, and we shrank from any action which might drive the races further apart than they already were. But the hard facts were that fifty years of non-violence had brought the African people nothing but more and more repressive legislation, and fewer and fewer rights. It may not be easy for this court to understand, but it is a fact that for a long time the people had been talking of violence - of the day when they would fight the white man and win back their country - and we, the leaders of the ANC, had nevertheless always prevailed upon them to avoid violence and to pursue peaceful methods. When some of us discussed this in May and June of 1961, it could not be denied that our policy to achieve a non-racial state by non-violence had achieved nothing, and that our followers were beginning to lose confidence in this policy and were developing disturbing ideas of terrorism.
It must not be forgotten that by this time violence had, in fact, become a feature of the South African political scene. There had been violence in 1957 when the women of Zeerust were ordered to carry passes; there was violence in 1958 with the enforcement of cattle culling in Sekhukhuniland; there was violence in 1959 when the people of Cato Manor protested against pass raids; there was violence in 1960 when the government attempted to impose Bantu authorities in Pondoland. Thirty-nine Africans died in these disturbances. In 1961 there had been riots in Warmbaths, and all this time the Transkei had been a seething mass of unrest. Each disturbance pointed clearly to the inevitable growth among Africans of the belief that violence was the only way out - it showed that a government which uses force to maintain its rule teaches the oppressed to use force to oppose it. Already small groups had arisen in the urban areas and were spontaneously making plans for violent forms of political struggle. There now arose a danger that these groups would adopt terrorism against Africans, as well as whites, if not properly directed. Particularly disturbing was the type of violence engendered in places such as Zeerust, Sekhukhuniland, and Pondoland amongst Africans. It was increasingly taking the form, not of struggle against the government - though this is what prompted it - but of civil strife amongst themselves, conducted in such a way that it could not hope to achieve anything other than a loss of life and bitterness.
At the beginning of June 1961, after a long and anxious assessment of the South African situation, I, and some colleagues, came to the conclusion that as violence in this country was inevitable, it would be unrealistic and wrong for African leaders to continue preaching peace and non-violence at a time when the government met our peaceful demands with force.
This conclusion was not easily arrived at. It was only when all else had failed, when all channels of peaceful protest had been barred to us, that the decision was made to embark on violent forms of political struggle, and to form Umkhonto we Sizwe. We did so not because we desired such a course, but solely because the government had left us with no other choice. In the Manifesto of Umkhonto published on 16 December 1961, which is exhibit AD, we said:
"The time comes in the life of any nation when there remain only two choices - submit or fight. That time has now come to South Africa. We shall not submit and we have no choice but to hit back by all means in our power in defence of our people, our future, and our freedom."
This was our feeling in June of 1961 when we decided to press for a change in the policy of the National Liberation Movement. I can only say that I felt morally obliged to do what I did….
Four forms of violence were possible. There is sabotage, there is, there is terrorism, and there is open revolution. We chose to adopt the first method and to exhaust it before taking any other decision.
In the light of our political background the choice was a logical one. Sabotage did not involve loss of life, and it offered the best hope for future race relations. Bitterness would be kept to a minimum and, if the policy bore fruit, democratic government could become a reality. This is what we felt at the time, and this is what we said in our manifesto (exhibit AD):
"We of Umkhonto we Sizwe have always sought to achieve liberation without bloodshed and civil clash. We hope, even at this late hour, that our first actions will awaken everyone to a realisation of the disastrous situation to which the nationalist policy is leading. We hope that we will bring the government and its supporters to their senses before it is too late, so that both the government and its policies can be changed before matters reach the desperate state of civil war."
The initial plan was based on a careful analysis of the political and economic situation of our country. We believed that South Africa depended to a large extent on foreign capital and foreign trade. We felt that planned destruction of power plants, and interference with rail and telephone communications, would tend to scare away capital from the country, make it more difficult for goods from the industrial areas to reach the seaports on schedule, and would in the long run be a heavy drain on the economic life of the country, thus compelling the voters of the country to reconsider their position.
Attacks on the economic life-lines of the country were to be linked with sabotage on government buildings and other symbols of apartheid. These attacks would serve as a source of inspiration to our people. In addition, they would provide an outlet for those people who were urging the adoption of violent methods and would enable us to give concrete proof to our followers that we had adopted a stronger line and were fighting back against government violence.
In addition, if mass action were successfully organised, and mass reprisals taken, we felt that sympathy for our cause would be roused in other countries, and that greater pressure would be brought to bear on the South African government.
This then was the plan. Umkhonto was to perform sabotage, and strict instructions were given to its members right from the start, that on no account were they to injure or kill people in planning or carrying out operations.
It is strange today to hear politicians of all stripes praising Mandela for his statesmanship when they, for years, condemned his embrace of violence and arrested those in the U.S. who—following Mandela’s own tactics—chained themselves to fences to oppose the U.S. government’s support of the apartheid regime in South Africa. It is true that Mandela lived numerous lives. As a young man, he was part of a royal tribal household. As a young adult, he was a lawyer. Later he was a non-violent leader. Still later, he turned to limited and rationalized use of violence. For 27 years he paid for his crimes in prison and then emerged a statesman, one committed to reconciliation, freedom, and multicultural democracy. Finally, when he stepped down from the Presidency after one term he helped assure South Africa’s democratic future and became an elder statesman in the truest sense of the word.
To understand the complexities of Mandela’s limited turn to sabotage (as opposed to terrorism in his words), it is helpful to consider Hannah Arendt’s essay On Violence, originally published in the New York Review of Books in 1969. Violence, writes Arendt, is at root instrumental. It is a means to an end. And sometimes, violence can yield positive and even moderate results, Arendt claims, citing Conor Cruise O’Brien: “Sometimes ‘violence is the only way of ensuring a hearing for moderation.’”
As did Mandela, Arendt well understood that violence can be a useful and important means in struggles for justice. She points to numerous of examples where violence has worked to promote justice: “France would not have received the most radical bill since Napoleon to change its antiquated education system if the French students had not rioted; if it had not been for the riots of the spring term, no one at Columbia University would have dreamed of accepting reforms; and it is probably quite true that in West Germany the existence of ‘dissenting minorities is not even noticed unless they engage in provocation.’” Violence can, and often does, make injustice visible to a citizenry that is blind to it. Because violence can “serve to dramatize grievances and bring them to public attention,” violence can serve the cause of reform and also of justice.
We must take Arendt and Mandela’s point seriously. Violence is a means to an end. Violence can work. “No doubt, ‘violence pays.’” Violence can yield results.
But Arendt is not an advocate for violence. Violence can pay, she writes, but “the trouble is that it pays indiscriminately.” And this is where the use of violence becomes dangerous.
The danger in using violence as a means is that when “applied to human affairs,” violence as a means has a tendency to overwhelm whatever good ends towards which it aims. Too often, violence will lead those in power to respond with sham reforms designed to end violence. They will seek the path of least resistance, instituting reforms that are often the wrong reforms. Arendt offers the example of the way that the student university protests of the 60s led to new courses in Swahili and “admitting students without the necessary qualifications” instead of real reform of the entire educational system.
What is more, violence—precisely because it is effective—has a tendency to promote more violence in response. If violence in the name of justice doesn’t achieve its ends quickly, the likely result is not justice, but more violence: “The practice of violence, like all action, changes the world, but the most probable change is to a more violent world.”
To read Mandela’s speech from 1964 is to encounter someone who thought through the promise and danger of violence in precisely the rational way that Arendt call for. The question we should ask is whether the turn to violence by the ANC in South Africa—even the limited, rational, and property-oriented violence Mandela embraced—promoted or retarded the cause for reform? Was it the ANC’s violence that led, 30 years later, to the reform of South Africa? Or was it Mandela’s dignity in prison and his emergence as a force for peace and reconciliation? Let’s celebrate Mandela as a hero this week. But let’s also ask: Was he right about violence?
My girlfriend and I walked by a clothing storefront and noticed the print on some of the t-shirts at the lower right corner of the window and went in. She had mentioned this Imaginary Foundation (IF) before. They make print t-shirts.
I went to school at an expensive liberal arts college in the Hudson Valley—everyone there makes print t-shirts. It is like a business you start as a college sophomore as a way to convince yourself that you are a ‘creative entrepreneur’ before you enter the corporate world (or, alternatively, as a penance for inherited culture and comfort) the not-for-profit world.
Often, I cannot stand them —the print t-shirts. There is something out of shape about them, as if the juxtaposition of body/shirt/image, sets askew some intrinsic agreement in the marriage of fashion and identity. And yet, the IF designs spoke to me. There is something dreamy and yet sincere about these prints. If le petit prince was looking for a print t-shirt, he would buy one of these.
It just so happened that the owner of the company was visiting this Seattle distributor and was in the store. He was awkward, skittish and European. I liked him, and before we left I told him that I blog for a thinking and humanities institute out east and may want to write about his brand. That’s how I got into the Imaginary Foundation.
The shirts are not exactly ‘pretty,’ or ‘fashionable,’ rather, their attraction is a gesture beyond themselves -- a rare feat in a culture that positions branding as the apex of success. I’ll describe one shirt and if interested you can invest your own time in the Imaginary Foundation.
The “Being There” shirt has three anonymous human heads (one of the cloud suit, one of the water suit, and one of the fire suit). The heads are in peripheral view and are aligned, with a slight skew (allowing us the view of all three faces), as they break through a wall, the veil of the universe.
Other shirts handle concepts of psychosis and love “Love Science,” science and discovery in a reach towards heaven “Reach,” and other such concepts widely considered esoteric or cliché within the lens of our popular culture. But, we no longer understand what a ‘cliché’ is. I have long held the view that a cliché is a truth, or a point of interest and perspective insight, that has simply been worn out by overexposure. But who has worn it out? How have we taken the liberty and quiet pleasure of the private sphere (the realms of reflection, contemplation, meditation as it is thought of in the Greek terms), out of our living cycle, our consciousness, our daily existence? Why is the call for private contemplation no longer a necessity of existence? It seems we should have more time then ever for such practices. So many of our daily chores, our basic needs, are met through the economic matrix. I no longer have to chop wood for warmth, hunt a boar for food, trek down to the river for a water simply, etc... Why shouldn’t I spend more time in private contemplation, or even public conversation on these more subtle topics of the human necessity? Why shouldn’t I be making something in an effort to communicate those private necessities? The actualization of the humanist requires space for such a practice. And yet, anything that requires a slowing down of, a calling for the work of the mind and private reasoning, is now, quite often immediately, labeled a cliché.
In The Human Condition Arendt writes “The emancipation of labor and the concomitant emancipation of the laboring classes from oppression and exploitation certainly means progress in the direction of non-violence. It is much less certain that it was also progress in the direction of freedom.” She is not saying that laboring classes should not have been emancipated. Rather, that the humanist goal has been blurred by some glitch. Instead of moving towards freedom from wasteful labor (a waste of human power -- physical, mental, spiritual) we instead have emancipated labor. Most of us have become imprisoned in a non-sustainable cycle that for the continuation of its forward motion requires an ever-increasing consumption and waste. This waste can be seen in terms of power. The core power of the human psyche originates in the liberty of free private thoughts—a psychological space for contemplation. A mapping of one’s stillness that is only possible in the acquisition of free time. Free time is a result of freedom from labors necessity. What Arendt’s thoughts gesture towards is that the set of basic necessities that we have been freed from, have been replaced by another, far more complicated and disguised set—the necessity to perpetuate a system that is moving much faster then us; a necessity to consume and continue consuming. To be ‘a part of‘ is, today, to be a consumer—to take ones place in the labor of waste.
Oh right, I wanted to tell you about a product...
“IF” is a creative project. It gains the viewers attention and borrows the imagination. This is a beginning. It does not steal, it borrows. It suggests the prospect of resonance rather than ownership.
I checked out the company website. The “about” page describes the development of the Imaginary Foundation: “a think tank from Switzerland that does experimental research on new ways of thinking and the power of the imagination. They hold dear a belief in human potential and seek progress in all directions.” The page is dotted with black and white images from the sixties, shaggy haired men and turtle-neck clad women engaged in contemplative, laissez-faire, light spirited dialogue. The imaginary director of the foundation is described as a “70-something uber-intellectual whose father founded the Dadaist movement.” The foundation is imaginary. It is a base, a canvas, for the products (the t-shirts) and the ideas behind them.
The blog section of the site imagines a list of contributors: Isadore Muggll, Kamilla Rousseau, etc. These architects, as is the back story, are too imaginary. “IF” is a fictional foundation for the product. But the product is real and engaging.
What is captured here goes beyond the tangible properties of the product (t-shirts). It is about what the product delivers—the wonder of creativity and science, the archetypes of the IF. Imagination IS the foundation of this product.
The blog itself is a venue for artists who marry technology and art, as well as other thought provoking materials. The image I use at the head of this article is taken from the blog. Cloud, idea, light, community, play—IF: all these are represented in the Cloud installation. This art installation is a discovery I am brought to by the Imaginary Foundation.
I once taught a course on the development of contemporary advertising, heavily focused on Edward Bernays and the peripheral route of persuasion. Bernays was Sigmund Freud’s nephew, Woodrow Wilson’s image advisor, the father of the term "Public Relations," and the architect of the torches of freedom (Lucky Strikes) campaign, among many others. His theory, though terribly simplified here, was that the modern consumer does not purchase with his mind; rather, he defers to his emotions in most choices. The rational-actor is a fiction. If consumerism became god, branding became its religion.
Ad campaigns have become remarkably creative, and even, at times, beautiful. Have you ever felt the urge to cry during a Jeep commercial? Many have. I think I have. The central conceptual premise of the AMC show Mad Men, depends upon this tension: between art and consumption; the rendering from black and white, to color; the effective marketing and selling off of the human experience. In question is the art aspect of advertising. It is at the core of Don Draper’s motivations, and the one that despite his many character failings keeps endearing him to us. Ultimately we are asking, will he reconcile his artistic urge (his private motivation) with his office at the homunculus of the consumerism model (his role in the corporate arena). Exposed is a manipulation, an incongruence, an infidelity in the marriage of advertising and art. Where as art points towards something beyond itself, beyond even the image and the medium, the ad campaign points only to one purpose—back into itself. No idea behind it. Nothing living. It consumes.
Advertising is like the Ouroboros, the dragon that swallows its own tail; having entirely swallowed itself, the modern advertising campaign defies the laws of balance, it is only the un-relentless, hungry serpent head of consumption -- devoid of the body of life. The only urge driving it is to possess.
It is the difference between the work of Egon Schiele and Penthouse, the writings of Georges Bataille and a godaddy.com super bowl campaign.
Seduce ->consume. This is the current mandate of the ad campaign. But this relationship is only sustainable through incompletion. It requires continual doses. Seduce -> consume -> feel a lack even in the possession of product (contract unfulfilled) -> be seduced again -> consume. Ad infinitum. A terrible loop.
How can consumerism and individual consciousness (the most private sector) be made sustainable? Is it possible for a product to speak beyond itself? To fulfill the promise of its persuasion? And if it could, what would that mean for us?
Here I position the word sustainability to face two directions. In part it refers to what Arendt terms as “worldly,” the creation produced through work and not labor, something that has the potential to last beyond the productions of time, something that maneuvers into the arena of the eternal. I also want to posit the word in terms of its evolving contemporary potential. The one sector of the public, and political sphere that allows for the platform of this conversation is the environmental movement. It is where we have begun to contemplate the world beyond the shortsighted view of individual lifetimes. We speak of the sustainability of our planet; we are considering new ways to move our habits from wasteful and consumptive, towards lasting and sustainable power. It is a fairly new conversation and the word “sustainability” is evolving with each new perspective we bring to it.
Sustainability goes beyond consumer awareness. It is about the awareness of the product, how a brand gains consciousness. I need to explore here a definition of “consciousness.”
I have come to understand definitions as ever evolving in accordance with society and the pressures put upon it by the conditions of the time, the fractals of our world (more simply put, the culture stew).
Consciousness is the expanding of space into which one can resonate. To learn of the world around us, to acknowledge it, to consider its multiple dimensions, is to become more conscious -- to create space into which we can move by the will of our imagination and invention.
The Imaginary Foundation is an example of this bridge. It acknowledges itself and its fiction. It allows for play. It is a small company that uses the fabrication of its narrative to bring the consumers attention to the mimetic principles behind its product. Revealing the architects conceit brings me (the consumer) into co-authorship of the story. It endears itself to me. We do not only consume the product. We consume the narrative of the product. Even if I do not purchase, if I am thinking about it, I am talking about it, I have bought in. If it generates new ideas and deeper order thoughts, then I have begun to take ownership of the product. I consume the myth, I begin to co-author it -- I don it in the neural network of culture. And thus the product has gained consciousness, has begun to be carried beyond the object -- it resonates.
My study of this product is limited. I am not encouraging anyone here to purchase a shirt. I have not purchased a shirt. What I think this opens up is a table for negotiations between the current consumerism model, and individual consciousness—an opportunity to examine sustainable consumerism in all implications.
In the two years since its inception, the Arab Spring remains an extraordinarily difficult phenomenon to define and assess. Its local, national, and regional consequences have been varied and contradictory, and many of them are not obviously or immediately heartening. These observations certainly apply to Syria: although growing numbers of the country’s military personnel are abandoning their posts, the Assad regime’s war with the Sunni insurgency still threatens to draw Turkey, Lebanon, Iran, and Jordan into an intractable sectarian conflict. But they are, if anything, even more relevant to Egypt. There the overthrow of the Mubarak regime occurred with less brutality, all things considered, than we might have reasonably feared. But, the nature of the country’s social and political reconstruction nevertheless remains extremely uncertain, given the delicate balance of forces between the Muslim Brotherhood, the Salafist Nour Party, and the country’s diverse liberal and activist camps.
The effects of Egypt’s revolution have been particularly ambiguous for the country’s women. To be sure, women have played a noteworthy role in the Tahrir Square protests in January and February 2011, and many local and foreign observers commented on the lack of intimidation and harassment they faced in the days leading to Mubarak’s fall. But as Wendell Steavenson details in the most recent New Yorker, the protests were by no means free of gendered violence, and the revolution has yet to create a more comfortable or equitable place for women in Egyptian public life.
Let me touch on one example from Steavenson’s article. Hend Badawi, a twenty-three-year-old graduate student, was protesting against the interim military government in Tahrir Square in December 2011 when she was confronted by a group of soldiers. In the course of her arrest, the soldiers tore off Badawi’s headscarf, dragged her several hundred meters by the hair, cursed at her, struck her, and groped her breasts and behind. One of the soldiers also apparently told her that “if my sister went to Tahrir, I would shoot her” After being taken to a parliament building, Badawi was beaten again and interrogated for several hours before landing in a military hospital, where she was treated for severe lacerations on her feet, a broken wrist, and multiple broken fingers.
The next day, Field Marshal Mohamed Tantawi, at that time Egypt’s effective ruler, paid a visit to the hospital for a photo op with a state-TV camera crew. Despite her injuries, Badawi confronted him: “We don’t want your visit!” she reportedly screamed. “We are not the ones who are thugs! You’ve beaten us and ruined us! Shame on you! Get out!” News of the tongue-lashing quickly made the rounds on Twitter and Facebook, and when Badawi was moved to a civilian hospital, she used a video camera smuggled in by friends to issue a lengthier statement about her ordeal. The resulting video went viral, and independent TV stations used it to challenge government claims that the Army had not used violence against civilians.
One might expect that Badawi would be honored for her courage and conviction, and I can only imagine that she is, at least among pro-democracy activists. But her family, which happened to sympathize with the Mubarak regime, was appalled. Badawi had gone to Tahrir Square without informing them, and they blamed her not only for the violent treatment she had received, but also for the damage they believed she had done to the family’s reputation. Badawi’s relatives locked her in her room; her elderly aunt yelled at her frequently; and her brother Ahmed hit her. Later, when Badawi’s family did not allow her to return to Tahrir for the first anniversary of the revolution, she basically reenacted the protests of the previous year—only this time on a more intimate scale. As she related to Steavenson, she launched a hunger strike to protest her treatment at her family’s hands and made placards that read, “Hend wants to topple the siege! Down with Ahmed!”
Badawi’s experience is particular and inevitably her own, but it nevertheless exemplifies the conundrums that many women face in contemporary Egypt. As the daughter of a pious rural family, she has benefitted from the increasing levels of affluence, education, and occupational opportunity that at least some young people, both women and men, have enjoyed over the past several decades. But she has also come face to face with the possibilities and the limits created by Egypt’s Islamic Revival, which has established new expectations for women’s comportment on the street and in other public institutions. (If many women in Cairo went bareheaded and wore skirts and blouses at the beginning of Mubarak’s reign, almost all now wear headscarves, and the niqab is not an uncommon sight.) Finally, Badawi’s life has been shaped not simply by her family’s notions of appropriate womanly behavior, but by a wider climate of pervasive sexual harassment. According to one 2008 survey, sixty percent of Egyptian men admit to having harassed a woman, and the country’s police and security forces either openly condone such treatment or engage in even more serious assaults themselves.
Badawi chafes at the “customs and traditions”—a common Arabic phrase, which she employs sardonically—that mold and circumscribe her life. And, like at least some other women, she regards Egypt’s recent upheaval as a potential opening, an “opportunity to mix my inner revolution with the revolution of my country". But it is significant, I think, that Badawi does not seek a “Western” form of women’s equality and emancipation. Although she appreciates “the space and freedom” that appear to be available to women on American TV shows, she nevertheless intends to pursue them “in the context of my religion”. At the same time, many of the reforms that she and other women’s advocates might champion are now thoroughly tainted by their association with the autocratic Mubarak regime. For example, many Egyptians dismiss recent amendments to the country’s “personal-status laws”—which allowed women to initiate no-fault divorces and enhanced their child-custody rights—as cosmetic changes that only aimed to improve the government’s international image. Many other citizens, meanwhile, view Mubarak’s 2010 effort to mandate a quota for female members of parliament as a patent violation of democratic procedure.
These developments offer no clear path forward for Badawi and other Egyptian women, whether or not they regard themselves as activists. But they also pose a distinct challenge to outside observers—like me—who sympathize with their efforts to transform Egyptian society. Ten years ago, the Columbia anthropologist Lila Abu-Lughod drew on the impending American invasion of Afghanistan to question the notion that the U.S. should “save” Muslim women from oppression. Instead of adopting a position of patronizing superiority, Abu-Lughod urged concerned Americans to ally themselves with local activists in the Middle East and to work with them on the issues that they deemed most important. In the context of the Arab Spring, however, even this advice appears to have its shortcomings. I worry that American (or wider “Western”) support for women like Hend Badawi, however well-meaning, will unintentionally undermine the very reforms that the activists themselves favor. I also suspect that a considerable number of Egyptians will resent even the most “enlightened” coalitions as yet another instance of anti-democratic meddling if not neo-colonial imposition. After all, the U.S. did much to keep Mubarak in power for thirty years. Why now should Americans, whether they are affiliated with the U.S. government or not, attempt to intervene even indirectly in Egypt’s transformation?
I certainly believe, from a political and scholarly perspective, that Americans should care a great deal about the consequences of the revolutions in Egypt and other North African and Middle Eastern states. In the end, however, I wonder if the most advisable practical course may be to adopt an attitude of principled non-interference in those cases where mass violence is not imminent. In short, we should allow Egyptians (and other Middle Easterners) room to work out the consequences and implications of the Arab Spring on their own, even if we are not entirely comfortable with the results.
Note: Lila Abu-Lughod’s argument, which I reference near the end of this post, appears in “Do Muslim Women Really Need Saving? Anthropological Reflections on Cultural Relativism and its Others.” American Anthropologist 104.3 (2002): 783-790.
It is true that totalitarian domination tried to establish these holes of oblivion into which all deeds, good and evil, would disappear; but just as the Nazis' feverish attempts, from June, 1942, on, to erase all traces of the massacres - through cremation, through burning in open pits, through the use of explosives and flame-throwers and bone-crushing machinery - were doomed to failure, so all efforts to let their opponents "disappear in silent anonymity" were in vain. The holes of oblivion do not exist. Nothing human is that perfect, and there are simply too many people in the world to make oblivion possible. One man will always be left alive to tell the story.
—Hannah Arendt, Eichmann in Jerusalem
Aung San Suu Kyi accepted her Nobel Peace Prize this weekend, 21 years after it was awarded. For over two decades since her landslide victory in what was then Burma and is now Myanmar, Suu Kyi has stood fast in her opposition to the military junta ruling her country. The junta has sought to make her disappear, suppress any mention of her, and violently repress all protest and dissent.
Until 2010 when, suddenly, the regime allowed Suu Kyi to stand for elections as the leader of the opposition. She is now a member of parliament.
In her speech accepting her Nobel Prize, Suu Kyi said of the Nobel Prize she won in 1991:
What the Nobel Peace Prize did was to draw me once again into the world of other human beings outside the isolated area in which I lived, to restore a sense of reality to me.
To be part of the human community is to be seen and remembered. It is to affirm that one has meaning and significance in the world. At a time when she had been hidden, silenced, and deprived of the right to speak and act in a way that matters in the world, Suu Kyi was in danger of disappearing. Hanging tenuously over the pit of oblivion, she felt her bond with the human community slipping away. “To be forgotten,” Suu Kyi said in Oslo, “is to die a little. It is to lose some of the links that anchor us to the rest of humanity."
Suu Kyi was near to falling through the cracks of the world into a black hole of forgetting. It is such oblivion that Hannah Arendt saw to be the grave threat totalitarian domination posed to human beings. Totalitarianism threatens to acquire the ability not simply to oppress a people, but to do so in such a way that even their death and their oppression was senseless and powerless in the world. To deprive a person of even the right to die like a human being and to be remembered is, Arendt saw, the greatest imaginable attack on human dignity.
But such holes of oblivion do not exist. That is Arendt's optimistic conclusion that she brings to bear upon the argument of a German Army physician, Peter Bamm. In his book The Invisible Flags (1952) Bamm distinguishes the SS mobile killing units from ordinary German soldiers. Arendt quotes his account of the murder of the Jews at length:
We knew this. We did nothing. Anyone who had seriously protested or done anything against the killing unit would have been arrested within twenty-four hours and would have disappeared. It belongs among the refinements of totalitarian governments in our century that they don't permit their opponents to die a great, dramatic martyr's death for their convictions. A good many of us might have accepted such a death. The totalitarian state lets its opponents disappear in silent anonymity. It is certain that anyone who had dared to suffer death rather than silently tolerate the crime would have sacrificed his life in vain. This is not to say that such a sacrifice would have been morally meaningless. It would only have been practically useless. None of us had a conviction so deeply rooted that we could have taken upon ourselves a practically useless sacrifice for the sake of a higher moral meaning.
If Bamm's argument at first sounds "hopelessly plausible," it trades in platitudes. Its power rests upon the assumption that deaths of resistance would have been in vain, that resisters would have disappeared in "silent anonymity." Practical uselessness thus excuses one from courageous moral action.
Arendt's faith in the symbolic power of moral action and the necessary failure of totalitarian suppression of that power underlies her stunning formulation of the Right to Have Rights in the Origins of Totalitarianism. Whereas much of human rights discourse in 1950 and still today imagines that there is a human right to life or to food or security, Arendt rejects those claims. Humans will die and some will starve. This is not hard hearted so much as it is a fact. Death and starvation can be unjust and tragic, but they are not inhuman thus not a violation of fundamental human rights. What is more, there are times when the most human thing we can do is to die and starve in ways that so exemplify our humanity.
The most basic human right is the right to know that whether we decide to live or to die, our choice will matter. For Arendt, the truly human rights are the rights to be heard, to be seen, and to be meaningful. As humans, we have the right to belong to an organized community, where we can speak and act in ways that matter in the world. In other words, we have the human right to not be consigned to oblivion.
We have such a human right both in theory and in practice. Arendt is convinced that even at a time when technology allows totalitarian regimes to rewrite and even to rewire reality, facts have a stubbornness that allows them to surface. And moral action, even more than mere fact, has a power that is impossible to suppress. As long as the story of resistance can be told, totalitarian oblivion is simply a myth that excuses inaction.
The myth of oblivion is shattered by action in spite of totalitarian domination. One of Arendt's favorite examples of such moral action is the German Sergeant Anton Schmidt. During the war, Schmidt assisted numerous Jews to escape by giving them passports, money, and papers. He never took money in return. He was indeed captured and executed. But his action was not in vain. For not only did he save individual Jews, he inspired them and others to continue their resistance. And his story today remains as a powerful reminder of the practical and moral importance of courageous self-sacrifice in the name of the good.
In her speech on Saturday, Suu Kyi said that the Nobel Peace Prize "opened up a door in my heart." The Nobel Peace Prize is often derided as political. That is often true. And yet there are times when the prize not only rewards sacrifice, but salvages a world in danger of being lost. The Nobel Prize can help illuminate those holes of oblivion that continue to exist, however temporary that existence might be. At its best, the Prize celebrates those like Suu Kyi who choose to dedicate their lives to the conviction that the truth will win out and the holes of oblivion cannot last.