Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
According to Rich Lowry and Ramesh Ponnuru, “The survival of American exceptionalism as we have known it is at the heart of the debate over Obama’s program. It is why that debate is so charged.” Mitt Romney repeated this same line during his failed bid to unseat the President, arguing that President Obama “doesn't have the same feelings about American exceptionalism that we do.” American exceptionalism—long a sociological concept used to describe qualities that distinguished American cultural and political institutions—has become a political truncheon. Now comes Peter Beinart writing in the National Journal that the conservatives are half correct. It is true that American exceptionalism is threatened and in decline. But the cause is not President Obama. Beinart argues that the real cause of the decline of exceptionalist feeling in the United States is conservatism itself. Here is Beinart on one way the current younger generation is an exception to the tradition of American exceptionalism: “For centuries, observers have seen America as an exception to the European assumption that modernity brings secularism. “There is no country in the world where the Christian religion retains a greater influence over the souls of men than in America,” de Tocqueville wrote. In his 1996 book, American Exceptionalism: A Double-Edged Sword, Seymour Martin Lipset quoted Karl Marx as calling America “preeminently the country of religiosity,” and then argued that Marx was still correct. America, wrote Lipset, remained “the most religious country in Christendom.” But in important ways, the exceptional American religiosity that Gingrich wants to defend is an artifact of the past. The share of Americans who refuse any religious affiliation has risen from one in 20 in 1972 to one in five today. Among Americans under 30, it's one in three. According to the Pew Research Center, millennials—Americans born after 1980—are more than 30 percentage points less likely than seniors to say that "religious faith and values are very important to America's success." And young Americans don't merely attend church far less frequently than their elders. They also attend far less than young people did in the past. "Americans," Pew notes, "do not generally become more [religiously] affiliated as they move through the life cycle"—which means it's unlikely that America's decline in religious affiliation will reverse itself simply as millennials age. In 1970, according to the World Religion Database, Europeans were over 16 percentage points more likely than Americans to eschew any religious identification. By 2010, the gap was less than half of 1 percentage point. According to Pew, while Americans are today more likely to affirm a religious affiliation than people in Germany or France, they are actually less likely to do so than Italians and Danes.” Read more on Beinart and American exceptionalism in the Weekend Read.
In this interview, Kevin Kelly, one of the founders of Wired magazine, explains his concept of the “technium,” or the whole system of technology that has developed over time and which, he argues, has its own biases and tendencies “inherently outside of what humans like us want.” One thing technology wants is to watch us and to track us. Kelly writes: “How can we have a world in which we are all watching each other, and everybody feels happy? I don't see any counter force to the forces of surveillance and self-tracking, so I'm trying to listen to what the technology wants, and the technology is suggesting that it wants to be watched. What the Internet does is track, just like what the Internet does is to copy, and you can't stop copying. You have to go with the copies flowing, and I think the same thing about this technology. It's suggesting that it wants to monitor, it wants to track, and that you really can't stop the tracking. So maybe what we have to do is work with this tracking—try to bring symmetry or have areas where there's no tracking in a temporary basis. I don't know, but this is the question I'm asking myself: how are we going to live in a world of ubiquitous tracking?” Asking such questions is where humans fit into the technium world. “In a certain sense,” he says, “what becomes really valuable in a world running under Google's reign are great questions, and that’s something that for a long time humans will be better at than machines. Machines are for answers; humans are for questions.”
Taking issue with a commentator's claim that The Paris Review's use of the word "crepuscular" (adj., resembling twilight) was elitist, Eleanor Catton suggests that the anti-critical attitude of contemporary readers arises out of consumer culture: "The reader who is outraged by being “forced” to look up an unfamiliar word — characterising the writer as a tyrant, a torturer — is a consumer outraged by inconvenience and false advertising. Advertising relies on the fiction that the personal happiness of the consumer is valued above all other things; we are reassured in every way imaginable that we, the customers, are always right." Literature, she says, resists this attitude, and, in fact cannot be elitist at all: "A book cannot be selective of its readership; nor can it insist upon the conditions under which it is read or received. The degree to which a book is successful depends only on the degree to which it is loved. All a starred review amounts to is an expression of brand loyalty, an assertion of personal preference for one brand of literature above another. It is as hopelessly beside the point as giving four stars to your mother, three stars to your childhood, or two stars to your cat."
Vladislav Inozemtsev reviews Laurence Cockcroft’s book Global Corruption. “The book’s central argument is that corruption has political roots, which Cockcroft identifies as the “merging of elites.” Surveying the mechanisms of top-level decision-making from Russia to Brazil, to Peru and India, as well as in many other countries, he discerns a pattern: Politicians today often act as entrepreneurs, surround themselves with sycophants and deputies, and so navigate the entire political process as they would any commercial business. The hallmarks of a corrupt society are the widespread leveraging of wealth to secure public office; the leveraging of such authority to secure various kinds of privileges; and the interplay of both to make even bigger money. Simply put, corruption is a transformation of public service into a specific kind of entrepreneurship.”
George Packer takes a look at Amazon's role in the book business noting that its founder, Jeff Bezos, knew from the start that book sales were only the lure; Amazon's real business was Big Data, a big deal in an industry that speaks to people's hearts and minds as well as their wallets. Still, "Amazon remains intimately tangled up in books. Few notice if Amazon prices an electronics store out of business (except its staff); but, in the influential, self-conscious world of people who care about reading, Amazon’s unparalleled power generates endless discussion, along with paranoia, resentment, confusion, and yearning. For its part, Amazon continues to expend considerable effort both to dominate this small, fragile market and to win the hearts and minds of readers. To many book professionals, Amazon is a ruthless predator. The company claims to want a more literate world—and it came along when the book world was in distress, offering a vital new source of sales. But then it started asking a lot of personal questions, and it created dependency and harshly exploited its leverage; eventually, the book world realized that Amazon had its house keys and its bank-account number, and wondered if that had been the intention all along."
Ta-Nehisi Coates, in the wake of NFL prospect Michael Sam's announcement that he is gay, considers how the concept of readiness is backwards: "The question which we so often have been offered—is the NFL ready for a gay player?—is backwards. Powerful interests are rarely “ready” for change, so much as they are assaulted by it. We refer to barriers being "broken" for a reason. The reason is not because great powers generally like to unbar the gates and hold a picnic in the honor of the previously excluded. The NFL has no moral right to be "ready" for a gay player, which is to say it has no right to discriminate against gay men at its leisure which anyone is bound to respect.”
This week, the magazine Jacobin released Class Action, a handbook for activist teachers, set against school reform and financed using the Kickstarter crowdfunding platform. One of the many essays contained within is Dean Baker's "Unremedial Education," which contains one of the handbook's major theses, an important reminder for those who are interested in education as a route to both the life of the mind and the success of the person: "Education is tremendously valuable for reasons unrelated to work and income. Literacy, basic numeracy skills, and critical thinking are an essential part of a fulfilling life. Insofar as we have children going through school without developing these skills, it is an enormous failing of society. Any just society would place a top priority on ensuring that all children learn such basic skills before leaving school. However, it clearly is not the case that plausible increases in education quality and attainment will have a substantial impact on inequality."
“Culture is being threatened when all worldly objects and things, produced by the present or the past, are treated as mere functions for the life process of society, as though they are there only to fulfill some need, and for this functionalization it is almost irrelevant whether the needs in question are of a high or a low order.”
--Hannah Arendt, “The Crisis in Culture”
Hannah Arendt defines the cultural as that which gives testimony to the past and in preserving the past helps constitute our common world. A cultural object embodies the human goal of achieving “immortality,” which as Arendt explains in The Human Condition is not the same as eternal life or the biological propagation of the species. Immortality concerns the life of a people and is ultimately political. It refers to the particular type of transcendence afforded by political action. In “The Crisis of Culture,” Arendt shows how culture has a political role insofar as it creates durable and lasting objects that contribute to the immortality of a people.
The danger Arendt confronts in “The Crisis in Culture” is that mass culture makes art disposable and thus threatens the political ability of cultural life to produce lasting and immortal objects. The source of her worry is not an invasion of culture by the low and the base, but a sort of cannibalization of culture by itself. The problem is that mass culture swallows culture and subsumes it under the rubric of need. The immortal is degraded to a biological necessity, to be endlessly consumed and reproduced. Durable cultural objects that constitute a meaningful political world are thereby consumed, eroding the common world that is the place of politics.
Arendt’s point is first that mass culture—like all culture under the sway of society— is too often confused with status, self-fulfillment, or entertainment. In the name of status or entertainment, cultural achievements are stripped down and repackaged as something to be consumed in the life process. She would argue that this happens every time Hamlet is made into a movie or the Iliad is condensed into a children’s edition. By making culture accessible for those who would use it to improve themselves, the mass-culture industry makes it less and less likely that we will ever confront the great works of our past in their most challenging form. Eventually, the watering down of once immortal works can make it difficult or impossible to perceive the importance of culture and cultural education for humanity and our common world.
However, Arendt does not offer simply a banal critique of reality television as fast-food. We might recognize a more insidious form of the risks she describes in the new intellectualism that marks the politics, or anti-politics of the tech milieu. What has been termed Silicon Valley’s anti-intellectualism should instead be understood as a forced colonization of the space potentially inhabited by the public intellectual.
The prophets of the tech world see themselves as fulfilling a social and political duty through enterprise. They unselfconsciously describe their creations as sources of liberation, democracy, and revolution. And yet they eschew politics. Their abnegation of overt political activity is comprehensible in that, for them, ‘politics’ is always already contained in the project of saving the world through technological progress.
We see such exemplars of technological cultural salvation all around us. Scholars and cultural figures are invited to lecture at the “campuses” of Apple and Google, and their ideas get digested into the business model or spit back out in the form of TED talks. Even Burning Man, originally a ‘counter-cultural’ annual desert festival with utopian pretensions, has been sucked into the vortex, such that Stanford Professor Fred Turner could give a powerpoint lecture titled, “Burning Man at Google: A cultural infrastructure for new media production.” The abstract for his article in New Media & Society is even more suggestive: “…this article explores the ways in which Burning Man’s bohemian ethos supports new forms of production emerging in Silicon Valley and especially at Google. It shows how elements of the Burning Man world – including the building of a sociotechnical commons, participation in project-based artistic labor and the fusion of social and professional interaction – help to shape and legitimate the collaborative manufacturing processes driving the growth of Google and other firms.” Turner’s conclusion virtually replicates Arendt’s differentiation between nineteenth century philistinism and the omniphagic nature of mass culture:
In the 19th century, at the height of the industrial era, the celebration of art provided an occasion for the display of wealth. In the 21st century, under conditions of commons-based peer production, it has become an occasion for its [i.e. wealth] creation.
The instrumentalization of culture within polite society has given way to the digestion and reconstitution of culture in the form of gadgets meant to increase convenience. Would-be cultural objects become rungs on the hamster wheel of life’s progress. Progress as the ultimate goal of technological cultural innovation is a vague concept because it is taken for granted due to the self-contained and self-enclosed nature of the industry. Where it is defined, it is demonstrated through examples, such as the implementation of the smart parking meter or the use of cloud networking in order to better administer services to San Francisco’s homeless population.
In a recent New Yorker article on the tech revolutionaries, George Packer writes, “A favorite word in tech circles is ‘frictionless.’ It captures the pleasures of an app so beautifully designed that using it is intuitive, and it evokes a fantasy in which all inefficiencies, annoyances, and grievances have been smoothed out of existence—that is, an apolitical world.” Progress here is the increasingly efficient administration of life.
When tech does leave its insular environment and direct its energies outward, its engagements reflect both its solipsism and focus on utility, which for Arendt go together. The Gates Foundation’s substantial investments in higher education impose the quantitatively verifiable standard of degree completion as the sole or main objective, which seems odd in itself, given Gates’ notoriety as a Harvard drop-out. The efforts of the Foundation aim less at placing Shakespeare in the hands of every fast-food worker, and more towards redirecting all of cultural education toward the development of a cheap version of utilitarian aptitude. Such tech intellectualism will ask, “What is the point of slaving over the so-called classics?” The claim is that the liberal arts vision of university education is inseparable from elitist designs, based on an exclusive definition of what ‘culture’ should be.
“What is the use?” is the wrong question, though, and it is tinged by the solipsistic mentality of a tech elite that dare not speak its name. The tech intellectual presents the culture of Silicon Valley as inherently egalitarian, despite the fact that capital gains in the sector bare a large burden of the blame for this country’s soaring rate of inequality. This false sense of equality fosters a naïve view of political and social issues. It also fuels tech’s hubristic desire to remake the world in its own image: Life is about frictionless success and efficient progress, and these can be realized via the technological fix. “It worked for us, what’s the matter with you?”
For Arendt, culture is not meant to be useful for employment or even the lofty purpose of self-cultivation; our relationship to culture nurtures our ability to make judgments. Kant’s discussion of taste and “common sense” informs her notion of the faculty of judgment in art and politics. In matters of taste, judging rests on the human ability to enlarge one’s mind and think with reference to an “anticipated communication with others” and “potential agreement.” Common sense, as she uses it, “discloses to us the nature of the world insofar as it is a common world.” Culture and politics are linked in that both can only exist in a world that is shared. She writes:
Culture and politics, then, belong together because it is not knowledge or truth which is at stake, but rather judgment and decision, the judicious exchange of opinion about the sphere of public life and the common world, and the decision what manner of action is to be taken, as well as to how it is to look henceforth, what kind of things are to appear in it.
That culture and politics are about enacting judgments, rather than truth or technique for the advancement of biological life, is a point that is clearly missed by the tech intellectuals. The establishment of utility as the sole goal of higher education represents only one section of a general lens through which the world appears only as a series of practical problems to be figured out. In this paradoxical utopia of mass accessibility, insulation, and narrow-mindedness, applied knowledge threatens to occupy and pervert culture at the expense of political action and care for our common world.
Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.
Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.
In the New York Times, Roger Berkowitz takes on what he calls the new consensus emerging in responses to the new "Hannah Arendt" movie, that seems to be resolving the vitriolic debates over Arendt's characterization of Adolf Eichmann over the last 50 years. This new consensus holds that Arendt was right in her general claim that many evildoers are normal people, but was wrong about Eichmann in particular. As Christopher Browning summed it up recently in the New York Review of Books, "Arendt grasped an important concept but not the right example." As Berkowitz writes, this new consensus is founded upon "new scholarship on Eichmann's writings and reflections from the 1950's, when he was living amongst a fraternity of former Nazis in Argentina, before Israeli agents captured him and spirited him out of the country and to Israel. Eichmann's writings include an unpublished memoir, "The Others Spoke, Now Will I Speak," and an interview conducted over many months with a Nazi journalist and war criminal, Willem Sassen, which were not released until long after the trial. Eichmann's justification of his actions to Sassen is considered more genuine than his testimony before judges in Jerusalem. In recent decades, scholars have argued that the Sassen interviews show that Arendt was simply wrong in her judgment of Eichmann because she did not have all the facts." As tempting as this new consensus is, it is wrong, Berkowitz argues. Read his full argument here.
Geoff Dyer, flipping through the catalogue of a recent Gary Winograd retrospective at SFMoMA, considers the way that the street photographer presented what he saw: "the pictures didn't look right, they were all skewed and lurchy, random-seeming and wrong. They were, it was felt, an unprovoked assault on the eye... We were accustomed to viewing the world through a set of conventional lenses that Winograd wrenched from our face, making us conscious of how short-sighted we had been." Winograd's still pictures, in other words, act on their viewers, betraying our sense of the world, shifting it out of focus, and therefore revealing it for what it is.
Tony Horwitz uses the upcoming 150th anniversary of Gettysburg to zoom out and consider the changing historical narrative about the American Civil War, in the process offering up an important reminder that history is a living, changing thing: "the 150th anniversary of the Civil War is too narrow a lens through which to view the conflict. We are commemorating the four years of combat that began in 1861 and ended with Union victory in 1865. But Iraq and Afghanistan remind us, yet again, that the aftermath of war matters as much as its initial outcome. Though Confederate armies surrendered in 1865, white Southerners fought on by other means, wearing down a war-weary North that was ambivalent about if not hostile to black equality. Looking backwards, and hitting the pause button at the Gettysburg Address or the passage of the 13th amendment, we see a "good" and successful war for freedom. If we focus instead on the run-up to war, when Lincoln pledged to not interfere with slavery in the South, or pan out to include the 1870s, when the nation abandoned Reconstruction, the story of the Civil War isn't quite so uplifting. "
Computer scientist and writer Jaron Lanier critiques the present digital economy with a close look at the evolving relationship between technology and power. To make his argument for change, he insightfully reinterprets what many consider to be a paradox - that the pairing of technology and power at once enriches and erodes the agency of individual actors. Companies like Google are so valuable, he argues, because they control enormously powerful and expensive servers-he calls them Siren Servers to emphasize their irresistible allure-that allow it to manipulate aggregate activity over time. "While people are rarely forced to accept the influence of Siren Servers in any particular case, on a broad statistical basis it becomes impossible for a population to do anything but acquiesce over time....While no particular Google ad is guaranteed to work, the overall Google ad scheme by definition must work, because of the laws of statistics. Superior computation lets a Siren Server enjoy the magical benefits of reliably manipulating others even though no hand is forced ... We need to experiment; to learn how to nurture a middle class that can thrive even in a highly automated society."
Discussing her recent essay in Harper's, writer Rebecca Makkai talks about her experience of her grandfather, whom she knew as a yoga instructor who lived in Hawaii, who was also the principal author of Hungary's Second Jewish Law, which passed in 1939. At one point, she strikes a particularly Arendtian note: "There's also the fact that it's just very difficult, psychologically, to reconcile the face of a real person with one of the darkest moments of the twentieth century. It's not the same as looking at someone who's personally violent, likely to reach out and hit you. This guy is chopping up papaya on his balcony, telling jokes, and I think we have an instinct to forgive, to see just the best in that person, to see him at just that moment. (The irony being that this is what he and his colleagues failed to do - to see humans in front of them.)"
Roger Berkowitz will be in attendance at the Moviehouse in Millerton for a discussion after the 4:00 pm screening of "Hannah Arendt" and before the 7:00 pm screening.
July 16, 2013
Following the 7:40 pm showing of "Hannah Arendt" at the Quad Cinema on 13th St. in N.Y.C., there will be a Q&A with Roger Berkowitz about the film.
July 21, 2013
Following the 6:00 pm showing of "Hannah Arendt" at Symphony Space on Broadway and 95th St. in N.Y.C., there will be a Q&A with Roger Berkowitz about the film.
The sixth annual fall conference, "Failing Fast" The Educated Citizen in Crisis"
Olin Hall, Bard College
Learn more here.
From the Hannah Arendt Center Blog
This week on the blog, Ian Storey in the Quote of the Week looks at the implications of the recent Supreme Court same sex marriage rulings. Jeff Champlin considers Arendt's reading of Kant, offering a new way to think about judgment. Hannah Arendt's thinking is brought to bear on the Paula Deen scandal. And, for your weekend read, Roger Berkowitz looks at the moral implications of financial inequality.
For two years I taught literature, reading and writing at a public university in one of New York City’s outer Boroughs. Of course having come out of a liberal arts “thinking” institution what I really thought (maybe hoped) I was teaching was new perspectives. Ironically, the challenge that most struck me was not administrative, nor class size or terrible grammar and endless hours of grading, the most pressing obstacle lay in creating a case for the value of “thinking.”
I state “case” because I regularly felt like my passions and beliefs, as well as my liberal arts education went on daily trial. I had originally come from a hard-scrabble immigrant reality, but my perception of reality had been altered by my education experience, and as an educator I felt the need to authenticate my progressive (core text) education with my students.
I was regularly reminded that the immediate world of the “average” student (citizen) with all its pressing, “real” concerns does not immediately open itself to “thought” in the liberal arts sense. We are a specialization, automation, struggling and hyper competitive society. The “learning time” of a student citizen is spent in the acquisition of “marketable,” and differentiating skills, while their “free time” is the opportunity to decompress from, or completely escape the pressures of competitive skill acquisition. The whole cycle is guided by an air of anxiety fostered in our national eduction philosophy, as well as the troubled economy and scattered society at large. I don’t think one can teach the humanities without listening to their students, and listening to the students calls for a deep inventory on the value of “thought” in the humanities sense, and then ultimately in how to most truthfully communicate this value to the students.
I need to add here that my students were quite smart and insightful. This made it even greater of a challenge. Their intelligence was one of realism. I needed to both acknowledge and sway their perspective, as well as my own.
Each semester I began with a close-reading of David Foster Wallace's commencement speech at Kenyon College, “What is Water.” He begins his speech with the parable of two fish swimming by an older fish which as it swims by asks, How is the Water?” The little ones swim on and only later ask each other, “What is water?” Didactic parable, cliche -- yes -- but Wallace goes on to deconstruct the artifice of commencement speeches, parables, and cliches, and then rebuilds them. Having so skillfully deconstructed them he has invited his listers into the form making, and as he communicates the truth beneath what had earlier seemed lofty or cliche, the listers follow him towards meaning making. Ultimately Wallace states that education is “less about teaching you how to think, and more about teaching you of the choice in what to think about.” To have agency is to be a meaning maker. And as more and more cultural institutions artfully vie for the citizens devotion and loyalty -- politics, religion, but even more so, corporate houses and pop culture designs, in the ever growing noise of institutional marketing the call to choose seems ever more muted.
The choice, for so many students today, is simply in how to most skillfully compartmentalize themselves and their lives in the face of the anxieties of their immediate world. The choice for many young teachers, facing their own set of related anxieties, is in how far are they willing step away from the ideal of learning-living-teaching integration model -- so easy is it today as an educator to simply become disenchanted, frustrated and aloof. Sometimes, “thinking” is the process of choosing what to keep and what to give away.
Wallace's insightful, no b.s, humorous and sincere tone resonated with my students, that is of course until they found out that Wallace killed himself. Then, that’s what everyone wanted to focus on. I can not blame them. There is a ‘text’ to ‘personal’ mystery, a ‘content’ to ‘context’ disjunction that opens itself at such a revelation, a mystery that the “thinking” mind wants to explore. The modern “thinking” mind draws little separation between the lofty and the sublime, the public and the personal. Such is a byproduct of a generation raised on reality television and celebrity stories. I, in all sincerity cannot judge this. My generation, the X’s who came of age on the cusp of the Millennials, were culturally educated by MTV, The Real World and Road Rules, and thus we crave hip, colorful, appropriately gentrified spaces to occupy -- think of artist collectives, or Facebook and Google working environments (bean bags, chill and chic prescription sunglasses, lounge happy hour with juice bars, untraditional working hours, colorful earth tones). I digress, I meant to make some observation of “thinking.”
I was excited to teach what excited me: I began with Wallace, then Kafka, O’Connor (Flannery or Frank), Platonov, Carver, Babel, Achebe Kundera, Elliot, etc... It is, essentially, the seven sisters freshmen reading list, a popular catalogue of classic stories peppered with some international obscurity. It is the “cool” thing in liberal arts. But, over and over my students came to me complaining that they could not find this relevant to their lives. After such reports I would tweak my lesson plans to give a greater introduction to the works, going deeper into the philosophical tenets of the stories, and into the universal reward of being able to utilize the tools of the thinking, writing mind. Induct, deduct, compare, contrast, relate, “give it greater shape,” I would say. “Breath life into it.”
To have the skills to decipher plot, to record the echo of a narrative, to infer characterization from setting, to understand the complex structure of a character, to be invited to participate in the co-creation of a narrative which gently guides you through action but leaves the moral implications up to the reader. These are “indispensable,” I would advise my students. “Indispensable for human agency.” Some would slowly gravitate to my vision, as I prodded further and further into their motivations for being in school, career, and other ‘relevant’ choice. Yet, they often felt only like visitors in my library, preparing to check out and return to the “default” education thinking mode as soon as the quarter, mid, or end semester exam periods began. The pressures of what they call “the real world” are much stronger then the ghosts of books and introspective thought -- vague, powerless, intangible.
“The real world:” Here I am reminded of the scene from the Matrix when Morpheus unveils to Neo “the desert of the real.” A barren waste land of human energy as only a power source nourished for consumption. The Matrix, I will add here, is based on a work by Jean Baudrillard, a french philosopher who warns of a modern society as a place existing in consumption and entertainment, devoid of meaning making -- the urge towards agency, in hibernation; the map towards meaning, defunct. In describing this new world he coined the phrase “the desert of the real.” Again, I fall into tangental thought.
I needed to find a way to invite, seduce, capture my students. I tried using myself as a conduit.
I pride myself on the fact that I am an immigrant, a former “at risk” student, that my tattoos all have mythological meaning and thought behind them, that I am a high-school drop out with credentials to my name, a top tier education, a masters degree, etc... I felt like these could help me bridge for my students the platforms of reality-setting discourse and humanistic thought. I had, and still do, believed that real “thinking” is indispensable in being human, in being free, and in the ability to have fun and play with the world.
Again, my students would, at times, meet me in the middle space I wanted to create, though rarely did this space become living for them, instead they lay their heads to the sound of another’s palpitation and breath, and then moved on. Maybe I planted a seed, I like to think. But then, maybe, they were bringing me somewhere as well.
They could not recklessly follow me, or I them. It was an issue of pragmatic bonds. For a moment, my class, or an individual student I was reading with would delve into the power of words with me and the ending of Andrei Platonov’s “Potudon River” would finally break through the events of the page: “Not every grief can be comforted; there is a grief that ends only after the heart has been worn away in long oblivion, or in distraction amidst life’s everyday concerns.” And my students would draw new understanding of the passage, enter it through a word or phrase that could unlock that middle space between their worlds and the world of literature, philosophy, metaphor. “Grief,” “long oblivion,” life’s everyday concerns,” all the sudden my students would give these new meaning, now only slightly guided by the story and letting their lives find a grip to the reigns. They would find new connections, and again they would return to the “real” world.
More and more I struggled to make thinking relevant. “Will this help me get a better job?” I was asked.
Thinking about it I had to encounter my own struggles with this question. I know the answers. I know the programed liberal arts answer, and the “real” answer. I know that the liberal arts answer exposes the “real” as something at best lacking, at its worst empty. I also know that the real, is real; it happens in real time, removed from the concerns of literature, poetry, and philosophy which concern themselves with the work of mans eternity.
“Unlikely,” I would answer. For gods sake, though I was teaching all these things I cared so deeply about, I also worked nights as a bartender to satisfy the demands of the real. I had to produce something consumable and all of my learning and thoughts on thinking are not that.
Here I acknowledge that this answer is not entirely true. We can find jobs which call for liberal arts skills, but these are few and far between and rarely afford a comfortable standard of living. We may also posit the argument that liberal arts skills will contribute to ones ability to perform better and have a greater understanding of ones job, but this argument does not lend itself to substantial evidence, no matter how much I may actually believe it. This was the litmus test of my “thinking,” and it only survives in embracing the privacies of my world, that I chose my private world despite and above the “real.”
“Unlikely.” And where does that leave us?
Ultimately, all I have as a conscious being is the ability to tell stories, to choose and create my narrative from the scattered world I am provided. Ultimately, after deconstructing both the “real” and the “lofty” I could only encourage my students to choose their own themes. To the question of “what is water?” I could only answer, “the desert.”
Oddly enough, and as “unlikely” as it may seem, when I answered with honesty, to them as well as myself, they followed. -- we could talk.
I am a neural matrix of roughly 80 billion cells each charged with the potential for action, firing out in multiple patters of synchronicity towards a seemingly inexhaustible order of calculations -- I am the system that emerges, I am its apex, I am sentience -- therefore I am.
This, I imagine, is what Descartes would have to say today of what remains of the self under the scope of examination, though I will admit this sounds less poetic then his original statement.
Galileo’s telescope, the atom, the space age, the tech age, the Human Genome Project, and now the BAM project, all can be seen as a succession of strivings towards a new perspective through which we could gleam a greater understanding and synthesis of Man. The BAM project is the newest manifestation of this urge. It is an exciting endeavor, and yet as with any new attempt of science to probe ourselves, it is a frightening one too.
Recently I learned about the “Brain Activity Map” (BAM) initiative sponsored by the Obama administration. I have a baseline knowledge of neuroscience and have been long fascinated by its hoped for implications and speculative repercussions. I wanted more detail. I found what I understand to be the source paper for this project, The Brain Activity Functional Connectomics, by Paul Alivisatos, et al. This is hot stuff, and I am not being glib. Obama thinks so too, that’s why 3 billion governmental dollars are slated to go into the project. Microsoft and Google are throwing in real money too. So what is really going on?
BAM follows the model of the Human Genome Project. In the proposal paper, as well as Obama’s state of the union address, reference is made to the fact that each $1 put into the Human Genome Project brought back $140 to the economy. I will leave alone the implications of this being economy driven. Should science be economically driven? This question, in our society, is mostly moot. Everything must now at least appear to be economy driven. Knowledge, transcendence, self-discovery, can only resonate in conversation with the economy.
But what are the human as opposed to the economic implications of the Brain Activity Map? BAM is a 15-year plan to create a non-topographical map of the brain the repercussions of which reach into the medical, commercial, educational, and technological fields. Until now our neuro-understanding of the brain has been limited to compartmentalized thinking, or to the study of individual ingredients. The brain simply cannot be understood this way and thus Alivisatos’ paper argues that “no general theory of brain function is universally accepted.” BAM seeks to create an “emergent systems” model, something akin to the rules of complex systems. This stems from the knowledge that brain function arises from the interplay of the electrical impulse grid (the action potential of all the neurons). The best way I can state this is that brain activity is a symphony rather then a carpenter’s graph. It is the interplay of notes, tones, and pacing, and sound rather than a combination of these individual elements. The point is not to isolate and combine but to mimic the complex yet structured electrical impulses of the brain in a way that allows higher order brain function to emerge in an artificially intelligent being. To quote Alivisatos: “An emergent level of analysis appears to be critical for understanding the most compelling questions of how brain functions create sentience.” The most exciting effort, in other words, is to create a sentient, thinking, and autonomous entity.
The project calls for an investment into new technologies that could make recording the action potentials and coordination of their impulses more feasible. This can be accomplished by investing in nano-technology: nanotubes and wires, quantum dots, nano-particles, neural probes, shanks containing optical waveguides, and tiny microchips that can pass into the brain.
The brain mapping project could likely entail human testing, which “we do not exclude,” though it would not take place till the last phase of the project.
Microsoft and Google have signed on as partners and possibly fiscal contributors, because clearly the repercussions of such of project could be ground breaking for the tech industry: Computer chips that replicate the emergent systems model; search engines that could graph society by treating each user as if they are a neuron and their googling activity as action potential. The source paper acknowledges some possible paranoia at such an endeavor and thus states that it is essential that this project be a public one, thus allowing for transparency in all findings. It also encourages a public relations campaign to reassure any party that may be susceptible to conspiracy theory making. That’s me!
I hold both, a fear of repercussions and a sense of excitement for this project. I tend to think that conspiracy theories are healthy. All great science fiction is fed by the conspiracy model, but it also tends to foretell future technological and social revelations. And there exactly is my point, or fear, or observation -- the irrelevance of social relevance. We don’t really care, unless it scares us.
I found myself facing this in writing this post. I am excited to tell people about this project, but as a writer I have a constant mechanism at play in my head as I write, to present a story or topic in a light that will make people interested. As much as this mechanism comes from within me it is also a product of cultural observation, a consistent tracking of what stimulates popular dialogue. What stimulates popular dialogue is conspiracy, not excitement or optimism. This itself is worthy of examination.
Ultimately the fear is of what we are losing in the race to understand ourselves through science and technology, of what we leave behind. I do not mean to gesture towards a conservative approach on science. Rather, I am fascinated by the anxiety that accompanies the prospect, and propose that our fear is that of isolated parties traveling at quite different speeds. We can investigate the self intrusively or/and reflectively. Reflectively, we evaluate and discuss our culture, ethics, the relationship of groups and individuals to one another, we pause and contemplate the grace of being. Intrusively we probe into the elemental makeup of ourselves and the world we inhabit. As one practice outpaces the other, something feels askew, as if a key organ in the symphony of being human is muting in the distance.
This Weekend Read is Part Two in “The “E” Word,” a continuing series on “elitism” in the United States educational system. Read Part One here.
Peter Thiel has made headlines offering fellowships to college students who drop out to start a business. One of those Thiel fellows is Dale Stephens, founder of Uncollege. Uncollege advertises itself as radical. At the top of their website, Uncollege cites a line from the movie "Good Will Hunting":
You wasted $150,000 on an education you coulda got for a buck fifty in late charges at the public library.
The Uncollege website is filled with one-liners extolling life without college. It can be and often is sophomoric. And yet, there is something deeply important about what Uncollege is saying. And its message is resonating. Uncollege has been getting quite a bit of attention lately, part of a culture of obsession with college dropouts that is increasingly skeptical of the value of college.
At its best, Uncollege does not simply dismiss college as an overpriced institution seeking to preserve worthless knowledge. Rather, Uncollege claims that college has become too anti-intellectual. College, as Uncollege sees it, has become conventional, bureaucratic, and not really dedicated to learning. In short, Uncollege criticizes college for not being enough like college should be. Hardly radical, Uncollege trades rather in revolutionary rhetoric in the sense that Hannah Arendt means the word revolution: a return to basic values. In this case, Uncollege is of course right that colleges have lost their way.
Or that is what I find interesting about Uncollege.
To actually read their website and the recent Uncollege Manifesto by Dale Stephens, is to encounter something different. The first proposition Uncollege highlights has little to do with education and everything to do with economics. It is the decreasing value of a college education.
The argument that college has ever less value will seem counter intuitive to those captivated by all the paeans to the value of college and increased earning potential of college graduates. But Uncollege certainly has a point. Currently about 30% of the U.S. adult population has a degree. But among 20-24 year olds, nearly 40% have a college degree. And The Obama administration aims to raise that number to 60% by 2020. Uncollege calls this Academic Inflation. As more and more people have a college degree, the value of that degree will decrease. It is already the case that many good jobs require a Masters or a Ph.D. In short, the monetary value of the college degree is diminished and diminishing. This gives us a hint of where Uncollege is coming from.
The Uncollege response to the mainstreaming of college goes by a number of names. At times it is called unschooling. Unschooling is actually a movement began by the legendary educator John Holt. I recall reading John Holt’s How Children Learn while I was in High School—a teacher gave it to me. I was captivated by Holt’s claim that school can destroy the innate curiosity of children. I actually wrote my college application essay on Holt’s educational philosophy and announced to my future college that my motto was Mark Twain’s quip, “I never let school interfere with my education”—which is also a quotation prominently featured in the Uncollege Manifesto.
Unschooling—as opposed to Uncollege—calls for students to make the most of their courses, coupling those courses with independent studies, reading groups, and internships. I regularly advise my students to take fewer not more courses. I tell them to pick one course each semester that most interests them and pursue it intently. Ask the professor for extra reading. Do extra writing. Organize discussion groups about the class with other students. Go to the professor’s office hours weekly and talk about the ideas of the course. Learners must become drivers of their education, not passive consumers. Students should take their pursuit of knowledge out of the classroom, into the dining halls, and into their dorms.
Uncollege ads that unschooling or “hacking your education” can be done outside of schools and universities. With Google, public libraries, and free courses from Stanford, MIT and Harvard professors proliferating on the web, an enterprising student of any age can compose an educational path today that is more rigorous than anything offered “off-the-shelf” at a college or university. I have no problem with online courses. I hope to take a few. But it is a mistake to think that systems of massive information delivery are the same thing as education.
What Uncollege offers is something more and something less wholesome than simply a call for educational seriousness. It packages that call with the message that college has become boring, conventional, expensive, and unnecessary. In the Uncollege world, only suckers pay for college. The Uncollege Manifesto promotes “Standing out from the other 6.7 billion”; it derides traditional paths pointing out that “5,000 janitors in the United States have Ph.Ds.”; and cautions, “If you are content with life and education you should probably stop reading… You shall fit in just fine with society and no one will ever require you to be different. Conforming to societal standards is the easy and expected path. You are not alone!”
At the core of the Uncollege message is that dirty and yet all-so-powerful little word again: “elitism.” Later in the Uncollege Manifesto we are told that young people have a choice between “real accomplishments” and the “easy path to mediocrity”:
To succeed without a college degree you will have to build your competency and reputation through real world accomplishments. I am warning now: this is not going to be easy. If you want to take the easy path to mediocrity, I encourage you to go to college and join the masses. If you want to stand out from the crowd and change the world, Uncollege is for you!
At one point, the Uncollege Manifesto lauds NPR’s “This I Believe” series and commends these short 500 word essays on personal credos. But Uncollege adds a twist: instead of writing what one believes, it advises its devotees to write an essay answering the question: “What do you believe about the world that most others reject?” It is not enough simply to believe in something. You must believe in something that sets you apart and makes you different.
Uncollege is at least suggesting that it might be cool to want, as it has not been for 50 years, to aim for excellence and to yearn to be different. In short, Uncollege is calling up students at elite institutions to boldly grab the ring of elitism and actively seek to stand outside and above the norm. And it is saying that education is no longer elite, but conventional.
It is hard not to see this embrace of elitism as refreshing although no doubt many will scream the “e” word. I have often lectured to students at elite institutions and confronted them with their fear of elitism. They or someone spends upwards of $200,000 on an education not to mention four years of their lives, and then they reject the entire premise of elitism: that they are different or special. By refusing to see themselves as members of an elite, these students too often refuse to accept the responsibility of elites, to mold and preserve societal values and to assume leadership roles in society.
Leading takes courage. In Arendtian terms, it requires living a public life where one takes risks, acts in surprising ways, and subjects oneself to public judgment. Leading can be uncomfortable and dangerous, and it is often more comfortable and fun to pursue one’s private economic, familial, and personal dreams. Our elite colleges have become too much about preparing students for private success rather than launching young people into lives of public engagement. And part of that failure is a result of a retreat from elitism and a false humility that includes an easy embrace of equality.
That Uncollege is selling its message of excellence and elitism to students at elite institutions of higher learning is simply one sign of how mainstream and conformist many of these elite institutions have become. But what is it that Uncollege offers these elite students who drop out and join Uncollege?
According to its website, Uncollege is selling “hackademic camps” and a “gap year program” that are designed to teach young people how to create their own learning plans. The programs come with living abroad programs and internships. Interestingly, these are all programs offered by most major universities and colleges. The difference is money and time. For $10,000 in just one year, you get access to mentors and pushed to write op-eds, and the “opportunity to work at hot Silicon Valley startups, some of them paid positions.” In the gap year program, participants will also “build your personal brand. Speak at a conference, Write an op-ed for a major news outlet. Build a personal website.”
None of this sounds radical, intellectual, or all-that elitist. On the contrary, it claims that young people have little to learn from educators. Teachers are unimportant, to be replaced by mentors in the world. The claim is that young people lack nothing but information and access in order to compete in the world.
What Uncollege preaches often has little to do with elitism or intellectual growth. It is a deeply practical product being sold as an alternative to the cost of college. In one year and for one-twentieth of what a four-year elite college education costs, a young person can get launched into the practical world of knowledge workers, hooked up with mentors, and set into the world of business, technology, and media. It is a vocational training program for wannabe elites, training people to leap into the creative and technology fields and compete with recent college graduates but without the four years of studying the classics, the debt, and the degree. The elitism that Uncollege is selling is an entrepreneurial elitism measurable by money. By appealing to young students’ sense of superiority, ambition, and risk-taking, Uncollege stands a real chance of attracting ambitious young people more interested in a good job and a hot career than in reading the classics or studying abstract math.
Let’s stipulate this is a good thing. Not everybody should be going to liberal arts colleges. People unmoved by Nietzsche, Einstein, or Titian who are then forced to sit through lectures, cram for exams, and pull all-nighters writing papers cribbed from the internet are wasting their time and money on an elite liberal arts education. What is more, they bring cynicism into an environment that should be fired by idealism and electrified by passion. For those who truly believe that it is important in the world to have people who are enraptured by Sebald and transformed by Arendt, it is deeply important that the liberal arts college remain a bastion apart, a place where youthful exuberance for the beautiful and the true can shine clearly.
We should remember, as well, that reading great books and studying Stravinsky is not an activity limited to the academy. We should welcome a movement like Uncollege that frees people from unwanted courses but nevertheless encourages them to pursue their education on their own. Yes, many of these self-educated strivers will acquire idiosyncratic readings of Heidegger or strange views about patriotism. But even when different, opinions are the essence of a human political system.
One question we desperately need to ask is whether having a self-chosen minority of people trained in the liberal arts is important in modern society. I teach in an avowedly liberal arts institution precisely because I fervently believe that such ideas matter and that having a class of intellectuals whose minds are fired by ideas is essential to any society, especially a democracy.
I sincerely hope that the liberal arts and the humanities persist. As I have written,
The humanities are that space in the university system where power does not have the last word, where truth and beauty as well as insight and eccentricity reign supreme and where young people come into contact with the great traditions, writing, and thinking that have made us whom we are today. The humanities introduce us to our ancestors and our forebears and acculturate students into their common heritage. It is in the humanities that we learn to judge the good from the bad and thus where we first encounter the basic moral facility for making judgments. It is because the humanities teach taste and judgment that they are absolutely essential to politics. It is even likely that the decline of politics today is profoundly connected to the corruption of the humanities.
Our problem, today, is that college is caught between incompatible demands, to spark imaginations and idealism and to prepare young people for employment and success. For a long while now colleges have been doing neither of these things well. Currently, the political pressure on colleges is to cut costs and become more efficient. The unspoken assumption is that colleges must more cheaply and more quickly prepare students for employment. For those of us who care about college as an intellectual endeavor, we should welcome new alternatives to college like internet courses, vocational education, and Uncollege that will pull away young people for whom college would have been the wrong choice. Maybe, under the pressure of Uncollege, colleges will return to their core mission of passionately educating young people and preparing them for lives of civic engagement.
I encourage you this weekend to read the Uncollege Manifesto. Let me know what you think.
The copyright conflict between the internet community and the entertainment industry escalated recently when some of the most visited sites on the web flexed their muscle by spearheading a campaign to kill the two bills which started the trouble. The bills have been shelved, thanks to the participation of most of the major social media websites and search engines in a twenty-four-hour blackout (including Wikipedia, Google, Reddit, Tumblr, Mozilla, among many others) – but what does such a “victory” mean?
Just days after most support had been pulled from the bills in both houses, the founder of file-sharing site Megaupload, Kim Dotcom (born Kim Schmitz, but had his name legally changed around 2005), was arrested in New Zealand and is facing extradition to the US due to alleged piracy charges, along with at least three of his closest associates. This may come as a surprise to those who argued that these bills were necessary to stop intellectual property theft. As Bill Keller explains in a recent Op-Ed piece in the Times, “The central purpose of the legislation — rather lost in the rhetorical cross fire and press coverage — was to extend the copyright laws that already protect content creators in the U.S. to offshore havens where the most egregious pirates have set up shop.” And yet, even without the new laws, Dotcom and his cohorts were arrested on US government orders.
It is helpful to go back to basics and try to understand the thinking behind the protection of intellectual property. Why, in other words, is it necessary to arrest someone like Dotcom, who merely makes content available to a wide and interested audience?
One attempt to answer that question is Mark Helprin's Digital Barbarism, an impassioned, literary, and philosophical defense of copyright on the internet. Known best for his novels, most memorably Winter's Tale, Helprin puts forth a philosophical and humanist argument in favor of copyright. At root, copyright is necessary as the “guarantor” or “coefficient” of liberty itself.
That property is at the essence of liberty is an idea that has its roots deep in liberal thinking. Property, from the root proper or propriety, is what is right and most my own. Who I am includes the character I possess, what defines me. This includes as well the way I live and the things I choose to own. Ownership, in other words, concerns what is my own, and who I am.
Our love for and defense for our property is not simply economic. It is a matter of identity and existence. Pace Helprin:
Property is to be defended proudly rather than disavowed with shame. Even if for some it is only a matter of luck or birth, for the vast majority it is the store of sacrifice, time, effort, and even, sometimes, love. It is, despite the privileged inexperience of some who do not understand, an all-too-accurate index of liberty and life. To trifle with it is to trifle with someone's existence, and as anyone who tries will find out, this is not so easy. Nor has it ever been. Nor should it ever be.
The copyright battle is less about economics, in Helprin's telling, than about freedom. Unlike some proponents of free market ideology, he does not advocate the absence of limits on freedom. In his words (which remind us of Helprin's artistry):
Nothing is entirely free, not even an electron (hardly an electron) or an atom floating in the inaccurately named vacuum of space. Everything that exists is subject to the pull or constraint of something else.
The point is not to reject all limits on property, but to insist upon a balance—one that Helprin thinks today is too far weighted toward disrespect for property.
He makes his argument in the context of taxation. Opposing both extreme positions of liberals (who find it cruel and inexplicable that someone would want to set limits before every mouth is fed and every cry comforted") and conservatives (who "find it deeply alarming that anyone can fail to recognize the danger of pressing ahead in the absence of limits"), Helprin insists that we at least honestly recognize that taxation has a non-material cost: taxation, to some extent, "extinguishes liberty."
In other words, taking someone's property is, in itself, wrong. There may be reason's do to so, and there is no absolute right to one's property. Society demands limits and some takings. But such decisions should be made with an appreciation that these takings are meaningful intrusions on individual liberty. This is Helprin's core point and it is one that I believe is rarely made and even more rarely considered.
To illustrate his claim about the imposition involved in all takings, Helprin calls on the common (and these days volatile) theme of income tax. Taxes, while necessary, are infringements on freedom (not simply on income). If the state compels Cyril “to surrender half his income” in an effort to provide for those who cannot provide for themselves, then Cyril is “laboring for the state during half his working life,” and not for himself. Helprin likens such disenfranchisement to slavery. This seems excessive. As far as I can tell, Helprin employs the analogy because he wants to shock us into seeing just how we have come to naturally accept the fact that it is normal for the majority to take property from the minority. In his account, just as the slave owner “presumes that the labor of his slaves belongs to him…that whatever they make is rightfully his,” so does the state, when it requires its citizens to pay a tax on the income generated by their own labor, operate under the assumption that it is entitled to decide the ultimate use of such labor.
The comparison of taxation to slavery is over the top, sure. But there is a point Helprin makes that is important:
Anyone who blithely recommends expropriation as a means of "economic justice" should first divest himself of most of what he has and give it to those who have less — and there are certain to be those who have less and are greatly afflicted for it. We tend to look up rather than at ourselves when surrendering to such passions of righteousness. The assault on copyright is a species of this, based on the infantile presumption that a feeling of justice and indignation gives one a right to the work, property, and time (those are very often significantly equivalent) of others, and that this, whether harbored at the ready or expressed in action, is noble and fair.
Which is why the question of Kim Dotcom’s arrest is central. According to Helprin’s explanation, Dotcom's websites and others like them blithely engage not just in economic exploitation of writers and artists, but do so without seriously considering the injustice involved in their depriving others of their sense of ownership in what they create. One can disagree. To do so, you must think that our societal right to read your essay or hear your song trumps your right to sell that song (or not) to whomever you wish.
For your weekend read, buy a copy of Helprin's Digital Barbarism, and give it a read. Or, read a chapter that Helprin has, freely, made available on the web.
My talk from last August is finally available in full, courtesy of the very interesting new website for Bard's Program in Language and Thinking.
I thank Thomas Bartscherer, Director of the Language and Thinking Program, for inviting me to give the talk and making the video available.
Roger Berkowitz, Director of the Arendt Center, held a lecture this week titled "Earth Alienation from Galileo to Google," as part of the Rostrum Lecture Series sponsored by Bard's Language & Thinking Program.
You can read the text of his lecture here: EarthAlienationgtogbardtext
In his talk, Berkowitz writes:
My Thesis today is: The scientific way of thinking inaugurated by Galileo in the 17th century is, in the first decades of the 21st century, forcing us to ask the question that the scientific approach to the world has harbored all along: Is humanity important?
How we humans answer this question will have a greater impact on our world than any scientific, technological, economic or artistic innovation that we may witness. For one thing, in an age of nuclear and biological weapons, we—or some few of us—may well choose to extinguish humanity. Or, in an age of automation where robots and machines are able to perform most economically necessary tasks, those in power may decide that it is better to euthanize the masses of superfluous persons for either economic or environmental reasons, or both.
Although nuclear Armageddon is one button away and Sun Microsystems Chairman Bill Joy has publicly raised the possibility of culling the superfluous, it is far more likely that we as a species will ignore the question.
I fear, however, that the refusal to confront the question of humanity’s worth will lead to very nearly the same effect as an affirmative decision of humanicide: In other words, we are now threatened with the possibility that the kindling of the human spark will dampen so that the darkness of the world will be interrupted only with the most fleeting fires of the human spirit.