Three Nobel Prizes have already been announced this week, and Thomson Reuters, which has "accurately forecast 35 winners since 2002," failed to predict any of the new laureates. The well-respected firm "mines scientific research citations to identify the most influential researchers in the fields of chemistry, physics, medicine, and economics." Next week's announcement by the Royal Swedish Academy of Sciences of the prize in Economics offers a chance for Thomson Reuters to save face. But even so prominent a handicapper, with a significant financial stake in the accuracy of its forecasts, dares not predict the winners of the prizes in literature and peace, which have been awarded since 1901. Why?
The Nobel Prize in Literature has a deservedly sketchy reputation among journalists, professors of literature, and professional critics. For every Samuel Beckett, Alexandr Solzhenitsyn, and Gabriel García Márquez, whose Nobel triumph has thrilled book lovers and theater-goers, there is a Bjørnstjerne Bjørnson, Karl Adolph Gjellerup, and Frans Eemil Sillanpää, whose shocking literary deification has left readers puzzling over an unfamiliar name and wondering whether it's time to once again change the prescription of their reading glasses. No doubt the regional and linguistic biases of the Swedish Royal Academy committee that select the annual winner explain a parochial impatience among Anglophone readers with obscure Nordic winners, instant literary celebrities with far too many syllables and diacritical marks in their names to generate veneration for their as-yet un-translated works.
If it's Quixotic for Thomson Reuters to predict with confidence winners of the Nobel prizes in the sciences, dismal or otherwise, then it seems utterly mad for anyone who reads novels, poems, plays, and memoirs, even professionally, to forecast the winner of the Nobel Prize in literature. (Even predicting the winner of the Peace Prize seems by comparison a rational enterprise: simply identify a former or potential war-criminal, terrorist, or would-be authoritarian leader who has lately made nice with the world press). To be sure, we are sometimes pleasantly surprised by the Royal Swedish Academy press release. The stunning announcement last year that Mario Vargas Llosa had won the Nobel Prize for literature is a case in point. A Peruvian novelist possessed of precocious and extraordinary talents, Vargas Llosa was not considered a likely winner by journalists and literature professors. His controversial reputation as one of Latin America's leading champions of "neo-liberal" economic and political reform, his very public criticism of fundamentalist Islam, and his unstinting denunciation of the authoritarian government of Fidel Castro and his brother appeared to disqualify him as a serious candidate for the Nobel laureate.
And yet, is it so terrible that the committee members who select the winner of the Nobel Prize in Literature annually exercise such predictably fickle, even perverse judgment? After all, it's been some time since any serious mind publicly claimed that literary criticism was a science, much less a predictive one. Since the rise of "aesthetic philosophy" in late 18th-century and earl 19th-century Europe, formidable attempts have been made by the likes of Immanuel Kant, Friedrich Schiller, and G.W.F. Hegel, as well their Scottish and English counterparts—Adam Smith, Edmund Burke, and Samuel Taylor Coleridge—to articulate an objective theory of literary value and artistic taste. But their theories fell under the skeptical gaze of later philosophers such as Friedrich Nietzsche who dismissed the notion of scientifically and objectively verifiable literary value, indeed of all objective values. Rendered silent in 1889 by a debilitating stroke brought on by self-prescribed medicines to treat his stomach ailments, Nietzsche never saw the immense influence of his aesthetic and moral critique of objective ethical and aesthetic values. He never learned before his death in 1900 of the "marginal revolution" in economics begun in England, Austria, and Switzerland in the 1860s and 70s by William Stanley Jevons, Carl Menger, and Léon Walras. We can only speculate on what he might have thought of this epochal shift in our understanding of economics, a seismic event that the Austrian economist, Ludwig von Mises recast as a more general "subjective theory of value" in the early 20th-century.
Oddly enough, mainstream literary criticism (leaving aside the vexed matter of whether the reading of novels and poetry these days can in any way be called "mainstream") and at least one increasingly influential, but still immensely controversial strand of economic science, agree that our tastes are a matter of individual and constantly changing judgment. As F. A. Hayek, a co-winner in 1971 of the Nobel Prize in Economics pointed out, the power of a government to centrally plan an economy is fatally compromised by its inability to know, anticipate, and account for the innumerable personal tastes of hundreds of millions of people, whose ever-changing needs signal to producers what they provide if those desires are to be most efficiently met. The subjective and unpredictable character of literary taste, which is deeply personal and subject to sudden, rapid, and unpredictable changes in cultural fashions, regional prejudices, and individual caprice, is a conspicuous and exemplary instance of the subjective theory of value and the glorious marginal revolution. We have returned to an insight of the ancients, who did not presume to offer a universal and objective theory of taste, culinary, literary, or otherwise: De gusitbus non est disputandum. "In matters of taste there can be no disputing."
So who will be this year's Nobel laureate in literature? I have no idea. Maybe the data-freaks and computer-modelers at Thomson Reuters will hazard a prediction based on their much prized and highly advanced scientific methodologies and protocols. But it seems likely they won't, especially given their track record this week. Surely that's a good thing for literature, not to mention for the economic and political health of world culture.
The post Taste Is Personal: A Primer in Nobel Prize-Winning Literature appeared first on Reason.com.
]]>When reason was founded, historians and political scientists routinely assumed that bigger government was better government. Constitutional law consisted largely of excuses for expanding the state. Keynesians dominated the economics profession. Technology meant ever-more-intrusive methods of surveillance and bureaucratic management. Even the fiction shelves were filled with authoritarian ideas. Clearly, not much has changed.
Except it has. Countertrends were already emerging in 1968, and in the 45 years since then they've flowered. Statism has hardly gone away, but the movement to roll it back is stronger than ever. We asked seven libertarians to recommend some of the books in different fields that made this cultural and intellectual revolution possible. A list this short is bound to be incomplete. But each of these volumes is an excellent place for a reader to start.
Fiction
Michael Valdez Moses on Mario Vargas Llosa's The War of the End of the World
Over the past five decades, Mario Vargas Llosa has won nearly every major international literary award, culminating in 2010 with the Nobel Prize for literature. Belonging to a prodigiously talented generation of writers that included Gabriel García Márquez (another Nobel laureate), Carlos Fuentes, Julio Cortázar, and José Donoso, the Peruvian novelist, dramatist, journalist, and literary critic has set himself apart from his compatriots by virtue of his repudiation of Marxism and embrace of classical liberalism, his direct participation in Latin American electoral politics (he ran unsuccessfully for the Peruvian presidency in 1990), and his adherence to a form of realistic fiction at odds with the "magical realism" of his Latin American literary cohort.
Vargas Llosa's personal favorite among his novels is The War of the End of the World (published in 1981 as La guerra del fin del mundo), a vast historical epic that may be described as the Latin American War and Peace. It is one of the greatest novels written in any language in the last 45 years.
Based on Euclides da Cunha's Os Sertões, a first-hand account of a violent rebellion in the backlands of late–19th century Brazil against the newly established national government, The War of the End of the World has proven to be an unusually prescient work of political fiction. The book recounts the efforts of Antonio Conselheiro, a charismatic lay preacher, to found Canudos, an independent community of religious followers—bandits, freed slaves, impoverished peasants, farmers, merchants, nomadic Indians—in the remote drought-ridden wastes of Bahia. When the federal republic attempts to impose a modern "rational" order inspired in part by the positivist philosophy of Auguste Comte, the counselor and his followers resist, rejecting the new taxes, census, civil marriage, municipal burial, and metric system being coercively imposed by the centralizing Brazilian authorities.
Increasingly violent clashes between the counselor's followers and government officials escalate into an apocalyptic civil war that ends with the annihilation of Canudos, the indiscriminate massacre of its tens of thousands of inhabitants, and the violent deaths of hundreds more soldiers (many of them conscripts).
Written in the wake of an Iranian Islamic revolution led by a charismatic Imam who challenged the power of the American empire and the legitimacy of modern "western" culture, and in the midst of a struggle between Soviet-backed Marxist revolutionaries and U.S.-supported authoritarian dictatorships, Vargas Llosa's novel skillfully evokes many of the fiercest ideological and political conflicts to have engulfed the globe in the last half-century.
But perhaps what most recommends The War of the End of the World, apart from its extraordinarily compelling array of memorable characters—ex-slaves, pious bandits, hubristic generals, utopian anarchists, degenerate aristocrats, religious zealots, scheming politicians, wandering circus freaks, militant peasants, skeptical journalists, indefatigable merchants—is its deeply skeptical view of modern politics. During his campaign for the Peruvian presidency, Vargas Llosa championed the traditional goals of a classical liberal statesman: liberalization of the economy, the wide distribution of private property, an end to the collusive relationship between government and privileged economic interests, the protection of civil liberties, the preservation of the rule of law. But as a novelist, Vargas Llosa has been more daring, more imaginative, and more disturbing.
Fans of James C. Scott's 2009 text The Art of Not Being Governed will recognize in The War of the End of the World one of the enduring tales of modern life: the utopian dream of walking away from the modern state, of the spontaneous organization of a radical, communal alternative to what Max Weber called "the iron cage" of bureaucratic rationalization. But such readers cannot also fail to discern in Vargas Llosa's great tragic novel the apocalyptic consequences that seem inevitably to follow when the modern Leviathan takes seriously the challenge that anarchistic dreaming poses to its unlimited claim of sovereignty.
Economics
Steven Horwitz on F.A. Hayek's Law, Legislation, and Liberty
reason's 45-year history has overlapped with a renaissance in pro-market economics, which makes it a difficult task to choose one book as the "best" of that period. Thomas Sowell's Knowledge and Decisions and Israel Kirzner's Competition and Entrepreneurship are obvious candidates. Milton Friedman's Free to Choose might have been the most influential beyond the discipline. Julian Simon's The Ultimate Resource II is brilliant, if underappreciated by economists.
But the book that stands out for its combination of deep insights about economics proper, its understanding of the broader context in which economics sits, and its influence on subsequent thought is F.A. Hayek's three-volume Law, Legislation, and Liberty.
The trilogy's three installments appeared in 1973, 1976, and 1979. The first volume, Rules and Order, sets out Hayek's broad vision of the roles of reason and evolution, and formulates his distinction between "cosmos" and "taxis"—spontaneous order and planned order, respectively. Later chapters explore the distinction between law, by which Hayek means general rules of just conduct, and legislation, the rules that govern the designed order of the administrative state.
Hayek's work here has been central to later thinking by Austrian, public choice, and new-institutionalist economists about the importance of rules and of institutions such as the law, money, and property rights in framing our ability to generate beneficial social orders without a designer. Rules and institutions matter because the incentives and knowledge signals they generate are what determine whether broadly self-interested behavior will produce socially beneficial or harmful results. In a culture of bailouts, for example, a capitalist's profits do not necessarily mean he has pleased consumers; he may have just pleased politicians. If misguided housing policy and an unconstrained central bank distort the incentives and knowledge facing mortgage lenders and potential homeowners, we should not be surprised when a boom turns to bust.
The second volume, The Mirage of Social Justice, contains Hayek's famous critique of fairness. He argues that the pattern of incomes that emerges in a market economy is an unintended consequence of numerous independent actions. If justice refers to whether a given action obeys what Hayek calls "rules of just conduct," then an outcome that is the product of no one's intention can be neither just nor unjust. The concept of "social justice" as a claim about the pattern of incomes that emerges in a truly free market is therefore nonsensical.
This argument, and specifically the chapter "The Market Order or Catallaxy," has been incredibly influential not just on economists working in the traditions noted above, but also on political and legal theorists and philosophers. The Catallaxy chapter might be one of the single best pieces ever written on the nature of the market as an undesigned order.
The third volume, The Political Order of a Free People, offers a positive proposal for political reform. Despite its emphasis on the importance of constitutional rules, this segment has been rightly criticized for being too much of an attempt at design in a work that is highly critical of central planning. But the volume's epilogue, "The Three Sources of Human Values," is one of Hayek's best essays on the relationship between a free society and human social evolution.
Law, Legislation, and Liberty opened up new ways of thinking about reason, social evolution, social justice, the nature of social rules, and the market as a spontaneous order. The book is indispensable for understanding modern political economy.
Constitutional Law
Chip Mellor on Richard Epstein's Takings and Randy Barnett's Restoring the Lost Constitution
Richard Epstein's Takings: Private Property and Eminent Domain resounded like a thunderclap in constitutional theory when it arrived in 1985.
The word "takings" refers to the government's power under the Fifth Amendment to take private property for public use upon payment of just compensation. Epstein states in his opening sentence, "The book is an extended essay about the proper relationship between the individual and the state." Within a few pages, one can see just how profound, indeed radical, this extended essay would become. Takings is no routine academic exercise. Epstein boldly challenges the very premise of the New Deal jurisprudence that unleashed the modern regulatory state.
The New Deal rested upon government's power to impose vast regulations on property and economic affairs. But this power could flourish only if government didn't need to compensate people for the negative impact that regulations had on property rights. A series of U.S. Supreme Court decisions removed any such constraint by relegating property rights to second-class constitutional status. Those decisions ushered in a presumption in favor of regulation, and the regulatory state was unleashed. By 1985, it had grown to enormous proportions, and the constitutional theory upon which it was based held hegemonic status in law schools and courts.
Then Epstein asserted that the presumption should be that "all regulation, all taxes, and all modifications of liability are takings of private property prima facie compensable by the state." His book builds a rigorous and compelling case for this presumption. Critics immediately pounced and sought to dismiss Takings as being beyond the pale. But Epstein never flinched, courageously defending his position in countless debates. The force of his genius won respect from adversaries and inspired a property-rights movement that continues to grow to this day. Takings served notice that the statist status quo rests on dubious moral and constitutional grounds.
Nearly 20 years later, Randy Barnett's Restoring the Lost Constitution: The Presumption of Liberty (2004) thrust other key provisions of the Constitution into the national spotlight and transformed the legal debate over constitutional interpretation. Legal interpretation during the preceding 70 years had increasingly read key provisions out of the Constitution, effectively amending it even though the words remained the same. As a result, the original meanings of the Commerce Clause, Necessary and Proper Clause, Ninth Amendment, 10th Amendment, and Privileges or Immunities Clause, among other provisions, disappeared, and with them the Constitution's core purpose of protecting individual liberty. Barnett sets out to restore this "lost Constitution."
Restoring the Lost Constitution has many virtues, but two merit particular attention, because they go to the heart of how we get from where we are today to a rule of law that respects individual liberty. The first is Barnett's recognition that the courts have a vital role in constitutional interpretation and that this interpretation should be guided by the original meaning of the Constitution. He rebuts advocates of judicial restraint while calling for an engaged judiciary to review both federal and state laws.
The second is the presumption of liberty. Sadly, the Supreme Court has created a presumption of constitutionality under which most laws—even those that flagrantly flout clear constitutional constraints—survive constitutional challenge. Barnett rebuts this misguided precedent. Through penetrating analysis and historical research, he explains that the burden should fall upon the government to justify a law through reference to either an enumerated power (in the case of the federal government) or a properly limited police power (in the case of the states). In each instance, the result would be a dramatic reduction in government latitude.
The strength of Barnett's book lies in the brilliant way he dissects prevailing constitutional theory and offers a principled alternative. It was just such integrity of intellect that enabled him to play a lead role in the constitutional challenge to ObamaCare.
Both Takings and Restoring the Lost Constitution present a constitutional philosophy in which individual rights are protected from arbitrary and excessive government. Both books are based on impeccable scholarship and thought-provoking advocacy. And both were written by individuals whose successful academic careers show that acting with the courage of your convictions can inspire people inside as well as outside the ivory tower.
Political Theory
Jesse Walker on James C. Scott's Seeing Like a State
The map, as the saying goes, is not the territory: The symbols we use to describe the world are not the same as the world itself. James C. Scott's 1998 book Seeing Like a State examines that gap between maps and territories through a political lens, exploring the conflicts between officials who want clear maps of the areas they rule and subjects who do not always want to be charted.
One insight of the book is the sheer amount of local knowledge that maps do not and often cannot capture. Another is how that missing information drives powerful people not just to try to make better maps, but to remake the territory itself—the real places where real people live—in order to make it more mappable.
The result is a wide-ranging, interdisciplinary, and brilliant book. It covers everything from Stalin's war on the Russian peasantry to the destruction wrought by urban renewal programs, from the rise of permanent surnames to the enclosures that expropriated the commons, from the evolution of the Vietnam Veterans Memorial in Washington, D.C., to the transformation of forestry in 18th-century Germany.
Scott's themes include several subjects familiar to libertarians, such as the failures of central planning and the value of inarticulate knowledge, but he tackles them with a fresh eye, a wealth of research, and a talent for introducing useful concepts. High modernism, for example, is "a particularly sweeping vision of how the benefits of technical and scientific progress might be applied—usually through the state—in every field of human activity." Metis is practical, vernacular knowledge, which Scott contrasts with the abstract knowledge of the high modernists and their maps. Non-state spaces are the poorly mapped zones where mutuality and metis flourish.
Impressive as Seeing Like a State is, it is only one part of a sequence of studies in which Scott offers a bracing alternative to conventional political assumptions. Two earlier books, Weapons of the Weak (1985) and Domination and the Arts of Resistance (1990), investigate the ways apparently powerless people quietly resist their rulers. And 2009's remarkable The Art of Not Being Governed describes the history of a particularly long-lived non-state space—the vast Asian territory sometimes known as Zomia—and its relationship with the governments and quasi-governments it has bordered.
All of these books are well worth reading. Scott does not consider himself a libertarian, a point he occasionally pauses to stress, but the concepts he has introduced or developed belong in every libertarian's toolkit.
Technology
Adam Thierer on Ithiel de Sola Pool's Technologies of Freedom and Virginia Postrel's The Future and Its Enemies
The past 45 years have seen remarkable advances in information technology: the Internet, mobile communications, ubiquitous news and entertainment options, and much more. What made these and other innovations possible was a general openness to the unplanned, the unpredictable, and even the uncontrollable. In our willingness to embrace a world of uncertainty and incessant change, we found unparalleled technological abundance.
No two books more eloquently captured and celebrated the information age than Ithiel de Sola Pool's Technologies of Freedom (1983) and Virginia Postrel's The Future and Its Enemies (1998).
Penned long before most of us had heard of "the Internet" or "cyberspace," Pool's prescient text predicted a world in which "networked computers will be the printing presses of the twenty-first century" and where "there will be networks on networks on networks." In the future, he forecasted, "A panoply of electronic devices puts at everyone's hands capacities far beyond anything that the printing press could offer."
Beyond his remarkable prophecies, Pool's book set forth a passionate defense of technological freedom and unrestricted freedom of speech. "Technology will not be to blame if Americans fail to encompass this system within the political tradition of free speech," he wrote. "On the contrary, electronic technology is conducive to freedom."
While Pool was open to some minimal interconnection rules for communications networks, he called for tight constraints on regulators to ensure that new electronic networks could develop free of the innovation-crushing burdens of the past. "In a free society," he insisted, "the burden of proof is for the least possible regulation of communication." His book set forth 10 "guidelines for freedom" to facilitate that objective. Among them: "the First Amendment applies fully to all media," "anyone may publish at will," "enforcement must be after the fact, not by prior restraint," and "regulation is a last recourse."
Pool's paradigm and prescriptions were rooted in what Postrel, reason's editor from 1989 to 2000, defined as a "dynamist" worldview. In The Future and Its Enemies, Postrel made the case for embracing "a world of constant creation, discovery, and competition" over the "regulated, engineered world," or what she labeled the "stasis mentality." We should "see technology as an expression of human creativity and the future as inviting" while also rejecting the idea "that progress requires a central blueprint," she argued.
Postrel's dynamist vision explained that progress is "a decentralized, evolutionary process" in which mistakes aren't viewed as permanent disasters but as "the correctable by-products of experimentation." Of course, "a future that is dynamic and inherently unstable" and that is full of "complex messiness" is one that many will fear and resist. But that's also what makes it so exciting and full of potential, she argued.
Today's digital economy is the manifestation of Postrel's dynamist ethos: innovation and abundance born of complex messiness. Yet as she predicted, the detractors and worrywarts persist, mostly for elitist reasons. "Stasist social criticism," she noted, "brings up the specifics of life only to sneer at or bash them. Critics assume that readers will share their attitudes and will see contemporary life as a problem demanding immediate action by the powerful and wise."
To succumb to such fears, Postrel explained, is to surrender on human advancement: "This relentlessly hostile view of how we live, and how we may come to live, is distorted and dangerous. It overvalues the tastes of an articulate elite, compares the real world of trade-offs to fantasies of utopia, omits important details and connections, and confuses temporary growing pains with permanent catastrophes. It demoralizes and devalues the creative minds on whom our future depends. And it encourages the coercive use of political power to wipe out choice, forbid experimentation, short-circuit feedback, and trammel progress."
While plenty of tech pundits and academics cling to such stasist thinking today, Pool and Postrel's books continue to provide beacons for a better world, free from the top-down, technocratic mentality and prescriptions of the past. At least thus far, permissionless innovation has largely trumped the precautionary principle in tech policy. Let's hope the dynamist vision can hold the line for another 45 years.
History
Paul Moreno on Mark Warren Bailey's Guardians of the Moral Order
It remains a common view that in the late 19th century the American judiciary adhered to a survival-of-the-fittest laissez-faire ideology summed up in the phrase "Social Darwinism." Judges, the story goes, used this ideology to strike down all manner of laws meant to ameliorate the harsh consequences of the urban and industrial revolutions, including minimum-wage laws, maximum-hours laws, and laws protecting helpless workers from employer fraud. Supreme Court Justice Oliver Wendell Holmes Jr. summed up the idea in his dissent in the 1905 case of Lochner v. New York, when he accused the majority of reading "Mr. Herbert Spencer's Social Statics" into the Constitution.
There are now numerous works showing that Darwin and Spencer had virtually no impact on the late 19th-century American courts. Though evolutionary ideas infiltrated American law schools that were modeled on German universities, this process had barely begun by the 1870s. (Ironically, Holmes, who accused his brethren of being Social Darwinists, was the only Darwinist on the Court.) The great contribution to this literature by Mark Warren Bailey's Guardians of the Moral Order: The Legal Philosophy of the Supreme Court, 1860–1910, is in describing the real philosophical background of turn-of-the-century judges. The result is a remarkable work of intellectual history, recapturing a lost world of ideas that has been distorted by generations of propagandists and scholars alike.
Most judges of that era had attended small, denominational colleges untouched by or hostile to Darwinism. They adhered to a traditional liberal arts curriculum, whose goal was not to produce cutting-edge research but to transmit what Bailey calls "a classical and Christian heritage deemed to be true and useful." Philosophy generally reflected "moral realism": a belief that moral truths existed and could be known, and that happiness consisted in living in conformity with them.
This worldview arose in the 18th century, and John Witherspoon established it firmly at Princeton; Yale became a stronghold as well. Fifty-eight of 75 college presidents in 1840 had graduated from either Princeton or Yale, so a liberal education informed mostly by the moderate British Enlightenment became the dominant theory of all major schools in the United States.
The resulting political economy did tend to endorse what might be called "laissez-faire," but the movement was more concerned with religious and ethical principles than with any particular economic and social order. Indeed, the basic academic worldview condemned both Darwin and the leading American Social Darwinist, William Graham Sumner.
By the late 19th century this academic model became frequently formulaic and doctrinaire—traditionalism (the dead faith of the living) rather than tradition (the living faith of the dead), as the Yale religious historian Jaroslav Pelikan put it. It was often used to defend what had become the status quo. But we often forget that classical liberalism was a dynamic, disruptive, and progressive force in the 19th century. What progressives criticized as "laissez-faire constitutionalism" derived from this older tradition of moral philosophy rather than from any 19th-century economic theory, and certainly not from Darwinism. It can be said to have been confidently but not overly rationalistic, and friendly to religion in a way that could span moderate evangelicalism to deism.
David Josiah Brewer, usually counted among the "laissez-fairest of them all," provides an example of the persistence of traditional pedagogy. Brewer's principal teacher was Theodore Dwight Woolsey, an ordained minister and classicist at Yale. Woolsey's political views presupposed "a divinely authored moral order," and he rejected utilitarianism and modern natural-rights theory, preferring older natural-law philosophy. Brewer's constitutional theory followed in these lines. He regarded the Declaration of Independence, with its view of government as the guarantor of preexisting natural rights, as the key to interpreting the Constitution. The 14th Amendment in particular more clearly articulated these founding principles.
Though late 19th-century political science had repudiated these ideas in favor of legal positivism—the idea that law was simply the will of the sovereign—classical liberalism retained vitality on the bench. It also resonated popular with the public at large: Brewer was the most popular justice of his day, and another popular justice, John Marshall Harlan, had a worldview whose jurisprudence was quite similar to Brewer's.
Bailey's account of the educational background of 19th-century judges reinforces the writing on 19th-century American law by legal scholars such as Peter Karsten. Nineteenth-century judges changed legal rules only very reluctantly; their intellectual tradition made them deferential to precedent. Far from being instrumentalists or pragmatists, they held a traditional view of the common law as the instantiation of the natural law. They did not think that they were "making" law, but that they were "discovering" it. And when they did change the law, they usually did so in order to aid the weak and the poor, rather than to subsidize entrepreneurs. Their values still derived from the Christian and classical sources of the Founding generation.
Late-19th-century judges were engaged in an earnest quest for justice. Bailey's books provides an excellent way for us to recover the principles behind their decisions.
Young Adult Fiction
Nick Gillespie on Frank Portman's King Dork
It's a coincidence (one hopes, anyway) that the self-conscious category of "young adult" (YA) fiction and reason got started around the same point in the late 1960s. The year 1967 saw the publication of Robert Lipsyte's The Contender, a novel about a black kid torn between the lure of ghetto fleshpots and the discipline of a boxing gym, and S.E. Hinton's The Outsiders, a slim but powerful account of class resentment set in Tulsa, Oklahoma. Both were literary works published by top New York houses (Harper & Row, Viking) but specifically pushed at readers in their teen years. By the time reason hit the mag scene a year later, the YA genre was already firmly established in the minds of publishers, librarians, parents, and kids themselves.
Even more than their adult counterparts, YA novels are all about the perils and possibilities of creating meaning, identity, and community out of endless desires and limited options. How do you negotiate the past, brute reality, and your own dreams? These questions neatly parallel the chief concerns of a magazine devoted to free minds and free markets. Over the past 45 years, the YA genre has offered up an embarrassment of riches covering topics and themes ranging from puberty to parenting to politics.
The Harry Potter series alone covers not only virtually every aspect of adolescence in spades, but it helped spark a reading renaissance among kids lucky enough to encounter the novels as they were being published in real time (from 1997 to 2007). Authors such as Judy Blume, Paul Zindel, and Ann Head have taught the past several generations more about lust, longing, love, and honest-to-God coitus than all the sex-ed teachers who will ever live or die. Louis Sachar's Holes (1998), set mostly in a boy's reform-school prison camp in Texas, shifts among historical settings more fluidly than anything written by Kurt Vonnegut and is as deep a meditation on race, gender, and national guilt as anything Faulkner ever coughed up.
If one YA book stands out for special notice, though, it's Frank Portman's comic-epic King Dork. Published in 2006, King Dork is an extended tour through that particular ring of hell known as high school as only the creative force of the punk band the Mr. T Experience could render it. (Portman has long fronted the group as Dr. Frank and warbles wry, funny songs with titles such as "Even Hitler Had a Girlfriend" and "The Weather is Here, Wish You Were Beautiful.")
Tom Henderson is a sophomore loser whom anyone with a soul will recognize and empathize with. "No one ever actually calls me King Dork," he announces. "It's how I refer to myself in my head." His life is an endless series of forced, unfair negotiations with the past, present, and future. He is undersized in a world of bullying jocks and cool kids, and he fits "the traditional mold of the brainy, freaky, oddball kid who reads too much, so bright that his genius is sometimes mistaken for just being retarded." At the start of the novel, his policeman father has died under ambiguous circumstances and has been replaced by "Big Tom," a well-meaning but clueless "full-on hippie" who married Tom's mother at a Neil Young concert.
At the heart of King Dork is a generational divide symbolized by the force-feeding of The Catcher in the Rye in virtually every class. "They live for making you read it," Tom says of his baby boomer teachers. As Tom investigates his father's death and tries to avoid trouble at home and school, Portman renders a rich fictional universe in which all things seem possible but also unlikely to pan out.
The power of the novel, which is steeped in high- and low-culture asides that will make you a smarter and funnier person, is following Tom to the point where he finally realizes that, "compared to Hillmont High School, Holden Caulfield's prep school troubles seem like a sort of heaven on earth. But honestly, I've got my mind on other things." That is, by the close of King Dork, he has made peace with the past while learning from it and is moving toward his own future.
The post Revolutionary Reading appeared first on Reason.com.
]]>Once an ardent supporter of Fidel Castro and the Cuban Revolution, an avid student of Marxist theory, and briefly, a member of the Peruvian Communist Party, Vargas Llosa became nearly as famous in the 1970s and 1980s for his controversial political conversion to classical liberalism as for his precocious literary accomplishments. In 1990, having already garnered nearly every major international literary award for such novels as The Green House (1968), Conversation in the Cathedral (1975), Aunt Julia and the Scriptwriter (1982), The War of the End of the World (1984), and The Storyteller (1989), Vargas Llosa ran a nearly successful campaign for the Peruvian Presidency (losing narrowly in the second round of balloting to a then little-known agronomist, Alberto Fujimori). Brilliantly recounted in Vargas Llosa's 1993 political memoir, A Fish in the Water, his campaign for the presidency was a revolutionary call for the radical liberalization of the economy, the privatization of the public sector, the wide distribution of private property, the end to collusion between a corrupt national government and elite economic interests, a comprehensive expansion and protection of human and civil rights, a rigorous respect for the rule of law, and an end to the authoritarian and paternalistic rule of both the left and right that had plagued Latin America for nearly two centuries.
Early in his career, Vargas Llosa, who embraced his role as a firebrand of the literary and cultural left, was an intimate friend and colleague of "the Boom" generation of Latin American writers that included Gabriel García Márquez, Carlos Fuentes, José Donoso, and Julio Cortázar, all early supporters of Castro and celebrants of the socialist future of Latin America. But in the early '70s, during the so-called "Padilla affair," Vargas Llosa became increasingly disenchanted with Fidel's political persecution of homosexuals and his increasing censorship of Cuban writers. In the coming years, as his interest in classical liberal thought grew, Vargas Llosa became more intimately acquainted with the writings of Friedrich Hayek, Ludwig von Mises, and Isaiah Berlin.
Vargas Llosa's critics in the press and the academy have often characterized his political transformation as reactionary. His more severe critics insisted that as he moved ever farther to "the right" his literary work suffered. They have been wrong on both counts. Among the Nobel laureates' finest and most powerful novels are his 1984 historical epic, The War of the End of the World, a Latin American War and Peace that depicts a bloody civil war in late 19th-century Brazil, The Storyteller (1989), Vargas Llosa's brilliant reworking of Kafka's Metamorphosis set among the isolated indigenous peoples of the Peruvian jungle, and The Feast of the Goat (2002), his wrenching novel about the sanguinary rule of Dominican dictator Rafael Trujillo. Not only are these later works among the author's best, they are also perfectly consonant with Vargas Llosa's early and long-standing commitment to personal liberty and political freedom, and with his life-long animus towards abuses of power, most particularly those of an oppressive government. What has changed over the decades since Vargas Llosa first achieved literary celebrity in the mid-1960s is not his passionate eloquence on behalf of those who seek freedom or his willingness to say "no" to power, but rather his reasoned assessment of how best liberty is to be sought and preserved.
Viva Mario!
Michael Valdez Moses is Associate Professor of English at Duke University editor of the journal, Modernist Cultures, and author of The Novel and the Globalization of Culture (which contains a chapter devoted to Vargas Llosa). He is a contributing editor for Reason.
The post Viva Mario! appeared first on Reason.com.
]]>Yet despite their best intentions, the boys sometimes just can't help themselves. Granted, I Want to Believe does not present the FBI as fully complicit in the axis of evil; the bureau even seems willing to "forgive" Mulder his past misdeeds and welcome him and Scully back into the fold. (At the conclusion of the TV series, Mulder was secretly detained by the government in a Guantanamo-like facility on trumped-up murder charges. Tortured and denied all legal rights, he was convicted by a kangaroo tribunal and sentenced to death before making his escape and going underground. Little wonder that Mulder insists in I Want to Believe that it is not he but the government that needs to be forgiven.)
But as it turns out, the FBI acts like a magnanimous big brother only because it unexpectedly needs Mulder's expertise concerning paranormal phenomena. When flown to Washington in one of the infamous black helicopters that were symbols of government oppression in the old show, Mulder and Scully learn that the FBI is as incompetent, inflexible, cynical, and self-serving as ever.
After an FBI agent is abducted, the bureau opportunistically makes use of Mulder just as it does Father Joseph Crissman (Billy Connolly), a psychic ex-priest and convicted pedophile, whose visions may provide clues to the agent's whereabouts. Unsurprisingly, the bureau fails to earn the trust of either the citizens who assist its efforts or those whom it purportedly serves. In fact, the bureau considers dropping the case when its missing agent turns up dead (its bureaucratic imperative is at an end), even though a civilian is still missing and presumed to have been abducted by the perpetrators. Only the persistence of Mulder (who refers to himself in one scene as a "non-cop") and Scully, acting on their own without government authorization, eventually leads to the apprehension of the two suspects responsible for a series of murders, and, more importantly, to saving the life of a victim. In the end, the FBI predictably discounts Mulder's interest in the paranormal, covers up his contribution to the investigation (along with that of Father Joe, whom the bureau publicly defames as an accomplice to the crimes), and takes full credit for stopping a dangerous foreign conspiracy involving illegal traffic in human organs. The FBI not only fails to save the life of one of its own agents but manages to get a second one killed. With the singular exception of its maverick Assistant Director Walter Skinner (Mitch Pileggi), who acts out of personal loyalty to Mulder and Scully, the FBI does nothing to solve the case; its methods and protocols only retard its satisfactory resolution. Another job well done at J. Edgar's old haunts.
If I Want to Believe portrays the government as incompetent and self-serving, rather than as the TV series had it—malign and conspiratorial—the film nonetheless remains focused on the difficulty in distinguishing the good guys from the bad guys in the contemporary political scene. Far from being haphazard and meandering (as a few critics have suggested) the movie is carefully structured around two parallel stories. The main plot involves two perpetrators, Janke Dacyshyn (Callum Keith Rennie) and Franz Tomczeszyn (Fagin Woodcock), who have serially abducted, murdered, and dismembered several victims. The same-sex pair, who are legally married in the state of Massachusetts, rely on a team of Russian scientists to perform a series of "full-body" transplants that prolong the life (or, more accurately, the head) of Tomczeszyn. The latter, who suffers from cancer as a result of radiation poisoning, manages to extend his life by having his head serially grafted on to the bodies of his victims (mainly young women), who share his AB negative blood type. Tomczeszyn heads a firm that legally deals in the transportation of human organs, but he and his partner also appear to be engaged in the illegal trade of organs harvested from their murder victims. Father Joe, the ex-priest, believes his visions provide him with a psychic link to one or more of the victims abducted by the murderers; but as it turns out, his real spiritual connection is with the perpetrator, Tomszeszyn, one of 37 altar boys whom the convicted pedophile sexually abused in the past.
A second seemingly minor subplot involves Scully's efforts at Our Lady of Sorrows Hospital to save the life of Christian Fearon (Marco Niccoli), a young boy suffering from Sandhoff disease, a fatal degenerative neurological disorder with no known cure. Over the objections of Father Ybarra (Adam Godley), the hospital administrator, and Christian's parents, Scully attempts a radically new "and extremely painful" therapy that draws on the newest breakthroughs in "stem cell research."
Though they may seem unrelated, the two subplots tell the same story: a tale of the Promethean effort to transform, adapt, remake, and preserve human life outside traditionally defined "natural" limits. At its best, the film imaginatively explores the great promise and the tremendous dangers of the brave new world of bio-technology. Dacyshyn and Tomszeszyn are meant to embody practices that violate the "natural order" of things: gay marriage, gender reassignment, trade and transplantation in human organs, stem cell research, vivisection, hybrid speciation, the indefinite prolongation of life. By contrast, Mulder and Scully seem to represent the traditional heterosexual couple who respect the limits of nature and who heroically save an innocent soul from the clutches of what the papers call a "modern day Doctor Frankenstein" at the risk of their own lives. Unlike their dark counterparts, Mulder and Scully do not resort to kidnapping and murdering young women in order to maintain their relationship or prolong their own lives. Nonetheless, Mulder and Scully are more representative of the brave new world than they or we might like to believe. Scully confesses to Father Joe that she and Mulder are not married, though they have lived together for several years. And although they have had a "son" (William), whom they've given up for adoption, they have never been sure if he was conceived through "normal" sexual relations. Fans of the show will recall that William may be a human-alien hybrid created by a conspiratorial syndicate via a sinister combination of genetic engineering and in vitro fertilization.
Moreover, Scully, despite her professed faith in the Catholic Church, eagerly embraces the cutting edge of (presumably embryonic) stem cell research and the most radically experimental forms of medical technology in her attempts to save the life of a dying boy. Were she to take on faith the Church's view that life begins at conception, she would have to concede that her attempts to save the life of her patient depend upon sacrificing the "lives" of those embryos from whom the crucial stem cells have been gathered. In any case, she refuses to allow the hospital to release Christian Fearon to a hospice and insists upon an untried medical procedure with little chance of success—but with the certainty that her patient will suffer agonizing pain.
Suggestively, the advanced stem cell research she relies on for her new surgical procedure appears to have been pioneered by the very same Russian medical team that carries out full-body transplants on Tomszeszyn. Mulder and Scully can, of course, fall back on their long-standing emotional and spiritual connection with each other to win our sympathies, but there's no reason to believe that their homosexual counterparts are any less fanatically devoted to each other—indeed, we witness a tender bed-side scene between the two in which Dacyshyn assures his failing partner that he will live, that he's "going to have a fine strong body again." The two couples thus represent two images of the very same phenomenon: the human endeavor to master nature and prolong human life. The parallels between the two couples thus work to erode the questionable distinction between what is or is not natural, and between those who live according to a natural order and those who challenge its authority.
The pivotal figure that conjoins the two plots is Father Joe. When we first meet the ex-priest, he significantly lights up a cigarette, a gesture that visually connects him with the arch-villain of The X-Files, the Cigarette Smoking Man. (Father Joe's long grey hair and terminal lung cancer are additional links to the Cigarette Smoking Man of the series finale.) The ex-priest is tormented by what he calls "monstrous appetites," desires he never asked for, and which he believes must come from God. In fact, he claims that he only ever wanted to "serve God," and he now seeks divine forgiveness for his sins and reengagement with the Church. In a remarkable scene, he spontaneously bleeds from his eyes after he envisions the body of the young woman tormented by the very person Father Joe abused years before. His painful visions turn out to be the penance he pays for his own "unnatural" desires. Pedophile and priest, psychic and saint, Father Joseph is that quintessentially ambivalent figure in whom the flesh and the spirit, good and evil are thoroughly mixed. And it is he who prophetically tells Scully that she "must not give up," which, she will learn, means she should pursue her Frankensteinian efforts to save the life of her dying patient, Christian Fearon.
At the close of the romantic era in early 19th century Europe, a number of writers and artists who had once been enthusiastic supporters of the French Revolution lost faith in the capacity of mankind to remake the world according to a new ideal of human freedom. The terror of the Revolution, the rise of Napoleon, the endless wars that marked his reign, and the repressive counter-revolutionary period that followed the Congress of Vienna made it difficult for these late Romantics to sustain a faith in liberal ideals or in the possibility of meaningful political reform. Lord Byron, Georg Büchner, and Mary Shelley did not so much turn away from politics as apply the bitter political lessons of their age to the cosmos as a whole. Rather than understand the corrupt ancient regimes of Europe as the source of human unhappiness and servitude, they projected a Gnostic vision of the universe as malignantly organized. In post-9/11 America, Chris Carter's conspiratorial view of government as a plot against the people has, like the visions of the late Romantics, become something of an indictment of the cosmos. The defects of a corrupt and incompetent government would seem to inhere in the nature of things. As Father Joe would have it, it is God (or if you prefer, nature's God) who authorizes those "monstrous" or unnatural desires that set loose evil in the world. And yet, for all the metaphysical darkness that pervades I Want to Believe, the film ends on a hopeful note: Scully takes heed of Father Joe's cryptic message, also sent by the divine powers: "don't give up." However harrowing the possibilities and however monstrous the risks, she's willing to accept the Promethean challenge and ventures forth once more to try and save the life of young Christian.
Michael Valdez Moses is Associate Professor of English at Duke University, author of The Novel and the Globalization of Culture, and co-editor of Modernism and Colonialism: British and Irish Literature, 1899-1939.
The post Modern Day Frankensteins appeared first on Reason.com.
]]>Coetzee's work mixes formal elements in ways that are often unsettling. He blends memoir with fiction, academic criticism with novelistic narration. When he received the Nobel Prize for Literature in 2003, Coetzee delivered not a traditional lecture but a meditation ostensibly written by Robinson Crusoe about "his man," the novelist Daniel Defoe. In his 2003 novel Elizabeth Costello, Coetzee transformed a series of academic lectures he gave over several years into a full-fledged fiction about an aging female novelist who hails from Australia, whose career at times eerily resembles Coetzee's own, and whose life intersects with that of Coetzee's contemporaries. Were that not strange enough, this "novel" does not end with Elizabeth Costello's death, but follows her misadventures into an afterlife that she herself recognizes as a kind of cut-rate parody of Kafka's parable, "Before the Law."
Similarly, Diary of a Bad Year refuses to recognize the border that has traditionally separated political theory from fictional narrative. Indeed, Coetzee suggests that the politics of an oppressive state are only one dimension of a broader web of contention that encompasses the private struggles of his characters. In this complex work, his characters' lives are marred by the conflicts and limitations in which the state itself originates. The search for an apolitical existence free from the evils of state power inevitably comes face to face with the dangers and discomforts of a "natural" world where the clash of personal desires is unregulated by any independent governmental authority.
The book's narrative structure is formally innovative and technically ambitious. It is partly narrated by "Señor C" (a.k.a. "John," "Juan," and "J C"), a world-famous writer in his seventies suffering from Parkinson's disease, who has recently left his native South Africa to take up residence in Sydney, Australia. (Coetzee currently resides in Adelaide, Australia.) Señor C counts among his many internationally known works of fiction and nonfiction a novel, Waiting for the Barbarians (a book actually published under Coetzee's name in 1982), and a study of literary censorship that sounds suspiciously like Coetzee's own 1996 book Giving Offense.
Diary of a Bad Year features two diaries, each written by Señor C. The first and longer diary, commissioned by a German publisher, consists of a set of "Strong Opinions" on timely political and social subjects: "On the origin of the state," "On anarchism," "On Machiavelli," "On Al Qaida," "On Guantanamo Bay," and so on. The second diary consists of "gentler" opinions, not intended for publication: "On the erotic life," "On aging," "On compassion," "On the writing life," "On J. S. Bach." These are less obviously political and more personal in tone and subject.
To make matters more complicated, both diaries usually share the book's pages with other strands of narrative. Solid horizontal lines divide the pages into sections. A top section consists of Señor C's "strong" and "soft" opinions. A middle section includes his more intimate record of an ongoing (and "Platonic") relationship with Anya, a sexually alluring half-Filipina twenty-something whom C meets in the laundry room of his apartment building. Finally, a bottom section offers Anya's own narrative, in which she reflects on Señor C, who hires her to be the typist of his "Strong Opinions" manuscript (though he seems far more interested in the scantiness of her clothing than her lamentable typing skills). She also chronicles her turbulent ongoing relationship with her boyfriend, Alan, an unsavory 42-year-old investment counselor.
Coetzee adds yet one more narrative twist: C's voice often gives way to Anya's in the "middle" sections, while Anya's "bottom" sections are often colonized by Alan's voice. The result is an intricate interplay between Señor C's pronouncements on a wide rage of political, cultural, and highly personal subjects and the emotionally resonant story of the deeply fraught romantic triangle involving Señor C, Anya, and Alan.
This complex, fugal narrative is more than a mere exercise in technical virtuosity. Diary of a Bad Year poses serious and deeply troubling questions: "Why is it so hard to say anything about politics from outside politics? Why can there be no discourse about politics that is not itself political?" Señor C's struggle to describe a world free of state power is burdened by his realization that he lacks an adequate literary form to represent such a perfectly free existence. What, C might wonder, would the language of pure freedom, of undiluted individual autonomy sound like? What hitherto unknown literary genre or artistic form, uninflected by the sorry history of human government, might body forth such a world? It is as if we were to ask what tongue Adam spoke before the fall.
Although Alan (a not entirely convincing representative of amoral capitalism and belligerent "neo-liberalism") describes Señor C as a sentimental socialist, C characterizes his own brand of political thought as "pessimistic anarchistic quietism." He explains: "anarchism because experience tells me that what is wrong with politics is power itself; quietism because I have my doubts about the will to set about changing the world, a will infected with the drive to power; and pessimism because I am skeptical that, in a fundamental way, things can be changed."
In wide-ranging remarks that wrestle with the political thought of Aristotle, Machiavelli, Hobbes, and Etienne de la Boétie, the 16th-century philosopher of civil disobedience, Señor C finds it especially troubling that "the only 'we' we know—ourselves and the people close to us—are born into the state; and our forebears too were born into the state as far back as we can trace. The state is always there before we are." For Señor C, the state originates in a criminal conspiracy: gangs of armed men employ force to extort money and obedience from their "subjects."
C insists that the criminal activities of the state do not end with the act of its founding, but are always and everywhere present in the contemporary world. The aerial bombing of civilian populations, the detentions at Guantanamo Bay, the suspension of civil liberties and widespread increase of surveillance in the war on terror —all speak to the fact that the state establishes its absolute sovereignty through violence.
By such means the state continually demonstrates that there is no authority higher than itself, that it is the ultimate source of all law and justice, that it possesses the "right" to treat "outlaws"—that is, individuals who reject its legitimacy—with impunity. What particularly depresses Señor C is the universal powerlessness and (more worrisome still) unwillingness of individuals to throw off this criminal conspiracy that goes by the name of "the state."
For all the passion of his opinions, Señor C senses their ineffectuality. Merely to express his outrage in print will not bring about a fundamental change in contemporary political life. In fact, it might paradoxically suggest a willingness to play by the rules set down by the political status quo. Insofar as the system tolerates C's "strong opinions" (and even indirectly rewards him financially for giving them vent) it demonstrates that it can quite easily withstand the most vehement and radical jeremiads of its critics. C's meeting with Anya, and his willingness to let her read and comment on his "strong opinions," alerts him to the need to revise his opinions or offer an alternative set of reflections. The increasingly personal and intimate nature of his second, "gentle" diary, which Anya prefers to the first, marks C's decisive turn away from public to the private affairs, from an aggressively anti-political to a more evasive apolitical mode of being in the world.
One might understand C's second diary as an attempt to resist the power of the state by burrowing ever more deeply into the (ever shrinking and always imperiled) sphere of his private life. And just as C's second diary corrects his first, and thereby ideally serves to delimit the sphere of politics, the more intimate narrative streams carry Coetzee's reader that much farther away from the public realm the state claims as its own.
But if the novelistic world inhabited by C, Anya, and Alan provides a minimal and precarious refuge from the omnipotence of the state, it is not, at least as Coetzee portrays it, a utopian or prelapsarian realm. Indeed, they discover that their personal lives are blighted by the very ethical disagreements, primal struggles, and potentially dangerous forms of sexual and material competition that historically gave rise to the state itself (or, at any rate, provided a pretext for its establishment).
In an effort to steal C's financial assets, Alan has used the unwitting Anya to plant spyware on C's computer. He reads both of C's diaries in manuscript without his prior consent. Though Anya objects to and Alan abandons his scheme to defraud Señor C, he nonetheless humiliates the septuagenarian author, informing him in savage fashion that he is not only a hopeless political relic but also an over-the-hill Don Juan whose sexual interest in Anya will never be reciprocated. By the end of the novel, C is left to confront his loneliness, his inexorable physical decline, and his inevitable death with only his stories, memories, and fantasies of Anya as potential sources of solace.
In Coetzee's fiction, the story of domestic life can be nearly as cruel and merciless as the political world. But at least the misdeeds and missteps of private existence have the virtue of being freely chosen. For Coetzee's characters, the difference between involuntary subjection to the state and a freely chosen individual path, however harsh and barren, may be the only difference that matters.
Contributing Editor Michael Valdez Moses is associate professor of English at Duke, author of The Novel and the Globalization of Culture (Oxford University Press), and co-editor of Modernism and Colonialism: British and Irish Literature, 1899-1939 (Duke University Press).
The post State of Discontent appeared first on Reason.com.
]]>Given that two of the films were directed by a pair of cinema's most bankable directors, George Lucas and Steven Spielberg, and that two were "installments" in hugely popular movie franchises, the Star Wars and Batman series, the financial success of the three films was not entirely unexpected. Then again, not every Spielberg film has proven a sure-fire hit (remember 1941 and The Terminal?), and more than one Batman and Star Wars film has disappointed critics and fans alike. But given the number of big-budget dogs that limped out of the theaters this summer (Kingdom of Heaven, The Island, The Adventures of Sharkboy and Lavagirl in 3-D), I suspect that the success of the top-three summer releases cannot be wholly explained by the cinematic equivalent of Viagra: super-sized FX and A-list stars.
According to the controversial thesis of the revered film scholar and theorist, Siegfried Kracauer, film, as a uniquely popular medium, makes visible images of the deep undercurrents of the national political psyche. A member of the Frankfurt school of social criticism, which combined the thinking of Marx, Freud, and to a lesser degree Nietzsche, Kracauer insisted that the movies reveal the fears and desires of "the masses," even as they embody the conscious design of artists and craftsmen who seek the critical (and commercial) approval of the theater-going audience. While I am suspicious of Kracauer's claim that the movies foretell the future destiny of a people (he saw in the German Expressionist films of 1920s Weimar a prophetic anticipation of the rise of Hitler and Nazism), I would nevertheless suggest that each of this summer's three blockbusters, whatever its flaws, registers an ongoing shift in the popular psyche, a change in worldview that post-dates the events of 9/11 and responds to the continuing political reverberations of that fateful day.
Each of the three films portrays a violent, even cataclysmic attack on the heart of modern "civilized" society that is calculated to resonate with a (mainly American) audience saturated by media reports and images of the 9/11 attacks on the World Trade Center and the Pentagon.
In Star Wars, ground zero of the catastrophe is the administrative center of Coruscant, capital planet of the old Republic (and the emergent Galactic Empire). Its most conspicuous public buildings—the Senate, the Jedi academy and temple, the administrative center of the Republic's military elite—are either seriously damaged or left a smoking ruin as a result of a violent coup, plotted in secrecy by the "hidden" Sith Lord.
When aliens launch their surprise attack on Bayonne, New Jersey, in War of the Worlds, crushing cars, incinerating buildings, and reducing its human inhabitants to ashes, Spielberg clearly expects his audience to recall the ghastly images of dust-covered New Yorkers fleeing billowing clouds of material and human debris as the World Trade Center collapsed. As the hero, Ray (Tom Cruise), attempts to flee the carnage with his family, his daughter Rachel (Dakota Fanning) screams: "Is it the terrorists?" Still later, the family emerges from the ruins of Ray's ex-wife and husband's home where they falsely believed themselves safe: Slowly they pick their way through the wreckage of a commercial airliner (presumably brought down by the alien invaders) that has crashed into this upscale neighborhood. Still later, amidst crowds of refuges attempting a river crossing to evade the alien attackers, the beleaguered family sees hundreds of photos of the lost or missing that have been fixed to walls, hand-held signs, and notice boards by anxious friends and relatives.
In the climax of Batman Begins, Henri Ducard (Liam Neeson), the nominal head of The League of Shadows, a secret world-wide network of assassins and terrorists, commanders an elevated passenger train and attempts to crash it into the tallest building in Gotham City, the financial headquarters of Wayne Enterprises (in actual fact, the Chicago Board of Trade). The ultimate nightmare that Batman (Christian Bale) just barely heads off is Ducard's deployment of weapons of mass destruction—biological or chemical agents—against Gotham city. If in the immediate aftermath of 9/11 the studios and networks scrambled to censor or delay the release of movies and TV episodes that evoked the specter of the terrorist attacks, this past summer marked a moment in which such images have come to saturate our popular films, have in fact become the iconographic currency of contemporary cinema.
It tempting to suggest that 9/11 has come to define a new epoch of American popular culture in much the same way that the Great Depression, World War II, the Cold War, and the Vietnam War might be said to have defined earlier periods. To be sure, these few crucial events in American history are not the sole touchstones of American cultural sensibility in any given epoch, but I think 9/11 might now be said to belong to those relatively few momentous events in American political history that have fundamentally shaped the popular culture and to which our collective psyche repeatedly and even obsessively responds. Just as one can find traces of a "Cold War sensibility" in films from the 1950s and 1960s as generically diverse as Dr. Strangelove, The Manchurian Candidate, Invasion of the Body Snatchers, From Russia with Love, The Ten Commandments, and The Searchers, so too can a post 9/11 worldview be said to animate not only the three above mentioned films, but also (to name only a few) The Incredibles, Kingdom of Heaven, Team America: World Police, and Fantastic Four. But what is perhaps most interesting about this recent shift in popular sensibility is what it suggests about our altered understanding of America's world status, national identity, moral authority, internal cohesiveness, and conception of heroism.
The three summer blockbusters all feature the ethically suspect and morally ambivalent heroes. Given that Anakin Skywalker (Hayden Christensen) becomes, in the course of Revenge of the Sith, Darth Vader, it would be belaboring the obvious to insist that the protagonist of Episode III is not an unsullied paragon of republican virtue. It's still worth considering just exactly what Anakin does in the course of becoming the Dark Lord. In the post-9/11 world of Gitmo and Abu Ghraib, of military tribunals, indefinite detention, and suspension of habeus corpus, Anakin's decision to execute Count Dooku (Christopher Lee) without trial or due process, and merely on the "orders" of the Supreme Chancellor Palpatine (Ian McDiarmid), takes on a sinister contemporary significance. Anakin is ultimately responsible for the wholesale massacre of the "younglings," the innocent child cadets of the Jedi temple (an act anticipated by his earlier massacre of a tribe of Tusken Raiders who held captive his mother in Star Wars: Episode II—Attack of the Clones). The fact that Anakin later improvidently hinders the Jedi master, Mace Windu (Samuel L. Jackson) from summarily executing the Chancellor (who is in fact the Sith Lord), and thereby succumbs to the power of the dark side, merely deepens the moral morass in which he flounders.
Is it only a coincidence that Bruce Wayne (Christian Bale) faces precisely the same moral dilemma in early scenes in Batman Begins? He makes plans to assassinate the murderer of his parents after the killer is paroled from prison because, in Wayne's world, the legal process fails to uphold true justice. Bruce then becomes a super-ninja in the remote mountainous redoubt of Ra's Al Ghul (Ken Watanabe), an Eastern religious mystic cum leader of an international terrorist network (an allegorical stand-in for Osama bin Laden in the mountains of Afghanistan). Wayne must prove his loyalty to Ra's Al Ghul and his chief lieutenant, Henri Ducard, by summarily executing an alleged "criminal."
Unlike Anakin, Wayne cavils at the demand and ends up fighting rather than supporting his mentor and former ally. But if Wayne, like another "W," is the privileged son of a former leading public figure of Gotham, his reintegration into polite society proves highly problematic. As Batman, he becomes the thing he fears (a bat), and more to the point, adopts the tactics and methods of those fanatics who threaten to destroy Gotham, which they pointedly characterize as a sink of moral corruption. (Given Wayne's affected playboy escapades in the fashionable restaurants of Gotham, which provide a welcome comic note in the film, the moral critique, however myopic, would seem to be a response to the social realities of big city life.) The consummate vigilante who refuses to be bound by the rule of law, Batman spends as much time eluding the police as he does breaking and entering, crashing into a great many occupied vehicles on the roadways, recklessly endangering the lives of civilians, cavalierly employing sophisticated forms of surveillance on the Gotham populace, wielding the most sophisticated battlefield weaponry within the "peaceful" confines of Gotham, and necessarily deceiving all but his closest confidante Alfred (Michael Caine) about his true intentions and actions. Given that Christopher Nolan, the British director and co-writer of the Batman screenplay, attends so meticulously to Wayne's discovery and transformation of the bat cave into Batman's command and control center, I might be forgiven for recalling President Bush's now infamous characterization of Al-Qaeda as those "fellows" who "burrow into caves."
To be sure, Jedi knights and comic book super-heroes are expected to commit plenty of mayhem if they're to pack the seats and sell popcorn at the cineplex, but even so ordinary a hero as longshoreman Ray Ferrier resorts to some pretty unsavory acts in the course of protecting his family: stealing a car, running over pedestrians desperate for a ride, threatening unnarmed refugees with a handgun, defying police and military authorities who try to prevent the overcrowding of a ferryboat, and most problematically, killing in cold blood Harlan Ogilvy (Tim Robbins), a survivalist who insists that Ray join him in violently resisting the aliens.
Ultimately, the moral ambiguity of these summer movie heroes reveals an unsettling dimension to contemporary American self-identity. It turns out to be increasingly difficult, even in the realm of comic book narratives, to distinguish the good guys from the bad guys. In War of the Worlds, the aliens are the faceless terrorists of 9/11 and, if the film's screenwriter, David Koepp, is to be trusted, the "Martians" also "represent American military forces invading the Iraqis."
No doubt Koepp and Spielberg drew their inspiration from H. G. Wells' novel, which was only one among many wildly popular versions of the "reverse invasion scare," tales of the British Empire suddenly overrun by an alien military force. In Wells' 1898 novel the Martians appear to the hapless English citizens as the invading British imperial troops might have to the "fuzzies" of the Sudan or the Zulus of the Natal in the late 19th century: a technologically superior and apparently irresistible military force. And yet, they are defeated, just like the French colonial armies in Algeria—the very subject, incidentally, upon which Ray's son Robbie (Justin Chatwin) is supposed to compose an essay for school. Koepp insists that one legitimate interpretation of War of the Worlds is that "the U.S. military intervention abroad is doomed by insurgency, just the way an alien invasion might be." Koepp's remarks help to explain why American resistance fighters engaging the enemy on their home soil resort to the desperate hit and run tactics of the Iraqi insurgents: Ray acts like a suicide bomber, clutching a satchel of hand grenades as he's seized by one of the alien tripods; the remnants of America's troops rely upon the equivalent of RPGs in their hit-and-run attacks to take down alien armored war machines.
Those who value their civil and political liberties might worry over the dark visions that have been on offer as light summer entertainment. Republics become despotic empires (Star Wars); the failures of incompetent and corrupt government incapable of upholding the rule of law or protecting its citizens give rise to a form of vigilantism nearly as lawless and destructive as the "foreign" enemies who wish to destroy Gotham (Batman Begins); the monstrous alien threat to American life, liberty, and property, which we properly fear and loathe turns out, on closer inspection, to be us (War of the Worlds).
Of course, from Captain Ahab to Ethan Edwards (John Wayne) in John Ford's The Searchers and Pike Bishop (William Holden) in Sam Peckinpah's The Wild Bunch, the history of American culture is littered with morally ambivalent heroes who have revealed the ethical contradictions and political self-doubts at the heart of the American political experiment. Not for the first time, popular narratives reveal a profound division within the national psyche. As in the age of the Cold War and the Vietnam era, today's blockbusters hint at deep and even violent tensions at work in the body politic.
Not coincidentally this summer's blockbusters insistently portray a city, nation, or galactic republic in which anarchy is loosed. The citizens of these crypto-American regimes are set fiercely and even murderously against each other. Each film ultimately pivots upon the efforts of a morally dubious hero to protect, revenge, or reconstitute the remnants of his family. His precarious familial attachment might be to a spouse (Star Wars), a child (War of the Worlds), or a parent (Batman Begins). Ominously, all of our summer heroes remain, at the end of their tales, conspicuously alone. (Even Ray Ferrier, heir to a typically sentimental Speilberg conclusion, though he manages to deliver safely his son and daughter to their grandparents' home in Boston, can only watch as his children rejoin a family circle—Ray's ex-wife, her new husband, and Ray's former in-laws—to which he can have at best a marginal relationship). Perhaps it will be the visible signature of the post-9/11 epoch that the political and social conflicts that beset the popular hero inevitably frustrate his profound and persistent desire for domestic tranquility.
The post Blockbuster Wars: Revenge of the Zeitgeist appeared first on Reason.com.
]]>The original Star Wars (1977) grossed $513 million worldwide, and its 1997 re-release earned $460 million. International theatrical rentals of the film swelled the total by an additional $779 million. (Not bad for a film that cost $11 million to produce.) The Fellowship of the Rings (2001) has proved fabulously popular, with worldwide grosses totaling $860 million and worldwide rentals $500 million. The Harry Potter series has produced only slightly less magical returns, with $600 million in worldwide rentals alone for Harry Potter and the Sorcerer's Stone. Nor will the flood of such films subside anytime soon. The VHS and DVD editions of Harry Potter and the Chamber of Secrets have recently been released. The theatrical premiere of The Return of the King (the third part of The Lord of the Rings) is slated for December 2003. Harry Potter and the Prisoner of Azkaban and Harry Potter and the Goblet of Fire are due out in 2004 and 2005. The next Star Wars film will be released in 2005, and a third trilogy of Star Wars movies is now in the planning stages.
While each of these series is impressively original, collectively they exhibit enough common features to prompt the question of why a certain film genre, loosely categorized as science fiction/fantasy (purists would scoff at this conflation), has unexpectedly assumed pre-eminence in contemporary culture. The question is no less relevant for viewers, such as myself, who have been disappointed by some of the recent episodes in these series—The Phantom Menace, Attack of the Clones, and Harry Potter and the Chamber of Secrets.
The popularity of these films owes much to the immense talent that has made possible their cinematic realization (and, with The Lord of the Rings if not Harry Potter, to the literary genius that gave birth to the novels on which the films are based). Technological advances in special effects and computer-generated imaging have lent a narrative credibility and spectacular quality to these movies that hitherto was unachievable. (By contrast, recall Ralph Bakshi's feeble 1977 animated rendering of The Lord of the Rings). The fact that all three series appeal to a broad audience that includes children and adolescents, as opposed to more "serious" films that market themselves exclusively to "mature" viewers, no doubt helps explain their box office success. But we are left to wonder why these particular films, as opposed to so many other well-crafted, financially successful, and technologically advanced pictures aimed at a general audience—Ice Age, Stuart Little, Toy Story, Shrek—have proved so culturally resonant.
One significant feature common to all three series is a dramatically compelling (as opposed to a didactically plodding) struggle between good and evil. The protagonists of these films do battle with a potent, even superhuman incarnation of evil: Sauron in The Lord of the Rings, Voldemort in Harry Potter, Darth Vader in Star Wars. Neatly counterpoised to these demonic figures are characters possessing magical or mystical powers who lead the fight for goodness and justice: Gandalf, Dumbledore, Obi-Wan Kenobi (and Yoda). Between these moral poles stand a set of emblematic heroes—Frodo Baggins, Harry Potter, Luke Skywalker—who, though they struggle against evil, nevertheless discover that they are related to or tempted by the evil figure they ostensibly oppose. Their struggle against evil ultimately turns out to be internal as much as it is external. As the literary critic Tom Shippey has pointed out, J.R.R. Tolkien's Rings trilogy makes a strong theological appeal. Like Harry Potter and Star Wars, it offers a mythological explanation of the apparent chaos, pain, disappointment, horror, and violence of the world in terms of a Manichean struggle of cosmic forces.
But if each of these series reiterates a theological narrative rooted in Western European Christian societies, they nonetheless respond to a specifically modern set of social anxieties. Indeed, each expresses a deep discomfort with modernity itself. If the fundamental narrative structure of the films borrows heavily from tradition, the specific forms that both good and evil assume within them are those of the modern world.
This may seem counterintuitive, given that science fiction and fantasy famously provide an escape from the realities of contemporary life. Yet all three series remain thematically rooted in the social problems from which they provide a cinematic holiday. Each represents an imaginary world that is and yet is not our own. Each presents a critical view of our modern world, and each offers a glimpse of a different and preferable modernity. These alternative worlds integrate appealing elements of the premodern past into a vision of the future.
One of the contemporary discontents to which all three series respond is a general boredom with modern bourgeois existence. The escapism of these stories is an antidote to the routine that is the special curse of safe, static middle-class life. The suffocating and vulgarly materialistic world of the Dursleys, the family that initially raises the orphaned Harry Potter; the pettiness and relative inconsequence of life in the Shire, where Frodo Baggins was born; the laborious and task-centered existence of the young Luke Skywalker on his aunt and uncle's dusty provincial farm—all bespeak the ordinary world of the middle class. Harry Potter and the Sorcerer's Stone provides an especially unfavorable view of modern bourgeois life. The Dursley home is what Harry most eagerly wishes to escape: His life is literally cramped, his daily existence reduced to the meager dimensions of a closet-cum-bedroom under the stairs.
It is no accident that all three series initially focus on the fortunes of an adventurous youth. Harry, Luke, and Frodo all secretly hope not to become like their elders. Though by no means revolutionary firebrands, all rebel against the older generation. All share a common desire for freedom, travel, and heroic adventure, a yearning to leave behind the safe but restrictive world in which they were raised.
The fact that such narratives strike a responsive chord among parents as well as their children suggests that the lure of adolescent rebellion against bourgeois life may be as strong—or stronger—for middle-aged, middle-class adults whose hopes of realizing their youthful romantic aspirations are rapidly fading. If The Lord of the Rings, Star Wars, and Harry Potter reject the modern bourgeois order, each also implicitly rejects the notion that each generation should emulate and carry on the traditions of their elders. As such, all three exemplify the spirit of romantic rebellion against the modern bourgeois order—a rebellion that is itself fundamentally modern and bourgeois. Each generation thinks of itself as progressive and its parents as conservative or reactionary. Yet the traditions of the older generation are in no way deeply rooted in the past, emerging instead from a relatively recent rejection of the norms of a not-so-distant generation.
This cycle of generational revolt is complicated by the curious fact that the heroes of these romances ultimately stumble on a secret inheritance at odds with their staid middle-class lives. Each learns that his true parents are nothing like the people who actually raised him. Luke makes the fateful discovery that his father was a Jedi knight. Harry finds his real parents were wizards. Frodo takes after his errant, widely traveled, and eccentric uncle Bilbo, who bequeaths to him the very ancient Ring of Power. All these romantic heroes rebel against their given social role and paternal expectations in the name of a hitherto concealed "true" identity rooted in an obscure and mysterious past. To become a wizard, a Jedi, or the Ring bearer is to claim an ancient title and anachronistic position impossible to reconcile with the narrow social possibilities and historical limitations of life in Little Whinging, the Shire, or a rural community on the outskirts of a galactic empire.
Were the heroes of these films to leave their tedious lives for a world utterly alien from the mundane ones they had known, these stories would offer pure escapism. Instead, each encounters a hellish version of the modern world he has fled. In the first Star Wars trilogy, Luke battles an empire propped up by a vast, robot-like army of imperial storm troopers and equipped with the most deadly weapon of mass destruction, the Death Star. Joining a loose confederation of rebels, he helps spearhead a war of independence against a regime that is soulless, tyrannical, hegemonic, and technologically based. Given that George Lucas' original trilogy was produced during the final years of the Cold War, it is not hard to identify his Empire with what Ronald Reagan famously described as "the Evil Empire." (That critics of Reagan's Strategic Defense Initiative derisively dubbed that program "Star Wars" further melded the series with historical reality.) Despite costumes freely adapted from the Flash Gordon serials and titles borrowed from Roman antiquity, the Empire represents a fusion of fascist and communist elements.
Luke and the other freedom fighters who oppose this totalitarian menace are characterized by their relative independence (suggested by Han Solo's very name), their voluntary and negotiable participation in the great struggle, and their comparatively decentralized forms of resistance and military organization. Perhaps the most interesting thematic development of the second trilogy (now two-thirds complete) is the revelation that the Empire emerged historically out of an apparently free and republican federation of planets. While reiterating in simplified form the transformation of republican into imperial Rome, the overall arc of the Lucas films hints that the totalitarian Empire issued from the ostensibly democratic and free Federation. And whereas the first trilogy celebrated the exploits of independent smugglers (Han and Chewbacca) who join up to fight an oppressive Empire that controls galactic trade, the second trilogy begins with the revelation that an evil Trade Federation, sanctioned by the Republic, secretly plots to take over the independent planet of Naboo. Surely the doubts about the collusive relationship between big business and big government, and the growing concern with the attendant loss of freedom in the Republic, reflect the changed post?Cold War environment of the 1990s: The only hegemon remaining is not the Evil Empire of the Soviets (much less the ever more transparent ghost of Nazi Germany) but the last great superpower: America.
Lucas claims to have laid out the overall arc of his story long before the fall of the Soviet Union and the Eastern Bloc. Even so, there is an unexpectedly subversive strain in the Star Wars saga that has become more pronounced as the long narrative has unfolded. The whole series turns, after all, on the fact that the great hero of the Federation, the Jedi Anakin Skywalker, goes over to the "dark side." In the process, he literally loses his individual human features and becomes something halfway between man and machine, trapped within the inhuman armor and featureless technological mask of Darth Vader. That the defenders of individual freedom and democratic liberty might paradoxically come to resemble the totalitarian enemy they once opposed is darkly suggested by the final scene of the first Star Wars film. Lucas' concluding shot of Han, Luke, and Chewbacca triumphantly advancing toward the dais on which stands Princess Leia recalls a famous scene from the Nazi propaganda film Triumph of the Will: Leni Riefenstahl's high-angled shot of Hitler and two top aides (including Goebbels) walking in measured fashion between the vast crowds at the 1934 Nuremberg rallies. The grim suggestion of the Star Wars films, borne out by the more spectacular but dramatically less compelling second series, is that the regime that once represented freedom and democracy has itself become corrupt, centralized, soulless, intrusive, despotic, and imperial.
Frodo's quest in The Lord of the Rings brings him face to face with a similarly terrifying image of modern political life. As Shippey has noted, notwithstanding Tolkien's denial that his trilogy was an allegory of the political events of 1939?45, both Isengard and Mordor, the realms of the Tolkien villains Saruman and Sauron, exhibit many features of 20th-century totalitarian states. (To be fair, they also exemplify characteristics of the German, Austro-Hungarian, and Turkish powers of World War I.) Once enemies, Saruman and Sauron have become allies, though their alliance is built on the mutual recognition that the weaker must ultimately serve the stronger. Their dark kingdoms lack all individual liberties. They rule vast worker-states devoted to conquest and marked by compulsory military service and forced labor. Their orc armies consist of deformed and inhuman masses lacking relations, beliefs, traditions, and interests outside the direct control of the state. In a scene vividly captured in one of Peter Jackson's Lord of the Rings films, Saruman breeds a new species of orc, the Uruk-Hai. Such eugenically designed creatures literally owe their existence to the regime's dark powers.
The oppressive conditions of these militaristic states are underscored in Tolkien's final part of the trilogy, The Return of the King. Here Frodo learns that Sauron built the fearsome Tower of Cirith Ungol not for defensive purposes but in order to keep those under his power within the borders of Mordor. Any reader old enough to remember the great totalitarian regimes of the past century will recognize the allegorical dimension of Tolkien's work.
J.K. Rowling's vision of political and social evil is rather obscured in the first two Harry Potter films. But in the climactic scene of Rowling's first novel, in which Professor Quirrell reveals himself to be Harry's hidden enemy, the two-faced villain recites the creed of his master, Voldemort: "There is no good and evil, there is only power and those too weak to seek it." Voldemort's vulgar Nietzscheanism is a familiar moral nihilism that can never be entirely vanquished. As opposed to Harry, who represents the redeeming power of love, Quirrell has become, like Voldemort, "full of hatred, greed, and ambition." His amoral doctrine waits a new incarnation by means of which it can re-enter the world. Should Voldemort ever succeed, Hogwarts—Harry's school of wizardry and witchcraft—would become "a school for the Dark Arts." Rowling suggests that this evil has manifested itself fairly recently in the modern era: "Dumbledore is particularly famous for his defeat of the Dark wizard Grindelwald in 1945."
But if the ultimate institutionalization of the Dark Side is kept at bay in the Harry Potter films, lesser forms of badness trouble Harry's years at school. In particular, Harry, the quintessential scholarship boy, constantly struggles against the malign intentions of his upper-crust classmate Draco Malfoy. Given the historically contentious class politics of modern Britain, the threat that Malfoy exemplifies, class prejudice and snobbery, likely touches a raw nerve among Rowling's British readers in a way that their American counterparts do not fully appreciate. Nonetheless, the virulent and unjustified antipathy that Malfoy feels toward "riffraff" and "Mudbloods" (those of non-magical ancestry) clearly represents the defining ethos of the ancient Malfoy family, Professor Snape, and the residents of the most exclusive (and illustrious) house at Hogwarts: Slytherin. Not surprisingly, the evil forces of Voldemort are surreptitiously linked with the upper-crust students and teachers of Slytherin. Ron Weasley, Harry's close friend, reports that Draco Malfoy's distinguished father (a Slytherin alumnus) was among those great wizards who joined Voldemort during his first grab for power—and that he needed little persuading to ally himself with the forces of evil.
One might say that the defining moment for Harry's future comes when he insistently asks not to be placed in Slytherin, despite his eligibility. In short, the great menace that haunts Hogwarts and indeed Britain itself is the old self-interested political, financial, and cultural elite that remains opposed to the new, more democratic heroism of Harry Potter.
If the Harry Potter series displays hostility toward conservatism and the traditional class hierarchies of England, its success nonetheless relies on recuperating many traditional features of English society for a more pluralistic, multi-ethnic, and egalitarian vision of the United Kingdom. The charm of these films relies not solely on the thrill of magic, but also on the appeal of the archaic and anachronistic. In a fundamental respect, the Potter stories are simply fanciful versions of a well-established and well-respected literary genre: the Bildungsroman or "novel of education" set at an elite public school or university (e.g., Tom Brown's Schooldays, Brideshead Revisited). Like the modern middle-class boy admitted on fellowship to an elite English educational institution, Harry must learn rituals, games, social protocols, and systems of knowledge that belong to the past.
To memorize ancient chants and spells is not so very different from learning the dead languages of Latin and Greek. To become a star "Seeker" at Quidditch is comparable to mastering schoolboy games such as Fives (played at Eton) or taking up a traditional pastime of the English social elite—cricket. The remote and bucolic location, immense wealth, privileged social standing, quaint academic dress, architectural grandeur, and ancient historical stature of Hogwarts all bespeak a rootedness in the English past. To enter Hogwarts is to travel back in time.
Indeed, before he passes through its ancient doors, Harry must buy his school supplies—a wand, an owl, books of magical lore—in Diagon Alley, a secret shopping district at the heart of present-day London that is a visual compendium of medieval, Renaissance, and Victorian England. The persistence of such places and institutions as Diagon Alley and Hogwarts in a contemporary Britain dominated by the spirit and customs of ordinary Muggles (as normal humans are called) is more truly magical than any spell cast at Hogwarts.
Of course, Harry's school does not represent a simple return to the moral values of the English past. Rowling carefully introduces crucial features of modern liberal Britain. The student body at Hogwarts is notably heterogeneous: Its houses hold a conspicuous mix of black, South Asian, Celtic, and Anglo-Saxon students. Girls are fully the equals of boys, as students, faculty, and Quidditch players. Most important, those who attend the prestigious school are drawn from rich and poor, privileged and obscure, urban and rural backgrounds. All accents are spoken at Hogwarts. If class conflict still lurks behind the institution's attractive face, the school also exemplifies the progressive spirit of the new multi-ethnic, feminist, democratic, postimperial, and communitarian U.K. The education of Harry Potter represents a magical attempt to wed the best of traditional English culture with Britain's most progressive social values.
Because Star Wars reflects an American rather than British worldview, it draws on a briefer national history. Luke Skywalker's status as a provincial farmboy making his way in the galaxy evokes a distinctively American nostalgia for the small towns and rural communities of the recent past. By visually evoking the American western—the dusty barren landscape of the planet Tatooine on which Luke's aunt and uncle make their home, the image of the burned-out homestead and the bodies of massacred settlers—Lucas manages to establish his hero as a representative of the American frontier. More generally, the aw-shucks demeanor of Luke and Han, their love of futuristic hot rods, their do-it-yourself Yankee ingenuity, and their go-it-alone determination are meant to summon up the virtues of a simpler, pre-urbanized, pre-corporate, decentralized, and more virtuous American era. Victory over the Empire depends crucially on the virtues and skills of Yankee farmers, frontiersmen, and commercial traders. (Han is the exemplary unregulated, freewheeling, independent trader cum smuggler).
Meanwhile, as in the Harry Potter series, the virtues of the American past are slyly blended with a forward-looking and progressive sense of the new American century. The conflict between frontiersman and aboriginal, between white and black, between the "native" American citizen and the ethnic immigrant are largely effaced. (A few residual traces persist—the Jawas of Tatooine, for example, are troublesome nomadic desert scavengers, though not a serious threat to civilized existence.) In Lucas' future there exists amity and equality between white man and Wookie (Chewy is more of a full partner than Tonto to Han's Lone Ranger), between whites (Han, Luke, Obi-Wan) and blacks (Lando Calrissian and Mace Windu), between men and women (Princess Leia, Queen Padmé Amidala).
The irregular rebel army is the racially and sexually integrated volunteer force of the post-Vietnam era. Even the deeply fraught relations of master and slave are reinvented as the amicable relationship between a young man and his favorite androids. The most heterogeneous clientele gathered from the farthest reaches of the galaxy drink together in the same wayside cantina, and a rich interplanetary mix of ethnicities, races, and genders are conspicuously represented in the Republican Senate. Lest this multi-ethnic, gender-neutral future seem to lack ancient traditions or theological grounding, the Jedi are there to represent a deep wisdom—part Eastern mysticism, part Rosicrucianism, part traditional Protestantism, and wholly New Age. The American past, it seems, is to be the progenitor of the New American Millennium.
Of the three series, The Lord of the Rings represents the most conservative and critical assessment of modernity. Where both Harry Potter and Star Wars project a youthful optimism about the future, The Lord of the Rings exhibits a more deeply nostalgic and elegiac attitude toward the past.
To be sure, the forces of Saruman and Sauron are ultimately defeated by the unified efforts of "the four, the free peoples"—elves, dwarves, Ents (giant, peripatetic, talking trees), and men, to which we must add the fifth free people, the hobbits of the Shire. Aragorn is triumphantly crowned King Elessar and united in marriage to the elfin Lady Arwen. Yet any reader of Tolkien's novels or viewer of Jackson's films recognizes the tragic dimension of these works.
Bilbo is tormented until his dying day by the memory of the Ring. Frodo never fully recovers from the wound he suffers from the blade of the Ringwraiths, nor the one Gollum inflicts on him at Mount Doom. Boromir dies repenting his betrayal of the Ring bearer. The Shire does not escape the dreadful consequences of the War of the Ring: Before Frodo and Sam return home, Saruman brutally exploits and tyrannizes the Shire. And the longed-for destruction of the Ring of Power marks the end of Elvish civilization in Middle Earth (as Tolkien's world is called). Of the Elvish peoples only Arwen, who chooses a mortal life, will remain. The others, along with Frodo, are borne by ship "into the West."
The tragic and elegiac dimensions of The Lord of the Rings only heighten the appeal of traditional and premodern ways of life. Making spectacular use of the varied natural scenery of his native New Zealand, Jackson has succeeded in conveying a vision of preindustrial existence that is immensely appealing. The unsullied beauty of the country's mountains and lakes provides an inspiring backdrop for the simple village life of the Shire. The realms of the elves are fully integrated into their wondrous natural settings, while their architecture, furnishings, and clothing are based on organic forms; it is impossible at times to distinguish those things crafted by the elves from the natural shapes that flourish in their old forests and mountainous river valleys. Having spent much of his youth in and around the urban blight of Birmingham, one of England's first great industrial cities, Tolkien anticipated in his fiction many contemporary ecological concerns. Treebeard—leader of the Ents—speaks for Tolkien when he says Gandalf is "the only wizard that really cares about trees." With few exceptions, the free peoples of Middle Earth are distinguished by their willingness to integrate themselves into the natural world and not to disrupt or destroy it.
By contrast, Saruman "has a mind of metal and wheels" and "does not care for growing things." Jackson's films emphasize the hellish atmosphere of Isengard, where Saruman's orcs clear-cut the forest and ravish the landscape in order to build what amounts to an enormous munitions factory. Fittingly, the Ents destroy Isengard by unleashing the river dammed up by Saruman's forces.
Of course, the free peoples of Middle Earth are not entirely immune to the lure of technology, industrialism, and the modern project. The dwarves have proved too relentless and rapacious in their pursuit of mithril, the precious silver they mine from under the Mountains of Moria. In the course of their ambitious building and tunneling they have awakened the Balrog, an evil creature who engages Gandalf in mortal combat. The wizard draws a moral from the misfortune of the dwarves: "Even as mithril was the foundation of their wealth, so also it was their destruction: they delved greedily and too deep, and disturbed that from which they fled, Durin's Bane."
Although the Shire does not provide much of an arena for high adventure or personal ambition, and although its village customs can seem stifling, its deep attraction becomes apparent to Frodo after he leaves its confines. Until the onset of the War of the Ring, the Shire largely escaped the great political struggles. Its relative obscurity and isolation—most in Middle Earth have never heard of hobbits—is a measure of its political independence. The hobbits know only a very loose and undemanding form of local self-government. In this respect, Tolkien's much-vaunted medievalism (he was a prominent scholar of Anglo-Saxon and Middle English literatures at Oxford) seems consonant with certain 19th-century theories of the origins of English liberty.
For Tolkien, as for other Victorian defenders of the medieval English past, the premodern period of loose and competing feudal political relations and independent townships gave birth to modern English political and personal freedoms. An epoch of weak monarchs, locally based political authorities, and intersecting and sometimes conflicting ties to church, local landholder, township, guild, and king was more conducive to personal liberty than was a later, apparently more uniform and "progressive" era of strong centralized government, universal suffrage, and mass politics. In any case, when Frodo and his companions return to the Shire after the destruction of Sauron, they find that the traditional "medieval" world of village life has changed for the worse.
In the adventurers' absence, the Shire has fallen under the dominion of Saruman. Prosperity has been replaced by shortages and rationing. Personal property is seized by the new government for the sake of "fair distribution," though most of these goods end up in the hands of corrupt officials. Old family homes have been destroyed to make way for shabbier public housing. The old mill has been replaced by several larger ones that are "always a-hammering and a-letting out a smoke and a stench." Beautiful old trees have been cut down and the river polluted. Political corruption is rampant, as is physical intimidation by newly established public officials. Most significantly, the personal liberties of the hobbits have been sharply curtailed under an intrusive and alien form of government. As old Farmer Cotton puts it, "everything except Rules got shorter and shorter."
One of Frodo's exasperated companions proclaims, "This is worse than Mordor!" In fact, as the critic Shippey has astutely noted, Tolkien's postwar Shire, much like Orwell's Oceania in Nineteen Eighty-Four, most closely resembles postwar Britain under the Labour government of the mid- and late 1940s.
In Tolkien's fantasy, Frodo and his friends manage to restore the old village life of the Shire, though the expulsion of Saruman and his cronies requires a pitched battle in which several hobbits lose their lives. Afterward, the fate of the Shire mirrors that of the rest of Middle Earth. Frodo's only memorable political act as deputy mayor is to "reduce the Shirriffs to their proper functions and numbers."
Restored to the throne of Gondor, Aragorn frees the slaves of Mordor and restores to the wild men of the Forests of Druadan (his former allies against Mordor) their lands and political independence. Given the late imperial context in which Tolkien completed his trilogy, Aragorn's magnanimity hints at the necessity of extending liberty not only to those abject souls formerly enslaved by totalitarian regimes but also to "primitive" and politically subject peoples (i.e., colonized nations) more generally. Most significantly, Aragorn issues an edict that makes the Shire "a free Land under the protection of the Northern Sceptre." The hobbits of the Shire are guaranteed the right to elect their own mayors, run their own farms and small businesses, carry on their trade, and govern their own communities as they deem fit.
But this restoration of the traditional way of life is haunted by a pervasive sense of historical finitude. The rebirth of the Shire takes place against a grand historical narrative that witnesses the decline and fall of the great civilization of the elves in Middle Earth and the gradual and inevitable loss of their magical influence on the lives of men, dwarves, and hobbits. Like postwar Britain, the Shire survives, but only in diminished form, the quaint and picturesque relic of a once-glorious Elvish past that shall come no more. The Shire serves chiefly as a holiday stop for old wizards and displaced elves who indulge in one final nostalgic visit before heading "to the West" and to oblivion.
Tolkien's epic narrative turns on a single, stark moral choice: Those who would defeat Mordor must themselves refuse the Ring of Power. The great temptation for the books' and films' heroes is to believe that they can command that Ring for good. Tolkien's fantasy is deeply conservative insofar as this moral choice implies a rejection of the modern project: To preserve the good life, men must relinquish their efforts to acquire power, particularly technological and economic power over nature, for their own ends. But if Tolkien's Luddite fantasy is consonant with anti-modern environmentalism, it also embodies a desire for political liberties and personal freedoms increasingly imperiled by the expanded authority of modern nation-states, which is itself supported by many environmentalists through instruments such as the Kyoto protocols.
Mr. Butterbur, an innkeeper in Tolkien's trilogy, speaks for many lovers of fantasy, hobbit and human alike, when he expresses a deeply felt if often frustrated desire: "We want to be let alone." And yet the desire to be left alone, at least for the great majority of readers and viewers of contemporary science fiction and fantasy, is often inextricably combined with a contrary desire to arrange the future for the better, to direct the lives and welfare of others, if only generations yet to be born. The deepest utopian appeal of The Lord of the Rings, Star Wars, and Harry Potter is not to an adolescent yearning for a world inhabited by wizards, hobbits, and Jedi knights, but to a modern consciousness torn by mutually contradictory desires. In divine fashion we would redesign the entire cosmos according to our individual whims and throw off the chains of all external authority. We wish at once to be free and to be a god to others. We would return to an idyllic past and progress forward to an unbounded future. The truly magical power of these films and stories is that they allow us, if only for the brief moment in which we are enthralled by their spell, to believe that as modern individuals we can be both at home in the world and at one with ourselves.
The post Back to the Future appeared first on Reason.com.
]]>Given that the FIFA World Cup generates only slightly more interest among U.S. sports fans than the Summer Grand Sumo Tournament in Tokyo or the Rugby Union Henineken Cup Final, it's not surprising that most Americans missed what was arguably the most important Irish sports story since the founding of the Free State in 1922.
In May, when the Irish national football team was training on the Pacific island of Saipan, its captain and only bona fide international star, Roy Keane, was summarily dismissed by the team manager, Mick McCarthy. The Irish press described Keane's sacking on the eve of the opening round of the World Cup as nothing less than "a national catastrophe."
Lest one dismiss such rhetoric as mere hyperbole, consider that Bertie Ahern, the newly re-elected Irish prime minister, offered (in vain) to intervene personally in the dispute in an attempt to "salvage his country's World Cup hopes." The wry comments of one Dublin sports fan put things in their proper perspective: "This is far more serious than Partition. That only brought us 80 years of bloodshed, but this could mean that we go out of the World Cup in the first group."
What shocked the Irish public was not so much the bitter nature of the disagreement between the feuding former teammates, but the vehemence and offensiveness of Keane's remarks to McCarthy at a team meeting that led to the sacking. According to the Dublin Evening Herald, Keane told McCarthy in front of the entire squad, "You were a crap player and you are a crap manager. The only reason I have any dealings with you is that, somehow, you are the manager of my country and you're not even Irish. You can stick it up your bollocks" (emphasis added).
What might have remained a "mere" sports story became a raging cultural debate because Keane impugned Mick McCarthy's Irishness! As it happens, McCarthy originally hails from Yorkshire and, like several other team members born to Irish parents, grew up in England. By extension, if McCarthy didn't make the cut as an authentic Irishman, then neither did several of Keane's teammates who also play for the Irish national squad. Keane seemed oblivious to the irony of his own peculiar position. The 30-year midfielder, who calls Cork home, currently captains one of Europe's most storied professional soccer teams, Manchester United, and thus lives for much of the year in England where he represents the British team.
The Keane-McCarthy spat replays one of the oldest running conflicts in Western Europe, that between the "native" Irish and the "foreign" English. But if the public disagreement necessarily evokes that 700-year-old conflict, it also points up one of the most recent and increasingly pressing issues in the contemporary Republic of Ireland: multiculturalism.
Having emerged in the last decade as "the Celtic Tiger," one of the fastest growing economies in Europe (Dubliners gleefully point out that the growth rate of the Irish Gross Domestic Product has recently outstripped England's), Ireland has for the first time in many centuries seen a rapid influx of new inhabitants. Indeed, the recent liberalization of the Irish economy, coupled with the country's rapid integration into the European Union, has signaled a sharp reversal in emigration patterns that have persisted for a century and a half.
From the mid-1840s, when Ireland suffered the last great famine in Western European history (scholars estimate that 1 million Irish perished and another 1.5 million emigrated), great numbers of Irish have steadily departed their native land for greener pastures in the United States, Australia, Canada, and England. Ireland today is thus unique among European nations insofar as its population is still below that of 1841 (even when one includes the present population of Northern Ireland, officially still part of the United Kingdom). The new Irish prosperity thus promises or threatens—depending on whether one takes a nativist or cosmopolitan perspective—to transform beyond recognition what has traditionally been one of modern Europe's most ethnically homogenous countries.
On a recent visit to Ireland, I was caught off-guard by a Dublin taxi driver who inveighed against the current wave of new immigrants. Vainly scrutinizing the streets for a face, complexion, or manner of dress that might stand out among the crowd thronging O'Connell Street, I finally asked whom he had in mind. "The Dutch!" he expostulated. It seems that one byproduct of EU membership, economic prosperity, and generous welfare benefits has been that ne'er do well Dutch and other European youth have found Dublin a hospitable place to idle away their time. (I continue to puzzle over how they can afford one of Europe's most expensive cities.) If from an American perspective the recent arrival of distant relatives of the Vanderbilts seems like a pretty innocuous instance of the new multiracial Ireland, the fact remains that multiculturalism is receiving a good deal of attention from Irish cultural critics and intellectuals.
A case in point is Multi-culturalism: The View from the Two Irelands, a brief book featuring essays by two of the island's most prominent literary critics: Edna Longley (who resides in Northern Ireland) and Declan Kiberd (who lives in the Republic). Its appearance marks an important moment in Irish cultural history. One of the first publications of the newly established Centre for Cross Border Studies (funded by the EU), the book is intended to represent the new cooperative cultural and political relationship between the Republic of Ireland and Northern Ireland established in the wake of the historic 1998 Good Friday Agreement.
It is a token of the immense importance that contemporary Ireland places on its arts and letters that this slim (78-page) volume features a laudatory foreword from Mary McAleese, the current president of the Irish Republic. She writes, "We are gradually moving away from the homogeneity and old certainties which have traditionally been the hallmarks of Irish life. We are rapidly becoming one of the wealthier states in the world, as well as a multi-cultural society."
Edna Longley's essay takes up in considerable detail the single most vexing cultural conflict in Ireland's recent history, that between Protestants and Catholics in Northern Ireland. While the academic and intellectual elite of the United States has been obsessively focused on the divisions marked by race, ethnicity, and gender, Longley's essay, "Multi-culturalism in Northern Ireland," provides a timely reminder that the truly fundamental, politically critical, and historically significant divisions among peoples in modern Western societies have been religious ones.
Yet one finds little or no interest among contemporary (and usually secular) North American and Western European academics in the religious or creedal differences among their students and fellow citizens. This remains true notwithstanding a post-9/11 interest in the rights of a beleaguered Islamic minority—a group all too frequently cast by left-liberal intellectuals as an ethnic or cultural rather than specifically religious one. But any historically informed and critically accurate consideration of the development of the political principles of toleration and civil rights must necessarily begin with the religious divisions that plunged late 16th—and 17th -century Europe into decades of civil violence, warfare, persecution, and bloody repression.
If, as Irish commentators of the last 30 years have had it, "it's still the 17th century" in Northern Ireland, then Longley's discussion of the challenge of a newly peaceful and culturally vibrant multicultural Northern Ireland might have a salutary effect on her American counterparts. Specifically, it might inspire them to revisit the politically arduous and philosophically complex struggle whereby modern liberalism overcame the sanguinary terror of religious conflict in post-Reformation Europe. In crucial respects, contemporary divisions among races or ethnicities, and between genders, merely replay in a minor key the historically antecedent and more violent ones fought among religions and creeds.
Declan Kiberd's fascinating and trenchant essay, "Strangers in Their Own Country," gets directly to the heart of contemporary Irish concerns about the prospects for the nation's multicultural future. Kiberd takes due note of the shocking instances of racial violence that have beset Ireland in recent years: unprovoked attacks on newly arrived immigrants, asylum seekers, and political refugees from Romania and Nigeria. He likewise notes the unprecedented challenge posed by the first generation of Muslim immigrants for a society that has been, since the establishment of the Irish Free State in 1922, overwhelmingly Roman Catholic.
Arguably the most prominent and prolific literary scholar living in Ireland today and one of his nation's most renowned cultural critics, Kiberd stands out among his European intellectual counterparts for his relative enthusiasm for economic liberalization. He is an eloquent defender of open borders and an astute commentator on the economic advantages that the free movement of labor across national borders offers to Ireland in particular, and to Europe and the Third World more generally. He writes that "the arguments for embracing immigrants are not just moral or cultural, but economic as well. At present the Irish labour force is seriously short of skilled and unskilled workers…and those who come to Ireland…are here to work, not to live on the state; and they will invariably pay far more in taxes than they will receive in state hand-outs."
In a rhetorical move meant to remind the Irish of their own beleaguered history, Kiberd continues, "by tradition, it is the energetic and enterprising people from poorer countries who usually get up and travel to another land: and the money which they earn in the host country helps, through the subventions which they send home, to reduce poverty in their native countries as well….The Irish, many of whom lived on remittance letters from Britain and the United States in the nineteenth century, should understand this better than most. So also should they recognise the immense levels of energy and creativity in those who migrate." Adam Smith could hardly have made the point any more succinctly.
Kiberd, who for some years was accused (erroneously, in my view) by his cultural critics of harboring "green" sympathies (that is, being too sympathetic to the irredentist ambitions of Irish Republicans who endeavored to unite the whole island under an Irish national flag), has become the nation's leading advocate of cultural hybridization and cosmopolitan globalization. In his dazzling and immensely erudite works, Kiberd has investigated the historical and cultural processes by which a seemingly monolithic and homogenous "traditional" Irish culture was, in fact, constructed by revolutionary Irish nationalists, politically motivated ideologues intent on uniting the citizenry against their imperial English overlords.
Following in the footsteps of the historians Hugh Trevor-Roper and Eric Hobsbawm, who have traced the creation of British "tradition," Kiberd notes that what often passes in Irish pubs from Boston to Belfast as authentic Irish culture are, in fact, examples of "invented traditions" and "instant archeology," creations of late 19th-century and early 20th-century Irish politicians, artists, journalists, and scholars.
In his now-classic work of literary criticism, Inventing Ireland, Kiberd tells the story of Gaelic-speaking Blasket islanders, gathered around a cottage hearth during Easter Week, 1916, who first hear the news of the violent uprising in Dublin. Informed of a rebellion mounted by Irish nationalists hoping to restore Ireland's glorious pre-colonial Gaelic identity, one of the more skeptical islanders, the writer Tomás Ó Criomhthainn, points out to his Gaelic-speaking friends that there is no word in Irish for the "republic" that has just been proclaimed on their behalf.
In contradistinction to less reflective celebrants of all things Irish, Kiberd willingly embraces the invented character of contemporary Irish culture. Indeed he is uncharacteristically acerbic in characterizing his well-heeled media antagonists on the far left who declaim against both the stultifying effects of traditional nationalism and the corrosive effects of globalization; they are, in his memorable phrase, "designer Stalinists." In fact, the invented or self-consciously chosen character of the Irish national culture is for Kiberd an invitation to the ongoing reformation and reinvention of Ireland today. As Kiberd sees it, "the nation is less a legacy of the past than the site of the future, a zone of pluralisms which will prove its durability precisely by the success with which it embraces refugees, exiles, and newcomers."
Kiberd sees in the rising generation that has come of age since the birth of the Celtic Tiger an admirable flexibility, creativity, and cultural promise: "What these young people grasp most clearly of all is that Ireland itself was always multi-cultural, in the sense of being eclectic, open, assimilative. The best definition of a nation was that given by Joyce's Leopold Bloom: the same people living in the same place. As an outcast Jew, condemned to wandering, Bloom may in fact have had more in common with the members of the historic Irish nation than most of the characters in Ulysses: and he would certainly endorse the view that mono-culture works as badly in the body politic as in agriculture, rapidly wearing out the earth's potential." Thus does Kiberd address a diasporic people still haunted by the potato famine that wiped out the only staple crop of the 19th century Irish peasantry and forever altered the course of Irish national history.
To be sure, Kiberd is critical of the excesses of materialist individualism and "economic self-interest." He expresses concern that a society that ceases to respect the "res publica" and loses all faith in a "national philosophy" may well drift into an asocial and culturally vacuous anomie. In particular, Kiberd is critical of what he sees as too sharp a divide between the private and public spheres in multicultural America. For him, the very notion of culture is ultimately a transindividual one, and hence incapable of complete privatization.
If Kiberd is sometimes overly critical of the American melting pot, it is in part because he feels that contemporary American (and especially academic) multiculturalism has abandoned its "secular republican ideal." The result, according to Kiberd, is an unfortunate obsession with the "identity politics" of one's own ethnic group that inhibits rather than encourages genuine cultural exchange. For Kiberd the "evolving multicultural syllabus" in the United States "may serve as a warning of how not to do multi-culturalism in modern Ireland. This is one which presents students with entirely separate versions of Hispanic, African or Indian cultures, each of them honourably rendered in some of its richness, but none of them shown in interaction." In such passages Kiberd hints that what passes as "multi-culturalism" on U.S. campuses is in fact more nearly akin to the cultural tribalism of which Northern Ireland has had such bitter and regrettable experience.
Kiberd also writes suggestively on the ways in which well-intentioned efforts of a government to impose a top-down model of the multicultural society may have unintended and undesirable consequences. Without taking issue with those commentators who've analyzed the racist attacks on Nigerians in Dublin during 1999, Kiberd notes that some have suggested that the deepest source of contemporary interethnic violence is not necessarily a profound or atavistic racism lurking in the Irish soul, but rather "a bureaucratic central government" that imperiously and "suddenly plants refugees" in the midst of local communities, thereby "massively disturb[ing]" long-established streets and villages.
While Kiberd does not call for government to abandon all efforts at social engineering, his comments nonetheless highlight a suggestive instance of the manner in which government bureaucracies create, or at the very least catalyze, those very problems of a multicultural society that they subsequently claim justify their existence and necessitate their intervention. No thought is given to how the unregulated and free-flowing cultural negotiations of individuals in civic society might better manage the challenges and conflicts of contemporary multiculturalism.
Kiberd's most important contribution to the global debate over multiculturalism is his insistence on a conceptual distinction recently made much of by Tom Garvin, one of Ireland's leading political scientists. That distinction is between the nativist or romantic notion of ethnic nationalism that understands the nation to rest on inherited linguistic, racial, or "organic" cultural traits, and an enlightenment or rationalist notion of civic nationalism that conceives the nation to be composed not of a naturally given group or groups, but of free rational individuals who chose to become citizens of a secular, liberal, and democratic republic. If the former sort of ethnic nationalism triumphed in late-19th century and early 20th century Ireland and continues to prevail among the more ardent adherents and sympathizers of the IRA and Sinn Fein, the latter notion is a lost but recuperable heritage of the United Irishmen.
The United Irishmen was an ecumenical organization of Protestants and Catholics, inspired by the examples of the American and French revolutions, who unsuccessfully attempted to overthrow British imperial rule in 1798; they offer one potential source of "traditional" Irish culture that waits to be realized. Suitably reimagined and reinvented, their legacy of civic nationalism, deeply indebted to the ideals of Enlightenment rationalism, and to a genuinely tolerant notion of citizenship indifferent to the categories of religion, race, ethnicity, gender, and class, might provide a wider, safer, and more promising path toward the multicultural future of Ireland—and of America, too.
The post Wherever Green Is Worn? appeared first on Reason.com.
]]>In those heady days in 1967 when Gabriel García Márquez and his fellow writers of the Latin American literary "boom" regularly descended on Havana to attend Fidel's shrimp barbecues—in an age when Che still dashed about the globe on behalf of the Marxist millennium to come—two of "Gabo's" most illustrious companions, the Peruvian novelist Mario Vargas Llosa and his Mexican counterpart Carlos Fuentes, met in a London pub to hatch a grand literary enterprise. Together they projected an ambitious artistic project tentatively titled Los padres de las patrias (The Fathers of the Nations).
To this collective undertaking a number of the foremost contemporary Latin American writers were each to contribute a novel about a dictator from their respective countries: Fuentes was to write about Santa Ana, the Cuban novelist Alejo Carpentier about Gerardo Machado, the Paraguayan Augusto Roa Bastos about José Rodríguez de Francia, and the Argentine Julio Cortázar about Evita Perón. Like many centrally planned enterprises of the era, the project never quite materialized as envisioned. But during the long decades that followed that meeting, years that saw Castro transformed from the hemisphere's great emancipator into one of its last old-style caudillos and tyrants, and Vargas Llosa from a literary wunderkind and left-wing firebrand into the éminence grise of Latin American neoliberalism, a good many memorable examples of the Latin American dictator novel were published to considerable acclaim.
The most recent novel by the 66-year-old Vargas Llosa is The Feast of the Goat (La Fiesta del Chivo), which was published in Spanish in 2000 and in an admirable English translation by Edith Grossman in 2001. It is a work that recounts with gruesome detail and dramatic intensity the last days of the dictatorship of the Dominican tyrant Rafael Leonidas Trujillo Molina. The Feast of the Goat now takes its place with the finest novels to come out of Latin America in the last 50 years, joining such famous Latin American dictator novels as Miguel Angel Asturias' The President (El señor Presidente, 1946), Roa Basto's I the Supreme (Yo el Supremo, 1974), Carpentier's Reasons of State (El recurso de método, 1975), García Márquez's The Autumn of the Patriarch (El otoño del patriarca, 1975), and Tomás Eloy Martínez's The Perón Novel (La novela de Perón, 1985).
The Latin American dictator novel would seem to possess staying power as preternatural as that of "el Macho," the fictional dictator of The Autumn of the Patriarch who lives to be over 200 years old and, in a career coextensive with the history of modern Latin America, tyrannizes his island nation for what seems to his abject countrymen an eternity.
Of course, the enduring power of the Latin American dictator novel has everything to do with the enduring power of Latin American dictators. It is a non-fictional work of historical and political analysis, Facundo: Civilization and Barbarism (Facundo: civilización y barbarie, 1845), by the Argentine politician and statesman Domingo Faustino Sarmiento, that is regarded as the first great literary chronicle about a Latin American dictator and as a seminal influence on the great Latin American dictator novels to follow. In the course of a polemical critique of his political adversaries—Juan Manuel de Rosas, who ruthlessly wielded power in Argentina between 1829 and 1851, and the sanguinary provincial caudillo, Facundo Quiroga—Sarmiento presented himself as an advocate of enlightened and liberal principles as against the barbarism and savagery of the nation's strongmen.
Writing in an age before the advent of American gunboat diplomacy and Yankee imperialism in Latin America (the Monroe Doctrine notwithstanding), Sarmiento looked to the United States as a model of what a liberal, progressive, and modernizing Argentina might become. While Sarmiento's work has been long acknowledged as an ur-text by the many practitioners of the Latin American dictator novel, they have generally, and quite understandably, failed to share his enthusiasm for the burgeoning political influence of their powerful neighbor to the north.
Indeed, one of the constant themes running through the Latin American dictator novel, a theme that received special prominence during the literary "boom" of the 1960s and '70s, was the interdependence of the Latin American tyrant and Yankee imperialism. While more sympathetic than other Latin American novels to the (all too often unrealized) liberationist potential of the United States as a home of liberal democratic principles, institutions, and practices, The Feast of the Goat proves no exception.
One of the first graduates of the Dominican National Police (a constabulary force created by the U.S. military during its occupation of the Dominican Republic in the late 1920s), Trujillo remained in power for nearly 32 years with the tacit backing of the CIA and an impressive array of American "representatives, senators, governors, mayors, lawyers and reporters" who received generous bribes and political favors from the Dominican dictator. A longtime darling of U.S. governments because of his hard-line anti-communist stance, Vargas Llosa's Trujillo in his final days in May 1961 is outraged to find himself abruptly out of favor with a U.S. administration that, much like Claude Rains in Casablanca, is unaccountably shocked to discover that el Chivo ("the goat," Trujillo's nickname) has compiled a profoundly dismaying human rights record.
Vargas Llosa's novel revolves around the final day in the life of the beleaguered tyrant. A group of Dominican conspirators, among whom are several lifelong supporters of the regime, receive minimal material backing from the CIA and bring to a violent close the life and political career of this most unsavory of dictators. And yet, as with so many attempts before, Trujillo's assassination is nearly bungled; all but two of the conspirators meet a violent end (and in most cases far more gruesome than their victim's) at the hands of the vengeful Trujillo family and their remorseless political allies. But for chance, Trujillo might well have extended his reign until dotage brought it to an inglorious end.
That is because in his last days, like so many Latin American tyrants before him, Trujillo plans to turn the sudden opposition of his North American patron to his advantage. Nothing promises to reinvigorate his flagging popularity more than to face up to the Yankee aggressor in the name of la patria. Shortly before his death, Trujillo is already on the offensive, denouncing the U.S. in public speeches that appeal to Dominican nationalism and secretly plotting to realign the Dominican Republic with his former nemesis, the Soviet Union, and its communist allies (which include Castro's Cuba).
It's a credit to Vargas Llosa that his conversion to neoliberalism has not meant that he has become merely an apologist for U.S. political hegemony, any more than it has made him a reactionary apologist for authoritarian regimes (regardless of their anti-communist credentials). Vargas Llosa's complex view of the tangled interrelationship of American and Latin American politics is embodied in the fictional fate of one of his protagonists, Urania Cabral, a Dominican who returns in 1996 to her country after decades in exile in order to pay a last and none too reconciliatory visit to her ailing father, a man who once served in Trujillo's inner circle of advisers and confidantes. In order to placate the tyrant who has cast him into political limbo in 1961, he had offered up his 14-year-old daughter as a sexual sacrifice to Trujillo. Permanently traumatized by her brutal violation, Urania flees the Dominican Republic for a sterile and wearisome expatriate existence before she belatedly returns one final time to her fatherland.
The fact that as a single Latin American female she manages without the support of family and intimate friends to find educational, financial, and professional—though not emotional—fulfillment in her adopted United States marks the political distance Vargas Llosa has traveled. If the Yankee imperialists in Washington have long been part of Latin America's political problems, the civic society that they ostensibly represent has in any case offered a uniquely inviting realm of political and economic freedom in which the Latin American expatriate might find refuge, dignity, and prosperity, if not consummate happiness.
Though the authors who most conspicuously contributed to the literary prestige of the Latin American dictator novel can generally boast impeccable left-wing credentials (García Márquez, Roa Bastos, and Carpentier were all supporters of the Cuban Revolution, Carpentier having once served as an official representative of the Castro government), the genre has not fared well with most leftist academic and literary critics, most especially those faithful to a tradition of Marxist or socialist literary analysis. Many have argued that the genre is unaccountably retrograde, for the "protagonist" of these novels—the larger-than-life dictator who exemplifies the dangers of personalismo, of personal authoritarian rule—was already a historical anachronism by the time the genre achieved renewed prominence in the mid-'70s.
By then Latin American dictatorships had increasingly become technocratic, impersonal, and bureaucratic police states on the model of Chile, Argentina, and Uruguay (if all the more vicious and oppressive for their scientific refinement of the methods of tyrannical rule). The practitioners of the dictator novel, so the critics alleged, had unaccountably returned to a bygone era of colorful and outrageous tyrants that typified Latin America in the 19th and early 20th centuries, rather than face the brave new Latin American world of the 1970s.
Moreover, in their need to captivate their readers, these novelists had inadvertently presented their fictional dictators in an all too attractive light. They were too powerful, too compelling, too charismatic, too funny to provide, even in retrospect, a cautionary lesson to the Latin American masses on the dangers of personal despotism.
Indeed, the more talented the novelist, the greater the risk that his tyrant would be so dramatically compelling as to win not only the interest, but even the secret admiration of the reader. Precisely because the audience was abject and oppressed, it could envy the dictator his fantastic power, his fabulous wealth, his prodigious sexual success with women, his legendary machismo. Insofar as novelists provided not an objective (and hence propagandistic) view of the dictator as a remote political figure, but instead rendered in intimate detail the subjective interiority of the man, with his personal failings, secrets, fears, anxieties, manias, and histories, he became much too human and consequently sympathetic, worthy even of pity and understanding.
Should the dictator serve as the subject of satire, the unintended effect of such comic diminishment was once again to win for him the reader's sympathy. As grand buffoon the tyrant became an inflated comic version of the Latin American loco, a gargantuan Latin male id, an amusing but still lovable cross of Cortez and Porfirio Rubirosa set loose in the brothels, taverns, dance halls, and national palaces of a crazy hemisphere where history constantly repeats itself as farce. In the hands of a García Márquez or Carpentier, el macho or el Presidente could even elicit nostalgia for the bad old days of the caudillos and caciques.
The Feast of the Goat does nothing to remedy the nervousness and discomfort of those soi-disant Marxist critics who doubt the political efficacy and "critical" potential of the Latin American dictator novel. For what Vargas Llosa has made explicit in his newest fiction is an insight that was sometimes entertained, but ultimately evaded or "dialectically" overcome in the narratives of his left-wing literary contemporaries: The dictator is the creation of the masses. They are mutually dependent and cannot exist without each other. As Urania tells herself, "You've come to understand how so many millions of people, crushed by propaganda and lack of information, brutalized by indoctrination and isolation, deprived of free will and even curiosity by fear and the habit of servility and obsequiousness, could worship Trujillo. Not merely fear him but love him, as children eventually love authoritarian parents, convincing themselves that the whippings and beatings are for their own good."
The Marxian notion of false consciousness returns in Vargas Llosa's work with a vengeance. For Urania's insight (which is to say, Vargas Llosa's) is the same that another great novelist who parted company with the Stalinist left, George Orwell, offered in 1984: The people, the plebes, the folk, the masses are not to be relied upon as the primary and initial source of anti-authoritarian resistance, and indeed may even prove a chief impediment to reform and liberation. Such popular nationalist tyrants as Trujillo did not have to rely on sophisticated if pseudoscientific ideologies to establish or maintain their popularity—they needed only to appeal to the tyrannical desires that lurk in the unenlightened souls of all human beings.
But if Urania willingly grants that the masses cannot, without assistance from above, save themselves, the truly disturbing question she poses is why the cultural and political elite of the nation, allegedly the beneficiaries of enlightenment, have proved so utterly complicit in Trujillo's tyranny. "What you've never understood," she tells herself, "is how the best-educated Dominicans, the intellectuals of the country, the lawyers, doctors, engineers, often graduates of very good universities in the United States or Europe, sensitive, cultivated men of experience, wide reading, ideas, presumably possessing a highly developed sense of the ridiculous, men of feeling and scruples, could allow themselves to be…savagely abused."
The reasons that the elite of Dominican society have for cooperating with Trujillo are as numerous and varied as the astounding array of characters who populate The Feast of the Goat. Fear, greed, savagery, racism, lust, cowardice, expediency, complacency, indifference, familial attachment, patriotism, ambition, courage, prudence, piety, loyalty, intelligence, idealism—all can be said to motivate one or another character who falls under the spell of the tyrant. Which is to say that Vargas Llosa writes no political tract or treatise but a novel in which the motivation of any given character is as unpredictable, idiosyncratic, as personal as can be imagined.
To be sure, Trujillo creates a system of despotism, a wide net of political, military, financial, and cultural influence that severely constrains resistance to his will. As one of the dictator's would-be assassins (himself a lifelong Trujillista) puts it, "what a perverse system Trujillo created, one in which all Dominicans sooner or later took part as accomplices, a system which only exiles (not always) and the dead could escape. In this country, in one way or another, everyone had been, was, or would be part of the regime."
But what most fundamentally distinguishes Trujillo is not in the end his thorough mastery of Machiavellian politics but rather his uncanny personal magnetism, his unique charisma and seductive charm that more closely resemble sorcery than science. Vargas Llosa's representation of the Dominican tyrant might seem simply fantastic and outlandishly anachronistic, like a Botero portrait of the Devil touched up with epaulettes and dark glasses, were it not that he strikingly resembles a fellow despot, who still plays upon the present political stage.
Isolated from the international community, at odds with an increasingly militant Roman Catholic Church, struggling against a hostile U.S. government that sponsors a punishing economic embargo, a septuagenarian with declining physical powers who has governed a once prosperous but increasingly impoverished Latin American island nation for decades, a clever populist nationalist willing to countenance any and all violations of human rights when it suits his purposes, Vargas Llosa's Trujillo seems to have rather more in common with Castro than their ideological differences might lead one to expect.
Vargas Llosa's novel offers two historical "conclusions." One story- line concludes with the restoration of civilian government under President Joaquín Balaguer (one of Trujillo's closest and most trusted aides and a Machiavellian politician of considerable abilities) in the months following the dictator's assassination in 1961. A second plot line ends with Urania's completion of her traumatic return visit to the Dominican Republic in 1996. The era between those two dates witnessed more than three decades of the Cold War, the rise and ongoing fall of the fortunes of Marxism in Latin America, what would appear to be the better part of Castro's despotic rule of Cuba, and the heyday of the Latin American literary boom and its long twilight—a period that saw the flourishing of the Latin American dictator novel.
Vargas Llosa thus returns to the origins of a recent phase of political, cultural, and literary history in order to mark its close. But in so doing, he darkly hints that the end of the Cold War, the fall of international communism, and the rise of neoliberalism may signal not some final end to the troubles of Latin America but a return to those difficulties that characterized that region (and much of the rest of the developing world) in the days before the superpower conflict that followed World War II. Indeed, the timeliness of Vargas Llosa's only seemingly belated novel lies in its representation of a dictator—dead for more than 40 years—whose career strikingly anticipates those despotic figures who continue to populate the world stage even after the end of the Cold War.
Vargas Llosa has publicly dismissed the easy consolations of Francis Fukuyama's "end of history," and his most recent fiction suggests that in the figure of Trujillo we spy a dangerous human type that transcends every ideology and historical period. In el Chivo the reader may detect not just the visage of a Castro or a Duvalier, but also that of a Saddam Hussein or a Milosevic. What sets The Feast of the Goat apart from the other great Latin American dictator novels of García Márquez and Carpentier (works that conclude optimistically and, in good socialist fashion, in anticipation that the age of the tyrant is at last coming to an end) is its chilling sense that the past will continue to haunt both the present and the future.
Significantly, Belaguer's astounding feat of engineering a relatively bloodless transition to democracy is shadowed by an event of which Vargas Llosa makes no mention but of which his (Latin American) readers are expected to remain cognizant. Just a few short years after the Dominican transition to democratic rule, the U.S. once again invaded and occupied the Dominican Republic. Nor does Urania's fate provide much greater solace. She is forever marked and deformed by Trujillo's tyranny. As the 39-year-old woman so poignantly puts it, shortly before she departs her fatherland for the final time, "my only man was Trujillo…I'm empty and still full of fear…Papa and His Excellency turned me into a desert."
If Vargas Llosa does not share the apparent naiveté of his former socialist comrades-in-arms about the future of Latin America (and this despite the recent liberalization of many Latin American regimes in the 1980s and '90s), it is perhaps due to his conviction that the region has more than once before experienced the giddiness that comes with a seeming liberation from the weight of historically outmoded despotisms. Having experienced firsthand, as a young writer and intellectual, the elation that followed Castro's sudden rise to influence in the early '60s, Vargas Llosa now refuses to forget that a similar euphoria swept Latin America in the early and mid-19th century following the wars of independence that freed much of the hemisphere from Spanish colonial rule.
It was, after all, out of the spectacular debacle of those liberal revolutions of the 19th century that the grim forerunners of Trujillo first emerged to assume their dark powers in the palaces, prisons, and torture chambers of modern Latin America. Vargas Llosa's grim historical novel should serve as a cautionary tale as much for his more recent neoliberal allies as for his old neo-Marxist critics.
The post Big Daddy appeared first on Reason.com.
]]>The premiere of a new Schwarzenegger film about terrorism was put on hold, the broadcast of the mini-series Band of Brothers was interrupted for a week, and a rerun of an X-Files episode featuring the terrorist bombing of a bank patronized by FBI agents was cancelled. Overnight, the real terror of history had displaced (at least temporarily) the manufactured variety, and the prospect of a new kind of military conflict of incalculable dimensions and uncertain outcome crowded out the simulacra of war, violence, and bloodshed that are a staple of the entertainment industry.
One of the striking features of the cultural environment in which the events of September 11 exploded has been a burgeoning fascination with war, specifically with World War II. Since the release in 1998 of Steven Spielberg's Saving Private Ryan, America's theaters have been crowded with major motion pictures devoted to that war. The Thin Red Line (1998), Enemy at the Gates (2001), Pearl Harbor (2001), and Captain Corelli's Mandolin (2001) are soon to be followed by Windtalkers, the story of the WW II Navajo codetalkers, slated to open in early 2002. For those who cannot wait, HBO's Band of Brothers, based on Stephen Ambrose's book, is back.
Programs on the second world war are featured so regularly on the History Channel that wags commonly call it "The Hitler Channel." Since September, the History Book Club, which has always carried a long list of World War II titles, has barraged its subscribers with new books about Pearl Harbor, Iwo Jima, Omaha Beach, and the German army.
Meanwhile, Hampton Sides' Ghost Soldiers has climbed the New York Times bestseller list, and Tom Brokaw's An Album of Memories: Personal Histories from the Greatest Generation—his sequel to The Greatest Generation—can be found in every airport. Stephen Ambrose, a talented writer of considerable range, might well have made an entire career out of the books he's penned on the war: new hardback copies of Band of Brothers (originally published in 1992) are currently billeting in the major bookstores alongside his The Wild Blue: The Men and Boys Who Flew the B24s Over Germany and The Victors—Eisenhower and His Boys: The Men of World War II. It is not overstating matters to claim that for the last three years we have been awash with World War II reminiscences and cultural memorabilia the extent of which we haven't experienced since before the Vietnam War.
Why now? How do we explain the current resurgence of interest in the second world war at a time when the last surviving veterans of the conflict are passing away? The producers and executives of Hollywood, to say nothing of its screenwriters, are more likely to belong to the baby boom generation (or even Generation X) than to "the greatest generation." In fact, many members of the target audience for Pearl Harbor or Enemy at the Gates are young enough to be the grandchildren or great-grandchildren of the vets who took up arms nearly 60 years ago. Why should they be so eager to weep for the dead of Pearl Harbor or Stalingrad?
Unlikely as it may seem, the 19th century novelist Sir Walter Scott might provide us with a clue. Scott was one of the most famous writers of his age, and his reputation rested in large measure on his popularizing a new sort of literary work: the historical novel. Although American readers are more likely to know Scott's medieval tale Ivanhoe, it was the 1814 publication of his first historical novel, Waverley, that established an international craze for historical fiction.
Significantly, Scott subtitled that work 'Tis Sixty Years Hence, a reference to the 1745 Jacobite rebellion of Bonnie Prince Charlie and the vain attempt to reclaim the British throne for the Stuarts. What 60 years before had been an immensely controversial political event that deeply divided public opinion in Scotland (and greatly inflamed passions in England) had become by 1814 a purely historical event suitable for fictional treatment that would appeal to a diverse international audience.
It was Scott's genius to recognize that a political event, especially one of great import, could best be transformed into fictional history precisely at the moment at which the living witnesses and participants of the event inevitably exited the scene. For all of his antiquarian attention to historical detail, Scott characteristically altered history for dramatic purposes. He offered a new and improved version of the past that championed his own bourgeois vision of the ideal political order, one characterized by peace, religious toleration, civility, moderation, commercial activity, individual freedom, and steadfast loyalty to God, king, and country.
Written at a time when the great political conflicts of Europe seemed momentarily at an end, Scott's historical novels looked back to the political and religious tumults of the past from what seemed the more secure and peaceful world of the late 1810s and 1820s. The lives of Scott's fictional heroes and heroines inevitably intersect with those of famous historical personages—Bonnie Prince Charlie, the Duke of Monmouth, Cromwell, Saladin—during a period of political crisis of far-reaching consequence: the Jacobite war of 1745, the Scottish religious wars and political rebellions of the 17th century, the English Civil War, the Crusades.
Scott's fictional protagonists are always "middling folk." These respectable, often prosperous and genteel characters with their "modern" middle-class mores appear anachronistic against the backdrop of historical epochs filled with violent religious fanatics, doomed aristocrats, deposed monarchs, wild romantic Highlanders, chivalric Christian knights, and Muslim warriors. But if Scott's fictional heroes seem to have been magically transported backward into time, they nonetheless fulfill an important narrative function by allowing Scott's readers to identify with the witnesses and minor actors in great historical events. Careful to appeal to the widest possible audience, Scott made certain that the stories he told offered a romantic hook; the resolution of the historical or political crisis is typically punctuated by the long-delayed and much anticipated union of Scott's fictional hero and heroine. Scott's formula is imitated to this day, especially in Hollywood.
Prior to the beginning of the present "War on Terror," in the post?Cold War era of the 1990s, writers, film producers, and the American public as a whole might be said to have found themselves in a historical lull comparable to that of Scott's readers in the 1820s. The United States in the 1990s seemed more peaceful and prosperous—and more self-satisfied and uninteresting—than the America of 30 or 60 years ago. The anxieties of having finally achieved cultural preeminence and political authority in uninteresting times seems to have particularly burdened the baby boomer generation.
If their youth was characterized by the turmoil of the Cold War, the nuclear threat, political assassinations, the civil rights movement, Watergate, and above all, the Vietnam War, such grand historical events were, with the possible exception of the civil rights struggle, resented or mourned rather than welcomed, forced upon a recalcitrant generation rather than passionately embraced in the manner with which their parents had greeted the epic challenge of World War II. At the moment that the first boomer was finally elected to the American presidency and his generation seemed at long last poised to determine the destiny of the nation, if not the world, the engine of history seemed to have stalled. The historical legacy of the boomers was not to be a new Camelot, of which they'd dreamed since their youth, but the Lewinsky White House. The great matter of the '90s was not a cause that would hallow any war, but the rancorous impeachment and Senate trial of a president whose most memorable acts were the proper subject of a fabliau rather than an epic.
In recent years, there has been a widespread and unfulfilled cultural yearning for a cause worth fighting for. The great majority—including the Big Boomer himself, Bill Clinton—had side-stepped or avoided the Vietnam War, rather than passionately engaging in it or opposing it. Clinton's "mixed emotions" on the events of September 11, as reported by Andrew Sullivan, reveal with shocking clarity the degree to which the ex-president nevertheless yearned for a historical cataclysm as a means to his own self-definition and glorification: "He [Clinton] has said there has to be a defining moment in a presidency that really makes a great presidency. He didn't have one."
The recent spate of World War II movies, books, and television programs has provided an ersatz means by which those anxieties, self-doubts, and recriminations—and, inevitably, the boomers' characteristic narcissism and self-justifications—could be expressed. Like Scott's historical novels, the films signaled an unfulfilled epic desire on the part of a generation that has felt itself diminished by comparison to the one that preceded it. They have, for better and worse, functioned as a means by which some influential boomers might, surreptitiously, assign themselves a central place, even an epic role, in a history that in fact did more to shape them than they it.
In an interview Spielberg granted when Saving Private Ryan was released, the director summed up his view of the great conflict. "I think it is the key—the turning point of the entire century. It was as simple as this: The century either was going to produce the baby boomers or it was not going to produce the baby boomers. World War II allowed my generation to exist." There you have it. The ultimate benefit, the highest justification and sanctification of the greatest, if not the bloodiest, war in human history: the birth of the baby boomers.
The World War II films of the last three years have been popular in part because they employ the most advanced special effects. The spectacular dimension of these films—the opening scenes on Omaha Beach in Saving Private Ryan, the scenes on the Volga in Enemy at the Gates, and the elaborate recreation of the Japanese surprise attack in Pearl Harbor—overwhelm the audience with visual and sonic power. No dialogue is necessary and little is supplied. In the age of the global audience, contemporary studio executives understand what their predecessors in the silent era surely appreciated: Visual spectacle and music cross borders easily.
Nonetheless, these films represent more than a triumph of special-effects technology. They also exhibit an impulse to remake the image of the generation that fought the second world war so as to flatter the values, beliefs, and mores of the baby boom generation. By reinventing their parents as an idealized version of themselves, the boomers have not so much memorialized the "greatest" generation as presented a glorified image of its successor.
Recent World War II epics have, a la Scott, allowed the boomers to dress up in their parents' vintage clothes and parade the dominant cultural and political ethos of the 1990s in the garb of the 1940s. In Pearl Harbor, the prevailing sexual mores of the 1990s, to some degree a cultural legacy of the 1960s, are peremptorily thrust back into the 1940s. Indeed, the film's main characters' casual embrace of premarital sexual relations is conveniently depicted as the inevitable result of a world crisis. If not for Hitler's aerial assault on Britain, Lt. Evelyn Johnson (Kate Beckinsale) and Capt. Rafe McCawley (Ben Affleck) would not have to rush matters, nor would Evelyn have to console herself, not long after Rafe apparently perishes in the Battle of Britain, in the arms of her beloved's fellow flyboy and best friend, Capt. Danny Walker (Josh Hartnett). You don't like free love? Blame it on history. And if not for the fiendish Japanese bombardment of what looks more like a holiday paradise than a functioning naval base, an attack that precipitates a series of events culminating in the death of bachelor #2 (that's Danny) in faraway Manchuria, our poor heroine might never have been able to make up her mind.
Of course, all ends happily when Evelyn and Rafe are ultimately reunited, with daddy Capt. Rafe nobly adopting as his own the son Evelyn and Danny had enthusiastically conceived amid billowing satin parachutes in a conveniently vacant airplane hanger. (Pleased with the return to family values? Thank history.)
The basics of this tangled romantic plot might have been pulled wholesale out of a Scott novel such as Old Mortality, in which the heroine finally marries the hero when his rival and friend is conveniently killed off in the last skirmish of the Scottish religious wars of the 17th century, despite the hero's best efforts to save him. But whereas Scott seemed genuinely interested in Scotland's 17th-century religious politics, the makers of Pearl Harbor—screenwriter Randall Wallace of Braveheart fame (who likely knows his Scott) and director Michael Bay—seem interested in the war mainly as a backdrop to their 1990s romantic fable. If Scott personalized the political to make it more accessible, Bay and Wallace simply reduce the political to the personal and leave it there.
As my female companion at the screening described it, Pearl Harbor is just your standard chick flick with vintage planes and heavy ordnance thrown in for effect. But the film does deliver a message of sorts: The non-traditional neo-nuclear family of the 1990s, which embodies in the bio-genetic diversity of its children the varied and liberated romantic pasts of its parents, is transferred to the 1940s, where it is comfortably situated within the American grain. Thus, in the final scene of the film, Evelyn, Rafe, and little Danny Walker Jr., back on the family farm in Tennessee (where Rafe grew up with his boyhood pal, Danny Sr.), are cinematically sanctified in a final visual tableau worthy of Norman Rockwell.
Other recent World War II films have served similar ends. In Enemy at the Gates, Vassili Zaitsev (Jude Law) is a talented young Russian sniper posted to besieged Stalingrad. He competes with his friend and patron Danilov (Joseph Fiennes), a propaganda officer working for the internal Russian security service under the command of Khrushchev (Bob Hoskins), for the affections of Tania (Rachel Weisz), a Russian-Jewish resistance fighter. Our young Russian hero must fight on two fronts. In addition to facing off against the top sniper in the German army (who comes to Stalingrad to pick off Vassili), he must worry about his friend Danilov picking off Tania.
Co-written and directed by the Frenchman Jean-Jacques Annaud—born in 1943, just ahead of the baby boom—the movie panders to the sizable American market despite its focus on a pivotal battle in which the U.S. played no significant role. Setting aside the question of the combat status of Russian women at Stalingrad, we might note that Enemy at the Gates reassures those worried about the fighting capabilities of a new unisex army. Women will, it seems, honorably take their place alongside their male counterparts on the front, even if they must assume a variety of uncomfortable postures when consorting with their beloved comrades-in-arms in crowded bunkers.
By encouraging a contemporary American and Western European audience to identify with its Russian heroes and heroines (much as Hollywood propaganda films of the early 1940s generated American and Western European sympathy for our Soviet allies), Enemy at the Gates clearly signals a post?Cold War realignment of political sympathies. The Russians are no longer our ideological and military foes, just patriotic individuals fighting for their homeland. Even Danilov, the propaganda officer, turns out to have ambivalent feelings toward the official Stalinist line that Khrushchev viciously enforces. Fascism turns out to be the one truly malignant force, for unlike even the most committed members of the Red Army, German officers are willing to kill children in cold blood.
Perhaps the most striking revisionism in Enemy at the Gates is its transformation of one of the longest and bloodiest battles of World War II into a private duel between two superstar snipers that is most meaningfully played out in the Soviet (and presumably German) press. By keeping its audience in the dark concerning the military strategies and large-scale engagements that actually determined the fate of Stalingrad, focusing instead on Danilov's propaganda war, the film reduces the battle of Stalingrad to a media campaign. Though the decisive Russian military breakthrough came in November of 1942—prior to the "climactic" duel depicted in the film—Enemy at the Gates reflects a post-Vietnam sensibility that understands the "winning of hearts and minds" as the most important and strategically significant undertaking in any military conflict. Even at Stalingrad, image is everything.
Paraphrasing what André Gide once said of Victor Hugo, one might concede that Spielberg is, alas, the greatest director of war films in his generation. Saving Private Ryan is at once the best and most cunning of the recent World War II epics. It artfully disguises its revisionist aims by virtue of its scrupulous attention to the gritty and unsavory details of combat.
In his successful bid to one-up Darryl Zanuck's 1962 epic The Longest Day, Spielberg has dazzled civilian and military audiences (including D-Day veterans) with his uncompromising depiction of slaughter on the beaches of Normandy. Taking his cue from Francis Ford Coppola's Apocalypse Now, Oliver Stone's Platoon, and Stanley Kubrick's Full Metal Jacket (Spielberg has recently claimed to have channeled the spirit of the late auteur), he seems to have combined the graphic realism of the Vietnam film with the seemingly traditional patriotism of the World War II saga.
But it might be more accurate to say that Spielberg has reinterpreted the experience of the second world war in light of that of Vietnam. The result is an exoneration of the conduct of Spielberg's generation and a subtle diminishment of the heroism of that of his parents. For if the visual center of Saving Private Ryan is the assault on Omaha Beach, the narrative center, as the title of the film suggests, is the saving of a single private in the U.S. Army.
In their efforts to rescue Ryan (Matt Damon) from behind enemy lines, the small patrol led by Capt. John Miller (Tom Hanks) endures a series of harrowing confrontations with the enemy that have more in common with Apocalypse Now than with The Longest Day. Miller's troops quarrel relentlessly among themselves, pointedly call into question the very purpose of their mission, and even threaten to mutiny or desert before they've rescued Ryan.
The film ultimately revolves around scenes that ask whether the shooting of unarmed German prisoners by American GIs is morally justified. No doubt such things took place during World War II, but Spielberg breaks new ground in anchoring his narrative in this issue. Spielberg's greatest generation, in spite of its bravery and patriotism, acts in a manner more closely resembling that of American troops at My Lai than that of John Wayne in World War II epics of an earlier age.
However, Spielberg is careful not to give away the game too easily. The audience is made to question Capt. Miller's compassionate decision to release a German prisoner rather than have him summarily shot by vengeful and disgruntled GIs. In the film's final battle scene, it is this same German soldier who fatally wounds Capt. Miller, and it is the most ethically upright and law-abiding member of Miller's squadron who subsequently kills the German. (Spielberg also takes pains to depict American soldiers helping, if thereby unintentionally and unnecessarily endangering, French civilians caught in the crossfire of the Normandy invasion). Critics argue that Miller's GIs should have shot the unarmed German when they had the chance, that the brutal killing was, in the context of battle, the moral thing to do. Spielberg thus implicitly justifies the behavior of his generation in Vietnam by making the case for battlefield conduct unavoidably shaped by the brutality inherent in all war. But perhaps Spielberg also suggests that one reason that American soldiers in World War II have been more favorably portrayed in the popular media than their sons in Vietnam was that the former conveniently faced a truly evil enemy—German fascists and anti-Semites—rather than one about which liberal members of Spielbergs's generation have felt considerable ambivalence (Vietnamese communists).
Saving Private Ryan also serves to assuage the guilt of the Vietnam generation through another sleight of hand. If Miller's men are Vietnam-era soldiers dressed in World War II general issue, Pvt. Ryan nonetheless refuses an opportunity to avoid danger—an opportunity that so many members of the Clinton, Gore, Quayle, and Bush Jr. generation sought.
In the midst of a military conflict, Ryan is offered the chance to avoid combat, to go home, and to do so with the full approval of his government. Of course he refuses, and stays to fight alongside a new set of unfamiliar comrades whom he nonetheless will mourn and memorialize as an old man, accompanied by his loving wife, children, and grandchildren. Rather than functioning as an implicit criticism of Spielberg's generation, Ryan's choice serves, I would argue, as a kind of compensatory fantasy.
Ryan lives out (and also lives to reflect upon) an experience of war and national service that so many of Spielberg's fellow boomers were anxious to avoid. Spielberg offers a virtual salve to his audience and himself: If once more given Ryan's choice (and his good fortune to draw a truly diabolical opponent), he and his generation really would have served bravely, rather than demurred or objected, however conscientiously. Spielberg's narrative genius offers an apologia for his generation that in equal measure assuages guilt and glories in self-justification, if not self-heroization.
In his book The Campaigns of Alexander, the second century Greek historian Arrian records the visit of Alexander the Great to the tomb of Achilles at the ancient site of Troy. Alexander is supposed to have described Achilles as "a lucky man in that he had Homer to proclaim his deeds and preserve his memory." Alexander understood that even the greatest political and military feats do not speak for themselves, they become truly important, memorable, historical, only insofar as they are recorded, interpreted, and subsequently appreciated by an audience.
Alexander's comment thus acknowledges that the fate of the hero rests not entirely in his own hands, but also in those of the artist or historian. Arrian's (perhaps apocryphal) tale casts much light on the murkier recesses of the cultural psyche of the boomers.
The boomer generation might well have been the first raised in an era in which history manifested itself primarily and most powerfully through the medium of the silver screen. Having come upon the stage of history too belatedly to fight World War II, the cultural elite of the boomer generation could nonetheless make movies about it. Or to be more precise, they could remake and revise the WW II films of their youth. For in a crucial respect, the WW II films of the '90s ultimately have less to do with the second world war per se than with earlier cinematic representations of the event.
The real (though invisible) antagonist of recent WW II films is not so much the German fascist or the Japanese imperialist as the specter of John Wayne, the larger-than-life self-image of "the greatest generation." Just as the epic deeds of Achilles in Homer's Iliad inflamed the envy and hubris of Alexander the Great, so too have the cinematic images of the Duke's heroic feats fed the jealousy and stoked the historical ambitions of the boomers.
The baby boom generation, for better or worse, is the first fully committed to the view that to control the visual representation of history is to control history itself, and thereby one's own destiny. This is a cultural reality that, for all of his anti-modern religious enthusiasm, even the malign director of the spectacular attacks of September 11 fully appreciates.
The post Virtual Warriors appeared first on Reason.com.
]]>Clinton, ever the self-promoter, has long sought to link his presidency with that of FDR. In May 1997, Clinton presided at the opening of the memorial to FDR, a monument happily consonant with the self-image cultivated by both Bill and Hillary during their own years in the White House. Should any visitor fail to make the connection between the two presidents, the floor-to-ceiling granite plaque that graces the foyer of the Information Center of the memorial features the name of WILLIAM J. CLINTON prominently etched just below that of FRANKLIN DELANO ROOSEVELT.
Clinton's dedication is above and in larger characters than the names of the FDR Memorial Congressional Committee members, architect Lawrence Halprin (who worked on the project for more than 20 years), private donors, and the citizens of the United States on whose behalf Congress so generously allocated funds.
To be sure, when architect Halprin first drafted his design in the mid-1970s, he could not have anticipated how faithfully his work would embody the animating spirit of the Clinton years. But such is genius that by representing the past it may peer into the future.
Bordering the Tidal Basin, the memorial occupies seven and a half acres, considerably more than is devoted to any other memorial on the Mall. Consisting chiefly of a series of massive granite walls that support a landscaped and mounded earthwork, the monument is constructed partly below ground. As a result, the FDR memorial is easy to miss, and from some vantage points invisible.
Nonetheless, the memorial to FDR fittingly embodies Bill Clinton's Orwellian pronouncement that the "era of big government is over." Enormous Carnelian granite blocks—31,239 of them—form a series of walls 12 feet high that constitute its chief architectural feature. One of the most massive and sprawling structures on the Mall, its gargantuan dimensions are cunningly crafted to escape notice. Eschewing the formality and grandiose neo-classicism of the Washington Monument and the Jefferson and Lincoln Memorials, the modernist-inflected FDR Memorial unobtrusively melds into its "natural" setting.
The land on which the memorial stands is in fact composed of mud dredged from the Tidal Basin in the late 19th century. The unstable topography, still settling, could not support the structure's massive weight, so a reinforced concrete deck over 900 steel pilings driven 100 feet into the ground was built as a foundation. In his commemorative account, The Franklin Delano Roosevelt Memorial, Halprin assures us that despite its vast expanse and ambitious engineering, the memorial "feels as if it's part of the earth."
In short, the memorial casually symbolizes federal bloat at the end of the century. An elephantine governmental structure that took more than 50 years to be realized (the first resolution to build a memorial to FDR was introduced into Congress in 1946), the monument is discretely disguised such that it appears to have always been an integral, even natural feature of our American landscape.
The FDR Memorial consists of four out-door "rooms," each corresponding to one of FDR's presidential terms. These rooms feature a variety of bronze statues and bas reliefs, as well as an impressive series of waterfalls and pools symbolizing various events and achievements in the life of the man whom Halprin refers to as "one of the four great presidents of our 220-year history as a country." (It would seem that the wrong Roosevelt is carved into Mount Rushmore.)
Into the walls are engraved in super-sized capital letters THE GREAT WORDS of Roosevelt, whom Halprin compares, without a hint of irony, to the prophet Moses, "who led his people to the promised land but was never able to cross into it." Whereas Moses required a mere pair of stone tablets on which to record the word of God, FDR's deep thoughts stretch across hundreds of thousands, even millions, of tons of expensively quarried granite transported from South Dakota.
The official deification of an American president is not unprecedented (Lincoln in his memorial is loosely modeled on Zeus in the temple at Olympia), though rarely has a memorial architect been so insistent that the visitor comport himself as a religious pilgrim. Halprin repeatedly calls the monument "a sacred memorial space" set apart from the "secular" recreational areas that flank it. The memorial, we are assured, is not designed for thoughtless acts of tourism, but is meant to unfold "a processional narrative," to inspire the penitential supplicant with its "spiritual, contemplative character." Our "processional passage" is punctuated by a series of devotional stops, including a penultimate pause in the Fourth Room before the recessed niche containing Neil Estern's bronze statue of Our First Lady of the United Nations and World Peace, Eleanor Roosevelt, her hands dutifully crossed in a manner studiously imitated by Hillary Clinton.
Though Halprin designed the memorial as "a walking experience" (not just a walk, but a walking experience), the official CD audio guide sold by the National Park Service assures us that it is fully in compliance with the Americans with Disabilities Act. A good deal of controversy surrounded Neil Estern's original larger- than-life bronze statue of FDR. Given that Roosevelt, with the connivance of the contemporary press, scrupulously hid from public view his own disability (stricken with polio at age 39, he was paralyzed from the waist down for the remainder of his life), Halprin and his artistic comrades debated whether the image of "one of our greatest heroes" should refer to his handicap.
Their original solution is an artistic equivocation worthy of Clinton himself. The 1997 bronze image of FDR is based on a photograph taken of a recumbent and ailing president at the 1945 Yalta conference. Estern's Roosevelt is seated on a chair equipped with tiny casters; but the chair is modeled not on the one on which FDR was photographed in Yalta, but on the one he used in his private residence at Hyde Park. Three casters are concealed beneath a bronze cape that drapes over Roosevelt's body. Only one small caster, which faces toward the back of the stone niche in which FDR's bronze image rests, is visible, and then only to the visitor unceremonious enough to climb around the seated figure and into the niche.
The disabled, who since 1997 have had to solace themselves with the knowledge that FDR's leg braces, encased in glass like the relics of a medieval saint, are on view in the monument's foyer, may now celebrate the newly dedicated statue of FDR conspicuously seated in a wheelchair. Though only four of some 10,000 extant photographs of FDR show him in a wheelchair, the National Organization on Disability, its president, Alan A. Reich, and the philanthropic investor, Peter Kovlar, successfully lobbied Congress to include Robert Graham's new life-sized statue that openly displays the physical limitation FDR was always so careful to conceal. Ever the linguistic pioneer, President Clinton proclaimed the new statue a "monument of freedom." His deep identification with his hero perhaps led Clinton to an uncharacteristically self-revealing assessment of FDR: "He was a canny fellow, and he didn't want to risk any vote loss from letting people see him in a wheelchair."
Nor are the disabled the only group whose special interests are addressed and celebrated at the memorial. Greens are reassured by FDR's very own words that the president—whose legacy included the Tennessee Valley Authority and the construction of the Hoover Dam, two projects unlikely to pass muster with Naderites were they to be proposed today—was a deeply committed environmentalist: "MEN AND NATURE MUST WORK HAND IN HAND." (Halprin, whose studies at the Harvard School of Design evidently did not include economic history, assures us that the underlying causes of the Great Depression were "farming malpractice, destruction of the natural environment, and large-scale erosion.") Robert Graham's bronze mural, Social Programs, dutifully represents the faces of America's various ethnic and racial minorities; even the granite on which the bronze mural is mounted, Halprin assures us, though "uniform in grain" is "diverse in its makeup."
The more pacific wing of the Clinton generation, lest they be put off by the Third Term Room with its imposing Broken Wall and thunderous cataract, meant to evoke FDR's heroic leadership during the chaos of World War II, are to be comforted by the words of a 1936 presidential campaign speech that, according to the audio tour, promised "to shun political commitments that would draw [the nation] into foreign wars." Engraved not once, but twice upon the granite blocks in the Third Term Room is the single declarative sentence: "I HATE WAR." Halprin's equivocal memorialization of American triumph in World War II, filtered through the anti-war sentiments of the Vietnam era, is perhaps a more fitting tribute to the 42nd president, who dodged the draft in his youth and cynically bombed Sudan and Afghanistan to distract attention from Monica Lewinsky's grand jury appearance, than it is to our 32nd chief executive.
Of course, in the rough and tumble of special-interest politics, some groups will inevitably lose out. All reference to one of FDR's most memorable talismans, the smoldering cigarette holder, was quietly omitted from Estern's statue, indeed from all images of the president that adorn his memorial. No Cigarette Smoking Man in evidence here. Halprin's is an FDR perfectly suited to the politically correct Clinton '90s.
The two most remarkable features of the memorial are the Second and Third Term Rooms, celebrating the New Deal and FDR's war leadership respectively. The Second Term Room is subdivided into two facing chambers, the first punctuated by a set of bronze statues executed by George Segal—"The Fireside Chat," "The Breadline," and "The Appalachian Couple"—depicting the experiences of ordinary folks during the Great Depression. The second facing chamber celebrates the New Deal, focusing on the creation of 54 alphabet agencies and social programs that, according to Halprin, "elevated the country from the quagmire into which it had sunk." (Halprin's efforts at what he terms "an experimental history lesson" for the masses are partly undermined by the official audio tour. The "Appalachian Couple," we are instructed, is also meant to evoke the dust bowl of the Great Plains.) Significantly, neither Halprin's official commemorative volume nor the Park Service audio tour mention that the nation sank more deeply into the quagmire of the Great Depression in the years following the initiation of the New Deal in 1933. But what's an experiment without a glorious failure or two, or 54 for that matter?
Segal's statues of the farm couple, a line of unemployed urbanites standing in a bread line, and a lone male figure sitting by his Philco radio "entranced by the voice" of FDR might be said to suffer from an "imitative fallacy." According to Halprin, Segal meant his bronze figures to embody the "inherent individual dignity" of ordinary men and women. Realistic and life-sized portraits of the down-trodden and degraded, they are unlovely to look upon. Segal and Halprin's gesture toward democratic inclusiveness merely accentuates the radical gulf that separates the patrician leader who wields real power in the White House from the plebes and proles. Halprin's book reveals that Segal wrapped his models from head to foot in bandages, before covering them in plaster molds, which were later used to caste the bronze statues. No doubt the process greatly enhanced each model's "inherent individual dignity."
Robert Graham's bronze mural, Social Programs, unintentionally manages to embody the more sinister and subterranean aspects of the New Deal, which, among other things, helped usher an often dehumanizing bureaucracy into everyday American life. The images of the common folk on his mural are rendered as fragments—a Sargasso Sea of isolated hands, faces, and random body parts, rarely if ever comprising a whole individual. These dismembered citizens are mere scattered pieces lost among Braille inscriptions, symbolic icons of federal programs, and a swarm of initials—the acronyms of the alphabet agencies. The classic republican relationship envisioned by Jefferson and the Founders is here reversed: The government is not the creation of individuals for the purpose of protecting their natural rights, but rather, a towering and mechanical federal bureaucracy that literally creates the people by stamping itself on the malleable clay of the nation. The resulting chaos of dismembered bodies and wandering souls (which might just as suitably serve to represent the carnage of World War II in the Third Term Room), we are assured, accords with a grand design conceived by a remote and all-knowing social engineer who really does love the people.
Though it is impossible to argue with FDR's principled struggle against fascism or to be unimpressed by the architectural achievement of the Third Term Room, it is no less beset by unintended irony. Having become familiar in the First Term Room with the words of FDR's ringing proclamation: "AMONG AMERICAN CITIZENS, THERE SHOULD BE NO FORGOTTEN MEN AND NO FORGOTTEN RACES," we are reminded in the passage to the room devoted to FDR's stewardship during World War II that "WE MUST SCRUPULOUSLY GUARD THE CIVIL RIGHTS AND CIVIL LIBERTIES OF ALL CITIZENS, WHATEVER THEIR BACKGROUND." Apparently Halprin felt it best to omit any unsettling allusion to FDR's executive order to inter Japanese-American citizens during the great man's finest hour. Similarly, his unwillingness to integrate the armed forces (that would fall to Truman) goes unmentioned.
The official audio tour offers a series of parallel narratives here: "FDR and Churchill," "FDR and Stalin," "FDR and Fala" (no kidding; a bronze statue of FDR's Scottish terrier sits at the president's feet), all meant to uplift our spirits and enumerate for us the war-hating president's wartime accomplishments. We are lectured on the president's Lend-Lease agreements with Churchill and Stalin, and reminded of his insistence that America become the "THE GREAT ARSENAL OF DEMOCRACY." Unprecedented levels of U.S. government arms shipments to Stalin's totalitarian regime no doubt heartened advocates of democratic self-rule everywhere, just as the transfer of sensitive computer and missile technology to China did under Clinton.
There are no audio narratives on FDR's attempt to pack a Supreme Court that regularly found many of his progressive initiatives in violation of the U.S. Constitution, or on his unprecedented concentration of power in the White House, and only praise for his tremendous expansion of the size and role of the federal government and his vainglorious abandonment of Washington's historic precedent of not serving more than two terms.
In his commentary on the Fourth Term Room, which memorializes the death of FDR in 1945 and celebrates the subsequent establishment of the United Nations, Halprin would not have us forget that before his untimely passing, the president "had achieved his goal of freeing the world from the menace of dictatorships." Those who have listened attentively to the audio guide account of FDR's war-time relationship with his great ally Joseph Stalin will be forgiven if they are impatient to make their way to the exit.
In St. Paul's Cathedral in London, on the inconspicuous tomb of its architect, Christopher Wren, are engraved the words Si monumentum requiris, circumspice—roughly, "If you would seek his monument, look around you." Wren understood that the edifice of St. Paul's, visible throughout the whole of London, would serve as the architect's most fitting memorial, rather than another lavish tomb. Like Wren, Roosevelt never expressed the desire for an ostentatious monument to himself. He asked to be remembered only by a plain stone engraved with his name, and such a stone was long ago placed in front of the National Archives. FDR grasped more firmly than his epigone, Bill Clinton, who is ever ready to flaunt himself before the American public, the Machiavellian lesson that to wield great personal power in a democracy means to conceal the scope and nature of that power. It is not Halprin's memorial, but the seemingly countless colossal government buildings flanking the Mall in Washington, D.C., many bearing the names of those bureaucratic agencies he created, that are, alas, FDR's most enduring legacy.
The post A Rendezvous with Density appeared first on Reason.com.
]]>W.B. Yeats: A Life: The Apprentice Mage, 1865-1914, by R.F. Foster, New York: Oxford University Press, 672 pages, $21.50
Yeats's Ghosts: The Secret Life of W.B. Yeats, by Brenda Maddox, New York: HarperPerennial, 496 pages, $16
The post The Poet As Politician appeared first on Reason.com.
]]>