Extracurricular Reading

Books every college student should read

|

The college curriculum has been a matter of great controversy for the past several years. But much undergraduate learning takes place outside of class and beyond the confines of formal reading lists. Taking advantage of this phenomenon, REASON asked a number of scholars, mostly university professors, to recommend works that students are not likely to encounter on the typical college syllabi but that they would suggest to a curious undergraduate. We limited each contributor to a single book but, as you'll see, some managed to cheat.

Gregory Benford
Most important, a book should hold your interest. Uplifting tomes left unread help no one. For pure intellectual fun, I salute Michael Hart's The One Hundred: A Ranking of the Most Influential Persons in History. Like peanuts, it's hard to nibble on just one entry in this 570-page book.

Most undergraduates emerge from universities agreeing with Henry Ford, that history is bunk. This book can correct that, ladling in fascinating detail that makes the past seem real and relevant.

Hart plays around the usual dates-and-wars approach, avoids the social-movement models, and for his underlying drivers stresses two great currents in human endeavor: science/technology and religion. These, he argues in case after case, propelled humanity from hunter-gatherers to lords of the planet, for better and for worse.

When I bought the first hardcover edition in 1978 (a second, updated edition is now in trade paperback), I thought that in his list of those who have most moved humanity, surely I would know the top 10. I missed one, and no one I know has recognized his entry number 7: Ts'ai Lun, the inventor of paper. Yet Hart's case is compelling. This obscure Chinese eunuch devised paper in 105 A.D., and the Han dynasty kept the technique secret until 751, when Arabs tortured the trick out of Chinese craftsmen.

Paper was never separately invented, even though its existence was known for centuries. Hart argues cogently that China's rise to cultural superiority over the West for a thousand years in large measure arose from the organizing utility of readily available paper. Only when Johann Gutenberg (entry number 8) devised the printing press did the West redress the balance and draw ahead of China.

Hart's Top Ten balances science/technology with religion, beginning with Muhammad (because he both founded and spread a major religion), then in order, Newton, Christ, Buddha, Confucius, St. Paul, Ts'ai Lun, Gutenberg, Columbus, and Einstein. My only major difference with Hart's choices is that I believe Euclid, the sole figure who put mathematical thinking on an orderly basis, has been more influential than Einstein.

Necessarily, Hart has to guess at the long-range influence of those born in the last few centuries, but still, engaging patterns emerge. An unusually large number of major figures flourished between the sixth and third centuries B.C. From the 15th century on, each century supplies more names.

Natives of the British Isles turn up often, with Scots the true standouts. Hart argues cogently that scientists and inventors (37 entries) exercise more sway over later lives than politicians and generals (30 entries). Nineteen of the hundred never married, and 26 had no known children, with both these groups concentrated near the top of the list. Maybe family life gets in the way of greatness? Hart estimates that about half of the list have no living descendants. Seven were illiterate, principally military leaders. Oddly, at least 10 suffered from gout, a fact of interest in medical research.

The book abounds in insights, points worth arguing, quirks of history. It makes the past a living landscape, and as Hart hopes, does indeed "open, as it did for me, a new perspective on history."

Gregory Benford is a professor of physics at the University of California, Irvine and the author of Timescape.

Stephen Cox

Students who are trying to make sense out of history will probably not derive much help from the versions of historical theory that they encounter in the classroom. They will be especially disappointed if they want to understand the wonderful coincidence embodied in the progress of capitalist societies--the simultaneous expansion of personal liberty and material prosperity.

The historical theories that are normally embalmed in college syllabi won't come close to explaining how this happened. Some of them deny that there is such a thing as "progress" in the first place. Others view material progress as an effect of purely material forces, a kind of self-generating machine that periodically emits, as a byproduct, certain sentimental notions about the importance of individuals and individual liberty.

The conclusive refutation of all this nonsense can be found in Isabel Paterson's The God of the Machine (Transaction Publishers). Paterson demonstrates the dependence of economic forces on ideas, which are the products of individual minds; and she shows that the key to material progress is the discovery of social and political mechanisms that allow individuals to think and act freely. If modern civilization can be likened to a machine, then the creative mind is its dynamo; its circuits of energy are maintained by capitalist institutions; and its improvement results from a gradually increasing understanding of the "engineering principles" of a free society. "These are not sentimental considerations," Paterson says. "They constitute the mechanism of production and therefore of power."

Paterson traces the history of the engineering principles from Greek science and Roman law to the system of limited government created by the founders of the American republic. She then describes the subversion of that system by collectivist ideas and practices. She presents a withering analysis of the managed economy, public education, state-sponsored "security" schemes, and the supposedly humanitarian impulses that lead people to favor the welfare state.

When The God of the Machine was published in 1943, it was shockingly unfashionable. "Fit audience, though few": The general public was not attracted, but Paterson was admired by such fellow iconoclasts as Ayn Rand, Albert Jay Nock, and Russell Kirk. The God of the Machine remains a classic of individualist thought. But it is not a pale historical artifact, locked in its time of origin. It is focused on the great continuing issues of civilization, which it confronts with the authority of Paterson's special character and experience.

Brought up in poverty on the Western frontier, educated by herself and educated superbly, Paterson became an important novelist and literary critic. She did not merely preach individualism; she lived life as an individualist. And she was not merely a theorist; she had the creative imagination that brings theory to life and challenges the imaginations of others. There was nobody quite like Isabel Paterson, and there is nothing quite like The God of the Machine.

Stephen Cox is professor of literature at the University of California, San Diego.

William C. Dennis

Among the intellectual openings to the further expansion of socialism that will plague the early 21st century, three appear to me to be particularly troubling: the question of race, the yearning for equality, and the worry over environmental degradation. Thus the friend of liberty will need to do some reading in all three of these areas if he is to be an able advocate of human freedom. On equality, one might first turn to the writings of the late Aaron Wildavsky, particularly his The Rise of Radical Egalitarianism (1991); the Heartland Institute's recent Ecosanity (1994) would be one good place to start thinking about the environment for liberty.

But on the question of race, the premier scholar of good sense is Thomas Sowell, senior fellow at the Hoover Institution. To his Race and Economics (1975), Markets and Minorities (1981), and Ethnic America (1981), among others, Professor Sowell has added this year Race and Culture: A World View (Basic Books). In these books. Sowell deals with how culture and background influence the statistical outcome of group behavior. He does this not to denigrate the role of individual effort in life achievements but to show how different starting points, values, customs, attitudes, and beliefs are bound to make a difference in how persons in a particular ethnic or religious group turn out.

Starting points matter. They do not determine, but they do influence. How could things be any other way? Nor would we want them to be, for otherwise the wonderful variety of humanity would be reduced to some colorless norm. Therefore, Sowell argues, if one bases governmental policies on statistical "disparities" of group outcomes, he will, perforce, ignore culture, history, choice, and preference as he tries to wrench the lives of real people into patterns more closely aligned to his theories of how things ideally should be.

Yet this is not a book where Sowell imposes some end state view of his own to substitute for those of the coercive utopians. Rather this is a work that might be subtitled, "On How to Begin Thinking About Race." Full of interesting examples of group differences throughout history and around the globe, Sowell's book has thoughtful discussions of paternalism, slavery, migration and conquest, discrimination and segregation, the counterintuitive and unintended consequences of government planning, the question of racial reparations, and the idea of social justice, among other topics.

Sowell's chapter on race and intelligence is one of his best--undogmatic and sensible throughout. Even to discuss such questions as these frankly would keep this book off most college syllabi, but the implications of Sowell's study for current policy are so "politically incorrect" that I would not be surprised to hear of a book burning on one of our campuses this autumn. Come to think of it, this book has a lot to say about the problems of equality and of the best environment for humanity, as well.

William C. Dennis is a senior program officer at Liberty Fund Inc., in Indianapolis.

Thomas W. Hazlett

I once heard a very important second-year assistant professor in economics pronounce harsh verdict upon Thomas Sowell's Knowledge and Decisions (Basic Books, 1980). "The book has nothing that's really new," he declared. "The entire 400 pages is little more than an application of the basic model."

How right he was. And how wrong. For the colossal achievement of Sowell's masterpiece is that it extends the essential logic of microeconomic analysis--and that's nothing new--into the very far reaches of modern life. That is. Organized not in the manner of an economics text but as a classic social science reference work, the book repeatedly busts out of the narrow confines of pure theory.

For instance, in Part I, devoted to Social Institutions, Sowell divvies up his subject matter into Economic Trade-Offs, Social Trade-Offs, and Political Trade-Offs. In Part II, Trends and Issues, he categorizes his topics as Historical Trends, Trends in Economics, Trends in Law, and Trends in Politics. The book's title exemplifies its purpose: to show how people use their brains to make choices. That is the classic inquiry of economic analysis, often lost over the past two centuries in the relentless march of formalism. This book puts meat back on the bones of a war-weary homo economicus.

Sowell manages to squeeze out the jargon in marvelous fashion--ceteris paribus. He craftily replaces the obtuse "marginal analysis" with the paradigm of "incremental vs. categorical thinking." Categorical thinking is exemplified by such economic illogic as that employed by the jury that went lightly on a rapist because his victim offered him a condom. Incremental thinking focuses more clearly on the issue at hand: Given that one finds oneself in an injurious situation, it is intelligent to mitigate damage. Rational action to limit damages is hardly to condone the perpetration of a felony. The analytical result of this sort of clearheadedness is that the reader is painlessly processed through the economist's workshop. And thanks mainly to Sowell's relentless pursuit of institutional detail, the excursion turns out exotic.

On social issues, Sowell bravely travels where few economists dare to explore, and charts the incremental gains and losses from various sorts of personal relationships. Sowell's pathbreaking work in the economics of ethnic rivalry give him a wealth of fascinating material to drop on the reader regarding racial and gender matters. The means by which discrimination works--and by which it fails--is especially intriguing.

But do not be surprised that topics such as discrimination and affirmative action are not the primary focus of this book; they simply provide a rich contextual reality for the larger framework of "incremental thinking." Indeed, the author spends perhaps more time on details of the camera market than on race, as he is an ardent photography buff.

On page 339, Dr. Sowell offers an observation that both demonstrates his mode of analysis and casts light upon the issue at hand--at least to many striving academics: "An intellectual is rewarded not so much for reaching the truth as for demonstrating his own mental ability. Recourse to well-established and widely accepted ideas will never demonstrate the mental ability of the intellectual, however valid its application to a particular question or issue. The intellectual's virtuosity is shown by recourse to the new, the esoteric, and if possible his own originality in concept or application--whether or not its conclusions are more or less valid than the received wisdom. Intellectuals have an incentive 'to study more the reputation of their own wit than the success of another's business,' as Hobbes observed more than three centuries ago."

In truth, Thomas Sowell does little more than bring Adam Smith's "invisible hand" up to the late 20th century--with metaphors in technicolor. Nothing particularly "elegant" or "rigorous" or "interesting" about that. Except to the human race.

Contributing Editor Thomas W. Hazlett teaches economics and public policy at the University of California, Davis.

Loren E. Lomasky

We are a species of busybodies, and no lesson is harder to learn--and more in need of frequent relearning--than forbearance. For most of us meddling comes all too naturally. Whether prompted by the love of virtue or abhorrence of vice, by outpourings of benevolence or stern imperatives of rectitude, we continually find ourselves impelled to extend our noses until they are firmly implanted in the business of others. In part we do so because to butt in is exquisitely pleasant. That, though, is only half of the story. Pleasures, even addictive ones, can be resisted. But when to the joys of intrusiveness is added the conviction that one is simply doing one's duty, the siren's call becomes well-nigh irresistible. That is why leaving other people alone is so terribly hard.

It is to the credit of the liberal philosophical tradition that its theorists have struggled heroically to set up sturdy bulwarks against this predilection. John Locke's Second Treatise on Government, J. S. Mill's On Liberty, and Immanuel Kant's Theory of Right develop conceptions of human beings as self-determiners thereby entitled to a domain of action within which they are not subject to interference. These and other philosophical classics must be part of our hypothetical student's curriculum. But alongside them ought to be a copy, a well-thumbed copy at that, of Miss Manners' Guide to Excruciatingly Correct Behavior.

"An etiquette book?" some will ask. "Is it really necessary for someone to know which fork to use with Belgian endive or how to address the Duchess of Kent at the Wimbledon reception?" No, of course not; though such tidbits of knowledge certainly don't hurt and may be no less illuminating than numerous others that could be culled from our scholar's social science textbooks. It is not merely disjoint instructions for performance that are to be gleaned from Miss Manners but rather an ethos.

Think of people as millions of vehicles whizzing along the superhighways, back roads, and interchanges of life. They set their courses independently and travel at different speeds. Sometimes they collide, and then carnage ensues. Last-second application of anti-lock disc brakes can avert some crashes, and when cars do collide, airbag deployment reduces injury. These devices are like moral codes and legal institutions, kicking in when collision appears imminent or has already occurred.

But skilled motoring features little reliance on brakes and none at all on airbags. One avoids the necessity of their use through taking prudent precautions: keeping a safe distance, staying in one's own lane, signaling intentions clearly. That is precisely the function of etiquette.

It is not a labyrinthine ritual of studied ostentation nor a mechanism of social dominance brought to class, race, or gender warfare, although etiquette can be misapplied in these ways. Rather, good manners are a means by which people preserve necessary distance between each other. They are how we send out signals that keep one person from veering suddenly into the path of someone else. Manners are not a substitute for moral imperatives or legal restraints any more than a course in defensive driving substitutes for having well-maintained brakes on one's car. Rather, etiquette complements morals and law. Each is an indispensable ingredient of civility.

Miss Manners (also known as Judith Martin) demonstrates for hundreds of life's mundane occasions how proper attentiveness to the principles of etiquette enables individuals to pursue their plans and projects while minimizing incursions on the interests of others. Rudeness, she insists, is not excusable even when--or perhaps especially when--it is prompted by allegedly higher motives. So, for example, the incontrovertible value of sincerity does not excuse one's failure politely to inquire, "How do you do?" when one is utterly uninterested in the content of the answer. Nor may one invoke honesty as a justification for responding to that question with a litany of one's current and prospective woes.

Please, no disapproving accusations of hypocrisy: This is not to announce an endorsement by etiquette of insincerity or dishonesty. Rather, the little ritual in which we engage when we meet is a wonderfully efficient formalism through which we acknowledge our interlocutor's status as a person deserving of solicitude and respect; that acknowledgment is duly reciprocated by a display of dignified restraint.

Of rules of etiquette there is no end, and Miss Manners seemingly knows them all. So too, no doubt, do dozens of dreary chiefs of protocol. But Miss Manners is no pedant; she is a sage; moreover, a sage who writes like a demon. That is why she amply merits her place on our student's reading list. And since academics can't resist piling footnote upon footnote, I hasten to add that should there be in the audience any postgraduates on the verge of establishing their own domestic nests, to you I commend as follow-up reading Miss Manners' Guide to Rearing Perfect Children.

Contributing Editor Loren E. Lomasky is a professor of philosophy at Bowling Green State University.

Mark O. Martin

"Every great advance in science has issued from a new audacity of imagination," wrote John Dewey. A lovely phrase, but one that hardly reflects the image of the scientist encountered by the public. This is a shame, since the practice of science has transformed both the nature of our world and how we perceive our place in it.

Yet science and scientists are repeatedly portrayed by the popular media in an unflattering and inaccurate light, as dinosaur-cloning profit mongers or monomaniacal and bespectacled superbomb designers. As I am a bespectacled microbiologist, this imagery concerns me a bit.

I know a number of scientists, and while there are those with more than a healthy dose of ego, the vast majority are nothing like the two-dimensional shadows created by Michael Crichton and Steven Spielberg. Still, media images are powerful and pervasive. How could anyone honestly expect the general educated public to know better?

There is a solution: the award-winning novel Timescape by Gregory Benford.

Fictional depictions of scientists have usually been unrealistic to say the least. Overly heroic attempts (Arrowsmith by Sinclair Lewis) are stacked against an almost rock-star focus on the pursuit of recognition (The Double Helix by James Watson).

Benford's novel bridges this gap in a balanced manner which is at once literate, accurate, and moving; it lacks the stereotypes of 1950s Giant Ant movies and our present media-powered fascination with the toppling of ethical standards. Timescape is a novel about interesting and intriguing people who happen to be scientists.

It takes place in two segments of time: an ecologically threatened 1998 and the more innocent world of 1962. A British scientist, as the millennium slouches toward a darker Bethlehem, has developed a technique to communicate a warning to the past. The enigmatic message from the future is received by a new professor (seeking tenure) at a California university in the fresh and optimistic days of JFK's version of Camelot.

Two world views collide in a spectacular fashion: Will the warning from the future be received, and understood? The concept of causality--the connections between past and future--shines brightly in Benford's prose, which is both clear and scientifically accurate.

More to the point, the scientists themselves come alive as people, not calculator-wielding plot devices. The triumvirate of scientific "types" (Theorist, Experimentalist, and Bureaucrat) reveal themselves in both 1962 and 1998. Their interactions remain fresh and vital, and uncomfortably familiar to anyone conversant with the true world of science.

When friends ask me what it is like to "do" science for a living, this is the book I hand them. We do not clone dinosaurs, nor release designer plagues into the world for profit. Benford's novel shows scientists realistically: men and women who work, love, and learn as their lives unfold.

Timescape can act as a conduit between C. P. Snow's two cultures, allowing the English major to appreciate that scientists are little different from other people who love their work. It is, perhaps, the antidote for the Creeping Crichtonism shouting from the many throats of the media today.

Mark O. Martin is an assistant professor of biology at Occidental College.

Donald N. McCloskey

Someone here will mention Milton and Rose Friedman's Free to Choose (1979), and I concur, adding only Wicksteed, Bastiat, and Adam Smith. Fine books, and seldom seen on college reading lists.

The problem, however, is that doctrine and counterdoctrine are not going to work with the audience we have in mind. We the middle-aged want to save the young the trouble, indoctrinating them with conveniently brief pamphlets about health reform or the crime bill. But the indoctrination hasn't worked in the college classroom. It's not going to work in the Offlist List, either.

Better reach for their culture. That's why Ayn Rand continues to sell. Ditto science fiction of a libertarian sort. Storytelling and symbol-making think for us. We do not speak the language, said Coleridge: the language speaks us. Some rock music attacks collectivism, as it certainly did once in Eastern Europe. Movies about the modern state, such as In the Name of the Father, are more educational than any number of essays about liberty under socialism.

I can't tell you about rock music or television, though not because I believe that print available in 1940 is all one needs of culture or that the American mind has closed since I was a lad. It's merely that I stopped listening to rock music in 1959 and am ill-equipped to analyze the politics of The Brady Bunch. I leave that side to younger people, as in the new magazine of cultural libertarianism, The Exchange.

My side is old-fashioned and literary. Like the kids, though, I'm looking for sensibilities with no whiff of Lenin, Mussolini, Bertrand Russell, or the Bauhaus. The worst in 20th-century culture comes from the mixture of Science and Romance concocted by intellectuals in the 19th century. The best looks like Swift, Voltaire, Hume, Smith.

So give the students anything by Borges, such as Labyrinths (1969), or his luminous poetry, Selected Poems, 1923-1967 (1972, with the Spanish on one side and translations by English-language poets on the other). Or anything by Mencken, such as the old Vintage Mencken edited by Alistair Cooke in 1955, or his autobiographies. Or anything by Eric Hoffer (surprisingly little is in print).

The deep literary thinkers at The New York Times to the contrary, there are numerous English professors who have the sensibility I'm looking for, whatever the details of their politics. An example is the late Northrop Frye, as in The Educated Imagination (1964), the text of a series of Canadian radio talks. Another is Wayne Booth, as in Now Don't Try to Reason with Me (1970). Or Paul Fussell, in Thank God for the Atomic Bomb, and other Essays (1988).

Joseph Epstein and the magazine he edits, The American Scholar, convey the 18th-century, unromantic, ironical, satirical, individualistic, hardminded yet courteous tone I have in mind. It spreads with bourgeois virtue. I'm hoping it will make the next century less of a slaughterhouse than the one we have just been through.

One book? All right, I'll cheat. No one can browse the four volumes of George Orwell's The Collected Essays, Journalism and Letters (1971) and still celebrate the ideologies of the 20th century. Put it on The Offlist. Put it on The List.

Contributing Editor Donald N. McCloskey's latest book is Knowledge and Persuasion in Economics (Cambridge).

Richard Mitchell

Welcome to college. Be careful. In these days, when most of the preachers have lost their clout with intellectuals, one of which you are about to become, our professors, like our politicians, have taken over the preaching, laboring mightily to foster true belief in notions defensible by faith alone. Nietzsche predicted it long ago. Don't worry. You can still be free, still get educated, nourished, and made even better than you are.

To do that, though, you may have to read even more than is assigned. Look around for authors who have been dead for a long time. Your advantage in reading the dead lies in this: They too may have had axes to grind, but nobody swings those axes anymore, so it is easier to find in them some ideas that might be continually true and useful even when the agendas they might have had for their readers are as dead as they are.

Here's a case in point. Lucius Apuleius, who died about 1,800 years ago, was an unreconstructed Roman, who became an initiate of the cult of Isis, a weird sort of mystery religion which, like the more famous Mysteries at Eleusis, had no creeds, no scriptures, and no clergy. Nevertheless, from lots of contemporary evidence, such mystery religions seem to have made people happy and decent. How they did it, we don't know; the initiates were sworn to secrecy, and none of them seem to have blabbed.

Except, maybe, for Lucius Apuleius. He wrote a rowdy, raunchy novel now called The Golden Ass. It's about a bright, handsome young man, maybe much like you, setting out to make something of himself. But, alas, through some indiscretions of a delicate nature, he finds himself transformed into a jackass, and, even worse, stolen by a band of robbers who use him as a pack animal. His adventures are various and entertaining, but still not to his liking. He would still rather be a man.

In the middle of the book, he is in a cave with two women, one an old crone cooking up some chicken soup for the other, a blushing young bride kidnapped at her wedding and being held for ransom. To comfort the girl, the old woman tells a story, a perfectly beautiful and wonderful story, which very possibly a man is not supposed to hear. But, of course, there's no man there, just this jackass. But the story sets the hero on the road to becoming a man again.

Young men will find this book valuable. They will, like all men, have to escape jackasserie again and again. Young women will find this book valuable. They will probably understand the story told in the cave, and even see the importance of letting a jackass overhear it once in a while. And no one at all is in any danger of being gulled into the cult of Isis. A very good book.

Richard Mitchell, who was the publisher of The Underground Grammarian, is the author of several books, including The Graves of Academe (Little, Brown) and The Gift of Fire (Simon & Schuster).

Ethan A. Nadelmann

Psychoactive drugs are not easily understood in American society today. Most Americans mistakenly assume that alcohol and tobacco are safer than most illicit drugs. Cocaine and heroin are blamed for many social problems, with little regard for the ways in which drug prohibition makes drugs even more dangerous than they need to be. All sorts of myths are associated with psychedelic drugs like LSD, mescaline, and psilocybic mushrooms.

Andrew Weil has long been a brilliant analyst of psychoactive drug use around the world. His first book, The Natural Mind (Houghton Mifflin, 1972, 1986), explained drug use in terms of a near-universal human desire to alter one's state of consciousness. A second book, The Marriage of the Sun and Moon (Houghton Mifflin, 1980), examined the diverse ways in which people around the world alter their consciousness--from peyote-based religious ceremonies to eating hot peppers, sitting in sweat lodges, munching on fresh mangoes and experiencing lunar eclipses. From Chocolate to Morphine: Everything You Need to Know About Mind-Altering Drugs (Houghton Mifflin, 1983), written with Winifred Rosen, completes his trilogy on drugs and drug use, providing a valuable antidote to foolish thinking about drugs.

Designed as a drug education book for both young people and adults, From Chocolate to Morphine is remarkably well written and accessible. Weil and Rosen make clear from the start that the best way to avoid problems with drugs is never to use them in the first place. But they also know that virtually all of us are likely to use psychoactive drugs in one form or another, even if we restrict our consumption to those that are legal.

Weil and Rosen describe what drugs are, where they come from, and what their effects are likely to be. They provide guidance as to which drugs and means of consumption are safer or more dangerous. They suggest how to recognize the point at which drug use becomes habitual or otherwise dangerous. And they emphasize that human beings, not drugs, are ultimately responsible for the relationship between the two.

There is no question that drug policy in the United States would improve dramatically if policy makers properly digested this text. No other domain of policy affords public speakers such latitude in substituting distortions, lies, and ignorant beliefs for well-established truths backed by historical and scientific evidence. As enthusiasm for the "war on drugs" fades, there is reason to hope that public discourse and policy increasingly will be shaped by scientific evidence and common sense. From Chocolate to Morphine should be priority reading for anyone who uses drugs, wants to use drugs, wants to know about other people who use drugs, just wants to learn about drugs, or wants to shape policy on drugs.

Ethan A. Nadelmann is director of the Lindesmith Center in New York City.

Steven R. Postrel

Choosing books for others, especially young people, always brings out the pomposity in an intellectual--if you read this, You Will Be a Better Person, goes the subtext. In that vein, I considered recommending Thucydides' The Peloponnesian War for its unparalleled description of civic virtue and vice and for the pity and terror it evokes at the grim realities of power politics and total war. The Apocalyptics by Edith Efron, which is the best antidote to Rachel Carson and the popular hysteria over cancer from industrial chemicals, almost got the nod as well.

Instead, I chose a book that I can honestly say might make the reader a better person--Vladimir Bukovsky's To Build a Castle: My Life as a Dissenter. First published in English in 1978, it is the gripping story of a man whose thirst for liberty and sense of integrity were so strong that he could not submit to the will of the Communist Party, even when his defiance brought down upon him the entire repression apparatus of the Soviet state.

When imprisonment in the gulag wasn't sufficient to break his spirit (and his survival techniques are fascinating), Bukovsky was declared insane and placed in a mental hospital to be cured of his anti-communist delusions. Even there, surrounded by truly demented psychotics and threatened with horrifying drug treatments and physical torture, he managed to resist--smuggling out his medical records for independent evaluation by Western psychiatrists, exposing the fraud of Soviet psychiatry to the world at large, and becoming an internationally known prisoner of conscience whose fame intimidated the authorities.

Bukovsky's memoir offers the contemporary American college student many valuable things. It demonstrates the meaning of political courage in the most direct way and shows that flinty toughness rather than squishy compassion is the key to resisting oppression. It explodes the idea that propaganda and re-education can reshape human nature, undermining the fashionable notion that our ideas are determined by the oppressive power of "privileged" texts. It brings back to fresh, vivid life the meaning of individual rights, such as freedom of speech and due process of law, whose perceived importance has been dulled by ritualistic invocation.

Battles over political correctness at universities take on clearer perspective after reading Bukovsky's accounts of his "crimes," their adjudication, and their punishment. On the one hand, the self-correction sessions, guilt by ideological accusation, and illogical charges are eerily familiar; on the other hand, the punishments meted out to campus dissidents today seem paltry in comparison to the kind of thing Bukovsky and his fellows took for granted.

Perhaps the most valuable commodity To Build a Castle provides today's campuses is its exploration of the fundamental evil of Soviet communism, now in the process of being forgotten or trivialized. Bukovsky ties that evil directly to the professed ideals of the regime: "You have to understand that without the use of force it is realistic to create a theoretical equality of opportunity, but not equality of results. People attain absolute equality only in the graveyard, and if you want to turn your country into a giant graveyard, go ahead, join the socialists." Now there's a statement just perfect for use in your sociology class.

Steven R. Postrel teaches management strategy at the University of California, Irvine.

Paul A. Rahe

There was a time, not so long ago, when the study of war, politics, and power was thought to be part and parcel of an undergraduate education. In the course of the last four decades, this has ceased to be the case. Except at a handful of institutions, such as Ohio State University and Yale, military history has been abandoned. Within today's academy, the practitioners of this dying art are shunned as pariahs, for it is assumed that no one would devote a career to the study of armed conflict who is not somehow morally deranged. Programs in peace studies and in conflict resolution abound; rarely does anyone face up to the fact that conflicts quite often get resolved, and peace achieved, through the successful conduct of war.

One consequence of the reigning academic pieties is that in college one is far more likely to encounter an offering in the history of women's fashion than to have the opportunity to study Napoleon's campaigns. This is not because of a lack of demand. Books in military history have always sold well and still do; undergraduates are quick to enroll in courses dealing with politics and war when given the chance. Red-blooded they always were, and red-blooded they remain. Therein lies hope.

Ten years ago, the obvious place to begin would have been Thucydides' account of the Peloponnesian War. Nowhere can one find a subtler depiction of the moral and practical dilemmas faced by the statesman in a world torn by conflict. Moreover, Thucydides' environment was bipolar--as was ours in the great epoch of struggles on the European continent that stretched from 1914 to 1989--and Thucydides' war pitted a maritime power, such as we were, against a power, such as Germany or the Soviet Union, threatening dominance on land. But the two great world wars are now long gone; the Cold War has come to an end; and we no longer find ourselves hovering on the verge of conflict with a single foe.

Our situation more closely resembles the plight of the late Victorians than that of Pericles, Archidamus, Alcibiades, and Lysander; and while we still have much to learn from Thucydides, there is something to be said for asking contemporary students to read Winston Churchill's lively accounts of the dirty little wars that his countrymen had to fight at the end of the last century in defense of their empire and their way of life.

Unfortunately, the best of these--The River War: An Historical Account of the Reconquest of the Soudan (London; Longmans, Green, and Co., 1899)--was never reprinted in its original, two-volume edition. Those in search of instruction will have to turn to the library or even to interlibrary loan. The rewards will amply repay the effort of acquisition, for the reader will have no difficulty in understanding why Churchill was eventually awarded the Nobel Prize for literature.

Churchill's great, neglected work is, like Thucydides' history, a prose epic. His subjects--the Nile and its peoples; the conflict between Islam and modernity; the origins, character, and course of the Mahdist revolt against Egyptian rule within the Sudan; the resistance mounted by General Gordon at Khartoum; the fecklessness of Gladstone's Liberal administration; and the campaign of reconquest ultimately mounted on behalf of Egypt and Britain by Sir Herbert Kitchener--offered him the same sort of canvas available to Thucydides, and he took the endeavor as an occasion for reflection on the moral responsibilities attendant upon great power and as an opportunity to explore the relationships between civilization and decadence, between barbarism and courage, and between modern science and the changing character of war.

In an age when Americans are likely to be called on to respond to ugly little conflicts marked by social, sectarian, and tribal rivalries in odd corners of the world--the Arabian peninsula, the Caucasus, the Horn of Africa, the Balkans, Central Africa, the Maghreb, and the Caribbean, to mention the most recent examples--I can think of no other historical work that better deserves our attention.

Paul A. Rahe chairs the department of history at the University of Tulsa. His book Republics Ancient and Modern: Classical Republicanism and the American Revolution was reissued in August by the University of North Carolina Press in a three-volume paperback edition.

John Shelton Reed

Maybe it's not helpful to start by questioning this assignment's premise, but in fact I'm not sure there any longer are such things as "typical college syllabi," at least in the humanities and social sciences. With a few honorable exceptions, college faculties have exploited the elective system to make an implicit pact with our students: We don't require them to read anything in particular, and we get to teach whatever we want. The upshot of this cheerful permissiveness is that there's almost nothing except maybe The Color Purple that all students have read, and almost nothing--Zane Grey's novels, Turkish cookbooks, the opinions of Justice Brennan--that one can't imagine being assigned by some professors, somewhere.

Notice I said "almost nothing." We do have our orthodoxies, and it's tempting to suggest books just because they're offensive to received collegiate opinion: Jean Raspail's Camp of the Saints, for instance, or Michael Fumento's The Myth of Heterosexual AIDS, Steven Goldberg's The Inevitability of Patriarchy, or anything by Arthur Jensen on group differences in intelligence. But it's a pretty safe bet that students aren't much better acquainted with Plato, the Bible, or Shakespeare, so maybe they should read some of the staples of the old "Western Civ" courses first--except, of course, that one shudders to recall how the classics are often taught these days.

Anyway, enough stalling: I'll play your game. You want one book?

My nominee is I'll Take My Stand: The South and the Agrarian Tradition, published in 1930 by "Twelve Southerners," most of them literary chaps associated with Vanderbilt University (all white males, and all but one now dead). "The Vanderbilt Agrarians," intellectual historian Richard King has written, "offered the closest thing to an authentic conservative vision which America has seen," and certainly John Crowe Ransom, Allen Tate, Robert Penn Warren, and their colleagues stand in a tradition of anti-industrial, anti-modern thought far better represented in England and on the Continent (and latterly in the Third World) than in the United States, a tradition today's students will find almost totally unfamiliar.

Like their antebellum cousins the pro-slavery theorists, the Agrarians started with reasonable, even commonplace, observations, but took them in startling directions. They disliked alienated labor and automobiles, ecological degradation and total war, all fashionable complaints on modern campuses. But their prescriptions were very old-fashioned: The family farm. Revealed religion. Good manners. And, Lord knows, most of the 12 could write.

What would MTV addicts make of Andrew Lytle's admonition to turn off the Victrola and take the fiddle from the wall? What would a generation of mall rats make of his paean to the spinning wheel? I don't know, but it would be fun to see, and those who are capable of being puzzled might be the better for it.

John Shelton Reed is William Rand Kenan Jr. Professor of Sociology at the University of North Carolina at Chapel Hill.

Lynn Scarlett

So many and varied are the niches of inquiry left unexamined in our schools that I feel obliged to recommend as an educational supplement a nonfiction epic of ideas. Nobel laureate F. A. Hayek's The Constitution of Liberty is such an epic.

Education should nurture the capacity to think about ideas and their consequences. Hayek counts among those ideas worth revisiting "that ideal of freedom which inspired modern Western civilization and whose partial realization made possible the achievements of that civilization."

The idea of liberty deserves particular attention because, as Hayek proposes, it is "not merely one particular value but…the source and condition of most moral values. What a free society offers to the individual is much more than what he would be able to do if only he were free."

But just what is this liberty (or freedom) of which Hayek writes? Why should we care about it? How do notions of freedom relate to current events and issues?

These are questions Americans need to ask. The Constitution of Liberty can assist in this inquiry.

Abraham Lincoln wrote that "we all declare for liberty: but in using the same word, we do not mean the same thing." Hayek unpacks the rhetoric of freedom. He contrasts at least four different concepts: political freedom (democracy); inner freedom (free will); freedom as power (to do or get what we want); and freedom as the absence of coercion.

It is the fourth concept that Hayek views as essential to human progress. The first concept, political freedom, he regards as instrumental to real freedom only if limited. The other two concepts he regards, respectively, as an irrelevant fiction and a dangerous conceit.

Hayek's discussion of the idea of progress contrasts evolutionary, unplanned progress-spontaneous order--with attempts to deliberately design the "good world." Socrates wrote that "the recognition of ignorance is the beginning of wisdom." Hayek develops this Socratic idea as it relates to human institutions, economic action, and progress. For Hayek, a central human problem is how to coordinate human actions and pursue human ends in a context of ignorance. Hayek's writings on progress and knowledge anticipate the more contemporary works of Aaron Wildavsky (Searching for Safety) and Peter Huber (Liability) when he suggests that "progress depends on accidents." And his writings on progress and knowledge anticipate the great modern debates about urban planning and environmentalism.

The Constitution of Liberty offers a compact history of politics and Western political theory; of economics; of public policy; and of that oft-neglected concept, morality. Unlike some of his other renowned works, such as the three-volume Law, Legislation, and Liberty, The Constitution of Liberty is eminently accessible. Hayek reminds us that arguments for limited government are best cast in terms of fundamental principles about human progress and moral values. Arguments about the inexpedience of particular government measures simply reinforce the idea that "the desirability or undesirability of a particular measure could never be a matter of principle but is always one of expediency."

Lynn Scarlett is vice president for research at the Reason Foundation.