Boom and Gloom

Accentuating the negative in an age of plenty.


Every party needs a pooper, even the overwhelmingly robust and upbeat shindig that is end-of-millennium America. In the broadest terms possible, things are going pretty damn well, and the future looks pretty bright, too.

The economy has been nothing less than miraculous. Inflation appears relatively stable, while buying power and overall compensation for virtually all segments of the population are growing. According to the Census Bureau, poverty is at its lowest rate in 20 years, and median household incomes are at an all-time high. Home ownership hovers at the historic high of around 67 percent while unemployment shrinks below supposedly "frictional" levels–despite an influx of welfare recipients once thought to be unemployable. Between 1991 and 1997 (the latest year for which there are full data), the total crime rate dropped almost 17 percent, while violent crime reached its lowest point in a decade. More kids than ever are going on to college (two-thirds, according to American Demographics magazine), and the birthrate for unmarried teens is dropping steadily.

So what's the response of America's political and intellectual classes to such generally good times? A sinking feeling, mostly, and a somewhat desperate search for the mist-shrouded iceberg that will sink us all.

To wit, a September 13, 1999, article in National Review about sex and teenagers: "For the first time in decades, more than half of America's high-school students described themselves as virgins," writes Amy H. Holmes of the Independent Women's Forum. "The rate of sexual activity among 17-to-19-year-old boys in urban areas declined from 75 percent in 1979 to 68 percent in 1995. Over the same period, the proportion of young men who approved of nonmarital sex fell from 80 percent to 71 percent." After granting that "the statistics may be improving," Holmes nonetheless asserts that "sexual practices among teenagers are becoming increasingly dehumanizing," a conclusion supported only by a few dubious anecdotes about ostensibly disturbing new trends that would hardly seem novel to anyone who has come of age since the publication of Tropic of Cancer.

Declinists can barely stomach the idea that something other than imminent ruin awaits the country unless their prescribed course of action is enacted but fast; progress is inevitably measured not in terms of how far we've come but only by who or what has been left behind. Hence, presidential hopefuls Pat Buchanan and Bill Bradley continue to hammer away at themes such as corporate downsizing and the export of "good-paying" jobs overseas. These complaints were just barely plausible in the low-growth early '90s and are simply nonissues in today's economy.

Meanwhile, left-leaning critics such as Susan Faludi bemoan "union-breaking" as an indicator of an ugly new world of workplace dislocation. Current union membership is 14 percent of the work force, down from a high of around 30 percent in the mid-'50s. Such numbers actually reflect not some new wave of sinister Pinkerton-like activity but a shift away from the assembly-line drudge work that gave rise to unions in the first place. But you wouldn't know it from the self-styled defenders of the working man.

On the right, the focus is less on imagined economic despair and dislocation than on perceived and potentially irreversible moral dissipation. "Even during a time of record prosperity, many Americans believe that something has gone wrong at the core," writes former drug czar William J. Bennett in his updated edition of The Index of Leading Cultural Indicators, which documents that the '90s have in fact been something other than a headfirst slide into anarchy. In a Wall Street Journal article, fellow conservative Robert Bork raised his eyebrows at what he called Bennett's "new-found optimism about the direction of American culture" and contended that there is "no excuse for overlooking the very real and degenerate state of much of our politics and culture."

Why do such folks feel a need to always accentuate the negative, often to the exclusion of any positive developments? For politicians, especially those not yet in power, the impetus is fairly obvious and transparently self-interested. Crisis and dislocation are an office seeker's best friend, as a beleaguered Al Gore no doubt understands better than anyone else seeking the White House.

Back during the 1992 presidential campaign, Gore benefited from just such a strategy when he and Bill Clinton argued passionately–if ludicrously–that the U.S. economy was in its worst shape since the Great Depression. Now, as the number-two man in an administration that has coincided with a historic economic expansion, the vice president seems unsure of how to use such good news to his own advantage.

For many intellectuals–a term that covers a wide range of social critics, commentators, and writers of varying degrees of sophistication–the reasons for persistent declinism are more sundry and less clear-cut. Intellectuals tend to define themselves against the vulgar crowd, and there is a long tradition in America of self-declared alienation from the mainstream they seek to influence. "I feel I am an exile here," wrote Herman Melville, a line that continues to resonate with figures as disparate as the Christian right's Paul Weyrich and the far left's Noam Chomsky. If the American people tend to be optimistic about the future–and all indications suggest they are–then by definition an intellectual must offer a darker vision of things, or else risk being unmasked as unworthy of the designation.

That oppositional stance is compounded by what historian Richard Hofstadter identified in Anti-Intellectualism in American Life (1963) as the self-defeating "hunger…for special deference" evinced by many intellectuals in a democratic society. Even as they recognize the relative freedom and opportunity afforded them by an open society, argued Hofstadter, intellectuals want to be appreciated as a breed apart. In a relatively egalitarian society, it has never been easy for intellectuals to gain and maintain special status, and many have resented having to sell their thoughts in a marketplace of ideas not markedly elevated from the one for consumer goods and services.

While at times captivated by an "expert"–whether a Harvard professor or someone like Ross Perot–with a perfect plan, Americans more usually evince healthy skepticism toward authority of all sorts, a tendency that has been waxing strongly over the past few decades. After all, generally wealthier and better-educated individuals feel more and more comfortable asserting control over their own lives in all their particulars. Like doctors, priests, and even stockbrokers, politicians and intellectuals have an increasingly weak hold over the public's mind. They are likely to confuse their relative decline in status with a larger social breakdown, rather than understanding it as a decentralization of power toward individuals.

Such a shift does not destroy political or intellectual authority. It alters the dynamic between professor and student, so to speak, to one between equals–a leveling trend that, come to think of it, may be one of the reasons things have been going so well.