Harold Pinter recently won the Nobel Prize in Literature. Trent Lott will never win that award. But the two have something in common. Pinter's characters avoid discussing the things that matter most, so their silences say more than their words. Lott's new autobiography, Herding Cats, starts and ends with the incident that prompted its publication, but in between it offers barely a grunt about the issue underlying that episode.
At a 2002 celebration of Sen. Strom Thurmond's 100th birthday, Lott said his state was proud to have backed Thurmond's 1948 presidential bid. "And if the rest of the country had followed our lead," he added, "we wouldn't have had all these problems over all these years, either."
Thurmond had run as a segregationist, and critics accused Lott of wishing for an alternative American history without integration. Actually, he was just trying to flatter a dying old man. He had used similar words in 1980, but few noticed at the time since he was just a mid-level member of the House minority. Now he was the top Senate Republican and was about to regain the title of majority leader. Moreover, he spoke while the C-SPAN cameras were on, which would enable newscasts to replay his fateful words over and over. Unhappily for him, critics linked the comment to his roots in Mississippi's muddy past, and he had to quit his leadership post.
In this book, Lott seeks to rehabilitate himself. He recounts his astonishment at his critics, saying that he has employed African-American aides since 1975. "So how could they accuse me of being callous to inequality?" he asks.
This rhetorical question could have led to a probing analysis of Lott's relationship to his home state, which has the nation's highest share of African Americans. Yet for most of the book, he maintains a Pinteresque silence about race.
Lott briefly mentions that he was attending the University of Mississippi when James Meredith became its first black student. White students protested, but Lott recalls urging his Sigma Nu fraternity brothers to shun violence. In a moment of seeming candor, he notes that his role was mainly passive. "I must confess that I was in the ranks of the clueless," he writes. "I believed then that segregation was wrong and that it was cruel, but we were living in a world our ancestors had created for us."
Call it uncandid candor. In a 1997 interview, he had a different memory: "Yes, you could say that I favored segregation then. I don't now." After the Thurmond incident, Time's Karen Tumulty reported that Lott had been more than passive. At a national convention of Sigma Nu, she wrote, he had fought admitting blacks to any of its chapters. Lott does not give his version of that story in Herding Cats.
Lott is equally dodgy about his years as chief aide to William Colmer, the Mississippi Democrat who chaired the House Rules Committee. "Like Strom Thurmond," Lott says, "he was well past his firebrand days, and had become more moderate on civil rights issues as each year passed." Since Colmer still opposed civil rights bills, it is hard to tell what Lott means by "moderate." In 1969, according to The New York Times, Lott drafted a letter for Colmer lamenting "enactment of legislation unduly favorable to the Negro race."
On Colmer's retirement in 1972, Lott successfully ran for his seat as a Republican. After 16 years in the House, he succeeded arch-segregationist John Stennis in the U.S. Senate. He says he was "ecstatic" to win 13 percent of the black vote in that election. "It was the start of a slow march of African Americans into the Republican ranks," he writes. Slow indeed. Lott neglects to mention his 2000 re-election, when he got 11 percent of the black ballots. He won the race by getting 88 percent of the white vote.
On Capitol Hill, Lott developed a reputation as a shrewd student of process and personality. He is the only man in history to serve as party whip in both chambers. In 1996, when Bob Dole resigned to devote his full time to his presidential race, Lott became majority leader.
With this background, Lott should have written a memoir rich in detail and insight. Instead, Herding Cats recycles familiar war stories and contains an annoying number of factual errors. It botches the names of fellow members and the years in which they served in Congress. It confuses the 1985 tax reform with the 1982 tax increase, and it even blames President Reagan for proposing a separate Department of Education. (That dubious honor belongs to President Carter.) During the years the book covers, Congress dealt with racial busing, voting rights, and affirmative action. Lott has nothing to say about these topics. Race disappears from the book until he gets back to the Thurmond story that frames his narrative.
Lott tells of his damage control efforts, recalling that he went on Black Entertainment Television "and groveled through another confession." He admits that the interview was a mistake but says that "you do, and say, strange things when you're desperate." It was worse than he lets on. When BET interviewer Ed Gordon asked whether he supported affirmative action, he said, "Absolutely across the board." He also promised to reconsider his support for judicial nominee Charles Pickering. The man had good character, Lott stammered, "But, having said that, you know, I'll have to weigh all my actions differently and more carefully."
The scene was worthy of Koestler or Orwell. One could picture Lott continuing: "Okay, Mister Gordon, if you hold up four fingers and say there are five, I will see five. I absolutely abjure all my thought crimes. And I love Big Brother!"
At first blush, it seems hard to explain Lott's performance. Surely he was not worrying about black support at home, since he had so little to begin with. And it's not as if endorsing affirmative action was going to bring him a deluge of white support. Republicans are supposedly the masters of the "race card," so how did Lott pull a joker? And why does he sidestep racial issues in his book?
To understand what happened to Lott, remember that he is not alone. A great unreported story in American politics is the silencing of overt debate on racial issues. When was the last time a major GOP politician came out squarely against racial preferences? After the Supreme Court upheld discriminatory admissions policies in the 2003 decision Grutter v. Bollinger, few Republicans had anything critical to say.
While Republicans have little chance at the black vote, they do hope for a share of Latinos. Although some surveys suggest otherwise, they think an attack on preferences would scuttle their Latino prospects. They also face pressure from business, which has surrendered on the issue. Discrimination in the name of diversity helps executives avoid protests and boycotts, and they cringe at the idea of revisiting the question.
The most potent cause of the Republicans' silence is fear of the "racist" label. Liberal Democrats have always seized every possible opportunity to link Republicans to bigotry. In most cases, the charge is baseless–but there's a catch. Only a few decades ago, Democratic segregationists such as Strom Thurmond and Jesse Helms did indeed move over to the GOP. As long as such people held high positions in the Republican Party, Democrats could use them as rhetorical battering rams. (Yes, there is a double standard. As Republicans like to point out, former Klansman Robert Byrd is the ranking Democratic senator. But he has escaped the wrath of civil rights organizations by moving left over the years.)
Lott is a transitional figure. Although the worst of segregation was over by the time he entered Congress, he did not free himself from the past. For a long time, he had a warm relationship with the Council of Conservative Citizens, the successor to the racist Citizens Councils. In 1981 he filed a brief in a Supreme Court case concerning Bob Jones University. The IRS had revoked its tax-exempt status because it forbade interracial dating. Lott argued that "racial discrimination does not always violate public policy."
During the Thurmond furor, the liberal blogosphere used those examples to make Lott a symbol of GOP racism. By defending him, Republicans feared, they might expose themselves to the same accusation. With his friends going silent and his enemies getting louder, he knew he had to give up the leadership.
Lott's critics engaged in Pinteresque silence of their own when they omitted a key line of his Bob Jones brief: "Schools are allowed to practice racial discrimination in admissions in the interest of diversity." For backers of affirmative action, that statement remains uncomfortably accurate. In Herding Cats, Lott could have made a strong case for colorblind equality by revising his argument. As long as we allow one form of discrimination, he could have said, we foster other forms. Instead of using the comparison to back up Bob Jones, he could have used it to condemn racial preferences.
The post The Silence of the Cats appeared first on Reason.com.
]]>As Campaign 2004 entered its home stretch, we asked a variety of policy wonks, journalists, thinkers, and other public figures in the reason universe to reveal for whom they are voting this fall, for whom they pulled the lever last time around, their most embarrassing presidential vote, and their favorite president of all time. Their answers, as of late August, follow.
—The Editors
Contributing Editor Bagge is best known as author of the alternative comic book Hate.
2004 vote: If it looks like my home state could go either way by Election Day, I'll vote for John Kerry. Otherwise I'll vote for the Libertarian Party's candidate, Michael Badnarik. That's been my M.O. every election year, since the Democratic candidate usually strikes me as the lesser of two evils (if not by much).
2000 vote: Harry Browne.
Most embarrassing vote: Every time I've voted for a major-party candidate I've felt embarrassed. I vaguely recall voting in '88 for Michael Dukakis, whose only positive attribute was that his last name wasn't Bush (as is the case with John F. Kerry).
Favorite president: George Washington, for actually refusing to assume as much power as he could have gotten away with. I can't think offhand of another president that could be said about.
Bailey is Reason's science correspondent.
2004 vote: I'm undecided between Republican George W. Bush and Libertarian Michael Badnarik. Bush has been a great disappointment. But Kerry will be even worse—raising taxes, overregulating, and socializing more of medicine. What to do?
2000 vote: George W. Bush. I couldn't possibly have voted for Gore since he dislikes me personally. Besides, I was presciently worried (you can ask my wife) about a popular vote/electoral vote mismatch.
Most embarrassing vote: George McGovern, 1972. I was 18 and thought I was a socialist.
Favorite president: George Washington. The man spurned being made king and stepped peacefully down from office.
Barlow is a songwriter for the Grateful Dead and other bands, the co-founder and vice chair of the Electronic Frontier Foundation, and a Berkman Fellow at Harvard Law School.
2004 vote: I'm voting for John Kerry, though with little enthusiasm. This is only because I would prefer almost anything to another four years of George W. Bush. I don't believe the Constitution, the economy, or the environment can endure another Bush administration without sustaining almost irreparable damage.
2000 vote: John Hagelin of the Natural Law Party. I discovered, in the voting booth, that a friend of mine was his vice presidential candidate. I couldn't bring myself to vote for Bush, Gore, or Nader and had intended to cast no presidential vote.
Most embarrassing vote: I'm embarrassed for my country that in my entire voting life, there has never been a major-party candidate whom I felt I could vote for. All of my presidential votes, whether for George Wallace, Dick Gregory, or John Hagelin, have been protest votes.
Favorite president: Jefferson, who defined, in his works and in his person, just about everything I love about America.
Bovard is author of The Bush Betrayal (Palgrave Macmillan) and seven other books.
2004 vote: I will probably vote for Badnarik, the Libertarian Party candidate. Both of the major-party candidates brazenly flaunt their contempt for the U.S. Constitution. Regardless of who wins in November, the U.S. likely will have a lousy president for the next four years.
2000 vote: I abstained.
Most embarrassing vote: I voted for Gerald Ford in 1976. He was not that embarrassing, compared to Jimmy Carter. And compared to George W. Bush, Ford was verbally graceful.
Favorite president: It might be a coin toss between Washington and Jefferson.
Washington set a magnificent example of self-restraint, protecting the new nation from both his own power lust and unnecessary wars (despite foolish popular demands). Jefferson masterfully reined in the federal government from the tyrannical Alien and Sedition Act persecutions that John Adams launched.
Brand is the founder of the Whole Earth Catalog and the Long Now Foundation. He is the author of, among other books, The Media Lab (Viking) and How Buildings Learn (Penguin).
2004 vote: Kerry. He's knowledgeable enough and appears to do well in crunches. He has the skills and connections to begin to undo the damage of the Bush years. He's highly ambitious, which is fine with me. And he personally killed a man who was trying to kill him and his crew. You could also say he personally attacked a government that was trying to kill his generation. Those actions take sand. Too bad they won't come up in the debates.
2000 vote: Al Gore.
Most embarrassing vote: Lyndon Johnson.
Favorite president: Theodore Roosevelt for a fine blend of intellect and zzzzzest, tied with Bill Clinton for the same reasons.
Carey stars in Drew Carey's Green Screen Show, beginning October 7 on the WB.
2004 vote: Quit pretending that it matters, would you? Can you vote for all the nefarious cabals that really run the world? No. So fuck it.
2000 vote: I voted Libertarian, for all the good it did me.
Most embarrassing vote: Is it considered embarrassing to cast a vote out of principle for someone you know doesn't have a snowball's chance of winning? Oh, OK. Then they're all embarrassing.
Favorite president: Andrew Jackson, because he's what a lap dance costs (and because, ironically, he opposed having a National Bank).
Cavanaugh is Reason's Web editor.
2004 vote: Michael Badnarik, because he's Not Bush either.
2000 vote: Ralph Nader.
Most embarrassing vote: Dukakis in 1988. I thought he looked cool in that tank!
Favorite president: If we can't count John Hanson, then Warren G. Harding; would that they could all achieve so little.
Chapman is a columnist and editorial writer for the Chicago Tribune.
2004 vote: I haven't decided between John Kerry and Michael Badnarik. I have only the dimmest hopes for a Kerry presidency, but I think Bush has to be held accountable for Iraq, the worst foreign policy blunder since Vietnam, and the accelerated growth of the federal government.
2000 vote: Harry Browne, in keeping with my usual (though not automatic) practice of voting for the Libertarian presidential nominee.
Most embarrassing vote: Richard Nixon, in my first election, 1972, an experience that helped estrange me permanently from the Republican Party.
Favorite president: Thomas Jefferson, who took great and justified pride that as president, he eliminated internal taxes and avoided war, and who peacefully doubled the size of the young nation.
Senior Editor Doherty is author of This Is Burning Man (Little, Brown).
2004 vote: I am a principled nonvoter. If I were forced to vote at gunpoint, I'd pick the Libertarian Party's Michael Badnarik, whose views on the proper role of government most closely resemble mine.
2000 vote: I did not vote. Those who vote have no right to complain.
Most embarrassing vote: I've been saved the embarrassment of ever having to feel any sense of responsibility, of even the smallest size, for the actions of any politician.
Favorite president: In their roles as president, I can't be an enthusiastic fan of any of them, but for his role in crafting the Constitution, a document that held some (unrealized) promise to limit government powers, James Madison.
Richard Epstein
Epstein is a professor of law at the University of Chicago and author, most recently, of Skepticism and Freedom: A Modern Case for Classical Liberalism (University of Chicago).
2004 vote: I don't know who the Libertarian candidate is this time, but you can put me down as voting for him; anyone but the Big Two. As far as I can tell, the debate thus far has borne no relation to the important issues facing the nation…except Vietnam. It's just two members of the same statist party fighting over whose friends will get favors.
2000 vote: I can't remember.
Most embarrassing vote: Since I don't remember who I vote for from one election to the next, it's hard to say. I suppose Richard Nixon in '72, though that doesn't mean I'd want to have voted for George McGovern either.
Favorite president: I'm certainly a Calvin Coolidge fan; he made some mistakes, but he was a small-government guy.
Freund is a senior editor at Reason.
2004 vote: I'm still thinking about it.
2000 vote: Harry Browne.
Most embarrassing vote: Andre Marrou.
Favorite president: I have no favorite president.
Contributing Editor Garvin, author of Everybody Had His Own Gringo: The CIA and the Contras (Brassey's), writes about television for The Miami Herald.
2004 vote: I live in Florida. My votes are randomly assigned based on the interaction of our voting machines, the Miami-Dade Election Commission, and passing UFOs.
2000 vote: See above.
Most embarrassing vote: My presidential record is solid. However, I once cast a write-in ballot for Dynasty's Joan Collins for Congress. It was an immature act, insulting to America's democratic institutions, and I regret it. Upon reflection,China Beach's Dana Delany would have been a more deserving choice.
Favorite president: William Henry Harrison caught pneumonia while delivering his inaugural address, lay in bed barely conscious for six weeks, and then died, his presidency having done hardly any damage to the country.
George is a New York Post columnist, West Indian Catholic stand-up comic, and recovering Republican flunky.
2004 vote: Living in a maximum blue state allows me to vote my conscience. Rather than vote for an entitlement-expanding, tariff-imposing, deficit-increasing, big-government Johnson Republican or an entitlement-expanding, tariff-imposing, tax-increasing, big-government Nixon Democrat, I will vote for Libertarian Party candidate Michael Badnarik.
2000 vote: Harry Browne. Bush's last-second evasiveness on his DUI arrest was too reminiscent of the slippery tongue of the guy about to leave office. Bob Jones University didn't help either.
Most embarrassing vote: Never actually did it, but had I been a citizen the year I turned 18 (1980), I would have (gulp!) voted for Jimmy Carter. It's the secret shame I have carried around for decades.
Favorite president: Abraham Lincoln, for proving that it's best to keep the band together despite many years of creative differences. (Had the Beatles followed his example, the world would have been a much better place.)
Gillespie is editor of Reason and of the new anthology Choice: The Best of Reason (BenBella Books).
2004 vote: Probably no one but maybe Badnarik, if only to register dissent from the Crest and Colgate parties.
2000 vote: Harry Browne, I think, but possibly no one.
Most embarrassing vote: In 1984, the first time I could vote for president and the only time I've voted for a major-party candidate, I cast a ballot for Walter Mondale. I empathize with complete losers, and a guy whose only memorable campaign line—"Where's the beef?"—came from a Wendy's commercial (not even a McDonald's spot!) was a loser of historic proportions.
Favorite president: Richard Nixon, who has done more in my lifetime than any other U.S. pol to discredit the idea that government should wield massive and unexamined power over citizens.
Contributing Editor Godwin is legal director of Public Knowledge.
2004 vote: Kerry. Let's put it this way: After four years of Bush, the Republican Party has become an example of a political machine out of control. Everything they used to decry—reckless foreign intervention, fiscal irresponsibility, deficit spending—they now represent. You don't have to love Kerry or the Democrats to think it's time for a change. Worst case, at least we'd get divided government for four years.
2000 vote: Gore, only because as a Texan I already had strong reservations about George W. Bush.
Most embarrassing vote: Not a one.
Favorite president: I have a lot of fondness for Theodore Roosevelt: In him you had a strong, articulate president who never thought he lost manhood points by being pro-environment. I also like Eisenhower; he presided over a strong American response to a very polarized East-West world.
Hentoff, a nationally syndicated columnist, writes regularly for both the Village Voice and The Washington Times. An expanded paperback edition of his book The War on the Bill of Rights and the Gathering Resistance (Seven Stories Press) will be released this fall.
2004 vote: I'm not voting for anyone at the top of the ticket. I can't vote for Bush, who supports Ashcroft's various "revisions" to the Bill of Rights, since our liberties are what we're supposed to be fighting for. As for Kerry, I think he's an empty suit: How much time did he give his years in the Senate in his convention speech, about 40 seconds?
2000 vote: I voted for Nader last time. But he wants to pull the troops out of Iraq, which would lead to a state of nature like Thomas Hobbes had; it would be disastrous. He's also become part of the bash-Israel crowd, and to get on ballots he's been cooperating with Lenora Fulani, who has been accused of harboring anti-Semitic biases.
Most embarrassing vote: Well, I didn't mind voting for Nader in 2000, because Gore had a whole series of empty suits during that campaign, and I didn't think much of Bush either. I can't think of any votes I'm particularly embarrassed about.
Favorite president: FDR. He could have done much more to help the victims of the Holocaust, but he did act decisively (if trickily) to take us into the war, which was essential. Otherwise we'd all be speaking German. And as Cass Sunstein has pointed out, FDR was the one who laid out a "second bill of rights," with economic freedoms like a right to decent housing.
Higgs is a senior fellow in political economy at the Independent Institute and author, most recently, of Against Leviathan (Independent Institute).
2004 vote: I never vote. I don't wish to soil my hands.
2000 vote: Had I been forced to cast a ballot for president in the 2000 election, I might have died of septicemic disgust.
Most embarrassing vote: I voted only once in a presidential election, in 1976, and I did so on that occasion only so that I could irritate my left-liberal colleagues at the University of Washington by telling them that I had voted for "that idiot" Gerald Ford.
Favorite president: Grover Cleveland, because he, more so than any of the others, acted in accordance with his oath to preserve and protect the Constitution, despite great pressures to act otherwise.
Jillette is the larger, louder half of the comedy/magic team Penn & Teller and star of Showtime's Penn & Teller: Bullshit!
2004 vote: I'm undecided (always the stupidest position). I might do the moral thing and not vote at all, or do the sensible thing and vote Libertarian (Badnarik, right?), or I might make 100 bucks from my buddy Tony and vote for Bush. (I told Tony that Bush and Kerry were exactly the same, and he bet me 100 bucks that I didn't believe that enough to really truly vote for Bush.) But if you want to be pragmatic, I'm in Nevada, so who cares?
2000 vote: Harry Browne!
Most embarrassing vote: I must have voted Republicrat at least once, but voting is secret—the Founding Fathers didn't want us to be embarrassed by our evil pasts.
Favorite president: Teller (he's president of Buggs and Rudy Discount Productions [Penn & Teller's company]), because he can lie without saying a word.
Kopel is research director of the Independence Institute in Golden, Colorado.
2004 vote: George Bush. This will be the first election in which I have ever voted for a Republican for president. We're in a war in which the survival of civilization is at stake, and Bush is the only candidate who realizes the gravity of the danger we face and who is determined to win World War IV.
2000 vote: Ralph Nader.
Most embarrassing vote: Harry Browne, 1996.
Favorite president: George Washington, for leading the nation through extremely perilous times, and for setting the highest standard of personal conduct and patriotic leadership.
Contributing Editor McClaughry, a senior policy adviser in the early Reagan White House, is president of the Ethan Allen Institute in Vermont.
2004 vote: George W. Bush. Unlike his opponents, he at least understands that only America can defeat militant Islam by a combination of military force and the ideology of freedom. At home, his recent advocacy for "a new era of ownership" promises the only way out of statist stagnation.
2000 vote: Bush.
Most embarrassing vote: Nixon, 1972.
Favorite president: Jefferson. He respected the Constitution, shrank government, slashed taxes, paid cash for Louisiana, hammered the Algerine pirates, reined in the judiciary, and began to extinguish the national debt.
Contributing Editor McCloskey teaches economics, history, English, communications, and anything else they tell her to teach at the University of Illinois at Chicago.
2004 vote: Can you believe this? The last time I voted for anyone but a Libertarian was for George McGovern in 1972, against the war. This time, another Democrat gets it—though as an economist I know it's irrational to vote at all.
2000 vote: For whomever the Libertarian Party candidate was. You expect me to remember?
Most embarrassing vote: Lyndon Johnson in 1964, my very first vote. Because Goldwater was scary. Did I know scary?
Favorite president: Speaking of scary, the office is. So the least effective: Warren Harding.
McElroy is a Fox News columnist, the editor of ifeminists.net, and a fellow at the Independent Institute.
2004 vote: I'm voting for No One for at least three reasons: 1) As a Canadian, I am spared the insulting process of punching a ballot to express which power glutton should prevail; 2) as an anarchist, I refuse to legitimize the process that puts anyone in a position of unjust power over people's lives; and 3) as a practical matter of value returned for effort, the time is better spent enjoying family or working.
2000 vote: No One. I admit to being so anti-Clinton and so appalled by the prospect of political correctness continuing that I uttered out loud "anyone but Gore"—which, in practical terms, meant Bush. Those famous last words are right up there with Socrates saying, "I drank what?"
Most embarrassing vote: I have never voted in a political proceeding. But when I first became a libertarian while living in California, I did support the Libertarian Party candidate. This would be more embarrassing if I had not learned from my mistake. The lesson: It is not the particular man in power that I oppose but the power itself, which is unjust. As a matter of logic, if nothing else, I cannot oppose the office as illegitimate while waving a straw hat and yelling, "Elect my man to it!"
Favorite president: Thomas Jefferson in his first term because he did less harm than any other president.
Murray is W.H. Brady Scholar at the American Enterprise Institute and author, most recently, of Human Accomplishment (HarperCollins).
2004 vote: Reluctantly—very reluctantly—George Bush. I find the Democrats so extremely obnoxious that I have to vote against them, and I can't do that voting Libertarian.
2000 vote: Harry Browne.
Most embarrassing vote: For Bob Dole in 1996, because I found Bill Clinton so extremely obnoxious that I had to vote against him. Probably my 2004 vote will be my second most embarrassing ballot.
Favorite president: Gotta be George Washington. He was acutely conscious that everything he did would be a precedent, and just about every choice he made was right.
O'Rourke is H.L. Mencken Research Fellow at the Cato Institute and author, most recently, of Peace Kills (Atlantic Monthly Press).
2004 vote: George W. Bush, because I don't want Johnnie Cochran on the Supreme Court.
2000 vote: George W. Bush. (I always vote Republican because Republicans have fewer ideas. Although, in the case of George W., not fewer enough.)
Most embarrassing vote: A 1968 write-in for "Chairman Meow," my girlfriend's cat. It seemed very funny at the time. As I mentioned, this was 1968.
Favorite president: Calvin Coolidge—why say more?
Paglia is a professor of humanities and media studies at the University of the Arts in Philadelphia.
2004 vote: John Kerry. In the hope that he will restore our alliances and reduce rabid anti-Americanism in this era of terrorism when international good will and cooperation are crucial.
2000 vote: Ralph Nader. Because I detest the arrogant, corrupt superstructure of the Democratic Party, with which I remain stubbornly registered.
Most embarrassing vote: Bill Clinton the second time around. Because he did not honorably resign when the Lewinsky scandal broke and instead tied up the country and paralyzed the government for two years, leading directly to our blindsiding by 9/11.
Favorite president: John F. Kennedy. Not that he accomplished much. But he was the first candidate I campaigned for as an adolescent, and I still admire his articulateness and vigor. The Kennedys gave the White House sophistication and style.
Pinker is Johnstone Professor of Psychology at Harvard and author of The Blank Slate (Penguin), How the Mind Works (W.W. Norton), and The Language Instinct (HarperCollins).
2004 vote: Kerry. The reason is reason: Bush uses too little of it. In the war on terror, his administration stints on loose-nuke surveillance while confiscating nail clippers and issuing color-coded duct tape advisories. His restrictions on stem cell research are incoherent, his dismissal of possible climate change inexcusable.
2000 vote: Gore, with misgivings.
Most embarrassing vote: I left Canada shortly after turning 18 and became a U.S. citizen only recently, so I haven't voted enough to be too embarrassed yet.
Favorite president: James Madison, for articulating the basis for democracy in terms of the nature of human nature.
Contributing Editor Pitney is a professor of government at Claremont McKenna College and author of The Art of Political Warfare (University of Oklahoma Press).
2004 vote: I'm voting for Bush. He cut taxes. Kerry would raise them.
2000 vote: Bush. Former reason Editor Virginia Postrel put it well: "Bush is a mixed bag. But I think Al Gore is the devil."
Most embarrassing vote: I'm comfortable with my presidential votes in general elections. In 1980, though, I hesitated to back Reagan in the primaries because I didn't think he would win in November. Oops.
Favorite president: Abraham Lincoln preserved the Union and wrote some of the most beautiful prose that ever came from an American pen.
Poole is the founder, former president, and current director of transportation studies at the Reason Foundation. He served on the Bush-Cheney transportation policy task force during the 2000 campaign and on the administration's transition team.
2004 vote: Bush, reluctantly, despite his troubling expansions of the federal government and threats to civil liberties. The alternative is simply worse. We elect not an individual but a consulting team, and we'll have a far better team in place with Bush than with Kerry.
2000 vote: Bush.
Most embarrassing vote: Richard Nixon in 1968. What a disaster! But at least he kept his promise to eliminate the draft.
Favorite president: Thomas Jefferson, despite his flaws, for his intellect, limited government philosophy, and strong support for separation of church and state.
Rauch is a correspondent for The Atlantic and a columnist for National Journal.
2004 vote: I'll recuse myself from this…I never tell my vote. A journalist thing.
2000 vote: See above.
Most embarrassing vote: See above.
Favorite president: Lincoln. All the usual reasons. Favorite prez of past 40 years: Bush 41. Beats Reagan and everybody else hands down.
Rennie is editor-in-chief of Scientific American.
2004 vote: John Kerry. Anybody who has seen Scientific American's editorials during the last few years knows we're deeply unhappy with the de facto anti-scientism of the current administration. Science shouldn't trump all else in setting policy, but it would be a nice change of pace for a White House to put science ahead of ideology again. Of course, I'm keeping my expectations low.
2000 vote: Remember that guy? The one that everybody said claimed to have invented the Internet, except he hadn't said that at all? He seemed good.
Most embarrassing vote: Back in college in 1980, flushed with youthful sanctimony, I voted for John Anderson. The voting booth is a bad place to be an idealist. But at least when I threw my vote away on a third-party candidate, it was irrelevant.
Favorite president: John Quincy Adams showed that it was possible for the son of a president to rise to that same office in a highly disputed election without being remembered as a dangerous embarrassment.
Reynolds, a professor of law at the University of Tennessee, publishes InstaPundit.com.
2004 vote: Most likely George Bush, and for one reason: the war. I'm having trouble trusting Kerry on that.
2000 vote: Harry Browne.
Most embarrassing vote: Dukakis, '88.
Favorite president: Calvin Coolidge, who knew his limitations.
Along with his partner Jane Metcalfe, Rossetto started Wired.
2004 vote: Bush may be wrong about everything else, but he is right about the issue that matters most for my children's future: stopping Islamic fascism. And Manchurian candidate Kerry and the Copperheads, er, Democrats, are just a joke, preferring to act as though this probably generation-spanning war is about politics, not the survival of the West.
2000 vote: I am proud to say that I have never voted for president in my life, despite having eight opportunities to do so, and really resent being forced to do so now.
Most embarrassing vote: This one—but the alternative of not voting and allowing a billionaire currency speculator like George Soros to pick the next U.S. president is too dire to contemplate.
Favorite president: Chauncey Gardner, from Being There. He was even less verbose than my next favorite president, Calvin Coolidge.
Sanchez is Reason's assistant editor.
2004 vote: Kerry would get my vote if I didn't live in the District of Columbia, but the prospect of raising his total from 94.0001 percent to 94.0002 percent isn't quite enough to lure me to the polls. Badnarik is embarrassing, and Bush is so egregiously dishonest and destructive that even electing a mannequin like Kerry is an acceptable price of ousting him.
2000 vote: I didn't vote, though (to my shame, in retrospect) I was optimistic about G.W. Bush after the convention speeches, where all that focus on Social Security reform, educational choice, and "humble" foreign policy led me to think he might do some net good.
Most embarrassing vote: I've managed to spare myself that particular breed of embarrassment by not voting. Blessed are the apathetic, for they get the better even of their political blunders.
Favorite president: Grover Cleveland, who vetoed a popular agricultural assistance bill with the phrase, "Though the people support the government, the government should not support the people."
Shafer writes the Press Box column for Slate.
2004 vote: Who is the Libertarian candidate this year? That's who. Because I'm a yellow dog Libertarian.
2000 vote: Who was the Libertarian candidate that year? That's who.
Most embarrassing vote: I've never been embarrassed in the slightest by my presidential ballot.
Favorite president: Richard Nixon, because he's the gift that keeps giving.
Shermer is publisher of Skeptic magazine, a monthly columnist for Scientific American, author of The Science of Good and Evil (Henry Holt), and a bicycling enthusiast.
2004 vote: John Kerry. I'm a libertarian, but in 2000 I voted my conscience under the assumption that it probably didn't matter who won between Bush and Gore (Tweedledee and Tweedledum when compared to Browne), and I was wrong. It did matter. The world situation is too precarious and too dangerous to flip a coin, the Libertarian candidate cannot win, Bush's foreign policy is making the world more dangerous and more precarious rather than less, and Kerry has a good chance to win and an even better chance to improve our situation. Most important, he's a serious cyclist who wears the yellow "LiveStrong" bracelet in support of Lance Armstrong's cancer foundation and Tour de France win.
2000 vote: Harry Browne, because like the Naderites on the other end of the spectrum I voted my conscience.
Most embarrassing vote: Richard Nixon, 1972, my first presidential vote cast, just out of high school. My poli-sci profs the next several years of college regaled us with daily updates about Watergate. Ooops…
Favorite president: Thomas Jefferson, because 1) he was a champion of liberty, 2) he applied scientific thinking to the political, economic, and social spheres, and 3) when he dined alone at the White House there was more intelligence in that room than when John F. Kennedy hosted a dinner there for a roomful of Nobel laureates.
Sirius, former editor-in-chief of Mondo 2000, edits NeoFiles at life-enhancement.com/NeoFiles and is author of, most recently, Counterculture Through the Ages: From Abraham to Acid House (Random House).
2004 vote: I can't bring myself to say it. I'm voting for the only guys who stand a chance of replacing the complete insanity of the Bush administration and return us to the ordinary consensus madness we had come to know so well. You know, John and John.
2000 vote: Even though I ran as a write-in candidate myself, I wound up voting for Nader because I thought he gave such rousing and impressive speeches. I wouldn't actually want him to be president though. He's way too puritanical. Did anybody notice that he joined in on the Janet Jackson nipple crisis? Instead of objecting from a religious point of view, he objects from the view that corporate media are "spewing filth" into our environment. The health fascists are everywhere.
Most embarrassing vote: Ralph Nader in 2000. First of all, some other people actually voted for me. My insincerity is justifiable only in a dadaist context, which I therefore proclaim. And secondly, it encouraged Nader, who is now clearly addicted to the run.
Favorite president: It's difficult to rate the quality of an 18th-century president's decisions at this distance, but I choose Thomas Jefferson for eloquently elucidating many of the ideas and attitudes of the Age of Reason.
Smith is chairman of the Federal Election Commission.
2004 vote: That's one an election commissioner better not answer; we're not supposed to engage in partisan activities
2000 vote: I don't want to answer that one either.
Most embarrassing vote: I'm too smart to cast embarrassing ballots.
Favorite president: Warren G. Harding; he's a vastly underrated president and a man of great ordinary decency.
Smith, the 2002 Nobel Prize winner in economics, was born and raised in Kansas and inspired to learn by a farmer-teacher in a one-room country schoolhouse.
2004 vote: I am not voting for Kerry. Yet to be decided is whether I will vote for Bush or for neither.
2000 vote: Bush.
Most embarrassing vote: Many of the presidents were embarrassments, but not for me as a voter, because I always think of myself as voting against the other one. This policy led me to vote "for" Humphrey, Nixon, and others, but I saw myself as choosing negatively.
Favorite president: Eisenhower, for whom the affairs of state could always await another round of golf—for his not wanting to "get bogged down in a land war in Asia," and his concern about the military-industrial complex, which generalizes to other complexes like the Treasury-investment-banking complex.
Sullivan, a senior editor at The New Republic, blogs at andrewsullivan.com.
2004 vote: I can't vote because I'm not a citizen. So I can only "support" candidates, and I'm not supporting anyone in this election.
2000 choice: Bush.
Most embarrassing choice: I'm unembarrassed by all my choices.
Favorite president: Lincoln, of course. He saved the Union.
Senior Editor Sullum is a syndicated columnist and author of Saying Yes: In Defense of Drug Use (Tarcher Penguin).
2004 vote: The thought of choosing between Bush and Kerry, or casting another pathetic protest vote for the Libertarian candidate, is so depressing that I probably won't be motivated to visit my local polling place this year. I'd like to see Bush lose, but without Kerry winning. Much as I disliked him when he was in office, Clinton is looking better and better to me in retrospect.
2000 vote: Harry Browne.
Most embarrassing vote: It's a tossup between Mondale in 1984 and Dukakis in 1988. Dukakis is more embarrassing because it was more recent (and because he's Dukakis), but Mondale is more embarrassing because I voted for him twice—in New York's open primary (when I almost went for Gary Hart) as well as the general election.
Favorite president: Thomas Jefferson for his writings rather than his performance in office, Grover Cleveland or Calvin Coolidge for taking seriously their oaths to uphold the Constitution and its limits on federal power.
Taylor writes Reason Express, a weekly news commentary for Reason Online.
2004 vote: George W. Bush, pathetic bastard that he is—and has made me. The only thing that I am certain that John Kerry would do is raise taxes. Plus I figure the potential confusion surrounding a new national security team may well get people killed. See? Pathetic.
2000 choice: Bush, in Maryland. A gimme.
Most embarrassing vote: Bob Dole in 1996. Knew that was going nowhere.
Favorite president: Ronald Reagan, because he beat back the flawed economic theory that was destroying America and the world.
Volokh teaches and writes, mostly about constitutional law and cyberspace law, at UCLA School of Law. He blogs at volokh.com.
2004 vote: George W. Bush. I almost always vote for the party, not the man, because the administration, its legislative agenda, and its judicial appointments generally reflect the overall shape of the party. I tend to think that Republicans' views on the war against terrorists, economic policy, taxes, and many though not all civil liberties questions—such as self-defense rights, school choice, color blindness, and the freedom of speech (at least as to political and religious speech)—are more sound than the Democrats' views. I certainly find plenty to disagree with the Republicans even on those topics, but if I waited for a party with which I agreed on everything or even almost everything, I'd be waiting a long time.
2000 choice: George W. Bush.
Most embarrassing choice: Can't think of any.
Favorite president: George Washington. As best I can tell, he did a crucial job better than anyone else could have done, and I don't know him well enough to have learned about all his warts.
Managing Editor Walker is author of Rebels on the Air: An Alternative History of Radio in America (NYU Press).
2004 vote: I'm not sure yet, but I'm increasingly inclined to write in Elmer Fudd.
2000 vote: Harry Browne.
Most embarrassing vote: Dukakis. Bush Sr.'s ACLU-baiting campaign was appalling, and I wasn't yet ready to start throwing my vote away on third-party candidates and frivolous write-ins. So I threw it away on a Democrat instead.
Favorite president: It would have to be one of those practically powerless presidents who served under the Articles of Confederation—maybe the anti-federalist Richard Henry Lee, chief of the Continental Congress from 1784 to 1785, who helped launch the American Revolution, tried to ban the importation of slaves, fought to include a Bill of Rights in the Constitution, and sang the goofiest song in 1776.
Wanniski midwifed supply-side economics, was the first Marxist to work as an editor of the Wall Street Journal editorial page, and is author of The Way the World Works (Gateway).
2004 vote: Bush does not deserve to be re-elected, and Kerry does not deserve to be elected—Bush because of Iraq and Kerry because his economics are dreadful. I'm leaning toward Kerry because I prefer recession to imperialist war, but Bush might tempt me back by firing Cheney, Rumsfeld, and company.
2000 vote: George W. Bush.
Most embarrassing vote: I only voted for Dole in '96 because Jack Kemp was on the ticket. I should have voted Perot in protest.
Favorite president: Abraham Lincoln, because he saved the Union. Otherwise, there would today be no USA.
Contributing Editor Welch is a columnist for Canada's National Post. He blogs at mattwelch.com.
2004 vote: John Kerry, because I think the foreign policy approach of Bush and his administration would make the world, on balance, less safe and less free, while expanding the target on Uncle Sam's back, compared to a presidency run by a flip-floppy Democrat with a Republican Congress. I think Bush needs to be fired.
2000 vote: Ralph Nader, despite covering his campaign.
Most embarrassing vote: Dukakis. I'm hoping not to feel the same way this November.
Favorite president: Lincoln, for freeing slaves, preserving the Union, and serving as a ghostly conscience to haunt the Republicans' "Southern Strategy."
Wilson helped found the Guns and Dope Party (gunsanddope.com), which urges everybody to vote for himself. His best-known book is Illuminatus! (Dell), co-written with Robert Shea.
2004 vote: I'm voting for myself because I don't believe anybody else can represent me as well as I can represent myself.
2000 vote: Nobody. I sat in my tent and sulked, like Achilles.
Most embarrassing vote: In 1964 I voted for LBJ because I thought Goldwater would escalate the war in Vietnam.
Favorite president: John Adams, because he didn't trust anybody in politics, including himself.
Contributing Editor Young is a columnist for the Boston Globe.
2004 vote: That's a little private, don't you think? Whichever way I answer, half my friends won't talk to me anymore. (Such are the perils of having a bipartisan social life.) Of course, I could cast a write-in vote for Xena, Warrior Princess…but that would probably piss off my friends who are Buffy fans.
2000 vote: See above.
Most embarrassing vote: Well, talk about private!
Favorite president: George Washington—can't offend anyone with that.
The post Who's Getting Your Vote? appeared first on Reason.com.
]]>Bush's Brain: How Karl Rove Made George W. Bush Presidential, by James C. Moore and Wayne Slater, Hoboken, New Jersey: John Wiley & Sons, 395 pages, $27.95
Political consultants are like movie actors. A few achieve wealth and celebrity, even branding their tag lines onto our collective consciousness. James Carville's "It's the economy, stupid" is almost as familiar as Robert DeNiro's "You talking to me?" Seeing these examples, ambitious youngsters may try to follow their career path, not realizing that most people in the field never make the big time. Instead of clinking champagne glasses at ritzy receptions, the silent majority of consultants and actors must settle for store-brand beer and Cheez Whiz.
Elections turn on a wide array of forces, including economics, demographics, and sheer chance. Amid all of these variables, a well-executed campaign can lose, just as a troubled one can win. As soon as the outcome is clear, however, pundits credit the winners with brilliance and blame the losers for incompetence. The former become the subjects of books and the latter become questions in the political-junkie version of Trivial Pursuit.
When the Long Count of 2000 finally ended in a Bush victory, eyes turned to his lead strategist Karl Rove, the 52-year-old stalwart who has worked with Bush since his unsuccessful 1978 race for Congress and who has helped many Texas Republicans win in a onetime Democratic stronghold. Ever since the Supreme Court ruled that Bush was president, numerous press reports have analyzed Rove's influence, and he was even a character in the mercifully short-lived sitcom That's My Bush! Now we have two full-length biographies: Bush's Brain and Boy Genius.
There's no question that Rove is a smart political operative who deserves careful study. But it's important to remember how easily things could have been different. If a few hundred more voters in Florida's Palm Beach county had understood simple ballot instructions, Rove would be back in Texas, Gore would be in the White House, and Gore campaign manager Donna Brazile would be on the cover of a book titled Girl Genius.
Any study of a current political figure has inherent limitations. It is hard to get candid comments from people who must continue to work with the subject after the book is out. And aside from the occasional leak, private papers will stay out of reach until the archives open, far in the future. Within those constraints, both of these books about Rove have at least something to offer. Though one of them is often obscured by factual errors and reckless speculation, combined the two books paint a picture of an intelligent operative thrust by the exigencies of politics into an exalted position of perceived influence. That perception benefits both friends and foes of the Bush administration.
Boy Genius, by reporters Lou Dubose, Jan Reid, and Carl M. Cannon, is by far the better of the two. It offers a workmanlike analysis of the Texas political wars where Rove honed his skills and an overview of his role since Bush moved into the White House. Dubose and Reid wrote the Texas sections, and they show how Rove successfully halted a Democratic counterattack against burgeoning GOP dominance in that state by running a series of tough statewide campaigns. Drawing on the state's underlying conservatism and aided by Democratic missteps, Rove helped open the way for Bush's governorship and the GOP's near-total dominance in Texas politics today.
Veteran Washington journalist Carl Cannon wrote the section on the 2000 election and the early Bush presidency. His treatment of the 2002 midterm is particularly valuable, as the president's party managed the unusual feat of gaining seats in both the House and Senate. Notwithstanding the GOP's rhetorical devotion to decentralization, Cannon describes a "top-down Bush era Republican Party" in which Rove made many of the key campaign decisions, including the emphasis on tax cuts and the recruitment of strong Senate candidates such as Norm Coleman of Minnesota.
In Bush's Brain, Texas journalists James C. Moore and Wayne Slater credit Rove with a vital decision that won Bush the 2000 campaign. Though West Virginia had long voted Democratic in close presidential races, Rove noticed that Clinton-Gore policies on gun control and the environment had irked the state's many hunters and mineworkers. He had the Bush campaign fight hard for the state while the Gore forces took it for granted. On election night, Bush won its five electoral votes, without which he would have lost the race.
The authors' account of the battle of West Virginia sheds some light on Bush's policies as president. Since the state is so pivotal, do not expect the administration to tack leftward on the Second Amendment or mining regulations. Supporters of limited government will find that much encouraging. On the other hand, the votes of West Virginia steelworkers were on policy makers' minds when they hiked steel tariffs last year.
Unfortunately, this intelligent account of Rove's West Virginia strategy is the only good thing about Bush's Brain. The rest is a mess—not just bad, but bad the way Steven Seagal movies, Yugo convertibles, and muskrat-flavored yogurt is bad.
To begin with, the book is disturbingly laden with factual errors. Moore and Slater say, for example, that Rove styled the 2000 Bush campaign "after the work of Mark Hanna, an industrialist at the turn of the twentieth century." Rove has indeed compared the current period to the McKinley era. But he has explained that the better-known Hanna was just the GOP's money man, whereas its chief strategist—and Rove's role model—was not Hanna but Charles Dawes. Then an obscure thirtysomething lawyer, Dawes went on to a stellar career as budget director, vice president, and winner of the Nobel Peace Prize.
Moore and Slater's error is important. By missing Rove's true influence, they botch an important lead. Rove's fascination with Dawes could offer important clues about his worldview. (Boy Genius does get the Dawes reference right, but doesn't do much with it. Any future biography of Rove ought to develop it in depth.)
Moore and Slater also say that Rove tried to counter the Bush-as-Dumbo image by planting a story in the National Review listing all the big books that Bush had read. John J. Miller, the author of the story, replied in the National Review Online: "This is flatly wrong—or at least seriously misleading. Rove was not 'the source of the article.' I did speak with him and quote him, but he was one source among many. He was not even the originating source of the story, in the sense that I got the idea for it somewhere else."
Where Bush's Brain is not provably wrong, it relies overmuch on uneducated speculation. For instance, the authors devote two chapters to a 1986 incident in which Rove claimed that someone had bugged his office. He was running the Republican campaign for governor of Texas, and he hinted that Democrats might be responsible. At the time, some reporters suspected that Rove had concocted the whole thing in order to discredit the other side. "Maybe Rove did not plant the bug," say Moore and Slater, but it is "hard to disconnect him from culpability in the incident."
Why exactly do they think this? Rove had spoken of seeing Power, the silly Richard Gere movie about political consulting. In that movie, Gere finds a listening device in his phone. So that's where Rove must have gotten the idea, right? (I've seen The Godfather dozens of times, but I have not put a single horse's head in anybody's bed, or even thought about doing so.)
The ratio of speculation to fact in Bush's Brain goes way up as its story stretches into the Bush presidency. Moore and Slater have few sources in Washington's GOP community, so they depend on conjectures from fellow outsiders. It's like covering the Super Bowl by talking to people who couldn't get tickets.
After the September 11 attacks, they note, a former associate of Rove wrote an editorial saying that American foreign policy had helped spawn terrorism. He e-mailed a copy to Rove, who never answered. (It's possible that a senior White House aide might have been a tad busy at the time.) A right-wing Web site then posted the editorial, drawing volumes of angry e-mail to the author's inbox. Was Rove responsible, as his former friend suspects? "There was simply no way of knowing, not for sure," Moore and Slater admit, but nevertheless dwell on the tale at some length.
The authors' assessments of the power Rove wields in the Bush White House makes them sound like the coke-addled Sherlock Holmes ranting about Professor Moriarty in The Seven-Percent Solution: "He is the organizer of half that is evil and of nearly all that is undetected in this great city. He is a genius, a philosopher, an abstract thinker. He has a brain of the first order." They even refer to the Cuban embargo as "the Rove doctrine." Egad, Eisenhower imposed a partial embargo in 1960, and JFK expanded it the next year. Talk about a "boy genius"—Rove was in grade school at the time. Would Bush drop the embargo if Rove went off the payroll? And given the Cuban vote in the pivotal state of Florida, would Gore have dared to change it?
Moore and Slater also blame Rove's evil genius for the decision "to fight the war on terrorism, which was a just cause, and use the pureness of that purpose to advance the Republican political agenda." They start with a smidgen of fact. In June of 2002, a White House intern lost a computer disk in Lafayette Park. A Democratic Senate aide happened to find it, discovering a PowerPoint presentation by Rove and White House political director Ken Mehlman. Moore and Slater find dark significance in one slide that urged Republicans to "focus on war and the economy." Democrats damned that advice as a Rove effort to "politicize" terrorism. The plain fact is that those are the administration's top policy priorities, and would be if Karl Rove had never been born.
Carl Cannon gets the story straight in Boy Genius: "In truth, most of the stuff in the presentation, delivered to GOP donors at the posh Hay Adams Hotel, was boilerplate political fare. Still, it was embarrassing to misplace your own campaign materials. Hardly the stuff of genius."
But Moore and Slater go even farther: Not only is Rove responsible for a political focus on the war—he was behind the war itself. The failure to catch Osama and crush al Qaeda was threatening Bush's political standing, the argument goes, so the administration redefined the war on terrorism as a global conflict with evil. The war on terror was big enough, they say, but now we would go after any enemy whose destruction could gratify American souls and boost Bush in the polls. "Rove's political strategy for the president transformed a policy whose scope and tenets were unprecedented in American history. All it needed was a little justification. And Iraq was handy."
One of their "sources" on this point is a "political operative who has closely observed Rove's tactics for many years." This person says that Rove and Bush can now take their fight against evil and "apply that to anyone they want. Tom Daschle or Hussein." Another operative calls the Iraq war "the most evil political calculation in American history."
Despite the gravity of their accusation, the authors fail to offer a single scrap of evidence for Rove's masterminding the war. The "Rove-wags-the-dog" scenario does not even make sense on its own terms. Just suppose that Rove did have the power to start a war for its political payoff. When he was allegedly plotting the map to Baghdad, the generals could not guarantee swift victory. There was a small but serious chance that the conflict might drag on and on, leaving Bush and Rove in political body bags.
In a cold electoral calculation, would the probable benefits of victory have outweighed the risks of defeat? Not bloody likely. As an amateur historian, Rove would have known that successful wars are seldom good for the party in power. After World War I, the nation spurned Wilson's Democrats and turned to Warren Harding. In the first midterm election after World War II, Republicans took control of Congress. And the year after the first Bush won the first Gulf War, he lost re-election with the lowest popular-vote share of any incumbent since Taft.
Though they take the theory to loony extremes, Moore and Slater are hardly alone in painting Rove as the real power of the Bush administration. The cover artist of Boy Genius illustrates the idea by placing a photo of Bush above the word Boy and Rove above Genius. (Lest anyone miss the point, there is also a light bulb over Rove's head.)
Journalists can easily find people who will agree that Rove is Professor Moriarty. In Washington, there are two kinds of politicians with a stake in hyping Rove's influence: Republicans and Democrats. When Republicans disagree with the administration but do not want to criticize their president, they blame Rove. In 2001, for instance, the administration yielded to Hispanic groups that wanted to end Navy bombing tests in Vieques, Puerto Rico. Sen. James M. Inhofe, an Oklahoma Republican who backed the tests, grumbled to The Washington Post: "It was Karl Rove who made the decision. It was politically motivated."
And when Democrats have to admit that anything intelligent comes out of the Bush administration—headed by a man they dismiss as a grinning, brainless frat boy—they need Rove to take the credit. They simply cannot bring themselves to believe that that dummy Bush could have beaten Ann Richards, Al Gore, and the Taliban. In January, Sen. Ernest Hollings (D-S.C.) addressed House Majority Leader Bill Frist (R-Tenn.) on the Senate floor about economic policy: "I ask my distinguished friend from Tennessee, I know you will see Karl Rove, and I want him to see he is leading his distinguished president into the same trap that Bush 41 got led into."
It's hard to assess just how much Rove has guided the policies and strategies of the administration, since Bush is running a notably leak-free operation. White House aides and Republican operatives feel a genuine loyalty to the president, so they seldom talk out of turn to the press.
One who did was political scientist John DiIulio, who briefly headed the office of faith-based initiatives. In an e-mail to journalist Ron Suskind, DiIulio said: "Little happens on any issue without Karl's okay, and, often, he supplies such policy substance as the administration puts out. Fortunately, he is not just a largely self-taught, hyper-political guy, but also a very well informed guy when it comes to certain domestic issues. (Whether, as some now assert, he even has such sway in national security, homeland security, and foreign affairs, I cannot say.)"
The e-mail also suggested that Rove and his fellow "Mayberry Machiavellis" have what DiIulio called a "libertarian" streak. So are they channeling the thoughts of Milton Friedman and Friedrich Hayek into substantive decisions? Alas, neither the e-mail nor Suskind's Esquire article was clear on this point, and DiIulio later backed away from his comments. (Policy outcomes have been a mixed bag for libertarians, who may applaud the tax cuts but worry about the enormous power of the Department of Homeland Security.)
Despite these two biographies and a fair amount of press coverage, Rove remains a mystery—which is a good position for a loyal political aide to be in. For a clearer picture of his role, we will have to wait for books that can analyze the Bush administration in retrospect. Will such works depict Rove as a success? The answer will depend on the 2004 election. A Bush defeat would diminish Rove's stature, even if it happens for reasons beyond his control.
Any future analyst of Rove, or of any political aide or consultant, would do well to keep in mind these words from Ecclesiastes: "The race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favor to men of skill; but time and chance happeneth to them all."
The post Accidental Genius appeared first on Reason.com.
]]>For supporters of limited government, Rep. Bob Inglis (R-S.C.) was an ideal candidate. Running in 1998 for the U. S. Senate, he scorned the idea that lawmakers should scrounge federal pork for their constituents. He actually voted against projects meant to benefit his home district, and he promised to take the same approach in the upper chamber.
His opponent was incumbent Democrat Ernest "Fritz" Hollings, who dealt with pork the same way the Rev. Jimmy Swaggart dealt with illicit sex—seeking it as vigorously as he damned it. The only difference was that Hollings had no shame.
Inglis tried to say that Hollings' approach was out of date. "In 1966 [the year Hollings first won the seat], we sent senators to Washington to get whatever they could because we were desperately poor," Inglis said. But the prosperous South Carolina of the 1990s, he insisted, no longer needed government guarantees. "The senator is selling day-old bread, and we're not going to buy it." Alas, South Carolinians find their day-old bread quite filling. They re-elected Hollings, 53 percent to 46 percent.
Earl and Merle Black, twin brothers who teach at Rice and Emory universities, offer this case study in The Rise of Southern Republicans, the latest in a series of books, including The Vital South and Politics and Society in the South, that have established them as leading experts on Southern politics. Inglis' story exemplifies their broader analysis.
On one hand, Inglis' success shows how far the GOP has come. During the 1950s, a Republican could not win a House seat in most of the South (the 11 states of the Confederacy), much less score 46 percent against a veteran Democratic senator.
On the other hand, the Hollings victory is a reminder that Republicans have not gained the overwhelming dominance that Democrats once enjoyed. Throughout the region, statewide elections remain competitive. The GOP's century of impotence in the South following the Civil War was caused by race; and racial questions, as the Black brothers show, are still a regrettably dominant concern in Dixie politics.
This richly documented book does a good job of explaining both the Republicans' surge and the Democrats' survival. For decades after the Civil War, American politics exhibited a pattern that the Black brothers dub "battlefield sectionalism." The party of Lincoln did well in the North but was anathema below the Mason-Dixon line. In the 1930s, Republicans suffered on both sides of the line. The Depression cost them much of their support in the North while deepening anti-GOP feeling in the South.
Moreover, the Democrats' New Deal had a special appeal for the region. "Millions of dollars helped to construct dams, reservoirs, harbors, highways and airports," the Blacks write. "Federal money was political magic, a regional free lunch. It cost the South little because most southerners did not earn enough money to pay federal income taxes."
Two big changes followed World War II. First, the South's economic and demographic profile began to resemble that of the rest of the country. The Thomas Wolfe South of small-town eccentrics morphed into the Tom Wolfe South of urban bankers. Houston came to look like Los Angeles, even down to the urine-colored air. Northerners and Midwesterners moved south, bringing their GOP voting habits with them. And because of their rising incomes, Southerners now felt the tax bite they had escaped during the Depression.
The second big change, of course, involved civil rights. By integrating the armed forces and backing civil rights legislation, the Black brothers explain, President Harry Truman "undercut 'an unwritten understanding' between the northern and southern wings of the Democratic party." The "Solid South" cracked. Democrats lost some of the region's electoral votes to Dixiecrat Strom Thurmond in 1948, then to Dwight Eisenhower and Richard Nixon in the next three elections.
Sen. Barry Goldwater (R-Ariz.) sped up this change when he voted against the 1964 Civil Rights Act. Later that year as the Republican presidential nominee, he carried five states of the Deep South (Louisiana, Mississippi, Alabama, Georgia, and South Carolina) while losing every other state except Arizona. "Some whites who had been trained from childhood to hate Republicans and revere Democrats now saw in the Goldwater wing of the Republican party an alternative to the Democratic party," the Black brothers write. From then on, Southern whites would always give a plurality of their votes to Republican presidential candidates.
The Blacks make clear that it wasn't just about race. White Southerners liked the GOP's position on taxes and defense as well. In the 1980s, Ronald Reagan also drew religious right voters in the South with his positions on social issues such as abortion and school prayer.
This new Southern support for the Republicans did not immediately extend below presidential races. In the House and Senate, many conservative Democrats clung to their seats like moss, greatly slowing the realignment at the congressional level. Finally in the 1990s, the GOP won a majority of the region's House seats. The 2002 midterm election confirmed the shift. Most of the region's governors, senators, and House members are now Republicans. Though Democrats still lead in Southern state legislatures, the GOP scored historic gains, taking the Georgia Senate and Texas House for the first time since Reconstruction.
The effect of civil rights on Republican electoral prospects in the South was a double-edged sword. While civil rights drove some Southern whites to the GOP, it also enabled many Southern African Americans to vote—and they voted almost unanimously for Democrats. In one of history's delicious ironies, African Americans, freed by the actions of the first Republican president, eventually became the Democratic Party's core vote throughout much of the South.
Still, no Southern state has an African-American majority, and not one has elected an African-American senator. The situation is different in House races. African-American Democrats can win in majority or near-majority African-American districts, which are largely the product of the Voting Rights Act. But by packing African Americans into majority-minority districts, the act also left neighboring districts whiter and more Republican, a process that cynics call "bleaching."
As a result, Southern Republicans can prevail either when African Americans make up a small share of the electorate or when Republican candidates win an overwhelming majority of the white vote. The first condition applies in many House races but not in Senate races. Most Southern GOP senators win because of huge white majorities. In 2000 Mississippi Sen. Trent Lott lost the African-American vote 87 percent to 11 percent but won re-election by carrying 88 percent of the white vote.
Such huge majorities are hard to achieve against a Democrat who can reach beyond African Americans. Many Southern Democratic politicians have learned the trick of building biracial coalitions, which is why they aren't extinct. For instance, Hollings beat Inglis by combining 90 percent of a high black turnout with 39 percent of whites.
Knowing that a tough Democratic challenge might lurk around the next corner, conservative Republican senators resort to shoring up their support by playing the pork game that Hollings has mastered. Even Phil Gramm, a rhetorically staunch free market economist, was never shy about claiming credit for the booty he brought home to Texas. By contrast, Sen. Mack Mattingly of Georgia largely ignored local concerns in favor of high-minded issues such as the line-item veto. He lasted one term.
The Black brothers make their points about the state of Southern Republicans with a wealth of data from elections and public opinion surveys. Though the book's elaborate details might make slow going for many readers, it will serve as a valuable resource for scholars and political activists.
Nevertheless, the book has its lapses. The Black brothers refer to the Republican Party of the Civil War era as "deliberately founded on sectional rather than national interests." Though it is true that the Republicans of the time had no appeal in the South, it is downright weird to refer to the limitation of slavery as a "sectional interest."
More significantly, the authors fail to develop a major implication of their own analysis. Mobilizing the African-American vote is now a key element of Democratic strategy, especially in statewide races. A common tactic is to demonize Republicans as racists, citing their opposition to racial preferences.
In 1999 Florida Gov. Jeb Bush announced his One Florida initiative to end racial preferences in government contracts and state university admissions. The initiative pre-empted a stronger measure that national anti-preference activist Ward Connerly was seeking to put on Florida's ballot. Republicans feared the Connerly measure would stoke strong opposition and hurt their party. As it turned out, Bush's One Florida hurt them anyway. Democrats used it to inspire anger among African-American voters, who turned out in high numbers and nearly tipped the state to Al Gore in 2000.
Even before this incident, Republicans in the South (and elsewhere) had grown increasingly gun-shy on the preference issue. Last June, Gov. Bush signed a bill creating the Florida Minority Business Loan Mobilization Program, which Connerly denounced as a new racial preference that backtracked from One Florida.
A half-century ago, Southern Democrats campaigned by opposing color-blind laws, stirring up racial fears, and silencing those who opposed them. They still do.
The post Gone With the Vote appeared first on Reason.com.
]]>In The Future and Its Enemies (1998), former reason editor Virginia Postrel coined a word for people like Johnson: stasists. Preferring yesterday's quiet ("stasis") to tomorrow's open-ended future, stasists are always warning us about troubling trends that call for more planning and regulation. Johnson describes the Clinton years as a decade when every bright light cast a dozen dark shadows. "Technology in a way scares me," a Stanford senior who supposedly represents American youth tells Johnson. "Cloning and those kinds of issues are very scary to me. I want to make sure there are mechanisms in place to control those kinds of things, and I don't know that there will be." Johnson concludes in his last chapter that "the problems promise to become even more tortuous in the future."
To draw a convincingly dire picture of where we are heading, stasists need a plausible critique of where we have been. But The Best of Times is hardly the best of social histories. Johnson seems a stranger in his own time, often sounding like someone who has partially awoken from a long cryogenic sleep. The PBS NewsHour erred by including him on its regular panel of historians. He doesn't study yesteryear; he is yesteryear.
In the book's first section, "Technotimes," Johnson tells of a visit to Microsoft's corporate headquarters. "Its employees and top executives refer to it as their 'campus,'" he says with an air of discovery. "They all speak of its 'culture,' their 'Microsoft culture.'" He is apparently unaware that firms have long referred to office complexes as "campuses" and that "corporate culture" has been a corporate cliché at least since the 1980s.
Johnson has heard of the Internet, but he is hazy on its capabilities. For example, he quotes a Silicon Valley friend as saying that in 10 years it might just be possible to send e-mail in English to Germany and have it come out in German. That friend probably didn't survive the dot-com shakeout since he or she wasn't keeping up. E-mail translation services are already available on a number of Web sites, and although their treatment of idiomatic expressions leaves something to be desired, the basic technology is in place.
It is no surprise that Johnson uncritically accepts the notion of the "digital divide," the purported gap in Internet access between rich and poor, white and black. In a reason article three years ago ("Falling for the Gap," November 1999), Adam Clayton Powell III demolished this myth with data that showed increasing access, particularly in the workplace. More recently, other observers, including the U.S. Department of Commerce, have made similar points, but Johnson does not acknowledge their views or, for that matter, just about any free market argument.
Where others could see opportunity, Johnson can see only turmoil. By producing cheap computer operating systems and software programs, Microsoft helped trigger a vast revolution in communications, but Johnson's discussion of the Microsoft antitrust case rails against "the reality of a cutthroat business filled with egotistic, arrogant people." If he really wanted a portrait of egotism and arrogance, he needn't have gone farther than Microsoft's nemesis, U.S. District Judge Thomas Penfield Jackson.
In his next section, "Teletimes," Johnson looks at the mass media and sees even more disturbing signs. Here he does make a point, albeit unintentionally. He worked for many years at The Washington Post and now teaches journalism. If his level of accuracy and grasp of historical context represent the best of the profession, then the media have deep troubles indeed. Johnson cites data showing that Internet usage is increasing but claims "that doesn't mean people are using the Internet news offerings in greater numbers." Wrong. He should have checked surveys by the Pew Center for the People and the Press (people-press.org). In 1998, 13 percent of adults reported going online for news at least three times a week. Two years later, that figure had grown to 23 percent. Now news junkies can bypass Johnson's friends in big-city editorial offices and instead pick what they want to see, whether from the Associated Press, Salon, or Reason Online.
The O.J. Simpson trial is Johnson's main case study of the press in the 1990s. Not only does it represent the blurring of entertainment and news, he says, but it also reveals fundamental attitudes about race. Sixteen months after the not-guilty verdict in the criminal trial, Johnson writes, "an all-white jury in Southern California's suburban Orange County unanimously found Simpson responsible for the murders." These words may sound like a powerful statement about race and society, but they're wrong in two crucial respects. First, the jury in the civil trial was not "all-white." According to self-descriptions, jurors included one Hispanic, one Asian, and one person of Asian and black heritage. Second, the trial took place not in suburban Orange County but in the relatively bohemian Los Angeles County community of Santa Monica. So Simpson lost with a multiethnic jury in a socially liberal city. Doesn't have the same punch when you put it that way, does it?
Johnson claims that television stations are downplaying news, and he blames deregulation by the Federal Communications Commission. Actually, because of market pressure and improved technology, local news broadcasts have become longer and more frequent. Public affairs broadcasting did take a hit during the 1990s, but as the result of increased regulation. The "must carry" provisions of the Cable Act of 1992 required cable operators to carry over-the-air broadcast signals on their systems. To make room, many operators dropped C-SPAN—the home of the sort of serious programming that Johnson claims is missing from the tube.
Johnson apparently believes that there was once a golden age of high-minded, morally pristine, and intellectually rigorous journalists. He mentions Walter Lippmann, but historical research has revealed that Lippmann wrote columns praising speeches that he himself had ghostwritten for favored politicians. Johnson quotes a 1958 speech in which the sainted Edward R. Murrow warned that television was already "being used to distract, delude, amuse, and insulate us." He sees prophecy in those words, but he misses their hypocrisy. Murrow himself helped found celebrity television journalism with Person to Person, an interview show in which he sat in a studio and asked chatty questions of guests sitting in their homes.
Johnson saves his purest venom for the new media: "the disgraceful attack talk-radio programs" and Internet sites featuring "cranks, sensation-seekers, professional hit men, rumor-and-hate mongers, maliciously motivated ideologues." He sneers at Matt Drudge's claim of an 80 percent accuracy rate, which "even the least reputable newspaper or TV broadcast would deem unacceptable." In light of the many inaccuracies in his own book, Johnson should ponder a line from Matthew (the gospel writer, not Drudge): "For with what judgment ye judge, ye shall be judged: and with what measure ye mete, it shall be measured unto you."
Johnson contends that Internet-generated bile has turned people away from public life, forgetting the many ways in which the Web fosters debate and discussion. In fact, if Johnson had wanted to find diverse and thoughtful criticisms of Internet news, he could have searched www. daypop.com and found it on a large number of Web logs that offer exactly what he claims is missing in cyberspace.
The book's longest section, "Scandal Times," offers a painfully detailed account of the Monica Lewinsky affair. Johnson purports to use it as a window on America's corrupt soul but along the way throws in reams of needless detail. In a taped conversation with Lewinsky, a footnote tells us, Linda Tripp confessed that she had gone seven years without sex. Johnson adds: "How relevant, or revelatory, that may be is beyond my capacity to determine." So unless his purpose is just to sling a bit of slime in Tripp's direction, why does he mention it in the first place?
Toward the end of this 212-page section, Johnson quotes New York Times reporter Francis X. Clines calling the story "a toxic waste dump of fetid ingredients and methane energies." It's hard to understand why Johnson needs so much space to reach a conclusion that nearly everyone else reached four years ago. Nevertheless, this portion of the book could serve a socially useful purpose: If American armed forces started reading it aloud to the Al Qaeda prisoners, they'd break in no time flat.
In light of the horrific events that occurred after Johnson wrote the book's last lines, his title does not seem so ironic after all. Even in the best of times, however, political life requires probing and subtle analysis, which you won't find here. Johnson once published a book about the 1980s whose title sums up his own approach to the 1990s: Sleepwalking Through History.
The post New Criticism appeared first on Reason.com.
]]>Gephardt urged tax incentives for partnerships between universities and small businesses, investment in renewable energy, the purchase of fuel-saving vehicles, and improved energy efficiency in new buildings. He would also increase the Earned Income Tax Credit, make refundable the $500 child tax credit, and make permanent the credit for contributions to retirement savings plans.
Simplifying the tax code means getting rid of deductions and credits in exchange for a lower rate. It's tough to do that when you're adding deductions and credits.
Gephardt is hardly the first politician to take contradictory tax positions. In 1996 GOP presidential candidate Bob Dole called for "a fairer, flatter, simpler tax system" while simultaneously pushing for adoption tax credits, IRAs for homemakers, and other preferences. But Gephardt is unique in that he has been contradicting himself so brazenly for so long.
As a junior House member in the late 1970s, he supported tax credits for private and parochial school tuition. In 1987 he told The New York Times he no longer favored such credits because the Internal Revenue Code should not be "cluttered with credits and deductions." Now he's back to pushing tax deductions for educational costs.
In 1981 Gephardt voted for the Reagan tax cuts. When he ran for president in 1988, he defended that position, which his aides saw as a winner. "[Sen. Paul] Simon and [Gov. Michael] Dukakis stumbled into it, criticizing us on taxes," his deputy campaign manager told the Los Angeles Times. "If they want to talk about how Dick Gephardt voted to cut taxes in New Hampshire, we said, sure, we'll talk about that."
Last year Gephardt cited the Reagan cuts to denounce the Bush cuts. "People like me got calls from my constituents in 1981 saying, 'Give Ronald Reagan a chance,' " he said. "Well, after we lost our alternatives, people like me gave him a chance. I voted for the Reagan tax cut in the end in 1981. It was a mistake."
Speaking in Iowa in July 2001, Gephardt suggested that the 1993 tax increase was a model: "I'm glad we did what was right in 1993, and I'll do it again because I believe in being fiscally responsible with the taxpayers' money." A few days later, he issued this statement: "I never addressed the future of taxes in my remarks because I don't believe they need to be raised."
In the 1980s, Gephardt sponsored legislation to overhaul the tax code by scrapping tax preferences. He told a group of business executives in 1985, "I feel more comfortable with the free market system deciding where capital should be allocated than with Dick Gephardt planning it."
In the 1990s, Gephardt proposed another major overhaul, which would have reduced tax rates by repealing nearly all itemized deductions. In a debate with Jack Kemp, he explained that he wanted everyone on a level field. "If you go out and earn your wages every day by working, you get taxed at a certain rate," he said. "If you earn by investing in capital…then you will pay at a similar rate. Why do you want to prefer one set of actions over another?"
A striking feature of this proposal was a requirement for a national referendum before any increase in tax rates. In 1995 Gephardt said such a requirement was necessary to block costly tax breaks. "When people look at tax reform," he said, they think, " 'Oh, sure, there they go again. They're going to lower my rates and they'll be back in two years opening up some more loopholes for rich people, and I'm going to pay through the nose.' "
Yet even before his 2002 tax credit fusillade, Gephardt was supporting tax preferences for individuals and businesses. The Web site for his 2002 re-election campaign includes this boast: "Dick Gephardt has consistently co-sponsored the Historic Home Ownership Assistance Act, which provides a Federal tax credit to individuals who rehabilitate historic homes."
Such wild contradictions could prove a handicap if Gephardt seeks the presidency again. His spinmeisters might try to contain the damage by recalling a famous quotation from F. Scott Fitzgerald: "The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function."
The post Dick Gephardt's Beautiful Mind appeared first on Reason.com.
]]>Literally before New York could clear the rubble of the World Trade Center, lawmakers were trying to spend more on anti-terrorism measures than even the administration wanted, prompting President Bush to threaten a veto. The Republican chair of the House Transportation and Infrastructure Committee said that his $71 billion railroad bill was vital because the terrorist hijackings "demonstrated even more the need for transportation alternatives."
We've been down this rhetorical road many times before. World War II and the Cold War spawned far-flung constituencies that sustained useless or obsolete military bases and weapon systems. Time and again, the word defense was the lubricant that helped unlock the Treasury doors. Hence, the National Defense Education Act and the National System of Interstate and Defense Highways.
History suggests other paths the khaki socialism will take in the months and years ahead. Noting that Soviet propagandists liked to talk about America's slums, liberals in the 1960s said that solving domestic social problems would help win our psychological war with communism. The first Kennedy-Nixon debate in 1960 was about domestic policy, but JFK began by saying that such issues "involve directly our struggle with Mr. Khrushchev for survival."
Five years later, Lyndon Baines Johnson said: "All you have to do is look at the morning paper this morning to see the rockets that were paraded down the avenues in the Soviet Union yesterday or the day before, and realize that until we banish ignorance, until we drive disease from our midst, until we win the war on poverty, we cannot expect to continue to be the leaders not only of a great people but the leaders of all civilization." In the late 1960s, even peaceniks got into khaki socialism by expanding the definition of "national security" far beyond anything that could plausibly become a GI Joe accessory.
Their sons and daughters are at it again. The National Association of Social Workers says on its Web site, "The social work profession can contribute to a redefinition of national security that includes healthy children, the prevention of poverty, an adequate education for all residents, and a productive economy." Writing in USA Today, Ted Halstead and Michael Lind warn that current bills "ignore one of the weakest links in our homeland defense: the armies of Americans without health insurance."
That metaphor points to another variant of khaki socialism, one that goes something like this: If big government can win World War II, the Cold War, and the Gulf War, then surely it can win the war against poverty, illiteracy, or disease.
When the United States does manage to crush Al Qaeda and its allies, the War on Terror will become part of this refrain. The argument will be that we can solve domestic problems if only we apply national resources as massively and single-mindedly as we did in the military struggle. AFL-CIO President John Sweeney recently told the Los Angeles Times: "You know, the president's getting good marks for the war against the terrorists, but he is neglecting the domestic war." Of course, the federal government did declare wars on poverty and drugs, with deeply disappointing results. In both cases, the domestic D-Days turned into stateside Vietnams.
Although military metaphors do help us understand many aspects of politics, they are not a literal guide to public policy. Put a little differently: If we've learned one thing in the past 40 years, it's that government cannot always help people by daisy bombing them with tax money.
It would be a shame if we followed military victory with khaki-wrapped pork-barrel spending–and elaborate, if misguided, metaphorical wars.
The post Khaki Socialism appeared first on Reason.com.
]]>Deadlock: The Inside Story of America's Closest Election, by the political staff of The Washington Post, New York: Public Affairs, 271 pages, $23
Smashmouth: Two Years in the Gutter with Al Gore and George W. Bush, by Dana Milbank, New York: Basic Books, 399 pages, $26
36 Days: The Complete Chronicle of the 2000 Presidential Election Crisis, by correspondents of The New York Times, New York: Times Books, 380 pages, $15
The post Chad All Over appeared first on Reason.com.
]]>In Love with Night: The American Romance with Robert Kennedy, by Ronald Steel, New York: Simon and Schuster, 224 pages, $23
The post Getting Right with Bobby appeared first on Reason.com.
]]>I once poured for Sen. Daniel Patrick Moynihan. And poured. And poured. The year was 1984, and I was a low-level aide for another lawmaker. Moynihan and my boss were on a charter flight to Albany to attend a conference on acid rain. My job was to sit in the jump seat behind Moynihan and keep his glass full of champagne. Moynihan kept me busy. When the flight landed, the senator darted toward a nest of reporters waiting in the terminal. "This will be a lively press conference," I thought.
It was, though not for the reason I thought. Moynihan's answers were as florid as his face, but they were utterly lucid and knowledgeable. This brief encounter left me with the suspicion that, like Winston Churchill and Frank Sinatra, Moynihan actually works better with some adult beverages in his bloodstream.
That's smart-alecky speculation, of course, but it may just solve a mystery. In his new biography of Moynihan, British journalist Godfrey Hodgson aspires to describe "the interplay between ideas and action" throughout Moynihan's life. In the light of the senator's much-vaunted public record, however, a careful analysis of the book reveals that the story is not about "interplay" at all. Moynihan's words have often diverged from his deeds. In fact, there are really two Moynihans: one with great insight into the limits of government and the booby traps of social policy, the other with an unstinting devotion to federal power and a lockstep liberal voting record.
Apropos of nothing in particular, I will call them Moynihan Wet and Moynihan Dry.
Hodgson prefers narrative to intellectual history and thus provides little insight into the dual nature of Moynihan's intellectual character. After a placid early childhood, Hodgson tells us, Moynihan went through a painful period when his father abandoned his family to troubled circumstances in New York City. Moynihan Dry has often cited this experience in explaining his support for social programs.
Hodgson's narrative suggests that Moynihan Wet may have sprouted in the unlikely soil of the London School of Economics. In spite of its reputation as a nursery for Third World socialism, he says, the LSE actually employed some free market thinkers when Moynihan did graduate work there in the early 1950s. The account of the LSE influence is sketchy, however. Hodgson even quotes Moynihan as saying, "Nothing and no one at LSE ever disposed me to be anything but a New York Democrat."
After serving as a top aide to New York Gov. Averill Harriman and completing a Ph.D. at Tufts in 1961, Moynihan became assistant secretary of labor in the Kennedy administration. When Lyndon Johnson became president, the capital got its first major taste of Moynihan Dry. With speechwriter Richard N. Goodwin, he drafted LBJ's historic 1965 address at Howard University, which helped lay the intellectual foundation for racial preferences by articulating equal outcomes as more important than equal opportunities: "[I]t is not enough just to open the gates of opportunity. All our citizens must have the ability to walk through those gates….We seek not just freedom and opportunity—not just legal equity but human ability—not just equality as a right and a theory but equality as a fact and as a result."
The speech drew heavily on a report that Moynihan had written on the "pathology" of the black family. Liberals who believed in the power of government to right all wrongs were annoyed by the notion of "pathology" (though it would later become conventional wisdom) as an explanation for poor blacks' problems. When the report became public, Moynihan Dry, despite his call for government action to help blacks, found himself the target of liberal vilification.
Enter Moynihan Wet. By this time, he had left the Labor Department to run in the Democratic primary for president of the New York City Council. He lost. He would spend the next few years teaching, writing, and challenging the liberal orthodoxy he had recently championed.
In 1967 Moynihan Wet gave a series of lectures about "the war on poverty," which he later collected in a book, Maximum Feasible Misunderstanding. His summation of the fate of "community action" programs was bluntly skeptical: "This is the essential fact. The government did not know what it was doing. It had a theory. Or, rather, a set of theories. Nothing more."
In 1968 the well-known Democratic policy maker took the astonishing step of contributing to an anthology called Republican Papers. Moynihan Wet sounded close to conversion. In his contribution, he wrote: "Somehow liberals have been unable to acquire from life what conservatives seem to be endowed with at birth, namely a healthy skepticism of the powers of government agencies to do good."
Moynihan Dry and Wet fought most over racial preference. In 1968 Moynihan wrote an article for The Atlantic Monthly titled "The New Racialism," which Hodgson unfortunately overlooks. Whereas Moynihan Dry had just advocated "equality of result," Moynihan Wet now grasped the unanticipated consequences: "That which was specifically forbidden by the Civil Rights Act is now explicitly (albeit covertly) required by the federal government. Employers are given quotas of the black employees they will hire, records of minority-group employment are diligently maintained, and censuses repeatedly taken."
Moynihan feared that the process would snowball. Trouble is, he wrote, ethnic groups will never have equal success in different fields because they value different kinds of success and excel at those they value most. "But government knows little of such variegations," he said, "and I very much fear that if we begin to become formal about quotas for this or that group, we will very quickly come to realize that these are instantly translated into quotas against." And then came the kicker: "Let me be blunt. If ethnic quotas are to be imposed on American universities and similarly quasipublic institutions, it is Jews who will be almost driven out."
Such comments drew the attention of Richard Nixon in his search for intellectual firepower for his incoming administration. Shocking the Democrats who still could remember Moynihan Dry's service in the previous administrations, Moynihan Wet signed on with Nixon as a domestic policy adviser. With proximity to power, though, Moynihan Dry returned. His major project was the Family Assistance Plan, a radical welfare overhaul that included a guaranteed annual income. Nixon embraced the plan and announced it in a nationwide address. The plan passed the House but died in the Senate. Moynihan Dry missed his biggest potential victory.
Hodgson attributes the bill's defeat to petty political infighting and Nixon's preoccupation with the Vietnam War. He misses the point. Though some observers at the time thought Moynihan's plan was a variation of Milton Friedman's negative income tax, it was something quite different. Rather than give cash assistance in place of social welfare programs, as Friedman had proposed, the plan would have been an addition to them—a massive expansion of the welfare state.
Martin Anderson, who served with Moynihan in the Nixon White House, wrote the definitive analysis of this plan in his book Welfare (oddly absent from Hodgson's endnotes). According to Anderson, Democratic Sen. Russell Long of Louisiana figured out that the plan would not only explode the budget but also discourage welfare recipients from working. "How do you justify that?" he asked an administration witness at a Senate hearing. "What possible logic is there to it?" The witness had no answer.
After this fiasco, Moynihan turned to foreign policy, first as ambassador to India, then as the U.S. representative to the United Nations. In 1976, he ran for the U.S. Senate against conservative incumbent James Buckley. He won by attacking Buckley's "extremism" and reluctance to back a federal bailout for New York City. Still, some hoped that the occupant of the Senate seat would turn out to be Moynihan Wet instead of Moynihan Dry.
Not a chance.
Year after year, vote after vote, Moynihan endeared himself to labor unions and the Washington establishment. In 1990 Congressional Quarterly's reference book Politics in America reflected on his career. On most domestic issues, his entry read, "Moynihan has been less intent on rethinking fundamental questions than on lining up emotionally with liberal Democrats in support of preserving the New Deal or the Great Society."
Some of his colleagues saw a touch of cynicism, the report went on, "since they remember when he was identified as a critic of these ideas." Perhaps this assessment is unfair. Did his split personality reflect genuine ambivalence rather than crass political pandering? Hodgson occasionally alludes to this question without resolving it.
Moynihan Wet made occasional appearances but never came close to erasing the record of Moynihan Dry. In 1999 and 2000, he won a good deal of press—as well as praise from Republicans—for supporting partial privatization of Social Security. A small step in the right direction, perhaps, but he would have left the program's basic structure in place. Moreover, Moynihan Dry had not been above playing demagogue on the issue. In 1981, after Reagan budget director David Stockman had proposed some modest cuts, the senator introduced a resolution condemning the idea as "a breach of faith with those aging Americans who have contributed to the Social Security system."
During a 1995 visit to Arizona, Moynihan Wet offered a faint echo of his 1968 piece on racial preference: "There's a tendency to get more and more groups involved, and…when there are people who will take advantage, you'll end up with unanticipated consequences." That same year, the Senate considered a measure forbidding ethnic and gender preferences in contracting by the legislative branch. Moynihan Dry voted to kill it. Three years later, he voted the same way when Republican Sen. Mitch McConnell of Kentucky offered an anti-preference amendment to a transportation bill.
The old skepticism of Maximum Feasible Misunderstanding briefly resurfaced in 1993, when the incoming Clinton team began making pronouncements on welfare. Moynihan wrote that he had been "repeatedly impressed by the number of members of the Clinton administration who have assured me with great vigor that something or other is known in an area of social policy which, to the best of my understanding, is not known at all." After this flash of Moynihan Wet, Moynihan Dry became a rabid opponent of welfare reform. Literally speaking of an "apocalypse," as Hodgson recounts, he foresaw "children on grates, because there's no money in the states and cities to care for them."
Nothing of the sort happened, of course, which undercuts Hodgson's description of the senator as a "prophet." The comment also points to a problem that plagued Moynihan throughout his career: His affection for the well-turned phrase often trumped prudence and careful thought.
Take the Moynihan Report of 1965. The notion of family pathology was sensitive enough, but then he had to add this line: "The very essence of the male animal, from the bantam rooster to the four-star general, is to strut." In 1994 a Senate hearing on childbearing by the very young took a strange turn when he remarked: "I mean, if you were a biologist, you could find yourself talking about speciation here," that is, the creation of a new species. "It has something to do [with] a changed condition in biological circumstances."
Such lapses may have diminished Moynihan's political effectiveness. Given the dominance of Moynihan Dry in the public record, that's probably all to the good. Still, as the "gentleman from New York" retires from public life at the end of this Congress, one must look to the close of his career with a twinge of regret: Moynihan Wet would have made one hell of a senator.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is an associate professor of government at Claremont McKenna College.
The post Sen. Moynihan, Wet and Dry appeared first on Reason.com.
]]>When Government Was Good: Memories of a Life in Politics, by Henry S. Reuss, Madison, Wis.: University of Wisconsin Press, 185 pages, $22.95
There's a right way to be wrong and a wrong way to be wrong. Some supporters of big, intrusive government manage to be witty, erudite, and tolerant of opposing views. If we must have statists, they're the ones to have. Alas, too many others are crabby, smug, and dogmatic–the kind who'd serve as the bad guys in an Ayn Rand novel.
John Kenneth Galbraith is of the first type, a sterling model of how to err in style. At the age of 91, he can look back on a rewarding life as a university professor, political adviser, ambassador to India, and debating partner of such conservatives as William F. Buckley Jr. Though he's seldom been right, he's always been a gentleman. And he has always been a graceful writer, as we see in Name-Dropping, a short, readable volume that offers personality sketches of FDR, JFK, and other famous figures he has known.
Galbraith has a sharp eye for the telling anecdote. As he reminds us here, Bill Clinton was not the first president who tried to be all things to all people. In separate meetings with FDR, two aides gave contradictory advice, and both times he replied, "You are perfectly right." When Eleanor objected that he could not possibly agree with such opposing arguments, he answered, "Eleanor, you are perfectly right."
Galbraith keeps the focus on his subjects, minimizing his own part in historical events. Here is his description of working on FDR's 1940 speechwriting team: "We found out how much of what we had written had survived when we heard the President speak on the radio. We listened with some attention." He is also refreshingly candid about his own failures. Looking back on campaign addresses he drafted for Adlai Stevenson, he now sees the fatal flaw: "I had written seeking the approval of the candidate, his campaign acolytes and myself, not that of the larger public, the electorate as a whole."
Name-Dropping, sad to say, is not really an original book. In his first chapter, Galbraith admits there is "an occasional event or personal encounter of which I have told before." That is an understatement: He has recycled a distressingly large portion of the material from his 1981 autobiography, A Life in Our Times. The earlier work supplies much more detail and documentation, so it is the place to look for historical reference.
Although Galbraith is wrong in the right way, he is still wrong. He acknowledges JFK's health problems and extramarital affairs but dismisses them as irrelevant to his presidency. Many historians would disagree. Before his 1961 summit with Khrushchev, JFK took medications that may have impaired his judgment. And his personal misbehavior constituted a wild security risk, exposing him to tacit blackmail by J. Edgar Hoover. He may have been just as charming as Galbraith describes, but charm is no excuse for recklessness.
There is a quaint frozen-in-time quality to Galbraith's thought–sort of Austin Powers without the bad teeth and mojo. Looking at Great Society welfare programs, he maintains that the solution to poverty is simply to give money to poor people, without necessarily expecting them to do work. In the decades since LBJ's War on Poverty, all but the staunchest statists have surrendered to reality and abandoned such notions. Oddly, Galbraith vents inordinate anger about America's effort to defeat Soviet communism in the Cold War. Austin–I mean, Mr. Galbraith…we won.
Galbraith provides the foreword to the autobiography of Henry S. Reuss, a liberal Democrat who represented Milwaukee in Congress from 1955 to 1983. Galbraith's contribution is pretty much the only good thing about this book, which unintentionally makes a splendid case for term limits.
Granted, you would expect a REASON reviewer to pan something titled When Government Was Good. But I opened the cover with hope. During his career, Reuss witnessed major events and served with great characters: Sam Rayburn, Bella Abzug, Tip O'Neill, and Melvin Laird, to name a few. Notwithstanding his love for the welfare state, he could have given us real insights into the personalities and private debates that shaped a critical period in congressional history.
He doesn't. Memoirs are supposed to have memories, but this book makes only passing reference to the big names and backstage debates. It's all Reuss all the time, bent on showing the wrong way to be wrong.
After an unenlightening account of Reuss' early years, the book treats his congressional career as a series of position papers, detailing why he sponsored this measure or opposed that one. Here's a sample: "After much study I drew up an amendment to the revenue-sharing legislation of 1970, conditioning it on a good-faith state effort to draw up a long-term modern governments plan. I set this forth at length in my 1970 book, Revenue-Sharing: Crutch or Catalyst for State-Local Government?"
Makes you want to read more, doesn't it? Seldom has a book this short been this tedious.
Despite its brevity, When Government Was Good shows blatant signs of padding. There are many lengthy block quotations from long-ago speeches and articles, along with the typical staged congressional photos (on numbered pages, not inserts). When Reuss runs out of old bills to talk about, he pontificates on current issues.
Reuss takes credit for many things, including terms in common usage. He recalls appearing before the Ways and Means Committee to talk about tax loopholes. He says he is still proud of his elaborate exhibit, "which I called my Flow Chart." Reading that line, I couldn't help picturing Dr. Evil doing two-finger quotation marks around "flow chart."
There are some items for which Reuss conveniently neglects to take credit. He devotes a chapter to the House Banking Committee, which he chaired from 1975 through 1980. (The part where he assumes the chairmanship is titled "The Banking Committee Reborn.") While mentioning the 1980 banking law, the chapter overlooks its key provision, which increased federal insurance on savings and loan accounts. By signaling the operators of shaky thrifts that taxpayers would bail them out from bad investments, the law triggered the S&L debacle of the 1980s.
Reuss tells how he led a minor investigation of Watergate but leaves out his major proposal for ending the controversy. On June 1, 1973, he wrote an article urging Nixon and Agnew both to resign so that Democratic House Speaker Carl Albert could become president. (Agnew's own ethics scandal had not yet come to light.) During the Lewinsky affair, Democrats accused the GOP of attempting a "coup," but never did Republicans dream of something as nakedly partisan as Reuss' proposal.
Reuss shows us just how mean-spirited a liberal can be. He applauds a 1998 stock market downturn because it may have cost the affluent "enough wealth and income to measurably decrease inequality." Apart from its sheer nastiness, that sentiment is also nonsensical. More than 40 percent of American households own stock, either directly or indirectly: Market crashes hurt the not-so-rich types whose pension funds lose value. Fortunately, stocks recovered after Reuss wrote that line.
One brief anecdote is revealing, about both Reuss and other politicians. An ardent supporter of the St. Lawrence Seaway in the 1950s, he was disgusted to learn at a field hearing that some of his constituents favored private-sector financing. He recalls approvingly that a Democratic colleague slipped him a note saying, "You're too good for these people!"
"Too good for these people": There's a sentiment that politicians often harbor but rarely utter. Elected officials contract arrogance the way coal miners get black lung disease: just by breathing the air for too long. Without meaning to, Reuss helps us understand how government goes bad.
Contributing Editor John J. Pitney (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College. His Web page is faculty.mckenna.edu/jpitney.
The post Ways to Be Wrong appeared first on Reason.com.
]]>Quick: What exactly did George Stephanopoulos do in the Clinton White House? He hosted the daily press briefing for a few months, but he botched the job so badly that Clinton handed the job to Dee Dee Myers. So how did he spend the rest of his tenure?
Don't feel bad if the answer escapes you. Stephanopoulos himself seems confused about what he was supposed to be doing, a problem that has apparently plagued him for years. In 1989, he signed on with Rep. Richard Gephardt (D-Mo.), then the House majority leader. "My new job was as exciting as I expected," he writes, "even though I couldn't explain exactly what it was."
Two years later, he joined the Clinton presidential campaign: "Although my duties were not defined with precision, I didn't press for clarification." After victory, the Clintons told him that he would have a place in their White House, and that he would keep on doing what he did in the campaign. "Exactly what they meant would be worked out later," he writes. "For the moment, it was just nice to hear." When Clinton dragged him from the briefing room, he says, "my new title was the nebulous 'senior adviser for policy and strategy.' Only time would tell what it would mean." Clinton reassured Stephanopoulos that he wanted him by his side, but when Leon Panetta became chief of staff, fears of unemployment strafed his mind, "which meant that I needed a job description beyond 'by my side.' "
All too typical. Washington teems with young people sporting such titles as "assistant," "deputy," and "liaison," all of them hoping to become George Stephanopoulos. They spend 60-hour weeks churning out "policy initiatives" and writing frantic memoranda on "damage control," never reflecting that most of the damage comes from all those initiatives. If they'd all just go home, the government would shrink, the storm would clear, and their mental health would improve. Instead, they settle for Band-Aids: Stephanopoulos grew a beard to cover the stress-induced hives on his chin.
It wasn't just the workload that was making him tense. Anybody in the Clinton White House ran a high risk of legal trouble, and Stephanopoulos was no exception. During the Whitewater probe, he griped to a Treasury official about a Resolution Trust Corporation investigator's GOP connections, wondering aloud if there was a way to fire him. Unknown to Stephanopoulos, the official was keeping a diary, which later fell into the hands of the Office of the Independent Counsel and the Senate Banking Committee. The journal entry did not show any criminal wrongdoing, Stephanopoulos says, but it did cause him some tense and embarrassing moments.
The incident undoubtedly reminded him of what was already conventional wisdom among Washington insiders: Diaries are subject to subpoena, so never keep one. According to press accounts of Stephanopoulos' book deal, he abided by that rule throughout his White House years.
A question arises. Although Stephanopoulos surely has a fine memory, it is implausible to think that he summoned 456 pages of detail and dialogue from the recesses of his brain. It is hard enough to reconstruct a conversation from yesterday, much less seven years ago. So without a diary, where did he get all this material?
In his acknowledgments, Stephanopoulos says he refreshed his recollections by speaking with other participants and having interns research the public record. He deserves credit for his diligence, but the product is a book that is short on fresh information.
For example, he tells of a 1992 focus group where hand-held meters showed a negative response when Hillary Clinton appeared on videotape. He recalls Bill Clinton showing both spousal concern and a capacity for denial by saying: "Oh, man, they don't like her hair." When Newsweek ran excerpts from the book, its editors thought this anecdote was newsworthy enough to merit inclusion. Apparently, they forgot that it had appeared five years ago in All's Fair, the joint memoir by Mary Matalin and James Carville.
In fact, whenever Carville appears, Stephanopoulos' account tracks closely with the earlier book. Their urgent phone conversation about Gennifer Flowers, Hillary Clinton's role in naming the now-famous War Room, the tearful farewells at the final War Room meeting–in each case, Stephanopoulos offers the same story as Carville, with only slight variations.
There are other stale crumbs. During the 1993 budget debate, Clinton called Sen. Bob Kerrey (D-Neb.), who was on the verge of breaking with the White House line. Stephanopoulos remembers Clinton's increasingly heated responses, climaxing in an obscene Anglo-Saxonism. Bob Woodward published this conversation (including Kerrey's side) in his 1994 book The Agenda. Woodward, of course, probably got his information from Stephanopoulos in the first place–but that doesn't help the reader in search of something new.
In fact, by talking to so many reporters during his tenure, Stephanopoulos repeatedly scooped himself. In All Too Human, he tells of his anguish over Clinton's support of the death penalty–an anguish he aired three years ago in a Boston Globe interview. He offers a good account of Clinton's 1991 pre-primary skirmish with Mario Cuomo over welfare. But he already related the episode to Joe Klein, who wrote it into his novel Primary Colors–narrated by a character based on Stephanopoulos.
In a weird, postmodern twist, Stephanopoulos seems to draw even from his fictional alter ego. During the 1992 campaign, he says, his own doubts about Clinton's character merely increased his determination to fight even harder: "Now I was a true believer." Joe Klein's "Henry Burton" put it this way: "True Believerism…I was caught up in the thing. I had no perspective."
The Clinton people are true believers in their own righteousness–and in the basic evil of anyone with a different viewpoint. When a Republican House candidate won an upset victory in a 1994 special election, Clinton said: "It's Nazi time out there. We've got to hit them back." Stephanopoulos shares this attitude, describing criticism of federally funded midnight basketball as "a fiscally conservative stance with a racist subtext."
Stephanopoulos now suggests that he has lost some of his faith in Clinton–but only because his lies got too gross to ignore. Never in this book does he seriously question his beliefs about the role of government, or even acknowledge that the issue might be subject to debate. There's plenty of introspection here, but it's all petty. Obsessing about whether he was up or down, he forgets to ponder whether he was right or wrong.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is an associate professor of government at Claremont McKenna College.
The post All Too Shallow appeared first on Reason.com.
]]>And Jimmy Carter is running for president again–only this time, his name is Elizabeth Dole.
It stands to reason. In Watergate's wake, people wanted the most un-Nixonian leader possible, and they found him in a Georgia peanut farmer who spoke of love, bragged of his status as a Washington outsider, and could say with a straight face, "I'll never lie to you." A similar phenomenon is at work today. Although Clinton's job approval numbers soared throughout Monicagate, Americans correctly judged his character to be the moral equivalent of a toxic waste dump. So in place of the wizard of id, they might be seeking someone who is utterly self-controlled–a leader who would again make the White House safe for interns.
Who better than Elizabeth Dole? Not only is she a woman, but it is hard to picture an immoral thought taking root beneath her hair helmet. Like Carter, she sports a smarmy manner and humorless grin that spell "the opposite of sex."
At this point, you may be wondering whether the Carter comparison is overdrawn. Like Carter in 1976, Dole can claim moral superiority to the winner of the last election. But in 1999, so can every other candidate. In fact, so can nearly everyone else on this planet, with the possible exception of Charles Manson. And come to think of it, Manson hasn't bombed any aspirin factories lately.
In Dole's case, however, the similarity to Carter goes beyond the ability to surpass a pitifully low moral threshold. It is with words that presidents govern, and her words bear a creepy likeness to Carter's.
Start with slogans. After Dole filed with the Federal Election Commission, her homepage (www.edole2000.org) featured these words: "The United States of America deserves a government worthy of its people." That's mighty close to Carter's promise of "a government as good as its people." In her announcement speech, she said, "We're beginning to lose faith in our own institutions. It's only a short step to losing faith in ourselves, and then we would be lost." She added that we "must renew faith in the goodness of our nation."
In 1976, Carter told the California Senate, "If I had to sum up in one word what this campaign is all about, the word would be faith. The American people want to have faith in their government. And it is our responsibility, as public men and women, to do everything in our power to help them regain the faith that they have lost."
At first glance, these phrases sound nice enough, fitting with the give-'em-heaven religiosity we associate with both Carter and Dole. But that's also a problem. The notion that we're all basically good clashes with the concept of original sin, as the Gospel of Matthew puts it, "No one is good but One, that is, God."
Theology aside, an emphasis on individual goodness implies that the main challenge of politics is not to limit government but to find angelic leaders who will use that power for good ends. "If angels were to govern men," wrote Madison, "neither external nor internal controls on government would be necessary." Since the political world is short on angels, he reasoned, we need strict checks on what government can do.
Madisonian ideas took a beating in the 1970s, just as Carter and Dole were coming into prominence. In the Georgia governorship, Carter enlarged government as much as the state's political culture would allow; and when he became president, he signed the bills establishing the Departments of Energy and Education. In the White House Office of Consumer Affairs and at the Federal Trade Commission, Dole favored increased government regulation and even wanted a Consumer Protection Agency.
Both went so far as to vouch for the virtues of bureaucrats. During the 1976 primary campaign, Carter's rival Jerry Brown criticized government administrators. Carter replied, "That is wrong. I have seen at first hand that most government employees want to do a good job." In her 1988 joint autobiography with Bob Dole, Elizabeth fondly recalled her FTC years: "I believe now what I believed then. Perhaps no one is more unjustly maligned than the bureaucrat."
REASON reader, I hope you weren't sipping coffee while reading that line. If you were, you probably spit it all over the magazine. Take a minute to clean up.
OK now? Back to Dole. Lest you think I got her words wrong, check pages 152-153 of the hardcover edition of The Doles: Unlimited Partners. You'll find that it gets worse: "I tell youthful audiences they can find no higher calling in life than that of the public service." What about the priesthood, the ministry, or the rabbinate? Heck, what about running a business and creating jobs?
"They may not get rich but they'll enrich the lives of countless others." Yeah, like trial lawyers and agribusiness lobbyists. "Because of them, the world is a little better." As Clinton would say, that depends on what the meaning of the word better is. If it includes frustration and impoverishment, she may be right.
If not bureaucrats, who causes policy problems? Carter pointed to special interests, saying that otherwise good people get selfish when they undertake political action. "Schoolteachers love their students," he said, "but when they organize and hire a lobbyist to work with the legislature, those lobbyists don't care anything about the students." Of course, that fine sentiment didn't stop him from promising to set up the Department of Education in return for teacher union support.
Dole is just as consistently inconsistent. In her campaign announcement, she said government is: "paralyzed by special interests." Huh? In the Reagan White House, she headed the Office of Public Liaison, whose sole purpose was to cater to special interests. And then she headed two of the most clientele-serving Cabinet departments: Transportation and Labor. After all those experiences, plus four national campaigns at Bob Dole's side, she had the chutzpah to add, "I'm not a politician, and frankly today I think that may be a plus."
Carter portrayed his devotion to government as pragmatism, not ideology. "In the last analysis," he said in Los Angeles on August 23, 1976, "good government is not a matter of being liberal or conservative…. We want both progress and preservation." Dole plies the same murky waters. In a New Hampshire speech, she said: "Most Americans prefer solutions to sound bites"–which was her sound bite for the evening news. "This makes us naturally suspicious of what I call either/or politics: Liberal vs. conservative. Public school vs. private school. Us vs. them." Or in the immortal words of Yogi Berra: "When you come to a fork in the road, take it!"
The story isn't all bad. Just as Jimmy Carter helped deregulate airlines, Dole oversaw the privatization of Conrail. Moreover, she decries the high level of taxation and speaks with pride of her service as a "lieutenant in Ronald Reagan's army." So despite all the bad signs, might she be a closet free market type?
Nah. For any GOP candidate, damning taxes and blessing Reagan are obligatory gestures, more etiquette than ideology. With Dole, you have to think about the words you don't hear. Ponder her statement on education: "I regard public education as one of the glories of American democracy. Which is precisely why the number one priority of any education reform must be this: to restore our public schools to greatness." Would a real free-market Republican have omitted any mention of private or parochial schools?
At the Transportation Department, Dole's best-known accomplishment was a rule requiring that every new car have an air bag or automatic safety belt. Just imagine if she had headed the Department of Health and Human Services: All refrigerators would come with a "Liddy Light" that would flash if we got too much ice cream.
No wonder a former aide once told Fortune magazine, "She's progressive at the core." If she becomes president, we'll probably get what we got in the 1970s: a pro-government agenda clad in polyester fuzzwords.
Dole has one last similarity to Jimmy Carter: the persistence of a childhood nickname. Though she reportedly hates people to call her "Liddy," she's probably stuck with it, at least for the duration of the campaign. "Elizabeth Dole" is a bit long for buttons and bumper stickers. In light of recent costume movies, "Elizabeth" would summon up images of Cate Blanchett or Judi Dench. "Dole" would make people think of Bob, not Elizabeth. And "E.D." would not do, since the initials also stand for "erectile dysfunction." Thanks to a current public service ad campaign for Pfizer, that would also remind people of Bob.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post That '70s Candidate appeared first on Reason.com.
]]>To understand the link, think back to 1990. At the time, Gingrich was the House Republican whip and general chairman of GOPAC, a committee for the training of Republican candidates. As the midterm congressional campaign got under way, GOPAC issued "Language: A Key Mechanism of Control," a linguistic guide for those who pleaded, "I wish I could speak like Newt."
The guide consisted of two lists, both of which grew out of focus group research. "Optimistic Positive Governing Words" such as opportunity, challenge, and commitment would help Republicans define their own vision. "Contrasting Words" such as crisis, threaten, hypocricy [sic], and ideological would help them define their opponents.
After Democrats attacked the guide as cynical and demeaning, Gingrich quickly disowned it as an aide's mistake. But in spite of their public indignation, the Democrats adopted its central idea: that language is indeed a mechanism for shaping the way people think about politics. Sometimes they openly acknowledged their intellectual debt to Gingrich. At a 1995 political retreat, Democratic senators and staff received an information packet that included the GOPAC document.
One reason the Democrats have done so well lately is that they have mastered both lists. These days, any speech by a Democratic politician will contain long stretches of "optimistic positive governing words," along with a few verbs and prepositions that give the illusion of thought. A typical passage sounds like this: "We have a precious opportunity to preserve our commitment to our families and protect the dreams of our children." Thanks to the list, speechwriters do not have to worry about order and logic. String the words together in another sequence, and they sound just as good: "Our families have precious dreams for our children, so we must preserve and protect our commitment to opportunity."
While these fuzzwords are dulling the listeners' capacity for critical thinking, the "contrasting" words are calling forth demonic images. Again, the Democrats have relied heavily on Gingrich's list, making adept use of such standards as greed, selfish, and intolerant. They've also made some additions, including mean-spirited and, of course, extremist. In the 1998 campaign, Sen. Barbara Boxer (D-Calif.) got away with applying the "extremist" label to GOP challenger Matt Fong, who is about as wild-eyed as Mr. Rogers. And in nearly every race, Democrats linked the Republican candidate to the king of the "extremists," Gingrich himself. As George C. Scott said in Patton: "Rommel, you magnificent bastard, I read your book!"
The political use of language involves more than the creation of positive or negative feelings. A virtuoso of the art will use wordplay to redefine the very terms of discussion. Here is where the Democrats have outdone Gingrich.
Seldom any more do they say, "We need to increase spending on federal domestic programs." Instead, they follow the lead of President Bill Clinton, who praised last fall's omnibus spending bill for making "critical investments in education and training." Not once in his statement did he use the word spend. Clever trick: "Spending" connotes loss, whereas "investment" implies the expectation of profit. Thus his words powerfully suggested that federal programs will pay for themselves–and then some–by making Americans more productive. Though the evidence points in the opposite direction, the language puts a daunting burden of proof on anyone who would balk at "investing" in workers and children.
Clinton's statement on the spending bill also said that it "added resources to protect the environment, to move people from welfare to work," and to do other nice things. That's typical: Spenders like to describe budget figures as "resources." When we hear the word, we tend to think vaguely of things natural and renewable–like pine cones. If you want to see how specious this language is, just try paying the IRS with pine cones.
Even more Orwellian is the voguish phrase "bill of rights." The term originally meant a set of strict limitations on federal power, but now the statists have applied it to proposals for increasing Uncle Sam's intrusiveness. The "Patients' Bill of Rights Act," for example, would give Washington tighter control over managed-care plans. We've come a long way from "Congress shall make no law respecting an establishment of religion" to "The Secretary shall specify (and may from time to time update) the data required to be included in the minimum uniform data set under subsection (a) and the standard format for such data."
To regain the political initiative, free market advocates need to regain control of the political vocabulary. We should replace verbal smog with the crisp language of truth.
Take the word surplus, which the dictionary defines as "that which remains above what is used or needed." What do we do with surplus wheat or cheese? We give it away to the needy. So by extension, it apparently makes sense to donate the budget surplus to federal social programs–sorry, "critical investments."
Washington Post columnist (and a Reason Foundation trustee) James K. Glassman suggests that we break out of this mind-set by referring to excess revenues as "the overcharge." Unlike surplus, this term clearly indicates that the money really belongs to the people who paid it, and that they deserve a refund. If the language of "surplus" favors spending increases, the language of "overcharge" favors tax cuts.
Likewise, we should stop talking about the Social Security "Trust Fund." The system does not maintain a real trust fund, but instead a pile of Treasury securities–IOUs from the federal government. So let's call it what it is: "The IOU Account." Politicians can give florid speeches about protecting the "Trust Fund," but they could not muster the same eloquence on behalf of the "IOU Account."
We can take this approach to social issues, too. In recent years, the defenders of color-blind justice have made some progress in substituting the descriptive preferential treatment for the deceptive affirmative action. But the term does not go far enough in emphasizing that these schemes are zero-sum games: Advantaging certain groups always means disadvantaging the rest. Therefore, we should refer to such policies as "discrimination against Italians, Armenians, and Jews, among other groups." Try it: It drives the bad guys nuts.
By making a conscious effort to reform political language, free market advocates will be rejoining an old fight. In his 1946 essay "Politics and the English Language," Orwell cataloged some of the linguistic swindles and perversions that had long served powerful people. He wrote that "one ought to recognize that the present political chaos is connected with the decay of language, and that one can probably bring about some improvement by starting at the verbal end."
The post Tongue of Newt appeared first on Reason.com.
]]>The books did not have to explicitly engage in prediction: Indeed, often the most prescient works are those that identify something about current, or even past, conditions that turns out to be important in shaping the future. Conversely, many a book has been written on a current movement or "crisis" that fizzled.
Walter Truett Anderson
I don't much like the title of Jean-Marie Guéhenno's The End of the Nation-State (University of Minnesota Press, 1995, translated by Victoria Elliott), partly because I don't think the nation-state's end is anywhere on the horizon. Nor, for that matter, do I agree with many of the specific propositions in this quirky, idiosyncratic work. Nevertheless, I think Guéhenno captures the essence of what is and will be going on in the world much better than Samuel Huntington's much-discussed The Clash of Civilizations. Guéhenno, who recently served as France's ambassador to the European Union, argues that the kind of world now emerging is in a way more united and at the same time without a center, that fixed and definite territories are becoming largely irrelevant, and that networks are becoming more important than 19th-century institutions of power: "We are entering into the age of open systems, whether at the level of states or enterprises, and the criteria of success are diametrically different from those of the institutional age and its closed systems. The value of an organization is no longer measured by the equilibrium that it attempts to establish between its different parts, or by the clarity of its frontiers, but in the number of openings, or points of articulation that it can organize with everything external to it."
Conversely, there is much in Fritjof Capra's The Turning Point (Simon & Schuster, 1982) that I do agree with–his insistence on the importance and value of ecological thinking, feminism, and the human potential movements of the 1960s and '70s–at the same time that I heartily reject his simple pronouncement that all those hold The Answer and are destined to build the civilization of the future. It is a simplistic model of change, and such thinking historically proves not only wrong but dangerous. Capra reveals those dangers in his cheerful willingness to impose New Age agendas on the bad guys with the full power of the state: Nationalize oil, restructure society, do whatever is necessary to get everybody thinking right. In a fairly typical passage of policy recommendations he writes: "An important part of the necessary restructuring of information will be the curtailing and reorganization of advertising….legal restrictions on advertising resource-intensive, wasteful and unhealthy products would be our most effective way of reducing inflation and moving toward ecologically harmonious ways of living."
Finally, I nominate Susantha Goonatilake's Merged Evolution (Gordon and Breach, 1998) as a useful peek into the future. Goonatilake (of Sri Lankan birth, now based in the United States) brilliantly explores the interactions among what he calls three different "lineages" of evolution–biology, culture, and artifacts–and focuses on the capacity of information and communications technologies to, as he puts it, "cut a swath through the biological and cultural landscape as no other technology has hitherto done." He does not explicitly address the political and organizational themes that Guéhenno raises, yet his analysis provides a good understanding of how we are entering into an age of open systems–not only political systems but also biological ones–and also why narcissistic visions of green goodness are an inadequate guide to the 21st century.
Walter Truett Anderson is a political scientist and author of Future of the Self (Tarcher/Putnam).
Ronald Bailey
In the late 1960s and early '70s a plethora of terrible books about the future were published. In 1968, Paul Ehrlich published the neo-Malthusian classic The Population Bomb (Ballantine). "The battle to feed all of humanity is over," he notoriously declared. "In the 1970s the world will undergo famines–hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now." Ehrlich was far from alone. In 1967, the Paddock brothers, William and Paul, asserted in Famine 1975! (Little, Brown) that "the famines which are now approaching…are for a surety, inevitable….In fifteen years the famines will be catastrophic." In 1972, the Club of Rome's The Limits to Growth (Universe Books) suggested that at exponential growth rates, the world would run out of gold by 1981, mercury by 1985, tin by 1987, zinc by 1990, petroleum by 1992, and copper, lead, and natural gas by 1993. The end was nigh. The modern heirs to this strain of doomsaying include Lester Brown at the Worldwatch Institute and Vice President Al Gore.
But the silliness was not confined to the environmentalist front. Take a look at John Kenneth Galbraith's 1967 paean to planning, The New Industrial State (Houghton Mifflin), in which he asserted: "High technology and heavy capital use cannot be subordinate to the ebb and flow of market demand. They require planning and it is the essence of planning that public behavior be made predictable–that is be subject to control."
Galbraith, too, has heirs–most notably, Robert Reich and Lester Thurow. In his 1980 book The Zero-Sum Society (Basic Books) Thurow suggested that "solving our energy and growth problems demand [sic] that government gets more heavily involved in the economy's major investment decisions….Major investment decisions have become too important to be left to the private market alone." Thurow ended with this revealing claim: "As we head into the 1980s, it is well to remember that there is really only one important question in political economy. If elected, whose income do you and your party plan to cut in the process of solving the economic problems facing us?"
Ultimately, the neo-Malthusians and the zero-summers are pushing the same egalitarian agenda: Stop growth and then divvy up the static pie.
Fortunately, a far more prescient and optimistic intellectual tradition opposed the melancholy millenarians. In 1967, Herman Kahn, the founder of the Hudson Institute, published The Year 2000, (Macmillan) in which he laid out a variety of scenarios for the next 33 years. Today's U.S. GDP is at the low end of Kahn's most likely scenario. (Hudson Institute analysts claim that the breakdown of American public education is to blame for this lower than expected GDP.) But Kahn was spot on when he predicted the flooding of women into the work force, the growing equality between the races in the United States, and the boom in Japan. Later, in direct contrast to Thurow, Kahn published The Coming Boom (Simon & Schuster, 1982), which predicted the enormous economic growth that the United States has experienced since then. Kahn also pleaded for the reestablishment of "an ideology of progress":
"Two out of three Americans polled in recent years believe that their grandchildren will not live as well as they do, i.e., they tend to believe the vision of the future that is taught in our school system. Almost every child is told that we are running out of resources; that we are robbing future generations when we use these scarce, irreplaceable, or nonrenewable resources in silly, frivolous and wasteful ways; that we are callously polluting the environment beyond control; that we are recklessly destroying the ecology beyond repair; that we are knowingly distributing foods which give people cancer and other ailments but continue to do so in order to make a profit.
"It would be hard to describe a more unhealthy, immoral, and disastrous educational context, every element of which is either largely incorrect, misleading, overstated, or just plain wrong. What the school system describes, and what so many Americans believe, is a prescription for low morale, higher prices and greater (and unnecessary) regulations."
Kahn turned out to be right about the boom, but most of the intellectual class is still burdened with an anti-progress ideology which remains a significant drag on technological and policy innovation.
As for the future, Kahn's Hudson Institute colleague Max Singer is one of the surest guides. If you want to know what the next 50 years will look like, check out Singer's Passage to a Human World (Hudson Institute, 1987). He makes the often overlooked point that poor people in the developing world can see their futures by looking at our present. And because poor countries have a road map to development, they will be able to grow wealthier and healthier much faster than we did.
One of the important legacies of Kahn and Singer is the insight that a bright future depends on first believing that such a future is possible. Overcoming the pervasive pessimism of the intellectual class is a major piece of work left for us to do in the 21st century.
Contributing Editor Ronald Bailey is editing Earth Report 2000: The True State of the Planet Revisited, which will be published by McGraw-Hill next spring.
Gregory Benford
Surely the most infamous attempt to predict the economic future was the Club of Rome's The Limits to Growth (Universe Books). In 1972 the authors could foresee only dwindling resources, allowing for no substitutions or innovation. The oil shocks soon after lent their work credence, but markets have since erased their gloomy, narrow view of how dynamic economies respond to change. A famous bet over the prices of metals decisively refuted the central thesis of The Limits to Growth in 1990: The prices had fallen in real terms, contrary to the Club of Rome's predictions.
Rather than looking at the short run, and getting that wrong, consider peering over the heads of the mob to trace instead long-run ideas that do not necessarily parallel the present. An example of this is J.D. Bernal's The World, the Flesh and the Devil (long out of print but available online at physserv1.physics.wisc.edu/~shalizi/Bernal), which examined our prospects in terms that seemed bizarre in 1929 but resonate strongly today: engineered human reproduction, biotech, our extension into totally new environments such as the deep oceans and outer space. This slim book found its proper and continuing audience long after its first edition went out of print, and among hard-nosed futurologists it is still considered a neglected masterpiece.
To my mind, the best way to regard the future is to listen to scientists thinking aloud, making forays into territories seldom explored in our era of intense narrowness. Prediction is speculation, but it often arrives well-disguised. Sometimes it is a brief mention of a notion awaiting exploration, as when James Watson and Francis Crick alluded, in the last sentence of their paper reporting the discovery of DNA's double helix, to the implications for reproduction: "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material."
In similarly laconic vein, consider a slim tome of stature comparable to Bernal's, Freeman Dyson's Imagined Worlds (Harvard University Press, 1997). Dyson in his lofty view shares an advantage with science fiction writers. Both are good at lateral thinking–the sideways swerve into future scenarios justified not by detail but by their intuitive sweep. Refusing to tell us how we get to their visions, Dyson and others take in a wider range of possibility than the hampered futurologists. "Science is my territory," he writes, "but science fiction is the landscape of my dreams."
In five admirably concise essays, Dyson takes us from the near past to the far future. Science and technology must progress through trial and error, he observes, while politics seeks to mandate outcomes and thus brings disasters. Dirigibles lost out to airplanes partly because of political agendas imposed upon them. The British Comet jetliner failed because management forced fast engineering results. Technology, then, must evolve through its own Darwinnowing.
Necessarily Dyson favors small science ("Tolstoyan") over big projects ("Napoleonic"). In our era of dwindling budgets, this seems the winning view. Luckily, small science governs in the crucial fields of neurobiology and microbiology, which will shape the next century. Attempts to impose big agendas on biology should be resisted.
Cleanly written, elegant in insight, this reflection by one of the great scientist-writers of our time beckons us to the far horizons of the human experience. Such vistas are more interesting, more inspiring, and ever more useful than the short views of the Club of Rome.
Contributing Editor Gregory Benford is a professor of physics at the University of California, Irvine. His most recent novel is Cosm (Avon Eos).
K. Eric Drexler
It was a great surprise when I realized that Robert Ettinger's The Prospect of Immortality (Doubleday, 1964) had actually made a sound argument. In the early 1970s I had heard of its thesis–that low temperatures could stabilize otherwise hopeless patients for later repair–but this looked like a technical impossibility. Cells often revive after freezing, but never whole mammals.
This observation, it turns out, is beside the point. By the late '70s, it had become clear that the onrush of molecular technologies would one day lead to thorough control of the structure of matter, including molecular machine systems able to examine and repair biological structures molecule by molecule. Suddenly Ettinger's case made sense. When I finally read his book, I found that he had anticipated molecular-level repair of the sort we now see how to develop. "Can mammals revive from freezing spontaneously?" is the wrong question, once molecular biorepair enters the picture. For us, the key question is instead, "Does freezing somehow erase the information content of the brain?"– which it clearly doesn't.
This idea of long-term, low-temperature first aid, a.k.a. biostasis, is catching on, especially among those who think of their minds as information processes. Watch for those medical bracelets (freely translated, "In case of system crash, do not discard; call…."), especially up here in Silicon Valley. The San Jose Mercury News has called this a "Silicon Valley trend."
Soon after realizing that biostasis would work, I came across an impressively false work of prediction: Entropy, by Jeremy Rifkin (Viking, 1980). It explains that our world is doomed and that human action must be severely limited, due to the inevitable "dissipation of matter" described by the Fourth Law of Thermodynamics. Senators, academics, and futurists endorsed the book, but it turns out that this "Fourth Law" isn't in the textbooks and is simply false, making the work an edifice of the purest piffle. Rifkin later fuzzed his justifications, but his call for salvation through oppression stays clear.
In their discussions on the future of technology, authors Chris Peterson and Gayle Pergamit observe: "If a thirty-year projection `sounds like science fiction,' it may be wrong. But if it doesn't sound like science fiction, then it's definitely wrong."
Keep that in mind when you read Marc Stiegler's forthcoming Earthweb (Baen Books, April 1999), a novel that plausibly portrays a key part of the future. Stiegler sketches what the Web can become when it grows up–a fast, flexible marketplace of ideas and reputations. He combines the "idea futures" work of Robin Hanson with the "information markets" work of Phil Salin to depict a new and productive spontaneous order. The infoworld Stiegler describes may arrive in the next 10 years, soon enough to shape much of the next 30.
Imagine what the world might be like if good ideas more consistently won and bad ideas more consistently lost. Better media and incentives can help.
K. Eric Drexler is chairman of the Foresight Institute (www.foresight.org) and author of Engines of Creation (Doubleday) and Nanosystems (Wiley).
Charles Paul Freund
More nonsense has been written about television than about anything else in the last 30 years, perhaps in the whole of human history. TV, while indisputably reordering life, has purportedly made its audience stupid, inattentive, illiterate, violent, and worse. Choosing the single most ridiculous book on the effects of TV is a challenge, but Jerry Mander's 1978 rant, Four Arguments for the Elimination of Television (Morrow), is in a class of its own.
Between the covers of this one book are assertions that TV: is a form of sensory deprivation; is addictive; "trains people to accept authority"; is physically unhealthy; suppresses the imagination; hypnotizes its public; is the equivalent of "sleep teaching"; "redesign[s] human minds into a channeled, artificial, commercial form"; and much, much more. Mander wanted TV banned–literally banned–because it was turning people into passive zombies who would do anything they were told. Was he right? Ask the broadcast networks.
James B. Twitchell doesn't like TV much more than Mander does, but he understands it a lot better. "The purpose of television," Twitchell writes in Carnival Culture (Columbia, 1992), "is to keep you watching television." How does it try to achieve that goal? By showing us whatever it thinks we want to see. "Television is where we go to be hooked," he writes. "It's our carnival."
Twitchell's book isn't about TV; it's about "taste," and what has happened to it in recent decades. According to him, what happened was the collapse of the old taste hierarchy, the empowerment of the cultural consumer, and the triumph of the "vulgar." There's great power in the vulgar, he reminds us, which is why it was once institutionalized at the margins–as in the annual medieval carnivals–by those who sought to control culture. Now, he writes, thanks to modern media and the freedom made possible by technology, the popular carnival has displaced the high church of good taste.
Where are we and such media taking each other? Sherry Turkle is investigating an interesting line of possibility. In Life on the Screen (Simon & Schuster, 1995), Turkle sees whole new forms of personal liberation becoming available online. Not only new communities of people, but new people. Life online, she suggests, makes possible a re-evaluation of identity itself.
Turkle's work concentrates on multi-user domains and the great variety of selves that users can experiment with in such environments. This is a new dimension to our relationship with media and with technology, she argues, one that is shifting our understanding of self, other, and machine.
Turkle may or may not be right about what will happen in the next 30 years, but she and Twitchell are both right about the power that will shape the cultural future: It belongs to media's users. Mander's idea–that such users are the dupes of a technological oligarchy–has been virtually sacralized by a dyspeptic anti-media, technophobic class. But if there's one cultural lesson that the past 30 years have to offer, it is that however much this class pouts, the future doesn't care.
Charles Paul Freund is a REASON senior editor.
Michael Fumento
Whoever said even a stopped clock is right twice a day never heard of Paul Ehrlich. This professional fearmonger switched from studying butterflies to doomsaying in 1968, when he published The Population Bomb (Ballantine Books). Among its spectacular claims: "The battle to feed all of humanity is over. In the 1970s the world will undergo famines–hundreds of millions of people [including Americans] are going to starve to death in spite of any crash programs embarked upon now."
Between the "green revolution" in plant technology, a flattening Third World population curve, and imminent population shrinkage in many industrial countries, this prediction, like every major one Ehrlich uttered, fell flat. Still, The Population Bomb remains the standard to which all gloom-and-doom writers aspire. Fortunately, Ehrlich received the ultimate confirmation of his foolishness in 1990, when he received the MacArthur Foundation "genius award."
Ehrlich is always wrong because he can't comprehend that human brains are larger than those of butterflies; that we can and do adapt. As shown by the late Julian Simon, in his classic 1982 book The Ultimate Resource (Princeton University Press, updated in 1996), when the going gets tough, the tough rise to the occasion and make things as good as–and usually better than–they were. At least, that's true in a free society with a relatively free market. Simon's book is the standard by which environmental myth-busting books should be measured.
Biochemist and medical journalist Alexandra Wyke's 21st Century Miracle Medicine: RoboSurgery, Wonder Cures, and the Quest for Immortality (Plenum, 1997) is very much in the Simon mold. No talk here about turning old folks into Soylent Green to feed the teeming masses. Instead, she says, advances in biotech, computers, and information technology will change medicine so drastically in our lifetimes that today's therapies may eventually be equated with witch doctoring.
Gene therapy and genetic screening, Wyke says, will tremendously reduce the incidence of cancer, cardiovascular disease, and some neurological diseases. So-called "newly emergent" infections will quickly be controlled by medicines not discovered by happenstance but designed from the ground up by supercomputers that provide information far more useful than that from rodent testing. Surgery will depend not on the steady hand and experience of the doctor but on devices such as the recently invented ROBODOC, combined with new imagery technology and computers that essentially make flesh and bone transparent in 3-D images, allowing machines to make cuts or dissolve tumors and blockages in exactly the right place.
Initially, such developments will drive up health care costs, Wyke says, but as they proliferate, cut patient recovery times, and save productive lives, they will more than pay their way. She predicts that by 2050, the average life span in developed countries will be a century. Doctors won't disappear, but their role will greatly diminish as computers and robots are increasingly used to diagnose illness, prescribe medicine, and perform surgery.
Predictions beyond 20 years are usually more speculation than science, but I suspect Wyke's forecast is on target. We are already undergoing a medical revolution, with treatments and cures coming at a pace that's furious compared to just a few years ago. Luddites like Jeremy Rifkin will have to be defeated, and death never will. But hold on to your chairs, because we are on the verge of some very exciting times.
Michael Fumento is the author of four books on science and health issues and is writing two others, including a handbook on science and health issues for journalists.
Steven Hayward
First the wrong: Jacques Ellul's The Technological Society (Alfred A. Knopf), which was first published in France in 1954 but did not really make a mark until its publication in the United States in 1964, after which it became a favorite of the highbrows within the 1960s counterculture. Ellul, one of those curious figures who made French sociology so much more interesting than the Anglo-American version of the discipline, wrote several worthy books (some of them on theology), but The Technological Society was not one of them. It promoted the commonplace view that our advance toward a high-technology future would be dehumanizing. He wondered how we would manage to control this portentous technology. The collective problem posed by technology has turned out to be false, of course, as technology empowers people and diminishes government control.
The prophetic: When Charles Murray published Losing Ground to great fanfare in 1984, a few graybeards noted that it vindicated Edward Banfield's much-despised 1968 book The Unheavenly City (Little, Brown). Banfield challenged the premises of the War on Poverty while it was still in its ideological heyday (even though ordinary citizens had grown tired of it by 1968). He argued that lack of money was the least of the problems of the underclass and predicted that government programs aimed at solving urban poverty were sure to make things even worse. By 1984, Murray was able to fill in the details of Banfield's prophecy. You can draw a straight line from Banfield to Murray to today's welfare reforms, which impose time limits on eligibility, emphasize work, and require individual responsibility.
The future: Francis Fukuyama's 1992 book The End of History and the Last Man (The Free Press) deserves a second look, in part because current troubles in the world have given Fukuyama some second thoughts about whether he was correct that liberal democracy represented the final stage in the political evolution of mankind. But the second aspect of his book–the Last Man–also should prompt some fresh chin pulling. The sluggish public reaction to President Clinton's scandals suggests that we are indeed in the condition of Last Men who care only for comfortable self-preservation; the anti-smoking crusade looks like a perfect expression of the snobbery that Fukuyama, following Nietzsche and Kojève, predicted would be the most prominent moral trait of the Last Man. And yet, might the collapse of the tobacco bill have been a turning point? Might the growing support for serious Social Security privatization prove a portent that more of us would indeed like to break out of our rut, as Fukuyama predicted at the end of his book? This might be grasping at straws, but if in fact the progress of the nanny state is not an irreversible process, there is a decent chance that the Last Man of the liberal democratic age need not be a despicable man.
Contributing Editor Steven Hayward is Bradley Fellow at the Heritage Foundation.
Charles Murray
They appeared in the same year, 1962: The Other America, by Michael Harrington (Macmillan) and Capitalism and Freedom, by Milton Friedman (University of Chicago Press). The elite's response to their publication was almost a caricature of the biases of the time. The Other America was greeted ecstatically. Dwight McDonald's New Yorker review of it was read by John Kennedy and prompted the staff work for the War on Poverty. Capitalism and Freedom was not reviewed in any major American newspaper or newsmagazine.
How wrong can one book be? The Other America has to be a contender for the world record. Ignore the many evidentiary problems with Harrington's attempt to portray America as a society with 50 million people in poverty. The real damage was done by his underlying premise: Poverty is the fault not of the individual but of the system. Seized upon as the new intellectual received wisdom, this view drove the design of social policy for the next decade: expanded welfare programs that asked nothing from the recipients, a breakdown of accountability in the criminal justice system, erosion of equality under the law in favor of preferences to achieve equal outcomes. All were bad policies that caused enormous damage, underwritten by the assumption that people are not responsible for the consequences of their actions.
Meanwhile, Friedman got it right. In a free society, the vast majority of people can be in control of their lives. It is a free society that best provides for those who remain in need. A free society produces in reality the broad economic prosperity that Harrington sought through Marxist theory. Harrington's book is a road map for understanding just about everything that went wrong with social policy in the last 30 years; Friedman's is a road map for understanding just about everything that went right.
A book that is likely to be seen 30 years from now as anticipating intellectual trends is E.O. Wilson's Consilience (Alfred A. Knopf, 1998). I find its broad vision of a unity of knowledge, a coming together of our understandings in the hard sciences, soft sciences, and humanities, to be compelling. It is perhaps most obvious that sociologists and political scientists must reconcile their conclusions with the discoveries of the geneticists and neuroscientists, but the hard scientists have some bridges to build as well. Ever since the quantum revolution began a century ago, the image of the clockwork universe where everything could be predicted if everything were known has been breaking down. The hard sciences are increasingly one with the poet, recognizing that the universe is not just stranger than we know but stranger than we can imagine. To me, E.O. Wilson's vision of the scholarly future is not just discerning but exciting.
Charles Murray is Bradley Fellow at the American Enterprise Institute. His most recent book is What It Means to Be a Libertarian: A Personal Interpretation (Broadway Books).
John V.C. Nye
Most Mistaken about 1968-1998: Paul Samuelson's Economics (various editions, current edition from Irwin co-authored with William Nordhaus). Some colleagues are going to shoot me for this, but Samuelson's introductory textbook deserves a prominent place in a list of seminal works that completely missed the boat. This great mathematical theorist somehow managed to produce a best-selling work that enshrined activist Keynesianism as the mainstream policy instrument (excepting a few "extreme left-wing and right-wing writers" seventh edition, 1967); praised high levels of government taxation and regulatory intervention (opining that the state could play "an almost infinite variety of roles in response to the flaws of the market mechanism," 15th edition, 1995); claimed that there was little observable waste in government spending (third edition, 1955); and systematically overestimated the economic success of the Soviet Union, claiming as late as 1989 that "the Soviet economy is proof that…a socialist command economy can function and even thrive" (13th edition).
Most Far-Sighted: The Rise of the Western World, by Douglass North and Robert Paul Thomas (Norton, 1973). Selecting this book might seem like special pleading, as North is both my colleague and a good friend, but there are objective grounds for highlighting the contributions of The Rise of the Western World. This book and subsequent work by North (who shared the 1993 Nobel Memorial Prize in Economics) helped to change the academic debates about development by focusing attention on the institutions of market capitalism, particularly the rule of law, secure property rights, and the low transactions costs that are the hallmarks of well-functioning markets.
The book dismissed arguments that attributed modern economic growth to technical change as putting the cart before the horse. North and Thomas memorably argued that technology did not cause economic progress, but that technological progress was itself part of the process of modern economic growth ultimately promoted by good institutions. The failure to understand that new technology without sound markets does not produce long-lasting development led to failed policies both in the East bloc, which thought that economic growth was all about building more steel mills, and in the West, with programs designed to transfer capital and technical know-how to the Third World while paying scant attention to the price system and existing political institutions. North's work inspired renewed interest in microeconomic as opposed to macro policies of development throughout the world, and it served as the inspiration for other groundbreaking books, such as Hernando de Soto's The Other Path. North's perspective also gave him the insight to criticize conservatives who pushed for simple-minded deregulation in Russia in 1991 without taking into consideration the institutional problems facing the ex-Soviets.
Most Relevant to the Future: Mercantilism, by Eli Heckscher (Allen and Unwin, 1935). The rebirth of economic competition as political struggle, best seen in the misguided economic nationalisms of Lester Thurow and Pat Buchanan and compounded by Asia's woes, will give rise to more intense mercantilist struggles in the near future. At home, the limits to direct tax-and-spend policies do not mean that the government will shrink. Instead, efforts increasingly will be shifted to direct regulation and control through unfunded mandates of all forms. Can't have a national health program? No big deal, just require businesses to provide health care and regulate the HMOs. Heckscher forces us to confront the unpleasant fact that the struggle to form viable states inevitably will be accompanied by unproductive and obtrusive interventions. There is no better way to think about regulation, the rise of the nation-state, and the future of the West than by pondering Heckscher's magnum opus about the growth of mercantilism and the pernicious effects of bad laws on the common welfare. Sadly, it is out of print. Go to your library and read.
John V.C. Nye is an associate professor of economics and history at Washington University in St. Louis.
Walter Olson
Policy buffs have long treasured John Kenneth Galbraith as the short sale that keeps on earning: Exhume a copy of The New Industrial State (Houghton Mifflin, 1967, out of print except as audiotapes) and marvel at the resistless advent of central planning and "administered" pricing, or the inevitably declining importance of individual creativity amid the coming ascendancy of committees and bureaucracies, to name only two of the trends that have so failed to typify the past 30 years.
With the toppling of the idea of a society run from above by experts, American cities have had a chance to recover from many of the social-engineering schemes that had begun to render them unlivable. Honors for seeing the problem early might be divided between Edward Banfield The Unheavenly City, 1970; The Unheavenly City Revisited reissue in 1974, Waveland) and Jane Jacobs (The Death and Life of Great American Cities, Vintage, 1961). Banfield often gets pegged as a neocon and Jacobs as vaguely countercultural, which points up the dangers of facile categorization based on choosing up sides regarding the '60s: both were in fact defending natural patterns of living against authorities' efforts to superimpose artificial structures on the social (Banfield) and physical (Jacobs) forms of community. Fearless, forthright, and intensely factual to the point of reportage, both books are still terrific reads, while the involuted condescension of Galbraith's style, which so impressed the critics of his day, has worn poorly.
Of the battles ahead, few will involve higher stakes than the defense of the Enlightenment and its libertarian values from assaults of both Left and Right. It's today's most unsettling alliance-in-practice: campus-based identitarians and a prominent body of conservative intellectuals agree with each other that "tolerance" and "free speech" are meaningless or overrated concepts, that claims for the objective authority of science should be cut down to size, that official neutrality on such matters as religion and identity is a chimera, that liberal distinctions between public and private are suspect because they tend to insulate bad conduct from social correction, and so forth. (In the August 17 & 24 New Republic, Richard Wolin traces parallels between today's postmodernists and the influential reactionary theorist Joseph de Maistre, who wrote that "what we ought not to know is more important than what we ought to know.")
To see how the antirationalist right met its downfall the first time around, head back to the freethinkers' shelf for another look at Hume and Gibbon and Lecky, Darwin and Tom Huxley, Mencken and Ingersoll; at the moment I'm browsing Andrew D. White's A History of the Warfare of Science with Theology in Christendom (1896, Prometheus paperback reissue). The former Cornell president can be dated or quaint, but more often his book is packed with astonishing, colorful, and, yes, inspiring accounts of the achievement we call the modern mind, formed as one discipline after another, from meteorology to sanitation, economics to philology, epistemology to medicine, pulled free from superstition.
Contributing Editor Walter Olson is a senior fellow at the Manhattan Institute and the author of The Excuse Factory (The Free Press).
Randal O'Toole
The accurate book: Jane Jacobs's The Death and Life of Great American Cities (Vintage, 1961). After 35 years, Jacobs's book remains the best critique of urban planning–and a wonderful critique of American planning in general. The book almost single-handedly demolished the federally funded urban-renewal movement.
Jacobs was scathing in her attacks on planners. "The pseudoscience of city planning and its companion, the art of city design," she wrote, "have not yet broken with the specious comfort of wishes, familiar superstitions, oversimplifications, and symbols." When "contradictory reality intrudes" on planners' preconceived notions, they merely "shrug reality aside." Ironically, planners today use The Death and Life of Great American Cities as a textbook as they try to turn suburbs into the dense cities that Jacobs admired. They ignore her specific warnings against doing so even as they overlook her assaults on their profession.
The fall-flat book: A.Q. Mowbry's Road to Ruin (J.B. Lippincott, 1969). Road to Ruin was one of the first in an unrelenting series of attacks on the automobile, attacks that continue today in Jane Holtz Kay's Asphalt Nation (Crown, 1997) and James Kunstler's The Geography of Nowhere (Simon & Schuster, 1993). Americans would rather not drive, these books claim, but they are forced to do so by a conspiracy of auto manufacturers, highway builders, and housing developers. "The automobile population is growing faster than the human population," warned Mowbry. "Under the doctrine that the machine must be served, the highway advocates are already laying plans for an accelerated effort to blanket the nation with asphalt."
Americans today drive three times as many miles as they did when Mowbry wrote. Yet well under 2 percent of the United States has been "blanketed" with asphalt–and much of that was roads before autos were invented. Though Road to Ruin and similar books convinced U.S. governments to spend billions of dollars subsidizing urban transit, Americans stubbornly continue to drive nearly everywhere they go. Those hostile to the automobile never see the enormous benefits that the auto has produced.
The next-30-years book: Joel Garreau's Edge City: Life on the New Frontier (Doubleday, 1991). Garreau, a Washington Post writer, coined the term "edge cities" for the most recent trend in urban development: concentrations of retailing, manufacturing, and entertainment in new towns on the fringes of major urban areas. Conditioned by writers such as Mowbry, Garreau's first reaction to an edge city was one of horror: "It seemed insane to me. It was a challenge to everything I had been taught: that what this world needed was More Planning; that cars were inherently Evil; that suburbia was morally wrong."
Garreau soon realized that planners had no foundation in reality, while the developers building edge cities had to be in touch with what people wanted, or they would lose their shirts. Edge City is as much a celebration of the marketplace as it is a prediction of what 21st-century American cities will look like–provided government doesn't try to keep them in the 19th century.
Randal O'Toole is an economist with the Thoreau Institute and the 1998 McCluskey Conservation Fellow at the Yale School of Forestry.
John J. Pitney Jr.
In 1969, at the age of 14, I read my first serious book about elections: The Emerging Republican Majority (Arlington House), by Kevin Phillips. Its 482 pages of maps, graphs, tables, and analysis grabbed me the way a baseball almanac would fascinate a more normal kid. I've kept my copy all these years, and I recently took a close look at it again. While Phillips's subsequent books (e.g., The Politics of Rich and Poor) have been more debatable, this one got it right.
Voting power, he said, was shifting from the Northeast to the Southern and Western states of the Sun Belt–a term that he coined in this book. Since Republicans were strong in the Sun Belt, they could look forward to an advantage in presidential elections. (He made no claims about congressional races.) At the time, many commentators belittled his analysis, noting Goldwater's massive 1964 defeat and Nixon's narrow 1968 victory. Phillips won the argument. Since the publication of the book, every presidential race has gone either to a Sun Belt Republican or a Republican-sounding Southern Democrat.
A year after Phillips foresaw the shape of presidential politics, Charles Reich took on all of society. "There is a revolution coming," he said in The Greening of America (Random House). "It promises a higher reason, a more human community, and a new and liberated individual." He never defined his terms precisely, but the new "Consciousness III" apparently spurned materialism, capitalism, and competition. It also meant wearing bell-bottoms. No kidding: "Bell bottoms have to be worn to be understood….They give the ankles a special freedom as if to invite dancing on the street."
Reich pictured the America of the future as "an extended family in the spirit of the Woodstock festival." It didn't happen, thank God. At Woodstock, people who took drugs and rolled around in mud were "hippies" or "flower children." Today we call them "the homeless." Fortunately, many of the Woodstock generation grew up, got haircuts, opened money-market accounts, and joined the emerging Republican majority. Some even subscribe to REASON.
Because of Greening, Reich was the most famous professor at Yale Law School at the time that Bill Clinton was attending. Some of Reich's goopier language about idealism seeped into Clinton's rhetoric, but here's one line he won't be quoting: "To be dishonest in love, to `use' another person, is a major crime."
What comes next? That's the question James P. Pinkerton addressed in his aptly titled 1995 book, What Comes Next (Hyperion). The government's vast "Bureaucratic Operating System," he wrote, has degenerated past the point where incremental patches can work. We need a new system that includes privatization, decentralization, elimination of failed agencies, and a fairer way to raise revenue, such as the flat tax.
That's a positive vision of the future, but there's no guarantee that it will come true. Although Republicans have praised these goals, they have lately been timid about acting on the specifics. Too bad. If supporters of free minds and free markets don't act on their beliefs, supporters of bureaucratic government will stay in charge. And the future will bear a depressing resemblance to the past.
Contributing Editor John J. Pitney Jr. is associate professor of government at Claremont McKenna College.
Robert W. Poole Jr.
One of the most prescient books of the past 30 years appeared at the end of 1968: Peter Drucker's The Age of Discontinuity (Harper & Row). At a time when the world of policy and government was dominated by the ideas of people like John Kenneth Galbraith, Drucker challenged the conventional wisdom across the board. He foresaw a half-century of fundamental change, in both the U.S. and the global economy, and in the ideas by which we attempt to make sense of the respective roles of government and the private sector, both for-profit and nonprofit. He identified knowledge as the key factor in economic growth, and he challenged governments to rethink their policies so as not to inhibit the huge changes that would be necessary as societies adjusted to the emerging knowledge-based economy–especially the revolution to be unleashed by widespread access to inexpensive computer power.
For me, Drucker's book first identified the concept of "reprivatization," calling for a fundamental rethinking of government's role (seeing it primarily as policy maker and regulator, rather than provider of goods and services). This insight was one of the critical influences that led me to research and write about privatization, and to set up what became the Reason Foundation.
The booby prize for prescience surely belongs to another 1968 volume, Paul Ehrlich's The Population Bomb (Ballantine). Evincing complete economic ignorance, combined with blindness to demographic evidence already becoming available, Ehrlich presented a Malthusian scenario under which out-of-control population growth would lead to mass starvation. He predicted that even England "will not exist in the year 2000." Despite their obvious absurdity, Ehrlich's views helped ignite today's enormously influential environmental movement.
As for the best book identifying trends that will shape the next 30 years, I want to cheat just a bit by discussing two books. The best book that sets the stage, by documenting the move away from central planning over the past decade, is Daniel Yergin and Joseph Stanislaw's The Commanding Heights (Simon & Schuster, 1998). The book's central theme is the replacement of the idea of "government knowledge" with the idea of "market knowledge," an insight the authors correctly trace to Nobel laureate F.A. Hayek.
But while The Commanding Heights is reportorial, it is not very analytical or predictive. The most profound book that examines the underlying factors and trends that will shape this country over the next several decades is Virginia Postrel's The Future and Its Enemies (The Free Press, 1998). This delightful book is an exercise in applying Hayek's insights about the dynamics of a free market and a free society to turn-of-the-millennium America. If you want to see what "spontaneous order" means when applied to the complex, high-tech world in which we will spend the rest of our lives, you should read this book.
Robert W. Poole Jr. is president of the Reason Foundation.
Virginia Postrel
Thirty years ago, conventional wisdom held that to reap the benefits of science, technology, and markets, we must deny ourselves fun. This repression theory of progress, derived from turn-of-the-century sociologist Max Weber, was just as popular in the counterculture as it was in the establishment. The only question was which side of the tradeoff you preferred.
Among the most influential expositions of this view was Daniel Bell's The Cultural Contradictions of Capitalism (Basic Books). In this 1976 book, Bell embraced the repression theory but observed that rising living standards were eroding the Puritan ethic. Capitalism, he argued, would destroy itself by encouraging hedonistic play: "In America, the old Protestant heavenly virtues are largely gone, and the mundane rewards have begun to run riot….The world of hedonism is the world of fashion, photography, advertising, television, travel. It is a world of make-believe in which one lives for expectations, for what will come rather than what is….Nothing epitomized the hedonism of the United States better than the State of California."
Bell's book became a touchstone for intellectuals on both the left and the right, so much so that a 20th anniversary edition was proudly issued in 1996. But its thesis had been falsified during the intervening decades. Far from destroying capitalism, play proved a great spur to progress and prosperity. Obsession, not repression, was the key. Nothing epitomized the trend better than the state of California.
The book that correctly captured this trend was set, however, in the Puritans' old stomping ground of Massachusetts. The cover of my paperback edition describes The Soul of a New Machine (Little, Brown and Avon) as "the phenomenal bestseller!" and it was indeed a phenomenon, winning the Pulitzer Prize in 1982. Tracy Kidder's nonfiction tale of engineers racing the clock to build a minicomputer that would "go as fast as a raped ape" alerted the literati that something big was going on among the technology nerds. "In technical creativity they have found a fulfillment that occasionally verges on ecstacy," reported The New York Times. Nowadays, that insight is hardly news. Thirty years ago, it was unthinkable.
All we can say about work 30 years hence is that it will be different. Two short stories suggest possible evolutions. Bruce Sterling's "The Beautiful and the Sublime," anthologized in Crystal Express (Ace, 1990), imagines a world in which "the ability to reason…comes through the wires just like electricity," thanks to advances in artificial intelligence. With reason cheap and abundant, markets and social life reward the aesthetic, and artistic temperaments are prized. The story's particulars are fanciful, but it provides a useful antidote to assumptions that today's economic totem pole will be tomorrow's.
Neal Stephenson's "Jipi's Day at the Office," published in the 80th anniversary issue of Forbes (July 7, 1997) similarly imagines a world in which intangible personal qualities define one's economic value. The title character has an unusual knack for soothing the irritable, a quality she applies to making her employer's resort particularly pleasant. On this day, however, she must persuade a paranoid, artificially intelligent car bomb not to blow. Not exactly fun, but the sort of wits-matching game that fits nowhere in Bell's repression thesis.
Editor Virginia Postrel is the author of The Future and Its Enemies: The Growing Conflict over Creativity, Enterprise, and Progress, just published by The Free Press.
Adam Clayton Powell III
The future is always with us, but the past is ever more so. And it is only the gifted who can peer into the future without straying to focus on the rear-view mirror of what we have already put behind us.
George Orwell's 1984 (Harcourt Brace) became an instant science fiction classic, widely viewed as a glimpse into future decades. But it was far more focused on 1948 than 1984, and the world of Big Brother was in truth the world of Hitler and Stalin. The Berlin Wall and what it represented ultimately collapsed, brought down by the determined rise of empowered, free individuals. And a key tool was the spreading of information technologies, from the Xerox machine to the Internet.
The best commentary on this shift was not in the book 1984 but in a television commercial in the year 1984. The ad was produced for Apple, introducing the then-new Macintosh computer. Broadcast only once, it featured a meeting hall full of drone-like people watching a huge screen displaying an image evoking 1984's Big Brother. A young woman burst into the back of the hall, ran up the aisle toward the screen and smashed it with a huge hammer.
Enter the Mac. Exit Big Brother.
Just a year before that commercial, Technologies of Freedom (Harvard University Press) correctly anticipated today's front-page clashes of central authority and the Internet. Writing in a world of slow, clunky, text-based networks, MIT professor Ithiel de sola Pool anticipated the rise of a high-speed Internet and its expansion into a popular medium–and resistance from central authorities.
"As speech increasingly flows over those electronic media, the five-century growth of an unabridged right of citizens to speak without controls may be endangered," he wrote at the end of the book's very first paragraph. And then he proceeded to outline cases, past and future, describing how the new technology has in its nature the seeds of freedom.
"Electronic technology is conducive to freedom," he wrote. "The degree of diversity and plenitude of access that mature electronic technology allows far exceeds what is enjoyed today. Computerized information networks of the twenty-first century need not be any less free for all to use without let or hindrance than was the printing press. Only political errors make them so."
A decade later, in Winchell: Gossip, Power and the Culture of Celebrity (Vintage Books, 1995), author Neal Gabler captured the rise of mass media and of celebrity-centered popular infotainment. Gabler was concerned not with freedom itself but with the implications of the rise of nonelite media. Gabler told this through the story of Walter Winchell. Decades before his role narrating The Untouchables television series, Winchell had been a top columnist and radio commentator, inventing the modern popular gossip column, celebrity-centered network newscasts, and even the very feel of network news, "its breathlessness, its ellipses, its abrupt shifts, its drama."
Winchell covered a broad range of people we would now call celebrities: "chorus girls, prize fighters, restaurateurs, journalists and even politicians like [New York City mayor] Jimmy Walker"–and his friend, Franklin D. Roosevelt. And Winchell wasn't even a journalist!
Gabler could have been writing about Matt Drudge or the latest Web site designer. And he may have written a clear look at the future of popular media.
Adam Clayton Powell III is vice president of technology and programs at the Freedom Forum.
John Shelton Reed
A book that got it wrong, huh? But there are so many, and they're wrong in so many different ways….How about something by the Chicken Little of population studies, Paul Ehrlich? One of his more fevered imaginings is The End of Affluence: A Blueprint for Your Future (Ballantine Books, 1974), but any of a half-dozen others would do as well. Ehrlich started his doomsaying career in 1968 (a big year for doomsaying) with The Population Bomb, (Ballantine Books) and he's been at it so long that he has become a sort of endearing figure, the Harold Stassen of environmentalism.
Speaking of Republicans, my candidate for a book that got it mostly right is The Emerging Republican Majority (Arlington House, 1969), by Kevin Phillips. How we laughed at Phillips's title when the Republicans got creamed in 1974! But he had the underlying demographic and cultural trends spot on, as Reagan demonstrated and the 1994 elections confirmed. The only way the Democrats can elect a president now is to nominate a Southerner who talks like a Republican (at least long enough to get elected), and even so it takes a real doofus like Ford, Bush, or Dole to screw it up for the Republicans. Of course, the Republicans seem to seek these guys out.
Finally, for a book that may tell us something about the next 30 years, I'm going to play a wild card: Walker Percy's Love in the Ruins: The Adventures of a Bad Catholic at a Time Near the End of the World (Farrar, Straus & Giroux). Published in 1971, this gets a few things wrong (Spiro Agnew is mentioned as a revered elder statesman), but what resident won't recognize Dr. Tom More's Louisiana hometown, which "has become a refuge for all manner of conservative folk, graduates of Bob Jones University, retired Air Force colonels, passed-over Navy commanders, ex-Washington, D.C., policemen, patriotic chiropractors, two officials of the National Rifle Association, and six conservative proctologists"? The center isn't holding: "Americans have turned against each other; race against race, right against left, believer against heathen, San Francisco against Los Angeles, Chicago against Cicero. Vines sprout in sections of New York where not even Negroes will live. Wolves have been seen in downtown Cleveland, like Rome during the Black Plague." The Republicans and Democrats have reorganized as the Knothead and Left parties, but it hardly matters: "Don't tell me the U.S.A. went down the drain because of Leftism, Knotheadism, apostasy, pornography, polarization, etcetera, etcetera," Dr. Tom says. "All these things may have happened, but what finally tore it was that things stopped working and nobody wanted to be a repairman." Bracing stuff.
John Shelton Reed is William Rand Kenan Jr. Professor of sociology at the University of North Carolina at Chapel Hill. His most recent book is Glorious Battle: The Cultural Politics of Victorian Anglo-Catholicism (Vanderbilt University Press).
Lynn Scarlett
Two ideas dominated political philosophy in the 20th century. The first was that mankind could collectively define the "good order." The second was that mankind could bring about this order through collective planning. These ideas were not new. But in the 20th century, they took root and sprouted as grand movements, like the Soviet communist experiment, and in more modest endeavors, like city renewal projects.
By the 1960s, almost no city planner challenged the idea of "the plan." Then came Jane Jacobs. In The Death and Life of Great American Cities (Vintage, 1961), she announced, "this book is an attack on current city planning and rebuilding." Jacobs described cities as "fundamentally dynamic." She celebrated diversity. Jacobs argued that cities were problems of "organized complexity," by which she meant that cities represented the "intricate minglings of different users." Cities were, she wrote, created through the "vastly differing ideas and purposes" of different people, all "planning and contriving outside the formal framework of public action." She saw attempts to impose order, by moving people around "like billiard balls" to new locations, as destructive of the organized complexity that kept cities vibrant.
Jacobs was right. Her prescience anticipated public housing project debacles and sterile redevelopment programs of the 1970s and '80s. But her vision extended beyond city problems. She recognized and celebrated unplanned order, the importance of feedback, competition, and innovation–ideas re-entering late 20th-century political debates about environmentalism, global economies, communications systems, and technological evolution.
Rewind half a century to the writings of French philosopher Jacques Ellul, in whom we find a mental map of another sort. Where Jacobs celebrated uncertainty, complexity, and spontaneous order, Ellul, in The Technological Society (Alfred A. Knopf, 1954) and subsequent books, offered a near-incoherent mix of contempt for uncertainty and yearning for self-will and choice. Jacobs launched her views using the city as her subject; Ellul targeted technology. His thesis: Technology dominates all else. It destroys traditional social structures, imagination, and values–all that makes us human.
Ellul is, above all, what REASON Editor Virginia Postrel calls a stasist. He does not oppose technology but wants to control it. By the 1990s, Ellul was lamenting that "no one has taken charge of the system." In uncertainty, he sees a kind of disorder and amorality.
But Ellul has it wrong. The very uncertainty and change in human structures that he describes is not generating a society of mere consumers and game players. Technology is allowing new social forms, new relationships, new possibilities for human imagination, interaction, and cooperation.
Cooperation is not always–or even often–wrought by some great "we" taking charge and dictating outcomes, an insight explored by Yale law and economics scholar Robert Ellickson in his 1991 book Order Without Law (Harvard University Press). Ellickson remarks that in everyday speech we refer to "`law and order,' which implies that governments monopolize the control of misconduct. This notion is false." Ellickson, like Jacobs 30 years earlier, is an idea buster. He proposes that among human settlements, cooperation is the norm; conflict the exception. And cooperation is often achieved without recourse to law. Instead, it is achieved through ever-evolving informal rules that generate mutual gain.
Ellickson's work is empirical. He shows how real people in real communities achieve cooperation. But his thesis is playing out in a broader political and philosophical tableau in which states, regions, and cities increasingly struggle with the clumsiness of statutes and the inability of laws to deal with the complexities of human interaction and physical reality. Not all circumstances are alike, but statutes require uniformity. Informal social and economic interactions offer a more resilient, adaptive, and evolutionary response to circumstance. Widening the realms in which informal bargaining, negotiation, and trade can supplant order generated through statute is the challenge for the next century. The former builds on human cooperation; the latter reinforces propensities for conflict. Ellickson's Order Without Law anticipates the potentially deepening tensions between formal and informal orders.
Lynn Scarlett (lynns@reason.org) is executive director of the Reason Public Policy Institute.
The post Yesterday's Tomorrows: 1968-1998 appeared first on Reason.com.
]]>Lessons Learned the Hard Way: A Personal Report, By Newt Gingrich. New York: HarperCollins, 256 pages, $25.00.
When they took control of Congress in 1994, Republicans promised to cut spending and transform Washington politics. Since then, they've had good days and bad days. May 22 of this year was especially rotten. On that date, the House approved a six-year, $218 billion authorization of federal highway and mass transit programs.
It's easy to measure just how bad this bill is. First, give a literal interpretation to the term "pork barrel," and reckon how much real, edible, dead-pig pork we could buy with $218 billion. Pork comes in many varieties, so which should we use? In light of how much courage lawmakers showed on this issue, the logical choice is "boneless." Supermarkets sell boneless pork for $3.39 a pound, so the highway bill equals about 64 billion pounds of pork.
That's 237 pounds for every man, woman and child in the United States. Think about that: Unless you are unusually big, your share of the highway bill exceeds your body weight.
No diabolical Democrats force-fed fat into a reluctant GOP. Republicans were the main culprits, gleefully turning from Jenny Craig into Orson Welles. They voted for the bill 143-56. Even the "revolutionary" Republicans first elected in 1994 supported it by a margin of 32-18.
Accordingly, the subtitle of The Freshmen is especially timely: "What happened to the Republican Revolution?" Linda Killian, a former reporter for Forbes, takes a long, anecdotal look at the House GOP class of 1994, focusing on a dozen members. She gives special attention to Van Hilleary of Tennessee, a good case study. Early in 1995, Hilleary helped lead the fight for congressional term limits–which became an embarrassing rout. Republicans wasted energy squabbling over several different versions, including Hilleary's proposal to let each state set its own limit. Many nominal supporters were quietly happy to see term limits die, since they now had the majority and were never serious about the idea in the first place.
At the time, the term limits vote seemed a rare exception: Before their 100-day deadline, Republicans won House passage of all the other elements of the Contract with America. By the following year, however, retreats were becoming more frequent. From a free market perspective, an especially troubling vote came when the House approved a 90-cent increase in the minimum wage. Meeting with constituents in 1995, Hilleary voiced the feelings of most people in his party: "I'm a free-market person, and I think the free market ought to set the wage." Yet 93 Republicans ended up voting for the increase, including Hilleary and 28 other freshmen.
What happened? Despite its subtitle, Killian's book does not supply a detailed explanation, instead putting narrative ahead of analysis. (A more systematic study is Nicol Rae's recently released Conservative Reformers.) Nevertheless, The Freshmen does contain some clues. With a good eye for detail, Killian notices that lawmakers "are yessed, kowtowed to, and generally stroked by everyone, even staffers who work for other members." Who would want to lose such treatment over a little thing like principle?
The lust for re-election can quickly turn revolutionaries into Romanovs. Take Hilleary's fellow freshman from Tennessee, Zach Wamp. (Say the name aloud: It's the sound of a phaser stunning a Klingon.) He began his congressional career as a zealous budget cutter but soon became a vocal advocate of such classic pork barrels as the Tennessee Valley Authority and the Appalachian Regional Commission.
Even when Republicans want to do the right thing, they find it hard to make their case to the public. The class of 1994 had soaked up Newt Gingrich's teachings about strategy, tactics, and rhetoric, but balked at his suggestion that they actually read a list of books. Says Killian: "Many freshmen, Hilleary among them, ignored the list, insisting they had no time for reading." Gingrich's list included Tocqueville's Democracy in America, which Hilleary had never heard of.
That says a lot. Some GOP lawmakers, such as Californians Chris Cox and David Dreier, really can engage in high-level intellectual debate. As for the rest, too many start floundering when the discussion moves beyond their staff-written talking points. It's not that floor debate directly changes roll-call votes; rather, C-Span and talk shows influence attentive voters. When House Republicans go on television only to face evisceration at the hands of Barney Frank, their morale plummets and they cave in.
The Freshmen would have been a more useful book if it had spent more time on the political context and less on vignettes that go nowhere. A whole chapter describes the troubles of Utah Rep. Enid Waldholtz, whose career crashed when her husband turned out to be a big fat crook. It's a sad tale, but not representative of anything else. How many other lawmakers are married to 300 pounds of criminality?
For a while, Newt Gingrich was heading for the same weight class. But as he recounts in his memoir-cum-manifesto, Lessons Learned the Hard Way, Senate Majority Leader Trent Lott gave him some chastening advice just before President Clinton's 1997 State of the Union Address. When the camera's on, said Lott, lean forward so as to minimize your girth. An embarrassed Gingrich then went on a weight-loss program and, as the jacket photo suggests, it worked.
Alas, all Washington ailments do not lend themselves to such straightforward cures. In a thoughtful introductory chapter, Gingrich acknowledges that he had underrated the difficulty of changing the federal government. The experience of the 104th Congress reminded him that a speaker must contend with a Senate that gives blocking power to the minority party and a president who wields the power of the veto. Other chapter headings point to lessons both sensible and obvious: "Pick Your Fights Wisely," "Stay on Offense," and "Learn to Keep Your Mouth Shut." Apparently he remembers the title of a 1996 book by David Maraniss and Michael Weisskopf: Tell Newt to Shut Up!
This book has weaknesses. First, it's the memoir of an active politician who must still appeal to voters and deal with the colleagues he writes about. Works in this genre lack the candor of, say, The Confessions of St. Augustine. In Gingrich's telling, he regarded last summer's abortive internal coup as a series of misunderstandings among well-intentioned people. Press accounts indicate that he was a wee bit angrier than he lets on here.
Another problem is more peculiar to Gingrich. Though he has a lively mind, he often lacks intellectual discipline, undercutting himself through overstatement. In Lessons, he talks about Washington Post stories "designed to keep us off balance" and approvingly quotes a GOP consultant that media polls are "deliberately" biased against the GOP. Bias yes, conspiracy no. Mainstream reporters surely see the world through a liberal lens, but it's hard to prove that they are intentionally skewing their own stories.
In the same vein, he wisely advises against underestimating the skill and tenacity of Democratic politicians, then proceeds to overestimate their unity. When he depicts the Democrats as a lockstep party "committed to policies and institutions that often violate the public's sense of decency," he is indulging in caricature, not serious political thought.
His policy prescriptions are sketchy. Noting that the federal, state, and local levels of government now take up about 38 percent of our income, he proposes to set a peacetime limit on all taxes at 25 percent. "This should not, however, be done by passing a federal law," he says. "As with social security reform, I think we must have a national dialogue and build a national majority for these goals."
The book is quite specific on one policy issue: the highway bill. Though deficit concerns led him to postpone action from the fall of 1997 to spring of 1998, Gingrich calls it "a meritorious bill." He argues that the money would come not from general revenues but from a highway trust fund overflowing with proceeds from the federal levy on gasoline. That rationale raises an obvious question: Instead of spending all the money on concrete, why not slash the gas tax? Wouldn't that be a big step toward the goal of limiting the total tax take?
Gingrich suggests the answer: "[E]ven in a conservative Republican Congress the pressure for more transportation spending is enormous." He could have stricken transportation from that sentence. Though his own commitment to economic conservatism has been spotty (see "The Many Faces of Newt Gingrich," February 1997), the problem goes beyond the speaker's office. Committee chairs want power, which means bigger federal programs to supervise. Backbench members want re-election, which means more pork to take home. No matter how tight-fisted a leader may be, it is hard to restrain those pressures.
Would Republicans act differently if they had bigger majorities in both chambers, along with control of the White House? Maybe, but if the highway bill is any sign, nobody should expect a revolution. That's a lesson we've all learned the hard way.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post Revolutionary History appeared first on Reason.com.
]]>Spin Cycle: Inside the Clinton Propaganda Machine, by Howard Kurtz, New York: The Free Press, 352 pages, $25.00
In his meandering account of the 1996 campaign, syndicated columnist Roger Simon tells how President Clinton joked about a famous Inca mummy: "I don't know if you've seen that mummy. But, you know, if I were a single man, I might ask that mummy out. That's a good-looking mummy."
In Washington Post media reporter Howard Kurtz's look at White House press operations, we learn the rest of the story. During an "off-the-record" bull session with reporters later that day, White House press secretary Mike McCurry topped Clinton's gag with one of his own: "Probably she does look good compared to the mummy he's been [having sex with]."
These two passages illustrate a difference between the two books. Clinton's mummy line is, as the White House likes to say, "old news." It appeared in dozens of press reports, and Simon's account adds nothing to what we already knew: namely, that even a 500-year-old corpse can get this guy thinking about sex. By contrast, the McCurry jab is fresh information, which itself made news when Spin Cycle came out. What's more important, the story offers insight into McCurry's style: By making a naughty side comment about the Clintons, he sought to charm White House reporters, who might thus become a tad more receptive to his pro-Clinton spin on larger matters. He's so good at his work that publication of the remark did not cost him his job, though it probably did earn him a curse from the First Mummy.
Though the lesser book of the two, Show Time does merit attention. One could easily rework it into a textbook on political writing, whose new title would be Don't Do This. The jacket notes call it a "riveting, rollicking, behind-the-curtains peek at the greatest show on earth: the modern American presidential campaign." It's more like a cheap circus that promises a wild-animal show and produces a wretched pack of mangy dogs. Simon has caged every cliché from the 1996 campaign and billed it as a revelation. Dole was an old man who lacked vision and ran a rotten campaign! Clinton was a slippery politician! Perot was a loony!
One might forgive Simon for his trite themes if he compensated with witty prose. He doesn't. A chapter on Dole's war wound bears the title "Arm and the Man." If that high-culture reference is too sophisticated for you, Simon serves up plenty of pop. "On Bob Dole's bad days, he looked like Grandpa Munster. On his really bad days, he sounded like him." To anybody who remembers Al Lewis's role on The Munsters, Simon is saying that fatigue would cause Dole to grow a huge schnoz and adopt a Brooklyn accent. Either he has an exclusive worthy of Weekly World News or he's proving that writers should give a little thought to their analogies.
Simon is sloppy about his facts. In 1992, he recalls, Governor Clinton permitted an execution, figuring that nobody ever lost votes "by electrocuting a black man." Here, his hard-boiled prose has a small crack: Convicted murderer Ricky Ray Rector died by lethal injection. In 1996, Simon assures us, Clinton did not want to become "the first president in history to be elected twice without a majority of the vote." Actually, he'd be the third, since Cleveland and Wilson had both scored that dubious achievement long before. Simon claims Republicans have an edge in the electoral college, but in both 1992 and 1996, Clinton won more than two-thirds of the electoral vote while winning less than half of the popular vote.
Where Show Time is not wrong, it's usually derivative. Its observations about Dole's mannerisms–especially his habit of punctuating sentences with a grunt that sounded like "arghh"–owe much to Richard Ben Cramer's masterful story of the 1988 campaign, What It Takes. Even the publisher's flacks cannot find new material here. In 1992, according to a passage featured in the press release, a female flight attendant on Clinton's campaign plane received instructions not to appear on the tarmac with him when photographers were present and to decline his invitations to work out with him at the Little Rock YMCA. So does Simon have a scoop? Hardly. All of this information first appeared in a July 7, 1994, Washington Post story by Sharon LaFraniere.
A couple of times, Simon does start to handle something potentially important–and then drops it. In describing Dole's first swing through California, Simon mentions the state's ballot initiative to ban racial preferences. His sketch of the issue raises the faint hope that he will chuck the trivia and analyze the role of a substantive issue in the 1996 campaign. No such luck. In Simon's telling, Dole cut short his planned endorsement of the proposition simply because bad advance work had botched the arrangements for the speech. Simon misses the opportunity to discuss racial politics and the GOP's continuing inability to articulate the principle of equal opportunity.
Simon devotes several pages to the policy concessions and ego strokes that Clinton employed to keep Jesse Jackson out of the 1996 Democratic primaries. A key gesture involved campaign help for Jackson's son, who was running for the U.S. House. Simon quotes presidential aide Harold Ickes: "Certain Democratic fund-raisers close to the president gave money to the Jackson for Congress campaign." Simon then mentions that these fund-raisers included the notorious Lippo Group and a mysterious Indonesian landscaper whose huge contributions would later embarrass the Democratic National Committee. So how exactly did Clinton's foreign-money laundromat relate to his effort to appease Jackson? Simon doesn't say; he just moves on to other things.
In his epilogue, Simon recounts an interview with Mike McCurry. Simon asks him about the president's state of mind: "And why is he so at peace, why is he so spiritual?" McCurry takes the question seriously, talking about the Clinton legacy, even though the president's "spirituality" really consists of offerings at the temple of Aphrodite. McCurry also explains why Clinton thinks people will remember his accomplishments instead of his scandals: "He is confident because he has done nothing to besmirch his office in a major way." Ponder those last four words: the Clinton codicil to every ethical standard, including each of the Ten Commandments. (Thou shalt not commit adultery …in a major way.)
According to Howard Kurtz, McCurry is "the master of spin." He makes such an effective spokesman for the president because he understands a few basic lessons about the press. First, reporters compete among themselves. They will not necessarily aid a colleague in distress, as McCurry demonstrated when he rudely cut off New York Post correspondent Deborah Orin at a press conference, and no one stood up for her. Competition allows McCurry to play reporters against one another: He leaks a breaking development to one, forcing the rest to play catch-up, thereby keeping a story alive through several news cycles.
The second lesson is that a press secretary can avoid outright lying if he avoids learning the truth. To protect his integrity when discussing scandal, Kurtz says, McCurry follows a path of "willful ignorance," repeating only what the lawyers tell him and not quizzing Clinton directly if he can help it.
The third lesson is that reporters, like other human beings, enjoy flattery and dislike criticism. Courting the press does not guarantee favorable coverage, but failure to do so will guarantee bad coverage. Although President Clinton is very good at it when he wants to be, his own hypersensitivity sometimes causes him to lash out against reporters, usually with unhappy results.
Spin Cycle does a fine job of explaining how McCurry and other Clintonistas apply these lessons. Unlike Show Time, it depends mainly on original reporting instead of Nexis searches. Kurtz is especially perceptive in laying out the elaborate effort to defend Al "No Controlling Legal Authority" Gore against accusations of illegal fund raising. Among other things, Kurtz says, the battle involved the generation of sympathetic op-ed pieces. Good point: One of the open secrets of journalism is that many (perhaps most) op-eds and letters to the editor result from professional public relations campaigns.
In spite of the book's virtues, its own "spin cycle" will be brief. Kurtz focuses mainly on White House response to scandal news in 1996 and 1997, and he does not purport to cover most other aspects of the relationship between the president and the press. And within the narrow scope of his research, he had only fragmentary access to important information. For legal and political reasons, White House aides were probably not inclined to volunteer the whole truth. What's more, the story is still unfolding: Just before the book went to press, Kurtz had to add a sketchy epilogue on the Monica Lewinsky affair.
For people interested in how Clinton has spun the news, many questions remain open. Here are a few.
Why have the media overlooked so many harsh personal attacks on conservatives and free market advocates? When Hillary Clinton testified at a 1993 congressional hearing on health care, Rep. Dick Armey (R-Tex.) said he would like to make the debate as exciting as possible. She answered: "I'm sure you will do that, you and Dr. Kevorkian." In most contexts, such a remark would merely be snide, but in this case it was cruel: As White House opposition researchers undoubtedly told her, Dick Armey's father committed suicide. Newspaper stories reported Armey's flustered reaction, but none explained the reason.
How has President Clinton gotten away with gaffes that would have sunk other politicians? On August 6, 1994, he said in Detroit: "The interests–the violent, extremist interests in this country that are trying to keep health care out of the reach of ordinary American working people–are a disgrace to the American Dream." Three days later, a few reporters asked press secretary Dee Dee Myers if she could name any health care groups that were committing violent acts. When they scoffed at her initial answer–that he must have been talking about anti-abortion protesters–she said: "I don't have any better explanation for you." Yet apart from stories in the Boston Herald and a handful of other outlets, Clinton's bizarre outburst got very little ink.
How did the president and his allies manage to put Republicans on the defensive on affirmative action? In 1996, liberal leaders such as Jesse Jackson hinted at a "cultural conspiracy" linking church burnings with efforts to roll back racial preferences. Fearful that some would brand them "racist," Republicans began backing away from the issue, thereby yielding a point that was both principled and popular. Did the White House coordinate the attack?
A final question arises from the opening line of one of Kurtz's chapters: "The White House war against Kenneth Starr was a curious and covert operation." So what are the elements of this "covert operation"? The answer to that question might come out only in court, if it comes out at all.
As of early 1998, Clinton's spin wars were still succeeding brilliantly. One could sum up his performance in six simple words: He lies. He cheats. He wins.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is an associate professor of government at Claremont McKenna College.
The post Bad Show appeared first on Reason.com.
]]>In one sure sign that Forbes has become a household figure, his name recently came up in an argument for an anti-cloning bill. A Democratic state legislator in New Hampshire said to the Manchester Union-Leader, "Do we want to have 10,000 Steve Forbes running around?"
Forbes's emergence is remarkable. Every president from Washington to Clinton has held either elective office or a high-level appointment such as a generalship. Apart from his stint on the Board for International Broadcasting, Forbes lacks government experience. Still, he does pass one key test: Like all Republican nominees between 1968 and 1996 except for Ford, he has already lost a bid for the White House. GOP primary voters like their candidates bloodied and humbled.
Forbes understands that any potential GOP nominee has to navigate the Republican Bermuda Triangle: the party's fractious coalition of libertarians, social conservatives, and office-holders. That's hard to do, and even at this early stage, his positions on abortion and drugs have alienated some elements of his libertarian base. But Forbes is no simple trimmer. Main Street Republicans fret about his forthright positions on issues such as Social Security and affirmative action, fearing that Democrats will brand him–and the party–as "extremist."
One label that people seldom associate with Steve Forbes is "daredevil." And yet he has indeed been taking riskier positions than George W. Bush, Lamar Alexander, and other top contenders. There are two possible explanations. One is that he's a sheltered political naif who just doesn't understand the trouble he's going to get into. The other is that he's a shrewd political leader who is firm in his beliefs and confident of his ability to change people's minds.
Either way, he's come much further than anyone expected a couple of years ago.
Forbes suffered from a late entry into the 1996 nomination campaign. Dole had already cornered the support of the blue-haired ladies who do the grunt work of primaries and caucuses, so Forbes had to compensate by waging an air war. His commercials rapidly pumped up his poll numbers, which collapsed just as quickly when the other candidates counterattacked.
In the early weeks, his stump style ranged from wooden to robotic. New Republic reporter Michael Lewis offered this cruel but telling account of how Forbes reacted to applause: "A few years back there was a television commercial for the Special Olympics that concluded with a retarded boy bursting through the tape at the finish line and breaking into a joyous, heart-tugging smile. Forbes now wears exactly the same expression. He has a Special Olympics smile."
Other observers dismissed him as a rich dabbler–Michael Huffington minus Arianna. But after Dole clinched the nomination, Forbes confounded the skeptics. He didn't skulk away like Huffington, or go into full-metal-wacko mode like Perot. He stayed in the arena, learning from his own missteps, and adapting techniques from political comebacks of the past.
Since the summer of 1996, Forbes has been walking the old Nixon trail, campaigning for GOP candidates and picking up political IOUs. In February of this year, for instance, he went to Richmond and praised Virginia Gov. Jim Gilmore's initiative to repeal the state's hated car tax: "The governor has proved that you can marry power to perspective." In an interview in these pages ("Happy Warrior," February 1997), anti-tax activist Grover Norquist observed: "Forbes is helping people. Sometimes by helping other people you help yourself."
As Reagan did after leaving the California governorship, Forbes has set up his own political committee: Americans for Hope, Growth and Opportunity. Among other things, AHGO sends out blast e-mails and maintains an impressive Web site (www.ahgo.org). Forbes has also started daily radio commentaries, available in RealAudio on the AHGO site. By coincidence, Forbes's radio producer is the man who produced Reagan's broadcasts in the 1970s.
Forbes is improving as a public speaker. At the Southern Republican Leadership Conference held in Biloxi, Mississippi, this February, he delivered many of his lines with simple humor and a natural-sounding inflection. "And so even if you had trouble with math or arithmetic when you were in school, which of course none of you did–See, I'm learning to pander…." The aside got a good laugh, and he responded with a graceful smile, not one of the "Special Olympics" variety.
In May 1997, he gave the commencement address at Claremont McKenna College, in Claremont, California. He concluded with words that warmed the hearts of professors weary of student grammar: "Just remember, it's not who you know; it's whom you know." Then he did a couple of things that previous commencement speakers had not done. During the distribution of diplomas, he stood up and shook hands with every graduate: a small touch, but a pleasant, unexpected one. After the ceremony, he stayed around for more than an hour to pose for snapshots and chat with anybody who wanted to talk with him. (He gained from comparison with the previous year's speaker, the imperious Vernon Jordan. After giving a harshly partisan speech praising racial preferences and hinting that Republicans were bigots, Jordan did not deign to mingle with the proles. Maybe he was eager to return to his job-referral service.)
Though he is getting better on stage, Forbes still has work to do. Too often, even when he is speaking off the cuff, he sounds as if he is reading a text for the first time. For one thing, he keeps committing the beginner's mistake of pronouncing the indefinite article a with a long-a sound ("ay"). At a New Hampshire town hall meeting, he said: "The IRS abuses…are ay symptom of ay corruptingly complex code."
During the same New Hampshire trip, Forbes toured downtown Nashua, with C-SPAN cameras rolling. At a shoe store, he awkwardly greeted clerks and customers, who seemed more interested in the merchandise. He stopped for a few moments with an 11-year-old girl, saying, "I hope you have a good shopping experience here." She thanked him and walked away. Throughout the visit, he seemed uneasy about getting in the way of market transactions–which, in a way, is a good sign.
In May 1991, long before Forbes even thought of seeking office, REASON asked him to describe his political philosophy. "I'm pro-growth," he answered. "If that means enterprise zones, I'm supportive, and if it means increasing the earned income tax credit as a way of enabling people to get off the welfare treadmill, I would be for it. That's really the guiding compass: How do you let people develop their talents with a minimum of interference?"
When it comes to broad free-market principles, Forbes "gets it." At the Claremont McKenna commencement, he said: "In times past, we thought of wealth, as land, armies, piles of gold and jewels, and, sadly, slaves. This new era has made clear what has always been true–that the true source of wealth is the human mind, human imagination, inventiveness, stick-to-it-iveness."
Earlier this year, he wrote an obituary for Julian Simon, the economist who debunked dire forecasts about population and pollution. "In an environment of freedom, more people mean more knowledge and, thus, more breakthroughs and inventions. In no small part due to Simon's work, the Chicken Littles were routed." Perhaps looking ahead to a race against Al Gore, Forbes added a caveat: "But Malthusians never stay down. They are at work today propagating a new menace to frighten us–global warming."
Both Gore and Forbes spoke at the Microsoft CEO summit in May 1997, and their remarks highlighted their philosophical differences. Gore compared free-market types to the heartless Tin Man of The Wizard of Oz. Today's Tin Men, he said, "have a cold, calculating bead on the facts and figures and theories that measure the rise and fall of markets," and their economic policy "is simply to slash taxes and get the government entirely out of the way." Their approach failed during the Reagan years, Gore argued, and it cannot work in a high-tech age: "The Tin Men offer no prescription for upgrading the skills of workers or for sparking innovation."
Forbes offered a different vision, calling the tax code "a real dead weight on the economic life in America." Whereas Gore proposed a federal plan to wire classrooms to the Internet, Forbes reached deeper: "Technology won't make a fundamental difference in education as long as the old monopoly is running it." He backed a variety of reforms, including school choice, to "blast those systems open."
Within the party, Forbes is scarcely alone in endorsing such policies. His rhetoric, however, suggests a more sophisticated understanding of government power. When other Republican leaders mention the Cold War, they usually limit their discussion to President Reagan's successful fight to bring down the Soviet Union. That part is true enough, but Forbes stands apart from the others in noticing that the struggle also had a dark side. In a speech at the Cato Institute last year, Forbes said that the Cold War had harmful effects on social and political life. National security, he said, gave the federal government a rationale for enlarging its power over education, research, transportation, and many other fields. By the early 1960s, people concluded that if the government could win two world wars and contain communism, it would do other good things, too. "And thus we got the War on Poverty, Jimmy Carter's moral equivalent of war on the energy crisis, and the war on drugs. War, real or metaphorical, has been the motif of this century." (We'll shortly return to his stand on the drug war.)
Forbes has repeatedly stressed the moral dimensions of his economic beliefs. You don't succeed in business through coercion or deceit, he said in his Claremont McKenna address, "You succeed in business by providing a product and service that people are willing to buy voluntarily from you." He has also discussed the need for honesty and integrity at the highest levels of politics. Many libertarians have praised this line of argument. As Edward Crane, president of the Cato Institute, has written: "I applaud Forbes's decision to include a call for moral leadership along with his libertarian policy proposals, ranging from the flat tax to school choice to Social Security privatization."
Life, liberty, and the pursuit of happiness, Forbes told the Heritage Foundation, are the "moral basis of a free society." Government's role is to secure those rights–in the order that the Declaration lists them. "Switch the order–putting happiness before liberty, or liberty before life–and you end up with moral squalor." From this belief, Forbes has derived some positions that have pleased social conservatives while disturbing many of his libertarian allies.
Speaking to the Christian Coalition last September, Forbes got a rousing reaction when he said of America's national creed: "Remember, life begins at conception and ends at natural death." Many news reports quoted that line, but by itself, it carried less significance than reporters thought. They forgot that Forbes had used similar language in the 1996 campaign: "I believe that we should protect life from conception to natural death."
Pleasantly surprised social conservatives speak of Forbes's "conversion experience" while pro-choice Republicans complain about a sell-out to the religious right. In fact, he's been more consistent than either side thinks. On February 10, 1996, he said on CNN's Evans and Novak program: "I think that we have to bring public opinion along each step. I think we can start with a consensus by banning abortions in late pregnancy–barring a life-threatening emergency–barring abortion for purposes of sex selection, no mandatory government funding, parental consent in the case of minors." Robert Novak asked him if he would support a constitutional ban on abortion, provided that public opinion came along. "I wouldn't oppose it, if you have the culture with you," Forbes said.
That's largely his position today, but he has made some adjustments that have major political consequences. When the constitutional issue resurfaced during an Evans and Novak interview earlier this year, Forbes's answer suggested a crucial shift in tone: "I believe in life. I want that constitutional amendment."
Whereas he once downplayed abortion, he now emphasizes the issue. Earlier this year, he even supported a Republican National Committee "litmus test" on partial-birth abortion, which would have denied party funding to candidates who didn't oppose the procedure. (Many anti-abortion Republicans, most prominently Rep. Henry Hyde, opposed the test for dividing the party.) "We should not fund a Republican candidate who opposes partial-birth abortion bans unless there are special circumstances, as determined by a majority of the 165-member RNC," Forbes told The Washington Times. He added that there "may be circumstances, such as support for Rudolph Giuliani in New York, where you might want to make an exception."
Forbes's comments illustrate the risks of finessing the issue. Former Republican national chairman Richard Bond wrote: "Forbes has alienated activists on both sides of the abortion issue; encouraged Republicans to engage in an internal fight at a time when President Clinton and the Democrats are on uncertain footing; further driven away women and other potential supporters from the party; and encouraged further attempts for litmus tests to be foisted on Republicans by those who believe in single-issue candidacies."
On another social issue, Forbes has campaigned hard against medical marijuana. "What's going on?" asks Clifford Thies, chairman of the Republican Liberty Caucus. In an open letter to Forbes, available on the group's Web site (www.rlc.org), he continues: "Has he become so fixated on becoming president that he has adopted Bob Dole's failed strategy of trying to beat Clinton on the drug issue?" Libertarian writer Doug Bandow is even harsher: "Treating the sick and dying as the enemy is a particularly cheap way to win votes."
Forbes's supporters argue that his support for tough drug laws, like his abortion stand, is less a new position than a new emphasis on a longstanding and sincere belief. On July 1, 1996, he wrote in his magazine that "the beguiling notion that decriminalization of the use of `mild' narcotics such as marijuana would allow authorities to crack down more effectively on hard drugs still persists (even in a recent Forbes story about the Netherlands). Alas, the idea is destructive nonsense."
Thies, however, argues that it's possible, indeed necessary, to have a separate discussion of medical marijuana: "As Republicans, we understand the importance of what we communicate to others, especially to youth. But it is immoral to hold sick people hostage so we don't miscommunicate our concern for drug abuse."
Some libertarians share Forbes's opposition to abortion. (Murray Sabrin, the Libertarian Party candidate for governor of New Jersey last year, attacked Republican incumbent Christine Whitman for her support of partial-birth abortion.) Some may even favor curbs on certain drugs. Inevitably, though, his high-profile stands on these issues will create friction between Forbes and the libertarian movement. If he is to hold onto at least a share of libertarian support, he has to take bold, pro-liberty stands on other issues. And he does.
Forbes has imported his favorite applause line from Transylvania: "Rather than further complicating a failed federal tax code, we should abolish it–kill it, drive a stake through its heart, bury it, and hope it never rises again to terrorize the American people." He urges Congress to take a "bold and radical step" by passing legislation to scrap the tax code by a date certain, a move that he hopes will trigger debate about "a new tax code for a new century."
Forbes thinks that the best alternative is a straightforward flat tax. His proposal features a 17 percent rate on wages and salary, with simple exemptions: $13,000 for singles, $26,000 for married couples filing jointly, and $17,000 for single heads of household. Taxpayers could also deduct $5,000 per child. He would not tax personal savings, pensions, Social Security benefits, capital gains, or inheritances. (Businesses would pay the 17 percent rate on net profits, and could write off investments in the first year.)
The flat-tax idea is compelling. For decades, Washington has used the tax code as an instrument of social and economic engineering. By getting rid of tax preferences, a flat tax would free individuals and businesses from a pervasive form of government power. With the elimination of most deductions and credits, taxpayers would no longer have to supply Uncle Sam with excruciatingly private information. Compliance costs would plunge, since people would no longer have to spend disproportionate sums on record-keeping and accountancy. And what's more, many people would see their taxes go down. Currently, an average family of four owes about $3,000 in taxes for the first $36,000 in income. Under the Forbes plan, they would owe nothing up to the $36,000 level; after that, they would pay 17 cents on the dollar.
The proposal might seem an automatic winner, especially since abusive IRS activity has received massive publicity, and the total tax burden has reached unprecedented levels. There's a catch, however: To slay the vampire, you also have to kill a sacred cow. On NBC's Meet the Press, Tim Russert asked Forbes if he would allow a deduction for mortgage payments. Forbes answered: "I think in order to make change…you have to go with a strong and pure proposal, or else the essence of it will not survive the political process."
Nearly 30 million households take the home-mortgage deduction. Even though many of them would do better under the Forbes plan, they balk at giving up an existing benefit. When the 1986 tax bill ended a number of tax breaks, Americans suspected that Washington would eventually renege on the promised payoff: reduction of the top rate to 28 percent. Guess what: They were right. People don't like Democratic tax increases, which is partially why Republicans took Congress in 1994. But courtesy of the Bush-Darman tax betrayal of 1990, they don't trust Republican pledges about tax cuts.
Even Republican primary voters are skeptical, especially when it comes to proposals that would touch the mortgage deduction. Dole effectively used the issue against Forbes in New Hampshire. In the general election campaign, Dole got his comeuppance when nobody believed his proposal for a 15 percent reduction in income tax rates.
Dan Quayle, among others, has hedged by calling for a "modified" flat tax that would retain the mortgage deduction. That one key word, of course, represents the difference between radical reform and another round of incremental change. The contrast shows that Forbes's stand is gutsier than it may first appear. He's betting that he can lead public opinion and overcome the attacks that will surely come both from Democrats and his GOP rivals.
Forbes's risk-taking doesn't end with the flat tax. He has broken with congressional Republicans whose answer to the impending crisis of Social Security is the time-tested dodge of appointing a blue-ribbon panel. In January, he told the National Press Club that we do not need another commission: "We need action. We need real proposals on the table." Forbes's proposal is to phase in a new Social Security system for younger people, in which most of their payroll taxes would go to their own retirement accounts. In his magazine, he explained: "As things now stand, without major payroll tax boosts the actual return for our under-age-30 citizens will be decidedly negative. A new system would both avoid new levies and give participants vastly more than we are getting now."
Such a commonsensical position shouldn't require courage, but it does. Even though many Democrats have acknowledged the system's flaws, Republicans always invite trouble when they discuss programs for the elderly. In 1986, Senate Republicans lost their majority in part because they supported a modest adjustment in benefit increases. A couple of years ago, Democrats drew blood with their "Mediscare" campaign. In both cases, the GOP sustained damage even though it retreated–or perhaps because it retreated. In light of widespread public support for reform, Forbes believes that a strong, firm position will prevail in the end.
Forbes also differs from Hill Republicans on affirmative action. Increasingly paranoid about charges of "racism," the congressional GOP has been caving in to the regime of racial preferences. House Republicans could not get an anti-preference bill out of committee. While passing the shameful highway bill, Senate Republicans failed to repeal the even-more-shameful program that sets aside up to 10 percent of federal highway construction contracts to minority and female-owned businesses. Forbes, however, has been speaking out against state-sponsored discrimination. His group sponsored an ad campaign to support the Washington State Civil Rights Initiative. Forbes said that the measure would "reassert the moral principle that all Americans have been created equal in the eyes of the law, regardless of race or gender or national origin."
Forbes has criticized GOP congressional leaders for yielding too much ground to the White House on budget issues, and they've hinted at retribution. On CNN's Evans and Novak, Senate Majority Leader Lott recently said: "I've got this to say to some of the people that are using that as a method to get the nomination: They'd better be real careful. They may be surprised whose help they might need….I don't appreciate it, and I'm not going to put up with it, I'll tell you that."
And on one important issue–immigration–Forbes has risked the antagonism of GOP grassroots activists, who see immigrants as burdens instead of assets. Forbes strongly supports legal immigration, because of his belief that human beings are the ultimate resource. He editorialized against California's Proposition 187 (which would have denied government benefits, including public schooling, to illegal aliens), and opposed a national version of the measure during his 1996 presidential campaign. He also came out against a constitutional ban on citizenship for children of illegal immigrants–a position that had enough GOP support to earn a place in the party platform. A leader of a California anti-immigration group told the Los Angeles Times that Forbes deserved special condemnation: "Steve Forbes basically says immigration makes Americans work harder. Here's a guy who inherited all of his money, never had to work for a dollar, and is telling us we have to compete against 4 billion people. Outrageous." (Although he inherited much of his wealth, of course, Forbes is in fact a CEO whose fortunes are tied closely to how well he runs the family publishing business.)
Whether out of courage or naiveté, Forbes is saying what he thinks. It's an unusual way to prepare for a presidential race.
At the moment, that race appears open. Forbes won the presidential straw poll at the Conservative Political Action Conference in late January. He was the first choice of 23 percent of the attendees, compared with 10 percent for Texas Gov. George W. Bush. At the Southern Republican Leadership Conference a few weeks later, Bush led with 18 percent. Forbes placed second with 15 percent, followed by Dan Quayle with 12 percent.
These unscientific results, together with soundings of GOP activists, indicate that Bush is Forbes's strongest competitor. The funny thing, though, is that few Republicans outside Texas can identify anything he has done as governor. Just as Zsa Zsa Gabor was famous for being famous, Bush is popular for being popular. Name identification, of course, has much to do with it, and even that is a mixed blessing. Because of his 1990 budget deal and his humiliating loss to Bill Clinton, the elder Bush has become the "yada yada" chapter of GOP history: "Reagan cut taxes and won the Cold War, then yada, yada, yada, Clinton came along."
At the Southern Republican Leadership Conference, Dan Quayle praised Reagan extravagantly, but only once did he mention the man who chose him as vice president–and then only as Saddam Hussein's assassination target. The rest of Quayle's remarks were in character, which is to say that they ranged from trite to embarrassing. At one point, he described "the centerpiece of our anti-crime plan: Three interns and you're out!" He held up three fingers, perhaps trying to prove that he could count.
On his geekiest day, Forbes does better than that.
And if Forbes sometimes has a "robotic" speaking style, Lamar Alexander must really be a robot. His total lack of originality or spontaneity suggests an artificial life form that someone assembled from pieces of old politicians, according to a focus-grouped blueprint. Consider the tag line of his Web site (www. lamaralexander.com): "For the new century, America needs not so much a new kind of government–not a Squarer Deal or a Fairer Deal or a Greater Society–but a New Spirit…." This depth of banality is far beyond the reach of any natural-born human.
Libertarians would be hard put to identify a potential GOP candidate (or, for that matter, a Democrat) whose positions are more acceptable than those of Forbes. Like other GOP proto-candidates, Bush, Quayle, and Alexander all oppose drug legalization and support abortion restrictions in varying degrees. Forbes outshines them on other issues, especially in his forthright stance on the flat tax. Compare his position with what Alexander said in 1996: "Taxes need simplifying and some of them need lowering."
Forbes is far from a perfect candidate. In his public style, he will never match the Great Communicator. In his issue positions, he will never completely please all the wings of the Republican Party. But the perfect candidate does not exist. Among the likely contenders, who else is there?
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College. His Web address is faculty.mckenna.edu/jpitney.
The post Capitalist Tool Time appeared first on Reason.com.
]]>To understand the ethics mess, skip the latest scoops. The real explanation came out some 1,400 years ago, when Pope Gregory the Great identified the Seven Deadly Sins. Gregory's list furnishes a concise guide to basic principles of contemporary politics: pride, envy, anger, sloth, avarice, gluttony, and lust.
Pride. Chaucer wrote that pride is the "general root of all evils." In politics, this root runs deep. Candidates exaggerate their own virtues, sometimes believing what they say. Once in office, they surround themselves with fawning staffs. Washington is to ego as Iowa is to corn: a place where abundant fertilizer promotes amazing growth.
Pride creates its own ethical logic. "I'm good," thinks the politician, "and since good people don't do bad things, then whatever I do is OK." During his "no controlling legal authority" press conference, Al Gore said: "I'm proud of what I did. I do not feel like I did anything wrong, much less illegal. I am proud to have done everything I possibly could to help support the re-election of this president and to help move his agenda forward." The agenda justifies the means.
Pride is the original sin of public policy. The "anointed," as Thomas Sowell calls them, believe that they know what's best for everybody else–hence such monstrosities as the Clinton health plan. They also think that they can divine long-range trends in economics, international relations, and even weather–hence Gore's crusade against global warming.
Prideful politicians think that they can get away with anything. When their policies fail or their misdeeds become public, they shift the blame or deny that anything has gone wrong. Sometimes these responses fail, as Presidents Johnson and Nixon discovered. Sometimes they work, which is why President Clinton survived his first term.
Envy. What do average Washingtonians want? Better job titles, bigger offices, richer perks, and more one-on-one contact with the powerful. Staffers refer to the last item as "face time." (Perhaps certain politicians offer "designated-body-part time.")
Inevitably, what everyone wants is what somebody else already has. In the White House, such envy helps explain why aides vie to catch the president's eye and fulfill his desires (including the noncarnal ones). The jostling for position includes even the lowliest ranks. According to press reports, intern Monica Lewinsky had a "coveted blue pass," which enabled her to enter the West Wing at will, while her peers labored elsewhere. No wonder some of them seemed quite eager to trash her reputation.
There is plenty of envy in the private world, but it's especially acute in politics because tangible accomplishments are so scarce. When you can't measure your achievements by the number of computers manufactured or customers served, you rate your position by the number of really cool meetings you get to attend.
Anger. Envy begets wrath. Since so much of political life hinges upon petty things, Washingtonians are hypersensitive to slights, snubs, and insults. They might not always remember the size of the national debt, but they do remember the time someone kept them waiting outside a conference room (and for how long). The ultimate example was Richard Nixon, who ended his political career as a congealed blob of resentment.
In a city of vendettas, leaks are a weapon of choice. Such leaks often come from disgruntled staffers and ex-staffers, whose ranks are legion. (Have you ever met a Washingtonian who was gruntled?) Last year, former White House aide Linda Tripp told Paula Jones's lawyers that President Clinton had made a pass at another woman. Clinton lawyer Robert Bennett then accused her of lying. Reportedly, her anger at Bennett spurred her to tape Lewinsky's phone calls.
Leaks and attacks have a profound effect on Bill and Hillary Clinton. In Affairs of State (See "A Thin Line Between Love and Hate," August/September 1997), historian Gil Troy describes their reaction: "The Clintons united in rage. Theirs was the angriest administration since Richard Nixon's." Angry people make dumb mistakes, but the Clintons have usually harnessed their anger to politically useful purposes. When Hillary Clinton complained of a "vast, right-wing conspiracy," she was venting her feelings while simultaneously providing a cue to reporters who were predisposed to believe in such things.
More often, it's Clinton's opponents who lose their heads. Consider the volcanic conservatives who prematurely called for his impeachment. Their reaction was understandable, since Clinton's slickness infuriates them. But from the standpoint of GOP political strategy, a successful effort to oust Clinton would backfire, since it would give Gore the incumbency advantage in the next presidential election.
As Nixon said as he left the White House: "Always remember, others may hate you, but those who hate you don't win unless you hate them–and then you destroy yourself." He should know.
Sloth. Contrary to the popular myth that Washington keeps bankers' hours, people in the political community put in long days. Physical sloth is not their problem. Instead, many suffer from intellectual sloth, which sets in when they fail to rethink their assumptions. The D'Amato hearings on Whitewater and the Thompson hearings on campaign finance both embodied this kind of sloth. Each time, Republicans were expecting Watergate in reverse, where noble Republicans could take down a tainted Democratic president. Each time, they flopped.
Notwithstanding all their hard work, they failed to take account of one big thing: The other side had studied Watergate, too. The White House recognized that it could hinder investigations by providing evidence at a glacial pace, a practice called "slow-walking." Congressional Democrats remembered that Sen. Sam Ervin (D-N.C.) was effective as chair of the Watergate committee because of his reputation for probity. Accordingly, they undercut the GOP chairs, hoping to make D'Amato look like a sleazebag and Thompson a shameless self-promoter. They succeeded.
Like the people they cover, reporters also represent a strange brew of hard work and sloth. They have long toiled to cover campaign scandals and the legislative proposals designed to prevent them. At the same time, they have seldom questioned the premise of campaign finance "reform," namely, that more red tape will produce a "cleaner" and more democratic process.
Avarice. No one can take money out of politics as long as people can make money by influencing public policy. With its wide array of rules and programs, contemporary government offers economic interests opportunities to gain subsidies, monopolies, and other advantages. When Sen. Carl Levin (D-Mich.) asked high-rolling contributor Roger Tamraz if one of the reasons for his donations was to gain access to the White House, he responded: "Senator, I am going even further; it is the only reason."
Even in a rural state such as Arkansas, a governor can be a helpful friend or a troublesome foe. That's why some colorful characters took an early interest in the financial well-being of the Clinton family. The fruit of this interest, of course, is the ongoing Whitewater controversy. And if it were not for Whitewater, Kenneth Starr would not hold the post of independent counsel, and there would have been no federal investigation of Monica Lewinsky's statements.
Gluttony. In its literal meaning, gluttony is not part of the Clinton scandals. Washington has become a nonfat yogurt kind of town, and in 1997, the president got into the act by curbing his appetite for rich foods.
In a metaphorical sense, however, gluttony does play a role. When the media find a hot new scandal, they engage in what political scientist Larry Sabato calls a "feeding frenzy." For a brief period, they compete with one another to gobble up all available bits of information, misinformation, and disinformation. Rather than digesting it carefully, they swallow it whole, often with disastrous results. When the pig-out is over, they feel nausea and shame, and they swear not to do it again.
In their remorse, they may leave some good food on the table; that is, they may overlook credible leads that take time and study to develop. In each of the Clinton scandals, serious questions lingered long after the initial frenzy came to an end.
Lust. This one doesn't require any explanation, except that the connection of sex and power didn't start with Clinton's inauguration. It goes all the way back to King David, who had an illicit affair and tried to cover it up. Bathshebagate broke because of a whistle-blower named Nathan, who shamed David into an apology.
God had warned that there would be days like this. As David Boaz wrote in Libertarianism: A Primer, the First Book of Samuel contains an early explanation of the need for limited government. When the people prayed for a king, God answered that abuse and tyranny would ultimately follow.
Like the rest of us, those in power are vulnerable to the Seven Deadly Sins. "If men were angels, no government would be necessary," wrote James Madison. "If angels were to govern men, neither internal nor external controls would be necessary." Angels are pretty scarce in the political world. That's why the framers of the Constitution built a system of separated powers, ensuring that a wayward president would eventually crash into judges, juries, and congressional investigations. The system can get pretty ugly at times, but as Madison asked, "What is government itself but the greatest of all reflections on human nature?"
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post The Seven Deadly Sins of Politics appeared first on Reason.com.
]]>Most of this fall's college freshmen were born in the year that Ronald Reagan won the White House. These 18-year-olds have only blurry memories of his presidency, which ended when they were in the third grade. Their knowledge about this period comes mainly from books and popular culture, which should worry the Gipper's admirers.
Go to a library's Reagan shelf, and you'll see such titles as The Acting President, Gambling With History, Make-Believe, The Reagan Detour, and Visions and Nightmares. Rent some political movies at Blockbuster, and you'll have a similar experience. Films such as The Pelican Brief and Clear and Present Danger portray fictional Reaganesque presidents in the light that Hollywood sees Reagan: as equal parts faker, killer, and doofus.
Aside from Martin Anderson's Revolution and a few other works, there is empty space where serious defenses of the Reagan administration ought to be. Dinesh D'Souza's new biography is an effort to plug the gap. This concise book offers a useful, if flawed, introduction to Reagan, especially for those who have heard only from the bashers.
D'Souza worked in the White House toward the end of Reagan's tenure. At the time, he admits, he saw the president as a nice fellow atop an administration sinking into scandal and internal strife. Hindsight has changed his view. "Previously I admired the man but had doubts about his leadership," he says. "Now I see that he had faults as an individual but was an outstanding statesman and leader."
Reagan certainly fumbled with details: The book recounts the now-famous story of the time he mistook his own housing secretary for a big-city mayor. But on the large issues and in the long term, Reagan was closer to the truth than his liberal critics. "We're on the edge of a world crash," said Sen. Daniel Patrick Moynihan (D-N.Y.) in 1982–just when we were on the edge of an historic boom. In 1986 House Speaker Jim Wright sputtered at Reagan's "tear down this wall" challenge to Gorbachev: "It just makes me have utter contempt for Reagan. He spoiled the chance for relations between our two countries." You know what happened next.
History has forced the critics to adjust their initial image of Reagan as an economy-wrecking, war-mongering monster. ("The evil is in the White House at the present time," said House Speaker Tip O'Neill in 1984.) Instead, they have adopted what D'Souza calls "the Revised Standard Version," which acknowledges that, well, maybe we did end up with peace, prosperity, and the total collapse of Soviet communism. But that was just coincidence, says the RSV. Reagan could not possibly have had anything to do with it, since he was just a cheerful simpleton with incredible luck. D'Souza points out that even many conservatives have bought this view, at least in part.
This book guts the RSV. Its numerous examples show that the big decisions belonged to Reagan, not his cynical and pragmatic handlers. Rebutting the notion that he passively read the lines that others set before him, D'Souza explains that Reagan was Reaganism's primary author and that he wrote much of his own material. Anyone can verify D'Souza's point by visiting the Reagan Library in Simi Valley, California. In reading drafts of major addresses, such as the 1983 "Evil Empire" speech in Orlando, you will find that a good deal of the language is in Reagan's distinctive and legible handwriting.
Reagan accomplished so much, says D'Souza, because he had mastered the three basic elements of leadership: vision, a bias for action, and an ability to build support for his policies. Despite his ideological commitments, he was flexible in day-to-day maneuvers and tolerant of intraparty disagreements. His agility reflected his self-confidence: He could afford the occasional sidestep because he knew where he wanted to go.
Though vivid and readable, D'Souza's account falls short of being definitive. In his enthusiasm to paint Reagan in bold colors, he sometimes misses the pastels that are part of the historical record.
D'Souza lists the major domestic and international developments of the 1980s and concludes that "Reagan was the prime mover; he brought them about." That's a stretch, as D'Souza implicitly concedes later in the book. Discussing the 1982 recession, he says that even Reagan's critics "recognized that [Fed Chairman Paul] Volcker was the prime mover and that his agency is independent of executive control." Reagan did provide crucial support for Volcker's anti-inflationary policies, as D'Souza quickly adds. But by definition, any development can have only one "prime mover."
D'Souza's Reagan-centrism distorts his treatment of the massive 1986 tax reform bill, which he briefly describes as a neat compromise between Reagan and congressional Democrats. In fact, the bill had a wild ride through the House and nearly died in the GOP Senate, until Finance Committee Chairman Bob Packwood rewrote it while consuming mass quantities of beer. In this case, as in others, the constitutional separation of powers limited the president's control over the course of events.
On the broader issue of taxes, D'Souza properly praises Reagan's leadership in winning an across-the-board cut in 1981. The following year, however, scary deficit projections prompted Reagan to push the biggest peacetime tax increase in history. Although describing the move as a mistake, D'Souza says the Gipper learned his lesson: "Reagan wouldn't agree to any further tax increases, which he believed would stifle the incentives of entrepreneurs and inhibit economic growth." But as D'Souza should have noted, then came a nickel-a-gallon boost in gasoline taxes in 1982, accelerated increases in Social Security taxes in 1983, and various tax hikes in the budget pacts of 1984 and 1987. Although the latter tax packages mainly worked at the margins, they set a precedent for the Big Bad Deal of 1990.
The biggest disappointment of the Reagan administration was its failure to slash the Washington bureaucracy. Not only did all the cabinet departments survive, but Reagan signed legislation creating a new one: the Department of Veterans Affairs. D'Souza admits that Reagan flopped on the domestic-spending side but blames interest group politics and the public's appetites: "As a believer in popular government, Reagan had no intention of thwarting the shared preferences of the people." While plausible, that analysis clashes with an earlier comment: "Reagan did not merely follow the path of public opinion, however. Like a true leader, he worked hard to shape it, so that he could point out the best way for the country to achieve its ideals."
Given all the political constraints, Reagan could not have won a total victory against big government, but he could have done much more. In this respect, free-market activist Fred Smith got it right: "[T]he Reagan revolution hasn't failed–it really hasn't been tried."
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post One for the Gipper appeared first on Reason.com.
]]>Behind the Oval Office: Winning the Presidency in the Nineties, by Dick Morris, New York: Random House, 359 pages, $25.95
Trail Fever: Spin Doctors, Rented Strangers, Thumb Wrestlers, Toe Suckers, Grizzly Bears, and Other Creatures on the Road to the White House, by Michael Lewis, New York: Alfred A. Knopf, 299 pages, $25.00
Whatever It Takes: The Real Struggle for Political Power in America, by Elizabeth Drew, New York: Viking, 294 pages, $24.95
"All Things Dull and Ugly," the title of a Monty Python song, makes a tempting label for the 1996 presidential campaign. Each of the three major candidates was a rerun, as familiar as a Python routine but not nearly as funny. Ross "The Very Silly Party" Perot managed the feat of turning clinical insanity into a cliché. Bill Clinton made presidential sleaziness seem commonplace, even acceptable: "Wink, wink, nudge, nudge, say no more!" And poor Bob Dole found himself in a political version of the Dead Parrot Sketch. (This campaign is no more! It has ceased to be! It's expired and gone to meet its maker! It's kicked the bucket, shuffled off its mortal coil, run down the curtain, and joined the bleedin' choir invisible! This is an EX-CAMPAIGN!)
Writing a good, readable book about such an election is tough–but feasible. As Raymond Chandler said, there are no dull subjects, only dull minds. The 1984 election ended in a predictable landslide for Reagan, and it still yielded such worthy volumes as Peter Goldman and Tony Fuller's The Quest for the Presidency 1984 and Richard Brookhiser's The Outside Story. This time, however, the literary world has fared less well.
First and least among the campaign books considered here is Back from the Dead. It consists of a main text by five Newsweek reporters, a foreword by Joe Klein (the "anonymous" author of Primary Colors), an afterword by Peter Goldman, and 75 pages of memoranda, mostly by mid-level campaign trolls. Back from the Dead confirms an old bit of Hollywood lore: A long list of writing credits suggests that a production is a patchwork mess.
The book's brevity (only 214 pages before the memoranda) stems less from succinctness than from superficiality. The authors' implicit message is: "We'd rather be doing something else, so we're trying to finish this damned thing as fast as we can." Instead of coming to grips with the remarkable historical forces that resulted in the re-election of a Democratic president and a Republican Congress, they concentrate on trivial tittle-tattle. Does anybody really want to learn about the backbiting between Don Sipple and Scott Reed? Does anybody even care who those guys were? (They worked for Dole, if you're interested, which I doubt.)
The authors have odd priorities. They devote an entire chapter to Colin Powell, who chose to stay out of the race, yet they scarcely mention Pat Buchanan and Steve Forbes, who actually won primaries. Their treatment of Forbes is especially deplorable. "The Forbes campaign deserves little more than a footnote in the history of politics," they say. "But it is worth looking back at as an object lesson in the effects of negative campaigning." That's nonsense. Alone among the GOP candidates, Forbes offered a program that was genuine (unlike Alexander's), coherent (unlike Dole's), and forward-looking (unlike Buchanan's).
When the authors do get the story right, they merely repeat things that political observers have long known. Over time, Clinton got better at acting presidential. Dole had honor but lacked vision. Gingrich made mistakes that hurt the GOP. And Dick Morris, the consultant who had guided Clinton since the early 1980s, gave cynical advice that enabled the White House to exploit Republican missteps. The book quotes Democratic pollster Pat Caddell: "When Clinton lost the [Arkansas gubernatorial] election in 1980, he sold his soul to the Devil, and the Devil sent him to Dickie Morris."
Perhaps he is not demonic, but Morris is definitely slimy. He originally signed a confidentiality agreement with the Clinton campaign, but right after it expired at the end of 1995, he weaseled his way out of signing an extension while secretly negotiating a $2.5 million book deal. When the deal became public, White House press secretary Mike McCurry said: "It does some violence to the concept of disclosure that we are attempting to establish." Ouch. Having "Stonewall" McCurry criticize your lack of candor is like having Mike Tyson disparage your table manners.
Morris's book, Behind the Oval Office, is revealing in a peculiar, unintentional way. A notorious tabloid story about his relationship with a prostitute ended his formal involvement with the Clinton campaign in August 1996. Morris writes that the trysts began in mid-1995, after President Clinton gave a Morris-inspired address about fiscal responsibility. Feeling a "sense of triumph," he thought he "could get away with anything." Only in Washington could someone regard a balanced-budget speech as an aphrodisiac.
In the text, Morris tries to stay in the good graces of future Democratic clients by explaining away his past Republican involvements (political, not sexual). He says he worked in the Jesse Helms 1990 re-election campaign because he "misjudged" the North Carolina senator. Yeah, right. By 1990, Helms had served for 18 years and had established a reputation for consistency, if not rigidity. Anyone who can work for both Helms and Clinton cannot care about principle, which is why Washington insiders tell this joke: "Why didn't the prostitute charge Dick Morris for her services? Professional courtesy."
Morris is still trying to butter up Clinton. Notwithstanding some mild pro forma criticisms, he heaps praise upon the president, including this jaw-dropper: "Lincoln and Clinton, it seemed to me, had a lot in common." That statement rings true only to those who can picture an evasive, lecherous, pudgy Lincoln.
Equally preposterous is this statement: "Race played no role in the 1996 presidential election even though anti-immigrant and anti-affirmative action ballot propositions threatened to make it the most racial of recent contests." Race played no role? In a way, it decided the election. According to the Voter News Service (a polling service jointly used by many news organizations) exit poll, non-Hispanic white voters favored Dole (46 percent) over Clinton (43 percent). The president won because of overwhelming support among blacks (84 percent) and Latinos (72 percent). In part, his margin reflected the longstanding Democratic advantage among ethnic minorities, but it also resulted from efforts to demonize opponents of racial preferences. While Democrats gained among voters who supported preferences, Republicans scored few points on the other side. Many stayed mum on the issue because they feared that Democrats would brand them church-burning bigots.
Morris is a man of some intelligence, and his book occasionally offers insights. Democratic political operatives, he says, "don't really know many Republicans well and often imagine them to be secretly evil." With his bipartisan experience, Morris understood how to flank Republican strengths and anticipate Republican weaknesses. After the 1994 election, he correctly argued that Clinton could dilute the GOP's appeal by embracing large portions of its policy agenda. And he also knew that Dole would flop with the tax issue–not because people liked taxes but because they doubted Dole would actually cut them.
Notwithstanding these spurts of straight political analysis, Morris's relationship with the truth was strictly a series of one-night stands. When he fell, many Democrats cheered. As Michael Lewis observes in Trail Fever, this response was "a reaction to being constantly told by people like Dick Morris that gray is white and two and two make five."
Based on a series of articles in The New Republic, Lewis's book is better than the Morris memoir or the Newsweek mishmash. Throughout 1996, Lewis hung around the edges of the presidential campaign, making witty, novelistic observations that other reporters envied. Indiana Sen. Richard Lugar "resembles a mechanical toy into which someone has inserted batteries one size too large." Alan Keyes, the black social conservative, "looks as if his joints could use a few squirts of WD-40." And Al Gore's conversation is "littered with 'frankly's' and 'to-be-honest-with-you's' and 'it-is-my-understanding's,' all of which translate into civilian English as 'I'm never going to tell you the truth about anything, so why on earth are you asking?'"
Lewis has a special affection for wheel magnate Morry "Grizz" Taylor, a minor candidate in the GOP primaries. Lewis calls the inarticulate Taylor "a truly representative citizen, who felt genuinely the same desires and ideals that motivate the mythical average American." Throughout Trail Fever, he uses Taylor's lack of political skill as a counterpoint to the more polished and manipulative figures in the campaign.
Unfortunately, Lewis carries this shtick too far, failing to note that Taylor simply did not know what he was talking about. He touted a demagogic protectionism: "Fair trade, not free trade–look out for Americans first!" And his "plan" for balancing the budget consisted of a proposal to fire one-third of federal managers, "starting at the top, but not anyone who is doing the work." In 1996, personnel costs for all civilian employees of the executive branch came to $73 billion, so he could not have balanced the budget even if he had sacked everyone down to the lowliest clerk. Taylor made Perot look like the patron saint of exactitude.
In contrast to his gushing treatment of Taylor, Lewis heaps scorn on the other self-financed millionaire candidate, Steve Forbes. Belittling Forbes's support for supply-side economics, Lewis says the Reagan tax cuts "had no measurable impact on the economy's growth rate." Huh? Where was Lewis during the longest peacetime expansion in American history? Even if he had been in a coma throughout the 1980s, he still could have looked at Lawrence Lindsay's 1990 book The Growth Experiment for a thorough account of how the tax cuts stimulated economic expansion. Guess he was too busy reading Morry Taylor's wisdom.
Lewis makes fun of Forbes's nerdy manner without understanding that it was a tremendous political asset. After all, we nerds are the sleeping giant of American politics. Not only do we cast millions of votes, but we rule large sectors of the economy, such as computer software and biotechnology. Every time Forbes flashed his awkward smile on television, we saw our champion. Each one of us could shout, "I am somebody!"
Apart from its economic illiteracy and anti-nerd prejudice, Trail Fever is generally amusing. Alas, it is no more than that, since Lewis aspires only to serve up a plateful of smirks. In Whatever It Takes, Elizabeth Drew takes a different and more satisfying approach. At the outset, she explains her purpose by quoting anti-tax advocate Grover Norquist (See "Happy Warrior," February.): "This is not going to be a Presidential race–it's going to be a race for the House, the Senate, governorships, the state legislatures–and some butthead who wants to be President."
Instead of focusing on the campaign for the White House, Drew examines the struggle for control of Congress. Her main characters are the party leaders and their interest-group allies. The Republicans worked with organizations ranging from Americans for Tax Reform to the Christian Coalition to the National Beer Wholesalers Association. (If you wonder why the beer wholesalers are in the mix, remember that they work in just about every district and suffer badly when federal excise taxes go up.) The Dems had the AFL-CIO, the League of Conservation Voters, and Emily's List, a feminist group that "bundles" campaign contributions.
Drew's key insight is that the interests of the presidential candidate may clash with those of the party. By mid-autumn of 1996, it was clear that Bob Dole was crashing and that he could take congressional Republicans with him. At this point, Drew reports, Republican leaders and their coalition partners explicitly decided to ditch Dole and concentrate their resources on holding Capitol Hill. In the final days, the Republicans produced an ad that took Clinton's re-election for granted, warning voters not to give him a "blank check" by electing a Democratic Congress. Says Drew: "It was as close as the Republicans could come to not garrotting Dole publicly." On the Democratic side, President Clinton did eventually decide to devote party resources to congressional races–but only after his own re-election was certain. Congressional Democrats grumbled that an earlier and deeper commitment could have put them over the top.
Drew is generally fair, but her own political biases peek out at times, especially in her account of Dole's acceptance speech. He issued a rebuke of Boutros Boutros-Ghali that "was code for racism," and he "inexplicably attacked teachers unions." Drew's criticism rests on the dubious assumption that the world is better off because of the U.N.'s socialist bureaucracy and the NEA's choke hold on American education.
A more fundamental problem with Whatever It Takes is that it overemphasizes Washington. Drew does offer sketches of the national political conventions and the re-election races of a few House members. But most of the book describes who said what to whom at which meeting inside a Washington office building. If Drew had really taken Norquist's comments to heart, she would have gone to places such as Tallahassee and Sacramento, so that she could explain the historic shift of power in America's statehouses. Republicans made massive advances in the 1994 state legislative elections, in some states winning their first majorities since Reconstruction. In 1996, despite setbacks in California and a few other places, they kept most of these gains. And the GOP trend is even more striking at the gubernatorial level. Republicans now hold governorships in 32 states, comprising three-fourths of the nation's population.
The statehouse shift is crucial. With at least some federal power devolving to the states, Republicans are now in a better position to shape domestic public policy. And unless they suffer unexpected reversals, they will have enormous influence over the redrawing of congressional district lines after the next census.
What accounts for the depth and breadth of these changes? None of these four books comes close to answering this question, because they're mainly about personalities instead of ideas. A far-reaching account of recent elections would explain why Americans grew weary of government intervention and how Republicans appealed to that sentiment. It would also take a hard look at how Republicans are using their newfound power in Washington and the states, and ask whether they are botching their chance to bring about real reform.
To paraphrase an apocryphal Yogi Berra quotation, one can hear a lot just by listening. In the 1996 campaign, you didn't need to be an investigative journalist to tell that Bob Dole's brain was an idea-free zone: All you had to do was catch his disjointed speeches and shallow policy proposals. So here's an idea for would-be chroniclers of the next campaign: Forget about tracking the turf wars among staffers or finding clever ways to describe cheesy political rallies. Instead, just heed what the candidates actually say. When they make sense, explain why. When they engage in flim-flams, expose them by checking their facts and analyzing their logic. Even the most insincere speeches are worth noting, because they have a way of haunting candidates after they take office. If you didn't get that, read my lips.
Contributing Editor john ]. Pitney ]r. (jpitney@mckenna.edu) is an associate professor of government at Claremont McKenna College.
The post All Things Dull and Ugly appeared first on Reason.com.
]]>With Reverence and Contempt: How Americans Think About Their President, by Thomas S. Langston, Baltimore: Johns Hopkins University Press, 180 pages, $14.95 paper
A couple of years ago, a clinical psychologist published a damning analysis of Bill Clinton's personality titled The Dysfunctional President. Now that the Clinton scandal chart has grown more tangled than an Arkansas family tree, it's easy to see that many bad character traits took root in a place called Hope. But there is more to the dysfunction on Pennsylvania Avenue than one man's tarnished soul. According to books by historian Gil Troy and political scientist Thomas S. Langston, the American public has long had a troubled relationship with residents of the White House.
I use the word residents instead of men because Troy's Affairs of State concerns first couples, not just chief executives. Anyone seeking a collection of heartwarming anecdotes should look elsewhere: This book is serious political history with a tough-minded attitude. After examining presidential marriages from the Trumans to the Clintons, Troy reaches a dour conclusion: "The experience of the last half-century suggests that the First Lady has greater potential to hurt than help."
Any first lady must struggle with conflicting demands from the voters, who want her to have an active public role but don't want her to have any real power. Whatever course she takes, she will run into criticism. When Pat Nixon played it quiet, detractors mocked her as the passive "Plastic Pat." When Rosalynn Carter openly discussed policy issues, Newsweek asked whether "Lady Macbeth" lurked "beneath the soft voice."
Some first ladies have coped with the dilemma less adeptly than others, and Betty Ford gets a particularly harsh review. Noting that she wanted a salary for her duties, Troy observes: "If Betty Ford had received a salary, she would have been docked pay for dereliction of duties. Between her osteoarthritis, her chemotherapy regime, her periodic depressions, her daily tranquilizers, and her drinking, her aides never knew if the First Lady would show up to an event–or what her mood would be." Troy speculates that her liberal social attitudes may have cost President Ford the 1976 election. Here he uncharacteristically reaches too far. Mrs. Ford surely offended some Americans when she went on 60 Minutes and made tolerant remarks about premarital sex. But her impact was trivial compared with the bad things that had happened during President Ford's tenure: the Nixon pardon, the 1974 recession, and the American defeat in Vietnam.
Troy is on firmer ground when he identifies another pitfall of first couplehood: People think poorly of presidents as spouses. George Bush, the joke went, reminded every woman of her first husband–and had Bob Dole won in 1996, he would have reminded every woman of her first husband's divorce lawyer. Nixon fared terribly on this point: "[W]hile his wife often made him look good, their marriage often made him look bad." With his aversion to public displays of affection, he often seemed to treat Pat with condescension or even coldness. This appearance had basis in fact. In planning his first trip to China, Nixon privately said that if Pat "goes, she goes solely as a prop."
It figures. Most presidents reach the White House only after many years of "public service," a euphemism for pursuit of power and position. In campaign after campaign, politicians learn to put their ambition ahead of everything else and to treat people around them, including their spouses, as "props." When scandal or defeat forces them from office, politicians claim they look forward to spending more time with their families. That's usually a lie. If they really wanted to spend time with their families, they would not have entered politics in the first place.
Dwight Eisenhower was the lone "nonpolitician" among the postwar presidents, but even he does not come across as an inviting marriage partner. "Mamie, there's one thing you must understand," he told his wife early in his military career. "My country comes first and always will. You come second." No wonder she drank.
Despite all the problems surrounding presidential marriage, a number of chief executives have depended heavily on their wives, none more so than Bill Clinton. As Edith Efron pointed out in these pages nearly three years ago ("Can the President Think?," November 1994), Hillary has served as the first Vulcan, a cool, rational check on the president's undisciplined mental habits. "To an inordinate degree, Hillary Clinton thinks for Bill Clinton," Efron wrote. "Specifically, she is Bill Clinton's access to the laws of logic, without which no thinking is possible."
This dependence has led to embarrassing consequences. "Hillary certainly was the leftist ideologue her conservative critics made her out to be," Troy says, so she was an odd choice to direct policy in a supposedly centrist administration. But Clinton could not do without her, so he put her in charge of his major domestic initiative, a "restructuring" of health care. The product was, of course, a disaster.
Just as important, however, the policy making process raised questions about the first lady's standing. Soon after she convened the health care task force, opponents sued. Since she was not a federal employee, they argued, the task force constituted a "federal advisory committee" that by law had to do its business in public. The Justice Department countered that she was the "functional equivalent of a federal employee." But when Republicans suggested that her health-related investments raised ethical questions, the White House counsel said she was exempt from the conflict-of-interest regulations covering federal employees.
This case suggests a larger point. It's wrong for the president to delegate real authority to a spouse, because power without accountability is dangerous. White House aides have to abide by a variety of statutes that sharply circumscribe their political activities (at least, that's the way it's supposed to work). Officials in the Cabinet and subcabinet have to account for their activities at congressional hearings, and the president himself has to answer to the voters. But the president's spouse is not subject to any such restraints. Indeed, courts cannot even compel husbands and wives to testify against each other in criminal trials, a bit of legal doctrine that may assume great importance in the next couple of years.
To remedy the mischiefs of White House marriage, Troy makes modest and pragmatic recommendations (e.g., "Support each other, rely on each other, but don't forget who's boss"). He assumes that the presidential spouse will be a woman, but a future edition of this solid book will undoubtedly have to address the status of a first gentleman.
Troy wisely recognizes the limitations of his subject matter. "The marriage tells us some things about the man and the president, but does not tell the whole story." For a broader (if less satisfying) view of the presidency, one may turn to Thomas S. Langston's With Reverence and Contempt. I opened the book with some hesitation, because its cover bears a blurb from the notorious Michael Lind (see "Power Puff," January 1997). Lind praises the book's "fascinating analysis," which is about as promising as William Shatner's lauding a fellow actor's "subtle and convincing performance."
Nevertheless, the book is not terrible. It starts with the sensible argument that Americans have invested "too many meanings within our presidency." They want presidents to embody the nation and reflect its majesty, but they also want them to symbolize a commitment to democracy. Americans want great men and common men all at once.
They also want presidents to lead the "civil religion." Americans have long seen their country as a special place with a providential destiny, so presidents often describe the United States in spiritual terms, invoking God's blessings and praying for His guidance. While Langston broadly describes the intellectual and political difficulties of civil-religious leadership, he gives too little attention to the practical problems that such leadership entails. For one thing, there's a fine line between spirituality and sanctimony. Jimmy Carter crossed that line, which is one reason why people remember his presidency the way they remember a tedious sermon in a crowded church in the middle of August.
If a president wants to acquire an air of godliness by quoting the Bible, he'd better get it right. Langston cites a line from Clinton's 1992 acceptance speech: "Scripture says [that] our eyes have not yet seen, nor our ears heard, nor our minds imagined what we can build." Langston then moves on without noting something important: Clinton misquoted the verse. Says 1 Corinthians 2:9: "No eye has seen, no ear has heard, no mind has conceived what God has prepared for those who love him" By substituting earthly endeavor for divine providence, Clinton deeply offended fundamentalist Christians.
Langston also errs when he says that Reagan's employment of civil religion was relatively passive: "In the 1980s, there was no demon bank to be subdued, no Senate censure to condemn, no party to build from scratch." What about the Evil Empire? Significantly, Reagan introduced that term in a 1983 address to the National Association of Evangelicals. Reagan told the evangelicals that he hoped the Soviets would someday discover God. "But until they do, let us be aware that while they preach the supremacy of the state, declare its omnipotence over individual man, and predict its eventual domination of all peoples on the Earth, they are the focus of evil in the modern world." He urged them not "to ignore the facts of history and the aggressive impulses of an evil empire, to simply call the arms race a giant misunderstanding and thereby remove yourself from the struggle between right and wrong and good and evil." That speech was among the most significant and revealing of Reagan's career, and Langston's book suffers badly from neglecting it.
Reagan was trying to get the evangelicals to oppose the nuclear freeze, which suggests another point that Langston misses. For decades, the danger of nuclear annihilation shaped the relationship of the people and the president. During the Cuban Missile Crisis of 1962, Americans prepared their fallout shelters, stocked up on canned goods, and came to grips with the idea that their world could burn up within a few hours. At the end of the crisis, many believed that John F. Kennedy's skillful leadership had literally saved their lives.
The commander-in-chief thus became the Man Who Kept the Missiles Away. Naturally enough, presidents exploited this stature to rally support for their foreign policy. They also used it to rationalize the growth of the national security apparatus and the invasion of civil liberties. Nixon even used it to justify the sacking of Watergate special prosecutor Archibald Cox. "Elliot," he told Attorney General Richardson, "Brezhnev wouldn't understand if I didn't fire Cox after all this."
In another way, however, the prospect of nuclear war made life tougher for presidents and presidential candidates. The chief executive could prevent a nuclear war, people realized, but he could also start one. Politicians and reporters thus gained an airtight reason for probing deeply into a would-be president's character. After all, anyone seeking the White House was asking for the power to kill every human being on earth. Any aspect of his public or private life that could hint at poor judgment was now fair game.
Not coincidentally, the first presidential campaign after the Cuban Missile Crisis featured vicious attacks on Barry Goldwater's stability. Four years later, Richard Nixon's campaign pamphlets said: "This time, vote like your whole world depended on it." In 1972, revelations of electroshock therapy instantly disqualified Sen. Thomas Eagleton as the Democratic vice presidential nominee. And in the following decades, a number of campaign commercials would feature the legendary "red phone" and the message that we needed a steady hand to answer it. As late as 1987, questions about judgment and character could still wreck a candidacy, as Gary Hart learned in the Donna Rice episode.
Bill Clinton, whose personality flaws were just as blatant as his unconcern for foreign policy, could probably not have won the White House during the Cold War. But by the time he geared up his 1992 presidential campaign, the perceived threat of nuclear war was fading, and people expected a good deal less from their president.
In evaluating the chief executive, perhaps we've gone too far in defining deviancy down. Even though we're not on the eve of destruction, a president can still do really bad things, such as collecting hundreds of confidential FBI files on political opponents. In this light, it's disturbing to read Gil Troy's description of how Bill and Hillary reacted to criticism: "The Clintons united in rage. Theirs was the angriest administration since Richard Nixon's."
It's tempting to stretch the Nixon analogy and start speculating about impeachment, but that's beside the point here. Regardless of who's president, abuse of authority will always be a danger as long as the federal government is too big, too complicated, and too intrusive. We should try to keep the power graspers out of the White House whenever we can, but above all we should drain the bureaucratic swamps that gave rise to Watergate and Indogate. The appropriate attitude toward the White House is neither reverence nor contempt, but watchfulness.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post A Thin Line Between Love and Hate appeared first on Reason.com.
]]>Yet socialist ideals survive in new guises. In the 1990s, the rhetoric of class struggle has given way to the language of sustainable growth and economic justice. Leftists still speak of anger and vengeance, but nowadays they are just as likely to talk about compassion and sensitivity. Joe Hill, meet Barney the Dinosaur.
Socialists and socialist wannabes haven't really changed their goals–they've just changed their colors. When you look to the left, you won't see red. Instead, you'll see a spectrum of greens, grays, baby blues, and khakis. At first glance, these shades of ideology all seem different. But beneath the surface, they all pay devotion to the same master: a more powerful government. What follows is a brief spectroscopic analysis of our varied modern socialisms.
Green Socialism. Everybody likes clean air and lush landscapes. If you want to sell statism, you might want to wrap it in green. In 1996, Green Party presidential candidate Ralph Nader gained 651,771 votes (outpolling the Libertarian Party's Harry Browne), and a handful of Greens won local offices. The party's national program puts its philosophy forthrightly: "Concepts of ownership are provisional and temporary, to be employed in the context of stewardship and of social and ecological responsibility." While claiming to disown Soviet economics, the Greens also spurn competitive capitalism "because it creates a dynamic of endless growth that is incompatible with ecological sustainability and that fosters greed and domination in society."
The Greens support public ownership of major industries, a guaranteed minimum income, a mandatory maximum wage, and free health care "under democratic public ownership and control." They occasionally praise decentralization, but all their talk of public control suggests that their model is not the United States under the Articles of Confederation, but Yugoslavia under Tito.
The Green Party program might be too bold for mass consumption, but paler versions are flourishing. In his 1992 book Earth in the Balance, Vice President Al Gore says that capitalism's blindness to ecology is "the single most powerful force behind what seem to be irrational decisions about the global environment." Gore proposes a worldwide accounting system "that assigns appropriate values to the ecological consequences of both routine choices in the marketplace by individuals and companies, and larger, macroeconomic choices by nations." He foresees treaties embodying "the regulatory frameworks, specific prohibitions, enforcement mechanisms, sharing arrangements, incentives, penalties, and mutual obligations necessary to make the overall plan a success."
Throughout the book, Gore keeps repeating that he believes in free markets. Right. And Strom Thurmond believes in term limits.
Gray Socialism. The American tradition of individualism holds that able-bodied workers should take care of themselves. Decades ago, supporters of the welfare state realized that they could bypass this resistance by focusing benefits on the elderly, a group with whom everybody sympathizes. Programs for old people, in turn, would create constituencies for more of the same, by creating both a bureaucracy eager to perpetuate itself and citizen pressure groups such as the American Association of Retired Persons raring for more. Social Security thus begat Medicare and SSI.
Large-scale programs for the aged, however, end up affecting everybody. Social Security numbers, which began as a tool for tracking wages, now enable various government agencies to keep an eye on all of us, all the time. Medicare started as a limited effort to help old people with medical bills, but to control costs and prevent fraud, Washington has spread bureaucracy and red tape throughout the health care industry. The feds have turned Dr. Kildare into Dilbert.
Proponents of gray socialism want to give the government even more power. The AARP would solve Medicare's problems through a "comprehensive health and long-term care system that provides access for all," with the money from a value-added tax or increased corporate profit taxes. As for Social Security, gray socialists shudder at the idea of letting individuals choose how to invest their own contributions. One alternative is for the government to put much of the Social Security trust fund in the stock market. Many gray socialists favor this approach, since these stock holdings would give Uncle Sam effective control of large swatches of the private economy. President Clinton says the proposal is "quite interesting."
Baby-Blue Socialism. According to a recent poll commissioned by a group of children's organizations , two-thirds of voters are willing to spend additional tax dollars to ensure children's welfare, and three-quarters are more likely to back candidates who support children's programs. Baby-blue socialists take advantage of these sentiments by framing every issue as a children's issue. Unemployment? Parents need jobs to buy food and clothing for their children. The environment? Hey, children breathe!
If you think I exaggerate, check out the 1996 Democratic platform, which mentions the words child, children, or childhood 89 times. It says America needs the National Endowments for the Arts and Humanities because "investment in the arts and humanities and the institutions that support them is an investment in the education of our children." Government-subsidized television is essential because "we want our children to watch Sesame Street, not Power Rangers." The Consumer Product Safety Commission is "an effective guardian of children and families in and around their homes."
Even arms control becomes a children's issue: "Today, not a single Russian missile points at our children." All that time we were trying to protect the Pentagon and Fort Bragg, the Soviets were really targeting Mr. Rogers' Neighborhood.
Khaki Socialism. During the Cold War, most Americans worried about those Russian missiles. This fear was justified, but it had the unfortunate side effect of encouraging people to let down their guard against government power. After all, who wanted to oppose measures that served our defense needs? The khaki cloak of national security not only led to the growth of the Pentagon, but also provided cover for the expansion of domestic programs. We had to expand education in order to keep up with the Sputnik-launching Russkis, hence the National Defense Education Act. We needed to make sure military vehicles could cross the country quickly, hence the National Defense Highways. Many even argued that poverty and welfare were national security issues, since U.S. slums provided the Soviets with propaganda material.
This political strategy may have seemed outdated at the end of the Cold War, but it's making a comeback. In his 1997 State of the Union address, President Clinton uttered the phrase "national security" only once–and not in regard to the military. "I ask parents, teachers, and citizens all across America for a new nonpartisan commitment to education– because education is a critical national security issue for our future."
During the Cold War, supporters of government defense policy liked to say that "politics stops at the water's edge," implying that dissent was unpatriotic. Adapting this line to his own situation, President Clinton added that "politics must stop at the schoolhouse door." Mind you, he does not want government to stop at the schoolhouse door–he opposes vouchers. Rather, he wants to declare that Washington's role in education policy is off- limits to fundamental debate and disagreement.
Not every issue lends itself to a national-security justification. But big government's friends can disguise this by using military language as a metaphor. In his Depression-era first inaugural address, Franklin Roosevelt spoke of "broad executive power to wage a war against the emergency, as great as the power that would be given to me if we were in fact invaded by a foreign foe." Following FDR's lead, politicians would later declare a War on Poverty, a War on Drugs, and a Moral Equivalent of War on energy shortages.
Such language is supposed to be inspirational, but its implications are disturbing. During wartime, combatants obey orders and civilians set aside their personal misgivings in support of the national effort. Armies and navies are strict hierarchies that administer rewards and punishments based on bureaucratic rules, not marketplace demands. Indeed, former Navy Secretary James Webb calls the military "a socialist meritocracy." If the whole country ran on that basis during peacetime, it would look like…well, a socialist state.
Khaki can be an attractive color. So can green, gray, and baby blue. But once the paint wears off, you might be stuck with something very ugly indeed.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post The Colors of Socialism appeared first on Reason.com.
]]>In this context, the term refers to a practice whereby a candidate may run on more than one party ticket. How does it work? Let's say that Leslie Liberty and Syd Statist are running as the major-party candidates for a House seat, and that you are mapping plans for a minor party. If your party fields its own candidate in this race, it might siphon votes from Liberty (who supports limited government) and tip the election to Statist (who combines the ideology of Leon Trotsky with the personality of Roseanne). As an alternative strategy, you might consider giving your party's nomination to Liberty, who would then have two spots on the ballot. That's "fusion."
Depending on the circumstances, as we'll see, a fusion strategy might make good political sense. But there's one catch. Only four states (New York, Oregon, Utah, Vermont) expressly allow candidates to accept more than one party's nomination, while a few other states are silent on the subject. The great majority directly or indirectly forbid fusion.
Things were not always so. Fusion was common in the 19th century, reaching its peak when both the Democrats and the Populists nominated William Jennings Bryan for president in 1896. Seeing a threat from the Populists and other grassroots political movements, however, the major parties in the state legislatures then started to pass the anti- fusion statutes.
The current challenge to these laws started with a 1994 race for the Minnesota legislature in which the leftist New Party chose to give its nomination to an incumbent Democrat. Though the lawmaker and his party consented to this endorsement, Minnesota's secretary of state said no, citing a state law directly banning fusion. The New Party mounted a legal fight (McKenna v. Twin Cities Area New Party), losing in federal district court but winning in the Eighth Circuit Court of Appeals. A few years earlier, another circuit court had reached the opposite conclusion, upholding an antifusion law in Wisconsin (Swamp v. Kennedy). To settle the issue, the U.S. Supreme Court accepted the Minnesota case for its docket, and it heard oral arguments in December.
Supporters of fusion argue that restrictive ballot laws abridge First Amendment rights by keeping minor parties from nominating the candidates of their choice. "This tells them what the substance of their political alliances can be," said Harvard law professor Laurence Tribe in his Supreme Court argument for the New Party.
Fusion enthusiasts also say that multiple listings give the voters more-detailed information and permit them to send more-specific messages to political elites. If a candidate appears on both the Republican and Right-to-Life party lines, the electorate will know more about that candidate's abortion stance than if he or she ran on the GOP ticket alone. And by voting for that candidate on the Right-to-Life line, anti-abortion voters indicate not only whom they support but why.
Opponents of fusion cite the need to preserve the two-party system. If a major party's candidates get a large share of their vote on a minor party's ticket, their loyalties will be divided and the last remnants of party discipline will vanish.
This argument assumes that the Republican-Democratic duopoly is a fundamental principle of American government. It isn't. The U.S. Constitution says nothing about parties, and it's awfully hard to find an endorsement of the two-party system in The Federalist. An advantage of a large and diverse republic, wrote James Madison, is "the greater security afforded by a greater variety of parties, against one party being able to outnumber and oppress" the others. Pedants might quibble that Madison was referring more to interest groups than to party organizations as we know them, but that objection misses the point: Madison liked the idea of loose and shifting political coalitions, the very thing that the antifusion laws try to stop.
Opponents of fusion also say that multiple ballot listings confuse voters. This contention is silly. In New York state, politicians routinely run on more than one line: In 1994, for instance, gubernatorial candidate George Pataki not only had the Republican and Conservative party nominations, he also created a "Tax Cut Now Party" so that he could get a third spot on the ballot. Residents of the Empire State are confused about many things (see any episode of Seinfeld), but this ballot system is not one of them. When I served as a precinct worker in Saratoga Springs, New York, I never saw a single person stagger from the voting booth, whimpering in bewilderment. This experience led me to a firm conclusion: Those who can't handle a multiple-listing ballot aren't going to vote anyway. People who are that cognitively impaired can't possibly find their way to the polling place.
During oral argument, several justices voiced a hesitancy to strike down dozens of state laws. Yet the anti-fusion statutes rest on such a weak foundation that the New Party might just win. If that happens, minor parties will have to rethink their political game plans, and they will discover that the fusion strategy has several advantages.
The first is economy of effort. Minor parties have a tough time recruiting and financing high-quality candidates. Let's face it: When seeking contenders for the county commission or even the U.S. House, minor parties find that their candidate pool is mighty shallow. Rather than scrape the bottom, minor parties could find it easier to give nominations to acceptable major-party candidates. This approach could also prevent embarrassment for minor parties. In a 1996 California congressional race, the Reform Party nearly nominated Mistress Madison, a professional dominatrix. (On second thought, it's too bad she didn't win: She would have made the perfect party whip.)
Of course, candidate-poor minor parties can always choose not to nominate anyone for a particular office. Unfortunately, blank ballot spots undercut a party's standing. Like a grocery store with empty shelves, a party without a full slate will come across as a fly-by-night operation.
The grocery analogy suggests another advantage of fusion. Browsers prefer to see at least a few familiar brands: If they look in the cooler and find only Mechanicville Cola instead of Coke or Pepsi, they will probably walk out without buying anything. In the same vein, the absence of well-known names on a minor-party slate may cause voters to wonder whether the party is a bunch of wackos. On the other hand, if a respected governor or senator accepts the party's nomination, the voters' comfort level will rise and they might take a look at the party's candidates for other offices.
Perhaps the greatest advantage of the fusion strategy is that it gives minor parties the power of political life or death over major-party candidates. In 1993, Rudy Giuliani ran for mayor of New York on both the Republican and Liberal lines, and the number of votes he won as a Liberal exceeded his margin of victory. If the Liberal Party had given its nomination to incumbent Democrat David Dinkins, Giuliani probably would have lost. And even if the Liberal Party had left its mayoral ballot slot blank, he might still have gone down to defeat. A significant number of New Yorkers hate the GOP so much that they would never touch a Republican ballot lever–even though they would vote for the same candidate on the Liberal line.
Accordingly, the Liberals gained a great deal of leverage over Mayor Giuliani, who gave plum jobs to well-connected Liberal Party members and to both sons of the party's state leader. The state's Conservative Party wields similar influence over Pataki.
To supporters of fusion, the New York state Conservative Party demonstrates how the practice can further the cause of principled politics. The party arose in reaction to the governorship of Republican Nelson Rockefeller, whose Mussolini-style public works dwarfed those of any "big-spending" Democrat. After spectacular showings by Conservative candidates (including the election of James Buckley to the U.S. Senate in 1970), Republicans increasingly warmed to the idea of fusion. To win the Conservatives' support, the state GOP moved steadily to the right.
But the Conservative Party experience also raises a cautionary note about fusion. In building its alliance with the GOP, the party has sometimes blessed candidates with less than totally conservative records. Sen. Al D'Amato, a Conservative Party favorite, has openly criticized Republicans for moving too far to the right. And according to the 1996 National Journal vote rankings, D'Amato's conservative scores are anemic: 54 percent on economics, 60 percent on social issues, and 63 percent on foreign policy.
A Conservative Party defender could argue that D'Amato is conservative "enough" and that any candidate who took a harder line would not win a majority in New York. (Buckley won with less than 50 percent in a three-way race.) This line of reasoning has its merits, but it raises larger questions. When minor parties turn pragmatic, what purpose do they serve? If you want to split the difference and yield on principle, why not merely join the Republicans or Democrats? And if fusion encourages compromise, might it diminish the creative, hell-raising, china-breaking role that minor parties have so often served?
If the Supreme Court opens the door to fusion, the leaders of minor parties should be wary before walking in. The major parties are like the Borg in Star Trek: If you get close to them, they'll assimilate you.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post Fusion Energy appeared first on Reason.com.
]]>–Walt Whitman, Leaves of Grass
Newt Gingrich likes to present audiences with a puzzle: Draw three rows of three dots each and, without lifting pencil from paper, draw four lines that cover all nine dots. Those who quit in frustration are surprised that one can solve the puzzle easily–by going outside the box. In a House speech, Gingrich once explained, "[Y]ou say to them, pointing back to the original nine dots, you say, 'What box? What perimeter? What limits?'"
Newt Gingrich is always trying to go outside the dots. As he once told Adam Clymer of The New York Times, "I had set out to do a very unusual job, which was part revolutionary, part national political figure, part Speaker, part intellectual." His detractors believe such statements reveal him as both an egotist and an undisciplined dabbler unwilling to acknowledge constraints on his visions. His admirers prefer to think of him as "Whitmanesque" (Walt, not Christine).
Either way, his leadership style is subject to sharp changes in direction. During the past two years, the congressional agenda reflected his war on the welfare state and his use of the balanced budget as a goal to focus his troops. This time, things may be different. "If the last Congress was the 'confrontation Congress,'" he told the House GOP in November, "this Congress will be the 'implementation Congress.'"
To understand such shifts, and to anticipate his agenda, one has to look at Gingrich from several perspectives. He is a self-described conservative who departs from conservative ideology, a Tofflerian futurist who has sometimes allied himself with Al Gore, a disciple of Peter Drucker who often neglects Drucker's maxims, a warrior who speaks of conciliation, and a political pragmatist who made his name as an ideologue.
The Un-Conservative
I find one side a balance, and the antipodal side a balance;
Soft doctrine as steady help as stable doctrine;
Thoughts and deeds of the present, our rouse and early start.
Newt Gingrich is not a conservative.
In 1983, explaining the moral of his favorite puzzle, he said that "just as the liberals have been trapped in the nine dots of bureaucratic solutions of Washington, so conservatives have been trapped in the nine dots of penny-pinching and negativism." Gingrich may have been thinking of the Burkean conservatism of incremental change, which he rejects in favor of revolutionary rhetoric. "Revolutions have to occur fast or not at all," he wrote in 1984. "Revolutions have to occur fast because they represent a fundamental break with the paradigm and power structure of the past."
If Gingrich is not a Burkean, neither is he a social conservative in the vein of Ralph Reed. When he discusses God, his references come not from Scripture but from Thomas Jefferson, who was a Deist. When he advocates school prayer, his argument is more sociological than theological: "It goes to the core of why Alcoholics Anonymous starts with the belief in a Supreme Being." Paul Weyrich, a leading social conservative, explained in a PBS interview: "When I hear about an issue, or when I'm considering a policy, the first question I ask is, 'Does this conform to the Judeo-Christian teachings on whatever subject it is we're talking about?' He does not start at that point. He starts at a different point. Is this good for the country? Is this good for the Republicans? Is this going to strengthen his majority?"
Even as a Republican who opposes gay marriage, Gingrich has long voiced tolerance for cultural differences. In his 1971 doctoral dissertation on Belgian education policy in the Congo, he criticized an influential 1921 book by a Catholic priest, who called for abolishing African adultery. "His definition of adultery was Christian and therefore monogamous," Gingrich wrote. "Yet the very basis of some African societies was polygamy. Eliminating the incredibly complex family relationships meant destroying the essence of tribal stability in many regions of Central Africa." Gingrich also expressed his tolerance on a personal level by allying himself with Wisconsin Rep. Steve Gunderson, until recently the only openly gay Republican in Congress. For social conservatives, then, Gingrich is not a true believer.
He is not really an economic conservative either, notwithstanding his pursuit of budget cuts. Unlike Majority Leader Dick Armey, who has a Ph.D. in economics, Gingrich often disparages the dismal science, with Hayek and Friedman conspicuously absent from his many lists of required readings. Gingrich's quest for solutions outside the "nine dots" is a classic case of what Thomas Sowell calls "the unconstrained vision," which denies the tradeoffs that are inevitable in conservative economic thought.
More important, Gingrich has often praised big government. In 1983, he said his philosophy could "involve very activist government," and he cited such GOP precedents as the transcontinental railroad. When a liberal interviewer once asked about problems with private enterprise, he said: "Oh yeah. But see, I'm not a libertarian. I say it pretty clearly in the book [Window of Opportunity]. I am not for untrammeled free enterprise. I am not for greed as the ultimate cultural value."
So if Gingrich is not a traditionalist, a social rightist, or an economic conservative, what is he?
In a 1989 interview with Ripon Forum, Gingrich suggested the answer: "There is almost a new synthesis evolving with the classic moderate wing of the party where, as a former Rockefeller state chairman, I've spent most of my life, and the conservative/activist right wing." In important ways, the Gingrich Republicanism of the 1990s echoes the liberal Republicanism of the 1960s.
Those Republicans stood out by their commitment to civil rights, a position that appealed to Gingrich. Even as an undergraduate political activist at Emory University, he denounced the racism of Georgia Democrats and urged Republicans to court black voters. In 1979, his first entry in the Congressional Record marked Martin Luther King's birthday. During the 1980s, Gingrich supported sanctions against South Africa's apartheid regime.
Civil rights leaders may attack Gingrich's stands, but one can still detect signs of his 1960s liberalism. In his inaugural address, he said, "The greatest leaders in fighting for an integrated America in the 20th century were in the Democratic Party." In the 104th Congress, he squelched a full-scale assault on affirmative action, arguing that the GOP should spend "four times as much effort reaching out to the black community to ensure that they know they will not be discriminated against, as compared to the amount of effort we've put into saying we're against quotas and set-asides."
In the 1960s and '70s, liberal Republicans stood in the vanguard of the environmental movement. (Rockefeller's interest was proprietary, since his family owned much of the planet.) Gingrich has carried this tradition to Congress, trying to soften the party's "anti-green" image. His November speech to the House Republicans nicely captured his attitude: "As Americans we should not accept a tradeoff which says you're either for bureaucrats bullying citizens or you're for killing off endangered species. We are for endangered species being saved, and we're for American liberties being saved. We're for the right technologies for the environment and the right opportunities for the economy. And yes, that takes creativity, but that's why we were elected: to be creative, not just coercive. And we're going to solve both."
This tack is reminiscent of the liberal Republicans' public policy approach, which faulted the Great Society less for its lofty aims than for its unresponsive bureaucracies. In his doctoral dissertation, Gingrich was already thinking that way. "Belgian colonialism was in fact a model of technocratic government," he said, concluding that "the dream of technocratic planning had all too many hidden limitations and so became a nightmare." Substitute "Washington bureaucracy" for "Belgian colonialism," and you have the makings of a Gingrich floor speech.
During the 1960s, Ripon Republicans advocated early versions of proposals such as enterprise zones, which they said would lead to an "opportunity state." In the 1980s, Gingrich changed the term to "opportunity society" and used it to sum up his guiding notions: devolving power and programs to states and localities; privatizing government functions wherever possible; replacing red tape with economic incentives; and reforming government to make it more accessible.
If this description sounds like Clintonism, don't be surprised. Clinton, after all, has cribbed much of his rhetoric from Gingrich. (See "The Adventures of 'But-Man,'" November 1996.) As Gingrich adopts a more cooperative attitude toward the administration, expect to see agreements on modest steps to improve federal performance. In the spirit of the Reinventing Government initiative, Gingrich told the House GOP, "I believe we can overhaul the mid-level bureaucracy of the Pentagon and that our goal should be to turn the Pentagon into a triangle by reducing at least 40 percent of the unnecessary duplication and waste that's in the system." (Here he was going outside the "nine dots" of traditional geometry.)
Gingrich's principles will run into problems. Cutting elements of the Pentagon budget might make sense, but there will be tradeoffs in the military's capabilities. More broadly, the GOP agenda is not entirely consistent. Rep. Christopher Shays (R-Conn.) admitted to reporter Elizabeth Drew that Republicans "have some conflicting interests, and we want block granting and freedom for local and state governments when it fits our agenda, and we want restrictions when that fits our agenda."
The Futurist
Chanter of Personality, outlining what is yet to be,
I project the history of the future.
Gingrich is both historian and futurist. In high school, he read Toynbee's A Study of History as well as the Foundation trilogy, in which Isaac Asimov recast the Roman Empire as a space-based civilization. As a House member, Gingrich made the trilogy required reading for his aides. As he explained to aide Frank Gregorsky, "[W]hat I'm trying to convey to you is that I'm a figure who thinks in terms of 100-year increments, and I think in terms of civilization's rising and falling over 500-year increments."
Asimov's main character, Hari Seldon, is a "psychohistorian" who forecasts his civilization's decline and devises a way to hasten its renewal. Gingrich summed up the underlying concepts: "The premise was that, while you cannot predict individual behavior, you can develop a pretty accurate sense of mass behavior…[Yet] Asimov did not believe in a mechanistic world. Instead, to Asimov, human beings always hold their fate in their own hands."
Another important influence on Gingrich has been the work of Alvin Toffler, author of Future Shock (1970) and The Third Wave (1980). Just as the world shifted from agricultural civilization ("The First Wave") to industrial civilization ("The Second Wave"), says Toffler, now it is shifting to a civilization based on knowledge and information ("The Third Wave"). According to Gingrich, "the information age means more decentralization, more market orientation, more freedom for individuals, more opportunity for choice, more capacity to be productive without controls by the state." One passage in Future Shock helps explain futurism's political value: "As we move from poverty toward affluence, politics changes from what mathematicians call a zero-sum game to a non-zero sum game….A system for generating imaginative policy ideas could help us take maximum advantage of the non-zero opportunities ahead." Technology will not just settle problems, it will transcend them–a painless solution to the nine-dot dilemmas of public policy.
Applying this idea to health care, Gingrich told his colleagues in November that "we represent better care with better science through better participation, so you have a better quality of life at lower cost. And frankly, I think we as a party can do an immense amount…at dramatically expanding the opportunities for the American people, and in the process, both improving the quality of life and lowering the cost to the taxpayers." In the 105th Congress, look for GOP initiatives on medical research and "wellness" programs.
This example, however, highlights a problem with the futurist approach. According to Charles Krauthammer, a centrist commentator with an M.D., Gingrich's optimism about costs is "nonsense on stilts." High-tech medicine extends lifespans, thereby increasing the ranks of the elderly, who need costlier treatments. In this case as in others, futurism downplays tradeoffs.
By encouraging government planning, futurism may also spawn bureaucracy. As a House member, Al Gore sponsored legislation to establish an Office of Critical Trends Analysis that would have prepared reports on "critical trends and alternative futures," with the help of an Advisory Commission on Critical Trends Analysis. His bill's co-sponsor was Newt Gingrich.
Both should have remembered what Hayek wrote: "Human reason can neither predict nor deliberately shape its own future. Its advances consist of finding out where it has been wrong."
The Executive and Entrepreneur
I hear it was charged against me that I sought to destroy institutions;
But really I am neither for nor against institutions;
(What indeed have I in common with them?–Or what with the destruction of them?)
Gingrich has often acknowledged his debt to the works of management scholar Peter Drucker, especially The Effective Executive (1966). During one of his college lectures, Gingrich rhetorically asked how he could do so many things at once. "And the answer is this book," he said. "This book taught me a quarter century ago how to systematically discipline, plan, think through, delegate, trust others to build a system." Anyone who knows Gingrich's career will find familiar concepts in The Effective Executive. For instance, Drucker's "rules for identifying priorities" embody much of Gingrich's style: "Pick the future as against the past….Focus on opportunity rather than on problem….Choose your own direction rather than climb on the bandwagon….Aim high, aim for something that will make a difference, rather than for something that is 'safe' and easy to do."
In his pre-leadership years, Gingrich always aimed high. Rather than embracing the "minority mentality," he sought ways to make the House GOP the majority. After the 1982 elections, he helped form the Conservative Opportunity Society, a new kind of congressional organization that sought to sharpen partisan distinctions and communicate the GOP message to the C-SPAN audience.
Gingrich was not merely tinkering with partisan tactics but creating a new, Gingrichian politics. "There are two ways to rise," he once said. "One is to figure out the current system and figure out how you fit into it. The other is to figure out the system that ought to be, and as you change the current system into the system that ought to be, at some point it becomes more practical for you to be a leader than somebody who grew out of the old order."
Gingrich could accomplish this goal because he observed another Drucker maxim: "Know thy time." That is, simply stop doing things that eat up work days without yielding results. As a backbencher, Gingrich chose to forgo the legislative detail work that consumes so many other members. Since the majority Democrats ignored GOP ideas anyway, Gingrich reasoned, why go through the motions?
After he became speaker, however, he let his self-confidence eclipse Drucker's advice about time. He took on too many duties, and when he got over-tired, he made serious mistakes. Thus he illustrated another Drucker saying: "Strong people always have strong weaknesses, too. Where there are peaks, there are valleys." Later on, he tried to rise from the valley by delegating more duties to Majority Leader Armey.
The first year of Gingrich's speakership saw harsh partisan struggles over the budget and social policy. Why did he begin by taking such a hard line? A passage from The Effective Executive explains that "one must start with what is right rather than what is acceptable because one must eventually compromise. But if one does not know what is right to satisfy the specifications and boundary conditions, one cannot distinguish between the right compromise and the wrong compromise–and will end up by making the wrong compromise."
In 1996, Gingrich and the Republicans did compromise on issues such as health and welfare reform. Because of their initial firmness, they argue, they ended up with the "right" compromises. Some conservatives would reply that the GOP "revolution" made the wrong compromises, leaving too much of the welfare state intact. They would quote another passage from Drucker: "The surgeon who only takes out half the tonsils or half the appendix risks as much infection or shock as if he did the whole job. And he has not cured the condition, has indeed made it worse."
In the next session of Congress, conservatives on Capitol Hill may hesitate to cede quite so much power to Gingrich, since they have seen that his valleys can dip very low indeed. Instead, some may turn to Senate Majority Leader Trent Lott, who lacks Gingrich's zigzag brilliance but wields power with a steadier hand.
The Warrior
Adieu, dear comrade!
Your mission is fulfill'd–but I, more warlike,
Myself, and this contentious soul of mine,
Still on our own campaigning bound…
Like all politicians, Gingrich uses military terminology. (After all, words such as campaign and strategy were born on battlefields.) Unlike most other political figures, he seriously thinks about applications of military analysis. In his first successful congressional race, he told a group of College Republicans: "A number of you are old enough to have been platoon leaders, or company commanders, depending on the situation, and how rapidly you move up in rank. This is the same business. We're just lucky, in this country, we don't use bullets, we use ballots instead. You're fighting a war. It is a war for power."
This mindset manifests itself in very concrete ways. Throughout 1995, Gingrich sent House Republican leaders and their aides to the U.S. Army Training and Doctrine Command centers in Virginia and Kansas. He reportedly asked military officers to prepare training materials for House Republicans and conduct "after-action reviews" of legislative maneuvers. Upon their disclosure, these practices exposed Gingrich's flank to Democratic attack. (See how easy it is to slip into military language?)
His model for long-range planning–"vision, strategy, projects, and tactics"–comes from military literature. Civilians may not always associate "vision" with olive-drab uniforms, but the idea is essential to soldiering. Clausewitz wrote that an indispensable quality for a military leader is "intellect that, even in the darkest hour, retains some glimmerings of inner light which leads to truth."
Gingrich's strategic vision reflects the influence of the ancient Chinese warrior Sun Tzu, who taught: "Anger [the opposing] general and confuse him….Keep him under a strain and wear him down." These proverbs encouraged Gingrich in his one-on-one battles with Democratic leaders. Tip O'Neill's famous 1984 outburst on the floor, a reaction to Gingrich attacks, confirmed ancient advice: "If the enemy is obstinate and prone to anger, insult and enrage him, so that he will be irritated and confused, and without a plan will recklessly advance against you."
Verbal attacks serve another purpose. Military leaders try to fire up their troops by telling them about the evils the enemy has committed and the even greater horrors that the enemy would perpetrate if it won. In the long march to the speakership, Gingrich took this approach against the Democrats. In 1988, he blamed Supreme Court nominee Robert Bork's defeat on a liberal smear campaign, adding: "The Left at its core understands in a way Grant understood after Shiloh that this is a civil war, that only one side will prevail, and that the other side will be relegated to history. This war has to be fought with the scale and duration and savagery that is only true of civil wars. While we are lucky in this country that our civil wars are fought at the ballot box, not on the battlefield, nonetheless it is a true civil war."
While Gingrich is sticking to his military planning model, the harsh publicity of 1995 and 1996 has prompted him to make his rhetoric a little more pacifistic. But in 1997, he may find the other side unwilling to call off the war. Liberal Democrats have taken every opportunity to attack his ethics and undercut his leadership, and they show no signs of letting up. In describing this strategy, Rep. George Miller (D-Calif.) himself used a military analogy: "Newt is the nerve center and the energy source. Going after him is like trying to take out command and control."
The Pragmatist
Do I contradict myself?
Very well, then, I contradict myself;
(I am large–I contain multitudes.)
"I think Newt's always been supremely pragmatic," says former Conservative Opportunity Society ally Vin Weber. "[F]or a long time, [that] simply meant keeping your mouth shut and going along. If pragmatic is defined more literally as doing what it takes to succeed, Newt's always been pragmatic." Newt Gingrich is a risk taker, but he is also a practicing politician. In 1986, he acknowledged that "you can trim some programs and you can kill some programs, but the first duty of a political coalition is to sustain its majority."
This pragmatism has cropped up on a number of issues.
In Window of Opportunity, Gingrich singled out the United Auto Workers as a praiseworthy, progressive union. He also wrote: "There are times and places when specific protectionist steps are appropriate: protectionism can defend an industry vital to national defense, can buy time for an industry to make adjustments to a sudden change in its environment, and can bludgeon a trading partner to force it to engage in fair trade." These comments, which clashed with GOP skepticism toward unions and its free trade ideology, reflected local concerns: Gingrich's district at the time included two auto plants, and protectionist sentiment was running high in Georgia.
In 1985, Gingrich persuaded Delta Air Lines to take reservations for Air Atlanta, the largest black-owned airline. Reporter Nicholas Lemann said: "Many conservatives would recoil in horror at the thought of politicians pressuring a company into a decision for reasons of race rather than efficiency; in the conservative movement racial quotas, minority business set-asides, and the like are at the top of the list of evils right now." In this case, Gingrich's intervention served two practical purposes: improving the GOP's image in the black enterprise community and serving a local business interest.
In 1992, during a difficult primary, Gingrich argued that Republicans should support him because he could bring home more federal benefits. Columnist George Will observed, "Gingrich may have saved his career as a professional legislator, but he ended his career as the scourge of the 'corruption' of the welfare state in the hands of career legislators." When an interviewer presented him with such examples of position shifting, he responded: "Oh, you can find more examples of chameleon-like behavior like that. Look, I believe in pragmatism. But it's tautological. Conservatism works. The work ethic works. Strength works. The free market works. Focusing on learning works. Preventive health works. So I can tell you with a straight face I am pragmatic, and as a result I am driven to conservatism. But I am not dogmatic. I think if non-conservatism works, I'll look at it, too. It just doesn't work as well."
Such pragmatism is fine for a split-the-difference Republican such as Bob Dole or Bob Michel, but it is a most peculiar attitude for a revolutionary. Try to picture Lenin saying, "If czarism works, I'll look at it."
Pragmatism also holds a more immediate political risk. In 1992, Gingrich told Republican congressional candidates in Georgia not to pledge their support to the unpopular George Bush in case a three-way election went to the House. In 1996, he told marginal Republican incumbents to "do what gets you re-elected," even if that meant ignoring Bob Dole. If the Democrats score direct hits on Gingrich over ethics or other issues, he will be looking for allies. He may not find them among Republicans who have fully absorbed his lessons about pragmatism.
Contributing Editor John J. Pitney !r (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post The Many Faces of Newt Gingrich appeared first on Reason.com.
]]>Movie buffs lovingly remember the name of Edward D. Wood Jr., director of such films as Glen or Glenda? and Bride of the Monster. Wood considered himself something of a genius, but he was so profoundly incompetent that his movies are actually fun to watch. In making his "masterpiece," Plan Nine from Outer Space, Wood used toy-store models to represent flying saucers and randomly alternated shots of daylight and darkness within the same sequence. When star Bela Lugosi died two days into filming, Wood replaced him in the remaining scenes with his wife's chiropractor. The substitute looked nothing like Lugosi, but that didn't stop Ed Wood: He merely had the man hold a cape over his face.
Michael Lind is becoming the Ed Wood of American political writing. Once a low-level munchkin in conservative intellectual circles, Lind has recently won himself a good deal of ink by denouncing his former patrons as divisive and reactionary. Earlier this year, he published Up from Conservatism, combining a memoir of his conversion with a wide-ranging attack on his forsaken ideology. Lind may have intended this book to be his own version of St. Augustine's Confessions, but it ended up as his Plan Nine from Outer Space.
Readers familiar with conservative and free market thought can derive perverse amusement from the book's countless omissions and mistakes. For instance, Lind says the "only libertarian organization of any importance in American politics is the Cato Institute." If you are holding this magazine in your hands, then you know about this other outfit located on South Sepulveda Boulevard in Los Angeles….
Like Ed Wood, Michael Lind works fast. Just a few months after Up from Conservatism, he has published Powertown, whose jacket notes identify it as his "fiction debut." Discerning consumers of the previous book would disagree, but that's not important here. Powertown is a different kind of work, and it's monumentally awful in a different way.
"Powertown" is Washington, D.C., which provides the backdrop for the novel's three loose storylines. The first involves a group of self-absorbed professionals who work in the political establishment and constantly obsess about their careers and sex lives. The second concerns black "gangbangers" who take drugs, shoot one another, and use the f-word a lot. The third centers on an illegal-immigrant cleaning lady, and in these chapters, the other characters mainly consist of people who are mean to her.
There's nothing wrong with a multilayered plot, so long as the author makes sure that readers can keep the characters straight. Within the first 30 pages of Powertown, Lind utterly bungles this elementary requirement of novel writing. We meet Stef Schonfeld, a woman who works for Rep. Jim Ritter on the Appropriations Committee. Stef is the inaugural-ball cover date for Avery Brackenridge, a gay black guest-booker at National Public Radio, whose lover is Ross Drummond, a white Republican political consultant. Ross's cover date is the wealthy Agalia Kazakis, who, as we later learn, heads a charitable foundation. Ross introduces Stef to the handsome Bruce Brandt, who works for Mrs. Gutierrez, the drug czar.
Don't complain yet: We're only up to page six. Now Lind introduces us to Velma and Curtis Hawkins, a working-class black couple who have two daughters, Lorena and Marilyn. With Velma's father, Dad Johnson, staying at their home, Curtis drives to another neighborhood to pick up Velma's sister-in-law Sharonda, together with her four children, Arnetta, Monique, Jamal, and Evander, as well as Sharonda's mother, Stella Morris. The next day, Evander learns from his friend Twon that the gangbanger Lookout Williams has been making threats against him, so he asks for protection from Frizzell, the chief gangbanger.
Meanwhile, evil landlady Mrs. Reyes evicts Graciela Herrera and her children, Rosa and Marcello. The three unfortunates move in with Graciela's cousin Yolanda, whose husband Isidro is displeased. Graciela goes to Senor Martinez, a purveyor of fake green cards, for help in finding a new place. Martinez sends Darryl Shelton, who moves her into a seedy apartment. When Rosa gets sick, Darryl arranges for Graciela to take her to Dr. Sorzano. But apparently, Darryl is seeking sexual favors in return for his assistance.
That's the state of play as of page 29, and it just keeps getting worse. I counted 92 different character names, and I probably missed a few. Powertown is not a novel, it's a census.
The names are as confusing as they are plentiful, so a reader literally needs a computer database. Sharonda, remember, is Velma Hawkins's sister-in-law, whereas Shirrelle is the aunt of black teen Latisha "Teesha" Madison, who becomes Evander's girlfriend. And do not confuse Latisha Madison with Letitia Monroe, who is a wealthy white woman. The same goes for Jack Vincent (legislative assistant to Rep. Lou Mayers) and Jack Dougherty (legislative assistant to Sen. Ted Chappell).
While creating this population explosion, Lind himself apparently got confused. He tells us that Dad Johnson had a son named Luke, who fathered Evander and died of a drug overdose. A few pages later, though, he writes that Evander's father was named Luke Turner and that he died in a car crash. On page 130, there's a character named "Willie." On page 131, we read about "Willy." Has young William suddenly decided to change the spelling of his name, or is Lind treating us to a guest appearance from the famous orca whale?
Lind's literary atrocities go far beyond the name game. He describes Evander's visit to a dance club: "The hammering beat coaxes his heart into sync with it; part of him wants to abandon himself to the amplified throbbing, to become one more stalk of grass in this field riffled by the thundering gale from the stage." He says of the hapless immigrant: "Graciela imagines what it must be like to be a soul floating free of the body, to be dazzled by a milky effulgence that burns through the very fabric of the universe like a flame eating through the walls and ceiling of a burning house."
And then: "The flame will speedily travel around the earth, back along the line of gasoline to the can, or the sun itself. It will explode this source and spread to every place that gasoline, our sunlight, touches. Explode the sunlight here, gentlemen, you explode the universe." Oh, sorry, that last passage really is from Plan Nine from Outer Space. You can see how easy it is mix up Lind's lines with Ed Wood's.
There's one big difference, however: Wood's major characters were all white. In Powertown, Lind tries to get us in touch with the African-American experience by including a large number of black characters. Evidently, he had trouble figuring out what to call them all. He offers two different Jocelyns, one being Latisha's mother and the other being Lookout's girlfriend. After running out of handles such as Rasheed and Jamal, he turned to the names of real public figures. A gang member is called Tito, after one of Michael Jackson's lesser brothers. An NAACP lawyer is named John Conyers, after the long-serving congressman from Detroit.
His dialogue is a middle-class white guy's take on what ghetto dwellers sound like. Here's a reaction to a murder: "'Oh, lawdy,' says Velma, 'Somebody else got shot. Lawd a' mercy.'" And here's a discussion of young love: 'While we was in the kitchen,' Lookout tells him, 'she say, 'Who that boy?' She say, 'He look mighty fine.' She wants you, cuz. Word, cuz. Go for it.'" To his credit, Lind is an equal-opportunity offender. None of his characters–black or white, straight or gay–sound like recognizable human beings.
Lind's Washington bears no resemblance to the real city on the Potomac. The Washingtonians who do the most harm are not the sex-obsessed cynics of Powertown, but nerdy idealists who spend their nights drafting new regulations for the Environmental Protection Agency and Occupational Safety and Health Administration. Perhaps Powertown is not really about people in Washington at all, but about alien creatures on a strange new world. In that case, it might supply the basis for a nifty science fiction movie.
Too bad Ed Wood isn't around to direct it.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
The post Power Puff appeared first on Reason.com.
]]>Walter Truett Anderson
In recent times we have seen the emergence of a new polarization–anti-technology vs. pro-technology, Luddite vs. techie. The neo-Luddites dream of people leaving technology behind and advancing into a future that looks–well, looks a lot like the past. The techies dream of artificial intelligence–computers so brilliant that they can advance and leave people behind. (See, for example, the cyberpunk classic Neuromancer.) Along with this goes a lot of argument–much of it useless hyperbole–about whether technology is good or bad, destroyer or savior.
What we really need to do, rather than take sides in any such simplistic fistfights, is to understand how inseparable technological change is from human evolution. Technology is us.
Two books that I will mention here address this issue straight on. The third, a work of science fiction, illuminates it in more indirect ways, as fiction should.
Origins of the Modern Mind (1991), by the Canadian psychologist Merlin Donald, argues that the human species has evolved by developing new "systems of representation," and that at each stage–as people invent new ways to communicate and manage information–we become in fact a different species. The first big jump, he says, was the invention of mimesis. Then came speech, then–much later–writing. We are now in the midst of another such transition, and it is literally changing the way we think: "The growth of the external memory system," he says, "has now so far outpaced biological memory that it is no exaggeration to say that we are permanently wedded to our great invention, in a cognitive symbiosis unique in nature." What this means is that we are now evolving into computer-connected beings with a computer culture and a computer civilization.
Bruce Mazlish of MIT makes a similar point in The Fourth Discontinuity (1993), although his framework is more historical than evolutionary. He takes his title from the proposition that the human species has in recent centuries gone through a number of "discontinuities," each of which involved learning new–and disturbing–lessons about the world and our place in it. We learned the Copernican lesson that our planet is not discontinuous from the heavenly bodies, we learned the Darwinian lesson that humans are not discontinuous from the animals, and we learned the Freudian lesson that the conscious mind is not discontinuous from its preconscious origins. Now, he says, "humans are on the threshold of decisively breaking past the discontinuity between themselves and machines," discovering "that tools and machines are inseparable from evolving human nature." Mazlish's book doesn't footnote Donald's, and I don't think that is an academic oversight. Rather, I suspect he merely moved along his own disciplinary path (he is a historian) and came to a quite similar conclusion.
Kim Stanley Robinson isn't trying to make any such point in Blue Mars (1996), but he makes quite a few anyway. This is the third volume in a trilogy (the previous installments were Red Mars and Green Mars) about an expedition to Mars that results in the deliberate transformation of the planetary ecology and the growth of a new human civilization. Robinson's books reflect most of the current scientific thinking about "terraforming," and also show how that issue might lead to a new kind of technophobe-technophile argument. The major political groupings in his story are the Greens who are eager to modify the planet, and the Reds (more or less similar to the Greens here on Earth), who prefer to leave it alone. In the books the Reds win many of the arguments, but the Greens proceed to change Mars–while human beings move on to terraform Venus, various asteroids, and the moons of the outer planets. Along the way they go through several technological revolutions and evolve some fancy artificial intelligence, but remain recognizably human. Technological change and human evolution proceed inseparably from one another, much as (according to Donald and Mazlish) they always have.
Walter Truett Anderson (waltt@well.com) is a political scientist, journalist, and author/editor of numerous books-most recently Evolution Isn't What It Used to Be: The Augmented Animal and the Whole Wired World (W.H. Freeman) and The Truth About the Truth: De-confusing and Re-constructing the Postmodern World (Tarcher/Putnam).
Stephen Cox
To grasp the significance of technology, it's helpful to look at a society that didn't have much of it to go around. Conquest (1993), Hugh Thomas's magisterial account of the destruction of the Aztec Empire, shows precisely how far a society could advance without wheels, nails, or candles. (The lack of firearms was a comparatively minor problem.) Thomas demonstrates what can and can't be done in such a society, and he dramatically illustrates its vulnerability to any competitor that has marginally less primitive tools.
But material technology is the child of intellectual technology, whose best conquests are peaceful ones. A remarkable example is the sudden triumph of agriculture on the North American plains–an effect of the advanced intellectual technology of free enterprise. Willa Cather's great novel O Pioneers! (1913) richly evokes the experience. Cather's protagonist is a young woman who is distinguished by her skillful use of capitalist methods. Hoping to do more than scratch out a modest living through sheer hard work, she takes the risk of thinking. She invests in real estate and farming methods that other people scorn, and her investments pay off. She transforms both her land and her life.
You can't understand technology without understanding how people think about technology, and, of course, you need to know the bad ideas as well as the good. That's why I want to insert a recommendation of just one influentially bad book, Thorstein Veblen's The Engineers and the Price System (1921). Veblen ably advocates a leading myth of the machine age: the idea that material technology "advances" because of people's collective efforts, only to be manipulated and hindered by capitalists for the sake of their private profits. This idea represents a profound misapprehension of the ways in which material technology is affected by investment, market prices, and property rights. Veblen recommended that capitalists be replaced by a "Soviet of technicians" that could "take care of the material welfare of the underlying population"–a proposal that is either chilling or comic, depending on the way you want to take it, but that is very much in the 20th-century spirit.
One of the finest books written in opposition to that spirit is Isabel Paterson's The God of the Machine (1943; republished 1993, with an introduction by me). Paterson offers a complex and compelling theory of history that explains the relationship between a dynamically developing material technology and the concepts of individual rights that are fundamental to capitalism. And she adds a warning to anyone who assumes that the industrial machine on which modern life depends will keep on humming even in a world dominated by social engineering. "For the very reason that the action of inanimate machinery is predetermined," she says, "the men who use it must be free. No other arrangement is feasible." After reading Paterson, one can hardly look at a toaster, much less a computer, without thinking of the Bill of Rights. But that's as it should be. It's not an accident that rights came first and toasters came second.
Stephen Cox (sdcox@ucsd.edu) is a professor of literature at the University of California, San Diego, and the author of Love and Logic: The Evolution of Blake's Thought (University of Michigan Press).
Penn Jillette
When you're in a Spielberg state of mind, try this: Take a baby from 150,000 years ago and raise him/her in modem Manhattan. What have you got? You've got a 21st-century kid, with in-line skates. Now, take the next kid born now and send him/her back 150,000 years and what have you got? Some grub-scrounging missing link.
Technology is all that matters. Technology is all that makes us human. You want books on technology? Every goddamned book is about technology. Every conversation is technology. Technology is all we got. If you don't like technology, you don't like humans.
If you want the above premise written by authors who aren't smartasses, try Making Silent Stones Speak: Human Evolution and the Dawn of Technology (1993), by Kathy D. Schick and Nicholas Toth. They're a nutty couple that went out, lived in the bush, made stone-aged tools, and used them for wacky stuff like butchering an elephant. Is that science or performance art? It's the best of both. Read it.
You want another book that'll rock your world? Try The Beak of the Finch (1994), by Jonathan Weiner. It's about another cool science-nut couple. The couple is perfect, but the author screws the pooch with biblical references. Jesus and Bible quotes in a science book are pure evil. He also misuses the phrase, "Possession is 9/10ths of the law," and yaps about Zen, global warming, and other hippie Luddite ideas. It's still a great book. This nutty couple goes and lives on one of those rocky, useless, Darwin islands and measures the length of the beaks of the finches (it's a well-named book). Again very close to performance art. They see evolution happening! Technology finding out exactly how we got here.
These books make you love goofy couples. The book I want now is the sex book about these wilderness-science couples. Never mind Pamela Anderson and Tommy Lee, I want to hear the nasty on the professionals that have nothing to do but discover our world like humans and screw like beasts. Those are the techno-videos I want to study.
The cheeses at REASON asked me to include one work of fiction. Why not include the best work of fiction in the world? The Mezzanine (1986), by Nicholson Baker, will kick your ass. I laughed, I cried. What the hell more you want? After science-sex couples use technology to learn about who we are and how we got here, let Nick think about it. His character takes a deep whiff of the madeleine of our modem world and records everything he thinks while riding the escalator. It's an importantly funny book. When your thoughts overlap his, you get that universal-one-world-one-people feel that's so great and when you don't overlap it's "oh-man-we're-all-alone," Camus-city. Both feelings are important. Both are true. The Mezzanine is true.
And it's technology that lets us think together.
Penn Jillette (mofo666@sincity.com) is more than half of Penn & Teller by weight. He has co-authored Penn & Teller's Cruel Tricks for Dear Friends (Villard) and Penn & Teller's How To Play with Your Food (Villard). He also does some sort of magic show.
Jonathan Kochmer and Jeff Bezos
Jonathan Swift's 1696 satire, "Battle of the Books," describes a passionate war between armies of living books–the Ancients and the Modems. Exactly 300 years later, equally vigorous yet often simplistic battles rage among today's Ancients and Modems: technophobes and technophiles.
The Pinball Effect, Out of Control, and Haroun and the Sea of Stories are three books that abandon simple explanation, instead acknowledging complex interdependencies between the makers and the made. They also explore the possibility that distinctions between society and technology are becoming less pronounced.
Everyone recognizes that technological history is complex, but most authors still clutch timelines and linear paths of cause and effect like drowning sailors. In contrast, James Burke's The Pinball Effect (1996) is a stunningly original and joyously otterine swim throughout the sea of history: Few other books we know so masterfully document the dizzyingly intricate symbioses of inventors and invention.
Kevin Kelly's Out of Control (1994) demonstrates that human artifacts such as machines, networks, and economies are becoming like organisms, ecologies, and societies–sometimes by design, but increasingly by emergence. Will centralized, engineering-oriented solutions be abandoned as the distinction between the made and the living dissolve? Some call Kelly a technophile, but he is in fact a syncretic biophile: His rallying cry is "life is the ultimate technology."
Finally, Salman Rushdie's Haroun and the Sea of Stories (1990) can be read as a lyrical parable of culture as a technology, with stories portrayed as the machinery of cultural replication. And few would suspect that Rushdie has written one of the best metaphorical descriptions of the Internet: "…the water…was made up of a thousand thousand thousand and one different currents, each one a different color weaving in and out of one another like a liquid tapestry of breathtaking complexity; and Iff explained that these were the Streams of Story, that each coloured strand represented and contained a single tale…as all the stories that had ever been told and many that were still in the process of being invented could be found here, the Ocean of the Streams of Story was in fact the biggest library in the universe….the stories were held here in fluid form, they retained the ability to change…unlike a library of books, the Ocean of the Streams of Story was much more than a storeroom of yarns. It was not dead, but alive….'And if you are very, very careful, or very, very highly skilled, you can dip a cup into the Ocean…and you can fill it with water from a single, pure Stream of Story.'"
Increasingly, the meaningful question may not be whether technology is good or bad, but instead, whether there are substantive differences between the makers and the made.
Jonathan Kochmer (jonathan@amazon.com) is a senior editor at Amazon.com Books, a Weh-based bookseller, and author of four Internet books and scientific articles on evolutionary theory, animal behavior, and climate change. Jeff Bezos (jeff@amazon.com) is president of Amazon.
Bart Kosko
I first began to think how technology relates to society when I read and reread Daniel Defoe's Robinson Crusoe (1719) in grade school. The stranded Crusoe is a society of one person for much of the book. He defied the claim of John Donne and Ernest Hemingway that no man is an island. Crusoe was a social island on an ocean island. He fought his way through John Locke's state of nature only to later have to fight the island natives in Thomas Hobbes's war of all against all.
Crusoe used technology to fight these battles for personal and political survival. He used ideas to shape the structure of his world. Crusoe killed goats for their meat and hides. He made weapons from bones and stones. He built a shelter from tree parts and hides. Defoe helped him with the rifle and the shipwrecked barrels of gunpowder that had floated to shore. Crusoe gladly took those gifts from his maker and used them to make himself a better world.
A second book gave me a deeper insight into how technology drives survival and vice versa. I was 18 years old and stood in line at my high school graduation in the small Kansas farm town of Lansing. My physics teacher came up and shook my hand and gave me a book as a graduation gift. He was an Army colonel at nearby Fort Leavenworth. He came to our high school as a type of pro bono effort to teach just one physics class of seven students. The book was Carl Sagan's The Dragons of Eden. It went on to win the Pulitzer Prize in 1978 but was out of step with a Bible Belt farm town with a posted population of little over 4,000 persons.
Sagan's book showed how the ancient battle between the reptiles and mammals has shaped our brains. Dreaming often involves our reptile-like midbrain. Most of us fear touching a harmless black garter snake more than we fear touching the more dangerous rabbit or raccoon. Brains house their evolutionary history in their present structure and function.
Sagan also made a conjecture that forced me to think about just how much of personal and social behavior stems from genes. You sometimes flinch yourself awake just as you fall asleep. Sagan suggested that this was a reflex that helped keep our hominid ancestors from falling out of their trees at night. The reptiles or "dragons" down below ate those great-grandparents who did not have the reflex or who had it and still fell. This book put me on a path that led me in time to write the text Neural Networks and Fuzzy Systems on the mathematical structure of the brain.
The book also led me in time to test one of its ideas on my newborn daughter just minutes after she emerged from a C-section birth. I held my swaddled daughter as all proud fathers do. Soon I could not resist touching the side of my forefinger to her small bare foot. Sagan was right. Her foot clutched at my finger just as a young monkey's paw would clutch at a tree branch.
Arthur C. Clarke showed where survival and technology could end in his 1956 novel The City and The Stars. A billion years have passed. Earth's mountains have crumbled and its seas have dried up. Humans have been to the stars and come back. They have lived now for eons in a huge city under the control of an omnipotent but benign central computer.
Humans have lost their teeth and nails in this dull steady state. They have lost much of the hominid within them–and with it their thirst for change and adventure. Hope comes only from the rare loner who challenges the system with no chance of changing it.
Bart Kosko is a professor of electrical engineering at the University of Southern California and directs USC's Signal and Image Processing Institute. He is the author of Fuzzy Thinking (Hyperion) and the new Prentice Hall text Fuzzy Engineering. Bantam/Broadway will publish his Heaven in a Chip and Avon will publish his novel Nanotime in the spring of 1997.
Brink Lindsey
The capacity to produce technology–to remake the world around us in the image of our thoughts–is a basic aspect of human nature. It is as old as our first ancestors, who chipped stone axes 2 million years ago: We have named them homo habilis–"handy man"–because of their toolmaking.
In the past century or so, however, this capacity has achieved an entirely new potency. A sustained, focused, and intricately integrated creative outburst on the part of millions of people has redefined the pace and possibilities of human existence in ways previously only dreamed about. Life dominated by natural rhythms and limits has given way to life mediated and liberated by artifacts.
This transformation, still unfolding, is one of the greatest quantum leaps in human development. But of course technology cannot be disentangled from the rest of human nature, and all its frailties and contradictions. And so the process of building this new, man-made world has been an exceedingly messy one-fluky and unpredictable, often tragic and misunderstood.
In this regard, it is humbling to realize this great release of creative energies was made possible by violence and predation. Yet through history, warfare has done as much as anything to drive technological development. William McNeill's fascinating The Pursuit of Power (1982) tells of the unique contributions made by the intense but inconclusive military competition that roiled medieval and modern Europe. Incessant conflict accelerated the pace of innovation directly, but the indirect effects were even more momentous. By combining domestic order with international anarchy, the fractured political landscape of Europe created spaces within which commerce could develop and flourish. The addition of profit motive to plunder motive gave technological growth an unstoppable momentum.
The resulting Industrial Revolution created unprecedented wealth and opened broad new avenues for creativity and achievement. But its promise was perverted by two grievous misconceptions widely shared among its implementers and champions: first, a belief that the logic of industrial development required ever more centralized control on ever more massive scales; and second, a withering reductionism that rejected as irrelevant or even valueless anything that is not measurable and concretely utilitarian. These two ways of thinking combined in powerfully destructive cultural and political movements that sought to bring mankind down to the level of the clanking, soulless machines it had constructed. These movements–which can be summed up with the terms "technocracy" and "social engineering," and which can be seen as a kind of Industrial Counterrevolution–inflicted deadening bureaucracy at their best, totalitarian horror at their worst.
The misbegotten ideal of technocracy was brilliantly satirized in Yevgeny Zamyatin's We, written in 1920-21, just as the Soviet Union was being established as its purest historical incarnation. The book is set in a dystopia of the far future, where in a city cordoned off from lush, unruly nature by the "Green Wall," humanity has been brought nearly to mechanistic perfection. "Numbers," as people are known in the "One State," live in glass apartments, so that all actions are visible to the ever present but unidentified "Guardians." A "Table of Hours" prescribes every action of every day, down to 50 chews per bite at mealtime, masticated in unison by all the numbers in the One State. (Interestingly, the "ancient" identified as the chief prophet of the One State is not Karl Marx, but Frederick Winslow Taylor, the father of "scientific management" and, incidentally, no small influence on Lenin.)
In Zamyatin's fiction, the parts of human nature suppressed by the One State–the spontaneous and unpredictable, the imaginative, the biological–are not eliminated: They persist outside the Green Wall, and are cherished by dissidents within who plot the tyranny's overthrow. Zamyatin's slender hope has been fulfilled in our generation, as the technocratic ideal has suffered stunning reverses-from the collapse of the Soviet Union to the restructuring of corporate bureaucracies.
Here again, however, things are messy: The abuses of the social engineers have helped to strengthen the case of Luddite reactionaries who despise and fear the man-made world. What has happened, then, is that much of the intellectual critique of technocracy has been of the baby-and-bathwater variety. Both sides have shared a common false premise: that technology is dehumanizing. The technocrats reject humanity; the Luddites reject technology.
Where to go from here? Fortunately, a body of thought that transcends this false dichotomy has been developing in recent years. The new sciences of chaos and complexity focus on the general capacity–in material objects, living creatures, and human institutions–for spontaneous order: complex behavior that is not the product of any central design, and cannot be reduced to the sum of the parts. While there are many excellent books that provide the general reader an overview of this new thinking, let me recommend Louise Young's The Unfinished Universe (1986). Young offers a lyrical vision of the universe's unfolding complexification-from physical order to chemistry, then biology, and then mind and culture.
This new way of understanding the world–non-mechanistic, non-reductionist–reconnects the technological and biological by showing that both are manifestations of spontaneous order. In this way, it can support a new kind of idealism about technology's possibilities. The ongoing creation of a man-made world can be seen, in this view, as part of the larger aspiration of mankind's continuing complexification: the development, through free institutions, of the best of human nature in all its variety and richness.
Contributing Editor Brink Lindsey (10213 4.2224@compuserve.com) practices trade law in Washington, D.C.
David Link
I begin with "The Dynamo and the Virgin," a chapter from The Education of Henry Adams (1905). In it, Henry Adams visits the Great Exposition of 1900 and, viewing the incredible, towering power sources on display, begins to contemplate the awesomeness of the developing technologies. The only precedent he can think of is the inspiring and miraculous power the Virgin held in the history of art.
Adams was on to something about why technology has been so compelling in the 20th century. Men are every bit as moved by the might of the machines they can build as they are in thrall to the power of sexual beauty. Every little boy knows what it's like to stare wide-eyed at a fire engine or a fighter jet or a bulldozer. This is a large part of what has moved men to make bigger and more majestic bridges, buildings, and then machines. That sheer glory of force, of size and power and muscularity, is a large part of what made the film Top Gun such a hit. Sure, there was a perfunctory love story in the movie, but what it's about, what you can't forget, is the roar of those jets, the fire and the blaring sound of them, and the thrill and fun the pilots have in actually being in charge of all that force.
As our control over technology developed, though, a necessary and inevitable lesson accompanied our progress: humility. The second book on my list is Walter Lord's A Night to Remember (1956), on which the 1958 film was based. The story of the sinking of the Titanic is the stuff of pure myth-if it had not actually happened, someone would have had to make it up. The sinking of the Titanic is most useful as a story about hubris. Our technology is the manifestation of our dreams, and nothing in the story suggests we should stop trying to make our marvelous, outsized visions real. On the other hand, we ought to keep in mind that while we are godlike, we are not gods. Glorious luxury liners are well worth the effort and the cost, but that's no reason not to have enough lifeboats on board.
My third choice is The Dancing Wu Li Masters (1979), by Gary Zukav. The goal of the book was to make the obscure field of quantum mechanics comprehensible to non-scientists, and, surprisingly, it succeeds in illuminating what the attraction of all that heady stuff is. Like Adams, these men (and they're pretty much all men) experience real awe, this time around, not at size and force, but at the enigmatic elegance of the physical world. The Dancing Wu Li Masters helps explain why modern scientists are driven to explore particles so infinitesimally small that they are (sometimes quite literally) nothing but ideas.
This move from Adams to Zukav is summed up in Stanley Kubrick's 2001: A Space Odyssey. The film begins with man's fascination with the first tool; in one brilliant edit it moves dazzlingly to the end of our century, where the idea and execution of tools has been all but perfected. What is left is the realm of pure mystery. Which, in a way, brings everything back to where Adams found it in 1900-the point of wonder.
David Link (dflink@pacbell.net) is a Los Angeles writer whose essays appear in Beyond Queer: Challenging Gay Left Orthodoxy (The Free Press), edited by Bruce Bawer.
Paul Lukas
Most of us tend to think of technology in terms of the macro rather than the micro. I refer here not to sheer physical size–surely Silicon Valley has taught us that the mightiest technological achievements can come in the tiniest of processing chips. I refer instead to our notions of technological complexity and, especially, technological power, whether measured in horsepower, kilowatts, megatons, or gigabytes. For the most part, our cultural mindset goes, bigger is better and biggest is best.
But technological power, not to mention technological utility, comes in many sizes. For every supercomputer, digital camera, and electronics laboratory, there's a host of products that may seem far more mundane yet are no less remarkable: office supplies, kitchen gadgets, canned goods, children's toys. Don't mistake these items' ubiquity for technological simplicity. Our tendency to take them for granted is in fact a testament to an immensely sophisticated production system, a system so vast and efficient that it can provide these things without most of us even stopping to ponder how it all gets accomplished. The mechanical engineering and industrial design processes that produce these items may be less glamorous than, say, software design, but they're no less important or impressive. Think of them collectively as inconspicuous technology.
There are a number of books that focus on inconspicuous technology. One of the best is Henry Petroski's The Evolution of Useful Things (1992), a loving examination of such minor miracles as paper clips, tin cans, pins and needles, nuts and bolts, silverware, adhesive tape, pull tabs, and the like. You don't need to be a minutiae fetishist to appreciate Petroski's examination of the subtler aspects of our everyday world, and you don't need to be an engineer to understand his prose. (The book also devotes several pages to a discussion of the zipper, which is itself the subject of an excellent book: Robert Friedel's Zipper: An Exploration in Novelty [1994]).
Packaging is another key component of inconspicuous technology. Package-design elements like the Coca-Cola wave and the Wrigley's arrow have become subsumed into our collective cultural psyche. For a look at the history of packaging, including an examination of the formidable technological challenges packaging can present and the additional technological innovations it can facilitate, try Thomas Hine's The Total Package (1995), which examines everything from milk cartons to cereal boxes and will forever change your perceptions of your local supermarket.
These books are wonderful and instructive, but they're also a bit scholarly; the best way to appreciate inconspicuous technology is in the context of the real world. Curiously, the best example of this is a work of fiction: Nicholson Baker's The Mezzanine (1986), in which the narrator enthusiastically pursues a series of obsessive intellectual tangents on such unlikely subjects as soda straws, men's-room urinals, shoelaces, doorknobs, tear-off perforations, cigarette butts, and vending machines, all in the course of a short escalator ride. To fully appreciate the wonders of the inconspicuous world, in theory and practice, start here.
Paul Lukas (krazykat@pipeline.com) is a columnist for New York magazine, the editor of Beer Frame: The Journal of Inconspicuous Consumption, and the author of the forthcoming Inconspicuous Consumption: An Obsessive Look at the Stuff We Take for Granted, to be published by Crown in January.
Henry Petrosk
To understand technology fully, it is necessary to understand the nature of engineering. The formulation and solution of technical engineering problems is, of course, at the heart of every technological endeavor, whether it be the design and production of an automobile or the generation and distribution of electricity, but dealing with technical problems within the constraints of the laws of nature is only one aspect of the total engineering enterprise. Real engineering in the real world is inextricably complicated by cultural, social, political, economic, and aesthetic goals that shape and in turn are shaped by the technical objectives.
The full story of just about any ostensibly technical project will illustrate the complex interrelationships among competing goals and constraints that engineers must deal with in the course of producing a technological artifact. Among the best of such stories is that of the building of the Brooklyn Bridge, told by David McCullough in his 1972 book, The Great Bridge. The true story of the conception and realization of a bridge that has become a cultural treasure is also the very human story of how John Roebling, his son, Washington Roebling, and his wife, Emily Warren Roebling, dealt with accidents and death, not to mention political corruption and greed, along with the physical and technical challenges of constructing the largest bridge in the world. It is as gripping as any novel.
A more recent book, The Innovators (1996), by David P. Billington, tells the stories of engineering pioneers who shaped modern technology and thereby made America modern. Rather than weaving an extended narrative about a single artifact, however, Billington uses famous technological achievements such as the steam engine, the distribution of electricity, the telegraph, and steel making to show how even the most technical aspects of engineering–its equations and formulas–are influenced by such factors as social, economic, and aesthetic considerations. By explaining in simple terms how technical decisions must incorporate a wide range of seemingly nontechnical considerations, Billington's book shows more explicitly than any other the true nature of engineering as a social and cultural, as well as a technical, endeavor.
Engineering is done by engineers, of course, and the 1976 book by Samuel C. Florman, The Existential Pleasures of Engineering, has become a classic for understanding the passion and enthusiasm individual engineers can feel for their work and the satisfaction they can experience as they make tangible contributions to society and culture. Florman's book has recently been reissued in a second edition (1994), which includes a new preface and chapters from some of his other books, and it is considered by many to be required reading for those wishing a full understanding of the engineer and engineering in modern society.
Henry Petroski (hp@egr.duke.edu) is A. S. Vesic Professor of Civil Engineering and professor of history at Duke University. His most recent books are Engineers of Dreams: Great Bridge Builders and the Spanning of America (Alfred A. Knopf) and Invention by Design: How Engineers Get from Thought to Thing (Harvard University Press).
John J. Pitney Jr.
Foundation (1951), by Isaac Asimov: In this classic of science fiction, a "psychohistorian" uses computers and advanced mathematics to predict the decline and fall of the Galactic Empire, and launches a plan to shorten the coming dark ages. In writing this novel, Asimov assumed that human nature would remain flawed but that technology would evolve to the point of producing accurate forecasts of events centuries in the future. The latter assumption is highly debatable, for as Hayek wrote, the "mind can never foresee its own advance." Asimov was on firmer ground when he included a parable about technological literacy. As the Empire declines, the planet Terminus gains power because its inhabitants are the only ones who still understand nuclear energy. In the surrounding kingdoms, people remember the operating instructions for their nuclear-powered machines, but have long forgotten how the things actually work. Terminus exploits this ignorance by creating a mystical cult around the technology, which allows it to manipulate the kingdoms with spells and sorcery. Remember this part of the novel the next time you read about the state of science education in the public schools.
The Ascent of Man (1973), by Jacob Bronowski: This companion volume to the excellent television series of the 1970s is not your typical coffee-table book. Though highly readable and handsomely illustrated, it also has a strong philosophical viewpoint. The title itself is revealing: The late Bronowski believed that the growth of scientific and technological knowledge did indeed constitute an "ascent." His book celebrates discovery and invention while bemoaning the "loss of nerve" represented by the all-too-frequent mysticism of pop culture. And it powerfully dismisses the widespread notion that technology will turn people into numbers. At one point, we see a photo of Bronowski squatting in a muddy field, and the accompanying text explains: "This is the concentration camp and crematorium at Auschwitz. This is where people were turned into numbers. Into this pond were flushed the ashes of four million people. And that was not done by gas. It was done by arrogance. It was done by dogma. It was done by ignorance."
See How They Ran: The Changing Role of the Presidential Candidate (1991), by Gil Troy: This volume may seem an odd choice in this context, but it is filled with wise observations about the ways in which technology has changed (and has failed to change) the relationship between voters and candidates. During the 1850s, for instance, rail travel and telegraphy allowed candidates to reach more people than ever before, thus allowing a hitherto obscure Illinois lawyer named Abraham Lincoln to become a national figure. Since then, technology has continued to expand the potential for public conversation on campaign issues. But the realization of that potential hinges on the substantive content of the campaigns, which is something that technology itself cannot determine.
Contributing Editor John J. Pitney Jr. (jpitney@mckenna.edu) is associate professor of government at Claremont McKenna College.
Virginia I. Postrel
One of the pleasures of editing a magazine is bringing together fine, insightful writers whose work enriches and extends your own ideas. On the subject of our relation with technology, then, it's no surprise that my touchstones include Walt Anderson's Evolution Isn't What It Used to Be (1996), Henry Petroski's The Evolution of Useful Things (1992), and Fred Turner's Tempest, Flute, and Oz (1991)–very different books, all important and well written, all ably represented by the presence of their authors in these pages. So, invoking another editor's pleasure, I can pick three other books.
Few writers better explore the tensions among technological innovation, scientific evolution, and politics than physicist Freeman Dyson. And few writers more appreciate the ecologies of human societies, with their complex dynamics, odd feedback effects, and occasional paradoxes. In From Eros to Gaia (1992), Dyson collects some of his finest essays, including the priceless "Six Cautionary Tales for Scientists," originally written in 1988. It contrasts the small-scale, unglamorous, and effective Plan A approach to technical problems with the politically successful Plan B–and finds the same patterns in the First, Second, and Third Worlds. The essay should be required reading for all science policy makers.
"Technology" means not only gadgets or computers but any embodied knowledge, including business systems. Both sorts of technology combine in Joseph Nocera's A Piece of the Action: How the Middle Class Joined the Money Class (1994), which tells the story of how bank credit cards and money market funds made credit and investment mass, middle-class products. Since these instruments work only because they are, in Paul Lukas's phrase, "inconspicuous technologies," we never appreciate the absolute genius, pigheaded determination, computer power, and sometimes disastrous experiments that building such systems required. Nocera is a terrific storyteller, and he has a fascinating story to tell. Plus his chapter on the Age of Inflation is itself a remarkable piece of technology: It so captures the utter, spiraling financial panic that hit middle-class America in the late 1970s that it will give you flashbacks. Good reading, too, for GenXers who think economic insecurity began with them.
Following my own instructions, I end with a novel or, rather, a series of novels. It's been 25 years since I read Laura Ingalls Wilder's Little House books, but they left some indelible technological images: the joys of a father's fiddle playing, the advantages of blindness (you can sew when it gets dark), the preservation and uses of a slaughtered hog, the economic triumph represented by glass window panes. Ghostwritten by Wilder's daughter, the important libertarian intellectual Rose Wilder Lane, the Little House books remind us of both the value of old technologies in their time and the wonders of newer ones in ours.
Virginia I. Postrel (VPostrel@aol.com) is the editor of REASON and a columnist for the technology magazine Forbes ASAP. She is writing a book, called The Future and Its Enemies, on clashing ideas about social and economic evolution, which will be published by The Free Press.
Adam Clayton Powell III
The future is always with us, in some form or outline, so wherever technology is taking us, the clues are already here. Each of these books informs us in its own way to look for hints of the future in what is already around us.
To a man with a hammer, goes the expression, the world appears a nail. And so to Andrew Grove–the president of Intel, the world's most successful producer of microprocessors–the world is a universe of hardware and software. However, by remaining so narrow, Grove's Only the Paranoid Survive (1996) focuses on exactly the right aperture for a glimpse of the future.
Computer makers are in territory so new and changing so rapidly that they (and we) must plan for a largely unknown and unknowable future, making a path across a trackless land: "It's about finding your way through uncharted territories." Grove insists this unknown territory is suffused with the thrill of the possible–and, too, with the fear that often accompanies the unknown. But he insists that fear can be a force for positive change: "Ideally, the fear of a new environment sneaking up on us should keep us on our toes." (Now we know what to make of the book's title.) Grove provides an analogy of a fire department: Firefighters do not know exactly where the next fire will be, but they can guess a rough total and the historical distribution pattern, so they can plan accordingly.
Grove also notes the fundamental generation gap now forming between most of us industrial-age, analog-era adults and the post-industrial, under-25 digerati who have spent most of their lives learning about the world, playing games, working on homework, and socializing with each other online. Just as their grandparents abandoned afternoon newspapers for television and their parents found generational refuge in FM radio, audience data now show that the computer generation born since 1970 is abandoning print and electronic mass media to embrace the more tailored, more fragmented, computer-based media. This has profound implications for public discourse, which Grove calls "a demographic time bomb ticking away."
And so with Grove's vision of future media, which may be science fiction to the industrial agers, but to the under-25s, it is just another upgrade–and just a few years away: "Processing power is going to be practically free and practically infinite," said Grove in a September Newsweek interview. "This will allow us to turn automatic 3-D photorealistic animation into a ubiquitous reality within two or three years."
Kevin Kelly, executive editor of Wired magazine, takes Grove's vision of the future of computers and extrapolates that vision to describe the future of society, with a fundamentally different social contract. He embraces the chaos of ultimate decentralization and individual power, writing a book that in its very title celebrates the complete absence of authority imposed by any central place or person. "There is no central keeper of knowledge in [the Internet], only curators of particular views," writes Kelly in Out of Control (1994). So he argues we are moving to a "highly connected yet deeply fragmented society" where "distributed, headless, emergent wholeness becomes the social ideal."
And if that leaves us out on the edge, it may be a challenge for Tom Stoppard's puzzle of a play, Arcadia (1993), to take us even further. But further it does, in a work that has nothing to do with computers or microchips but everything to do with research, society, the past, and the future–and also manages to encompass Lord Byron, quantum physics, and trends in British garden design in the late 18th century.
Stoppard presents alternating scenes of today's researchers and the subjects of their research in 1809, and we can see how painstaking care and rigorous procedure lead our present-day characters to exquisitely erroneous conclusions about the past. Amid this lesson in humility is one exclamation that serves as a signpost, an emblem for our sea change to the brave new digital world. The character Valentine, a twentysomething mathematician, is entranced by the possibilities of chaos theory and quantum mechanics, as excited by the fall of the old order as are his real-life twentysomething digerati counterparts.
"The future is disorder," says Valentine. "A door like this has been cracked open five or six times since we got up on our hind legs. It's the best possible time to be alive, when everything you thought you knew is wrong!"
Adam Clayton Powell III (apowell@freedomforum.org) is vice president, technology programs, at The Freedom Forum.
John Tierney
For critics of technology, The State of Humanity is a terribly depressing read. The 1995 book, edited by the economist Julian Simon, is a collection of essays and graphs analyzing human welfare over the past few millennia. The trends are just about all positive: Humans are enjoying longer, healthier, wealthier, and freer lives in a world with less pollution and more plentiful resources. The reason for all this good news, of course, is technology. Doomsayers claim to be taking the long view of technology and its problems, but how can they ignore evidence like this?
The answer to that question can be found in William Tucker's book, Progress and Privilege: America in the Age of Environmentalism. This 1982 work (Tucker had the misfortune of being too far ahead of his time to sell many books) anticipates the neo-Luddites and environmentalists of the 1990s. As Tucker shows, today's critics of technology–including the ones who think they're progressives fighting to help the poor–are part of an elitist, conservative tradition. Aristocrats, clerics, and intellectuals have always come up with creative reasons for opposing any technological change that threatens their comfortable status or interferes with their determination to write the rules for everyone else. Tucker offers lovely analyses of our current environmental "crises." Consider his explanation for the angst over population growth in the Third World: "There have been two lines of demagogic argument that have always gone down well in history. The first is to tell the poor that the rich have too much money. The second is to tell the rich that there are too many poor people."
If these books haven't persuaded you to ignore the Luddites and Malthusians, try a work of inadvertent fiction by an ornithologist named William Vogt: Road to Survival. It preaches against technological innovators-"the freebooting, rugged individualist" must be recognized as "the Enemy of the People"–and warns of pending famine and poverty due to pollution, overpopulation, vanishing topsoil, depleted stores of natural resources, and various environmental catastrophes. It reads exactly like a tract from Bill McKibben or the Worldwatch Institute. But this book was published in 1948–and the doomsday predictions were being made about the United States. Reading it will make you laugh, restore your faith in modern technology, and free you from any temptation to take the doomsters seriously.
John Tierney is a staff writer for The New York Times Magazine.
Frederick Turner
In some ways the most interesting book on the relationship between humans and our technology that I have read recently is William R. Jordan III's remarkable work The Sunflower Forest, but it is as yet unpublished (smart publishers, take note). Jordan is a curator at the University of Wisconsin, Madison, Arboretum, and his book is a searching analysis of the way in which our Cartesian and Puritan heritage has led us to divide humans from the rest of nature and thus, finally, to the view espoused by the likes of Bill McKibben, in which nature is by definition what is untouched by human hand, and humanity is a blight and cancer upon nature. Jordan presents an alternative view, in which the human appetite for ritual, myth, play, and gardening can actually improve the richness of the earth's living ecosystem, with beneficial economic results and without governmental invasion. We shall trade with the rest of nature, he believes, to our mutual benefit, thought as an extension of biological evolution.
But since the book is unpublished, I cannot make it one of my three. A science fiction work, Neal Stephenson's The Diamond Age (1995), set in industrial China a hundred years hence, is a marvelous fountain of ideas about the human technological future. It is, like some recent works of Greg Bear and Gregory Benford, a convincing vision of a nanotechnological age (nano-engineered diamond has become the most popular building material in Stephenson's future). What is especially provocative about it is that Stephenson, despite his countercultural roots, has perceived, correctly in my view, that the masters and mistresses of a future technological age will be people with solid family backgrounds, personal honor, and discipline, the new Victorians or "Vickies" as he calls them. As the subtitle of his book, A Young Lady's Illustrated Primer, suggests, however, a sufficiently sophisticated and artistic piece of educational software, based on storytelling, might serve as a substitute family. But the point is that chemical technology is more potent than our present physics-based technology, biotechnology more potent than chemical technology, and nanotechnology is the most potent of all, exerting the greatest leverage upon the physical world.
My second bona fide book is Homer's Odyssey. This poem is among other things a good introduction to the brilliant simplicity of ancient Greek technology, and an exemplary demonstration of what constitutes user-friendliness in both software and hardware. Homer shows us how a bow, a door, a garden, a ship, a narrative verse tradition, an olive grove, an adze, a polytheism, a viticulture can work together to the enrichment of the world.
My last book is Martyn Fogg's Terraforming (1995), the current bible of those who believe we must begin to become a spacefaring civilization. He makes a compelling argument that we can colonize Mars using contemporary technology, and that we should do so. Reading Fogg, who is the president of the British Interplanetary Society, one feels a blush of shame at the pusillanimity and malingering of our times, the most prosperous in history, and of our generation, which would rather bicker over nasty little social prizes and slights than take up once again the heroic mantle of our destiny.
Frederick Turner is a poet, a theorist of the links between the sciences and the humanities, and Founders Professor of the Arts and Humanities at the University of Texas at Dallas. His most recent books include April Wind (University Press of Virginia) and The Culture of Hope (The Free Press).
The post Art and Artifacts appeared first on Reason.com.
]]>When Hillary Clinton published It Takes a Village, she failed to give credit to Barbara Feinman, the ghostwriter who did much of the work on the book. This omission created bad publicity, and according to press reports, it also prompted Feinman to strike back by telling Bob Woodward about Mrs. Clinton's "seances" with Eleanor Roosevelt. Apparently, President Clinton does not want a similar problem. ("COL. SANDERS ASKS PREZ: DO YOU WANT FRIES WITH THAT?") At the end of Between Hope and History, he acknowledges policy consultant William E. Nothdurft, "who was primarily responsible for helping to draft this book."
Poor Mr. Nothdurft. In order to assemble the book, he first had to read the president's speeches, which are cholesterol for the eyes. Then he cut, pasted, and edited various passages into a first draft, which the president purportedly reworked. For a respected author such as Nothdurft, the process must have been a burden: It wasn't writing; it was cooking with a crock pot.
Thanks to the word-search capability of the White House site on the World-wide Web , it is easy to tell exactly where many of the ingredients come from. The opening passages, which celebrate how the American spirit overcame the Olympic Park bombing in Atlanta, derive from an August 5 speech in Washington, D.C. The proposal for "America's Hope Scholarships" appeared in the president's June 4 commencement address at Princeton. His praise for the job-saving efforts of Harman International (a firm headed by a Democratic lawmaker's husband) made its debut in California on March 8.
In this manner, you can probably track the origins of nearly every paragraph in this book. But if you actually go ahead and do so, that's a sign that you have way too much time on your hands.
Between Hope and History is the longest 178-page book in the history of publishing. Its three main sections–Opportunity, Responsibility, and Community–comprise page after tedious page of familiar policy prescriptions, such as preserving AmeriCorps and extending gun control. If you want the gist of President Clinton's message, look instead to the 1996 Democratic platform, which is much more concise and a lot cheaper. (Web surfers can access it for free at www.democrats. org.)
Still, the book holds some interest for political buffs. Through sheer repetition, it highlights a technique that defines the president's rhetorical style: firmly endorsing limited government or free markets, then adding a qualifying clause that turns the original statement into vapor.
Through these comments, the president inadvertently reveals his secret super power: the ability to transform himself from New to Old Democrat just by uttering a one-syllable conjunction. Since he wants the television networks to air more children's programming, perhaps he could make a contribution by starring in his own series: The Adventures of "But-Man" (and his Trusty Sidekick, "However").
The morphing of the president doesn't end there. At times, his prose bears a striking resemblance to that of Newt Gingrich.
In December 1994, Gingrich told House Republicans to read the Declaration of Independence, with special attention to the phrase "pursuit of happiness." He said: "Each individual has to be engaged in the pursuit–not guaranteed the finding." In Between Hope and History, Clinton says: "America promises the opportunity to pursue happiness but does not guarantee it."
In To Renew America, Gingrich wrote: "Our rights are pale shadows of our responsibilities. When the Founding Fathers pledged their lives, their fortunes and their sacred honor they meant it literally." The president says that America is about interdependence as well as independence: "And it was to both that our Founders in the Declaration of Independence said they would 'mutually pledge our lives, our fortunes and our sacred honor.'"
The speaker in To Renew America: "Why are governments so painfully slow at adapting to change? Governments almost always grant monopoly status to their own operations so they won't have to compete." The president in Between Hope and History: "Many of today's [gov-ernment] operations still run like monopolistic, big industrial corporations. Government, however, was slow to recognize this and even slower to respond."
Is Clinton cribbing from Gingrich, or is it the other way around? I think it's the former, since Gingrich was already using such themes and language in his 1984 book, Window of Opportunity. Gingrich helped shape the contemporary Republican vocabulary, so if President Clinton's strategy is to sound like a Republican (sometimes), it's natural that he will sound like Gingrich.
Just as revealing as President Clinton's identity transformations are his rather unusual readings of history. In a rehash of a 1994 speech at UCLA, he says that Progressives such as Woodrow Wilson sought to mend the frayed fabric of family and community. At the end of World War I, he notes sadly, the country lost its way. "America withdrew from the world, seeking security in isolationism and protectionism. We withdrew here at home, too, into the trenches of racial prejudice and bigotry and away from the protection of our citizens and economic institutions. Ten years later, in 1929, that decade of neglect produced the Great Depression."
The President's chronology of protectionism is slightly defective: Smoot-Hawley passed in 1930. And leaving unanswered the question of how racial prejudice help trigger the Depression, he also implies that this prejudice represented a backsliding from the Progressive Era. This is piffle. As C. Vann Woodward explained in The Strange Career of Jim Crow, national Progressives had a blind spot for blacks. "In fact," Woodward said, "the typical progressive reformer rode to power in the South on a disenfranchising or white-supremacy movement." The Klan gained enormous power during the administration of Woodrow Wilson–himself a 200-proof racist–but by the end of the 1920s, it was actually declining.
The president is rewriting history for his own purposes. He sees evil Republicanism as a cyclical phenomenon, so he adapts a faulty analysis of the 1980s ("decade of neglect") to the period of GOP ascendancy in the 1920s. He identifies himself with Woodrow Wilson and the Progressives, so he overlooks their sins, imputing them instead to the wicked Republicans. Curiously, the book's acknowledgments do not list Oliver Stone as an historical adviser.
A more benign whopper comes when Clinton quotes Thomas Jefferson as saying that democracy depended on the "yeoman farmer." So far, so good. Then he claims that "although Jefferson was a farmer himself, he wasn't making a pitch for agriculture." Today's "yeoman farmers," he says, are American families, whether they live on the farm or in the cities. That's a very inspiring image, Mr. President, but it's not what Jefferson meant. Calling farmers "the chosen people of God," Jefferson said that urban occupations corrupted morals: "The mobs of great cities add just so much to the support of pure government, as sores do to the strength of the human body." Clinton probably would not want to quote that line while campaigning in New York or Los Angeles. And in the industrial Midwest, he should probably forget about Jefferson's comments on manufacturing: "[L]et our work-shops remain in Europe. It is better to carry provisions and materials to workmen there, than bring them to the provisions and materials, and with them their manners and principles."
Clinton is no more accurate about his own history than he is about the country's. When he was growing up, he says, the only unpaved streets in Hope, Arkansas, were in the black neighborhood near his grandfather's store. He uses this observation to bring a personal dimension to his pitch for affirmative action–but there's just this one small problem. "I don't know what the hell he's talking about," one of the president's distant relatives told the Arkansas Democrat-Gazette when he first spun this yarn in 1995. "These streets were paved back then, sure to God. I know it was in '60–more than 30 years ago."
Between Hope and History has some significant omissions. There's no mention of the need to prevent the political abuse of confidential FBI files. Why not? He says that he has supported "expanding school choice and charter schools." Why does he forget to add that he only means choice among government-run schools? Why doesn't he discuss the fine private schools that he and the vice president have chosen for their own children?
These are serious questions, but Between Hope and History is not a serious work. It is a recycling bin with hard covers.
The post The Adventures of "But-Man" appeared first on Reason.com.
]]>Meanwhile, the Republican majority in the House is the slimmest of any party in nearly 40 years. And even if only the editors of REASON and their immediate families composed the majority in both the House and Senate, there's still the small matter of Bill Clinton, the 43 percent solution left over from 1992. By vetoing anything and everything that cuts government, he effectively commands two-thirds of the votes in both chambers.
While America has no party willing to call itself socialist, we do have an admitted socialist in the House (Bernie Sanders of Vermont) who votes with the Democratic Party 97 percent of the time. This telling fact reveals the Democrats to be America's closet socialists. Unfortunately for them, as the 20th century draws to a close their ideology has suffered global defeat. Its leading state practitioners are dead or in denial: The Soviet Union collapsed under socialism's weight, while communist China pretends to the world that it is a free country in order to avoid defending the indefensible. Save for America's universities and the likes of Fidel Castro and Gennady Zyuganov, there is no one left to stand up for socialism.
Hence there is no longer a future for the party that has controlled our government in Washington by hook and crook for my entire lifetime. To survive this long, they have had to employ professional bait-and-switch artists like Bill Clinton. It can't last. It might not even survive the next election.
Christopher Cox is a member of Congress from California's 47th District and chairman of the House Republican Policy Committee.
Ed Crane
There are, of course, reasons for the systemic growth of government. The tyranny of the status quo, concentrated benefits and diffuse costs, "public choice" self-interest on the part of politicians and bureaucrats, all lead to a government growth imperative. Yet Americans clearly desire less government–much less. The single strongest piece of evidence for that proposition is that 80 percent of them support term limits. It's hard to believe that Americans overwhelmingly prefer a citizen legislature over one populated with professional politicians because they believe it will bring them more government.
And term limits are one of two essentially process-oriented changes that I believe can radically change the culture inside the Beltway and lead to a dramatic downsizing of the federal leviathan. The other is the elimination of campaign contribution limits.
Term limits are key because they will fundamentally change the way Washington works. That's why a Gallup poll showed that a majority of congressional aides, corporate lobbyists, and federal bureaucrats oppose term limits. The most common argument for term limits is Lord Acton's dictum that power tends to corrupt. True. The less time in Washington a congressman spends, the better. A more powerful argument is that term limits solve an adverse selection problem: Simply put, the wrong people tend to seek office in the first place.
Today, a potential citizen legislator sees few open seats, knows that odds are 10-1 he'll lose against an incumbent, and that even if he did win he'd have to serve a long time to have much influence. Real term limits–six years in the House–would change that dynamic. A culture that accepted people serving for just one term would likely develop, allowing Congress to revisit the mountain of bad laws that are currently protected by careerists. The common sense of a citizen legislature would give us Medical Savings Accounts, privatized Social Security, a repeal of the income tax, and much more.
Finally, for citizen participation to flourish in politics and for the two parties to feel some outside competition we should eliminate contribution limits to federal campaigns, have full disclosure, and create an open, dynamic political system. Such a system will challenge incumbents and allow individuals who are not career politicians or professional "activists" to effectively make their case to the American people.
Under the current system we're spending less than $3.00 per eligible voter per election cycle on congressional races. That's not enough, given the huge impact Congress currently has on our lives. Unlimited contributions will also take away the artificial bias the Federal Election Campaign Act has created in favor of the media. Why should Katherine Graham, Garry Trudeau, or Rush Limbaugh give the equivalent of millions of dollars in support of their candidate or cause when we're limited to $1,000? If the answer is the First Amendment, well, it's meant for all of us, not just the media.
In addition, the growing renewed interest in the enumerated powers doctrine of the Constitution (including, of course, the 10th Amendment) will facilitate fundamental downsizing. And if that's not enough, the culture in cyberspace is quite libertarian. Millions of Americans are learning how to conduct business and solve problems over the Internet, making the federal government less and less important to America in the 21st century.
Ed Crane is president of the Cato Institute.
Steven Hayward
Can Washington change?!? Is this an invitation to become a humorist? Readers may well find laughable the optimistic view that Washington can, and indeed is likely to, change over the next few years. For the better. No, really.
The hand-wringing about the collapse of the 104th Congress has blinded a lot of very bright people, including the congressional leaders of the purported "revolution," about some fundamental lessons of politics. The principal defect of this Congress is that it has believed its own press clippings–first that it was the most noble and revolutionary body since the Continental Congress, and now that it is the worst legislative body since Cromwell's Long Parliament. It is of course neither thing, though Cromwell's famous words–"You have sat too long here for any good you have been doing"–may be appropriate.
The second unremarked lesson of the present moment is that you can't change the government unless you control it. To adapt that great Aristotelian phrase for yet another purpose, controlling Congress is the necessary but not sufficient condition for changing government. Here our historian/House speaker has badly misread the lesson of history (the peril of studying European instead of American history, I suppose). While Gingrich likes to think of the present moment as the successor to the New Deal realignment of the 1930s, this Congress didn't pay much attention to its closest historical analogue, the Congress elected in 1930. In the off-year election of 1930, Democrats captured the House and ended a long era of Republican rule. But the New Deal revolution couldn't really begin until President Hoover was replaced and more Democrats were added to Congress. Instead of finishing off Hoover by passing legislation in 1931 and 1932, the Congress did nothing, which was even worse for Hoover. The 104th Congress thought it would finish off Clinton by legislating. We know how that story has played out.
The third lesson is that major change is only possible when there are big partisan majorities. Most significant expansions of government–Medicare is the best example–came at moments of swollen partisan majorities. Shrinking government will also require large majorities. The Republican majority at the moment is very thin (in the Senate it is arguable whether there is a majority at all). A disastrous second term from Clinton–a likely possibility–might set the stage for a large Republican majority down the road.
But, a cynic will rightly respond at this point, even a large and relatively principled Republican majority is unlikely to be able to deliver real change in the face of the "iron triangle" reform-thwarting machine. Ordinarily I make this argument myself. In the fullness of time, however, historical trends may come to overwhelm particular obstacles to change. Washington didn't change into what it is today spontaneously, as an acorn becomes an oak spontaneously. Washington grew for decades amid a general climate of opinion that "market failure" required the remedy of expanded government. Today, opinion is crystallizing that "government failure" requires the remedy of smaller government and market solutions. This trend is unlikely to be reversed, as the contradictions and incompetence of government will only become more obvious as time passes.
This doesn't mean that the status quo will lay down its arms and surrender, or that public opinion will always be clear-headed about the issues involved. To the contrary, each battle will require brutal, hand-to-hand combat. Many battles will be lost along the way (did I hear someone say "telecom act" or "minimum wage"?). The general conditions of the battlefield, however, favor our side. We should box on.
Contributing Editor Steven Hayward is vice president, research, for the Pacific Research Institute.
Susanne Lohmann
Modern democracies are trapped in a collective dilemma. In many cases, government policies are biased in favor of special interests at the expense of the general public. Pork-barrel politics can take on monetary form, as with subsidies or tax breaks, but they may also consist of market interventions, as with regulatory policy or trade policy. Such policies are often inefficient in the sense that the total losses imposed on the majority exceed the total gains enjoyed by the minority. While special interests form a minority on any one policy issue, just about every citizen is a member–whether active or not–of some special interest group on some policy issue; indeed, many citizens are affiliated with more than one special interest group. Each citizen prefers being favored by government policy even at the expense of inefficiencies imposed on the society at large. But, relative to the status quo involving inefficient government policies on all dimensions, each citizen would arguably be better off if government did not cater to special interests at all.
Voters clamor for government to streamline and reduce the scope of its redistributive activities. But any serious attempt to implement the expressed wish of the general public requires cutting the pet programs of special interests; and any politician who would do so can count on being dead on arrival at the polls. Yet in the aggregate special interests are the general public. Because voters are willing to punish government for doing what would appear to be good for them, it is hard to avoid the conclusion that special interest gridlock is due to some kind of voter illusion.
The persistence of this collective dilemma is puzzling. Political philosophers have threatened us with a tyranny of the majority–but we suffer under a tyranny of many minorities that sum up to The People. Still, it would seem that politicians should have incentives to act efficiently. Surely, if economists are correct in identifying the huge social dead-weight losses associated with political handouts to special interests, then an incumbent who eradicated such inefficiencies would preside over an increase in standards of living that would sweep him to re-election. If he is too short-sighted to follow this prescription, then it would seem that political competition should allow society to break free from special interest gridlock. Surely a political entrepreneur who stood for election offering to eliminate all the perks enjoyed by special interests would be elected unanimously. If not, why not?
Here's the problem: Special interests are better informed than the general public. Each voter knows how well off she is. She also knows that her wealth and well-being are affected by various government policies, but as a member of the general public she experiences policy consequences only indirectly, as mediated by complex social and market forces. As a special interest, however, she is very well informed about government policy.
Now suppose the government implements a policy that amounts to taking away $1.00 from the general public and giving 90 cents to special interests, with 10 cents standing for the social dead-weight losses of redistribution. As a result, the incumbent's electoral support increases. Why? The general public notices that it is worse off by $1.00, but it is unsure whether to blame the government or other forces. The special interest notices that it is better off by 90 cents, and it can assign the political responsibility for its good fortunes very precisely. At the margin, the government loses some electoral support among the general public, but this loss is outweighed by votes gained among special interests. Thus, we end up with a society in which each voter gets 90 cents on the dimensions of public policy where she is a special interest and pays $1.00 on all policy dimensions where she is part of the general public.
Now we can see why political competition is impotent. The general public cannot effectively monitor whether a political candidate who promises to eliminate the policy bias toward special interests will keep his promise. Once elected, he shares the electoral incentives of his predecessors to favor special interests who are better able to monitor his performance.
Are there any solutions to the problem of special interest gridlock? Proposals for reform are often directed at real or perceived deficiencies of the political process, political institutions, or political elites. It is useful to note, however, that pork-barrel politics occurs across the world in countries with different poli-tical systems and political cultures. It makes little sense to explain special interest handouts with, say, a corrupt Congress responding to campaign contributions when similar patterns of government policy are observed in countries where monetary contributions to political candidates play an insignificant role, in some cases because campaigns are publicly financed.
Instead, I propose that the problem lies with The People themselves. A proposal for reform that does not come to grips with the informational asymmetry between the general public and special interests is unlikely to succeed. Overcoming this asymmetry is not easy, and perhaps impossible. We are not talking about simply producing a 10-second TV soundbite–"The government is taking away $1.00 from you and giving 90 cents to special interests."
The reason why we have representative democracy in the first place is because government policy is complex. The People, who do not have incentives to become informed about complex policies that marginally affect their lives, delegate political decision making to elected representatives. On any given policy issue, there are some people–special interests–who have a greater stake in the outcome, and they have incentives to become informed and monitor the government's performance. In response, the government has incentives to favor special interests at the expense of the general public–not to speak of the inefficiencies created by such redistributive policies. Special interest gridlock is an integral part of representative democracy. It is here to stay.
Susanne Lohmann is an associate professor of political science at UCLA.
William H. Mellor III
It wasn't long ago that libertarians were rightly lauding James Buchanan's Nobel prize and the insights he provided through public choice theory. His insights, of course, demonstrated the insatiable and self-enhancing nature of government. Likewise libertarians have long appreciated F.A. Hayek's and Milton Friedman's prescriptions for non-governmental social organization. That Reason chooses to dedicate much of an issue to the question of "Can Washington change?" reflects the fact that twice in the past 16 years, libertarians have been seduced with the prospect of fundamental change brought about through the electoral process.
In each instance–the Reagan administration, in which I served, and the 1994 elections–all of our zealous intentions and wishful thinking couldn't overcome the incentives and institutional inertia we were all aware of at the outset. For example, while I was in the Reagan Energy Department, we dramatically reduced energy regulations, but it was clear to all concerned that we were never going to be serious about dismantling the Energy Department despite all the vaunted rhetoric. And when the Republicans finally had a shot at ending welfare, they produced a bill over 400 pages long, ultimately driven more by block-grant hungry governors than by sound welfare policy. It should therefore not be surprising that the answer to the question "Can Washington change?" is basically no. Washington is what it is–same as it ever was.
There will continue to be opportunities in Washington to limit the federal leviathan in specific ways, and there may be possibilities for Berlin Wall like breakthroughs on issues such as Social Security. But the more propitious focus for libertarians should be away from issues defined by Washington politics and on such matters as establishing a rule of law conducive to a society of free and responsible individuals. Whether the issue is economic liberty, property rights, or the First Amendment, the Constitution provides the means to check Washington's real-world impact. At the same time, policy prescriptions for the state and local level should be a priority not merely as a matter of devolution, but to restore responsibility and accountability in the government closest to the people. And again, the Constitution properly construed should check grassroots tyranny and provide a legal climate in which individual liberty can flourish in communities built around voluntary associations, religious and secular.
Finally, we should take heart in the force of our ideas and principles, which, despite what may be happening in Washington, are indeed still resounding around the world with unprecedented vigor. For two centuries, the United States has been the world's beacon of liberty. That role was not created by, nor can it be preempted by, Washington.
William H. Mellor III is president and general counsel of the Institute for Justice.
Grover G. Norquist
The Republican victory of November 1994 has mortally wounded the welfare/warfare state, just as the hairline fracture in the Berlin Wall in the fall of 1989 doomed the Soviet Empire. The Republican Congress passed a seven-year budget plan that spends $300 billion less in the year 2002 than Bill Clinton asked for. That $300 billion in reduced spending (at an average salary or grant of $50,000 per person) translates into 6 million more Americans who, in the Republican future, will be in the private sector. In Clinton's future, those 6 million people would have been government employees or recipients of government loans, grants, or welfare. This demographic shift moves 6 million Americans from dependence on the federal government–where they have what Marxists call "objective class interests" that make them Democrats–to the private sector, where, as taxpayers, their objective class interests make them Republicans.
Since the Democratic Party is the party of government, it should be relatively painless for the Republicans to reduce government spending, downsize government, and lay off Democrat precinct captains. True, Bill Clinton has vetoed much of the Republican legislation in 1995 and 1996, but despite the Clinton veto pen, Republicans have kept the budget on its glide path to balance in 2002, cutting $42 billion from the budget in 1995 and 1996 combined.
Republicans have demonstrated the ability to build a coalition and consensus for budget cutting and devolution from Washington to the states that would not have been conceivable only two years ago. A Republican Congress passed the "Freedom to Farm" act drafted by the congressmen on the House and Senate Agriculture Committees who would have been presumed most hostile to phasing out all farm subsidies and ending government control over planting. Rep. Pat Roberts (R-Kan.) designed the model for dismantling statist agriculture programs. Farmers, who for the past 50 years were locked into subsidy dependence, were offered a deal: less money in return for more freedom. That same deal was offered to state governments, which have heretofore opposed having welfare and Medicaid sent to the states. The trade: less federal money for more freedom from federal regulations and restrictions.
Similar offers can be made in education: We'll send you a $3,000 voucher rather than spending $9,000 to educate each child, but you the parent can control where your child is educated. This was the GOP proposal for the District of Columbia; other federal education monies could be similarly voucherized. Less money, in return for more freedom.
One can already see the outlines of an agreement heading to privatize Social Security and Medicare. Giving younger Americans the opportunity to opt out of the government Ponzi schemes and into individually held and controlled retirement accounts brings about a phaseout of Social Security payments in return for a phaseout of Social Security taxes.
Bill Clinton's vetoes have slowed this process but cannot prevent it. The next four years will see the election of Bob Dole and a flurry of tax-cutting and government-dismantling legislation mirroring the activism of the 1964-68 Great Society–or it will see the re-election of Bill Clinton and the phased surrender of big government. The model for a Clinton second term is Richard Nixon's presidency from 1972 to 1975. A hostile Congress will use the battering ram of Whitewater and the Arkansas scandals to force Clinton to throw his allies overboard and continue to dismantle the state.
The players are in place. The strategy is set and tested. In 20 years, the American government will be half its present size.
Grover G. Norquist is president of Americans for Tax Reform.
John J. Pitney Jr.
One way to address the possibility of change in Washington is to picture a dialogue between a pessimist and an optimist.
Pessimist: It's hopeless. For years, Republicans yammered about government, and when they finally win Congress, they don't scrap a single Cabinet department. Heck, they can't even get rid of the chocolate-smearing pornographers at the National Endowment for the Arts. It's as if Carry A. Nation burst into the saloon with a Nerf hatchet.
Optimist: Put down that cup of hemlock. Washington can change, and it already has. In the 1980s, the Military-Industrial Complex lurked everywhere. Its lobbyists worked the corridors of Congress, while the payrolls of military bases and arms factories ensured grassroots support for the defense budget. But when the Cold War ended, even this gigantic political machine could not stop history. Between 1986 and 1996, inflation-adjusted defense spending dropped by nearly one-third. That's real change.
Pessimist: No, it isn't. Washington hasn't set us free: It's just transferred custody of the chains. In constant 1987 dollars, defense did fall by $79 billion during the past 10 years, but Social Security, Medicare, and other payments for individuals went up by $207 billion. The Military-Industrial Complex is a pip-squeak compared with the Greedhead-Geezer Complex. Any time a politician tries to curb the growth in their benefits, he or she ends up as another notch on their Metamucil jar.
Optimist: Things will change, because the system cannot continue. Medicare's health-insurance trust fund goes bust in 2001. Twelve years later, Social Security payments outpace revenues and the system starts tapping its "trust fund," which runs empty a decade or two after that. As the baby boomers edge closer to retirement, they'll see that there is no alternative to privatization of the old-age entitlements. With luck, that realization will come sooner rather than later–but it will come. And then, the odds will finally favor reform.
Pessimist: Since when do national crises lead to less government power? After the stock-market crash of 1929, did Congress seek new ways to encourage free markets? No, it passed the Smoot-Hawley tariff, which helped turn a problem into a calamity. Listen carefully to the current debates: Even the Republican congressional leaders say that they want to preserve the entitlements, not end them. There's not much reason to think that Congress or the people will respond rationally to the coming crisis. It could be like Atlas Shrugged: Things just get worse until all the lights go out–except that we don't have John Galt.
Optimist: But we do have a number of lawmakers who are serious about reform. And if you look at the elections of 1992 and 1994, you'll see that their ranks are growing. Nothing's automatic, of course, but there's a reasonable chance that they will win in the end.
Pessimist: In the end, we're all dead. Please pass the hemlock.
John J. Pitney Jr. is an associate professor of government at Claremont McKenna College.
Randy T. Simmons
Jonathan Rauch suggests that the public knows what it is doing when reforms are blocked. He is wrong. Voters do not know what they are doing, because their roles as beneficiaries and as taxpayers are not directly connected and do not discipline each other. Economically prudent decision making is not demanded by any voter's own budget constraints. In playing out their taxpayer role, well-meaning, public-spirited citizen-voters support reductions in their own tax payments while pursuing increases in benefits to themselves. Thus, citizen-voters are able to make ethical imperatives out of logically contradictory agendas. A citizen-voter, for example, might advocate increased spending on defense, social programs, and consumer protection while also promoting lower taxes, balanced budgets, and reduced government controls.
Voters may hope that other taxpayers, preferably "wealthy" ones, can be forced to finance their private desires and altruistic impulses. Besides, not to seek subsidies, feel-good social programs, or ideological preferences when others do seek them makes little sense. Everyone is caught up in what could be called "the budgetary tragedy of the commons," a frenzied scramble over the elusive public pie.
Elected officials contribute to the problem, as they realize that any given action or policy will please some voters, displease others, and escape the attention of most. It will also affect an official's power or influence with colleagues, bureaucrats, and organized interests. Reconciling these differences requires bargains, compromises, logrolling, and strategic choices. Successful politicians are those who can adopt policies which keep them in office by taking the wealth from the politically less powerful and redistributing it to supporters, who maintain or improve influence with other politicians, and who enlist the aid of bureaucrats. These goals are all accomplished in an arena where the prudent citizen-voter must pay a steep price even to be informed about the costs and benefits of policy options. Voters, therefore, remain ignorant of costs while politicians remind them of the alleged benefits. All this suggests a great tragedy for modern democracies: The better a political system represents the narrow private interests of some citizens, the worse it may be at fostering economic progress and safeguarding liberty.
But reforms are possible. Otherwise there would have been no Reagan tax cuts, water marketing would still be a gleam in some economist's eye, we wouldn't be talking about meaningful welfare reform, and there would be no justification for magazines like REASON or pro-market think tanks. The Gingrich reforms lost a public relations battle. Clear constituencies were not identified, informed, and mobilized. Costs of programs were not compared to their benefits in ways the voters understood. Political entrepreneurs opposed to reform, however, effectively mobilized constituencies, framed issues in ways that swayed voters, and separated the benefits of programs from their costs. What reform needs is clear understanding of the dynamics of the process and good political entrepreneurship, not despair.
Randy T. Simmons is professor of political science at Utah State University and co-author of Beyond Politics: Markets, Welfare, and the Failure of Bureaucracy (Westview Press).
The post Can Washington Change? appeared first on Reason.com.
]]>The leading philosopher of our time–Jerry Seinfeld–says that buying books makes you humble, "because to walk into a bookstore, you have to admit there's something you don't know." Accordingly, IDG Books has named a line of reference books "…for Dummies." Most of these volumes, such as Windows 95 for Dummies, aim at readers who actually have plenty of smarts but need some guidance about a specific technical subject. A new entry in this line, Politics for Dummies, tries to do for political participation what the other books have done for computing and business. It includes 26 short-attention-span chapters on topics ranging from campaign finance to national party conventions. With a lively format featuring checklists, sidebars, and cartoons, the book presents bits of useful material in a readable manner. Especially worthwhile is an appendix that explains each state's procedures for voter registration.
Alas, author Ann DeLaney takes the title too seriously. Much of Politics for Dummies really does assume that its readers are…well, if you can't tell where this sentence is heading, stop here, because this book is for you.
For the rest of us, Politics for Dummies is a like a long day with a grade-school teacher who has little confidence in the pupils. Such a teacher doesn't trust the kids with the straight facts but instead gives them the version that is "good for them." In reading this book, I kept thinking of my seventh-grade nun, locally famous for her unique take on reality. Warning the class about physical vanity, she told of a woman who perished after dyeing her hair too many times: "And when they did the autopsy, they found that her brain had turned purple!" Even at age 12, we didn't believe these stories. Instead of improving our character, they turned us into little cynics.
DeLaney tries to get us to the polls by insisting that every vote counts and is equally important. This sentiment sounds uplifting, but it's about as credible as the parable of the purple brain. Notwithstanding the old anecdotes about close elections, a single vote is probably not going to change any outcomes in an electorate of 104 million people. In many cases, politicians have further diluted individual voting power through district lines that guarantee victory to a party or ethnic group. In California's 37th congressional district, for instance, the 6-to-1 Democratic registration edge means that any single Democratic vote is redundant and any Republican vote is irrelevant. This is the stupid-politician trick called gerrymandering, a term that even dummies have heard. Politics for Dummies fails to mention it, however, which is odd in light of the author's background as a congressional candidate in Indiana, site of some classic districting ploys.
Not content with praising the act of voting, DeLaney speaks of nonvoting the same way my nun spoke of hair-dyeing. If you don't register to vote, she says, your opinion can never show in public opinion surveys, because pollsters only speak to people on the rolls. She errs: Although some polls screen for registration, particularly at election time, many others do not. She goes on to say that, "politically, if you neither register nor intend to vote, you don't exist. You are a nonperson!" She exaggerates: Although politicians may well pay less attention to abstainers, many figures in American history have made their mark outside the electoral process. As a convicted felon, Malcolm X could not vote, but he got his point across through demonstrations and other forms of direct action.
You can just see the author standing over her readers, ruler in hand, ready to whack them on the knuckles if they don't reach for a ballot. "You must vote. If you don't vote, you have no right to complain about politics, politicians, or government" (emphasis in the original). No right? I guess I overlooked that clause the last time I read the First Amendment. (By the way, this primer on American politics neglects to include a copy of the Constitution.)
There is a compelling case for voting, one grounded in the concept of civic obligation. The book would have been far better if DeLaney had developed that line of argument; however, she never lets careful analysis interfere with a cliché.
And boy, does this book have clichés coming out of its ears! (Intentional irony, folks.):
Clichés can be tolerable as long as they make sense, but the book embraces some adages that are wildly misleading. Consider this one: "You can't beat somebody with nobody." Recently, a veteran party operative told me that he was tired of hearing that line: "It's silly. We've beaten lots of somebodies with nobodies." He pointed to the 1994 elections, where major political figures such as Speaker Tom Foley, Sen. Jim Sasser, and Gov. Mario Cuomo lost to worthy but obscure challengers. DeLaney makes the cliché even dumber when she rewords it: "A somebody candidate, no matter how unpopular, beats a nobody candidate every time." Former Congressman Dan Rostenkowski might want to correct her on that point. He has the time.
The 1994 elections also undercut another of the book's favorite saws: "All politics is local." For 40 years, Democrats had controlled the House of Representatives by concentrating on constituency service and local issues. Newt Gingrich and the Republicans broke that pattern by drawing sharp partisan distinctions and recasting the midterm elections as a choice between two national parties. Failing to recognize any contradiction, DeLaney herself mentions the importance of the 1994 elections as "a wave of voter reaction."
The book contains other contradictions. After her stern lectures about the absolute political impotence of nonvoters, DeLaney encourages teenagers to get involved in campaigns: "If you're willing to work hard, you can make a difference before you're old enough to cast your first ballot." In her list of "Ten Common Political Mistakes," number four is "promising not to run for reelection" while number seven is "not knowing when to retire."
At some points, the book lapses into downright incoherence. "In government, as opposed to politics, the spokesperson is the elected official or a third-party group. The recent national debate over health care provides a perfect example of this model. The Republican Party opposed President Clinton's proposal, but it was the financial interests–the insurance industry, the nursing home industry, and the hospital association–that attacked the proposal openly." Elected officials never speak for themselves during campaigns? Republicans never openly attacked the health proposal? What planet is she talking about?
Floating in the muddle are faint hints of an ideological agenda. DeLaney likes government, and she encourages people to think of how they can benefit from it. In advising her readers how to take sides on an issue, she says that they should ask two questions: "Does this issue matter to me at all?" and "Will the issue have any impact on me?" The "me" emphasis, of course, is an invitation to pork-barreling and deficit- mongering. DeLaney personifies the "good government" types who take the leviathan state for granted and seek to confine political debate to the splitting of the loot. DeLaney does show a concern for costs, but in peculiar places. A sidebar on "The Ridiculous Cost of Campaigns" cites a U.S. Senate race in California as a horror story of extravagance. But DeLaney forgets that more than 7 million people voted in that election. Communicating with so many voters is inherently expensive, especially because California television stations give candidates few opportunities for free exposure.
More to the point, however, the Citizens' Research Foundation reports that the total cost of all federal, state, and local campaigns during the last presidential election cycle amounted to $3.2 billion. In the same two-year period, the total cost of federal, state, and local governments came to $3.9 trillion. In other words, government spending was more than a thousand times greater than election spending–yet to DeLaney high-priced campaigns are the problem. If she had been at Chernobyl, she probably would have complained about the lead-based paint on the reactor core.
Politics for Dummies is a good idea for a book. Politicians have made the system more complex than it needs to be, in part because mystification allows them to escape accountability. A simple, clear-minded, shrewd guide would help people see through the mummery and start to regain control of the government. Politics for Dummies is not that book. After I put it down, I thought not only of my seventh-grade nun but also of Jack Nicholson. At the climax of A Few Good Men, he sums up this book's implicit motto: "You can't handle the truth!"
John. J. Pitney Jr. (jpitney@mckenna.edu) is an associate professor of government at Claremont McKenna College in Claremont, California.
The post Dumb and Dumber appeared first on Reason.com.
]]>It was not always so. When I was a teenager in the late 1960s, I yearned to be…Richard Nixon. To an awkward kid with a bulging bookbag, the sight of cheering crowds at Nixon rallies made a politician's life look awfully attractive: Here was one career where a nerd could get standing ovations.
Now, 28 years after pinning on my first Nixon button, I'm still a Republican, a political junkie, and a nerd. But having spent much of that time in the company of politicians, I'm now convinced that I don't want to be one. Sure, it might be fun to debate on the Senate floor or go on the Brinkley show and kid Sam Donaldson about his toupee. The catch, I've learned, is that these activities account for only a tiny part of a politician's time. The rest of it stinks.
Life may be a banquet, but political life is fast food. The size of modern government means that politicians have to cope with frenzied daily schedules. They are constantly rushing from meeting to hearing to markup to floor debate and back to meeting again, seldom spending enough time on a single activity to learn anything from it. Amid the rush, they find it hard to squeeze in the little things, such as reading, study, and thought. It's no wonder that when a television interviewer asked Rep. (now Sen.) Ron Wyden (D-Ore.) to find Bosnia on a globe, he couldn't do it. Busy lawmakers don't have time for picky details.
One such detail is the actual text of legislation. In Washington, D.C., and many state capitals, the best you can say is that some of the lawmakers read some of the bills some of the time. The rest rely on staff-written summaries and word-of-mouth descriptions from colleagues and lobbyists. They often don't know what they're doing, and then they can't remember why they did it. Accordingly, one of the literary genres of the political world is the "vote justification," a memo in which staffers supply legislators with rationales for votes they have already cast.
Call me naive, but I'd hate to vote on bills I'd never had a chance to read. I'd feel as dirty and irresponsible as if I'd graded a term paper without looking past the cover sheet.
I'd also hate to perpetrate the deception known as political writing. Impressed with the "personal" response to the letter you sent your senator? In all likelihood, the senator didn't read it, much less write it: A recent college graduate did the work and affixed the senator's signature with a laser printer. Enjoy the governor's perceptive op-ed article in the local paper? Same story. Maybe the governor edited a draft, but the real author was some munchkin you never heard of.
If I held office, I'd try to do as much of my own writing as possible, but as a practical matter, the staff would have to do most of it. I've spent years warning my students about the evils of plagiarism, preaching that they should never put their names on other people's work. What would I tell a former student who saw me signing an article that one of my staffers had written? On an intellectual level, one could justify it as "routine staff work"–but on a gut level, it would still feel like cheating.
Another dilemma would arise out of my appointment book, because I would not want to meet with the people who'd want to meet with me. People usually seek out a politician because they want something for themselves, and it's typically something they shouldn't get, such as a tax break or a pork-barrel project. But they can usually make their pleas sound plausible–even heart-wrenching. Picture telling a sad-eyed 80-year-old lady: "Ma'am, we must cut spending, so I can't support additional funding for a senior citizens' center in your home town." Saying no is hardly a delight.
Neither is saying yes, at least if you remember where the money comes from. Every time politicians support new government spending, they are telling the butcher, the baker, and the janitor: "Hand over your hard-earned cash, and we will choose what to do with it. If you fail to pay up, people with badges will take away your property or throw you in prison." Perhaps it's a character flaw, but I would dislike practicing coercion on my fellow citizens, even for the sake of some hi gher good.
Of course, I've been assuming that I could win an election in the first place, which is highly doubtful. Running for office takes a lot of money; and since I'm not rich, I'd have to ask for it. Yuck. I can't stand to sell raffle tickets to my friends, much less badger rich strangers to give me thousands of dollars.
Even if I did scrounge up the necessary funds, my positions would alienate large voting blocs, and my rallies would serve as magnets for nasty protests. Take, for instance, the highly emotional issue of laboratory testing of animals. I'm for it, big time. If scientists must sacrifice a thousand pigs to develop a surgical procedure that will save a single human life, that's OK with me. Upon hearing me say such things, animal-rights wackos would shout me down with cries of "murderer!" and tie-dyed moms would lean down to their small children, and say, "He's the bad man who wants to kill Babe."
Likewise, I favor privatizing Social Security and Medicare. So when the animal-rights folks got hoarse, the greedy geezers would start yelling, and the tied-dyed moms would lean down again and say, "The bad man wants to kill Grandma, too."
Rushing, faking, begging, and making small children cry–that's not my idea of healthy living. I do not choose to run.
John J. Pitney Jr. is an associate professor of government at Claremont McKenna College in Claremont, California.
The post Politics: Don't Vote for Me appeared first on Reason.com.
]]>And in 1996, the Internet has enabled ordinary Americans to do just that. Years ago, researchers had to spend hours and days pawing through yellowed newspaper clippings just to document a single evasion. Now, anyone with a reasonable amount of computer skill and political knowledge can conduct an extensive search in a matter of minutes. Candidates have loaded their Web sites with speeches, press releases, and policy statements–perhaps not thinking that anyone would actually read the stuff. Other material is available from a huge variety of on-line sources.
The GOP has some problems. A brief scan of cyberspace shows that Republican politicians have not set high standards of consistency on the issues of tax reform, abortion, and federalism.
The Flat Tax: The flat tax has stirred a great deal of interest among Republicans. In its pure form, this proposal would scrap all existing credits and deductions in return for a single low tax rate. (For detailed information on tax reform, check the Joint Economic Committee's site: http://www.jec.senate.gov/). One school of GOP thought holds that the idea will appeal to voters who are sick of garbled tax rules and grumpy old IRS bureaucrats. A top presidential campaign proclaims that its candidate "wants a fairer, flatter, simpler tax system so people can fill out their tax returns without a lawyer, an accountant, or both. He will end the IRS as we know it, and curb its abuses of taxpayers."
But another school holds that the current system benefits many voters, and if anything, we need to establish even more tax preferences. One campaign calls its candidate "a leading advocate of family tax relief, such as a $500 per child tax credit, 'marriage penalty' relief, adoption tax credits, IRAs for homemakers, and easing the estate tax burden on family businesses."
These diametrically opposed statements might have furnished the basis for a enlightening debate, except for one small problem: Both came from the Dole campaign. I have not wrenched them out of context, from separate documents issued months apart. The two positions appeared together in a Dole for President e-mail bulletin on January 19, 1996. You can also find them on the Dole Web site (http://www.dole96.com/) under "What Bob Dole will do as our President."
The sheer audacity of this contradiction has theological dimensions. Zen monks spend years contemplating such riddles in order to free their minds from earthly logic.
Steve Forbes has made a more straightforward flat-tax proposal, even though the statement on his Web site (http://www.forbes96.com/) does not directly mention the elimination of deductions. For the most thorough explanation and defense of the idea, a computer user must look beyond the presidential candidates to House Majority Leader Dick Armey (http://www.house.gov/armey.flattax/), whose site shows exactly what a tax form would look like under his legislation.
Abortion: As 1996 got under way, pro-choice Republicans lacked a strong voice either on the campaign trail or within the top ranks of the GOP congressional leadership. Among the presidential candidates, Lamar Alexander probably has come the closest to the pro-choice side, yet if you check his Web site (http://www.Nashville.Net/~lamar)–the one with mock red flannel on the home page–you would not think you had contacted NOW. Click "On the Issues/Abortion" and you find: "The federal government should never become involved with abortion–should not subsidize it, encourage it, or prohibit it….Because I believe that abortion is wrong and that states may restrict abortion, I would characterize my beliefs as pro-life."
The issue may still rock the party, however, because the anti-abortion side faces a serious internal split. Some pro-lifers insist that the GOP must hold fast to every word of its firm anti-abortion platform plank, which remained unchanged between 1984 and 1992. Others argue for a modification, saying that the party can broadly embrace the anti-abortion cause without committing itself to certain proposals that could alienate potential allies.
The platform language is tough indeed, and many of its supporters may not have thought it through to its logical conclusion. The platform endorses "legislation to make clear that the Fourteenth Amendment's protections apply to unborn children." The 14th Amendment secures for everyone "the equal protection of the laws," so under the proposed legislation, state homicide statutes would apply to the killing of fetuses. In other words, prosecutors and courts would have to treat abortion as murder. Women who have abortions, as well as their physicians, would have to go to prison.
Some hard-liners in the pro-life movement do have the intellectual courage to acknowledge and accept this consequence. For the most part, however, those who support the current language have preferred to avert their eyes from its real meaning. Even Pat Buchanan has blinked. His Web site (http://www.buchanan.org/) includes the full text of a speech titled "A Contract With the Unborn." If you download it and conduct a word search, you will not find the words jail, prison, or penalty. The speech does mention the word punish–but only in the context of punishing anti-abortion violence.
To see how Buchanan deals with the penalty issue, one must go to the University of New Hampshire's special site for the state's presidential primary (http://unhinfo.unh.edu:70/unh/acad/libarts/comm/nhprimary/nhprim.html), which has a fine directory of candidate statements on various issues. The directory includes a transcript of a talk radio appearance in which Buchanan said: "I think the woman's a victim in an abortion. I don't believe any of them should be put in jail or punished or chased around or get in their face or anything. My goal is very simple–don't do it, save lives." Fine–but he has specifically pledged his support to the current platform language.
Republican leaders prefer to change the subject when abortion comes up. Sooner or later, they will have to pay attention to what their party has been saying on the issue.
Federalism: Republicans like to recite the 10th Amendment: "The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people." They really mean it–except when they don't.
The 1993 Senate debate on the crime bill is especially revealing. The Congressional Record transcript of Senate action is available on Thomas (http://thomas.loc.gov), a not-so-user-friendly site run by the Library of Congress. (Leave a good deal of time for scrolling through extraneous material.) During this debate, Senate Republicans supported crime-bill amendments that amounted to a drive-by assault on federalism. With Sen. Al D'Amato (N.Y.), Phil Gramm sponsored an amendment that would have federalized virtually all firearms crimes, and Gramm is still taking credit for the proposal on his Web site (http://www.gramm96.org/). Dole himself sponsored the most excessive provision. The Dole amendment, later scratched in a House-Senate conference, would have made it a federal crime "to participate in, or to conspire to participate in, a criminal street gang, and to induce others to join the gang." In the Dole version of West Side Story, the role of Officer Krupke would have gone to Janet Reno.
Leave aside the silly proposition that the FBI and DEA are better equipped to handle street crime than local police. The more important point is that, if the 10th Amendment means anything, then the federal government has no business even trying. That concern, however, has not stopped Dole from including a 10th Amendment section on his Web site.
Cynical political handlers may dismiss all these points as academic piffle. They like to repeat the old joke that sincerity is the key to winning elections, and if you can fake that, you've got it made. But in the end, the joke is on them. Voters have always hated it when politicians have trifled with them, and this sentiment has grown more intense ever since President Bush broke his "Read My Lips" promise. More than anyone, Republican politicians should remember that.
In the electronic universe, politicians will find it harder and harder to get away with dissembling. On the Internet, the truth is out there.
John J. Pitney Jr. (jpitney@mckenna.edu) is an associate professor of government at Claremont McKenna College.
The post Politics: Tangled Web appeared first on Reason.com.
]]>Take the example of U.S. Grant, a clean man with a dirty administration. Graft and incompetence flowed so freely during his tenure that Grantism quickly became a synonym for political debasement. Despite the slime that surrounded him, Grant easily won a second term in 1872, and his party held the presidency in the next election. OK, so the GOP's 1876 victory was a tad fraudulent. Still, the voters forgave and forgot: In 1880, they handed the Republicans yet another term in the White House and restored a GOP majority in the House of Representatives.
During the 1920s, the emerging Teapot Dome scandal lost political steam when Warren Harding conveniently died (from poisoning by his jealous wife, according to rumor) and was succeeded by the righteous Calvin Coolidge, who led the Republicans to triumph in 1924. Three decades later, stories about widespread corruption did contribute to Truman's retirement and the GOP sweep of the 1952 national elections. In 1954, however, Democrats regained the House and Senate, starting what would become the longest stretch of one-party control in congressional history.
In all these instances, the parties in power walked away with only scratches. Once the tainted individuals left the scene, the scandals lost their sting.
Watergate seems to be an exception. The Republicans suffered deep losses in the 1974 congressional elections and did not really recover until 1980—and even then, they failed to win control of the House. Ever since the 1970s, therefore, opposition researchers of both parties have lusted for "another Watergate," a series of revelations that would break the other side's legs and keep them broken for years.
The weakness of the Watergate paradigm is that the scandal was only one source of the GOP's problems in the '70s, and not necessarily the most important one. In 1972, the Committee to Re-Elect the President vacuumed up campaign resources that could have gone to GOP congressional candidates and then refused to give the slightest help to the party at large. The Nixon administration's policies—a guaranteed income, wage-price controls, racial quotas—temporarily blunted liberal criticism at the cost of obliterating partisan differences. In 1974, Nixonomics combined with an oil shortage (made worse by government controls) to produce a deep recession that probably cost the GOP far more seats than Watergate did.
Over the long run, in fact, Watergate may have hurt the Democrats more than the Republicans. Many in the party's dominant liberal wing thought they could forever scare voters away from the GOP by chanting, "Nixon, Watergate, dirty tricks." They passed up the chance to update their message by rethinking the New Deal welfare state, and they paid the price during the presidential elections of the 1980s.
Today's Republicans would be similarly ill-advised to sink all their capital into the Clintons' alleged misdeeds. (That would be as irrational as risking one's life savings in cattle futures without a miraculously well-connected broker.) For such scandals to yield long-term benefit for the GOP, two things would have to happen. First, the voters would have to conclude that wrongdoing involved Democrats outside the Clintons' immediate circle. But politicians, like cockroaches, have wondrous survival instincts. If things start to go badly, Clinton's partisans will distance themselves from him just as fast as House Democrats ditched Speaker Jim Wright when he became a liability.
Second, voters would have to be convinced that Republicans are much more ethical than Democrats. That is a tough sell. While Republicans can justly argue that Democrats hyped GOP misdeeds during the "decade of greed," they can hardly claim exemption from original sin. Among other things, Americans are painfully aware of Republican involvement in the House Bank and the S&L fiasco.
These scandals helped set off the current wave of political outsiderism. Some Republicans have assumed this sentiment would automatically help their side: Democrats hold more seats in Congress, so "anti-incumbent" usually means "pro-Republican," right? Wrong. As John Hood has argued in these pages ("The Third Way," February 1993), popular discontent with the political system has translated into growing support for minor parties.
This phenomenon extends beyond Ross Perot's presidential bid. Ryan Iwasaka, an undergraduate researcher at Claremont McKenna College, has calculated that minor-party candidates for the U.S. House received 1.8 million votes in 1992, or 2 percent of the total vote. The latter figure understates their significance, since they did not appear on the ballot in most districts. Among the races where they did participate, they won more than 9 percent of the vote.
If Republicans talk of nothing but Democratic character flaws, voters will spurn both sides and turn in even greater numbers to minor parties. To attract voters to the GOP banner instead of driving them away from the two-party system, Republicans need a compelling message about the substance of public policy.
Idealy the issue of corruption could offer the starting point for such a message. After all, why does official corruption happen in the first place? Whenever government hands out material benefits and regulates economic activities, interested individuals and firms will seek to gorge themselves on the goodies and gain special dispensations from the rules. To curry government's favor, they will ply politicians with "gifts" and campaign contributions.
The 1972 Nixon campaign could afford luxuries such as burglary because it could raise millions from businesses afraid of government regulation. According to one of Woodward and Bernstein's sourc-es, finance chairman Maurice Stans would make the following pitch to CEOs: "You know we got this crazy man [EPA Administrator William] Ruckelshaus back East who'd just as soon close your factory as let the smokestack belch. He's a hard man to control and he's not the only one like that in Washington. People need a place to go, to cut through the red tape when you've got a guy like that on the loose."
Since Watergate, Congress has responded to official corruption with complex regulations on ethics and campaign finance, which have mainly served to make influence peddling more subtle and professional. Using such rules to fight political rot is like treating obesity with a girdle: It rearranges the problem instead of solving it.
The best way to raise government's moral standard is to reduce its size and scope. Conversely, the worst possible course is to create costly new bureaucracies—and this is where ethics intersects with health-care reform. If you want to give physicians and other providers even more incentive to play the influence game, endow the federal government with total control over their livelihoods. If you want to create a black market in medical services, where desperate patients slip cash under the table to their doctors, then enact legislation empowering health alliances to limit fees and forbidding patients to pay doctors directly. By drawing more and more people into the swamp of regulatory politics, and by criminalizing market transactions, the Clinton health plan would set the stage for corruption far greater than anything ever imagined in Little Rock.
So in framing the campaign debate, the issue for Republicans is not policy or ethics, but policy as ethics. Focusing on the sins of individual politicians will do them little good in the long run, because the politicians will eventually go away. Instead, they need to make the case that big, bureaucratic government is bad government, meaning not just inefficient but ethically corrosive.
John J. Pitney Jr. is assistant professor of government at Claremont McKenna College.
The post Scandal Time appeared first on Reason.com.
]]>The latter point is just as significant as the first. In every election between 1924 and 1956, one or both major parties chose a presidential nominee who had served as governor. But from 1960 to 1972, the major parties turned to the likes of Kennedy, Johnson, Goldwater, Nixon, Humphrey, and McGovern—all sons of Capitol Hill, and not a governor among them.
Political analysts observed that the Senate had replaced the state capital as the nursery of presidential ambitions. In the old days, a Franklin Roosevelt or a Thomas Dewey could win the nomination because his gubernatorial job kept him in touch with the "kingmakers"—the legendary bosses who saw the world from between the brim of a Fedora and the tip of a cigar.
In the 1960s, however, reporters who saw life through a camera viewfinder ruled presidential politics. Senators had easy access to the elite Washington press corps, while most governors could only reach the small-fry and washouts assigned to the backwater beat of the state-house. And during the early days of Camelot, senators could claim credit for new social programs and pontificate about foreign affairs, while governors were stuck with explaining bond issues and tax increases.
In the mid-1970s, things started to change again. Watergate, Vietnam, and the failure of the Great Society spoiled Washington's image. Senators kept getting press attention, but more of it was hostile: Compare the cover-up of John Kennedy's assignations with the ever-deeper excavations of Edward Kennedy's personal life. (The Senate had lost its luster but not its lusters.)
The percentage of Americans reporting "a great deal of confidence" in Congress plunged from 21 percent in 1972 to 9 percent in 1976. Senators were heading into political oil slicks: Their reelection rate had topped 86 percent between 1962 and 1970, but it dropped to 74 percent between 1972 and 1980.
Governors, meanwhile, saw an increase in their reelection rates in the same period, from 70 percent to 75 percent. The 1976 presidential election reflected this reversal of fortunes. Several big-name senators—including Frank Church and Henry Jackson—sought the Democratic nomination, yet they were all outrun and outgrinned by an obscure former governor of Georgia. And on the Republican side, a former governor of California came within a Brylcreemed hair of taking the crown from President Gerald Ford.
Former Gov. Reagan went on to win the next two elections. The rise in gubernatorial stock continued in 1988 when the Democrats chose Michael Dukakis, the first sitting governor to win a presidential nomination in 36 years. And in state elections between 1982 and 1990, the gubernatorial reelection rate rose again, to 80 percent. (House and Senate rates increased as well.)
Several trends combined to strengthen the governors. One was the monstrous increase in the size and reach of state governments. Since 1960, real per-capita state spending and the number of state employees have both more than doubled. Indeed, since the early 1970s, state governments have employed more civilians than the federal government. Among other things, these numerous state employees have been busy writing and enforcing regulations that cover many aspects of business and everyday life.
Through their authority to appoint officials and veto items in appropriation bills, governors generally wield a good deal of control over the states' burgeoning power. Those people who stand to gain or lose from state action—a rather large group—thus find it prudent to stay on a governor's good side.
Early in the 1988 race, Michael Dukakis gained a decisive fund-raising advantage over his rivals by getting contributions from corporate executives in Massachusetts. Sens. Paul Simon and Albert Gore, who each cast only a single vote in a 100-member body, lacked the financial leverage that a sitting governor could exercise.
In the 1970s and 1980s, the expansion of state government provided governors with numerous occasions to tout their "innovative policies," a.k.a. spending programs. Most readers have heard of Dukakis's Employment and Training program, along with other elements of the "Massachusetts Miracle." Dukakis's accomplishments proved as bogus as they were costly—yet they did generate a great deal of favorable publicity in the 1988 campaign.
Less well-known is that Ronald Reagan also campaigned on his gubernatorial spending initiatives. A 1980 campaign document noted that, as governor, Reagan "expanded state aid to education at all levels, from elementary schools to the state universities; increased the number of state scholarships awarded by over 500%, and the amount of state loans and scholarships by 915%."
Governors now have greater opportunities to win exposure and make political contacts beyond their states' borders. Many states now run overseas offices and sponsor foreign-trade missions, which governors often lead. In the autobiography circulated in his 1976 campaign, Jimmy Carter recalled: "During the last two years of my term, we made personal and official visits to ten other nations. In each instance, our goal was to promote friendship and trade, and to learn as much as possible about our hosts." In fact, his main goal was to pad the otherwise-vacant "international experience" line on his presidential résumé.
A governor can also amass mileage within the United States. Bill Clinton gained national prominence by heading the National Governors' Association task force on educational goals. And by winning the helm of the Democratic Leadership Council, he established himself as his party's leading quasi-moderate. (Note: In light of Clinton's amazing "flexibility," all adjectives applied to him should start with the prefix quasi.)
Changing terms of office also helped spread Potomac Fever to a number of state capitals. In 1960, 15 states elected governors to two-year terms. Their governors faced the unappealing prospect of giving up their offices if they wanted to run for president. Since then, 12 of these states have switched to four-year terms—and except for North Dakota, all have opted to hold elections in nonpresidential years, freeing their chief executives to seek the White House. Massachusetts changed in 1966, when Michael Dukakis was still a hungry state legislator. The most recent switch came in 1986—in Arkansas.
Yet for all their newfound assets, governors still have considerable disadvantages in presidential campaigns. Heading a state government is a demanding job that may leave its occupant with little time and energy for the campaign trail. Carter and Reagan both departed office in 1974, so they had the luxury of becoming full-time candidates. Conversely, Mario Cuomo was probably not dissembling last December when he said New York's budget woes precluded him from entering the primaries. Even a free-spending statist such as Cuomo has a tough time keeping up with the orders for pork coming from New York's greedy and corrupt legislature.
Governors are magnets for blame as well as credit. Whereas lawmakers can dodge accountability for just about anything, governors rise or fall with their states' fortunes, whether or not they are actually responsible. Dukakis took the bows for the good economic times that Massachusetts enjoyed in the 1980s—even though the boom stemmed from tax cuts that he had opposed. When the state crashed after the 1988 election, the last embers of his political career were extinguished—even though he was only one of many culprits.
Fiscal reality is catching up with the states, so governors in the 1990s may lose the political benefits that accompanied the expansion of state government. Some, however, are turning austerity to their advantage. William Weld, who succeeded Dukakis, responded to the state's fiscal crisis with budget cuts instead of tax hikes, and he aims to restructure the state's government through initiatives such as privatization and school choice. If he succeeds, he may well position himself as a new kind of presidential contender: a governor who achieves prominence by making government better instead of bigger.
John J. Pitney Jr. is assistant professor of government at Claremont McKenna College, Claremont, California.
The post Politics: Who Loves the Guvs? appeared first on Reason.com.
]]>The proposal also posted a gain of sorts in Atlanta. In 1984, the Democrats forthrightly opposed both the amendment and the convention call. Their 1988 platform, though, said nothing about the issue. It merely stated that government "must provide the resources" to pay for social programs. (What do they mean by "resources"—raisins, dates, and hazelnuts?)
Elsewhere, the convention call is having hard times. In a number of state legislatures, lawmakers have introduced resolutions to rescind their states' previous support for a convention call. During 1988, such resolutions passed in Florida and Alabama. In the year ahead, supporters of the convention call will fight for its approval in several states, including New Jersey, Michigan, Minnesota, and West Virginia. Opponents will seek to block these efforts and, if need be, persuade more legislatures to backtrack.
Yet it's obvious to nearly everyone that federal overspending cannot go on. So what's all the fuss about?
The United States has not had a national constitutional convention since 1787. Calling a new one requires support from 34 states; a convention to write a balanced-budget amendment has won in 32 legislatures. If the Florida and Alabama rescissions prove valid—scholars disagree on whether a state can "take back" endorsement of a convention call—the number drops to 30.
Opponents of the idea range all over the ideological map. Some social conservatives hold that a convention could stray beyond its charter and write language threatening, for example, the right to worship. Social liberals worry that it could create a national religion. Lawrence Tribe fears a convention full of supporters of Phyllis Schlafly, who fears a convention full of Lawrence Tribes.
Supporters of the convention call, led by the National Taxpayers Union, say that's far-fetched. But they should think hard about a contrary danger: that the convention would produce not Frankenstein but Gumby. Instead of crafting an airtight lid on federal spending, it could well write a boneless document full of loopholes.
Before considering why the whole effort might flop, look at the "runaway convention" argument. With some variations in wording, the states' applications would limit the proposed convention to writing a balanced-budget amendment and nothing else. Some scholars argue that such curbs could not bind the convention: once assembled, it could propose whatever amendments it wanted. After all, the first constitutional convention turned into a runaway. In 1787, Congress instructed the delegates to meet to revise the Articles of Confederation, but once they reached Philadelphia the delegates scrapped the Articles and started over.
The result is generally deemed a happy ending. Opponents of a new convention warn that we wouldn't be so lucky again. Sen. Patrick Leahy (D–Vt.) is typical: "Nothing—not our first amendment rights of freedom of speech, freedom of religion, not our rights against search and seizure—none of the rights in the Constitution would be off limits. They all would be susceptible to revision. And I fear that the rights we take for granted could be the first to go."
Until a runaway happens and the Supreme Court rules whether it overstepped its bounds, legal questions about limiting a convention's work will remain open. But suppose the worst: a convention proposed repealing the First Amendment, and the Supreme Court refused to step in. Three-fourths (38) of the states would still have to ratify the change. And ratification would surely crash into the roadblock of public opinion.
Even in the unlikely event that a majority favored such a radical shift, it would still face tall odds. The Founders built a ratification process that enables a dedicated minority to thwart a hot-headed majority. Just 13 states, with as little as 3 percent of the population, could scuttle the new provision. Opponents would have an even easier time if Congress sent the amendment to state legislatures instead of special state conventions. (Only the 21st Amendment, repealing Prohibition, was ratified by special conventions). Except in unicameral Nebraska, both chambers of a state legislature must vote to ratify, so opponents would have to get only 13 of 99 state legislative houses to withhold approval. They wouldn't even need a majority in those 13 houses; merely blocking a vote would have the same effect.
With such a low threshold, opponents could win. Ponder the fate of the Equal Rights Amendment, a less drastic measure. In spite of presidential support and polls showing wide approval of the ERA, its foes mustered enough clout in enough statehouses to kill it.
It might seem that most legislatures balk at enlarging civil liberties (as some claim the ERA would have) but would eagerly take them away. The evidence suggests otherwise, however. Thirty-five legislatures supported the ERA, and it's hard to see them voting to repeal the First Amendment.
Another response concedes the improbability of these horror stories but argues that the stakes are so high that we cannot even afford a small chance. Advocates of a convention belittle this argument. Under Article 5, Congress may propose amendments at any time, without calling a convention. Since Congress has had nearly 200 years to hatch evil amendments, ask convention supporters, why haven't we seen any?
Opponents counter that Congress has produced some rotten constitutional amendments, including numbers 16 (the income tax) and 18 (Prohibition). And if lawmakers did not have to face reelection, they would probably write new ones requiring taxpayer-funded Jacuzzis in every Capitol office. And no electoral discipline would apply to convention delegates. They would run only once and so would have even greater freedom to act irresponsibly.
This may be the strongest case for believing the runaway-convention fears. It seems much more likely, however, that if a convention took a wrong turn, it would be toward timidity, not tyranny.
In drafting a balanced-budget amendment, the delegates would confront many temptations to weaken it. Everybody who gets special benefits from the federal government would fight to exempt his or her own program from the lid. And all of the beneficiaries would favor "escape clauses" allowing Congress to lift the lid in case of such national emergencies as poor surfing conditions or excessive reruns of "I Love Lucy."
Whether the convention bowed to such pressures would depend on its makeup. If the delegates were public-spirited defenders of economy and limited government, it would probably try to write a rat-proof amendment. But with enough special-interest pleaders, it would surely cook up rat pudding.
Apportioning and choosing delegates becomes crucial. Article 5 of the Constitution says nothing on the issue, so Congress and the state legislatures would have to decide. Under a bill proposed by Sen. Orrin Hatch (R–Utah), apportionment would follow the Electoral College formula: each state's delegation would equal the number of its senators and House members. This would nod to the one-person-one-vote principle but also give an edge to the smaller states. Some may consider them fiscally conservative, so that this formula would foster a tight amendment. But the six leaders in per capita state spending are Alaska, Wyoming, North Dakota, Delaware, Hawaii, and New Mexico—all small states.
Senator Hatch's bill would let each state decide how to choose its delegates. He favors popular election but does not think Congress should specify particulars—and many scholars believe that Congress lacks constitutional authority to do so.
Some state legislatures might opt to appoint delegates. Consider the incentives under this arrangement. For many lawmakers who said aye to a convention call, the vote was a "freebie," a way to strike a pose for a balanced federal budget without taking any real action. But as the amendment moved closer to reality, new considerations would come into play. A strict balanced-budget requirement could slash federal aid and force state lawmakers to either cut spending or raise taxes. Seeking to duck that choice, they would want convention delegates to favor a weak amendment and would choose delegates accordingly.
Does this scenario take too harsh a view of state lawmakers? No doubt hundreds of them sincerely oppose "big government." Between 1970 and 1985, however, total state spending (minus federal aid and inflation) rose 60 percent. In Albany as in Washington, politics is survival of the fattest.
Popular election of delegates doesn't look any better. Although a balanced budget would benefit the country as a whole, those blessings would amount to scant personal benefit when split among 245 million Americans. The average citizen would have little material incentive to vote in delegate elections, much less work in a delegate campaign. By contrast, the costs of a balanced budget would fall hard on those who enjoy special federal benefits. Their big stake in the outcome would drive them to act. In delegate elections, those most likely to participate would be those most likely to back foes of a strong balanced-budget amendment.
Prospects for an ironclad amendment would also suffer from the mode of election. Except in small states, at-large races would be impractical; delegates would run by congressional district. Several states have gerrymandered their districts to elect big spenders, so opponents of a strong amendment would enjoy a head start. More important, many congressional districts rely on federal pork barrels such as water projects and military bases. People running for convention delegate would pledge to support austerity—except for programs that bring local bounty.
Advocates of an amendment with teeth could try public education and get-out-the-vote efforts. But federal beneficiaries have their own firepower: the American Association of Retired Persons has 18 million members; the National Education Association, 1.7 million. Such groups could show strength in every congressional district in the country. Fiscal conservatives can hardly make the same claim. Business organizations, despite their green-eyeshade reputation, would probably not shed blood for a balanced-budget amendment. Austerity would hurt the thousands of firms that do business with Uncle Sam. Ideological groups may have long mailing lists and busy offices in Washington, but so far they lack state-level organizations to match the pro-spending forces.
That might be changing. In California, Minnesota, and other states, free-market groups are starting to nourish the parched grass roots of economic conservatism. Besides, say advocates of the convention call, we should ultimately trust the people. Despite the scare-mongering opposition of special-interest groups, the people voted for Proposition 13 in California and Proposition 2½ in Massachusetts. Given a fair chance, the argument goes, the people will pick delegates who really want to curb spending.
Whatever its makeup, the convention would adopt its own rules of procedure. And spenders are adept at rigging the game. For example, by crafting rules that limit amendment and debate, the House leadership has steamrollered billions of dollars into the budget. Wanting instant passage of a tax hike in the fall of 1987, Speaker Jim Wright faced the hurdle of a one-day procedural waiting period. No problem—he simply had the House declare a whole new day. Wright and his henchmen would happily teach like-minded delegates how to pull similar tricks. So if a convention does take place, a word of warning to supporters of limited government: fight hard on procedure. The spenders will play J.R. Ewing, so don't get caught in the role of Cliff Barnes.
Yet, although a convention might well flop, the congressional route is hardly a better bet. Unless the House ousts Wright for ethical violations—which would be like a bawdy-house firing the madam for promiscuity—he will reign for years and continue to block anything resembling fiscal sanity. And even if Congress did write and approve a balanced-budget amendment, it would probably leave out the teeth. Witness how Congress exempted half the budget from the Gramm-Rudman knife. So despite all its dangers and pitfalls, the convention route still deserves serious attention from supporters of a balanced-budget amendment.
Any path to a more-limited government and lower spending will carry its share of hazards, because the forces of big government know how to fight. The answer is to learn how to fight back. Adam Smith offers one kind of insight, Machiavelli and Sun Tzu another.
John J. Pitney, Jr., is an assistant professor of government at Claremont McKenna College.
The post Budget Balancing Act appeared first on Reason.com.
]]>Jackson wants to stop "economic violence"—a vague term that seems to cover any business decision he dislikes, such as closing a factory. "Let's take on the giants who have the power," he says, "and make them responsible to the American people." In other words, big government would make businesses follow its master plan.
In The Road to Serfdom (1944), Friedrich Hayek explained the danger of that approach: "Economic control is not merely control of a section of human life which can be separated from the rest; it is the control of the means for all our ends.…Economic planning would involve direction of almost the whole of our life." And in fact, Jackson's stands on the issues add up to a vast expansion of the state's power over every American:
• He offers a program of "universal and comprehensive health care" with "cradle-to-grave protection." The yearly cost would start at $50 billion but would soar as more and more people took advantage of the program. The inevitable result: strict federal rationing of health services.
• He endorses the Harkin-Gephardt agriculture bill, which would place mandatory production controls on subsidized farm commodities and forbid farmers from selling more than their crop quota. Whoever failed to take part in the program would not be allowed to sell program crops. Even Mikhail Gorbachev might find this proposal a tad oppressive.
Jackson backs the bill, along with higher dairy price supports, because he wants farmers to get more money. It doesn't bother him that such programs would force everyone to spend more on food. "The reality is that if urban people are working," he says, "they can pay a few more pennies for a glass of milk or for a cone of ice cream or for a loaf of bread."
• He says that when "truckers are deregulated and when small banks are deregulated, and airlines are deregulated, and you cut the Housing and Urban Development budget back from $32 billion to less than $10 billion, a whole pocket of people are simply wiped out by a shift in our priorities." Apparently he would repeal deregulation—even though his rationale is goofy. Nearly every study shows that deregulation has given people greater choice at lower cost; for instance, discount prices for flights over 2,500 miles have dropped by 35 percent in real terms. Why would he make consumers pay more?
And his claim about the HUD budget is bogus: in constant 1982 dollars, outlays for fiscal 1988 are the same as for fiscal 1980—$15 billion. Besides, how much good does the money do? Ever visit a housing project? When Jesse speaks, a whole pocket of reality is simply wiped out by his mendacity.
• He wants a "code of conduct" for American business "to ensure that its investment decisions are made in the best interests of the community." That is, he would compel firms to put their money wherever the government told them to.
• He supports racial quotas in jobs and schooling. If you applied for an opening and your group were "overrepresented," Jackson would slam the door in your face.
• He backs "comparable worth," the notion that employers should be forced to pay men and women the same not just for equal work but for jobs of "comparable" value. To carry out such a policy, the government would have to judge how much money each job in America is worth. Never mind about such little things as collective bargaining or the marketplace: a huge new federal bureaucracy would decide your pay.
These proposals might strike some readers as unwise, costly, or restrictive. They're also dangerous: the greater the growth in government control, the greater the risk of overreaching and abuse.
That's especially true in Jackson's case. Extreme as his proposals may sound, they are probably only a moonshadow of what he would try to do if he had the power. Over the years, hints of a more radical agenda have slipped out. In the 1970s, he outlined his "kingdom theory" of economics, which says "the community" should make business management decisions:
"Control of the store means determining who will be clerks and bookkeepers, and who will have such monetary awarding (sic) jobs as butchers and meat managers within the store. By that same token, it is also to determine the amount of shelf space which the community feels should go to the products of black producers…which banks the store will transact its financial operations in; who conducts the collection services for the store.…At the same time, this principle applies to all other institutions in the community."
Jackson's apologists might claim that he was just proposing local boycotts of unfair businesses, not pushing Marxist nostrums for the whole country. His rhetoric, however, rings with the ideas of the Bearded One. He calls the plight of rural America "a new feudalism"; Part One of Marx's The German Ideology explains the central role of "feudalism" in Marx's theory of history. "Workers of the world must unite," Jackson says in his stump speech. Where have we heard that before?
His collectivist urges aren't tempered by traditional Western ideas of moderation and political restraint. At Stanford University last year, where students and instructors were protesting a course in Western civilization, he reportedly led demonstrators who chanted, "Hey, hey, ho, ho, Western culture's got to go."
That incident stirs bad memories. In The Road to Serfdom, Hayek recalled that many English and American students came back from the Continent in the 1930s unsure whether they were communists or Nazis "and certain only that they hated Western liberal civilization." Then as now, trashing the thoughts of Aristotle or Adam Smith clears the path for totalitarianism. And as Hayek taught, it doesn't matter whether the tyranny is leftist or rightist, because both brands amount to the same thing.
Jackson doesn't see that. While he passionately attacks South Africa, he smiles upon dictatorships that proclaim "national liberation." His 1984 tour of Latin tyrannies put his double-think on display. He visited indoctrination centers on Cuba's Isle of Youth and called them "creative." At the University of Havana, he cried, "Long live Castro. Long live Martin Luther King. Long live Che Guevara," thereby implying that communist thugs stood for the same things as an apostle of freedom and nonviolence. Later he flew to Managua, where he praised the communist government for putting Nicaragua "back on the road to democracy, peace, and reconciliation."
None of this means that Jackson aims to bring Cuban-style repression to the United States or name Daniel Ortega as secretary of Health and Human Services. Rather, it indicates his odd attitude toward power. Unlike the American Founders, who deemed no one virtuous enough to be trusted with unchecked political power, Jackson seems to approve of any government fiat as long as the motive is "progressive." He once said that if he were the mayor of a big city, he "would force the government to call out the National Guard to deal with existing injustices.…There's no reason why the Army couldn't come down the street with bayonets, looking for slumlords." Of course, Jackson was just talking big, not laying out an actual plan. But his willingness to mention martial law should trouble anyone who values civil liberty.
Says Adolph Reed, a Yale political scientist who criticizes Jackson from the left: "An irony of [Jackson's] political style, and the leadership model in which it is embedded, is that—while ostensibly popular and immediately representative—it is fundamentally antidemocratic." Although he has gone through the motions of the electoral process, he thinks that his real mandate comes from above. According to biographer Barbara Reynolds, he used to tell his underlings, "I am anointed, not appointed."
Like other leaders who considered themselves anointed, Jackson gains strength through demagoguery. Hayek described the strategy: "The contrast between the 'we' and the 'they,' the common fight against those outside the group [is] always employed by those who seek not merely support of a policy, but the unreserved allegiance of huge masses."
Jackson casts the issue of our time as big business against the common guy, "the barracudas versus the small fish." In a debate last October, he said: "Since 1973, we have lost 38 million jobs—multinational corporations, merging, been (sic) purging workers and submerging our economy.…These corporations are now flying beyond any accountability. They must be regulated and reinvest in America."
That quotation shows how a demagogue distorts the facts. The economy has not "submerged"—it's in the longest peacetime expansion in history. Corporations are hardly flying outside government's grasp, as any reader of the Federal Register can attest. And Jackson just plain lied about job losses: between 1973 and 1987, we had a net gain of 27 million jobs.
And yet Jackson's oratory still gets crowds to holler approval. Through his instinct for political symbols and his ability to tap popular resentments, he manages to overcome an audience's normal skepticism. Imagine what would happen if he could combine this power over minds with power over the U.S. federal government.
So let's go back to the original question: What would happen if he actually ran things? Even with defense cuts his social programs would bloat the budget. "If you put all of it down on a piece of paper and you put a price tag on that piece of paper, there's no way we won't end up with a deficit that's not $200 billion but $300 billion," says Democratic consultant Ted Van-Dyk.
Jackson has already proposed big tax increases. In the wake of a mounting deficit he would propose more. And without doubt, a big recession would follow.
Jackson would not react to a slump with a sudden embrace of supply-side economics. If his public record offers any guide, he would seek an even larger role for government: more bureaucrats, more forms, more control of our lives. Normally, people would spurn a leader who took away so much of their liberty. But as a skillful demagogue, Jackson might find a way to direct their frustration toward a scapegoat: big business, or perhaps a foreign country.
All of us would have far less freedom, but some groups might stand in particular risk of discrimination. Jackson denies that he is anti-Semitic, and he has striven to improve his relation with the Jewish community. But anyone contemplating Jackson leadership must consider what he has said:
• "Once we began to move up, Jews who were willing to share decency were not willing to share power."
• "I'm sick and tired of hearing about the Holocaust."
• "Zionism is a poisonous weed."
• "The Jew began to be revealed as the landlord and shop owner."
• "You just can't trust the Jews. I never have trusted those people."
Nobody knows if American voters will ever trust the presidency to the man who said these things, but it is certain that he will wield great influence over his party for years to come. Already, other Democratic leaders have adopted his stands in one form or another. And in making their political calculations, they must constantly ask themselves what Jackson will think or say. The Democratic Party is well along the road to Jessedom.
John J. Pitney, Jr., is assistant professor of government at Claremont McKenna College.
The post What Jesse Really Wants appeared first on Reason.com.
]]>Consider how he voted in the Senate. According to the Competitive Enterprise Institute, which analyzed 86 economic votes during the 99th Congress, Hart backed competition and free-market principles only 21 percent of the time—the eighth-lowest score in the Senate. Even Edward Kennedy scored better, with 25 percent.
That's not just the view of a free-market think tank. The centrist National Journal reckoned Hart's 1986 economic conservatism score at 20 percent.
Yet Hart appeals to many young pro-market voters, who apparently see him as a libidinous Adam Smith. One way he wins their support is by sprinkling speeches with passages such as this: "The problem with big government is the myth that it can solve every problem and meet every challenge. The problem of big government, frankly, is the demand placed on it by every interest group in our society." Gary Hart gives great sound-bite.
And grant the man some credit: He has taken a few sound positions. He voted against the Chrysler bailout, and he continues to fight protectionism. He says the Gephardt Amendment would draw the same blood as Smoot-Hawley: It "would fire the opening shot of a new trade war, and then draft American workers, farmers and businesses to do the heavy fighting. The difference is, in a military war, there are sometimes winners. In a trade war, everybody loses."
But fittingly, it was in a trade policy statement that his true philosophy slipped out. "We must make clear to the President that we do not subscribe to the laissez faire, 'let it happen,' economic theology," he said in the Congressional Record of October 15, 1986. "We believe in aidez faire—which means, literally, 'to make it happen.' We need a policy to help us master the forces of economic change and to use those forces to America's advantage."
With those few lines he said a lot about himself and his mindset:
1. He likes French.
2. He scorns free-market economics. (It's odd that a former divinity student should belittle a school of thought by calling it a "theology.")
3. He lusts for central planning. By "us," he did not mean "us individual consumers and producers," he meant "us Washington policymakers." And a glance at a thesaurus points up some creepy connotations for the word master, "subdue," "quell," "crush," "reduce," "beat down," and "hold in bondage." No hidden hand for Gary.
4. He's one up on Joe Biden: he plagiarizes himself. His October 15, 1986, statement was nearly a word-for-word copy of one he had entered into the Record on August 8.
So what does he want to do to the economy? It's hard to tell from the title he has given to the collection of his most recent speeches, "Reform, Hope and the Human Factor: Ideas for National Restructuring." With that one title, he manages to invoke Robert LaFollette, Ronald Reagan, Graham Greene, and Mikhail Gorbachev.
The fullest account of Hart's vision lies in his 1983 manifesto, A New Democracy. The book called for an "industrial strategy" but denied that it would consist of government control or MITI-style planning. As disclaimers go, that's about as convincing as George Michael's assurance that his soft-porn music video "is not about casual sex."
Once the book got down to specifics, it said that the president should act as "the principal arbitrator" among business, labor, and government. Management would promise modernization and job security, workers would promise productivity increases, and Washington would seal the deal by targeting financial aid to deserving parties. Under Hart's plan, government would just assert the national interest, not aggrandize its own power. Members of Congress would not win aid for unworthy concerns in their constituencies. The president would not rig the process for political gain. In other words, angels would govern.
Two hundred years ago, James Madison taught that if men were angels, no government would be necessary. But like all statist visions, Hart's plan hinges on leaders of superhuman virtue. He vaguely recognized this in the book's introduction, where he lamented the "erosion of national character" and declared that America's test "is moral, not economic—ethical, not political."
During this campaign, Hart is pleading with us to skip the character issue and instead look at his ideas. Ironically, to take his ideas seriously is to demand a president who would never lie, cheat, or break a solemn vow.
John J. Pitney, Jr., is assistant professor of government at Claremont McKenna College.
The post Viewpoint: The Tell-Tale Hart appeared first on Reason.com.
]]>