One of the most important things economists can do in a pandemic is not forget what we know. We know that central planners don't have enough information and insight about the lives and activities of 330 million people to plan those lives in a thoughtful way. We know the problems that emerge when you distribute something valuable by giving it away. We know that government officials face bad incentives. We know that externalities pose problems for the straightforward "leave it to the market" viewpoint, but that large government interventions create new problems. In the rush to make pandemic policy, too many of these lessons were cast aside.
One of the most important controversies of the 20th century was the economic calculation debate. In his 1922 book, Socialism, Ludwig von Mises argued that without markets, central planners would not know how to "calculate." Specifically, they wouldn't know how many of various goods to produce, how to produce them, and whom to allocate them to. In the 1930s and 1940s, Mises' student Friedrich Hayek advanced the argument by noting the ways an economy depends on dispersed information that exists in the minds of millions of people. This information about individuals' "circumstances of time and place," he wrote, could not be captured by a central planner. Hayek's most famous contribution to the debate was his 1945 article "The Use of Knowledge in Society," published in the American Economic Review. That article led modern Hayekians to use the phrase "local knowledge" as a shorthand for Hayek's "circumstances of time and place."
By the end of the Cold War, most economists—even some socialists—were acknowledging that Mises and Hayek had won the debate: The Soviet planners had failed because they had embarked on a task that could not succeed.
But in the COVID-19 era, a lot of policy makers have let this lesson slip their minds. While few have advocated full-blown state socialism, many have forgotten the more general truth that officials don't have enough information to make detailed plans about people's lives.
Take Gavin Newsom, the first governor to impose a statewide lockdown. The California Democrat listed 16 infrastructure sectors deemed so essential that they would not have to lock down. Restaurants, hairdressers, gymnasiums, and schools, not being among them, were compelled to close. So were large swaths of the retail economy. But Newsom did not base these regulations on a sophisticated understanding of what is essential and what is not. He couldn't. No one has that understanding, for the reasons Hayek laid out long ago. The list of essential industries came from an old script; it was not highly correlated with the relative value of various industries and was not closely based on risks of spread.
What was missing from the discussion is something known only in the minds of the humans involved: the value of what was lost. Measuring the loss of gross domestic product (GDP) doesn't quite do it, because the private sector component of GDP is valued at market prices but the value consumers put on goods and services typically exceeds the sticker price. (Economists refer to the value minus the price as consumer surplus.) Gatherings of more than a few people at funerals, for example, were prohibited; many mourners surely valued the gathering they had to miss at more than the ceremony's price.
Central planners tend to come up with one-size-fits-all policies even when the evidence shows a large range of "sizes." With the lockdowns, the most extreme instance of that may be the decision in various jurisdictions to close schools to in-person instruction. Even if, like me, you aren't a fan of government schools, they arguably create at least one large valuable service: day care. So shutting them down—while paying full, or close to full, salaries to public school teachers—took away one of the most valuable services the institutions provided, while shifting the costs onto parents.
Whether or not one ultimately agrees with it, one can understand the decision to close schools in March and April of last year. But as more data came out, it became increasingly clear that students ages 15 or younger had only a tiny risk of dying from COVID-19. The latest data from the Centers for Disease Control and Prevention (CDC) shows that from January 1, 2020, to February 17, 2021, only 140 U.S. residents under age 15 died from the disease—just 0.03 percent of the overall COVID-19 deaths. During the 2019–20 flu season, according to the CDC, about 434 U.S. children under age 18 died from the flu. Yet no one advocated closing schools over that.
Of course, there is the risk of transmission from children to teachers. But teachers in Sweden, which avoided school shutdowns, had a slightly lower fatality rate than I.T. technicians. That comparison is relevant because many I.T. technicians can and do work from home, and probably did so increasingly after the worry about the coronavirus became widespread.
What about the risk of transmission from school children to their parents or other family members? If you've paid attention to recent protests in California and elsewhere, you'll realize that many families are eager to take that risk.
Moreover, the comparison between private schools and public schools is telling. In a January 2021 article in Axios, Erica Pandey notes that only 5 percent of private schools were "virtual" last fall, with presumably 95 percent being in person. That contrasts with the 62 percent of students in public schools who started school in the fall online. The difference is wonderful for the private school kids and tragic for many of those in public schools, but those with local knowledge and local control were most able to get kids back into classrooms.
The pathologies of central planning also played out with the COVID-19 vaccines. In early May, four economists—Susan Athey of Stanford, Michael Kremer of Harvard, Christopher Snyder of Dartmouth, and Alex Tabarrok of George Mason—wrote an op-ed in The New York Times titled "In the Race for a Coronavirus Vaccine, We Must Go Big. Really, Really Big." It advocated two major forms of federal spending for a vaccine. One, which they called a "pull incentive," was a commitment to buy 300 million courses of vaccine at a price of $100 per person. The second, which they called a "push incentive," was a guarantee of "partial reimbursement for production capacity built or repurposed at risk and partial reimbursement as they achieve milestones."
The authors didn't discuss how the vaccine should be distributed once the federal government paid for it. Presumably they wouldn't favor letting the drug companies sell a vaccine to the public after being paid by the federal government; that would have created an uproar. By default, their not mentioning distribution probably left most readers thinking they wanted the federal government to distribute the vaccine.
The good news is that the feds are not distributing the vaccine. The bad news is that state governments are deciding who gets it. Furthermore, no one pays for it, so we lack a price system. Without prices, there are huge lines to get vaccinated; people who barely value the vaccine sometimes get it ahead of people who value it a lot; and the incentive for those administering the vaccine to do so quickly is lower than it would be in a free market.
Could it have been different? Yes. On January 11, 2020, Yong-Zhen Zhang of China distributed the virus's genetic code; two days later, the Moderna lab in Massachusetts formulated one of the vaccines now being used. That was more than three months before the economists' article in The New York Times. So even without federal subsidies, Moderna would have been ready to sell the vaccine by the time it actually did so in late December. Then it could have distributed it to front-line health care workers and older people in nursing homes for, say, $20 a pop and sold doses to a lot of the rest of us for more. If the prices were too high for some people to afford, the government could have helped low-income Americans pay for the vaccine without involving itself in the distribution process.
The Food and Drug Administration's power to say no to drugs and vaccines that it thinks haven't been sufficiently tested for safety and/or efficacy is also a form of central planning. Without that power, we could have had the vaccine earlier. Even if the agency's power to "just say no" to drugs were restricted to its pre-1962 powers, when it could insist on safety but not on efficacy, we would have had the vaccine months earlier. That would have saved tens of thousands of lives.
Another fundamental insight from economics—one that is arguably the basis of almost every other economic insight—is that incentives matter. Decision makers whose rewards are closely tied to the value their decisions create will tend to make decisions that create, or at least allow, a lot of value. They will sometimes fail, but they will try hard to make good decisions.
On the other hand, decision makers whose rewards are unrelated to the value of the decisions they make will make good decisions much less often. Even more perversely, if their decisions benefit only narrowly defined interest groups, such as government workers, their decisions might well destroy value.
Does this sound familiar? Consider Anthony Fauci's guidance to the American public early in the pandemic, from his perch on the White House Coronavirus Task Force, that there was no need to wear a mask. Later, Fauci conceded that this wasn't true; he had said otherwise, he maintained, to ensure an ample supply of masks for health care workers who needed them more than the average American. He didn't seem to take account of the damage that would do to the federal government's credibility.
Credibility is particularly important during a fast-moving pandemic. But Fauci would be paid the same $417,608 annually no matter what he said.
Moreover, the issue of incentives is relevant to the school opening issue discussed above. Private schools depend on tuitions and gifts to stay in business. Public schools, by contrast, get funded whether they teach in person or on Zoom. And public school teachers are typically paid full salaries even if they teach from home.
The strongest argument for lockdowns is that when one person passes the virus on to another, he creates a "negative externality"—economists' term for a cost that someone imposes on others that wasn't accounted for when the person decided to act. The classic example is air pollution from a factory. The plant spews smoke into the air; it makes its way to tens of thousands of lungs; and, absent liability, fines, or agreements, the factory owner has little or no incentive to care.
If the factory's smoke entered the lungs of workers voluntarily laboring at the factory, though, that would not be classified as an externality. Whatever the lowest wage they would have been willing to work for in the absence of the smoke, that wage will be somewhat higher with the smoke. The factory owner can then decide whether it's cheaper to reduce the smoke or to continue as usual but pay higher wages.
Why do I note that caveat? Because what is sometimes described as a COVID-19 externality is not necessarily one.
Consider a customer going to a bar and knowing that there's a risk he could get infected. He has an incentive to take account of that risk. The other customers in the same position have an incentive to take account of the risk. The owner, knowing that some customers will be worried, has an incentive to take account of the risk. Private property helps "internalize" the externality.
It doesn't fully internalize it, because there are many bars, many restaurants, many gyms. So people leaving the first bar might spread the virus elsewhere, imposing a cost on patrons and workers at those other bars, restaurants, and gyms. But much of it is internalized.
In a recent paper, the George Mason economists Peter T. Leeson and Louis Rouanet note another way COVID-19 externalities differ from pollution. The polluter typically doesn't worry about the pollution that blows downstream. Many people, by contrast, do worry about being in contact with others and either infecting or getting infected by them. One need only look at the huge voluntary changes people made in their lives to see how important this factor is. Well before the first lockdown, Americans canceled trips and conferences and quit going to indoor restaurants and bars. On a smaller scale, people going about their business in town wear masks, and if they don't—and sometimes even if they do—they try to maintain a decent space between themselves and others, especially if those others are unmasked.
To the extent there is an externality, we should also remember a point made by Nobel-winning economist Ronald Coase: The person who suffers from pollution downwind from a factory would not suffer if he weren't there. That observation has led economists in Coase's tradition to the concept of "least-cost avoider." Economists tend to focus on the efficient outcome, and the efficient outcome requires looking at who has the lower cost of reducing or eliminating the externality. When people live near an airport, for example, the cheaper solution might be to have airplanes produce less noise. But it might instead be for homeowners to install double-pane or triple-pane windows.
In this pandemic, governments have chosen to prevent a huge number of interactions among people who are at low risk of suffering from the disease. Given that the risk of death by COVID-19 for older people with comorbidities is orders of magnitude higher than the risk for the general population, the lower-cost solution would probably have been for the elderly to isolate themselves.
That has a lower cost for two reasons. One is sheer numbers: It's easier for 40 million people to isolate than for more than 300 million people to isolate. The other reason is that elderly people with comorbidities are more likely to be retired or to work from home. So their loss from staying in their homes is low.
Moreover, it wouldn't necessarily require a mandate that the elderly isolate. They could do so if they wish, and most probably would choose to do so. But governments should not insist—as New York, New Jersey, and Pennsylvania did—that nursing homes readmit people who test positive for the virus.
Just as even paranoids can have real enemies, even optimists can have real grounds for hope.
I think almost all of us were surprised at how quickly most governors and many mayors moved to close down major sectors of the economy. This was a really large attack on economic freedom, the largest in my lifetime, and it happened within days. In most cases, executives did it with zero consent from legislatures. They used existing law to the limit and, some legal scholars say, beyond the limit. I doubt those officials typically thought in March 2020 that we would still be locked down in January 2021. But the lockdowns took on a life of their own.
Recall, though, an earlier anti-liberty episode that was not nearly as shocking as the lockdowns. In 2005's Kelo v. New London, the U.S. Supreme Court gave its blessing to a city government's use of eminent domain to expropriate property from homeowners and transfer it to a private entity, the New London Development Corporation. This sent shockwaves through the country. The Institute for Justice, which represented the losing side before the Supreme Court, has noted that the decision "sparked a nation-wide backlash against eminent domain abuse, leading eight state supreme courts and 43 state legislatures to strengthen protections for property rights."
Could we see a similar response to the lockdowns? Already there have been some moves at the state level to limit governors' lockdown powers. A bill that passed both the House and the Senate in Ohio would have limited the Ohio Department of Health's power to quarantine and isolate people, restricting it to only those who were directly exposed to COVID-19 or diagnosed with the disease. Similarly, in Michigan, the Senate and House passed a bill to repeal a 1945 law that Gov. Gretchen Whitmer had used to impose the state's rather extreme lockdowns. Both bills were vetoed, but I doubt that will be the end of the story.
Even if it doesn't happen until this particular pandemic is over, there's good reason to believe that some state legislatures will want a say in future decisions. Whatever the case for letting governors move so quickly early last year, that case gets weaker and weaker the longer the lockdowns last. At some point, legislators just might roll back those powers. Or so we can hope.
The post Economic Lessons From COVID-19 appeared first on Reason.com.
]]>"The issue of wealth and income inequality, to my mind, is the greatest moral issue of our time," said presidential candidate Sen. Bernie Sanders (I–Vt.). Former Secretary of Labor Robert Reich claims that "great wealth amassed at the top" will cause us to lose democracy. To fight economic inequality, presidential candidate Sen. Elizabeth Warren (D–Mass.) is calling for a 2 percent annual tax on household net worth between $50 million and $1 billion and a 3 percent tax on net worth above $1 billion.
Language like that and proposals like Warren's make increasing inequality sound like a crisis. But they misread the situation and misdiagnose the underlying problems.
On a global scale, inequality is declining. While it has increased within the United States, it has not grown nearly as much as people often claim. The American poor and middle class have been gaining ground, and the much-touted disappearance of the middle class has happened mainly because the ranks of the people above the middle class have swollen. And while substantially raising tax rates on higher-income people is often touted as a fix for inequality, it would probably hurt lower-income people as well as the wealthy. The same goes for a tax on wealth.
Most important: Not all income inequality is bad. Inequality emerges in more than one way, some of it justifiable, some of it not. Most of what is framed as a problem of inequality is better conceived as either a problem of poverty or a problem of unjustly acquired wealth.
First, though, let's look at how much inequality there is. The Congressional Budget Office (CBO) produced a report in November 2018 on the growth of household income in each of five quintiles. Between 1979 and 2015, average real income for people in the top fifth of the population rose by 101 percent, while it rose for people in the bottom quintile by "only" 32 percent. For the middle three quintiles, average real income increased by 32 percent as well.
Or at least those are the numbers if you ignore the effects of taxes and direct government transfers. But you really shouldn't leave those out: If you're debating whether to increase taxes on the rich and transfers to the poor, it seems important to take into account the taxation and safety net already in place. Once the CBO researchers subtracted taxes and added welfare, Social Security, and so on, the picture changed dramatically for the lowest quintile: Income rose by 79 percent. (For the middle three quintiles, it increased by 46 percent. For the highest quintile, it went up by 103 percent—slightly more than before, probably thanks to Ronald Reagan's and George W. Bush's tax cuts.)
The above data on real income growth actually understate the growth of income for each quintile. When the CBO compares incomes over time, it measures inflation using the Consumer Price Index (CPI). But many economists have concluded that the CPI overstates inflation by not sufficiently adjusting for new products, improvements in quality, changes in the mix of goods and services purchased, and shifts in where consumers buy their goods. (The latter factor is sometimes called "the Walmart effect," but that term is arguably dated. Maybe we should call it "the Amazon effect" instead.)
Stanford economist Michael Boskin estimates that the CPI overstates inflation by 0.8 to 0.9 percentage points a year. That's small for any given year, but over time it doesn't just add up—it compounds up. If you go with the conservative estimate of 0.8 percentage points and adjust the CBO's after-tax, after-transfer data accordingly, the top quintile's average real income between 1979 and 2015 increased by 168 percent and the bottom quintile's average real income increased by 136 percent.
That's still an increase in income inequality, of course. But it's not an inequality increase in which the poor and near-poor are worse off. They're much better off. Everyone is.
And those numbers don't do complete justice to how much better off we are. Donald J. Boudreaux, an economist at George Mason University, has compared the prices of items you could have bought from a Sears catalog in 1975 with prices for similar items in 2006. He shows that with the average wage in 2006, you would have to spend far less time working to earn enough to buy the items than you would have had to spend in 1975. Moreover, he notes, the 2006 items are almost always of much higher quality. Who wants a 1975 TV? In 2010, my local Goodwill wouldn't even accept a working 1999 TV. And those awful primitive cellphones everyone had in 1975? Oh, wait.
I asked Boudreaux to update his data to 2019. Since 2013, he told me, the "time cost" of his chosen goods has fallen by another 30 percent.
I should note that while most consumer goods have been getting cheaper, education, housing, and health care have become more expensive. Interestingly, these are all areas in which governments have had a substantial influence on prices. In education, state and local governments have almost a monopoly; in housing, governments on the West Coast and in the Northeast have so restricted new construction that supply has not kept up with demand, causing prices to explode; and in health care, extensive regulation and subsidization have driven up the cost, though not always the price, of health care. (The difference is that the price to the consumer is often low because insurance and government subsidies hide the true cost, which is often high.)
On a global level, meanwhile, inequality is declining—and it's likely to fall further.
Economists measure inequality with something called the Gini coefficient. A coefficient of 100 would mean that one person gets all the income while everyone else gets nothing; a coefficient of zero would mean complete equality. In a 2015 study published by the Peterson Institute for International Economics, Tomas Hellebrandt of the Bank of England and Paolo Mauro of the International Monetary Fund tracked the global Gini coefficient from 2003 and 2013. During that time it fell from 69 to 65, thanks to rapid economic growth in lower-income countries—not just India and China but also sub-Saharan Africa. Hellebrandt and Mauro project that by 2035 it will have declined to 61.
Often when we look at income inequality, we do it by comparing income "quintiles." That is, we ask how much better or worse the richest fifth of the population did over a span of time vs. the second-richest, the middle, the second-poorest, and the poorest fifths. But it's important to keep in mind that there is substantial mobility from one quintile to another, even over just a few years. In a 2015 report for the Census Bureau, Carmen DeNavas-Walt and Bernadette D. Proctor concluded that "57.1 percent of households remained in the same income quintile between 2009 and 2012, while the remaining 42.9 percent of households experienced either an upward or [a] downward movement across the income distribution."
That's important to remember when considering the frequently stated worry that the middle class is disappearing. The middle class is getting smaller—but it's disappearing, for the most part, because it's moving up.
Now, it matters how we define the middle class. If the middle class is defined as the middle three income quintiles, then in 2018 it consisted of households with income between $25,600 and $130,000. In 1967, the middle three quintiles had income ranging from $19,726 to $54,596 (in 2018 dollars). The people in the middle, in other words, are considerably richer than their counterparts a half century ago.
Of course, defining the middle class that way means that exactly 60 percent of households will always qualify. That seems too broad. American Enterprise Institute economist Mark Perry, on his blog Carpe Diem, defines the middle class more narrowly to include any household with an income, in 2018 dollars, of between $35,000 and $100,000. In 1967, he notes, 54 percent of households were in that category; by 2018, that was down to 42 percent. That wasn't because they slipped; it was because they rose. In 1967, only 9.7 percent of U.S. households had income of $100,000 or more (in 2018 dollars). By 2018, that percentage had more than tripled to 30.4 percent.
And remember that this calculation adjusts for inflation using the CPI, and the CPI overstates inflation. So in some ways, the improvements are even greater than Perry's data suggest. On the other hand, the numbers arguably overstate the progress for people who live in coastal California, other urban parts of the West Coast, and the coastal northeastern United States, where the cost of housing has skyrocketed thanks to barriers erected by local and state governments.
On a related note: It's important to distinguish the concepts of inequality and poverty. The distinction seems obvious, yet even some economists confuse the two. In 2015, for example, University of Oregon economist Mark Thoma, author of the popular Economist's View blog, wrote: "Recent research…from UCLA's Fielding School of Public Health provides evidence that income inequality is associated with inequality in health. In particular, lower income is associated with 'high levels of stress, exhaustion, cardiovascular disease, lower life expectancy and obesity.'"
Notice that Thoma subtly jumps from "income inequality" to "lower income." Absolute real incomes certainly could be plausibly connected to health, but that's a separate question from how well off someone is relative to others. It's hard to believe that if group A's real income increases by a large percent but group B's income increases by an even larger percent, group A's health would worsen.
One reason so many people worry about income inequality is that they believe the share of income that accrues to the top 1 percent has increased dramatically. This became a hot issue during Bill Clinton's run for the presidency in 1992, and President Barack Obama harped on it repeatedly during his years in office. French economist Thomas Piketty's influential 2014 bestseller, Capital in the Twenty-First Century, also put a lot of emphasis on this assertion.
Piketty and the Berkeley economist Emmanuel Saez have estimated that, for the United States between 1979 and 2015, the top 1 percent's share of pretax income (including capital gains) increased from 9.0 percent to 20.3 percent. But in a 2018 paper, economists Gerald Auten of the U.S. Treasury Department and David Splinter of the congressional Joint Committee on Taxation came to a very different conclusion. To get a better measure of income, they accounted for all of national income, including unreported income, retirement income missing from tax returns, and income due to changes in the tax base that resulted from the Tax Reform Act of 1986. Their conclusion: Between 1979 and 2015, the share of pretax income going to the top 1 percent rose from 9.5 percent to 14.2 percent, and the share of after-tax, after-transfer income rose even less, from 7.2 percent to 8.5 percent.
Who's right? Piketty and Saez have moved in Auten and Splinter's direction with a more complete accounting for income. Their new approach found that the top 1 percent's share rose from 9.1 percent in 1979 to 15.7 percent in 2014. Half of the remaining difference between Piketty/Saez and Auten/Splinter is due to how the pairs handle underreported business income. Piketty and Saez assume that underreported income is proportional to reported business income, whereas Auten and Splinter assume that lower-income business owners disproportionately underreport. Auten has been at Treasury since 1987, which makes me inclined to trust his instincts here, but you can decide for yourself.
In any case, one thing that is clear from the data is that the higher your income, the higher the percent of your income you pay to Washington. In 2015, according to a recent study from the Tax Foundation, people in the lowest quintile paid 1.5 percent of their income in federal taxes, on average; the second quintile paid 9.2 percent; the middle quintile, 14.0 percent; the fourth quintile, 17.9 percent; and the highest quintile, 26.7 percent. Those in the top 1 percent paid a whopping 33.3 percent. This includes all federal taxes: income taxes, taxes for Social Security and Medicare, corporate income taxes, and excise taxes.
That means that whenever there is a large federal tax cut, those in the top quintile will almost certainly get a much bigger benefit, both in dollars and as a percentage of their income, than other quintiles. This is especially true when most or all of the cut is in the individual income tax, because that tax is disproportionately paid by higher-income people. But do they get a bigger cut as a percentage of their federal tax burden? For the George W. Bush 2001, 2002, and 2003 tax cuts and the Donald Trump 2017 tax cuts, the answer has been no.
Because of Bush's tax cuts, people in the second-lowest quintile in 2004 saw a 17.6 percent cut in their income taxes, the biggest percentage tax cut of any quintile. The middle quintile's cut was 12.6 percent, the second-highest quintile's cut was 9.9 percent, and the highest quintile's cut was slightly more than 11 percent.
Similarly, the 2017 Trump cuts reduced taxes most, percentage-wise, for the second-lowest quintile, cutting their taxes by 10.3 percent. The middle quintile got an 8.7 percent cut; the second-highest quintile, a 7.5 percent cut; the top quintile, a 6.7 percent cut; and the top 1 percent, a 4.6 percent cut. The lowest quintile had its taxes cut by 7.3 percent—it's hard to cut taxes for a quintile whose members mostly don't pay them.
Many people who worry about income inequality want to tax higher-income people more. Given what economists know about the harmful effects from raising already high marginal tax rates even higher, tax increases could certainly reduce measured inequality—because they would cause higher-income people to reduce their taxable income by working less, by taking more pay in the form of untaxed fringe benefits, or by investing more in municipal bonds, whose interest is not taxable by the feds. Of course, none of this would make lower-income people better off. Indeed, to the extent that higher taxes discourage capital accumulation, they slow the growth of worker productivity. One of the main ways to increase worker productivity is to increase the amount of capital per worker. With a slower growth rate of capital, worker productivity will grow more slowly—and so will real wages. This makes lower-income people worse off than they would have been.
Piketty recognizes that higher tax rates won't yield much additional tax revenue. In his 2014 magnum opus, he wrote that when a government "taxes a certain level of income or inheritance at a rate of 70 or 80 percent, the primary goal is obviously not to raise additional revenue (because these high brackets never yield much)." Instead, he argued, the goal is to "put an end to such incomes and large estates."
Because of limited space, I have focused on income inequality rather than wealth inequality. I will point out, though, that a tax on wealth—proposed both by Piketty and by Warren—would reduce the incentive to invest in capital, decreasing worker productivity and, therefore, workers' wages.
When is a growth in inequality justified, and when is it not? Consider two opposite cases: an innovator and a seeker of privilege.
In 1949, Robert McCulloch introduced the 3-25, a one-man chainsaw weighing only 25 pounds. This revolutionized forestry. A friend of mine, now in his late 80s, told me that when he was a teenager, his father made him cut wood for a whole winter of heating a large house. When my friend found out about the 3-25, he used his own allowance to buy one. It changed his life.
McCulloch made a lot of money with his chainsaw. But everyone who bought a 3-25 wanted it. It's likely that almost all of them got a large benefit from the purchase. McCulloch got richer, and so did his customers, who were able to save huge amounts of time and effort. Eventually, competitors produced their own chainsaws to compete with McCulloch's—products that were better, cheaper, or both.
That increased the benefits to consumers while reducing the profits to McCulloch. Still, he made a lot—enough that his innovation almost certainly increased income inequality, by raising McCulloch's income far above most other people's.
Now consider a story in which someone used political power to make himself and his wife very wealthy. In 1942, a young congressman from Texas had a net worth of approximately zero. But by 1963, when he became president of the United States, Lyndon Johnson and his wife had a net worth of about $20 million, a large part of which could be attributed to a license from the Federal Communications Commission (FCC) to operate the radio station KTBC in Austin, Texas.
During the 1964 presidential campaign, Johnson claimed that his wife had turned an asset she bought for $17,500 into a property worth millions by working hard. Not quite. Lyndon had worked hard—at using his political influence as a congressman. Before his wife acquired it, KTBC's owners had spent years trying to get the FCC's permission to sell the station. On January 3, 1943, Lady Bird Johnson filed her application to buy it, and just 24 days later, the owners were suddenly allowed to sell. That June, the future first lady applied for permission to operate for more hours a day and at a much better part of the AM band. She received permission a month later.
While all this was happening, the FCC was under attack by a powerful congressman, Eugene Cox, who wanted to cut the FCC's budget to zero. Lyndon Johnson strategized secretly with an FCC official named Red James and used his influence with House Speaker Sam Rayburn to deflect the attack. James later admitted that he had recommended to Lady Bird that she apply for the license.
Over the subsequent decades, the FCC didn't just clear an easy path when her radio station (and, later, a television station as well) needed an application approved. When a competitor wanted to make a move, the agency would put regulatory barriers in its way. In this manner, Lady Bird's company came to dominate Austin broadcasting.
So here we have two examples of income and wealth inequality increasing. In the first case, inequality increased because a man's company introduced a product that made the lives of those who bought it substantially better, raising their real incomes. In the second case, inequality increased because a politician used his influence to get monopolistic privileges from a federal agency, making the politician wealthier and lowering the real incomes of people in the Austin area.
There are at least two reasons to think differently about these two cases. The first is that McCulloch's actions improved others' well-being in addition to his own, while the Johnsons' actions benefited themselves at the expense of others. The second is why McCulloch's actions had those benevolent social effects and Johnson's didn't. McCulloch's wealth—and the benefits to his customers—were rooted in voluntary transactions in the marketplace. The Johnsons' wealth was rooted in raw political power.
There is a third possible source of wealth for the very rich: inheritance. Some people inherit wealth from fortunes like McCulloch's, and some inherit it from fortunes like the Johnsons'. The important question, as far as I'm concerned, is how the money is initially acquired. But in any event, the long-term importance of inheritance in American inequality is overstated. In the May 2013 American Economic Review, Steven Kaplan of the University of Chicago and Joshua Rauh of Stanford analyzed the fortunes of the superwealthy. They found that only 32 percent of people on the Forbes 400 list in 2011 had come from very rich families—down from 60 percent in 1982. Moreover, 69 percent of the 400 had started their own business. In short, a majority of those who made fortunes made fortunes. Maybe they made it the McCulloch way, maybe they made it the Johnson way, or maybe it was a mix. But they didn't simply rely on their parents.
The word inequality sparks thoughts of the very rich and the very poor. But data on the degree of inequality tell us nothing about the degree of poverty or the lives of the poor. Inequality can grow even while the poor and almost everyone else are becoming better off. Indeed, in the last half-century, while U.S. income inequality grew, the poor and the middle class became substantially better off. And the even better news is that global income inequality has fallen and is likely to fall even further.
Great wealth, meanwhile, is a problem only to the extent that it is unjustly extracted. Government favoritism to politically powerful people may increase income and wealth inequality, as it did in the case of Lyndon Johnson and his wife. But it is the government favoritism, not inequality per se, that is the true problem.
The Important Role of Work
Many people, including many economists, who worry about income inequality overlook the role of work. You might imagine that the rich are generally idle while the poor work their fingers to the bone. But that's not the full picture.
The Department of Commerce's Census Bureau gathers data annually on characteristics of the five income quintiles. One thing that changes very little from year to year is the number of workers per household.
The data for 2018 are no exception. Each quintile that year was made up of 25.7 million households. For the lowest quintile, 16.2 million households had no one working at all during 2018. For the highest quintile, by contrast, only 1.1 million households had no one working. How many households in the bottom quintile had two earners? Only 1.1 million, or 4.3 percent of the total number of households. For the top quintile, 14.1 million households, or 54.7 percent of the total, had two earners. Not surprisingly, therefore, the average number of workers per household in the bottom quintile was 0.4, while the average number for households in the top quintile was 2.1.
The lesson seems clear for people who want to avoid being in the bottom quintile: Try your hardest to get a job year-round. Fortunately, that is relatively easy in the current economy, with its less than 4 percent unemployment rate. To further improve your chances, get married to—or just live with—someone else who works year-round as well.
The post The Truth About Income Inequality appeared first on Reason.com.
]]>On February 10, 2016, the Indiana-based furnace company Carrier announced that it would close two factories in the U.S.—one in Indianapolis and one in Huntington—and shift production to Mexico. Just three days later, presidential longshot Donald Trump bragged on Twitter: "I am the only one who can fix this. Very sad. Will not happen under my watch!" Then on April 20, just 13 days before the May 3 Indiana primary, he promised to impose stiff taxes on companies that moved jobs out of the country, singling out Carrier as part of the problem. On July 12, he attacked Carrier again.
On November 8, of course, Trump won the presidential election. Less than three weeks later, on Thanksgiving, he tweeted that he was "working hard" to keep the plant in the United States. On November 29, Carrier announced that it had reached an agreement with Trump to keep about 1,000 jobs in Indianapolis.
"Donald Trump's Carrier Deal is Pure Crony Capitalism," read the headline of a Newsweek op-ed by American Enterprise Institute fellow Claude Barfield. Trump supporter Sarah Palin denounced the deal as "crony capitalism," too, insisting in a op-ed of her own that "intrusion using a stick or carrot to bribe or force one individual business to do what politicians insist…isn't the answer." Certainly they had a basis for that reaction, especially after Vice President–elect Mike Pence's statement that "the free market has been sorting it out and America's been losing."
But a closer look reveals Trump's up to something a little different, and potentially more damaging. His actions will almost certainly lead to more cronyism than we have now. But his behavior in the Carrier case looks more like President John F. Kennedy's treatment of U.S. Steel in the early 1960s and President Barack Obama's treatment of General Motors and Chrysler bondholders in 2009. And it has disturbing implications both for our economic well-being and for our freedom.
Bribe The day before Trump's triumphant press conference, Carrier gave two reasons for its decision: first, that "the incoming Trump-Pence administration has emphasized to us its commitment to support the business community and create an improved, more competitive U.S. business climate," and second, that the "incentives offered by the state [government] were an important consideration." Carrier did not specify which incentives it had in mind.
By December 2, one of the incentives was publicized: a $7 million tax credit over 10 years from Indiana's state government. Now, $7 million might sound like a lot. But stretched out over 10 years, it's really not. It's hard to believe that this amount was enough to keep 1,000 jobs in town. The tax credit amounts to $700 per job per year. Is that enough, on its own, to keep Carrier from moving the jobs? It seems unlikely. Companies don't relocate factories on a whim; the firm would have expected significant and lasting gains from the move. In fact, according to a Washington Post story, Carrier had claimed that its move to Mexico would have saved $65 million per year. That's two orders of magnitude more than the new tax credit. Moreover, Indiana's governor, Vice President–elect Mike Pence, claimed that United Technologies, Carrier's parent company, had turned down a similar package of tax incentives in March.
"I was born at night, but it wasn't last night," United Technologies' CEO said. "I also know that about 10 percent of our revenue comes from the U.S. government."
That suggests that other factors were involved. One prime candidate would be a threat to go after United Technologies' military contracts. Of the parent company's $56 billion in annual sales, about 10 percent is to the military. That's about $5.6 billion. Again, just a smidge of basic math: If Trump's retaliation were to cause United Technologies to lose 15 percent of these sales, or $840 million, and its profit were, say, 8 percent of sales, that's an annual loss of $67 million. Compare that to the $65 million in expected gains from moving to Mexico, and we can start to see one good reason for Carrier to stay.
Of course, this suggestion of a threat on military contracts is hypothetical. I wasn't there when the deals were made. But it's plausible.
What is not hypothetical is United Technologies CEO Greg Hayes' comment to Mad Money host Jim Cramer: "I was born at night, but it wasn't last night. I also know that about 10 percent of our revenue comes from the U.S. government." Also not hypothetical is the language that Trump used in a post-election phone call to Hayes. According to Trump, after Hayes told him that the company had already built the new facility in Mexico, Trump said, "Rent it. Sell it or knock it down. I don't care."
The president-elect could be lying about how aggressive he was, or overstating his willingness to see economic assets of American companies destroyed for political ends. But if there's one thing that seems to be a constant with our incoming president, it is Mr. You're Fired's willingness to threaten people.
Bully It's tempting, looking in from the outside, to see the Carrier deal as classic cronyism. But cronyism is generally thought of as involving companies going to the government for special favors. That favor could take the form of subsidies, as when Barack Obama's administration gave hundreds of millions of dollars to a failing solar firm in California. It could take the form of protecting firms from foreign competition, as President George W. Bush did in 2002 with a 30 percent tariff on imported tin mill steel. It could be a regulation that hurts the favored firms' competitors, as with New York state's rules designed to benefit hotels by hobbling Airbnb.
None of these categories of cronyism fits the Carrier case. It's possible that Donald Trump made a larger commitment to give special favors to Carrier or United Technologies in return for keeping furnace production in the United States. But it's far more likely that United Technologies perceived—or actually received—a threat on its lucrative military contracts.
And that's a different model entirely. Though not one without precedent.
Consider John F. Kennedy's clash with U.S. Steel in 1962. With mediation from Labor Secretary Arthur Goldberg, 10 steel companies and the United Steelworkers of America agreed to a contract on March 31 of that year. Kennedy praised the deal, saying that it was "non-inflationary" and that both sides had shown "industrial statesmanship of the highest order." But on April 10, U.S. Steel Chairman Roger Blough visited Kennedy in the White House to tell him his company was about to raise prices by $6 a ton, an increase of 3.5 percent, and that other steel companies were about to do the same. (Parenthetically, I wonder how Blough knew other companies would do the same. Was he aware that, under antitrust laws then and now, companies aren't supposed to share private information about future price increases?)
Kennedy was enraged. Twitter did not exist then, so the president used a large part of his April 11 press conference to lambaste the companies. He accused them of "a wholly unjustifiable and irresponsible defiance of the public interest" and claimed that they showed "utter contempt for the interests of 185 million Americans." The one company he singled out by name at the start of the press conference was U.S. Steel.
Secretary of Defense Robert McNamara then announced that the Defense Department would shift a contract for submarine construction from U.S. Steel to a firm that had not raised prices. Kennedy's younger brother, Attorney General Robert Kennedy, announced an antitrust investigation of the steel companies and subpoenaed U.S. Steel. Then Inland Steel, the eighth largest steel company, refused to raise its prices. Bethlehem Steel, the second largest company, cried uncle too. And then U.S. Steel, the largest company, backed down on the price hike.
Beg and Borrow Or consider Obama's bailout of Chrysler and General Motors. On the surface, it looks like a straightforward case of cronyism in which Obama helped both the auto companies and the United Auto Workers' union. But in a 2011 National Journal article, "The Auto Bailout and the Rule of Law," George Mason University law professor Todd Zywicki looks beneath the surface. He finds that the Obama administration put its thumb on the scales of what should have been a straightforward bankruptcy procedure and violated the rule of law. "In the years leading up to the economic crisis," Zywicki notes, "Chrysler had been unable to acquire routine financing and so had been forced to turn to so-called secured debt in order to fund its operations. Secured debt takes first priority in payment; it is also typically preserved during bankruptcy under what is referred to as the 'absolute priority' rule—since the lender of secured debt offers a loan to a troubled borrower only because he is guaranteed first repayment when the loan is up. In the Chrysler case, however, creditors who held the company's secured bonds were steamrolled into accepting 29 cents on the dollar for their loans. Meanwhile, the underfunded pension plans of the United Auto Workers—unsecured creditors, but possessed of better political connections—received more than 40 cents on the dollar."
Obama violated standard bankruptcy procedures. How did he get away with it? By lambasting various hedge funds in public. In an April 2009 speech, he claimed that a group of investment funds and hedge funds were holding out for an "unjustified taxpayer bailout." Further, he claimed, they were demanding "twice the return that other lenders were getting." Well, sure. That's because they would not have lent if they hadn't expected to be first in line for repayment. And as for the "unjustified taxpayer bailout," it was Obama who insisted on, and carried out, the bailout. He gave special treatment to the United Auto Workers. That's the cronyism part. But both Kennedy and Obama initiated actions against various parties—Kennedy against steel companies and Obama against holders of secured bonds. To the extent that Trump's actions were threats against Carrier and its parent company, they were a lot like Kennedy's threat against steel companies and Obama's abrogation of bondholders' rights.
Steal Larry Summers, formerly Obama's chief economic adviser, has eloquently denounced Trump's actions as a violation of the rule of law. It's too bad he apparently didn't express the same view when he and his protégé, Treasury Secretary Tim Geithner, were busy violating the rule of law in the auto bailout. Better late than never, but I fear that Summers' respect for the rule of law will fade or even disappear the next time a Democratic president takes office.
Someone who doesn't worry about the implications of Trump's actions but instead actually likes them is Steve Pearlstein, a columnist for The Washington Post and a professor of political and international affairs at George Mason University. In a Post column titled "Donald Trump's Carrier deal could make American capitalism better," Pearlstein writes: "[Trump] knows that he and his new commerce secretary will have to engage in a few more bouts of well-publicized arm twisting before the message finally sinks in in the C-Suite. He may even have to make an example of a runaway company by sending in the tax auditors or the OSHA inspectors or cancelling a big government contract. It won't matter that, two years later, these highly publicized retaliations are thrown out by a federal judge somewhere. Most companies won't want to risk such threats to their 'brands.' They will find a way to conform to the new norm."
These predictions are, unfortunately, very plausible.
Most instances of crony capitalism are facilitated by the discretionary power of government. Because of past power grabs by presidents and past failures by Congress to exert its authority, the U.S. president, whoever he is, has an enormous amount of discretionary power. And Trump has never seemed like a man who worries about executive overreach.
Once firms recognize that Trump can pull discretionary levers to get the outcomes he wants, they will look for ways to kiss the emperor's ring.
The stage is already set for a great deal of cronyism because so many firms rely on government contracts. So Trump's exercise of his coming power over government contracts will inevitably lead to crony capitalism. Once firms recognize, as they probably already have, that the Trump administration can decide whatever narrow outcome it wants and then find the discretionary levers to pull to get that outcome, they will look for ways to kiss the emperor's ring. Then those firms that do so might well be rewarded with subsidies, tariffs, regulations that hobble their competitors, and so on.
Unlike Pearlstein, I see this as awful news. One of the basics of a free society is that if you obey the law, you are not in trouble. We have already gone far beyond that, with asset forfeiture, for example. But Trump's action with Carrier takes us one step further down that road. And if Trump follows Pearlstein's suggestion—sending in tax auditors and Occupational Safety and Health Administration (OSHA) inspectors, not as part of a regular tax audit or OSHA inspection but as part of a threat by a government that wants to get its way on something else—that would take us several steps more.
In his 1944 classic, The Road to Serfdom, economist Friedrich Hayek wrote eloquently about the danger to people of accumulations of government power: "As the coercive power of the state will alone decide who is to have what, the only power worth having will be a share in the exercise of this directing power." Larry Summers, drawing on his inner Hayek, actually said it well also. "Reliance on rules and law has enormous advantages," he writes. "It greatly increases predictability and reduces uncertainty. It reduces expenditures on both guarding property and seeking to appropriate property. It promotes freedom because most of the people most of the time do not take political positions with a view to gaining commercial advantage. The advantages of the rule of law are so great that I would claim that there is no country more than 2/3 as rich as the United States that does not have a strong tradition of the rule of law based capitalism. And I know of no country where the people are free where the rule of law does not largely govern market interactions."
We're already well along the path away from the rule of law, as Summers well knows, given his prominent role in the G.M. and Chrysler bailouts. The Trump maneuvers on Carrier, in themselves, are but a small further step away. But they are a step.
The less predictable are the rules, the less secure are our property rights. Making property rights less secure reduces the incentive to invest. Summers probably overstated the bad effects of Trump's move: We might not wind up objectively poorer, but we will certainly be poorer than we would have been. And we will definitely be less free.
As Donald Trump would say, "Very sad."
Correction 1/29/17: An earlier version of this story mistakenly described Robert Kennedy as John F. Kennedy's older brother.
The post Bribe Bully Beg Borrow Steal appeared first on Reason.com.
]]>Ask any American born before 1960 for an example of corporate greed resulting in environmental disaster and the odds are good that he or she will name Love Canal. Love Canal, for readers who don't know, is a neighborhood in the city of Niagara Falls, N.Y. that was once a chemical waste dump. The dump became a major news story in the late 1970s, including sensational articles in the Niagara Falls Gazette by registered nurse turned reporter Michael Brown, who would later write the book Laying Waste: The Poisoning of America by Toxic Chemicals (Pantheon, 1980). The incident led to passage of the so-called Superfund legislation of 1980, which imposed a tax on petroleum and chemical companies to generate revenue for government-directed cleanup of toxic chemical sites.
But the real story of Love Canal isn't the "corporate guys: bad; government guys and community activists: good" tale that many people believe. In its February 1981 issue, Reason magazine published an exhaustive, fact-filled, 13,000-word article on Love Canal written by independent investigative reporter Eric Zuesse. The article dramatically recast many of the characters in Brown's reports, including Brown himself. I recently asked Reason's longtime science writer, Ron Bailey, whether further information in subsequent years had led him to doubt any important factual claims in Zuesse's piece, and he replied, "I am not aware that his article has been contradicted or found deficient in any important way."
In the years following that article, when I read about Love Canal, I do so with an eye on two topics: (1) Does the work discuss Zuesse's version of the story? (2) Does it challenge his claims? Those questions were on my mind as I read the latest addition to this literature, historian Richard Newman's new book Love Canal: A Toxic History from Colonial Times to the Present. Newman does not mention Zuesse, but he does raise some of the issues that Zuesse did. Disappointingly, he ultimately ignores those issues and adopts much of the story that Brown presented.
Before diving into the book, some backstory is needed. In the 1890s, an entrepreneur named William Love proposed to build a canal bypassing Niagara Falls, which would allow shipping between Lake Erie and Lake Ontario, better connecting the hydro-powered industrial area to markets. As part of his vision, Love proposed a planned community along the waterway. His project ultimately was dashed by the Panic of 1893 and a congressional prohibition on diverting water from the Niagara River, resulting in the abandoning of the partially dug canal. However, some of the residential development Love envisioned did become reality.
Decades later, the unfinished canal, which had become filled with groundwater, became a dumpsite for the city's municipal waste. During World War II, a city–based chemical company, Hooker Electrochemical Company (later Hooker Chemical) received permission to dispose of chemical waste in the canal. Late in the decade, Hooker drained it, lined it with clay, and began depositing drums of chemicals at the site. Dumping continued through the early 1950s, when Hooker capped the site with clay.
But the site was soon disturbed. The Niagara Falls City School District decided it wanted the property for a school, threatening to use eminent domain to gain the land. Rather than fight the action, Hooker offered to sell the property to the school board for $1. The school board agreed, even though it was aware of the site's history.
In the deed of sale, Hooker included the following closing paragraph:
Prior to the delivery of this instrument of conveyance, the grantee herein has been advised by the grantor that the premises above described have been filled, in whole or in part, to the present grade level thereof with waste products resulting from the manufacturing of chemicals by the grantor at its plant in the City of Niagara Falls, New York, and the grantee assumes all risk and liability incident to the use thereof. It is therefore understood and agreed that, as a part of the consideration for this conveyance and as a condition thereof, no claim, suit, action or demand of any nature whatsoever shall ever be made by the grantee, its successors or assigns, for injury to a person or persons, including death resulting therefrom, or loss of or damage to property caused by, in connection with or by reason of the presence of said industrial wastes. It is further agreed as a condition hereof that each subsequent conveyance of the aforesaid lands shall be made subject to the foregoing provisions and conditions.
The new owner of the land automatically became liable for any damage done by toxic waste on the land, making such a clause legally unnecessary. Why, then, did Hooker insert the clause? Zuesses's explanation is that Hooker wanted to underscore that the chemicals could be dangerous and should not be disturbed. Consider that in March 1952, a Hooker official escorted school board officials to the site and, with them present, made test borings into the protective clay cover to convince the school board officials that the potentially dangerous chemicals were there. Yet, in August 1953, the school board unanimously voted to remove 4,000 cubic yards of fill from the waste site to complete the grading at another school site. The school board then began building the school on the Love Canal site, and that school opened in February 1955.
In 1957, the school board considered trading part of the property to two developers in exchange for some other land and $11,000 in cash. Hooker executives, upon hearing about the proposal, sent company attorney Arthur Chambers to attend the board meeting where the proposal was discussed. Chambers reminded the board members that chemicals were buried under the land's surface and pleaded with them not to let houses be built on the land. The board deadlocked 4–4, with the result that the resolution to sell the land failed.
Unfortunately, at the same time, city workmen, while constructing a sewer, punctured the walls of the site and its clay cover. They did this even though articles in the local paper at the time regularly warned that the construction was "dangerous" and "injurious."
In short, Hooker Chemical tried on several occasions to warn people about the dangers of the buried chemicals, and the irresponsible players in the drama were government officials. Two decades later, as groundwater tests began finding toxic chemicals and assertions were made that the chemicals were causing birth defects, Brown began writing his articles.
Newman tells some of this story. But at some points, he undercuts it with doubts about Hooker's actions. He writes, for example:
Hooker later claimed that developers removed the [clay] cap when building new homes and streets. But subsequent investigations doubted that the company had actually capped the entire dump (perhaps only part of it)… In short, the Love Canal dump may never have been completely contained.
Here's the problem: In a 306-page book with 32 pages of footnotes, this very important claim is not footnoted. So either Newman has failed to back up a correct claim, or he's simply stating some unspecified person's opinion. The way to plant credible doubt is to show, not just assert, that there is doubt.
Newman quotes much-celebrated Love Canal activist Lois Gibbs's claim that "residents of this blue-collar community have come to see that corporate power and influence are what dictated the actions at Love Canal, not the health and welfare of its citizens." Yet, if Zuesse's version of the story is correct, then it was a major corporation that warned against various politicians' plans to cut through the cover over the toxic dump. Perhaps Hooker Chemical was truly worried about "the health and welfare" of Love Canal's citizens. Perhaps it merely wanted to avoid adverse publicity and possible legal action (despite the legal protections in the deed). Regardless, the problem was not corporate power and influence but corporate impotence. The politicians had their narrow goal—building a school over a potentially toxic dump—and they were not about to be stopped by a mere corporation. Of course, the quote is from Gibbs, not from Newman, but Newman does not even attempt to gainsay her strong claim.
This failure is not just a careless slip. In one section, for example, Newman writes, "It all came back to the concept of justice, for Love Canal families felt that they had been sacrificed on the altar of profit and power." It seems far more accurate to say that they were sacrificed on the altar of the local school board's power; the idea that for-profit Hooker sacrificed them is hard to maintain in light of Hooker's warnings not to disturb the site. Moreover, elsewhere in the book, Newman refers to Hooker's "newfound concern" in 1980 with the "public's health and safety." Newfound? As documented above, Hooker stated and, more importantly, acted on its concern in the 1950s.
And what were the chemicals' health consequences for Love Canal residents? One would think that a 2016 book would at least partially answer that question. But even though Newman refers to bad health consequences, he is disturbingly vague about their nature. He refers, for example, to some blood tests of young children living in the area without even giving a hint about what those blood tests found. Elsewhere he refers to a "much debated genetic test showing that roughly one-third of the thirty-six people sampled may have suffered chromosome damage." One third is high. What, then, was debated? Only when you actually read a 1983 New York Times article referenced in the footnote do you find the following: "Residents and former residents at the Love Canal toxic-waste site in Niagara Falls, N.Y., are no more likely to have suffered chromosomal damage than residents elsewhere in the city, a Government study concluded today." The study was conducted by the Centers for Disease Control. Note the asymmetry: Newman puts the horrific claim about genetic tests in the body, references a New York Times study in a footnote, and doesn't even hint in the footnote either what the study found or that it was conducted by the Centers for Disease Control.
At one point, Newman refers to reporter Brown's being driven "by an old-fashioned sense of justice." I beg to differ. Consider, for example, Brown's claim, which Zuesse highlights: "At that time [1953], the company issued no detailed warnings about the chemicals; a brief paragraph in the quitclaim document disclaimed company liability for any injuries or deaths that might occur at the site." A brief paragraph? As Zuesse points out, this paragraph, quoted above, is the longest paragraph in the whole deed. Newman's idea of an old-fashioned sense of justice is certainly not mine.
It's usually a good idea to revisit important historical issues in light of new information. Unfortunately, Newman's revisit of the Love Canal story omits much of what Zuesse discovered 35 years earlier. People who read it will get many of the apparently false impressions produced by Brown's original reports.
If you want to know the history of Love Canal in the 19th century and first half of the 20th century, part one of Newman's book is for you. But if you want to really understand the key events from the early 1950s to 1980, Zuesse's 1981 article is the place to look.
This article originally appeared in the Winter 2016/2017 issue of the Cato Institute's Regulation. Go here for more information, including on how to subscribe.
The post Love Canal Down the Memory Hole appeared first on Reason.com.
]]>Since 2008, the Federal Reserve has been trying to stave off economic disaster with an unconventional monetary policy tool known as quantitative easing. By buying financial assets from commercial banks and other institutions, the Fed has massively expanded the money supply-quadrupling it since the practice began.
Many economists, particularly followers of the Austrian school, deplored the practice and predicted that the unprecedented currency and asset price manipulation would lead to huge and damaging price inflation. reason was among them, declaring on our October 2009 cover: "Inflation Returns!" A group of free market economists were asked: "Has the time come to stockpile canned goods and pick up a wheelbarrow for transporting currency, or should we be afraid of the opposite-a prolonged contraction that causes prices to crash?"
Six years later, official consumer price index inflation sits at just 2 percent annually from July 2013 to July 2014, the latest period for which figures are available. This is identical to the rate for the previous year.
We asked four economists and market analysts to revisit what they originally predicted would happen after quantitative easing and assess whether (and why) they were right. Analyst Peter Schiff sticks to his guns, saying that any "claims of victory over inflation are premature and inaccurate. Inflation is easy to see in our current economy, if you make a genuine attempt to measure it." Economist Robert Murphy believes we are in a "calm before the storm" and is "confident that a day of price inflation reckoning looms." Contributing Editor David R. Henderson writes that the "financial crisis has brought such major changes in central banking that uncontrolled inflation from discretionary monetary policy is not as great a danger as it once was," though he remains critical of the Fed's growing powers. And economist Scott Sumner claims victory for the "market monetarists," noting that both Austrians and Keynesians have been proven wrong by events, and urging both sides to "take markets seriously."—Brian Doherty
Where Is the Inflation?
Peter Schiff
Back in 2009, when the federal government began running trillion-dollar-plus annual deficits and the Federal Reserve started printing trillions of dollars to buy Treasury debt and subprime mortgages, economists debated whether much higher inflation was inevitable. Mainstream economists (who hold sway in government, the corporate world, and academia) argued that as long as the labor market remained slack, inflation would not catch fire. My fellow Austrian economists and I loudly voiced the minority viewpoint that money printing is always inflationary—in fact, that it is the very definition of inflation.
Today, with price inflation still not rampant, it's hard to ignore the victory chants coming from the White House press room, the minutes of the Federal Reserve's Open Markets Committee, the talking heads on financial television, and the editorial pages of The New York Times. They claim that the Fed's extraordinary monetary policy and the government's fiscal stimulus have succeeded in keeping the economy afloat through the Great Recession without sparking inflation in the slightest. Deflation, they argue, is still the bigger threat. Their claims of victory are premature and inaccurate. Inflation is easy to see in our current economy, if you make a genuine attempt to measure it.
The Consumer Price Index (CPI) doesn't qualify as a genuine attempt to measure inflation. The CPI report for July 2014 came in at 2.0 percent year-over-year. But because of consistent alterations in how the data is calculated, the CPI has hidden price increases under a blanket of subjective "adjustments." While the details are intricate, the results can be glaring.
For instance, between 1986 and 2003, the CPI rose by 68 percent (about 4 percent per year). Over that 17-year period, the "Big Mac Index," a data set compiled by The Economist that tracks the cost of the signature McDonald's burger, rose at a nearly identical pace. Since then, this correlation appears to have broken. Between 2003 and 2013, the Big Mac Index rose more than twice as fast as the CPI (61 percent vs. 25 percent). The sandwich, which reflects the average person's direct experience, may be a more accurate yardstick of inflation.
Meanwhile, the Fed is pushing up prices not reflected in the CPI. Through its zero-interest-rate policy and direct asset purchases via quantitative easing, the Fed has lowered the cost of capital and raised prices for stocks, bonds, and real estate. In doing so, it has argued that rising asset prices create a "wealth effect" and are thus a key goal of its monetary policy.
Over the past five years, the prices of these financial assets have risen dramatically. However, unlike past periods of bull asset markets, these increases have not been accompanied by robust economic growth. To the contrary, the last five years have seen the slowest non-recession economic growth since the Great Depression.
This Fed-driven dynamic explains the rich-get-richer economy we've seen since the alleged recovery of 2009 began. The wealth effect has allowed the elites to push up prices for high-end consumer goods such as luxury real estate, fine art, wine, and collectible cars. But that is cold comfort to rank-and-file Americans struggling to find work in an otherwise stagnant economy.
Broader consumer price inflation has been kept at bay because many of the newly printed dollars don't even hit our economy. Instead, foreign countries purchase them in an attempt to keep their own currencies from appreciating against the dollar. In the current environment, a weak currency is widely (and wrongly) seen as essential to economic growth. That's because a weak currency lowers the relative price of a particular country's manufactured goods on overseas markets. Nations hope those lower prices will lead to greater exports and more domestic jobs.
Thus we see "currency wars," in which the victors are those who most successfully debase their currencies. That policy perpetuates greater global imbalances (between those nations that borrow and those nations that lend) and the accumulation of dollar-based assets in the accounts of foreign central banks.
The more debt the U.S. government issues, the more purchases these foreign banks must make to keep their currencies from becoming more valuable relative to ours. It is no coincidence that many of the countries heavily buying U.S. dollars, such as China, the Philippines, and Indonesia, are experiencing high levels of domestic inflation. Inflation may now be America's leading export.
In recent years, U.S. federal deficits have declined from more than $1.2 trillion to less than $600 billion. This is not because the government has made hard choices to raise revenue or cut spending but because rising asset prices have resulted in greater tax receipts from the wealthy. Yet this windfall can only last until the next meaningful correction in asset prices. If tax revenues fall, growing federal deficits would compel the Fed to print the difference. In that case, foreign banks would need to buy even more dollars to maintain their currency valuations. If they lose the will to keep pace, the dollar would lose relative value. A weaker dollar could be the spark that finally ignites significant CPI inflation in the United States.
As foreign currencies gain strength, consumers in those countries will gain buying power and more finished products will gravitate toward foreign shelves. Given that a significant portion of the products we now buy are imported, the diminished domestic supply could push up prices for common products like apparel, electronics, and appliances.
The Fed used to be considered effective when it kept inflation low. Today, inflation is the goal. This is especially true for the Federal Reserve as led by Janet Yellen, who looks set to be the most dovish-on-inflation chairperson in the bank's history. The inflation created by the U.S. government has been delayed, not avoided. Already the costs of everyday goods are rising faster than incomes. This is why economic pessimism is so prevalent on Main Street. Sadly, that gap is likely to get much worse.
Mainstream economists would have us believe that inflation and a weak labor market can't exist simultaneously. Have they ever been to Argentina? Do none of them remember the stagflation of the 1970s?
The truth is that high levels of unemployment are historically correlated with higher inflation and low levels of unemployment with lower inflation. That is because an economy that more fully utilizes labor resources is more productive. More production brings down prices. In contrast, an economy that does not fully employ its citizens is less productive, and its government is more prone to pursue misguided inflationary policies to stimulate the economy.
Although America's policies may not differ dramatically from what has been disastrously tried by Argentina, the dollar is for now protected by the international reserve status that it has enjoyed for almost 70 years. But that privilege has its limits. The dollar may be bigger and more globally integrated than any other paper currency, but in the end, its value may be just as ephemeral.
Peter Schiff is CEO of Euro Pacific Capital and author of The Real Crash: America's Coming Bankruptcy (St. Martin's).
The Calm Before the Storm
Robert Murphy
Since the fall of 2008, I have been among the economists, many from the Austrian tradition, warning the public about the disastrous policies enacted by the Federal Reserve in response to the financial crisis. The Fed was generating unprecedented increases in the monetary base, which is the quantity of dollars held by the public as currency and held as reserves by commercial banks. In late 2009, I made a public wager with economist David R. Henderson in which I predicted a 10 percent year-over-year increase in the Consumer Price Index by January 2013. I lost that bet. In general, warnings about price inflation seem to have been premature at best, totally wrong at worst.
It's true that consumer prices did not zoom up as I had predicted, but my objection to the Fed's post-crisis policies was never dependent on that specific forecast. Indeed, the distinctive feature of Austrian business cycle theory is that "easy money" causes the familiar boom-bust cycle by affecting relative prices. Regardless of the purchasing power of the dollar, the Fed's actions have definitely interfered with interest rates, hindering the communication of information about the condition of the credit markets. By postponing needed readjustments in the structure of production, the Fed's actions have allowed the problems apparent in the fall of 2008 to fester.
I am still confident that a day of price inflation reckoning looms and that the U.S. dollar's days as the world's reserve currency are numbered, though I have no way of gauging the duration of this calm before the storm. Still, my 2009 predictions about consumer price inflation were wrong, and it's useful to analyze why.
At the time, I thought the Fed's policies were simply going to kick the can down the road and exacerbate the underlying structural imbalances in the economy. The housing bubble had itself been fueled by the artificial monetary stimulus and rate cuts under previous Fed chair Alan Greenspan (in response to the dot-com crash and the 9/11 attacks), and Bernanke seemed to be drawing from the same failed playbook. We would simply replace one bubble with another: in this case, swapping a bubble in U.S. Treasuries (and the U.S. stock market) for the collapsing housing market.
That all still seems true. My crucial mistake back in 2009 was in predicting that other investors would come to agree with my assessment in a year or two. In other words, I thought they would look ahead, realize Bernanke had no exit strategy, and then short the dollar (and other dollar-denominated assets) to avoid holding the bag. More specifically, I thought that commercial banks would eventually realize they needed to get their excess reserves in higher-yielding assets.
Once the commercial banks started this process, the quantity of money in the broader sense (captured in aggregates such as M1 and M2, which include the public's checking account balances at the banks) would begin to reflect the enormous spike in the monetary base the Fed had directly engineered. Remember that in our fractional reserve banking system, when the Fed buys $1 billion (say) in assets and thereby adds $1 billion in new reserves to the system, if the commercial banks proceed to make new loans, then in the process they will create (say) an additional $9 billion in new money, broadly measured. In 2009 I thought more and more investors would begin to anticipate this process, anticipating that the money supply held by the public eventually would start to soar, so that large-scale price inflation would become a self-fulfilling prophecy.
But the U.S. economy has stayed in this holding pattern, where people expect low consumer price inflation and so commercial banks keep their excess reserves earning 25 basis points parked at the Fed rather than make new loans. Thus the process I described above has been thwarted; the quantity of money held by the public right now is much lower than it would be, if the banks decided they would rather make loans and earn a higher interest rate than the 25 basis points currently paid by the Fed.
I do not believe the Federal Reserve can gracefully exit from its current position. Fed officials eventually will be in an untenable position in which they must choose to either (a) crash the financial markets by selling off assets and letting interest rates rise sharply or (b) let the dollar fall quickly in value against consumer goods and services. But in the last six years, they have been granted a very generous grace period before having this hard choice foisted upon them.
According to Austrian business cycle theory, as developed by Ludwig von Mises and elaborated by Friedrich Hayek (who would later win the Nobel Prize partly for this work), interest rates serve a specific purpose in a market economy. Intuitively, the more society saves and is willing to defer immediate gratification, the more we want entrepreneurs to invest real resources in longer-term projects. When the central bank injects new money into the credit markets, this not only lowers the purchasing power of money (other things being equal) but artificially suppresses interest rates and renders long-term projects profitable that in reality should not be pursued.
In the Austrian view, therefore, consumer prices are not a reliable gauge of the "looseness" or "tightness" of monetary policy. Irving Fisher infamously thought the Fed in the 1920s had done a good job because the CPI had been tame, whereas Mises knew that a crash was brewing by the late 1920s.
Fearing an imminent spike in consumer prices because of the Fed's unprecedented actions since late 2008 turned out to be wrong-but if wrong in spirit or merely in timing, only time will tell.
Bernanke's policies were harmful regardless of the impact on the CPI. Pumping enormous amounts of money into the credit markets doesn't make us richer. It just distorts the coordinating function of interest rates. Remember, Greenspan did us no favors by pumping up the housing bubble. Whether or not a massive bout of price inflation breaks out, a crash in the real economy should still be expected.
Robert P. Murphy (rpm@consultingbyrpm.com) is senior economist at the Institute for Energy Research and the author of The Politically Incorrect Guide to Capitalism (Regnery).
High Inflation Still Unlikely
David R. Henderson
Ever since Federal Reserve Chairman Ben Bernanke commenced quantitative easing in October 2008, many commentators have warned about the danger of inflation. There are good reasons to be concerned-including the unprecedented expansion of the monetary base and the Fed's bloated balance sheet-but many Fed watchers are fighting the last war.
The financial crisis has brought such major changes in central banking that uncontrolled inflation from discretionary monetary policy is not as great a danger as it once was. The Fed now has a variety of exotic tools that can prevent any sudden expansion of the broader monetary measures. These tools are partly what have already prevented quantitative easing from causing serious inflation.
Moreover, the Fed monitors banks and other financial institutions so closely that it cannot be caught napping, and implicit inflation targeting has become a dominant Fed goal. The real danger is not runaway inflation. It's that the Fed is becoming more and more intrusive in shaping how national savings are allocated.
The prevalent worry is that once the economy gets back to normal and interest rates start to rise, banks will increase their loans. And when that happens, many fear, the Fed has no viable exit strategy to hold back inflation.
Central banks traditionally dampen inflation by reducing the monetary base via selling off assets or, equivalently, calling in and not rolling over loans. Those who predict future uncontrolled inflation believe this cannot be accomplished quickly enough without major disruption of credit markets. Aggravating this problem is that the Fed's balance sheet now contains large quantities of mortgage-backed securities and long-term Treasury debt, not just the short-term Treasury securities that were its primary asset in the past. The Fed can quickly reduce its balance sheet by not buying new short-term Treasuries when the old ones mature. With long-term Treasuries, by contrast, the Fed must sell many of them on secondary markets. This could drive prices down, disrupting credit markets and causing losses for the Fed on its portfolio.
But the Fed no longer has to rely on dumping assets to impose monetary restraint. It can use four other methods to accomplish the same goal. These comprise either completely new tools or older tools that were previously of minor importance.
They are:
1. Loans to the Fed from the U.S. Treasury.
2. Reverse repurchase agreements (reverse repos)-that is, borrowing by the Fed. The Fed sells a security from its portfolio with an agreement to buy it back. Under new Fed chairman Janet Yellen, the Fed uses this device extensively, borrowing over $200 billion.
3. Term Deposit Facility, a mechanism through which banks can convert their reserves deposited at the Fed (which are just like Fed-provided, interest-earning checking accounts for banks) into deposits of fixed maturity at higher interest rates set by auction (making them just like Fed-provided certificates of deposit for banks). Fed officials have made clear their intention to employ the Term Deposit Facility liberally if necessary. This would drain bank reserves by converting bank deposits at the Fed from implicit loans to explicit, higher-interest-earning loans with fixed maturities.
4. Interest on reserves. The most important way the Fed began borrowing and continues to do so is by paying interest to banks on their reserves. Permission to do so was included in the Troubled Asset Relief Program (TARP) Act, and the Fed started using that power within days. Interest-earning reserves have encouraged banks to raise their reserve ratios rather than expand loans to the private sector. This tool thus constitutes a flexible substitute for reserve requirements.
The rate that the Fed pays on reserves started out as high as 1.4 percent on required reserves and 1 percent on excess reserves, but it is now fairly low on both: 0.25 percent. Yet the alternatives available to the banks are little better, especially after adjusting for risk. It is the gap between these rates that determines the incentive for individual banks to hold reserves. The interest on three-month Treasury bills remains lower, and both T-bills and reserves are assets that impose no legally mandated capital requirements on banks.
By paying interest on reserves, the Fed has made itself the preferred destination for much bank lending. Should market interest rates begin to increase, raising the prospect of increased bank lending and inflation, the Fed can increase the interest rate it pays, locking up bank reserves and keeping reserve ratios high.
These four tools combined make it possible for the Fed to prevent any expansion of the broader monetary measures without selling off any assets. Of course, the Fed may supplement these tools with some asset sales, but sales are unlikely to be large initially. Interest on reserves will almost certainly be the dominant exit tool.
Treasury deposits may not become significant again, given the national government's huge deficits. Use of reverse repos could be somewhat constrained by Fed concerns about the solvency of the counterparties it borrows from, whereas this is not a serious consideration for highly regulated and monitored commercial banks. Term deposits are just a modified way of paying interest to banks.
Another issue that could affect the exact mix is how these tools affect Fed income. Treasury deposits are the only exit tool that cannot reduce Fed earnings. The Fed pays no interest on Treasury deposits directly, so the Treasury would bear the cost of rising market interest rates if it engaged in extra borrowing on behalf of the Fed. This would be offset only partly by the Fed's regular remittances of its excess earnings to the Treasury. On the other hand, paying higher interest on reserves, on reverse repos, and on term deposits would all directly curtail Fed earnings, whereas any large sale of assets could impose capital losses.
Rep. John Campbell, a California Republican who heads the monetary policy and trade subcommittee of the House Financial Services Committee, has warned that central bank losses are "a legitimate concern and something we will be watching." It would be truly ironic if congressional and popular hostility to the Fed pressured the Fed to create more money to keep Treasury remittances flowing, possibly contributing to the very inflation that so many Fed critics fear. And if the Fed's exit strategy coincides with a Treasury fiscal crisis, all bets are off. That last scenario, though, would arise, not from Fed monetary policy per se, but from Congress' and the president's deficits and debt.
My analysis is not advocacy. I do not claim that it is good for the Fed to have these powers. Indeed, as I argued in a presentation at the San Francisco Federal Reserve Bank in April, I would like the Fed to have zero power. The Federal Reserve's sorry century-long record is evidence for its failure. But the issue at hand is whether the Fed's actions will produce high inflation. That is highly unlikely.
Contributing Editor David R. Henderson is an associate professor of economics at the Naval Postgraduate School in Monterey, California, and a research fellow with Stanford University's Hoover Institution. He blogs at econlog.econlib.org.
Neither Hyperinflation Nor a Liquidity Trap
Scott Sumner
The Fed has responded to the Great Recession with a variety of unconventional monetary policies. These policies have been interpreted, or perhaps I should say misinterpreted, in two notable ways.
On the right, many economists and pundits have expressed concern that low interest rates and lots of monetary expansion would lead to high inflation. On the left, Keynesian economists argued that monetary policy is ineffective once interest rates have fallen close to zero. This led them to advocate fiscal stimulus.
Events of the last six years have decisively refuted both of these widely held views. Instead, a relatively small and little-known group-the market monetarists-has offered the most persuasive account of recent policy. Monetary policy continues to be highly effective at near-zero interest rates, but in most cases actual monetary stimulus has been far too weak to promote high inflation, or even an adequate recovery.
I am one of the market monetarists. Here are our arguments:
1. The decline in interest rates to near-zero levels did not represent easy money, but rather the effects of a weak economy. When nominal GDP growth is very slow, as in the early 1930s or in Japan in the 1990s, nominal interest rates tend to fall toward zero. That does not mean "easy money," nor does it mean high inflation is on the way. In contrast, very high interest rates usually occur during periods of very high inflation, when monetary policy is very expansionary.
2. Once interest rates have fallen close to zero, there's no opportunity cost in holding the kind of money produced by the Federal Reserve: currency in people's wallets and bank reserves (which are essentially cash held by commercial banks, often as electronic deposits at the Fed). In previous low-interest-rate environments, such as the 1930s in America and more recently in Japan, central banks were able to "print" lots of money without generating high inflation rates.
In addition, beginning in 2008 the Fed started to pay a small amount of interest on bank reserves, which made banks even more willing to hold onto that cash. Recall that traditional economic theory says that printing money is inflationary because it's a sort of hot potato. The kind of money produced by the Fed prior to 2008 did not earn any interest. Thus, when interest rates are positive, people and banks don't want to hold onto lots of currency and bank reserves. As they try to get rid of these excess cash balances, they purchase goods, services, and assets, driving up prices in the long run. That's why printing money is usually inflationary. But when interest rates are close to zero, the hot potato effect is much weaker, and central banks can print large quantities of money without creating substantial inflation.
If printing money is not inflationary at zero interest rates, what's wrong with the Keynesian argument that we need to rely on fiscal policy because monetary policy is stuck in a "liquidity trap"?
Many Keynesians ignore the role of expectations. Although the U.S. is currently experiencing near-zero interest rates, that won't go on forever. At some point in the future, interest rates will be positive and monetary injections will have an inflationary effect. More importantly, the expectation of higher inflation in the future is expansionary right now. These expectations lead to higher asset prices (stocks, bonds, real estate, commodities, and foreign exchange), which boosts aggregate demand and inflation.
If the Keynesian argument were true, it would be possible for a single central bank to buy up all of the world's assets with newly printed money without creating inflation. Does that sound too good to be true? Do you really believe that a fiat-money central bank would be unable to debase its currency if it wanted to? Do you believe the Fed doesn't know how to create the sort of hyperinflation experienced in Germany in the 1920s and in Zimbabwe more recently?
Contrary to what you may have read in the press, the Fed has never said it was "out of ammunition"; indeed exactly the opposite. When they've made moves like "tapering"-slowing the pace of money printing-it's because they didn't think the economy needed more stimulus, not because they were running out of paper and green ink.
The Keynesians have made a number of predictions in recent years based on the assumption that monetary policy is ineffective when the economy is in a liquidity trap. Perhaps the most famous Keynesian pundit is Paul Krugman, who expressed skepticism about whether the Swiss central bank and the Bank of Japan would be able to depreciate their currencies when interest rates were stuck at zero.
Soon after he expressed those doubts, both central banks undertook bold policies that successfully held down the value of their currency in the foreign exchange market. The Bank of Japan's decision to adopt a 2 percent inflation target had an electrifying impact on asset markets in Japan and led to an acceleration in both inflation and real GDP growth in 2013.
Keynesians like Krugman also predicted that fiscal austerity would slow economic growth. And yet, though the U.S. has been more austere than the eurozone, since 2011 economic growth has been much higher in the U.S. than the eurozone. The reason for the difference is clear: The European Central Bank had a more contractionary monetary policy, leading to below-target inflation, while the U.S. engaged in unconventional stimulus efforts such as quantitative easing and "forward guidance" (trying to shape market expectations through announcements of its near-future intentions), generating slightly higher inflation.
In early 2013, Krugman said the year would be a "test" of market monetarism, specifically the assumption that fiscal policy is ineffective because it is offset by monetary policy. Most Keynesians expected 2013 to see a sharp slowdown, due to fiscal austerity such as higher income taxes, a 2 percent higher payroll tax, and lower government spending associated with the April 2013 sequester.
In late 2012, the Fed pursued a third round of quantitative easing and aggressive forward guidance with the goal of offsetting the expected fiscal austerity. The results of this experiment could not be clearer. Contrary to the Keynesian prediction, GDP growth did not slow. Indeed, real GDP growth from the fourth quarter of 2012 to the fourth quarter of 2013 was nearly double the pace of the previous four quarters.
We've seen plenty of experiments in Switzerland, Japan, Britain, and the U.S., which clearly demonstrate that monetary policy can be effective at near-zero interest rates. This should have been no surprise, as even Franklin Roosevelt was able to stimulate the economy in 1933 by devaluing the dollar when interest rates were close to zero.
Market monetarists succeeded in our preÂdictions because we take markets seriously. Conservatives who predicted higher inflation should have seen the low inflation forecasts in the financial markets as a warning sign that their models were flawed. Liberals who thought that monetary policy couldn't work at near-zero interest rates should have paid more attention to the fact that financial markets often rallied strongly on signals of monetary stimulus and fell sharply on signs that the stimulus would be cut back.
Market signals aren't perfect, but they're much better than the predictions of academic economists or government bureaucrats.
Scott Sumner (ssumner@bentley.edu) is a professor of economics at Bentley University.
The post Whatever Happened to Inflation? appeared first on Reason.com.
]]>New Zealand, Canada, and the postwar United States all managed to slash the state on a grand scale. Governments shed responsibility for forests, railways, radio spectrum, and more while relaxing labor markets, slimming the welfare state, and ending price controls. Far from damaging economies or increasing unemployment, these reductions in the size and scope of government boosted GDP, improved services, and created jobs.
Government cutters faced opposition along the way, from skeptical Keynesians to Kiwi bureaucrats. But they also found unlikely allies, with left-wing parties playing major roles in the Canadian and New Zealand examples. The stories below should encourage would-be cutters and reassure skeptics: It can be done.
Turning Guns to Butter
How postwar America brought the boys home without bringing the economy down
Arnold Kling
When World War II ended in 1945, President Harry Truman faced a problem. Public opinion called for a rapid demobilization that would bring the boys home as soon as possible. But the Keynesians who were gaining prominence in the economics profession warned that a rapid decline in government spending and the size of the public work force would produce, in the late economist Paul Samuelson's words, "the greatest period of unemployment and dislocation which any economy has ever faced."
Thankfully, Truman ignored the Keynesians. Government spending plummeted by nearly two-thirds between 1945 and 1947, from $93 billion to $36.3 billion in nominal terms. If we used the "multiplier" of 1.5 for government spending that is favored by Obama administration economists, that $63.7 billion plunge should have caused GDP to fall by $95 billion, a 40 percent economic decline. In reality, GDP increased almost 10 percent during that period, from $223 billion in 1945 to $244.1 billion in 1947. This is a rare precedent of a large drop in government spending, so its economic consequences are important to understand.
The end of World War II thrust more than 10 million demobilized servicemen back into the labor market, but without the catastrophic consequences Keynesians feared. Close to 1 million took advantage of the GI bill to attend college. In addition, some of the increase in the male work force was offset by a decline in female labor force participation from World War II levels. But if Rosie the Riveter became a housewife, many of her friends continued to work outside the home. Over all, from 1945 to 1947 the civilian labor force increased by 7 million, or 12 percent. The vast majority found work, as civilian employment rose by 5 million, an increase of 9 percent.
In addition to the demobilized servicemen, the federal government let go of more than a third of its civilian employees—over 1 million workers. Many of these civilians had been engaged in government attempts to manage the economy. As the economist Gary M. Anderson has pointed out in The Freeman, more than 150,000 people were employed by various wartime economic regulatory agencies, such as the War Production Board, the War Labor Board, the Office of Civilian Supply, and the Office of Price Administration.
With responsibilities that extended well beyond wartime production to include restrictions and controls on the civilian nonmilitary economy, those agencies and boards disbanded with great reluctance. The 1946 election, which gave Republicans a majority in the House of Representatives for the first time since 1930, prompted a change of heart, with Price Administrator Chester Bowles removing virtually all remaining price controls five days after the vote.
The conversion to a peacetime economy was a remarkable undertaking by the private sector. It did not merely involve converting wartime manufacturing to peacetime uses. For example, of the 2.8 million workers let go by the "other transportation equipment" sector between 1943 and 1948, when military vehicles were no longer needed, just half a million were absorbed by the civilian automobile industry. The big employment gains turned out not to be in manufacturing at all. The sectors that saw the most hiring were retail trade, services, contract construction, and wholesale trade, which together added nearly 4 million workers.
There are important differences between circumstances today and the circumstances in 1945, of course. Back then, federal spending was much larger as a share of GDP (40 percent, vs. less than 10 percent today), and government employment was a much larger share of the labor force than now (20 percent vs. 2 percent), so a more significant adjustment was required.
But there are other factors that make change more difficult today. During World War II, the personal savings rate climbed to more than 20 percent, so after the war households were able to offset the decline in government spending by consuming a larger share of their incomes. Today, with a savings rate of about 5 percent, households have much less room to expand. In addition, the skill requirements of today's industries make it more difficult to match workers with jobs than was the case in the much simpler economy of the 1940s.
Any way you look at it, though, America's experience from 1945 to 1947 demonstrates that the private sector is capable of overcoming a tremendous drop in government spending. As a percentage of GDP, the decrease in government purchases then was larger than would result from the total elimination of government today. While no one can be sure what would happen if the government were to shrink that quickly, the '40s boom offers a hopeful example.
Arnold Kling (arnoldsk@us.net) is a member of the Financial Markets Working Group at the Mercatus Center at George Mason University. He blogs at econlog.econlib.org.
The New Zealand Miracle
When left and right worked together on far-reaching market reforms
Maurice McTigue
In the early 1980s, New Zealand was on the fast track to bankruptcy. By 1984, when the conservative National Party called a snap election, the deficit was approaching a massive 9 percent of GDP with no budget in place. The government's share of GDP was 45 percent, unemployment was 9 percent (it would later peak at 11 percent), the top tax rate was 66 percent, and the rate of economic growth was a sluggish 2 percent.
A decade later, New Zealand had one of the most competitive economies in the developed world. The government's share of GDP had fallen to 27 percent, unemployment was a healthy 3 percent, and the top tax rate was 30 percent. The government went from 23 years of deficits to 17 years of surpluses and repaid most of the nation's debt.
This remarkable change was not only possible; it was fast and comparatively easy. The incoming Labour Party government paved the way in 1984 with its market-oriented approach to the economy; the National Party administration that took over in 1990 enthusiastically expanded the successful reforms. To solve deep economic problems, successive governments of New Zealand set out to eliminate the deficit, lower unemployment, and increase investment by shrinking the public sector, reforming or eliminating expensive programs, privatizing government enterprises, and reforming a burdensome regulatory process that was weakening our economy. Here's how it happened.
Privatization: From 1986 through the mid-1990s, New Zealand sold off airlines, airports, maritime ports, shipping lines, irrigation projects, radio spectrum, printing offices, insurance companies, banks, securities, mortgages, railways, bus services, hotels, farms, forests, and more. The capital released by privatization paid down the debt and was reinvested in higher priorities. In each of the services sold, productivity increased and costs went down; the country's competitiveness improved, and taxes poured into government coffers.
Rightsizing government agencies: After we eliminated those government functions, the bureaucracies that used to perform them were too large to perform their remaining tasks. So the civil service was reduced by 66 percent. Some agencies remained almost the same size, while others were reduced by 90 percent to 100 percent. After we privatized our forests, for example, only 17 of the Ministry of Forestry's former 17,000 employees were deemed necessary.
Cutting taxes: At the same time, we reformed the revenue system by eliminating capital gains taxes, inheritance taxes, luxury taxes, and excise duties and by allowing income to be taxed only once. We halved tax rates, eliminated all deductions that were not a cost of earning income, and created a system where one-third of revenue came from consumption taxes and two-thirds came from income taxes. Under the simplified system, about 65 percent of the population no longer had to file tax returns—a major selling point for reform.
Reforming the appropriations process: Before 1987, a government appropriation was simply a grant to spend on a specific activity. If money was appropriated to employment programs, for example, there was no expectation that a certain number of unemployed people would become employed as a result. With the State Sector Act of 1987 and subsequent laws, funding was linked directly to results. Agency heads were now CEOs, chosen for capability. They received fixed-term contracts: five years with a possible three-year extension. The only grounds for removal was nonperformance, so a newly elected government couldn't replace department heads with its own people. In the new appropriations process, these CEOs signed a purchase contract identifying exactly what was to be produced for the money allocated.
Consider this case from one of my own portfolios, the Ministry for Employment. Under the old system, a total of $60 million was appropriated in 1989 to 34 different programs, which found jobs for 40,000 clients. After 1990, under the new purchase agreements, the same amount was allocated to just four programs; the other 30 were terminated. The contract required the ministry to place 120,000 people in jobs, with 56 percent of that figure drawn from the long-term unemployed, 25 percent from the Maori, 14 percent from people with disabilities, and 7 percent from people with social and drug or alcohol dependence problems. The ministry successfully placed 135,000 people in jobs that year.
Welfare policy: We now aimed not just to shrink the state but to reduce the number of people dependent upon it. Welfare had evolved into an entitlement to state support for social circumstances such as solo parenting, unemployment, health problems, and other forms of dependency. Under the new approach, resources were devoted to resolving health problems, education problems, social problems, and poor work skills. This approach moved 300 percent more people from dependency to independent living. It was a carrot-and-stick approach, though: If work was available and a citizen was capable of doing the job, he or she had to take it or forgo benefits.
As the New Zealand example demonstrates, it is possible to meet a crisis with serious, substantial reforms. All it requires is a good plan and strong political leadership.
Maurice McTigue (mmctigue@gmu.edu) helped guide New Zealand's reforms as a member of parliament, cabinet minister, and ambassador. He is currently the director of the Government Accountability Project at the Mercatus Center at George Mason University.
If Canada Can Do It…
Slashing the state in the Great White North
David R. Henderson
In 1994 government debt was 68 percent of Canada's GDP. By 2008 that number was down to 29 percent. Finance Minister Paul Martin Jr. and Prime Minister Jean Chrétien, both of the Liberal Party, are the two unlikely stars in this heroic tale of fiscal discipline.
Paul Martin's father was also known as "the father of Medicare," Canada's federally mandated single-payer health care plan, so revered by American liberals. Chrétien was the political heir of Pierre Trudeau, prime minister of Canada for all but nine months between 1968 and 1984, who bore a great deal of the responsibility for accumulating all that debt to begin with.
In the 1993 election, the Liberal Party promised to reduce the deficit. Almost unbelievably, its candidates actually kept this pledge after winning office. Chrétien and Martin accomplished the task with large budget cuts—not cuts in the growth of spending, but cuts in nominal dollars spent—and only small increases in taxes.
In assembling these cuts, Paul Martin didn't follow the usual pattern of consulting interest groups one by one. Instead, he held four televised regional consultations in which various lobbyists, experts, and ordinary citizens contended with one another. Martin also spoke directly to the public about what was needed to turn Canada's budget around. In October 1994, his Department of Finance published a report, A New Framework for Economic Policy, showing that in order to keep the ratio of debt to GDP from rising, government had to run a substantial surplus on its program budget—that is, have revenues significantly exceeding state expenditures.
Martin and Chrétien enforced discipline on other cabinet members with a zero-sum ground rule: If a cabinet member wanted a smaller cut in one program, he had to propose a bigger cut in another.
Martin's 1995 budget remains a shining example of how to deliver on promises of aggressive fiscal discipline. Government spending didn't just grow more slowly; it actually shrank. Spending on programs (as opposed to debt service) was lower in dollar terms, and therefore even lower when adjusted for inflation, than spending in 1993–94. Indeed, program spending was lower as a percentage of GDP than it had been at any time since 1951. The 1995 budget also privatized a number of government corporations, including a railway, a uranium company, and the air traffic control system; and it tightened Canada's unemployment insurance program.
In this and later budgets, Martin used conservative assumptions to make sure he achieved his goals come "hell or high water," an expression he used so often it became the title of his autobiography. Because his assumptions often turned out to be too pessimistic—a refreshing change from the usual budgetary wishful thinking—the ratio of debt to GDP fell even faster than projected. Martin also had a "no-deficit rule": Once he had managed to get rid of the deficit, he pledged to avoid future deficits by keeping spending in check.
From 1992–93 to 2000–01, Canadian spending on federal programs fell from 17.5 percent of GDP to 11.3 percent. The Canadian economist Thomas Courchene notes that the latter figure was the lowest "in more than half a century."
Not everything Martin did would meet with universal libertarian approval. Although there were six to seven dollars in budget cuts for every dollar of tax hikes, he did raise taxes. Virtually all of these increases were announced in the 1994 and 1995 budgets, and most were nickel-and-dime stuff: a reduction in the deductibility of meal and entertainment expenses, elimination of the $100,000 capital-gains tax exemption that a taxpayer could claim cumulatively over a lifetime, a 5.7-cent-per-gallon increase in the gasoline tax, a reduction of the upper limit on deductible contributions to Registered Retirement Savings Plans (Canada's version of a deductible IRA), an increase in the corporate income tax rate from 39.14 percent to 39.52 percent, and a few others.
Martin did not raise individual income tax rates. He did, however, increase the degree of means testing for various federal benefits. Within some income ranges, benefits ended up falling for every additional dollar of income, so the implicit marginal tax was several percentage points higher than the explicit rate.
As a result of this fiscal discipline, in every year between 1997 and 2008 Canada's federal government ran a budget surplus. In one fiscal year, 2000–01, its surplus was a whopping 1.8 percent of GDP. If the U.S. government had such a surplus today, it would amount to a cool $263 billion rather than the current deficit of more than $1.5 trillion.
Chrétien and Martin's efforts were so successful that in 2000 they reduced the corporate tax rate by seven percentage points, cut income taxes, decreased the amount of capital gains subject to taxation, and increased the contribution limit for retirement accounts. Through most of this time period, Canada's economy boomed, thus belying the Keynesian view that large budget cuts reduce economic growth. If the Canadians can do it, maybe, just maybe, we can too.
Contributing Editor David R. Henderson (davidrhenderson1950@gmail.com), a former Canadian, is an associate professor of economics in the Graduate School of Business and Public Policy at the Naval Postgraduate School in Monterey, California, and a research fellow with the Hoover Institution at Stanford University.
The post It Can Happen Here appeared first on Reason.com.
]]>But before we jump on board the anti-Merck bandwagon, we should realize the stakes involved. If the courts, the FDA, and Congress do not handle the Merck case judiciously, there is great danger that the real victims of this case will include many people reading this article. And they will be victims, not of Merck, but of an overly zealous and out-of-control legal and political system.
The facts are these: (1) Vioxx, like almost all drugs, carries it own risks but also has substantial benefits for many patients, (2) some members of the media, in particular, CBS, have distorted the facts about Vioxx and about Merck, and (3) there is no evidence that Merck broke any law or even that it acted in any way irresponsibly. Let's take these points in turn:
Risks and Tradeoffs
In an October 5, 2004 Wall Street Journal, article entitled "Despite Vioxx Withdrawal, The Benefits of Medicines Can Outweigh the Risks," Andrea Petersen writes:
Even in the case of Vioxx, the likelihood that an individual taking the drug would have a heart attack or stroke is relatively small. In a study of 2,600 patients, there were 15 heart attacks or stroke per 1,000 patients among those who took Vioxx for longer than 18 months. Among those who took a placebo, there were 7.5 heart attacks or strokes per 1,000 patients. And those who took Vioxx for less than 18 months had no increased cardiovascular risk.
In other words, the increased risk of heart attack or stroke from taking Vioxx rather than a placebo was only an additional 7.5 per 1000, or less than 1 in 100. Incidentally, the study cited by Petersen was the so-called APPROVe study, designed to test whether VIOXX could help prevent the recurrence of colon polyps. This study, incidentally, was what led Merck to withdraw Vioxx from the market.
You might argue, contrary to Petersen, that an almost one in 100 increased risk of a heart attack or stroke is fairly substantial. However, we're not talking only about the worst kind of stroke or heart attack. Most strokes and heart attacks do not cause death. Consider the results of Merck's early VIGOR study of Vioxx. VIGOR is the acronym for Vioxx Gastrointestinal Outcomes Research. In June 2000, Merck submitted the findings of this research to the Food and Drug Administration. It was a comparison of results for 4,047 patients given Vioxx and 4,029 given naproxen (Aleve). The results: five patients on Vioxx died of heart attacks and four on naproxen died of heart attacks. In other words, there was one extra death from a heart attack out of 4,000 people on Vioxx. This does seem like a small risk. Making the risk of actual usage even smaller is the fact that all the results were at double the maximum recommended daily dose for Vioxx.
Of course, some people might judge the extra 1 in 4,000 chance of a fatal heart attack as being too big a risk to take. That's fine. No one is forcing people who make that judgment to take Vioxx. The great virtue of freedom to choose is that it allows people to make their own tradeoffs between risk and other things. And one of those other things is pain from arthritis. Which brings us to the tradeoff between risk and pain reduction.
Consider Dave Ellis, a 66-year-old retired pharmacist from Edmond, Oklahoma. For 30 years, he has had degenerative arthritis in his spine. By about mid-February, he will have run through his supply of Vioxx and will not be able to buy more. He tells The Wall Street Journal he "dreads" that day. As Hayes Wilson, a rheumatologist in Atlanta, put it in a December 21 Journal article, "If you live with intractable pain every day of your life, would you take a chance that you would have a heart attack? A lot of my patients would."
So here we have a number of patients who regret that they weren't simply told of the risks and allowed to choose for themselves. And who will argue plausibly that a pharmacist is uninformed about pharmaceuticals?
It's true that no government agency has taken Vioxx off the market. Merck chose to do that. But they made that choice under pressure. Given the huge benefits of Vioxx to a substantial subset of the people taking it, a company unconcerned about lawsuits would have continued to sell it, along with the warning. In this country at this time, however, companies have to be concerned about lawsuits, even from those who have been amply warned. So the danger of lawsuits from people who are warned about the risks is preventing people willing to take those risks from having a drug that would make their lives much better.
Let's not forget why there was a demand for a drug like Vioxx in the first place. It's true that drugs like Aleve and aspirin can reduce the pain from arthritis. But these drugs come with their own side effects: aspirin, Advil, and Aleve, all of which are non-steroidal anti-inflammatory drugs (NSAIDs), can cause gastrointestinal (GI) bleeding. The rate of major GI problems typically seen in clinical trials for NSAIDs like aspirin is two to four percent of patients treated for a year. One researcher reported that at least 6,000 U.S. deaths per year result from NSAID-induced GI bleeding. Other experts have estimated that NSAID-induced GI complications result in 16,500 deaths and more than 100,000 hospitalizations per year in the U.S.
Vioxx, which is certified by the FDA to have GI-protective effects, has produced fantastic results for those most vulnerable to the side effects of NSAIDs. Vioxx was approved by the FDA for the treatment of osteoarthritis, rheumatoid arthritis, acute pain, dysmenorrhea, and migraines. Even at twice the maximum recommended daily dose, Vioxx showed a 57 percent lower rate of serious GI bleeding than the rate for NSAIDs.
Misleading Media
CBS' 60 Minutes starts its November 14 segment on Vioxx with the story of a healthy, beautiful 39-year old, Janet Huggins, who died within a month of starting to take Vioxx. CBS goes on to say:
Instead, Merck found something potentially worse: Patients taking Vioxx for longer than 18 months were twice as likely to suffer a heart attack or stroke than those taking a placebo.
CBS then shows Huggins' husband saying that he's sure that Vioxx killed his wife. But 18 months is not one month. Merck has solid evidence that Vioxx shows no additional cardiovascular risk until after 18 months of therapy. The odds that Mrs. Huggins died from Vioxx are slim indeed. But that didn't stop CBS from exploiting her death to make a point.
Merck's Responsibility
Of course, our view of Merck might change if we had reason to think that Merck withheld evidence of Vioxx's harmful side effects. But so far we have no basis for thinking Merck did withhold evidence and, in fact, we have a strong basis for believing that it didn't.
First, let's consider the easy case. No one seems to think that Merck moved too slowly in withdrawing Vioxx within four days of getting the results from the aforementioned APPROVe study. So the issue comes down to whether Merck withheld information from earlier studies.
The closest anyone has come to a "smoking gun" is an email about the VIGOR study that Merck's chief of research, Edward Scolnick, sent to his colleagues. Here's how The Wall Street Journal put it:
On March 9, 2000, the company's powerful research chief, Edward Scolnick, e-mailed colleagues that the cardiovascular events "are clearly there" and called it a "shame." He compared Vioxx to other drugs with known side effects and wrote, "there is always a hazard." (Anna Wilde Mathews and Barbara Martinez, "E-Mails Suggest Merck Knew Vioxx's Dangers at Early Stage," Wall Street Journal, November 1, 2004.)
In that same e-mail, Scolnick lamented, "it is a shame but it is a low incidence and it is mechanism based as we worried it was." In other words, Scolnick was saying that, contrary to what Merck researchers had believed, the higher incidence of heart attacks was due not to the fact that people taking Vioxx weren't taking NSAIDs but rather to something within Vioxx itself. This does sound like a cover-up. But notice something interesting in the time line that some anti-Vioxx lawyers put together. The next event they note after the March 9 email is that on March 17, only eight days after Scolnick's speculation, Merck updated its label by adding, in the "adverse events" section, "cardiovascular" reports. Some cover-up.
60 Minutes breathlessly added:
However, according to internal Merck documents 60 Minutes has seen, and interviews with outside scientists, Merck had concerns that Vioxx could possibly cause cardiovascular risks long before it was pulled off the market.
Of course they had these concerns. Those risks were known from the VIGOR trial, which was completed in March 2000, and whose results were reported to the FDA in June 2000.
Beginning in September, 2002, this new information was printed on the Vioxx package insert. It was reprinted, shortly after that, in the widely used 2003 version of the Physicians' Desk Reference. The package insert devotes two full tables and a number of paragraphs to the risk of cardiovascular problems. The package insert shows that 45 of 4,047 patients on Vioxx experienced a cardiovascular thrombotic event, compared to only 19 of 4,029 patients on naproxen. The package insert goes on to say that Vioxx is not a substitute for aspirin and that patients who need cardiovascular protection should continue taking aspirin. It also notes the additional one in 4,000 risk of a fatal heart attack.
The real test of whether Merck was responsible is a market test: after this information was revealed to the public in September 2002, did the demand for Vioxx decline? If it didn't decline substantially, that suggests that people taking Vioxx were not very concerned about this small additional risk. You might argue that people are not informed enough to know about this risk. But their doctors are, and doctors, in the U.S. at least, write all the prescriptions for Vioxx. Doctors regularly page through the Physicians' Desk reference and word on Vioxx would have been out by early 2003. Yet the worldwide demand for Vioxx grew slightly while the U.S. demand declined slightly. The U.S. decline of five percent between the second quarter of 2003 and the second quarter of 2004, was so small that it could have been due to a number of other factors, including competition from similar drugs such as Celebrex. (Celebrex has also been linked to an increased risk of cardiovascular problems, but the drug's manufacturer, Pfizer Inc. has chosen not to withdraw it from the market.)
There's another test of Merck's responsibility or lack of same: look at the behavior of Merck employees, especially those intimately involved with the VIGOR study. After learning of the VIGOR results, did those employees who took Vioxx stop taking it? We don't know, but those data are relevant and, in fact, Merck would do well to gather them for its legal defense. And we do know one sample point. Merck's CEO, Ray Gilmartin, told a congressional committee that his wife took Vioxx until the day the company withdrew the drug. Unless we speculate that he was out to kill his wife slowly, and was willing to accept a low probability of succeeding, his and her behavior are powerful evidence that Merck was selling a drug it considered relatively safe.
Two final pieces of evidence come from the behavior of the FDA itself. First, the FDA is exceptionally conservative. It is acutely aware that of the two ways it can fail, approving a bad drug is significantly worse for its employees than failing to approve a good drug. Approving a bad drug may kill or otherwise harm patients, and an investigation of the approval process will lead to finger pointing. As one former FDA employee, Henry Miller, put it, "This kind of mistake is highly visible and has immediate consequences—the media pounces, the public denounces, and Congress pronounces." Given this, it seems highly unlikely that the FDA suspected any serious cardiovascular problems with Vioxx, even after it studied the VIGOR results.
The second piece of evidence is the current behavior of the FDA. No one likes to be accused of screwing up, especially if the accusation is false. If Merck had withheld important information about Vioxx from the FDA, then officials in the FDA would now have a strong incentive to lay the blame on Merck. But they aren't doing so. And it's presumably because they can't, because they were fully informed and saw the risks as being fairly small.
Why does this matter to anyone other than Vioxx patients, Merck stockholders, and trial lawyers? For two main reasons. First is the issue of simple justice. Merck is one of the most cautious and benevolent drug companies in the business. Merck has a long history of innovative discoveries that have improved human health around the world. In a huge breakthrough in 1989, Merck researchers determined the three-dimensional structure of HIV protease at a time when AIDS was much scarier. They could have used this tremendous competitive advantage, but they chose to make it publicly available. Every day that this information was withheld, they reasoned, was one more day for the AIDS epidemic to spread.
The second reason for caring about the Vioxx case is the issue of long-term consequences. The FDA already has incentives to turn down drugs that are beneficial if they have reason to expect any negative side effects. And the greatest scandal of all, even if people's worst fears about Vioxx are true, is not Vioxx but the dozens of new drugs that will never be developed because the cost of clearing the FDA's superconservative hurdles make those drugs not worth developing. One of the authors of this article already informs drug companies that some of the medicines they are developing don't make economic sense. Add a few years and a few hundred million dollars to the development timeline and the economics become that much worse. Emory University economist Paul Rubin reports an estimate that FDA regulation saves between 500 and 1,000 injuries (not deaths) per year but costs between 2,100 and 12,000 lives annually. That's not a good tradeoff. The last thing we need is further hurdles to new drug development.
The post Clear Thinking About Vioxx appeared first on Reason.com.
]]>Natural Capitalism, by Paul Hawken, Amory Lovins, and L. Hunter Lovins, brings some real news to the authors' presumed audience of hard-core environmentalists. Hawken is an environmentalist and the author of The Ecology of Commerce; the Lovinses are co-founders of the Rocky Mountain Institute, a non-profit consultancy that advises firms on saving resources. Perhaps it's not as entertaining as a two-headed Marilyn Monroe clone marrying Michael Jackson, but to many greens who associate economic growth with shrinking environmental quality, Natural Capitalism's thesis might be even more shocking than the tabloids: Economic growth needn't coincide with environmental degradation.
The kind of industrial activity that upsets environmentalists involves the wasteful use of huge amounts of natural materials to create products that then dump pollutants back on Mother Nature. Do you see the hidden solution? The authors do, and that's what they call "natural capitalism," though the adjective isn't necessary—it's exactly the same kind of capitalism that free-market economists have been pushing for about 200 years.
Capitalism—the search for profits through making and selling in free markets—should move us toward both a healthy economy and clean air and water. Why? Because pollution and waste are inefficient and expensive. Is your factory polluting the air? You are wasting money. Polluting the water? You are wasting money. Using too much energy? Still wasting money. Add in the property rights of those downstream and the link between capitalism and environmentalism becomes still clearer. Factor in the true value of the clean air and water that nature produces each and every day for "free," and it becomes obvious that we will be richer with a cleaner environment.
This is where the greedy capitalists come in. Money can be made preventing the waste that causes resource depletion and pollution. Entrepreneurs clever enough to tap into this massive potential will make themselves and their companies wealthy. The government needn't order anyone to act.
That three authors generally associated with the environmental movement now realize capitalism can help save the environment is news indeed, and fairly good news for much of this book. Natural Capitalism's greatest virtues are in pointing out the tremendous waste caused by many of our activities and in laying out alternate, less-wasteful ways of doing things. Some of this waste, they note, is due to perverse government meddling. Interestingly—and more controversially—they attribute some of it to businesses' failure to maximize profits.
The authors devote a handful of chapters to stories of entrepreneurs who saved huge amounts of resources by thinking clearly at the start of investment projects. In example after example, they tell how various businesses came up with new ways to save energy and other wasted resources, with nice results for the bottom line. They also point out, disturbingly, that many businesses insist on about a 1.9-year payback for investment in energy savings, which is another way of saying that they insist on a rate of return of more than 50 percent. The necessary consequence is that many very profitable energy-saving investments are not being made.
Even a free-market economy isn't a perfectly efficient world where all opportunities are grasped and acted upon instantaneously; sometimes businesses make mistakes because they are stuck in a mind-set that, say, puts a higher threshold payback period on energy savings than on other investments. This doesn't mean government needs to step in—time and competition usually guarantee that profit opportunities will eventually be seized—but it does mean that a certain amount of educating is necessary.
One of the best chapters, "Aqueous Solutions," documents how water is wasted. The authors note that people in industrialized countries "rely on the highest-quality water for every task, flushing toilets and washing driveways with drinking water." They write that we also "build big dams and water projects by reflex, rather than asking what's the best solution and the right size for the job."
What they write about toilets is so compelling that I haven't been able to flush a toilet since without thinking about how absurdly wasteful they are. "Many sanitation experts have come to view [the toilet] as one of the stupidest technologies of all time: In an effort to make waste products 'invisible,' it mixes pathogen-bearing feces with relatively clean urine. Then it dilutes that slurry with about 100 times its volume in drinking water and further mixes the mess with industrial toxins in the sewer system, thus turning 'an excellent fertilizer and soil conditioner' into a serious, far-reaching, and dispersed disposal problem."
They don't mention, though, that government bears much responsibility for our sewer system. University of Alabama historian David T. Beito has written that, as late as 1880, more than two-thirds of urban U.S. households used "privy vaults," brick-lined holes in the ground that were emptied by companies that sold the wastes to farmers for manure. Government-sponsored sewage systems changed all that. The authors do note, however, that two-compartment toilets in Sweden separate the feces from the nutrient-rich urine, allowing both to be sold for fertilizer.
In writing about cars, the authors think the market could make cars much more fuel-efficient without making them any less safe. "Blending today's best technologies," they write, "can yield a family sedan, sport-utility, or pickup truck that combines Lexus comfort and refinement, Mercedes stiffness, Volvo safety, BMW acceleration, Taurus price, four- to eight-fold improved fuel economy (that is, 80 to 200 miles per gallon), a 600 to 800 mile range between refuelings, and zero emissions." Although their claims may sound far out, the Lovinses' Rocky Mountain Institute was early in seeing the "hybrid" car—running on both an internal combustion engine and a battery—as the car of the future. Honda's recent introduction of such a car suggests the Lovinses may be right.
The authors also welcome an expansion of capitalist markets into the world of air pollution, specifically an emissions quota market for greenhouse gases, which they want to see restricted internationally along the lines of the 1997 Kyoto treaty. That way, people who want to emit more than their quota could buy emission rights from those who use less. That would provide a profit incentive for finding ways to emit less, since you could then profit from selling permits if you can emit under the legal limit.
But the authors undercut the need for either the Kyoto treaty or emissions credit markets by arguing that, quotas aside, producers will find it profitable to cut carbon emissions below the Kyoto-treaty level. So, for example, if the treaty would place a 100-unit limit on emissions, they argue that companies on their own would want to emit only, say, 80 units. If this is so, even those who fear global warming can comfortably oppose the Kyoto treaty. Who needs a treaty to stop people from doing things they wouldn't be doing anyway? And the value of emissions credits would also be zero if they are correct, making the market pointless—if people want to emit less than is allowed, they will pay nothing for the right to emit more.
Despite its virtues, this is not a great book, largely because the authors deeply misunderstand some key views of mainstream economic thinkers and of what they call "conventional capitalism." This misunderstanding prevents them from seeing just how many natural allies they have among mainstream and free-market economists. They write, for example, that the economic theory analyzing the past 200 years of economic growth is "based on the fallacy that natural and human capital have little value as compared to final output." That would have surprised the late Theodore Schultz and would shock Gary Becker, both strong advocates of free markets and both winners of the Nobel Prize in economics. Schultz and Becker developed the concept of "human capital"—capital produced by investing in knowledge and residing in people's heads. Hawken and the Lovinses' ignorance of economics shows itself perhaps most clearly in their discussion of the gasoline lines that followed the 1973 Arab oil embargo: They don't mention that such lines would not have existed if the U.S. government hadn't used price controls to keep gas prices below the market-clearing level.
The authors also think that "natural capitalism" is superior to "conventional capitalism" because the former treats nature as if it's valuable. But in conventional capitalism, much of nature is treated as valuable. If I own a farm, I care about how the addition of various pesticides and fertilizers affects the value of my land. I might not care about what the pesticides and fertilizers do to the surrounding rivers, but that's because no one has well-defined property rights in those rivers. If they had, they could hold me liable for the damage I do.
The book's authors are right that we often don't take account of the damage we do to nature, but that's because private property is not complete enough. No one "owns" nature, though as technologies of demarcation improve (the way that the invention of barbed-wire fences made it practicable to own and manage cows and land on the wide-open range), we can expect to find private property spread in areas in which it is currently absent, like water and air sheds.
If people owned nature, we wouldn't treat it nearly as badly as we sometimes do now. If I owned the water off Pacific Grove, California, to take a recent controversy in my town, I would never let the local government spill sewage into it the way they do now, or at least I would charge the government enough to compensate for the damage. Economists at the Political Economy Research Center in Bozeman, Montana, and other environmentally conscious "conventional capitalists" have written hundreds of articles and books extending these insights. The authors of Natural Capitalism, not well-versed in the body of thought known as "free market environmentalism," never mention this work of people who are truly their allies.
They are also mistaken about the level of government subsidy involved in transportation. They argue that transportation "is the most subsidized and centrally planned sector of the majority of the world's economies—at least for such favored modes as road transport and aviation." We do have a centrally planned (the old word was socialist) system of roads, airports, and air traffic control, as does virtually every other country. But it's certainly not one of the most subsidized. Drivers pay, in the form of gasoline taxes, for the roads they use, and not all the taxes go to build roads—a substantial fraction now goes to subsidize the buses and subways of municipal transit.
Interestingly, the authors admit that roads are not on net subsidized, writing, "In fairness, the federal subsidies to the U.S. oil industry, unlike other fossil-fuel industries, are approximately offset by federal excise taxes collected on its retail products." They confine this admission, which undercuts their whole case, to a note at the end of the book. That's too bad, because they could have made the point that even though there is no net subsidy to roads, roads are probably not built where they should be and are over-congested because government control means that the price system is not allowed to direct resources. Government rarely charges people for their specific choices as to what roads to drive on and when, and almost never prices them higher at peak times of the day.
The authors realize, though, that much urban sprawl is caused by government zoning regulations that prevent "clustering" of housing, jobs, and shopping, though they ignore Jane Jacobs' pioneering work on that issue in her modern classic, The Death and Life of Great American Cities. They also note that widespread building regulations requiring developers to provide parking for each new shop, office, and apartment effectively subsidize drivers.
Despite its flaws, Natural Capitalism does provide some good arguments for a good message, even if the authors might not fully agree with all the implications of their analysis. But Hawkens and the Lovinses are essentially saying: Get rid of government restrictions that prevent us from using resources efficiently, tear down socialism in our transportation sector, and, if you're a businessperson who wants to be environmentally conscious, pay close attention to all available profit opportunities, even if they seem new. Then go out and get rich.
David R. Henderson (drhend@mbay.net) is a research fellow with the Hoover Institution and an economics professor at the Naval Postgraduate School in Monterey, California. From 1983-84, he served as the senior economist for energy policy on President Reagan's Council of Economic Advisers.
The post Save the Environment, and Get Rich appeared first on Reason.com.
]]>Paul Krugman, an economics professor at MIT, is probably the country's most famous economist under the age of 50. In addition to knocking out interesting technical articles, mainly on international economics, Krugman writes highly readable articles and books for a general audience. His latest popular book is The Return of Depression Economics, which displays his familiar strengths and weaknesses.
Krugman's main strength is his ability to weave a complex yet understandable story about recent economic events. His main weakness, which becomes more apparent the more you read him, is his tendency to leave out important facts that undercut his view of the world. A more glaring weakness is his sarcasm and fondness for putdowns, which make it easier for him to dismiss rather than refute thoughtful people with different views.
The title of this book illustrates another Krugman weakness: his tendency, which has gotten stronger as he has gotten more famous, to exaggerate. When I think of a depression, I think of a substantial decline in an economy's output: At the trough of America's Great Depression in the early 1930s, output was 30 percent below its 1929 peak. By Krugman's own admission, Japan, the only big economy in the world that has suffered a prolonged slump in the 1990s, has had only two years of low single-digit decline in real gross domestic product.
One of Krugman's strengths is his talent for analogies that help readers understand economic reasoning. He starts with an analogy, developed by Joan and Richard Sweeney in the February 1977 Journal of Money, Credit, and Banking, between a baby-sitting co-op in Washington, D.C., and a large, complex economy.
In the co-op, people earned one half-hour coupon by providing one half-hour of baby-sitting services. At one point there was too little scrip in circulation. Couples would try to earn more scrip by offering to baby-sit. But there weren't enough other couples wanting to go out, because they didn't want to run down their inventory of coupons, and so there was an excess supply of baby-sitting services. This baby-sitting economy with about 150 couples was in a recession.
As Krugman points out, this situation is analogous to an excess supply of goods in a real economy, and thus, he concludes, it shouldn't be hard for people to accept that a much more complex economy in which millions of good and services are exchanged can suffer from lack of demand. The solution that the baby-sitting co-op finally accepted was to print more coupons, and it worked. This is like printing money to get an economy out of a recession, which also usually works.
But nowhere does Krugman mention that another way to solve a problem of excess supply is to let prices fall. This missing piece is interesting, given that it was explicitly discussed in the article from which Krugman draws the analogy. The Sweeneys pointed out that because the founders of the co-op economy imposed price controls, decreeing that one unit of scrip must always exchange for a half-hour of baby-sitting services, there would be shortages when demand was too high and surpluses when it was too low. It's not surprising that Krugman left this out: He seems to be biased in favor of having government step in rather than letting markets work things out.
Krugman explains how various countries with previously strong economies have gotten into a mess in the last few years, often prodded to so do by the International Monetary Fund. Take Brazil. Krugman notes that Brazil's economy was starting to do fairly well by 1998 as inflation was cut to moderate levels. But when speculators started expecting Brazil's currency, the real, to fall, Brazil asked for the IMF's help in defending the real's value. Harvard economist Jeffrey Sachs has pointed out that the Brazilian government could have instead let the real's value fall, and then Brazil probably would have ended up like Australia, with a currency worth substantially less but with no major economic crisis. Krugman echoes that thought.
Instead, the IMF demanded that Brazil's government keep interest rates high. The IMF also wanted Brazil to reduce its budget deficit by raising taxes and cutting spending. As Krugman notes, the main cause of future budget deficits would be the policies the IMF wanted: Higher tax rates would depress the economy, reducing tax revenues, and higher interest rates on the public debt would increase government spending. Out of nowhere, though, Krugman advocates that Third World governments impose exchange controls, which, he admits, lead to corruption and stagnation.
One of the biggest stories in Asian economies is Japan's slow-to-negative growth in the 1990s. Krugman attributes this anemic record to lack of growth in the money supply and claims that the solution is to expand the money supply in order to cause inflation. Mild inflation, notes Krugman, would motivate Japan's consumers to spend more, and more consumer spending, he claims, would boost Japan's economy. He is probably right. Interestingly, though, he doesn't consider some of the other possible causes of Japan's troubles, such as the heavy taxes on land and on capital gains that were imposed in the late 1980s and early '90s with the explicit purpose of "breaking the bubble," i.e., reducing the value of land and of Japanese stocks. Isn't it possible that the taxes worked, and that's a major reason for Japan's poor performance in the '90s?
Krugman correctly criticizes some of his fellow economists for claiming, when they are surprised by an economic change, that it was obvious all along that the change would occur. Unfortunately, he commits the same sin. In his 1990 book The Age of Diminished Expectations, Krugman wrote, "By the year 2000…Japan will have a GNP that in dollar terms is 80% or more of the U.S. level." In fact, Japan's real GNP is less than 65 percent of U.S. GNP. Krugman overestimated by more than 20 percent. Yet here's what he says about Japan's economy in his latest book: "The weaknesses of the system were actually evident by the late 1980s, to anyone willing to see."
Whenever Krugman discusses supply-side economics, he goes ape, and his latest book is no exception. He says "the specific set of silly ideas that has laid claim to the name `supply-side economics' is a crank doctrine, which would have little influence if it did not appeal to the prejudices of editors and wealthy men." In this one sentence are displayed three weaknesses that can be seen again and again in Krugman's writing.
First, note the ad hominem tactic. Krugman attacks supply-side economics by attacking the motives and objectivity of some of those who believe in it. Supply-side economics may well appeal to some prejudices of some people, but that tells us exactly zero about its merits.
Second, note Krugman's use of the words silly and crank to describe a view of the world that competes with his. That's Krugman's nasty side, which pops up whether he's attacking those to the right of him or those to his left. By dismissing the views as "silly" and "crank," he saves himself the trouble of telling his readers what is wrong with supply-side economics.
Third, Krugman's sweeping dismissal is not supported by arguments or evidence. Let us distinguish between two versions of supply-side economics. The broad version, which most American economists came to share in the late 1970s, is the view that the supply side of an economy–saving, investment, technological progress, and the size and productivity of the labor force–is what matters for output in the long run. Martin Feldstein, Krugman's and my boss at the Council of Economic Advisers (CEA) when Krugman was the international policy economist and I was the senior economist for health policy, is one of the best-known advocates of this view.
The narrow version of supply-side economics holds that high marginal tax rates discourage production by undermining incentives and that, therefore, when the government cuts marginal tax rates by X percent, taxable income increases and tax revenue falls by well under X percent. This was the view articulated by Arthur Laffer, Alan Reynolds, and other supply-siders during the late 1970s and early '80s. It is a view that Martin Feldstein shares, and it is the version of supply-side economics that Krugman ridicules.
Interestingly, some of the earliest evidence for this view was presented in 1980 by Jerry Hausman, one of Krugman's colleagues at MIT. The most systematic and careful evidence that Ronald Reagan's early-1980s cuts in marginal tax rates caused a huge increase in taxable income, thereby substantially offsetting the reduction in tax revenues, has been presented by Lawrence Lindsey. Lindsey was hired by Feldstein as the CEA's staff tax economist and was Krugman's and my colleague at the CEA during the 1982-83 academic year. Lindsey devoted his Ph.D. dissertation to the topic of Reagan's tax cuts once he returned to Harvard and wrote up his findings in the well-respected Journal of Public Economics. One signer of Lindsey's dissertation was Lawrence Summers, currently secretary of the treasury, who was also at the CEA during 1982-83. He is a man for whom Krugman generally shows much respect.
Yet Krugman mentions none of his present and past colleagues' respected academic work that supports the supply-side view. It's much more convenient for him to ignore their findings. In a 1993 article Krugman wrote, "Not all real-world questions are interesting–I find that almost anything having to do with taxation is better than a sleeping pill." Yet to appreciate the supply-siders' insights, you must delve into tax policy. Is it possible that Krugman has never taken the time to understand what supply-siders have to say?
Here's a challenge: I will give $100 to the first person who shows me anything close to a methodical examination by Krugman, written before August 1999, of the logic and evidence behind supply-side economics. You should know three things about me before you take the challenge: 1) I think I've read everything on supply-side economics that Krugman has ever written, 2) I'm a man of my word, and 3) I'm cheap.
Krugman seems to believe that the executive branch of government should have a lot of discretion in setting policy. After pointing out that the U.S. Congress refused to approve funds for a bailout of Mexico, Krugman writes: "Luckily, it turned out that the U.S. Treasury can at it its own discretion make use of the Exchange Stabilization Fund (ESF), a pot of money set aside for emergency intervention in foreign exchange markets. The intent of the legislation that established that fund was clearly to stabilize the value of the dollar; but the language didn't actually say that. So with admirable creativity Treasury used it to stabilize the peso instead."
Note his words luckily and admirable. Yet it is this same sort of discretion that, by Krugman's own admission, has often led the IMF to make things worse. Some of the best of Krugman's work supports the view, although he never says it, that discretionary government power is on balance bad. It's hard to read this book with an open mind and feel more confident at the end that giving lots of power to government officials is the best way to maintain economic growth.
The post Artful Dodger appeared first on Reason.com.
]]>The father of the modern American state was a pipe-puffing executive at R.H. Macy & Co. named Beardsley Ruml." Thus opens Amity Shlaes' new book, The Greedy Hand. Shlaes is an editor at The Wall Street Journal, where she writes many excellent unsigned editorials about the absurdities, inefficiencies, and injustices of taxes. Her book shows, often dramatically, how the tax apparatus "pervades, dampens and makes knots of our lives" in our roles as workers, shoppers, investors, spouses, and parents. She discusses income taxes, Social Security taxes, sales taxes, property taxes, and estate taxes–in short, all the major taxes except the tax on corporate income. Along the way, she dispenses valuable nuggets of information about the tax system: its origins, its rationale, and some of its most twisted features.
Take tax withholding, whereby employers deduct from our gross pay an estimate of the federal and state income taxes we owe. Before 1943, there was no withholding; people just paid their annual taxes every March 15. But President Franklin Roosevelt and Congress dramatically increased tax rates in 1942, months after America entered World War II, and the new rates hit tens of millions of Americans who had never before paid federal income taxes. That, Shlaes writes, is when the "class tax" became a "mass tax."
There was one little problem. A Gallup poll showed that only 5 million of the 34 million people subject to the tax were saving to make their annual payment in March 1943. As March 15 approached, Treasury Secretary Henry Morgenthau fretted, "Suppose we have to go out and arrest five million people?"
Enter Beardsley Ruml, chairman of the Federal Reserve Bank of New York and an adviser to the president. While treasurer at Macy's, Ruml had noticed that customers didn't like big bills, preferring to pay in installments. So he proposed that employers deduct estimated taxes from workers' paychecks. The government accepted his proposal and added a sweetener: Taxpayers were granted amnesty on 75 percent of the lower of their 1942 and 1943 liabilities. So was born withholding, which Shlaes calls "the most ambitious bait-and-switch plan in America's history."
Withholding, writes Shlaes, is what allows the government to be so big. She has a point. It's hard to imagine someone earning, say, $50,000 and accepting this much government if, instead of paying $500 out of each month's paycheck, he had to pony up $6,000 in one lump sum just for federal income taxes.
One minor villain in the withholding episode, incidentally, was Nobel laureate Milton Friedman, at the time a 30-year-old Treasury Department economist. Friedman now agrees that withholding set the stage for today's big government: In Two Lucky People, the memoir he co-authored with his wife, Rose, he candidly states that he "was helping to develop machinery that would make possible a government that I would come to criticize severely as too large, too intrusive, too destructive of freedom."
Shlaes also writes thoughtfully about the Social Security tax. "For younger people," she notes, "the central tax is not income tax, it is FICA, which looms on their paycheck stubs like a Tyrannosaurus rex. To them Social Security is a job tax, indeed the job tax, the major tax fact of the first decade of their career life" (emphasis in original).
Shlaes quotes from a pamphlet published in 1936 by the newly formed Social Security Board. After explaining that employer and employee would each pay three cents of every dollar of the employee's earnings, up to $3,000, the pamphlet stated, "That is the most you will ever have to pay." Of course, this was utterly false. As Shlaes points out, the Social Security and Medicare taxes for each are now 7.65 percent for employer and employee on income up to about $70,000. The Social Security part alone is 5.3 percent for each.
The pamphlet also stated that "the check will come to you as your right." Unfortunately, though Shlaes quotes this statement, she doesn't challenge it. Yet in the 1960 case Flemming v. Nestor, the Supreme Court found that Social Security is not a legal right. Either of the board's two falsehoods, if made by a private pension fund, would have subjected it to a lawsuit.
Because the vast majority of FICA revenue goes to current recipients of benefits, many critics of Social Security, including me, have likened it to a Ponzi scheme. According to Shlaes, Larry Dewitt–the Social Security Administration's official historian–actually devoted part of the SSA's Web page to distinguishing one from the other. (The SSA, Dewitt informs me, has since removed his analysis.) Dewitt explained that Charles Ponzi lured investors with promises of 50 percent returns on foreign postal coupons. In fact, investors were being paid their returns out of the inflows from later investors. Ponzi's scam, in short, was a pyramid scheme.
So far, that sounds a lot like Social Security. The difference, argued Dewitt, is that Ponzi's swindle was a geometric progression that "works only so long as there is an ever-increasing number of new investors coming into the scheme." Social Security, by contrast, is a simple arithmetic progression that "is sustainable forever, provided that the number of new people entering the system maintains a rough balance with the number of people collecting from the system."
Hmm. As Shlaes points out, demography and the economy have not cooperated with Dewitt's plan. Moreover, even the system's planners knew, as early as the 1950s, that without big tax increases or benefit cuts the baby boom would bankrupt the system. So, by Dewitt's own definition, Social Security is a Ponzi scheme. Shlaes' book does a real service in bringing the moral and intellectual bankruptcy of this Social Security apologist out into the open.
Shlaes also notes that the earnings penalty for Social Security beneficiaries under age 70 imposes a huge tax on incremental dollars earned by the youngest and most able seniors. In 1996, 65-to-69-year-olds who earned more than $11,520 a year lost one dollar in benefits for every three dollars earned, a marginal tax rate of 33 percent. Americans aged 62 to 64 lost one dollar in benefits for every two dollars earned, a whopping 50 percent marginal tax rate. And that's before counting regular income taxes.
Moreover, Shlaes notes, for elderly people over a certain income level, 85 percent of their Social Security benefits are subject to income taxation. In fact, as I showed in an article in Fortune (March 4, 1996), when you put all these taxes together, the marginal tax rate on dollars earned from working is often more than 80 percent and could be more than 103 percent. "The overall effect," writes Shlaes, "is to send a clear message: `Don't work.'"
The earnings penalty was imposed when Social Security began, because FDR's brain trust thought that the Depression was caused by "too much labor." "So they intentionally idled seniors," Shlaes writes.
Shlaes also shows how our graduated income tax system, in which marginal rates rise with income, imposes an often large penalty when two earners get married. She leads off with a story of two blue-collar workers in Indiana named Darryl Pierce and Sharon Mallory. "They duly did what many couples in America now do when they consider a serious question like matrimony," she writes. "They went to their accountant." The swing in their tax liability, they learned, would be $3,700 a year.
Shlaes' story of how the marriage penalty developed over about 50 years is detailed and well worth reading. She sums up the quandary as follows: "You could have progressivity [higher tax rates on higher incomes], you could have low rates for the second earner, and you could have a tax arrangement that buttressed the traditional family. But you could not have all three. When you treated married women who worked as individuals, you gave them and their husbands a tax advantage over traditional couples–at least as long as there was progressivity. And when you buttressed the tax supports for married couples with a stay-at-home wife while retaining progressivity, you punished couples with working wives." Because President Clinton and most Democrats opposed reducing "progressivity" and social conservatives in the Republican Party wanted to help "the family," the mid-1990s, by default, became an era of loophole writing.
My major disappointment with the book, incidentally, is with Shlaes' failure to challenge the word "progressivity." Shlaes seems not to have learned the lesson taught by George Orwell and Thomas Szasz: Words affect the way we think about things. "Progressivity" sounds, well, so progressive. It's not.
To her credit, Shlaes attacks the idea of higher tax rates for higher-income people. She quotes George Harrison's 1966 song "Taxman," written shortly after the Beatles' success put them into Britain's highest tax bracket. Shlaes doesn't mention it, but the top British tax rate on "earned" income at the time was 83 percent, and the top rate on income from dividends and interest was 98 percent. The Beatles were shocked to be paying such taxes; thus the song, which, she notes, makes specific references to the tax collectors, Prime Minister Wilson and Prime Minister-to-be Heath. (Margaret Thatcher later cut the top rates to 40 percent.)
As Shlaes writes, "We are all Beatles now." Even those of us who are modestly successful–earning, say, $80,000 in a high-tax state like New York or California–are paying marginal tax rates somewhere between 40 percent and 50 percent. The stated purpose of "progressive" taxation, writes Shlaes, is to tax the rich. But high marginal tax rates, she notes, aren't really taxes on the wealthy; they're taxes on becoming wealthy.
Shlaes, by the way, buys into the idea that super-rich people such as Leona Helmsley don't pay taxes. This is false: Even if Helmsley was guilty of defrauding the government from 1983 to 1985–and economist Paul Craig Roberts has presented powerful evidence suggesting that Helmsley's accountants, not she, were the actual frauds ("Leona May be Guilty, but Not as Charged," Wall Street Journal, April 9, 1992)–the Helmsleys paid $53.7 million in federal taxes on adjusted gross income of $103.6 million, rather than the $55.4 million the government claimed they owed. In other words, the feds are very successful at taxing the rich, too.
The only chapter I found unpersuasive is "Baby Taxes." In it, Shlaes shows how burdensome the government's tax and regulatory requirements are for those who hire nannies. That's presumably why President Clinton's first choice for attorney general, Zoe Baird, paid an immigrant off the books for providing day care.
So far, so good. Then Shlaes cites a statistic: Even after all the publicity about Baird, the IRS recently reported that only one in 13 taxpayers who owe "nanny taxes" actually pays them. This upsets Shlaes. I sympathize with her, because I too value honesty and hate the fact that, as Will Rogers once put it, taxes have made more liars than golf.
But there's another way to look at this. There was a saying in the late 19th century West: "A man who won't cheat a railroad ain't honest." At the time, the government gave many railroads special privileges. Cheating them–breaking the oppressive law of the day–became the moral thing to do. The same principle applies today. How many teenage babysitters or newspaper delivery boys do you know who pay taxes on their income? I had both jobs as a kid and would have been shocked had someone told me I was legally liable for taxes.
One of the ways the most oppressive laws get changed, as Milton Friedman wrote in the mid-1960s, is for people to ignore and break them. So only one in 13 people actually pays the nanny tax? That may be one of the most hopeful statistics in this book.
Contributing Editor David R. Henderson (drhend@mbay.net) is a research fellow at the Hoover Institution and an economics professor at the Naval Postgraduate School in Monterey, California.
The post Government Greed appeared first on Reason.com.
]]>I've been a fan of Milton Friedman since age 17, when, fresh from reading Ayn Rand, I ran across a Newsweek column of his titled, "The Public Be Damned." Friedman, then a University of Chicago economist, wasn't writing about robber barons. It was the U.S. Post Office, he argued, that was damning the public; he advocated ending its legal monopoly. After reading every past Friedman column I could find, I was hooked. Here was a guy who wrote and thought clearly about economic freedom and, at the same time, was not a fringe player. He convinced me by example that one could be an advocate for freedom and still be part of mainstream society.
The next summer I worked in a mine in northern Canada, and I sometimes hitchhiked 80 miles round-trip to get the latest Newsweek containing Friedman's column. When I was 19, I knocked on his campus door, and he visited with me for 15 minutes, giving me some fatherly career advice. Now Milton and his economist wife and sometime co-author, Rose, have completed their memoirs, Two Lucky People. (Some of the book's sections are attributed to Milton, some to Rose, and some to both.) I have been awaiting Two Lucky People eagerly since 1990, when Milton first told me about it. It was with real anticipation that I opened the book to page one.
That was a mistake. As much as I love and appreciate both Friedmans, I found their recollections of their early life uninteresting. Disappointed, I put the book aside. But a few weeks later, I turned to the chapters on the 1960s. The Friedmans' stories about the '60s were fascinating. I jumped to the next interesting-sounding part, about the University of Chicago, then to their time in Washington during the war, and so on until finally I had read the entire book and even found myself wishing for more. I recommend starting the book at about page 91.
The '60s were a decade of intense activity and achievement for the Friedmans. Milton, working with Rose, completed Capitalism and Freedom, which is more subtle and nuanced than their more popular 1980 book, Free to Choose. Milton advised presidential candidates Barry Goldwater and Richard Nixon on economic issues. He and Anna Schwartz finished The Monetary History of the United States, which opened many economists and noneconomists to the idea that the Federal Reserve Board had played a major role in causing the Great Depression. During those years Friedman became an outspoken opponent of military conscription, and served on the Nixon commission that almost unanimously endorsed an end to the draft. It was in 1966 that Friedman started his tri-weekly column for Newsweek.
Along with the achievements on display in these memoirs are the strengths of character that lay behind them. Among the most attractive of these strengths is the Friedmans' willingness to speak truth to power. Milton tells of visiting the White House in September 1971, a month after Nixon imposed comprehensive wage and price controls. His friend and former colleague George Shultz was in charge of administering the controls. As Friedman got up to leave his meeting with Nixon and Shultz, Nixon volunteered that wage and price controls were a monstrosity and that they would get rid of them as soon as possible, adding, "Don't blame George for this monstrosity." "As I remember it," writes Friedman, "I replied something like, `I don't blame George. I blame you, Mr. President.'"
The story also illustrates Milton Friedman's capacity to make important distinctions and judge accordingly. Just two months after his White House meeting, I attended a libertarian conference at Columbia University at which many in the audience were trying to push Friedman to attack Nixon. Friedman denounced Nixon for the price controls but wouldn't go beyond that, insisting that we give credit to Nixon for trying to end the draft. Friedman's complete unwillingness to pander to his audience set an example that I have never forgotten.
One of Milton's most admirable traits is his willingness to share credit with those around him, and this too has found its way into the book. Friedman tells about a four-day conference on the draft at the University of Chicago in 1967. The proceedings are in The Draft, edited by Sol Tax, a book I read cover to cover in my late teens. One speaker at the conference was Walter Oi, a 38-year-old economist who, as a 12-year-old California Nisei, had been imprisoned by the U.S. government at the outbreak of World War II. During graduate school, Oi became blind as the result of a degenerative disease. (In the late 1970s, Oi was my colleague at the University of Rochester, where he is still on the faculty.) Friedman writes of Oi:
"A convinced libertarian, he strongly opposed the draft. At the conference he gave an eloquent paper presenting the case for ending the draft on grounds of both principle and expediency. The impact was dramatic. Here was a blind man, enormously impressive simply for his capacity to prepare a cogent, closely argued, and fully documented paper. He conveyed a clear sense of moral outrage on an issue about which he had no conceivable ax to grind. To me, it was the high point of the conference."
But there are drawbacks to the book aside from its slow opening chapters. When I read an autobiography of an intellectual, I expect to learn which key experiences helped form his or her views. Here were two people who grew up in Jewish-American households in the 1910s and 1920s and were not quite adults when the Great Depression began. Many prominent Jewish intellectuals of that generation turned left.
By contrast, Milton Friedman was to devote a chapter of Capitalism and Freedom to the way in which market economies weaken patterns of persecution, and was to address the issue of Jewish attitudes toward capitalism in a 1972 address to the Mont Pelerin Society (reprinted years later in Encounter.) Pro-market Jews, he stated then, are "regarded not only as intellectual deviants but also as traitors to a supposed cultural and national tradition." Friedman went on to make a powerful case for the emancipatory benefits that Jews had derived from the market.
The Friedmans have won their argument; pro-market Jews are no longer intellectual outsiders. I would have been interested in learning in what way the events of the Friedmans' lives shaped their own belief in freedom. After reading their book, I don't have much more of a clue than when I began.
Even more strikingly, Milton doesn't talk much about how his views evolved during graduate school or after he completed his Ph.D. Yet evolve they did. Most notably, his views on the causes of inflation changed dramatically. The Milton Friedman that most of us know about is the one who said, in a famous 1968 debate with Keynesian economist Walter Heller, "the state of the budget by itself has no significant effect on…inflation" and who wrote in 1963, "Inflation is always and everywhere a monetary phenomenon." By contrast, here's what Friedman says about testimony he gave as a Treasury economist in 1942: "The most striking feature of this [testimony] is how thoroughly Keynesian it is. I did not even mention `money' or `monetary policy'! The only `methods of avoiding inflation' I mentioned in addition to taxation were `price control and rationing, control of consumers' credit, reduction in governmental spending, and war bond campaigns.'"
What happened between 1942 and the early 1950s that changed Friedman's mind? Maybe the explanation is simply that he gathered data that persuaded him of the power of monetary policy. But in most intellectual autobiographies I have read, there's one event, piece of evidence, story, conversation, or argument that starts the process of change. Friedman mentions no such epiphany.
While Friedman's defense of freedom was initially pragmatic, over the years he has increasingly emphasized moral arguments. His approach to price controls illustrates the shift. A pamphlet that he co-authored with George Stigler, "Roofs or Ceilings," leads off the argument against rationing of rent-controlled apartments by the Office of Price Administration as follows: "The defects in our present method of rationing by landlords are obvious and weighty. They are to be expected under private, personal rationing, which is, of course, why OPA assumed the task of rationing meats, fats, canned goods, and sugar during the war instead of letting grocers ration them."
This passage, among others, led Ayn Rand to write a letter to the publisher, Leonard Read, complaining that the pamphlet "advocates the nationalization of private homes." It's also probably why Rand never recommended anything Friedman or Stigler ever wrote, a serious failing on her part. But although Rand was incorrect in saying Friedman and Stigler advocated nationalizing homes, she had a point: Friedman and Stigler never mentioned that rent controls violate landlords' property rights. In 1971, by contrast, Friedman was to call price controls "immoral."
Was the omission of morality in 1946 due to Stigler, who was never comfortable talking about rights? Did Friedman think that he couldn't discuss rights in the intellectual environment of the time? Quite possibly the latter, because, as Friedman has often said, in the late 1940s the intellectual world was substantially more hostile to freedom than it has been since. Also, Friedman was a Jew without tenure, and anti-Semitism at the University of Wisconsin before the war had cost him his job. Indeed, before World War II, strange as it now sounds, Jews were almost nonexistent in academia. Friedman may have been justifiably wary of defending property rights forthrightly. Context is important. Unfortunately, neither Milton nor Rose gives the context.
Still, Two Lucky People has many strengths. One is the pithy comments it offers on various people. As a Treasury economist, for example, Friedman was assigned to explain national income accounts to Sen. Robert A. Taft, the famous conservative known as "Mr. Republican." Writes Friedman: "He proved to be an apt student. I would rank him with Richard Nixon as one of the intellectually ablest political figures with whom I have had close contact. He would have been an outstanding member of any university." About George Bush, by contrast, Milton writes, "I believe that Reagan made a mistake when he chose Bush as his vice-presidential candidate–indeed, I regard it as the worst decision not only of his campaign but of his presidency."
The book's other main strength is that it shows the Friedmans going through life enjoying themselves and keeping their fame in perspective. So many famous people expected to be treated better than those of us who are not famous. Through the whole book, you never get the idea that this is the Friedmans' view. One of my favorite stories is Rose's telling of their being thoroughly searched by a U.S. Customs official on their return from traveling abroad. While rifling through Milton's wallet, the official found Milton's business card and suddenly became "respectful rather than overbearing. We were as annoyed by his change in demeanor on recognizing Milton's name as by being singled out for close examination without being informed what if anything we were suspected of," writes Rose. How refreshing that the two Friedmans care, not about their own privileges, but about everyone's rights.
In 1983, three years after the series Free to Choose was shown on television, the Friedmans met Queen Elizabeth at a party on her yacht, Britannia. Rose, instead of oohing and aahing about royalty in her account, shows the Friedmans and the queen with their feet on the ground. Rose writes, "When it came Milton's turn to be introduced to the queen while going through the receiving line, she remarked, "I know you. Philip is always watching you on the telly.'"
Contributing Editor David R. Henderson (drhend@mbay.net) is a research fellow with the Hoover Institution and an associate professor of economics at the Naval Postgraduate School in Monterey, California.
The post Lucky Pair appeared first on Reason.com.
]]>Peter G. Peterson, chairman of the Blackstone Group, a private investment bank, and a former secretary of commerce under President Nixon, has been writing about the Social Security mess for some time. In Will America Grow Up Before It Grows Old?, Peterson shows just how grim the future will be due to out-of-control spending on Social Security and Medicare. His analysis, though off in some places, is often on target. Unfortunately, some of his proposed solutions would create new problems to replace the existing ones.
Peterson makes his case against Social Security's fiscal soundness with words, numbers, and graphs. To convey what the future will be like, he asks the reader to imagine "a nation of Floridas." Today, about one in five Floridians is age 65 or older; by the mid-2020s this will be true for the United States in general. His beautiful bar graph drives home the point. And while there are currently 3.3 workers per Social Security beneficiary, he notes that by 2040 the ratio will be less than 2 to 1. The implied Social Security taxes would rise from 11.5 percent of workers' payroll today to between 17 percent and 22 percent in 2040. And if Medicare spending rises as fast as it has historically, writes Peterson, taxes for these two programs alone would take more than a third of every worker's paycheck.
Peterson also presents scary data about the federal government's unfunded liability–the present value of all its future commitments minus the present value of the tax revenue earmarked to pay for these commitments. For Medicare and Social Security alone, this unfunded liability is a whopping $15.3 trillion. Compare this to the value of all the federal government's assets, which Peterson puts at $2.3 trillion. So much for the idea of selling off assets to avoid raising taxes for Medicare and Social Security.
The mess started in 1935, when Franklin Roosevelt introduced a social security program, though there was no popular demand for it, and then required most people in the private sector to join, even if they already had a pension plan. In contrast to private pension plans, which are normally set up on an actuarially sound basis, Roosevelt's Social Security was pay-as-you-go, with current retirees' benefits paid out of current workers' taxes. Oddly, Peterson mentions very little about the historical origins of Social Security and, when he talks about Roosevelt at all, he is way too generous. Rather than seeing the current dilemma of Social Security as inherent in the program's design, Peterson suggests the problem started in the 1960s and '70s. While it is true that what happened then made an awful situation much, much worse, it is also clear that Social Security's pay-as-you-go financing is an insurmountable structural flaw.
Between 1967 and 1972, writes Peterson, Congress and the president raised Social Security benefits by 72 percent (38 percent after adjustment for inflation). President Lyndon Johnson started the boom years. When Wilbur Cohen, Johnson's secretary of health, education, and welfare, proposed a 10 percent hike in Social Security benefits, Johnson replied, "Come on, Wilbur, you can do better than that!" Johnson also introduced Medicare. Then President Nixon added to the problem by getting into a bidding war with Wilbur Mills, a powerful congressman who tried to get the Democratic presidential nomination in 1972. Net result: a 20 percent increase in benefits. Massachusetts Institute of Technology economist Paul Samuelson provided some of the intellectual backing for these policies. Peterson quotes Samuelson: "The beauty about social insurance is that it is actuarially unsound." Samuelson's point was that if real incomes were growing quickly, each generation could get more out of Social Security than it paid in. While contemporary critics call Social Security a Ponzi game, Samuelson beat them to the punch in 1967 by blessing it as one. "A growing nation," wrote Samuelson, "is the greatest Ponzi game ever contrived."
Peterson pushes the popular belief that the current elderly are receiving a windfall from Social Security. "The typical one-earner couple retiring in 1995," he writes, "will get about $123,000 more from Social Security than the average earner and his or her employers ever paid into it, plus interest." But only in a footnote does he say that he used a real (i.e., inflation-adjusted) interest rate of only 2 percent. That's below what you can make on Treasury bills nowadays and well below the long-term 7 percent real rate that the stock market has averaged during the last 70 years.
Indeed, when you use a more realistic real rate of return, say between 5 percent and 7 percent, you conclude that Social Security screws almost everyone. This is what Eugene Steuerle, an economist at the Urban Institute who normally uses a 2 percent rate, found when he used 6 percent for some unpublished computations, and what my student Shawn Duffy and I found using 7 percent and even 5 percent. Steuerle found that someone born in 1955 will, if married, a sole provider, and a high earner (making $60,000 in 1993 dollars), receive $750,000 less than he or she paid in. Even an average earner (making $24,444 in 1993 dollars), if a sole worker in a married couple, loses $268,000. If both spouses work, and one earns $60,000 while the other makes $24,444, they lose over $1.2 million. That's right. For such a couple, it's as if the government took more than a million dollars from them the day they retired. Duffy and I calculated that even someone born in 1929 who worked for the minimum wage his entire life and who retired in 1994–supposedly the quintessential Social Security windfall king–would have done better without Social Security. At a 5 percent real rate of return, he would have been over $34,000 better off. At a real rate of 7 percent, he would have been over $100,000 better off without Social Security.
So, although Peterson is thought of as a Cassandra for his warnings on Social Security, he actually understates the problem. The problem arises because the Social Security taxes are spent rather than invested privately.
Just as Peterson's diagnosis is sometimes on target and sometimes not, so are his prescriptions. First the good news. Peterson would end the so-called earnings penalty, the law that requires retirees below age 70 to give up Social Security benefits for every dollar they earn past a modest threshold. For those aged 62 to 64, the "giveback" is 50 cents per dollar earned over $8,280; for those aged 65 to 69 it is 33 cents per dollar earned over $11,520. This program alone, independent of the income tax, imposes a marginal tax rate ranging from 33 percent to 50 percent. Score one for Peterson. Also to his credit, Peterson advocates that the age at which people can receive full Social Security and Medicare benefits be raised gradually from 65 to 70 by the year 2015.
Now the bad news. Peterson would impose an "affluence test" for receipt of Social Security, Medicare, and other benefits. If your income exceeds $40,000, you would lose 10 percent of your federal benefits for every additional $10,000 of income. So a family making $50,000 and receiving $12,000 in federal benefits would lose $1,200. A family making $100,000 and receiving $12,000 in benefits would lose 60 percent of that $12,000, or $7,200.
Aside from the difficulty of enforcing such a plan–Mr. Smith, we just learned that your income last year was $10,000 higher than the previous year; please send us a check for $1,500, which is 10 percent of the cost of your hip replacement–there is a fundamental moral and economic objection. The affluence test is morally objectionable because it penalizes people who make the same income as others their whole life, but who save more and earn a return on these savings.
The economic objection runs along similar lines. The "affluence test" would deter retirement saving, which is, after all, one of the main kinds. Financial planners, tax attorneys, and others would discuss the law in tax and investment newsletters, and someone who saves enough to make over $40,000 a year in retirement is likely to pay attention to such information. Peterson disagrees. It "strains credibility," he writes, to think that people will look ahead so far and so clearly. If he's right–I think he's not–notice that Peterson's proposal relies on people not finding out.
At times, Peterson strains his own credibility through his ambivalence about government's proper role. Waxing eloquent about Americans of past centuries, Peterson writes, "[T]he ethic of individual responsibility taught that if you didn't save, you would bear the consequence personally. The spendthrift who hit bottom could apologize to his family, implore his neighbors, or beg from strangers–but would have no claim on the public purse." Peterson later states that "all groups of federal beneficiaries" must "make meaningful sacrifices" by having some benefits cut. But one page later he tells us he doesn't mean it, advocating that federal benefits for the elderly poor be increased. This, in spite of his own demonstration that Social Security benefits have risen so dramatically for all, including the lowest-income beneficiaries. Peterson also wants the government to "substantially increase spending" on education, worker training, and research and development.
In the end, Peterson fails to draw the important moral from his own tale. He shows how Democratic and Republican politicians made a mess for us with reckless abandon, not seeming to care a whit about any consequences beyond the next election. And what does he conclude? That government should do more.
David R. Henderson (drhend@mbay.net) is a research associate at the Hoover Institution and an economics professor at the Naval Postgraduate School in Monterey, California.
The post Boomer Blues appeared first on Reason.com.
]]>In the early 1970s, I was a graduate teaching assistant at UCLA in an undergraduate course taught by Charles Baird, a free-market economist. After explaining to the class the problems with the current welfare system—its disincentive to work, the amount of life-arranging (his word) that social workers do, etc.—Baird proposed as an alternative Milton Friedman's idea of a negative income tax.
Under Friedman's plan, the panoply of different government aid programs would be replaced by a simple cash giveaway, administered through the tax system, with no money specifically earmarked for certain items, as with food stamps, and no pesky social workers trying to manage your life. Poor people would be able to spend their welfare payment on anything they wanted.
Shortly after the class, an undergrad—one of the more promising and apparently idealistic ones, I might add—came by to discuss the Friedman proposal further. I expected him to focus on the proposal's effect on the poor. Instead, he considered solely its effect on himself.
He had calculated that under the Friedman plan he could be independent of his middle-class family because he would qualify for the negative income tax himself. I was stunned. Where I grew up, in rural Canada, there was a stigma about taking welfare. I had simply trusted that this stigma would be strong enough that few young, healthy people would take advantage of a negative income tax. This student challenged my naïveté, and unwittingly made me an opponent of the negative income tax.
I was reminded of that incident while reading Gertrude Himmelfarb's The De-moralization of Society. Himmelfarb, a history professor at the City University of New York, takes a fresh look at Victorian England—an era my 40-something generation was taught to ridicule—and finds much that was good. One of the good things was the way Victorians stigmatized those on "relief." We are told today that welfare should carry no stigma, because that is demeaning and dehumanizing to those dependent on it. The Victorians believed exactly the opposite.
Humanitarians in that era wanted to help only poor people who were unable to support themselves. Thus, they wanted able-bodied people on relief to feel stigmatized by those around them, as a way of motivating them to get off welfare as soon as possible. They wanted those who could be poor and independent not to turn into paupers, that is, people who were permanently dependent on others for their daily sustenance. They thought a lot about how to reduce this permanent dependence.
Their thinking led them to believe in "less-eligibility," which was the basis for the Poor Law reform of 1834. Before that welfare reform, people were entitled to relief. The 1834 law tightened eligibility. According to the principle of less-eligibility, the condition of the "able-bodied pauper" should be less "eligible," less desirable, than the condition of poor self-supporting laborers. Less-eligibility led to the workhouse principle, the idea that to get relief, the able-bodied pauper and his family (but not the sick, the aged, and widows with small children) would have to live in workhouses.
Weren't these workhouses the hellholes that Charles Dickens portrayed in Oliver Twist? Certainly not always. French writer Hippolyte Taine, who observed England in the 1860s, reported that the workhouse he visited was spacious and clean, the children were taught in classrooms, and the diet included meat once a week—a luxury in those days.
Though workhouses weren't the hellholes of popular myth, they did take away people's freedom and segregate them from the general community. And it worked. Taine reported that of the 350 "inmates," not one was a single able-bodied man.
"They prefer to be free and starve," he wrote. "The workhouse is looked upon as a prison and the poor make it a point of honor never to enter one."
The data on workhouse expenditures confirm that the reform worked in discouraging dependency. Annual expenditures averaged 6.7 million pounds in the five years before the reform and 4.5 million pounds after, in spite of a population increase of 1 million.
Illegitimacy has always been a decent predictor of future social pathologies. The Victorians, writes Himmelfarb, also condemned women who had children out of wedlock. While I still think condemnation was too harsh, contrast that to three years ago, when former Vice President Dan Quayle was the one condemned for even suggesting there might be problems associated with illegitimacy.
Himmelfarb quotes the recently departed surgeon general, Joycelyn Elders, who, asked whether having children out of wedlock should be condemned, answered: "No. Everyone has different moral standards….You can't impose your standards on someone else." This difference in moral outlook, argues Himmelfarb, is responsible for a huge difference in results. She notes that whereas in 1901 only 4 percent of births in England and Wales were out of wedlock, by 1992 the figure was 32 percent.
Himmelfarb's subtitle, "From Victorian Virtues to Modern Values," highlights the importance of words. What the Victorians called "virtues," we call "values." And the change in words speaks volumes. Virtue connotes something that is rock solid and definitely not arbitrary. Value sounds squishier, more subjective.
Himmelfarb notes that modern "moral education" courses explicitly avoid educating people about morality. Instead, the values clarification technique has students "discover" their own values by "exploring their likes and dislikes, preferences and feelings"—as if likes and dislikes have anything to do with morality. The Victorians would have little tolerance for this ethical relativism that surrounds us today.
Instead, the Victorians believed in the bourgeois virtues—being honest, industrious, punctual, sober, and law-abiding, to name a few. Living by these virtues almost guarantees that dependency will not become a problem. Those virtues also resulted in a very civil, and very safe, society. Hippolyte Taine wrote: "I have seen whole families of the common people picnicking on the grass in Hyde Park; they neither pulled up nor damaged anything." And, notes Himmelfarb, Britain's crime rate during the Victorian era was very low. By 1901, near the end of the Victorian era, the crime rate bottomed out at 250 indictable offenses per 100,000 population. Compare that to Britain's 1991 rate of 10,000, a staggering 40 times that 1901 rate. Taine commented, "The aim of every society must be a state of affairs in which every man is his own constable, until at last none other is required." The modern emphasis on values over virtue has done little to help us achieve this noble aim.
But Himmelfarb believes that abandoning failed welfare policies and releasing the resources of the free market wouldn't be enough to achieve that aim either. Faith in free markets, writes Himmelfarb, "underestimates the moral and cultural dimensions of the problem." Traditional values, she argues, must be legitimated, and this is difficult when the state and the dominant culture are legitimating their opposite.
Those who want to resist the dominant culture, asserts Himmelfarb, "may be obliged, however reluctantly, to invoke the power of the law and the state, if only to protect those private institutions and associations that are the best repositories of traditional values." She does not say clearly which powers of the state she would invoke and for what, but her further discussion hints that she would have no trouble with anti-pornography laws, for example.
Himmelfarb is right that a cultural change is needed. But she is wrong to believe that "invoking the power of the state" is the way to get there. Though she seems to understand the strong connection between government welfare policies and the decline in culture, she doesn't take the obvious next step: calling for a radical downsizing of government.
But only a large cut in government welfare programs, with abolition of most, can set the cultural forces in motion that would lead to declines in illegitimacy, crime, and other social pathologies. Trying to change the culture without changing its underlying incentives is, well, silly.
David Frum said this well in his 1994 book, Dead Right. In discussing the major strands of 1990s American conservatism, Frum wrote: "Conservatives who throw in the towel on issues like Social Security and Medicare and welfare in order to direct their full attention to 'the culture' are attempting to preserve bourgeois values in a world arranged in such a way as to render those virtues at best unnecessary and at worst active nuisances. The project is not one that is very likely to succeed."
Contributing Editor David R. Henderson is an associate professor of economics at the Naval Postgraduate School in Monterey, California, and the editor of The Fortune Encyclopedia of Economics.
The post Values Judgments appeared first on Reason.com.
]]>Are you looking for an economist who can really write and who has insight after insight on free markets versus government regulation? Would you like it even better if you could get some good laughs from his clever way of putting things? Then Ronald H. Coase's Essays on Economics and Economists is the book for you.
One of the book's best articles is "The Market for Goods and the Market for Ideas." Presented originally in December 1973 at the American Economics Association meetings, it made many intellectuals squirm. In it, Coase notes that intellectuals tend to think government regulation of economic markets improves matters. When it comes to regulating the market for ideas, however, most intellectuals are positive that the government is utterly incompetent. Coase sticks the knife into this inconsistency, and, page by page, cleverly turns it. He points out that, while the press is often quite willing to accept and capitalize on stolen documents (as was the case with the New York Times publication of the "Pentagon papers" in 1971), it castigates others for stealing them. At the time of his speech, the Watergate burglary—an attempt to steal documents—was bringing down Richard Nixon. Quips Coase: "It is hard for me to believe that the main thing wrong with the Watergate affair was that it was not organized by The New York Times."
One of my other favorites is "Economics and Public Policy," based on a speech Coase gave in 1974 at UCLA. I remember the speech; I saw him give it. And it reads as well as I remember it. Coase's main message is that what economists have to say that "is important and true is quite simple—so simple indeed that little or no economics is required to understand it." But what is discouraging is that these simple truths are commonly ignored in economic policy discussions. One such insight is that if the government fixes a price below the market price, it will cause a shortage because at that lower price consumers will want to buy more and producers will want to supply less.
In the early 1970s, Nixon's price controls on gasoline were causing shortages, long lines, and violence. Coase tells of his colleague at the University of Chicago Law School, Edmund Kitch, giving a speech in Washington explaining how price controls were causing a shortage of natural gas. The audience was a typical Washington audience: Washington journalists, staff members of congressional committees, and so on. They showed little interest in Kitch's findings, says Coase, but much interest in who financed the study. Many assumed that it had been financed by the natural gas industry, which was not the case.
Coase writes, "A large part of the audience seemed to live in a simple world in which anyone who thought prices should rise was pro-industry and anyone who wanted prices to be reduced was pro-consumer. I could have explained that the essentials of Kitch's argument had been put forward earlier by Adam Smith—but most of the audience would have assumed that he was someone else in the pay of the American Gas Association."
Who is this guy? Few people outside the economics profession had heard of Coase before he won the Nobel Prize in economics in 1991. But he had a major impact on economics with just two articles, one published in 1937, the other in 1960. In the book's first essay, a reprint of the lecture he gave in accepting the Nobel Prize, Coase explains the significance of the two articles. The 1937 piece, "The Nature of the Firm," was probably the first attempt by an economist to explain why there are firms rather than just autonomous individuals exchanging things. Within firms, he pointed out, the price system—free markets—is rarely used. Coase asserted that firms arise as a way of economizing on the inevitable transaction costs of the price system. Competition ensures that we get the optimal combination of firms and arm's-length market transactions. Coase notes that he had presented the essentials of his 1937 article in a 1932 lecture when he was 21. "It is a strange experience to be praised in my 80s for work I did in my 20s," says Coase.
His other major article was "The Problem of Social Cost" (1960). In it, Coase shows that if property rights were complete and if transaction costs were zero, then problems like pollution would be solved. How so? If the people hurt by the pollution had the right not to be polluted, they could either stop the pollution or sell their right to the person who wanted to pollute. If those suffering from pollution valued not being polluted more than the polluter valued being able to pollute, then the polluter would not be willing to pay enough to pollute and the pollution would cease. If, on the other hand, the person polluting had the right to pollute but valued being able to pollute less than the victims valued being free of pollution, then the victims would be willing to pay enough to get the polluter to stop. Notice something interesting: Whether the pollution occurs does not depend on who has the right. Irrespective of who has the right, the entity that values the right most dominates. The outcome is efficient. The insight was so stunning at the time that George Stigler, Coase's colleague at the University of Chicago, labeled it the "Coase Theorem."
In the same Nobel Prize lecture, Coast outlines the significance of his theorem. He stresses that he is not claiming that transaction costs are zero. Rather, he is saying that if they were zero, there would be no problem; if they are positive, then we have to examine whether government or free markets would do a better job of solving pollution and other problems. The tendency before his article had been to assume that transactions are costly in free markets but that somehow the government could arrange things more efficiently.
A gentle man, Coase is also quite willing to take on some of the giants of economics when he disagrees with them. In one essay, "How Should Economists Choose?," Coase criticizes a famous 1953 article on methodology by Milton Friedman. Friedman had argued that the correctness of one's assumptions is unimportant and that all that matters for an economic theory is that it be capable of accurate predictions. Coase responds with a devastating counterexample.
"We could have predicted," writes Coase, "over the last few years what the American government's policies on oil and natural gas would be if we had assumed that the aim of the American government was to increase the power and income of the OPEC countries and to reduce the standard of living in the United States. But I am sure that we would prefer a theory that explains why the American government, which presumably did not want to bring about these results, was led to adopt policies which harmed American interests. Testable predictions are not all that matters. And realism in our assumptions is needed if our theories are ever to help us understand why the system works the way it does. Realism in assumptions forces us to analyze the world that exists, not some imaginary world that does not."
Or consider how he takes on Austrian-school economist Joseph Schumpeter in an essay on Adam Smith's Wealth of Nations. "Schumpeter," comments Coase, "acknowledges Smith's skill in rather grudging terms: `He disliked whatever went beyond common sense. He never moved above the heads of even the dullest readers. He led them on gently, encouraging them by trivialities and homely observations, making them feel comfortable all along.' What Schumpeter means is the Wealth of Nations can be read with pleasure. It is clear, amusing, and persuasive. Adam Smith's style is, of course, very different than that of most modern economists, who are either incapable of writing simple English or have decided that they have more to gain by concealment."
My favorite part of Coase's Essays is the essays on economics. I found the essays on famous economists making up the book's second half less interesting. Of the latter, the most captivating pieces include an essay on George Stigler, written before Stigler died in 1991, and an essay on the London School of Economics in the 1930s. The piece on the LSE points out what was known at the time, and mainly forgotten now, namely that young economist F.A. Hayek was a major player at the school in the 1930s.
Coase also writes at length on Alfred Marshall, arguably the greatest economist of the late 19th century. The essay's best moment comes when Coase quotes from a letter Marshall wrote to a student about how mathematics should be used in economics: "(1) Use mathematics as a shorthand language, rather than as an engine of inquiry. (2) Keep to them till you have done. (3) Translate into English. (4) Then illustrate by examples that are important in real life. (5) Burn the mathematics. (6) If you can't succeed in (4), burn (3). This last I did often."
Contributing Editor David R. Henderson is an associate professor of economics at the Naval Postgraduate School in Monterey, California, a research fellow at the Hoover Institution, and the editor of The Fortune Encyclopedia of Economics.
The post Coase Encounters appeared first on Reason.com.
]]>Opponents of further controls have a powerful ally in the mainstream health economists whose work implicitly or explicitly makes the case against government intervention in health care. Here are the most important facts that these economists agree on.
Fact One: Health-care spending is growing faster that gross domestic product almost everywhere.
We often hear that health-care spending is consuming a growing share of GDP in the United States, but this country is hardly unique in that respect. Take the so-called G-7 countries: Canada, France, Germany, Italy, Japan, the United Kingdom, and the United States. In all but Germany, health-care spending consumed a higher percentage of GDP in 1990 than in 1980. The United States led the way, with the share of GDP consumed by health care rising from 9.2 percent to 12.1 percent, a whopping 32-percent increase.
But guess which country was a fairly close second, with its health-care spending rising from 7.4 percent of GDP to 9.3 percent, an increase of 26 percent. Hint One: Proponents of socialized medicine–pardon me, "single-payer health care"–often hold it up as a model. Hint Two: I was born there and left for "the States" when I was 21. Hint Three: This country is directly north of us. Hint 4: Take off, hoser. That's right. Canada, which has "solved" the problem of access to health care by giving it away and which supposedly has kept spending down, has had to contend with massive cost increases.
The Clintons don't want you to know this because, despite the evidence from other countries, they plan to use the police power of the federal government to freeze private real per-capita spending on health care in 1999 and later years. Yet in all of the G-7 countries, including Germany, and in every other Western European country except Ireland, real per-capita spending on health care grew substantially between 1980 and 1990. Indeed, in all but a few, it grew by more than 1.5 percent a year, meaning a compound growth of over 15 percent for the decade. The Clintons would use so-called global budgets to make sure that private spending grew by no more than the consumer price index and the growth in population. And what if an area of the country hit its budget cap for 1999 by, say, October? Tough.
Fact Two: Government spending on health care has risen much more than private spending.
We often hear that it doesn't matter whether you look at government spending or private spending on health care because both have risen rapidly in the last few decades. Actually, it does matter. Spending by government on its two main health care programs, Medicare and Medicaid, has risen much more rapidly than private spending. In 1970 federal, state, and local governments spent $12.3 billion on Medicare and Medicaid combined. By 1991 this was up to $216.7 billion, an inflation-adjusted increase of 427 percent. By contrast, private spending on health care rose from $42.5 billion in 1970 to $377 billion in 1991, an inflation-adjusted increase of 165 percent. To be sure, this is a substantial increase, but it's less than half the increase in Medicare and Medicaid spending.
These numbers indicate that the explosion of costs in Medicare and Medicaid is mainly an explosion of government costs. Which makes you wonder why Bill Clinton is not laughed out of court when he says that one main reason he wants the government to control one-seventh of the economy is to keep costs down. Incidentally, the Clintons are aware that Medicare and Medicaid spending are growing faster than private health-care spending. That's probably why their plan has looser limits on Medicare and Medicaid spending than on private spending.
Fact Three: We're getting a better product for our health-care spending.
As Northwestern University health economist Burton A. Weisbrod recently wrote, "Fifty years ago, physicians were little more than diagnosticians…" All they could do was identify an illness and predict the likely outcome. Now they can actually do something. Weisbrod cities many effective medical procedures: kidney dialysis, organ transplants, polio vaccines, arthroscopic surgical techniques, CAT scans, magnetic resonance imaging, and in-vitro fertilization. These techniques have undoubtedly reduced deaths and made life easier. There's still a serious question whether they are worth their high cost. But the only reason that question arises is that those who benefit don't typically pay for the bill. Which brings us to the next fact.
Fact Four: The amount of any good people consume is higher if they're spending other people's money than if they're spending their own.
This is not a controversial proposition in economics. It's one of the few principles that economists are really sure of. It applies to food, housing, and, yes, health care. But just to make sure it applied to health care, the U.S. government paid the Rand Corporation to do a five-year, $80-million experiment, beginning in 1974. For a few years, thousands of families in the experiment were given one of four health-insurance plans. The main difference between the plans was the copayment rate–the percentage of health expenses paid by the family–which was 0, 25, 50, or 95 percent. Under all the plans, if a family's out-of-pocket expenses reached $1,000, the insurance paid for all additional expenses.
The Rand experiment's main finding was that people do consume more health care if they're spending other people's money. The higher a family's co-payment rate, the less often members of that family went to a doctor and the less often they incurred medical expenses generally. The researchers concluded that a catastrophic insurance plan–a plan in which patients pay a high deductible and then the insurance pays all costs in excess of the deductible–would reduce expenditures by about 31 percent relative to a plan in which people paid nothing out of pocket. If you doubt that, ask yourself how likely you are to have a doctor examine your scratchy throat when the insurance company covers $40 of his $50 fee, leaving you with a bill for $10. Now ask yourself how likely you are to see the doctor if you have to pay the whole $50.
Because almost all insured people, whether covered by their employers or by Medicare and Medicaid, pay close to zero out of pocket, we can straightforwardly apply the Rand results to most Americans. Updating the experiment's $1,000 deductible to 1994 dollars gives a deductible of about $2,000 today. If everyone in the United States switched to such a deductible, estimates Harvard health economist Joseph Newhouse, our health-care spending would fall by about 30 percent. Instead of spending over 14 percent of our GDP on medical care, as we do now, we would spend only about 10 percent.
In short, we could get the waste out of the health-care system simply by buying health insurance with high deductibles. And we could get the share of GDP spent on health care to a level way below the one Clinton says he wants to achieve. Clinton doesn't want you to know that high deductibles would solve the problem because his plan makes any deductible higher than $400 per family illegal.
Fact Five: Employers provide low-deductible medical insurance for their employees–with deductibles that often amount to only $200 or $250 per year, after which the insurance pays 80 percent to 100 percent of costs–because the tax law gives them an artificial incentive to do so.
This fact was driven home by my former boss at Ronald Reagan's Council of Economic Advisers, Martin Feldstein. Feldstein, a Harvard professor and a leading health economist in the late 1960s and early '70s, was one of the first to point out that, because employers' contributions to their employees' health insurance are not taxable as employee income, employers have an incentive to load up their employees with health insurance. Consider an employer and employee trying to choose between an extra dollar in taxable wages and an extra dollar of health insurance. If the dollar is in wages, the employer must spend an extra 7.65 cents in Social Security and Medicare taxes. So the real cost to the employer is $1.0765. And the employee gets not $1.00 but $1.00 minus 7.65 cents for the employee's portion of Social Security and Medicare taxes, minus 28 cents if the employee's federal marginal tax rate is 28 percent, minus 4 cents if the employee's state marginal tax rate is 4 percent. Net take-home pay on $1.00: 60.35 cents.
But if instead the employer pays the dollar in higher premiums for a more generous health-insurance program, the government takes nothing. As long as the employee values the extra dollar in health insurance at more than about 60 cents, he or she is better off taking it in that form. That is one main reason that employers have paid for insurance policies with low deductibles. The employee is better off charging a $50 doctor's bill to the insurance company, even if the insurer spends $20 to process it, and having the employer pay the extra $70 in a higher premium. The alternative, having the employer pay an extra $70 in cash, yields the employee only about $42.
Understanding these incentives, most economists have concluded that more regulation is not necessary to get people out of low-deductible plans. All the government would have to do is end the favorable tax treatment of health insurance. There are two ways to do this. One is for Congress to declare that henceforth employers' contributions to their employees' health insurance are taxable income. The other way is to make a certain amount of compensation–say, $3,000 per employee each year–tax-free income which the employee can use to buy health insurance or put in a medical savings account similar to an individual retirement account.
So is there a health-care crisis? There are actually two. The first, if we had truth in political advertising, would be labeled "Made in Washington." The second is about to be made in Washington.
Contributing Editor David R. Henderson is a senior research fellow at the Hoover Institution, an associate professor of economics at the Naval Postgraduate School in Monterey, California, and editor of The Fortune Encyclopedia of Economics. He served as senior health economist with President Reagan's Council of Economic Advisers from 1982 to 1984.
The post Health Care: Reality Check appeared first on Reason.com.
]]>Clinton has had this idea for a while. In 1988, the Democratic Leadership Council proposed a plan for national service. In his foreword to that proposal, Sen. Sam Nunn (D–Ga.) lists Clinton, at the time a member of the DLC's governing board, as one of the proposal's developers. Will Marshall, the plan's author, now heads the Progressive Policy Institute, which The Wall Street Journal called "Clinton's campaign think tank."
So although Clinton was vague about the details during the campaign, we can reasonably assume that his national-service proposal will resemble the DLC's. And Clinton thought the idea important enough to mention it again in his November 12 news conference, his first after being elected.
Government service later in return for college aid now is not new. The U.S. military does it with its ROTC program, in which college students receive enough money to cover all college expenses, plus a small stipend. In return, students train during their summers and serve as military officers for a number of years after graduating.
Nevertheless, national service is a bad idea. It undercuts the concept of voluntarism; it would give taxpayers little in return for a lot of money; and, ominously, its strongest supporters have traditionally favored compulsory national service with prison sentences for those who refuse. Furthermore, there is a better way for students to get college aid that costs taxpayers nothing and would not even involve the government.
National-service advocates usually pitch their proposal by appealing to the strong tradition of service in America. They picture many "unmet needs" that national-service workers will supposedly meet. When he described his plan in his nomination-acceptance speech at the Democratic convention, Clinton envisioned "millions of energetic young men and women, serving their country by policing the streets, or teaching the children, or caring for the sick, or working with the elderly or people with disabilities, or helping young people to stay off drugs and out of gangs."
The Democratic Leadership Council claimed that its proposal built on the American tradition of voluntarism that Alexis de Toqueville so admired in the 1830s. National service, the DLC claimed, would encourage citizens "to rely less on the impersonal agency of government to solve their problems."
But national service actually violates the tradition of voluntarism—precisely by getting the government involved. And that involvement doesn't come cheap.
An October 23 Washington Post article pegged the cost of Clinton's program at $10 billion to $13 billion. An additional $10 billion in spending amounts to, on average, an additional $100 in taxes for each of the roughly 100 million taxpayers in the United States. Clinton thinks his program will cost about $8 billion, an estimate he defended during one of the debates with George Bush and Ross Perot.
Yet this figure appears very squishy. In an October interview, campaign spokeswoman Avis Lavelle told REASON reporter Grant Thompson that the estimate does not rely on any particular assumptions about what percentage of students would choose national service, as opposed to paying off the loans out of their incomes. But without this information, how can anyone predict the program's cost?
What would the taxpayers get in return for their money? Probably not much, if Clinton's bill resembles Sam Nunn's 1989 national-service bill. Nunn's bill forbade "displacing" someone currently employed. This means that national-service workers would not even be able to take the place of any of the millions of workers now being paid the minimum wage of $4.25 per hour. Their labor would then have to be used in tasks worth even less than $4.25 per hour. No wonder the needs are unmet: At current wages, they aren't worth filling.
Yet the $10,000 per year paid by the government to the national-service worker, plus the loan forgiveness of $10,000 per year of work, amounts to $10 per hour. So the government would pay $10 an hour for work that is probably worth less than $4.25.
To use the workers in more-valuable pursuits, Clinton would have to take on municipal employees, who have the strongest unions in America. During the campaign, he was unwilling to do this.
National service would put the federal government in the position of defining what is service and what is not. Its proponents take a very narrow view. Is a pre-med student serving when she works hard to become a doctor, not only to make money but also to help people? Not according to Bill Clinton.
What about the youth who wants to become a businessman so he can revolutionize the production of a good and sell it at a lower price? Is he serving? Apparently not.
As the proponents of national service see it, someone "serves" only when engaged in an activity that no one values enough to pay for. The national-service advocates take service to others and turn it into work. They would rewrite Milton's sonnet to read: "Only they serve who stand and wait."
To Clinton's credit, his national-service plan is voluntary for the participants (though not for the taxpayers). But many prominent supporters of national service would prefer compulsion. For example, Sen. Charles Robb (D–Va.), one of the sponsors of Nunn's 1989 bill, has stated his preference for universal military training. Nunn himself used to be a strong advocate of the draft.
And Sen. Harris Wofford (D–Pa.), in a 1979 book on national service, expressed his excitement about "the extraordinary mobilization of the talent of young people possible under authoritarian, post-revolutionary conditions" such as those in China. In that same book, Roger Landrum, a contributor to the DLC's proposal, favorably cited Chairman Mao's forced removal of millions of Chinese youth from cities to the countryside and Castro's similar coercion of Cuban youth as examples that "fire some American imaginations."
None of the supporters who reject compulsion do so on principle. In a 1989 Washington Post op-ed, for example, Landrum gave three reasons for rejecting compulsory universal service: 1) "the cost is out of sight," 2) "debates over compulsion are an endless distraction," and 3) "the administrative requirements are a nightmare." So Landrum's only objection to compulsion, aside from cost and administrative hassles, is that it has to be debated. And red tape, not prison sentences for 18-year-olds, is his idea of a nightmare.
It isn't hard to build a scenario in which national service leads to a draft. Here's one: National service attracts few kids from higher-income families. Its advocates then argue that the only way to get broad participation across all income classes is to make national service compulsory. With the voluntary-service network in place, and with an existing constituency of organizations that benefit from the artificially cheap labor, the next step is compulsory service.
Implausible? In 1989, Scott Celley, a spokesman for Sen. John McCain (R–Ariz.), told me that McCain supported national service to "start plugging people into positions and setting up networks with state and local officials." Then, after two years, McCain would have switched to a mandatory system of universal service, with "disincentives" for those who do not participate.
"Could one 'disincentive' be a prison sentence?" I asked. Answered Celley: "Under consideration would be a full range of possible penalties to ensure mandatory participation."
It's true that national service addresses a real problem—funding for college education. But there's a better solution for those whose parents can't afford tuition. They can borrow the money. True, the interest rates for borrowing are high, but they're high because the loans are risky. And the loans are risky for one main reason: The U.S. Supreme Court refuses to allow enforcement of contracts in which people exchange their future labor services for money. The Court calls that "involuntary servitude," but it's no more servitude than the requirement that I pay my mortgage.
The Court does allow such contracts to be enforced when the employer is the U.S. military. National-service advocates would have the government do it for civilian jobs. People should have the same freedom to contract with private lenders to do private-sector jobs. And that would take care of funding for any deserving student who could benefit significantly from college.
Contributing Editor David R. Henderson is an associate professor of economics at the Naval Postgraduate School in Monterey, California, and a senior research fellow at the Hoover Institution.
The post Washington: For College and Country appeared first on Reason.com.
]]>In 1971, as an undergraduate in economics, I wrote James Buchanan a fan letter after getting excited about his book Public Principles of Public Debt. Imagine my surprise when I got a reply the next week. Nor was it a perfunctory reply. In his letter, Buchanan suggested that I update some of the data in his book and submit the results as a paper to the National Tax Journal. (To my regret, I didn't.) Those of you who know academics will know just how rare it is for a senior, internationally known scholar to give this kind of nurturing to one of his or her own students, let alone to a 20-year-old stranger.
Buchanan is well-known for nurturing students, and the Nobel Prize in economics that he won in 1986 did not seem to change that side of him. He reveals more about his own character and personality, and gives punchy commentary on various other topics, in his book Better Than Plowing. The title, Buchanan explains, is a description of an academic career, taken from his mentor, Frank Knight. These essays about Buchanan's roots and his thoughts about his work are probably as close to an autobiography as he will ever write. Those who, like me, are fascinated by reading about people's intellectual odysseys will enjoy much of this book.
Buchanan, who grew up in relative poverty in Tennessee, received his undergraduate education at Middle Tennessee State Teachers College in Murfreesboro. Drafted into the U.S. Navy in August 1941, he spent World War II as a lieutenant at Pearl Harbor and on Guam. One incident in the Navy stands out, in his memory and in the book. The group of cadets with which he was training in 1941 was divided alphabetically. But because there were too few As and Bs with an Ivy League background but a surplus of Ivy Leaguers in the Rs and Ss, an R was imported to head Buchanan's platoon. His name: Bill Rockefeller.
Comments Buchanan: "This initial appointment of cadet officers 'radicalized' me to such an extent that emotional scars remain, even a half century later." (To this day, Buchanan favors confiscatory inheritance taxes, a view that he admits was conditioned by that experience.)
So, writes Buchanan, when he decided after the war to do graduate work in economics, he wanted no part of "eastern establishment" elitism. He decided to attend the University of Chicago.
The decision, he recognizes, was fateful. For it was there that Buchanan met and studied under economist Frank Knight. When Buchanan arrived at Chicago, he was a self-described "libertarian socialist." He had been antigovernment but also, after having read his grandfather's populist tracts on big business, had distrusted the establishment that he thought ran the U.S. economy. But within six weeks of enrolling in Knight's price theory class, Buchanan had become a zealous advocate of free markets.
Buchanan's main contribution to economics is his work on public choice. Public choice, in a nutshell, is based on the idea that just as consumers and producers in the private marketplace act on the basis of their self-interest, so do voters, politicians, and bureaucrats. Buchanan saw the idea of looking at the political system in this way as the obvious next step.
"Once we acknowledge that governmental or political outcomes are, themselves, produced by an interaction of persons, acting in varying political roles," he writes, "the political economist necessarily must extend analysis to the process of interaction and to the relation between process and patterns of results." With this starting point, Buchanan, Gordon Tullock, and others developed the idea that governments, no less than markets, fail in characteristic ways.
When Buchanan was awarded the Nobel Prize for his contributions to public choice, some journalists and economists complained that his thoughts about political behavior were obvious. Interestingly, Buchanan agrees. He says that at no point in writing his first book on public choice, The Calculus of Consent, co-authored with Tullock, was there a sense of "discovery." Rather, the exercise was "writing out the obvious." Yet when their book was published in 1962, it created a lot of excitement and led to much further analysis of political behavior.
Although Buchanan believed while working on The Calculus of Consent that government failed in certain ways, he still thought that it basically responded to the preferences of individual citizens. But as he observed the explosion of government spending to benefit special interests in the late 1960s, his fundamental optimism disappeared. Buchanan came to see the U.S. government as Leviathan. In his 1980 book, The Power to Tax, co-authored with Geoffrey Brennan, Buchanan, in his words, "explores the implications of the hypothesis that government maximizes revenues from any taxing authority" it has.
This view of the state is behind Buchanan's unwillingness to advise governments. He writes: "I have remained unable…to participate putatively in proffering advice to a presumed benevolently despotic government."
One of the book's chapters is devoted entirely to Buchanan's favorite quotes from others. Of these, the three I like most are: "You and I may both believe in planning, but my plan may be to kill you" (economist David McCord Wright). "To talk of states as if they were persons, endowed with the spiritual impulses and aspirations of human beings, and therefore morally accountable, is a piece of pure abstraction for which not even Hegel can be held responsible. It is a habit of modern journalism, catering to that mixture of vulgar passion and dominant materialism which renders unto Caesar the things that are God's because Caesar can be bribed" (W. A. Orton). "You have to belong to the intelligentsia to believe things like that; no ordinary man would be such a fool" (George Orwell).
Writing autobiographies is a new art form for economists. The only other one I know of by a recent American economist is George Stigler's Memoirs of an Unregulated Economist. I actually enjoyed Stigler's more because he managed to give an exciting story about his intellectual discoveries. But Stigler did not let the reader in on his own thinking about matters other than economics or on his feelings. Buchanan tried something more difficult. He wrote about his reasons for taking various actions, and he is just introspective enough to make me want more. He even writes, "I have never experienced psychological hangups," although he wisely hedges by immediately saying, "at least at any level of rational recognition." We all have hangups, and surely his about wealthy people counts. Still, Buchanan took a risk by exposing his personal side. For that he deserves credit.
Contributing Editor David R. Henderson is an associate professor of economics at the Naval Postgraduate School in Monterey and editor of The Fortune Encyclopedia of Economics (forthcoming in 1993).
The post Choices, Public and Private appeared first on Reason.com.
]]>Modern Times provides an incisive and readable portrait of world events since World War I, a cataclysmic tableau that should be understood by anyone hoping to lead the nation toward the 21st century. What makes Johnson's work particularly compelling is his devastating critique of both relativism and statism. The 20th century has been the "age of politics," he writes, and the grand experiment has proved to be a disastrous failure. As he observes: "The state had proved itself an insatiable spender, an unrivalled waster. Indeed, in the twentieth century it had also proved itself the great killer of all time." Modern Times might help officials at all levels learn not to look to politics—and ever newer and more expensive government programs—as the solution to today's problems.
What should government do? For the answer to this question the president should peruse Charles Murray's In Pursuit: Of Happiness and Good Government. Rather than provide a laundry list of "good" and "bad" programs, Murray delves deeper, arguing that the only appropriate role of the state is to promote people's "pursuit of happiness," that is, he writes, to enable them "to go about the business of being human beings as wisely and fully as they could." That, he finds, is most likely to occur if the government does not interfere with private economic and social interaction, especially the operation of the "little platoons" so active in America.
Indeed, he concludes that government social policy has to be ultimately judged by its impact on the functioning of private society. The state has some duty to help the "little platoons" by, for instance, preventing crime, but, Murray argues, there must be a "stopping point" beyond which government does not try to supplant private efforts. He rightly concludes that people's "satisfaction depends crucially on being left important things over which we take trouble," something recent Congresses and presidents have increasingly failed to recognize.
Also on the president's bedstand should be Richard John Neuhaus's The Naked Public Square: Religion and Democracy in America. The role of religion in the United States is a controversial one; some of the bitterest issues that a president confronts involve church-state relations. Neuhaus's thoughtful analysis should help the reader avoid both extremes: the sterile, even dangerous "naked public square," with religion banned from communal life, and the incestuous, equally dangerous union of church and state, with clerics and politicians allied for any number of dubious ends. What we need, Neuhaus argues, is a rearticulation of "the religious base of the democratic experiment." The Naked Public Square will help the president understand how he can simultaneously welcome religious values in the public debate and resist clerics who reach for the levers of power.
Contributing Editor Doug Bandow is a senior fellow at the Cato Institute and the author of The Politics of Plunder: Misgovernment in Washington and Beyond Good Intentions: A Biblical View of Politics. He formerly served as a special assistant to President Reagan.
Robert L. Bartley
Every president for at least two decades has been blindsided by powerful international economic forces that few of us begin to comprehend and that most of us don't even begin to recognize. To start to grasp the real meanings of global interdependence, a president ought to read Kenichi Ohmae's The Borderless World (Harper Business/HarperCollins, 1990) and Walter Wriston's The Twilight of Sovereignty (Scribner's, 1992). For a sense of what can happen if you get these matters wrong, he could dip back into the interwar period for John Maynard Keynes's The Economic Consequences of the Peace (London: Macmillan, 1920). And he ought to ask his new treasury secretary to read all of them twice.
Robert L. Bartley is the editor of The Wall Street Journal and the author of The Seven Fat Years: And How to Do It Again (The Free Press).
Clint Bolick
Unless Frank Zappa experiences an unexpected electoral surge, it looks like our next president will need guidance not only in public policy but in the style of governance. No one personifies passionate and principled leadership better than Václáv Havel, and his Summer Meditations should top the president-elect's reading list.
Havel is the most heroic political leader to emerge on the world stage in recent years, and his life is an example of what it means to stand for something. His elegant and dignified leadership style stands in stark contrast to the frenzied arm waving and demagoguery that dominate American politics.
"Though my heart may be left of centre," Havel declares, "I have always known that the only economic system that works is a market economy," for "it is the only one that reflects the nature of life itself."
Yet the task of building a "state of ideas" is not one of ideology, Havel insists. "Something more is necessary. For the sake of simplicity, it might be called spirit. Or feeling. Or conscience." Havel's introspections could go far in helping our own president rediscover the principles and spirit that made our nation great.
Turning to policy, no issue bodes more ominously than the future of American education. With the collapse of communism in Eastern Europe, the most brazen example outside China of unreconstructed socialism in the world today is America's public-school system. The primary victims are millions of low-income youngsters who are consigned to inner-city educational cesspools and lack access to the most basic skills necessary for responsible citizenship and productive livelihoods.
John Chubb and Terry Moe's Politics, Markets & America's Schools establishes convincingly that more spending and superficial tinkering will not cure the ails of public schooling. Rather, the flaws are inherent in a highly bureaucratic system driven not by consumer satisfaction but by special-interest politics. Nothing less than full-scale market competition and a transfer of power from bureaucrats to parents, Chubb and Moe demonstrate, will accomplish the goal of equal and high-quality educational opportunities.
But let's get real. We're dealing here with an American president. So my go-for-broke selection is The Little Red Hen, the perfect presidential bedtime story that says it all. Ms. Hen, one of the great feminist heroes in popular literature, asks her barnyard neighbors to help her harvest wheat and bake bread. "Not I!" they all respond. But in the end, they all want to share in the fruits of Hen's labor. Think again, cackles the red-feathered fowl.
Hen rivals Charles Murray's works as a scathing indictment of the welfare state and offers a stirring moral defense of free trade and private property. Eight-year-old writer and paleontologist Evan Bolick told me he was persuaded. Maybe it will persuade a president too.
Clint Bolick is litigation director at the Institute for Justice in Washington and author of the forthcoming Grass Roots Tyranny and the Limits of Federalism.
David Brudnoy
We may well never again have a president as plain-spoken, as unaffected, and as little disposed to bend with the winds (and whims) of fashion as Harry Truman, whose (very likely) definitive biography has been written by David McCullough. Truman (Simon & Schuster, 1992) is old-fashioned history, meaning that it tells us what happened, when, why, and where, and leaves to our own ruminations the deeper meanings of happenings. Truman's Fair Deal liberalism may not be everyone's idea of the perfect political philosophy, but it was grounded on a consistent theory of government as enunciated by an honest man.
Rising from provincialism to competence in personal and professional life, avoiding the taint of his machine-politics sponsors, studying the issues and concluding what by his lights was the best course, Truman did what he believed needed doing, growing nobly into the office that was thrust upon him. He was sometimes gawky in his bluntness, but he stood for something and Americans knew where we stood with him. To him, even before his victory in the great upset of all time, polls were nonsense, all save the poll on election day. To read Truman is to see what we're missing in the White House.
If we are not to be gobbled up entirely by an ever-growing fraction of the American people who are somehow on the dole, we had best get a handle both on expenditures in the "entitlements" category and on the reasoning, or lack thereof, that got us into this mess in the first place. No one has better analyzed the matter than Charles Murray, whose Losing Ground: American Social Policy 1950–1980 (Basic Books, 1984) puts firmly into perspective the gross social (as well as economic) effects of the Great Society of Lyndon Johnson and its, and its successors', perhaps well-intentioned but nonetheless woeful enlargement of the proto-welfare state. A decade of intelligent analyses along similar lines has subsequently emerged; all owe their initial insights to Losing Ground.
The tragedy of what was Yugoslavia; the splintering of what was the Soviet Union; the halving of Czechoslovakia; the francophone separatism in Canada; tribalism in black Africa; and a score of other recent and hundreds of long-standing ethnic and racial conflicts worldwide demonstrate the bankruptcy of the notion of multiculturalism. Mexican irredentism in our own Southwest, the on-going crisis of Haitians seeking asylum in the United States, and the craze for Afrocentric "education," to name just a few domestic instances, signal our own suicidal path if we don't snap out of the mentality that embodies the banal truism that "we're a nation of immigrants so no limits on immigration are just."
Jean Raspail's The Camp of the Saints (Scribner's, 1975, originally published in French in 1973, English translation by Norman Shapiro) creates in fiction a chilling dilemma: What will the advanced portion of the world do when the Third World claims a "right" to occupy Western lands without limit? On the one horn of this dilemma is the seeming cruelty of our recent policy of turning away the Haitians; on the other, to use words from the book jacket itself, is "the end of the white world."
Romantic sentimentality notwithstanding, our current policy merely advances the date by which we will be obliged to choose. The Camp of the Saints confronts our president for the mid-1990s with the bitter alternatives. At present our immigration "policy" is inertial and our internal posture vis à vis bilingualism and multiculturalism are morosely tepid.
Contributing Editor David Brudnoy is New England's leading nighttime radio talk host (WBZ-AM) and a commentator on the region's leading TV station (WCVB-TV).
James M. Buchanan
My recommended selection emerges from an initial concentration on the issues that the incoming president in 1993 needs to understand. These issues are, first, the overextension of politics and the apparently inexorable tendencies for this extension to proceed unchecked. Although his model of politics is not fully congruent with modern public choice, Anthony de Jasay's book, The State (Blackwell, 1985) does capture the dynamics of government growth in this century. I recommend this book as a precautionary tale.
A more specific issue, which is itself derivative from the larger one, is the apparent inability of post-Keynesian Western democracies to escape from the regime of continuous, and accelerating, budget deficits. Cole Brembeck's Congress, Human Nature, and the Federal Debt (Praeger, 1991) discusses the origins, the implications, and the consequences of the deficit regime.
Any appreciation of the overreaching of politics must be accompanied by an understanding of how markets can provide nonpoliticized alternative solutions, even if, in many cases, organizational judgments must be made by pragmatic comparisons between an imperfect politics and an imperfect market. David Friedman's short book, The Machinery of Freedom (first edition, Harper, 1973; second edition, Open Court Press, 1989), should be required reading for all those whose natural proclivity is to turn to government as the solution. The president must, somehow, absorb Ronald Reagan's basic vision that politics is the problem, not the solution.
James M. Buchanan is Harris University Professor at George Mason University and the 1986 Nobel laureate in economics.
Craig M. Collins
Political "lifers" like Bush and Clinton don't need to read. Their ambition is to win elections. Having won, they do not need to know more.
To avoid dissolving completely into cynicism, however, I try to believe that mixed up with their selfish motives are traces of a desire to further the public good—so long as it doesn't hurt them politically. For those rare occasions when they have an opportunity to do good, I suggest they read Richard Posner's Economic Analysis of the Law so they will know what to do. Posner minutely details the complex interconnectedness of laws and their many unintended effects.
Richard Epstein's Takings shows how we have socialized private property into near extinction. He generally concludes this was not a good thing.
Finally, Julian Simon's The Economic Consequences of Immigration shows that it was no accident that vigorous economic growth in this country occurred coincident with a significant influx of immigrants.
Teaching politicians the right thing to do is not the same as convincing them to do it. They well know their interests are vested in the present system of buying votes by reallocating property. For that to change, the public must first become aware of the corrupting effect of this system. This public awareness will not depend on which books the president reads. It will depend on which books the rest of us read.
Craig M. Collins, a former REASON assistant editor, is an attorney in Santa Monica, California, specializing in property law.
Steven Hayward
The problem of modern democratic government is not simply a tendency to bad policy; it is also that most modern politicians do not have a sufficient understanding of or respect for democratic institutions and procedures. A deeper understanding of the principles of limited government goes hand in hand with better policy. Hence what is most needed is remedial reading.
It may be wildly naive to suppose that, at the threshold of the Oval Office, our nation's pre-eminent political figure can be taught anything meaningful, but here goes. My first book is what I call "the owner's manual to the U.S. Constitution," The Federalist. Why not? Although it is true that these essays were the product of a partisan campaign, and are written in an unfamiliar idiom, there is nevertheless a carefully worked out theory of how our constitutional form of government should work. A president will learn as much from Publius's errors of judgment as from his wisdom. The Federalist shines especially brightly on the current problems of separation of powers, legislative and executive prerogative, and judicial review.
My second recommendation is Harvey C. Mansfield Jr.'s recent collection titled America's Constitutional Soul (Johns Hopkins University Press, 1991). In addition to Mansfield's always learned reflections on the state of constitutionalism and party politics in America, there are chapters analyzing the last four presidential elections, from which a president will learn that the distinction between politics and policy, between campaigning and governing, is false and pernicious. Mansfield's serious treatment of and obvious respect for the political ability and achievements of Ronald Reagan are a nice antidote to the standard cliches against Reagan.
My third recommendation is Jeremy Rabkin's Judicial Compulsions: How Public Law Distorts Public Policy (Basic Books, 1989). This may seem like an odd or narrow pick for a president's short reading list. But Judicial Compulsions focuses attention on a major crisis within our government that isn't receiving adequate attention and that impinges directly on a president's ability to administer the executive branch. Administrative law has become subject to a regime of judicial activism directed chiefly by special-interest litigation. What this means is that neither the executive branch nor the legislative branch is really in control of policy. The point is, limited government and the rule of law require a properly limited judiciary, and the president who understands this and sets out to tame the judiciary will render the republic a noble service. And the judiciary will probably be easier to tame than Congress.
Contributing Editor Steven Hayward is the research and publications director of the Pacific Research Institute in San Francicso.
Thomas W. Hazlett
In the 1988 V.P. debate an uppity journalist asked J. Danforth Quayle what recent book he had enjoyed reading. A bit of tension ensued, as America experienced a collective moment of embarrassment. To his staff's credit, Quayle had the name of some erudite tome at the ready. This put that pipsqueak reporter in his place (especially since the debate format allowed no time-out in which to test the senator's comprehension coefficient). While I have no staff (not counting my laptop), I have been given advance warning of the question: Which three marvelous books should the new president read?
1. Hedrick Smith's The Power Game (1989). This artistic hunk of applied political science describes the sources and uses of political clout in Washington, revealing everything Jimmy Carter should have known but was afraid to ask about the national government. Those boneheads who believe that the pols are crazy and that things get screwed up because we don't have enough smart, good citizens in Washington simply don't see what Hedrick Smith knows: Things happen in government for good reason (even if the results for the lowly American taxpayers are ugly).
"Some like to say that the power game is an unpredictable game of chance and improvisation," writes the New York Times reporter. "But most of the time politics is about as casual and offhand as the well-practiced triple flips of an Olympic high diver." Filled with insights (example: "Congress has a stake in the inefficiency of federal bureaucrats: It lets their staffs become important fixers…."), this volume is an excellent substitute for a Ph.D. at the Kennedy School for a busy chief executive on the go.
2. Ithiel de Sola Pool's Technologies of Freedom (1984). A masterful treatise on the evolution of free speech, this book explains how the opportunities for greater liberty afforded by the revolution in computer intelligence may be sabotaged by the political ghosts of censorship past. The American tradition broke historic new ground in moving firmly away from the stultification of a government-licensed press, yet our First Amendment rights have gone into retreat with the emerging electronic communications media.
This uncivil rejection of our libertarian values is all the more ironic in light of the immense possibilities for genuinely democratic free speech that the new technology has given us. Our transition from a press of newsprint to one of electronics is now a century along; computer technologies are stupendously accelerating the passage. Yet our law and institutions have strangely afforded smaller scope for freedom to the newer forms of speech than to the old, a delineation that makes poor sense legally and no sense technically. (It makes perfect sense politically; see Hedrick Smith, above.) Pool, the late, famous professor of political science at MIT, reminds us of our magnificent heritage as the world's freest speakers nestled happily under the protections of the Constitution's First Amendment. Upholding such, Mr. President, will be your job.
3. Robert Caro's The Years of Lyndon Johnson (either of the two volumes out now, or of the two due out soon). Caro's due diligence turned up the dirt on President Johnson, years after the legends (promulgated by the fearsome commander-in-chief himself) had been swallowed whole by journalists and biographers alike. Read Caro on Johnson and you will know a scoundrel. In glorious detail and riveting prose. Yeccccckkkkk! An odious perversion of public power on display for all the world to see. Can this massive dose of posthumous public shame inoculate our future president from hubris disease? Let us hope. Please read Robert Caro and remember: Someone will be watching. Closely.
Contributing Editor Thomas W. Hazlett teaches economics and public policy at the University of California, Davis.
David R. Henderson
Mr. President, you need three main things from your reading. First is a sense of what people's rights are and how a just government should treat them. Second is a basic understanding of how the world works. Third is a perspective on the 1980s. The following list meets the bill, to the extent any three books can.
1. Richard Epstein, Takings: Private Property and the Power of Eminent Domain. This book by a law professor starts from each person's right to his or her own body and ends up showing, on that basis, that government has no right to take people's property without just compensation. Epstein then shows, with flair and buzzsaw logic, that the Fifth Amendment's ban on takings without just compensation invalidates most zoning laws, all price controls, "progressive" taxation, and most government spending.
2. Paul Heyne, The Economic Way of Thinking. Because understanding how the world works requires a basic understanding of economics, I recommend this introductory textbook. It lays out beautifully how cooperation among people works in a free-market economy. It gives you a basic understanding of how a price system works, and works magnificently, to turn conflict into harmony. Among other things, Heyne's book shows why free trade makes both sides better off and how price controls cause destruction.
3. William A. Niskanen, Reaganomics. You cannot understand the 1980s without understanding what economic policies were and what effects these policies had. Niskanen, even though a Reagan partisan, gives the most even-handed treatment of Reagan's economic policies available. Indeed, Herb Stein, no partisan of Reagan himself, called Niskanen's book, "a lucid analysis of Reagan's economics by that rare creature, an objective insider." Lou Cannon, a "liberal" Washington Post columnist, called Niskanen's book, "a definitive and notably objective account of administration economic policy." Niskanen tells the good—some budget cuts, large cuts in marginal tax rates, and a substantial reduction in inflation—along with the bad—failure to get spending under control, huge deficits (caused by the failure to control spending), and protectionist trade policies. Niskanen also gives a sense of the relative importance of various economic issues.
Contributing Editor David R. Henderson is an associate professor of economics at the Naval Postgraduate School in Monterey and editor of The Fortune Encyclopedia of Economics (forthcoming in 1993). He was previously a senior economist with President Reagan's Council of Economic Advisers.
Rick Henderson
The health of the economy will be the most important issue the next president will address. Effective economic policy is no longer a purely domestic matter. It requires a global view.
Economists Richard McKenzie and Dwight Lee recognize this. In Quicksilver Capital: How the Rapid Movement of Wealth Has Changed the World, they say that the information revolution allows nations, not just local regions, to compete for investments in capital and labor.
Fifty years ago, F. A. Hayek argued that central planners never possess enough information to efficiently direct economic activity. Back then, when planners tightened their grip on entrepreneurs and employees, those people suffered. Now, say McKenzie and Lee, capital can (and does) move faster than central planners can try to manipulate it. Policy makers who try to increase taxes and regulations will find their capital bases moving to more hospitable climes. The authors also insist that, so long as government remains intrusive, no amount of "investment" in worker retraining and public works can prevent private capital from fleeing. There's plenty here for either George Bush or Bill Clinton to chew on.
The president will also face a nation with decaying cities, disintegrating families, and a breakdown of what European liberals call "civil society"—the informal network of neighborhoods, churches, and other voluntary arrangements that (besides work) provide meaning and relevance to people in their everyday lives. Charles Murray's In Pursuit: Of Happiness and Good Government argues that government attempts to replace that voluntary sphere in poverty-stricken areas have had disastrous consequences.
How, Murray asks, can a person with little education or few skills find fulfillment? As a good neighbor, an effective parent, or a valued friend. If the government is incapable of keeping the streets safe enough for children to walk to school, neighborhoods—in any meaningful sense—can never come into being. Government can set the conditions that allow these bonds to form, for instance, by making neighborhoods safe. Otherwise, Murray says, it should get out of the way. A president who pays attention to In Pursuit could give millions of despairing Americans a chance to start working through these difficult times.
If Charles Murray provides theoretical justification for the importance of neighborliness, John Shelton Reed tells you how much fun it is to be a good neighbor. In Whistling Dixie: Dispatches from the South, the University of North Carolina sociologist spells out why minor-league baseball games, church picnics, and fishing trips are important.
Even though he calls himself a "crypto-semi-neo-Agrarian," Reed is not an enemy of modernity, he is no apologist for the Jim Crow days, and he doesn't imagine that everything was perfect 40 years ago. But he is onto something: Not so long ago, life was more civil. And (I would argue) we've lost much of that civility because we expect politicians and bureaucrats to solve every problem that comes our way.
John Shelton Reed and Charles Murray would probably agree about many things. The next president would be wise to listen to what they have to say.
Rick Henderson is Washington editor of REASON.
Karl Hess
Because the next U.S. president will face crucial decisions about abandoning or restoring a republican form of government, it is urged that he read two current books—In Pursuit: Of Happiness and Good Government, by Charles Murray (Simon & Schuster, 1988), and The Disuniting of America, by Arthur Schlesinger Jr. (W.W. Norton, 1992)—and the somewhat older The Institutional Imperative, by Robert Kharasch (Charterhouse Books, 1973).
Murray's book can take its place as the peer of any book ever written on the nature and propriety of government—not as an ideological treatise but as a careful questioning based on only one assumption: that the pursuit of happiness, person by person, is in fact why our own government, an epic and historic innovation, was created and constitutionally constrained. Murray writes about people, not society, and makes the difference crystal clear. His book is a guide to the preservation of liberty, which, in turn, is the essential condition for the pursuit of happiness. The next president, being a representative of some faction or another of exactly the sorts warned against in The Federalist Papers, probably will find Murray's book intolerable. Alas.
Because the next president will serve during a time when the factions will have developed their own special languages—and possibly will even have been elected by echoing special vocabularies—the Schlesinger book is a superb reminder of the success up until now of the melting-pot dynamics of the country that still remains the preferred destination of so many immigrants, legal or not. When people vote with their feet, they generally vote American, no matter the politically correct position of blaming America for most, if not all, the world's ills. The Schlesinger book is particularly impressive as a counter to anti-American slanders because of the author's long and honorable representation of the modern liberal position. In this book he even sounds a bit like a classical liberal.
The Kharasch book is one of those overlooked gems that can make your day when you find a copy on a shelf of used books. It is a lighthearted but actually most serious look at how bureaucracies operate. The Iron Law: Bureaucracies exist in order to exist, no matter their publicly stated goals or roles. Because the next president will be in large part ruled by the demands of the bureaucracies, this book is an essential guide to the facts behind the factions. It also presents serious recommendations as to how the bureaucracies could be tamed. An example: No agency authorized to declare an emergency should also be authorized to manage it.
Karl Hess is a writer living in Kearneysville, West Virginia.
John Hood
Despite the pandering rhetoric of the election campaign, foreign policy remains the first and foremost duty of any president. To fulfill that duty, our new president must first have a solid understanding of the century's worldwide conflicts, both their historical roots and their significance for future policy making.
If the rise and almost-fall of totalitarianism demonstrates anything, it is the principle that "ideas have consequences," as do idea makers. Intellectuals, far from being cloistered agents of learning and discourse, ultimately determine the course of human events—by creating rabid, revolutionary movements with millions of victims, or alternatively, by constructing a philosophical framework for protecting liberty. Leaders ignore the life of the mind to their peril. Today's philosophy students can be tomorrow's Khmer Rouge or Shining Path. Today's mild-mannered professor or author can be tomorrow's Karl Marx or Abimael Guzman.
Paul Johnson's Modern Times: The World from the Twenties to the Eighties demonstrates that the conflicts and catastrophes of the century have their roots in intellectual trends. The new president must understand the importance of ideas, of philosophy, and of rhetoric, if he is to lead his nation out of its current post–Cold War torpor. Ronald Reagan, despite policy miscues, will always be counted as among the greatest of our presidents because of his implicit understanding of the power of ideas (gleaned, perhaps, from his career as an actor—a field not too distant, in many ways, from that of rhetoricians and scholars).
More specifically, the new president must thoroughly understand why Marxism failed, both as a political system and as a system of economic, psychological, and cultural insights. Reading Thomas Sowell's Marxism: Philosophy and Economics would be an excellent start.
For a little light reading, I'd advise my president to read the plays of Shakespeare—particularly Hamlet, King Lear, and Julius Caesar—for their insights into human action and the nature of leadership. Both George Bush and Bill Clinton escaped the American education system before its demise and thus have no doubt read these works. But Shakespeare is best savored, not simply skimmed. And if all the world is indeed a stage, then the next president of the world's only superpower will play the lead. He had best memorize the right lines.
Contributing Editor John Hood is editor of Carolina Journal and a columnist for Spectator magazine in Raleigh.
F. Kenneth Iverson
The president should read:
Trashing the Planet, by Dixy Lee Ray with Lou Guzzo. A well-known scientist gives an even-handed, common-sense perspective on environmental issues. It avoids the distortions and hysterical rhetoric that seem to be the order of the day.
The Fair Trade Fraud, by James Bovard. The author provides an in-depth look at our chaotic trade laws, which give incompetent industries an entitlement to milk the American consumer. The Fair Trade Fraud is the frightening story of the 8,000 tariffs and 3,000 quotas that restrict foreigners' rights to sell and American citizens' right to buy, and the description of an area where clearly the government has invaded the rights of the individual.
The Next Century, by David Halberstam. A short book by a thoughtful observer of society on our problems and the changes we need to make a better tomorrow.
F. Kenneth Iverson is chairman and chief executive of Nucor Corp.
Elizabeth Larson
Set in South Africa half a century ago, Alan Paton's deeply moving tale, Cry, the Beloved Country (1948), is both a tragedy, in the classical meaning of the word, and a paean to the human virtues and dignity sadly lacking in much of American society today. Many nonfiction works have been written in recent years decrying the effects of the welfare society and the cult of victimization on personal morals and responsibility. For all their careful analysis, documentation, and statistics, however, none of those books brings home to the reader as Paton does the evil of abdicating individual responsibility and the human dignity of those who willingly live, and die, as a result of their actions.
A sidelight to Paton's central tale of a simple Zulu pastor and his wayward son is the story of the pastor's village—the land overworked and infertile and the people despondent. A wealthy white man arrives one day with plans to reverse this "tragedy of the commons" by dividing the land among the villagers. The right to private property is the subject of the other two books I suggest for our incoming president: Free Market Environmentalism, by Terry L. Anderson and Donald R. Leal (1991), and Takings: Private Property and the Power of Eminent Domain, by Richard Epstein (1985).
Anderson's and Leal's environmental reader is the most important book for any political leader surrounded by aides, policy makers, and green advocates claiming that only the government can remedy environmental "crises." While other free-market environmental books are essential resources for information on specific environmental problems and why government "solutions" have made them worse, Free Market Environmentalism provides the fundamental principles used by every free-market environmental writer. Anderson and Leal explain, with many historical examples, that environmental problems can be solved by providing the right incentives to the people involved and by letting human initiative, not government mandates, take charge.
Particularly in light of recent battles between property owners and environmental activists over the "taking" of private property by restricting an owner's use of his land, Epstein's authoritative analysis of the concept of eminent domain restricted in the Constitution is the most important work on the subject available. The deceptively simple questions Epstein considers (What is a taking of property? Do current regulations—say, zoning or rent control—fall into that category?) ought to be posed to every policy maker from the president to your local zoning board—and, unfortunately for the security of property rights in America today, almost never are. A new president couldn't have a better foundation upon which to build his presidency than a profound respect for what the Founders considered one of the inalienable rights of women and men.
Elizabeth Larson is REASON's production editor.
Laura Main
Our nation's problems stem from an internal sort of cancer—call it lack of "family values" or, to be blunt, simply a lack of values. It touches every segment of our country, from crime on our streets to the well-being of our businesses, and it has very little to do with having children out of wedlock.
Even with large segments of the population receiving some sort of government aid, we still find a nation in the grips of so-called poverty. The book that blows the lid off the ineffectual hand-out system is Charles Murray's In Pursuit: Of Happiness and Good Government. Murray quite graciously touches on every foundation that every individual needs to find true happiness—self-respect, education, a functional community, and family—and how our present system is providing everything but that. Everyone should read this book, not just the next president.
So that our president will further his understanding of the need for true self-esteem (not the pop version), I would recommend The Psychology of Self-Esteem, by Nathaniel Branden. And last, but not least, A Bend in the River, by V. S. Naipaul, to illustrate the negative effects of populating cities and towns with innumerable government employees ignorant of what a town truly is and what it means to be a citizen in one. A mere facade of freedom and prosperity, orchestrated by an irresponsible government, can only result in one thing: barely surviving in a jungle.
Laura Main, a former art director for REASON, is an artist and graphic designer in Los Angeles.
Donald N. McCloskey
If a president reads anything longer than 50 pages containing an argument, it's good news. Presidents—of universities and of companies as much as of the United States of America—have to be quick reads. But too much quick reading makes Jack a superficial boy. It makes him an arrogant boy, too, a Ross Perot, unaccustomed to the modesty of quiet listening. To read a good book with an argument you have to shut up and listen for a few hours, or you're not going to get it. The executive summary won't do. The last long-reading president was Harry Truman. Asked in his old age whether he liked to read himself to sleep he shot back, "No, young man: I like to read myself awake."
One book for the awakening would be Eric Hoffer's The Temper of Our Time (reprinted in 1992 by Buccaneer Books). Hoffer, who died in 1983, leaving 10 of these short but luminous books, was a San Francisco longshoreman and sage. He received no formal education, seizing it instead from libraries and bookstores on his way to pick fruit or offload cargo. He wrote in aphorisms, which make his books readable in rest periods from working out the schedule for the White House tennis court. Though a worker, Hoffer supported capitalism; though a thinker, he distrusted intellectuals. "In politics, the intellectual who as a 'man of words' should be a master in the art of persuasion refuses to practice the art once he is in power. He wants not to persuade but to command." A president should know that; Hoffer knew it at the height of American social engineering.
After Hoffer's aphorisms, try a sustained historical argument, from J. R. T. Hughes, a great economic historian at Northwestern who died this year prematurely: The Governmental Habit: Economic Controls from Colonial Times to the Present (new edition, Princeton University Press, 1991). From Hughes the president can learn the unhappy fact that we have always liked to interfere, we American individualists. Should "deregulation" be turned back by the new administration? Don't make me laugh. What Hughes called "the regulatory junk pile" is three centuries deep. We can crush modern economic growth with the junk pile if we try hard enough. The reborn state socialists in the environmental movement would like us to do just that. A president should know it.
And light relief: P. J. O'Rourke's A Parliament of Whores: A Lone Humorist Tries to Explain the Entire United States Government (Atlantic Monthly Press, 1991). The book is gutwrenchingly funny. Mark Twain called Congress "America's only native criminal class"; O'Rourke extends the characterization to the entire U.S. government. You can imagine the new president not joining the laughter. He should, and would gather thereby O'Rourke's serious point. It came to him in the middle of a New England town meeting: "The whole idea of our government is this: If enough people get together and act in concert, they can take something and not pay for it." There's something every president should know.
Donald N. McCloskey teaches economics and history at the University of Iowa. His most recent book is an edited collection, Second Thoughts: Policy Lessons from American Economic History, just out from Oxford University Press.
Michael McMenamin
Mr. President, economics has never been your strong suit. You showed no more appreciation of how to generate real growth in the economy than your opponent did. Yet if the economy in the next four years doesn't begin to demonstrate the kind of growth it enjoyed during the '80s, your party may well be shut out of the White House for the next generation.
So what books can you read during the next two and a half months that might really make a difference in your new administration? Given the constraints on your time, the books should be 1) relatively short, 2) entertaining, 3) a source of ideas for improving the economy (if not the government), and 4) written by someone who has no interest in being appointed by you to a high government office.
The three books I recommend are (in the order in which they should be read): A Parliament of Whores, by P. J. O'Rourke (Atlantic Monthly Press, 1991); The Fair Trade Fraud, by James Bovard (St. Martin's Press, 1991); and For Free Trade, by Winston S. Churchill (Arthur L. Humphreys, 1906).
O'Rourke's A Parliament of Whores is the most accurate, insightful book on American government since Tocqueville's Democracy in America. Read it. Suggest to your staff that P. J. would make a fine White House director of communications. (Don't worry, he won't accept). Fire those who disagree. They have no sense of humor and you're going to need people who do in your administration.
Armed with your newfound insight into how American democracy really works, move right on to the next book. It's time to start saving the economy. The secret lies in two words: tariffs and quotas. Eliminate them. Totally. Unilaterally. The result will be a $1,200 windfall to every American family, which will return $80 billion to the private sector by lowering prices to consumers everywhere, especially food and clothing for the poor and middle class. As a bonus, you will reduce federal bureaucracy. You will not increase the deficit and you will also cut the cost of goods for U.S. industry, thereby making it more competitive in world markets. It's all there and more in Bovard's The Fair Trade Fraud. All you need is the political will and skill to bring it off.
Which leads to the third book. You and your opponent gave lip service to free trade during the campaign. Everyone does. Then they go out and vote for "temporary" quotas and tariffs in order to achieve a "level playing field" and pocket the campaign contributions from those businesses who benefit. What will you do without those contributions and how do you respond to the demagogues who claim it is unpatriotic to buy less expensive, higher-quality, foreign-made products? Study Churchill's book.
He knew more about the politics and economics of free trade by the time he was 32 (when For Free Trade was published) than any 10 politicians you will have to face in the next four years. The arguments haven't changed in 90 years, and the numbers are still on your side. There are more cost-conscious consumers and competitive U.S. companies who benefit from the lower prices of free trade than there are inefficient businesses who wrap themselves in the flag and ask the government to save them. Organize the competitive companies; mobilize consumers. Collect campaign contributions from the former and votes from the latter. Churchill will show you how. Plagiarize him. His book is in the public domain. And if the Library of Congress can't find its copy, I'll lend you mine if you promise a) not to mark it up and b) to return it in good condition in February. Good luck.
Contributing Editor Michael McMenamin is an attorney in Cleveland.
Charles Murray
Takings: Private Property and the Power of Eminent Domain, by Richard A. Epstein, Harvard University Press, 1985. I grant that the possibility of George Bush or Bill Clinton actually getting all the way through Takings is, charitably, remote. But they should. Contrary to popular political rhetoric ("The United States is the only industrialized country in the world without a national…."), our country's excellence does not lie in becoming another Western European social democracy. Ronald Reagan had this one thing right: The United States is primarily not a geographic entity but the political expression of a few core ideas. The subtext of Epstein's intellectual tour de force is that those ideas are not rhetoric for the Fourth of July but are—or once were—the bones and muscle of the Constitution. It would be helpful to have a president who understood that.
Little House on the Prairie, by Laura Ingalls Wilder, Harper, 1935. This is not a joke. I read Little House to my youngest daughter a few months ago and was at least as engrossed as she was. Little House is not really fiction but a reminiscence of a real childhood and a common national experience. What comes home most forcibly to the modern reader is how much we once took for granted that people and neighbors could do, and would do, for themselves—and, in contrast, how pitifully small and cramped is the average politician's vision of the average citizen's capacity. We Americans like to think of ourselves this way. Perhaps we would like to live that way too.
There is also a delicious final touch to Little House, a tiny counterweight to the politically correct histories of the American West. Do you remember why Laura's parents had to abandon the farm they had carved out of the prairie? Because they had settled four miles inside lands that were granted to the Indians by treaty, and the U.S. Army forced the whites to leave.
In Pursuit: Of Happiness and Good Government, by Charles Murray, Simon & Schuster, 1988. OK, I understand that it's unseemly to choose one of my own books. But the point of In Pursuit was to lead people to think about the question, "What do we really want to accomplish?" in rigorous ways, applying it to the practical assessment of policy, and that is what being president is all about. It is a question neither Bush nor Clinton has visibly worried about in the past. It is time they did.
Charles Murray is the Bradley Fellow at the American Enterprise Institute.
Charles Oliver
Finding three books that distilled all of the accumulated wisdom of mankind was hard. Finding three that would also appeal to a man who has neither the time nor the inclination to read fat books dealing with difficult ideas was nearly impossible.
Obviously, the first book on the list should be about economics. Voters consistently said that the economy was the most important issue in this year's election. Unfortunately, despite occasional bows to business, the president has never demonstrated that he understands how markets work. My first thought was to recommend Ludwig von Mises's Human Action. This exhaustive work starts with the basis of economic activity—individuals acting for their own ends—and explains in detail how markets work and why government interference keeps them from working. But the book, though clearly written, is long and complicated, one the president probably wouldn't pick up, much less finish. So instead I'm advising him to read Henry Hazlitt's Economics in One Lesson. The best primer on economics, this book demolishes most of the fallacies the president will hear from his advisers.
The second big issue facing the country is race; divisions among various ethnic groups threaten to rip this country apart. I was tempted to pass along various books by Walter Williams, Thomas Sowell, Shelby Steele, Anne Wortham, and Stanley Crouch, but instead I advise the president to read Richard Epstein's new book, Forbidden Grounds. This sweeping book begins with the basic values of liberal society—freedom of contract and freedom of association—and shows how these values foster another liberal value, racial tolerance. Epstein then demonstrates how current civil-rights policies not only undermine freedom of contract and association but also promote racial division. Epstein's book should be the basis for a reevaluation of civil-rights law.
Deciding upon a third book proved to be the most difficult. Should I recommend something to counter all of the environmental doomsaying the president will undoubtedly hear from his advisers? Should I pass along a book outlining the benefits of free trade? What about foreign policy or defense?
I decided upon none of those options. Instead, I urge the president to read Robert Heinlein's The Moon is a Harsh Mistress. Why? First of all, it's a damned good read, the best of Heinlein's novels, so the president won't put it down. That's good because the first part of the book paints a believable portrait of how a truly free society would work. This book isn't abstract ideas but people, albeit fictional ones, dealing with problems and solving them without the government's help. Quite frankly, this book could do more to impress the value of freedom upon the president than any other I could recommend.
Charles Oliver is assistant editor of REASON.
Robert W. Poole Jr.
What has been most sorely lacking in the Bush administration is a basic vision, a philosophy of government. The most profound and important book on this subject in many years is Thomas Sowell's Knowledge and Decisions.
Sowell's inspiration was F. A. Hayek's 1945 essay, "The Use of Knowledge in Society." Knowledge and Decisions is a book-length elaboration on that theme, drawing on the extensive body of knowledge produced during the '50s, '60s, and '70s in such fields as law and economics and public choice theory. The book's theoretical first half explains how knowledge is generated and used in society, the necessity of trade-offs (economic, social, and political), and the crucial importance of incentives in human organizations. Part II applies these principles to 20th-century trends in economics, law, and politics, showing how and why centralization of government fails to solve the problems it's intended to solve and creates a host of new ones. A thorough familiarity with these lessons would give the president a needed dose of humility about what government planning and programs can accomplish—plus a framework for shaping a new kind of presidential agenda.
Perhaps the most serious threat to Americans' well-being and prosperity today is the rise of pseudoscience—irrational attacks on foods, drugs, chemicals, energy supplies, and modern technology itself in the name of protecting us from cancer or saving the environment. The first book to document the perversion of science in the service of a new regulatory agenda was Edith Efron's vastly underappreciated 1984 book, The Apocalyptics. Efron's specific subject is cancer prevention, and she presents the book as an intellectual detective story: a journalist discovering and systematically documenting the gradual corruption of science in the service of environmental politics. The book's length can be intimidating, and its title may be off-putting. But Efron's message must be understood by policy makers, especially as the same type of pseudoscience now dominates far too much environmental and energy policy making.
Another issue high on any president's agenda must be urban policy. Yet until last year, most books about cities failed to acknowledge the profound changes that have taken place in urban form over the last two decades. Joel Garreau's Edge City is the first popular book to take seriously the shift of economic activity from traditional downtowns to suburbia. What makes Garreau's sometimes rambling account especially interesting is that he obviously began his research hostile to these changes but ended up discovering a highly decentralized market process at work—a process that reflects the way real people prefer to live and work. An urban policy based on trying to restore the predominance of traditional downtowns, served by traditional transit, is not only doomed to failure but also profoundly antidemocratic.
In recommending these three books, I take it for granted that the president-elect has already read David Osborne and Ted Gaebler's much-touted Reinventing Government. It reflects a "new paradigm" approach stressing choice, competition, cost-effectiveness, and accountability. While hardly laissez-faire, this approach would represent a welcome change of course.
Robert W. Poole Jr. is president of the Reason Foundation and publisher of REASON.
Virginia I. Postrel
Washington is a weirdly sterile place—the true home of the cultural elite, left and right—and the president is the most insulated person in America (with the possible exception of Michael Jackson). Reading should break the box and pull the president into the world where people don't all have identical suits, identical haircuts, and mostly identical ideas about what constitutes the good life.
For starters, I recommend two beautifully written books about the people who are transforming world business and world cultures: American Steel: Hot Metal Men and the Resurrection of the Rust Belt, by Richard Preston (Prentice-Hall, 1991), and The Outnation: A Search for the Soul of Japan, by Jonathan Rauch (Harvard Business School Press, 1992).
American Steel is an adventure, an absolutely riveting drama of the building of a minimill to make rolled steel with never-before-tried technology. This audacious undertaking is made all the more challenging by Nucor Corp.'s determination to do everything fast. The book has plenty to say about international and domestic competition—"man against man" in English-class jargon—but it is really about man against nature, about the joys and hazards of taming metal that's nearly 3,000 degrees Fahrenheit, "runny as water and as unpredictable as a cat."
And while Preston vividly portrays the romance of hot metal, American Steel is anything but romantic. A terrible accident destroys much of the mill and leaves a man to die a slow and painful death from burns. "Until you see the walls of a steel mill blown off and part of the roof blown away, the power of hot metal doesn't hit you." Neither I, nor I suspect the president, would be willing to take the risks that making steel requires. But some people relish them, and civilization is the better for it. The president should appreciate that. So should the risk-averse control freaks who populate Washington.
The Outnation is as tranquil as American Steel is hard driving. Less about trade, competitiveness, and international relations than about people, culture, and values, this tiny volume (180 pages, with photographs) has more insightful things to say about trade, competitiveness, and international relations than most books two or three times its length.
Those insights spring primarily from Rauch's willingness to look at Japan detail by detail instead of cramming an entire civilization—and a country of 125 million not-in-fact-homogenous individuals—into a tidy thesis for talk-show bookers. An enormously subtle book filled with well-chosen stories about real people, The Outnation appreciates and exposes the myths Japanese and Americans tell about our cultures and our differences. It is suffused with a sense of history and with a great appreciation for liberal values and why we value them. Reading it, we learn not only about Japan but about ourselves, where we come from, and, perhaps, where we're going.
Dedicated "to the unknown civilization that is growing in America," The Constitution of Liberty, by F. A. Hayek (University of Chicago Press, 1959), is three times as long as The Outnation, has no pictures, and tells no anecdotes. It is not journalism. But it is profoundly about "the real world" and, though philosophy, it is not abstract.
Hayek's is the nuanced world of history and action, in which knowledge emerges from experience and experimentation and principles are different from revealed axioms. The Constitution of Liberty is one of the wisest books ever written, the most appreciative of liberty, and the most distant from today's Washington—a place where people actually believe the man in the White House "creates jobs" and dictates culture. Entering Hayek's world, even for a chapter, would be a radical step out of the box.
Virginia I. Postrel is the editor of REASON.
Jonathan Rauch
Three books, Mr. President? I can think of three dozen, and I can think of one. High on a list of three dozen would certainly be Mancur Olson's The Rise and Decline of Nations (Yale University Press, 1982), whose 10-year-old predictions today look depressingly accurate. Olson's hypothesis is that special-interest groups and their anticompetitive arrangements accumulate inexorably over time and gradually choke off economic and political vitality. Thus may postwar democracy, in America and elsewhere, seize up in much the way a man might choke on his own phlegm. Olson's hypothesis, though not uncontroversial, positively must be reckoned with, especially by a president, who needs to appreciate that pandering to interest groups is more dangerous than it seems.
Another among a handful would be Aaron Wildavsky's unheralded but fascinating The Rise of Radical Egalitarianism (The American University Press, 1991). Wildavsky looks at activists of seemingly quite different kinds, from land-use regulators to feminists to environmentalists to animal-rights advocates, and discovers a common cultural thread, namely the belief in the moral virtue of diminishing any given array of differences between people (or species). Most Americans believe in liberty and equality, but radical egalitarians are one-value people—like the antipodal radical libertarians, but much more influential."Egalitarians exist not to be satisfied," writes Wildavsky (italics his own). He will help show you what makes them tick.
But really there is only one book, on a list by itself. It was published in 1988 and has become a monument to the fact that liberalism still has millions of committed enemies, and they will hurt us if they can.
You ought to read The Satanic Verses and make sure everybody knows you are reading it. Then you should ensure that there will be no semblance of normal relations with Iran until the death sentence against Salman Rushdie is revoked. In 1989, George Bush's reply to the death sentence was of oatmeal consistency, and the White House has been silent on the matter ever since. Please do better. If the president of the United States does not stand stoutly beside those who exercise the right to criticize (as Rushdie's novel harshly and justly criticized both the Ayatollah Khomeini in particular and Islamic fundamentalism in general), then the world does not need him.
Jonathan Rauch, a contributing editor of National Journal, is author of The Outnation: A Search for the Soul of Japan (Harvard Business School Press, 1992).
Jacob Sullum
It's tempting to recommend the works of important classical liberals or summaries of contemporary libertarianism. But I'm afraid these would be too easily dismissed as outmoded or radical. Instead I've selected three books that are provocative without being too threatening or alienating. Rather than produce instant converts, they should have a more subtle, long-term impact that may result in some salutary second thoughts.
Richard Nisbett and Lee Ross's Human Inference: Strategies and Shortcomings of Social Judgment (Prentice-Hall, 1980) is, despite the forbidding title, quite readable (for a serious psychology book, anyway). Nisbett and Ross explore how people go wrong in making judgments about themselves and the world around them, mainly by relying too heavily on cognitive rules of thumb and by failing to understand or apply principles of statistics and probability that are familiar to scientists. The book has no obvious political bent, but its implications for public policy, especially with regard to regulation and risk assessment, are profound.
Another psychologist, Stanton Peele, has built a career on challenging the conventional wisdom about addiction. His Diseasing of America: Addiction Treatment Out of Control (Lexington Books, 1989) is not a direct critique of the war on drugs, which may give his message a better chance of getting through. Like Thomas Szasz, Peele argues that the medical analogy has clouded our thinking about addiction and that we need to talk about people and responsibility rather than chemicals and diseases. He refutes many widely accepted myths about addiction, including the idea (accepted by a lot of Democrats and drug-policy reformers) that more money for treatment is the solution to "the drug problem."
Randy Barnett is not a psychologist, so far as I know. But he is the editor of and a contributor to The Rights Retained by the People (George Mason University Press, 1989), an attempt to understand the much-neglected Ninth Amendment. Although the book has a lot to say about liberty and individual rights—which you might think would be a turnoff for politicians—it draws on a wide range of perspectives, so it has a mainstream appearance.
Nevertheless, Barnett dares suggest that rights do not come from governments, that the Ninth Amendment might actually mean something, and that courts could even apply it in a principled fashion. Bold yet respectable, this book might plant a seed of doubt about the state's presumptions, or at least curiosity about our political heritage.
Jacob Sullum is associate editor of REASON.
Martin Morse Wooster
In asking this question, the editors of REASON are asking the president to do something that politicians rarely do—read for pleasure. Books are something most presidents avoid. Ronald Reagan read Tom Clancy, and John F. Kennedy enjoyed Ian Fleming novels, but Harry S. Truman was the last avid reader in the White House. Most presidents—and most politicians—subsist on a diet of words that consists of memos, government documents, and the occasional public-policy magazine.
So my first request would be that either George Bush or Bill Clinton find the time routinely to sit in a comfortable study full of books not directly related to his job. Most good writers are avid readers, and one of the reasons most politicians are incapable of giving a persuasive speech is that their "in" baskets largely consist of styleless mush. With that in mind, here are three books that will help the president sort through the issues of the day.
No better history of our sad and savage times exists than Paul Johnson's recently revised Modern Times (HarperCollins). Johnson wittily dissects the follies of dictators and statists in a stylish and decisive manner. Even though the age of tyrants has now mostly passed, Johnson's book is still essential for understanding how much of the world could have been deluded by fascism and communism.
Neither Ronald Reagan nor George Bush has done very much to alter the American welfare state, so Charles Murray's Losing Ground (Basic Books) is still essential, accurate, and necessary. Murray demolished the assumptions on which the welfare state was created, thus ensuring that welfare programs now have no intellectual base. Liberals have spent a great deal of time trying to refute Murray; they have found that his arguments are irrefutable.
Both Bush and Clinton want to reform American education, so they might learn something about how American education became as bureaucratic, sclerotic, and hierarchical as it is. There are not that many histories of education, but the best remains David Tyack's The One Best System (Harvard University Press, 1974). In this book, Tyack shows how misguided Progressives transformed American education from a decentralized system responsive to parental desires into a close-minded institution strongly resistant to change. The book is long out of print, but an enterprising reprint publisher should discover that this hard-to-find volume will have a wide audience.
Contributing Editor Martin Morse Wooster is REASON's magazine critic. His book on reforming public high schools will be published next year by the Pacific Research Institute.
The post What Should the President Read? appeared first on Reason.com.
]]>First, The Growth Experiment is a great book. It is persuasive about important issues in tax policy. Second, I am biased. I worked with its author, Lawrence Lindsey—now on leave from Harvard for a stint at the White House Office of Policy Development—when he was the tax economist on President Reagan's Council of Economic Advisers and I was senior economist for health and energy.
I found Larry to be a man of great integrity. One piece of evidence: As the CEA person assigned to make sure that Reagan's speeches contained no errors of economic fact, Larry was a tough taskmaster. I remember hearing bits of arguments on the phone between him and Reagan speechwriters who wanted to give Reagan credit for every good development in the U.S. economy after January 20, 1981. Larry also has two traits that are rarely combined in an economist: He weighs evidence carefully, and he doesn't lose the big picture. All these characteristics are on display in his book.
Lindsey begins by detailing what many of us have forgotten or never knew: just how badly the U.S. economy was doing during Carter's administration. He points out that between 1979 and 1981, real wages of American workers had plunged by 9 percent to their 1962 level, wiping out nearly two decades of growth. This reduction in wages, combined with much higher tax burdens, left Americans worse off than their counterparts in 1962. Lindsey reminds us that economists and political leaders were arguing at the time that we were doomed to declining standards of living because we were in an "era of limits."
A major reason for this economic mess, argues Lindsey, was tax-bracket creep. High inflation, combined with a progressive income-tax system that wasn't indexed, threw even middle-income taxpayers into the tax brackets of 40 percent and more that used to be reserved only for the rich. Middle-income taxpayers were responding rationally—by not working as much overtime, by consuming instead of saving, by taking wage increases in untaxed fringe benefits, and by seeking tax shelters. Nor did standard Keynesian policy have an answer. Keynesians saw taxes only as a tool to raise or lower aggregate demand and not as something that affected supply.
Enter the supply-siders, whose numbers did not include Lindsey at the time. By cutting marginal tax rates, the supply-siders argued, the government could increase the incentive to work, to save, and to invest, and could decrease the incentive to avoid taxes with wasteful deductions and tax shelters. The supply-siders started pushing for a tax cut in 1977, and by the time Ronald Reagan won the 1980 presidential nomination, they had recruited him to their cause. Reagan made the tax cut the economic centerpiece of his campaign.
In his book, Lindsey focuses on how much revenue the government actually lost by cutting tax rates. Why such an apparently narrow focus? Because, Lindsey explains, the revenue effects are a good measure of how much high tax rates were hurting the economy. If Reagan's 23-percent cut in tax rates caused a revenue loss of 23 percent, this would mean that cutting tax rates did not expand the tax base at all and therefore that high tax rates were not shrinking the tax base and were not hurting production. But if Reagan's 23-percent tax cut caused a revenue loss substantially less than 23 percent, this would mean that high tax rates were hurting the economy a lot.
Lindsey reports that the government's static loss in revenue—the loss assuming that the tax cuts had no effect on people's behavior—would have been $115 billion in 1985. But Lindsey calculates a demand-side effect of $40 billion in 1985 that even Keynesians would admit. Then, using estimates of the response in labor supply to reductions in tax rates, Lindsey estimates that tax revenue from the added labor supply—the so-called supply-side effect—was $21 billion by 1985. Finally, he estimates that simply reshuffling assets in response to lower tax rates—shifting from tax-exempt to taxable bonds, for example—caused taxpayers to pay more in taxes. He estimates this "pecuniary effect" at $21 billion by 1985.
Thus, the net result of the 1981 tax cut by 1985 was a loss not of $115 billion but of only $33 billion. Whereas $40 billion of the revenue resulting from the cut was due to the standard Keynesian demand-side effect, $42 billion was due to factors the supply-siders had focused on and that Keynesians had assumed to be zero. And in return for that $33-billion loss in federal tax revenue, Lindsey writes, real output of the economy was 2 percent to 3 percent higher. Not a bad deal.
The cut in rates was greatest for the taxpayers in the highest bracket, whose top rate was cut immediately from 70 percent to 50 percent. Lindsey shows convincingly that the tax cut caused taxpayers with annual incomes of more than $200,000 to increase their incomes and to pay more taxes.
Does this increase in taxable income mean that the rich got a lot richer? Not necessarily. Rather, Lindsey writes, the lower marginal rates simply caused them to change their behavior so as to expose more of their income to taxation. Lindsey makes a persuasive case that the revenue-maximizing tax rate is probably below 40 percent and that the optimal tax rate is well below 40 percent. Indeed, Lindsey goes further, arguing for a flat tax rate of 19 percent combined with a large standard deduction to shield lower-income people.
Lindsey buttresses his case that high marginal tax rates distort the economy by tracking the effects of the income tax since it began in 1913. He shows that whenever tax rates on high-income people rose, their taxable income fell and so did the share of tax revenue they paid. Conversely, after tax rates on high-income people fell—with the cuts instigated by Coolidge's treasury secretary, Andrew Mellon, for example—their taxable income and their share of tax revenue rose. Though these facts have been known at least since supply-sider Jude Wanniski reported them in his path-breaking 1978 book, The Way the World Works, Lindsey reports and interprets them more carefully.
While Exhibit A for Lindsey's supply-side case is the Reagan tax cuts, Exhibit B is President Kennedy's similar cuts. Lindsey shows that demand-side factors account for only one-quarter of the added growth caused by Kennedy's tax cut. As evidence that the other three-quarters was due to the tax-rate cuts, Lindsey notes that most of the growth in the tax base occurred among high-income taxpayers who had previously faced marginal tax rates as high as 91 percent and who now faced a top rate of 70 percent.
Each chapter is filled with nuggets of economic wisdom. Lindsey shows that the federal deficit is due almost entirely to increases in spending rather than to tax cuts, that simply keeping inflation-adjusted government spending constant would eliminate the deficit by 1994, and that the standard measure of American savings drastically understates true savings. Lindsey also documents our current tax code's strong bias against saving and proposes a plan for removing this tax bias.
One of the best chapters is the one titled "Helping Hands," in which Lindsey argues that private charities are much more effective than government subsidies: They target those who need help most, they must be successful to keep donors and volunteers, and they avoid huge bureaucracies. He describes the experience of the Brookings Institution's John Chubb, an education researcher who called the New York City Board of Education to find out how many bureaucrats worked in its central offices. Chubb made about six calls before finding someone who knew. But this person was not allowed to tell him. After at least six more calls, he found someone who could tell him. The answer: 6,000. Chubb then called the Archdiocese of New York to find out how many people worked in the central office of their school system, which serves about one-fifth as many pupils as the New York public schools. The first person he reached said she didn't know. Chubb expected the same runaround. Then she said, "Wait a minute. Let me count." The answer: 26.
Lindsey advocates expanding the charitable deduction in the personal income-tax system to give people a greater incentive to contribute to charities. Lindsey's own story of his meetings with congressional aides to discuss his proposal is very revealing. These aides, he reports, do not dispute economists' findings that a 1-percent reduction in the after-tax cost of charitable giving yields an additional 1.2 percent to 1.3 percent in donations. Instead they object to letting individuals decide which charities should receive the money! This, to them, is "undemocratic." Lindsey's understated commentary: "Their faith in the omniscience of the state inflicts a high price on the country's needy."
My generally strong enthusiasm for The Growth Experiment is tempered by one main criticism. When he uses ideas or evidence from articles by Harvard colleagues or by fellow researchers associated with the National Bureau of Economic Research in Cambridge, Lindsey refers to them by name and gives the articles' titles, publication dates, etc. But when he talks about the supply-siders, he rarely refers to a person, let alone an article, by name, except to criticize. Lindsey cites Wanniski's book, for example, only to criticize (correctly) Wanniski's equation of the revenue-maximizing tax rate with the optimal tax rate.
In discussing the revenue effect of the Kennedy tax cut, Lindsey never refers to some similar work done on the subject by James Gwartney of Florida State University and Richard Stroup of Montana State. And nowhere does Lindsey even mention Paul Craig Roberts. Without Roberts's work as a congressional aide in the late 1970s on the Kemp-Roth tax cut and as a point man in Reagan's Treasury Department in 1981, the 1981 tax cut might never have happened.
But this asymmetry in the credit that Lindsey gives to others is a relatively small negative in a sea of positives. If you buy one book this year about the effects of taxation, The Growth Experiment is the one to buy.
Contributing Editor David R. Henderson is an associate professor of economics at the Naval Postgraduate School in Monterey, California, and a senior fellow with the National Center for Policy Analysis.
The post Marginal Success appeared first on Reason.com.
]]>"Really?" I asked my wife, Rena. "What is it about?"
"It's about these poor Chicano farmers in New Mexico or somewhere who steal water from a big business to irrigate their beans. The movie treats them as heroes."
"I bet they don't steal it," I beamed confidently. "I'll bet you that they just reclaim it after the big business, with the government's help, had already grabbed their water."
Rena looked at me skeptically. Why was I so sure? Because of Henderson's Law of Heroic Movies. Henderson's Law states that antibusiness movies that have heroes are always based on, or consistent with, a libertarian premise.
I formulated this law after seeing and enjoying a lot of movies that liberal friends expected me to dislike on ideological grounds. Sure enough, my law was confirmed by The Milagro Beanfield War; which is, incidentally, not only profreedom and heroic but also very good.
The movie A Christmas Carol, based on Charles Dickens's novel, is another example. Because it is an attack on businessmen, liberal friends are sometimes surprised that I love it—especially the 1951 version with Alastair Sim. But A Christmas Carol is profoundly profreedom. It does not question Scrooge's right to keep his money for himself but rather his wisdom in denying himself the pleasure of giving. In one of the final scenes, Scrooge dances around the room with joy because he realizes it is not too late to change. He is excited, not because he can now go out and vote to have the government spend other people's money but because he can now personally, voluntarily help those around him.
But why is Henderson's Law true? Why aren't movies with heroes based on antilibertarian premises? There are two reasons. Both should give us hope for our cultural and political future.
First, even though many movie scriptwriters and directors are left-wing, they must think that a heroic antilibertarian movie will not make money. They know—implicitly or explicitly—that people who want a heroic movie want to have no qualms about supporting the hero. So they must enlist the hero in a just cause. The only kind of just cause that they can be sure the public will support is one based on an individualistic view of justice. Because the public, although most of them don't know it, have a profoundly libertarian view of justice.
Skeptical? Then consider. Which plot would stir the juices of American moviegoers: one where the hero stole money from a corporation that succeeded by being honest with people, or one where the hero stole money from a corporation that had been cheating people? To ask the question is to answer it. The second movie is the only possible contender.
Push the analysis further. Normally, when a corporation cheats someone, the person cheated can sue. But sometimes that doesn't work. When it doesn't, it is typically because the legal system is not doing its job of protecting people from fraud. So the hero who steals money from the cheating corporation is trying to achieve the profreedom outcome that the legal system was supposed to provide.
The second reason that left-wing screenwriters and directors often adopt libertarian premises is that they themselves are profoundly libertarian. To believe in heroism is almost necessarily (Ayn Rand would probably cut the almost) to believe in freedom. But writers and directors usually don't understand how the market works, and they think that big corporations, shielded by a corrupt legal system, are always ripping people off. In other words, they are antibusiness because of profreedom values coupled with a mistaken view of how the market works.
Henderson's Law also holds for books, movies, and TV shows. Take Robin Hood. We hear that he was a hero because he robbed from the rich to give to the poor. Robin Hood would be a clear-cut exception to Henderson's Law if the rich he robbed had earned their money. But invariably Robin Hood's rich victims are those who themselves robbed first. Robin Hood reclaimed from the rich and gave back to the poor.
I have not yet found an exception to Henderson's Law, and not for lack of trying. When I think I have one, it turns out either to be consistent with libertarianism or to lack a hero.
Take Norma Rae, a movie about a worker at a cotton mill who grows out of her low self-image and who courageously and effectively fights to unionize the mill. The movie qualifies as heroic. It is also consistent with libertarian premises. The turning point for the union's success comes when Norma Rae's father—who also works at the mill—dies because his employer refuses to let him sit down during a heart attack. The employer clearly violated an implicit contractual agreement: I know of no advocate of freedom who thinks that when you agree to work for someone, you also agree not to take a break during a heart attack.
Of course, when the union forms, the large minority of workers who voted against it will be represented by the union against their will. Forcing a bargaining agent on these people and forcing them to pay dues as well violates the workers' right to choose their own representatives, not to mention the employer's rights. But nowhere does the movie even hint that those who oppose the union will be forced to join. If the movie had covered this issue, then Norma Rae's heroism would not have been so clear-cut. The only way the director and scriptwriter could keep the movie both heroic and prounion was to mislead moviegoers about the antifreedom aspects of unions under American labor law. American labor law, not Norma Rae, is antifreedom.
Country is another apparent exception that turns out to be consistent with Henderson's Law. This movie portrays as a hero a woman, played by Jessica Lange, who tries to hold on to her farm and to prevent the government from foreclosing on it. I could nitpick and say that the movie is not antibusiness because the government, not business, is foreclosing. But I won't, because one can easily imagine a business doing the same thing. The more relevant points are that the government is not portrayed as clearly villainous and that the heroine attempts to hold on to her farm within the law. The neighbors help out, not by breaking laws or by violating property rights, but simply by refusing to buy her assets, a very creative use of their freedom.
Of course, in the late '60s many antibusiness movies were hostile to freedom. But they were also antiheroic, and therefore not exceptions to Henderson's Law.
Henderson's Law gives me hope. That heroic movies are based on or consistent with the principles of freedom means that millions of people can see as heroic only someone whose actions do not violate others' rights. This shared belief in freedom is a huge common ground. What remains, if believers in freedom wish to persuade people, is to show them this common ground and then to address the factual issues on which we disagree, the main one being whether corporations can easily rip people off without the government's help. Those who love heroes love freedom.
Contributing Editor David R. Henderson, an economics professor at the Naval Postgraduate School and a former White House economist under President Reagan, loves heroic movies.
The post Life & Liberty: Henderson's Law of Heroic Movies appeared first on Reason.com.
]]>Reaganomics: An Insider's Account of the Policies and the People, by William A. Niskanen, New York and Oxford: Oxford University Press, 363 pages, $19.95
When asked to sign an academics' statement supporting Ronald Reagan during the presidential campaign of 1980, I refused. I did so on the basis of two pieces of evidence. The first was Reagan's mediocre record as governor of California. One of his worst acts, committed shortly after he took office, was to give California one of the country's most "progressive" (graduated) state income tax systems in an effort to erase a short-term budget deficit inherited from the senior Governor Brown. The second piece of evidence was his commitment during the 1980 campaign to cut government spending by attacking "waste, fraud, and abuse." Not that I favor waste, fraud, and abuse. Rather, my objection was that if Reagan interpreted those terms as narrowly as they are usually interpreted, he would not cut government spending programs that really should be eliminated—subsidies for farmers, for example. I never imagined during the 1980 campaign that Reagan would make a serious and sustained attempt to cut the size, let alone the growth, of the federal government.
I was wrong. In his first month in office, Reagan removed price controls on oil and disbanded the Council on Wage and Price Stability that had been running former President Carter's wage and price guidelines. He rescinded a number of "midnight" regulations that had been imposed during Carter's last days and proposed sharp budget cuts and even larger tax cuts. And that wasn't all. In his speeches to Congress and the nation, Reagan took the offensive, blaming government for the nation's economic problems.
That first month left me in shock. When I heard the president of the United States saying in his speeches some of the things I had said in mine, I felt mixed emotions. On the one hand, I felt gratified that finally a man who shared some of my most deeply held political and economic beliefs had the power to act on them. On the other hand, I felt threatened: Reagan was challenging my view of myself as a rebel, a view in which I had invested a lot of emotional capital.
That month, my political values struggled with my self-image. My political values won. I decided that I wanted to join Reagan's crusade to roll back the government. I went to Washington, where I worked first in the policy branch of the Labor Department under Hoover Institution economist John Cogan and later at the Council of Economic Advisers under Murray Weidenbaum and then under Martin Feldstein and Bill Niskanen.
By the time I got to Washington in late 1981, David Stockman had already helped torpedo the Reagan revolution by giving damaging interviews to then–Washington Post reporter William Greider. The 1981–82 recession had also begun. Both factors made further budget cuts even harder to attain. As a result, I missed the peak of Reagan's political power and was not in on many important attempts to tame the government.
The only one major old regulation I helped make less onerous was the one on fuel economy for new cars. Mostly, I helped to prevent new government programs. Sitting on the inside seeing Reagan and his advisors squander opportunities for cutting government and seeing Congress head Reagan off at almost every pass, I found it hard to think that I was part of a revolution.
Was there a Reagan revolution? Two recent books disagree on the issue. What makes the disagreement particularly interesting is that it is between two of the most libertarian of Reagan's high-level appointees, men who were and are political allies: Anderson, formerly Reagan's chief advisor on domestic policy, and Niskanen, a member of Reagan's Council of Economic Advisers and subsequent acting chairman. In Revolution, Anderson argues, as the title implies, that there was a genuine Reagan revolution. In Reaganomics: An Insider's Account of the Policies and the People, Niskanen says it ain't so.
If you look just at economic policies other than tax policy, you would have to say that Niskanen is right. Take government spending. According to Anderson's numbers, Reagan's goal was to bring spending to 18.6 percent of gross national product by 1985, down from an estimated 22.7 percent in 1981. Did he achieve it? The closest Anderson gets to answering is to describe the long economic expansion that we have had since 1982: "Reaganomics," he writes, "turned the economy into a money-making machine that allowed him to preside over the largest increases in social welfare spending of any country in history." In short, Reagan failed to cut domestic spending. By 1985, spending had increased to about 24 percent of GNP, over 5 percentage points above Reagan's goal. But you won't find that fact anywhere in Revolution.
Niskanen points out that Reagan made large cuts only in a few categories such as welfare support services and energy. How many spending programs did Reagan eliminate? Only one, but to Reagan's credit, a big one: the $5-billion revenue-sharing program for state and local governments. Reagan slowed the growth of new regulation but also slowed the deregulation movement that had gathered steam during Carter's term. How about Reagan's record on trade? Reagan, Niskanen states flatly, "imposed more new restraints on trade than any administration since Hoover."
Niskanen's conclusions are not totally negative. He credits Reagan for two major policy achievements. First, Federal Reserve Board Chairman Paul Volcker, backed by Reagan, reduced inflation by six percentage points—even more than Reagan had intended. In the process, they showed that inflation is easier to tame than many economists had thought. Niskanen claims that the true cost of cutting each percentage point of inflation turned out to be less than half the $100 billion (in 1980 dollars) in lost output predicted by President Carter's economists.
Second, Reagan dramatically reduced taxation of individual incomes. On tax policy, there was clearly a Reagan revolution. Reagan's biggest victory was his gutting of the "progressive" income tax: he reduced the top rate on individual income from 70 to 28 percent. Most of the industrialized countries around the world have been scrambling to follow the U.S. example: Canada's top federal rate, for example, is now only 29 percent; New Zealand's is 33 percent; Britain's, 40 percent. Certainly, Reagan has had a revolutionary impact on federal taxes that more than makes up for his earlier progressive-income-tax sins in California.
But the real revolution, argues Anderson, was in the realm of ideas. Anderson has that rare ability to look at the big picture—to keep his eyes on the prize. And the big picture, to Anderson, is the death of socialism worldwide and the corresponding shift in thinking toward "the new capitalism." Not that Anderson gives Reagan credit for this shift. He claims that Reagan, like Nixon and Goldwater before him, was created by, as much as a contributor to, the political movement toward the new capitalism.
Although I have trouble imagining Nixon as a closet believer in laissez-faire, Anderson is right about the dramatic shift toward procapitalist thinking. Take economics, for example. Only 18 years ago, I could read virtually everything written in journals and magazines by free-market economists. Now I can't keep up with 5 percent of what is written in English. And the foreign-language free-marketeers are at it too.
Revolution should not be read as a detailed analysis of economic policies. That is not what it is, nor what it claims to be. Instead, it is a kind of poetic political manifesto. Fellow REASON contributor Roy Childs has called the book "a libertarian counterpart to what liberal historians and commentators did to—or for—the administrations of Roosevelt and Kennedy." I second that. Revolution is about the values that motivated Reagan in his 25-year quest to tame the federal government. And Anderson writes with a power and a sweep that are rare for an economist.
In some ways, Revolution reminded me of Ayn Rand's novel The Fountainhead. This is perhaps not surprising, given Anderson's early association with Rand. Like Rand's novel, Revolution is about abstract ideas and the flesh-and-blood people who have those ideas. The most fascinating character in this large-stakes drama is Reagan himself.
As Anderson demonstrates with his tale of how Reagan made it to the White House, his biggest virtue is tenacity. Throughout the 1960s and 1970s, he stuck to the same message: the federal government spends too much on domestic programs and overregulates the economy, and our defenses are too weak. Ultimately, American citizens came around to his views. Then amazingly, after Reagan earned his way to Washington, he did not forsake the ideas he stood for but actually tried to implement them.
Reagan's biggest flaw is not, as is often believed, his failure to monitor subordinates. One human being can do only so much, and Reagan probably realized that he could not at the same time focus on his major goals, continue to be the Great Communicator, have any kind of private life, and still watch over his staff. The latter is what a chief of staff is supposed to do and what, according to Anderson, Jim Baker did well in the first term and Don Regan did badly in the second.
Reagan's biggest flaw was his susceptibility to making decisions based, not on logic, but on the personal appeal of those with access. This characteristic was best illustrated by Anderson's shocking story of why and how Reagan got Israel to quit bombing Lebanon. There was no careful weighing of the issues. Rather, Michael Deaver walked into Reagan's office one day, threatening to quit unless Reagan asked Israel's prime minister, Menachem Begin, to stop the bombing. Reagan called and made the request. Twenty minutes later Begin called back and told him that the bombing had stopped. Simple. This same willingness to give in to former CIA Director William Casey is what landed Reagan in the Iran-Contra soup.
Another character, less important but still fascinating, is young David Stockman. Anderson rightly calls Stockman's book, The Triumph of Politics, "a self-righteous work with the tone of a screeching bluejay." Stockman simply did not understand that you can't always get what you want. If pessimistic Stockman were to write a description of baseball great Ted Williams, says Anderson, it would read as follows: "Even at the height of his career, Williams manages to get base hits only 40 percent of the time and strikes out repeatedly."
Anderson's book is probably most valuable as an inspiration for those who want to enter the political arena and continue hacking away at government. But it says little about the lessons learned from some of Reagan's failed attempts. Niskanen's Reaganomics is much better on this. One of Reagan's biggest blunders, according to Niskanen, was on Social Security.
In 1981, Social Security was ripe for reform. The old age, survivors, and disability (OASDI) program was projected to run out of money within a few years despite large tax increases passed in 1977. The main problem—short- and long-term—was that benefits were tied to the consumer price index, which overstated inflation. Congress was already concerned about the problem. According to Niskanen, Democratic representative Jake Pickle (Tex.) and Republican senator William Armstrong (Colo.) would probably have come up with a good bill to solve the short-run problem and even make Social Security sounder in the long-run.
But budget director David Stockman was impatient. He wanted immediate budget reductions and cared little about the long run. As he stated in one of the most revealing, and unheralded, quotes in his infamous interview with Greider, "I'm just not going to spend a lot of political capital solving some other guy's problem in 2010." As Niskanen points out, Stockman, allied with Anderson, got his way, and the administration proposed cutting early retirement benefits from 80 percent to 55 percent of full benefits effective in January 1982. The result: the Senate straight-armed Reagan, approving a resolution critical of his proposal by a vote of 92 to 0. Reagan quickly withdrew the proposal, and the Democrats had a hot issue that helped them in the crucial 1982 elections.
Moreover, Social Security did not get fixed. The main long-term problem—the indexing formula—was unchanged, and higher taxes were the main solution to the short-term problem.
This episode, a pivotal one given that Social Security accounts for over 20 percent of government spending, carries an important lesson: be patient. If you want to have a long-term impact, work for long-term goals.
Surprisingly, neither book mentions what could be one of Reagan's most durable and important legacies: his appointment of judges. For good or ill, judges have enormous power to decide what the law is and whether it is constitutional. Because Reagan has appointed three Supreme Court judges and hundreds of district and appeals court judges, all of whom can serve for life, the impact of his judges will be felt into the next century.
I am not qualified to say whether the net impact be good or bad, but on economic policy it will most likely be good. Some of Reagan's Supreme Court judges—especially fire-breathing Antonin Scalia—have been instrumental in two landmark decisions that limit the power of local governments to regulate people's use of their own property—Nollan v. California Coastal Commission and First English Church v. Los Angeles County. Richard Epstein, in his book Takings, argues that the judges should go further and overturn other regulations that amount to a regulatory taking of private property by the government. If they do so, it will be in no small part because of Reagan's appointees.
Reagan definitely hastened the movement of the United States away from statism. At the same time, he squandered important opportunities for further moves. So it is too soon to tell whether there there was a Reagan revolution. How lasting his impact is will depend on a number of factors. The most immediate is how the new administration handles economic policy. But more important than the policies of the new administration is whether Americans in general believe that government is a dangerous master. Even more important is whether the hundreds of believers in limited government whom Reagan appointed, along with the thousands of others who believe in limited government but are not in government, will continue to fight the beast. As Anderson writes: "Little in politics happens by chance,…virtually everything is carefully done, carefully plotted, sometimes years in advance."
Contributing Editor David R. Henderson, a former staff economist for President Reagan's Council of Economic Advisers, teaches at the Naval Postgraduate School in Monterey and is a regular contributor to Fortune.
The post Witness to a Revolution? appeared first on Reason.com.
]]>Consider taxes. Dukakis has said that he would raise taxes only as a last resort. Bush has promised no new taxes, period. I'm not so naive as to believe either promise. After all, even though Ronald Reagan vowed in 1984 not to raise taxes, that is exactly what he did in 1987. But let's apply the same degree of skepticism to both Bush and Dukakis, and let's look at some evidence.
Early in the campaign, Michael Dukakis said that the worst mistake of the Reagan administration was its 1981 tax cut. In his own state, Dukakis opposed Proposition 2½, the 1980 law that limited Massachusetts property taxes to 2.5 percent of assessed value. Does this sound like someone who would raise taxes only as a last resort?
And anyway, what does it mean that you would raise taxes only as a last resort? After first trying to do what? To cut Social Security? To shut down government spending for two weeks while you call Congress's bluff? To pass a balanced budget amendment? Can you imagine Michael Dukakis doing any of these things?
Bush's record on taxes is harder to uncover. He was such a good soldier for Reagan that he never staked out an independent position on taxes. So you can't say much about his credibility on that issue. Instead, we have to retreat to common sense. Which candidate is more likely to raise taxes: one who vows not to or one who refuses to make such a vow?
On domestic spending, neither candidate has specified particular cuts he would make. But I can't think of a single domestic spending program that Dukakis proposes to trim, let alone abolish. Bush, for his part, has laid out an admittedly vague "flexible freeze." He would not touch Social Security, would allow defense spending to increase with inflation, and would balance the budget by 1993 by cutting other domestic spending. Unfortunately, the only way Bush can promise a balanced budget by 1993 is with unrealistic projections.
According to the Congressional Budget Office, if Bush does what he proposes with defense and Social Security, and if most domestic programs increase with inflation while Medicare and farm price supports continue as in current law, the budget deficit in 1993 would be $139 billion. This means that Bush would have to squeeze $139 billion out of Medicare, farm price supports, interest on the government debt, and other domestic spending. He projects $66 billion less in interest payments, but only by assuming that long-term interest rates in 1993 will be 4.5 percent—almost 3 percent below the CBO's prediction. This seems wildly optimistic. The CBO's number is probably much closer to the mark. The other $73 billion in Bush's deficit reductions comes from the $543 billion that is not Social Security, defense, or interest. That's a 13 percent cut.
Moreover, all of these calculations assume that Bush will neither increase spending on any other programs above the level assumed by current law nor introduce any new spending programs. Clearly, that is doable. Can George Bush do it? I don't know, but I wouldn't bet a dollar on it if you gave me 20 to 1 odds. A better bet is that the deficit could be decreased from the currently projected $139 billion to about $100 billion.
And that's not bad. Remember that that $100 billion would be in 1993 dollars and that the federal debt held by the public would be about $2.5 trillion by the start of 1993. Assuming an inflation rate of 4 percent, the real value of the debt owed by the feds at the beginning of 1993 would erode by 4 percent by the end, or by $100 billion. Which means that Bush could still run a deficit of $100 billion and keep the federal debt from rising in real terms. And he could do it without any tax increases.
Still not excited about Bush? Consider the alternative. Michael Dukakis and cousin Olympia assure us that he is frugal because he uses the same 25-year-old snow blower. But I don't give a damn how he spends his money—that's his business. What matters is how he spends our money. Even in the unusually promise-free Democratic platform, it is not hard to find tens of billions of added spending per year. If Dukakis plans, for example, to introduce government provision of long-term health care, as the platform hints, the price tag would be about $28 billion per year, according to Republican Rep. Harris Fawell (Ill.).
Although Dukakis complains about deficits, he is strangely silent about the increases in government spending that (absent "last resort" tax increases) cause deficits. For good reason. Even the liberal New Republic acknowledges that in his last six years as governor, Dukakis increased state spending by 73 percent. Compare this to the 47 percent increase in federal spending under Reagan and the approximately 58 percent increase in spending by state and local governments over the same time period.
And although Dukakis brags about balancing 10 state budgets, what he doesn't tell us is that by the same way of counting, Ronald Reagan has balanced 7 federal budgets in a row. Balancing a budget in Massachusetts simply means making sure that expenditures do not exceed revenues, where you include borrowed funds as revenues. Every dollar Ronald Reagan spent was either taxed or borrowed too. What's the alternative: knocking off a bank? The simple fact is that when Dukakis became governor in 1982, the Massachusetts debt was $5.1 billion and that it is projected to stand at about $10.2 billion in 1989.
But spending and taxes aren't the only economic policies that can end up hamstringing an economy. Constrained by large deficits, Dukakis would probably do what he has done in Massachusetts: regulate employers. We can't come up with federal funds to support laid-off workers? That's all right. Let's make businesses pay severance pay when they close down a plant or lay off workers—this is the essence of the "plant closing" law that Congress recently passed and Reagan failed to veto. Would national health insurance cost hundreds of billions per year? Too bad. We can make businesses provide health insurance for every employee—this is what Dukakis did earlier this year in Massachusetts. If Dukakis becomes president, look for the Europeanization of the American labor force.
I'll admit that I find morally repugnant such Bush positions as harsh penalties against those who peacefully sell drugs to willing buyers. But am I voting for Bush? Hell yes.
Contributing Editor David R. Henderson, formerly a senior economist with President Reagan's Council of Economic Advisers, writes frequently for Fortune.
The post Economics: Budgets Are Stubborn Things appeared first on Reason.com.
]]>The alarm is spreading, not just among those who seek more excuses to regulate our lives but also among people who are genuinely concerned. What are the facts? And what should we do about them?
First, no economist or journalist who claims that America is a net debtor has presented data on net debt. Those who write about the subject typically classify foreign-owned U.S. stocks and real estate as debt. But they are not: My ownership of my house, for example, makes no one indebted to me.
The admittedly cumbersome term to describe what people are concerned about is the U.S. net international investment position, that is, U.S. assets abroad—stocks, bonds, structures, and equipment—minus foreign-owned assets here. It is true that our net international investment is a negative $400 billion.
But let's put that in perspective. The net income (foreigners' earnings on their U.S. assets minus ours on foreign assets) is less than half a percent of GNP. And even if our net international investment continued to decline as fast as in 1987, by the year 2000 the net income would reach about 2 percent of GNP, roughly the same proportion as in Canada in recent years. It's hard to argue that this could make us poor.
So what are the critics of foreign investment worried about? Three things.
First, they say, we've been depending on foreigners to lend money to our government and to invest in our industries. Couldn't we suffer severe economic disruption, they ask, if all foreigners decided to sell off their Treasury bills, their high-rise buildings, and their factories in the United States? The answer is yes.
But what follows is that we should make sure the United States remains a safe haven where foreign investors can feel secure in their holdings. We shouldn't pass laws to make holding those assets less desirable. This would only encourage foreign investors to divest themselves of U.S. assets—precisely the awful event that the worriers want to avoid.
The critics might counter that we should at least avoid making the situation even worse by selling more Treasury bills and buildings to foreigners. But our government sells them Treasury bills to finance the budget deficit. Refusing to do so is not going to eliminate the deficit—it will simply raise the cost of financing it, because if we exclude a source of lending, the interest rate must rise. Certainly for other reasons we should slash the deficit, but as long as we have it we should finance it as cheaply as possible.
And as for selling more buildings to foreigners, we are getting goods—VCRs, computers, and cars—in return. We apparently as a group value these goods more than the buildings.
Second, critics often worry that foreign investors will have too much influence on our political system. Martin and Susan Tolchin, for example, in their recent book, Buying Into America, cite the key role played by foreign firms in persuading California to eliminate the unitary tax (where a company is taxed on its worldwide sales of products made in the state, not just on sales in the state).
But should foreign companies have no say about how our governments tax them? If so, would the Tolchins and others claim that American-owned firms abroad should have no say in how foreign governments tax them? It is hard to see what is wrong per se with individuals or companies trying to affect the laws they have to live under.
Third, critics argue that a high level of foreign investment reduces our control over our lives. No it doesn't. The manufacturer who sells his company to a British investor chooses to do so (the United Kingdom is the biggest source of investment in the United States—the Japanese, contrary to popular impression, are a distant third behind the Dutch). Consumers who buy from that company may not like the new owner's products as well—or they may like them better. Employees may not like their new boss as well—or they may like him better. No case I know of has been made that foreign owners are worse at serving consumers or managing workers.
Foreign investment in the United States is no different in principle from investment by New Yorkers in California. Both New Yorkers and Californians gain in the latter case, and both foreigners and Americans gain in the former. Foreign investment is simply the next logical step in the division of labor. By allowing specialization, the division of labor has raised living standards and has been one of the most liberating forces in human history. The international division of labor will do the same if we let it.
Contributing Editor David R. Henderson writes frequently for Fortune.
The post Economics: America For Sale? appeared first on Reason.com.
]]>…as the decade in which we all grew older. This is about the only memory we will all share; the rest will depend on who watched what on video. Since the arts and political discourse, not to mention the educational system, are reduced by and large to entertainment, aimed chiefly to make people feel good, we have regressed toward the Middle Ages—that is, back to a radical division between a small educated class obsessed with ideology and a huge majority who are not only ignorant but have such a tenuous grip on reality that they believe the stars foretell their day and are bewitched by actresses who lecture about their experiences as prostitutes in ancient Egypt. All of which is bad news for shared memories of history. Perhaps the collapse of share prices and the promised economic downturn will be remembered as the beginning of a return to civilization. Less busy shopping, people may find time to think, read, locate America on the map, study history—amusements that cost very little.
In the meantime, some will remember the '80s for clean living inspired by AIDS; for Chernobyl, which demonstrated that it doesn't matter who drops the Bomb on whom, the wind will take it around to everybody; and for the Reagan-Gorbachev treaty, influenced by the growing realization that air currents are the ultimate deterrents.
Stephen Vizinczey is the author of In Praise of Older Women and Truth and Lies in Literature. A chapter from his novel An Innocent Millionaire appeared in the May 1987 REASON.
…for the paradoxical combination of the collapse of the socialist idea throughout the communist and Third worlds and even among Western intellectuals and the discrediting of free-market ideals in the West. In the '80s, the true historical function of the Chinese Marxist revolution became clear—destruction of the millennia-old feudal impediments to capitalist development. The decision in the Soviet Union genuinely to compete economically with the West and with the surging capitalist economies of the Far East forced radical economic liberalization and rehabilitation of the previously forgotten Russian economist Adamsky Smithovich. (Twenty years later, the Lenin Peace Prize was awarded posthumously to the great Russian atheist, materialist, and antifascist Ayn Rand.) And the last "Marxist" intellectual who maintained that Marx favored either a centrally planned economy or a forced egalitarianism was expelled from the American Society of Socialists.
In the '80s, due to the convincing rhetoric of Ronald Reagan, almost all Americans came to understand that free enterprise meant massive federal debt, payrolls, and taxes; social and economic stagnation; and the triumph of the aristocracy of pull. Believing in neither the old socialism nor the new free enterprise, the electorate turned increasingly to prayer—and overwhelmingly elected the ticket of Jackson and Robertson.
Eric Mack teaches philosophy at Tulane University.
…for the near-complete breakdown of civility and the libertarian impulse within the conservative movement. What had once been a genteel and thoughtful amalgam of traditionalist and libertarian elements became—in the hands of the manipulators surrounding the president and in the rhetoric and pamphleteering of the operators who took for granted their benediction from what they imagine is their God—a bitter and vicious thing. Conservatism, in power in the White House and at many of the significant agencies, demonstrated a sanctimoniousness and self-satisfied arrogance that resembled the behavior of liberalism rampant in earlier years.
Libertarianism demonstrated that as a political force it is to all extents meaningless except when somebody can bankroll a presidential or other race: such was the case in 1980. By 1984 the Libertarian Party ticket made about as much headway as the Vegetarian. As a confluence of ideas, libertarianism showed some energy in the '80s, though the American tendency to ride with the red-hot conservatives or the retrograde liberals or the primitive populists showed how insubstantially had libertarianism dented the public consciousness.
Ollie North and Eddie Murphy became '80s heroes, which just about says it all.
Contributing Editor David Brudnoy is a radio talk show host and film critic in Boston.
Before we had a chance to breathe a collective sigh of relief after a decade of polyester, discos, and unabashed promiscuity, the '80s lifestyle was upon us. The first hint of our future arrived in the Preppie Handbook—the hottest item to hit college campuses since marijuana. The book espoused a preference for pearls (real ones, of course) over plastic baubles, cotton (read pricey) over nylon, leather furniture over crushed velvet. Fully loaded Buicks gave way to back-to-basics Mercedes and BMWs. "They cost a little more, but it's the quality that counts." Yeah, right.
The tacky 70s gave birth to the beautiful '80s and, in turn, to a demand for high-quality goods—preferably the kind that appreciate in value. Framed posters of wildlife flooded thrift stores and yard sales, while the desire for high art you could cart home in the back of the Volvo helped the new art superstars flourish. To create the appropriate surroundings for our newly found taste, a new breed of architecture superstars designed everything right down to the teapot on our commercial ranges. Bye, bye, Lady Kenmore.
The '80s will certainly be remembered for this rebirth of materialism, and we will remember just how much it cost. But if you ever have doubts about whether it was all worth it, just pull out a snapshot of yourself from 1975 and remind yourself of the words of the immortal Billy (Nando's Place) Crystal: "It's better to look good than to feel good."
Laura Main is REASON's art director.
…for sense and nonsense. Nearly the whole world grew to recognize that markets foster prosperity. Pockets of political or tyrannical resistance were circumscribed. Tax rates mostly fell. Yet in the same decade, the welfare state entrenched itself where enterprise has brought wealth. Amassed public debt acted as somewhat of a check on new social transfers but didn't threaten the core middle-class entitlements that weight down the West. Only economic calamity looms as a caller of that intergenerational loan, and thanks to wonders of trade and commerce that largely escaped establishment eyes, such a calamity may not be just around the corner.
Arms were ever more a preoccupation of the nation-states, although here again the fiscal cinch was felt. A dissolving of post–World War II arrangements seemed finally in order, but the question of will to contest Soviet Leninism was unresolved throughout much of the free world. Here at home, technology and expanded means made life inexorably better for most, though it became gruesomely worse for individuals (and their offspring) who had made unfortunate choices about personal conduct and family responsibilities. The decade leaves libertarians with a particular challenge to work with cultural conservatives to repair that rending of our society's fabric.
Tim W. Ferguson is editorial features editor for the Wall Street Journal.
…for the creation of the U.S. troy ounce gold bullion coin and the silver ounce coin. First, in the greatest sense of "remembered," the bullion coins will be unearthed by archaeologists thousands of years from now, and they will figure out what "MCMLXXXVI" means.
Second, politicized paper money systems are inherently unstable, and we have seen recently that floating international exchange rates disrupt both capital and goods markets, so there is a good chance a new world monetary system will evolve, based on a tangible monetary base. If this occurs, of course, the troy ounce gold bullion coins are most likely, in my opinion, to form the new monetary base—and the "units of account" will be troy ounces of gold. So the '80s will be remembered in the next century and thereafter as the beginning of the new international gold standard monetary system, transcending government-issued paper units.
Contributing Editor Joe Cobb is senior economist for the Joint Economic Committee of the U.S. Congress.
We don't know yet. When I came to the United States in the fall of 1962 "the '50s" were still heavily in progress. "The '60s" began, I would say, in 1965. "The '70s" began (let's say) with the fall of Saigon. In retrospect the '70s were a decade of increasing conservatism (election of Mrs. Thatcher and Ronald Reagan). I guess "the '80s" began in 1985, say with Reagan's second term. So far the decade has been characterized by a declining dollar and a renewed detente. Both, I suspect, will prove to be unreliable indicators. The events that in retrospect will define "the '80s" have not yet occurred and, on January 21, 1988, are not foreseeable. No doubt they will surprise us all.
Contributing Editor Tom Bethell is a media fellow at the Hoover Institution.
…for condoms…Just Say No…answering machines…aerobics, health clubs, triathalons, hard bodies, sweat…Oliver North…Safe Sex…yuppies…Madonna and Sean Penn…the Challenger exploding against a very blue sky…Tammy Bakker's eyelashes.…Mary Beth Whitehead, test tube babies, women giving birth to their own grandchildren…John Lennon's death…the first black Miss America and the first to quit…frozen yogurt…"getting the government off our backs"…cheap airline tickets…Steven Spielberg and George Lucas…Charles and Diana, Fergie and Andrew…Michael Jackson's Thriller…AIDS…"I Want Your Sex"…Cuisinarts, microwaves…Ivan Boesky…nouvelle food, Thai food, diner food, sushi, blackened everything…crack…VCRs, CDs, PCs…homosexuals, intravenous drug users, Haitians…Nutrasweet…Band Aid, Live Aid, USA for Africa…Libya, Grenada, Chernobyl…MTV…Talking Heads, Springsteen, U-2, rap…Ronald Reagan, Mikhail Gorbachev, Cory Aquino…Esprit…the crash of '87…Gary Hart, Donna Rice…"The Cosby Show," "Family Ties," "Miami Vice," "Dynasty"…Rock Hudson…Men Who Won't Eat Quiche and the Women Who Love Them.
Assistant Managing Editor Lucy Braun likes lists.
…for being a time of private versus public interest for most Americans—a time of Reaganism, when rollbacks of programs such as AFDC, student loans, and funding for the handicapped coexisted with rollbacks of tax rates for corporations and wealthy individuals. It spawned yuppies, who, though few in number, were as important symbols of the times as hippies were of theirs.
College students listed making a lot of money as their primary goal. The energy for social change came not from the left but from the right. Yet even with an enormously popular president who was more hospitable than any recent president to the right wing's fundamentalist agenda, and with the power of televangelism, the progressive attitudinal changes of the '60s and early '70s were too entrenched to tolerate major changes in the laws affecting matters ranging from school prayer and gay rights to censorship and abortion. Moreover, due to the belated acknowledgment by the administration of the serious implications of AIDS, condoms finally came out of the closet.
As a sociopolitical decade, the '80s have ended. A reemphasis on public concerns and a more progressive political agenda will almost certainly follow.
Christie Hefner is president of Playboy Enterprises. An interview with her appeared in REASON's June 1986 issue.
…for dramatic reductions in the top marginal tax rates on personal income, not just in the United States but around the world. Ten years ago who would have dreamed that by 1988 the top rate here would have fallen from 70 to 28 percent? In 1971 I attended a conference at Columbia University that featured Milton Friedman. He was asked which government interventions he would like to see eliminated or reformed first. One was the income tax. Friedman said he would drop tax rates and make sure that the highest rate was no more than double the lowest. His wish came true: 28 is less than twice 15.
These low tax rates have a good shot at being permanent. Economists are concluding—and governments are slowly learning—that high rates reduce the tax base so much that they lead to lower revenues. And academic economists who write on optimal taxation, even those with a strong egalitarian bent, are concluding that the top rate should be somewhere between 20 and 40 percent.
Economics columnist David R. Henderson is a regular contributor to Fortune.
…for failure. The '80s were a typical Republican decade—benign, comfortable, and nonradical. They were pleasant years for most of us, with a popular two-term president who…er, presided over relative peace and prosperity, kept the social engineers at bay, and avoided wrenching social upheavals. For those reasons, the '80s seem misleadingly "good."
But Reagan's '80s, like Eisenhower's very similar '50s, will be seen in retrospect as a time of lost opportunity. Early in Reagan's first term, he might have repealed large portions of Lyndon Johnson's Great Society—as Margaret Thatcher rolled back socialism in Great Britain—but he didn't really try. (Similarly, Eisenhower might have been able to repeal the New Deal—but he didn't try either, leaving Social Security to fester through the decades.) Oh, Reagan said some fine things, and he managed to reduce our income taxes, but he could have done so much more had he dared.
America is a revolutionary country. In 1787, Thomas Jefferson, commenting on Shays' Rebellion, said: "God forbid we should ever be twenty years without such a rebellion.…[W]hat country can preserve its liberties, if its rulers are not warned from time to time that this people preserve the spirit of resistance?…The tree of liberty must be refreshed from time to time with the blood of patriots and tyrants."
The '80s were ripe for a genuine restoration of individual liberty, but Reagan blew it. And I guess we blew it too.
Contributing Editor Warren Salomon is an attorney and tax specialist in Miami.
…for the personal computer revolution. From an installed base of 365,000 in 1979, the number of personal computers in the United States—in homes, in schools, and in the workplace—soared to 15 million in 1988. By 1990, that number is expected to grow to more than 26 million.
It was during the '80s that we started to become a computer-literate society and truly entered the Information Age. The implications extend far, from how we speak to what we teach our children. Back in '79, I don't think I ever had a casual conversation that contained the words modem or megabyte. But recently, in a long-distance phone call to an old friend, we spent a good 10 minutes going over the relative virtues of the Apple Macintosh and the IBM PC, as if we were discussing the latest car models.
The personal computer has made some of our lives easier, some harder, and others just more confused in adjusting to the new ways. It has transformed our society unlike any invention since the television. With television, we became a nation of vidiots. We are now, irrevocably, a nation of hackers.
Former Managing Editor Eric Marti, who computerized REASON, is a student at Stanford Business School.
…for the greatest mirage in libertarian/conservative political history. Never have so many right-thinking men and women been promised so much and received so little. President Ronald Reagan, whatever his good intentions, did not, in the end, rise much above rhetoric. The less-(government)-is-more crowd who helped put him in office got to see little in the way of genuine privatization, decreased federal spending, or meaningful reduction in government manipulation of the economy.
His tough-on-communism supporters, who had thrilled to the sound of his "Evil Empire" theme, were to witness, with few exceptions, a shameful substitution of empty words for action—from his feeble response to the downing of the KAL airliner; to a human rights policy so hypocritical it permitted a Ukrainian seaman named Medvid to lose his desperate bid for freedom and be delivered to almost certain death, allowed the State Department to condone the massacre of Tibetan demonstrators by Red Chinese troops, and made excuses for the man in charge of ravaging Afghanistan; to the ultimate obscenity of his lovefest with the current head of a totalitarian regime that has no modern rivals in systematized persecution of dissenters, institutionalized slave labor, mass murder, and world conquest.
Erika Holzer is a novelist and lawyer who lives in Bedford, New York.
…demographically, as a milestone for the so-called baby boomers: the first decade during which their own generation almost entirely constituted the work force, as that of their parents had completely retired from it by 1990.
…politically, as a time during which libertarian ideas could no longer be pooh-poohed away by the self-ordained intelligentsia: although often disguised as conservatism, individualism no longer garnered big laughs at cocktail parties.
…economically, as the salad days for small investors: the first time during which the proletariat was able to participate in the financial markets in a big way, thanks largely to computer technology.
…musically, as the era during which rock & roll was liberated from the numbing excesses of '70s pop and disco: New Wave minimalism returned drums, bass, and synthesizers to the background tracks—where they belong.
…personally, as the decade during which I became less melancholy about choosing the social sciences instead of a more lucrative major: perhaps something involving animal mutilation or generally accepted accounting principles.
Stephen G. Barone is a children's psychologist and free-lance writer.
On the geopolitical stage, the decade will be fondly remembered as the beginning of the end for totalitarian communism, an ugly political leftover from a century that began in a mindless orgy of bloodshed and ignorance. For the first time, in Afghanistan, an indigenous people was able to resist and defeat a Communist invasion. And, again for the first time, a nation began to dismantle its own Communist system as a failed experiment: the implications of China's turn away from Communism in all but name can hardly be overstated, as the billion-strong people begin to take their place with the Asian economic powerhouses of Japan, Taiwan, and South Korea.
Secondly, the '80s will be remembered for an explosion of scientific knowledge and technology such as the world has never before seen, led by computerization, biotechnology, and advances in medicine that could hardly be imagined a mere decade before. This trend, coupled with the turn toward capitalism and freedom in the Far East, could set the stage for a 21st century more enlightened and progressive than any the human species has ever experienced.
Contributing Editor Timothy Condon is a tax attorney.
…as a time of growing pessimism and uncertainty. The election of Ronald Reagan held out the great promise of putting the American dream back on track after our traumatic defeat in Vietnam. But the left will not let America forget its defeat. We must never be permitted to believe that our effort to save South Vietnam from communism was in any way justified.
In addition, the AIDS plague, growing illiteracy, disintegrating education, huge deficits, high consumer debt, our problems in Central America, our military disaster in Beirut, terrorism, hostage taking, space accidents, the farm depression, spy scandals, and the drug problem have just about convinced Americans that they are no longer the architects of their own future.
If it hadn't been for breathtaking advances in technology produced by capitalism, by entrepreneurs, and by the spirit of innovation, there would have been nothing visible to offset the climate of negativism. But glittering shopping malls, high-tech cars, jet travel, and Disney World cannot make up for spiritual misery.
The '80s will also be remembered for the revival of fundamentalist religious faith, a return to biblical principles by many Americans, the growth of home schooling, and the emergence of a new Calvinist intelligentsia within the heterogeneous conservative mix. These trends promise to upset many applecarts and make the '90s intellectually exciting.
Contributing Editor Samuel L. Blumenfeld is author of The New Illiterates.
…for the growth and growing success of free-market institutions the world over. Hayek's Road to Serfdom (1944) stoked the fire and his Intellectuals and Socialism (1949) set the strategic agenda. Interalia, he enthused three people: Leonard Read, Antony Fisher, and F.A. "Baldy" Harper. Read started the Foundation for Economic Education (1946), Fisher the Institute of Economic Affairs (1957), and Harper the Institute for Humane Studies (1961). All three shared a deep concern for freedom; all took different tacks; all are heroes.
Read's FEE was the layman's high school for liberty. Fisher's IEA concentrated on Hayek's nonpejorative "secondhand dealers in ideas," such as teachers and journalists. And Harper's IHS set out to preserve and later expand the remnant of market scholars.
By the end of the '70s the success of IEA had led Fisher to become a full-time entrepreneur of pro-market think tanks. He today lists over 40 institutes in over 20 countries that he has helped through the Atlas Economic Research Foundation.
Three main trends stand out in the '80s. First is a solid growth in those institutes already in existence. Second is a surge of new institutes at the national level in the United States and abroad. And third is a still-developing wave of regional or state-wide institutes.
Suddenly in the '80s it's fashionable and feasible to promote free markets.
John Blundell is executive vice president of the Institute for Humane Studies.
If decades were neighborhoods, the '80s would be a place you wouldn't want your daughter to live, a low, mean cross between Disneyland and Port Said. It was a time of debt, deficits, and deceit—deceit of self and deceit of others. Politically, it was the decade of Ronald Reagan, Gary Hart, and the Continuing Resolution; religiously, the decade of Jim and Tammy Bakker; culturally, the decade of Steven Spielberg. Altogether, it was a decade of irresponsibility, amnesiac about the past and uncaring about the future.
The '80s offered the illusion of protracted adolescence for the predominant baby boom generation. As a group we flat-out refused to grow up. Everyone watched Spielberg movies about childish heroes in a childish world. We did not want to be reminded of adulthood by adult themes, so even the movies that came closest to speaking to those themes, like Breaking Away or Karate Kid, had to be disguised as stories about teenagers. Little wonder we refused to pay our bills. Bills and credit ratings are adult concerns.
The good news about the '80s is that the decade ended in 1987. This was merciful—the '20s dragged on until 1929.
The stock market crashed. The arms-for-hostages scandal exposed Reagan as an amiable old fool. Jim and Tammy were turned out of their PTL playland. Gary Hart's adolescent ideas, not to mention his lies and infidelities, were no longer a turn-on. Suddenly, in 1987, movies were made about adults. Fatal Attraction and Moonstruck told us that we were growing up, Broadcast News that we were waking once again to ideas.
Yet the '80s, like the Roaring '20s, will live better in recollection than they did in life, and they will seem more prosperous. A decade of false prosperity and false youth will seem high-spirited as we look back from the poorhouse. Casey Stengal gave us a phrase we remembered and repeated in the '80s: "It's not over till it's over." Now it's over.
Contributing Editor James Dale Davidson is chairman of the National Taxpayers Union and coauthor of Blood in the Streets.
…for the most serious efforts in this century to reduce the role of the state. Some of these efforts are still nascent, such as in the Soviet Union. Some were partial but more dramatic, such as in China and New Zealand. Some were broader but with limited success, such as in Britain, France, and the United States. These developments reflect the combined effects of a change in the ideological bounds and the accumulating evidence that government direction of the economy reduces both individual liberty and economic growth.
All of these developments, however, are vulnerable. Although Marxism has lost broad appeal, it is still the official credo in many nations. Freedom will continue to be threatened internally by innocent people who promote government "solutions" to most any perceived problem and appeal to others motivated by envy. A demonstrated record of government failure, as in education, will be used as a rationale to increase state control. Some problems that are the result of government programs, such as the rapid increase in the price of medical services, will be used as the rationale for new programs.
The future, however, will be different because of the developments of the '80s. Political history, like biological evolution, is path dependent. The '80s may be only a temporary pause in the relative growth of the state, or this decade could be a turning point in the direction of increased freedom. As always, historical change is the result of human action, not of some inexorable process for either good or evil.
William Niskanen is chairman of the Cato Institute.
…for being better than the '70s—a lot better. Especially if you were a teenager in the '70s.
Assistant Editor Virginia I. Postrel doesn't consider herself a baby boomer.
In the '80s, AIDS provided the major ammunition for a smug religious counterrevolution, cutting a bloody swath through all civil liberties, not just sexual ones, as civil libertarians had to concentrate on a fight against calls for concentration camps. By vetoing bills to establish tax credits for AIDS research (as in 1987 in California), the right kept AIDS research firmly in the hands of government. Promising lines of research that would have saved tens of thousands of lives were delayed or squelched.
Citizens of the '80s were concerned only about money and threw their votes in with the religious right in the vain hope that their taxes would be lowered. As Sunday closing laws returned, as many counties banned liquor, and as local theater owners were arrested for reviving Carnal Knowledge, few saw that the cause was their failure to fight for civil liberties for "fags" and "druggies." Even most libertarians failed to speak out. Libertarian soft-pedaling of civil liberties allowed AIDS to help calcify political debate in the old liberal-conservative terms, thus helping finish the transition of the libertarian movement from obscurity to total eclipse.
Former REASON Spotlight columnist John Dentinger is a Los Angeles free-lance writer.
…for trade with Mars. The Reagan administration's strategy to increase the deficit to force down domestic spending had an unintended consequence. Congressmen, feeling the heat as the standard pork barrel projects got squeezed (or at least their expected rate of increase got squeezed), found a substitute. Highly publicized deficits in merchandise trade provided them a new growth industry in xenophobic pork. With fewer new federal dollars for hometown projects, congressmen learned to zap foreign competitors of hometown manufactured goods. The new pork barrel of the '80s has something for every district, from roses and sugar to textiles and computer chips. And with politicians all over the world looking to boost their case for hometown protectionism, the whole world taken together is running an amazing combined trade deficit in the '80s (some say around $100 billion a year). Either trade with Mars accounts for the difference or somebody has been cooking the books. So the '80s may be remembered for worldwide howling about worldwide trade deficits that together couldn't possibly be, but were anyway, and helped entrepreneurial politicians get reelected.
Greg Rehmke is the Reason Foundation's educational programs director.
…for the desperate entrenchment of those whose worldview was stamped by a New Deal media and whose testosterone level was verified in World War II, struggling against offspring who bought Playboy, smoked marijuana, broke the speed limit, opposed a war, abandoned slide rules, made heroes out of the rogue North and the rake Hart, and cheated on their taxes.
Young adults of the "Great" Depression are thoughtfully retiring or dying, though income taxes, an intrusive international reputation, bureaucracies, Social Security, and the debt they created and consigned to their children linger on, creating penury and contempt. Their experiment feeble and failing, they and their sycophants struggle to institutionalize their subliminal political/philosophical dictionary unto perpetuity in a vain but ineluctable grasp for historical immortality.
The decade saw real economic power fall into the bewildered hands of those inseminated by James Dean, carried by Bob Dylan, delivered by J.F.K., spanked by Ho Chi Minh, nursed by Sergeant Pepper, weaned by G. Gordon Liddy, and toilet-trained by Letterman and Spielberg. "In the Mood" was relegated to nostalgic camp and "Won't Get Fooled Again" elevated to classic status.
Former REASON Spotlight columnist Patrick Cox is a regular contributor to USA Today.
…for a resurgence of skepticism about what government can do.
…for the major contribution of the Reagan presidency—appointment to the federal bench of numerous judges who understand economics and the need to protect individual rights, including property rights.
…internationally, for inauguration of the Gorbachev era, involving potentially massive internal reforms and steps to improve Soviet-American relations. And for another memorable aspect of the '80s—the Soviets' being effectively held to a stalemate by the freedom fighters in Afghanistan.
…for yuppies and dinks (double-income, no kids) and the baby boom entrepreneur, who came of age and helped signal the trend toward opposition to government intervention in economic activity and personal conduct.
…for the Kasparov-Karpov matches, with the emergence of Gary Kasparov as the best-known chess champion since Bobby Fischer.
…for the emergence of "foodies" and the chef as superstar, accompanied by a renewed focus on quality, freshness, and originality and by an increased popularity of ethnic cuisine ranging from Cajun to Caribbean to "California."
…for many excellent wine vintages and high-quality wine making in France, California, Australia, and Italy—1982 may be remembered as the vintage of the century for Bordeaux, and the 1985 red Burgundies will be difficult to surpass in the rest of this century.
…finally, for the growing importance of the Pacific Rim and the city of Los Angeles as a financial and business center…and for the Reason Foundation's move to Los Angeles in 1986 and the resultant expansion in Los Angeles of free-market ideas.
Senior Editor Manuel S. Klausner, a Los Angeles attorney, was one of the partners who published REASON from 1971 to 1978.
Yes, what will they be remembered for? History has not been kind to Republican eras of prosperity. The Roaring '20s are still seen as little more than a prelude to the Depression—an age of flapper frivolities and fatuous presidents. The '50s are The Silent Decade, when we were all terrified into conformity by The Bomb and a somnambulant President Eisenhower.
Few remember the '20s and '50s as periods of astonishing economic progress, when America leapt ahead of the world in prosperity. Liberal historians prefer the grim heroics of the '30s, the outlandish excesses of the '60s, or—failing all else—the countercultural triumphs of the Watergate '70s. And so it may seem almost foreordained that the '80s will eventually be written off as just another Republican interlude when "greed" and "passivity" temporarily triumphed over "morality" and "social consciousness."
Yet history retains the capacity to surprise, and 1988 may become what historian Lewis Namier called 1848, the "Year of Revolutions" in Europe—"the turning point at which history didn't turn." I suspect the Republicans will win the presidency in 1988 and it will suddenly become clear to everyone that, rather than being a fluke or an interlude, the '80s actually marked the beginning of a long historical period in which Americans finally gave up the collectivist fantasies of the early 20th century and became a mature, responsible, permanently growing nation.
Contributing Editor William Tucker is writing a book on the family.
…for confirming the cynics, who despair of the intractability of the polity. But there are some nuances of difference.
At the level of national defense, the decade has seen nuclear deterrence take a great battering—first from the freeze movement, later from the arms control movement, which now includes such improbables as Ronald Reagan, George Shultz, and Paul Nitze. Maybe Reagan will be remembered as the hardliner-gone-soft who began by describing the Soviet Union as an "evil empire" and ended by saying they have changed. Carter began his presidency as a dove and, after Afghanistan, ended it a hawk. Reagan began as a hawk and, after meeting Gorbachev, is ending his term a dove. Reagan's Strategic Defense Initiative was one of the interesting ideas of the '80s but seems likely to remain that—an idea being researched.
The decade began with great hope for progress in economic deregulation and translating the intellectual ascendancy of free-marketeers into greater economic freedom. It ends with some small progress—in taxation, transportation, oil. But the interventionists are ominously strong in their pressures to reregulate airlines and railroads and are using Black Monday and Ivan Boesky to justify newly intrusive controls of capital markets. And hysteria over the trade deficit continues to propel destructive protectionist moves in Congress, despite high employment.
The end of the '80s may see a long overdue recession or a revival of inflation to double digits. And since there is probably a one-in-four chance that the depression doomsters are right about dangerously excessive debt, there's a possibility the '80s will be remembered for laying the ground for a 1935–45–style period of grand interventionism, a new New Deal.
Contributing Editor Peter Samuel is a Washington-based journalist.
…as the decade that gave us Ronald Reagan, safe sex, freedom fighters, market meltdowns, shot shuttles, people power, antiapartheid activists, supply siders, and Classic Coke.
Also between now and 1991, it is possible that the New Wave will recede, the New Age will die of old age, and New Ideas will become a thing of the past. We can only hope.
Editorial intern James Taranto is too young to remember the '60s and considers himself fortunate.
…for a revival of democracy and capitalism, for privatization, deregulation, and reduced tax penalties for increasing output and income. It has been very much the era of Thatcher and Reagan, with flattering emulation worldwide. There were deep, radical reductions of marginal tax rates in Colombia, Bolivia, Jamaica, Mauritius, Botswana, Ciskei, Indonesia, Israel, Singapore, the Philippines, Turkey, New Zealand, China, and (for a few years) even India. All of these economies, like those of the United States and the United Kingdom, soon outperformed their neighbors. By 1988 there were also timid gestures toward marginal tax relief in Sweden, Norway, Australia, France, Canada, and Japan, soon to be followed by Greece, Austria, Ireland, and the Netherlands. Will any of this be remembered? Not by the academic elite, who never understood what was going on, but by taxpayers in the affected countries. A backlash is likely, since there are powerful financial and political interests in reregulation, in protection of sleepy managers from foreign rivalry or domestic takeovers, and in a tax and welfare system that punishes success and rewards failure.
Contributing Editor Alan Reynolds is the chief economist for Polyconomics Inc.
What will be remembered about the '80s depends upon who does the remembering. For lawyers like me who make a living defending media interest as well as others who cherish a free and independent press, the '80s will not be recalled with fondness.
Juries delivered verdicts against the media whenever they got a chance. The media lost 89 percent of libel cases that went to trial, the more prominent of which included a multimillion-dollar judgment by a Las Vegas jury for hometown entertainer Wayne Newton against NBC and a similar verdict by a Chicago jury for a major tobacco company against the local CBS affiliate.
Judges joined in. After displaying an open hostility to the media while on the U.S. Court of Appeals, Antonin Scalia was elevated to the Supreme Court, which this year reversed long-standing precedent and held that high school newspapers are not entitled to protection of the First Amendment against government censorship.
Politicians were no better. When Rupert Murdoch's Boston Herald offended Massachusetts Senators Kennedy and Kerry, they conspired behind closed doors with liberal and conservative legislators from both parties to pass, without debate, a bill that required Murdoch to sell the paper or lose his Boston television station.
Defense of the media has always had more than its share of summer soldiers and sunshine patriots. In the '80s the struggle continued without them.
Contributing Editor Michael McMenamin is a trial lawyer in Cleveland.
…as the epic when America overcame entropy and socialism succumbed to it. The United States used supply-side economics and high technology to launch a record economic boom despite a near depression among our trading partners in Europe and the Third World. While Europe followed big-government industrial policy for high technology and job preservation, it lost nearly 2 percent of its employment and fell ever farther behind in the key technologies of the information age. Shaking the tin cup of socialism at every global conclave, the Third World sounded its own death rattle. Meanwhile, U.S. and Asian capitalism converged in one thriving Pacific economy based on human liberty and creativity in the information age.
But who rescinded the Second Law? We did. The crowning symbol of this era of unleashed innovation and progress was the simultaneous invention of the desktop supercomputer and the high-temperature superconductor. Together with the rising yield of bioengineering, these breakthroughs promise at long last an eventual escape from dependence on raw materials and other territorial forms of wealth that suffer decay and exhaustion in a world of thermodynamic decline. The Second Law will fall before the capitalist law of reason: knowledge grows as it is used. Remember, you read it here first.
George Gilder is the author of Wealth and Poverty and, most recently, of Microcosm.
…for the failure of politics to solve human problems. President Jimmy Carter inaugurated the decade by demonstrating the ineffectiveness of the liberal welfare state; President Ronald Reagan concluded the decade with his conservative revolution in disarray.
…as the time when the public sector openly institutionalized envy and greed. Agricultural subsidies expanded five-fold as farmers shamelessly demanded more. Social Security became a sacred cow. And agencies like the Small Business Administration and Economic Development Administration, widely recognized as special interest groups, survived every attack.
…for the loss of communism's credibility. In Poland, a labor union and a church together challenged the state. China dismantled its farm collectives, the Soviet Union amended its laws to encourage foreign investment, and Vietnam loosened its rigid economic controls.
…for the atmosphere of hope abroad, as countries like the Philippines and Korea moved toward democracy and even the Soviet Union adopted policies of glasnost and perestroika. And for the feeling of despair at home, as such traditional values as honesty, monogamy, and charity declined, AIDS became an epidemic, the drug war was lost, and the secular world's answers became increasingly inadequate.
Contributing Editor Doug Bandow is a syndicated columnist and a senior fellow at the Cato Institute.
Jeopardy contestants, Trivial Pursuit devotees, and the learned may remember Gary Hart, Ivan Boesky, and Raisa Gorbachev some years hence. But for the rest of us, alas, the '80s—the newsworthy stuff of the '80s—will reduce to a couple of well-worn epithets. Ah, yes, the '80s—the era of yuppies and Black Monday on the stock market. After all, the '20s conjure up images of flappers and the other big crash; the '30s notoriously bring to mind the Depression, Hitler, and a notion that the New Deal came along about that time; the '50s is all Mom and apple pie punctuated by the Beatniks. The '60s was a very busy decade, so we remember fully four things about it—the Cuban missile crisis, the Vietnam war, student political hullabaloo, and hippies. And the '70s—well now, the '70s were pretty unremarkable, unless we count Watergate, inflation, and Billy Carter.
If what I remember about those earlier eras is any guide to how most of us will remember the '80s, then yuppies and the stock market crash seem like good candidates. But if we move away from the Big Stuff of the nightly news and People magazine, the '80s will come into clearer focus. I will remember the '80s for the birth of my daughter and for all the little trials, tribulations, and personal encounters that are what life is really all about. I mean, consider, for example, my grandma, now 95. She remembers her neighbor, Peach Seeds, in more detail than she recalls the death of President McKinley. While our particulars will vary, I suspect that is the sort of thing most of us will remember about the '80s.
Book Review Editor Lynn Scarlett is also the Reason Foundations research director.
The post 1968–1988–2008: The '80s Will Be Remembered… appeared first on Reason.com.
]]>Surprisingly, evidence supporting this view comes from that Ivy League bastion of mainstream economics, Harvard University. In a recent article in the Journal of Public Economics, Harvard's Lawrence B. Lindsey shows that cutting the top rate from 70 percent in 1980 to 50 percent in 1982 actually increased tax revenues from high-income taxpayers.
In 1984, for example, the latest year for which Lindsey had data, taxpayers with incomes of $200,000 a year or more paid $42.1 billion in taxes. This was $8 billion more than they would have paid under the old tax law. In 1985, according to some unpublished data provided to me by Lindsey, they paid $49.5 billion—$9.6 billion more than they would have paid. Bottom line: to soak the rich, cut their tax rates!
How could Lindsey tell what tax revenues would have been? Using detailed data on 25,000 taxpayers in 1979, he developed a model to do just that. He adjusted for the recession of 1981–82 and the recovery and expansion of 1983–85. He tested his model by seeing how well it predicted revenues for 1980, the year before the first installment of the tax cut. The fit was very close.
Why were the supply-siders only partly right? Because the more extreme of them spoke as if total revenues, not just revenues from the rich, would increase (although I confess that I have not been able to find this claim in anything written by Laffer or Wanniski). As Lindsey shows, that did not happen. Not even close. Had there been no tax cut, revenues for 1984 would have been $385.6 billion. Their actual level: $304.0 billion.
This failure of the most extreme supply-side prediction is not surprising. First, cuts in the marginal tax rates of lower and middle-income taxpayers were small—for a median-income family the marginal rate dropped from 32 percent to 25 percent. Which means the after-tax reward for earning income rose from 68 cents on the dollar to 75 cents, an increase of only 10 percent. Compare that to a high income taxpayer whose after-tax return goes from 30 cents on the dollar to 50 cents, a whopping 67 percent!
Second, the non-rich have less room to maneuver than the rich. As Lindsey points out, when rich taxpayers' rates drop, they can realize more capital gains. Also, the self-employed rich and even high-income employees can typically alter their compensation packages away from perks and toward money.
Based on his data, Lindsey estimates that a top tax rate of about 35 percent would maximize federal tax revenues. Since the period of time he studied, President Reagan and Congress cut the top rate further, from 50 percent down to 28 percent (or 33 for a few taxpayers with the alternative minimum tax). So the top rate is now close to the revenue-maximizing rate.
The implication of this finding for future tax and budget policy is enormous and crucial. Even if we wanted to increase taxes on the rich (I don't want to, and not just because I plan to be one of them some day), we can't. Raising the top tax rate further would yield at most a few billion dollars and could even cause revenues to fall.
What about raising tax rates on capital gains? As Lindsey shows in a recent paper, the revenue maximizing rate on capital gains is 15–20 percent. Since the top rate on capital gains is now 28 percent, the only way to raise more revenues here is to cut this rate.
Follow the logic further. Because the non-rich don't have much room to maneuver when tax rates change, the only surefire way to raise more revenues from income taxes is to increase taxes on the non-rich. And that would be wrong, as someone in the White House once said. It would also be political suicide, as Mr. Mondale learned. The last year during peacetime that tax rates were raised explicitly (rather than through bracket creep caused by inflation) was 1940!
We could raise corporate taxes, but Stanford tax economist John Shoven argues that the poor (as well as the rich) bear a disproportionate burden of that tax: many of the poor are old people who depend on income from their investments.
A consumption tax? To be a sure money-raiser, it would have to hit the non-rich, or it would cause some of the same disincentive effects as the income tax.
Bottom line: we're stuck.
"What you mean, we, kemo sabe?"
"You're right, Tonto." They—the ones who want to tax us more—are stuck. They can't raise taxes without hurting poor and middle-income taxpayers. Period. They'd better figure out how to cut spending.
David R. Henderson writes frequently for Fortune.
The post Economics: The Supply-Siders Were Right (Partly) appeared first on Reason.com.
]]>Not so with Buchanan. For the first time since 1969, when the prize in economics was first awarded, there has been serious dissent about whether the winner deserved it. The dissent has been expressed in public among journalists, and if Washington Post columnist Hobart Rowen is to be believed, in private among economists. Robert Lekachman, a socialist economist at the City University of New York, wrote in the New York Times that Buchanan's scholarly achievements are "rather modest." Rowen quoted an anonymous economist who was "shocked, surprised, and nonplussed" that Buchanan won the prize.
Rowen himself equated Buchanan's contributions to economics with those of—get this—columnists George Will and William Safire, and pollster Richard Wirthlin. Michael Kinsley, editor of The New Republic, called Buchanan "an obscure right-wing eccentric" and claimed Buchanan had said nothing that Kinsley hadn't said.
They are wrong. That these critics find Buchanan's insights obvious (Kinsley called them "trivial") shows just how profound an effect the public choice school has had: ideas about government that were once considered revolutionary are now common wisdom.
Public choice challenges the widespread belief that participants in the public sector pursue the public interest as diligently as people in the private sector pursue their own interests. Public choice economists assume that politicians, voters, bureaucrats, and lobbyists act in their self-interest—just like everybody else.
A personal story best illustrates how economists used to think about government. When I took an undergraduate economics course in 1971, before the public choice revolution had taken hold, the professor gave the typical treatment of government intervention in the economy. We learned that the free market is efficient if a number of very stringent conditions hold. But if these conditions don't hold, then all bets are off.
One possible cause of such "market failure" was "externalities." The usual example was of apples and bees. Apples provide nectar, which bees use to make honey. If unable to charge the beekeepers for use of their apples, the orchard owners, in deciding how many apples to grow, will not take account of the gains to beekeepers. Therefore they will produce too few apples. The positive effect of apple growing on honey production is called an externality.
Another case of market failure was "public goods." I already knew that the "free-rider problem" was a classic and widely accepted justification for government provision of national defense: there is no obvious way to charge people for the benefit they get from defense. If we tried to do so, everyone would have an incentive to free-ride on everyone else and we wouldn't get defended. Thus the justification for taxes to pay for defense.
But the professor who taught me, and the article he assigned by former JFK advisor Francis Bator, took the public goods rationale for intervention further. Even a radio signal is a public good, wrote Bator, because one person's tuning in does not prevent another person from doing so. How about allowing the radio station to scramble its signal and charge people for descramblers to exclude nonpayers? Bator objected that excluding nonpayers is economically inefficient, since the cost of servicing each additional person is zero.
Bator's analysis was not a foot in the door for government intervention—it removed the door completely. Our professor's explicit bottom line was that government intervention could be justified in virtually every area of people's economic lives. And the content of this course at a mainstream school was very similar to what hundreds of thousands of economics students had learned before and have learned since.
When we came to discuss the actual effects of existing government intervention—oil import quotas, agricultural subsidies, airline regulation—the tone changed. Our professor sounded like free-market exponent Milton Friedman. All these regulations and government programs had pernicious effects, he said, and ought to be abolished.
As he attacked program after program, I and a like-minded student could barely suppress a cheer. We egged him on, asking about this intervention and that one and not being disappointed by his answers. By the end of the hour, I felt exhilarated. Whatever this guy's views about the theoretical case for government interventions, he seemed to oppose every one we had.
The exhilaration was short-lived. The more I thought about what had gone on in those classes, the more profoundly dissatisfied I felt. Sure, the professor had analyzed particular government regulations and even called for their abolition. But it didn't lead him to challenge his theoretical case for intervention. We had seen the theoretical case for market failure. Where was the theoretical case for government failure?
Actually, there was such a case, and public choice economists had made it. Today, economics students are exposed to a very different view of government from the one we received.
Consider today's microeconomics textbooks. In one of the best-selling, Microeconomics: Private and Public Choice, authors James Gwartney and Richard Stroup take an unabashedly public choice approach. Even textbooks written by liberal economists bear the imprint of public choice. Listen to this quote from the latest edition of a textbook whose 12 editions have sold over 3 million copies: "Before we race off to our federal, state, or local legislature, we should pause to recognize that there are government failures as well as market failures'" (emphasis theirs).
The text? Economics, by liberals Paul Samuelson of MIT and William Nordhaus of Yale. Clearly, economists view government differently from the way they used to.
Although most people today think of James Buchanan and his colleague Gordon Tullock as the originators of public choice, its first major contributor was Anthony Downs, now an economist at the Brookings Institution. In An Economic Theory of Democracy, published 30 years ago and based on his Ph.D. dissertation at the University of Chicago, Downs sowed the seeds of public choice.
Starting with the simple assumption that politicians' main motive is to get reelected, Downs derived some powerful conclusions. One is that rational voters will choose to be ignorant about government policies. Why? Because information is a lousy investment: the effect of one vote on an election's outcome is so small that it does not pay for a voter to become informed.
Downs also concluded that in a two-party system such as ours, once party A has taken a position on an issue, party B will position itself close to A so as to get as many votes as possible from opponents of A's position, while at the same time alienating as few voters as possible who favor A's position. For example, if the Democrats advocate spending $5 billion on some program, the Republicans would rationally advocate, say, $4 billion.
That way, they could establish themselves as a clearly preferable choice for those who want no spending and at the same time pick up votes from voters who want spending but want it to be below $4.5 billion.
Clearly, this Tweedledee-Tweedledum behavior of the two major parties is what we usually observe. Candidates out on the extreme—for example, Barry Goldwater in 1964 and George McGovern in 1972—lose.
Downs also argued that government programs that hurt the economy overall are viable if they impose small losses on large numbers of people to finance big gains to small numbers of people. A good example is the federal program introduced in 1983 to subsidize milk producers for not producing milk. These subsidies enrich about 100,000 dairy farmers by thousands of dollars per year, while costing taxpayers and milk drinkers less than $50 per year each. Economists have shown that the total cost of such programs exceed their benefits. For instance, taxes have disincentive effects; MIT economist Jerry Hausman estimates that these cost the economy about 29 cents for every $1.00 collected. Yet, in spite of greater costs than benefits, the milk subsidy program shows no sign of being repealed. Downs's model explains why.
The dairy farmers have a strong incentive to support the program. They are well informed about how members of Congress vote on it, and their payoffs to politicians who support the program are legendary—in just 1981 and 1982, the two years before the program was introduced, the dairy lobby contributed $2 million to congressional campaigns. On the other hand, taxpayers and milk drinkers have virtually no incentive to oppose it, because their individual gains from doing so would be so small.
Popularized in the 1970s by Milton Friedman and others, this analysis of special interests' advantages now seems like mere common sense. But do not underrate the value of common sense. Downs's analysis would be news to David Stockman. In his recent book, The Triumph of Politics, Stockman asserts that in the early 1980s the United States in effect had a national referendum on government spending and that the vote was overwhelmingly against cutting it. In fact, there never was such a referendum. Voters never had a direct say in spending. Rather, they had to choose among particular politicians, virtually all of whom understood that the way to assure reelection is to cater to special interests, not to the general voter. The failure of Stockman, a very bright man, to distinguish between votes for spending and votes for politicians who have an interest in spending shows that the public choice message is less obvious than its critics keep saying.
Although Downs's book was the first major breakthrough, the impact of public choice ideas would have been much smaller had it not been for Buchanan and Tullock. They were the ones who institutionalized public choice. And the story of how they did so is fascinating.
It began in 1948. Buchanan, who had just completed his Ph.D. dissertation at the University of Chicago, was browsing in the "dusty stacks of Chicago's old Harper Library" when he discovered a little-known and untranslated 1896 essay that the Swedish economist Knut Wicksell had written early in his career. Its gist was that the only justifiable taxes and government expenditures are those the taxpayers approve unanimously. That way, explained Wicksell, proponents of programs have to devise taxes that actually take money from the beneficiaries of those programs.
Wicksell argued that such a system would be not only efficient but also morally sound: "It would seem to be a blatant injustice if someone should be forced to contribute toward the costs of some activity which does not further his interests or may even be diametrically opposed to them."
Reading that essay, says Buchanan, "was one of the most exciting intellectual moments of my career." Wicksell had stood orthodox public finance theory on its head in ways congenial to Buchanan's own way of thinking. The proposal and its justification were at odds with the mainstream view of the 1940s that there need be no connection between taxes paid and benefits received. That it came from Wicksell, a giant among early 20th-century economists, gave Buchanan "a tremendous surge of self-confidence." He vowed to spread Wicksell's message and immediately started translating the essay from German into English.
In the 1950s, there were few conservative or libertarian academics in the country, at most two or three per college. Most of them had developed a fortress mentality. The Volker Fund, a private philanthropic organization, was set up to remedy this. It worked.
The Fund held conferences to bring such academics together for moral and intellectual sustenance and nurture. At one of these, Buchanan and fellow Chicago graduate G. Warren Nutter, both of whom were teaching at the University of Virginia in Charlottesville, met the Fund's representative, Richard Cornuelle. Out of that came a five-year Volker grant to set up the Thomas Jefferson Center for Studies in Political Economy.
The grant authorized them to hire a postdoctoral fellow for a year. Nutter recommended his former debating partner at the University of Chicago. The candidate's credentials—a law degree, one course in economics, and nine years as a State Department employee specializing in Oriental communism—were not obvious qualifiers. But neither Buchanan nor Nutter was into credentialism.
Buchanan met this government bureaucrat at the American Economic Association meetings in Philadelphia and came away with a thick manuscript full of insights into how bureaucracies work. After reading it, he offered the fellowship. Gordon Tullock accepted. Thus was formed a close association that lasted for over 20 years. The manuscript turned into a book, The Politics of Bureaucracy, that should be thought of as a classic.
In 1959, Buchanan and Tullock decided to write The Calculus of Consent. This book is a classic. That it was published in 1962, the 175th anniversary of our Constitution, is appropriate. For in it, the authors defend a constitutional democracy with checks and balances, much like the one we started with in this country. Their book is really a modern-day version of James Madison's contributions to the Federalist Papers, informed by a much more sophisticated knowledge of economics.
In Calculus, Buchanan and Tullock analyze various possibilities for constitutional rules of governance. One they consider is a requirement that any government activity be approved unanimously. While such a rule assures that no government action will impose net costs on anybody, it is also unworkable in practice. So they consider various ways of modifying the rule to attain "workable" unanimity. But they do much more than this, and a few paragraphs cannot do justice to their accomplishments. This book alone should have won the Nobel for Buchanan and Tullock.
When I read books by people I've never seen, I form mental pictures of them. Reading Calculus in 1970, I pictured Buchanan and Tullock as urbane, pipe smoking, Ward Cleaver types in their mid-40s. I was roughly right about their ages, but dead wrong otherwise. Buchanan, who grew up in rural Tennessee, regards himself as a down-home country boy, or, in his words, "one of the great unwashed." He is also very shy. Tullock, who grew up in Chicago, has a cherubic face that makes him look like an overgrown kid. He is very abrasive. He loves to tangle people in knots when he disagrees with them and will push until they cry uncle. My guess it that his abrasiveness cost Tullock a Nobel. (The Nobel decision was apparently a cliff-hanger: Tullock claims he was told that the committee split at the last minute on whether to make a joint award.)
Had personalities counted as much to Buchanan and Tullock as they appeared to with the Nobel committee, their partnership would never have materialized. But what counted was their ideas. Although Tullock had entered the University of Chicago "as a moderate Republican who believed in high tariffs," and Buchanan as a self-described "libertarian socialist," within weeks both had become hard-core free-marketeers. Tullock attributes his conversion to a 10-week course in basic economics taught in the law school by Henry Simons. Buchanan was most influenced by the legendary Frank Knight.
(In 1986 Buchanan described himself to the Washington Post as a libertarian. But he holds some apparently unlibertarian ideas. In a 1971 article, Buchanan advocated equalizing people's economic opportunities with "confiscatory inheritance taxation," "massive public outlays on general education," and "selective programs to eliminate poverty." He still does. Explains Buchanan: "I don't think libertarianism makes much sense until you define who are the players in the game. People have to have the sense that it's a fair game. By evening up the starting point, we make it fairer.")
Together, Buchanan and Tullock nurtured the public choice school of thought. Tullock arrived permanently at the Thomas Jefferson Center in 1962. He left in 1967 over a conflict with the university administration—they had thrice denied him promotion from associate to full professor. Buchanan followed the next year: "I left because there had been repeated signs that the university's administration did not support us. They did nothing to keep Ronald Coase (who then went to the University of Chicago Law School) or Andy Whinston. Tullock was the last straw."
But during those years they had, according to Tullock, "an extraordinary collection of students." Among them were James C. Miller III, currently head of the Office of Management and Budget, and Robert Tollison, now a leading academic public choice economist. One of the students, J. Ronnie Davis, refers to those years as "Camelot."
A core group of these students made names for themselves even while in graduate school by publishing Why the Draft? in 1968, the height of the draft protests. Edited and entrepreneured by Miller, the book made a solid case—on grounds of economics, justice, and tradition—against the draft. With an introduction by Senator Edward Brooke of Massachusetts and glowing praise on the cover by both Milton Friedman and John Kenneth Galbraith, it sold about 35,000 copies.
A popular book by graduate students in economics is almost unheard of. One reason this one came to be is that Buchanan nurtures students. He has them write weekly papers, many of which later become scholarly articles. One of his favorite sayings is, "Don't get it right—get it written." Buchanan contrasts his approach with what he calls the "UCLA disease." When I first heard him use the term, I knew immediately what it meant, having earned my Ph.D. there. We UCLAers learned very quickly that if you made a mistake, the gods of economic wisdom—the professors—would jump all over you. That made most of us gun shy—and many ex-UCLAers have taken years to recover.
Precisely the opposite happens to Buchanan's students. Their first work as graduate students is usually not their best. But they grow quickly. This encouragement is a big factor in the spread of public choice.
Tullock also did his share to disseminate the new paradigm, both with his own prolific writing—he dictates articles quickly into a tape recorder and then edits the transcripts—and by beginning a journal. Started as Papers in Non-Market Decision-Making in 1966 and funded out of Tullock's own pocket, within a few years it became the Journal of Public Choice.
How did they come up with the phrase "public choice"? Says Tullock: "We started having annual meetings of people interested in the field. We considered calling ourselves the No-Name Society, then Polyconomics, then Synergetics. We didn't like any of them. We couldn't use the term 'Political Economy Society' because 'political economy' in those days meant Marxist. Bill Riker [a political scientist at the University of Rochester] suggested 'public choice.' No one liked it much, but it was neutral."
The first years as the Journal of Public Choice were at Virginia Polytechnic Institute (VPI), where Tullock went in 1968 after a year at Rice. Buchanan followed Tullock to VPI a year later, and together they set up the Center for Study of Public Choice, funded partly by private money and with salaries paid by the school's economics department. In 1982, Buchanan and Tullock—and the Center—moved to George Mason University in Fairfax, Virginia.
Throughout this time, their interests diverged. Buchanan has focused on what he calls "constitutional economics." He spends a lot of time thinking about what kinds of rules people would choose if they could revise the Constitution. (For example, he himself favors some kind of balanced budget amendment.) His interests in the area have become more philosophic and less economic, as he struggles with some of the same issues as philosophers Robert Nozick (Anarchy, State, and Utopia) and John Rawls (A Theory of Justice).
Tullock, on the other hand, is an "economic imperialist" who has applied the economic approach to understanding revolutions, dictatorships, and court trials. He also pioneered the theory of "rent seeking": the use of political activity to redistribute wealth from others to oneself. Most economists who write about redistribution think of wealth being channeled from the rich to the poor. But Tullock, especially in his 1983 book Economics of Income Redistribution, emphasizes the even more substantial transfers that have nothing to do with rich and poor.
A good example is farm programs. Many farmers are rich, many are middle-class, and many are poor. Exactly the same is true of the consumers and taxpayers who pay for these programs. Tullock was the first to note that the attempt to redistribute income to oneself is costly and that many of the gains from rent seeking are dissipated in the process. His work has generated a great deal of controversy about what fraction of the redistributed income gets lost in the process. Tullock is currently trying to figure it out himself.
He is doing so at a new location. Last summer, Tullock moved to the University of Arizona. He says the move was not triggered by bitterness at not sharing the Nobel. "I am irritated—not at Buchanan, because he had nothing to do with it, but at the Nobel committee. I was very upset when it first happened: I couldn't sleep the first two nights. I have quieted down." So why the move? Part of it is the divergence of Buchanan's and Tullock's interests. But also, Tullock turned 65 this year, qualifying him for the Virginia state retirement plan. By moving to Arizona, he will be able to "double dip."
Of course, much research on public choice has gone on at places other than the Center. For example, William Niskanen, now chairman of the Cato Institute, wrote one of the classics on bureaucracy while at the Institute for Defense Analysis. In Bureaucracy and Representative Government, published in 1971, Niskanen applies to heads of government bureaus the central assumption of public choice—that people in the public sector pursue their own interests. Figuring that these bureau heads have a monopoly on the goods and services they produce for "sale" to the legislature, Niskanen concludes that bureaucracies will produce approximately twice the optimal level of output.
Other economists have criticized this conclusion and the analysis leading to it. For example, Earl Thompson of UCLA showed in a journal review at the time that if Niskanen were right, half the voters would want to eliminate the bureau altogether—something we don't observe.
Niskanen admits the criticism but responds that his monopoly assumption is the culprit. In practice, he notes, bureaus often do compete. The Federal Trade Commission and the Justice Department both enforce antitrust laws, and such competition, he claims, makes a bureau's output closer to the optimal level.
Another strand of public choice research is that of economists at the California Institute of Technology who have focused on the political agenda. Their thesis: he who controls the agenda controls the outcome. That is, by specifying the options from which voters can choose, and even by specifying the order of options, a clever agenda-setter can get whatever outcome he wants. An important way to limit the power of legislatures, therefore, is to have an initiative process, such as California's. That way, voters who are unhappy with the alternatives they're given can, with enough votes, get their preferred option on the ballot.
Much is brewing in public choice research today. Robert Tollison, director of the Public Choice Center at George Mason, has consistently produced new ideas. He is not content with having demonstrated, with coauthors William F. Long and Richard Schramm, that antitrust authorities prosecute firm A and not firm B because firm A's sales are higher. Nor does he stop at showing, with coauthor Robert McCormick, that the pay of a state's legislators is directly related to the date of its constitution (the more recent the constitution, the more power it gives to legislators).
Now Tollison is taking on the view of Milton Friedman and Anna Schwartz that honest mistakes by the Federal Reserve Board caused the money supply to shrink in the 1920s, thus causing the Great Depression. Argues Tollison: "The Fed purposely engineered the Great Depression to hurt banks that were not members of the Federal Reserve System and to help banks that were members."
Friedman is not convinced. Yet he recently admitted that public choice has affected his view of the Federal Reserve. "I no longer give advice to the Fed. I instead try to figure out what kind of constitutional or institutional arrangements we could set up which would make it possible to abolish it."
Even outside the Center, a large research community works from the public choice framework. Gary Becker, a University of Chicago economist who pioneered the application of economic analysis to discrimination, "human capital," and the family, has recently weighed in on rent seeking. In a 1983 article, he reaches a controversial conclusion: that government programs to redistribute income will be the most efficient ones possible for achieving that redistribution.
Becker claims that the programs will not impose net losses. In fact, what he really demonstrates is that the losses will be as small as possible. Why? Because if the losses for a given amount of redistribution could have been smaller, then the gainers would have an incentive to substitute programs that give them the same benefits but cost the losers less: that way, less opposition would be generated among losers.
But Dwight Lee, a public choice economist at the University of Georgia, disagrees. Often, notes Lee, the most efficient way of redistributing is to tax everyone and give the proceeds directly to the politically powerful. "The problem is, this is also the most blatant. A blatant way of subsidizing the politically powerful will generate more opposition among the losers than an inefficient but subtle way that they are not even aware of."
Even with Lee's caveat, Becker's theory can explain why privatization of government assets has been much more extensive in Britain than in the United States. When the Thatcher government wanted to unload public housing on which it was taking a loss, it sold the housing to current tenants at below-market prices. This made the politically powerful tenants, who otherwise would have opposed privatization, into strong supporters. In the United States, on the other hand, when Reagan's Office of Management and Budget tried to privatize electric power in the West, it did not propose giving current beneficiaries of artificially low-priced power any share in the gains. Not surprisingly, OMB's initiative is going nowhere.
And not to be missed in a necessarily incomplete survey of the public choice landscape is a paper on gerrymandering being written by Ben Zycher. Gerrymandering—dividing states into election districts to give one political party an electoral majority in a large number of districts while concentrating the opposition's strengths in as few districts as possible—is often considered unsavory, even immoral. In recent years it has mainly helped Democrats who disproportionately control the state governments that do the redistricting.
But Zycher, a free-market economist at the Rand Corporation, defends gerrymandering. In its absence, he argues, many more elections would be close, thus increasing the power of concentrated groups and creating a bias toward bigger government. Furthermore, gerrymandering guarantees the minority party a minimum number of seats even in a landslide, thus preserving the political system's long-run competitiveness.
Of course, not all students of the political system have been persuaded by the public choice paradigm. Joseph Kalt of Harvard's Kennedy School of Government and Mark Zupan of the University of Southern California have given overwhelming evidence that narrow self interest alone cannot explain legislators' votes. In a 1984 article, they show that the ideology of U.S. senators was an important factor in their votes on a bill to regulate strip mining. In a recent book Making Public Policy: A Hopeful View of American Government, Kalt's colleague Steven Kelman raises many similar cases that cast doubt on the public choice view that only self-interest matters, not ideas—for example, about the proper role of government.
Buchanan himself agrees: "Public choice scholars who assume that voters and legislators care only about their narrow self-interest have gone too far." But he points out that the public choice approach is useful even though people have broader, more altruistic concerns. Just as the tax deductibility of charity increases the amount of charitable giving, he argues, so allowing government officials the power to distribute "rents" will lead people seeking those rents to invest more to influence the officials' decisions.
Has public choice had an important impact? Buchanan and Tullock student and OMB chief James C. Miller III answers that question with a resounding yes. Says Miller: "The public choice approach has caused people to focus on a question that virtually no one was asking or answering: how do you structure incentives in the political system so that politicians will make the 'right' choices?" He sees the seeds of public choice thinking in the Gramm-Rudman law and in proposals for a balanced budget amendment.
Milton Friedman, not himself a member of the public choice school, finds its work "tremendously important." What they did, he said recently, "was to bring out explicitly…and develop logically and systematically" what had been implicit in the writings of earlier economists: that government officials are as self-interested as businessmen.
Buchanan, Tullock & Co. have reinforced a skepticism of government native to Americans. On questions of government intervention, the burden of proof is now more than ever on the proponents. And that is all to the good.
David R. Henderson, a former member of President Reagan's Council of Economic Advisers, is currently a regular contributor to Fortune.
The post James Buchanan & Co. appeared first on Reason.com.
]]>Are the Northeast and the Midwest, the so-called Frostbelt, dying? Is the rise of the South and the West, the so-called Sunbelt, due to the closing of manufacturing plants in the Frostbelt? If companies were forced by law to give long advance notice and to make large payments to employees whenever they closed a plant, would this necessarily benefit workers?
These questions and more are taken up in Fugitive Industry, written by economist Richard McKenzie of Clemson University. His answers may surprise a lot of readers.
Advocates of laws making it difficult and costly to close factories claim that plant closings, contractions, and relocations eliminated over 15 million jobs in the United States between 1969 and 1976. But McKenzie points out that these lost jobs were more than compensated for by 24 million jobs created over that same period. Moreover, while manufacturing employment in the North fell by 1.6 million during that time, total employment (excluding agriculture) increased by 1.9 million. McKenzie shows that between 1965 and 1980, total nonagricultural employment increased in all major regions of the country—even in northeast industrial cities such as Akron, Detroit, Youngstown, and Cleveland, which are supposedly the epitome of industrial decline.
Nor was the net loss of manufacturing jobs in the Frostbelt caused by a higher rate of job loss than in the Sunbelt. Rather, it was due to a lower rate of job creation. Moreover, McKenzie points out, over 98 percent of job losses in the Frostbelt between 1969 and 1972 resulted from companies scaling back or shutting down operations. Less than 2 percent were caused by companies moving to greener pastures. So much for the widespread belief that the Sunbelt is prospering at the expense of the Snowbelt.
Unfortunately, McKenzie fails to point out that the Frostbelt's industrial problems are concentrated in autos and steel, where very high wages prevail. In 1970, hourly compensation for auto and steel workers was about 30 percent higher than the average for all manufacturing employees. But by 1981 the difference had grown to 50 percent for auto and 70 percent for steel workers.
Do plant-closing laws benefit workers? McKenzie says no. Such laws, to the extent they prevent closings, tie up capital that could have been used to create new jobs. Unfortunately, this effect is not readily apparent. Consequently, workers who remain unemployed because new jobs are not created don't realize why few new jobs exist. So they don't lobby against plant-closing laws. On the other hand, workers who already have jobs, and hope to keep a lock on them with plant-closing laws, lobby very effectively.
In criticizing plant-closing laws, McKenzie makes one argument that does not make sense. He claims that such laws could themselves cause some individual plants to close: if a plant is doing badly and there is no law, the company might hold out in the hope that demand will pick up; but if there is a law that imposes a cost on the company for closing its plant, the company will be more likely to close it. McKenzie's logic here is obscure, for a company would be more likely to keep the plant open if its managers faced a penalty for closing it. McKenzie could reasonably claim, though, that if companies anticipated enactment of a closing law they would be more likely to shut down a marginal plant to avoid future penalties.
Often, worry about plant closings comes down to a perception that it's unfair to lay off employees who have spent years of their lives working in one plant. But, counters McKenzie, the supposed unfairness depends on what the employer has promised. If he has assured his workers that their jobs are secure for a long time, then it is unfair to close the plant suddenly. But when an employer breaks such a commitment, his employees have a just grievance that can be handled by the courts.
When an employer has made no such promises, why, asks McKenzie, is it "unfair" to close the plant suddenly? In such cases, the employer generally has to pay higher wages to compensate his employees for the higher risk of losing their jobs. For this reason, it is plant-closing laws themselves that are unfair. They force an employer to overcompensate his employees by first paying them wage premiums that reflect the risk that their jobs might end and then not being able to close the plant as quickly as anticipated when the premium wages were agreed to.
Of course, as McKenzie points out, employees would be overcompensated only in the short run. In the long run, employers would insist on paying lower wages to make up for their reduced flexibility, so employees wouldn't necessarily benefit overall. In fact, they might lose. After all, they are always free to bargain for lower wages in exchange for less management flexibility to close plants. Employees who opt for higher wages must want that more than they want job stability. But plant-closing laws would prevent them from making that trade-off.
One caveat is in order. While the reader will probably find most of McKenzie's arguments persuasive and his evidence convincing, there are some real problems with his book. It is wordy and plodding and too seldom incisive. The book clearly needed a good editor. Also, McKenzie fails to present some evidence that strengthens his case. Some of the best evidence appears only in the brief foreword by economist Finis Welch.
Nevertheless, the issue McKenzie deals with is important. Restrictions on plant closings are already in force in Maine and Wisconsin and in Philadelphia. And the debate over such laws continues. McKenzie's Fugitive Industry provides a valuable source for learning more about the case against such restrictions.
David Henderson recently left the staff of the Council of Economic Advisers to return to teaching.
The post Closing in on Plant Closings appeared first on Reason.com.
]]>John Hospers
Streamers (parachutes that won't open) was a Broadway play, then a television drama, and now Robert Altman has made it into a film. Set entirely in an army boot camp in Virginia, it still has the trappings of the original stage play. Whether as theater or as film, it is a drama of raw power and concentrated emotion. It is set in the time of the Vietnam war but is not at all about war; it is about the conflicting emotions of men, until recently strangers to one another, who are thrown together in close proximity. Fear of death, fear of homosexual impulses, fear of confrontation with each other and their own deepest feelings, combine to produce an excruciatingly intense drama.
The performances, by a cast of unknowns, are superb. Best of all perhaps is that of George Dzunda as the Vietnam-weary officer. When he appeared in The Deer Hunter, for a few crucial moments the fate of the entire film was in his hands: at the end of the film, tears streaming down his face after the funeral, he prepared breakfast for the grief-stricken survivors and, as if to deflect their attention, started to hum "God Bless America," whereupon others took up the tune until everyone joined in. It was a tremendous chance to take, the chances being that coming where it did it would sound false and ruin the climax of the film, but thanks to his performance the end of the film was brilliant. He enacts a similar scene toward the end of Streamers, an unusually difficult and delicate thing to bring off, but he does it with extraordinary ability.
The film, however, is so harrowing that it could hardly be called entertainment, and it will certainly not suit everyone's taste.
Wolves are still perceived as the enemies of man, as vicious predators. The literature is full of myths about attacks on man by wolves—Red Riding Hood and the wolf, Saki's famous story about wolves' attacks, and so on. One result of man's tragic misperception of wolves is that they have been almost wiped out on the American mainland.
Actually, wolves are among the most highly evolved species in the world. Their complex system of signals, their manner of dividing up territories, their code of etiquette in the training of their young, and their relations with other animals, all marvels of complexity and utility, have been the subject of numerous documentaries, notably Wolves and Wolf-Men.
Now from Disney Productions comes a full-length film, Never Cry Wolf, filmed over a year's span of time in the wilds of Alaska. It illustrates another facet of the life of wolves, amidst the most stunning Arctic scenery. A man is sent to Alaska to test the theory that the wolf is responsible for the depletion of caribou herds. As things turn out, it is man and not the wolf who is responsible for the slaughter of the caribou—the wolf lives largely on small rodents. But between the landing in the Alaskan wilderness and the final proof of the truth about wolves, there is a remarkable series of incidents, with a fascinating narrative, a serious purpose, and a pervasive sense of humor, all of which make the film richly informative and fascinating to watch. For anyone who cares about animals or about scenic beauty, this is one of the most rewarding films of this or any other year.
Compared to most contemporary films about the problems of youth, Rumble Fish is profound and disturbing. It concerns the problem of growing up in a city (Tulsa), with no parents worthy of the name, no money, nothing to do, wondering how to spend each day that comes along. It is a saga of alienated youth, which makes most youth-oriented films superficial and fatuous by comparison.
From another point of view, however, a person could well say that the characters largely deserve their fate. People have come out of the slums before and made their lives succeed; these show no motivation at all in that direction—neither the main character (Mat Dillon) nor his ex-motorcycle champion brother (Mickey Rourke).
With whichever attitude one comes to the film, it seems clear that for director Francis Ford Coppola the art (some would say artiness) is more important than the story. Mists constantly swirl about the scene; the streets are always wet, the sunsets are always lucidly dramatic. But then, the film is not an attempt at cinematic realism. It is a vision, presumably of life as seen from a specific point of view, in which the physical universe is imbued with the tortuous conflicts of the characters, much as the rocks appear agonized and tormented in El Greco's painting "Christ in Gethsemane." To criticize the film for not abiding by the standard of realism is to miss entirely what Coppola was trying to do.
But not everyone who sees what he was trying to do will like it. There is not much by way of overt incident in the film; it is highly atmospheric, and this is something one must absorb through one's pores slowly and gradually. Those interested only in what happens next should accordingly skip this one.
Beyond the Limit is a film adaptation of Graham Greene's novel The Honorary Consul. It is a fairly accurate rendition of the bare bones of the novel (except for the fortunate omission of the physician's mother). But some of the sharp lines of the novel have been lost, owing to the familiar tendency among screen writers, who already know the plot well, to assume that the viewer who hasn't read the book is equally familiar with it. So incidents in the book that contain rapier-sharp thrusts tend to be blunted in the film. Moreover, the agonizing sense of moral ambiguity that pervades Greene's work is largely lost in the film: the idealistic young physician sleeps with his friend the consul's wife; the antigovernment Paraguayan guerrillas have deceived the doctor into serving them by lying to him about his father's fate; the doctor thinks his first duty is to save lives, no matter whose, and thus remains apolitical but is forced by the situation in Argentina to take sides, a moral decision for which he pays heavily; and so on.
Michael Caine as the consul and Richard Gere as the doctor give their best to their roles. But if some of the moral dilemmas had been fleshed out more clearly, the film would have been an eminently worthy one instead of merely adequate.
John Hospers is the author of Understanding the Arts. He teaches philosophy at the University of Southern California and is the editor of the Monist.
David Brudnoy
There is a theater of, if you will, the weird sisters or the weird family. George S. Kaufman had a knack for it, and now and again somebody comes along with a sense of the fineness of that line separating the weird sisters from the dipsy-doodle sisters, the strange but engaging family from the deliberately nutsy family. You may find an evening's entertainment by the latter a kind of exercise in tolerance, but you'll likely find it a strain. With the cleverly written and engagingly acted weird sisters, however, you'll have a ball.
You will, to be specific, have a ball with the three MaGrath sisters in Beth Henley's wonderful Crimes of the Heart. It is currently on national tour after copping a Pulitzer in 1981 and enjoying a lengthy and profitable run in New York City.
We meet the sisters first in the form of Lenny (Caryn West), who is a spinster now suffering through the very day of her 30th birthday. Lenny lives with and tends Old Granddaddy MaGrath, who raised her and her sisters after their daddy absconded and their mother hanged herself, along with an old yellow cat.
Lenny keeps a neat house in Hazelhurst, Mississippi, and frets. She has raised fretting to the finest of arts: she is neither a southern incarnation of the yenta as gentile (she doesn't oppress others, she just agonizes more than her share), nor does she stand much chance of liberation. She is strongly rumored to have had a man once—one man, one episode—but something happened, and he's no longer around.
Crimes of the Heart portends many deep dark secrets. Most turn out to be medium grey, but they are secrets of the heart just the same. They are secrets in the modern South, not Southern Gothic secrets, however much some reviewers want to shove Henley's play into a niche it doesn't fit. This is a piece that resists neat pigeon-holing, as the three MaGrath sisters resist easy analysis.
The sisters have a cousin, Chick, who is a paragon of overweight rectitude, recalling how she almost missed getting into the ladies' social club because of Meg (Kathy Danzer), who, to hear Chick tell it, was the Scarlet Woman before Hester Prynne. It seems Meg was a mite quicker into the world of the liberated woman than most of the proper young ladies of Hazelhurst, and of late she has been flopping, albeit gutsily, as a nightclub singer in California. But Meg's on her way home to help baby sister Babe (Cyd Quilling), who has just been let out of jail. Babe has spent the night in the local keep because she shot her husband in the stomach—aimed for his heart but missed—since, as she puts it, she didn't like the way he looked.
Well, now, there's more to everybody's story than first appears. Meg's old beau is back and married to a northern woman—"Poor Doc," says Meg; "his children are half Yankee!"—and Meg still has a yen for him. As for Babe, she has grown sufficiently bored with her husband that she has taken up with a local teenager, who happens, oh my, to be of a decidedly duskier hue than is considered acceptable in Hazelhurst, Mississippi.
Beth Henley has given three sisters the secrets of about five and hung upon their attractive but slight shoulders more baggage than this fine though not great play can truly bear. Crimes of the Heart can hardly be put to the trimming process now, its Pulitzer hovering above it and its lengthy run in New York presaging its success on tour; but the play simply cries out for simplifying. The cast, those mentioned and Dawn Dadiwick as Chick, Tom Stechschulte as Doc, and David Allison Carpenter as the eager young lawyer who would like very much to save Babe from prison and earn her gratitude in every which way, carry the play to success despite the top-heaviness of its plot. Here is ensemble acting at its rare best, a textbook example of the power of well-directed performances to whisk away many of the problems embedded in a play's script.
Crimes is more than a well-crafted comedy of modern manners and timeless virtues conflicting and coalescing. It's an exceptionally thoughtful exploration of the nature of sibling connection. These are three young women who need and very genuinely love each other but who drag resentments along with them into their basically warm and supportive sisterhood.
It is, moreover, a shrewd examination of the equation of duty and self-interest. Lenny has taken on her felt obligations as something of a cross between a hairshirt and a housedress; she's ground down by it all, by her subservience to the needs of Old Granddaddy, but she's also grown comfy and safe in that role. After all, Lenny can send a suitor packing, saying to herself that he wouldn't want her because she has a shrunken ovary, though in time she learns that he loathes kids anyhow. But Lenny's balancing act with what she feels she must do and what she half suspects she would like to do is no more delicate and no more of a trap for her than is Meg's for her and Babe's for her.
Each of the sisters has come only some of the way toward freedom—meet the audience, ladies—and each has only half realized the fact. Meg thinks that she's washed up because her singing career is going nowhere; but what she thinks has done her in—her willfulness—has probably been her salvation, and what she thinks she has left behind—her smalltown conventional roots—has tagged along and held her down.
Babe? Well, see Babe in action, and you'll know.
Crimes of the Heart is wiser than most plays and kinder by a country mile. It's a gentle nod to the weird sisters in us all, and it skirts quite nicely the dipsy-doodle sisters we all fear to become. It's a swell play for all but congenital grouches, and infants, and it's going the rounds now to brighten the winter stage.
Contributing Editor David Brudnoy reviews the arts for Boston radio station WRKO-AM and is film critic for the Tab group of newspapers. He is host of a nightly talk program on WRKO, writes regularly for the Boston Herald, and syndicates a thrice-weekly newspaper column.
Copyright © 1984 by David Brudnoy
William Havender
A Feeling for the Organism, by Evelyn Fox Keller, San Francisco: W.H. Freeman, 1983, 235 pp., $17.95.
During high school in the early decades of this century, a young woman, in taking stock of her inclination for "doing the kinds of things"—like studying for a career—"that girls were not supposed to do," pondered how she could handle her "difference." Says that woman now,
I found that handling it in a way that other people would not appreciate, because it was not the standard conduct, might cause me great pain, but I would take the consequences. I would take the consequences for the sake of an activity that I knew would give me great pleasure. And I would do that regardless of the pain—not flaunting it, but as a decision that it was the only way I could keep my sanity, to follow that kind of regime. And I followed it straight through high school, and through college, through the graduate period, and subsequently. It was constant. Whatever the consequences, I had to go in that direction.
These resonant words might well have been spoken by one of the heroines of Ayn Rand's individualist novels. In fact, however, they were spoken by Barbara McClintock, one of the world's great geneticists. Her life—an uncommonly determined, purposeful, and accomplished life—is chronicled in Evelyn Fox Keller's book, A Feeling for the Organism, written and published before McClintock won the 1983 Nobel Prize in medicine for discoveries in the field of genetics that underlie much current research in genetic engineering and disease control. The science of genetics must seem rather esoteric to people who are not schooled in it, so the significance of McClintock's work may not be clear. She made her major contributions to cytogenetics in the years before World War II, when scientists were still working out the relation between the phenomena that were observable by means of genetic crosses and the behavior of the microscopically observable cellular bodies called chromosomes. Suffice it to say that McClintock's brilliant contributions at this time were seminal in establishing the chromosomal theory of inheritance. These include the first demonstration that genetic recombination is correlated to a physical exchange between chromosomes ("crossing over"), the discovery of ring chromosomes, the identification of the nucleolar organizer, and the elucidation of the cytology of the important experimental organism Neurospora.
She won a worldwide reputation but still found it difficult to obtain a university position, a circumstance not entirely unrelated to the fact that she was the first woman to seek to pursue a full-time career in genetic research. Still, she eventually secured a position at the Cold Spring Harbor Laboratory on Long Island's North Shore, where she remains to this day. And there she carried out the research that she regards as the most important of her career, namely her discovery in the late 1940s and early '50s of movable controlling elements that govern the turning on and off of genes in the corn plant. The significance of this pathbreaking discovery was largely overlooked at the time in the lemming-like rush of the profession to embrace the new glamour discipline of molecular biology, a rush given irresistible impetus by the working out of the structure of DNA in 1953 by James Watson and Francis Crick.
Only in the early '60s did one part of McClintock's newer work—her finding that the functioning of the genes that code for proteins can be switched on and off by genetic factors outside of the coding sequences—receive external validation in the Nobel prize–winning work of Jacques Monod and Francois Jacob with bacteria. And only in the '70s was another portion of McClintock's later work—her finding that these controlling elements can move around from place to place on the chromosomes—confirmed by the discovery of "insertion elements" in bacteria and of related phenomena in yeast and higher cells. These phenomena are currently under intense investigation for the promise they hold in understanding the coordinated growth and differentiation of tissues in living organisms and, in particular, in understanding that defect in coordinated growth known as cancer.
For those interested in learning painlessly about the history of genetic research in this century (which equals 20th-century physics in intellectual brilliance), this book is superb. For feminists, the book is to be recommended, too, as the well-told tale of yet another woman who pioneered in a man's world. And for individualists and supporters of free enterprise, the book provides a valuable illustration of the importance of a social order with many "niches" and diverse sources of funding, so that mavericks can survive even when the herds of the profession are moving elsewhere. For this reason, the ever-increasing centralization of research support in the biological sciences (currently, virtually all money for fundamental biological research comes from the federal government), with its inevitable tendency only to support work that is conventionally understandable, is much to be regretted. Only in a world where the sources of support are multiple, independent, and various can the true entrepreneurs of the intellect, like Barbara McClintock, flourish.
William Havender is a biologist and a freelance writer.
Laurence Beilenson
Present History, by Theodore Draper, New York: Random House, 1983, 426 pp., $19.95.
Nuclear war, the Western alliance, Vietnam, Henry Kissinger's diplomacy, and the Arab-Israeli wars—these are Theodore Draper's subjects in Present History, a collection of articles written during the last 10 years by this writer and scholar of things political and historical. His aim, he says, is "to analyze present-day events historically [and] with convincing documentation and reasoned judgment."
Draper does not suffer fools, or even mistakes of the brightest, gladly. His acid pen is at its best in burning away the deadwood that erring commentators have heaped around his subjects. His own solutions, however, are flawed by his disregard or ignorance of essential facts and his failure to use historical patterns as a guide for present policy.
The virtue and the defect become equally evident in the articles on the related problems of nuclear war and the Western alliance. George F. Kennan's advocacy of parallel declarations by the United States and the Soviet Union promising to refrain from first use of nuclear weapons, Draper spears with a single sentence: "The awful truth is that [such declarations] have no reliability at all." And if, as (Fate of the Earth) Jonathan Schell proposes, "we have to 'reinvent the world' to control nuclear weapons, the chance of saving the human race must be somewhere near the vanishing point." To negotiations and treaties between the United States and the Soviet Union—praised as the best hope for nuclear peace by the Catholic bishops, the media, and the peace fronts—Draper also gives short shrift. Altogether, he blows a refreshingly cool breeze through the mass of hot air beclouding the subject.
What a letdown, then, when Draper urges "minimal deterrence," as suggested by Lord Solly Zuckerman. If, as Draper asserts, deterrence is our only means of averting the calamity of nuclear war, why not deter to the maximum extent possible? Where survival is at stake, monetary cost should become unimportant. And as the author apparently does not know, nuclear weapons are the cheapest segment of our military establishment. We spend only one-eighth of our defense budget to deter nuclear war; seven-eighths goes to prepare to fight conventional wars in faraway places.
Having properly ridiculed the notion that treaties are reliable, Draper proposes to halt the arms race by prohibiting research and development through a test-ban treaty. The Soviets could easily pursue secret research and development and then deploy the weapons either without tests or with tests on the eve of deployment. The awful fact, as Draper would say, is that we would have "to re-invent the world" to stop the advance in science and technology.
We shouldn't want to; it is precisely science and technology, with to-be-expected leaps, that offer the best chance of escaping from dependence on deterrence. Deterrence counts on the absence of accidents and the presence of rational conduct in a world where human error and irrational conduct have occurred regularly. To see that beneficence is not a reliable means of preventing nuclear war, it is not necessary to point to Leninist ideology and its exaltation of violence; the blood-stained pages of history bear witness enough. Real hope lies not in beneficence but in humankind's history of tremendous material betterment.
The geometric progress in science and technology makes probable the discovery of an effective active defense against nuclear weapons if the necessary resources and effort are devoted to the task. Impossible? Again, history teaches that conventional wisdom has derided as impossible many great inventions, including the airplane on the eve of its first flight in 1903. For the nation that put a man on the moon, active defense against nuclear weapons is feasible. Meanwhile, civil defense (whose only mention by Draper is: "a vast and wasteful program of 'civil defense'") would save millions of American lives if nuclear war should come in spite of our every effort to prevent it.
After arguing the improbability that any Soviet-American nuclear exchange could be kept limited, a position with which I agree, Draper falls into the common snare of suggesting an increase in NATO's conventional forces as a remedy for our nuclear danger. Aside from the fact that Soviet doctrine asserts the necessity of the combined use of nuclear and conventional weapons from the outset of war, the Soviets would be fatuous to start a conventional war in Europe without a first strike against the United States, because of the great value the first blow confers and because of the fear that if they didn't strike first, we would. Nor would they be so foolish as to rely on a no-first-strike declaration by the US government.
Dwelling at length on the changed circumstances that have made NATO obsolete, and saying that French President Charles de Gaulle "knew what he was talking about," Draper characterizes the North American alliance as a "misalliance" and stresses the divergent interests pulling the United States and Europe apart. The appropriate resolution of this situation, which Draper never offers, is a phased American withdrawal and cancellation of the treaty pursuant to its terms, after giving Europe the nuclear knowledge to create its own effective nuclear deterrent. A major reason why the nuclear sword of Damocles hangs over the United States is the American military presence in Europe.
Draper devotes some 55,000 words to the Arab-Israeli wars; they are interesting and useful but devoid of a constructive remedy. I wish I knew one. The wry humor of Golda Meir illuminates some of the difficulties. To her dismay, as she said, she found that for her socialist friends in Europe, Arab oil was redder than Israeli blood. And when told the Israeli situation was desperate, she shrugged and smiled: "When hasn't it been?" For the United States, the applicable maxim may be: If you can't fix it, leave it alone.
Under various rubrics and with some detours, the rest of the book discusses the faults of containment. In mercilessly excoriating the mistakes of American statesmen about Vietnam, Draper finds the clue: "A good general assesses his own strengths as objectively as that of the enemy; he does not go into battle without taking into account what his own forces are capable of accomplishing and, in the particular case of the American people, what they are willing to fight for and at what cost."
Instead of pursuing this basic theme, however, Draper gets lost in demonstrating the many mistakes made by American statesmen. His view of the Cold War suffers from his failure to comprehend Lenin's blueprint for the protracted global triumph of communism with subversion as the preferred tool. This is the essential starting point of any plan to defeat Lenin's heirs. Where the cold warriors went wrong was in their assumption that we must contain at every point and ultimately by American military force. Sensibly, the American people do not want to spill American blood all over the globe fighting Soviet surrogates. To ask them to do so is, as Draper says, a grave misreading of the intangibles that are as much a part of the situation as the number of weapons on each side.
"The Soviet bloc and Soviet security system are not immune…from being reversed from within as well as from without." In this passage, Draper grasped the key; but instead of turning it, he let it slip from his hand. The substitute for containment is counterattack by another tool: subversion. The United States can point Lenin's preferred tool of subversion against Communist regimes without sending a single American soldier overseas. The long history of subversion, especially with Lenin's adaptation, weaves a pattern on which an American plan can be based. The United States can and should give the internal subversion of dissidents against Communist governments a helping push by truthful propaganda and openly announced aid in money and, when propitious, arms. Such foreign-aid-for-freedom would keep the Communist masters busy in their own backyards, with a consequent diversion from other pastures. In the end, such a course of action might well bring an overthrow at the center of the trouble, in the Soviet Union: another giant step to prevent nuclear war.
Draper's book lacks fairness. In gibing at "the domino theory," for example, he might have had the good grace to note that some of the dominoes have fallen. In addition, to a book of some 240,000 words dealing with present history, the publisher should have provided an index.
Despite my criticisms, Draper's sprightly prose is good reading, and there are nuggets of information scattered throughout the book. While his solutions seem wanting in wisdom, I ruefully acknowledge that my own are not espoused by any political party or by the foreign-policy establishment.
Laurence Beilenson, a lawyer, has for many years devoted himself to research and writing. His books include Survival and Peace in the Nuclear Age and The Treaty Trap.
Eugenia Froedge Toma
The Troubled Crusade, by Diane Ravitch, New York: Basic Books, 1983, 384 pp., $19.95.
I initially assumed from its title that The Troubled Crusade was just another book by an educator lamenting a lack of sufficient funds for public education. However, I was pleasantly surprised.
Diane Ravitch has written a book that examines the history of thought in public education in the United States since World War II. In doing this, she attempts to demonstrate that the public schools have served as the battleground for struggles fought in society as a whole. Whether it be concern about communist infiltrators in the '50s, the racial problems and rebellion against authoritarian figures in the '60s, or the move toward equal opportunity and affirmative action in more recent times, the public schools have served as a locus for resolving the ills of society. In documenting these events and examining their effects on the public schools, The Troubled Crusade is excellent. It will serve as a significant source book for those interested in the post-World War II changes in public education.
Most of The Troubled Crusade is largely factual. The author discusses the various movements in public education with very little of her own judgment coming into play. While this method of presentation has the advantage of lending the book a scientific tone, it also leaves the reader with a slight sense of disappointment, as there is virtually no analysis of the effects of these various movements. Ravitch attempts to remedy this in the final chapter when she discusses the new politics of education.
This last chapter is perhaps the best and most important of the book. Up to this point, she has examined the growth of the federal government as an educational policymaker and financier but has done so by pointing out specific events and describing the environment that culminated in federal actions. Some of these events included the initiation of federal impact aid to compensate local school districts for providing education to students living on federal lands that are not taxable by the local governments; the funding of scientific programs following the launching of the Russian Sputnik; the passage of the Elementary and Secondary Education Acts in 1965 as part of the Johnson administration's efforts to eliminate poverty; and finally, the numerous activities of the years since then, such as bilingual education and programs for the handicapped.
Not until the final chapter does Ravitch point to the implications of these program changes. She argues that the increased federal funding and federal regulations have come at the expense of loss of authority on the part of individual schools and thus the parents and children of those schools.
To illustrate this loss of authority at the local level, she points to some revealing figures. The number of pages of federal legislation affecting education increased from 80 in 1964 to 360 in 1976. The actual number of federal regulations increased from 92 in 1965 to nearly 1,000 in 1977. The trend in federal court decisions affecting education was quite similar. There were 112 decisions between 1946 and 1956, 729 from 1956 to 1966, and then over 1,200 in the next four years. According to Ravitch, a major reason for this increased federal role after the mid-60s was the passage of Title IV of the Civil Rights Act of 1964, which empowered federal officials to withdraw funds from any program violating antidiscrimination laws and regulations. The ambiguity associated with defining violations served as the stepping stone for the entrance of the federal government both legislatively and judicially.
In reading The Troubled Crusade one is struck by the public's dissatisfaction with the public schools over the preceding four decades as well as today. At least since World War II, the cry by the public has been for a return to the "good old days." But on reflecting upon the events described here, one might conclude that those good old days in our system of public education have never existed. Ravitch makes a valid and extremely important point that the public's dissatisfaction has increased with the increasing shift of control to the state and federal governments. But even in the '40s educators in the public school system managed to bring about changes in the curriculum that parents disapproved.
The major weakness of this book is Ravitch's failure fully to develop this point and its implications. At the end of the book, she still has hope for the future of the public education system. She does not point out (and perhaps does not grasp) that the system's ability to perform is inherently poor. As long as the control of schooling is vested in the political arena, the ability to satisfy the consumers of education remains limited. To remain in office, politicians must make decisions that generate the greatest political support. Although voters as a whole may be interested in and concerned about public education, it is the educators who have the greater incentive to lobby on behalf of their particular interests. Consequently, the politicians find it in their own interest to respond to the educators rather than to the general consumer of education.
This weakness of democratic decision-making exists whether control rests in the local level or the state or federal level. As Ravitch correctly points out, however, its pernicious effects are diminished when control is local. The costs to an individual voter, or consumer of education, of expressing his dissatisfaction with decisions concerning public education are less when the size of the political group is smaller. In other words, the actions of a single individual are more likely to affect the outcome in voting on a smaller scale. For this reason, educators will have less power over political decisions made at the local level than when those decisions are made at the state and federal levels.
A final and crucial step in analyzing the problems of public schools is skipped completely by Ravitch: she does not question whether education should even be provided publicly. The advantages afforded by the private market in satisfying individual preferences for education are never mentioned. While it may be argued that a discussion of privatizing education lay beyond the scope of this book, it certainly deserves mention in any attempt to examine the troubles confronting the institution of education.
I closed this book with the feeling that Ravitch had almost gotten to the source of the problem. A final chapter on private alternatives would have made it complete.
Eugenia Froedge Toma is a professor in the Department of Economics at California State University, Northridge.
Jack Shafer
The Heroin Solution, by Arnold S. Trebach, New Haven, Conn.: Yale University Press, 1982, 331 pp., $9.95.
The Hardest Drug: Heroin and Public Policy, by John Kaplan, Chicago: University of Chicago Press, 1983, 247pp., $20.00.
In 1972, in his influential book The Natural Mind, physician Andrew Weil offered a new way of looking at drugs and drug use. "The use of drugs to alter consciousness is nothing new," he wrote. "It has been a feature of human life in all places on the earth and in all ages of history." The ubiquity of drug use suggested to Weil that there is nothing particularly aberrant about it and no reason automatically to castigate drug users as sinners, criminals, deviants, or mentally and physically ill. For Weil, it appeared that the pursuit of an altered state of consciousness is a natural development. Framed in this context, the bliss of the meditator and the pothead were almost indistinguishable.
In the decade since The Natural Mind was published, the "new way" of looking at drugs does not seem to have taken hold. Misinformation and hysteria surround what passes for a drug debate in this country. That is why Arnold S. Trebach's The Heroin Solution (recently published in paperback) and John Kaplan's The Hardest Drug can only fill the heart of the serious student of drug policy with encouragement. Together these two books advance the scholarship of what has become the hardest drug for American society to come to grips with.
Trebach's tack is to rehabilitate the much-tarnished reputation of heroin. When first marketed in the 1890s, it was proclaimed a wonder drug. Later, with the passage of the Harrison Act of 1914, the "demonization" of heroin began, and eventually the drug was made medically and legally verboten. Doctors dispensing it and heroin users were carted off to jail, and the drug was driven into the underground, where it has remained to this day—feared and misunderstood.
Trebach's attempt to "exorcise the devil in heroin" is a rousing success. He documents how heroin has remained in the legal pharmacopoeia of doctors in foreign countries, most notably the United Kingdom. In that country doctors can prescribe heroin to habitual users (addicts) or to those suffering intractable pain. Trebach advocates the disengagement of lawmakers from what he argues is medical turf, although he does not advocate a complete repeal of drug laws.
Like Trebach, Kaplan is out to deflate a few heroin myths. In his exhaustive study we find that heroin withdrawal is rarely more painful than a case of the flu, that heroin use alone causes no organ damage, that it takes more than one taste of heroin to be "hooked" (often, it requires two weeks of thrice-daily doses to reach the plateau at which withdrawal symptoms occur if heroin use is ended). We also learn that most "addicts" mature out of heroin in their middle to late 30s. Kaplan agrees with Trebach that "our heroin problems would probably be fewer today had we begun some kind of opiate maintenance [like the British] at the time we passed the Harrison Act." But Kaplan fears to tread where even the cautious Trebach boldly marches. "Turning back the clock and starting from that point a second time is obviously impossible, so we must consider the costs and benefits of heroin maintenance in today's world."
This retreat to cost and benefit analysis negates the power of the opening chapters of The Hardest Drug. Kaplan points to a number of reasons, such as crime, why heroin use should be curbed—but the link between crime and heroin is stronger in the minds of social scientists than it is on the street. He warns that one of the "costs" of free availability of heroin is a possible increase in its use—but since he has discounted the physical harm of heroin use, what "cost" can he be talking about? He fears a drop in social productivity—but if an individual chooses to experience heroin rather than a 40-hour workweek, whose business is that but his own?
Kaplan's survey of the social "costs" of heroin conveniently ignores the costs of heroin prohibition. As he acknowledges, enforcement of drug laws cannot eliminate drug use; it can only drive up the price of drugs. As those prices rise, users don't always lay off the stuff. In the case of heroin and, to a lesser degree, cocaine, higher drug prices encourage users to find ways to get more "bang per buck" out of their drug. This has resulted in the use of the needle and syringe for intravenous injection, which is physically dangerous. In markets in which heroin is not so dear—Vietnam in the late '60s and early '70s, Iran before the shah "banned" opium, or Hong Kong before a similar drug crackdown there—most users safely sniffed or smoked their opiates without doing any physical harm to themselves. Similarly, the high price of cocaine has led many users away from the relatively safe administration of cocaine through the nose, to the physically debilitating use of needles and free-base smoking. One could predict that if the authorities were as successful at banning alcohol as they are at banning heroin, most alcoholics would take their precious drug via a needle, too.
Another "cost" of heroin prohibition is the dangerous drug experimentation it precipitates. Like any other consumers, heroin users are willing to explore the use of substitutes when their product of choice vanishes. Heroin users have plundered the pharmacopoeia and performed wild experiments on their own bodies—sometimes with deadly results. In recent years, methadone diverted from methadone maintenance clinics has been at the root of as many drug deaths as heroin use. Heroin users have discovered that the prescription drug Talwin can be combined with the antihistamine Pyribenzamine to satisfy their habit. The trouble is, scores of people have dropped dead on it. Another mixture, called "loads," is made by combining the sleeping pill Dorieden and codeine. Making the rounds as budget heroin, it too kills people. "Bathtub chemists" have gone to the literature and found the formulas for other heroin-like drugs. A group of drugs based on the well-known pain-killer fentanyl has been unleashed by clandestine chemists on the West Coast. A hundred times more potent than heroin, fentanyl has killed over 20 people by pure pharmacological overdose—something that rarely happens with heroin.
It's unfortunate that Kaplan is not ready to concede that heroin prohibition produces monumental costs such as these for individuals and society. Rather than cope with a drug that we are just beginning to understand, Kaplan would unwittingly have us battling ever more unpredictable "brews."
The publication of these two books—from academic presses, no less—marks a giant leap in heroin scholarship. Whatever relevant information on heroin is available in the hostile contemporary environment, it is collected here. These works by Kaplan and Trebach signify that the major points raised by Andrew Weil in 1972 are finally making it into the policy mainstream. Neither book delivers the alarmist and anecdotal testimony we have grown accustomed to hearing since the declaration of "war on drugs." Neither book damns drug users as "sick" or "criminal" or "sinful." That in itself is a great accomplishment. Weil's ideas regarding how people naturally experiment with consciousness have been adapted by both Kaplan and Trebach. Furthermore, both authors have integrated into their books Weil's tenet that the expectation, environment, and previous experience of users is as important as the substance being consumed.
When limited to the drug itself, both books shine. Trebach argues successfully for rehabilitating heroin as a potentially good medicine. While he avoids the subject of total drug-law repeal, he has favorably approached that position in interviews. Kaplan, alas, is too busy madly performing "cost-benefit" arithmetic to consider his own evidence. He has penned the most comprehensive interdisciplinary study of heroin yet written—the bibliography alone is worth the price. It's a shame that he does not recognize that living in a free society cannot be without costs and that the only valid balancing of costs and benefits is that done by individuals, not aspiring social engineers like himself.
Jack Shafer is managing editor of Inquiry magazine.
Walter E. Williams
Economics of Income Redistribution, by Gordon Tullock, Hingham, Mass.: Kluwer-Nijhoff, 1983, 187 pp., $28.00.
The difficult domestic problem for the United States is to somehow reduce the growth of government spending. The major growth sector of government spending at every level is the class of expenditures officially classified as transfer payments, or what has become known as income redistribution. Coping with some aspect of poverty—inadequate health care, housing, nutrition, etc.—is the usual justification for these payments.
Gordon Tullock's new book, Economics of Income Redistribution, sheds some new light on transfer payments. He concludes that, contrary to conventional wisdom, altruism may not fully explain our current level of transfers. His primary reason for suspicion is the fact that in the budget year of 1981, enough transfer money was spent so a family of four in the bottom 10 percent of the income distribution could have received $48,000 that year. Or, if we wanted to make the "safety net" larger, the bottom 20 percent could have received $24,000 for a family of four. Now we all know that poor people are not getting all that transfer money. So naturally we ask, who is getting all that money?
According to Tullock, "The bulk of the transfer goes to the politically influential and well-organized." The primary motivation behind income redistribution is simply that people want other people's money, and government provides the mechanism. Economics of Income Redistribution gives the reader a good rule of thumb to tell whether a program is designed to help poor people or somebody else. "This rule of thumb," according to Tullock, "is that if there is a means test," the program is designed to help the poor (that is, if people with income higher than the official poverty line are ineligible for the program). If the program has no means test, then it is not designed to help the poor even though it may do so to some extent.
It is significant, therefore, that such large income-transfer programs as medicare, social security, and education do not have a means test. And this is evidence, argues Tullock, of the desire of some people to get other people's money. According to Tullock, some people oppose the means test because the poor may be insulted. A more important reason is that the middle class will not vote for taxpayer-supported medical care, old age pensions, etc., if the benefits go only to the poor. But if the middle class view themselves as beneficiaries, they will vote larger amounts of money. Thus the push for expansion and universalization of transfer programs.
Then there is perverse income redistribution—the kind where lower-income people make transfers to higher-income people. Tax support of higher education falls into this category, which Tullock labels "regressive transfers." In higher education, for example, the average taxpayer subsidizes the education of people whose lifetime income is likely to be well above the average. As a condition of admission, colleges often require combined SAT scores of 900 plus (the national average is 893), and the student must be in the top 25 percent of his high school class. These requirements effectively weed out most of the poor. But, as Tullock points out, "The taxes that support this apparatus are general taxes which fall across the income structure." Of course, higher-income people pay more taxes, but the number of higher-income children in public-supported higher education is more peaked than the distribution of taxes.
There is another often-ignored class of transfers that Tullock discusses. These are transfers that appear in the form of economic regulatory activities such as zoning changes, tariffs and quotas, transportation regulations, and farm subsidies. In these transfer schemes, both the motivation and pattern conform to that found in income-transfer programs. That is, the beneficial effects are biased toward the more well-to-do. And the more well-to-do are motivated by the desire to have more, at others' expense.
Tullock has made a fine contribution to our understanding of income redistribution. However, in his admirable frankness about the motivation for income redistribution, he forgot a minor thought or two that I think could have fit nicely into the discussion.
The first is a debunking of the term income redistribution itself, which in my opinion contributes immensely to the demogoguery and confusion over the sources of income. "Income redistribution" suggests that out there is a distributor of dollars—a dollar dealer. The reason that some people have fewer dollars than other people do is that the "dealer" is unfair—a sexist, a racist, or a multinationalist; whatever. Thus, to promote justice and equity, according to this view, there must be a redealing of the dollars, an income redistribution. Those with more money must relinquish their ill-gotten gains for the sake of equity.
In fact, however, most income results from productive efforts by individuals singly or in combination with others as partners or corporations. Different individuals wind up with different incomes in part reflecting differences in abilities and differences in consumer tastes. I earn less than Moses Malone because of private, independent decisions made by millions of people about whom they wish to watch play basketball. People who argue for income redistribution wish forcibly to cancel all those decisions.
Also, Tullock overlooked one motivation for transfers—the important make-me-whole motivation. For example, a man who pays $40,000 a year in taxes may ask, "What am I getting out of government that somebody who pays $1,000 doesn't get?" He may conclude that nothing adds up to the $39,000 difference. He may then call for special transfers and government services to justify his taxes, to make him whole. The classical economist Frederic Bastiat characterized this activity as the contagion of legalized plunder.
Neither one of these minor suggestions or criticisms detracts from the strengths and insights of Tullock's book. I strongly recommend it as an excellent addition to one's domestic-policy bookshelf.
Walter Williams is a professor of economics at George Mason University and the author of The State Against Blacks.
David R. Henderson
Risk by Choice: Regulating Health and Safety in the Workplace, by W. Kip Viscusi, Cambridge, Mass: Harvard University Press, 1983, 200 pp., $18.50.
True or false? Companies make workplaces safe because if they don't, they could be fined by the Occupational Safety and Health Administration (OSHA).
True or false? American workers are generally ignorant of hazardous conditions at their work sites.
True or false? Congress passed the Occupational Safety and Health Act in response to an increase in hazards in America's workplaces in the 1960s.
If you answered "true" to any of these, then read this review. Better yet, read W. Kip Viscusi's excellent book, Risk by Choice.
An economist at Duke University, Viscusi was a regulatory reformer with President Carter's Council on Wage and Price Stability. He is a leading expert on regulation of workplace health and safety. His book, based on his academic work, is written for a wide audience.
Viscusi argues that there is a market for safety. Workers are paid to take risks; and the greater the risk, the higher the pay. An employer who makes his workplace safer can reduce the wages he pays. This wage reduction, which measures how much his employees value the reduction in risk, gives the employer an incentive to increase safety. He will increase safety if this incentive exceeds the cost of doing so.
But if the risk reduction costs the employer more in labor and capital than it saves him in reduced wages, he will not do so. Furthermore, points out Viscusi, if the government forces the employer to increase safety, then workers won't value the additional safety by as much as it cost the employer to produce it.
Is Viscusi's concept of a market for safety realistic? Do riskier jobs pay more? Are workers informed enough about the riskiness of their jobs relative to other jobs to make the market work? Yes, argues Viscusi.
By correlating wages with injury rates, Viscusi shows that the riskier the job, the higher the pay, other things being equal. He estimates the total "risk premium" paid to all US private-sector workers in 1979 at $69 billion, an average of $925 per worker. This average, of course, understates the amount paid to workers in the riskiest jobs. A worker who risks a 1-in-1000 chance of being killed receives annually thousands of dollars in risk premiums alone.
Viscusi notes that the $69 billion paid in risk premiums to US workers in 1979 provides a strong incentive for employers to make workplaces safe. Of course, because employers choose to pay this amount rather than totally eliminate risk, the cost of greater safety must be even more than $69 billion. But, says Viscusi, the striking fact is that this $69 billion is 3,000 times the total penalties levied by OSHA in 1979. It is the extra wages that workers demand to endure unsafe conditions, not OSHA, that makes workplaces safe.
Furthermore, that workers receive risk premiums means at least some of them are informed. Viscusi also presents evidence that the greater the risk of injury in an industry, the higher the proportion of workers in that industry who see their job as dangerous.
Viscusi reports other evidence that the market for safety works. In a table on fatal accident trends since 1930, he shows that as US per capita income has risen, accidental death rates have fallen. Between 1940 and 1980, for instance, the accidental death rate per 100,000 workers fell by 66 percent. One would predict that as workers became wealthier, they would demand more safety. The figures show that added safety has indeed been supplied.
Viscusi also points out that a reported increase in workplace hazards in the 1960s, used to justify the Occupational Safety and Health Act, was nothing more than a statistical artifact. The injury frequency rate for manufacturing industries compiled by the Bureau of Labor Statistics increased 2.4 percent annually between 1958 and 1970, but the BLS statistics do not distinguish between minor and major accidents. Viscusi presents more meaningful risk measures that all show a decline throughout the pre-OSHA period.
While skeptical of government regulation of workplace health and safety, Viscusi doesn't want to abolish OSHA. Instead, he proposes that the agency shift its focus from regulating workplaces to informing workers. At first blush, this seems reasonable, because Viscusi argues persuasively that information about workplace safety is a public good that is underprovided in a free market. But the public good problem doesn't vanish when the government gets into the information business. Viscusi still needs to show what incentives would get a government bureaucracy to provide the appropriate quantity of valuable safety information to workers. And this he fails to do.
On a less substantive matter, Viscusi at times asserts conclusions from his academic articles without explaining them. For instance, he argues that if workers are not perfectly informed about job risks, employers tend to use new technologies that are not well understood. I don't see why, and Viscusi doesn't show why. Instead, he refers readers to a highly mathematical academic article. But such are minor imperfections in a book full of valuable information and insights.
David Henderson is a senior staff economist with the Council of Economic Advisers. The views expressed here do not necessarily represent those of his employer.
The post Arts & Letters appeared first on Reason.com.
]]>The claim that "Big Oil" has a monopoly on the oil market is nothing new. For at least a decade now, popular writers have been feeding the public with this sensational news. Fred Cook, who writes regularly on the energy industry for the left-leaning Nation, offers up the standard fare for a wider audience in The Great Energy Scam. Too bad Macmillan bit.
Cook does at times make a convincing case for Big Oil's alleged monopoly—but only by using facts selectively and interpreting them with an utter disregard for economics. So his feast is fatally poisoned.
Cook's main evidence for Big Oil monopoly is "its" behavior during the 1979 gasoline shortage. He argues that the Iranian revolution had nothing to do with the US shortage, because increased oil production by other countries more than offset reduced Iranian production, and US imports in 1979 were greater than in 1978. Moreover, he points out, US reserves of crude oil in 1979 were at their highest level ever. Therefore, he concludes, the major oil companies must have purposely engineered the gasoline shortage to drive up prices.
Let's look at some facts that Cook leaves out. US imports in 1979 were greater than in 1978 because 1978 imports were unusually small. More important, what matters for production is not imports but the total amount of crude oil available. In calculating crude reserves, Cook fails to subtract the oil in the federal government's Strategic Petroleum Reserve. Why should he? Because the oil companies couldn't count on the government to release this oil, both because it was for use only in emergencies and because the government had failed to install equipment for pumping out the oil. Without these irrelevant reserves, US reserves in March 1979 were in fact at their lowest level since April 1977.
Prudent refiners wanted to have on hand enough gasoline for the peak driving season and adequate home heating oil supplies. Because they feared a reduction in world oil output, they rationally chose to reduce gasoline production rather than risk the much more costly alternative of shutting down refineries later. Although world oil output didn't fall, their fear was quite sensible.
But reduced supplies alone don't cause shortages; artificially low prices do. Without price controls, higher prices would have matched demand with supply, but the federal government prevented gasoline prices from rising enough. It also forced refiners to allocate gas to farming areas to meet all the demands there, worsening the shortages in urban areas. Cook doesn't mention this.
Nor does he point out that because prices were set by a predetermined formula, they increased no more because of the gasoline shortage than they would have otherwise. So much for Big Oil's monopoly power to drive up prices. (Stephen Chapman presented a more complete analysis of the shortage in "The Gas Lines of '79," in the Public Interest, Summer 1980.)
Cook's two other main pieces of evidence for monopoly are the large profits per gallon earned by some refiners and the large profits earned by all oil companies in 1979. It is true that some refiners' prices increased by a larger percentage than did crude costs. But the reason their margins were high is that they could buy limited oil supplies from the Saudis at a below-market price, while their competitors were paying the market price. Thus they earned high profit margins even while pricing no higher than their competitors. And while US oil companies earned billions of dollars in "windfall profits," this is not evidence of monopoly. When the price of a resource rises, people who own some of that resource do earn windfalls, be they monopolists or the Beverly Hillbillies.
Moreover, how could refiners suddenly choose to become monopolistic? If Cook cites the increased profit margins of the 1970s as evidence for monopoly, then he can't claim that Big Oil was monopolistic in the 1960s when "it" had lower profit margins. So something must have changed to facilitate monopoly. But what? Cook doesn't say.
Although Cook says he believes in competition, he doesn't show the least understanding of competition. For instance, he argues that decontrolling the price of natural gas would cause its price to rise to the equivalent price of oil, marking "the end of any meaningful competition" between oil and gas.
In fact, knowing how producers and potential producers respond to the possibility of profits, we know that higher prices would increase natural-gas production, which necessarily creates more competition between oil and natural gas.
At another point, Cook states that Big Oil damaged the industrial base in the 1970s by redirecting capital from other industries into oil and gas. On the contrary, if US oil companies were monopolistic, they would be able to and would keep capital out of their industry.
His demonstrated ignorance of economics makes Cook politically naive. He recounts how the Carter administration suppressed a 1977 government study that concluded that an increase in natural-gas prices would bring forth large reserves. Carter also fired the head of the Geological Survey, who had been saying that there were huge US reserves of natural gas. Cook strongly criticizes Carter's actions. He doesn't seem to understand that the suppressed evidence strengthened the case for natural-gas decontrol, which Cook vehemently opposed. I don't mean to imply that Cook should favor the suppression of evidence that weakens his case, but rather that he isn't even aware that it does so.
Cook is most naive when he buys the Saudis' explanation for their advocacy of a per barrel tax (euphemistically labeled a "windfall profits" tax) on US oil. Although the Saudis threatened dire consequences for Americans if the government failed to impose a tax, it was really the Saudis who faced dire consequences. They feared that the high price of oil would encourage US production, ultimately undercutting their cartel. The Saudis wanted to nip this competition in the bud. Hardly in America's interest.
Space constraints prevent me from discussing some of Cook's other errors. He makes reasonable arguments against the Alaska natural-gas pipeline, the Washington state government's nuclear power plants, and synthetic fuels. But overall, one gets the feeling that Cook's visceral hatred of Big Oil has blinded him to economic reality. In a revealing passage, Cook tells us that his angry reaction to the decontrol of oil prices was "almost automatic." Leave out "almost," and I believe him.
David Henderson is a senior staff economist with the President's Council of Economic Advisers.
The post Pumping Away at "Big Oil" appeared first on Reason.com.
]]>Revolt Against Regulation: The Rise and Pause of the Consumer Movement, by Michael Pertschuk, Berkeley and Los Angeles: University of California Press, 1982, 192 pp., $12.95
Social Regulation is a collection of essays by specialists in regulation presenting strategies for reforming food and drug, occupational safety, and pollution regulation. The quality and interest of the essays vary greatly, but those by William Havender, Michael Levin, Thomas Grumbly, and Michael O'Hare best succeed in making their case.
Havender, a research biochemist, discusses the problems involved in regulating foods and drugs. After giving a lucid introduction to research on health hazards, he explains how perverse incentives guide the Food and Drug Administration (FDA). The result is that the agency holds drugs off the market even when their dangerous side effects are minimal or nonexistent and even when they could substitute for riskier drugs currently on the market. For instance, he tells of the FDA keeping low-dose contraceptives off the market until 1974, although it was known in the 1960s that high-dose pills were causing harmful side effects and women in Communist China had been taking the low-dose versions without adverse consequences since the late '60s.
Havender's essay is full of insights and interesting but little-known facts. He points out that the Environmental Protection Agency (EPA) released its commissioned study of chromosome defects among residents of houses near the Love Canal in Buffalo, New York, without having compared their frequency of chromosome defects with that of a matched control group. Only after President Carter decided to evacuate 700 families to temporary quarters did an EPA panel review the study. The panel concluded that there was no indication of excessive chromosome abnormalities.
But Social Regulation contains more than essays about the harm done by government regulation. There are two essays about successful reductions in regulation. The first is "Getting There: Implementing the 'Bubble' Policy," by Michael Levin. It is a fascinating account of how the EPA allowed firms to meet pollution standards by "trading" pollution reductions from one source for equivalent pollution increases from other sources.
Under the bubble policy, each plant is treated as if under a bubble: all that matters is the total amount of pollution, not its source within a plant. This approach allows firms to reduce pollution where it is most feasible to reduce and not to do so when reductions would be very costly. Since plant managers can determine their own capacities to reduce pollution better than do centralized regulators, the same amount of pollution reduction occurs as with command-and-control regulation, but at a lower cost.
Although the principle seems obvious, implementing it was not easy. Levin, who is chief of the Regulatory Reform Staff in the EPA, draws some conclusions from the episode. Since some of his conclusions are relevant for those who seek political change in general, they are worth repeating. First, he says, reform should start as a supplement, not a replacement. This will avoid threatening vested interests. Although the reform should start small, it should be structured to create an internal dynamic that will broaden once it starts being used. Second, it must be "gotten out on the street" to be used and must be structured to produce quick real-world success stories. Third, it must build a constituency within and outside an agency.
The other success story about regulatory reduction is Thomas Grumbly's "Self-Regulation: Private Vice and Public Virtue Revisited." It tells how the Department of Agriculture, under President Carter, shifted federal meat and poultry inspection away from government and toward self-inspection by firms. (One can just imagine the cries of foul play were the Reagan administration to try such a reform!) The lessons Grumbly draws from this successfully executed reduction in government regulation are similar to Levin's.
Michael O'Hare's essay, "Information Strategies as Regulatory Surrogates," points out the many possibilities for replacing heavy-handed regulation with simple requirements that producers provide information for consumers. Unlike traditional regulation, which forecloses consumer options, information requirements allow consumers to make their own choices.
O'Hare cautions, however, that even information requirements can be burdensome to consumers as well as to producers. He tells of a bank that inserted in its electronic funds transfer disclosure statement a sentence offering $10 to anyone who wrote the words "Regulation E" on a postcard and sent it in. Not one of 115,000 recipients of this offer responded! In this case, an information requirement had the perverse effect of discouraging consumers from reading the bank statement and causing them to be possibly less informed than if there had been no requirement.
While the other essays were somewhat informative, I would not recommend the book as a whole. Those interested in the nitty-gritty of how regulations can be altered could well use this book. Otherwise, I recommend borrowing it from a library and reading the four essays discussed above.
Revolt Against Regulation, by a member of the Federal Trade Commission (FTC) who was its chairman under President Carter, begins the way one would expect. In the first chapter, "On the Side of the Angels," Michael Pertschuk recalls the glory days of the 1960s "consumer" movement. In his mind, "consumer legislation" is good, and a politician's effectiveness is measured by the amount of such lawmaking he sponsors. He doesn't question whether consumer legislation helps or hurts consumers. Fairly soon, though, some cracks appear in the liberal foundation. For instance, Pertschuk admits that the cost of much consumer regulation falls on consumers in the form of higher prices.
Pertschuk goes on in the next few chapters to tell of the business revolt against regulation in the mid- to late '70s and of his efforts as FTC chairman to save as much power as possible from the axe of congressional oversight. All pretty much par for the course. But then he turns on a dime. In the final chapter he discusses some of the lessons the regulators have learned from economists. "We boasted then, 'There is a law that makes cars safer!' We know now…there is no such thing as a free lunch. And laws don't make anything."
Even then, he says, "The economists earned my grudging respect,…for their dogged insistence that we think through (emphasis his) the reality of what we believed we were achieving with our intervention in the marketplace." Pertschuk confesses that economists helped teach him "respect, if not reverence, for the marketplace—or, more precisely, for the power of market incentives, of self-interest. They have taught us how much more likely we are to gain our objectives by channeling the flow of such incentives than by vain efforts to block their passage."
In case you think this is just window-dressing, and that his views on particular regulations haven't changed, listen to his comments on a federal automobile regulatory agency to set minimum quality performance standards for automobiles. When Caspar Weinberger, FTC chairman under Richard Nixon, first proposed such an agency, Pertschuk, then a congressional staffer, loved it. But now, in commenting on Weinberger's original report, Pertschuk writes: "A fundamental problem with that report was that nowhere in its pages does one find so much as a discussion of the costs of providing such protection, or of the tradeoffs between such costs and benefits to consumers."
The quote speaks for itself. I recommend Pertschuk's last chapter to those who doubt that reasonable people will change their minds in response to evidence.
David Henderson is a staff economist for the President's Council of Economic Advisers.
The post Can Regulators Reform? appeared first on Reason.com.
]]>The Politics and Philosophy of Economics shocked me into thinking about my discipline. For that reason I highly recommend it to economists and to anyone else interested in economics. In a series of nine essays, British economist T. W. Hutchison covers subject matter ranging from Marxist economist Friedrich Engels, to John Maynard Keynes versus the Keynesians, to the postwar German economic miracle, to the Austrian school of economic thought, to the limitations of general theories in macroeconomics. If there is a unifying theme, it is that hewing to a "party line" can cause intelligent people to ignore reality.
Hutchison makes his point very effectively by drawing on issues involving many of the great economists of the last 100 years. In an essay on Engels, he notes that Engels and Marx foresaw 40 revolutions in 30 years, none of which took place, but that they were undaunted in their faith that there would be a revolution of the proletariat in an advanced industrial country. He also quotes at length from an Engels criticism of a fellow socialist for not understanding that a price mechanism is necessary for allocating resources efficiently. Engels's criticism reads very much like F.A. Hayek's statement of the same argument 50 years later, but Engels apparently ignored his own criticism completely.
In his essay on the German postwar economic miracle, Hutchison quotes from British and American economists' assertions in 1948 that free-market policies would not lead to German recovery. In fact, it was adoption of such policies that brought Germany to prosperity within a few years. In an essay on the Cambridge school economists, Hutchison quotes Maurice Dobb's assertion in 1941 that the forced collectivization of Soviet agriculture that starved literally millions "resulted in something approaching famine conditions in certain areas."
Two of the most interesting essays deal with the thought of John Maynard Keynes. I had never known how to reconcile the Keynes who called for massive government spending during the Depression with the Keynes who enthusiastically reviewed The Road to Serfdom, Friedrich Hayek's critique of central economic planning. Hutchison's essay, "Keynes versus the Keynesians," resolves the paradox. Keynes advocated massive public spending as a temporary solution to an unemployment rate in Britain of 22 percent, but not as a general cure for unemployment. In 1937, Keynes opposed increases in general government spending as a solution to Britain's 11–12 percent unemployment rate.
Moreover, Keynes was an ardent opponent of rationing. In 1940, with Hitler threatening Britain's survival, Keynes wrote:
If the community's aggregate rate of spending can be regulated, the way in which personal incomes are spent and the means by which demand is satisfied can be safely left free and individual,…This is the only way to avoid the destruction of choice, and initiative, whether by consumers or producers, through the complex tyranny of allround rationing.
Why then do we get such a distorted view of his ideas today? Hutchison says that the "Keynesians" did not really agree with Keynes on a wide range of issues but used his name to push their own agenda. Many of the Keynesians discounted his opposition to government control. Keynesian economist Joan Robinson claimed that some of Keynes's clear statements against centralized economic planning were "ill-considered" and "quite contrary to his main argument." In describing Keynes's last writings advocating only a limited role for government in international balance-of-payments policy, Lord Kahn, another famous British Keynesian, called Keynes "a sick man" who had made his recommendations "against his better judgment." Robinson even went so far as to say that some Keynesians "sometimes had some trouble getting Maynard to see what the point of his revolution really was." As Hutchison points out, it is no wonder that Keynes is reported to have said, "I am not a Keynesian."
The moral of the Keynes story is the danger of following a party line and of confusing a system of conclusions with a system of thought. Many of the Keynesians embraced Keynes's advocacy of government spending as a solution for unemployment. But they were not really believers in a system of thought. Instead, they became slaves to certain policy conclusions and never really did much thinking after that. Britain's sad economic state, attributable to its economic policymakers' following Keynesian advice, illustrates the harm that can come from being attached to conclusions rather than thought.
Having seen the consequences of being tied to a set of conclusions, I could hardly resist looking more objectively at my own views. Hutchison's essay on the Austrian school forced me to do so. I had not realized that there are two Friedrich Hayeks, one before 1937 and one after. Hutchison points out that before 1937, Hayek believed that an economy reaches equilibrium. But that is true only if people have full knowledge, or what Hutchison calls "Allwissenheit." After 1937, Hayek argued that it is impossible to have full knowledge.
The implications of imperfect knowledge are enormous. We can't be sure that the economy will reach equilibrium. It is possible for a free-market economy to sink into a depression. It is also not certain that government intervention in people's economic decisions will make things worse, since people without full knowledge might not have made the right decisions anyway.
To say all this is not to deny that government intervention is usually harmful. But after reading Hutchison's book, I realize that I hold that view less as a theoretical necessity than as a conviction based on experiences and morality.
David Henderson is an economist in Washington, D.C. He received his Ph.D. in economics from the University of California, Los Angeles.
The post Economic Conclusions vs. Economic Thinking appeared first on Reason.com.
]]>