When Barack Obama was just a baby, nuclear energy was touted as the technology that would finally provide pollution-free, limitless electricity for all. In its famous 1962 Port Huron Statement, the left-wing Students for a Democratic Society gushed about how “our monster cities…might now be humanized” thanks to nuclear power. Like so many predictions about the future, that one rather dramatically missed the mark.
Surprising as it may seem, the United States still generates around 20 percent of its electricity from nuclear power plants. This despite the fact that no new facilities have been built since the notorious Three Mile Island accident of 1979, which released small amounts of radioactive gases and iodine into the environment after a partial meltdown at a nuclear power plant in Dauphin County, Pennsylvania. Public opinion has remained steadfast against the technology ever since. In February The Economist reported that 64 percent of Americans opposed building new reactors. Disputes over waste disposal have never been resolved, and the Fukushima reactor meltdown in March 2011 cast further doubt on the idea that nuclear power will ever be a long-term clean-energy solution in the United States.
All of this has not stopped the Obama administration from betting on nukes. Even though the president prefers talking up more fashionable (and less economically viable) technologies such as wind and solar, in February his Nuclear Regulatory Commission quietly approved construction of what would be the first two new nuclear reactors in two generations. In 2010 Secretary of Energy Steven Chu touted the White House’s commitment to “restarting the American nuclear industry and creating thousands of new jobs and export opportunities in the process.”
But jump-starting nuclear power is not just bad politics. It’s awful economics.
The nuclear energy industry in the United States is powered by corporate welfare on plutonium. What is in theory a wonderful technology is in practice an economic white elephant. The data accumulated during the last 30 years suggest strongly that nuclear plants will never be able to cover their operating costs, let alone recoup the billions it costs to build them.
A 2009 Massachusetts Institute of Technology study led by physicist Ernest J. Moniz and engineer Mujid S. Kazimi showed that nuclear energy costs 14 percent more than gas and 30 percent more than coal. And that’s after taking into account a baked-in taxpayer subsidy that artificially lowers nuclear plants’ operating costs.
A 2010 study by the U.S. Energy Information Administration projected that nuclear power will remain more expensive to produce than other conventional sources of electricity in 2016 (see chart). Based on this analysis, nuclear power is also more expensive than wind power, although cheaper than solar and clean coal.
While the nuclear industry in the United States has seen continued improvement in operating performance over time, it remains uncompetitive with coal and natural gas on price. This cost differential is primarily driven by high capital costs and long construction times, often more than 10 years.
According to the Congressional Budget Office, nuclear power plants, on average, wind up costing three times more to build than original estimates suggest. Inflation, especially in the more nuclear-powered 1970s, played some role in the problem of ballooning costs. But when a project takes more than a decade to complete, labor and capital costs can grow in unexpected ways as well.
Historically, nuclear energy has flourished only in countries, such as France and Japan, where governments have stepped in with heavy subsidies. Yet dating back to at least the Reagan years, many conservatives have argued that if it weren’t for the regulatory costs and other barriers imposed by the federal government, nuclear energy would be competitive in the United States as well. While these conservatives rarely have a kind word for a nation of cheese-eating surrender monkeys, they don’t hesitate to point to France—which gets about 75 percent of its electricity from nuclear power and has never suffered a large-scale disaster—as demonstrable proof that nuclear power can be affordable and safe if companies are given the opportunity to build plants.
But producing nuclear energy in France is not magically cheaper than elsewhere. French citizens are forced to pay inflated costs to support grand government schemes, such as the decision made 30 years ago to go nuclear at any cost after the first oil shock in 1974.
In a 2010 paper published by the Institute for Energy and the Environment, Vermont Law School economist Mark Cooper found that the overall cost of generating nuclear power in France is similar to that in the United States. The price range for the two countries (after adjusting for purchasing power) overlap, despite the fact that the U.S. has a relatively strict regulatory regime and France has a relatively loose one: France’s estimated cost for a kilowatt of power is between $4,500 and $5,000; the estimated cost in the U.S. is between $4,000 and $6,000. Those figures remain stubbornly high, despite decades of efforts to get them down to manageable levels.
The main reason no new nuclear power plants have been built in the United States in 30 years is that they have proven to be poor investments, producing far more expensive electricity than originally promised. Giving the thumbs up to start work now on two new nuclear plants is an act of desperation by a president who has realized he is running out of other options.
Contributing Editor Veronique de Rugy is a senior research fellow at the Mercatus Center at George Mason University.