Politics

What's So Smart About Investing in the Smart Grid?

It makes no sense to throw $4.5 billion at electric power infrastructure

|

Smart grid technology is all the rage. General Electric just paid $2.4 million for a Superbowl ad featuring an animated scarecrow singing "If I Only Had a Brain" to promote its smart grid initiatives. IBM, meanwhile, is running full page "Smarter power for a smarter planet" ads in major newspapers like the New York Times. These corporations are in perfect sync with the new administration in Washington.

Earlier this month, President Barack Obama promised to retrofit America by "updating the way we get our electricity, by starting to build a new smart grid that will save us money, protect our power sources from blackout or attack, and deliver clean, alternative forms of energy to every corner of our nation." To that end, the House version of the American Recovery and Reinvestment Act authorizes the Department of Energy to spend $4.5 billion dollars to stimulate the deployment of smart grid technologies.

The Energy Information Administration (EIA) describes the current national power grid as the "largest interconnected machine on earth." The U.S. electric power infrastructure is worth over $1 trillion. It consists of more than 9,200 electric generating units with more than 1 million megawatts of generating capacity connected to more than 300,000 miles of high voltage transmission lines and 5.5 million miles of distribution lines. Utility companies have only the most rudimentary ability to monitor this vast grid. More often than not, utility companies only learn about a local power outage when customers call to complain.

Creating a smart grid means computerizing the current electric grid using advanced wireless two-way information and communications equipment, deploying an array of sensors to monitor activity, and developing the software to control and track in real time all aspects of electricity generation, transmission, and consumption. The smart grid would also more easily integrate supplies from decentralized renewable energy suppliers and enable consumers to better manage their energy consumption.

Modernization would certainly help the current transmission network, which is so overburdened that blackouts are now bigger, lengthier, and more common. The EIA estimates that outages currently cost the economy as much as $150 billion per year. Even as demand for electricity has grown, transmission capacity has been lagging. In addition, Americans are projected to use about 30 percent more electricity by 2030. The Electric Power Research Institute (EPRI), the think tank of the utility industry, estimates that smart grid technologies could potentially lower projected annual energy consumption in 2030 by 1.2 to 4.3 percent. This would mean that fewer power plants and transmission lines would need to be built in the future.

For many proponents, however, the chief reason to create a smart grid is that it promotes energy conservation. For example, the smart grid concept envisions smart meters in homes or businesses allowing consumers to fine tune their energy consumption, such as setting dishwashers or washing machines to turn on at night when electric power is relatively cheap and more plentiful. And consumers could also grant utility companies permission to send signals that lower the temperatures on residential hot water heaters or reduce air conditioning when the grid is threatened with an overload on hot summer days. Some pilot projects report that consumers using this technology have cut their energy bills by an average of 10 percent.

But why is there such a push to conserve energy, especially electricity? After all, the U.S. has plenty of coal, natural gas, and uranium to generate power and, if promoters of renewable fuels are right, there'll be plenty of those, too. The answer, of course, centers on concerns about man-made global warming. About 50 percent of our electricity is produced using coal and 20 percent more is generated by burning natural gas, both of which emit carbon dioxide that contributes to raising the earth's average temperature.

If the goal is reducing carbon dioxide emissions rather than achieving energy efficiency for its own sake, then the simplest and most effective policy would be to place a price on carbon dioxide emissions. Pricing carbon would push generators and consumers to switch to low and no-carbon fuels while encouraging innovators to develop such fuels. Proponents of the smart grid often liken it to the Internet. And that's actually a pretty good analogy. No central computer distribution company created the Internet by installing a desktop in every home, after all. So why should we jigger electricity rate regimes in order to push utility companies to install smart meters or appliances? As higher carbon prices cause electric bills to increase, consumers themselves will start installing smart meters and appliances while seeking out relatively cheaper low-carbon power.

Here's another question: Why haven't utility companies and electric generation companies already started invested in a smart grid? One word: incentives. The chief problem is that power companies make more money when they sell more electricity to consumers. Building a smart grid means utilities would pay for infrastructure that could reduce the amount of electricity they sell. In other words, given the current regulatory system, utilities would be spending money so that they would make even less money. Obviously, this incentive scheme won't work.

The leading proposal to change the incentive structure is called decoupling. Instead of earning money by selling more electricity, a utility's profits are decoupled from the amount of electricity it sells. Regulators guarantee that a utility's fixed costs, including a profit margin, can be recovered no matter how much energy it sells through a rate adjustment formula. So if a utility sells less than was forecasted, the rates to consumers are adjusted to make up the difference. Generally speaking, such adjustments have amounted to an additional 2 to 3 percent on consumers' bills.

The House stimulus bill contains provisions that require states to consider decoupling as a condition for applying for $3.4 billion in energy efficiency grants. Some free marketers dislike decoupling because consumers could perversely see their energy use go down while their bills remain the same or even go up. Some progressives also reject decoupling on the grounds that it provides windfall profits to utilities.

While decoupling removes the incentive for utilities to sell more power, it doesn't provide much impetus for utilities to boost energy efficiency. One proposed fix for this problem is shared savings. If a utility invests in some type of energy efficiency—say the installation of smart meters or better insulation in houses—the utility shares the energy savings with the consumers. How? Consumers get lower utility bills because they use less energy and regulators award the utility with higher rates to pay back its investment in energy efficiency. For example, California utilities get rate hikes that amount to between 9 and 12 percent of the energy efficiency savings. But again, why rig electricity markets so that utility companies end up in charge of insulating houses, paying for energy efficient appliances, and installing energy management systems? Instead, utilities need to create an open information exchange network that both consumers and power generators can tap into.

In 2004, the Electric Power Research Institute calculated that it would cost $165 billion over the next 20 years—about $8 billion per year—to build out the smart grid. And one of the first challenges is mobilizing sufficient investment. But that problem won't be solved by throwing $4.5 billion at the electric grid as a sop to the environmental lobby—even if it does stimulate the bottom lines of favored corporations.

Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.