Last week, President Bush stepped back from the global warming precipice over which his Environmental Protection Agency administrator Christine Todd Whitman and Treasury Secretary Paul O'Neill were trying to push him. These two Bush administration officials wanted EPA to be given the authority to regulate carbon dioxide, the gas released by oil, coal, and natural gas as a pollutant. With this authority, the EPA would have been able implement regulations aimed at reducing the emissions of carbon dioxide along the lines proposed in the unratified Kyoto Protocol. That agreement would require the U.S. to reduce its carbon dioxide emissions by 7 percent below its 1990 levels between 2008-2012.
Imposing curbs on carbon dioxide emissions would be expensive. This past December, the Energy Information Administration, the research arm of the U.S. Department of Energy, released an analysis that found that regulating carbon dioxide as a pollutant would cost American consumers nearly $115 billion dollars per year. Whitman and O'Neill have evidently embraced the arguments emanating from the U.N.'s Intergovernmental Panel on Climate Change. The IPCC believes that raising the amount of carbon dioxide in the earth's atmosphere could lead to catastrophic global warming in which the average temperature of the Earth might rise as much as 10 degrees Fahrenheit (about 4 degrees centigrade) by 2100.
Scientifically, Bush has good reason to step back from the Kyoto climate control process. First, the Earth's temperature trend in the highly accurate satellite record has been cooling at about 0.06 degrees centigrade per decade since 1979, when the first satellite data was gathered. (See http://www.ghcc.msfc.nasa.gov/MSU/msusci.htmlfor a chart of this). This contrasts with the temperature trend as measured by a spotty network of surface thermometers. The surface thermometers report a warming trend of about 0.10 to 0.15 degrees centigrade per decade.
The scariest and most widely cited of the 245 controversial climate storylines mentioned in the IPCC's Policymakers Summary, released earlier this year, suggested that global temperatures could rise by 4 degrees centigrade by 2100, or about 0.4 degrees per decade. In order to account for the actual, much-lower rates of temperature rise, modelers have inserted a number of cooling factors into their models, particularly sulfate particles produced by burning dirty coal. The modelers postulate that sulfate particles brighten clouds that cool the surface by reflecting sunlight back into space. Other climatologists think that the sulfate cooling effect in the models is vastly overstated.
The physics of atmospheric warming are uncontroversial, as is the fact that atmospheric carbon dioxide is increasing. Added carbon dioxide should increase temperature, which should increase water vapor (the most ubiquitous greenhouse gas), which should lead to higher average temperatures. Climate computer models try to replicate this process so as to predict what the earth's climate will be over the next century. So far, though, the model results do not match the actual temperature trends found in the satellite record. What causes this disparity? Clouds are probably the main cause. It is well known that the computer models are rotten at explaining the effects of clouds on climate.
Richard Lindzen, a professor of atmospheric sciences at the Massachusetts Institute of Technology, has published a study in the current issue of the Bulletin of the American Meteorological Society which could explain what's going on. Lindzen and his colleagues find that the tropical Pacific Ocean may be able to open a "vent" in its heat-trapping cirrus cloud cover and release enough energy into space to significantly diminish the projected climate warming caused by a buildup of greenhouse gases in the atmosphere.).
High, thin cirrus clouds tend to act as insulators, trapping heat before it can radiate into space. Bright cumulus clouds (thunderstorm clouds) tend to reflect sunlight back into space before it reaches the surface, thus cooling the atmosphere. What Lindzen and his colleagues think they've found is a negative feedback loop (a thermostat) that tends to lower the temperature of the atmosphere. Looking at weather satellite cloud data and ocean surface temperature data for the western Pacific, Lindzen found that for every 1 degree centigrade increase in ocean surface temperature, there is an average 22 percent decrease in cirrus cloud cover. Lindzen suggests that cumulus clouds over warmer seas produce rain more efficiently thus leaving less moisture in the atmosphere to produce heat-trapping cirrus clouds. Fewer cirrus clouds mean that excess heat can vent into space.
Preliminary calculations using this cloud thermostat cut computer climate model temperature estimates of global warming by almost two-thirds. So instead of rising 1.5 degrees to 4.0 degrees centigrade (as the early models suggest), the earth's temperature would rise between 0.64 to 1.6 degrees centigrade by 2100, all other things being equal. For perspective, keep in mind that the last century saw a temperature increase of about 0.6 degrees centigrade.
Despite claims from environmentalists committed to global warming catastrophe scenarios and the politicized scientists who run the IPCC, it is clear that the science of global warming is far from settled. As Carl Sagan once said (but didn't practice), "Extraordinary claims require extraordinary evidence."
President Bush is wise to insist that global warming alarmists come up with at least better, if not extraordinary, evidence before he commits the country and the world to a perilous, expensive and perhaps needless effort to cut energy use.