Some skeptics of man-made global warming have pointed to the sun as being responsible, at least in part, for the recent warming of the planet. One recently advanced theory suggested that the sun's magnetic activity modulates the amount of cosmic rays reaching the earth. The basic idea is that cosmic rays affect cloud cover–the more cosmic rays, the more cloud cover and the cooler the temperature. The theory suggests a stronger solar magnetic field deflects cosmic rays which would lead to fewer clouds and higher temperatures.
Two European researchers are now publishing a study in the Proceedings of the Royal Society A which look at data for the sun's irradiance, the incidence of cosmic rays and the like on the earth's recent climate. What do they find?
There is considerable evidence for solar influence on the Earth's pre-industrial climate and the Sun may well have been a factor in post-industrial climate change in the first half of the last century. Here we show that over the past 20 years, all the trends in the Sun that could have had an influence on the Earth's climate have been in the opposite direction to that required to explain the observed rise in global mean temperatures.
One of the researchers tells the New Scientist:
"The upshot is that somewhere between 1985 and 1987 all the solar factors that could have affected climate have been going in the wrong direction. If they were really a big factor we would have cooling by now."
Of course in areas that are prey to big uncertainties, no study is definitive. However, as the evidence for man-made global warming continues to accumulate, policymakers and citizens should turn our attention to what should be done about it. See some of my thoughts on the carbon taxes vs. carbon markets here.
Disclosure: Just in case any H&R readers missed it, I am a former skeptic of man-made global warming. See my mea culpa here.