As if having climate models to predict future weather on earth is not challenging enough, now some astrophysicists have devised what is essentially a climate model for the sun. As many news outlets have been reporting, Northumbria University mathematics professor Valentina Zharkova claims that the sunspot model is 97 percent accurate.
If the model is right, then the sun is heading by 2030 toward a grand sunspot minimum of the sort last seen during the Maunder Minimum in the 18th century. During the Maunder Minimum sunpots basically disappeared and energy output of the sun fell. The result was the Little Ice Age in which temperatures dropped significantly in the Northern Hemisphere. Famously, the weather was so cold that the Thames River in London regularly froze and residents used the iced over river as the site for "Frost Fairs."
One theory for why a lack of sunspots produces an earthly chill is that reduced solar output, especially ultraviolet light, means that less ozone is being produced in the stratosphere which affects atmospheric circulation in ways that lower the average temperature. By how much? According to some modeling results, a new grand minimum would lower temperatures by perhaps -0.2 to -0.3 degrees Celsius below what they would otherwise be. In a 2013 modeling study, "Could a future 'Grand Solar Minimum' like the Maunder Minimum stop global warming?," in Geophysical Research Letters concluded:
…a grand solar minimum in the middle of the 21st century would slow down human-caused global warming and reduce the relative increase of surface temperatures by several tenths of a degree. … Therefore, results here indicate that such a grand solar minimum would slow down and somewhat delay, but not stop, human-caused global warming.
Assuming that the University of Alabama in Huntsville global temperature trend of +0.11 degree Celsius per decade is sustained, such a new grand minimum would counteract three decades of warming.