Global Cooling, Part 2 of 3
Among many in the climate science community, global warming is a given. There is no consideration given to the earth losing heat and cooling.
But is the chance of cooling really zero? And, if it is not zero, what might the implications of cooling be? This is Part 2 of a three-part essay on global cooling. Part 1 is below.
Lets examine whether the “consensus” regarding continued warming is well-founded.
World history is made up of cooling periods and warming periods. Here is a graph of temperatures for the last 2,000 years:
The Roman Warm Period was in progress at the time of Christ. Temperatures plummeted a few centuries later. Rome fell as civilizations relocated to escape the cold.
Temperatures warmed again about a thousand years ago. During the Medieval Warm Period, wine grapes were grown in Newfoundland and Leif Ericson set up a settlement in Greenland (which he described as “green”). Over the next few centuries, temperatures dropped so much that the glaciers advanced and the world experienced the Little Ice Age. Temperatures have been in recovery mode since.
It is very likely that humans have influenced temperatures and other aspects of climate, especially in the past few decades as fossil fuel use has increased. Because of solar activity, volcanoes, cosmic rays, and climate feedbacks (i.e., other factors equal warmer temperatures equal more clouds which should produce cooling) it is impossible, at the present state of the science, to know the exact extent of man’s influence.
Climate scientists attempt to investigate the extent of man’s influence through the use of climate models. The models are complex computer simulations of the atmosphere, the ocean, the sun, and demographic trends. The value of these simulations is limited because, to cite just one example, we don’t understand the role of clouds and tiny dust particles in regulating incoming solar radiation and outgoing “long wave” radiation. With so much about the sun, cosmic rays, and other influences not understood, the models cannot be relied upon to make accurate forecasts (see Part 1 for examples of wildly incorrect climate forecasts). In fact, the models are so unreliable, we do not use them to make 90-day climate forecasts. How can we believe they can make accurate 90-year forecasts?
It is not just my opinion that the forecasts are unreliable. You can download a 2007 paper from The Wharton School that concludes, “We have been unable to identify any scientific forecasts to support global warming. Claims the earth will get warmer have no more credence than saying it will get colder.”
Mainstream climate scientists are starting to agree. Within the last week, Dr. Roger Pielke, Sr. writes:
There is no evidence that the global climate model multi-decadal predictions (and even shorter term runs on a year or less into the future) have the needed skill. [to make accurate forecasts]
My conclusion: There is no demonstrable skill at forecasting climate.
With that background, lets ask a question: What if all of the recent evidence of colder weather means just that: The earth is starting to cool.
Now, I want to clearly state: I don’t know whether the earth’s heat content will increase, decrease, or stay the same in future decades. Neither does anyone else.
As a risk management professional, I’m asking whether major cooling or warming is the greater threat and should we prepare for either?
I’ll offer some thoughts tomorrow.
UPDATE: New Year's Eve. Here is an article about botched environmental forecasts.
UPDATE: New Year's Eve. Here is an article about botched environmental forecasts.
Comments
Post a Comment