President Barack Obama’s approach to climate change is based on an acceptance of the conclusions of the United Nations Intergovernmental Panel on Climate Change (IPCC). Perhaps the most widely cited of these conclusions are those of the 2013 IPCC report which stated:
“Warming of the climate system is unequivocal”
“Human influence on the climate system is clear”
These sentences mask deep ambiguity and confusion about what the dangerous anthropogenic global warming (DAGW) hypothesis is all about.
Whether “the planet is getting warmer,” as Obama told the graduating class of the United States Coast Guard Academy on May 20, depends entirely on the time period considered.
For example, mild global warming occurred between the end of the Little Ice Age (about 1860) and now, and also between 1979 and 1997. However, global cooling of about one degree has occurred since the peak of the Medieval Warm Period (about 1,000 A.D.), and perhaps of two degrees since the Holocene Climatic Optimum about 8,000 years ago.
It would seem then that Planet Earth is on a long-term cooling trend that overshadows the 20th century warming the IPCC focuses on.
No statistically significant global warming has occurred since 1997, an 18 year-long period during which atmospheric carbon dioxide (CO2) levels increased by 10 percent. That increase represents fully 30 percent of all the human-related emissions since the start of the industrial revolution, all for no warming.
The lack of warming in this period confirms that the DAGW hypothesis is wrong.
The issue at hand is not, as the IPCC and Obama maintain, that warming is happening, but specifically whether dangerous global warming is likely to happen in the foreseeable future as a result of human-induced CO2 emissions. If any warming that eventuates is not potentially dangerous, then it should not be a public policy issue at all, let alone worth expending vast amounts of public money trying to stop.
IPCC’s second statement, one repeated verbatim on the White House Web site, is equally meaningless. No sensible scientist would dispute that “human influence on the climate system is clear.”
The building of towns and cities replaces natural vegetation and land surfaces with industrial materials, thus providing a heat trap for solar radiation thereby causing the urban heat island effect. Similarly, in the countryside, farmers cut down dark-colored native vegetation and replace it with light-colored crops such as wheat. These fields then reflect more incoming solar radiation than did the native forest, which results in local, human-caused cooling.
Adding up our various warming and cooling influences across the globe results in a net human effect on the planet. But the effect is so small that it has yet to be calculated accurately, let alone measured.
The question then is not ‘is there human influence’ on climate, but ‘How big is it?’
Over the past three decades, thousands of scientists have expended hundreds of billions of dollars researching this issue without finding any convincing evidence that the human effect exceeds natural, random variations. The human impact on global climate is too small to even be detected with current instrumentation.
Given the highly variable nature of both weather and climate through time, the simplest hypothesis that explains all the facts is the ‘null hypothesis’ that ‘observed modern changes in the climate system, or in plants and animals affected by it, are due to natural causes unless and until specific evidence indicates otherwise.’
Neither governments, nor the IPCC, have yet described any convincing evidence that contradicts this hypothesis.
Yet, in attempting to secure a climate change legacy before the end of his presidency, Obama appears to have not even considered the null hypothesis. With worldwide expenditures on climate finance now at $1 billion a day, it’s time he did.