17 February 2018
Overheated claims on temperature records by Dr. Tim Ball and Tom Harris
Now that the excitement has died down over the news that Earth’s surface temperature made 2017 one of the hottest years on record, it is time for sober second thoughts.
Did the January 18 announcement by the National Oceanic and Atmospheric Administration (NOAA) that 2017 was our planet’s third-hottest year since 1880, and NASA’s claim that it was the second hottest year, actually mean anything?
Although the Los Angeles Times called 2017 “a top-three scorcher for planet Earth,” neither the NOAA nor the NASA records are significant. One would naturally expect the warmest years to come during the most recent years of a warming trend. And thank goodness we have been in a gradual warming trend since the depths of the Little Ice Age in the late 1600s! Back then, the River Thames was covered by a meter of ice, as Jan Grifier’s 1683 painting “The Great Frost’ illustrates.
Regardless, recent changes have been too small for even most thermometers to notice. More important, they are often less than the government’s estimates of uncertainty in the measurements. In fact, we lack the data to properly and scientifically compare today’s temperatures with the past.
This is because, until the 1960s, surface temperature data was collected using mercury thermometers located at weather stations situated mostly in the United States, Japan, the United Kingdom and eastern Australia. Most of the rest of the planet had very few temperature sensing stations. And none of the Earth’s oceans, which constitute 70 percent of the planet’s surface area, had more than an occasional station separated from its neighbors by thousands of kilometers or miles.
The data collected at the weather stations in this sparse grid had, at best, an accuracy of +/-0.5 degrees Celsius (0.9 degrees Fahrenheit). In most cases, the real-world accuracy was no better than +/-1 deg C (1.8 deg F). Averaging such poor data in an attempt to determine global conditions cannot yield anything meaningful. Displaying average global temperature to tenths or even hundreds of a degree, as is done in the NOAA and NASA graphs, clearly defies common sense.
Modern weather station surface temperature data is now collected using precision thermocouples. But, starting in the 1970s, less and less ground surface temperature data was used for plots such as those by NOAA and NASA. This was done initially because governments believed satellite monitoring could take over from most of the ground surface data collection.
However, the satellites did not show the warming forecast by computer models, which had become so crucial to climate studies and energy policy-making. So bureaucrats closed most of the colder rural surface temperature sensing stations – the ones furthest from much warmer urban areas – thereby yielding the warming desired for political purposes.
Today, virtually no data exist for approximately 85 percent of the earth’s surface. Indeed, fewer weather stations are in operation now than in 1960.
That means surface temperature computations by NOAA and NASA after about 1980 are meaningless. Combining this with the problems with earlier data renders an unavoidable conclusion: It is not possible to know how Earth’s so-called average surface temperature has varied over the past century and a half.
The data is therefore useless for input to the computer models that form the basis of policy recommendations produced by the United Nations Intergovernmental Panel on Climate Change (IPCC) and used to justify eliminating fossil fuels, and replacing them with renewable energy.
But the lack of adequate surface data is only the start of the problem. The computer models on which the climate scare is based are mathematical constructions that require the input of data above the surface, as well as on it. The models divide the atmosphere into cubes piled on top of each other, ideally with wind, humidity, cloud cover and temperature conditions known for different altitudes. But we currently have even less data above the surface than on it, and there is essentially no historical data at altitude.
Many people think the planet is adequately covered by satellite observations, data that represents global 24/7 coverage and is far more accurate than anything determined at weather stations. But the satellites are unable to collect data from the north and south poles, regions that the IPCC, NOAA and NASA tout as critical to understanding global warming. Besides, space-based temperature data collection did not start until 1979, and 30 years of weather data are required to generate a single data point on a climate graph.
So the satellite record is far too short to allow us to come to useful conclusions about climate change.
In fact, there is insufficient data of any kind – temperature, land and sea ice, glaciers, sea level, extreme weather, ocean pH, and so on – to be able to determine how today’s climate differs from the past. Lacking such fundamental data, climate forecasts cited by climate activists therefore have no connection with the real world.
British Professor Hubert Lamb is often identified as the founder of modern climatology. In his comprehensive 1972 treatise, Climate: Past, Present and Future, he clearly showed that it is not possible to understand climate change without having vast amounts of accurate weather data over long time frames. Lamb also noted that funding for improving the weather database was dwarfed by money being spent on computer models and theorizing. He warned that this would result in wild and unsubstantiated theories and assertions, while predictions failed to improve. That is precisely what happened.
Each and every prediction made by the computer models cited by the IPCC have turned out to be incorrect. Indeed, the first predictions they made for the IPCC’s 1990 Assessment Report were so wrong that the panel started to call them “projections” and offered low, medium and high “confidence” ranges for future guesstimates, which journalists, politicians and others nevertheless treated as reliable predictions for future weather and climate.
IPCC members seemed to conclude that, if they provided a broad enough range of forecasts, one was bound to be correct. Yet, even that was too optimistic. All three ranges predicted by the IPCC have turned out to be wrong.
US Environmental Protection Agency (EPA) Administrator Scott Pruitt is right to speak about the need for a full blown public debate among scientists about the causes and consequences of climate change. In his February 6 television interview on KSNV, an NBC affiliate in Las Vegas, Mr. Pruitt explained:
“There are very important questions around the climate issue that folks really don’t get to. And that’s one of the reasons why I’ve talked about having an honest, open, transparent debate about what do we know, and what don’t we know, so the American people can be informed and they can make decisions on their own with respect to these issues.”
On January 30, Pruitt told the Senate Environment and Public Works Committee that a “red team-blue team exercise” (an EPA-sponsored debate between climate scientists holding differing views) is under consideration. It is crucially important that such a debate take place.
The public needs to understand that even the most basic assumptions underlying climate concerns are either in doubt or simply wrong. The campaign to force America, Canada, Europe and the rest of the world to switch from abundant and affordable coal and other fossil fuels – to expensive, unreliable, land intensive alternatives – supposedly to control Earth’s always fluctuating climate, will then finally be exposed for what it really is: the greatest, most damaging hoax in history.
Dr. Tim Ball is an environmental consultant and former climatology professor at the University of Winnipeg in Manitoba. Tom Harris is executive director of the Ottawa, Canada-based International Climate Science Coalition.
My Note:
I added the yellow highlighting. This is a point I have also long made. I will add another point, much of the historical data has been "corrected" in recent times and the corrections are very substantial compared to the temperature trends and somehow almost always make the older temperatures colder. If the older data really does need to have such large corrections, then the older data is worthless as scientific data and should be treated as such. There is no point in making the corrections on such a wobbly, uncertain base.
Subscribe to:
Post Comments (Atom)
1 comment:
I think Tony Heller's work continues to be a thorn in the side of the alarmists who actually work with and oversee these large data sets, such as Zeke Hausfather and Gavin Schmidt, because he focuses on the one data set of sufficient quality in both space and time to actually track temperature trends over the last century. The data set is USHCN and it undoubtedly has had it's observed long-term cooling trend reversed into a warming trend.
https://imgur.com/a/vpJ18
And the pattern of changes to the underlying data match CO2 rise with an almost perfect correlation.
https://imgur.com/a/hjapW
If you bring your attention to the comments you'll notice a very revealing statement from Steven Mosher of Berkeley Earth Systems, a colleague of Zeke Hausfather.
A common retort by the above researchers is that, while the US data sets are admittedly adjusted severely, the global average is affected minimally. This, to me, is extremely troubling because the vast, vast majority of actual thermometer readings have historically come from North America. I believe in 1900 it was something like 95% of min/max readings were in the US.
I think what's going on is something that many have suspected for more than a decade, that "bad quality" data is being favored over "good quality" data and that much of the good quality data sets are being unfairly manipulated because they are at odds with data of more dubious consistency. I believe at one point Dr James Hansen was puzzled over the lack of warming in the US, despite a warming trend globally.
Post a Comment