Global Warming Data Twisted by Advocates

 Powered by Max Banner Ads 




Before you read these words, you have probably already made your mind up as to whether you will agree or disagree with what I am about to report – that’s too bad – I wish there could be real dialogue on this issue which is of paramount importance to all mankind. 

As you may remember, when Mr. Al “I made hundreds of millions from an Inconvenient Truth and the green industry” Gore reported about global warming, there were loud protests that the data had been manipulated and charted in a way to maximize claims that global warming was melting the earth and that it was all caused by man.  In addition, both global warming advocates and the press have been very sloppy in differentiating between “climate change” which is a natural part of Earth’s planetary cycles, and the impact man has had on those otherwise natural cycles.  Dr. Richard Muller’s BEST (Berkeley Earth Surface Temperature) program sought to review and reanalyze the data correcting prior perceived deficiencies.  Scientists from around the world were curious as to how the data would be portrayed and just what it would show.

In a October 21, 2011 Wall Street Journal Article entitled There was good reason to doubt . . . until now, Dr. Muller argued that new review of the data should eliminate any reasonable skepticism about the validity of man-made global warming fears.  Unfortunately, Dr. Muller’s conclusions don’t seem consistent with his own summary of his findings.  One of the key problems with earlier analyses was that the stations that were used to collect global warming data were faulty, producing inconsistent results.  Dr. Muller writes:

The temperature-station quality is largely awful. The most important stations in the U.S. are included in the Department of Energy’s Historical Climatology Network. A careful survey of these stations by a team led by meteorologist Anthony Watts showed that 70% of these stations have such poor siting that, by the U.S. government’s own measure, they result in temperature uncertainties of between two and five degrees Celsius or more. We do not know how much worse are the stations in the developing world.

Using data from all these poor stations, the U.N.’s Intergovernmental Panel on Climate Change estimates an average global 0.64ºC temperature rise in the past 50 years, “most” of which the IPCC says is due to humans. Yet the margin of error for the stations is at least three times larger than the estimated warming.

 We know that cities show anomalous warming, caused by energy use and building materials; asphalt, for instance, absorbs more sunlight than do trees. Tokyo’s temperature rose about 2ºC in the last 50 years. Could that rise, and increases in other urban areas, have been unreasonably included in the global estimates? That warming may be real, but it has nothing to do with the greenhouse effect and can’t be addressed by carbon dioxide reduction.

The faulty data from these data collection stations creates numerous problems with the methodology of the BEST study.  Steve McIntyre, proprietor of Climate Audit noticed

“BEST’s estimate of the size of the temperature increase since the start of the 19th century is much larger than previous estimates….The decade of the 1810s is shown in their estimates as being nearly 2 degrees colder than the present….It’s also interesting to interpret these results from the context of ‘dangerous climate change’, defined by the UN as 2 deg C. Under BEST’s calculations, we’ve already experienced nearly 2 deg C of climate change since the early 19th century.” 

McIntyre could not replicate some of Muller’s results using raw station data.  Add to that, the land surface temperature record does not agree with the satellite record, nor is there good agreement with sea surface temperature record.  BEST uses only land-based temperature data and ignores sea surface temperature data.  According to Jonathen DuHamel:

“The whole thing boils down to what Muller said, “The temperature-station quality is largely awful.”  That means the surface temperature data is inadequate to come to any valid conclusions.  BEST measured a larger subset of the surface temperature record than some other researchers. BEST merged and “filtered” the data.  Various methods of massaging data lead to different conclusions, none of which may be close to reality.  The science is still not settled and the BEST study, despite its good intentions, provides nothing new.”

Yesterday the controversy “heated up” even more when BEST co-author Dr. Judith Curry denied claims that the BEST study supported continued global warming: “the project’s research data show there has been no increase in world temperatures since the end of the Nineties – a fact confirmed by a new analysis.”  An interesting “twist” in the way the global warming trend, flattening out and stabilizing for a decade and a half were “hidden” by the way the graph was depicted.

The top graph would seem to support claims that global warming is rampant and threatening the world we live in.  Actually, if you take the same data and enlarge the graph to show the past decade plus you get an almost flat line showing no alarming spike.  Even global warming skeptics are quick to point out that this “data display manipulation” was very different from the now infamous Climate gate data which hid problems in ways that made real analysis difficult.  In this case, the data was transparently provided even if presented in a way intended to improperly persuade.  Meteorologist Anthony Watts says:

Indeed Best seems to have worked hard to obscure it. They present data covering .. almost 200 years… with a short x-axis and a stretched y-axis to accentuate the increase. The data is then smoothed using a ten year average which is ideally suited to removing the past five years of the past decade and mix the earlier standstill years with years when there was an increase. This is an ideal formula for suppressing the past decade’s data.

About Geoff Willis