Warming Bias in Temperature Data due to Consensus Belief not Conspiracy

In a Cliscep article Science: One Damned Adjustment After Another? Geoff Chambers wrote:-

So is the theory of catastrophic climate change a conspiracy? According to the strict dictionary definition, it is, in that the people concerned clearly conferred together to do something wrong – namely introduce a consistent bias in the scientific research, and then cover it up.

This was in response to last the David Rose article in the Mail on Sunday, about claims the infamous the Karl et al 2015 breached America’s National Oceanic and Atmospheric Administration (NOAA) own rules on scientific intergrity.

I would counter this claim about conspiracy in respect of temperature records, even in the strict dictionary definition. Still less does it conform to a conspiracy theory in the sense of some group with a grasp of what they believe to be the real truth, act together to provide an alternative to that truth. or divert attention and resources away from that understanding of that truth. like an internet troll. A clue as to know why this is the case comes from on of the most notorious Climategate emails. Kevin Trenberth to Micheal Mann on Mon, 12 Oct 2009 and copied to most of the leading academics in the “team” (including Thomas R. Karl).

The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.

It is the first sentence that was commonly quoted, but it is the last part is the most relevant for temperatures anomalies. There is inevitably a number of homogenisation runs to get a single set of anomalies. For example the Reykjavik temperature data was (a) adjusted by the Iceland Met office by standard procedures to allow for known locals biases (b) adjusted for GHCNv2 (the “raw data”) (c) adjusted again in GHCNv3 (d) homogenized by NASA to be included in Gistemp.

There are steps that I have missed. Certainly Gistemp homogenize the data quite frequently for new sets of data. As Paul Matthews notes, adjustments are unstable. Although one data set might on average be pretty much the same as previous ones, there will be quite large anomalies thrown out every time the algorithms are re-run for new data. What is more, due to the nature of the computer algorithms, there is no audit trail, therefore the adjustments are largely unexplainable with reference to the data before, let alone with reference to the original thermometer readings. So how does one know whether the adjustments are reasonable or not, except through a belief in how the results ought to look? In the case of the climatologists like Kevin Trenberth and Thomas R. Karl, variations that show warmer than the previous run will be more readily accepted as correct rather than variations that show cooler. That is, they will find reasons why a particular temperature data set now shows greater higher warming than before. but will reject as outliers results that show less warming than before. It is the same when choosing techniques, or adjusting for biases in the data. This is exacerbated when a number of different bodies with similar belief systems try to seek a consensus of results, like  Zeke Hausfather alludes to in his article at the CarbonBrief. Rather than verifying results in the real world, temperature data seeks to conform to the opinions of others with similar beliefs about the world.

Kevin Marshall

The Climate Alarmist Reaction to a Trump Presidency

A few weeks ago cliscep had a piece Trump, climate and the future of the world that looked at the immediate reactions to the surprise victory in the US Presidential election amongst the climate community. Brad Keyes noted Jo Romm’s piece will President Trump pull the plug on a livable climate?. To support this Romm stated

Indeed, one independent firm, Lux Research, projected last week that “estimated emissions would be 16 percent higher after two terms of Trump’s policies than they would be after two terms of Clinton’s, amounting to 3.4 billion tons greater emissions over the next eight years.”

There is a little graph to sort of back this up.

Whilst Romm then states two reasons why he does not think emissions will rise so much (Trump will cause a massive recession and will not win a second term) he then states the Twitter quote:-

That said, the damage and delay that even a one-term President Trump could do will make the already difficult task of keeping total warming well below 2°C essentially impossible.

So a difference of much less than 3.4 GtCO2e over eight years will make keeping total warming well below 2°C essentially impossible.
Before looking at the evidence that contradicts this, there are even more bizarre claims made by the expert climate scientists at RealClimate. They use a different graph which is probably a couple of years old and explain:-

Here are some numbers. Carbon emissions from the United States have been dropping since the year 2000, more than on-track to meet a target for the year 2020. Perhaps with continued effort and improving technology, emissions might have dropped to below the 2020 target by 2020, let’s say to 5 gigatons of CO2 per year (5000 megatons in the plot). In actuality, now, let’s say that removing restrictions on energy inefficiency and air pollution could potentially lead to US emissions by 2020 of about 7 gigatons of CO2. This assumes that future growth in emissions followed the faster growth rates from the 1990’s.
Maybe neither of these things will happen exactly, but these scenarios give us a high-end estimate for the difference between the two, which comes to about 4 gigatons of CO2 over four years. There will also probably be extra emissions beyond 2020 due to the lost opportunity to decarbonize and streamline the energy system between now and then. Call it 4-6 gigatons of Trump CO2.
This large quantity of gas can be put into the context of what it will take to avoid the peak warming threshold agreed to in Paris. In order to avoid exceeding a very disruptive warming of 1.5 °C with 66% probability, humanity can release approximately 220 gigatons of CO2 after January, 2017 (IPCC Climate Change 2014 Synthesis report, Table 2.2, corrected for emissions since 2011). The 4-6 Gtons of Trump CO2 will not by itself put the world over this threshold. But global CO2 emission rates are now about 36 gigatons of CO2 per year, giving a time horizon of only about six years of business-as-usual (!) before we cross the line, leaving basically no time for screwing around. To reach the catastrophic 2 °C, about 1000 gigatons of CO2 remain (about 20 years of business as usual). Note that these estimates were done before global temperatures spiked since 2014 — we are currently at 1.2 °C! So these temperature boundaries may be closer than was recently thought.

RealClimate come up with nearly twice the difference made by Joe Romm / Lux Research, but at least admit in the final paragraph that whoever won would not make much difference.
There are two parts to putting these analyses into context – the US context and the global one.
In the USA emissions have indeed been falling since 2000, this despite the population growing. The rate of decline has significantly increased in the years of the Obama Presidency, but for reasons quite separate from actions to reduce emissions. First there was the credit crunch, followed by the slowest recovery in US history. Second, the high oil price encouraged emissions reductions, along with the loss of energy-intensive industries to countries with lower energy costs. Third is that the shale gas revolution has meant switching from coal to gas in electricity production.
But the global context is even more important. RealClimate does acknowledge the global figure, but only mentions CO2 emissions. The 36GtCO2 is only two-thirds of total greenhouse gas emissions of about 55GTCO2e and that figure is rising by 1-2% a year. The graph – reproduced from the USA INDC submission to the UNFCCC – clearly states that it is in million tonnes of carbon dioxide equivalent. What is more, these are vague policy proposals, that President Obama would have been unable to get through Congress. Further, most of the proposed emission reductions were through extrapolating trends that of what has been happening without any policy intervention.
If the 1.5°C limit breached from 220 GtCO2e of additional emissions, it will be breached in the run-up to Christmas 2020. The 1000 GtCO2e for the 2°C limit was from 2011. By simple arithmetic it is now below 800GtCO2e with about 15 years remaining if (a) a doubling of CO2 levels (or equivalent GHG gases) leads to 3°C of warming (b) the estimated quantity of emissions to a unit rise in atmospheric gas levels is correct and (b) the GHG gas emitted is retained for a very long period in the atmosphere.
Even simple arithmetic is not required. Prior to the Paris talks the UNFCCC combined all the INDCs – including that of the USA to cut emissions as shown in the graph above – were globally aggregated and compared to the approximate emissions pathways for 1.5°C and least-cost 2°C warming. The updated version, post-Paris is below.

The difference Donald Trump will make is somewhere in the thickness of the thick yellow line. There is no prospect of the aimed-for blue emissions pathways. No amount of ranting or protests at the President-elect Trump will change the insignificant difference the United States will make with any politically-acceptable and workable set of policies, nor can make in a country with less than a twentieth of the global population and less that one seventh of global emissions.

Kevin Marshall