How the “greater 50% of warming since 1950 is human caused” claim is deeply flawed

Over at Cliscep, Jaime Jessop has rather jokingly raised a central claim of the IPCC Fifth Assessment Report, after someone on Twitter had accused her of not being a real person.

So here’s the deal: Michael Tobis convinces me, on here, that the IPCC attribution statement is scientifically sound and it is beyond reasonable doubt that more than half of the warming post 1950 is indeed caused by emissions, and I will post a photo verifying my actual existence as a real person.

The Report states (AR5 WG1 Ch10 Page 869)

It is extremely likely that human activities caused more than half of the observed increase in GMST from 1951 to 2010.

This extremely likely is at the 95% confidence interval and includes all human causes. The more specific quote on human greenhouse gas emissions is from page 878, section “10.2.4 Single-Step and Multi-Step Attribution and the Role of the Null Hypothesis

Attribution results are typically expressed in terms of conventional ‘frequentist’ confidence intervals or results of hypothesis tests: when it is reported that the response to anthropogenic GHG increase is very likely greater than half the total observed warming, it means that the null hypothesis that the GHG-induced warming is less than half the total can be rejected with the data available at the 10% significance level.

It is a much more circumspect message than the “<a href=”http://stocker IPCC 2013″ target=”_blank”>human influence on the climate system is clear</a>” announcements of WG1 four years ago.  In describing attribution studies, the section states

Overall conclusions can only be as robust as the least certain link in the multi-step procedure.

There are a number of candidates for “least certain link” in terms of empirical estimates. In general, if the estimates are made with reference to the other estimates, or biased by theory/beliefs, then the statistical test is invalidated. This includes the surface temperature data.

Further, if the models have been optimised to fit the surface temperature data, then the >50% is an absolute maximum, whilst the real figure, based on perfect information, is likely to be less than that.

Most of all are the possibilities of unknown unknowns. For, instance, the suggestion that non-human causes could explain pretty much all the post-1950 warming can be inferred from some paleoclimate studies. This reconstruction Greenland ice core (graphic climate4you) shows warming around as great, or greater, than the current warming in the distant past. The timing of a warm cycle is not too far out either.

In the context of Jaime’s challenge, there is more than reasonable doubt in the IPCC attribution statement, even if a statistical confidence of 90% (GHG emissions) or 95% (all human causes) were acceptable as persuasive evidence.

There is a further problem with the statement. Human greenhouse gas emissions are meant to account for all the current warming, not just over 50%. If the full impact of a doubling is CO2 is eventually 3C of warming, then from that the 1960-2010 CO2 rise from 317ppm to 390ppm alone will eventually be 0.9C of warming. Possibly 1.2C of warming from all sources. This graphic from AR5 WG1 Ch10 shows the issues.

The orange line of anthropogenic forcing accounts for nearly 100% of all the measured warming post-1960 of around 0.8C – shown by the large dots. Yet this is about 60% of the warming in from GHG rises if a doubling of CO2 will produce 3C of warming. The issue is with the cluster of dots at the right of the graph, representing the pause, or slow down in warming around the turn of the century. I have produced a couple of charts that illustrate the problem.

In the first graph, the long term impact on temperatures of the CO2 rise from 2003-2012 is 2.5 times that from 1953-1962. Similarly, from the second graph, the long term impact on temperatures of the CO2 rise from 2000-2009 is 2.6 times that from 1950-1959. It is a darn funny lagged response if the rate of temperature rise can significantly slow down when the alleged dominant element causing them to rise accelerates. It could be explained by rising GHG emissions being a minor element in temperature rise, with natural factors both causing some of the warming in the 1976-1998 period, then reversing, causing cooling, in the last few years.

Kevin Marshall



Warming Bias in Temperature Data due to Consensus Belief not Conspiracy

In a Cliscep article Science: One Damned Adjustment After Another? Geoff Chambers wrote:-

So is the theory of catastrophic climate change a conspiracy? According to the strict dictionary definition, it is, in that the people concerned clearly conferred together to do something wrong – namely introduce a consistent bias in the scientific research, and then cover it up.

This was in response to last the David Rose article in the Mail on Sunday, about claims the infamous the Karl et al 2015 breached America’s National Oceanic and Atmospheric Administration (NOAA) own rules on scientific intergrity.

I would counter this claim about conspiracy in respect of temperature records, even in the strict dictionary definition. Still less does it conform to a conspiracy theory in the sense of some group with a grasp of what they believe to be the real truth, act together to provide an alternative to that truth. or divert attention and resources away from that understanding of that truth. like an internet troll. A clue as to know why this is the case comes from on of the most notorious Climategate emails. Kevin Trenberth to Micheal Mann on Mon, 12 Oct 2009 and copied to most of the leading academics in the “team” (including Thomas R. Karl).

The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.

It is the first sentence that was commonly quoted, but it is the last part is the most relevant for temperatures anomalies. There is inevitably a number of homogenisation runs to get a single set of anomalies. For example the Reykjavik temperature data was (a) adjusted by the Iceland Met office by standard procedures to allow for known locals biases (b) adjusted for GHCNv2 (the “raw data”) (c) adjusted again in GHCNv3 (d) homogenized by NASA to be included in Gistemp.

There are steps that I have missed. Certainly Gistemp homogenize the data quite frequently for new sets of data. As Paul Matthews notes, adjustments are unstable. Although one data set might on average be pretty much the same as previous ones, there will be quite large anomalies thrown out every time the algorithms are re-run for new data. What is more, due to the nature of the computer algorithms, there is no audit trail, therefore the adjustments are largely unexplainable with reference to the data before, let alone with reference to the original thermometer readings. So how does one know whether the adjustments are reasonable or not, except through a belief in how the results ought to look? In the case of the climatologists like Kevin Trenberth and Thomas R. Karl, variations that show warmer than the previous run will be more readily accepted as correct rather than variations that show cooler. That is, they will find reasons why a particular temperature data set now shows greater higher warming than before. but will reject as outliers results that show less warming than before. It is the same when choosing techniques, or adjusting for biases in the data. This is exacerbated when a number of different bodies with similar belief systems try to seek a consensus of results, like  Zeke Hausfather alludes to in his article at the CarbonBrief. Rather than verifying results in the real world, temperature data seeks to conform to the opinions of others with similar beliefs about the world.

Kevin Marshall

The Climate Alarmist Reaction to a Trump Presidency

A few weeks ago cliscep had a piece Trump, climate and the future of the world that looked at the immediate reactions to the surprise victory in the US Presidential election amongst the climate community. Brad Keyes noted Jo Romm’s piece will President Trump pull the plug on a livable climate?. To support this Romm stated

Indeed, one independent firm, Lux Research, projected last week that “estimated emissions would be 16 percent higher after two terms of Trump’s policies than they would be after two terms of Clinton’s, amounting to 3.4 billion tons greater emissions over the next eight years.”

There is a little graph to sort of back this up.

Whilst Romm then states two reasons why he does not think emissions will rise so much (Trump will cause a massive recession and will not win a second term) he then states the Twitter quote:-

That said, the damage and delay that even a one-term President Trump could do will make the already difficult task of keeping total warming well below 2°C essentially impossible.

So a difference of much less than 3.4 GtCO2e over eight years will make keeping total warming well below 2°C essentially impossible.
Before looking at the evidence that contradicts this, there are even more bizarre claims made by the expert climate scientists at RealClimate. They use a different graph which is probably a couple of years old and explain:-

Here are some numbers. Carbon emissions from the United States have been dropping since the year 2000, more than on-track to meet a target for the year 2020. Perhaps with continued effort and improving technology, emissions might have dropped to below the 2020 target by 2020, let’s say to 5 gigatons of CO2 per year (5000 megatons in the plot). In actuality, now, let’s say that removing restrictions on energy inefficiency and air pollution could potentially lead to US emissions by 2020 of about 7 gigatons of CO2. This assumes that future growth in emissions followed the faster growth rates from the 1990’s.
Maybe neither of these things will happen exactly, but these scenarios give us a high-end estimate for the difference between the two, which comes to about 4 gigatons of CO2 over four years. There will also probably be extra emissions beyond 2020 due to the lost opportunity to decarbonize and streamline the energy system between now and then. Call it 4-6 gigatons of Trump CO2.
This large quantity of gas can be put into the context of what it will take to avoid the peak warming threshold agreed to in Paris. In order to avoid exceeding a very disruptive warming of 1.5 °C with 66% probability, humanity can release approximately 220 gigatons of CO2 after January, 2017 (IPCC Climate Change 2014 Synthesis report, Table 2.2, corrected for emissions since 2011). The 4-6 Gtons of Trump CO2 will not by itself put the world over this threshold. But global CO2 emission rates are now about 36 gigatons of CO2 per year, giving a time horizon of only about six years of business-as-usual (!) before we cross the line, leaving basically no time for screwing around. To reach the catastrophic 2 °C, about 1000 gigatons of CO2 remain (about 20 years of business as usual). Note that these estimates were done before global temperatures spiked since 2014 — we are currently at 1.2 °C! So these temperature boundaries may be closer than was recently thought.

RealClimate come up with nearly twice the difference made by Joe Romm / Lux Research, but at least admit in the final paragraph that whoever won would not make much difference.
There are two parts to putting these analyses into context – the US context and the global one.
In the USA emissions have indeed been falling since 2000, this despite the population growing. The rate of decline has significantly increased in the years of the Obama Presidency, but for reasons quite separate from actions to reduce emissions. First there was the credit crunch, followed by the slowest recovery in US history. Second, the high oil price encouraged emissions reductions, along with the loss of energy-intensive industries to countries with lower energy costs. Third is that the shale gas revolution has meant switching from coal to gas in electricity production.
But the global context is even more important. RealClimate does acknowledge the global figure, but only mentions CO2 emissions. The 36GtCO2 is only two-thirds of total greenhouse gas emissions of about 55GTCO2e and that figure is rising by 1-2% a year. The graph – reproduced from the USA INDC submission to the UNFCCC – clearly states that it is in million tonnes of carbon dioxide equivalent. What is more, these are vague policy proposals, that President Obama would have been unable to get through Congress. Further, most of the proposed emission reductions were through extrapolating trends that of what has been happening without any policy intervention.
If the 1.5°C limit breached from 220 GtCO2e of additional emissions, it will be breached in the run-up to Christmas 2020. The 1000 GtCO2e for the 2°C limit was from 2011. By simple arithmetic it is now below 800GtCO2e with about 15 years remaining if (a) a doubling of CO2 levels (or equivalent GHG gases) leads to 3°C of warming (b) the estimated quantity of emissions to a unit rise in atmospheric gas levels is correct and (b) the GHG gas emitted is retained for a very long period in the atmosphere.
Even simple arithmetic is not required. Prior to the Paris talks the UNFCCC combined all the INDCs – including that of the USA to cut emissions as shown in the graph above – were globally aggregated and compared to the approximate emissions pathways for 1.5°C and least-cost 2°C warming. The updated version, post-Paris is below.

The difference Donald Trump will make is somewhere in the thickness of the thick yellow line. There is no prospect of the aimed-for blue emissions pathways. No amount of ranting or protests at the President-elect Trump will change the insignificant difference the United States will make with any politically-acceptable and workable set of policies, nor can make in a country with less than a twentieth of the global population and less that one seventh of global emissions.

Kevin Marshall