How strong is the Consensus Evidence for human-caused global warming?

You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.

Richard Feynman – 1964 Lecture on the Scientific Method

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

The Debunking Handbook 2011 – John Cook and Stephan Lewandowsky

My previous post looked at the attacks on David Rose for daring to suggest that the rapid fall in global land temperatures at the El Nino event were strong evidence that the record highs in global temperatures were not due to human greenhouse gas emissions. The technique used was to look at long-term linear trends. The main problems with this argument were
(a) according to AGW theory warming rates from CO2 alone should be accelerating and at a higher rate than the estimated linear warming rates from HADCRUT4.
(b) HADCRUT4 shows warming stopped from 2002 to 2014, yet in theory the warming from CO2 should have accelerated.

Now there are at least two ways to view my arguments. First is to look at Feynman’s approach. The climatologists and associated academics attacking journalist David Rose chose to do so from a perspective of a very blurred specification of AGW theory. That is human emissions will cause greenhouse gas levels to rise, which will cause global average temperatures to rise. Global average temperature clearly have risen from all long-term (>40 year) data sets, so theory is confirmed. On a rising trend, with large variations due to natural variability, then any new records will be primarily “human-caused”. But making the theory and data slightly less vague reveals an opposite conclusion. Around the turn of the century the annual percentage increase in CO2 emissions went from 0.4% to 0.5% a year (figure 1), which should have lead to an acceleration in the rate of warming. In reality warming stalled.

The reaction was to come up with a load of ad hoc excuses. Hockey Schtick blog reached 66 separate excuses for the “pause” by November 2014, from the peer-reviewed to a comment in the UK Parliament.  This could be because climate is highly complex, with many variables, the presence of each contributing can only be guessed at, let alone the magnitude of each factor and the interrelationships with all factors. So how do you tell which statements are valid information and which are misinformation? I agree with Cook and Lewandowsky that misinformation is pernicious, and difficult to get rid of once it becomes entrenched. So how does one evaluate distinguish between the good information and the bad, misleading or even pernicious?

The Lewandowsky / Cook answer is to follow the consensus of opinion. But what is the consensus of opinion? In climate one variation is to follow a small subset of academics in the area who answer in the affirmative to

1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

Problem is that the first question is just reading a graph and the second could be is a belief statement will no precision. Anthropogenic global warming has been a hot topic for over 25 years now. Yet these two very vague empirically-based questions, forming the foundations of the subject, should be able to be formulated more precisely. On the second it is a case of having pretty clear and unambiguous estimates as to the percentage of warming, so far, that is human caused. On that the consensus of leading experts are unable to say whether it is 50% or 200% of the warming so far. (There are meant to be time lags and factors like aerosols that might suppress the warming). This from the 2013 UNIPCC AR5 WG1 SPM section D3:-

It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.

The IPCC, encapsulating the state-of-the-art knowledge, cannot provide firm evidence in the form of a percentage, or even a fairly broad range even with over 60 years of data to work on..  It is even worse than it appears. The extremely likely phrase is a Bayesian probability statement. Ron Clutz’s simple definition from earlier this year was:-

Here’s the most dumbed-down description: Initial belief plus new evidence = new and improved belief.

For the IPCC claim that their statement was extremely likely, at the fifth attempt, they should be able to show some sort of progress in updating their beliefs to new evidence. That would mean narrowing the estimate of the magnitude of impact of a doubling of CO2 on global average temperatures. As Clive Best documented in a cliscep comment in October, the IPCC reports, from 1990 to 2013 failed to change the estimate range of 1.5°C to 4.5°C. Looking up Climate Sensitivity in Wikipedia we get the origin of the range estimate.

A committee on anthropogenic global warming convened in 1979 by the National Academy of Sciences and chaired by Jule Charney estimated climate sensitivity to be 3 °C, plus or minus 1.5 °C. Only two sets of models were available; one, due to Syukuro Manabe, exhibited a climate sensitivity of 2 °C, the other, due to James E. Hansen, exhibited a climate sensitivity of 4 °C. “According to Manabe, Charney chose 0.5 °C as a not-unreasonable margin of error, subtracted it from Manabe’s number, and added it to Hansen’s. Thus was born the 1.5 °C-to-4.5 °C range of likely climate sensitivity that has appeared in every greenhouse assessment since…

It is revealing that quote is under the subheading Consensus Estimates. The climate community have collectively failed to update the original beliefs, based on a very rough estimate. The emphasis on referring to consensus beliefs about the world, rather than looking outward for evidence in the real world, I would suggest is the primary reason for this failure. Yet such community-based beliefs completely undermines the integrity of the Bayesian estimates, making its use in statements about climate clear misinformation in Cook and Lewandowsky’s use of the term. What is more, those in the climate community who look primarily to these consensus beliefs rather than the data of the real world will endeavour to dismiss the evidence, or make up ad hoc excuses, or smear those who try to disagree. A caricature of these perspectives with respect to global average temperature anomalies is available in the form of a flickering widget at John Cooks’ skepticalscience website. This purports to show the difference between “realist” consensus and “contrarian” non-consensus views. Figure 2 is a screenshot of the consensus views, interpreting warming as a linear trend. Figure 3 is a screenshot of the non-consensus or contrarian views. They is supposed to interpret warming as a series of short, disconnected,  periods of no warming. Over time, each period just happens to be at a higher level than the previous. There are a number of things that this indicates.

(a) The “realist” view is of a linear trend throughout any data series. Yet the period from around 1940 to 1975 has no warming or slight cooling depending on the data set. Therefore any linear trend line derived for a longer period than 1970 to 1975 and ending in 2015 will show a lower rate of warming. This would be consistent the rate of CO2 increasing over time, as shown in figure 1. But for shorten the period, again ending in 2015, and once the period becomes less than 30 years, the warming trend will also decrease. This contracts the theory, unless ad hoc excuses are used, as shown in my previous post using the HADCRUT4 data set.

(b) Those who agree with the consensus are called “Realist”, despite looking inwards towards common beliefs. Those who disagree with warming are labelled “Contrarian”. This is not inaccurate when there is a dogmatic consensus. But it utterly false to lump all those who disagree with the same views, especially when no examples are provided of those who hold such views.

(c) The linear trend appears as a more plausible fit than the series of “contrarian” lines. By implication, those who disagree with the consensus are viewed as as having a distinctly more blinkered and distorted perspective than those who follow the consensus. Yet even using gistemp data set (which is gives greatest support to the consensus views) there is a clear break in the linear trend. The less partisan HADCRUT4 data shows an even greater break.

Those who spot the obvious – that around the turn of the century warming stopped or slowed down, when in theory it should have accelerated – are given a clear choice. They can conform to the scientific consensus, denying the discrepancy between theory and data. Or they can act as scientists, denying the false and empirically empty scientific consensus, receiving the full weight of all the false and career-damaging opprobrium that accompanies it.

fig2-sks-realists

 

 

fig3-sks-contras

Kevin Marshall

 

Hiroshima Bombs of Heat Accumulation – Skeptical Science reversing scientific reality

Skeptical Science blog has a little widget that counts the heat the climate has accumulated since 1998 in terms of Hiroshima Atomic Bombs.

One the first uses of the Hiroshima bomb analogy was by skepticalscience.com stalwart Dana Nuccitelli, in the Guardian.

The rate of heat building up on Earth over the past decade is equivalent to detonating about 4 Hiroshima atomic bombs per second. Take a moment to visualize 4 atomic bomb detonations happening every single second.

But what does this mean in actual heat energy? I did a search, and found out the estimated heat generated by the Hiroshima bomb was about 63TJ, or terra joules, or 63 x 1012 joules. A quick calculation reveals the widget actually uses 62TJ, so I will use that lower value. It is a huge number. The energy was sufficient to kill over 100,000 people, cause horrific injuries to many more, and destroying every building within a large radius of the blast site. Yet in the last 17 years the climate system has accumulated over two billion times that energy.

Most of that energy goes into the oceans, so I was curious to estimate the impact that phenomenal heat accumulation would have on the average temperature of the oceans. Specifically, how long would it take to heat the oceans by 1oC.

The beauty of metric measurements is that weight and volume are combined all around the unit of water. I will ignore the slight differences due to the impurities of sea water for this exercise.

The metric unit of energy, a joule, is not quite so easy to relate to water. The old British thermal unit is better, being the quantity of energy sufficient to raise a pound of water through 1oF. Knowing that 1lb=454g, 1.8oF = 1oC and 1btu ≈ 1055J, means that about 4.2 joules is the energy sufficient to raise 1 gram of water the one degree.

So the Hiroshima bomb had the energy to raise (62 x 1012)/4.2 ≈ 15 x 1012 grams of water through one degree.

That is 15 x 109 kilos (litres) of water, or 15 x 106 tonnes (cubic metres) of water. That is the volume of a lake of 1 kilometre in area, with an average depth of 15 metres.

The largest lake in England is Lake Windermere, which has approximately a volume of 1 cubic kilometre of water, or 1 billion tonnes of water. (The biggest freshwater lake in the United Kingdom by volume is Loch Ness, with about 9 km3 of water.)

It would take the power of 67 Hiroshima bombs to heat Lake Windermere by 1 degree. Or the oceans are accumulating heat at a rate that would the temperature of this lake by one degree in 16.67 seconds.

Although Lake Windermere can look quite large when standing on its shoreline, it is tiny in relative to the Great Lakes, let alone the oceans of the world. With a total area of about 360,000,000 km2, and an average depth of at least 3000 metres, the oceans have a volume of about 1,080,000,000 km3, or contain 108 x 1018 tonnes of water. If all the heat absorbed by the global climate system since 1998 went into the oceans, it would about 18 billion seconds to raise average ocean temperature by 1oC. That is 5,000,000 hours or 208,600 days or 570 years.

Here I am slightly exaggerating the warming rate. The UNIPCC estimates that only 93% of the warming from extra heat absorbed by the climate system was absorbed by the oceans.

But have I got this wrong by a huge margin? The standard way of stating the warming rates – used by the UNIPCC – is in degrees centigrade per decade. This is the same metric that is used for average surface temperatures. Warming of one degree in 570 years becomes 0.0175°C/decade. In Chapter 3 of the UNIPCC AR5 Working Group 1 Report, Figure 3.3 (a) on page 263 is the following.

The ocean below about 1000 metres, or more than two-thirds of the water volume, is warming at a rate less than 0.0175°C/decade. This may be an overstatement. Below 2000 metres, average water temperature rise is around 0.005°C/decade, or 1oC of temperature rise every 2000 years.

The energy of four Hiroshima bombs a second is trivial on a global scale. It causes an amount of temperature change that is barely measurable on a year-on-year basis.

There are two objectives that I believe Skeptical Science team try achieving with their little widget.

The first objective is to reverse people’s perception of reality. Nuclear explosions are clearly seen by everybody. You do not have to be an expert to detect it if you are within a thousand miles of the detonation. Set one off anywhere in the world, even deep underground, and sensitive seismic detectors will register the event from the other side of the globe. Rejection of the evidence of a blast can only be on the basis of clear bias or lying.

But trying to measure changes of thousands of a degree in the unimaginable vastness of the oceans, with changes in the currents and seasonal changes as well is not detectable with a single instrument, or even thousands of such instruments. It requires careful collation and aggregation of the data, with computer modelling filling in the gaps. Small biases in the modelling techniques, whether known or unknown, due to technical reasons or through desiring to get a particular result, will be more crucial than accuracy of the instruments. Even without these issues, there is the small matter of using ten years of good quality data, and longer periods of sparser and lower quality data, to determine underlying trends and the causes of it. Understanding of the nature of the data measurement issue puts the onus on anyone claiming the only possible answer to substantiate those claims.

The second objective is to replace a very tiny change in the very short period for which we have data, into a perception of a scientifically-validated catastrophic problem in the present. Whether it is a catastrophic problem relies on the projections of climate models.

It is easy to see why Skeptical Science needs this switch in the public perception of reality. True understanding of climate heat accumulation means awareness of the limits and the boundaries of our current knowledge. That requires a measure of humility and recognition of when existing knowledge is undermined. It is an inter-disciplinary subject that could result in a whole range of results of equal merit. It does not accord with their polarized vision of infallible enlightened scientists against a bunch of liars and ignoramuses who get nothing right.

Kevin Marshall