Nature tacitly admits the IPCC AR5 was wrong on Global Warming

There has been a lot of comment on a recent paper at nature geoscience “Emission budgets and pathways consistent with limiting warming to 1.5C” (hereafter Millar et. al 2017)

When making a case for public policy I believe that something akin to a process of due diligence should be carried out on the claims. That is the justifications ought to be scrutinized to validate the claims. With Millar et. al 2017, there are a number of issues with the make-up of the claims that (a) warming of 1.5C or greater will be achieved without policy (b) constraining the emissions  

The baseline warming

The introduction states
Average temperatures for the 2010s are currently 0.87°C above 1861–80,

A similar quote from UNIPCC AR5 WG1 SPM page 5

The total increase between the average of the 1850–1900 period and the 2003–2012 period is 0.78 [0.72 to 0.85] °C, based on the single longest dataset available.

These figures are all from the HADCRUT4 dataset. There are three areas to account for the difference of 0.09°C. Mostly it is the shorter baseline period. Also, the last three years have been influenced by a powerful and natural El-Nino, along with the IPCC using an average of the last 10 years.

The warming in the pipeline

There are valid reasons for the authors differing from the IPCC’s methodology. They start with the emissions from 1870 (even though emissions estimates go back to 1850). Also, if there is no definite finish date, it is very difficult to calculate the warming impact to date. Consider first the full sentence quoted above.

Average temperatures for the 2010s are currently 0.87°C above 1861–80, which would rise to 0.93°C should they remain at 2015 levels for the remainder of the decade.

This implies that there is some warming to come through from the impact of the higher greenhouse gas levels. This seems to be a remarkably low and over a very short time period. Of course, not all the warming since the mid-nineteenth century is from anthropogenic greenhouse gas emissions. The anthropogenic element is just guesstimated. This is show in AR5 WG1 Ch10 Page 869

More than half of the observed increase in global mean surface temperature (GMST) from 1951 to 2010 is very likely due to the observed anthropogenic increase in greenhouse gas (GHG) concentrations.

It was after 1950 when the rate largest increase in CO2 levels was experienced. From 1870 to 1950, CO2 levels rose from around 290ppm to 310ppm or 7%. From 1950 to 2010, CO2 levels rose from around 310ppm to 387ppm or 25%. Add in other GHG gases and there the human-caused warming should be 3-4 times greater in the later period than the earlier one, whereas the warming in the later period was just over twice the amount. Therefore if there is just over a 90% chance (very likely in IPCC speak) of over 50% of the warming post-1950 was human-caused, a statistical test relating to a period more than twice as long would have a lower human-caused element of the warming as being statistically significant. Even then, I view the greater than 50% statistic as being deeply flawed. Especially when post-2000, when the rate of rise in CO2 levels accelerated, whilst the rise in average temperatures dramatically slowed. There are two things that this suggests. First, the impact could be explained by rising GHG emissions being a minor element in temperature rise, with natural factors both causing some of the warming in the 1976-1998 period, then reversing, causing cooling, in the last few years. Second is that there is a darn funny lagged response of rising GHGs (especially CO2) to rises in temperature. That is the amount of warming in the pipeline has increased dramatically. If either idea has any traction then the implied warming to come of just 0.06°is a false estimate. This needs to be elaborated.

Climate Sensitivity

If a doubling of CO2 leads to 3.00°C of warming (the assumption of the IPCC in their emissions calculations), then a rise in CO2 levels from 290ppm to 398 ppm (1870 to 2014) eventually gives 1.37°C of warming. With other GHGs this figure should be around 1.80°C. Half that warming has actually occurred, and some of that is natural. So there is well over 1.0°C still to emerge. It is too late to talk about constraining warming to 1.5°C as the cause of that warming has already occurred.

The implication from the paper in claiming that 0.94°C will result from human emissions in the period 1870-2014 is to reduce the climate sensitivity estimate to around 2.0°C for a doubling of CO2, if only CO2 is considered, or around 1.5°C for a doubling of CO2, if all GHGs are taken into account. (See below) Compare this to AR5 WG1 section D.2 Quantification of Climate System Responses

The equilibrium climate sensitivity quantifies the response of the climate system to constant radiative forcing on multicentury time scales. It is defined as the change in global mean surface temperature at equilibrium that is caused by a doubling of the atmospheric CO2 concentration. Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C (high confidence), extremely unlikely less than 1°C (high confidence), and very unlikely greater than 6°C (medium confidence).

The equilibrium climate sensitivity ECS is at the very bottom of the IPCC’s range and equilibrium climate response is reached in 5-6 years instead of mutlicentury time scales. This on top of the implied assumption that there is no net natural warming between 1870 and 2015.

How much GHG emissions?

With respect to policy, as global warming is caused by human greenhouse gas emissions, to prevent further human-caused warming requires reducing, and possibly eliminating global greenhouse emissions. In conjunction with the publication of the AR5 Synthesis report, the IPCC produced a slide show of the policy case laid out in the three vast reports. It was effectively a short summary of a summary of the synthesis report. Approaching the policy climax at slide 30 of 35:-

Apart from the policy objective in AR5 was to limit warming from 2°C, not 1.5°C, it also mentions the need to constrain GHG emissions, not CO2 emissions. Then slide 33 gives the simple policy simplified position to achieve 2°C of warming.

To the end of 2011 1900 GTCO2e of GHGs was estimated to have been emitted, whilst the estimate is around 1000 GTCO2e could be emitted until the 2°C warming was reached.

The is the highly simplified version. At the other end of the scale, AR5 WG3 Ch6 p431 has a very large table in a very small font to consider a lot of the policy options. It is reproduced below, though the resolution is much poorer than the original.

Note 3 states

For comparison of the cumulative CO2 emissions estimates assessed here with those presented in WGI AR5, an amount of 515 [445 to 585] GtC (1890 [1630 to 2150] GtCO2), was already emitted by 2011 since 1870

The top line is for the 1.5°C of warming – the most ambitious policy aim. Of note:-

  • The CO2 equivalent concentration in 2100 (ppm CO2eq ) is 430-480ppm.
  • Cumulative CO2 emissions (GtCO2) from 2011 to 2100 is 630 to 1180.
  • CO2 concentration in 2100 is 390-435ppm.
  • Peak CO2 equivalent concentration is 465-530ppm. This is higher than the 2100 concentration and if for CO2 alone with ECS = 3 would eventually produce 2.0°C to 2.6°C of warming.
  • The Probability of Exceeding 1.5 °C in 2100 is 49-86%. They had to squeeze really hard to say that 1.5°C was more than 50% likely.

Compare the above to this from the abstract of Millar et. al 2017.

If COemissions are continuously adjusted over time to limit 2100 warming to 1.5C, with ambitious non-COmitigation, net future cumulativCOemissions are unlikely to prove less than 250 GtC and unlikely greater than 540 GtC. Hence, limiting warming to 1.5C is not yet a geophysical impossibility, but is likely to require delivery on strengthened pledges for 2030 followed by challengingly deep and rapid mitigation.

They use tonnes of carbon as the unit of measure as against CO2 equivalent. The conversion factor is 3.664, so cumulative CO2 emissions need to be 870-1010 GtCO2 range. As this is to the end of 2015, not 2011 as in the IPCC report, it will be different. Subtracting 150 from the IPCC reports figures would give a range of 480 to 1030. That is, Millar et. al 2017 have reduced the emissions range by 75% to the top end of the IPCC’s range. Given the IPCC considered a range of 1.5-1.7°C of warming, this seems somewhat odd to then say it related to the lower end of the warming band, until you take into account that ECS has been reduced. But then why curtail the range of emissions instead calculating your own? It appears that again the authors are trying to squeeze a result within existing constraints.

However, this does not take into account the much higher levels of peak CO2 equivalent concentrations in table 6.3. Peak CO2 concentrations are around 75-95ppm higher than in 2100. Compare this to the green line in the central graph in Millar et. al 2017. 

 This is less than 50ppm higher than in 2100. Further in 2100 Millar et. al 2017 has CO2 levels of around 500ppm as against a mid-point of 410 in AR5. CO2 rising from 290 to 410ppm with ECS = 3.0 produced 1.50°C of warming. CO2 rising from 290 to 410ppm with ECS = 2.0 produced 1.51°C of warming. Further, this does not include the warming impact of other GHGs. To squeeze into the 1.5°C band, the mid-century overshoot in Millar et. al 2017 is much less than in AR5. This might be required in the modeling assumptions due to the very short time assumed in reaching full equilibrium climate response.

Are the authors playing games?

The figures do not appear to stack up. But then they appear to be playing around with figures, indicated by a statement in the explanation of Figure 2

Like other simple climate models, this lacks an explicit physical link between oceanic heat and carbon uptake. It allows a global feedback between temperature and carbon uptake from the atmosphere, but no direct link with net deforestation. It also treats all forcing agents equally, in the sense that a single set of climate response parameters is used in for all forcing components, despite some evidence of component-specific responses. We do not, however, attempt to calibrate the model directly against observations, using it instead to explore the implications of ranges of uncertainty in emissions, and forcing and response derived directly from the IPCC-AR5, which are derived from multiple lines of evidence and, importantly, do not depend directly on the anomalously cool temperatures observed around 2010.

That is:-

  • The model does not consider an “explicit physical link between oceanic heat and carbon uptake.” The IPCC estimated that over 90% of heat accumulation since 1970 was in the oceans. If the oceans were to belch out some of this heat at a random point in the future the 1.5°C limit will be exceeded.
  • No attempt has been made to “calibrate the model directly against observations”. Therefore there is no attempt to properly reconcile beliefs to the real world.
  • The “multiple lines of evidence” in IPCC-AR5 does not include a glaring anomaly that potentially falsifies the theory and therefore any “need” for policy at all. That is the divergence in actual temperatures trends from theory in this century.

Conclusions

The authors of Millar et. al 2017 have pushed out the boundaries to continue to support climate mitigation policies. To justify constraining emissions sufficient stop 1.5°C of warming the authors would appear to have

  • Assumed that all the warming since 1870 is caused by anthropogenic GHG emissions when there is not even a valid statistical test that confirms even half the warming was from this source.
  • Largely ignored any hidden heat or other long-term response to rises in GHGs.
  • Ignored the divergence between model predictions and actual temperature anomalies since around the turn of the century. This has two consequences. First, the evidence appears to strongly contradict the belief that humans are a major source of global warming and by implication dangerous climate change. Second, if it does not contradict the theory, suggests the amount of warming in the pipeline consequential on human GHG emissions has massively increased. Thus the 1.5°C warming could be breached anyway.
  • Made ECS as low as possible in the long-standing 1.5°C to 4.5°C range. Even assuming ECS is at the mid-point of the range for policy (as the IPCC has done in all its reports) means that warming will breach the 1.5°C level without any further emissions. 

The authors live in their closed academic world of models and shared beliefs. Yet the paper is being used for the continued support of mitigation policy that is both failing to get anywhere close to achieving the objectives and is massively net harmful in any countries that apply it, whether financially or politically.

Kevin Marshall

Commentary at Cliscep, Jo Nova, Daily Caller, Independent, The GWPF

Update 25/09/17 to improve formatting.

How the “greater 50% of warming since 1950 is human caused” claim is deeply flawed

Over at Cliscep, Jaime Jessop has rather jokingly raised a central claim of the IPCC Fifth Assessment Report, after someone on Twitter had accused her of not being a real person.

So here’s the deal: Michael Tobis convinces me, on here, that the IPCC attribution statement is scientifically sound and it is beyond reasonable doubt that more than half of the warming post 1950 is indeed caused by emissions, and I will post a photo verifying my actual existence as a real person.

The Report states (AR5 WG1 Ch10 Page 869)

It is extremely likely that human activities caused more than half of the observed increase in GMST from 1951 to 2010.

This extremely likely is at the 95% confidence interval and includes all human causes. The more specific quote on human greenhouse gas emissions is from page 878, section “10.2.4 Single-Step and Multi-Step Attribution and the Role of the Null Hypothesis

Attribution results are typically expressed in terms of conventional ‘frequentist’ confidence intervals or results of hypothesis tests: when it is reported that the response to anthropogenic GHG increase is very likely greater than half the total observed warming, it means that the null hypothesis that the GHG-induced warming is less than half the total can be rejected with the data available at the 10% significance level.

It is a much more circumspect message than the “<a href=”http://stocker IPCC 2013″ target=”_blank”>human influence on the climate system is clear</a>” announcements of WG1 four years ago.  In describing attribution studies, the section states

Overall conclusions can only be as robust as the least certain link in the multi-step procedure.

There are a number of candidates for “least certain link” in terms of empirical estimates. In general, if the estimates are made with reference to the other estimates, or biased by theory/beliefs, then the statistical test is invalidated. This includes the surface temperature data.

Further, if the models have been optimised to fit the surface temperature data, then the >50% is an absolute maximum, whilst the real figure, based on perfect information, is likely to be less than that.

Most of all are the possibilities of unknown unknowns. For, instance, the suggestion that non-human causes could explain pretty much all the post-1950 warming can be inferred from some paleoclimate studies. This reconstruction Greenland ice core (graphic climate4you) shows warming around as great, or greater, than the current warming in the distant past. The timing of a warm cycle is not too far out either.

In the context of Jaime’s challenge, there is more than reasonable doubt in the IPCC attribution statement, even if a statistical confidence of 90% (GHG emissions) or 95% (all human causes) were acceptable as persuasive evidence.

There is a further problem with the statement. Human greenhouse gas emissions are meant to account for all the current warming, not just over 50%. If the full impact of a doubling is CO2 is eventually 3C of warming, then from that the 1960-2010 CO2 rise from 317ppm to 390ppm alone will eventually be 0.9C of warming. Possibly 1.2C of warming from all sources. This graphic from AR5 WG1 Ch10 shows the issues.

The orange line of anthropogenic forcing accounts for nearly 100% of all the measured warming post-1960 of around 0.8C – shown by the large dots. Yet this is about 60% of the warming in from GHG rises if a doubling of CO2 will produce 3C of warming. The issue is with the cluster of dots at the right of the graph, representing the pause, or slow down in warming around the turn of the century. I have produced a couple of charts that illustrate the problem.

In the first graph, the long term impact on temperatures of the CO2 rise from 2003-2012 is 2.5 times that from 1953-1962. Similarly, from the second graph, the long term impact on temperatures of the CO2 rise from 2000-2009 is 2.6 times that from 1950-1959. It is a darn funny lagged response if the rate of temperature rise can significantly slow down when the alleged dominant element causing them to rise accelerates. It could be explained by rising GHG emissions being a minor element in temperature rise, with natural factors both causing some of the warming in the 1976-1998 period, then reversing, causing cooling, in the last few years.

Kevin Marshall

 

 

How strong is the Consensus Evidence for human-caused global warming?

You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.

Richard Feynman – 1964 Lecture on the Scientific Method

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

The Debunking Handbook 2011 – John Cook and Stephan Lewandowsky

My previous post looked at the attacks on David Rose for daring to suggest that the rapid fall in global land temperatures at the El Nino event were strong evidence that the record highs in global temperatures were not due to human greenhouse gas emissions. The technique used was to look at long-term linear trends. The main problems with this argument were
(a) according to AGW theory warming rates from CO2 alone should be accelerating and at a higher rate than the estimated linear warming rates from HADCRUT4.
(b) HADCRUT4 shows warming stopped from 2002 to 2014, yet in theory the warming from CO2 should have accelerated.

Now there are at least two ways to view my arguments. First is to look at Feynman’s approach. The climatologists and associated academics attacking journalist David Rose chose to do so from a perspective of a very blurred specification of AGW theory. That is human emissions will cause greenhouse gas levels to rise, which will cause global average temperatures to rise. Global average temperature clearly have risen from all long-term (>40 year) data sets, so theory is confirmed. On a rising trend, with large variations due to natural variability, then any new records will be primarily “human-caused”. But making the theory and data slightly less vague reveals an opposite conclusion. Around the turn of the century the annual percentage increase in CO2 emissions went from 0.4% to 0.5% a year (figure 1), which should have lead to an acceleration in the rate of warming. In reality warming stalled.

The reaction was to come up with a load of ad hoc excuses. Hockey Schtick blog reached 66 separate excuses for the “pause” by November 2014, from the peer-reviewed to a comment in the UK Parliament.  This could be because climate is highly complex, with many variables, the presence of each contributing can only be guessed at, let alone the magnitude of each factor and the interrelationships with all factors. So how do you tell which statements are valid information and which are misinformation? I agree with Cook and Lewandowsky that misinformation is pernicious, and difficult to get rid of once it becomes entrenched. So how does one evaluate distinguish between the good information and the bad, misleading or even pernicious?

The Lewandowsky / Cook answer is to follow the consensus of opinion. But what is the consensus of opinion? In climate one variation is to follow a small subset of academics in the area who answer in the affirmative to

1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

Problem is that the first question is just reading a graph and the second could be is a belief statement will no precision. Anthropogenic global warming has been a hot topic for over 25 years now. Yet these two very vague empirically-based questions, forming the foundations of the subject, should be able to be formulated more precisely. On the second it is a case of having pretty clear and unambiguous estimates as to the percentage of warming, so far, that is human caused. On that the consensus of leading experts are unable to say whether it is 50% or 200% of the warming so far. (There are meant to be time lags and factors like aerosols that might suppress the warming). This from the 2013 UNIPCC AR5 WG1 SPM section D3:-

It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.

The IPCC, encapsulating the state-of-the-art knowledge, cannot provide firm evidence in the form of a percentage, or even a fairly broad range even with over 60 years of data to work on..  It is even worse than it appears. The extremely likely phrase is a Bayesian probability statement. Ron Clutz’s simple definition from earlier this year was:-

Here’s the most dumbed-down description: Initial belief plus new evidence = new and improved belief.

For the IPCC claim that their statement was extremely likely, at the fifth attempt, they should be able to show some sort of progress in updating their beliefs to new evidence. That would mean narrowing the estimate of the magnitude of impact of a doubling of CO2 on global average temperatures. As Clive Best documented in a cliscep comment in October, the IPCC reports, from 1990 to 2013 failed to change the estimate range of 1.5°C to 4.5°C. Looking up Climate Sensitivity in Wikipedia we get the origin of the range estimate.

A committee on anthropogenic global warming convened in 1979 by the National Academy of Sciences and chaired by Jule Charney estimated climate sensitivity to be 3 °C, plus or minus 1.5 °C. Only two sets of models were available; one, due to Syukuro Manabe, exhibited a climate sensitivity of 2 °C, the other, due to James E. Hansen, exhibited a climate sensitivity of 4 °C. “According to Manabe, Charney chose 0.5 °C as a not-unreasonable margin of error, subtracted it from Manabe’s number, and added it to Hansen’s. Thus was born the 1.5 °C-to-4.5 °C range of likely climate sensitivity that has appeared in every greenhouse assessment since…

It is revealing that quote is under the subheading Consensus Estimates. The climate community have collectively failed to update the original beliefs, based on a very rough estimate. The emphasis on referring to consensus beliefs about the world, rather than looking outward for evidence in the real world, I would suggest is the primary reason for this failure. Yet such community-based beliefs completely undermines the integrity of the Bayesian estimates, making its use in statements about climate clear misinformation in Cook and Lewandowsky’s use of the term. What is more, those in the climate community who look primarily to these consensus beliefs rather than the data of the real world will endeavour to dismiss the evidence, or make up ad hoc excuses, or smear those who try to disagree. A caricature of these perspectives with respect to global average temperature anomalies is available in the form of a flickering widget at John Cooks’ skepticalscience website. This purports to show the difference between “realist” consensus and “contrarian” non-consensus views. Figure 2 is a screenshot of the consensus views, interpreting warming as a linear trend. Figure 3 is a screenshot of the non-consensus or contrarian views. They is supposed to interpret warming as a series of short, disconnected,  periods of no warming. Over time, each period just happens to be at a higher level than the previous. There are a number of things that this indicates.

(a) The “realist” view is of a linear trend throughout any data series. Yet the period from around 1940 to 1975 has no warming or slight cooling depending on the data set. Therefore any linear trend line derived for a longer period than 1970 to 1975 and ending in 2015 will show a lower rate of warming. This would be consistent the rate of CO2 increasing over time, as shown in figure 1. But for shorten the period, again ending in 2015, and once the period becomes less than 30 years, the warming trend will also decrease. This contracts the theory, unless ad hoc excuses are used, as shown in my previous post using the HADCRUT4 data set.

(b) Those who agree with the consensus are called “Realist”, despite looking inwards towards common beliefs. Those who disagree with warming are labelled “Contrarian”. This is not inaccurate when there is a dogmatic consensus. But it utterly false to lump all those who disagree with the same views, especially when no examples are provided of those who hold such views.

(c) The linear trend appears as a more plausible fit than the series of “contrarian” lines. By implication, those who disagree with the consensus are viewed as as having a distinctly more blinkered and distorted perspective than those who follow the consensus. Yet even using gistemp data set (which is gives greatest support to the consensus views) there is a clear break in the linear trend. The less partisan HADCRUT4 data shows an even greater break.

Those who spot the obvious – that around the turn of the century warming stopped or slowed down, when in theory it should have accelerated – are given a clear choice. They can conform to the scientific consensus, denying the discrepancy between theory and data. Or they can act as scientists, denying the false and empirically empty scientific consensus, receiving the full weight of all the false and career-damaging opprobrium that accompanies it.

fig2-sks-realists

 

 

fig3-sks-contras

Kevin Marshall

 

Hiroshima Bombs of Heat Accumulation – Skeptical Science reversing scientific reality

Skeptical Science blog has a little widget that counts the heat the climate has accumulated since 1998 in terms of Hiroshima Atomic Bombs.

One the first uses of the Hiroshima bomb analogy was by skepticalscience.com stalwart Dana Nuccitelli, in the Guardian.

The rate of heat building up on Earth over the past decade is equivalent to detonating about 4 Hiroshima atomic bombs per second. Take a moment to visualize 4 atomic bomb detonations happening every single second.

But what does this mean in actual heat energy? I did a search, and found out the estimated heat generated by the Hiroshima bomb was about 63TJ, or terra joules, or 63 x 1012 joules. A quick calculation reveals the widget actually uses 62TJ, so I will use that lower value. It is a huge number. The energy was sufficient to kill over 100,000 people, cause horrific injuries to many more, and destroying every building within a large radius of the blast site. Yet in the last 17 years the climate system has accumulated over two billion times that energy.

Most of that energy goes into the oceans, so I was curious to estimate the impact that phenomenal heat accumulation would have on the average temperature of the oceans. Specifically, how long would it take to heat the oceans by 1oC.

The beauty of metric measurements is that weight and volume are combined all around the unit of water. I will ignore the slight differences due to the impurities of sea water for this exercise.

The metric unit of energy, a joule, is not quite so easy to relate to water. The old British thermal unit is better, being the quantity of energy sufficient to raise a pound of water through 1oF. Knowing that 1lb=454g, 1.8oF = 1oC and 1btu ≈ 1055J, means that about 4.2 joules is the energy sufficient to raise 1 gram of water the one degree.

So the Hiroshima bomb had the energy to raise (62 x 1012)/4.2 ≈ 15 x 1012 grams of water through one degree.

That is 15 x 109 kilos (litres) of water, or 15 x 106 tonnes (cubic metres) of water. That is the volume of a lake of 1 kilometre in area, with an average depth of 15 metres.

The largest lake in England is Lake Windermere, which has approximately a volume of 1 cubic kilometre of water, or 1 billion tonnes of water. (The biggest freshwater lake in the United Kingdom by volume is Loch Ness, with about 9 km3 of water.)

It would take the power of 67 Hiroshima bombs to heat Lake Windermere by 1 degree. Or the oceans are accumulating heat at a rate that would the temperature of this lake by one degree in 16.67 seconds.

Although Lake Windermere can look quite large when standing on its shoreline, it is tiny in relative to the Great Lakes, let alone the oceans of the world. With a total area of about 360,000,000 km2, and an average depth of at least 3000 metres, the oceans have a volume of about 1,080,000,000 km3, or contain 108 x 1018 tonnes of water. If all the heat absorbed by the global climate system since 1998 went into the oceans, it would about 18 billion seconds to raise average ocean temperature by 1oC. That is 5,000,000 hours or 208,600 days or 570 years.

Here I am slightly exaggerating the warming rate. The UNIPCC estimates that only 93% of the warming from extra heat absorbed by the climate system was absorbed by the oceans.

But have I got this wrong by a huge margin? The standard way of stating the warming rates – used by the UNIPCC – is in degrees centigrade per decade. This is the same metric that is used for average surface temperatures. Warming of one degree in 570 years becomes 0.0175°C/decade. In Chapter 3 of the UNIPCC AR5 Working Group 1 Report, Figure 3.3 (a) on page 263 is the following.

The ocean below about 1000 metres, or more than two-thirds of the water volume, is warming at a rate less than 0.0175°C/decade. This may be an overstatement. Below 2000 metres, average water temperature rise is around 0.005°C/decade, or 1oC of temperature rise every 2000 years.

The energy of four Hiroshima bombs a second is trivial on a global scale. It causes an amount of temperature change that is barely measurable on a year-on-year basis.

There are two objectives that I believe Skeptical Science team try achieving with their little widget.

The first objective is to reverse people’s perception of reality. Nuclear explosions are clearly seen by everybody. You do not have to be an expert to detect it if you are within a thousand miles of the detonation. Set one off anywhere in the world, even deep underground, and sensitive seismic detectors will register the event from the other side of the globe. Rejection of the evidence of a blast can only be on the basis of clear bias or lying.

But trying to measure changes of thousands of a degree in the unimaginable vastness of the oceans, with changes in the currents and seasonal changes as well is not detectable with a single instrument, or even thousands of such instruments. It requires careful collation and aggregation of the data, with computer modelling filling in the gaps. Small biases in the modelling techniques, whether known or unknown, due to technical reasons or through desiring to get a particular result, will be more crucial than accuracy of the instruments. Even without these issues, there is the small matter of using ten years of good quality data, and longer periods of sparser and lower quality data, to determine underlying trends and the causes of it. Understanding of the nature of the data measurement issue puts the onus on anyone claiming the only possible answer to substantiate those claims.

The second objective is to replace a very tiny change in the very short period for which we have data, into a perception of a scientifically-validated catastrophic problem in the present. Whether it is a catastrophic problem relies on the projections of climate models.

It is easy to see why Skeptical Science needs this switch in the public perception of reality. True understanding of climate heat accumulation means awareness of the limits and the boundaries of our current knowledge. That requires a measure of humility and recognition of when existing knowledge is undermined. It is an inter-disciplinary subject that could result in a whole range of results of equal merit. It does not accord with their polarized vision of infallible enlightened scientists against a bunch of liars and ignoramuses who get nothing right.

Kevin Marshall