Hansen et al 1988 Global Warming Predictions 30 Years on

Last month marked the 30th anniversary of the James Hansen’s Congressional Testimony that kicked off the attempts to control greenhouse gas emissions. The testimony was clearly an attempt, by linking human greenhouse emissions to dangerous global warming, to influence public policy. Unlike previous attempts (such as by then Senator Al Gore), Hansen’s testimony was hugely successful. But do the scientific projections that underpinned the testimony hold up against the actual data? The key part of that testimony was a graph from the Hansen et al 1988* Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model, produced below.

Figure 1: Hansen et al 1988 – Figure 3(a) in the Congressional Testimony

Note the language of the title of the paper. This is a forecast of global average temperatures contingent upon certain assumptions. The ambiguous part is the assumptions.

The assumptions of Hansen et. al 1988

From the paper.

4. RADIATIVE FORCING IN SCENARIOS A, B AND C

4.1. Trace Gases

  We define three trace gas scenarios to provide an indication of how the predicted climate trend depends upon trace gas growth rates. Scenarios A assumes that growth rates of trace gas emissions typical of the 1970s and 1980s will continue indefinitely; the assumed annual growth averages about 1.5% of current emissions, so the net greenhouse forcing increase exponentially. Scenario B has decreasing trace gas growth rates, such that the annual increase of the greenhouse climate forcing remains approximately constant at the present level. Scenario C drastically reduces trace gas growth between 1990 and 2000 such that the greenhouse climate forcing ceases to increase after 2000.

Scenario A is easy to replicate. Each year increase emissions by 1.5% on the previous year. Scenario B assumes that growth emissions are growing, and policy takes time to be enacted. To bring emissions down to the current level (in 1987 or 1988), reduction is required. Scenario C one presumes are such that trace gas levels are not increasing. As trace gas levels were increasing in 1988 and (from Scenario B) continuing emissions at the 1988 level would continue to increase atmospheric levels the levels of emissions would have been considerably lower than in 1988 by the year 2000. They might be above zero, as small amounts of emissions may not have an appreciable impact on atmospheric levels.

The graph formed Fig. 3. of James Hansen’s testimony to Congress. The caption to the graph repeats the assumptions.

Scenario A assumes continued growth rates of trace gas emissions typical of the past 20 years, i.e., about 1.5% yr-1 emission growth; scenario B has emission rates approximately fixed at current rates; scenario C drastically reduces traces gas emissions between 1990 and 2000.

This repeats the assumptions. Scenario B fixes annual emissions at the levels of the late 1980s, whilst scenario C sees drastic emission reductions.

James Hansen in his speech gave a more succinct description.

We have considered cases ranging from business as usual, which is scenario A, to draconian emission cuts, scenario C, which would totally eliminate net trace gas growth by year 2000.

Note that the resultant warming from fixing emissions at the current rate (Scenario B) is much closer in warming impacts to Scenario A (emissions growth of +1.5% year-on-year) than Scenario C that stops global warming. Yet Scenario B results from global policy being successfully implemented to stop the rise in global emissions.

Which Scenario most closely fits the Actual Data?

To understand which scenario most closely fits the data, we need to look at that trace gas emissions data. There are a number of sources, which give slightly different results. One source, and that which ought to be the most authoritative, is the IPCC Fifth Assessment Report WG3 Summary for Policy Makers graphic SPM.1 is reproduced in Figure 2.

 Figure 2 : AR5 WG3 SPM.1 Total annual anthropogenic GHG emissions (GtCO2eq/yr) by groups of gases 1970-2010. FOLU is Forestry and Other Land Use.

Note that in Figure 2 the other greenhouse gases – F-Gases, N2O and CH4 – are expressed in CO2 equivalents. It is very easy to see which of the three scenarios fits. The historical data up until 1988 shows increasing emissions. After that data emissions have continued to increase. Indeed there is some acceleration, stated on the graph comparing 2000-2010 (+2.2%/yr) with 1970-2000 (+1.3%/yr) . In 2010 GHG emissions growth were not similar to those in the 1980s (about 35 GtCO2e) but much higher. By implication, Scenario C, which assumed draconian emissions cuts is the furthest away from the reality of what has happened. Before considering how closely Scenario A compares to temperature rise, the question is therefore how close actual emissions have increased compared to the +1.5%/yr in scenario A.

From my own rough calculations, total GHG emissions from 1990 to 2010 rose about 29% or 1.3% a year, compared to 41% or 1.7% a year in the period 1970 to 1990. Exponential growth of 1.3% is not far short of the 1.5%. The assumed 1.5% growth rates would have resulted in 2010 emissions of 51 GtCO2e instead of the 49 GtCO2e estimated, well within the margin of error. That is actual trends over 20 years were pretty much the business as usual scenario. The narrower CO2 emissions from fossil fuels and industrial sources from 1990 to 2010 rose about 42% or 1.8% a year, compared to 51% or 2.0% a year in the period 1970 to 1990, above the Scenario A.

The breakdown is shown in Figure 3.

Figure 3 : Rough calculations of exponential emissions growth rates from AR5 WG1 SPM Figure SPM.1 

These figures are somewhat out of date. The UNEP Emissions Gap Report 2017 (pdf) estimated GHG emissions in 2016 at 51.9 GtCO2e. This represents a slowdown in emissions growth in recent years.

Figure 4 shows are the actual decadal exponential growth trends in estimated GHG emissions (with a linear trend to the 51.9 GtCO2e of emissions in 2016 from the UNEP Emissions Gap Report 2017 (pdf)) to my interpretations of the scenario assumptions. That is, from 1990 in Scenario A for 1.5% annual growth in emissions; in Scenario B for emissions to reduce from 38 to 35 GtCO2e in(level of 1987) in the 1990s and continue indefinitely: in Scenario C to reduce to 8 GtCO2e in the 1990s.

Figure 4 : Hansen et al 1988 emissions scenarios, starting in 1990, compared to actual trends from UNIPCC and UNEP data. Scenario A – 1.5% pa emissions growth; Scenario B – Linear decline in emissions from 38 GtCO2e in 1990 to 35 GtCO2e in 2000, constant thereafter; Scenario C – Linear decline  in emissions from 38 GtCO2e in 1990 to 8 GtCO2e in 2000, constant thereafter. 

This overstates the differences between A and B, as it is the cumulative emissions that matter. From my calculations, although in Scenario B 2010 emissions are 68% of Scenario A, cumulative emissions for period 1991-2010 are 80% of Scenario A.

Looking at cumulative emissions is consistent with the claims from the various UN bodies, that limiting to global temperature rise to 1.5°C or 2.0°C of warming relative to some point is contingent of a certain volume of emissions not been exceeded. One of the most recent the key graphic from the UNEP Emissions Gap Report 2017.

Figure 5 : Figure ES.2 from the UNEP Emissions Gap Report 2017, showing the projected emissions gap in 2030 relative to 1.5°C or 2.0°C warming targets. 

Warming forecasts against “Actual” temperature variation

Hansen’s testimony was a clear case of political advocacy. By making Scenario B constant the authors are making a bold policy statement. That is, to stop catastrophic global warming (and thus prevent potentially catastrophic changes to climate systems) requires draconian reductions in emissions. Simply maintaining emissions at the levels of the mid-1980s will make little difference. That is due to the forcing being related to the cumulative quantity of emissions.

Given that the data is not in quite in line with scenario A, if the theory is correct, then I would expect:-

  1. Warming trend to be somewhere between Scenario A and Scenario B. Most people accept 4.2equilibrium climate sensitivity of the Hansen model was 4.2ºC for a doubling of CO2 was too high. The IPCC now uses 3ºC for ECS. More recent research has it much lower still. However, although the rate of the warming might be less, the pattern of warming over time should be similar.
  2. Average temperatures after 2010 to be significantly higher than in 1987.
  3. The rate of warming in the 1990s to be marginally lower than in the period 1970-1990, but still strongly positive.
  4. The rate of warming in the 2000s to be strongly positive marginally higher than in the 1990s.

From the model Scenario C, there seems to be about a five year lag in the model between changes in emission rates and changes in temperatures. However, looking at the actual temperature data there is quite a different warming pattern. Five years ago C3 Headlines had a post 2013: The NASA/Hansen Climate Model Prediction of Global Warming Vs. Climate Reality.  The main graphic is in Figure 6

Figure 6 : C3 Headlines – NASA Hansen Prediction Vs Reality

The first thing to note is that the Scenario Assumptions are incorrect. Not only are they labelled as CO2, not GHG emissions, but are all stated wrongly. Stating them correctly would show a greater contradiction between forecasts and reality. However, the Scenario data appears to be reproduced correctly, and the actual graph appears to be in line with a graphic produced last month by Gavin Schmidt last month in his defense of Hansen’s predictions.

The data contradicts the forecasts. Although average temperatures are clearly higher than in in 1987, they are not in line with the forecast of Scenario A which is closest to the actual emissions trends. The rise is way below 70% of the model implied by inputting the lower IPCC climate sensitivity, and allowing for GHG emissions being fractional below the 1.5% per annum of Scenario A. But the biggest problem is where the main divergence occurred. Rather than warming accelerating slightly in the 2000s (after a possible slowdown in the 1990s),  there was no slowdown in the 1990s, but it either collapsed to zero, or massively reduced, depending on the data set was used. This is in clear contradiction of the model. Unless there is an unambiguous and verifiable explanation (rather than a bunch of waffly and contradictory excuses ), the model should be deemed to be wrong. There could be natural and largely unknown natural factors or random data noise that could explain the discrepancy. But equally (and quite plausibly) those same factors could have contributed to the late twentieth century warming.

This simple comparison has an important implication for policy. As there is no clear evidence to link most of the observed warming to GHG emissions, by implication there is no clear support for the belief that reducing GHG emissions will constrain future warming. But reducing global GHG emissions is merely an aspiration. As the graphic in Figure 5 clearly demonstrates, over twenty months after the Paris Climate Agreement was signed there is still no prospect of aggregate GHG emissions falling through policy. Hansen et. al 1988 is therefore a double failure; both as a scientific forecast and a tool for policy advocacy in terms of reducing GHG emissions. If only the supporters would realize their failure, and the useless and costly climate policies could be dismantled.

Kevin Marshall

*Hansen, J., I. Fung, A. Lacis, D. Rind, S. Lebedeff, R. Ruedy, G. Russell, and P. Stone, 1988: Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. J. Geophys. Res., 93, 9341-9364, doi:10.1029/JD093iD08p09341.

Nature tacitly admits the IPCC AR5 was wrong on Global Warming

There has been a lot of comment on a recent paper at nature geoscience “Emission budgets and pathways consistent with limiting warming to 1.5C” (hereafter Millar et. al 2017)

When making a case for public policy I believe that something akin to a process of due diligence should be carried out on the claims. That is the justifications ought to be scrutinized to validate the claims. With Millar et. al 2017, there are a number of issues with the make-up of the claims that (a) warming of 1.5C or greater will be achieved without policy (b) constraining the emissions  

The baseline warming

The introduction states
Average temperatures for the 2010s are currently 0.87°C above 1861–80,

A similar quote from UNIPCC AR5 WG1 SPM page 5

The total increase between the average of the 1850–1900 period and the 2003–2012 period is 0.78 [0.72 to 0.85] °C, based on the single longest dataset available.

These figures are all from the HADCRUT4 dataset. There are three areas to account for the difference of 0.09°C. Mostly it is the shorter baseline period. Also, the last three years have been influenced by a powerful and natural El-Nino, along with the IPCC using an average of the last 10 years.

The warming in the pipeline

There are valid reasons for the authors differing from the IPCC’s methodology. They start with the emissions from 1870 (even though emissions estimates go back to 1850). Also, if there is no definite finish date, it is very difficult to calculate the warming impact to date. Consider first the full sentence quoted above.

Average temperatures for the 2010s are currently 0.87°C above 1861–80, which would rise to 0.93°C should they remain at 2015 levels for the remainder of the decade.

This implies that there is some warming to come through from the impact of the higher greenhouse gas levels. This seems to be a remarkably low and over a very short time period. Of course, not all the warming since the mid-nineteenth century is from anthropogenic greenhouse gas emissions. The anthropogenic element is just guesstimated. This is show in AR5 WG1 Ch10 Page 869

More than half of the observed increase in global mean surface temperature (GMST) from 1951 to 2010 is very likely due to the observed anthropogenic increase in greenhouse gas (GHG) concentrations.

It was after 1950 when the rate largest increase in CO2 levels was experienced. From 1870 to 1950, CO2 levels rose from around 290ppm to 310ppm or 7%. From 1950 to 2010, CO2 levels rose from around 310ppm to 387ppm or 25%. Add in other GHG gases and there the human-caused warming should be 3-4 times greater in the later period than the earlier one, whereas the warming in the later period was just over twice the amount. Therefore if there is just over a 90% chance (very likely in IPCC speak) of over 50% of the warming post-1950 was human-caused, a statistical test relating to a period more than twice as long would have a lower human-caused element of the warming as being statistically significant. Even then, I view the greater than 50% statistic as being deeply flawed. Especially when post-2000, when the rate of rise in CO2 levels accelerated, whilst the rise in average temperatures dramatically slowed. There are two things that this suggests. First, the impact could be explained by rising GHG emissions being a minor element in temperature rise, with natural factors both causing some of the warming in the 1976-1998 period, then reversing, causing cooling, in the last few years. Second is that there is a darn funny lagged response of rising GHGs (especially CO2) to rises in temperature. That is the amount of warming in the pipeline has increased dramatically. If either idea has any traction then the implied warming to come of just 0.06°is a false estimate. This needs to be elaborated.

Climate Sensitivity

If a doubling of CO2 leads to 3.00°C of warming (the assumption of the IPCC in their emissions calculations), then a rise in CO2 levels from 290ppm to 398 ppm (1870 to 2014) eventually gives 1.37°C of warming. With other GHGs this figure should be around 1.80°C. Half that warming has actually occurred, and some of that is natural. So there is well over 1.0°C still to emerge. It is too late to talk about constraining warming to 1.5°C as the cause of that warming has already occurred.

The implication from the paper in claiming that 0.94°C will result from human emissions in the period 1870-2014 is to reduce the climate sensitivity estimate to around 2.0°C for a doubling of CO2, if only CO2 is considered, or around 1.5°C for a doubling of CO2, if all GHGs are taken into account. (See below) Compare this to AR5 WG1 section D.2 Quantification of Climate System Responses

The equilibrium climate sensitivity quantifies the response of the climate system to constant radiative forcing on multicentury time scales. It is defined as the change in global mean surface temperature at equilibrium that is caused by a doubling of the atmospheric CO2 concentration. Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C (high confidence), extremely unlikely less than 1°C (high confidence), and very unlikely greater than 6°C (medium confidence).

The equilibrium climate sensitivity ECS is at the very bottom of the IPCC’s range and equilibrium climate response is reached in 5-6 years instead of mutlicentury time scales. This on top of the implied assumption that there is no net natural warming between 1870 and 2015.

How much GHG emissions?

With respect to policy, as global warming is caused by human greenhouse gas emissions, to prevent further human-caused warming requires reducing, and possibly eliminating global greenhouse emissions. In conjunction with the publication of the AR5 Synthesis report, the IPCC produced a slide show of the policy case laid out in the three vast reports. It was effectively a short summary of a summary of the synthesis report. Approaching the policy climax at slide 30 of 35:-

Apart from the policy objective in AR5 was to limit warming from 2°C, not 1.5°C, it also mentions the need to constrain GHG emissions, not CO2 emissions. Then slide 33 gives the simple policy simplified position to achieve 2°C of warming.

To the end of 2011 1900 GTCO2e of GHGs was estimated to have been emitted, whilst the estimate is around 1000 GTCO2e could be emitted until the 2°C warming was reached.

The is the highly simplified version. At the other end of the scale, AR5 WG3 Ch6 p431 has a very large table in a very small font to consider a lot of the policy options. It is reproduced below, though the resolution is much poorer than the original.

Note 3 states

For comparison of the cumulative CO2 emissions estimates assessed here with those presented in WGI AR5, an amount of 515 [445 to 585] GtC (1890 [1630 to 2150] GtCO2), was already emitted by 2011 since 1870

The top line is for the 1.5°C of warming – the most ambitious policy aim. Of note:-

  • The CO2 equivalent concentration in 2100 (ppm CO2eq ) is 430-480ppm.
  • Cumulative CO2 emissions (GtCO2) from 2011 to 2100 is 630 to 1180.
  • CO2 concentration in 2100 is 390-435ppm.
  • Peak CO2 equivalent concentration is 465-530ppm. This is higher than the 2100 concentration and if for CO2 alone with ECS = 3 would eventually produce 2.0°C to 2.6°C of warming.
  • The Probability of Exceeding 1.5 °C in 2100 is 49-86%. They had to squeeze really hard to say that 1.5°C was more than 50% likely.

Compare the above to this from the abstract of Millar et. al 2017.

If COemissions are continuously adjusted over time to limit 2100 warming to 1.5C, with ambitious non-COmitigation, net future cumulativCOemissions are unlikely to prove less than 250 GtC and unlikely greater than 540 GtC. Hence, limiting warming to 1.5C is not yet a geophysical impossibility, but is likely to require delivery on strengthened pledges for 2030 followed by challengingly deep and rapid mitigation.

They use tonnes of carbon as the unit of measure as against CO2 equivalent. The conversion factor is 3.664, so cumulative CO2 emissions need to be 870-1010 GtCO2 range. As this is to the end of 2015, not 2011 as in the IPCC report, it will be different. Subtracting 150 from the IPCC reports figures would give a range of 480 to 1030. That is, Millar et. al 2017 have reduced the emissions range by 75% to the top end of the IPCC’s range. Given the IPCC considered a range of 1.5-1.7°C of warming, this seems somewhat odd to then say it related to the lower end of the warming band, until you take into account that ECS has been reduced. But then why curtail the range of emissions instead calculating your own? It appears that again the authors are trying to squeeze a result within existing constraints.

However, this does not take into account the much higher levels of peak CO2 equivalent concentrations in table 6.3. Peak CO2 concentrations are around 75-95ppm higher than in 2100. Compare this to the green line in the central graph in Millar et. al 2017. 

 This is less than 50ppm higher than in 2100. Further in 2100 Millar et. al 2017 has CO2 levels of around 500ppm as against a mid-point of 410 in AR5. CO2 rising from 290 to 410ppm with ECS = 3.0 produced 1.50°C of warming. CO2 rising from 290 to 410ppm with ECS = 2.0 produced 1.51°C of warming. Further, this does not include the warming impact of other GHGs. To squeeze into the 1.5°C band, the mid-century overshoot in Millar et. al 2017 is much less than in AR5. This might be required in the modeling assumptions due to the very short time assumed in reaching full equilibrium climate response.

Are the authors playing games?

The figures do not appear to stack up. But then they appear to be playing around with figures, indicated by a statement in the explanation of Figure 2

Like other simple climate models, this lacks an explicit physical link between oceanic heat and carbon uptake. It allows a global feedback between temperature and carbon uptake from the atmosphere, but no direct link with net deforestation. It also treats all forcing agents equally, in the sense that a single set of climate response parameters is used in for all forcing components, despite some evidence of component-specific responses. We do not, however, attempt to calibrate the model directly against observations, using it instead to explore the implications of ranges of uncertainty in emissions, and forcing and response derived directly from the IPCC-AR5, which are derived from multiple lines of evidence and, importantly, do not depend directly on the anomalously cool temperatures observed around 2010.

That is:-

  • The model does not consider an “explicit physical link between oceanic heat and carbon uptake.” The IPCC estimated that over 90% of heat accumulation since 1970 was in the oceans. If the oceans were to belch out some of this heat at a random point in the future the 1.5°C limit will be exceeded.
  • No attempt has been made to “calibrate the model directly against observations”. Therefore there is no attempt to properly reconcile beliefs to the real world.
  • The “multiple lines of evidence” in IPCC-AR5 does not include a glaring anomaly that potentially falsifies the theory and therefore any “need” for policy at all. That is the divergence in actual temperatures trends from theory in this century.

Conclusions

The authors of Millar et. al 2017 have pushed out the boundaries to continue to support climate mitigation policies. To justify constraining emissions sufficient stop 1.5°C of warming the authors would appear to have

  • Assumed that all the warming since 1870 is caused by anthropogenic GHG emissions when there is not even a valid statistical test that confirms even half the warming was from this source.
  • Largely ignored any hidden heat or other long-term response to rises in GHGs.
  • Ignored the divergence between model predictions and actual temperature anomalies since around the turn of the century. This has two consequences. First, the evidence appears to strongly contradict the belief that humans are a major source of global warming and by implication dangerous climate change. Second, if it does not contradict the theory, suggests the amount of warming in the pipeline consequential on human GHG emissions has massively increased. Thus the 1.5°C warming could be breached anyway.
  • Made ECS as low as possible in the long-standing 1.5°C to 4.5°C range. Even assuming ECS is at the mid-point of the range for policy (as the IPCC has done in all its reports) means that warming will breach the 1.5°C level without any further emissions. 

The authors live in their closed academic world of models and shared beliefs. Yet the paper is being used for the continued support of mitigation policy that is both failing to get anywhere close to achieving the objectives and is massively net harmful in any countries that apply it, whether financially or politically.

Kevin Marshall

Commentary at Cliscep, Jo Nova, Daily Caller, Independent, The GWPF

Update 25/09/17 to improve formatting.