Valve Turner Micheal Foster’s Climate Necessity Defense

The Climate Necessity Defence for criminal acts to impede the lawful business of the fossil fuel industry cannot be justified. The acts will never of themselves have a significant impact in constraining global greenhouse emissions. In any event, there will always be more than sufficient proven fossil fuel reserves in countries out of the reach of any activist action, or even Government-backed action, to constrain aggregate cumulative fossil fuel emissions to anywhere near the levels commensurate with constraining temperature to 2°C of warming. What it does do is impose immediate harms on the actual victims of the crimes, and harms on the countries in which the crimes are committed. Some of the harms are from benefitting non-policy countries who produce fossil fuels. The conviction last week of climate activist Michael Foster is a clear case study.


The New York Times reports (hattip GWPF) on the conviction by the North Dakota Supreme Court of Seattle resident Michael Foster.

Foster took part in effort on Oct. 11, 2016, to draw attention to climate change by turning off valves on five pipelines that bring Canadian oil south. Foster targeted the Keystone Pipeline in North Dakota. Other activists targeted pipelines in Minnesota, Montana and Washington state.

A jury in North Dakota’s Pembina County on Friday convicted Foster after a weeklong trial of criminal mischief, criminal trespass and conspiracy. He faces up to 21 years in prison when he’s sentenced Jan. 18. The man who filmed his protest action, Samuel Jessup of Winooski, Vermont, was convicted of conspiracy and faces up to 11 years.

What I found interesting was the next sentence.

Foster had hoped to use a legal tactic known as the climate necessity defense — justifying a crime by arguing that it prevented a greater harm from happening.

The Climate Disobedience Center in its article for activists on the climate necessity defense says

The basic idea behind the defense — also known as a “choice of evils,” “competing harms,” or “justification” defense — is that the impacts of climate change are so serious that breaking the law is necessary to avert them.

Foster had his action filmed, shown from 2.07 here.

Keystone Pipeline. North Dakota. I’m Michael Foster. In order to preserve life as we know it and civilization, give us a fair chance and our kids a fair chance, I’m taking this action as a citizen. I am duty bound.

This was a significant action. The video quotes Reuters news agency.

Was this action “preserving life as we know it“? In shutting down the pipeline, (along with four pipelines others in the coordinated action) 590,000 barrels of oil failed to be transported from Canada to the USA that morning. It was merely delayed. If the pipelines are working at full capacity it would maybe have been transported by rail instead. Or more produced in the USA. Or more imported from the Middle East. But suppose that those 590,000 barrels (83000 tonnes) had been left in the ground, never to be extracted, rather than delaying production. What is the marginal difference that it would make climate change?

From the BP Statistical Review of World Energy 2016 (full report), I find that global oil production in 2015 was around 92 million barrels per day, or 4362 million tonnes in the full year. Global production would have been 0.6% lower on Oct. 11, 2016 or 0.002% lower in the full year. Yet there is plenty of the stuff in the ground. Proven global reserves are around 50.7 years of global production. Leaving 590,000 barrels in the ground will reduce proven reserves by around 0.000038%. That is less than one part in a million of proven oil reserves. Yet in the last few years, proven reserves have been increasing, as extraction techniques keep improving. This despite production climbing as well. 2015 production was 21% higher than in 2000 and 56% higher than in 1985. Proven reserves in 2015 were 30% higher than in 2000 and 112% higher than in 1985.

I have divided up those 50.7 years of reserves by major areas.

The effect of turning off the oil pipeline is posturing unless it shuts down oil production in Canada and the USA. But that would still leave over 40 years of proven reserves elsewhere. Are Russia and Middle Eastern countries going to shut down their production because of the criminal acts of a few climate activists in the USA?

But oil is not the only major fossil fuel. Production of coal in 2015 was 3830 Million tonnes of oil equivalent, 88% of oil production. Proven coal reserves are 123 years of current production. Further, if oil prices rise to the levels seen over the last few years, it will become economic to convert more coal to liquids, a process which consumes four to five times the CO2 of burning oil.

Are China, Russia, India, Australia, Ukraine, Indonesia, South Africa and many other countries going to shut down their production because of the criminal acts of a few climate activists in the USA?

The third major fossil fuel is gas. Production in 2015 was 3200 million tonnes of oil equivalent, 73% of oil production. Proven reserves are equivalent to 52.8 years of current production levels.

The reserves are slightly more centralized than for oil or coal. Like with oil, a large part of available reserves are concentrated in Russia and the Middle East.

Leaving 590,000 barrels in the ground would reduce proven reserves of fossil fuels by around one part in ten million.

The 50+ years of proven reserves of oil and gas, and 120+ years of proven reserves of coal needs to be put into a policy context. The IPCC AR5 Synthesis Report gave a very rough guide to how much CO2 (or equivalent greenhouse gases) could be emitted to limit warming to less than 2°C. From 2012 it was 1000 GtCO2e.

With emissions in 2011 at around 50 GtCO2e, that gave 20 years. From next year that will be less than 15 years. The recent paper “Emission budgets and pathways consistent with limiting warming to 1.5C” (hereafter Millar et. al 2017) reevaluated the figures, with the 1.5°C not being breached for a further 20 years. Whatever way you look at the figures, most of the proven fossil fuels in the world will have to be left in the ground. That requires the agreement of Saudi Arabia, Russia, Iran, Iraq, Qatar, Kuwait, Turkmenistan, China, India, Venezuela, alongside USA, Canada, Australia and a large number of other countries.

Further, there can be no more extractions of fossil fuels from unproven reserves, which will likely exceed the proven reserves.

The efforts of Micheal Foster and his mates could incite further criminal acts. But massive lawbreaking throughout the United States, it would still be insufficient in the USA to significantly dent the production and distribution of fossil fuels in the USA. Even if that happened, there are plenty of other countries who would willingly meet the existing demand. All that the action is likely to do is push up the costs of production and distribution in the USA, harming the US economy and the futures of people involved in the fossil fuel industries and energy-intensive industries.

It is the aspect of failing to make a significant marginal difference through the action – that is reducing global greenhouse gas emissions – than renders the climate necessity defense void. Even if large numbers of other actions are inspired by Foster and others, it would still be insufficient to get anywhere close to the constraint in emissions to constrain warming to 1.5°C or 2°C. On a larger scale, even if all major Western economies shut down all fossil fuel production and consumption immediately, it would merely delay by a few years the cumulative aggregate emissions from 2012 onwards exceeding 1000 GtCO2e.

It gets worse. A particular case must be decided on the damage caused to the victims of the crime. In this case the owners of the pipeline, the employees of the business, the customers who do not get their oil, etc. If there are beneficiaries, it is the billions of people in generations to come. The marginal difference to the victims of the action is tangible and has happened. The marginal difference to the beneficiaries is imperceptible and even then based on belief in what amount to nothing more than pseudo-scientific prophecies. But given that a shut-down of production in the USA is likely to be met by increased production elsewhere even these future dispersed and speculated benefits are unlikely to accrue.

More broadly, if specific people need to have their immediate interests sacrificed for the greater good, surely that is the function of Government, not some wayward activists? In that way the harms could be more equitably distributed. With random acts of criminality, the harms are more likely to be based on the prejudices on the activists.


The Climate Necessity Defence is an invalid justification for the criminal actions of Michael Foster and others in shutting down the oil pipelines from Canada into the USA. The marginal impact on reducing greenhouse gas emissions by the action, if they were not made up by increased production elsewhere, is about one part in ten million. But given that most of the global proven fossil fuel reserves are concentrated in a small number of countries – many of whom have no commitment to reduce emissions, let alone leave the source of major revenues in the ground – the opportunity of producing more is likely to be taken up. Further, the harms the activist’s action is immediate, very definite and concentrated, whilst the benefits of reduced climate change impacts from reduced emissions are speculative and dispersed over tens of billions of people. 

Kevin Marshall

The Policy Gap in Achieving the Emissions Goals

The Millar et al. 2017 has severe problems with the numbers, as my previous post suggested. But there is a more fundamental problem in achieving emissions goals. It is contained in the introductory paragraphs to an article lead author Richard Millar posted at Carbon Brief

The Paris Agreement set a long-term goal of limiting global warming to “well-below” 2C above pre-industrial levels and to pursue efforts to restrict it to 1.5C.

A key question for the upcoming rounds of the international climate negotiations, particularly when countries review their climate commitments next year, is exactly how fast would we have to cut emissions to reach these goals?

In a new paper, published in Nature Geoscience, we provide updated estimates of the remaining “carbon budget” for 1.5C. This is the total amount of CO2 emissions that we can still emit whilst limiting global average warming to 1.5C.

Our estimates suggest that we would have a remaining carbon budget equivalent to around 20 years at current emissions rates for a 2-in-3 chance of restricting end-of-century warming to below 1.5C.

This suggests that we have a little more breathing space than previously thought to achieve the 1.5C limit. However, although 1.5C is not yet a geophysical impossibility, it remains a very difficult policy challenge.

The problem is with the mixing of singular and plural statements. The third paragraph shows the problem.

In a new paper, published in Nature Geoscience, we provide updated estimates of the remaining “carbon budget” for 1.5C. This is the total amount of CO2 emissions that we can still emit whilst limiting global average warming to 1.5C.

In the first sentence, the collective “we” refers to the ten authors of the paper. That is Richard J. Millar, Jan S. Fuglestvedt, Pierre Friedlingstein, Joeri Rogelj, Michael J. Grubb, H. Damon Matthews, Ragnhild B. Skeie, Piers M. Forster, David J. Frame & Myles R. Allen.  In the second sentence, the collective “we” refers to approximately 7500 million people on the planet, who live about 195 countries. Do they speak for all the people in Russia, India, Nigeria, Iran, Iraq, China, Taiwan, North and South Korea, the United States and Australia for instance? What I would suggest is they are speaking figuratively about what they believe the world ought to be doing.

Yet the political realities are that even though most countries have signed the Paris Agreement, it does not commit them to a particular emissions pathway, nor to eliminate their emissions by a particular date. It only commits them to produce further INDC submissions every five years, along with attending meetings and making the right noises. Their INDC submissions are not scrutinized, still less sent back for “improved ambition” if they are inadequate in contributing to the aggregate global plan.

Looking at the substance of the Paris Agreement, point 17 notes gives an indication of the policy gap.

17. Notes with concern that the estimated aggregate greenhouse gas emission levels in 2025 and 2030 resulting from the intended nationally determined contributions do not fall within least-cost 2 ˚C scenarios but rather lead to a projected level of 55 gigatonnes in 2030, and also notes that much greater emission reduction efforts will be required than those associated with the intended nationally determined contributions in order to hold the increase in the global average temperature to below 2 ˚C above pre-industrial levels by reducing emissions to 40 gigatonnes or to 1.5 ˚C above pre-industrial levels by reducing to a level to be identified in the special report referred to in paragraph 21 below;

But the actual scale of the gap is best seen from the centerpiece graphic of the UNFCCC Synthesis report on the aggregate effect of INDCs, prepared in the run-up to COP21 Paris. Note that this website also has all the INDC submissions in three large Pdf files.

The graphic I have updated with estimates of the policy gap with my take on revised Millar et. al 2017 policy gaps shown by red arrows.

The extent of the arrows could be debated, but will not alter the fact that Millar et. al 2017 are assuming that by adjusting the figures and assuming that they are thinking for the whole world, that the emissions objectives will be achieved. The reality is that very few countries have committed to reducing their emissions by anything like an amount consistent with even a 2°C pathway. Further, that commitment is just until 2030, not for the 70 years beyond that. There is no legally-binding commitment in the Paris Agreement for a country to reduce emissions to zero sometime before the end of the century. Further, a number of countries (including Nigeria, Togo, Saudi Arabia, Turkmenistan, Iraq and Syria) have not signed the Paris Agreement – and the United States has given notification of coming out of the Agreement. Barring huge amounts of funding or some technological miracle most developing countries, with a majority of the world population, will go on increasing their emissions for decades. This includes most of the countries who were Non-Annex Developing Countries to the 1992 Rio Declaration. Collectively they accounted for just over 100% of the global GHG emissions growth between 1990 and  2012.

As some of these Countries’ INDC Submissions clearly state, most will not sacrifice economic growth and the expectations of their people’s for the unproven dogma of politicalized academic activists in completely different cultures say that the world ought to cut emissions. They will attend climate conferences and be seen to be on a world stage, then sign meaningless agreements afterward that commit them to nothing.

As a consequence, if catastrophic anthropogenic global warming is true (like the fairies at the bottom of the garden) and climate mitigation reduction targets are achieved, the catastrophic climate change will be only slightly less catastrophic and the most extreme climate mitigation countries will be a good deal poorer. The non-policy countries will the ones better off. It is the classic free-rider problem, which results in an underprovision of those goods or services. If AGW is somewhat milder, then even these countries will be no worse off.

This is what really irritates me. I live in Britain, where the Climate Change Act 2008 has probably the most ludicrous targets in the world. That Act was meant to lead the world on climate change. The then Environment Secretary David Miliband introduced the bill with this message in March 2007.

From the graphic above COP21 Paris showed that most of the world is not following Britain’s lead. But the “climate scientists” are so stuck in their manipulated models, they forget that their models and beliefs of their peers are not the realities of the wider world. The political realities mean that reduction of CO2 emissions are net harmful to the people of Britain, both now and for future generations of Britains. The activists are just as wilfully negligent in shutting down any independent review of policy as a pharmaceutical company who would push one of its products onto the consumers without an independent evaluation of both the benefits and potential side effects.

Kevin Marshall

Nature tacitly admits the IPCC AR5 was wrong on Global Warming

There has been a lot of comment on a recent paper at nature geoscience “Emission budgets and pathways consistent with limiting warming to 1.5C” (hereafter Millar et. al 2017)

When making a case for public policy I believe that something akin to a process of due diligence should be carried out on the claims. That is the justifications ought to be scrutinized to validate the claims. With Millar et. al 2017, there are a number of issues with the make-up of the claims that (a) warming of 1.5C or greater will be achieved without policy (b) constraining the emissions  

The baseline warming

The introduction states
Average temperatures for the 2010s are currently 0.87°C above 1861–80,

A similar quote from UNIPCC AR5 WG1 SPM page 5

The total increase between the average of the 1850–1900 period and the 2003–2012 period is 0.78 [0.72 to 0.85] °C, based on the single longest dataset available.

These figures are all from the HADCRUT4 dataset. There are three areas to account for the difference of 0.09°C. Mostly it is the shorter baseline period. Also, the last three years have been influenced by a powerful and natural El-Nino, along with the IPCC using an average of the last 10 years.

The warming in the pipeline

There are valid reasons for the authors differing from the IPCC’s methodology. They start with the emissions from 1870 (even though emissions estimates go back to 1850). Also, if there is no definite finish date, it is very difficult to calculate the warming impact to date. Consider first the full sentence quoted above.

Average temperatures for the 2010s are currently 0.87°C above 1861–80, which would rise to 0.93°C should they remain at 2015 levels for the remainder of the decade.

This implies that there is some warming to come through from the impact of the higher greenhouse gas levels. This seems to be a remarkably low and over a very short time period. Of course, not all the warming since the mid-nineteenth century is from anthropogenic greenhouse gas emissions. The anthropogenic element is just guesstimated. This is show in AR5 WG1 Ch10 Page 869

More than half of the observed increase in global mean surface temperature (GMST) from 1951 to 2010 is very likely due to the observed anthropogenic increase in greenhouse gas (GHG) concentrations.

It was after 1950 when the rate largest increase in CO2 levels was experienced. From 1870 to 1950, CO2 levels rose from around 290ppm to 310ppm or 7%. From 1950 to 2010, CO2 levels rose from around 310ppm to 387ppm or 25%. Add in other GHG gases and there the human-caused warming should be 3-4 times greater in the later period than the earlier one, whereas the warming in the later period was just over twice the amount. Therefore if there is just over a 90% chance (very likely in IPCC speak) of over 50% of the warming post-1950 was human-caused, a statistical test relating to a period more than twice as long would have a lower human-caused element of the warming as being statistically significant. Even then, I view the greater than 50% statistic as being deeply flawed. Especially when post-2000, when the rate of rise in CO2 levels accelerated, whilst the rise in average temperatures dramatically slowed. There are two things that this suggests. First, the impact could be explained by rising GHG emissions being a minor element in temperature rise, with natural factors both causing some of the warming in the 1976-1998 period, then reversing, causing cooling, in the last few years. Second is that there is a darn funny lagged response of rising GHGs (especially CO2) to rises in temperature. That is the amount of warming in the pipeline has increased dramatically. If either idea has any traction then the implied warming to come of just 0.06°is a false estimate. This needs to be elaborated.

Climate Sensitivity

If a doubling of CO2 leads to 3.00°C of warming (the assumption of the IPCC in their emissions calculations), then a rise in CO2 levels from 290ppm to 398 ppm (1870 to 2014) eventually gives 1.37°C of warming. With other GHGs this figure should be around 1.80°C. Half that warming has actually occurred, and some of that is natural. So there is well over 1.0°C still to emerge. It is too late to talk about constraining warming to 1.5°C as the cause of that warming has already occurred.

The implication from the paper in claiming that 0.94°C will result from human emissions in the period 1870-2014 is to reduce the climate sensitivity estimate to around 2.0°C for a doubling of CO2, if only CO2 is considered, or around 1.5°C for a doubling of CO2, if all GHGs are taken into account. (See below) Compare this to AR5 WG1 section D.2 Quantification of Climate System Responses

The equilibrium climate sensitivity quantifies the response of the climate system to constant radiative forcing on multicentury time scales. It is defined as the change in global mean surface temperature at equilibrium that is caused by a doubling of the atmospheric CO2 concentration. Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C (high confidence), extremely unlikely less than 1°C (high confidence), and very unlikely greater than 6°C (medium confidence).

The equilibrium climate sensitivity ECS is at the very bottom of the IPCC’s range and equilibrium climate response is reached in 5-6 years instead of mutlicentury time scales. This on top of the implied assumption that there is no net natural warming between 1870 and 2015.

How much GHG emissions?

With respect to policy, as global warming is caused by human greenhouse gas emissions, to prevent further human-caused warming requires reducing, and possibly eliminating global greenhouse emissions. In conjunction with the publication of the AR5 Synthesis report, the IPCC produced a slide show of the policy case laid out in the three vast reports. It was effectively a short summary of a summary of the synthesis report. Approaching the policy climax at slide 30 of 35:-

Apart from the policy objective in AR5 was to limit warming from 2°C, not 1.5°C, it also mentions the need to constrain GHG emissions, not CO2 emissions. Then slide 33 gives the simple policy simplified position to achieve 2°C of warming.

To the end of 2011 1900 GTCO2e of GHGs was estimated to have been emitted, whilst the estimate is around 1000 GTCO2e could be emitted until the 2°C warming was reached.

The is the highly simplified version. At the other end of the scale, AR5 WG3 Ch6 p431 has a very large table in a very small font to consider a lot of the policy options. It is reproduced below, though the resolution is much poorer than the original.

Note 3 states

For comparison of the cumulative CO2 emissions estimates assessed here with those presented in WGI AR5, an amount of 515 [445 to 585] GtC (1890 [1630 to 2150] GtCO2), was already emitted by 2011 since 1870

The top line is for the 1.5°C of warming – the most ambitious policy aim. Of note:-

  • The CO2 equivalent concentration in 2100 (ppm CO2eq ) is 430-480ppm.
  • Cumulative CO2 emissions (GtCO2) from 2011 to 2100 is 630 to 1180.
  • CO2 concentration in 2100 is 390-435ppm.
  • Peak CO2 equivalent concentration is 465-530ppm. This is higher than the 2100 concentration and if for CO2 alone with ECS = 3 would eventually produce 2.0°C to 2.6°C of warming.
  • The Probability of Exceeding 1.5 °C in 2100 is 49-86%. They had to squeeze really hard to say that 1.5°C was more than 50% likely.

Compare the above to this from the abstract of Millar et. al 2017.

If COemissions are continuously adjusted over time to limit 2100 warming to 1.5C, with ambitious non-COmitigation, net future cumulativCOemissions are unlikely to prove less than 250 GtC and unlikely greater than 540 GtC. Hence, limiting warming to 1.5C is not yet a geophysical impossibility, but is likely to require delivery on strengthened pledges for 2030 followed by challengingly deep and rapid mitigation.

They use tonnes of carbon as the unit of measure as against CO2 equivalent. The conversion factor is 3.664, so cumulative CO2 emissions need to be 870-1010 GtCO2 range. As this is to the end of 2015, not 2011 as in the IPCC report, it will be different. Subtracting 150 from the IPCC reports figures would give a range of 480 to 1030. That is, Millar et. al 2017 have reduced the emissions range by 75% to the top end of the IPCC’s range. Given the IPCC considered a range of 1.5-1.7°C of warming, this seems somewhat odd to then say it related to the lower end of the warming band, until you take into account that ECS has been reduced. But then why curtail the range of emissions instead calculating your own? It appears that again the authors are trying to squeeze a result within existing constraints.

However, this does not take into account the much higher levels of peak CO2 equivalent concentrations in table 6.3. Peak CO2 concentrations are around 75-95ppm higher than in 2100. Compare this to the green line in the central graph in Millar et. al 2017. 

 This is less than 50ppm higher than in 2100. Further in 2100 Millar et. al 2017 has CO2 levels of around 500ppm as against a mid-point of 410 in AR5. CO2 rising from 290 to 410ppm with ECS = 3.0 produced 1.50°C of warming. CO2 rising from 290 to 410ppm with ECS = 2.0 produced 1.51°C of warming. Further, this does not include the warming impact of other GHGs. To squeeze into the 1.5°C band, the mid-century overshoot in Millar et. al 2017 is much less than in AR5. This might be required in the modeling assumptions due to the very short time assumed in reaching full equilibrium climate response.

Are the authors playing games?

The figures do not appear to stack up. But then they appear to be playing around with figures, indicated by a statement in the explanation of Figure 2

Like other simple climate models, this lacks an explicit physical link between oceanic heat and carbon uptake. It allows a global feedback between temperature and carbon uptake from the atmosphere, but no direct link with net deforestation. It also treats all forcing agents equally, in the sense that a single set of climate response parameters is used in for all forcing components, despite some evidence of component-specific responses. We do not, however, attempt to calibrate the model directly against observations, using it instead to explore the implications of ranges of uncertainty in emissions, and forcing and response derived directly from the IPCC-AR5, which are derived from multiple lines of evidence and, importantly, do not depend directly on the anomalously cool temperatures observed around 2010.

That is:-

  • The model does not consider an “explicit physical link between oceanic heat and carbon uptake.” The IPCC estimated that over 90% of heat accumulation since 1970 was in the oceans. If the oceans were to belch out some of this heat at a random point in the future the 1.5°C limit will be exceeded.
  • No attempt has been made to “calibrate the model directly against observations”. Therefore there is no attempt to properly reconcile beliefs to the real world.
  • The “multiple lines of evidence” in IPCC-AR5 does not include a glaring anomaly that potentially falsifies the theory and therefore any “need” for policy at all. That is the divergence in actual temperatures trends from theory in this century.


The authors of Millar et. al 2017 have pushed out the boundaries to continue to support climate mitigation policies. To justify constraining emissions sufficient stop 1.5°C of warming the authors would appear to have

  • Assumed that all the warming since 1870 is caused by anthropogenic GHG emissions when there is not even a valid statistical test that confirms even half the warming was from this source.
  • Largely ignored any hidden heat or other long-term response to rises in GHGs.
  • Ignored the divergence between model predictions and actual temperature anomalies since around the turn of the century. This has two consequences. First, the evidence appears to strongly contradict the belief that humans are a major source of global warming and by implication dangerous climate change. Second, if it does not contradict the theory, suggests the amount of warming in the pipeline consequential on human GHG emissions has massively increased. Thus the 1.5°C warming could be breached anyway.
  • Made ECS as low as possible in the long-standing 1.5°C to 4.5°C range. Even assuming ECS is at the mid-point of the range for policy (as the IPCC has done in all its reports) means that warming will breach the 1.5°C level without any further emissions. 

The authors live in their closed academic world of models and shared beliefs. Yet the paper is being used for the continued support of mitigation policy that is both failing to get anywhere close to achieving the objectives and is massively net harmful in any countries that apply it, whether financially or politically.

Kevin Marshall

Commentary at Cliscep, Jo Nova, Daily Caller, Independent, The GWPF

Update 25/09/17 to improve formatting.

The Inferior Methods in Supran and Oreskes 2017

In the previous post I looked at one aspect of the article Assessing ExxonMobil’s Climate Change Communications (1977–2014) by Geoffrey Supran and Naomi Oreskes. I concluded the basis for evaluation of ExxonMobil’s sponsored climate papers – “AGW is real, human-caused, serious, and solvable” –  is a mantra held by people who fail to distinguish between empirical and verifiable statements, tautologies, opinions and public policy that requires some fanciful global political implementation. In this post I look at how the application of that mantra in analyzing journal articles can lead to grossly misleading interpretations.

Under Section 2. Method, in Table 2 the authors lay out their criteria evaluation in terms of how the wording supports (endorses) or doubts elements of the mantra. For AGW is real & human-caused there are quite complex criteria. But for whether it is “serious” and “solvable” they are much more straightforward, and I have reproduced them below.

The acknowledgment or doubt of “AGW as serious” or “AGW as solvable” are in relation to the mantra. That is the only criteria used. Supran and Oreskes would claim that this does not matter. What they are looking at is the positions communicated in the papers relative to the positions expressed by ExxonMobil externally. But there are problems with this methodology in terms of alternative perspectives that are missing.

First is that the underlying quality and clarity of results and relevancy of each paper is ignored. What matters to Supran and Oreskes is the language used.

Second is that ExxonMobil’s papers are not the only research on whether “AGW is real, human-caused, serious, and solvable”. The authors could also take into account the much wider body of papers out there within the broad areas covered by the mantra.

Third, if the totality of the research – whether ExxonMobil’s or the totality of climate research – does not amount to a strong case for anthropogenic global warming being a serious global problem, and nor having a workable solution, why should they promote politicized delusions?

Put this into the context of ExxonMobil – one of the World’s most successful businesses over decades – by applying some of the likely that it would use in assessing a major project or major strategic investment. For instance

  • How good is the evidence that there is a serious problem on a global scale emerging from human GHG emissions?
  • How strong is the evidence that humans have caused the recent warming?
  • Given many years of research, what is the track record of improving the quality and refinement of the output in the climate area?
  • What quality controls and KPIs are in place to enable both internal and external auditors to validate the work?
  • Where projections are made, what checks on the robustness of those projections have been done?
  • Where economic projections are produced, have they been done by competent mainstream economists, what are the assumptions made, and what sensitivity analyses have been done on those assumptions?
  • Does the project potentially harm investors, employees, customers and other stakeholders in the business? Where are the risk assessments of such potential harms, along with the procedures for the reporting and investigation of non-compliances?
  • Does a proposed project risk contravening laws and internal procedures relating to bribery and corruption?
  • Once a project is started, is it possible to amend that project over time or even abandon it should it fail to deliver? What are the contractual clauses that enable project amendment or abandonment and the potential costs of doing so?

Conclusions and further thoughts

Supran and Oreskes evaluate the ExxonMobil articles for AGW and policy in terms of a belief mantra applied to a small subset of the literature on the subject. Each article is looked at independently of from all other articles and indeed all other available information. Further any legitimate argument or evidence that undermines the mantra is evidence of doubt. It is all throwing the onus on ExxonMobil to disprove the allegations, but never for Supran and Oreskes justify their mantra or their method of analysis is valid.

There are some questions arising from this, that I hope to pursue in later posts.

1. Is the method of analysis just a means of exposing ExxonMobil’s supposed hypocrisy by statistical means, or does it stem from a deeply flawed and ideological way of perceiving the world, that includes trying to shut out the wider realities of the real world, basic logic and other competing (and possibly superior) perspectives?

2. Whatever spread of misinformation and general hypocrisy might be shown on the part of ExxonMobil from more objective and professional perspectives, is there not greater misinformation sown by the promoters of the “climate consensus“?

3. Can any part of the mantra “AGW is real, human-caused, serious, and solvable” be shown to be false in the real world, beyond reasonable doubt?

Kevin Marshall


Supran and Oreskes on ExxonMobils Communication of Climate Change

Over at Cliscep, Geoff Chambers gave a rather bitter review (with foul language) about a new paper, Assessing ExxonMobil’s Climate Change Communications (1977–2014) by Geoffrey Supran and Naomi Oreskes.
One point that I would like to explore is part of a quote Geoff uses:-

The issue at stake is whether the corporation misled consumers, shareholders and/or the general public by making public statements that cast doubt on climate science and its implications, and which were at odds with available scientific information and with what the company knew. We stress that the question is not whether ExxonMobil ‘suppressed climate change research,’ but rather how they communicated about it.

It is the communication of climate science by a very powerful oil company, that the paper concentrates upon. The approach reveals a lot about the Climate Change movement as well. In particular, this statement in the introduction:-

Research has shown that four key points of understanding about AGW—that it is real, human-caused, serious, and solvable—are important predictors of the public’s perceived issue seriousness, affective issue involvement, support for climate policies, and political activism [62–66].

The references are as follows

[62] Krosnick J A, Holbrook A L, Lowe L and Visser P S 2006 The origins and consequences of democratic citizens’ policy agendas: a study of popular concern about global warming Clim. Change 77 7–43
[63] Ding D, Maibach E W, Zhao X, Roser-Renouf C and Leiserowitz A 2011 Support for climate policy and societal action are linked to perceptions about scientific agreement Nat. Clim. Change 1 462–6
[64] Roser-Renouf C, Maibach E W, Leiserowitz A and Zhao X 2014 The genesis of climate change activism: from key beliefs to political action Clim. Change 125 163–78
[65] Roser-Renouf C, Atkinson L, Maibach E and Leiserowitz A 2016 The consumer as climate activist Int. J. Commun. 10 4759–83
[66] van der Linden S L, Leiserowitz A A, Feinberg G D and Maibach E W 2015 The scientific consensus on climate change as a gateway belief: experimental evidence PLoS One 10 e0118489

For the purposes of Supran and Oreskes study, the understanding that people have does not require any substance at all beyond beliefs. For instance, the Jehovah Witness Sect developing an “understanding” that Armageddon would occur in 1975. This certainly affected their activities in the lead up to the momentous history-ending event. Non-believers or members of the Christian Church may have been a little worried, shrugged their shoulders, or thought the whole idea ridiculous. If similar studies to those on climate activism had been conducted on the prophecy of Armageddon 1975, similar results could have been found to those quoted for AGW beliefs in references 62-66. That is, the stronger the belief in the cause, whether religious evangelism in the case of Jehovah’s Witnesses, or ideological environmentalism in the case of AGW, is a predictor of activism in support of the cause. They cannot go further because of an issue with scholarly articles. Claims made must be substantiated, something that cannot be done with respect to the prophesies of climate catastrophism, except in a highly nuanced form.
But the statement that AGW is “real, human-caused, serious, and solvable” – repeated five times in the article – indicates something about the activists understanding of complex issues.
AGW is real” is not a proper scientific statement, as it is not quantified. Given that the impacts on surface temperatures can muffled and delayed nearly indefinitely by natural factors, or swallowed by the oceans, the belief can be independent of any contrary evidence for decades to come.
AGW is human-caused”, is saying “Human-caused global warming is human-caused”. It is a tautology that tells us nothing about the real world.
AGW is serious” is an opinion. It may be a very widely-held opinion, with many articles written with confirming evidence, and many concerned people attending massive conferences where it is discussed. But without clear evidence for emerging net adverse consequences, the opinion is largely unsubstantiated.
AGW is solvable” could be whether it is theoretically solvable, given the technology and policies being implemented. But the statement also includes whether it is politically solvable, getting actual policies to reduce emissions fully implemented. If the “solution” is the reduction of global emissions to a level commensurate with 2C of warming (hence a partial solution), then COP21 in Paris shows that AGW is a long way from being solvable, with no actual solution in sight. Whereas the 2C limit requires global emissions to be lower in 2030 than in 2015, and falling rapidly, fully implemented policies would still see emissions higher in 2030 than in 2015 and still increasing.

The statement AGW is “real, human-caused, serious, and solvable” is, therefore, nothing more than a mantra held by people who fail to distinguish between empirical and verifiable statements, tautologies, opinions and public policy that requires some fanciful global political implementation. 

Kevin Marshall

Met Office Extreme Wet Winter Projections

I saw an article in the Telegraph

Met Office warns Britain is heading for ‘unprecedented’ winter rainfall, with records broken by up to 30pc 

Britain is heading for “unprecedented” winter rainfall after the Met Office’s new super computer predicted records will be broken by up to 30 per cent.

Widespread flooding has hit the UK in the past few years leading meteorologists to search for new ways to “quantify the risk of extreme rainfall within the current climate”.

In other words, the Telegraph reporting that the Met Office is projecting that if the current record is, say, 100mm, new records of 130mm could be set.

BBC is reporting something slightly different

High risk of ‘unprecedented’ winter downpours – Met Office

There is an increased risk of “unprecedented” winter downpours such as those that caused extensive flooding in 2014, the UK Met Office says.

Their study suggests there’s now a one in three chance of monthly rainfall records being broken in England and Wales in winter.

The estimate reflects natural variability plus changes in the UK climate as a result of global warming.

The BBC has a nice graphic, of the most extreme winter month of recent years for rainfall.

The BBC goes onto say

Their analysis also showed a high risk of record-breaking rainfall in England and Wales in the coming decade.

“We found many unprecedented events in the model data and this comes out as a 7% risk of a monthly record extreme in a given winter in the next few years, that’s just over Southeast England,” Dr Vikki Thompson, the study’s lead author told BBC News.

“Looking at all the regions of England and Wales we found a 34% chance of an extreme event happening in at least one of those regions each year.”

Not only is there a greater risk, but the researchers were also able to estimate that these events could break existing records by up to 30%.

“That is an enormous number, to have a monthly value that’s 30% larger, it’s a bit like what we had in 2014, and as much again,” said Prof Adam Scaife from the Met Office.

The 30% larger is an outlier.

But over what period is the record?

The Met Office website has an extended version of what the BBC reports. But strangely no figures. There is a little video by Dr Vikki Thomson to explain.

She does say only recent data is used, but no definition of what constitutes recent. A clue lies not in the text, but an explanatory graphic.

It is from 35 years of winters, which ties into the BBC’s graphic from 1981. There are nine regions in England and Wales by the Met Office definition. The tenth political region of London is included in the South East. There could be different regions for the modeling. As Ben Pile and Paul Homewood pointed out in the comments to the Cliscep article, elsewhere the Met Office splits England and Wales into six regions. What is amazing is that the Met Office article does not clarify the number of regions, still less show the current records in the thirty-five years of data. There is therefore no possibility of ever verifying the models.

Put this into context. Northern Ireland and Scotland are excluded, which seems a bit arbitrary. If rainfall was random, then the chance of this coming winter setting a new record in a region is nearly 3%. For any one of nine regions, if data rainfall data independent between regions (which it is not) it is nearly a 26% chance. 34% is higher. But consider the many alternatives ways for the climate patterns to become more extreme and variable. After all, with global warming there climate could be thrown into chaos, so more extreme weather should be emerging as a foretaste of much worse to come. Given the many different aspects of weather, there could be hundreds of possible ways climate could get worse. With rainfall, it could be wetter or drier, in either summer or winter. That is four variables, of which the Met Office choose just one. Or could be in any 1, 2, 3… or 12 month period. Then again, climate change could mean more frequent and violent storms, such as that of 1987. Or it could mean more heatwaves. Statistically, heatwaves records could be a number of different ways, such as, say, 5 consecutive days in a month where the peak daily temperature is more than 5C about the long-term monthly average peak temperature.
So why choose rainfall in winter? Maybe it is because in recent years there have been a number of unusually wet winters. It looks like the Met Office, for all the power of their mighty computers, have fallen for a common fallacy.


Texas sharpshooter fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are stressed. From this reasoning, a false conclusion is inferred. This fallacy is the philosophical/rhetorical application of the multiple comparisons problem (in statistics) and apophenia (in cognitive psychology). It is related to the clustering illusion, which refers to the tendency in human cognition to interpret patterns where none actually exist.
The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.

A run of extremely wet winters might be due to random clustering, or it could genuine patterns from natural variation, or it could be a sign of human-caused climate change. An indication of random clustering would be to look at many other the different aspects of weather, to see if there is a recent trend of emerging climate chaos. Living in Britain, I suspect that the recent wet weather is just drawing the target around the tightest clusters. Even then, high winter rainfall in Britain high rainfall this is usually accompanied by slightly milder temperatures than average. Extreme winter cold is usually on cloud-free days. So, if winter rainfall is genuinely getting worse it seems that the whole global warming thing for Britain is predicted to become a bit a damp squib.

Kevin Marshall


Larson C ice-shelf break-away is not human-caused but Guardian tries hard to imply otherwise

A couple of days ago the BBC had an article Giant iceberg splits from Antarctic.

The giant block is estimated to cover an area of roughly 6,000 sq km; that’s about a quarter the size of Wales.

A US satellite observed the berg on Wednesday while passing over a region known as the Larsen C Ice Shelf.

Scientists were expecting it. They’d been following the development of a large crack in Larsen’s ice for more than a decade.

The rift’s propagation had accelerated since 2014, making an imminent calving ever more likely.

After looking at various evidence the BBC concludes

“Most glaciologists are not particularly alarmed by what’s going on at Larsen C, yet. It’s business as usual.”

Researchers will be looking to see how the shelf responds in the coming years, to see how well it maintains a stable configuration, and if its calving rate changes.

There was some keen interest a while back when the crack, which spread across the shelf from a pinning point known as the Gipps Ice Rise, looked as though it might sweep around behind another such anchor called the Bawden Ice Rise. Had that happened, it could have prompted a significant speed-up in the shelf’s seaward movement once the berg came off.

As it is, scientists are not now expecting a big change in the speed of the ice.

That is the theory about a link with accelerating global warming is no longer held due to lack of evidence. But the Guardian sees things differently.

Unlike thin layers of sea ice, ice shelves are floating masses of ice, hundreds of metres thick, which are attached to huge, grounded ice sheets. These ice shelves act like buttresses, holding back and slowing down the movement into the sea of the glaciers that feed them.

“There is enough ice in Antarctica that if it all melted, or even just flowed into the ocean, sea levels [would] rise by 60 metres,” said Martin Siegert, professor of geosciences at Imperial College London and co-director of the Grantham Institute for Climate Change & Environment. 

Despite the lack of evidence for the hypothesis about accelerating ice loss due to glaciers slipping into the sea the Guardian still quotes the unsupported hypothesis. Then the article has a quote from someone who seems to extend the hypothesis to the entire continent. Inspection of their useful map of the location of Larson C might have been helpful.

Larsen C is located mid-way up the Antarctic Peninsula, which comprises around 2% of the area of Antarctica. The Peninsula has seen some rapid warming, quite unlike East Antarctica where very little warming has been detected. That is the Antarctic Peninsula is climatically different from the vast majority of the continent, where nearly all of the ice mass is located.

The article the goes on to contradict the implication with climate change, so the quote is out of context.

Andrew Shepherd, professor of Earth Observation at the University of Leeds, agreed. “Everyone loves a good iceberg, and this one is a corker,” he said. “But despite keeping us waiting for so long, I’m pretty sure that Antarctica won’t be shedding a tear when it’s gone because the continent loses plenty of its ice this way each year, and so it’s really just business as usual!”

However, the Guardian then slips in another out of context quote at the end of the article.

The news of the giant iceberg comes after US president Donald Trump announced that the US will be withdrawing from the 2015 Paris climate accord – an agreement signed by more than 190 countries to tackle global warming. 

Another quote from the BBC article helps give more perspective.

How does it compare with past bergs?

The new Larsen berg is probably in the top 10 biggest ever recorded.

The largest observed in the satellite era was an object called B-15. It came away from the Ross Ice Shelf in 2000 and measured some 11,000 sq km. Six years later, fragments of this super-berg still persisted and passed by New Zealand.

In 1956, it was reported that a US Navy icebreaker had encountered an object of roughly 32,000 sq km. That is bigger than Belgium. Unfortunately, there were no satellites at the time to follow up and verify the observation.

It has been known also for the Larsen C Ice Shelf itself to spawn bigger bergs. An object measuring some 9,000 sq km came away in 1986. Many of Larsen’s progeny can get wound up in a gyre in the Weddell sea or can be despatched north on currents into the Southern Ocean, and even into the South Atlantic.

A good number of bergs from this sector can end up being caught on the shallow continental shelf around the British overseas territory of South Georgia where they gradually wither away.

Bigger events have happened in the past. It is only due to recent technologies that we are able to measure the break-up of ice shelves, or even to observe icebergs the size of small countries.

Note that the Guardian graphic is sourced from Swansea University. Bloomberg has a quote that puts the record straight.

Although this is a natural event, and we’re not aware of any link to human-induced climate change,” said Martin O’Leary, a glaciologist at Swansea University, in a statement.

Kevin Marshall

The Closest yet to my perspective on Climate Change

 Michael S. Bernstam of the Hoover Institution has produced a short post Inconvenient Math. (hattip The GWPF). The opening paragraphs are:-

Climate change faces a neglected actuarial problem. Too many conditions must be met to warrant a policy action on climate change. The following four stipulations must each be highly probable:

1. Global warming will accumulate at 0.12 degrees Celsius or higher per decade.

2. It is anthropogenic, due largely to carbon dioxide emissions.

3. The net effect is harmful to human well-being in the long run.

4. Preventive measures are efficient, that is, feasible at the costs not exceed-ing the benefits.

But even if the probability of each of these stipulations is as high as 85 percent, their compound probability is as low as 50 percent. This makes a decision to act or not to act on climate change equivalent to flipping a coin.

Bernstam later states

In the case of climate change, the conditions are four. They are not random, nor are they arbitrary. To see this, one can run a thought experiment and drop or ignore any of the above foursome. At once, the entire call for action on climate change becomes pointless. If global warming is not ongoing, there is no need to stop it. If it is not anthropogenic, there is no need to curb carbon dioxide emissions. If it is not harmful, there is no need to worry. If preventive measures are inefficient, they would not help and there is no use applying them. It follows that all four conditions are necessary. If just one of them does not hold, action is unnecessary or useless.

That is, for action on climate change to be justified (in terms of having a reasonable expectation that by acting to combat climate change a better future will be created than by not acting) there must be human-caused warming of sufficient magnitude to produce harmful consequences, AND measures that cost less than the expected future costs that they offset.

These sentiments are a simplified version of a series of posts I made in October 2013, where I very crudely deriving two cost curves (costs of climate change and climate mitigation). This aimed to replicate a takeaway quote from the Stern Review.

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more. In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.

I looked at the idea of multiplying the various probabilities together, at least for the costs of climate change.  But instead of the boundary it is a continuous function of an infinite number of possible scenarios. In general I believe the more extreme the costs of warming, the less likely it is to happen. The reason is that we derive the non-visible part of the cost curve can only be objectively derived from the revealed warming from the recent past. Separation of the costs of warming-induced climate change are extremely difficult from the costs of random extreme weather events. Even worse, the costs of extreme natural weather events (especially in terms of death toll) has been falling over time, as Indur Goklany has documented. The fall-back for global-warming theory is to use the late Milton Friedman’s Methodology of Positive Economics. That is to evaluate theory credibility on its predictive ability. If in the short-run climate scientists (or anyone who believes in climate alarmism like Al Gore) are able to make predictions about the signals of impending climate apocalypse, then this should give some credibility for claims of substantially worse to come. The problem is there are a huge number of failed predictions of climate worsening, but not a single one that has come true. This would signify that the true risk (as opposed to the perceived risk from the climate community) of climate change is approximately zero. The divergence of belief from the evidence is likely from the collective navel-gazing of post normal science.

The policy aspect that Bernstam fails to explore is the re-distributional aspects of policy. The theory is that global warming is caused by global greenhouse gas emissions. Therefore climate mitigation must comprise of reducing those global emissions. However, as the COP21 Paris showed most of the worlds population live in countries where there are no GHG emissions reduction policies even proposed. But actually reducing emissions means increasing energy costs, and hampering businesses with onerous regulations. Policy countries are given a comparative disadvantage to non-policy countries, as I tried to show here. The implication is that if developed countries strongly pursue high cost mitigation policies, the marginal cost of non-policy emerging economies switching to emissions reduction policies increases. Thus, whilst Donald Trump’s famous tweet that Global Warming is a Chinese hoax to make US manufacturing non-competitive is false, the impact of climate mitigation policies as currently pursued are the same as if it were true.

There is also a paradox with the costs of climate change. The costs of climate change are largely related to the unexpected nature of the costly events. For instance, ceteris paribus. a category 1 hurricane could be more costly in a non-hurricane area than a stronger hurricane in say Florida. The reason is that in the non-hurricane area buildings will not be as resistant to storms, nor will there be early warning procedures in place as in Florida. The paradox is that more successful climate scientists are in forecasting the risks of climate change, the more people can adapt to climate change, reducing the costs. The current focus on climate consensus, rather than focusing on increasing competency and developing real expertise in the field is actually harmful to future generations if climate change is a actually a serious emerging problem. But the challenge for the climate alarmists is that in developing the real expertise may result in their beliefs about the world are false.

Finally, Bernstam fails to acknowledge an immutable law of public policy. Large complex public policy projects with vague aims; poorly defined plans and lack of measurable costs tend to overshoot on costs and under-perform of benefits. Climate mitigation is an extreme example of complexity, lack of clear objects and lack object measurement of costs per unit of emissions saved.

Kevin Marshall

Joe Romm eco-fanaticism shown in Sea-Level Rise claims

The previous post was quite long and involved. But to see why Jo Romm is so out of order in criticizing President Trump’s withdrawal from the Paris Climate Agreement, one only has to examine the sub-heading of his rant  Trump falsely claims Paris deal has a minimal impact on warming. –

It may be time to sell your coastal property.

This follows with a graphic of Florida.

This implies that people in Southern Florida should take in account a 6 metre (236 inch) rise in sea levels as a result of President Trump’s decision. Does this implied claim stack up. As in the previous post, let us take a look at Climate Interactive’s data.

Without policy, Climate Interactive forecast that US emissions without policy will be 14.44 GtCO2e, just over 10% of global GHG emissions, and up from 6.8 GtCO2e in 2010. At most, even on CIs flawed reasoning, global emissions will be just 7% lower in 2100 with US policy. In the real world, the expensive job-destroying policy of the US will make global emissions around 1% lower even under the implausible assumption that the country were to extend the policy through to the end of the century. That would be a tiny fraction of one degree lower, even making a further assumption that a doubling of CO2 levels causes 3C of warming (an assumption contradicted by recent evidence). Now it could be that every other country will follow suit, and abandon all climate mitigation policies. This would be a unlikely scenario, given that I have not sensed a great enthusiasm for other countries to follow the lead of the current Leader of the Free World. But even if that did happen, the previous post showed that current policies do not amount to very much difference in emissions. Yet let us engage on a flight of fancy and assume for the moment that President Trump abandoning the Paris Climate Agreement will (a) make the difference between 1.5C of warming, with negligable sea-level rise and 4.2C of warming with the full impact of sea-level rise being felt (b) 5% of that rise. What difference will this make to sea-level rise?

The Miami-Dade Climate Change website has a report from The Sea Level Rise Task Force that I examined last November. Figure 1 of that report gives projections of sea-level rise assuming the no global climate policy.

Taking the most extreme NOAA projection it will be around the end of next century before sea-levels rose by 6 metres. Under the IPCC AR5 median estimates – and this is meant to be the Climate Bible for policy-makers – it would be hundreds of years before that sea-level rise would be achieved. Let us assume that the time horizon of any adult thinking of buying a property, is through to 2060, 42 years from now. The NOAA projection is 30 inches (0.76 metres) for the full difference in sea-level rise, or 1.5 inches (0.04 metres) for the slightly more realistic estimate. Using the mainstream IPCC AR5 median estimate, sea-level rise is 11 inches (0.28 metres) for the full difference in sea-level rise, or 0.6 inches (0.01 metres) for the slightly more realistic estimate. The real world evidence suggests that even these tiny projected sea level rises are exaggerated. Sea tide gauges around Florida have failed to show an acceleration in the rate of sea level rise. For example this from NOAA for Key West.

2.37mm/year is 9 inches a century. Even this might be an exaggeration, as in Miami itself, where the recorded increase is 2.45mm/year, the land is estimated to be sinking at 0.53mm/year.

Concluding Comments

If people based their evidence on the real world, President Trump pulling out of the Paris Climate Agreement will make somewhere between zero and an imperceptible difference to sea-level rise. If they base their assumptions on mainstream climate models, the difference is still imperceptible. But those with the biggest influence on policy are more influenced by the crazy alarmists like Joe Romm. The real worry should be that many policy-makers State level will be encouraged to waste even more money on unnecessary flood defenses, and could effectively make low-lying properties near worthless by planning blight when there is no real risk.

Kevin Marshall


Joe Romm inadvertently exposes why Paris Climate Agreement aims are unachievable


Joe Romm promotes a myth that the Paris Climate Agreement will make a huge difference to future greenhouse gas emissions. Below I show how the modelled impact of think tank Climate Interactive conclusion of a large difference is based on emissions forecasts of implausible large emissions growth in policy countries, and low emissions growth in the non-policy developing countries.


In the previous post I looked at how blogger Joe Romm falsely rebutted a claim that President Donald Trump had made that the Paris climate deal would only reduce only reduce future warming in 2100 by a mere 0.2°C. Romm was wrong on two fronts. He first did not check the data behind his assertions and second,  in comparing two papers by the same organisation he did not actually read the explanation in the later paper on how it differed from the first. In this post I look at how he has swallowed whole the fiction of bogus forecasts, that means the mere act of world leaders signing a bit of paper leads to huge changes in forecast emissions.

In his post  Trump falsely claims Paris deal has a minimal impact on warming, Romm states

In a speech from the White House Rose Garden filled with thorny lies and misleading statements, one pricks the most: Trump claimed that the Paris climate deal would only reduce future warming in 2100 by a mere 0.2°C. White House talking points further assert that “according to researchers at MIT, if all member nations met their obligations, the impact on the climate would be negligible… less than .2 degrees Celsius in 2100.”

The Director of MIT’s System Dynamics Group, John Sterman, and his partner at Climate Interactive, Andrew Jones, quickly emailed ThinkProgress to explain, “We are not these researchers and this is not our finding.”

They point out that “our business as usual, reference scenario leads to expected warming by 2100 of 4.2°C. Full implementation of current Paris pledges plus all announced mid-century strategies would reduce expected warming by 2100 to 3.3°C, a difference of 0.9°C [1.6°F].”

The reference scenario is RCP8.5, used in the IPCC AR5 report published in 2013 and 2014. This is essentially a baseline non-policy forecast against which the impact of climate mitigation policies can be judged. The actual RCP website produces emissions estimates by type of greenhouse gas, of which breaks around three-quarters is CO2. The IPCC and Climate Interactive add these different gases together with an estimate of global emissions in 2100. Climate Interactive current estimate as of April 2017 is 137.58 GtCO2e for the the reference scenario and the National Plans will produce 85.66 GTCO2e. These National would allegedly make global emissions 37.7% than they would have been without them, assuming they are extended beyond 2030. Climate Interactive have summarized this in a graph.

To anyone who actually reads the things, this does not make sense. The submissions made prior to the December 2015 COP21 in Paris were mostly political exercises, with very little of real substance from all but a very few countries, such as the United Kingdom. Why it does not make sense becomes clear from the earlier data that I extracted from Climate Interactives’ C-ROADS Climate Simulator version v4.026v.071 around November 2015.  This put the RCP8.5 global GHG emissions estimate in 2100 at the equivalent of 139.3 GtCO2e. But policy is decided and implemented at country level. To determine the impact of policy proposal there must be some sort of breakdown of emissions. C-ROADS does not provide a breakdown by all countries, but does to divide the world into up to 15 countries and regions. One derived break-down is into 7 countries or regions. That is the countries of USA, Russia, China and India, along with the country groups of EU27, Other Developed Countries and Other Developing Countries. Also available are population and GDP historical data and forecasts. Using this RCP8.5 and built-in population forecasts I derived the following GHG emissions per capita for the historical period 1970 to 2012 and the forecast period 2013 to 2100.

Like when I looked at Climate Interactives’ per capita CO2 emissions from fossil fuels estimates at the end of 2015, these forecasts did not make much sense. Given that these emissions are the vast majority of total GHG emissions it is not surprising that the same picture emerges.

In the USA and the EU I can think of no apparent reason for the forecast of per capita emissions to rise when they have been falling since 1973 and 1980 respectively. It would require for energy prices to keep falling, and for all sectors to be needlessly wasteful. The same goes for other developed countries, which along with Canada and Australia, includes the lesser developed countries of Turkey and Mexico. Indeed why would these countries go from per capita emissions similar to the EU27 now to those of the USA in 2100?

In Russia, emissions have risen since the economy bottomed out in the late 1990s following the collapse of communism. It might end up with higher emissions than the USA in 1973 due to the much harsher and extreme climate. But technology has vastly improved in the last half century and it should be the default assumption that it will continue to improve through the century. It looks like someone, or a number of people, have failed to reconcile the country estimate with the forecast decline in population from 143 million in 2010 to 117 million. But more than this, there is something seriously wrong with emission estimates that would imply that the Russian people become evermore inefficient and wasteful in their energy use.

In China there are similar issues. Emissions have increased massively in the last few decades on the back of even more phenomenal growth, that surpasses the growth of any large economy in history. But emissions per capita will likely peak due to economic reasons in the next couple of decades, and probably at a much lower level than the USA in 1973. But like Russia, population is also forecast to be much lower than currently. From a population of 1340 million in 2010, Climate Interactive forecasts population to peak at  1420 million in 2030 (from 2020 to 2030 growth slows to 2 million a year) to 1000 million in 2100. From 2080 (forecast population 1120) to 2100 population is forecast to decline by 6 million a year.

The emissions per capita for India I would suggest are far too low. When made, the high levels of economic growth were predicted to collapse post 2012. When I wrote the previous post on 30th December 2015, to meet the growth forecast for 2010-2015, India’s GDP would have needed to drop by 20% in the next 24 hours. It did not happen, and in the 18 months since actual growth has further widened the gap with forecast. Similarly forecast growth in GHG emissions are far too low. The impact of 1.25 billion people today (and 1.66 billion in 2100) is made largely irrelevant, nicely side-lining a country who has declared economic growth is a priority.

As with the emissions forecast for India, the emissions forecast for other developing countries is far too pessimistic, based again on too pessimistic growth forecasts. This mixed group of countries include the 50+ African nations, plus nearly all of South America. Other countries in the group include Pakistan, Bangladesh, Myanmar, Thailand, Indonesia, Vietnam, Haiti, Trinidad, Iraq, Iran and Kiribati. There are at least a score more I have omitted, in total making up around 42% of the current global population and 62% of the forecast population in 2100. That is 3 billion people today against 7 billion in 2100. A graph summarizing of Climate Interactive’s population figures is below.

This can be compared with total GHG emissions.

For the USA, the EU27, the other Developed countries and China, I have made more reasonable emissions per capita estimates for 2100.

These more reasonable estimates (assuming there is no technological breakthrough that makes zero carbon energy much cheaper than any carbon technology) produce a reduction in projected emissions of the same order of magnitude as the supposed reduction resulting from implementation of the National Plans. However, global emissions will not be this level, as non-policy developing nations are likely to have much higher emissions. Adjusting for this gives my rough estimate for global emissions in 2100.

The overall emissions forecast is not very dissimilar to that of RCP8.5. Only this time the emissions growth has shift dramatically from the policy countries to the non-policy countries. This is consistent with the data from 1990 to 2012, where I found that the net emissions growth was accounted for by the increase in emissions from developing countries who were not signatories to reducing emissions under the 1992 Rio Declaration. As a final chart I have put the revised emission estimates for India and Other Developing Countries to scale alongside Climate Interactives’ Scoreboard graphic at the top of the page.

This clearly shows that the emissions pathway consistent the constraining warming to  2°C will only be attained if the developing world collectively start reducing their emissions in a very few years from now. In reality, the priority of many is continued economic growth, which will see emissions rise for decades.

Concluding Comments

This is a long post, covering a lot of ground. In summary though it shows environmental activist has Joe Romm has failed to check the claims he is promoting. An examination of Climate Interactive (CI) data underlying the claims that current policies will reduce global temperature by 0.9°C through reducing GHG global emissions does not stand up to scrutiny. That 0.9°C claim is based on global emissions being around 35-40% lower than they would have been without policy. Breaking the CI data down into 7 countries and regions reveals that

  • the emissions per capita forecasts for China and Russia show implausibly high levels of emissions growth, when they show peak in a few years.
  • the emissions per capita forecasts for USA and EU27 show emissions increasing after being static or falling for a number of decades.
  • the emissions per capita forecasts for India and Other Developing Countries show emissions increasing as at implausibly lower rates than in recent decades.

The consequence is that by the mere act of signing an agreement makes apparent huge differences to projected future emissions. In reality it is folks playing around with numbers and not achieving anything at all, except higher energy prices and job-destroying regulations. However, it does save the believers in the climate cult from having to recognize the real world. Given the massed hordes of academics and political activists, that is a very big deal indeed.

Kevin Marshall