Valve Turner Micheal Foster’s Climate Necessity Defense

The Climate Necessity Defence for criminal acts to impede the lawful business of the fossil fuel industry cannot be justified. The acts will never of themselves have a significant impact in constraining global greenhouse emissions. In any event, there will always be more than sufficient proven fossil fuel reserves in countries out of the reach of any activist action, or even Government-backed action, to constrain aggregate cumulative fossil fuel emissions to anywhere near the levels commensurate with constraining temperature to 2°C of warming. What it does do is impose immediate harms on the actual victims of the crimes, and harms on the countries in which the crimes are committed. Some of the harms are from benefitting non-policy countries who produce fossil fuels. The conviction last week of climate activist Michael Foster is a clear case study.

 

The New York Times reports (hattip GWPF) on the conviction by the North Dakota Supreme Court of Seattle resident Michael Foster.

Foster took part in effort on Oct. 11, 2016, to draw attention to climate change by turning off valves on five pipelines that bring Canadian oil south. Foster targeted the Keystone Pipeline in North Dakota. Other activists targeted pipelines in Minnesota, Montana and Washington state.

A jury in North Dakota’s Pembina County on Friday convicted Foster after a weeklong trial of criminal mischief, criminal trespass and conspiracy. He faces up to 21 years in prison when he’s sentenced Jan. 18. The man who filmed his protest action, Samuel Jessup of Winooski, Vermont, was convicted of conspiracy and faces up to 11 years.

What I found interesting was the next sentence.

Foster had hoped to use a legal tactic known as the climate necessity defense — justifying a crime by arguing that it prevented a greater harm from happening.

The Climate Disobedience Center in its article for activists on the climate necessity defense says

The basic idea behind the defense — also known as a “choice of evils,” “competing harms,” or “justification” defense — is that the impacts of climate change are so serious that breaking the law is necessary to avert them.

Foster had his action filmed, shown from 2.07 here.

Keystone Pipeline. North Dakota. I’m Michael Foster. In order to preserve life as we know it and civilization, give us a fair chance and our kids a fair chance, I’m taking this action as a citizen. I am duty bound.

This was a significant action. The video quotes Reuters news agency.

Was this action “preserving life as we know it“? In shutting down the pipeline, (along with four pipelines others in the coordinated action) 590,000 barrels of oil failed to be transported from Canada to the USA that morning. It was merely delayed. If the pipelines are working at full capacity it would maybe have been transported by rail instead. Or more produced in the USA. Or more imported from the Middle East. But suppose that those 590,000 barrels (83000 tonnes) had been left in the ground, never to be extracted, rather than delaying production. What is the marginal difference that it would make climate change?

From the BP Statistical Review of World Energy 2016 (full report), I find that global oil production in 2015 was around 92 million barrels per day, or 4362 million tonnes in the full year. Global production would have been 0.6% lower on Oct. 11, 2016 or 0.002% lower in the full year. Yet there is plenty of the stuff in the ground. Proven global reserves are around 50.7 years of global production. Leaving 590,000 barrels in the ground will reduce proven reserves by around 0.000038%. That is less than one part in a million of proven oil reserves. Yet in the last few years, proven reserves have been increasing, as extraction techniques keep improving. This despite production climbing as well. 2015 production was 21% higher than in 2000 and 56% higher than in 1985. Proven reserves in 2015 were 30% higher than in 2000 and 112% higher than in 1985.

I have divided up those 50.7 years of reserves by major areas.

The effect of turning off the oil pipeline is posturing unless it shuts down oil production in Canada and the USA. But that would still leave over 40 years of proven reserves elsewhere. Are Russia and Middle Eastern countries going to shut down their production because of the criminal acts of a few climate activists in the USA?

But oil is not the only major fossil fuel. Production of coal in 2015 was 3830 Million tonnes of oil equivalent, 88% of oil production. Proven coal reserves are 123 years of current production. Further, if oil prices rise to the levels seen over the last few years, it will become economic to convert more coal to liquids, a process which consumes four to five times the CO2 of burning oil.

Are China, Russia, India, Australia, Ukraine, Indonesia, South Africa and many other countries going to shut down their production because of the criminal acts of a few climate activists in the USA?

The third major fossil fuel is gas. Production in 2015 was 3200 million tonnes of oil equivalent, 73% of oil production. Proven reserves are equivalent to 52.8 years of current production levels.

The reserves are slightly more centralized than for oil or coal. Like with oil, a large part of available reserves are concentrated in Russia and the Middle East.

Leaving 590,000 barrels in the ground would reduce proven reserves of fossil fuels by around one part in ten million.

The 50+ years of proven reserves of oil and gas, and 120+ years of proven reserves of coal needs to be put into a policy context. The IPCC AR5 Synthesis Report gave a very rough guide to how much CO2 (or equivalent greenhouse gases) could be emitted to limit warming to less than 2°C. From 2012 it was 1000 GtCO2e.

With emissions in 2011 at around 50 GtCO2e, that gave 20 years. From next year that will be less than 15 years. The recent paper “Emission budgets and pathways consistent with limiting warming to 1.5C” (hereafter Millar et. al 2017) reevaluated the figures, with the 1.5°C not being breached for a further 20 years. Whatever way you look at the figures, most of the proven fossil fuels in the world will have to be left in the ground. That requires the agreement of Saudi Arabia, Russia, Iran, Iraq, Qatar, Kuwait, Turkmenistan, China, India, Venezuela, alongside USA, Canada, Australia and a large number of other countries.

Further, there can be no more extractions of fossil fuels from unproven reserves, which will likely exceed the proven reserves.

The efforts of Micheal Foster and his mates could incite further criminal acts. But massive lawbreaking throughout the United States, it would still be insufficient in the USA to significantly dent the production and distribution of fossil fuels in the USA. Even if that happened, there are plenty of other countries who would willingly meet the existing demand. All that the action is likely to do is push up the costs of production and distribution in the USA, harming the US economy and the futures of people involved in the fossil fuel industries and energy-intensive industries.

It is the aspect of failing to make a significant marginal difference through the action – that is reducing global greenhouse gas emissions – than renders the climate necessity defense void. Even if large numbers of other actions are inspired by Foster and others, it would still be insufficient to get anywhere close to the constraint in emissions to constrain warming to 1.5°C or 2°C. On a larger scale, even if all major Western economies shut down all fossil fuel production and consumption immediately, it would merely delay by a few years the cumulative aggregate emissions from 2012 onwards exceeding 1000 GtCO2e.

It gets worse. A particular case must be decided on the damage caused to the victims of the crime. In this case the owners of the pipeline, the employees of the business, the customers who do not get their oil, etc. If there are beneficiaries, it is the billions of people in generations to come. The marginal difference to the victims of the action is tangible and has happened. The marginal difference to the beneficiaries is imperceptible and even then based on belief in what amount to nothing more than pseudo-scientific prophecies. But given that a shut-down of production in the USA is likely to be met by increased production elsewhere even these future dispersed and speculated benefits are unlikely to accrue.

More broadly, if specific people need to have their immediate interests sacrificed for the greater good, surely that is the function of Government, not some wayward activists? In that way the harms could be more equitably distributed. With random acts of criminality, the harms are more likely to be based on the prejudices on the activists.

Summary

The Climate Necessity Defence is an invalid justification for the criminal actions of Michael Foster and others in shutting down the oil pipelines from Canada into the USA. The marginal impact on reducing greenhouse gas emissions by the action, if they were not made up by increased production elsewhere, is about one part in ten million. But given that most of the global proven fossil fuel reserves are concentrated in a small number of countries – many of whom have no commitment to reduce emissions, let alone leave the source of major revenues in the ground – the opportunity of producing more is likely to be taken up. Further, the harms the activist’s action is immediate, very definite and concentrated, whilst the benefits of reduced climate change impacts from reduced emissions are speculative and dispersed over tens of billions of people. 

Kevin Marshall

Nature tacitly admits the IPCC AR5 was wrong on Global Warming

There has been a lot of comment on a recent paper at nature geoscience “Emission budgets and pathways consistent with limiting warming to 1.5C” (hereafter Millar et. al 2017)

When making a case for public policy I believe that something akin to a process of due diligence should be carried out on the claims. That is the justifications ought to be scrutinized to validate the claims. With Millar et. al 2017, there are a number of issues with the make-up of the claims that (a) warming of 1.5C or greater will be achieved without policy (b) constraining the emissions  

The baseline warming

The introduction states
Average temperatures for the 2010s are currently 0.87°C above 1861–80,

A similar quote from UNIPCC AR5 WG1 SPM page 5

The total increase between the average of the 1850–1900 period and the 2003–2012 period is 0.78 [0.72 to 0.85] °C, based on the single longest dataset available.

These figures are all from the HADCRUT4 dataset. There are three areas to account for the difference of 0.09°C. Mostly it is the shorter baseline period. Also, the last three years have been influenced by a powerful and natural El-Nino, along with the IPCC using an average of the last 10 years.

The warming in the pipeline

There are valid reasons for the authors differing from the IPCC’s methodology. They start with the emissions from 1870 (even though emissions estimates go back to 1850). Also, if there is no definite finish date, it is very difficult to calculate the warming impact to date. Consider first the full sentence quoted above.

Average temperatures for the 2010s are currently 0.87°C above 1861–80, which would rise to 0.93°C should they remain at 2015 levels for the remainder of the decade.

This implies that there is some warming to come through from the impact of the higher greenhouse gas levels. This seems to be a remarkably low and over a very short time period. Of course, not all the warming since the mid-nineteenth century is from anthropogenic greenhouse gas emissions. The anthropogenic element is just guesstimated. This is show in AR5 WG1 Ch10 Page 869

More than half of the observed increase in global mean surface temperature (GMST) from 1951 to 2010 is very likely due to the observed anthropogenic increase in greenhouse gas (GHG) concentrations.

It was after 1950 when the rate largest increase in CO2 levels was experienced. From 1870 to 1950, CO2 levels rose from around 290ppm to 310ppm or 7%. From 1950 to 2010, CO2 levels rose from around 310ppm to 387ppm or 25%. Add in other GHG gases and there the human-caused warming should be 3-4 times greater in the later period than the earlier one, whereas the warming in the later period was just over twice the amount. Therefore if there is just over a 90% chance (very likely in IPCC speak) of over 50% of the warming post-1950 was human-caused, a statistical test relating to a period more than twice as long would have a lower human-caused element of the warming as being statistically significant. Even then, I view the greater than 50% statistic as being deeply flawed. Especially when post-2000, when the rate of rise in CO2 levels accelerated, whilst the rise in average temperatures dramatically slowed. There are two things that this suggests. First, the impact could be explained by rising GHG emissions being a minor element in temperature rise, with natural factors both causing some of the warming in the 1976-1998 period, then reversing, causing cooling, in the last few years. Second is that there is a darn funny lagged response of rising GHGs (especially CO2) to rises in temperature. That is the amount of warming in the pipeline has increased dramatically. If either idea has any traction then the implied warming to come of just 0.06°is a false estimate. This needs to be elaborated.

Climate Sensitivity

If a doubling of CO2 leads to 3.00°C of warming (the assumption of the IPCC in their emissions calculations), then a rise in CO2 levels from 290ppm to 398 ppm (1870 to 2014) eventually gives 1.37°C of warming. With other GHGs this figure should be around 1.80°C. Half that warming has actually occurred, and some of that is natural. So there is well over 1.0°C still to emerge. It is too late to talk about constraining warming to 1.5°C as the cause of that warming has already occurred.

The implication from the paper in claiming that 0.94°C will result from human emissions in the period 1870-2014 is to reduce the climate sensitivity estimate to around 2.0°C for a doubling of CO2, if only CO2 is considered, or around 1.5°C for a doubling of CO2, if all GHGs are taken into account. (See below) Compare this to AR5 WG1 section D.2 Quantification of Climate System Responses

The equilibrium climate sensitivity quantifies the response of the climate system to constant radiative forcing on multicentury time scales. It is defined as the change in global mean surface temperature at equilibrium that is caused by a doubling of the atmospheric CO2 concentration. Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C (high confidence), extremely unlikely less than 1°C (high confidence), and very unlikely greater than 6°C (medium confidence).

The equilibrium climate sensitivity ECS is at the very bottom of the IPCC’s range and equilibrium climate response is reached in 5-6 years instead of mutlicentury time scales. This on top of the implied assumption that there is no net natural warming between 1870 and 2015.

How much GHG emissions?

With respect to policy, as global warming is caused by human greenhouse gas emissions, to prevent further human-caused warming requires reducing, and possibly eliminating global greenhouse emissions. In conjunction with the publication of the AR5 Synthesis report, the IPCC produced a slide show of the policy case laid out in the three vast reports. It was effectively a short summary of a summary of the synthesis report. Approaching the policy climax at slide 30 of 35:-

Apart from the policy objective in AR5 was to limit warming from 2°C, not 1.5°C, it also mentions the need to constrain GHG emissions, not CO2 emissions. Then slide 33 gives the simple policy simplified position to achieve 2°C of warming.

To the end of 2011 1900 GTCO2e of GHGs was estimated to have been emitted, whilst the estimate is around 1000 GTCO2e could be emitted until the 2°C warming was reached.

The is the highly simplified version. At the other end of the scale, AR5 WG3 Ch6 p431 has a very large table in a very small font to consider a lot of the policy options. It is reproduced below, though the resolution is much poorer than the original.

Note 3 states

For comparison of the cumulative CO2 emissions estimates assessed here with those presented in WGI AR5, an amount of 515 [445 to 585] GtC (1890 [1630 to 2150] GtCO2), was already emitted by 2011 since 1870

The top line is for the 1.5°C of warming – the most ambitious policy aim. Of note:-

  • The CO2 equivalent concentration in 2100 (ppm CO2eq ) is 430-480ppm.
  • Cumulative CO2 emissions (GtCO2) from 2011 to 2100 is 630 to 1180.
  • CO2 concentration in 2100 is 390-435ppm.
  • Peak CO2 equivalent concentration is 465-530ppm. This is higher than the 2100 concentration and if for CO2 alone with ECS = 3 would eventually produce 2.0°C to 2.6°C of warming.
  • The Probability of Exceeding 1.5 °C in 2100 is 49-86%. They had to squeeze really hard to say that 1.5°C was more than 50% likely.

Compare the above to this from the abstract of Millar et. al 2017.

If COemissions are continuously adjusted over time to limit 2100 warming to 1.5C, with ambitious non-COmitigation, net future cumulativCOemissions are unlikely to prove less than 250 GtC and unlikely greater than 540 GtC. Hence, limiting warming to 1.5C is not yet a geophysical impossibility, but is likely to require delivery on strengthened pledges for 2030 followed by challengingly deep and rapid mitigation.

They use tonnes of carbon as the unit of measure as against CO2 equivalent. The conversion factor is 3.664, so cumulative CO2 emissions need to be 870-1010 GtCO2 range. As this is to the end of 2015, not 2011 as in the IPCC report, it will be different. Subtracting 150 from the IPCC reports figures would give a range of 480 to 1030. That is, Millar et. al 2017 have reduced the emissions range by 75% to the top end of the IPCC’s range. Given the IPCC considered a range of 1.5-1.7°C of warming, this seems somewhat odd to then say it related to the lower end of the warming band, until you take into account that ECS has been reduced. But then why curtail the range of emissions instead calculating your own? It appears that again the authors are trying to squeeze a result within existing constraints.

However, this does not take into account the much higher levels of peak CO2 equivalent concentrations in table 6.3. Peak CO2 concentrations are around 75-95ppm higher than in 2100. Compare this to the green line in the central graph in Millar et. al 2017. 

 This is less than 50ppm higher than in 2100. Further in 2100 Millar et. al 2017 has CO2 levels of around 500ppm as against a mid-point of 410 in AR5. CO2 rising from 290 to 410ppm with ECS = 3.0 produced 1.50°C of warming. CO2 rising from 290 to 410ppm with ECS = 2.0 produced 1.51°C of warming. Further, this does not include the warming impact of other GHGs. To squeeze into the 1.5°C band, the mid-century overshoot in Millar et. al 2017 is much less than in AR5. This might be required in the modeling assumptions due to the very short time assumed in reaching full equilibrium climate response.

Are the authors playing games?

The figures do not appear to stack up. But then they appear to be playing around with figures, indicated by a statement in the explanation of Figure 2

Like other simple climate models, this lacks an explicit physical link between oceanic heat and carbon uptake. It allows a global feedback between temperature and carbon uptake from the atmosphere, but no direct link with net deforestation. It also treats all forcing agents equally, in the sense that a single set of climate response parameters is used in for all forcing components, despite some evidence of component-specific responses. We do not, however, attempt to calibrate the model directly against observations, using it instead to explore the implications of ranges of uncertainty in emissions, and forcing and response derived directly from the IPCC-AR5, which are derived from multiple lines of evidence and, importantly, do not depend directly on the anomalously cool temperatures observed around 2010.

That is:-

  • The model does not consider an “explicit physical link between oceanic heat and carbon uptake.” The IPCC estimated that over 90% of heat accumulation since 1970 was in the oceans. If the oceans were to belch out some of this heat at a random point in the future the 1.5°C limit will be exceeded.
  • No attempt has been made to “calibrate the model directly against observations”. Therefore there is no attempt to properly reconcile beliefs to the real world.
  • The “multiple lines of evidence” in IPCC-AR5 does not include a glaring anomaly that potentially falsifies the theory and therefore any “need” for policy at all. That is the divergence in actual temperatures trends from theory in this century.

Conclusions

The authors of Millar et. al 2017 have pushed out the boundaries to continue to support climate mitigation policies. To justify constraining emissions sufficient stop 1.5°C of warming the authors would appear to have

  • Assumed that all the warming since 1870 is caused by anthropogenic GHG emissions when there is not even a valid statistical test that confirms even half the warming was from this source.
  • Largely ignored any hidden heat or other long-term response to rises in GHGs.
  • Ignored the divergence between model predictions and actual temperature anomalies since around the turn of the century. This has two consequences. First, the evidence appears to strongly contradict the belief that humans are a major source of global warming and by implication dangerous climate change. Second, if it does not contradict the theory, suggests the amount of warming in the pipeline consequential on human GHG emissions has massively increased. Thus the 1.5°C warming could be breached anyway.
  • Made ECS as low as possible in the long-standing 1.5°C to 4.5°C range. Even assuming ECS is at the mid-point of the range for policy (as the IPCC has done in all its reports) means that warming will breach the 1.5°C level without any further emissions. 

The authors live in their closed academic world of models and shared beliefs. Yet the paper is being used for the continued support of mitigation policy that is both failing to get anywhere close to achieving the objectives and is massively net harmful in any countries that apply it, whether financially or politically.

Kevin Marshall

Commentary at Cliscep, Jo Nova, Daily Caller, Independent, The GWPF

Update 25/09/17 to improve formatting.

CO2 Emissions from Energy production forecast to be rising beyond 2040 despite COP21 Paris Agreement

Last week the US Energy Information Administration (EIA) published their INTERNATIONAL ENERGY OUTLOOK 2016. The Daily Caller (and the GWPF) highlighted the EIA’s summary energy energy production. This shows that the despite the predicted strong growth in nuclear power and implausibly high growth in renewables, usage of fossil fuels are also predicted to rise, as shown in their headline graphic below.

For policy purposes, the important aspect is the translation into CO2 emissions. In the final Chapter 9. Energy-related CO2 Emissions figure 9.3 shows the equivalent CO2 Emissions in billions of tonnes of CO2. I have reproduced the graphic as a stacked bar chart.

Data reproduced as a stacked bar chart.

In 2010 these CO2 emissions are just under two-thirds of total global greenhouse gas emissions. The question is how does this fit into the policy requirements to avoid 2°C from the IPCC’s Fifth Assessment Report? The International Energy Authority summarized the requirements very succicently in World Energy Outlook 2015 Special Report page 18

The long lifetime of greenhouse gases means that it is the cumulative build-up in the atmosphere that matters most. In its latest report, the Intergovernmental Panel on Climate Change (IPCC) estimated that to preserve a 50% chance of limiting global warming to 2 °C, the world can support a maximum carbon dioxide (CO2) emissions “budget” of 3 000 gigatonnes (Gt) (the mid-point in a range of 2 900 Gt to 3 200 Gt) (IPCC, 2014), of which an estimated 1 970 Gt had already been emitted before 2014. Accounting for CO2 emissions from industrial processes and land use, land-use change and forestry over the rest of the 21st century leaves the energy sector with a carbon budget of 980 Gt (the midpoint in a range of 880 Gt to 1 180 Gt) from the start of 2014 onwards.

From the forecast above, cumulative CO2 emissions from 2014 with reach 980 Gt in 2038. Yet in 2040, there is no sign of peak emissions.

Further corroboration comes from the UNFCCC. In preparation for the COP21 from all the country policy proposals they produced a snappily titled Synthesis report on the aggregate effect of intended nationally determined contributions. The UNFCCC have updated the graphics since. Figure 2 of 27 Apr 2016 shows the total GHG emissions, which were about 17 Gt higher than the CO2 emissions from energy emissions in 2010.

The graphic clearly shows that the INDCs – many with very vague and non-verifiable targets – will make very little difference to the non-policy emissions path. Yet even this small impact is contingent on those submissions being implemented in full, which is unlikely in many countries. The 2°C target requires global emissions to peak in 2016 and then head downwards. There are no additional policies even being tabled to achieve this, except possibly by some noisy, but inconsequential, activist groups. Returning to the EIA’s report, figure 9.4 splits the CO2 emissions between the OECD and non-OECD countries.

The OECD countries represent nearly all countries who propose to reduce their CO2 emissions on the baseline 1990 level, but their emissions are forecast by the EIA still to be 19% higher in 2040. However, the increase is small compared to the non-OECD countries – who mostly are either proposing to constrain emissions growth or have no emissions policy proposals – with emissions forecast to treble in fifty years. As a result the global forecast is for CO2 emissions to double. Even if all the OECD countries completely eliminate CO2 emissions by 2040, global emissions will still be a third higher than in 1990. As the rapid economic growth in the former Third World reduces global income inequalities, it is also reducing the inequalities in fossil fuel consumption in energy production. This will continue beyond 2040 when the OECD with a sixth of the world population will still produce a third of global CO2 emissions.

Unless the major emerging economies peak their emissions in the next few years, then reduce the emissions rapidly thereafter, the emissions target allegedly representing 2°C or less of global warming by 2100 will not be met. But for countries like India, Vietnam, Indonesia, Bangladesh, Nigeria, and Ethiopia to do so, with the consequent impact on economic growth, is morally indefensible.

Kevin Marshall

 

Degenerating Climatology 1: IPCC Statements on Human Caused Warming

This is the first in an occasional series of illustrating the degeneration of climatology away from an empirical science. In my view, for climatology to be progressing it needs to be making ever clearer empirical statements that support the Catastrophic Anthropogenic Global Warming (CAGW) hypothesis and moving away from the bland statements that can just as easily support a weaker form of the hypothesis, or support random fluctuations. In figure 1 this progression is illustrated by the red arrow, with increasing depth of colour. The example given below is an illustration of the opposite tendency.

Obscuring the slowdown in warming in AR5

Every major temperature data set shows that the warming rate this century has been lower than that towards the end of the end of the twentieth century. This is becoming a severe issue for those who believe that the main driver of warming is increasing atmospheric greenhouse gas levels. This gave a severe problem for the IPCC in trying to find evidence for the theory when they published in late 2013.

In the IPCC Fifth Assessment Report Working Group 1 (The Physical Science Basis) Summary for Policy Makers, the headline summary on the atmosphere is:-

Each of the last three decades has been successively warmer at the Earth’s surface than any preceding decade since 1850. In the Northern Hemisphere, 1983–2012 was likely the warmest 30-year period of the last 1400 years (medium confidence).

There are three parts to this.

  • The last three decades have been successively warmer according to the major surface temperature data sets. The 1980s were warmer than the 1970s; the 1990s warmer than the 1980s; and the 2000s warmer than the 1990s.
  • The 1980s was warmer than any preceding decade from the 1850s.
  • In the collective opinion of the climate experts there is greater than a 66% chance that the 1980s was the warmest decade in 1400 years.

What the does not include are the following.

  1. That global average temperature rises have slowed down in the last decade compared with the 1990s. From 2003 in the HADCRUT4 temperature series warming had stopped.
  2. That global average temperature also rose significantly in the mid-nineteenth and early twentieth centuries.
  3. That global average temperature fell in 4 or 5 of the 13 decades from 1880 to 2010.
  4. That in the last 1400 years there was a warm period about 1000 years ago and a significantly cold period that could have reached bottomed out around 1820. That is a Medieval Warm Period and the Little Ice Age.
  5. That there is strong evidence of Roman Warm Period that about 2000 years ago and a Bronze Age warm period about 3000 years ago.

Point (i) to (iii) can be confirmed by figure 2. Both the two major data surface temperature anomalies show warming trends in each of the last three decades, implying successive warming. A similar statement could have been made in 1943 if the data had been available.

In so far as the CAGW hypothesis is broadly defined as a non-trivial human-caused rise in temperatures (the narrower more precise definition being that the temperature change has catastrophic consequences) there is no empirical support found from the actual temperature records or from the longer data reconstructions from proxy data.

The major statement above is amplified by the major statement from the press release of 27/09/2013.

It is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century. The evidence for this has grown, thanks to more and better observations, an improved understanding of the climate system response and improved climate models.

This statement does exclude other types of temperature change, let alone other causes of the temperature change. The cooling in the 1960s is not included. The observed temperature change is only the net impact of all influences, known or unknown. Further, the likelihood is based upon expert opinion. If the experts have always given prominence to human influences on warming (as opposed to natural and random influences) then their opinion will be biased. Over time if this opinion is not objectively adjusted in the light of evidence that does not conform to the theory the basis of Bayesian statistic is undermined.

Does the above mean that climatology is degenerating away from a rigorous scientific discipline? I have chosen the latest expert statements, but not compared them with previous statements. A comparable highlighted statement to the human influence statement from the fourth assessment report WG1 SPM (Page 3) is

The understanding of anthropogenic warming and cooling influences on climate has improved since the TAR, leading to very high confidence that the global average net effect of human activities since 1750 has been one of warming, with a radiative forcing of +1.6 [+0.6 to +2.4] W m–2

The differences are

  • The greenhouse gas effect is no longer emphasised. It is now the broader “human influence”.
  • The previous statement was prepared to associate the influence with a much longer period. Probably the collapse of hockey stick studies, with their portrayal of unprecedented warming, has something to do with this.
  • Conversely, the earlier statement is only prepared to say that since 1750 the net effect of human influences has been one of warming. The more recent statement claims a dominant cause of warming has been human caused.

This leads my final point indicating degeneration of climatology away from science. When comparing the WG1 SPMs for TAR, AR4 and AR5 there are shifting statements. In each report the authors have chosen the best statements to fit their case at that point in time. The result is a lack of continuity that might demonstrate and increasing correspondence between theory and data.

Kevin Marshall

Global Emissions Reductions Targets for COP21 Paris 2015

There is a huge build-up underway for the COP21 climate conference to be staged in Paris in November. Many countries and NGOs are pushing for an agreement that will constrain warming to just 2oC, but there are no publicly available figures of what this means for all the countries of the world. This is the gap I seek close with a series of posts. The first post is concerned with getting a perspective on global emissions and the UNIPCC targets.

In what follows, all the actual figures are obtained from three primary sources.

  • Emissions data comes from the Carbon Dioxide Information Analysis Centre or CDIAC.
  • Population data comes from the World Bank, though a few countries are missing. These are mostly from Wikipedia.
  • The Emissions targets can be found in the Presentation for the UNIPCC AR5 Synthesis Report.

All categorizations and forecast estimates are my own.

The 1990 Emissions Position

A starting point for emissions reductions is to stabilize emissions to 1990 levels, around the time that climate mitigation was first proposed. To illustrate the composition emissions I have divided the countries of the world into the major groups meaningful at that time – roughly into First World developed nations, the Second World developed communist countries and the Third World developing economies. The First World is represented by the OECD. I have only included members in 1990, with the USA split off. The Second World is the Ex-Warsaw pact countries, with the countries of the former Yugoslavia included as well. The rest are of the world is divided into five groups. I have charted the emissions per capita against the populations of these groups to come up with the following graph.

In rough terms, one quarter of the global population accounted for two-thirds of global emissions. A major reduction on total emissions could therefore be achieved by these rich countries taking on the burden of emissions reductions, and the other countries not increasing their emissions, or keeping growth to a minimum.

The 2020 emissions forecast

I have created a forecast of both emissions and population for 2020 using the data up to 2013 for both emissions and population. Mostly these are assuming the same change in the next seven years as the last. For emissions in the rapidly-growing countries this might be an understatement. For China and India I have done separate forecasts based on their emissions commitments. This gives the following graph.

The picture has changed dramatically. Population has increased by 2.4 billion or 45% and emissions by over 80%. Global average emissions per capita have increased from 4.1 to 5.2t/CO2 per capita. Due to the population increase, to return global emissions to 1990 levels would mean reducing average emissions per capita to 2.85t/CO2.

The composition of emissions has been even more dramatic. The former First and Second World countries will see a slight fall in emissions from 14.9 to 14.0 billion tonnes of CO2 and the global share will have reduced from 68% to 36%. Although total population will have increased on 1990, the slower growth than elsewhere means the share of global population has shrunk to just 19%. China will have a similar population and with forecast emissions of 13.1 billion tonnes of CO2, 33% of the global total.

The picture is not yet complete. On slide 30 of their Synthesis Report presentation the UNIPCC state

Measures exist to achieve the substantial emissions reductions required to limit likely warming to 2oC (40-70% emissions reduction in GHGs globally by 2050 and near zero GHGs in 2100)

The baseline is 2011, when global emissions were 29.74 billion t/CO2. In 2050 global population will be nearly nine billion. This gives an upper limit of 2.2 t/CO2 per capita and lower limit of 1.1 t/CO2 per capita.

To put this in another perspective, consider the proportions of people living in countries that need emissions targets based on greater than 2.2t/CO2 emissions per capita.

In 1990, it was just a third of the global population. In 2020 it will be three quarters. No longer can an agreement on constraining global CO2 emissions be limited to a few countries. It needs to be truly global. The only area that meets the target is Africa, but even here the countries of Algeria, Egypt, Libya, Tunisia and South Africa would need to have emission reduction targets.

Further Questions

  1. What permutations are possible if other moral considerations are taken into account, like the developed countries bear the burden of emission cuts?
  2. What targets should be set for non-fossil fuel emissions, such as from Agriculture? Are these easier or harder to achieve than for fossil fuels?
  3. What does meeting emission targets mean for different types of economies? For instance are emission reductions more burdensome for the fast-growing emerging economies that for the developed economies?
  4. What are the measures that IPCC claims exist to reduce emissions? Are they more onerous than the consequences of climate change?
  5. Are there in place measures to support the states dependent on the production of fossil fuels? In particular, the loss of income to the Gulf States from leaving oil in the ground may further destabilize the area.
  6. What sanctions if some countries refuse to sign up to an agreement, or are politically unable to implement an agreement?
  7. What penalties will be imposed if countries fail to abide by the agreements made?

Kevin Marshall

BBC understates Cost of Climate Policy by 45 to 50 times

The UNIPCC has just finished a major meeting in Copenhagen to put finalize the wording of their AR5 Synthesis Report. BBC News Environment correspondent Matt McGrath said

The IPCC says that the cost of taking action to keep the rise in temperature under 2 degrees C over the next 76 years will cost about 0.06% of GDP every year.

Over the same period, world GDP is expected to grow at least 300%

The figure of 0.06% of GDP (strictly Gross World Product) seemed a bit low. So I looked up the source of this quote.

The Synthesis Report states on pages 116-117

Estimates of the aggregate economic costs of mitigation vary widely depending on methodologies and assumptions, but increase with the stringency of mitigation (high confidence). Scenarios in which all countries of the world begin mitigation immediately, in which there is a single global carbon price, and in which all key technologies are available, have been used as a cost-effective benchmark for estimating macroeconomic mitigation costs. (Figure 3.4). Under these assumptions, mitigation scenarios that are likely to limit warming to below 2 °C through the 21st century relative to pre-industrial levels entail losses in global consumption —not including benefits of reduced climate change (3.2) as well as co-benefits and adverse side-effects of mitigation (3.5, 4.3) — of 1% to 4% (median: 1.7%) in 2030, 2% to 6% (median: 3.4%) in 2050, and 3% to 11% (median: 4.8%) in 2100, relative to consumption in baseline scenarios that grows anywhere from 300% to more than 900% over the century. These numbers correspond to an annualized reduction of consumption growth by 0.04 to 0.14 (median: 0.06) percentage points over the century relative to annualized consumption growth in the baseline that is between 1.6% and 3% per year.

Matt McGarth (or a press officer) has wrongly assumed that 0.06% of GDP is the reduction in output, whereas the Synthesis Report talks about a reduction in growth rate. At any rate of growth, the impact of .06% reduction in growth rates will mean output in 2100 will be 4.8% lower. We can put a monetary impact on this through to 2090. The World Bank estimates global output was $74,910 billion in 2013. To keep the figures simple I will assume that 2014 will be $75,000 bn. The figures are below for 2090.

With 1.94% growth global output in 2090 will be $323,038bn, about $14,774bn less than if there was 2% growth. Cumulatively a 0.06% reduction in growth would be $369,901bn. But a cost of 0.06% each year of global output, with 2% growth is a mere $8,087bn. Misstatement of the UNIPCC’s position understates the cumulative cost by 45.7 times.

Similarly, with 2.94% growth global output in 2090 will be $678,356bn, about $30,716bn less than if there was 3% growth. Cumulatively a 0.06% reduction in growth would be $644,144bn. But a cost of 0.06% each year of global output, with 3% growth is a mere $13,107bn. Misstatement of the UNIPCC’s position understates the cumulative cost by 49.1 times.

The BBC or the UNIPCC needs to issue a correction. The UNIPCC have at last recognized that policy will effect economic growth. It is way too low, particularly for the high-policy countries who are put at an economic disadvantage relative to those countries without policies. Now they need to also look at the additional estimated costs of low carbon energy, along with the hidden costs of regulation and failed policies.

Thanks to Joanne Nova for highlighting the quote.

Kevin Marshall

UNIPCC Risk Management Process

Thanks to Tom0Mason for pointing out the following graphic SPM.3 at from the UNIPCC AR5 WGII report.

He states

Within the documentation (page 9 of the full report) is Figure SPM.3 | Climate-change adaptation as an iterative risk management process with multiple feedbacks. People and knowledge shape the process and its outcomes. [Figure 2-1]

This graphic implies that the UN has the ability to tell governments what to do. You all voted for that didn’t you?

Yes the UN minions have set themselves up as identifiers of risk, assessors of risk, establishers of decision-making criteria, and implementers decision and then they’ll monitor you compliance.

I am not sure that I entirely agree. The UNIPCC might have set themselves up as telling governments what to do, but they only partially heed what they claim in the chart, and governments even less so. For instance on “scoping“, the identification of risks and vulnerabilities is only partially followed through. In AR4 the UNIPCC scrapped around for every possible risk they could find, and then embellished them. They later admitted the Himalayan glaciers were fabricated, but there was nothing on similar fabrications for crop failures in Africa or for the collapse of the Amazon Rainforest. Nor was there an admission that claims of increasing hurricane activity were unsupported; or that the vanishing snows of Kilimanjaro were not from rising temperatures . The process of scoping should include categorizing risks according to magnitude, likelihood and the quality of the evidence. But no such critical evaluation takes place.

Implementation is a loop of

Implement Decision Monitor Review and Learn

In practice (with the UK as an example) implementation is accompanied by an enforcing agency whose monitoring consists of justifying the policy, with no independent audits of the success of the policy, nor identifying any adverse consequences. As a result the reviews to not learn from mistakes, nor how to improve the quality of policy, nor how to take into account new evidence, nor to consider the increasing evidence that the optimal policy is to do nothing.

Kevin Marshall


Reconciling UNIPCC AR5 polar ice melt data with sea level rise

For over a year I have been pondering how to reconcile the near constant rise in sea levels with the accelerating polar ice melt. At the end of September the UNIPCC published the AR5 Working Group II (the Physical Science Basis) Summary for Policymakers which provides some useful evidence.

The following from the UNIPCC gives some estimates of the rate of polar ice melt. In page 9

• The average rate of ice loss from the Greenland ice sheet has very likely substantially increased from 34 [–6 to 74] Gt yr–1 over the period 1992 to 2001 to 215 [157 to 274] Gt yr–1 over the period 2002 to 2011.

• The average rate of ice loss from the Antarctic ice sheet has likely increased from 30 [–37 to 97] Gt yr–1 over the period 1992–2001 to 147 [72 to 221] Gt yr–1 over the period 2002 to 2011. There is very high confidence that these losses are mainly from the northern Antarctic Peninsula and the Amundsen Sea sector of West Antarctica.

Put in sea level rise terms, the combined average rate of ice loss from the polar ice caps increased from 0.18 mm yr–1 over the period 1992 to 2001 to 1.00mm yr–1 over the period 2002 to 2011.

There is a problem with these figures. The melting ice will end up raising sea levels. The satellite data from the University of Colorado shows a near constant rate of rise of 3.2mm yr–1.

Assuming a one year lag in raising sea levels, the 0.18 mm yr–1 over the period 1992 to 2001 is equivalent to 5% of the 3.3mm yr–1 average sea level rise from 1993 to 2002, whilst the 1.00mm yr–1 over the period 2002 to 2011 is equivalent to 32% of the 3.1mm yr–1 average sea level rise from 2003 to 2012. Some other component of sea level rise must be decreasing. The estimates of the other components are given on page 11

Since the early 1970s, glacier mass loss and ocean thermal expansion from warming together explain about 75% of the observed global mean sea level rise (high confidence). Over the period 1993 to 2010, global mean sea level rise is, with high confidence, consistent with the sum of the observed contributions from ocean thermal expansion due to warming (1.1 [0.8 to 1.4] mm yr–1), from changes in glaciers (0.76 [0.39 to 1.13] mm yr–1), Greenland ice sheet (0.33 [0.25 to 0.41] mm yr–1), Antarctic ice sheet (0.27 [0.16 to 0.38] mm yr–1), and land water storage (0.38 [0.26 to 0.49] mm yr–1). The sum of these contributions is 2.8 [2.3 to 3.4] mm yr–1.

The biggest component of sea level rise is thermal expansion. The contribution from this element must be decreasing. Ceteris paribus, that suggests the rate of heat accumulation is decreasing. This contradicts the idea that the lack of surface temperature warming is accounted for by this heat accumulation.

The problem is that all things are not equal. Thermal expansion of water varies greatly with temperature of that water. On page 10 there is the following graphic

The heat content of the upper ocean increased by around 10 x 1022 J from 1993 to 2010. For 700m of ocean depth I estimate this would be 0.1oC. It is a tiny amount that varies greatly with temperature, as shown by the graph below.

As sea temperature varies greatly according to location and depth, it is possible to hypothesise a decline in the thermal expansion with an increase in heat content. This whilst also accepting that both the rate of rise in the heat content of the oceans has accelerated and the contribution to sea level rise due the increase in heat content has decreased. For example one would just have to hypothesise that the increasing heat content had been predominately in the tropics during the 1990s and switched to the Arctic in the 2000s.

Even this switch is not necessary. There is huge variation between areas of the amount of temperature increase over a twenty year period. Consistent with an increase of 0.1 could have been a decrease in average temperatures in an area of ocean as large as the Atlantic and Indian Oceans combined.

But, what makes this less than credible is that this shift almost exactly offset the estimated increase in the ice melt component. Most likely no-one will try to calculate this, as the data is not there. Even with 3,000 Argo Buoys in the oceans, there is still on average just one buoy per 200,000 km3 of ocean, taking about 25 dips a year. The consensus viewpoint appears the less likely than the view that climate models have an exaggerated belief in the impact of greenhouse gases on average temperatures.

There is an opportunity for some further investigation with the data. But a huge amount of work may not yield anything, or may yield conclusion at odds with the “real”, unknown one. However, the first step is to determine how the UNIPCC calculated the figure for thermal expansion. Hopefully it was more substantial than from the difference between the total sea level rise and the estimates for other factors.

Update

At Bishop Hill Unthreaded michael hart Jun 1, 2014 at 4:13 AM refers to some other variables that determines how warming oceans will affect sea level rise through thermal expansion. So now the list includes.

  • The quantity of heat. (see above)
  • The initial temperature of the water which the heat was applied to. (see above)
  • The initial temperature is in turn related to
  1. Latitude – at mid latitudes there is a seasonal variation temperature variation down to about 300 metres.
  2. Depth
  3. Density variation due to salinity (see pdf page 9)

However, there are local variations as well, due to ocean currents that shift over time.

For these reasons, any attempt at estimating thermal expansion will be reliant on assumptions and estimates. The UNIPCC will have simply estimated the difference between estimated “known” factors – ice melt and land water storage – and deducted from the known sea level rise.

In terms of reconciling polar ice melt to sea level rise, there is something that I missed. According to the UNIPCC, glacier melt has a larger contribution to sea level rise than polar ice melt – 0.76 mm yr–1 against 0.60 mm yr–1. It is quite conceivable – particularly since temperatures have stopped rising – that have glacier melt has effectively ceased or even gone into reverse. Unlike with thermal expansion, there should be estimates available to confirm this.

 

 

 

 

 

 

 

Radiative Forcing – UNIPCC AR5 undermines AR4, but scientists have unshaken confidence in their work

Last year in “Aerosols – The UNIPCC AR4 adjustment factor” I claimed that in 2007 the UNIPCC engineered the radiative forcing components to tell a story. It basically manipulated the figures to account for the lack of warming up to that point. The release of AR5 Working Group 1 report yesterday shows the extent of the false levels of certainty in the scientist’s estimates in 2007.

The Data

In 2007 Figure 2.4 of the Synthesis Report was as follows

In 2013, Figure SPM.5 is below1

There are slight changes in format and terminology. I have put the two tables side-by-side for comparison, with analysis:-

 

The range of forcings I have expressed the range as a percentage of the mid-point.

Below are comments on the individual forcing components.

Carbon Dioxide CO2

The most important anthropogenic greenhouse gas has hardly moved, from 1.66 to 1.68 W m-2. In 1750 CO2 levels were 280 ppm, rising to 379 ppm in 2005 and 392 ppm in 2011. In 2007, the scientists estimated that it took a rise of 60 ppm to increase radiative forcing by 1 W m-2, compared to 66 ppm in 2013. Scientists have found that CO2 is 10% less effective as a greenhouse gas than previously thought. They are far less certain about this figure, as the range has doubled, but they are still have high confidence in their figures4 but scientists have switched from high confidence to very high confidence with their figures.3

Methane CH4

CH4 has practically doubled in impact, from 0.48 to 0.97 W m-2. In 1750 CH4 levels were 715 ppb, rising to 1774 ppb in 2005 and 1803 ppb in 2011. In 2007, the scientists estimated that it took a rise of 2200 ppb to increase radiative forcing by 1 W m-2, compared to 1120 ppm in 2013. Scientists have found that CH4 was practically twice as potent as a greenhouse gas than previously thought. They are far less certain about this figure, as the range has more than doubled relative to the mid-point. More significantly, the new potency is well outside the confidence range of the 2007 report. There the high point of the uncertainty range was 0.53 W m-2, whereas the low point of the uncertainty range is 0.74 W m-2. Despite having been so far out six years ago the scientists still have high confidence in their figures. The reason given on page 9 is

This difference in estimates is caused by concentration changes in ozone and stratospheric water vapour due to CH4 emissions and other emissions indirectly affecting CH4.

 

The potency of CH4 is a modelled estimate based on other factors. It is by including these indirect effects that the uncertainty is increased.

As a side point, of the 1100 ppb rise in CH4 levels since 1750, 80% was prior to the 1975. It has ceased to be a significant contributor to increasing radiative forcing. Given the increased recognised potency, it is a minor explanation of the pause in warming.

Nitrous Oxide N2O

This has hardly moved in impact, from 0.16 to 0.17 W m-2. In 1750 N2O levels were 270 ppb, rising to 319 ppb in 2005 and 324 ppb in 2011. Scientists are far less certain about these figures, as the range has nearly doubled, but they still have high confidence in their figures.

Halocarbons4

Although a minor group of greenhouse gases the impact has reduced from 0.34 to 0.18 W m-2, but the magnitude to the uncertainty band has increased more than five-fold from 0.06 (0.37-0.31) to 0.34 (0.35-0.01). Instead of reducing scientists confidence, they have gone from “high confidence” to “very high confidence” in the figures.

Aerosols

Of the 2007 report I claimed they were a fudge factor, suppressing the warming effect of greenhouse gases. The combined mid-point is now 1.20 W m-2of direct and cloud albedo effects, down more than 30% on 2007. The range of uncertainty is more significant. This has increased from 0.8 to 0.9 W m-2, with the impact of the high-end being a net warming effect. Despite being now being uncertain of whether the direct effect of aerosols warm or cool the planet, and despite being less certain of already high “confidence” range six years ago, the scientists still have high confidence in their figures.

Forecasts for Radiative Forcing in 2100 for CO2 and CH4

Let us assume that CO2 continue to increase at 3ppm a year and CH4 increases by 5ppb a year until 2100. Using 2007 potency estimates, CO2 forcing will be 6.34 W m-2 and CH4 will be 0.69 W m-2 above 1750 levels. Using 2013 potency estimates, CO2 forcing will be 5.72 W m-2 and CH4 will be 1.37 W m-2 above 1750 levels. Combined estimated forcing is less than 1% different, despite doubling the potency of CH4. Maybe we will have a much greater reason to worry about the melting of permafrost in the tundra, causing a huge rise in atmospheric methane levels. Suppressed warming from this factor has been doubled.

Conclusion

Scientists now implicitly admit that they were much too confident about the potency of greenhouse gases in 2007. They have now doubled the uncertainty bands on the three major greenhouse gases. Yet recognizing this past over confidence seems to have had no impact on current levels of confidence.

Kevin Marshall

 

Notes

  1. The graphic at the time of writing was only available in pdf format.
  2. NMVOC = Non-methane volatile organic compounds. They have a role in the production of ozone. Defra have a fuller explanation.
  3. All these figures are available from the 2007 “Full report” page and the 2013 WG1 Summary for Policymakers page 7. This is the 27-09-13 version. Page numbering will change once tables are properly inserted.
  4. Upon re-reading I have made two adjustments. For CO2, I note that scientists have increased their confidence despite doubling the size of their uncertainty bands. I have also added a comment on halocarbons, where confidence has increased, despite a

Assessing the UNIPCC fifth assessment report

The first part of the UNIPCC AR5 is due to be published in the coming days. At the Conversation, Research Fellows Roger Jones and Celeste Young at Victoria University have posted Explainer: how to read an IPCC report. It contains some useful stuff on penetrating the coded language of the IPCC report. You will be better able to decode what the IPCC mean by various levels of confidence. However, the authors are very careful not to give people a free rein in thinking for themselves. Therefore they stress that the language is complex, and any questions need to be answered by an expert. After all, it would not do to have people misinterpreting the science.

I suggest an alternative method of understanding the science. That is comparing what is said now with what the consensus said back in 2007 in AR4. The AR4 is available at the United Nations Intergovernmental Panel on Climate Change website at the following location.

http://www.ipcc.ch/publications_and_data/publications_ipcc_fourth_assessment_report_synthesis_report.htm

Figure 2.4 Radiative forcing components of SYR.

It would be nice to see the comparative estimates, particularly on whether aerosols have a comparatively large negative role and whether natural factors are still less than 10% of the net total.

.

Figure 2.4. Global average radiative forcing (RF) in 2005 (best estimates and 5 to 95% uncertainty ranges) with respect to 1750 for CO2, CH4, N2O and other important agents and mechanisms, together with the typical geographical extent (spatial scale) of the forcing and the assessed level of scientific understanding (LOSU). Aerosols from explosive volcanic eruptions contribute an additional episodic cooling term for a few years following an eruption. The range for linear contrails does not include other possible effects of aviation on cloudiness. {WGI Figure SPM.2}

Figure SPM.6. Projected surface temperature changes for the late 21st century (2090-2099).

An updated map on a comparable basis would be useful, especially for the most concerning area of the Arctic.


Figure SPM.6. Projected surface temperature changes for the late 21st century (2090-2099). The map shows the multi-AOGCM average projection for the A1B SRES scenario. Temperatures are relative to the period 1980-1999. {Figure 3.2}

Table SPM.2. Examples of some projected regional impacts.


It would be nice to have an update on how the short term impacts are doing. These all had high confidence or very high confidence

In Africa

By 2020, between 75 and 250 million of people are projected to be exposed to increased water stress due to climate change.

By 2020, in some countries, yields from rain-fed agriculture could be reduced by up to 50%. Agricultural production, including access to food, in many African countries is projected to be severely compromised. This would further adversely affect food security and exacerbate malnutrition.

In Australia and New Zealand

By 2020, significant loss of biodiversity is projected to occur in some ecologically rich sites, including the Great Barrier Reef and Queensland Wet Tropics.

Small Islands

Sea level rise is expected to exacerbate inundation, storm surge, erosion and other coastal hazards, thus threatening vital infrastructure, settlements and facilities that support the livelihood of island communities.

Please note the graphs used are available at this website and are IPCC Copyright.