Camp Fire California – Lessons in the Limits & Harms of Political Action

I believe that there is a prayer that politicians should adopt.
Called the Serenity Prayer, and written by Reinhold Niebuhr it begins

God grant me the serenity
to accept the things I cannot change;
courage to change the things I can;
and wisdom to know the difference.

The order is the right way round. Most “things” a politician – or even a ruling political party – cannot change. It is in the identification of the things that they can change for the better where they can make a positive difference.

An example comes from the end of last month. For a few days the news in Britain was dominated for days with the stories of the greatest ever wildfire in California. Called the Camp Fire, it killed 86, destroyed 19,000 homes and burnt 240 square miles (62,000 ha)
CBS News 60 Minutes has this short report.
Many politicians, including Governor Brown, blamed climate change. Yet even if wildfires were solely from that cause, the ultimate cause is supposed to be from global greenhouse gas emissions. As in 2016 California’s emissions were around 430 MtCO2e – or about 0.8% of the global total – any climate change policies will make virtually zero difference to global emissions. Even under the 2015 proposed contribution from the USA would not have made much difference as most of the forecast drop in emissions was due to non-policy trends, not due to actual policies. Policy that achieves much less than 10% real reduction from a country that has one-eighth of global emissions is hardly going to have an impact in a period when net global emissions are increasing. That is, impact of any mitigation policies by the State of California or the United States will have approximately zero impact on global emissions.
But no reasonable person would claim that it was all down to climate change, just that climate change may have made the risk of wild fires a little greater.
What are the more immediate causes of wild fires? This is what Munich Re has to say on wildfires in Southeast Australia. (Bold mine) 

The majority of bushfires in southeast Australia are caused by human activity

Bushfire is the only natural hazard in which humans have a direct influence on the hazard situation. The majority of bushfires near populated areas are the consequence of human activity. Lightning causes the smaller portion naturally. Sometimes, a carelessly discarded cigarette or a glass shard, which can focus the sun’s rays is all it takes to start a fire. Heat from motors or engines, or electric sparks from power lines and machines can ignite dry grass. Besides this accidental causes, a significant share of wildfires are started deliberately.

Humans also change the natural fire frequency and intensity. They decrease the natural fire frequency due to deliberate fire suppression near populated areas. If there is no fuel-reduction burning in forests for the purposes of fire prevention, large quantities of combustible material can accumulate at ground level.

Surface fires in these areas can become so intense due to the large amounts of fuel that they spread to the crowns of the trees and rapidly grow into a major fire. If humans had not intervened in the natural bushfire regime, more frequent low-intensity fires would have consumed the forest undergrowth and ensured that woodland grasses and scrubs do not proliferate excessively.

David Evans expands on the issue of fuel load in a 2013 article.

The immediate cause of wildfires is human. Near to people’s homes or businesses there is little that can be done to prevent fires either accidental or deliberate.

But, as any fire safety course will explain, for a fire to happen requires heat, fuel and oxygen. A few tenths of a degree of warming is not going to increase the heat source significantly. As Munich Re explains successful suppression of small fires, or forest management that allows dead material to accumulate, or does not thin the forest. or create fire-breaks will increase the continuous and rich fuel for fires to spread. That is, the unintended consequence of certain types of forest management will be to increase risk of severe fires.

President Trump was then correct in blaming poor forest management for the horrific fire. The reaction from firefighters that the tweets were “demeaning” and “ill-informed” were misplaced. If bad policy contributed to the severity of a fire then politicians should share some of the blame for the damage caused. They should not be defended by those risking their lives to limit the damage resulting from bad policies. If poor building regulations lead to many deaths in a large building then those responsible for the regulations would shoulder some of the blame for those deaths even if an arsonist started the fire. The same applies to forests. After major disasters such as air crashes and earthquakes, regulations are often put in place to prevent future similar disasters even when such regulations would not have prevented the actual disaster. The result of a disaster is to concentrate minds on the wider aspects and plug gaps. But like major disasters, if regulations contributed to the extent of the disaster, the aftermath will be to shift blame elsewhere then fix the underlying problem in a raft of – mostly unnecessary – regulations. President Trump broke these unwritten political rules. But the results are the same, and have occurred quite quickly.

When Trump visited the site of the Camp Fire he met with outgoing Governor Jerry Brown and Lt. Gov. Gavin Newsom he stated on November 19th

Is it happening? Things are changing. ….. And I think, most importantly, we’re doing things about. We’re going to make it better. We’re going to make it a lot better. And it’s going to happen as quickly as it can possibly happen.

From the Daily Caller and WUWT, on December 23rd President Trump signed into law new wildfire legislation that will better allow such fire-prevention management policies. On Christmas Eve President Trump followed this up with an executive order allowing agencies to do more to prevent massive wildfires.

Returning to the serenity prayer, in issuing an Executive order to allow government agencies to reduce fire risk President Trump has done something that is within his power. GOP legislation to better enable others to carry out similar forest management policies, has a slightly less direct impact. Democrats whinging about climate change is far more than failing to accept the things they cannot change. It is about blocking actions that can limit risk and extent of wild fires to maintain ineffectual and costly policies.

Kevin Marshall

BBCs misleading reporting of COP 24 Katowice Agreement

As usual, the annual UNFCCC COP meeting reached an agreement after extra time, said nothing substantial, but tried to dress up the failure as something significant. The BBC’s exuberant reporting of the outcome by Matt McGarth seriously misleads readers as to the substance of the agreement when he states

The Katowice agreement aims to deliver the Paris goals of limiting global temperature rises to well below 2C.

I have written to the BBC Complaints Department asking that they make a correction. Within that letter I cite four references that demonstrate why this McGarth’s statement misleading.

First, there is Climate Action Tracker’s thermometer. I do not believe there have been any additional pledges made in the last few days that would cause CAT to lower their projection from 3oC of warming to below 2oC.
Instead I believe that the COP24 Agreement merely tries to ensure that the pledges are effectively implemented, thus ensuring 3oC of warming rather than the “current policy” 3.3oC of warming.

Second, I do not believe there were additional pledges made during the Katowice conference will cut emissions by at least 15 GtCO2e in 2030. This is the minimum difference to be on track to stop global average temperatures exceeding 2oC.  . I enclose a screen shot of Climate Action Tracker’s Emission Gap page.

For the original source, I direct readers to the UNEP Emissions Gap Report 2018 , published towards at the end of November. In particular, look to Figure ES.3 on page xviii. The three major points in bold of the Executive Summary (pages xiv to xvii) clarify this graphic.

Third, I also draw readers attention to “Table 2.1: Overview of the status and progress of G20 members, including on Cancun pledges and NDC targets” on page 9 of the full UNEP report. A screenshot (without footnotes) is shown below.

The G20 countries accounted for 78% of the 2017 global GHG emissions excluding LULUCF of 49.2 GtCO2e. This was equivalent 72% of total GHG emissions of 53.5 GtCO2e. It might be worth focusing on which countries have increased their pledges in the past couple of weeks. In particular, those countries whose INDC submission pledges of 2015 imply increases in emissions between 2015 and 2030 of at least 0.5 GtCO2e or more (China, India, Russia, Turkey and Indonesia plus Pakistan, Nigeria and Vietnam outside of the G20), as they collectively more than net offset the combined potential emissions decreases of the developed countries such as the USA, EU, Canada and Australia. In a previous post I graphed this proposed emissions increases in figures 2 and 3. They are reproduced below.

Fourth, is that the UNFCCC press announcement makes no mention of any major breakthrough. The only national government mentioned is that of Scotland, who provided £200,000 of additional funding.  Scotland is not an independent Nation, recognized by the United Nations. As a part of the EU, it is not even part of a recognized nation state that makes submissions direct to the UNFCCC. The  SUBMISSION BY LATVIA AND THE EUROPEAN COMMISSION ON BEHALF OF THE EUROPEAN UNION AND ITS MEMBER STATES of 6 March 2015 can be found here.  By being a part of the EU, in the UNFCCC Scotland is two levels below Liechtenstein or Tuvalu. despite having respectively 140 and 480 times the population. But even if Scotland were both independent of the UK and the EU, as a nation state it would hardly seem fair that it was accorded the same voice as India or China with each have about 250 times the population of Scotland.

In the spirit of objectivity and balance, I hope that the BBC makes the  necessary correction.

Kevin Marshall

UNEP Emissions Gap Report 2018 Part 3 – UNEP tacitly admits climate mitigation is a failure

To those following the superficial political spin of climate policy, a UN organisation admitting that climate mitigation has failed may come as a surprise. Yet one does not have to go too deeply into the new UNEP Emissions Gap Report 2018 to see that this tacit admission is clearly the case. It is contained within the 3 major points in the Executive Summary.

By policy failure, I mean to achieve a global substantial reduction in GHG emissions in the near future, even if that reduction is not in line with either the 1.5°C or 2.0°C warming objective. On this measure, the UNEP is tacitly admitting failure it the summary.
The Executive Summary of the UNEP Emissions Gap Report 2018 starts on the pdf page 14 of 112, numbered page xiv.

Point 1 – Current commitments are inadequate

1. Current commitments expressed in the NDCs are inadequate to bridge the emissions gap in 2030. Technically, it is still possible to bridge the gap to ensure global warming stays well below 2°C and 1.5°C, but if NDC ambitions are not increased before 2030, exceeding the 1.5°C goal can no longer be avoided. Now more than ever, unprecedented and urgent action is required by all nations. The assessment of actions by the G20 countries indicates that this is yet to happen; in fact, global CO2 emissions increased in 2017 after three years of stagnation.

This is not a statement about a final push to get policy over the line, but a call for a complete change of direction. The tacit admission is that this is politically impossible. In the amplification it is admitted that in the G20 major economies – most of them developing countries – even the “NDC ambitions” for 2030 are not likely to be achieved. As I showed in the Part 2 post, 9 of the G20 will actually increase their emissions from 2015 to 2030 if the commitments are fully met, and the sum of the emissions increases will be greater than the emissions decreases. The exhortation for “unprecedented and urgent action” is not like Shakespeare’s Henry V rallying his men with a “once more unto the breach chaps and we will crack it” but more about like “Hey good fellows, if we are really going to breach the defenses we need to upgrade from the colorful fireworks to a few kegs of proper gunpowder, then make a few genuine sacrifices. I will be cheering you all the way from the rear“. This sentiment is contained in the following statement.

As the emissions gap assessment shows, this original level of ambition needs to be roughly tripled for the 2°C scenario and increased around fivefold for the 1.5°C scenario.

Point 2 – Emissions are increasing, not decreasing rapidly

2. Global greenhouse gas emissions show no signs of peaking. Global CO2 emissions from energy and industry increased in 2017, following a three-year period of stabilization. Total annual greenhouse gases emissions, including from land-use change, reached a record high of 53.5 GtCO2e in 2017, an increase of 0.7 GtCO2e compared with 2016. In contrast, global GHG emissions in 2030 need to be approximately 25 percent and 55 percent lower than in 2017 to put the world on a least-cost pathway to limiting global warming to 2°C and 1.5°C respectively.

In just 13 years from now global emissions need to be down by a quarter or more than a half to achieve the respective 2°C and 1.5°C targets. Emissions are still going up. Again, an admission that the progress in over two decades is small in relation to the steps needed to achieve anything like a desired outcome.

Point 3 – Scale of the gap in numbers

3. The gap in 2030 between emission levels under full implementation of conditional NDCs and those consistent with least-cost pathways to the 2°C target is 13 GtCO2e. If only the unconditional NDCs are implemented, the gap increases to 15 GtCO2e. The gap in the case of the 1.5°C target is 29 GtCO2e and 32 GtCO2e respectively. This gap has increased compared with 2017 as a result of the expanded and more diverse literature on 1.5°C and 2°C pathways prepared for the IPCC Special Report.

Some developing countries said they would change course conditional on massive amounts of funding. It is clear this will not be forthcoming. Fleshing out the 1.5°C target in the SR1.5 Report showed that it requires more onerous policies than previously thought. Each year UNEP produces a chart that nicely shows the scale of the problem. The 2018 version on page xviii is reproduced as figure 1.

Figure 1 : The emissions GAP in 2030 under the 1.5°C and 2°C scenarios, from the UNEP Emissions Gap Report 2018.

The widening gap between the 1.5°C and 2°C pathways and current projected commitments over the last five reports is shown in figure 2.

This widening gap is primarily a result of recalculations. Increased emissions in 2017 are secondary.

Conclusion

That nearly 200 nations would fail to agree to collectively and substantially reduce global emissions was obvious from the Rio Declaration in 1992. This exempted developing countries from any obligation to reduce their emissions. These developing countries now have at least four fifths of the global population and around two-thirds emissions. It was even more obvious from reading the Paris Agreement, where vague aspirations are evident. It is left to the reader to work out the implications of paragraphs like 4.1 and 4.4, which renders the UNFCCC impotent in reducing emissions. The latest UNEP Emissions Gap Report presents the magnitude of the mitigation policy failure and very clear statements about that failure.

Kevin Marshall

Leave EU Facebook Overspending and the Brexit Result

Last week an Independent article claimed

Brexit: Leave ‘very likely’ won EU referendum due to illegal overspending, says Oxford professor’s evidence to High Court

The article began

It is “very likely” that the UK voted for Brexit because of illegal overspending by the Vote Leave campaign, according to an Oxford professor’s evidence to the High Court.

Professor Philip Howard, director of the Oxford Internet Institute, at the university, said: “My professional opinion is that it is very likely that the excessive spending by Vote Leave altered the result of the referendum.
“A swing of just 634,751 people would have been enough to secure victory for Remain.
“Given the scale of the online advertising achieved with the excess spending, combined with conservative estimates on voter modelling, I estimate that Vote Leave converted the voting intentions of over 800,000 voters in the final days of the campaign as a result of the overspend.”

Is the estimate conservative? Anthony Masters, a Statistical Ambassador for the Royal Statistical Society, questions the statistics in the Spectator. The 800,000 was based upon 80 million Facebook users, 10% of whom clicked in on the advert. Of those clicking, 10% changed their minds.

Masters gave some amplification on in a follow-up blog post Did Vote Leave’s overspending cause their victory?
The reasons for doubting the “conservative” figures are multiple, including
– There were not 80 million voters on Facebook. Of the 46 million voters, at most only 25.6 million had Facebook accounts.
– Click through rate for ads is far less than 10%. In UK in 2016 it was estimated at 0.5%.
– Advertising is not the source of campaigning. It is not even viewed as the primary source, merely bolstering other parts of a campaign through awareness and presence.
– 10% of those reading the advert changing their minds is unlikely. Evidence is far less.
Anthony Masters concludes the Spectator piece by using Professor Howard’s own published criteria.

Prof Howard’s 2005 book, New Media Campaigns and the Managed Citizen, also argues that we should apply a different calculation to that submitted to the High Court. His book says to apply a one per cent click-through rate, where 10 per cent “believe” what they read; and of that 10 per cent act. This ‘belief’ stage appears to have been omitted in the High Court submission’s final calculation. Using these rates, this calculation turns 25.6 million people into 2,560 changed votes – hardly enough to have swung the referendum for Leave, given that their margin of victory was over a million votes. If we share a belief in accuracy, this erroneous claim should have limited reach.

There is further evidence that runs contrary to Prof Howard’s claims.

1. The Polls
To evaluate the statistical evidence for a conjecture – particularly for a contentious and opinionated issue like Brexit – I believe one needs to look at the wider context. If a Facebook campaign swung the Referendum campaign in the final few days from Remain to Leave, then there should be evidence of a swing in the polls. In the blog article Masters raised three graphs based on the polls that contradict this swing. It would appear that through the four weeks of the official campaign the Remain / Leave split was fairly consistent on a poll of polls basis. From analysis by pollster YouGov, the Leave share peaked on 13th June – ten days before the referendum. The third graphic, from a statistical analysis from the LSE, provides the clearest evidence.

The peak was just three days before the murder of MP Jo Cox by Tommy Mair. Jo Cox was a Remain campaigner, whilst it was reported that the attacker shouted slogans like “Britain First”. The shift in the polls could have been influenced by the glowing tributes to the murdered MP, alongside the speculation of the vile motives a clear Vote Leave supporter. That Jo Cox’s murder should have had no influence, especially when campaigning was suspended as a result of the murder, does not seem credible.

On Twitter, Anthony Masters also pointed to a question in Lord Ashcroft’s poll carried out on the day of the referendum – How the United Kingdom voted on Thursday… and why to a graphic that looked at when people had decided which way to vote. At most 16% of leave voters made up their minds in the last few days, slightly less than the 18% who voted remain.

The same poll looked at the demographics.


This clearly shows the split by age group. The younger a voter the more likely they were to vote Remain. It is not a minor relationship. 73% of 18-24s voted for Remain, whilst 40% of the 65% voted. Similarly, the younger a person the greater the time spent on social media such as Facebook.

2. Voting by area
Another, aspect is to look at the geographical analysis. Using Chris Hanretty’s estimates of the EU referendum results by constituency, I concluded that the most pro-Remain areas were the centre of major cities and in the University Cities of Oxford, Cambridge and Bristol. This is where the most vocal people reside.

The most pro-Leave areas were in the minor towns such are Stoke and Boston. Greater Manchester provided a good snapshot of the National picture. Of the 22 constituencies is estimated that just 3 has a majority remain vote. The central to the City of Manchester. The constituencies on the periphery voted to Leave, the strongest being on the east of Manchester and a few miles from the city centre. Manchester Central contains many of the modern flats and converted warehouses of Manchester. Manchester Withington has a preponderance of media types working at Media City for the BBC and ITV, along with education sector professionals.

These are the people who are not just geographically marginalised, but often feel politically marginalised as well.

Concluding comments

Overall, Professor Howard’s claims of late Facebook posts swinging the Referendum result are not credible at all. They are about as crackpot (and contradict) as the claims of Russian influence on the Brexit result.
To really understand the issues one needs to look at the data from different perspectives and the wider context. But the more dogmatic Remainers appear to be using their power and influence – backed by scanty assertions -to thrust their dogmas onto everyone else. This is undermining academic credibility, and the democratic process. By using the courts to pursue their dogmas, it also threatens to pull the legal system into the fray, potentially undermining the respect for the rule of law for those on the periphery.

Kevin Marshall

Natural Variability in Alaskan Glacier Advances and Retreats

One issue with global warming is discerning how much of that warming is human caused. Global temperature data is only available since 1850. That might contain biases within the data, some recognized (like the urban heat island effect) and others maybe less so. Going further back is notoriously difficult, with proxies for temperature having to be used. Given that (a) recent warming  in the Arctic has been significantly greater than warming at other latitudes (see here) and (b) the prominence given a few years ago to the impact of melting ice sheets, the retreat of Arctic glaciers ought to be a useful proxy. I was reminded of this with yesterday’s Microsoft screensaver of Johns Hopkins Glacier and inlet in Glacier Bay National Park, Alaska.

The caption caught my eye

By 1879, when John Muir arrived here, he noticed that the huge glacier had retreated and the bay was now clogged with multiple smaller glaciers.
I did a quick search on how for more information on this retreat. At the National Park Service website, there are four images of the estimated glacier extent.
The glacier advanced from 1680 to 1750, retreated dramatically in the next 130 years to 1880, and then retreated less dramatically in the last 130+ years. This does not fit the picture of unprecedented global warming since 1950.

The National Park Service has more detail on the glacial history of the area, with four maps of the estimated glacial extent.

The glacial advance after 1680 enveloped a village of some early peoples. This is so something new to me. Previous estimates of glacier movement in Glacier Bay have only been of the retreat. For instance this map from a 2012 WUWT article shows the approximate retreat extents, not the earlier advance. Is this recently discovered information.

I have marked up the John Hopkins Glacier where the current front is about 50 miles from the glacier extent in 1750.
The National Park Service has a more detailed map of Glacier Bay, with more detailed estimated positions of the glacier terminus at various dates. From this map the greatest measured retreat of John Hopkins Glacier was in 1929. By 1966 it had expanded over a mile and the current terminus in slightly in front of the 1966 terminus. This is an exception to the other glaciers in Glacier Bay which are still retreating, but at a slower rate than in the nineteenth century.

As the human-caused warming is supposed to have predominately after 1950 the glacial advance and retreat patterns of the extensive Glacier Bay area do not appear to conform to those signals.

A cross check is from the Berkeley Earth temperature anomaly for Anchorage.

Whilst it might explain minor glacial advances from the 1929 to 1966, it does not explain the more significant glacial retreat in the nineteenth century, nor the lack of significant glacial retreat from the 1970s.

Kevin Marshall

UNEP Emissions Gap Report 2018 Part 2 – Emissions Projections by Country

On previous UNEP Emission Gap Reports I found that although they showed the aggregate global projected emissions, there has been no breakdown by country. As mitigation policies are mostly enacted by nation states, and the aim is to reduce global emissions, it would be useful to actually see how each of the near 200 nation states have pledged contribute to that objective.  Table 2.1 on page 9 of the UNEP Emissions Gap Report 2018 (published last week) goes part way to remedy this glaring omission. The caption states

Table 2.1: Overview of the status and progress of G20 members, including on Cancun pledges and NDC targets.


The G20 economies accounted for 78% of global emissions (excluding LULUCF) in 2017. The table does not clearly show the estimate emissions in 2015 and 2030, only the emissions per capita in 2015 (including LULUCF) and the percentage change in emissions per capita from 2015 to 2030. So I have done my own calculations based on this data using the same future population estimates as UNEP. That is from the medium fertility variant of the United Nations World Population Prospects 2017 edition. There are two additional assumptions I have made in arriving at these figures. First, the share of global emissions in 2015 for each country was exactly the same as in 2017. Second, the global shares including LULUCF (Land use, land-use change and forestry) are the same as those excluding LULUCF. This second assumption will likely understate the total emissions shares of countries like Brazil and Indonesia, where land use has high, and variable, emissions impacts. It may impact the country rankings by a small amount. However, the overall picture shown in Figure 1 will not be materially changed as the report states on page XV that the land use element was just 4.2 GtCO2e of the 53.5 GtCO2e estimated emissions in 2017.

In Figure 1 it is only G20 countries with 33% of current global emissions where emissions are projected to be lower 2030 than in 2015. The other G20, with 45% of global emissions, are projected to be higher. There are wide variations. I calculate, Argentina is projected to increase its emissions by 7% or 32 MtCO2e, Turkey by 128% or 521 MtCO2e and India by 93% or 2546 MtCO2e.
To get a clearer picture I have looked at the estimates changes between 2015 and 2030  in Figure 2. Please note the assumptions made above, particularly concerning LULUCF. I also make the additional assumption that in rest of the world emissions will increase in line with projected population growth, so emissions per capita will be unchanged.

The calculated figures show a net increase of 7.4 GtCO2e, compared to EGR2018 estimates of 6 GtCO2e including LULUCF. It might be a reasonable assumption that there are net reductions in removing the rainforests by burning, and increase in trees due to more planting, and the impact of increased growth due to higher CO2 levels will be net positive.
Note that whilst the USA has given notice of exiting the Paris Agreement, and thus its pledges, the pledge was a very soft target. It is more than likely the United States will have the greatest emissions reductions of any country between 2015 and 2030, and have one of the largest percentage reductions as well. These reductions are mostly achieved without economically damaging mitigation policies.
The figures used for the G20 countries in Table 2.1 are only vague estimates as section 2.4.2 (Emissions trends and targets of individual G20 members) implies. However, the assumption of a net increase of 29% for the rest of the world might not be valid if one uses country INDC submissions as a basis for calculation. There are a few countries that have pledged to reduce emissions. Andorra and Liechtenstein are two examples. But among the more populous emerging economies, it is clear from the INDC submissions that there is no intention to reduce emissions.

Figure 3 estimates the implied increase in emissions in the more populous countries outside of the G20 for the unconditional scenarios.

I would also have liked to include DR Republic of Congo, Egypt and Iran, with a combined population of 260 million. However, lack of data in the INDCs prevented this.
Although the 8 countries in Figure 3 contain one eighth of the global population, they currently have just 4% of global emissions. But their aggregate projected emissions increase without outside assistance is 3.0 GtCO2e, on top of 2.1 GtCO2e in 2015. Combined with the 7.4 GtCO2e estimated increase for the G20 countries and it is difficult to see how the UNEP estimates an increase just 3 GtCO2e. (see Figure ES.3 on page XVIII).

There appear to be no countries with a population of more than 40 million outside of the G20 who are promising to decrease their emissions. Tanzania, Colombia, Kenya and Algeria (combined population 190 million people) are all projecting significant emissions increases, whilst Myanmar and Sudan have inadequate data to form an estimate. A quick check of 8 non G20 countries with populations of 30-40 million has the same result. Either an increase in emissions or no data. 

Implications for mitigation policy

In summary, of the 45 nations with a population above 30 million, just 10 have pledged to have emissions lower in 2030 than 2015. The United States will likely achieve this objective are well. The other 34 nations will likely have higher emissions in 2030, with most significantly higher. The 11 emissions-reducing nations have a population of 1.1 billion against 5.3 billion in the 34 other nations and 1.15 billion in nations or territories with a population of less than 30 million. In terms of emissions, barring economic disaster, I estimate it is likely that countries with in excess of 60% of global emissions in 2017 will have emissions in 2030 that exceed those of 2015.  

To put this in context, the Emissions Gap report states on page xv

According to the current policy and NDC scenarios, global emissions are not estimated to peak by 2030.

My analysis confirms this. The Report further states

Total annual greenhouse gases emissions, including from land-use change, reached a record high of 53.5 GtCO2e in 2017, an increase of 0.7 GtCO2e compared with 2016. 
In contrast, global GHG emissions in 2030 need to be approximately 25 percent and 55 percent lower than in 2017 to put the world on a least-cost pathway to limiting global warming to 2°C and 1.5°C respectively.

After over 20 years of annual meeting to achieve global reductions in emissions, there is still no chance of that happening. In the light of this failure UNEP appear to have fudged the figures. Part of this is justified, as many developing countries appear to have put through unjustifiable BAU scenarios then claimed “climate actions” that will bring the projection more into line with what would be a non-policy forecast. COP 24 at Katowice will just be another platform for radical environmentalists to denigrate capitalist nations for being successful, and for a collective finger-wagging at the United States. 

The next part will look at the coded language of the Emissions Gap Report 2018 that effectively admits the 2°C and 1.5°C ambitions are impossible.

Kevin Marshall

 

UNEP Emissions Gap Report 2018 Part 1 – The BBC Response

Over the past year I have mentioned a number of times to UNEP Emissions Gap Report 2017. The successor 2018 EGR (ninth in the series) has now been published. This is the first in a series of short posts looking at the issues with the report. First up is an issue with the reporting by the BBC.
On the 27th Matt Macgarth posted an article Climate change: CO2 emissions rising for first time in four years.
The sub-heading gave the real thrust of the article.

Global efforts to tackle climate change are way off track says the UN, as it details the first rise in CO2 emissions in four years.

Much of the rest of the article gives a fair view of EGR18.  But there is a misleading figure. Under “No peaking?” the article has a figure titled

Number of countries that have pledged to cap emissions by decade and percentage of emissions covered”.

In the report Figure 2.1 states

Number of countries that have peaked or are committed to peaking their emissions, by decade (aggregate) and percentage of global emissions covered (aggregate).

The shortened BBC caption fails to recognize that countries in the past peaked their emissions unintentionally.  In looking at Climate Interactive‘s bogus emissions projections at the end of 2015 I found that, collectively, the current EU28 countries peaked their emissions in 1980. In the USA emissions per capita peaked in 1973. Any increases since then have been less than the rise in population. Yet Climate Interactive’s RCP8.5, non-policy, projection apportionment by country assumed that 

(a) Emissions per capita would start to increase again in the EU and USA after falling for decades

(b) In China and Russia emissions per capita would increase for decades to levels many times that of any country.

(c) In India and African countries emissions per capita would hardly change through to 2100, on the back of stalled economic growth. For India, the projected drop in economic growth was so severe that on Dec 30th 2015 to achieve the projection the Indian economy would have needed to have shrunk by over 20% before Jan 1st 2016. 

Revising the CO2 emissions projections (about 75% of the GHG emissions EGR18 refers to) would have largely explained the difference between the resultant 4.5°C of warming in 2100 from the BAU scenario of all GHG emissions and the 3.5°C consequential on the INDC submissions. I produced a short summary of more reasonable projections in December 2015.

Note that EGR18 now states the fully implemented INDC submissions will achieve 3.2°C of warming in 2100 instead of 3.5°C that CI was claiming three years ago.

The distinction between outcomes consequential on economic activity and those resultant from the deliberate design of policy is important if one wants to distinguish between commitments that inflict economic pain on their citizens (e.g. the UK) and commitments that are almost entirely diplomatic hot air (the vast majority). The BBC fails to make the distinction historically and in the future, whilst EGR18 merely fails with reference to the future.  

The conclusion is that the BBC should correct its misreporting, and the UN should start distinguishing between hot air and substantive policy to could cut emissions. But that would mean recognizing climate mitigation is not just useless, but net harmful to every nation that enacts policy that will make deep cuts in actual emissions,

Kevin Marshall

Two Contrasting Protests on Climate Change

Yesterday marked two protests related to climate change. One in central London by a group of climate extremists baying for more stringent climate policies. The other right across France demanding the removal of a small additional tax on fuel.

The Climate Extremists

Yesterday a group calling themselves !EXTINCTION REBELLION! had a series of protests around London, including blocking off five major bridges. They have a long history, having been founded almost three weeks ago on Halloween. Their aims are quite clear from a mercifully short video.

It is based on “science“.

The Crux

Even without the other ecological drivers of mass species extinction, natural resource exhaustion and growing human population pressure, human-caused (anthropogenic) climate breakdown alone is enough to wipe out the human species by the end of this century, if governments do not immediately begin to reverse their extractivismand ‘growth’ -based economic policies.

This is why the Extinction Rebellion campaign has at its core a group of activists who are prepared to go to prison for peaceful civil disobedience, to get the necessary media coverage for long enough to leverage the government and the public into war-level mobilisation mode.

When you repeatedly come across the figure of 2 degrees i.e. limiting global warming to 2 degrees, think of what happens to a human body when it experiences a temperature increase of more than 2 degrees.

The recent IPCC SR1.5 was the product of two and a half years trying to come up with scary stories to frighten governments into action. Two examples of the scary headlines from the SPM.

Temperature extremes on land are projected to warm more than GMST (high confidence): extreme hot days in mid-latitudes warm by up to about 3°C at global warming of 1.5°C and about 4°C at 2°C, and extreme cold nights in high latitudes warm by up to about 4.5°C at 1.5°C and about 6°C at 2°C (high confidence). The number of hot days is projected to increase in
most land regions, with highest increases in the tropics (high confidence).

By 2100, global mean sea level rise is projected to be around 0.1 metre lower with global warming of 1.5°C compared to 2°C (medium confidence).

In Britain we will be wiped out by a few 20°C+ hot nights and extra sea level rise of four inches. Maybe we could listen to the 40% of the global population that lives in the tropics.

The “science” section includes this quote from Bill McKibben.

What those numbers mean is quite simple. This industry has announced…in promises to shareholders, that they are determined to burn five times more fossil fuel than the planet’s atmosphere can begin to absorb.

This is not science, but blinkered ideology. Why blinkered? Try going to the CDP Carbon Majors Report 2017 Appendix 1 – Cumulative emissions 1988-2015 %. Below are the top 10.

If the !XR! really believe in the climate apocalypse, shouldn’t they be protesting outside the Chinese, Russian, Iranian and Indian Embassies, and inciting rebellion in those countries? Or are they just climate totalitarians trying to wreck the well-being of the British people?

The Carbon Tax Revolt

On the same day in France there were massive nationwide protests after the Macron government raised its hydrocarbon tax this year by 7.6 cents per litre on diesel and 3.9 cents on petrol. This lead to the formation of the gilets jaunes (yellow vests) movement that have organised at least 630 protests nationwide. From the website blocage17novembre.com. I grabbed the a screenshot map of the protest locations.

These protests became far from peaceful, as frustrated drivers tried to push their way through the protesters. The BBC reports one person killed and 227 killed. The BBC also reports that the 200,000+ protesters are backed by about 75% of the French public.

Yet !EXTINCTION REBELLION! should be supporting the Macron

Lessons for the Climate Extremists

Protests in a single country will not work. Protests in many countries will not work either, as people have other priorities. Further, it is too late to convince countries to sign up to massive cuts in emissions. That opportunity was missed in 1992, when “developing” countries were exempted from any obligation to constrain there emissions. Those countries, with at least 80% of the global population and up to two-thirds of global emissions have shown no inclination to change course. The protests in France show how even very small changes can lead to massive protests. In the UK fuel prices are not raised due to political unpopularity.

If such extremists still believe they are correct in their prophesies, and I am in denial, there are a number of strategies that they can legitimately use to evangelize.

  • Let contrary ideas to their own be evaluated on same unemotive level playing field as their own. In the past on hearing reports of court cases of heinous crimes, I have been convinced more by the daft excuses of the defendant than the prosecution’s evidence.  Alternatively, the overturned terrorist convictions in the 1970s of the Guildford Four and the Birmingham Six undermined belief in the Rule of Law.  So too with the false climate alarmism undermines my belief in scientific evidence.
  • Rather than accept whatever “science” that the supports alarmism is put out, seek to clarify the likelihood, type, extent, location and timing of coming catastrophes. That way, people can better adapt to changing conditions. The problem here is that predictions of doom are most likely false prophesies.
  • Supporting and encouraging Governments where they are encountering popular opposition. Why were !XR! not in France supporting President Macron? He not only supports the ban on fracking (with maybe 80% of Europe’s frackable gas reserves), but also have banned any fossil fuel extraction on French soil. After all !XR! believe this is a WW2 type emergency. Winston Churchill swallowed his loathing of the Bolsheviks to extinguish the Nazi Empire. Is climate not important enough to seek allies and give them some encouragement in time of need?

Climate alarmists will not accept what I say, as this would threaten their world views. They have plenty of others to fall back on for reassurance, but in reality they are just supporting policies that are net harmful.

Kevin Marshall

Australian Beer Prices set to Double Due to Global Warming?

Earlier this week Nature Plants published a new paper Decreases in global beer supply due to extreme drought and heat

The Scientific American has an article “Trouble Brewing? Climate Change Closes In on Beer Drinkers” with the sub-title “Increasing droughts and heat waves could have a devastating effect on barley stocks—and beer prices”. The Daily Mail headlines with “Worst news ever! Australian beer prices are set to DOUBLE because of global warming“. All those climate deniers in Australia have denied future generations the ability to down a few cold beers with their barbecued steaks tofu salads.

This research should be taken seriously, as it is by a crack team of experts across a number of disciplines and Universities. Said, Steven J Davis of University of California at Irvine,

The world is facing many life-threatening impacts of climate change, so people having to spend a bit more to drink beer may seem trivial by comparison. But … not having a cool pint at the end of an increasingly common hot day just adds insult to injury.

Liking the odd beer or three I am really concerned about this prospect, so I rented the paper for 48 hours to check it out. What a sensation it is. Here a few impressions.

Layers of Models

From the Introduction, there were a series of models used.

  1. Created an extreme events severity index for barley based on extremes in historical data for 1981-2010.
  2. Plugged this into five different Earth Systems models for the period 2010-2099. Use this against different RCP scenarios, the most extreme of which shows over 5 times the warming of the 1981-2010 period. What is more severe climate events are a non-linear function of temperature rise.
  3. Then model the impact of these severe weather events on crop yields in 34 World Regions using a “process-based crop model”.
  4. (W)e examine the effects of the resulting barley supply shocks on the supply and price of beer in each region using a global general equilibrium model (Global Trade Analysis Project model, GTAP).
  5. Finally, we compare the impacts of extreme events with the impact of changes in mean climate and test the sensitivity of our results to key sources of uncertainty, including extreme events of different severities, technology and parameter settings in the economic model.

What I found odd was they made no allowance for increasing demand for beer over a 90 year period, despite mentioning in the second sentence that

(G)lobal demand for resource-intensive animal products (meat and dairy) processed foods and alcoholic beverages will continue to grow with rising incomes.

Extreme events – severity and frequency

As stated in point 2, the paper uses different RCP scenarios. These featured prominently in the IPCC AR5 of 2013 and 2014. They go from RCP2.6, which is the most aggressive mitigation scenario, through to RCP 8.5 the non-policy scenario which projected around 4.5C of warming from 1850-1870 through to 2100, or about 3.8C of warming from 2010 to 2090.

Figure 1 has two charts. On the left it shows that extreme events will increase intensity with temperature. RCP2.6 will do very little, but RCP8.5 would result by the end of the century with events 6 times as intense today. Problem is that for up to 1.5C there appears to be no noticeable change what so ever.  That is about the same amount of warming the world has experienced from 1850-2010 per HADCRUT4 there will be no change. Beyond that things take off. How the models empirically project well beyond known experience for a completely different scenario defeats me. It could be largely based on their modelling assumptions, which is in turn strongly tainted by their beliefs in CAGW. There is no reality check that it is the models that their models are not falling apart, or reliant on arbitrary non-linear parameters.

The right hand chart shows that extreme events are porjected to increase in frequency as well. Under RCP 2.6 ~ 4% chance of an extreme event, rising to ~ 31% under RCP 8.5. Again, there is an issue of projecting well beyond any known range.

Fig 2 average barley yield shocks during extreme events

The paper assumes that the current geographical distribution and area of barley cultivation is maintained. They have modelled in 2099, from the 1981-2010 a gridded average yield change with 0.5O x 0.5O resolution to create four colorful world maps representing each of the four RCP emissions scenarios. At the equator, each grid is about 56 x 56 km for an area of 3100 km2, or 1200 square miles. Of course, nearer the poles the area diminishes significantly. This is quite a fine level of detail for projections based on 30 years of data to radically different circumstances 90 years in the future. The results show. Map a) is for RCP 8.5. On average yields are projected to be 17% down. As Paul Homewood showed in a post on the 17th, this projected yield fall should be put in the context of a doubling of yields per hectare since the 1960s.

This increase in productivity has often solely ascribed to the improvements in seed varieties (see Norman Borlaug), mechanization and use of fertilizers. These have undoubtably have had a large parts to play in this productivity improvement. But also important is that agriculture has become more intensive. Forty years ago it was clear that there was a distinction between the intensive farming of Western Europe and the extensive farming of the North American prairies and the Russian steppes. It was not due to better soils or climate in Western Europe. This difference can be staggering. In the Soviet Union about 30% of agricultural output came from around 1% of the available land. These were the plots that workers on the state and collective farms could produce their own food and sell surplus in the local markets.

Looking at chart a in Figure 2, there are wide variations about this average global decrease of 17%.

In North America Montana and North Dakota have areas where barley shocks during extreme years will lead to mean yield changes over 90% higher normal, and the areas around have >50% higher than normal. But go less than 1000 km North into Canada to the Calgary/Saskatoon area and there are small decreases in yields.

In Eastern Bolivia – the part due North of Paraguay – there is the biggest patch of > 50% reductions in the world. Yet 500-1000 km away there is a North-South strip (probably just 56km wide) with less than a 5% change.

There is a similar picture in Russia. On the Kazakhstani border, there are areas of > 50% increases, but in a thinly populated band further North and West, going from around Kirov to Southern Finland is where there are massive decreases in yields.

Why, over the course of decades, would those with increasing yields not increase output, and those with decreasing yields not switch to something else defeats me. After all, if overall yields are decreasing due to frequent extreme weather events, the farmers would be losing money, and those farmers do well when overall yields are down will be making extraordinary profits.

A Weird Economic Assumption

Building up to looking at costs, their is a strange assumption.

(A)nalysing the relative changes in shares of barley use, we find that in most case barley-to-beer shares shrink more than barley-to-livestock shares, showing that food commodities (in this case, animals fed on barley) will be prioritized over luxuries such as beer during extreme events years.

My knowledge of farming and beer is limited, but I believe that cattle can be fed on other things than barley. For instance grass, silage, and sugar beet. Yet, beers require precise quantities of barley and hops of certain grades.

Further, cattle feed is a large part of the cost of a kilo of beef or a litre of milk. But it takes around 250-400g of malted barley to produce a litre of beer. The current wholesale price of malted barley is about £215 a tonne or 5.4 to 8.6p a litre. About cheapest 4% alcohol lager I can find in a local supermarket is £3.29 for 10 x 250ml bottles, or £1.32 a litre. Take off 20% VAT and excise duty leaves 30p a litre for raw materials, manufacturing costs, packaging, manufacturer’s margin, transportation, supermarket’s overhead and supermarket’s margin. For comparison four pints (2.276 litres) of fresh milk costs £1.09 in the same supermarket, working out at 48p a litre. This carries no excise duty or VAT. It might have greater costs due to refrigeration, but I would suggest it costs more to produce, and that feed is far more than 5p a litre.

I know that for a reasonable 0.5 litre bottle of ale it is £1.29 to £1.80 a bottle in the supermarkets I shop in, but it is the cheapest that will likely suffer the biggest percentage rise from increase in raw material prices. Due to taxation and other costs, large changes in raw material prices will have very little impact on final retail costs. Even less so in pubs where a British pint (568ml) varies from the £4 to £7 a litre equivalent.

That is, the assumption is the opposite of what would happen in a free market. In the face of a shortage, farmers will substitute barley for other forms of cattle feed, whilst beer manufacturers will absorb the extra cost.

Disparity in Costs between Countries

The most bizarre claim in the article in contained in the central column of Figure 4, which looks at the projected increases in the cost of a 500 ml bottle of beer in US dollars. Chart h shows this for the most extreme RCP 8.5 model.

I was very surprised that a global general equilibrium model would come up with such huge disparities in costs after 90 years. After all, my understanding of these models used utility-maximizing consumers, profit-maximizing producers, perfect information and instantaneous adjustment. Clearly there is something very wrong with this model. So I decided to compare where I live in the UK with neighbouring Ireland.

In the UK and Ireland there are similar high taxes on beer, with Ireland being slightly more. Both countries have lots of branches of the massive discount chain. They also have some products on their website aldi.co.uk and aldi.ie.  In Ireland a 500 ml can of Sainte Etienne Lager is €1.09 or €2.18 a litre or £1.92 a litre. In the UK it is £2.59 for 4 x 440ml cans or £1.59 a litre. The lager is about 21% more in Ireland. But the tax difference should only be about 15% on a 5% beer (Saint Etienne is 4.8%). Aldi are not making bigger profits in Ireland, they just may have higher costs in Ireland, or lesser margins on other items. It is also comparing a single can against a multipack. So pro-rata the £1.80 ($2.35) bottle of beer in the UK would be about $2.70 in Ireland. Under the RCP 8.5 scenario, the models predict the bottle of beer to rise by $1.90 in the UK and $4.84 in Ireland. Strip out the excise duty and VAT and the price differential goes from zero to $2.20.

Now suppose you were a small beer manufacturer in England, Wales or Scotland. If beer was selling for $2.20 more in Ireland than in the UK, would you not want to stick 20,000 bottles in a container and ship it to Dublin?

If the researchers really understood the global brewing industry, they would realize that there are major brands sold across the world. Many are brewed across in a number of countries to the same recipe. It is the barley that is shipped to the brewery, where equipment and techniques are identical with those in other parts of the world. This researchers seem to have failed to get away from their computer models to conduct field work in a few local bars.

What can be learnt from this?

When making projections well outside of any known range, the results must be sense-checked. Clearly, although the researchers have used an economic model they have not understood the basics of economics. People are not dumb  automatons waiting for some official to tell them to change their patterns of behavior in response to changing circumstances. They notice changes in the world around them and respond to it. A few seize the opportunities presented and can become quite wealthy as a result. Farmers have been astute enough to note mounting losses and change how and what they produce. There is also competition from regions. For example, in the 1960s Brazil produced over half the world’s coffee. The major region for production in Brazil was centered around Londrina in the North-East of Parana state. Despite straddling the Tropic of Capricorn, every few years their would be a spring-time frost which would destroy most of the crop. By the 1990s most of the production had moved north to Minas Gerais, well out of the frost belt. The rich fertile soils around Londrina are now used for other crops, such as soya, cassava and mangoes. It was not out of human design that the movement occurred, but simply that the farmers in Minas Gerais could make bumper profits in the frost years.

The publication of this article shows a problem of peer review. Nature Plants is basically a biology journal. Reviewers are not likely to have specialist skills in climate models or economic theory, though those selected should have experience in agricultural models. If peer review is literally that, it will fail anyway in an inter-disciplinary subject, where the participants do not have a general grounding in all the disciplines. In this paper it is not just economics, but knowledge of product costing as well. It is academic superiors from the specialisms that are required for review, not inter-disciplinary peers.

Kevin Marshall

 

IPCC SR1.5 – Notes on Calculations and Assumptions

Given that my previous post was about failing to reconcile the emissions estimates for 1.5°C and 2.0°C of warming in the IPCC fifth assessment report (AR5), I was intrigued to see how the new IPCC “special report on the impacts of global warming of 1.5 °C above pre-industrial levels” would fare. However, that will have to wait for another post, as first there are some “refinements” from AR5 in how results are obtained. From my analysis they would appear that key figures on temperatures and climate sensitivities are highly contrived.

Isn’t 1.5°C of warming already built in? 

Chapter 1 Page 24

Expert judgement based on the available evidence (including model simulations, radiative forcing and climate sensitivity) suggests that if all anthropogenic emissions were reduced to zero immediately, any further warming beyond the 1°C already experienced would likely be less than 0.5°C over the next two to three decades, and also likely less than 0.5°C on a century timescale.

This basically states that if all emissions were stopped now there is more than a 50% chance that warming would not exceed 1.5°C. But using previous assumptions 1.5°C should be already be built in. 

If ECS = 3.0 (as in AR5) then that implies the net effect of all GHGs and all aerosols is less than 396 ppm, despite CO2 on its own in September 2018 being 405.5 ppm (1.6°C of eventual warming). Further, in 2011 the impact of all GHGs combined was equivalent to 430 ppm, or an extra 40 ppm more than CO2 on its own. On that basis we are at the around 445 ppm or fractionally about the 2.0°C warming level. However, in AR5 it was assumed (based on vague estimates) that the negative human impacts of aerosols exactly offset the addition of other GHGs (e.g. methane) so that only CO2 is considered. Even then based on ECS = 3.0 without further emissions 1.5°C will be eventually reached.

But ECS has been lowered.

From Chapter 1 Annex Page 11

…Equilibrium Climate Sensitivity (ECS) of 2.7°C and Transient Climate Response (TCR) of 1.6°C and other parameters as given in Millar et al. (2017).

This raises the CO2-eq level to achieve 1.5°C of warming by 15-16 ppm from 396ppm and the CO2-eq level to achieve 2.0°C by 23-24 ppm from 444 ppm. Mauna Loa CO2 levels in September averaged 405.5 ppm. With ECS = 2.7 this is equivalent to just 1.44°C of eventual warming compared to 1.60°C  when ECS = 3.0. What is more significant is that if ECS were 2.8 eventual warming of 1.50°C would be in the pipeline sometime before the end of the year. ECS = 2.7 is the highest ECS that us currently compatible with the statement made above if CO2 alone is taken into account. Consider this in the light of 2013 AR5 WG1 SPM, which stated on page 16

Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C

And in a footnote on the same page.

No best estimate for equilibrium climate sensitivity can now be given because of a lack of agreement on values across assessed lines of evidence and studies.

 In AR5 they chose ECS = 3.0 as it was in the middle of the range. A range unchanged since the Charney Report of 1979. I am not aware of any that establishes ECS is a range that would justify ECS = 2.7 that is not contradicted by other research. For instance Lewis and Curry 2018 gives a median estimate for ECS of 1.66.

Transient Climate Response (TCR)

But how does the Transient Climate Response (TCR) of 1.6°C fit into this? Some context can be had from the very start of the Summary for Policy-Makers SPM-4

A1.1. Reflecting the long-term warming trend since pre-industrial times, observed global mean surface temperature (GMST) for the decade 2006–2015 was 0.87°C (likely between 0.75°C and 0.99°C)

With TCR = 1.6°C for a doubling of CO2 levels what is the warming generated from a rise in CO2 levels from 280 to 400.83 ppm? That is a rise in CO2 levels from pre-industrial times to the average level in 2015. I calculate it to be 0.83°C. Make TCR = 1.7°C and that increases to 0.88°C. It is effectively assuming that both 100% of the rise in average temperatures in over 150 years is due to CO2 alone (consistent with AR5), and there has been no movement whatsoever from the short-term Transient Climate Response to the long-term Equilibrium Climate Sensitivity. However, if TCR is a variable figure derived from a calculation from revealed warming and CO2 rise, it becomes meaningless nonsense unless you can clearly demonstrate the other assumptions are robust. That is (1) 100% of past warming was due to human emissions (2) the impact of GHGs other than CO2 are effectively cancelled out by aerosols etc. (3) natural factors are net zero (4) the average temperature data anomaly is without any systematic biases. For instance, when measured CO2 levels were about 390ppm, the AR5 WG3 SPM stated in the last sentence on page 8

For comparison, the CO2-eq concentration in 2011 is estimated to be 430 ppm (uncertainty range 340 to 520 ppm)

It seems a pretty shaky foundation to the assumption that negative impact of aerosols (with uncertainties) will offset the combined impact of other GHG increases.

Summary and Concluding Comments

On the estimates of climate sensitivity, it appears to be set so that the IPCC can still claim that if emissions stopped tomorrow then there would be a greater than 50% chance of 1.5°C warming never been exceeded. The ECS value of 2.7°C is set at the maximum value, given the assumptions. But ceteris paribus, this will not hold if

  • One waits 3 years and CO2 levels continue increasing at a rate of the last few years.
  • ECS is slightly higher but still well within the accepted range of estimates. Indeed if ECS = 3.0, as in AR5 and AR4 in 2007, then 1.5C of warming was exceeded 5 years ago.
  • The impact of all GHGs together is slightly more than the offsetting impacts of other aerosols.
  • 0.06°C, or more, of the observed rise on temperature since 1850 is not due to GHG emissions.

Then there is the Transient Climate Response (TCR) which appears to be little more than taking the historical temperature change, assuming all of is down to human GHG emissions, and calculating a figure. Including rises in CO2 a century or more ago is hardly transient.

Based on my calculations, the results are highly contrived. They appear as a very fine balance between getting the maximum values for human-caused warming possible and not admitting that 1.5°C or even 2°C is already passed. There is a huge combination of empirical assumptions that are as equally valid as the ones used in the SR1.5 that go one way or the other. Rather than being a robust case, empirically it is highly improbable one.

Finally there is a conundrum here. I have calculated that if ECS = 2.7 and the starting level of CO2 is 280 ppm, then in round numbers, 1.5°C of warming results from CO2 levels of 412 ppm and 2.0°C of warming results from CO2 levels of 468 ppm. With CO2 levels in September 2018 at 406 ppm for 2.0°C of warming requires a rise in CO2 ten times greater than for 1.5°C of warming. So how can the IPCC claim that it is only about twice the amount of emissions? In my previous post I could not find an explanation, even though the emissions numbers reconciled with both past data and future emissions to generate 2.0°C of warming given certain assumptions. In the next I hope to provide an answer, which fits the figures quite closely, but looks pretty embarrassing.

Kevin Marshall