Hansen’s 1988 Scenario C against Transient Climate Response in IPCC TAR 2001

In a recent comment at Cliscep Jit made the following request

I’ve been considering compiling some killer graphs. A picture paints a thousand words, etc, and in these days of short attention spans, that could be useful. I wanted perhaps ten graphs illustrating “denialist talking points” which, set in a package, would be to the unwary alarmist like being struck by a wet fish. Necessarily they would have to be based on unimpeachable data.

One of the most famous graphs in climate is of the three scenarios used in Congressional Testimony of Dr James Hansen June 23 1988. Copies are poor, being copies of a type-written manuscript. The following is from SeaLevel.info website.

Fig 3 of Hansen’s Congressional Test June 23 1988

The reason for choosing this version rather than the clearer version in the paper online is that the blurb contains the assumptions behind the scenarios. In particular “scenario C drastically reduces trace gas emissions between 1990 and 2000.” In the original article states

scenario C drastically reduces trace gas emissions between 1990 and 2000 such that greenhouse forcing ceases to increase after 2000.

In current parlance this is net zero. In the graph this results in temperature peaking about 2007.

In the IPCC Third Assessment Report (TAR) 2001 there is the concept of Transient Climate Response.

TAR WG1 Figure 9.1: Global mean temperature change for 1%/yr CO2 increase with subsequent stabilisation at 2xCO2 and 4cCO2. The red curves are from a coupled AOGCM simulation (GFDL_R15_a) while the green curves are from a simple illustrative model with no exchange of energy with the deep ocean. The transient climate response, TCR, is the temperature change at the time of CO2 doubling and the equilibrium climate sensitivity, T2x, is the temperature change after the system has reached a new equilibrium for doubled CO2, i.e., after the additional warming commitment has been realised.

Thus, conditional on CO2 rising at 1% a year and the eventual warming from a doubling of CO2 being around 3C, then at the point when doubling has been reached temperatures will have risen by about 2C. From the Mauna Loa data annual average CO2 levels have risen from 316 ppm in 1959 to 414 ppm in 2020. That is 31% in 60 years or less than 0.5% a year. Assuming 3C of eventual warming from a CO2 doubling then the long time period of the transient climate response

  • much less than 1C of warming could so far have resulted from the rise in CO2 since 1959
  • it could be decades after net zero is achieved that warming will cease.
  • the rates of natural absorption of CO2 from the atmosphere are of huge significance.
  • Calculation of climate sensitivity even with many decades CO2 data and temperature is near impossible unless constraining assumptions are made about the contribution of natural factors; the rate of absorption of CO2 from the atmosphere; outgassing or absorption of CO2 by the oceans; & the time period for the increase in temperatures from actual rates of CO2 increase.
  • That is, change in a huge number variables within a range of acceptable mainstream beliefs significantly impacts the estimates of emissions pathways to constrain warming to 1.5C or 2C.
  • If James Hansen in 1988 was not demonstrably wrong false about the response time of the climate system and neither is TAR on the transient climate response, then it could be not be possible to exclude within the range of both the possibility that 1.5C of warming might not be achieved this century and that 2C of warming will be surpassed even if global net zero emissions is achieved a week from now.

Kevin Marshall

Excess and Covid-19 death rates

Last month the Daily Mail had an article on excess deaths against Covid deaths over 12 months for 30 countries. This was based on a more detailed article in the Economist. What I found surprising that the countries making the headlines here in the UK for Covid deaths – UK, USA & Brazil – were well down the list in terms of excess deaths per 100,000 population. Since then the Economist has extended the data set to include 78 countries and the cities of Istanbul and Jakarta. The time period also varies, from around 180 to 400 days, though mostly for about a year.

Given this limitation. there are number of observations that can be made.

  • Overall the 78 countries account for well under half the global population. Notable absences from the Economist data set are China, India, Indonesia (except Jakarta), Pakistan, Bangladesh and Nigeria.
  • Total excess deaths are around 50% higher than than reported Covid deaths overall. That is 3.64 as against 2.43 million.
  • Excess deaths are slightly negative in a small number of countries. Most notably are Japan, South Korea, Taiwan, Malaysia and Philippines. Is this a cultural issue or a policy issue?
  • The worst country for excess deaths is Peru with 503 deaths per 100,000 for the period Mar 30th 2020-May 2nd 2021. Even allowing for the longer period, Peru is well above any other country. Covid deaths at 62,110 are just 38% of the excess deaths.
  • Next on the list with excess deaths per 100,000 and covid deaths as a percentage of excess deaths in brackets are Bulgaria (433, 53%), Mexico (354, 45%), Russia (338, 20%), Serbia (320, 24%), Lithuania (319, 43%), Ecuador (319, 34%), North Macedonia (304, 50%), Czechia (300, 81%) and Slovakia (270, 64%). Britain and USA, for comparison, are respectively 26th (180, 126%) and 25th (182, 93%).
  • All countries in the top 10 are either in Central / South America or Eastern Europe. Of the top 20, only South Africa (14th) and Portugal (20th) are outside these areas.
  • If countries are separated in excess death rankings by geography, maybe comparisons should be made between similar countries? In Western Europe the five large countries are Italy in 23rd (197, 74%) Britain in 26th (180, 126%), Spain in 28th (170, 100%), France in 37th (126, 125%) and Germany in 57th (63, 155%). Why should Germany be so much lower on excess and Covid deaths? Might it be that the Germans lead in following instructions, whilst the Italians & Spanish ignore them and the British tend more to think rules apply to people in general, but with many worthy exceptions for themselves and their immediate peers?
  • Peru not only has the highest excess death rate in the world, but some of the most draconian anti-covid policies. Could it be that some of the high excess deaths are the result of the policies? In Brazil, where lockdown policies are determined at state level, in some areas people are both deprived of a means to earn a living and get no assistance from the state.

There are many ways to look at the data. The Economist excess and covid deaths data gives far more insights than just crude deaths totals. Superficially it would suggest that problem areas are not, like early last year, in Western Europe, but in the Eastern Europe & South America. With the lowest death rates in the Far East, globally there are huge disparities that cannot be explained by differences in policy responses. It is more likely cultural factors play the greater role, although it is perfectly understandable why policy-makers would poo-poo what strongly suggested by the data. Moreover, with a lack of data from much of the world, and likely under reporting of Covid deaths in many countries, the true scale of the pandemic is likely vastly understated.

Kevin Marshall

Imperial self-congratulations on saving 3 million lives

Late last night there was an article at the BBC that I cannot find anymore on the BBC website. Fortunately my history still has the link.

The article linked to a pre-publication article in Nature from the team at Imperial College – Estimating the effects of non-pharmaceutical interventions on COVID-19 in Europe

On pdf page 16 of 18. Modelled estimates that by 4th May lockdowns across 11 countries saved 2.8 – 3.1 million lives.
From pdf page 6 / article page 5 there are four countries listed on the Fig 1, the last being the UK.

The graphics show for all countries that the day of the lockdowns saw a massive step reduction in daily new infections & the magic R. For the UK modelled daily infections spiked at over 400,000 on the day of the lockdown and were less than 100,000 the day after. R from around 4 at lockdown to 1 the day after. Prime Minister Boris Johnson gave a very powerful and sincere speech ordering the lockdown, but he did not realize how transformative the power of his words would be. It was the same in all the other countries surveyed. Further, all countries surveyed had a massive spike in infections in the few days leading up to lockdown being imposed. If only the wisdom of the scientists had been listened to a few days earlier – at slightly different dates in each country – then thousands of more lives could have been saved,

I believe that the key to these startling conclusions lies in the model assumptions not the data of the natural world. We have no idea of the total number of coronavirus cases in any country at the start of lockdown, just that the identified number of cases. Thus whilst the model estimates of the number of cases cannot be proved wrong, it is very unlikely. I can think of five reasons to back up my claim, with particular reference to the UK.

First, the measures taken prior to the lockdown had no impact on coronavirus cases and hardly any on R. This includes the stopping of visitors to care homes, social distancing measures, voluntary closing of public places like pubs and the start of home working.

Second, Prime Minister Boris going on TV ordering a lockdown had an immediate impact. Same for other leaders, such as President Macron in France. This is nonsense. People were locked down in households, so there would have still been infections within the households in the few days after lockdown.

Third, we know that many of the coronavirus deaths were of people infected whilst in a hospitals or care homes of people who were infected within care homes. The lockdown did not segregate people within those communities.

Fourth, the assumed pre-lockdown spike followed by a massive drop in daily new infections was not followed a few days later by any such corresponding pattern in daily deaths. It is far easier to make the case for a zero impact of lockdowns rather than this extreme impact. The truth, if perfect data were available, is likely to be nearer zero lives saved than 3 million.

Fifth, in Italy the lockdown was not imposed nationally on the same day. The North of Italy was locked down first, followed by the rest of the country some days later. Like with the imposition of less draconian measures pre-lockdown in the UK, this should have seen a less immediate effect than suggested by Figure 1.

Are the authors and peer-reviewers at Nature likely to be aware of the problems with the headline BBC claims and the underlying paper? Compare the caption to “Extended Data Table 1” (pdf page 16)

Forecasted deaths since the beginning of the epidemic up to 4th May in our model vs. a counterfactual model assuming no interventions had taken place.

to the report from the BBC;

Lockdowns have saved more than three million lives from coronavirus in Europe, a study estimates. …….

They estimated 3.2 million people would have died by 4 May if not for measures such as closing businesses and telling people to stay at home.

That meant lockdown saved around 3.1 million lives, including 470,000 in the UK, 690,000 in France and 630,000 in Italy, the report in the journal Nature shows.

from the Evening Standard;

Around three million deaths may have been prevented by coronavirus lockdowns across Europe, research suggests.

from yahoo! News;

By comparing the number of deaths against those predicted by their model in the absence of interventions, the researchers believe that  3.1 million deaths have been averted due to non-pharmaceutical measures.

and from eCAM Biomed – GLOBAL IMPACT FUND

This study found that nonpharmaceutical interventions, including national “lockdowns,” could have averted approximately 3.1 million COVID-19 deaths across 11 European countries.

These press reports is not disimilar to the title of the Imperial College press release

Lockdown and school closures in Europe may have prevented 3.1m deaths

I would suggest that they are different from the caption to Extended Table 1. The difference is between comparison of actual data to modelled data based on some unlikely assumptions and actual lives saved in the real world. The difference is between the lockdowns having saved 3 million lives and having saved many times less. It is between the desisions of governments sacrificing economic prosperity to save hundreds of thousands of lives, and ruining the lives of millions based on pseudo-science for near zero benefits. The authors should be aware of this and so should the reviewers of the world’s leading science journal.
Are we going to see a quiet withdrawal, like of the BBC report?

Kevin Marshall

Dumb hard left proclamation replaces pluralistic thirst for knowledge and understanding

Last week Guido Fawkes had a little piece that, in my opinion, illustrates how nasty the world is becoming. I quote in full.

IMPERIAL COLLEGE DROPS “IMPERIAL” MOTTO
ROOTED IN POWER & OPPRESSION

In response to representations from students inspired by the Black Lives Matter movement Imperial College’s President, Professor Alice Gast, has announced they are dropping their “imperialist” Latin motto.

“I have heard from many of you with concerns about the university motto and its appearance on our crest. The Latin motto appears on a ribbon below the crest and is commonly translated to ‘Scientific knowledge, the crowning glory and the safeguard of the empire’. We have removed this ribbon and the motto in a revised crest you can see below in this briefing. This modified crest is already in use by my office and the Advancement team and will be integrated into all of our materials over the coming year. We will commission a group to examine Imperial’s history and legacy. We have a long way to go, but we will get better. We will build upon our community’s spirit, commitment and drive. We will draw strength from your commitment and support.”

The College’s motto, coined in 1908, was ‘Scientia imperii decus et tutamen’ which translates as ‘Scientific knowledge, the crowning glory and the safeguard of the empire’. As Titania McGrath might say this motto “is a reminder of a historical legacy that is rooted in colonial power and oppression”. That’s an actual quote from the college’s President, in the interests of diversity she is erasing the past. As someone once wrote “Who controls the past controls the future. Who controls the present controls the past.”

UPDATE:This old article from 1995 describes the arms and motto of Imperial College, paying particular attention to the deliberate ambiguity of the Latin:

Thus DECUS ET TUTAMEN translates as ‘an honour and a protection’. The rest of the motto is deliberately ambiguous. SCIENTIA means ‘knowledge’ but is also intended as a pun on the English word ‘science’. IMPERII could mean ‘power’, ‘dominion over’, ‘universal’, ‘of the empire’, ‘of the state’, or ‘superior’; and again is intended as a pun on the English word ‘imperial’.

Because of this ambiguity the full motto can be translated in many different ways. One translation could be: ‘Dominion over science is an honour and a protection’. A more politically correct translation might be: ‘Universal knowledge is beautiful and necessary’.

The Black Lives Matter translation of the motto – ‘Scientific knowledge, the crowning glory and the safeguard of the empire’ – might be valid, but so are many other formulations. Indeed, although Britain at the start of the last century was the most powerful nation and ruled the most extensive empire in history, along with competing with the United States & Germany as the leaders in the pursuit of scientific knowledge, the motto has proved untrue. Imperialists who backed the foundation of Imperial College who thought that scientific knowledge would safeguard empire were mistaken. What is left is an Imperial College ranked about tenth in world rankings of universities so it is a glorious product of imperialist thinking. Given that it is still thriving it is more glorious that the majestic ruins of earlier empires, such as the Colesium in Rome or the Parthenon in Athens.

Deeper than this is that the motto is deliberately a pun. It is superfically meaningful in different ways to those from a diverse range of backgrounds and belief systems. But to those with deeper understanding – achieved through high level study and reflection – know that more than one valid perspective is possible. That also leads into the realisation that our own knowledge, or the collective that of any groups that we might identifying as belonging to, is not the totality of all knowledge possible, and might be even turn out to be false some time in the future. This humility gives a basis for furthering understanding of both the natural world and the place of people within it. Rather than shutting out alternative perspectives, we should develop understanding of our own world view, and aiming to understand others. This is analogous to the onus in English Common Law for the prosecution to prove a clearly defined case beyond reasonable doubt based on the evidence. It is also applies to the key aim of the scientific method. Conjectures about the natural world are ultimately validated by experiments based in the natural world.

Consider the alternative “ideal” that we are heading towards at an alarming rate of knots. What counts as knowledge is the collective opinion of those on the self-proclaimed moral high ground. In this perspective those who do not accept the wisdom of the intellectual consensus are guilty of dishonesty and should not be heard. All language and observations of the natural world are interpreted through this ideological position. Any conflict of is resolved by the consensus. Is it far fetched? A quote from Merchants of Doubt – Oreskes Conway 2010.

Sunday Times exaggerates price gauging on Amazon

It has been many months since last posting on this blog due to being engaged in setting up a small business after many years working as a management accountant, mostly in manufacturing. For the first time this year I purchased the Sunday Times. What caught my attention was an article “Amazon sellers rolling in dough from flour crisis“. As my speciality was product costing I noticed a few inaccuracies and exaggerations.

Sunday Times article from page 6 of print edition 03/05/20


The first issue was on the fees.

Amazon sells many products directly to consumers but more than half of its sales are from third-party sellers on its “Marketplace”. They pay fees to the online giant of up to 15% of the sale price.

The fees at least 15% of the sale price. This is if the seller despatches for themselves, incurring the cost of postage and packing.

Let us take an example of the price rises.

A packet of 45 rolls of Andrex lavatory roll rose from under £24 to more than £95.

For somebody purchasing from Amazon with prime, they will get free postage on purchases over £20. So they can get 45 rolls delivered for about the standard price in a supermarket of 5 x 9 roll packs at £4.50 each. Using an third party app (which might not be accurate) for the Classic Clean Andrex, I find that third party sellers were selling at £23.45 up to March 8 when stocks ran out. Further Amazon were selling for about 3 days at £18.28 until Sunday March 8, when they also ran out. Apart from on Fri Mar 13 Amazon did not have supplies until late April. It was during this period that 3rd party sellers were selling at between £50 & £99.99. Any items offered for sale sold very quickly.

Now suppose an enterprising individual managed to grab some Andrex from a wholesaler (est. £15 inc. VAT) and list them for sale on Amazon. How much would they make? If they already had an account (cost about £30 per month) they could despatch themselves. They would need a large box (at least 45 x 45 x 35 cm) which they might be able to buy for under £30 for a pack of 15. They would have to pay postage. It is £20.45 at the Post Office. If anyone can find a carrier (for 6.5kg) cheaper than £12, including insurance and pick up, please let me know in the comments. If the selling at £50, the costs would be at least £7.50 + £15 + £2 + £12 = £36.50. To make a quick buck it is a lot of work.

This is, however, a bad example. Let us try a much lower weight product. The classic Uno Card Game, that the Sunday Times claims was listed at £5.60 on March 1st and £5.60-£17.99 on April 30th. This compares with £7.00 at Argos & Sainsbury’s and £6.97 at Asda. The inaccuracy here is with the range of prices, as there were multiple sellers on both dates, with £5.60 being the price sold by Amazon themselves. Actual selling prices fluctuate during March and this evening the prices are between £5.49 and £17.99. It is usually the case with popular products that there are multiple sellers. During March and April Amazon were out of stock, with actual selling prices between £4.99 and £19.00. Most often it was in the range of £9.00-£11.50.

A final example from the Sunday Times is for Carex Handwash Sensitive 250ml. As an antibacterial product, as soon as the Government recommended frequent hand washing the product sold out in supermarkets. As such it was ripe for making super-profits dueing the period of panic buying. This product used to be frequently available at £1 or slightly more. The Sunday Times lists the Amazon price at £1.99 on March 1st and at £5.98-£11.50 on April 30th. My App shows a single seller at £7.99, with the March 1st price of £3.25. The Sunday Times have probably picked up a different listing that is no longer available. The best value Carex antibacterial at the time of writing appears to be this listing for 2 x 250ml, where the price ranges from £9.49 to £13.16 including delivery. Selling at around £3.64 prior to March, the selling price peaked at around £32.99 in mid-March.

Whilst the Sunday Times article may not have the best examples, it does highlight that some people have made extraordinary profits by either being in the right place at the right time, or by quickly reacting to changing market information and anticipating the irrational panic buying of many shoppers. Here the problem is not with entrepreneurs meeting demand, but with consumers listening to the fearmongers in the media and social media believing that a pandemic “shutdown” would stop the movement of goods, along with a cultural ignorance of the ability of markets to rapidly respond to new information. In the supermarkets shelves many needlessly emptied shelves. Much of the fresh food bought in panic was binned. Further, many households will not be purchasing any more rice, tinned tomatoes and toilet rolls for many months. Since then there has been an extraordinary response by suppliers and supermarkets in filling the shortages. The slowest responses to shortages have been where the state is the dominant purchaser or the monopoly supplier and purchaser. The former is in the area of PPE, and the latter in the area of coronovirus testing.

Finally, there is a puzzle as to why there is such a range of prices available for an item on Amazon. A reason is that many of the the high-priced sellers were competitive, but the price has fallen dramtically. Another is that the higher-priced sellers are either hoping people make a mistake, or have “shops” on Amazon, that lure people in with low price products and hope they occassionally buy other, over-priced products. Like the old-fashinoned supermarket loss-leaders, but on steroids. Alternatively they may have the products listed elsewhere (e.g. actual shops or on Ebay) and/or a range of products with the extraordinary profits of the few offsetting the long-term write-offs of the many. There is the possibility that these hopefuls will be the future failures, as will be the majority of entrepreneurial ventures in any field.

Kevin Marshall

How misleading economic assumptions can show Brexit making people worse off

Last week the BBC News headlined Brexit deal means ‘£70bn hit to UK by 2029′ITV news had a similar report. The source, NIESR, summarizes their findings as follows:-

Fig 1 : NIESR headline findings 

£70bn appears to be a lot of money, but this is a 10 year forecast on an economy that currently has a GDP of £2,000bn. The difference is about one third of one percent a year. The “no deal” scenario is just £40bn worse than the current deal on offer, hardly an apocalyptic scenario that should not be countenanced. Put another way, if underlying economic growth is 2%, from the NIESR in ten years the economy will be between 16% and 22% larger.   In economic forecasting, the longer the time frame, the more significant the underlying assumptions. The reports are based on an NIESR open-access paper  Prospects for the UK Economy – Arno Hantzsche, Garry Young, first published 29 Oct 2019. The key basis is contained in Figures 1 & 2, reproduced below.

Fig 2 : Figures 1 & 2 from the NIESR paper “Prospects for the UK Economy

The two key figures purport to show that Brexit has made a difference. Business investment growth has apparently ground to a halt since mid-2016 and economic growth slowed. What it does not show is a decline in business investment, nor a halting of economic growth.

After these figures the report states:-

The reason that investment has been affected so much by the Brexit vote is that businesses fear that trade with the EU will be sufficiently costly in the future – especially with a no-deal Brexit – that new investment will not pay off. Greater clarity about the future relationship, especially removing the no-deal threat, might encourage some of that postponed investment to take place. But that would depend on the type of deal that is ultimately negotiated. A deal that preserved the current close trading relationship between the UK and EU could result in an upsurge in investment. In contrast, a deal that would make it certain that there would be more trade barriers between the UK and EU in the future would similarly remove the risk of no deal but at the same time eliminate the possibility of closer economic ties, offsetting any boost to economic activity.

This statement asserts, without evidence, that the cause of the change in investment trend is singular. That is due to business fears over Brexit. There is no corroborating evidence to back this assumption, such as surveys of business confidence, or decline in the stock markets. Nor is there a comparison with countries other than the UK, to show that any apparent shifts are due to other causes, such as the normal business cycle. Yet it is this singular assumed cause of the apparent divergence from trend that is used as the basis of forecasting for different policy scenarios a decade into the future.

The rest of this article will concentrate of the alternative evidence, to show that any alleged change in economic trends are either taken out of context or did not occur as a result of Brexit. For this I use World Bank data over a twenty year period, comparing to the Euro area. If voting to leave the EU has had a significant impact in economic trends 

Net Foreign Direct Investment

There is no data for the narrow business investment at the World Bank. The alternative is net foreign direct investment.


Fig 3 : Data for net foreign direct investment from 1999 to 2018 for the Euro area and the UK.

UK net foreign direct investment was strongly negative in 2014 to 2016, becoming around zero in 2017 and 2018. The Euro area shows an opposite trend. Politically, in 2014 UKIP won the UK elections to the European Parliament, followed in 2015 by a promise of a referendum on the EU. Maybe the expectation of Britain voting to leave the EU could have had impact? More likely this net outflow is connected to the decline in the value of the pound. From xe.com

Fig 4 : 10 year GBP to USD exchange rates. Source xe.com

The three years of net negative FDI were years of steep declines in the value of the pound. In the years before and after, when exchange rates were more stable, net FDI was near zero.

GDP growth rates %

The NIESR choose to show the value of quarterly output to show a purported decline in the rate of economic growth post EU Referendum. More visible are the GDP growth rates.

Fig 5 : Annual GDP growth rates for the Euro area and the UK from 1999 to 2018. 

The Euro area and the UK suffered a economic crash of similar magnitude in 2008 and 2009. From 2010 to 2018 the UK has enjoyed unbroken economic growth, peaking in 2014. Growth rates were declining well before the EU referendum. The Euro area was again in recession in 2012 and 2013, which more than offsets the stronger growth than the UK from 2016 to 2018. In the years 2010 to 2018 Euro area GDP growth averaged 1.4%, compared with 1.5% for the years 1999 to 2009. In the UK it was 1.9% in both periods. The NIESR is essentially claiming that leaving the EU without a deal will reduce UK growth to levels comparable with most of the EU. 

Unemployment – total and youth

Another matrix is unemployment rates. If voting to leave has impacted business investment and economic growth, one would expect a lagged impact on unemployment.

Fig 6 : Unemployment rates (total and youth) for the Euro area and the UK from 1999 to 2019. The current year is to September.

Unemployment in the Euro area has always been consistently higher than in the UK. The second recession in 2012 and 2013 in the Euro area resulted in unemployment peaking at least two years later than the UK. But in both places there has been over five years of falling unemployment. Brexit seems to have zero impact on the trend in the UK, where unemployment is now the lowest since the early 1970s. 

The average rates of total unemployment for the period 1999-2018 are 8.2% in the Euro area and 6.0% in the UK. For youth unemployment they are 20.9% and 14.6% respectively. 

The reason for higher rates of unemployment in EU countries for decades is largely down to greater regulatory rigidities than the UK. 

Concluding comments

NIESR’s assumptions that the slowdowns in business investment and economic growth are soley due to the uncertainties created by Brexit are not supported by the wider evidence. Without support for that claim, the ten year forecasts of slower economic growth due to Brexit fail entirely. Instead Britain should be moving away from EU stagnation with high youth unemployment, charting a better course that our European neighbours will want to follow. 

Kevin Marshall

Cummings, Brexit and von Neumann

Over at Cliscep, Geoff Chambers has been reading some blog articles by Dominic Cummings, now senior advisor to PM Boris Johnson, and formerly the key figure behind the successful Vote Leave Campign in the 2016 EU Referendum. In a 2014 article on game theory Cummings demonstrates he has actually read the Von Neumann’s articles and seminal 1944 book “A Theory of Games and Economic Behavior” that he quotes. I am sure that he has drawn on secondary sources as well.
A key quote in the Cummings article is from Von Neumann’s 1928 paper.

‘Chess is not a game. Chess is a well-defined computation. You may not be able to work out the answers, but in theory there must be a solution, a right procedure in any position. Now, real games are not like that at all. Real life is not like that. Real life consists of bluffing, of little tactics of deception, of asking yourself what is the other man going to think I mean to do. And that is what games are about in my theory.’

Cummings states the paper

introduced the concept of the minimax: choose a strategy that minimises the possible maximum loss.

Neoclassical economics starts from the assumption of utlity maximisation based on everyone being in the same position and having the same optimal preferences. In relationships they are usually just suppliers and demanders, with both sides gaining. Game theory posits that there may be net are trade-offs in relationships, with possibilities of some parties gaining at the expense of others. What Von Neumann (and also Cummings) do not fully work out is a consequence of people bluffing. As they do not reveal preferences it is not possible to quantify the utility they receive. As such mathematics is only of use in working through hypothetical situations not for empirically working out optimal strategies in most real world sitautions. But the discipline imposed by laying out the problem on game theory is to recognize that opponents in the game both have different preferences and may be bluffing.

In my view one has to consider the situation of the various groups in the Brexit “game”.

The EU is a major player whose gains or losses from Brexit need to be considered. More important that the economic aspects (the loss of 15% of EU GDP; a huge net contributor to the EU budget and a growing economy when the EU as a whole is heading towards recession) is the loss face at having to compromise for a deal, or the political repurcussions of an Indpendent Britain being at least as successful as a member.

By coming out as the major national party of Remain the Liberal Democrats have doubled their popular support. However, in so doing they have taken an extreme position, which belies their traditional occupation of the centre ground in British politics. Further, in a hung Parliament it is unlikely that they would go into coalition with either the Conservatives or Labour.  The nationalist Plaid Cymru and SNP have similar positions. In a hung Parliament the SNP might go into coalition with Labour, but only on the condition of another Scottish Independance Referendum.

The Labour Party have a problem. Comparing Chris Hanretty’s estimated the referendum vote split for the 574 parliamentary constituencies in England and Wales for the EU Referendum with 2015 General Election Results, Labour seats are more deeply divided than the country as a whole. Whilst Labour held just 40% of the seats, they had just over half the 231 seats with a 60% or more Leave vote, and almost two-thirds of the 54 seats with a 60% or more Remain vote. Adding in the constituencies where Labour came second by a margin of less 12% if the vote, (the seats need to win a Parliamentary majority) I derived the following chart.

Tactically, Labour would have move towards a Leave position, but most of the MPs were very pro-Remain and a clear majority of Labour voters likely voted remain. Even in some Labour constituencies where the constituency as a whole voted Leave, a majority of Labour voters may voted Remain. Yet leading members of the current Labour leadership and a disproportionate number of the vast leadership are in very pro-Remain, London constituencies.

The Conservative-held seats had a less polarised in the spread of opinion. Whilst less than 30% of their 330 England and Wales voted >60% Leave, the vast majority voted Leave and very few were virulently pro-Remain.

But what does this tell us about a possible Dominic Cummings strategy in the past few weeks?

A major objective since Boris Johnson became Prime Minister and Cummings was appointed less than two months ago has been a drive to Leave the EU on 31st October. The strategy has been to challenge the EU to compromise on the Withdrawal Agreement to obtain a deal acceptable to the UK Parliament. Hilary Benn’s EU Surrender Act was passed to hamper the negotiating position of the Prime Minister, thus shielding the EU from either having to either compromise or being seen by the rest of the world as being instransigent against reasonable and friendly approaches. Also, it has been to force other parties, particularly Labour, to clarify where they stand. As a result, Labour seems to a clear Remain policy. In forcing the Brexit policy the Government have lost their Parliamentary majority. However, they have caused Jeremy Corbyn to conduct a complete about-turn on a General Election, called for an ummediate election, then twice turning down the opportunity to call one.

Back to the application of game theory to the current Brexit situation I believe there to be a number of possible options.

  1. Revoke Article 50 and remain in the EU. The Lib Dem, Green, SNP amd Plaid Cymru position.
  2. Labour’s current option of negotiating a Withdrawal Agreement to liking, then hold a second referendum on leaving with Withdrawal Agreement or reamining in the EU. As I understand the current situation, the official Labour position would be to Remain, but members of a Labour Cabinet would be allowed a free vote. That is Labour would respect the EU Referendum result only very superficially, whilst not permitting to break away for the umbrella of EU institutions and dik tats.
  3. To leave on a Withdrawal Agreement negotiated by PM Boris Johnson and voted through Parliament.
  4. To leave the EU without a deal.
  5. To extend Article 50 indefinitely until the public opinion gets so fed up that it can be revoked.

Key to this is understanding the perspectives of all sides. For Labour (and many others in Parliament) the biggest expressed danger is a no-deal Brexit. This I believe is either a bluff on their part, or a failure to get a proper sense of proportion. This is illustrated by reading the worst case No Deal Yellowhammer Document (released today) as a prospective reality rather than a “brain storm” working paper as a basis for contingency planning. By imagining such situations, however unrealistic, action plans can be created to prevent the worst impacts should they arise. Posting maximum losses allows the impacts to be minimized. Governments usually kept such papers confidential precisely due to political opponents and journalists evaluating as them as credible scenarios which will not be mitigated against.

Labour’s biggest fear – and many others who have blocked Brexit – is of confronting the voters. This is especially due to telling Leave voters they were stupid for voting the way they did, or were taken in by lies. Although the country is evenly split between Leave and Remain supporting parties, the more divided nature of the Remainers is that the Conservatives will likely win a majority on around a third of the vote. Inputting yesterday’s YouGov/Times opinion poll results into in the Electoral Calculus User-Defined poll gives the Conservatives a 64 majority with just 32% of the vote.

I think when regional differences are taken into account the picture is slightly different. The SNP will likely end up with 50 seats, whilst Labour could lose seats to the Brexit Party in the North and maybe to the Lib Dems. If the Conservatives do not win a majority, the fifth scenario is most likely to play out.

In relation to Cummings and Game Theory, I would suggest that the game is still very much in play, with new moves to be made and further strategies to come into play. It is Cummings and other Government advisors who will be driving the game forward, with the Remainers being the blockers.

Kevin Marshall

Updated 29/09/19

How climate damage costings from EPA Scientists are misleading and how to correct

The Los Angeles Times earlier this month had an article

From ruined bridges to dirty air, EPA scientists price out the cost of climate change. (Hattip Climate Etc.)

By the end of the century, the manifold consequences of unchecked climate change will cost the U.S. hundreds of billions of dollars per year, according to a new study by scientists at the Environmental Protection Agency.
…..
However, they also found that cutting emissions of carbon dioxide and other greenhouse gases, and proactively adapting to a warming world, would prevent a lot of the damage, reducing the annual economic toll in some sectors by more than half.

The article is based on the paper
Climate damages and adaptation potential across diverse sectors of the United States – Jeremy Martinich & Allison Crimmins – Nature Climate Change 2019

The main problem is with the cost alternatives, contained within Figure 2 of the article.

Annual economic damages from climate change under two RCP scenarios. RCP8.5 has no mitigation and RCP4.5 massive mitigation. Source Martinich & Crimmins 2019 Figure 2

I have a lot of issues with the cost estimates. But the fundamental issue centers around costs that are missing from the RCP4.5 costs to enable a proper analysis to be made.

The LA Times puts forward the 2006 Stern Review – The Economics of Climate Change – as an earlier attempt at calculating “the costs of global warming and the benefits of curtailing emissions.
The major policy headline from the Stern Review (4.7MB pdf)

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more. In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.

The Stern Review implies a straight alternative. There are either the costs of unmitigated climate change OR the costs of mitigation policy. The RCP4.5 is the residual climate damage costs after costly policies have been successfully applied. The Stern Review quotation only looked at the policy costs, not the residual climate damage costs after policy has been applied, whereas Martinich & Crimmins 2019 only looks at the residual climate damage costs and not the policy costs.

The costs of any mitigation policy to combat climate change must include both the policy costs and the damage costs. But this is not the most fundamental problem.

The fundamental flaw in climate mitigation policy justifications

The estimated damage costs of climate change result from global emissions of greenhouse gases, which raise the average levels of atmospheric greenhouse gases which in turn raise global average temperatures. This rise in global average temperatures is what is supposed to create the damage costs.
By implication, the success of mitigation policies in reducing climate damage costs is measured in relation to the reduction in global emissions. But 24 annual COP meetings have failed to even get vague policy intentions will collectively stabilize emissions at the current levels. From the UNEP Emissions Gap Report 2018 is Figure ES.3 showing the gap between intentions and the required emissions to constrain global warming to 1.5°C and 2.0°C.

Current mitigation policies in the aggregate will achieve very little. If a country were to impose additional policies, the marginal impact would be very small in reducing global emissions. By implication, any climate mitigation policy costs imposed by a country, or sub-division of that country, will only result in very minor reductions in the future economic damage costs to that country. This is even if the climate mitigation policies are the most economically efficient, getting the biggest reductions for a given amount of expenditure. As climate mitigation is net costly, under current climate mitigation policies will necessarily impose burdens on the current generation, whilst doing far less in reducing the climate impacts on future generations in the policy area. Conversely, elimination of costly policies will be net beneficial to that country. Given the global demands for climate mitigation, politically best policy is to do as little possible, whilst appearing to be as virtuous as possible.

Is there a way forward for climate policy?

A basic principle in considering climate mitigation is derived from Ralph Niebuhr’s Serenity Prayer

God, grant me the serenity to accept the things I cannot change,
Courage to change the things I can,
And wisdom to know the difference.

An emended prayer (or mantra) for policy-makers would be to change things for the better. I would propose not having perfect knowledge of the future, but having a reasonable expectation that it will change the world for the better. If policy is costly, the benefits should exceed the costs. If policy-makers are aiming to serve their own group, or humanity as a whole, then they should have the serenity accept that there are the costs and harms of policy. In this light consider a quote by Nobel Laureate Prof William Nordhaus from an article in the American Economic Review last year.

The reality is that most countries are on a business-as-usual (BAU) trajectory of minimal policies to reduce their emissions; they are taking noncooperative policies that are in their national interest, but far from ones which would represent a global cooperative policy.

Nordhaus agrees with the UNEP emissions gap report. Based on the evidence of COP24 Katowice it is highly unlikely most countries will not do a sudden about-face, implementing policies that are clearly against their national interest. If this is incorrect, maybe someone can start by demonstrating to countries that rely on fossil fuel production for a major part of their national income – such as Russia, Saudi Arabia, Iran and other Middle Eastern countries – how leaving fossil fuels in the ground and embracing renewables is in their national interests. In the meantime, how many politicians will public accept that it is not in their power to reduce global emissions, but continue implementing policies whose success requires that they are part of policies that collectively will reduce global emissions?

If climate change is going to cause future damages what are the other options?

Martinich & Crimmins 2019 have done some of the work in estimating the future costs of climate change for the United States. Insofar as these are accurate forecasts, actions can be taken to reduce those future risks. But those future costs are contingent on a whole series of assumptions. Mostly crucially, the large magnitude of the damage costs are usually contingent  on dumb economic actor assumptions. That is, people have zero behavioral response to changing conditions over many decades. Two examples I looked at last year illustrate the dumb economic actor assumptions.

A Government Report last Summer claimed that unmitigated climate change would result in 7000 excess heat deaths in the UK by the 2050s. The amount of warming was small. The underlying report was based on the coldest region of England and Wales only experiencing average summer temperatures in the 2050s on a par with those of London (the warmest region) today. Most of the excess deaths would be in the over 75s in hospitals and care homes. The “dumb actors” in this case are the health professionals caring for patients in extreme heatwave in the 2050s in exactly the same way as they do today, even though the temperatures would be slightly higher. Nobody would think to try adapt practices through learning from places with hotter summers than the UK at present. That is from the vast majority of countries in the world.

Last year a paper in Nature Plants went by the title “Decreases in global beer supply due to extreme drought and heat”. I noted the paper made a whole serious of dubious assumptions, including two “dumb actor” assumptions,  to arrive at the conclusion that beer prices in some places could double due to global warming. One was that although in the agriculture models barley yields would shrink globally by 16% by 2100 compared to today contingent on a rise of global average temperatures of over 3.0°C , in Montana and North Dakota yields could double. The lucky farmers in these areas would not try to increase output, nor would farmers faced with shrinking yields reduce output. Another was that large price discrepancies in a bottle of beer would open up over the next 80 years between adjacent countries. This includes between Britain and Ireland, despite most of the beer sold being produced by large brewing companies, often in plants in third countries. No one would have the wit to buy a few thousand bottles of beer in Britain and re-sell it at a huge profit in higher-priced Ireland.

If the prospective climate damage costs in Martinich & Crimmins 2019 are based on similar “dumb actor” assumptions then any costly adaptation policies derived from the report might be largely unnecessary. People on the ground will have more effective localized, efficient, adaptation strategies. Generalized regulations and investments based on the models will fail on a benefit cost basis.

Concluding comments

Martinich & Crimmins 2019 look at US climate damage costs under two scenarios, one with little or no climate mitigation policies, the other with considerable successful climate mitigation. In the climate mitigation scenario they fail to add in the costs of climate mitigation policies. More importantly, actual climate mitigation policies have only been enacted by a small minority of countries, so costs expended on mitigation will not be met by significant reductions in future climate costs. Whilst in reality any climate mitigation policies is likely to lead to worse outcomes than doing nothing at all, the paper implies the opposite.
Further, the assumptions behind Martinich & Crimmins 2019 need to be carefully checked. If it includes “dumb economic actor” assumptions then on this alone the long-term economic damage costs might be grossly over-estimated. There is a real risk that adaptation policies based on these climate damage projections will lead to worse outcomes than doing nothing.
Overall, if policy-makers want to make a positive difference to the world in combating climate change, they should acquire the wisdom to identify areas where they can only do net harms. In the current environment, that will take an extreme level of courage. However, these justifications are far less onerous than the rigorous testing and approval process that new medical treatments need to go through before being allowed in general circulation.

Kevin Marshall

Nobel Laureate William Nordhaus demonstrates that pursuing climate mitigation will make a nation worse off

Summary

Nobel Laureate Professor William Nordhaus shows that the optimal climate mitigation policy is for far less mitigation than UNIPCCC proposes. That is to constrain warming by 2100 to 3.5°C instead of 2°C or less. But this optimal policy is based on a series of assumptions, including that policy is optimal and near universally applied. The current situation, with most countries without any effective mitigation policies, is that climate mitigation policies within a country will likely make that country worse off, even if they would be better off with the same policies were near universally applied. Countries applying costly climate mitigation policies are making their people worse off.

Context

Last week Bjorn Lomborg tweeted a chart derived from Nordhaus paper from August 2018 in the American Economic Review.

The paper citation is

Nordhaus, William. 2018. “Projections and Uncertainties about Climate Change in an Era of Minimal Climate Policies.” American Economic Journal: Economic Policy10 (3): 333-60.

The chart shows the optimal climate mitigation policy, based upon minimization of (a) the combined projected costs of climate mitigation policy and (b) residual net costs from human-caused climate change, is much closer to the non-policy option of 4.1°C than restraining warming to 2.5°C. By the assumptions of Nordhaus’s model greater warming constraint can only be achieved through much greater policy costs. The abstract concludes

The study confirms past estimates of likely rapid climate change over the next century if major climate-change policies are not taken. It suggests that it is unlikely that nations can achieve the 2°C target of international agreements, even if ambitious policies are introduced in the near term. The required carbon price needed to achieve current targets has risen over time as policies have been delayed.

A statement whose implications are ignored

This study is based on mainstream projections of greenhouse gas emissions and the resultant warming. Prof Nordhaus is in the climate mainstream, not a climate agnostic like myself. Given this, I find the opening statement interesting. (My bold)

Climate change remains the central environmental issue of today. While the Paris Agreement on climate change of 2015 (UN 2015) has been ratified, it is limited to voluntary emissions reductions for major countries, and the United States has withdrawn and indeed is moving backward. No binding agreement for emissions reductions is currently in place following the expiration of the Kyoto Protocol in 2012. Countries have agreed on a target temperature limit of 2°C, but this is far removed from actual policies, and is probably infeasible, as will be seen below.
The reality is that most countries are on a business-as-usual (BAU) trajectory of minimal policies to reduce their emissions; they are taking noncooperative policies that are in their national interest, but far from ones which would represent a global cooperative policy.

Although there is a paper agreement to constrain emissions commensurate with 2°C of warming, most countries are doing nothing – or next to nothing – to control their emissions. The real world situation is completely different to assumptions made in the model. The implications of this are skirted over by Nordhaus, but will be explored below.

The major results at the beginning of the paper are
  • The estimate of the SCC has been revised upward by about 50 percent since the last full version in 2013.
  • The international target for climate change with a limit of 2°C appears to be infeasible with reasonably accessible technologies even with very ambitious abatement strategies.
  • A target of 2.5°C is technically feasible but would require extreme and virtually universal global policy measures in the near future.

SCC is the social cost of carbon. The conclusions about policy are not obtained from understating the projected costs of climate change. Yet the aim to limit warming to 2°C appears infeasible. By implication limiting warming beyond this – such as to 1.5°C – should not be considered by rational policy-makers. Even a target of 2.5°C requires special conditions to be fulfilled and still is less optimal than doing nothing. The conclusion from the paper without going any further is achieving the aims of the Paris Climate Agreement will make the world a worse place than doing nothing. The combined costs of policy and any residual costs of climate change will be much greater than the projected costs of climate change.

Some assumptions

This outputs of a model are achieved by making a number of assumptions. When evaluating whether the model results are applicable to real world mitigation policy consideration needs to be given to whether those assumptions hold true, and the impact on policy if violated. I have picked some of the assumptions. The ones that are a direct or near direct quote, are in italics.

  1. Mitigation policies are optimal.
  2. Mitigation policies are almost universally applied in the near future.
  3. The abatement-cost function is highly convex, reflecting the sharp diminishing returns to reducing emissions.
  4. For the DICE model it is assumed that the rate of decarbonization going forward is −1.5 percent per year.
  5. The existence of a “backstop technology,” which is a technology that produces energy services with zero greenhouse gas (GHG) emissions.
  6. Assumed that there are no “negative emissions” technologies initially, but that negative emissions are available after 2150.
  7. Assumes that damages can be reasonably well approximated by a quadratic function of temperature change.
  8. Equilibrium climate sensitivity (ECS) is a mean warming of 3.1°C for an equilibrium CO2 doubling.

This list is far from exhaustive. For instance, it does not include assumptions about the discount rate, economic growth or emissions growth. However, the case against current climate mitigation policies, or proposed policies, can be made by consideration of the first four.

Implications of assumptions being violated

I am using a deliberately strong term for the assumptions not holding.

Clearly a policy is not optimal if it does not work, or even spends money to increase emissions. More subtle is using sub-optimal policies. For instance, raising the cost of electricity is less regressive the poor are compensated. As a result the emissions reductions are less, and there cost per tonne of CO2  mitigated rises. Or nuclear power is not favoured, so is replaced by a more expensive system of wind turbines and backup energy storage. These might be trivial issues if in general policy was focussed on the optimal policy of a universal carbon tax. No country is even close. Attempts to impose carbon taxes in France and Australia have proved deeply unpopular.

Given the current state of affairs described by Nordhaus in the introduction, the most violated assumption is that mitigation policy is not universally applied. Most countries have no effective climate mitigation policies, and very few have policies in place that are likely to result in any where near the huge global emission cuts required to achieve the 2°C warming limit. (The most recent estimate from the UNEP Emissions Gap Report 2018 is that global emissions need to be 25% lower in 2030 than in 2017). Thus globally the costs of unmitigated climate change will be close to the unmitigated 3% of GDP, with globally the policy costs being a small fraction of 1% of GDP. But a country that spends 1% of GDP on policy – even if that is optimal policy – will only see a miniscule reduction in its expected climate costs. Even the USA with about one seventh of global emissions, on Nordhaus’s assumptions efficiently spending 1% of output might expect future climate costs to fall by maybe 0.1%. The policy cost to mitigation cost for a country on its own is quite different to the entire world working collectively on similar policies. Assumption four of a reduction of 1.5% in global emissions illustrates the point in a slightly different way. If the USA started cutting its emissions by an additional 1.5% a year (they are falling without policy) then it would likely mean global emissions would keep on increasing.

The third assumption is another that is sufficient on its own to undermine climate mitigation. The UK and some States in America are pursuing what would be a less than 2°C pathway if it were universally applied. That means they are committing to a highly convex policy cost curve, (often made steeper by far from optimal policies) with virtually no benefits for future generations.

Best Policies under the model assumptions

The simplest alternative to climate mitigation policies could be to have no policies at all. However, if the climate change cost functions are a true representation, and given the current Paris Agreement this is not a viable option for those less thick-skinned than President Trump, or who have a majority who believe in climate change. Economic theory can provide some insights into the strategies to be employed. For instance if the climate cost curve is a quadratic as in Nordhaus (or steeper – in Stern I believe it was at least a quartic) there are rapidly diminishing returns to mitigation policies in terms of costs mitigated. For a politician who wants to serve their the simplest strategies are to

  • Give the impression of doing something to appear virtuous
  • Incur as little cost as possible, especially those that are visible to the majority
  • Benefit special interest groups, especially those with climate activist participants
  • Get other countries to bear the real costs of mitigation.

This implies that many political leaders who want to serve the best interests of their countries need to adopt a strategy of showing they are doing one thing to appear virtuous, whilst in reality doing something quite different.

In the countries dependent of extracting and exporting fossil fuels for a large part of their national income (e.g. the Gulf States, Russia, Kazakhstan, Turkmenistan etc.) different priorities and much higher marginal policy costs for global mitigation are present. In particular, if, as part of climate policies other countries were to shut down existing fossil fuel extraction, or fail to develop new sources of supply to a significant extent then market prices would rise, to the benefit of other producers.

Conclusion

Using Nordhaus’s model assumptions, if the World as a whole fulfilled the Paris Climate Agreement collectively with optimal policies, then the world would be worse off than if it did nothing. That is due to most countries pursuing little or no actual climate mitigation policies. Within this context, pursuing any costly climate mitigation policies will make a country worse off than doing nothing.

Assuming political leaders have the best interests of their country at heart, and regardless of whether they regard climate change a problem, the optimal policy strategy is to impose as little costly policy as possible for maximum appearance of being virtuous, whilst doing the upmost to get other countries to pursue costly mitigation policies.

Finally

I reached the conclusion that climate mitigation will always make a nation worse off ,using neoclassical graphical analysis, in October 2013.

Kevin Marshall

Australian Beer Prices set to Double Due to Global Warming?

Earlier this week Nature Plants published a new paper Decreases in global beer supply due to extreme drought and heat

The Scientific American has an article “Trouble Brewing? Climate Change Closes In on Beer Drinkers” with the sub-title “Increasing droughts and heat waves could have a devastating effect on barley stocks—and beer prices”. The Daily Mail headlines with “Worst news ever! Australian beer prices are set to DOUBLE because of global warming“. All those climate deniers in Australia have denied future generations the ability to down a few cold beers with their barbecued steaks tofu salads.

This research should be taken seriously, as it is by a crack team of experts across a number of disciplines and Universities. Said, Steven J Davis of University of California at Irvine,

The world is facing many life-threatening impacts of climate change, so people having to spend a bit more to drink beer may seem trivial by comparison. But … not having a cool pint at the end of an increasingly common hot day just adds insult to injury.

Liking the odd beer or three I am really concerned about this prospect, so I rented the paper for 48 hours to check it out. What a sensation it is. Here a few impressions.

Layers of Models

From the Introduction, there were a series of models used.

  1. Created an extreme events severity index for barley based on extremes in historical data for 1981-2010.
  2. Plugged this into five different Earth Systems models for the period 2010-2099. Use this against different RCP scenarios, the most extreme of which shows over 5 times the warming of the 1981-2010 period. What is more severe climate events are a non-linear function of temperature rise.
  3. Then model the impact of these severe weather events on crop yields in 34 World Regions using a “process-based crop model”.
  4. (W)e examine the effects of the resulting barley supply shocks on the supply and price of beer in each region using a global general equilibrium model (Global Trade Analysis Project model, GTAP).
  5. Finally, we compare the impacts of extreme events with the impact of changes in mean climate and test the sensitivity of our results to key sources of uncertainty, including extreme events of different severities, technology and parameter settings in the economic model.

What I found odd was they made no allowance for increasing demand for beer over a 90 year period, despite mentioning in the second sentence that

(G)lobal demand for resource-intensive animal products (meat and dairy) processed foods and alcoholic beverages will continue to grow with rising incomes.

Extreme events – severity and frequency

As stated in point 2, the paper uses different RCP scenarios. These featured prominently in the IPCC AR5 of 2013 and 2014. They go from RCP2.6, which is the most aggressive mitigation scenario, through to RCP 8.5 the non-policy scenario which projected around 4.5C of warming from 1850-1870 through to 2100, or about 3.8C of warming from 2010 to 2090.

Figure 1 has two charts. On the left it shows that extreme events will increase intensity with temperature. RCP2.6 will do very little, but RCP8.5 would result by the end of the century with events 6 times as intense today. Problem is that for up to 1.5C there appears to be no noticeable change what so ever.  That is about the same amount of warming the world has experienced from 1850-2010 per HADCRUT4 there will be no change. Beyond that things take off. How the models empirically project well beyond known experience for a completely different scenario defeats me. It could be largely based on their modelling assumptions, which is in turn strongly tainted by their beliefs in CAGW. There is no reality check that it is the models that their models are not falling apart, or reliant on arbitrary non-linear parameters.

The right hand chart shows that extreme events are porjected to increase in frequency as well. Under RCP 2.6 ~ 4% chance of an extreme event, rising to ~ 31% under RCP 8.5. Again, there is an issue of projecting well beyond any known range.

Fig 2 average barley yield shocks during extreme events

The paper assumes that the current geographical distribution and area of barley cultivation is maintained. They have modelled in 2099, from the 1981-2010 a gridded average yield change with 0.5O x 0.5O resolution to create four colorful world maps representing each of the four RCP emissions scenarios. At the equator, each grid is about 56 x 56 km for an area of 3100 km2, or 1200 square miles. Of course, nearer the poles the area diminishes significantly. This is quite a fine level of detail for projections based on 30 years of data to radically different circumstances 90 years in the future. The results show. Map a) is for RCP 8.5. On average yields are projected to be 17% down. As Paul Homewood showed in a post on the 17th, this projected yield fall should be put in the context of a doubling of yields per hectare since the 1960s.

This increase in productivity has often solely ascribed to the improvements in seed varieties (see Norman Borlaug), mechanization and use of fertilizers. These have undoubtably have had a large parts to play in this productivity improvement. But also important is that agriculture has become more intensive. Forty years ago it was clear that there was a distinction between the intensive farming of Western Europe and the extensive farming of the North American prairies and the Russian steppes. It was not due to better soils or climate in Western Europe. This difference can be staggering. In the Soviet Union about 30% of agricultural output came from around 1% of the available land. These were the plots that workers on the state and collective farms could produce their own food and sell surplus in the local markets.

Looking at chart a in Figure 2, there are wide variations about this average global decrease of 17%.

In North America Montana and North Dakota have areas where barley shocks during extreme years will lead to mean yield changes over 90% higher normal, and the areas around have >50% higher than normal. But go less than 1000 km North into Canada to the Calgary/Saskatoon area and there are small decreases in yields.

In Eastern Bolivia – the part due North of Paraguay – there is the biggest patch of > 50% reductions in the world. Yet 500-1000 km away there is a North-South strip (probably just 56km wide) with less than a 5% change.

There is a similar picture in Russia. On the Kazakhstani border, there are areas of > 50% increases, but in a thinly populated band further North and West, going from around Kirov to Southern Finland is where there are massive decreases in yields.

Why, over the course of decades, would those with increasing yields not increase output, and those with decreasing yields not switch to something else defeats me. After all, if overall yields are decreasing due to frequent extreme weather events, the farmers would be losing money, and those farmers do well when overall yields are down will be making extraordinary profits.

A Weird Economic Assumption

Building up to looking at costs, their is a strange assumption.

(A)nalysing the relative changes in shares of barley use, we find that in most case barley-to-beer shares shrink more than barley-to-livestock shares, showing that food commodities (in this case, animals fed on barley) will be prioritized over luxuries such as beer during extreme events years.

My knowledge of farming and beer is limited, but I believe that cattle can be fed on other things than barley. For instance grass, silage, and sugar beet. Yet, beers require precise quantities of barley and hops of certain grades.

Further, cattle feed is a large part of the cost of a kilo of beef or a litre of milk. But it takes around 250-400g of malted barley to produce a litre of beer. The current wholesale price of malted barley is about £215 a tonne or 5.4 to 8.6p a litre. About cheapest 4% alcohol lager I can find in a local supermarket is £3.29 for 10 x 250ml bottles, or £1.32 a litre. Take off 20% VAT and excise duty leaves 30p a litre for raw materials, manufacturing costs, packaging, manufacturer’s margin, transportation, supermarket’s overhead and supermarket’s margin. For comparison four pints (2.276 litres) of fresh milk costs £1.09 in the same supermarket, working out at 48p a litre. This carries no excise duty or VAT. It might have greater costs due to refrigeration, but I would suggest it costs more to produce, and that feed is far more than 5p a litre.

I know that for a reasonable 0.5 litre bottle of ale it is £1.29 to £1.80 a bottle in the supermarkets I shop in, but it is the cheapest that will likely suffer the biggest percentage rise from increase in raw material prices. Due to taxation and other costs, large changes in raw material prices will have very little impact on final retail costs. Even less so in pubs where a British pint (568ml) varies from the £4 to £7 a litre equivalent.

That is, the assumption is the opposite of what would happen in a free market. In the face of a shortage, farmers will substitute barley for other forms of cattle feed, whilst beer manufacturers will absorb the extra cost.

Disparity in Costs between Countries

The most bizarre claim in the article in contained in the central column of Figure 4, which looks at the projected increases in the cost of a 500 ml bottle of beer in US dollars. Chart h shows this for the most extreme RCP 8.5 model.

I was very surprised that a global general equilibrium model would come up with such huge disparities in costs after 90 years. After all, my understanding of these models used utility-maximizing consumers, profit-maximizing producers, perfect information and instantaneous adjustment. Clearly there is something very wrong with this model. So I decided to compare where I live in the UK with neighbouring Ireland.

In the UK and Ireland there are similar high taxes on beer, with Ireland being slightly more. Both countries have lots of branches of the massive discount chain. They also have some products on their website aldi.co.uk and aldi.ie.  In Ireland a 500 ml can of Sainte Etienne Lager is €1.09 or €2.18 a litre or £1.92 a litre. In the UK it is £2.59 for 4 x 440ml cans or £1.59 a litre. The lager is about 21% more in Ireland. But the tax difference should only be about 15% on a 5% beer (Saint Etienne is 4.8%). Aldi are not making bigger profits in Ireland, they just may have higher costs in Ireland, or lesser margins on other items. It is also comparing a single can against a multipack. So pro-rata the £1.80 ($2.35) bottle of beer in the UK would be about $2.70 in Ireland. Under the RCP 8.5 scenario, the models predict the bottle of beer to rise by $1.90 in the UK and $4.84 in Ireland. Strip out the excise duty and VAT and the price differential goes from zero to $2.20.

Now suppose you were a small beer manufacturer in England, Wales or Scotland. If beer was selling for $2.20 more in Ireland than in the UK, would you not want to stick 20,000 bottles in a container and ship it to Dublin?

If the researchers really understood the global brewing industry, they would realize that there are major brands sold across the world. Many are brewed across in a number of countries to the same recipe. It is the barley that is shipped to the brewery, where equipment and techniques are identical with those in other parts of the world. This researchers seem to have failed to get away from their computer models to conduct field work in a few local bars.

What can be learnt from this?

When making projections well outside of any known range, the results must be sense-checked. Clearly, although the researchers have used an economic model they have not understood the basics of economics. People are not dumb  automatons waiting for some official to tell them to change their patterns of behavior in response to changing circumstances. They notice changes in the world around them and respond to it. A few seize the opportunities presented and can become quite wealthy as a result. Farmers have been astute enough to note mounting losses and change how and what they produce. There is also competition from regions. For example, in the 1960s Brazil produced over half the world’s coffee. The major region for production in Brazil was centered around Londrina in the North-East of Parana state. Despite straddling the Tropic of Capricorn, every few years their would be a spring-time frost which would destroy most of the crop. By the 1990s most of the production had moved north to Minas Gerais, well out of the frost belt. The rich fertile soils around Londrina are now used for other crops, such as soya, cassava and mangoes. It was not out of human design that the movement occurred, but simply that the farmers in Minas Gerais could make bumper profits in the frost years.

The publication of this article shows a problem of peer review. Nature Plants is basically a biology journal. Reviewers are not likely to have specialist skills in climate models or economic theory, though those selected should have experience in agricultural models. If peer review is literally that, it will fail anyway in an inter-disciplinary subject, where the participants do not have a general grounding in all the disciplines. In this paper it is not just economics, but knowledge of product costing as well. It is academic superiors from the specialisms that are required for review, not inter-disciplinary peers.

Kevin Marshall