Empirical evidence contradicts theory that coral bleaching of Great Barrier Reef is a result of Global Warming

At Cliscep, Jaime Jessop looks at Coral Reefs again. She quotes from

Spatial and temporal patterns of mass bleaching of corals in the Anthropocene DOI: 10.1126/science.aan8048 . (Hughes et al 2018) 

The first line is 

The average surface temperature of Earth has risen by close to 1°C as of the 1880s (1), and global temperatures in 2015 and 2016 were the warmest since instrumental record keeping began in the 19th century.

The surface temperature consists of two parts, land and ocean data. HADCRUT4 data since 1850 is as follows.

Recent land warming is significantly greater than ocean warming. Further, in the last 50 years the warming in the tropics was slightly  less than the global average, with the greatest warming being north of the tropics. Below is a split of the HADCRUT4 data into eight bands of latitude that I compiled last year. 

NASA GISS have maps showing trends across the globe. The default is to compare the most recent month with the 1951-1980 average.

The largest coral reef on the planet is the Great Barrier Reef off the North West Coast of Australia. From the map the warming is -0.2 to 0.2 °C. By implication, Hughes et al are claiming that coral bleaching in the Southern Hemisphere is being caused not by local average surface temperature rise but by a global average heavily influenced by land-based northern hemisphere temperature rise.

However, this is only a modeled estimate of trends. Although local data trends for the sea is not readily available, Berkeley Earth does provide trends for towns on the coastline adjacent to the GBR. I have copied the trends for Cairns and Rockhampton, one located in the middle section of the GBR, the other at the Southern tip.

Cairns, in the middle of the GBR, has no warming since 1980, whilst Rockhampton has nearer the global average and no warming trend from 1998 to 2013. This is consistent with the NASA GISS map.

BE are extremely thorough, providing the sites which make up the trend, with the distance from the location. The raw data reveals a more complex picture. For Townsville (one-third of the way from Cairns to Rockhampton) the station list is here. Looking at the list, many of the temperature data sets are of short duration, have poor quality data (e.g. Burdekin Shire Council 4875), or have breaks in the data (e.g. Ayr, Burdekin Shire Council 4876). Another issue with respect to the Great Barrier Reef is that many are inland, so might not be a good proxy for sea surface temperatures. However, there are a couple of stations that can be picked out with long records and near the coast.
Cardwell Post Office 152368 had peak temperatures in the 1970s and cooling since. Relative to other stations, BE’s algorithms estimated there was a station bias of over 0.5°C in the 1970s.

Cairns Airport 152392 (with data since 1908, twenty years before planes first flew from the site! ) has cooling in the 1930s and warming from 1940 to the late 1950s. The opposite of the global averages. There are no station bias adjustments until 1950, showing that this is typical of the regional expectation. Recent warming is confined to 1980s and a little post 2000.

These results are confined to the land. I have found two sites on the GBR that have give a similar picture. Lihou Reef (17.117 S 152.002 E) and Marion Reef (19.090 S 152.386 E). Both for fairly short periods and the quality of the data is poor, which is not surprising considering the locations. But neither show any warming trend since the 1980’s whereas the faint grey line of the global land data does show a warming trend.

The actual temperature data of the GBR indicates that not only are average temperatures not a cause of GBR bleaching, but that calculated global average temperature trends are not replicated on the North East Australian coast. With respect to the world’s largest coral reef, increase incoral bleaching is not empirically linked to any increase in average global temperatures.

UPDATE 11/03/19 – 20:10

Following a comment by Paul Matthews, I have found the sea surface temperature data by location. The HADSST3 data is available in 5o by 5o gridcells. From data that I downloaded last year I have extracted the gridcells for 145-150oE/10-15oS and 145-150oE/15-20oS which cover most of the Great Barrier Reef, plus a large area besides. I have charted the annual averages alongside the HADCRUT4 global and HADSST3 ocean anomalies.

Ocean surface temperatures for the Great Barrier Reef show no warming trend at all, whilst the global averages show a quite distinct warming trend. What is more important, if the coral bleaching is related to sudden increases in sea temperatures then it is the much more massive increases in local data that are important, not the global average. To test whether increases in temperatures are behind bleaching events requires looking for anomalous summer months in the data. Another post is required.     

The context of Jaime Jessop’s Cliscep article

After multiple comments at a blogpost by Jaime Jesssop in early January 2018 Geoff M Price wrote a post at his own blog “On Coal Alarmism” on 2nd April 2018. ATTP re-posted 11 months later on 5th March 2019. Personally I find the post, along with many of the comments, a pseudo-scientific and discriminatory attack piece. That may be the subject of another post.

Kevin Marshall

What would constitute AGW being a major problem?

Ron Clutz has an excellent post. This time on he reports on A Critical Framework for Climate Change. In the post Ron looks at the Karoly/Tamblyn–Happer Dialogue on Global Warming at Best Schools particularly at Happer’s major statement. In my opinion these dialogues are extremely useful, as (to use an old-fashioned British term) are antagonists are forces by skilled opponents to look at the issues in terms of a level playing field. With the back and forth of the dialogue, the relative strengths and weaknesses are exposed. This enables those on the outside to compare and contrast for themselves. Further, as such dialogues never fully resolve anything completely, can point to new paths to develop understanding. 

Ron reprints two flow charts. Whilst the idea is of showing the issues in this way to highlight the issues is extremely useful correct. I have issues with the detail. 

 

In particular on the scientific question “Is it a major problem?“, I do not think the “No” answers are correct.
If there was no MWP, Roman warming, or Bronze Age warming then this would be circumstantial evidence for current warming being human-caused. If there has been 3 past warming phases at about  1000, 2000 and 3000 years ago, then this is strong circumstantial evidence that current warming is at least in part due to some unknown natural or random factors. Without any past warming phases at all then it would point to the distinctive uniqueness of the current warming, but that still does not mean not necessarily mean that it is a major net problem. There could be benefits as well as adverse consequences to warming. But the existence of previous warming phases under many studies and only being able to claim by flawed statistics that the majority of warming since 1950 in human-caused (when there was some net warming for at least 100 years before that suggests a demonstrable marginal impact of human causes far less than 100% of total warming. Further there is issues with

(a) the quantity of emissions a trace gas to raise that the atmospheric levels by a unit amount

(b) the amount of warming from a doubling of the trace gas – climate sensitivity

(c) the time taken for rises on a trace gas to raise temperatures.

As these are all extremely difficult to measure, so a huge range of equally valid answers. It is an example of the underdetermination of scientific theory.

At the heart of the underdetermination of scientific theory by evidence is the simple idea that the evidence available to us at a given time may be insufficient to determine what beliefs we should hold in response to it.

But even if significant human-caused warming can be established, this does not constitute a major problem. Take sea-level rise. If it could be established that human-caused warming was leading to sea level rise, this may not, on a human scale, be a major problem. At current rates sea levels from the satellites are rising on average by 3.5mm per year. The average adjusted level from the tide gauges are less than that – and the individual tide gauges show little or no acceleration in the last century. But if that rate accelerated to three or four times that level, it is not catastrophic in terms of human planning timescales. 

The real costs to humans are expressed in values. The really large costs of climate change are based not so much on the physical side, but implausible assumptions about the lack of human responses to ongoing changes to the environment. In economics, the neoclassical assumptions of utility maximisation and profit maximisation are replaced by the dumb actor assumption.

An extreme example I found last year. In Britain it was projected that unmitigated global warming could lead to 7000 excess heatwave deaths in the 2050s compared to today. The projection was most of these deaths would occur in over 75s dying in hospitals and care homes. The assumption was that medical professionals and care workers would carry on treating those in the care in the same way as currently, oblivious to increasing suffering and death rates.  

Another extreme example from last year was an article in Nature Plants (a biology journal) Decreases in global beer supply due to extreme drought and heatThere were at least two examples of the dumb actor assumption. First was failure by farmers to adjust output according to changing yields and hence profits. For instance in Southern Canada (Calgary / Edmonton) barley yields under the most extreme warming scenario were projected to fall by around 10-20% by the end of the century. But in parts of Montana and North Dakota – just a few hundred miles south – they would almost double. It was assumed that farmers would continue producing at the same rates regardless, with Canadian farmers making losses and those in Northern USA making massive windfall profits. The second was in retail. For instance the price of a 500ml bottle of beer in Ireland was projected to increase under the most extreme scenario in Ireland by $4.84 compared to $1.90 in neighbouring Britain. Given that most of the beer sold comes from the same breweries; current retail prices in UK and Ireland are comparable (In Ireland higher taxes mean prices up to 15% higher); cost of a 500ml bottle is about $2.00-$2.50 in the UK; and lack of significant trade barriers, there is plenty of scope with even a $1.00 differential for an entrepreneur to purchase a lorry load of beer in the UK and ship it over the Irish Sea. 

On the other hand nearly of the short-term forecasts of an emerging major problem have turned out to be false, or highly extreme. Examples are

  • Himalayan Glaciers will disappear by 2035
  • Up to 50% reductions in crop yields in some African Countries by 2020
  • Arctic essentially ice-free in the summer of 2013
  • Children in the UK not knowing what snow is a few years after 2000
  • In the UK after 2009, global warming will result in milder and wetter summers

Another example of the distinction between a mere difference and a major problem is the February weather. Last week the UK experienced some record high daytime maximum temperatures of 15-20C. It was not a problem. In fact, accompanied by very little wind and clear skies it was extremely pleasant for most people. Typical weather for the month is light rain, clouds and occasional gales. Children on half-term holidays were able to play outside, and when back in school last week many lessons were diverted to the outdoors. Over in Los Angeles, average highs were 61F (16C) compared to  February average of 68F (20C). This has created issues for the elderly staying warm, but created better skiing conditions in the mountains. More different than a major problem. 

So in summary, for AGW to be a major problem it is far from sufficient to establish that most of the global warming is human caused. It is necessary to establish that the impact of that warming is net harmful on a global scale.

Kevin Marshall

 

Two false claims on climate change by the IPPR

An IPPR report  This is a crisis: Facing up to the age of environmental breakdown published yesterday, withing a few hours received criticism from Paul Homewood at notalotofpeopleknowthat, Paul Matthews at cliscep and Andrew Montford at The GWPF.  has is based on an April 2018 paper by billionaire Jeremy Grantham. Two major issues, that I want cover in this post are contained in a passage on page 13.

Climate Change : Average global surface temperature increases have accelerated, from an average of 0.007 °C per year from 1900–1950 to 0.025 °C from 1998–2016 (Grantham 2018). ……. Since 1950, the number of floods across the world has increased by 15 times, extreme temperature events by 20 times, and wildfires sevenfold (GMO analysis of EM-DAT 2018).

These two items are lifted from an April 2018 paper The Race of Our Lives Revisited by British investor Jeremy Grantham CBE. I will deal with each in turn.

Warming acceleration

The claim concerning how warming has accelerated comes from Exhibit 2 of The Race of Our Lives Revisited.

The claimed Gistemp trends are as follows

1900 to 1958  – 0.007 °C/year

1958 to 2016  – 0.015 °C/year

1998 to 2016  – 0.025 °C/year

Using the Skeptical Science trend calculator for Gistemp I get the following figures.

1900 to 1958  – 0.066 ±0.024 °C/decade

1958 to 2016  – 0.150 ±0.022 °C/decade

1998 to 2016  – 0.139 ±0.112 °C/decade

That is odd. Warming rates seem to be slightly lower for 1998-2016 compared to 1958-2016, not higher. This is how Grantham may have derived the incorrect 1998-2016 figure.

For 1998-2016 the range of uncertainty is 0.003 to 0.025 °C/year.

It would appear that the 1900 to 1958 & 1958 to 2016 warming rates are as from the trend calculator, whilst the 1998 to 2016 warming rate of 0.025 °C/year is at the top end of the 2σ uncertainty range.

Credit for spotting this plausible explanation should go to Mike Jackson.

Increase in climate-related disasters since 1950

The IPPR report states

Since 1950, the number of floods across the world has increased by 15 times, extreme temperature events by 20 times, and wildfires sevenfold

Exhibit 7 of The Race of Our Lives Revisited.

The 15 times “Floods” increase is for 2001-2017 compared to 1950-1966.
The 20 times “Extreme Temperature Events” increase is for 1996-2017 compared to 1950-1972.
The 7 times “Wildfires” increase is for 1984-2017 compared to 1950-1983.

Am I alone in thinking there is something a bit odd in the statement about being from 1950? Grantham is comparing different time periods, yet IPPR make it appear the starting point is from a single year?

But is the increase in the data replicated in reality?

Last year I downloaded all the EM-DAT – The International Disasters Database – from 1900 to the present day. Their disaster types I have classified into four categories.

Over 40% are the “climate”-related disaster types from Grantham’s analysis. Note that this lists the number of “occurrences” in a year. If, within a country in a year there is more than one occurrence of a disaster type, they are lumped together.

I have split the number of occurrences by the four categories by decade. The 2010s is only for 8.5 years.

Climate” disasters have increased in the database. Allowing for 8.5 years in the current decade, compared to 1900-1949, “Climate” disasters are 65 times more frequent. Similarly epidemics are 47 times more frequent, geological events 16 times and “other” disasters 34 times.

Is this based on reality, or just vastly improved reporting of disasters from the 1980s? The real impacts are indicated by the numbers of reports deaths. 

The number of reported disaster deaths has decreased massively compared to the early twentieth century in all four categories, despite the number of reported disasters increasing many times. Allowing for 8.5 years in the current decade, compared to 1900-1949, “Climate” disaster deaths are down 84%. Similarly epidemic deaths are down by 98% and”other” disasters down by 97%. Geological disaster deaths are, however, up by 27%. The reported 272,431 deaths in the 2010s that I have classified under “Geology” includes the estimated 222,570 estimated deaths in the 2010 Haitian Earthquake.

If one looks at the death rate per reported occurrence, “Climate” disaster death rates have declined by 97.7% between 1900-1949 and the 2010s. Due to the increase in reporting, and the more than doubling of the world population, this decline is most likely understated. 

The Rôle of Progressives in Climate Mitigation

The IPPR describes itself as The Progressive Policy Think Tank. From the evidence of the two issues above they have not actually thought about what they are saying. Rather they have just copied the highly misleading data from Jeremy Grantham. There appears to be no real climate crisis emerging when one examines the available data properly. The death rate from extreme weather related events has declined by at least 97.7% between the first half of the twentieth century  and the current decade. This is a very important point for policy. Humans have adapted to the current climate conditions, just have they have reduced the impact of infectious diseases and are increasingly adapting to the impacts of earthquakes and tsunamis. If the climate becomes more extreme, or sea level rise accelerates significantly humans will adapt as well.

There is a curious symmetry here between the perceived catastrophic problem and the perceived efficacy of the solution. That for governments to reduce global emissions to zero. The theory is that rising human emissions, mostly from the burning of fossil fuels, are going to cause dangerous climate change. Global emissions involve 7600 million people in nearly 200 countries. Whatever the UK does, with less than 1% of the global population and less than 1% of global emissions makes no difference to global emissions.

Globally, there are two major reasons that reducing global emissions will fail.

First is that developing countries, with 80%+ of the global population and 65% of emissions, are specifically exempted from any obligation to reduce their emissions. (see Paris Agreement Articles 2.1(a), 2.2 and 4.1) Based on the evidence of the UNEP Emissions GAP Report 2018, and from the COP24 Katowizce meeting in December, there is no change of heart in prospect.

Second is that the reserves of fossil fuels, both proven and estimated, are both considerable and spread over many countries. Reducing global emissions to zero in a generation would mean leaving in the ground fossil fuels that provide a significant part of government revenue in countries such as Russia, Iran, Saudi Arabia, and Turkmenistan. Keeping some fossil fuels in the ground in the UK, Canada, Australia or the United States will increase the global prices and thus the production elsewhere.

The IPPR is promoting is costly and ideological policies in the UK, that will have virtually zero benefits for future generations in terms of climate catastrophes averted. In my book such policies are both regressive and authoritarian, based on failing to understand to the distinction between the real very marginal impacts of policy and the theoretical total impacts.

If IPPR, or even the climate academics, gave proper thought to the issue, then they would conclude the correct response will be to more accurately predict the type, timing, magnitude and location of future climate catastrophes. This information will help people on the ground adapt to those circumstances. In the absence of that information, the best way of adapting to changing climate is the same way as people have been able to adapt to extreme events, whether weather or geological. That is through sustained long-term economic growth, in the initial stages promoted by cheap and reliable energy sources. If there is a real environmental breakdown on its way, the Progressives, with their false claims and exaggerations, will be best kept well away from the scene. Their ideological beliefs render them incapable of getting a rounded perspective on the issues and the damage their policies will cause.

Kevin Marshall

Was time running out for tackling CO2 pollution in 1965?

In a recent Amicus Brief it was stated that the Petroleum Industry was told

– CO2 would cause significant warming by the year 2000.
– Time was running out to save the world’s peoples from the catastrophic consequence of pollution.

The Amicus Brief does not mention

– The Presentation covered legislative impacts on the petroleum industry in the coming year, with a recommendation to prioritize according to the “thermometer of legislative activity”.
– The underlying report was on pollution in general.
– The report concluded CO2 emissions were not controllable at local or even the national level.
– The report put off taking action on climate change to around 2000, when it hoped “countervailing changes by deliberately modifying other processess” would be possible.

The Claim

In the previous post I looked at a recent Amicus Brief that is in the public domain

In this post I look at the following statement. 

Then in 1965, API President Frank Ikard delivered a presentation at the organization’s annual meeting. Ikard informed API’s membership that President Johnson’s Science Advisory Committee had predicted that fossil fuels would cause significant global warming by the end of the century. He issued the following warning about the consequences of CO2 pollution to industry leaders:

This report unquestionably will fan emotions, raise fears, and bring demands for action. The substance of the report is that there is still time to save the world’s peoples from the catastrophic consequence of pollution, but time is running out.

The Ikard Presentation

Note 6 contains a link to the presentation 

6. Frank Ikard, Meeting the challenges of 1966, Proceedings of the American Petroleum Institute 12-15 (1965), http://www.climatefiles.com/trade-group/american-petroleuminstitute/1965-api-president-meeting-the-challenges-of-1966/.

The warning should be looked at in context of the presentation.
– Starts with the massive increase in Bills introduced in the current Congress – more than the previous two Congresses combined.
– Government fact gathering
– Land Law Review
– Oil and Gas Taxation
– Air and Water Conservation where the alleged statement was made above.
– Conclusion

The thrust of the presentation is about how new legislation impacts on the industry. I have transcribed a long quotation from Air and Water Conservation section, where the “time is running out” statement was made.

Air and Water Conservation
The fact that our industry will continue to be confronted with problems of air and water conservation for many years to come is demonstrated by the massive report of the Environment Pollution Panel of the President’s Science Advisory Committee, which was presented to President Johnson over the weekend.
This report unquestionably will fan emotions, raise fears and bring demands for action. The substance of the report is that there is still time to save the world’s peoples from the catastrophic consequence of pollution, but time is running out.
One of the most important predictions of the report is that carbon dioxide is being added to the earth’s atmosphere at such a rate that by the year 2000 the heat balance will be so modified as possibly to cause marked changes in climate beyond local or even national efforts. The report further states, and I quote: “… the pollution from internal combustion engines is so serious, and is growing so fast, that an alternative nonpollution means of powering automobiles, buses and trucks is likely to become a national necessity.”
The report, however, does conclude that urban air pollution, while having some unfavourable effects, has not reached the stage where the damage is as great as that associated with cigarette smoking. Furthermore, it does not find that present levels of pollution in air, water, soils and living organisms are such as to be a demonstrated cause of disease or death in people: but it is fearful of the future. As a safeguard it would attempt to assert the right of man to freedom from pollution and to deny the right of anyone to pollute air, land or water.
There are more than 100 recommendations in this sweeping report, and I commend it to your study. Implementation of even some of them will keep local, state and federal legislative bodies, as well as the petroleum and other industries, at work for generations.
The scope of our involvement is suggested, once again, by the thermometer of legislative activity this past year. On the federal level, hearings and committee meetings relating to air and water conservation were held almost continuously. The results, of course, are the Water Quality Act of 1965 and an important amendment to the Clean Air Act of 1963.

My reading is that Ikard is referring to a large report on pollution as a whole, with more 100 recommendations, when saying “time is running out”. However, whether the following paragraph on atmospheric CO2 is related to the urgency claim will depend whether the report treats tackling pollution from atmospheric CO2 with great urgency. Ikard commends the report for study, prioritizing by the “thermometer of legislative activity”.
Further. this Amicus Brief was submitted by a group of academics, namely Dr. Naomi Oreskes, Dr. Geoffrey Supran, Dr. Robert Brulle, Dr. Justin Farrell, Dr. Benjamin Franta and Professor Stephan Lewandowsky. When I was at University, I was taught to read the original sources. In his presentation Frank Ikard also commends listeners to study the original document. Yet the Amicus Brief contains no reference to the original document. Instead, they make an opinion based on an initial opinion voiced just after publication.

1965 Report of the Environmental Pollution Panel 

Nowadays, mighty internet search engines can deliver now-obscure documents more quickly than a professional researcher would think where to find the catalogues with a reference in a major library.
I found two sources.
First, from the same website that had the Ikard presentation – climatefiles.com.
http://www.climatefiles.com/climate-change-evidence/presidents-report-atmospher-carbon-dioxide/
As the filename indicates, it is not a copy of the full report. The contents include a letter from President Johnson; Contents; Acknowledgements; Introduction; and Appendix Y4 – Atmospheric Carbon Dioxide. Interestingly, it does not include “Climatic Effects of Pollution” on page 9.
Fortunately a full copy of the report is available at https://babel.hathitrust.org/cgi/pt?id=uc1.b4315678;view=1up;seq=5

I have screen-printed President Johnson’s letter and an extract of Page 9, with some comments.

President Johnson made a general reference to air pollution in general, but nothing about the specific impacts of carbon dioxide on climate. Page 9 is more forthcoming.

CLIMATIC EFFECTS OF POLLUTION

Carbon dioxide is being added to the earth’s atmosphere by the burning of coal, oil and natural gas at the rate of 6 billion tons a year. By the year 2000 there will be about 25% more CO2 in our atmosphere than at present. This will modify the heat balance of the atmosphere to such an extent that marked changes in the climate, not controllable though local or even national efforts, could occur. Possibilities of bringing about countervailing changes by deliberately modifying other processes that affect climate may then be very important.

The page 9 paragraph is very short. It makes the prediction that Ikard referred to in his presentation. By 2000, there could be “marked changes in climate not controllable though local or even national efforts”. I assume that there is a typo here, as “not controllable through local or even national efforts” makes more sense.
I interpret the conclusion, in more modern language, is as follows:-
The earth is going to warm significantly due to fossil fuel emissions, which might cause very noticeable changes in the climate by 2000. But the United States, the world’s largest source of those emissions, cannot control those emissions. Around 2000 there might be ways of controlling the climate that will counteract the impact of the higher CO2 levels.

Concluding Comments

Based on my reading of API President Frank Ikard’s presentation, he was not warning about the consequences of CO2 emissions when he stated

This report unquestionably will fan emotions, raise fears, and bring demands for action. The substance of the report is that there is still time to save the world’s peoples from the catastrophic consequence of pollution, but time is running out.

This initial interpretation is validated by the lack of urgency given in the report to rising tackling possible impacts of rising CO2 levels. Given that Ikard very clearly recommends reading the report, one would have expected over fifty years later for a group of scholars to follow that lead before formulating an opinion.
The report is not of the opinion that “time is running out” for combating the climatic effects of carbon dioxide. It further pushes taking action to beyond 2000, with action on climate seeming to be of a geo-engineering type, rather than adaptation. Insofar as Izard may have implied urgency with respect to CO2, the report flatly contradicts this.
The bigger question is why the report chose not to recommend taking urgent action at the time. This might inform why people of the time did not see rising CO2 as something for which they needed to take action. It is the Appendix Y4 (authored by the leading American climatologists at that time) that makes the case for the impact of CO2 and courses of action to tackle those impacts. In another post I aim to look at the report through the lens of those needing to be convinced. 

Kevin Marshall

Climate Alarmism from Edward Teller in 1959

The Daily Caller had an article on 30th January SEVERAL HIGH-PROFILE ENVIROS ARE WORKING TO RESUSCITATE CALIFORNIA’S DYING CLIMATE CRUSADE

What caught my interest was the following comment

Researchers Naomi Oreskes and Geoffrey Supran were among those propping up the litigation, which seeks to hold Chevron responsible for the damage climate change has played on city infrastructure.

The link is to an Amicus Brief submitted by Dr. Naomi Oreskes, Dr. Geoffrey Supran, Dr. Robert Brulle, Dr. Justin Farrell, Dr. Benjamin Franta and Professor Stephan Lewandowsky. I looked at the Supran and Oreskes paper Assessing ExxonMobil’s Climate Change Communications (1977–2014) in a couple of posts back in September 2017. Professor Lewandowsky on probably gets more mentions on this blog than any other.

The Introduction starts with the following allegation against Chevron

At least fifty years ago, Defendants-Appellants (hereinafter, “Defendants”) had information from their own internal research, as well as from the international scientific community, that the unabated extraction, production, promotion, and sale of their fossil fuel products would result in material dangers to the public. Defendants failed to disclose this information or take steps to protect the public. They also acted affirmatively to conceal their knowledge and discredit climate science, running misleading nationwide marketing campaigns and funding junk science to manufacture uncertainty, in direct contradiction to their own research and the actions they themselves took to protect their assets from climate change impacts such as sea level rise.

This are pretty serious allegations to make against a major corporation, so I have been reading with great interest the Amicus Brief and started making notes. As an ardent climate sceptic, I started reading with trepidation. Maybe there would be starkly revealed to me the real truth of climate denial. Instead, it has made very entertaining reading. After a three thousand words of notes and having only got up to 1972 in the story, I have decided to break up the story into a few separate posts.

Edward Teller 1959

The Amicus Brief states

In 1959, physicist Edward Teller delivered the first warning of the dangers of global warming to the petroleum industry, at a symposium held at Columbia University to celebrate the 100th anniversary of the industry. Teller described the need to find energy sources other than fossil fuels to mitigate these dangers, stating, “a temperature rise corresponding to a 10 per cent increase in carbon dioxide will be sufficient to melt the icecap and submerge New York. All the coastal cities would be covered, and since a considerable percentage of the human race lives in coastal regions, I think that this chemical contamination is more serious than most people tend to believe.”

Edward Teller was at the height of his fame, beingcredited with developing the world’s first thermonuclear weapon, and he became known in the United States as “the father of the H-bomb.” At the height of the cold war it must have been quite a coup to have one of the world’s leading physicists and noted anti-communist to give an address. As top executives from all the major oil companies would have been there, I would not sure they would have greeted the claims with rapturous applause. More likely thought the Professor has caught some new religion. They might have afterwards made some inquiries. Although climatology was in its infancy, the oil majors would have teams of geologists, who could make enquiries. The geologists  may have turned up the Revelle and Suess 1957 paper Carbon Dioxide Exchange Between Atmosphere and Ocean and the Question of an Increase of Atmospheric CO2 during the Past Decades, 9 Tellus 18 (1957) that is mentioned in the previous paragraph of the Amicus Brief.

Revelle and Suess state in the Introduction

(A) few percent increase in the CO2 content of the air, even if it has occurred, might not produce an observable increase in average air temperature near the ground in the face of fluctuations due to other causes. So little is known about the thermodynamics of the atmosphere that it is not certain whether or how a change in infrared back radiation from the upper air would affect the temperature near the surface. Calculations by PLASS (1956) indicate that a 10% increase in atmospheric carbon dioxide would increase the average temperature by 0.36oC. But amplifying or feed-back processes may exist such that a slight change in the character of the back radiation might have a more pronounced effect.

So some experts in the field report that it is uncertain how much warming could occur from a small rise in CO2 levels. The only actual estimate is 0.36oC from a 10% rise. So how could that melt the icecap and flood New York? If this was first introduction that oil executives had to the concept of CO2-induced global warming might they have become a little on their guard about any future, more moderate, claims?

They would have been right to be uneasy. 1959 was the first full year CO2 levels were monitored at Mauna Loa, Hawaii. The mean CO2 Level for that year was 315.97 ppm. The 10% increase was passed in 1987, and for 2018 the figure was 408.52 ppm, 29.3% higher. The polar icecaps are still in place. From Sea Level Info, tide gauges show linear sea level rises over the last 59 years of  7.6 inches for Washington DC; 6.9 inches for Philadelphia 6.9 inches, and 6.6 inches for Battery at the tip of Lower Manhattan . This assumes a linear rise over 60 years.

The chart for The Battery, NY shows no discernible acceleration in the last 60 years, despite the acceleration in the rate of CO2 rise shown in green. It is the same for the other tide gauges.

The big question here is that 60 years later, what were the authors of the Amicus Brief thinking when they quoted such a ridiculous claim?

Kevin Marshall

East Antarctica Glacial Melting through the filter of BBC reporting

An indication of how little solid evidence there is for catastrophic anthropogenic global warming comes from a BBC story story carried during the COP24 Katowice conference in December. It carried the headline “East Antarctica’s glaciers are stirring” and began

Nasa says it has detected the first signs of significant melting in a swathe of glaciers in East Antarctica.

The region has long been considered stable and unaffected by some of the more dramatic changes occurring elsewhere on the continent.

But satellites have now shown that ice streams running into the ocean along one-eighth of the eastern coastline have thinned and sped up.

If this trend continues, it has consequences for future sea levels.

There is enough ice in the drainage basins in this sector of Antarctica to raise the height of the global oceans by 28m – if it were all to melt out.

Reading this excerpt one could draw a conclusion that the drainage basins on “one-eighth of the eastern coastline” have sufficient ice to raise sea levels by 28m. But that is not the case, at the melting of all of Antarctica would only raise sea levels by 60m. The map reproduced from NASA’s own website is copied below.

The study area is no where near a third or more of Antarctica. Further, although it might be one eighth of the eastern coastline, it is far less than the coastline of East Antarctica, which is two-thirds or more of the total area.

NASA does not mention the 28m of potential sea level rise in its article, only 3 metres from the disappearance of the Totten Glacier. So how large is this catchment area? From a Washington Post article in 2015 there is a map.

The upper reaches of the catchment area may include Vostok Station, known for being the location of the lowest reliably measured natural temperature on Earth of −89.2 °C (−128.6 °F). The highest temperature recorded in over 60 years is −14.0 °C. In other words, what is being suggested is that a slight increase in ocean current temperatures will cause, through gravity, the slippage of a glaciers hundreds of miles long into the ocean covering ten times the Totten Glacier catchment.

The Guardian article of 11th December also does not mention the potential 28m of sea level rise. This looks to be an insertion by the BBC making the significance of the NASA research appear orders of magnitude more important than the reality.

The BBC’s audio interview with Dr Catherine Walker gives some clarification of the magnitude of the detected changes. At 2.30 there is a question on the scale of the changes.

Physically the fastest changing one is Vincennes Bay which is why we were looking at that one. And, for instance, in 2017 they changed average about .5 meters a year. So that is pretty small.

Losing 0.5 metres out of hundreds of thousands of length is not very significant. It just shows the accuracy of the measurements. Dr Walker than goes on to relate this to Fleming Glacier in West Antarctica, which is losing about 8 meters a year. The interview continues:-

Q. But the point is that compared to 2008 there is definitely an acceleration here.
A. Yes. We have shown that looking at 2008 and today they have increased their rate of mass loss by 5 times.
Q. So it is not actually a large signal is it? How do we describe this then. Is this East Antarctica waking up? Is it going to become a West Antarctica here in a few decades time or something?
A. I think its hard, but East Antarctica given how cold it is, and it still does have that layer insulating it from warm Antarctic circumpolar current … that really eats away at West Antarctica. We’ve seen it get up under Totten, so of you know, but it is not continuous you know. Every so often it comes up and (…….) a little bit.

There is acceleration detected over a decade, but for the disappearance of the glacier would take tens or hundreds of thousands of years. 

Walker goes into say that for the small changes to further increase

you would have to change the Antarctic circumpolar current significantly. But the fact that you are seeing these subtle changes I guess you could say Antartica is waking up.
We are seeing these smaller glaciers – which couldn’t be seen before – see them also respond to the oceans. So the oceans are warming enough now to make a real difference in these small glaciers.

This last carry-away point – about glaciers smaller than Totten – is not related to the earlier comments. It is not ocean warming but movements in the warm Antarctic circumpolar current that seem to impact on West Antarctica and this small section of the East Antarctica coast. That implies a heat transfer from elsewhere could be the cause as much as additional heat.

This account misses out on another possible cause of the much higher rates of glacier movement in West Antarctica. It might be just a spooky coincidence, but the areas of most rapid melt seem to have a volcanoes beneath them.

Yet even these small movements in glaciers should be looked at in the context of net change in ice mass. Is the mass loss from moving glaciers offset by snow accumulation?
In June 2018 Jay Zwally claimed his 2015 paper showing net mass gain in Antarctica is confirmed in a forthcoming study. It is contentious (as is anything that contradicts the consensus. But the mainstream estimate of 7.6 mm of sea-level rise over 25 years is just 0.30mm a year. It is in Eastern Antarctica that the difference lies. 

From the Daily Caller

Zwally’s 2015 study said an isostatic adjustment of 1.6 millimeters was needed to bring satellite “gravimetry and altimetry” measurements into agreement with one another.

Shepherd’s paper cites Zwally’s 2015 study several times, but only estimates eastern Antarctic mass gains to be 5 gigatons a year — yet this estimate comes with a margin of error of 46 gigatons.

Zwally, on the other hand, claims ice sheet growth is anywhere from 50 gigatons to 200 gigatons a year.

In perspective the Shepard study has a central estimate of 2,720 billion tonnes of ice loss in 25 years leaving about 26,500,000 billion tonnes. That is a 0.01% reduction. 

As a beancounter I prefer any study that attempts to reconcile and understanding differing data sets. It is looking at differences (whether of different data sets; different time periods; hypothesis or forecast and empirical reality, word definitions etc.) that one can greater understanding of a subject, or at least start to map out the limits of one’s understanding. 

On the measure of reconciliation, I should tend towards the Zwally estimates with isostatic adjustment. But the differences are so small in relation to the data issues that one can only say that there is more than reasonable doubt about against the claim Antarctica lost mass in the last 25 years. The data issues are most clearly shown by figure 6 Zwally et al 2015, reproduced below.

Each colour band is for 25mm per annum whereas the isostatic adjustment is 1.6mm pa. In the later period the vast majority of Antarctica is shown as gaining ice, nearly all at 0-50mm pa. The greatest ice loss from 1992 to 2008 is from West Antarctica and around the Totten Glacier in East Antarctica. This contradicts the BBC headline “East Antarctica’s glaciers are stirring“, but not the detail of the article nor the NASA headline “More glaciers in East Antarctica are waking up“.

Concluding Comments

There are a number of concluding statements that can be made about the BBC article, along with the context of the NASA study.

  1. The implied suggestion by the BBC that recent glacier loss over a decade in part of East Antarctica could be a portent to 28m of sea level rise is gross alarmism. 
  2. The BBC’s headline “East Antarctica’s glaciers are stirring” implies the melt is new in area, but the article makes clear this is not the case. 
  3. There is no evidence put forward in the BBC, or elsewhere, to demonstrate that glacier melt in Antarctica is due to increased global ocean heat content or due to average surface temperature increase. Most, or all, could be down to shifts in ocean currents and volcanic activity. 
  4. Most, or all of any ice loss from glaciers to the oceans will be offset by ice gain elsewhere.  There are likely more areas gaining ice than losing it and overall in Antarctica there could be a net gain if ice.
  5. Although satellites can perform measures with increasing accuracy, especially glacier retreat and movement, the fine changes in ice mass are so small that adjustment and modelling assumptions for East Antarctica can make the difference between net gain or loss.

The NASA study of some of East Antarctica’s glaciers has to be understood in the context of when it was published. It was during the COP24 conference to control global emissions, with the supposed aim of saving the world from potential dangerous human-caused climate change. The BBC dressed it up the study make it appear that the study was a signal of this danger, when it was a trivial, localized (and likely) example of natural climate variation. The prominence given to such a study indicates the lack of strong evidence for a big problem that could justify costly emissions reduction policies. 

Kevin Marshall

Camp Fire California – Lessons in the Limits & Harms of Political Action

I believe that there is a prayer that politicians should adopt.
Called the Serenity Prayer, and written by Reinhold Niebuhr it begins

God grant me the serenity
to accept the things I cannot change;
courage to change the things I can;
and wisdom to know the difference.

The order is the right way round. Most “things” a politician – or even a ruling political party – cannot change. It is in the identification of the things that they can change for the better where they can make a positive difference.

An example comes from the end of last month. For a few days the news in Britain was dominated for days with the stories of the greatest ever wildfire in California. Called the Camp Fire, it killed 86, destroyed 19,000 homes and burnt 240 square miles (62,000 ha)
CBS News 60 Minutes has this short report.
Many politicians, including Governor Brown, blamed climate change. Yet even if wildfires were solely from that cause, the ultimate cause is supposed to be from global greenhouse gas emissions. As in 2016 California’s emissions were around 430 MtCO2e – or about 0.8% of the global total – any climate change policies will make virtually zero difference to global emissions. Even under the 2015 proposed contribution from the USA would not have made much difference as most of the forecast drop in emissions was due to non-policy trends, not due to actual policies. Policy that achieves much less than 10% real reduction from a country that has one-eighth of global emissions is hardly going to have an impact in a period when net global emissions are increasing. That is, impact of any mitigation policies by the State of California or the United States will have approximately zero impact on global emissions.
But no reasonable person would claim that it was all down to climate change, just that climate change may have made the risk of wild fires a little greater.
What are the more immediate causes of wild fires? This is what Munich Re has to say on wildfires in Southeast Australia. (Bold mine) 

The majority of bushfires in southeast Australia are caused by human activity

Bushfire is the only natural hazard in which humans have a direct influence on the hazard situation. The majority of bushfires near populated areas are the consequence of human activity. Lightning causes the smaller portion naturally. Sometimes, a carelessly discarded cigarette or a glass shard, which can focus the sun’s rays is all it takes to start a fire. Heat from motors or engines, or electric sparks from power lines and machines can ignite dry grass. Besides this accidental causes, a significant share of wildfires are started deliberately.

Humans also change the natural fire frequency and intensity. They decrease the natural fire frequency due to deliberate fire suppression near populated areas. If there is no fuel-reduction burning in forests for the purposes of fire prevention, large quantities of combustible material can accumulate at ground level.

Surface fires in these areas can become so intense due to the large amounts of fuel that they spread to the crowns of the trees and rapidly grow into a major fire. If humans had not intervened in the natural bushfire regime, more frequent low-intensity fires would have consumed the forest undergrowth and ensured that woodland grasses and scrubs do not proliferate excessively.

David Evans expands on the issue of fuel load in a 2013 article.

The immediate cause of wildfires is human. Near to people’s homes or businesses there is little that can be done to prevent fires either accidental or deliberate.

But, as any fire safety course will explain, for a fire to happen requires heat, fuel and oxygen. A few tenths of a degree of warming is not going to increase the heat source significantly. As Munich Re explains successful suppression of small fires, or forest management that allows dead material to accumulate, or does not thin the forest. or create fire-breaks will increase the continuous and rich fuel for fires to spread. That is, the unintended consequence of certain types of forest management will be to increase risk of severe fires.

President Trump was then correct in blaming poor forest management for the horrific fire. The reaction from firefighters that the tweets were “demeaning” and “ill-informed” were misplaced. If bad policy contributed to the severity of a fire then politicians should share some of the blame for the damage caused. They should not be defended by those risking their lives to limit the damage resulting from bad policies. If poor building regulations lead to many deaths in a large building then those responsible for the regulations would shoulder some of the blame for those deaths even if an arsonist started the fire. The same applies to forests. After major disasters such as air crashes and earthquakes, regulations are often put in place to prevent future similar disasters even when such regulations would not have prevented the actual disaster. The result of a disaster is to concentrate minds on the wider aspects and plug gaps. But like major disasters, if regulations contributed to the extent of the disaster, the aftermath will be to shift blame elsewhere then fix the underlying problem in a raft of – mostly unnecessary – regulations. President Trump broke these unwritten political rules. But the results are the same, and have occurred quite quickly.

When Trump visited the site of the Camp Fire he met with outgoing Governor Jerry Brown and Lt. Gov. Gavin Newsom he stated on November 19th

Is it happening? Things are changing. ….. And I think, most importantly, we’re doing things about. We’re going to make it better. We’re going to make it a lot better. And it’s going to happen as quickly as it can possibly happen.

From the Daily Caller and WUWT, on December 23rd President Trump signed into law new wildfire legislation that will better allow such fire-prevention management policies. On Christmas Eve President Trump followed this up with an executive order allowing agencies to do more to prevent massive wildfires.

Returning to the serenity prayer, in issuing an Executive order to allow government agencies to reduce fire risk President Trump has done something that is within his power. GOP legislation to better enable others to carry out similar forest management policies, has a slightly less direct impact. Democrats whinging about climate change is far more than failing to accept the things they cannot change. It is about blocking actions that can limit risk and extent of wild fires to maintain ineffectual and costly policies.

Kevin Marshall

BBCs misleading reporting of COP 24 Katowice Agreement

As usual, the annual UNFCCC COP meeting reached an agreement after extra time, said nothing substantial, but tried to dress up the failure as something significant. The BBC’s exuberant reporting of the outcome by Matt McGarth seriously misleads readers as to the substance of the agreement when he states

The Katowice agreement aims to deliver the Paris goals of limiting global temperature rises to well below 2C.

I have written to the BBC Complaints Department asking that they make a correction. Within that letter I cite four references that demonstrate why this McGarth’s statement misleading.

First, there is Climate Action Tracker’s thermometer. I do not believe there have been any additional pledges made in the last few days that would cause CAT to lower their projection from 3oC of warming to below 2oC.
Instead I believe that the COP24 Agreement merely tries to ensure that the pledges are effectively implemented, thus ensuring 3oC of warming rather than the “current policy” 3.3oC of warming.

Second, I do not believe there were additional pledges made during the Katowice conference will cut emissions by at least 15 GtCO2e in 2030. This is the minimum difference to be on track to stop global average temperatures exceeding 2oC.  . I enclose a screen shot of Climate Action Tracker’s Emission Gap page.

For the original source, I direct readers to the UNEP Emissions Gap Report 2018 , published towards at the end of November. In particular, look to Figure ES.3 on page xviii. The three major points in bold of the Executive Summary (pages xiv to xvii) clarify this graphic.

Third, I also draw readers attention to “Table 2.1: Overview of the status and progress of G20 members, including on Cancun pledges and NDC targets” on page 9 of the full UNEP report. A screenshot (without footnotes) is shown below.

The G20 countries accounted for 78% of the 2017 global GHG emissions excluding LULUCF of 49.2 GtCO2e. This was equivalent 72% of total GHG emissions of 53.5 GtCO2e. It might be worth focusing on which countries have increased their pledges in the past couple of weeks. In particular, those countries whose INDC submission pledges of 2015 imply increases in emissions between 2015 and 2030 of at least 0.5 GtCO2e or more (China, India, Russia, Turkey and Indonesia plus Pakistan, Nigeria and Vietnam outside of the G20), as they collectively more than net offset the combined potential emissions decreases of the developed countries such as the USA, EU, Canada and Australia. In a previous post I graphed this proposed emissions increases in figures 2 and 3. They are reproduced below.

Fourth, is that the UNFCCC press announcement makes no mention of any major breakthrough. The only national government mentioned is that of Scotland, who provided £200,000 of additional funding.  Scotland is not an independent Nation, recognized by the United Nations. As a part of the EU, it is not even part of a recognized nation state that makes submissions direct to the UNFCCC. The  SUBMISSION BY LATVIA AND THE EUROPEAN COMMISSION ON BEHALF OF THE EUROPEAN UNION AND ITS MEMBER STATES of 6 March 2015 can be found here.  By being a part of the EU, in the UNFCCC Scotland is two levels below Liechtenstein or Tuvalu. despite having respectively 140 and 480 times the population. But even if Scotland were both independent of the UK and the EU, as a nation state it would hardly seem fair that it was accorded the same voice as India or China with each have about 250 times the population of Scotland.

In the spirit of objectivity and balance, I hope that the BBC makes the  necessary correction.

Kevin Marshall

UNEP Emissions Gap Report 2018 Part 3 – UNEP tacitly admits climate mitigation is a failure

To those following the superficial political spin of climate policy, a UN organisation admitting that climate mitigation has failed may come as a surprise. Yet one does not have to go too deeply into the new UNEP Emissions Gap Report 2018 to see that this tacit admission is clearly the case. It is contained within the 3 major points in the Executive Summary.

By policy failure, I mean to achieve a global substantial reduction in GHG emissions in the near future, even if that reduction is not in line with either the 1.5°C or 2.0°C warming objective. On this measure, the UNEP is tacitly admitting failure it the summary.
The Executive Summary of the UNEP Emissions Gap Report 2018 starts on the pdf page 14 of 112, numbered page xiv.

Point 1 – Current commitments are inadequate

1. Current commitments expressed in the NDCs are inadequate to bridge the emissions gap in 2030. Technically, it is still possible to bridge the gap to ensure global warming stays well below 2°C and 1.5°C, but if NDC ambitions are not increased before 2030, exceeding the 1.5°C goal can no longer be avoided. Now more than ever, unprecedented and urgent action is required by all nations. The assessment of actions by the G20 countries indicates that this is yet to happen; in fact, global CO2 emissions increased in 2017 after three years of stagnation.

This is not a statement about a final push to get policy over the line, but a call for a complete change of direction. The tacit admission is that this is politically impossible. In the amplification it is admitted that in the G20 major economies – most of them developing countries – even the “NDC ambitions” for 2030 are not likely to be achieved. As I showed in the Part 2 post, 9 of the G20 will actually increase their emissions from 2015 to 2030 if the commitments are fully met, and the sum of the emissions increases will be greater than the emissions decreases. The exhortation for “unprecedented and urgent action” is not like Shakespeare’s Henry V rallying his men with a “once more unto the breach chaps and we will crack it” but more about like “Hey good fellows, if we are really going to breach the defenses we need to upgrade from the colorful fireworks to a few kegs of proper gunpowder, then make a few genuine sacrifices. I will be cheering you all the way from the rear“. This sentiment is contained in the following statement.

As the emissions gap assessment shows, this original level of ambition needs to be roughly tripled for the 2°C scenario and increased around fivefold for the 1.5°C scenario.

Point 2 – Emissions are increasing, not decreasing rapidly

2. Global greenhouse gas emissions show no signs of peaking. Global CO2 emissions from energy and industry increased in 2017, following a three-year period of stabilization. Total annual greenhouse gases emissions, including from land-use change, reached a record high of 53.5 GtCO2e in 2017, an increase of 0.7 GtCO2e compared with 2016. In contrast, global GHG emissions in 2030 need to be approximately 25 percent and 55 percent lower than in 2017 to put the world on a least-cost pathway to limiting global warming to 2°C and 1.5°C respectively.

In just 13 years from now global emissions need to be down by a quarter or more than a half to achieve the respective 2°C and 1.5°C targets. Emissions are still going up. Again, an admission that the progress in over two decades is small in relation to the steps needed to achieve anything like a desired outcome.

Point 3 – Scale of the gap in numbers

3. The gap in 2030 between emission levels under full implementation of conditional NDCs and those consistent with least-cost pathways to the 2°C target is 13 GtCO2e. If only the unconditional NDCs are implemented, the gap increases to 15 GtCO2e. The gap in the case of the 1.5°C target is 29 GtCO2e and 32 GtCO2e respectively. This gap has increased compared with 2017 as a result of the expanded and more diverse literature on 1.5°C and 2°C pathways prepared for the IPCC Special Report.

Some developing countries said they would change course conditional on massive amounts of funding. It is clear this will not be forthcoming. Fleshing out the 1.5°C target in the SR1.5 Report showed that it requires more onerous policies than previously thought. Each year UNEP produces a chart that nicely shows the scale of the problem. The 2018 version on page xviii is reproduced as figure 1.

Figure 1 : The emissions GAP in 2030 under the 1.5°C and 2°C scenarios, from the UNEP Emissions Gap Report 2018.

The widening gap between the 1.5°C and 2°C pathways and current projected commitments over the last five reports is shown in figure 2.

This widening gap is primarily a result of recalculations. Increased emissions in 2017 are secondary.

Conclusion

That nearly 200 nations would fail to agree to collectively and substantially reduce global emissions was obvious from the Rio Declaration in 1992. This exempted developing countries from any obligation to reduce their emissions. These developing countries now have at least four fifths of the global population and around two-thirds emissions. It was even more obvious from reading the Paris Agreement, where vague aspirations are evident. It is left to the reader to work out the implications of paragraphs like 4.1 and 4.4, which renders the UNFCCC impotent in reducing emissions. The latest UNEP Emissions Gap Report presents the magnitude of the mitigation policy failure and very clear statements about that failure.

Kevin Marshall

Leave EU Facebook Overspending and the Brexit Result

Last week an Independent article claimed

Brexit: Leave ‘very likely’ won EU referendum due to illegal overspending, says Oxford professor’s evidence to High Court

The article began

It is “very likely” that the UK voted for Brexit because of illegal overspending by the Vote Leave campaign, according to an Oxford professor’s evidence to the High Court.

Professor Philip Howard, director of the Oxford Internet Institute, at the university, said: “My professional opinion is that it is very likely that the excessive spending by Vote Leave altered the result of the referendum.
“A swing of just 634,751 people would have been enough to secure victory for Remain.
“Given the scale of the online advertising achieved with the excess spending, combined with conservative estimates on voter modelling, I estimate that Vote Leave converted the voting intentions of over 800,000 voters in the final days of the campaign as a result of the overspend.”

Is the estimate conservative? Anthony Masters, a Statistical Ambassador for the Royal Statistical Society, questions the statistics in the Spectator. The 800,000 was based upon 80 million Facebook users, 10% of whom clicked in on the advert. Of those clicking, 10% changed their minds.

Masters gave some amplification on in a follow-up blog post Did Vote Leave’s overspending cause their victory?
The reasons for doubting the “conservative” figures are multiple, including
– There were not 80 million voters on Facebook. Of the 46 million voters, at most only 25.6 million had Facebook accounts.
– Click through rate for ads is far less than 10%. In UK in 2016 it was estimated at 0.5%.
– Advertising is not the source of campaigning. It is not even viewed as the primary source, merely bolstering other parts of a campaign through awareness and presence.
– 10% of those reading the advert changing their minds is unlikely. Evidence is far less.
Anthony Masters concludes the Spectator piece by using Professor Howard’s own published criteria.

Prof Howard’s 2005 book, New Media Campaigns and the Managed Citizen, also argues that we should apply a different calculation to that submitted to the High Court. His book says to apply a one per cent click-through rate, where 10 per cent “believe” what they read; and of that 10 per cent act. This ‘belief’ stage appears to have been omitted in the High Court submission’s final calculation. Using these rates, this calculation turns 25.6 million people into 2,560 changed votes – hardly enough to have swung the referendum for Leave, given that their margin of victory was over a million votes. If we share a belief in accuracy, this erroneous claim should have limited reach.

There is further evidence that runs contrary to Prof Howard’s claims.

1. The Polls
To evaluate the statistical evidence for a conjecture – particularly for a contentious and opinionated issue like Brexit – I believe one needs to look at the wider context. If a Facebook campaign swung the Referendum campaign in the final few days from Remain to Leave, then there should be evidence of a swing in the polls. In the blog article Masters raised three graphs based on the polls that contradict this swing. It would appear that through the four weeks of the official campaign the Remain / Leave split was fairly consistent on a poll of polls basis. From analysis by pollster YouGov, the Leave share peaked on 13th June – ten days before the referendum. The third graphic, from a statistical analysis from the LSE, provides the clearest evidence.

The peak was just three days before the murder of MP Jo Cox by Tommy Mair. Jo Cox was a Remain campaigner, whilst it was reported that the attacker shouted slogans like “Britain First”. The shift in the polls could have been influenced by the glowing tributes to the murdered MP, alongside the speculation of the vile motives a clear Vote Leave supporter. That Jo Cox’s murder should have had no influence, especially when campaigning was suspended as a result of the murder, does not seem credible.

On Twitter, Anthony Masters also pointed to a question in Lord Ashcroft’s poll carried out on the day of the referendum – How the United Kingdom voted on Thursday… and why to a graphic that looked at when people had decided which way to vote. At most 16% of leave voters made up their minds in the last few days, slightly less than the 18% who voted remain.

The same poll looked at the demographics.


This clearly shows the split by age group. The younger a voter the more likely they were to vote Remain. It is not a minor relationship. 73% of 18-24s voted for Remain, whilst 40% of the 65% voted. Similarly, the younger a person the greater the time spent on social media such as Facebook.

2. Voting by area
Another, aspect is to look at the geographical analysis. Using Chris Hanretty’s estimates of the EU referendum results by constituency, I concluded that the most pro-Remain areas were the centre of major cities and in the University Cities of Oxford, Cambridge and Bristol. This is where the most vocal people reside.

The most pro-Leave areas were in the minor towns such are Stoke and Boston. Greater Manchester provided a good snapshot of the National picture. Of the 22 constituencies is estimated that just 3 has a majority remain vote. The central to the City of Manchester. The constituencies on the periphery voted to Leave, the strongest being on the east of Manchester and a few miles from the city centre. Manchester Central contains many of the modern flats and converted warehouses of Manchester. Manchester Withington has a preponderance of media types working at Media City for the BBC and ITV, along with education sector professionals.

These are the people who are not just geographically marginalised, but often feel politically marginalised as well.

Concluding comments

Overall, Professor Howard’s claims of late Facebook posts swinging the Referendum result are not credible at all. They are about as crackpot (and contradict) as the claims of Russian influence on the Brexit result.
To really understand the issues one needs to look at the data from different perspectives and the wider context. But the more dogmatic Remainers appear to be using their power and influence – backed by scanty assertions -to thrust their dogmas onto everyone else. This is undermining academic credibility, and the democratic process. By using the courts to pursue their dogmas, it also threatens to pull the legal system into the fray, potentially undermining the respect for the rule of law for those on the periphery.

Kevin Marshall