Increasing Extreme Weather Events?

Over at Cliscep, Ben Pile posted Misleading Figures Behind the New Climate Economy. Ben looked at the figures behind the recent New Climate Economy Report from the Global Commission on the Economy and Climate, which claims to be

… a major international initiative to examine how countries can achieve economic growth while dealing with the risks posed by climate change. The Commission comprises former heads of government and finance ministers and leaders in the fields of economics and business, and was commissioned by seven countries – Colombia, Ethiopia, Indonesia, Norway, South Korea, Sweden and the United Kingdom – as an independent initiative to report to the international community.

In this post I will briefly look at Figure 1 from the report, re-posted by Ben Pile.

Fig 1 – Global Occurrences of Extreme Weather Events from New Economy Climate Report

Clearly these graphs seem to demonstrate a rapidly worsening situation. However, I am also aware of a report a few years ago authored by Indur Goklany, and published by The Global Warming Policy Foundation  – GLOBAL DEATH TOLL FROM EXTREME WEATHER EVENTS DECLINING

Figure 2 : From Goklany 2010 – Global Death and Death Rates Due to Extreme Weather Events, 1900–2008. Source: Goklany (2009), based on EM-DAT (2009), McEvedy and Jones (1978), and WRI (2009).

 

Note that The International Disaster Database is EM-DAT. The website is here to check. Clearly these show two very different pictures of events. The climate consensus (or climate alarmist) position is that climate change is getting much worse. The climate sceptic (or climate denier) position is that is that human-caused climate change is somewhat exaggerated. Is one side outright lying, or is their some truth in both sides?

Indur Goklany recognizes the issue in his report. His Figure 2, I reproduce as figure 3.

Figure 3: Average Number of Extreme Weather Events per Year by Decade, 1900–2008.  Source: Goklany (2009), based on EM-DAT (2009).

I am from a management accounting background. That means that I check my figures. This evening I registered at the EM-DAT website and downloaded the figures to verify the data. The website looks at all sorts of disaster information, not just climate information. It collates

Figure 4 : No of Climatic Occurrences per decade from EM-DAT. Note that 2010-2016 pro rata is similar to 2000-2009

The updated figures through to 2016 show that pro rata, in the current decade occurrences if climate-related events as similar to the last decade. If one is concerned about the human impacts, deaths are more relevant.

Figure 5 : No of Climatic Deaths per decade from EM-DAT. Note that 2010-2016 pro rata is similar to 2000-2009

This shows unprecedented flood deaths in the 1930s. Of the 163218 flood deaths in 6 occurrences, 142000 were due to a flood in China in 1935. Wikipedia’s Ten deadliest natural disasters since 1900 lists at No.8 1935 Yangtze river flood, with 145000 dead. At No.1 is 1931 China floods with 1-4 million deaths. EM-DAT has not registered this disaster.

The decade 1970-1979 was extreme for deaths from storms. 300000 deaths were due to a Bangladesh storm in 1970. Wikipedia’s Ten deadliest natural disasters since 1900 lists at No.2 1970 Bhola cyclone, with ≥500,000.

The decade 1990-1999 had a high flood death toll. Bangladesh 1991 stands out with 138987 dead. Wikipedia No.10 is 1991 Bangladesh cyclone with 138866 dead.

In the decade 2000-2009 EM-DAT records the Myanmar Storm of 2008 with 138366 dead. If Wikipedia had a top 11 deadliest natural disasters since 1900, then Cyclone Nargis of 2 May 2008 could have made the list. From the BBC, with 200000 estimated dead, it would have qualified. But from the Red Cross 84500 Cyclone Nargis may have not made the top 20.

This leaves a clear issue of data. The International Disaster Database will accept occurrences of disasters according to clear criteria. For the past 20-30 years disasters have been clearly recorded. The build-up of a tropical cylone / hurricane is monitored by satellites and film crews are on hand to televise across the world pictures of damaged buildings, dead bodies, and victims lamenting the loss of homes. As I write Hurricane Florence is about to pound the Carolinas, and evacuations have been ordered. The Bhola Cyclone of 1970 was no doubt more ferocious and impacted on a far greater number of people. But the primary reason for the extreme deaths in 1970 Bangladesh was lack of warning and a lack of evacuation places. Even in the Wizard of Oz, based on 1930s United States, in a Tornado most families had a storm cellar. In the extreme poverty of 1970 Bangladesh there was nothing. Now, after decades of moderate growth and some rudimentary warning systems, it is unlikely that a similar storm would cause even a tenth of the death toll.

Even more significant, is that even if (as I hope) Hurricane Florence causes no deaths and limited property damage, it will be sufficiently documented to qualify for an entry on the International Disaster Database. But the quality of evidence for the 1931 China Floods, occurring in a civil war between the Communists and the Kuomintang forces, would be insufficient to qualify for entry. This is why one must be circumspect in interpreting this sort of data over periods when the quality and availability of data varies significantly. The issue I have is not with EM-DAT, but those who misinterpret the data for an ideological purpose.

Kevin Marshall

Excess Deaths from 2018 Summer Heatwaves

Last month I looked at the claims by the UK Environmental Audit Committee warning of 7,000 heat-related deaths in the 2050s, finding it was the result a making a number of untenable assumptions. Even if the forecast turned out to be true, cold deaths would still be more than five times the hot deaths. With the hottest summer since 1976, it is not surprising that there have been efforts to show there are excess heat deaths.

On the 6th August, The Daily Express headlined UK heatwave turns KILLER: 1,000 more people die this summer than average as temps soar.

Deaths were up in all seven weeks from June 2 to July 20, which saw temperatures reach as high as 95F (35C).

A total of 955 people more than the average have died in England and Wales since the summer began, according to the Office for National Statistics (ONS).

On the 3rd August the Guardian posted Deaths rose 650 above average during UK heatwave – with older people most at risk.

The height of the heatwave was from 25 June to 9 July, according to the Met Office, a run of 15 consecutive days with temperatures above 28C. The deaths registered during the weeks covering this period were 663 higher than the average for the same weeks over the previous five years, a Guardian analysis of data from the Office of National Statistics shows.

Note the Guardian’s lower figure was from a shorter time period.

I like to put figures in context, so I looked up the ONS Dataset:Deaths registered monthly in England and Wales

There they have detailed data from 2006 to July 2018. Estimating the excess deaths from these figures needs some estimation of other factors. However, some indication of excess deaths can be gleaned from taking the variation from the average. In July 2018 there were 40,624 recorded deaths, as against an average of 38,987 deaths in July in the years 2006-2018. There were therefore 1,637 deaths more than average. I have charted the variation from average for each year.

There were above average deaths in July 2018, but there similar figure in the same month in 2014 and 2015. Maybe the mean July temperatures from the Central England Temperature Record show a similar variation?

Not really. July 2006 had high mean temperatures and average deaths, whilst 2015 had low mean temperatures and higher than average deaths.

There is a further element to consider. Every month so far this year has had higher than average deaths. Below I have graphed the variation by month.

January is many times more significant than July. In the first seven months of this year there were 30,000 more deaths recorded than the January-July average for 2006 to 2018. But is this primarily due to the cold start to the year followed by a barbecue summer? Looking at the variations from average 300,000 deaths for the period January to July period, it does not seem this is the case.

Looking at individual months, if extreme temperatures alone caused excess deaths I would expect an even bigger peak during in January 2010 when there was record cold than this year. In January 2010 there were 48363 recorded deaths, against 64157 in January 2018 and a 2006-2018 average of 52383. Clearly there is a large seasonal element to deaths as the average for July is 39091, or three-quarters of the January level. But discerning the temperature related element is extremely tricky, and any estimates of excess deaths to a precise number should be treated with extreme caution.

Kevin Marshall

Milk loss yields down to heat stress

Last week, Wattupwiththat post “Climate Study: British Children Won’t Know What Milk Tastes Like”. Whilst I greatly admire Anthony Watts, I think this title entirely misses the point.
It refers to an article at the Conservation “How climate change will affect dairy cows and milk production in the UK – new study” by two authors at Aberystwyth University, West Wales. This in turn is a write up of a Plos One article published in May “Spatially explicit estimation of heat stress-related impacts of climate change on the milk production of dairy cows in the United Kingdom“. The reason I disagree is that even with very restrictive assumptions, this paper shows that even with large changes in temperature, the unmitigated costs of climate change are very small. The authors actually give some financial figures. Referring to the 2190s the PLOS One abstract ends:-

In the absence of mitigation measures, estimated heat stress-related annual income loss for this region by the end of this century may reach £13.4M in average years and £33.8M in extreme years.

The introduction states

The value of UK milk production is around £4.6 billion per year, approximately 18% of gross agricultural economic output.

For the UK on average Annual Milk Loss (AML) due to heat stress is projected to rise from 40 kg/cow to over 170 kg/cow. Based on current yields it is from 0.5% to 1.8% in average years. The most extreme region is the south-east where average AML is projected to rise from 80 kg/cow to over 320 kg/cow. That is from 1% to 4.2% in average years. That is, if UK dairy farmers totally ignore the issue of heat stress for decades the industry could see average revenue losses from heat stress rise on average from £23m to £85m. The financial losses are based on constant prices of £0.30 per litre.

With modeled estimates over very long periods, it is worth checking the assumptions.

Price per liter of milk

The profits are based upon a constant price of £0.30 a liter. But prices can fluctuate according to market conditions. Data on annual average prices paid is available from AHDB Dairy, ” a levy-funded, not-for-profit organisation working on behalf of Britain’s dairy farmers.” Each month, since 2004, there are reported the annual average prices paid by dairies over a certain size available here. That is 35-55 in any one month. I have taken the minimum and maximum prices for reported in June each year and shown in Figure 1.

Even annual average milk prices fluctuate depending on market conditions. If milk production is reduced in summer months due to an unusual heat wave causing heat stress, ceteris paribus, prices will rise. It could be that a short-term reduction in supply would increase average farming profits if prices are not fixed. It is certainly not valid to assume fixed prices over many decades.

Dumb farmers

From the section in the paper “Milk loss estimation methods

It was assumed that temperature and relative humidity were the same for all systems, and that no mitigation practices were implemented. We also assumed that cattle were not significantly different from the current UK breed types, even though breeding for heat stress tolerance is one of the proposed measures to mitigate effects of climate change on dairy farms.

This paper is looking at over 70 years in the future. If heatwaves were increasing, so yields falling and cattle were suffering, is it valid to assume that farmers will ignore the problem? Would they not learn from areas with more extreme heatwaves in summer elsewhere such as in central Europe? After all in the last 70 years (since the late 1940s) breeding has increased milk yields phenomenally (from AHDB data, milk yields per cow have increased 15% from 2001/2 to 2016/7 alone) so a bit of breeding to cope with heatwaves should be a minor issue.

The Conversation article states the implausible assumptions in a concluding point.

These predictions assume that nothing is done to mitigate the problems of heat stress. But there are many parts of the world that are already much hotter than the UK where milk is produced, and much is known about what can be done to protect the welfare of the animals and minimise economic losses from heat stress. These range from simple adaptations, such as the providing shade, to installing fans and water misting systems.

Cattle breeding for increased heat tolerance is another potential, which could be beneficial for maintaining pasture-based systems. In addition, changing the location of farming operations is another practice used to address economic challenges worldwide.

What is not recognized here is that farmers in a competitive market have to adapt in the light of new information to stay in business. That is the authors are telling farmers what they will be fully aware of to the extent that their farms conform to the average. Effectively assuming people and dumb, then telling them obvious, is hardly going to get those people to take on board one’s viewpoints.

Certainty of global warming

The Conversation article states

Using 11 different climate projection models, and 18 different milk production models, we estimated potential milk loss from UK dairy cows as climate conditions change during the 21st century. Given this information, our final climate projection analysis suggests that average ambient temperatures in the UK will increase by up to about 3.5℃ by the end of the century.

This warming is consistent with the IPCC global average warming projections using RCP8.5 non-mitigation policy scenario. There are two alternative, indeed opposite, perspectives that might lead rational decision-makers to think this quantity of warming is less than certain.

First, the mainstream media, where the message being put out is that the Paris Climate Agreement can constrain global warming to 2°C or 1.5°C above the levels of the mid-nineteenth century. With around 1°C of warming already if it is still possible to constrain additional global warming to 0.5°C, why should one assume that 3.5°C of warming for the UK is more than a remote possibility in planning?

Second, one could look at the track record of global warming projections from the climate models. The real global warming scare kicked-off with James Hansen’s testimony to Congress in 1988. Despite actual greenhouse gas emissions being closely aligned with rapid warming, actual global warming has been most closely aligned with the assumption of the impact of GHG emissions being eliminated by 2000. Now, if farming decision-makers want to still believe that emissions are the major driver of global warming, they can find plenty of excuses for the failure linked from here. But, rational decision-makers tend to look at the track record and thus take consistent decision-makers with more than a pinch of salt.

Planning horizons

The Conversation article concludes

(W)e estimate that by 2100, heat stress-related annual income losses of average size dairy farms in the most affected regions may vary between £2,000-£6,000 and £6,000-£14,000 (in today’s value), in average and extreme years respectively. Armed with these figures, farmers need to begin planning for a hotter UK using cheaper, longer-term options such as planting trees or installing shaded areas.

This compares to the current the UK average annual dairy farm business income of £80,000 according to the PLOS One article.

There are two sides to investment decision-making. There are potential benefits – in this case avoidance of profit loss – netted against the potential benefits. ADHB Dairy gives some figures for the average herd size in the UK. In 2017 it averaged 146 cows, almost double the 75 cows in 1996. In South East England, that is potentially £41-£96 a cow, if the average herd size there is same as the UK average. If the costs rose in a linear fashion, that would be around 50p to just over a pound a year per cow in the most extreme affected region. But the PLOS One article states that costs will rise exponentially. That means there will be no business justification for evening considering heat stress for the next few decades.

For that investment to be worthwhile, it would require the annual cost of mitigating heat stress to be less than these amounts. Most crucially, rational decision-makers apply some sort of NPV calculation to investments. This includes a discount rate. If most of the costs are to be incurred decades from now – beyond the working lives of the current generation of farmers – then there is no rational reason to take into account heat stress even if global warming is certain.

Summary

The Paper Spatially explicit estimation of heat stress-related impacts of climate change on the milk production of dairy cows in the United Kingdom makes a number of assumptions to reach its headline conclusion of decreased milk yields due to heat stress by the end of the century. The assumption of constant prices defies the economic reality that prices fluctuate with changing supply. The assumption of dumb farmers defies the reality of a competitive market, where they have to respond to new information to stay in business. The assumption of 3.5°C warming in the UK can be taken as unlikely from either the belief Paris Climate Agreement with constrain further warming to 1°C or less OR that the inability of past climate projections to conform to the pattern of warming should give more than reasonable doubt that current projections are credible.  Further the authors seem to be unaware of the planning horizons of normal businesses. Where there will be no significant costs for decades, applying any sort of discount rate to potential investments will mean instant dismissal of any consideration of heat stress issues at the end of the century by the current generation of farmers.

Taking all these assumptions together makes one realize that it is quite dangerous for specialists in another field to take the long range projections of climate models and apply to their own areas, without also considering the economic and business realities.

Kevin Marshall 

UK Government Committee 7000 heat-deaths in 2050s assumes UK’s climate policies will be useless

Summary

Last week, on the day forecast to have record temperatures in the UK, the Environmental Audit Committee warns of 7,000 heat-related deaths every year in the UK by the 2050s if the Government did not act quickly. That prediction was based upon Hajat S, et al 2014. Two principle assumptions behind that prognosis did not hold at the date when the paper was submitted. First is that any trend of increasing summer heatwaves in the data period of 1993 to 2006 had by 2012 ended. The six following summers were distinctly mild, dull and wet. Second, based upon estimates from the extreme 2003 heatwave, is that most of the projected heat deaths would occur in NHS hospitals, is the assumption that health professionals in the hospitals would not only ignore the increasing death toll, but fail to take adaptive measures to an observed trend of evermore frequent summer heatwaves. Instead, it would require a central committee to co-ordinate the data gathering and provide the analysis. Without the politicians and bureaucrats producing reports and making recommendations the world will collapse.
There is a third, implied assumption, in the projection. The 7,000 heat-related deaths in the 2050s assumes the complete failure of the Paris Agreement to control greenhouse emissions, let alone keep warming to within any arbitrary 1.5°C or 2°C. That means other countries have failed to follow Britain’s lead in reducing their emissions by 80% by 2050. The implied assumption is that the considerable costs and hardships on imposed on the British people by the Climate Change Act 2008 will have been for nothing.

Announcement on the BBC

In the early morning of last Thursday – a day when there were forecasts of possible record temperatures – the BBC published a piece by Roger Harrabin “Regular heatwaves ‘will kill thousands’”, which began

The current heatwave could become the new normal for UK summers by 2040 because of climate change, MPs say.
The Environmental Audit Committee warns of 7,000 heat-related deaths every year in the UK by 2050 if the government doesn’t act quickly. 
Higher temperatures put some people at increased risk of dying from cardiac, kidney and respiratory diseases.
The MPs say ministers must act to protect people – especially with an ageing population in the UK.

I have left the link in. It is not to a Report by the EAC but to a 2014 paper mentioned once in the report. The paper is Hajat S, et al. J Epidemiol Community Health DOI: 10.1136/jech-2013-202449 “Climate change effects on human health: projections of temperature-related mortality for the UK during the 2020s, 2050s and 2080s”.

Hajat et al 2014

Unusually for a scientific paper, Hajat et al 2014 contains very clear highlighted conclusions.

What is already known on this subject

▸ Many countries worldwide experience appreciable burdens of heat-related and cold-related deaths associated with current weather patterns.

▸ Climate change will quite likely alter such risks, but details as to how remain unclear.

What this study adds

Without adaptation, heat-related deaths would be expected to rise by around 257% by the 2050s from a current annual baseline of around 2000 deaths, and cold-related mortality would decline by 2% from a baseline of around 41 000 deaths.

▸ The increase in future temperature-related deaths is partly driven by expected population growth and ageing.

▸ The health protection of the elderly will be vital in determining future temperature-related health burdens.

There are two things of note. First the current situation is viewed as static. Second, four decades from now heat-related deaths will dramatically increase without adaptation.
With Harrabin’s article there is no link to the Environmental Audit Committee’s report page, direct to the full report, or to the announcement, or even to its homepage.

The key graphic in the EAC report relating to heat deaths reproduces figure 3 in the Hajat paper.

The message being put out is that, given certain assumptions, deaths from heatwaves will increase dramatically due to climate change, but cold deaths will only decline very slightly by the 2050s.
The message from the graphs is if the central projections are true (note the arrows for error bars) in the 2050s cold deaths will still be more than five times the heat deaths. If the desire is to minimize all temperature-related deaths, then even in the 2050s the greater emphasis still ought to be on cold deaths.
The companion figure 4 of the Hajat et al 2014 should also be viewed.

Figure 4 shows that both heat and cold deaths is almost entirely an issue with the elderly, particularly with the 85+ age group.
Hajat et al 2014 looks at regional data for England and Wales. There is something worthy of note in the text to Figure 1(A).

Region-specific and national-level relative risk (95% CI) of mortality due to hot weather. Daily mean temperature 93rd centiles: North East (16.6°C), North West (17.3°C), Yorks & Hum (17.5°C), East Midlands (17.8°C), West Midlands (17.7°C), East England (18.5°C), London (19.6°C), South East (18.3°C), South West (17.6°C), Wales (17.2°C).

The coldest region, the North East, has mean temperatures a full 3°C lower than London, the warmest region. Even with high climate sensitivities, the coldest region (North East) is unlikely to see temperature rises of 3°C in 50 years to make mean temperature as high as London today. Similarly, London will not be as hot as Milan. there would be an outcry if the London had more than three times the heat deaths of Newcastle, or if Milan had had more than three times the heat deaths of London. So how does Hajat et al 2014 reach these extreme conclusions?
There are as number of assumptions that are made, both explicit and implicit.

Assumption 1 : Population Increase

(T)otal UK population is projected to increase from 60 million in mid-2000s to 89 million by mid-2080s

By the 2050s there is roughly a 30% increase in population. Heat death rates per capita only show a 150% increase in five decades.

 

Assumption 2 : Lack of improvement in elderly vulnerability
Taking the Hajat et al figure 4, the relative proportions hot and cold deaths between age bands is not assumed to change, as my little table below shows.

The same percentage changes for all three age bands I find surprising. As the population ages, I would expect the 65-74 and 74-84 age bands to become relatively healthier, continuing the trends of the last few decades. That will make them less vulnerable to temperature extremes.

Assumption 3 : Climate Sensitivities

A subset of nine regional climate model variants corresponding to climate sensitivity in the range of 2.6–4.9°C was used.

The compares to the IPCC AR5 WG1 SPM Page 16

Equilibrium climate sensitivity is likely in the range 1.5°C to 4.5°C (high confidence)

With a mid-point of 3.75°C compared to the IPCC’s 3°C does not make much difference over 50 years. The IPCC’s RCP8.5 unmitigated emissions growth scenario has 3.7°C (4.5-0.8) of warming from 2010 to 2100. Pro-rata the higher sensitivities give about 2.5°C of warming by the 2050s, still making mean temperatures in the North East just below that of London today.
The IPCC WG1 report was published a few months after the Hajat paper was accepted for publication. However, the ECS range 1.5−4.5 was unchanged from the 1979 Charney report, so there should be a least a footnote justifying the higher senitivitity. An alternative approach to these vague estimates derived from climate models is those derived from changes over the historical instrumental data record using energy budget models. The latest – Lewis and Curry 2018 – gives an estimate of 1.5°C. This finding from the latest research would more than halved any predicted warming to the 2050s of the Hajat paper’s central ECS estimate.

Assumption 4 : Short period of temperature data

The paper examined both regional temperature data and deaths for the period 1993–2006. This 14 period had significant heatwaves in 1995, 2003 and 2006. Climatically this is a very short period, ending a full six years before the paper was submitted.
From the Met Office Hadley Centre Central England Temperature Data I have produced the following graphic of seasonal data for 1975-2012, with 1993-2006 shaded.

Typical mean summer temperatures (JJA) were generally warmer than in both the period before and the six years after. Winter (DJF) average temperatures for 2009 to 2011 were the coldest three run of winters in the whole period. Is this significant?
A couple of weeks ago the GWPF drew attention to a 2012 Guardian article The shape of British summers to come?

It’s been a dull, damp few months and some scientists think we need to get used to it. Melting ice in Greenland could be bringing permanent changes to our climate
The news could be disconcerting for fans of the British summer. Because when it comes to global warming, we can forget the jolly predictions of Jeremy Clarkson and his ilk of a Mediterranean climate in which we lounge among the olive groves of Yorkshire sipping a fine Scottish champagne. The truth is likely to be much duller, and much nastier – and we have already had a taste of it. “We will see lots more floods, droughts, such as we’ve had this year in the UK,” says Peter Stott, leader of the climate change monitoring and attribution team at the Met Office. “Climate change is not a nice slow progression where the global climate warms by a few degrees. It means a much greater variability, far more extremes of weather.”

Six years of data after the end of the data period, but five months before the paper was submitted on 31/01/2013 and nine months before the revised draft was submitted, there was a completely new projection saying the opposite of more extreme heatwaves.
The inclusion more recent available temperature data is likely to have materially impacted on the modelled extreme hot and cold death temperature projections for many decades in the future.

Assumption 5 : Lack of Adaptation
The heat and cold death projections are “without adaptation”. This assumption means that over the decades people do not learn from experience, buy air conditioners, drink water and look out for the increasing vulnerable. People basically ignore the rise in temperatures, so by the 2050s treat a heatwave of 35°C exactly the same as one of 30°C today. To put this into context, it is worth looking as another papers used in the EAC Report.
Mortality in southern England during the 2003 heat wave by place of death – Kovats et al – Health Statistics Quarterly Spring 2006
The only table is reproduced below.

Over half the total deaths were in General Hospitals. What does this “lack of adaptation” assumption imply about the care given by health professionals to vulnerable people in their care? Surely, seeing rising death tolls they would be taking action? Or do they need a political committee in Westminster looking at data well after the event to point out what is happening under there very noses? Even when data been collated and analysed in such publications as the Government-run Health Statistics Quarterly? The assumption of no adaptation should have been alongside and assumption “adaptation after the event and full report” with new extremes of temperature coming as a complete surprise. However, that might still be unrealistic considering “cold deaths” are a current problem.

Assumption 6 : Complete failure of Policy
The assumption high climate sensitivities resulting in large actual rises in global average temperatures in the 2050s and 2080s implies another assumption with political implications. The projection of 7,000 heat-related deaths assumes the complete failure of the Paris Agreement to control greenhouse emissions, let alone keep warming to within any arbitrary 1.5°C or 2°C. The Hajat paper may not state this assumption, but by assuming increasing temperatures from rising greenhouse levels, it is implied that no effective global climate mitigation policies have been implmented. This is a fair assumption. The UNEP emissions Gap Report 2017 (pdf), published in October last year is the latest attempt to estimate the scale of the policy issue. The key is the diagram reproduced below.

The aggregate impact of climate mitigation policy proposals (as interpreted by the promoters of such policies) is much closer to the non-policy baseline than the 1.5°C or 2°C emissions pathways. That means other countries have failed to follow Britain’s lead in reducing their emissions by 80% by 2050. In its headline “Heat-related deaths set to treble by 2050 unless Govt acts” the Environmental Audit Committee are implicitly accepting that the Paris Agreement will be a complete flop. That the considerable costs and hardships on imposed on the British people by the Climate Change Act 2008 will have been for nothing.

Concluding comments

Projections about the consequences of rising temperatures require making restrictive assumptions to achieve a result. In academic papers, some of these assumptions are explicitly-stated, others not. The assumptions are required to limit the “what-if” scenarios that are played out. The expected utility of modeled projections is related to whether the restrictive assumptions bear relation to actual reality and empirically-verified theory. The projection of over 7,000 heat deaths in the 2050s is based upon

(1) Population growth of 30% by the 2050s

(2) An aging population not getting healthier at any particular age

(3) Climate sensitivities higher than the consensus, and much higher than the latest data-based research findings

(4) A short period of temperature data with trends not found in the next few years of available data

(5) Complete lack of adaptation over decades – an implied insult to health professionals and carers

(6) Failure of climate mitigation policies to control the growth in temperatures.

Assumptions (2) to (5) are unrealistic, and making any more realistic would significantly reduce the projected number of heat deaths in the 2050s. The assumption of lack of adaptation is an implied insult to many health professionals who monitor and adapt to changing conditions. In assuming a lack of climate mitigation policies implies that the £319bn Britain is projected is spent on combating climate change between 2014 and 2030 is a waste of money. Based on available data, this assumption is realistic.

Kevin Marshall

Plan B Environmental Activists deservedly lose High Court battle over Carbon Target

Breaking News

From Belfast Telegraph & itv.com and Science Matters (my bold)

Lawyers for the charity previously argued the Government should have, in light of the current scientific consensus, gone further than its original target of reducing carbon levels by 2050 to 80% of those present in 1990.

They said the decision not to amend the 2050 target put the UK in breach of its international obligations under the Paris Agreement on Climate Change and was influenced by the Government’s belief that a “more ambitious target was not feasible”.

At a hearing on July 4, Jonathan Crow QC told the court: “The Secretary of State’s belief that he needs to have regard to what is feasible, rather than what is necessary, betrays a fundamental misunderstanding of the scheme of the 2008 Act and must be quashed.

“All of the individual claimants are deeply concerned about climate change.”

The barrister argued the Secretary of State’s “continuing refusal” to amend the 2050 target means the UK is playing “Russian roulette with two bullets, instead of one”.

But, refusing permission for a full hearing, Mr Justice Supperstone said Plan B Earth’s arguments were based on an “incorrect interpretation” of the Paris Agreement.

He said: “In my view the Secretary of State was plainly entitled … to refuse to change the 2050 target at the present time.

In a previous post I wrote that

Taking court action to compel Governments to enforce the Paris Climate Agreement is against the real spirit of that Agreement. Controlling global GHG emissions consistent with 2°C, or 1.5°C is only an aspiration, made unachievable by allowing developing countries to decide for themselves when to start reducing their emissions. ……. Governments wanting to both be players on the world stage and serve their countries give the appearance of taking action of controlling emissions, whilst in substance doing very little. This is the real spirit of the Paris Climate Agreement. To take court action to compel a change of policy action in the name of that Agreement should be struck off on that basis.

Now I would not claim Mr Justice Supperstone supports my particular interpretation of the Paris Agreement as an exercise in political maneuvering allowing Governments to appear to be one thing, whilst doing another. But we are both agreed that “Plan B Earth’s arguments were based on an “incorrect interpretation” of the Paris Agreement.

The UNFCCC PDF of the Paris Agreement is here to check. Then check against my previous post, which argues that if the Government acted in the true spirit of the Paris Agreement, it would suspend the costly Climate Change Act 2008 and put efforts into being seen to be doing something about climate change. Why

  • China was praised for joining the emissions party by proposing to stop increasing emissions by 2030.
  • Very few of the INDC emissions will make real large cuts in emissions.
  • The aggregate forecast impact of all the INDC submissions, if fully enacted, will see global  emissions slightly higher than today in 2030, when according to the UNEP emissions GAP report 2017 for 1.5°C warming target they need to be 30% lower in just 12 years time. Paris Agreement Article 4.1 states something that is empirically incompatible with that aim.

In order to achieve the long-term temperature goal set out in Article 2, Parties aim to reach global peaking of greenhouse gas emissions as soon as possible, recognizing that peaking will take longer for developing country Parties,

  • The Paris Agreement allows “developing” countries to keep on increasing their emissions. With about two-thirds of global emissions (and over 80% of the global population), 30% emissions cuts may not be achieved even if all the developed countries cut emissions to zero in 12 years.
  • Nowhere does the Paris Agreement recognize the many countries who rely on fossil fuels for a large part of their national income, for instance in the Middle East and Russia. Cutting emissions to near zero by mid-century would impoverish them within a generation. Yet, with the developing countries also relying on cheap fossil fuels to promote high levels of economic growth for political stability and to meeting the expectations of their people (e.g. Pakistan, Indonesia, India, Turkey) most of the world can carry on for decades whilst some enlightened Governments in the West damage the economic futures of their countries for appearances sake. Activists trying to dictate Government policy through the Courts in a supposedly democratic country ain’t going to change their minds.

Plan B have responded to the judgement. I find this statement interesting.

Tim Crosland, Director of Plan B and former government lawyer, said: ‘We are surprised and disappointed by this ruling and will be lodging an appeal.

‘We consider it clear and widely accepted that the current carbon target is not compatible with the Paris Agreement. Neither the government nor the Committee on Climate Change suggested during our correspondence with them prior to the claim that the target was compatible.

Indeed, it was only in January of this year that the Committee published a report accepting that the Paris Agreement was ‘likely to require’ a more ambitious 2050 target

What I find interesting is that only point that a lawyer has for contradicting Mr Justice Supperstone’s statement that “Plan B Earth’s arguments were based on an “incorrect interpretation” of the Paris Agreement” is with reference to a report by the Committee on Climate Change. From the CCC website

The Committee on Climate Change (the CCC) is an independent, statutory body established under the Climate Change Act 2008.

Our purpose is to advise the UK Government and Devolved Administrations on emissions targets and report to Parliament on progress made in reducing greenhouse gas emissions and preparing for climate change.

The Committee is set up for partisan aims and, from its’s latest report, appears to be quite zealous in fulfilling those aims. Even as a secondary source (to a document which is easy to read) it should be tainted. But, I would suggest that to really understand the aims of the Paris Agreement you need to read the original and put it in the context of the global empirical and political realities. From my experience, the climate enlightened will keep on arguing for ever, and get pretty affronted when anyone tries to confront their blinkered perspectives.

Kevin Marshall

Why Plan B’s Climate Court Action should be dismissed

Summary

Taking court action to compel Governments to enforce the Paris Climate Agreement is against the real spirit of that Agreement. Controlling global GHG emissions consistent with 2°C, or 1.5°C is only an aspiration, made unachievable by allowing developing countries to decide for themselves when to start reducing their emissions. In the foreseeable future, the aggregate impact of emissions reduction policies will fail to even reduce global emissions. Therefore, costly emissions reductions policies will always end up being net harmful to the countries where they are imposed. Governments wanting to both be players on the world stage and serve their countries give the appearance of taking action of controlling emissions, whilst in substance doing very little. This is the real spirit of the Paris Climate Agreement. To take court action to compel a change of policy action in the name of that Agreement should be struck off on that basis. I use activist group Plan B’s case before the British Court to get the British Government to make even deeper emissions cuts than those under the Climate Change Act 2008.

Plan B’s Case at the High court

Last week BBC’s environment analyst Roger Harrabin reported Court action to save young from climate bill.

The campaigners – known collectively as Plan B – argue that if the UK postpones emissions cuts, the next generation will be left to pick up the bill.

It is seeking permission from a judge to launch formal legal action.

The government has promised to review its climate commitments.

A spokesperson said it was committed to tackling emissions.

But Plan B believes ministers may breach the law if they don’t cut emissions deeper – in line with an international agreement made in Paris at the end of 2015 to restrict global temperature rise to as close to 1.5C as possible.

From an obscure website crowdjustice

Plan B claim that the government is discriminating against the young by failing to cut emissions fast enough. During the hearing, they argued that the UK government’s current target of limiting global temperature rises to 2°C was not ambitious enough, and that the target ought to be lowered to 1.5°C, in line with the Paris Agreement that the UK ratified in 2015. Justice Supperstone postponed the decision until a later date.

Plan B on their own website state

Plan B is supporting the growing global movement of climate litigation, holding governments and corporations to account for climate harms, fighting for the future for all people, all animals and all life on earth.

What is the basis of discrimination?

The widely-accepted hypothesis is that unless global greenhouse gas (GHG) emissions are reduced to near zero in little more than a generation, global average temperature rise will rise more than 2°C above pre-industrial levels. A further hypothesis is that this in turn will cause catastrophic climate change. Consequent on both hypotheses being true gives the case for policy action. Therefore, failure to reduce global GHG emissions will imperil the young.

A further conjecture is that if all signatories to the Paris Agreement fulfil their commitments it is sufficient to prevent 1.5°C or 2°C of warming. There are a number of documents to consider.

First is the INDC submissions (i.e. Nation States communications of their intended nationally determined contributions), collected together at the UNFCCC website. Most are in English.  To find a country submission I suggest clicking on the relevant letter of the alphabet.

Second, to prevent my readers being send on a wild goose chase through small country submissions, some perspective is needed on relative magnitude of emissions. A clear secondary source (but only based on CO2 emissions) BP Data Analysis Global CO2 Emissions 1965-2017. More data on GHG emissions are from the EU Commissions EDGAR Emissions data and the World Resources Institute CAIT Climate Data Explorer.

Third is the empirical scale of the policy issue. The UNEP emissions Gap Report 2017 (pdf), published in October last year is the latest attempt to estimate the scale of the policy issue. The key is the diagram reproduced below.

The total of all commitments will still see aggregate emissions rising into the future. That is, the aggregate impact of all the nationally determined contributions is to see emissions rising well into the future. So the response it to somehow persuade Nations States to change their vague commitments to such an extent that aggregate emissions pathways sufficient to prevent 1.5°C or 2°C of warming?

The relevant way to do this ought to be through the Paris Agreement.

Fourth is the Adoption Paris Agreement itself, as held on the UNFCCC website (pdf).

 

Paris Agreement key points

I would draw readers to Article 2.1(a)

  • Holding the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change;

Article 2.2

  • This Agreement will be implemented to reflect equity and the principle of common but differentiated responsibilities and respective capabilities, in the light of different national circumstances.

My interpretation is that the cumulative aggregate reduction will be only achieved by if those countries that (in the light of their national circumstances) fail to follow the aggregate pathways, are offset by other countries cutting their emissions by a greater amount. It is a numbers game. It is not just a case of compelling some countries to meet the 1.5°C pathway but to compel them to exceed it by some margin.

I would also draw readers to Article 4.1

In order to achieve the long-term temperature goal set out in Article 2, Parties aim to reach global peaking of greenhouse gas emissions as soon as possible, recognizing that peaking will take longer for developing country Parties, and to undertake rapid reductions thereafter in accordance with best available science, so as to achieve a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century, on the basis of equity, and in the context of sustainable development and efforts to eradicate poverty.

My reading is that any country defined as “developing” has only an aim of reducing emissions after peaking of their emissions. When they choose to do so depends on a number of criteria. There is not clear mechanism for deciding this, and no surrender of decision-making by countries to external bodies.

Implications of the Paris Agreement

Many developing countries emissions are increasing their emissions. They agreement does not compel them to change course in the near future. Empirically that means to achieve the goals the aggregate emission reductions of countries reducing their emissions must be such that they cancel out the emissions increases in the developing countries. Using EDGAR figures for GHG emissions, and the Rio Declaration 1992 for developing countries (called Non-Annex countries) I estimate they accounted for 64% of global GHG emissions in 2012, the latest year available.

 

All other sources sum to 19 GtCO2e, the same as the emissions gap between the unconditional INDC case and the 1.5°C case. This presents a stark picture. Even if emissions from all other sources are eliminated by 2030, AND the developing countries do not increase their emissions to 2030, cumulative global emissions are very likely to exceed the 1.5°C and the 2°C warming targets unless the developing countries reduce their emissions rapidly after 2030. That is close down fairly new fossil fuel power stations; remove from the road millions of cars, lorries and buses; and reduce the aspirations of the emerging middle classes to improving life styles. The reality is quite the opposite. No new policies are on the horizon that would significantly reduce global GHG emissions, either from the developed countries in the next couple of years, or the developing countries to start in just over a decade from now. Reading the comments in the INDC emissions (e.g. Indonesia, Pakistan, India), a major reason is that these governments are not willing to sacrifice the futures of their young through risking economic growth and political stability to cut their emissions. So rather than Plan B take the UK Government  to a UK Court, they should be persuading those Governments who do not share their views (most of them) of the greater importance of their case. After all, unlike proper pollution (such as smoke), it does not matter where the emissions are generated in relation to the people affected.

It gets worse. It could be argued that the countries that most affected by mitigation policies are not the poorest seeing economic growth and political stability smashed. It is the fossil fuel dependent countries. McGlade and Ekins 2015 (The geographical distribution of fossil fuels unused when limiting global warming to 2°C) estimated, said to achieve even 2°C target 75% of proven reserves and 100% of new discoveries must be left in the ground. Using these global estimates and the BP estimated proven reserves of fossil fuels I created the following apportionment by major countries.

 

The United States has the greatest proven fossil fuel reserves in terms of potential emissions. But if one looks at fossil fuel revenues relative to GDP, it is well down the league table. To achieve emission targets countries such like Russia, Saudi Arabia, Kuwait, Turkmenistan, Iraq, and Iran must all be persuaded to shut down their down sales of fossil fuels long before the reserves are exhausted, or markets from developing countries dry up. To do this in a generation would decimate their economies. However, given the increase in fossil fuel usage from developing countries, and the failure of developed countries to significantly reduce emissions through policy this hardly seems a large risk.

However, this misses the point. The spirit of the Paris Agreement is not to cut emissions, but to be seen to be doing something about climate change. For instance, China were held up by the likes of President Obama for aiming to both top out its emissions by 2030, and reduce emissions per unit of GDP. The USA and the EU did this decades ago, so China’s commitments are little more than a Business-as-usual scenario. Many other countries emissions reduction “targets” are attainable without much actual policy. For example, Brazil’s commitment is to “reduce greenhouse gas emissions by 43% below 2005 levels in 2030.” It sounds impressive, until one reads this comment under “Fairness and Ambition

Brazil’s current actions in the global effort against climate change represent one of the largest undertakings by any single country to date, having reduced its emissions by 41% (GWP-100; IPCC SAR) in 2012 in relation to 2005 levels.

Brazil intends to reduce emissions by a further 2% compared to 2005 levels. Very few targets are more than soft targets relative to current or projected trends. Yet the outcome of COP21 Paris enabled headlines throughout the world to proclaim a deal had been reached “to limit global warming to “well below” 2C, aiming for 1.5C”. It enables most Governments to juggle being key players on a world stage, have alarmists congratulating them on doing their bit on saving the planet, whilst making sure that serving the real needs of their countries is not greatly impeded. It is mostly win-win as long as countries do not really believe that targets are achievable. This is where Britain has failed. Under Tony Blair, when the fever of climate alarmism at its height, backed up by the political spin of New Labour and a Conservative opposition wanting to ditch its unelectable image, Green activists wrote the Climate Change Act 2008 with the strict targets to be passed. Britain swallowed climate alarmism whole, and now as a country that keep its promises is implementing useless and costly policies. But they have kept some form of moderation in policies until now. This is illustrated by a graphic from a Committee on Climate Change report last week “Reducing UK emissions 2018 – Progress Report to Parliament” (pdf) (and referenced at cliscep)

Whilst emissions have come down in the power sector they are flat in transport, industry and in buildings. Pushing real and deep reductions in these sectors means for young people pushing up the costs of motoring (placing driving a car out of the reach of many), of industry (raising costs relative to the countries – especially the non-policy developing countries) and buildings in a country where planning laws make home-owning unaffordable for many and where costs of renting is very high. This on top of further savings in the power industry will be ever more costly as the law of diminishing returns sets in. Forcing more urgent policy actions will increase the financial and other burdens on the young people of today, but do virtually nothing to reach the climate aspirations of the Paris Agreement due to Britain now having less than 1% of global emissions. The Government could be forced out of political fudging to impose policies that will be net harmful to the young and future generations.

Plan B are using an extreme activist interpretation. As reported in Climate Home News after the postponement.

“The UK is not doing enough,” Tim Crosland, director of Plan B told Climate Home News. “The benchmark target is now out of place. We are arguing that it is a breach of human rights.”

The UK has committed to cut emissions by at least 80% of 1990 levels by 2050, with an aim to limit global temperature rise to 2C.

Under the 2008 Climate Change Act, the secretary can revise the target to reflect significant developments in climate change science or in international law or policy.

Plan B want to see the target lowered to be in line with 1.5C, the lower target of the Paris Agreement, which the UK ratified in 2016.

As stated, insofar as the Paris Climate Agreement is a major development of policy, it is one of appearing to do a lot whilst doing very little. By these terms, the stronger case is for repealing the Act, not strengthening its clauses. 

But what if I am wrong on this Paris Agreement being just an exercise in appearances? This then it should be recognized that developing countries will only start to reduce their emissions at some time in the future. By implication, for the world to meet the 1.5°C warming limit, developing countries should be pursuing and emissions reduction pathway much steeper than the 25% reduction between 2015 and 2030 implied in the Emissions GAP Report graphic. It should be at least 50% and nearer 100% in the next decade. Given that the Climate Change Act was brought in so that Britain could lead the world on climate change, Plan B should be looking for a 100% reduction by the end of the year. 

Kevin Marshall

 

Changing a binary climate argument into understanding the issues

Last month Geoff Chambers posted “Who’s Binary, Us or Them? Being at cliscep the question was naturally about whether sceptics or alarmists were binary in their thinking. It reminded me about something that went viral on youtube a few year’s ago. Greg Craven’s The Most Terrifying Video You’ll Ever See.

To his credit, Greg Craven in introducing both that human-caused climate change can have a trivial impact recognize that mitigating climate (taking action) is costly. But for the purposes of his decision grid he side-steps these issues to have binary positions on both. The decision is thus based on the belief that the likely consequences (costs) of catastrophic anthropogenic global warming then the likely consequences (costs) of taking action. A more sophisticated statement of this was from a report commissioned in the UK to justify the draconian climate action of the type Greg Craven is advocating. Sir Nicholas (now Lord) Stern’s report of 2006 (In the Executive Summary) had the two concepts of the warming and policy costs separated when it claimed

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more. In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.

Craven has merely simplified the issue and made it more binary. But Stern has the same binary choice. It is a choice between taking costly action, or suffering the much greater possible consequences.  I will look at the policy issue first.

Action on Climate Change

The alleged cause of catastrophic anthropogenic global warming is (CAGW) is human greenhouse gas emissions. It is not just some people’s emissions that must be reduced, but the aggregate emissions of all 7.6 billion people on the planet. Action on climate change (i.e. reducing GHG emissions to near zero) must therefore include all of the countries in which those people live. The UNFCCC, in the run-up to COP21 Paris 2015, invited countries to submit Intended Nationally Determined Contributions (INDCs). Most did so before COP21, and as at June 2018, 165 INDCs have been submitted, representing 192 countries and 96.4% of global emissions. The UNFCCC has made them available to read. So these intentions will be sufficient “action” to remove the risk of CAGW? Prior to COP21, the UNFCCC produced a Synthesis report on the aggregate effect of INDCs. (The link no longer works, but the main document is here.) They produced a graphic that I have shown on multiple occasions of the gap between policy intentions on the desired policy goals. A more recent graphic is from the UNEP Emissions Gap Report 2017, published last October and

Figure 3 : Emissions GAP estimates from the UNEP Emissions GAP Report 2017

In either policy scenario, emissions are likely to be slightly higher in 2030 than now and increasing, whilst the policy objective is for emissions to be substantially lower than today and and decreasing rapidly. Even with policy proposals fully implemented global emissions will be at least 25% more, and possibly greater than 50%, above the desired policy objectives. Thus, even if proposed policies achieve their objective, in Greg Craven’s terms we are left with pretty much all the possible risks of CAGW, whilst incurring some costs. But the “we” is for 7.6 billion people in nearly 200 countries. But the real costs are being incurred by very few countries. For the United Kingdom, with the Climate Change Act 2018 is placing huge costs on the British people, but future generations of Britain’s will achieve very little or zero benefits.

Most people in the world live in poorer countries that will do nothing significant to constrain emissions growth if it that conflicts with economic growth or other more immediate policy objectives. In terms of the some of the most populous developing countries, it is quite clear that achieving the policy objectives will leave emissions considerably higher than today. For instance, China‘s main aims of peaking CO2 emissions around 2030 and lowering carbon emissions per unit of GDP in 2030 by 60-65% compared to 2005 by 2020 could be achieved with emissions in 2030 20-50% higher than in 2017. India has a lesser but similar target of reducing emissions per unit of GDP in 2030 by 30-35% compared to 2005 by 2020. If the ambitious economic growth targets are achieve, emissions could double in 15 years, and still be increasing past the middle of the century. Emissions in Bangladesh and Pakistan could both more than double by 2030, and continue increasing for decades after.

Within these four countries are over 40% of the global population. Many other countries are also likely to have emissions increasing for decades to come, particularly in Asia and Africa. Yet without them changing course global emissions will not fall.

There is another group of countries that are have vested interests in obstructing emission reduction policies. That is those who are major suppliers of fossil fuels. In a letter to Nature in 2015, McGlade and Ekins (The geographical distribution of fossil fuels unused when limiting global warming to 2°C) estimate that the proven global reserves of oil, gas and coal would produce about 2900 GtCO2e. They further estimate that the “non-reserve resources” of fossil fuels represent a further 8000 GtCO2e of emissions. The estimated that to constrain warming to 2C, 75% of proven reserves, and any future proven reserves would need to be left in the ground. Using figures from the BP Statistical Review of World Energy 2016 I produced a rough split by major country.

Figure 4 : Fossil fuel Reserves by country, expressed in terms of potential CO2 Emissions

Activists point to the reserves in the rich countries having to be left in the ground. But in the USA, Australia, Canada and Germany production of fossil fuels is not a major part of the economy. Ceasing production would be harmful but not devastating. One major comparison is between the USA and Russia. Gas and crude oil production are similar volumes in both countries. But, the nominal GDP of the US is more than ten times that of Russia. The production of both countries in 2016 was about 550 million tonnes or 3900 million barrels. At $70 a barrel that is around $275bn, equivalent to 1.3% of America’s GDP and 16% of Russia’s. In gas, prices vary, being very low in the highly competitive USA, and highly variable for Russian supply, with major supplier Gazprom acting as a discriminating monopolist. But America’s revenue is likely to be less than 1% of GDP and Russia’s equivalent to 10-15%. There is even greater dependency in the countries of the Middle East. In terms of achieve emissions targets, what is trying to be achieved is the elimination of the major source of the countries economic prosperity in a generation, with year-on-year contractions in fossil fuel sales volumes.

I propose that there are two distinct groups of countries that appear to have a lot lose from a global contraction in GHG emissions to near zero. There are the developing countries who would have to reduce long-term economic growth and the major fossil fuel-dependent countries, who would lose the very foundation of their economic output in a generation. From the evidence of the INDC submissions, there is now no possibility of these countries being convinced to embrace major economic self-harm in the time scales required. The emissions targets are not going to be met. The emissions gap will not be closed to any appreciable degree.

This leaves Greg Craven’s binary decision option of taking action, or not, as irrelevant. As taking action by a country will not eliminate the risk of CAGW, pursuing aggressive climate mitigation policies will impose net harms wherever they implemented. Further, it is not the climate activists who are making the decisions, but policy-makers countries themselves. If the activists believe that others should follow another path, it is them that must make the case. To win over the policy-makers they should have sought to understand their perspectives of those countries, then persuade them to accept their more enlightened outlook. The INDCs show that the climate activists gave failed in this mission. Until such time, when activists talk about the what “we” are doing to change the climate, or what “we” ought to be doing, they are not speaking about

But the activists have won over the United Nations, those who work for many Governments and they dominate academia. For most countries, this puts political leaders in a quandary. To maintain good diplomatic relations with other countries, and to appear as movers on a world stage they create the appearance of taking significant action on climate change for the outside world. On the other hand they are serving their countries through minimizing the real harms that imposing the policies would create. Any “realities” of climate change have become largely irrelevant to climate mitigation policies.

The Risks of Climate Apocalypse

Greg Craven recognized a major issue with his original video. In the shouting match over global warming who should you believe? In How it all Ends (which was followed up by further videos and a book) Craven believes he has the answer.

Figure 5 : Greg Craven’s “How it all Ends”

It was pointed out that the logic behind the grid is bogus. As in Devil’s advocate guise Craven says at 3:50

Wouldn’t that grid argue for action against any possible threat, no matter how costly the action or how ridiculous the threat? Even giant mutant space hamsters? It is better to go broke building a load of rodent traps than risk the possibility of being hamster chow. So this grid is useless.

His answer is to get a sense of how likely the possibility of global warming being TRUE or FALSE is. Given that science is always uncertain, and there are divided opinions.

The trick is not to look at what individual scientists are saying, but instead to look at what the professional organisations are saying. The more prestigious they are, the more weight you can give their statements, because they have got huge reputations to uphold and they don’t want to say something that later makes them look foolish. 

Craven points to the “two most respected in the world“. The National Academy of Sciences (NAS) and the American Association for the Advancement of Science (AAAS). Back in 2007 they had “both issued big statements calling for action, now, on global warming“.  The crucial question from scientists (that is people will a demonstrable expert understanding of the natural world) is not for political advocacy, but whether their statements say their is a risk of climate apocalypse. These two bodies still have statements on climate change.

National Academy of Sciences (NAS) says

There are well-understood physical mechanisms by which changes in the amounts of greenhouse gases cause climate changes. The US National Academy of Sciences and The Royal Society produced a booklet, Climate Change: Evidence and Causes (download here), intended to be a brief, readable reference document for decision makers, policy makers, educators, and other individuals seeking authoritative information on the some of the questions that continue to be asked. The booklet discusses the evidence that the concentrations of greenhouse gases in the atmosphere have increased and are still increasing rapidly, that climate change is occurring, and that most of the recent change is almost certainly due to emissions of greenhouse gases caused by human activities.

Further climate change is inevitable; if emissions of greenhouse gases continue unabated, future changes will substantially exceed those that have occurred so far. There remains a range of estimates of the magnitude and regional expression of future change, but increases in the extremes of climate that can adversely affect natural ecosystems and human activities and infrastructure are expected.

Note, this is conjunction with the Royal Society, which is arguably is (or was) the most prestigious  scientific organisation of them all. It is what not said that is as important as what is actually said. They are saying that there is a an expectation that extremes of climate could get worse. There is nothing that solely backs up the climate apocalypse, but a range of possibilities, including changes somewhat trivial on a global scale. The statement endorses a spectrum of possible positions that undermines the binary TRUE /FALSE position on decision-making.

The RS/NAS booklet has no estimates of the scale of possible climate catastrophism to be avoided. Point 19 is the closest.

Are disaster scenarios about tipping points like ‘turning off the Gulf Stream’ and release of methane from the Arctic a cause for concern?

The summary answer is

Such high-risk changes are considered unlikely in this century, but are by definition hard to predict. Scientists are therefore continuing to study the possibility of such tipping points beyond which we risk large and abrupt changes.

This appears not to support Stern’s contention that unmitigated climate change will costs at least 5% of global GDP by 2100. Another context of the back-tracking on potential catastrophism is to to compare with  Lenton et al 2008 – Tipping elements in the Earth’s climate system. Below is a map showing the the various elements considered.

Figure 6 : Fig 1 of Lenton et al 2008, with explanatory note.

Of the 14 possible tipping elements discussed, only one makes it into the booklet six years later. Surely if the other 13 were still credible more would have been included in booklet, and less on documenting trivial historical changes.

American Association for the Advancement of Science (AAAS) has a video

Figure 7 : AAAS “What We Know – Consensus Sense” video

 

It starts with the 97% Consensus claims. After asking the listener on how many,  Marshall Sheppard, Prof of Geography at Univ of Georgia states.

The reality is that 97% of scientists are pretty darn certain that humans are contributing to the climate change that we are seeing right now and we better do something about it to soon.

There are two key papers that claimed a 97% consensus. Doran and Zimmerman 2009 asked two questions,

1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

The second of these two responses was answered in the affirmative by 77 of 79 climate scientists. This was reduced from 3146 responses received. Read the original to find out why it was reduced.

Dave Burton has links to a number of sources on these studies. A relevant quote on Doran and Zimmerman is from the late Bob Carter

Both the questions that you report from Doran’s study are (scientifically) meaningless because they ask what people “think”. Science is not about opinion but about factual or experimental testing of hypotheses – in this case the hypothesis that dangerous global warming is caused by human carbon dioxide emissions.

The abstract to Cook et al. 2013 begins

We analyze the evolution of the scientific consensus on anthropogenic global warming (AGW) in the peer-reviewed scientific literature, examining 11 944 climate abstracts from 1991–2011 matching the topics ‘global climate change’ or ‘global warming’. We find that 66.4% of abstracts expressed no position on AGW, 32.6% endorsed AGW, 0.7% rejected AGW and 0.3% were uncertain about the cause of global warming. Among abstracts expressing a position on AGW, 97.1% endorsed the consensus position that humans are causing global warming. 

Expressing a position does not mean a belief. It could be an assumption. The papers were not necessarily by scientists, but merely authors of academic papers that involved the topics ‘global climate change’ or ‘global warming’. Jose Duarte listed some of the papers that were included in the survey, along with looking at some that were left out.

Neither paper asked a question concerning belief in future climate catastrophism. Sheppard does not make clear the scale of climate change trends from the norm, so the human-caused element could be insignificant. The 97% consensus does not include the policy claims.

The booklet is also misleading as well in the scale of changes. For instance on sea-level rise it states.

Over the past two decades, sea levels have risen almost twice as fast as the average during the twentieth century.

You will get that if you compare the tide gauge data with the two decades of satellite data. The question is whether those two sets of data are accurate. As individual tide gauges do not tend to show acceleration, and others cannot find statistically significant acceleration, the claim seems not to be supported.

At around 4.15 in the consensus video AAAS CEO Alan I. Leshner says

America’s leaders should stop debating the reality of climate change and start deciding the best solutions. Our What we Know report makes clear that climate change threatens us at every level. We can reduce the risk of global warming to protect out people, businesses and communities from harm. At every level from our personal and community health, our economy and our future as a global leader.  Understanding and managing climate change risks is an urgent problem. 

The statement is about combating the potential risks from CAGW. The global part of global warming is significant for policy. The United States share of global emissions is around 13% of global emissions. That share has been falling as America’s emissions have been falling why the global aggregate emissions have been rising. The INDC submission for the United States aimed as getting US emissions in 2025 at 26-28% of 2005 levels, with a large part of that reduction already “achieved” when the report was published. The actual policy difference is likely to be less than 1% of global emissions. So any reduction in risks with respect to climate change seems to be tenuous. A consensus of the best scientific minds should have been able to work this out for themselves.

The NAAS does not give a collective expert opinion on climate catastrophism. This is shown by the inability to distinguish between banal opinions and empirical evidence for a big problem. This is carried over into policy advocacy, where they fail to distinguish between the United States and the world as a whole.

Conclusions

Greg Laden’s decision-making grid is inapplicable to real world decision-making. The decision whether to take action or not is not a unitary one, but needs to be taken at country level. Different countries will have different perspectives on the importance of taking action on climate change relative to other issues. In the real world, the proposals for action are available. In aggregate they will not “solve” the potential risk of climate apocalypse. Whatever the actual scale of CAGW, countries who pursue expensive climate mitigation policies are likely to make their own people worse off than if they did nothing at all.

Laden’s grid assumes that the costs of the climate apocalypse are potentially far greater than the costs of action, no matter how huge. He tries to cut through the arguments by getting the opinions from the leading scientific societies. To put it mildly, they do not currently provide strong scientific evidence for a potentially catastrophic problem. The NAS / Royal Society suggest a range of possible climate change outcomes, with only vague evidence for potentially catastrophic scenarios. It does not seem to back the huge potential costs of unmitigated climate change in the Stern Review. The NAAAS seems to provide vague banal opinions to support political advocacy rather than rigorous analysis based on empirical evidence that one would expect from the scientific community.

It would appear that the binary thinking on both the “science” and on “policy” leads to a dead end, and is leading to net harmful public policy.

What are the alternatives to binary thinking on climate change?

My purpose in looking at Greg Laden’s decision grid is not to destroy an alternative perspective, but to understand where the flaws are for better alternatives. As a former, slightly manic, beancounter, I would (like the Stern Review  and William Nordhaus) look at translating potential CAGW into costs. But then weight it according to a discount rate, and the strength of the evidence. In terms of policy I would similarly look at the likely expected costs of the implemented policies, against the actual expected harms foregone. As I have tried to lay out above, the costs of policy and indeed the potential costs of climate change are largely subjective. Further, those implementing policies might be boxed in by other priorities and various interest groups jostling for position.

But what of the expert scientist who can see the impending on-coming catastrophes to which I am blind and to which climate mitigation will be useless? It is to endeavor to pin down the where, when, type and magnitude of potential changes to climate. With this information ordinary people can adjust their plans. The challenge for those who believe there are real problems is to focus on the data from the natural world and away from inbuilt biases of the climate community. But the most difficult part is from such methods they may lose their beliefs, status and friends.

First is to obtain some perspective. In terms of the science, it is worth looking at the broad range of  different perspectives on the Philosophy of Science. The Stanford Encyclopedia of Philosophy article on the subject is long, but very up to date. In the conclusions, the references to Paul Hoyningen-Huene’s views on what sets science apart seems to be a way out of consensus studies.

Second, is to develop strategies to move away from partisan positions with simple principles, or contrasts, that other areas use. In Fundamentals that Climate Science Ignores I list some of these.

Third, in terms of policy, it is worthwhile having a theoretical framework in which to analyze the problems. After looking at Greg Craven’s video’s in 2010, I developed a graphical analysis that will be familiar to people who have studied Marshallian Supply and Demand curves of Hicksian IS-LM. It is very rough at the edges, but armed with it you will not fall in the trap of thinking like the AAAS that US policy will stop US-based climate change.

Fourth, is to look from other perspectives. Appreciate that other people might have other perspectives that you can learn from. Or alternatively they may have entrenched positions which, although you might disagree with, are powerless to overturn. It should then be possible to orientate yourself, whether as an individual or as part of a group, towards aims that are achievable.

Kevin Marshall

Sea Level Rise Acceleration as a sign of Impending Climate Apoclaypse

Global warming alarmism first emerged in the late 1980s, three decades ago. Put very simply, the claim is that climate change, resulting from human-caused increases in trace gases, is a BIG potential problem. The BIG solution is to control reduce global greenhouse gas emissions through a co-ordinated global action. The actual evidence shows a curious symmetry. The proponents of alarmism have failed to show that rises in greenhouse gas levels are making non-trivial difference on a global scale, and the aggregate impact of the policy proposals on global emissions, if fully implemented, will make a trivial difference to global emissions pathways. The Adoption of the Paris Agreement communique paragraph 17 clearly states the failure. My previous post puts forward reasons why the impact of mitigation policies will remain trivial.

In terms of an emerging large problem, the easiest to visualize, and the most direct impact from rising average temperatures is rising sea levels. Rising temperatures will lead to sea level rise principally through meltwater from the polar ice-caps and thermal expansion of the oceans. Given that sea levels have been rising since the last ice age, if a BIG climate problem is emerging then it should be detectable in accelerating sea level rise. If the alarmism is credible, then after 30 years of failure to implement the BIG solution, the unrelenting increases in global emissions and the accelerating rise in CO2 levels for decades, then there should be a clear response in terms of acceleration in the rate of sea level rise.

There is a strong debate as to whether sea-level rise is accelerating or not. Dr. Roy Spencer at WUWT makes a case for there being mild acceleration since about 1950. Based on the graph below (from Church and White 2013) he concludes:-

The bottom line is that, even if (1) we assume the Church & White tide gauge data are correct, and (2) 100% of the recent acceleration is due to humans, it leads to only 0.3 inches per decade that is our fault, a total of 2 inches since 1950.

As Judith Curry mentioned in her continuing series of posts on sea level rise, we should heed the words of the famous oceanographer, Carl Wunsch, who said,

“At best, the determination and attribution of global-mean sea-level change lies at the very edge of knowledge and technology. Both systematic and random errors are of concern, the former particularly, because of the changes in technology and sampling methods over the many decades, the latter from the very great spatial and temporal variability. It remains possible that the database is insufficient to compute mean sea-level trends with the accuracy necessary to discuss the impact of global warming, as disappointing as this conclusion may be.”

In metric, the so-called human element of 2 inches since 1950 is 5 centimetres. The total in over 60 years is less than 15 centimetres. The time period for improving sea defences to cope with this is way beyond normal human planning horizons. Go to any coastal strip with sea defences, such as the dykes protecting much of the Netherlands, with a measure and imagine increasing those defences by 15 centimetres.

However, a far more thorough piece is from Dave Burton (of Sealevel.info) in three comments. Below is his a repost of his comments.

Agreed. On Twitter, or when sloppy and in a hurry, I say “no acceleration.” That’s shorthand for, “There’s been no significant, sustained acceleration in the rate of sea-level rise, over the last nine or more decades, detectable in the measurement data from any of the longest, highest-quality, coastal sea-level records.” Which is right.

That is true at every site with a very long, high-quality measurement record. If you do a quadratic regression over the MSL data, depending on the exact date interval you analyze, you may find either a slight acceleration or deceleration, but unless you choose a starting date prior to the late 1920s, you’ll find no practically-significant difference from perfect linearity. In fact, for the great majority of cases, the acceleration or deceleration doesn’t even manage statistical significance.

What do I mean by “practically-significant,” you might wonder? I mean that, if the acceleration or deceleration continued for a century, it wouldn’t affect sea-level by more than a few inches. That means it’s likely dwarfed by common coastal processes like vertical land motion, sedimentation, and erosion, so it is of no practical significance.

For instance, here’s one of the very best Pacific tide gauges. It is at a nearly ideal location (mid-ocean, which minimizes ENSO effects), on a very tectonically stable island, with very little vertical land motion, and a very trustworthy, 100% continuous, >113-year measurement record (1905/1 through 2018/3):

As you can see, there have been many five-year to ten-year “sloshes-up” and “sloshes-down,” but there’s been no sustained acceleration, and no apparent effect from rising CO2 levels.

The linear trend is +1.482 ±0.212 mm/year (which is perfectly typical).

Quadratic regression calculates an acceleration of -0.00539 ±0.01450 mm/yr².

The minus sign means deceleration, but it is nowhere near statistically significant.

To calculate the effect of a century of sustained acceleration on sea-level, you divide the acceleration by two, and multiply it by the number of years squared, 100² = 10,000. In this case, -0.00539/2 × 10,000 = -27 mm (about one inch).

That illustrates a rule-of-thumb that’s worth memorizing: if you see claimed sea-level acceleration or deceleration numbers on the order of 0.01 mm/yr² or less, you can stop calculating and immediately pronounce it practically insignificant, regardless of whether it is statistically significant.

However, the calculation above actually understates the effect of projecting the quadratic curve out another 100 years, compared to a linear projection, because the starting rate of SLR is wrong. On the quadratic curve, the point of “average” (linear) trend is the midpoint, not the endpoint. So to see the difference at 100 years out, between the linear and quadratic projections, we should calculate from that mid-date, rather than the current date. In this case, that adds 56.6 years, so we should multiply half the acceleration by 156.6² = 24,524.

-0.00539/2 × 24,524 = -66 mm = -2.6 inches (still of no practical significance).

Church & White have been down this “acceleration” road before. Twelve years ago they published the most famous sea-level paper of all, A 20th Century Acceleration in Global Sea-Level Rise, known everywhere as “Church & White (2006).”

It was the first study anywhere which claimed to have detected an acceleration in sea-level rise over the 20th century. Midway through the paper they finally tell us what that 20th century acceleration was:

“For the 20th century alone, the acceleration is smaller at 0.008 ± 0.008 mm/yr² (95%).”

(The paper failed to mention that all of the “20th century acceleration” which their quadratic regression detected had actually occurred prior to the 1930s, but never mind that.)

So, applying the rule-of-thumb above, the first thing you should notice is that 0.008 mm/yr² of acceleration, even if correct, is practically insignificant. It is so tiny that it just plain doesn’t matter.

In 2009 they posted on their web site a new set of averaged sea-level data, from a different set of tide gauges. But they published no paper about it, and I wondered why not. So I duplicated their 2006 paper’s analysis, using their new data, and not only did it, too, show slight deceleration after 1925, all the 20th century acceleration had gone away, too. Even for the full 20th century their data showed a slight (statistically insignificant) deceleration.

My guess is that the reason they wrote no paper about it was that the title would have had to have been something like this:

Church and White (2009), Never mind: no 20th century acceleration in global sea-level rise, after all.

There is no real disagreement between the too accounts. Roy Spencer is saying that if the Church and White paper is correct there is trivial acceleration, Dave Burton is making a more general point about there being no statistically significant acceleration or deceleration in any data set.                                                At Key West in low-lying Florida, the pattern of near constant of sea level rise over the past century is similar to Honolulu. The rate of rise is about 50% more at 9 inches per century but more in line with the long-term global average from tide gauges. Given that Hawaii is a growing volcanic island, this should not come as a surprise.

I choose Key West from Florida, as supposedly from projecting from this real data, and climate models, the Miami-Dade Sea Level Rise Task Force produced the following Unified Sea Level Rise Projection.

The projections of significant acceleration in the rate of sea level rise are at odds with the historical data, but should be discernible as the projection includes over two decades of actual data. Further, as the IPCC AR5 RCP8.5 scenario is the projection without climate mitigation policy, the implied assumption for this report for adapting to a type of climate change is that climate mitigation policies will be completely useless. As this graphic is central to the report, it would appear it is the usage of the most biased projections that appears to be influencing public policy. Basic validation of theory against modelled trends in the peer-reviewed literature (Dr Roy Spencer) or against actual measured data (Dave Burton) appears to be rejected in favour of beliefs in the mainstream climate consensus.

The curious symmetry of climate alarmism between evidence for BIG potential climate problem and the lack of an agreed BIG mitigation policy solution is evident is sea level rise projections. Unfortunately, given that policy is based on the ridiculous projections, it is people outside of the consensus that will suffer. Expensive and unnecessary flood defences will be built and low-lying areas will be blighted by alarmist reports.

 

Kevin Marshall

 

Charles Moore nearly gets Climate Change Politics post Paris Agreement

Charles Moore of the Telegraph has long been one of the towering figures of the mainstream media. In Donald Trump has the courage and wit to look at ‘green’ hysteria and say: no deal (see also at GWPF, Notalotofpeopleknowthat and Tallbloke) he understands not only the impact of Trump withdrawing from the climate agreement on future global emissions, but recognizes that two other major developed countries – Germany and Japan – whilst committed to reduce their emissions and spending lots of money on renewables are also investing heavily in coal. So without climate policy, the United States is reducing its emissions, but with climate commitments, emissions in Japan and Germany are increasing their emissions. However, there is one slight inaccuracy in Charles Moore’s account. He states

As for “Paris”, this is failing, chiefly for the reason that poorer countries won’t decarbonise unless richer ones pay them stupendous sums.

It is worse than this. Many of the poorer countries have not said they will decarbonize. Rather they have said that they will use the money to reduce emissions relative to a business as usual scenario.

Take Pakistan’s INDC. In 2015 they estimate emissions were 405 MtCO2e, up from 182 in 1994. As a result of ambitious planned economic growth, they forecast a BAU of 1603 MtCO2e in 2030. However, they can reduce that by 20% with about $40 billion in finance. That is, with $40bn, average annual emissions growth from 2015-2030 will still be twice that of 1994-2015. Plus Pakistan would like $7-$14bn pa for adaptation to climate change. The INDC Table 7 summarizes the figures.

Or Bangladesh’s INDC. Estimated BAU increase in emissions from 2011 to 2030 is 264%. They will unconditionally cut this by 5% and conditionally by a further 15%. The BAU is 7.75% annual emissions growth, cut to 7.5% unconditionally and 6% with lots of finance. The INDC Table 7 summarizes the figures.

I do not blame either country for taking such an approach, or the many others adopting similar strategies. They are basically saying that they will do nothing that impedes trying to raise living standards through high levels of sustained economic growth. They will play the climate change game, so long as nobody demands that Governments compromise on serving the best interests of their peoples. If only the Government’s of the so-called developed nations would play similar games, rather than impose useless burdens on the people they are supposed to be serving.

There is another category of countries that will not undertake to reduce their emissions – the OPEC members. Saudi Arabia, Iran, Venezuela, Kuwait, UAE and Qatar have all made submissions. Only Iran gives a figure. It will unilaterally cut emissions by 4% against BAU. With the removal of “unjust sanctions” and some financial assistance and technology transfer it conditional offer would be much more. But nowhere is the BAU scenario stated in figures. The reason these OPEC countries will not play ball is quite obvious. To achieve the IPCC objective of constraining warming to 2°C according to McGlade and Ekins 2015 (The geographical distribution of fossil fuels unused when limiting global warming to 2°C) would mean leaving 75% of proven reserves of fossil fuels in the ground and all of the unproven reserves. I did an approximate breakdown by major countries last year, using the BP Statistical Review of World Energy 2016.

It does not take a genius to work out that meeting the 2°C climate mitigation target would shut down a major part of the economies of fossil fuel producing countries in about two decades. No-one has proposed either compensating them, or finding alternatives.

But the climate alarmist community are too caught up in their Groupthink to notice the obvious huge harms that implementing global climate mitigation policies would entail.

Kevin Marshall

Does data coverage impact the HADCRUT4 and NASA GISS Temperature Anomalies?

Introduction

This post started with the title “HADCRUT4 and NASA GISS Temperature Anomalies – a Comparison by Latitude“.  After deriving a global temperature anomaly from the HADCRUT4 gridded data, I was intending to compare the results with GISS’s anomalies by 8 latitude zones. However, this opened up an intriguing issue. Are global temperature anomalies impacted by a relative lack of data in earlier periods? The leads to a further issue of whether infilling of the data can be meaningful, and hence be considered to “improve” the global anomaly calculation.

A Global Temperature Anomaly from HADCRUT4 Gridded Data

In a previous post, I looked at the relative magnitudes of early twentieth century and post-1975 warming episodes. In the Hadley datasets, there is a clear divergence between the land and sea temperature data trends post-1980, a feature that is not present in the early warming episode. This is reproduced below as Figure 1.

Figure 1 : Graph of Hadley Centre 7 year moving average temperature anomalies for Land (CRUTEM4), Sea (HADSST3) and Combined (HADCRUT4)

The question that needs to be answered is whether the anomalous post-1975 warming on the land is due to real divergence, or due to issues in the estimation of global average temperature anomaly.

In another post – The magnitude of Early Twentieth Century Warming relative to Post-1975 Warming – I looked at the NASA Gistemp data, which is usefully broken down into 8 Latitude Zones. A summary graph is shown in Figure 2.

Figure 2 : NASA Gistemp zonal anomalies and the global anomaly

This is more detail than the HADCRUT4 data, which is just presented as three zones of the Tropics, along with Northern and Southern Hemispheres. However, the Hadley Centre, on their HADCRUT4 Data: download page, have, under  HadCRUT4 Gridded data: additional fields, a file HadCRUT.4.6.0.0.median_ascii.zip. This contains monthly anomalies for 5o by 5o grid cells from 1850 to 2017. There are 36 zones of latitude and 72 zones of longitude. Over 2016 months, there are over 5.22 million grid cells, but only 2.51 million (48%) have data. From this data, I have constructed a global temperature anomaly. The major issue in the calculation is that the grid cells are of different areas. A grid cell nearest to the equator at 0o to 5o has about 23 times the area of a grid cell adjacent to the poles at 85o to 90o. I used the appropriate weighting for each band of latitude.

The question is whether I have calculated a global anomaly similar to the Hadley Centre. Figure 3 is a reconciliation with the published global anomaly mean (available from here) and my own.

Figure 3 : Reconciliation between HADCRUT4 published mean and calculated weighted average mean from the Gridded Data

Prior to 1910, my calculations are slightly below the HADCRUT 4 published data. The biggest differences are in 1956 and 1915. Overall the differences are insignificant and do not impact on the analysis.

I split down the HADCRUT4 temperature data by eight zones of latitude on a similar basis to NASA Gistemp. Figure 4 presents the results on the same basis as Figure 2.

Figure 4 : Zonal surface temperature anomalies a the global anomaly calculated using the HADCRUT4 gridded data.

Visually, there are a number of differences between the Gistemp and HADCRUT4-derived zonal trends.

A potential problem with the global average calculation

The major reason for differences between HADCRUT4 & Gistemp is that the latter has infilled estimated data into areas where there is no data. Could this be a problem?

In Figure 5, I have shown the build-up in global coverage. That is the percentage of 5o by 5o grid cells with an anomaly in the monthly data.

Figure 5 : HADCRUT4 Change in the percentage coverage of each zone in the HADCRUT4 gridded data. 

Figure 5 shows a build-up in data coverage during the late nineteenth and early twentieth centuries. The World Wars (1914-1918 & 1939-1945) had the biggest impact on the Southern Hemisphere data collection. This is unsurprising when one considers it was mostly fought in the Northern Hemisphere, and European powers withdrew resources from their far-flung Empires to protect the mother countries. The only zones with significantly less than 90% grid coverage in the post-1975 warming period are the Arctic and the region below 45S. That is around 19% of the global area.

Finally, comparing comparable zones in the Northen and Southern hemispheres, the tropics seem to have comparable coverage, whilst for the polar, temperate and mid-latitude areas the Northern Hemisphere seems to have better coverage after 1910.

This variation in coverage can potentially lead to wide discrepancies between any calculated temperature anomalies and a theoretical anomaly based upon one with data in all the 5o by 5o grid cells. As an extreme example, with my own calculation, if just one of the 72 grid cells in a band of latitude had a figure, then an “average” would have been calculated for a band right around the world 555km (345 miles) from North to South for that month for that band. In the annual figures by zone, it only requires one of the 72 grid cells, in one of the months, in one of the bands of latitude to have data to calculate an annual anomaly. For the tropics or the polar areas, that is just one in 4320 data points to create an anomaly. This issue will impact early twentieth-century warming episode far more than the post-1975 one. Although I would expect the Hadley centre to have done some data cleanup of the more egregious examples in their calculation, potentially lack of data in grid cells could have quite random impacts, thus biasing the global temperature anomaly trends to an unknown, but significant extent. An appreciation of how this could impact can be appreciated from an example of NASA GISS Global Maps.

NASA GISS Global Maps Temperature Trends Example

NASA GISS Global Maps from GHCN v3 Data provide maps with the calculated change in average temperatures. I have run the maps to compare annual data for 1940 with a baseline of 1881-1910, capturing much of the early twentieth-century warming. I have run the maps at both the 1200km and 250km smoothing.

Figure 6 : NASA GISS Global anomaly Map and average anomaly by Latitude comparing 1940 with a baseline of 1881 to 1910 and a 1200km smoothing radius

Figure 7 : NASA GISS Global anomaly Map and average anomaly by Latitude comparing 1940 with a baseline of 1881 to 1910 and a 250km smoothing radius. 

With respect to the maps in figures 6 & 7

  • There is no apparent difference in the sea data between the 1200km and 250km smoothing radius, except in the polar regions with more cover in the former. The differences lie in the land area.
  • The grey areas with insufficient data all apply to the land or ocean areas in polar regions.
  • Figure 6, with 1200km smoothing, has most of the land infilled, whilst the 250km smoothing shows the lack of data coverage for much of South America, Africa, the Middle East, South-East Asia and Greenland.

Even with these land-based differences in coverage, it is clear that from either map that at any latitude there are huge variations in calculated average temperature change. For instance, take 40N. This line of latitude is North of San Francisco on the West Coast USA, clips Philidelphia on the East Coast. On the other side of the Atlantic, Madrid, Ankara and Beijing are at about 40N. There are significant points on the line on latitude with estimate warming greater than 1C (e.g. California), whilst at the same time in Eastern Europe, cooling may have exceeded 1C in the period. More extreme is at 60N (Southern Alaska, Stockholm, St Petersburg) the difference in temperature along the line of latitude is over 3C. This compares to a calculated global rise of 0.40C.

This lack of data may have contributed (along with a faulty algorithm) to the differences in the Zonal mean charts by Latitude. The 1200km smoothing radius chart bears little relation to the 250km smoothing radius. For instance:-

  •  1200km shows 1.5C warming at 45S, 250km about zero. 45S cuts through South Island, New Zealand.
  • From the equator to 45N, 1200km shows rise from 0.5C to over 2.0C, 250km shows drop from less than 0.5C to near zero, then rise to 0.2C. At around 45N lies Ottowa, Maine, Bordeaux, Belgrade, Crimea and the most Northern point in Japan.

The differences in the NASA Giss Maps, in a period when available data covered only around half the 2592 5o by 5o grid cells, indicate quite huge differences in trends between different areas. As a consequence, trying to interpolate warming trends from one area to adjacent areas appears to give quite different results in terms of trends by latitude.

Conclusions and Further Questions

The issue I originally focussed upon was the relative size of the early twentieth-century warming to the Post-1975. The greater amount of warming in the later period seemed to be due to the greater warming on land covering just 30% of the total global area. The sea temperature warming phases appear to be pretty much the same.

The issue that I focussed upon was a data issue. The early twentieth century had much less data coverage than after 1975. Further, the Southern Hemisphere had worse data coverage than the Northern Hemisphere, except in the Tropics. This means that in my calculation of a global temperature anomaly from the HADCRUT4 gridded data (which in aggregate was very similar to the published HADCRUT4 anomaly) the average by latitude will not be comparing like with like in the two warming periods. In particular, in the early twentieth-century, a calculation by latitude will not average right the way around the globe, but only on a limited selection of bands of longitude. On average this was about half, but there are massive variations. This would be alright if the changes in anomalies were roughly the same over time by latitude. But an examination of NASA GISS global maps for a period covering the early twentieth-century warming phase reveals that trends in anomalies at the same latitude are quite different over time. This implies that there could be large, but unknown, biases in the data.

I do not believe the analysis ends here. There are a number of areas that I (or others) can try to explore.

  1. Does the NASA GISS infilling of the data get us closer or further away from a what a global temperature anomaly would look like with full data coverage? My guess, based on the extreme example of Antartica trends (discussed here) is that the infilling will move away from the more perfect trend. The data could show otherwise.
  2. Are the changes in data coverage on land more significant than the global average or less? Looking at CRUTEM4 data could resolve this question.
  3. Would anomalies based upon similar grid coverage after 1900 give different relative trend patterns to the published ones based on dissimilar grid coverage?

Whether I get the time to analyze these is another issue.

Finally, the problem of trends varying considerably and quite randomly across the globe is the same issue that I found with land data homogenisation discussed here and here. To derive a temperature anomaly for a grid cell, it is necessary to make the data homogeneous. In standard homogenisation techniques, it is assumed that the underlying trends in an area is pretty much the same. Therefore, any differences in trend between adjacent temperature stations will be as a result of data imperfections. I found numerous examples where there were likely differences in trend between adjacent temperature stations. Homogenisation will, therefore, eliminate real but local climatic trends. Averaging incomplete global data where missing data could contain regional but unknown data trends may cause biases at a global scale.

Kevin Marshall