Is increasing Great Barrier Reef coral bleaching related to climate change or observation bias?

In the previous post I looked at whether the claimed increase in coral bleaching in the Great Barrier Reef was down to global average temperature rise. I concluded that this was not the case as the GBR has not warmed, or at least not warmed as much as the global temperatures. Here I look further at the data.
The first thing to state is that I recognize that heat stress can occur in corals. Blogger Geoff Price (in post at his own blog on April 2nd 2018, reposted at ATTP eleven months later) stated

(B)leaching via thermal stress is lab reproducible and uncontroversial. If you’re curious, see Jones et al 1998, “Temperature-induced bleaching of corals begins with impairment of the CO2 fixation mechanism in zooxanthellae”.

I am curious. The abstract of Jones et al 1998 states

The early effects of heat stress on the photosynthesis of symbiotic dinoflagellates (zooxanthellae) within the tissues of a reef‐building coral were examined using pulse‐amplitude‐modulated (PAM) chlorophyll fluorescence and photorespirometry. Exposure of Stylophora pistillata to 33 and 34 °C for 4 h resulted in ……….Quantum yield decreased to a greater extent on the illuminated surfaces of coral branches than on lower (shaded) surfaces, and also when high irradiance intensities were combined with elevated temperature (33 °C as opposed to 28 °C). …..

If I am reading this right. the coral was exposed to a temperature increase of 5-6 °C for a period of 4 hours. I can appreciate that the coral would suffer from this sudden change in temperature. Most waterborne creatures would be become distressed if the water temperature was increased rapidly. How much before it would  seriously stress them might vary, but it is not a serious of tests I would like to carry out. But is there evidence of increasing heat stress causing increasing coral bleaching in the real world? That is, has there been both a rise in coral bleaching and a rise in these heat stress conditions? Clearly there will be seasonal changes in water temperature, even though in the tropics it might not be as large as, say, around the coast of the UK. Also, many over the corals migrate up and down the reef, so they could be tolerant of a range of temperatures. Whether worsening climate conditions have exacerbated heat stress conditions to such an extent that increased coral bleaching has occurred will only be confirmed by confronting the conjectures with the empirical data.


Rise in instances of coral bleaching

I went looking for long-term data that coral bleaching is on the increase and came across and early example. 

P. W. Glynn: Coral reef bleaching: Ecological perspectives. Coral Reefs 12, 1–17 (1993). doi:10.1007/BF00303779

From the introduction

Mass coral mortalities in contemporary coral reef ecosystems have been reported in all major reef provinces since the 1870s (Stoddart 1969; Johannes 1975; Endean 1976; Pearson 1981; Brown 1987; Coffroth et al. 1990). Why, then, should the coral reef bleaching and mortality events of the 1980s command great concern? Probably, in large part, because the frequency and scale of bleaching disturbances are unprecedented in the scientific literature.

One such example of observed bleaching is graphed in Glynn’s paper as Figure 1 c

But have coral bleaching events actually risen, or have the observations risen? That is in the past were there less observed bleaching events due to much less bleaching events or much less observations? Since the 1990s have observations of bleaching events increased further due to far more researchers leaving their families the safe climates of temperate countries to endure the perils of diving in waters warmer than a swimming pool? It is only by accurately estimating the observational impact that it is possible to estimate the real impact.
This reminds me of the recent IPPR report, widely discussed including by me, at cliscep and at notalotofpeopleknowthat (e.g. here and here). Extreme claims were lifted a report by billionaire investor Jeremy Grantham, which stated

Since 1950, the number of floods across the world has increased by 15 times, extreme temperature events by 20 times, and wildfires sevenfold

The primary reason was the increase in the number of observations. Grantham mistook increasing recorded observations in a database with real world increases, than embellished the increase in the data to make that appear much more significant. The IPPR then lifted the false perception and the BBC’s Roger Harrabin copied the sentence into his report. The reality is that many extreme weather events occurred prior to the conscientious worldwide cataloguing of them from the 1980s. Just because disasters were not observed and reported to a centralized body did not mean they did not exist.
With respect to catastrophic events in the underlying EM-DAT database it is possible to have some perspective on whether the frequency of reports of disasters are related to increase in actual disasters by looking at the number of deaths. Despite the number of reports going up, the total deaths have gone down. Compared to 1900-1949 in the current decade to mid-2018 “Climate” disaster deaths are down 84%, but reported “Climate” disasters are 65 times more frequent.
I am curious to know how it is one might estimate the real quantity of reported instances of coral bleaching from this data. It would certainly be a lot less than the graph above shows.


Have temperatures increased?

In the previous post I looked at temperature trends in the Great Barrier Reef. There are two main sources that suggest that, contrary to the world as a whole, GBR average temperatures have not increased, or increased much less than the global average. This was shown on the NASA Giss map comparing Jan-2019 with the 1951-1980 average and for two HADSST3 ocean data 5ox5o gridcells. For the latter I only charted the temperature anomaly for two gridcells which are at the North and middle of the GBR. I have updated this chart to include the gridcell 150-155oE / 20-25oS at the southern end of the GBR.

There is an increase in warming trend post 2000, influenced particularly by 2001 and 2003. This is not replicated further north. This is in agreement with the Gistemp map of temperature trends in the previous post, where the Southern end of the GBR showed moderate warming.


Has climate change still impacted on global warming?

However, there is still an issue. If any real, but unknown, increase in coral bleaching has occurred it could still be due to sudden increases in surface sea temperatures, something more in accordance with the test in the lab.
Blogger ATTP (aka Professor Ken Rice) called attention to a recent paper in a comment at cliscep

The link is to a pre-publication copy, without the graphics or supplementary data, to

Global warming and recurrent mass bleaching of corals – Hughes et al Nature 2017

The abstract states


The distinctive geographic footprints of recurrent bleaching on the Great Barrier Reef in 1998, 2002 and 2016 were determined by the spatial pattern of sea temperatures in each year.


So in 2002 the GBR had a localized mass bleaching episode, but did not share in the 2010 pan-tropical events of Rice’s quote. The spatial patterns, and the criteria used are explained.

Explaining spatial patterns
The severity and distinctive geographic footprints of bleaching in each of the three years can be explained by differences in the magnitude and spatial distribution of sea-surface temperature anomalies (Fig. 1a, b and Extended Data Table 1). In each year, 61-63% of reefs experienced four or more Degree Heating Weeks (DHW, oC-weeks). In 1998, heat stress was relatively constrained, ranging from 1-8 DHWs (Fig. 1c). In 2002, the distribution of DHW was broader, and 14% of reefs encountered 8-10 DHWs. In 2016, the spectrum of DHWs expanded further still, with 31% of reefs experiencing 8-16 DHWs (Fig. 1c). The largest heat stress occurred in the northern 1000 km-long section of the Great Barrier Reef. Consequently, the geographic pattern of severe bleaching in 2016 matched the strong north-south gradient in heat stress. In contrast, in 1998 and 2002, heat stress extremes and severe bleaching were both prominent further south (Fig. 1a, b).

For clarification:-

Degree Heating Week (DHW) The NOAA satellite-derived Degree Heating Week (DHW) is an experimental product designed to indicate the accumulated thermal stress that coral reefs experience. A DHW is equivalent to one week of sea surface temperature 1 deg C above the expected summertime maximum.

That is, rather than the long-term temperature rise in global temperatures causing the alleged increase in coral bleaching, it is the human-caused global warming changing the climate by a more indirect means of making extreme heat events more frequent. This seems a bit of a tall stretch. However, the “Degree Heating Week” can be corroborated by the gridcell monthly HADSST3 ocean temperature data for the summer months if both the measures are data are accurate estimates of the underlying data. A paper published last December in Nature Climate Change (also with lead author Prof Terry Hughes) highlighted 1998, 2002, 2016 & 2017 as being major years of coral bleaching. Eco Watch has a short video of maps from the paper showing the locations of bleaching event locations, showing much more observed events in 2016 and 2017 than in 1998 and 2002.

From the 2017 paper any extreme temperature anomalies should be most marked in 2016 across all areas of the GBR. 2002 should be less significant and predominantly in the south. 1998 should be a weaker version of 2002.
Further, if summer extreme temperatures are the cause of heat stress in corals, then 1998, 2002, 2016 & 2017 should have warm summer months.
For gridcells 145-150oE / 10-15oS and 150-155oE / 20-25oS respectively representing the northerly and summer extents of the Great Barrier Reef, I have extracted the January February and March anomalies since 1970, then circled the years 1998, 2002, 2016 and 2017. Shown the average of the three summer months.

In the North of the GBR, 2016 and 2017 were unusually warm, whilst 2002 was a cool summer and 1998 was not unusual. This is consistent with the papers findings. But 2004 and 2010 were warm years without bleaching.
In the South of the GBR 1998 was exceptionally warm in February. This might suggest an anomalous reading. 2002 was cooler than average and 2016 and 2017 about average.
Also note, that in the North of the GBR summer temperatures appear to be a few tenths of a degree higher from the late 1990s than in the 1980s and early 1990s. In the South there appears to be no such increase. This is the reverse of what was found for the annual average temperatures and the reverse of where the most serious coral bleaching has occurred.
On this basis the monthly summer temperature anomalies do not seem to correspond to the levels of coral bleaching. A further check is to look at the change in the anomaly from the previous month. If sea surface temperatures increase rapidly in summer, this may be the cause of heat stress as much as absolute magnitude above the long-term average.

In the North of the GBR the February 1998 anomaly was almost a degree higher than the January anomaly. This is nothing exceptional in the record. 2002, 2016 & 2017 do not stand out at all.

In the South of the GBR, the changes in anomaly from one month to the next are much greater than in the North of the GBR. February 1998 stands out. It could be due to problems in the data. 2002, 2016 and 2017 are unexceptional years. There also appears to be less volatility post 2000 contradicting any belief in climate getting more extreme. I believe it could be an indication that data quality has improved.

Conclusions

Overall, the conjecture that global warming is resulting in increased coral bleaching in the Great Barrier Reeg directly through rising average temperatures, or indirectly through greater volatility in temperature data, is not supported by the HADSST3 surface temperature data from either the North or South of the reef. This does not necessarily mean that there is not a growing problem of heat stress, or though this seems the most likely conclusion. Alternative explanations could be that the sea surface temperature anomaly is inadequate or that other gridcells show something different.
Which brings us back to the problem identified above. How much of the observed increase in coral bleaching is down to real increases in coral bleaching and how much is down to increased observations? In all areas of climate, there is a crucial difference between our perceptions based on limited data and the underlying reality.

Kevin Marshall

Two false claims on climate change by the IPPR

An IPPR report  This is a crisis: Facing up to the age of environmental breakdown published yesterday, withing a few hours received criticism from Paul Homewood at notalotofpeopleknowthat, Paul Matthews at cliscep and Andrew Montford at The GWPF.  has is based on an April 2018 paper by billionaire Jeremy Grantham. Two major issues, that I want cover in this post are contained in a passage on page 13.

Climate Change : Average global surface temperature increases have accelerated, from an average of 0.007 °C per year from 1900–1950 to 0.025 °C from 1998–2016 (Grantham 2018). ……. Since 1950, the number of floods across the world has increased by 15 times, extreme temperature events by 20 times, and wildfires sevenfold (GMO analysis of EM-DAT 2018).

These two items are lifted from an April 2018 paper The Race of Our Lives Revisited by British investor Jeremy Grantham CBE. I will deal with each in turn.

Warming acceleration

The claim concerning how warming has accelerated comes from Exhibit 2 of The Race of Our Lives Revisited.

The claimed Gistemp trends are as follows

1900 to 1958  – 0.007 °C/year

1958 to 2016  – 0.015 °C/year

1998 to 2016  – 0.025 °C/year

Using the Skeptical Science trend calculator for Gistemp I get the following figures.

1900 to 1958  – 0.066 ±0.024 °C/decade

1958 to 2016  – 0.150 ±0.022 °C/decade

1998 to 2016  – 0.139 ±0.112 °C/decade

That is odd. Warming rates seem to be slightly lower for 1998-2016 compared to 1958-2016, not higher. This is how Grantham may have derived the incorrect 1998-2016 figure.

For 1998-2016 the range of uncertainty is 0.003 to 0.025 °C/year.

It would appear that the 1900 to 1958 & 1958 to 2016 warming rates are as from the trend calculator, whilst the 1998 to 2016 warming rate of 0.025 °C/year is at the top end of the 2σ uncertainty range.

Credit for spotting this plausible explanation should go to Mike Jackson.

Increase in climate-related disasters since 1950

The IPPR report states

Since 1950, the number of floods across the world has increased by 15 times, extreme temperature events by 20 times, and wildfires sevenfold

Exhibit 7 of The Race of Our Lives Revisited.

The 15 times “Floods” increase is for 2001-2017 compared to 1950-1966.
The 20 times “Extreme Temperature Events” increase is for 1996-2017 compared to 1950-1972.
The 7 times “Wildfires” increase is for 1984-2017 compared to 1950-1983.

Am I alone in thinking there is something a bit odd in the statement about being from 1950? Grantham is comparing different time periods, yet IPPR make it appear the starting point is from a single year?

But is the increase in the data replicated in reality?

Last year I downloaded all the EM-DAT – The International Disasters Database – from 1900 to the present day. Their disaster types I have classified into four categories.

Over 40% are the “climate”-related disaster types from Grantham’s analysis. Note that this lists the number of “occurrences” in a year. If, within a country in a year there is more than one occurrence of a disaster type, they are lumped together.

I have split the number of occurrences by the four categories by decade. The 2010s is only for 8.5 years.

Climate” disasters have increased in the database. Allowing for 8.5 years in the current decade, compared to 1900-1949, “Climate” disasters are 65 times more frequent. Similarly epidemics are 47 times more frequent, geological events 16 times and “other” disasters 34 times.

Is this based on reality, or just vastly improved reporting of disasters from the 1980s? The real impacts are indicated by the numbers of reports deaths. 

The number of reported disaster deaths has decreased massively compared to the early twentieth century in all four categories, despite the number of reported disasters increasing many times. Allowing for 8.5 years in the current decade, compared to 1900-1949, “Climate” disaster deaths are down 84%. Similarly epidemic deaths are down by 98% and”other” disasters down by 97%. Geological disaster deaths are, however, up by 27%. The reported 272,431 deaths in the 2010s that I have classified under “Geology” includes the estimated 222,570 estimated deaths in the 2010 Haitian Earthquake.

If one looks at the death rate per reported occurrence, “Climate” disaster death rates have declined by 97.7% between 1900-1949 and the 2010s. Due to the increase in reporting, and the more than doubling of the world population, this decline is most likely understated. 

The Rôle of Progressives in Climate Mitigation

The IPPR describes itself as The Progressive Policy Think Tank. From the evidence of the two issues above they have not actually thought about what they are saying. Rather they have just copied the highly misleading data from Jeremy Grantham. There appears to be no real climate crisis emerging when one examines the available data properly. The death rate from extreme weather related events has declined by at least 97.7% between the first half of the twentieth century  and the current decade. This is a very important point for policy. Humans have adapted to the current climate conditions, just have they have reduced the impact of infectious diseases and are increasingly adapting to the impacts of earthquakes and tsunamis. If the climate becomes more extreme, or sea level rise accelerates significantly humans will adapt as well.

There is a curious symmetry here between the perceived catastrophic problem and the perceived efficacy of the solution. That for governments to reduce global emissions to zero. The theory is that rising human emissions, mostly from the burning of fossil fuels, are going to cause dangerous climate change. Global emissions involve 7600 million people in nearly 200 countries. Whatever the UK does, with less than 1% of the global population and less than 1% of global emissions makes no difference to global emissions.

Globally, there are two major reasons that reducing global emissions will fail.

First is that developing countries, with 80%+ of the global population and 65% of emissions, are specifically exempted from any obligation to reduce their emissions. (see Paris Agreement Articles 2.1(a), 2.2 and 4.1) Based on the evidence of the UNEP Emissions GAP Report 2018, and from the COP24 Katowizce meeting in December, there is no change of heart in prospect.

Second is that the reserves of fossil fuels, both proven and estimated, are both considerable and spread over many countries. Reducing global emissions to zero in a generation would mean leaving in the ground fossil fuels that provide a significant part of government revenue in countries such as Russia, Iran, Saudi Arabia, and Turkmenistan. Keeping some fossil fuels in the ground in the UK, Canada, Australia or the United States will increase the global prices and thus the production elsewhere.

The IPPR is promoting is costly and ideological policies in the UK, that will have virtually zero benefits for future generations in terms of climate catastrophes averted. In my book such policies are both regressive and authoritarian, based on failing to understand to the distinction between the real very marginal impacts of policy and the theoretical total impacts.

If IPPR, or even the climate academics, gave proper thought to the issue, then they would conclude the correct response will be to more accurately predict the type, timing, magnitude and location of future climate catastrophes. This information will help people on the ground adapt to those circumstances. In the absence of that information, the best way of adapting to changing climate is the same way as people have been able to adapt to extreme events, whether weather or geological. That is through sustained long-term economic growth, in the initial stages promoted by cheap and reliable energy sources. If there is a real environmental breakdown on its way, the Progressives, with their false claims and exaggerations, will be best kept well away from the scene. Their ideological beliefs render them incapable of getting a rounded perspective on the issues and the damage their policies will cause.

Kevin Marshall

Australian Beer Prices set to Double Due to Global Warming?

Earlier this week Nature Plants published a new paper Decreases in global beer supply due to extreme drought and heat

The Scientific American has an article “Trouble Brewing? Climate Change Closes In on Beer Drinkers” with the sub-title “Increasing droughts and heat waves could have a devastating effect on barley stocks—and beer prices”. The Daily Mail headlines with “Worst news ever! Australian beer prices are set to DOUBLE because of global warming“. All those climate deniers in Australia have denied future generations the ability to down a few cold beers with their barbecued steaks tofu salads.

This research should be taken seriously, as it is by a crack team of experts across a number of disciplines and Universities. Said, Steven J Davis of University of California at Irvine,

The world is facing many life-threatening impacts of climate change, so people having to spend a bit more to drink beer may seem trivial by comparison. But … not having a cool pint at the end of an increasingly common hot day just adds insult to injury.

Liking the odd beer or three I am really concerned about this prospect, so I rented the paper for 48 hours to check it out. What a sensation it is. Here a few impressions.

Layers of Models

From the Introduction, there were a series of models used.

  1. Created an extreme events severity index for barley based on extremes in historical data for 1981-2010.
  2. Plugged this into five different Earth Systems models for the period 2010-2099. Use this against different RCP scenarios, the most extreme of which shows over 5 times the warming of the 1981-2010 period. What is more severe climate events are a non-linear function of temperature rise.
  3. Then model the impact of these severe weather events on crop yields in 34 World Regions using a “process-based crop model”.
  4. (W)e examine the effects of the resulting barley supply shocks on the supply and price of beer in each region using a global general equilibrium model (Global Trade Analysis Project model, GTAP).
  5. Finally, we compare the impacts of extreme events with the impact of changes in mean climate and test the sensitivity of our results to key sources of uncertainty, including extreme events of different severities, technology and parameter settings in the economic model.

What I found odd was they made no allowance for increasing demand for beer over a 90 year period, despite mentioning in the second sentence that

(G)lobal demand for resource-intensive animal products (meat and dairy) processed foods and alcoholic beverages will continue to grow with rising incomes.

Extreme events – severity and frequency

As stated in point 2, the paper uses different RCP scenarios. These featured prominently in the IPCC AR5 of 2013 and 2014. They go from RCP2.6, which is the most aggressive mitigation scenario, through to RCP 8.5 the non-policy scenario which projected around 4.5C of warming from 1850-1870 through to 2100, or about 3.8C of warming from 2010 to 2090.

Figure 1 has two charts. On the left it shows that extreme events will increase intensity with temperature. RCP2.6 will do very little, but RCP8.5 would result by the end of the century with events 6 times as intense today. Problem is that for up to 1.5C there appears to be no noticeable change what so ever.  That is about the same amount of warming the world has experienced from 1850-2010 per HADCRUT4 there will be no change. Beyond that things take off. How the models empirically project well beyond known experience for a completely different scenario defeats me. It could be largely based on their modelling assumptions, which is in turn strongly tainted by their beliefs in CAGW. There is no reality check that it is the models that their models are not falling apart, or reliant on arbitrary non-linear parameters.

The right hand chart shows that extreme events are porjected to increase in frequency as well. Under RCP 2.6 ~ 4% chance of an extreme event, rising to ~ 31% under RCP 8.5. Again, there is an issue of projecting well beyond any known range.

Fig 2 average barley yield shocks during extreme events

The paper assumes that the current geographical distribution and area of barley cultivation is maintained. They have modelled in 2099, from the 1981-2010 a gridded average yield change with 0.5O x 0.5O resolution to create four colorful world maps representing each of the four RCP emissions scenarios. At the equator, each grid is about 56 x 56 km for an area of 3100 km2, or 1200 square miles. Of course, nearer the poles the area diminishes significantly. This is quite a fine level of detail for projections based on 30 years of data to radically different circumstances 90 years in the future. The results show. Map a) is for RCP 8.5. On average yields are projected to be 17% down. As Paul Homewood showed in a post on the 17th, this projected yield fall should be put in the context of a doubling of yields per hectare since the 1960s.

This increase in productivity has often solely ascribed to the improvements in seed varieties (see Norman Borlaug), mechanization and use of fertilizers. These have undoubtably have had a large parts to play in this productivity improvement. But also important is that agriculture has become more intensive. Forty years ago it was clear that there was a distinction between the intensive farming of Western Europe and the extensive farming of the North American prairies and the Russian steppes. It was not due to better soils or climate in Western Europe. This difference can be staggering. In the Soviet Union about 30% of agricultural output came from around 1% of the available land. These were the plots that workers on the state and collective farms could produce their own food and sell surplus in the local markets.

Looking at chart a in Figure 2, there are wide variations about this average global decrease of 17%.

In North America Montana and North Dakota have areas where barley shocks during extreme years will lead to mean yield changes over 90% higher normal, and the areas around have >50% higher than normal. But go less than 1000 km North into Canada to the Calgary/Saskatoon area and there are small decreases in yields.

In Eastern Bolivia – the part due North of Paraguay – there is the biggest patch of > 50% reductions in the world. Yet 500-1000 km away there is a North-South strip (probably just 56km wide) with less than a 5% change.

There is a similar picture in Russia. On the Kazakhstani border, there are areas of > 50% increases, but in a thinly populated band further North and West, going from around Kirov to Southern Finland is where there are massive decreases in yields.

Why, over the course of decades, would those with increasing yields not increase output, and those with decreasing yields not switch to something else defeats me. After all, if overall yields are decreasing due to frequent extreme weather events, the farmers would be losing money, and those farmers do well when overall yields are down will be making extraordinary profits.

A Weird Economic Assumption

Building up to looking at costs, their is a strange assumption.

(A)nalysing the relative changes in shares of barley use, we find that in most case barley-to-beer shares shrink more than barley-to-livestock shares, showing that food commodities (in this case, animals fed on barley) will be prioritized over luxuries such as beer during extreme events years.

My knowledge of farming and beer is limited, but I believe that cattle can be fed on other things than barley. For instance grass, silage, and sugar beet. Yet, beers require precise quantities of barley and hops of certain grades.

Further, cattle feed is a large part of the cost of a kilo of beef or a litre of milk. But it takes around 250-400g of malted barley to produce a litre of beer. The current wholesale price of malted barley is about £215 a tonne or 5.4 to 8.6p a litre. About cheapest 4% alcohol lager I can find in a local supermarket is £3.29 for 10 x 250ml bottles, or £1.32 a litre. Take off 20% VAT and excise duty leaves 30p a litre for raw materials, manufacturing costs, packaging, manufacturer’s margin, transportation, supermarket’s overhead and supermarket’s margin. For comparison four pints (2.276 litres) of fresh milk costs £1.09 in the same supermarket, working out at 48p a litre. This carries no excise duty or VAT. It might have greater costs due to refrigeration, but I would suggest it costs more to produce, and that feed is far more than 5p a litre.

I know that for a reasonable 0.5 litre bottle of ale it is £1.29 to £1.80 a bottle in the supermarkets I shop in, but it is the cheapest that will likely suffer the biggest percentage rise from increase in raw material prices. Due to taxation and other costs, large changes in raw material prices will have very little impact on final retail costs. Even less so in pubs where a British pint (568ml) varies from the £4 to £7 a litre equivalent.

That is, the assumption is the opposite of what would happen in a free market. In the face of a shortage, farmers will substitute barley for other forms of cattle feed, whilst beer manufacturers will absorb the extra cost.

Disparity in Costs between Countries

The most bizarre claim in the article in contained in the central column of Figure 4, which looks at the projected increases in the cost of a 500 ml bottle of beer in US dollars. Chart h shows this for the most extreme RCP 8.5 model.

I was very surprised that a global general equilibrium model would come up with such huge disparities in costs after 90 years. After all, my understanding of these models used utility-maximizing consumers, profit-maximizing producers, perfect information and instantaneous adjustment. Clearly there is something very wrong with this model. So I decided to compare where I live in the UK with neighbouring Ireland.

In the UK and Ireland there are similar high taxes on beer, with Ireland being slightly more. Both countries have lots of branches of the massive discount chain. They also have some products on their website aldi.co.uk and aldi.ie.  In Ireland a 500 ml can of Sainte Etienne Lager is €1.09 or €2.18 a litre or £1.92 a litre. In the UK it is £2.59 for 4 x 440ml cans or £1.59 a litre. The lager is about 21% more in Ireland. But the tax difference should only be about 15% on a 5% beer (Saint Etienne is 4.8%). Aldi are not making bigger profits in Ireland, they just may have higher costs in Ireland, or lesser margins on other items. It is also comparing a single can against a multipack. So pro-rata the £1.80 ($2.35) bottle of beer in the UK would be about $2.70 in Ireland. Under the RCP 8.5 scenario, the models predict the bottle of beer to rise by $1.90 in the UK and $4.84 in Ireland. Strip out the excise duty and VAT and the price differential goes from zero to $2.20.

Now suppose you were a small beer manufacturer in England, Wales or Scotland. If beer was selling for $2.20 more in Ireland than in the UK, would you not want to stick 20,000 bottles in a container and ship it to Dublin?

If the researchers really understood the global brewing industry, they would realize that there are major brands sold across the world. Many are brewed across in a number of countries to the same recipe. It is the barley that is shipped to the brewery, where equipment and techniques are identical with those in other parts of the world. This researchers seem to have failed to get away from their computer models to conduct field work in a few local bars.

What can be learnt from this?

When making projections well outside of any known range, the results must be sense-checked. Clearly, although the researchers have used an economic model they have not understood the basics of economics. People are not dumb  automatons waiting for some official to tell them to change their patterns of behavior in response to changing circumstances. They notice changes in the world around them and respond to it. A few seize the opportunities presented and can become quite wealthy as a result. Farmers have been astute enough to note mounting losses and change how and what they produce. There is also competition from regions. For example, in the 1960s Brazil produced over half the world’s coffee. The major region for production in Brazil was centered around Londrina in the North-East of Parana state. Despite straddling the Tropic of Capricorn, every few years their would be a spring-time frost which would destroy most of the crop. By the 1990s most of the production had moved north to Minas Gerais, well out of the frost belt. The rich fertile soils around Londrina are now used for other crops, such as soya, cassava and mangoes. It was not out of human design that the movement occurred, but simply that the farmers in Minas Gerais could make bumper profits in the frost years.

The publication of this article shows a problem of peer review. Nature Plants is basically a biology journal. Reviewers are not likely to have specialist skills in climate models or economic theory, though those selected should have experience in agricultural models. If peer review is literally that, it will fail anyway in an inter-disciplinary subject, where the participants do not have a general grounding in all the disciplines. In this paper it is not just economics, but knowledge of product costing as well. It is academic superiors from the specialisms that are required for review, not inter-disciplinary peers.

Kevin Marshall

 

NOAA Future Aridity against Al Gore’s C20th Precipitation Graphic

Paul Homewood has taken a look at an article in yesterdays Daily Mail – A quarter of the world could become a DESERT if global warming increases by just 2ºC.

The article states

Aridity is a measure of the dryness of the land surface, obtained from combining precipitation and evaporation.  

‘Aridification would emerge over 20 to 30 per cent of the world’s land surface by the time the global temperature change reaches 2ºC (3.6ºF)’, said Dr Manoj Joshi from the University of East Anglia’s School of Environmental Sciences and one of the study’s co-authors.  

The research team studied projections from 27 global climate models and identified areas of the world where aridity will substantially change.  

The areas most affected areas are parts of South East Asia, Southern Europe, Southern Africa, Central America and Southern Australia.

Now, having read Al Gore’s authoritative book An Inconvenient Truth there are statements first about extreme flooding, and then about aridity (pages 108-113). The reason for flooding coming first is on a graphic of twentieth-century changes in precipitation on pages 114 & 115.

This graphic shows that, overall, the amount of precipitation has increased globally in the last century by almost 20%.

 However, the effects of climate change on precipitation is not uniform. Precipitation in the 20th century increased overall, as expected with global warming, but in some regions precipitation actually decreased.

The blue dots mark the areas with increased precipitation, the orange dots with decreases. The larger the dot, the larger the change. So, according to Nobel Laureate Al Gore, increased precipitation should be the far more common than increased aridity. If all warming is attributed to human-caused climate change (as the book seems to imply) then over a third of the dangerous 2ºC occurred in the 20th century. Therefore there should be considerable coherence between the recent arid areas and future arid areas.

The Daily Mail reproduces a map from the UEA, showing the high-risk areas.

There are a couple of areas with big differences.

Southern Australia

In the 20th century, much of Australia saw increased precipitation. Within the next two or three decades, the UEA projects it getting considerably arider. Could this change in forecast be the result of the extreme drought that broke in 2012 with extreme flooding? Certainly, the pictures of empty reservoirs taken a few years ago, alongside claims that they would never likely refill show the false predictions.

One such reservoir is Lake Eildon in Victoria. Below is a graphic of capacity levels in selected years. It is possible to compare other years by following the historical water levels for EILDON link.

Similarly, in the same post, I linked to a statement by re-insurer Munich Re stating increased forest fires in Southern Australia were due to human activity. Not by “anthropogenic climate change”, but by discarded fag ends, shards of glass and (most importantly) fires that were deliberately started.

Northern Africa

The UEA makes no claims about increased aridity in Northern Africa, particularly with respect to the Southern and Northern fringes of the Sahara. Increasing desertification of the Sahara used to be claimed as a major consequence of climate change. In the year following Al Gore’s movie and book, the UNIPCC produced its Fourth Climate Assessment Report. Working Group II report, Chapter 9 (Pg 448) on Africa made the following claim.

In other countries, additional risks that could be exacerbated by climate change include greater erosion, deficiencies in yields from rain-fed agriculture of up to 50% during the 2000-2020 period, and reductions in crop growth period (Agoumi, 2003).

Richard North took a detailed look at the background of this claim in 2010. The other African countries were Morocco, Algeria and Tunisia. Agoumi 2003 compiled three reports, only one of which – Morocco – had anything near a 50% claim. Yet Morocco seems, from Al Gore’s graphic to have had a modest increase in rainfall over the last century.

Conclusion

The UEA latest doom-laden prophesy of increased aridity flies in the face of the accepted wisdom that human-caused global warming will result in increased precipitation. In two major areas (Southern Australia and Northern Africa), increased aridity is at add odds with changes in precipitation claimed to have occurred in the 20th Century by Al Gore in An Inconvenient Truth. Yet over a third of the of the dangerous 2ºC warming limit occurred in the last century.

Kevin Marshall

 

Blighting of Fairbourne by flawed report and BBC reporting

The Telegraph is reporting (hattip Paul Homewood)

A Welsh village is to sue the government after a climate change report suggested their community would soon be washed away by rising sea levels.

The document says Fairbourne will soon be lost to the sea, and recommends that it is “decommissioned”.

However, I was not sure about some of the figures in the Telegraph report, so I checked for myself.

West of Wales Shoreline Management Plan 2(SMP2) is available in sections. Fairbourne is covered in file 4d3 – Section 4 Coastal Area D PDZ11.pdf under folder West of W…\Eng…\Coastal Area D

On page 16 is the following graphic.

Fairbourne is the grey area to the bottom left of the image. In 50 years about a third of the village will be submerged at high tide and in 100 years all of the village. This is without changes to flood defences. Even worse is this comment.

Over the 100 years with 2m SLR the area would be typically 1.5m below normal tidal levels.

Where would they have got this 1-2m of sea level rise from? In the Gwynedd council Cabinet Report 22/01/13 Topic : Shoreline Management Plan 2 it states

The WoWSMP2 was undertaken in defined stages as outlined in the Defra guidance published in March 2006.

And on sea level rise it states

There is a degree of uncertainty at present regarding the rate of sea level rise. There is an upper and lower estimate which produces a range of possibilities between 1m and 2m in the next 100 years. It will take another 10 to 20 years of data to determine where we are on the graph and what the projection for the future is.

Does the Defra guidance bear any resemblance to the expert opinion? In the UNIPCC AR5 Working Group 1 Summary for Policymakers page 21 is Table SPM.2

At the foot of the table is the RCP8.5 business as usual scenario for sea level rise.

The flood risk images produced in 2011 assume 0.36m of sea level rise in 50 years or about 2061. This is at the very top end of the RCP8.5 scenario estimates for 2046-2065. It is above the sea level rise projections with mitigation policies. Similarly a rise of 1m in 100 years is equivalent to the top end of the RCP8.5 scenario estimates for 2081-2100 of 0.82m. With any other mitigation scenario, the sea level rise is below that.

This means that the West of Wales Shoreline Management Plan 2 assumes that the Climate Change Act 2008 (which has increased electricity bills by at least 30% since it was passed, and blighted many rural areas with wind turbines) will have no impact at all. For added effect, it takes the most extreme estimate of sea level rise and doubles it.

It gets worse. The action group Fairbourne Facing Change has a website

The Fairbourne Facing Change Community Action Group (FFC) was established in direct response to the alarming way the West of Wales Shoreline Management Plan 2(SMP2) was publicised on national and local television. The BBC programme ‘Week in Week Out’ broadcast on Tuesday, 11th February 2014, did not present an accurate and balanced reporting of the situation. This, then followed with further inaccurate coverage culminating in unnecessary concern, anxiety, and panic for the community.

The BBC has long been the mouthpiece for an extremist view on climate change. The lack of balance has caused real distress and helped exacerbate the situation. Even, though the 2013 report was unduly alarmist in its forecasts, there is nothing in the figures to support the statement that

Fairbourne is expected to enter into “managed retreat” in 2025 when the council will stop maintaining defences due to rising sea levels.

And

More than 400 homes are expected to be abandoned in the village by 2055 as part of the council’s shoreline management plan (SMP) policy.

With sea level rise of about 3mm a year, and with forecast acceleration, the council is alleged to find it no longer worthwhile to maintain the sea defences when sea levels of have risen by one or two inches, and will have completely abandoned the village based on a sea level rise of less than 14 inches. With a shovel on my own I could construct an 18-inch high barrier out of the loose stone from the on a couple of mile front well before 2025.

Kevin Marshall

 

 

Defining “Temperature Homogenisation”

Summary

The standard definition of temperature homogenisation is of a process that cleanses the temperature data of measurement biases to only leave only variations caused by real climatic or weather variations. This is at odds with GHCN & GISS adjustments which delete some data and add in other data as part of the homogenisation process. A more general definition is to make the data more homogenous, for the purposes of creating regional and global average temperatures. This is only compatible with the standard definition if assume that there are no real data trends existing within the homogenisation area. From various studies it is clear that there are cases where this assumption does not hold good. The likely impacts include:-

  • Homogenised data for a particular temperature station will not be the cleansed data for that location. Instead it becomes a grid reference point, encompassing data from the surrounding area.
  • Different densities of temperature data may lead to different degrees to which homogenisation results in smoothing of real climatic fluctuations.

Whether or not this failure of understanding is limited to a number of isolated instances with a near zero impact on global temperature anomalies is an empirical matter that will be the subject of my next post.

Introduction

A common feature of many concepts involved with climatology, the associated policies and sociological analyses of non-believers, is a failure to clearly understand of the terms used. In the past few months it has become evident to me that this failure of understanding extends to term temperature homogenisation. In this post I look at the ambiguity of the standard definition against the actual practice of homogenising temperature data.

The Ambiguity of the Homogenisation Definition

The World Meteorological Organisation in its’ 2004 Guidelines on Climate Metadata and Homogenization1 wrote this explanation.

Climate data can provide a great deal of information about the atmospheric environment that impacts almost all aspects of human endeavour. For example, these data have been used to determine where to build homes by calculating the return periods of large floods, whether the length of the frost-free growing season in a region is increasing or decreasing, and the potential variability in demand for heating fuels. However, for these and other long-term climate analyses –particularly climate change analyses– to be accurate, the climate data used must be as homogeneous as possible. A homogeneous climate time series is defined as one where variations are caused only by variations in climate.

Unfortunately, most long-term climatological time series have been affected by a number of nonclimatic factors that make these data unrepresentative of the actual climate variation occurring over time. These factors include changes in: instruments, observing practices, station locations, formulae used to calculate means, and station environment. Some changes cause sharp discontinuities while other changes, particularly change in the environment around the station, can cause gradual biases in the data. All of these inhomogeneities can bias a time series and lead to misinterpretations of the studied climate. It is important, therefore, to remove the inhomogeneities or at least determine the possible error they may cause.

That is temperature homogenisation is necessary to isolate and remove what Steven Mosher has termed measurement biases2, from the real climate signal. But how does this isolation occur?

Venema et al 20123 states the issue more succinctly.

The most commonly used method to detect and remove the effects of artificial changes is the relative homogenization approach, which assumes that nearby stations are exposed to almost the same climate signal and that thus the differences between nearby stations can be utilized to detect inhomogeneities (Conrad and Pollak, 1950). In relative homogeneity testing, a candidate time series is compared to multiple surrounding stations either in a pairwise fashion or to a single composite reference time series computed for multiple nearby stations. (Italics mine)

Blogger …and Then There’s Physics (ATTP) partly recognizes these issues may exist in his stab at explaining temperature homogenisation4.

So, it all sounds easy. The problem is, we didn’t do this and – since we don’t have a time machine – we can’t go back and do it again properly. What we have is data from different countries and regions, of different qualities, covering different time periods, and with different amounts of accompanying information. It’s all we have, and we can’t do anything about this. What one has to do is look at the data for each site and see if there’s anything that doesn’t look right. We don’t expect the typical/average temperature at a given location at a given time of day to suddenly change. There’s no climatic reason why this should happen. Therefore, we’d expect the temperature data for a particular site to be continuous. If there is some discontinuity, you need to consider what to do. Ideally you look through the records to see if something happened. Maybe the sensor was moved. Maybe it was changed. Maybe the time of observation changed. If so, you can be confident that this explains the discontinuity, and so you adjust the data to make it continuous.

What if there isn’t a full record, or you can’t find any reason why the data may have been influenced by something non-climatic? Do you just leave it as is? Well, no, that would be silly. We don’t know of any climatic influence that can suddenly cause typical temperatures at a given location to suddenly increase or decrease. It’s much more likely that something non-climatic has influenced the data and, hence, the sensible thing to do is to adjust it to make the data continuous. (Italics mine)

The assumption of a nearby temperature stations have the same (or very similar) climatic signal, if true would mean that homogenisation would cleanse the data of the impurities of measurement biases. But there is only a cursory glance given to the data. For instance, when Kevin Cowtan gave an explanation of the fall in average temperatures at Puerto Casado neither he, nor anyone else, checked to see if the explanation stacked up beyond checking to see if there had been a documented station move at roughly that time. Yet the station move is at the end of the drop in temperatures, and a few minutes checking would have confirmed that other nearby stations exhibit very similar temperature falls5. If you have a preconceived view of how the data should be, then a superficial explanation that conforms to that preconception will be sufficient. If you accept the authority of experts over personally checking for yourself, then the claim by experts that there is not a problem is sufficient. Those with no experience of checking the outputs following processing of complex data will not appreciate the issues involved.

However, this definition of homogenisation appears to be different from that used by GHCN and NASA GISS. When Euan Mearns looked at temperature adjustments in the Southern Hemisphere and in the Arctic6, he found numerous examples in the GHCN and GISS homogenisations of infilling of some missing data and, to a greater extent, deleted huge chunks of temperature data. For example this graphic is Mearns’ spreadsheet of adjustments between GHCNv2 (raw data + adjustments) and the GHCNv3 (homogenised data) for 25 stations in Southern South America. The yellow cells are where V2 data exist V3 not; the greens cells V3 data exist where V2 data do not.

Definition of temperature homogenisation

A more general definition that encompasses the GHCN / GISS adjustments is of broadly making the data homogenous. It is not done by simply blending the data together and smoothing out the data. Homogenisation also adjusts anomalous data as a result of pairwise comparisons between local temperature stations, or in the case of extreme differences in the GHCN / GISS deletes the most anomalous data. This is a much looser and broader process than homogenisation of milk, or putting some food through a blender.

The definition I cover in more depth in the appendix.

The Consequences of Making Data Homogeneous

A consequence of cleansing the data in order to make it more homogenous gives a distinction that is missed by many. This is due to making the strong assumption that there are no climatic differences between the temperature stations in the homogenisation area.

Homogenisation is aimed at adjusting for the measurement biases to give a climatic reading for the location where the temperature station is located that is a closer approximation to what that reading would be without those biases. With the strong assumption, making the data homogenous is identical to removing the non-climatic inhomogeneities. Cleansed of these measurement biases the temperature data is then both the average temperature readings that would have been generated if the temperature station had been free of biases and a representative location for the area. This latter aspect is necessary to build up a global temperature anomaly, which is constructed through dividing the surface into a grid. Homogenisation, in the sense of making the data more homogenous by blending is an inappropriate term. All what is happening is adjusting for anomalies within the through comparisons with local temperature stations (the GHCN / GISS method) or comparisons with an expected regional average (the Berkeley Earth method).

But if the strong assumption does not hold, homogenisation will adjust these climate differences, and will to some extent fail to eliminate the measurement biases. Homogenisation is in fact made more necessary if movements in average temperatures are not the same and the spread of temperature data is spatially uneven. Then homogenisation needs to not only remove the anomalous data, but also make specific locations more representative of the surrounding area. This enables any imposed grid structure to create an estimated average for that area through averaging the homogenized temperature data sets within the grid area. As a consequence, the homogenised data for a temperature station will cease to be a closer approximation to what the thermometers would have read free of any measurement biases. As homogenisation is calculated by comparisons of temperature stations beyond those immediately adjacent, there will be, to some extent, influences of climatic changes beyond the local temperature stations. The consequences of climatic differences within the homogenisation area include the following.

  • The homogenised temperature data for a location could appear largely unrelated to the original data or to the data adjusted for known biases. This could explain the homogenised Reykjavik temperature, where Trausti Jonsson of the Icelandic Met Office, who had been working with the data for decades, could not understand the GHCN/GISS adjustments7.
  • The greater the density of temperature stations in relation to the climatic variations, the less that climatic variations will impact on the homogenisations, and the greater will be the removal of actual measurement biases. Climate variations are unlikely to be much of an issue with the Western European and United States data. But on the vast majority of the earth’s surface, whether land or sea, coverage is much sparser.
  • If the climatic variation at a location is of different magnitude to that of other locations in the homogenisation area, but over the same time periods and direction, then the data trends will be largely retained. For instance, in Svarlbard the warming temperature trends of the early twentieth century and from the late 1970s were much greater than elsewhere, so were adjusted downwards8.
  • If there are differences in the rate of temperature change, or the time periods for similar changes, then any “anomalous” data due to climatic differences at the location will be eliminated or severely adjusted, on the same basis as “anomalous” data due to measurement biases. For instance in large part of Paraguay at the end of the 1960s average temperatures by around 1oC. Due to this phenomena not occurring in the surrounding areas both the GHCN and Berkeley Earth homogenisation processes adjusted out this trend. As a consequence of this adjustment, a mid-twentieth century cooling in the area was effectively adjusted to out of the data9.
  • If a large proportion of temperature stations in a particular area have consistent measurement biases, then homogenisation will retain those biases, as it will not appear anomalous within the data. For instance, much of the extreme warming post 1950 in South Korea is likely to have been as a result of urbanization10.

Other Comments

Homogenisation is just part of the process of adjusting data for the twin purposes of attempting to correct for biases and building a regional and global temperature anomalies. It cannot, for instance, correct for time of observation biases (TOBS). This needs to be done prior to homogenisation. Neither will homogenisation build a global temperature anomaly. Extrapolating from the limited data coverage is a further process, whether for fixed temperature stations on land or the ship measurements used to calculate the ocean surface temperature anomalies. This extrapolation has further difficulties. For instance, in a previous post11 I covered a potential issue with the Gistemp proxy data for Antarctica prior to permanent bases being established on the continent in the 1950s. Making the data homogenous is but the middle part of a wider process.

Homogenisation is a complex process. The Venema et al 20123 paper on the benchmarking of homogenisation algorithms demonstrates that different algorithms produce significantly different results. What is clear from the original posts on the subject by Paul Homewood and the more detailed studies by Euan Mearns and Roger Andrews at Energy Matters, is that the whole process of going from the raw monthly temperature readings to the final global land surface average trends has thrown up some peculiarities. In order to determine whether they are isolated instances that have near zero impact on the overall picture, or point to more systematic biases that result from the points made above, it is necessary to understand the data available in relation to the overall global picture. That will be the subject of my next post.

Kevin Marshall

Notes

  1. GUIDELINES ON CLIMATE METADATA AND HOMOGENIZATION by Enric Aguilar, Inge Auer, Manola Brunet, Thomas C. Peterson and Jon Wieringa
  2. Steven Mosher – Guest post : Skeptics demand adjustments 09.02.2015
  3. Venema et al 2012 – Venema, V. K. C., Mestre, O., Aguilar, E., Auer, I., Guijarro, J. A., Domonkos, P., Vertacnik, G., Szentimrey, T., Stepanek, P., Zahradnicek, P., Viarre, J., Müller-Westermeier, G., Lakatos, M., Williams, C. N., Menne, M. J., Lindau, R., Rasol, D., Rustemeier, E., Kolokythas, K., Marinova, T., Andresen, L., Acquaotta, F., Fratianni, S., Cheval, S., Klancar, M., Brunetti, M., Gruber, C., Prohom Duran, M., Likso, T., Esteban, P., and Brandsma, T.: Benchmarking homogenization algorithms for monthly data, Clim. Past, 8, 89-115, doi:10.5194/cp-8-89-2012, 2012.
  4. …and Then There’s Physics – Temperature homogenisation 01.02.2015
  5. See my post Temperature Homogenization at Puerto Casado 03.05.2015
  6. For example

    The Hunt For Global Warming: Southern Hemisphere Summary

    Record Arctic Warmth – in 1937

  7. See my post Reykjavik Temperature Adjustments – a comparison 23.02.2015
  8. See my post RealClimate’s Mis-directions on Arctic Temperatures 03.03.2015
  9. See my post Is there a Homogenisation Bias in Paraguay’s Temperature Data? 02.08.2015
  10. NOT A LOT OF PEOPLE KNOW THAT (Paul Homewood) – UHI In South Korea Ignored By GISS 14.02.2015

Appendix – Definition of Temperature Homogenisation

When discussing temperature homogenisations, nobody asks what the term actual means. In my house we consume homogenised milk. This is the same as the pasteurized milk I drank as a child except for one aspect. As a child I used to compete with my siblings to be the first to open a new pint bottle, as it had the cream on top. The milk now does not have this cream, as it is blended in, or homogenized, with the rest of the milk. Temperature homogenizations are different, involving changes to figures, along with (at least with the GHCN/GISS data) filling the gaps in some places and removing data in others1.

But rather than note the differences, it is better to consult an authoritative source. From Dictionary.com, the definitions of homogenize are:-

verb (used with object), homogenized, homogenizing.

  1. to form by blending unlike elements; make homogeneous.
  2. to prepare an emulsion, as by reducing the size of the fat globules in (milk or cream) in order to distribute them equally throughout.
  3. to make uniform or similar, as in composition or function:

    to homogenize school systems.

  4. Metallurgy. to subject (metal) to high temperature to ensure uniform diffusion of components.

Applying the dictionary definitions, data homogenization in science is not about blending various elements together, nor about additions or subtractions from the data set, or adjusting the data. This is particularly true in chemistry.

For UHCN and NASA GISS temperature data homogenization involves removing or adjusting elements in the data that are markedly dissimilar from the rest. It can also mean infilling data that was never measured. The verb homogenize does not fit the processes at work here. This has led to some, like Paul Homewood, to refer to the process as data tampering or worse. A better idea is to look further at the dictionary.

Again from Dictionary.com, the first two definitions of the adjective homogeneous are:-

  1. composed of parts or elements that are all of the same kind; not heterogeneous:

a homogeneous population.

  1. of the same kind or nature; essentially alike.

I would suggest that temperature homogenization is a loose term for describing the process of making the data more homogeneous. That is for smoothing out the data in some way. A false analogy is when I make a vegetable soup. After cooking I end up with a stock containing lumps of potato, carrot, leeks etc. I put it through the blender to get an even constituency. I end up with the same weight of soup before and after. A similar process of getting the same after homogenization as before is clearly not what is happening to temperatures. The aim of making the data homogenous is both to remove anomalous data and blend the data together.

Understanding GISS Temperature Adjustments

A couple of weeks ago something struck me as odd. Paul Homewood had been going on about all sorts of systematic temperature adjustments, showing clearly that the past has been cooled between the UHCN “raw data” and the GISS Homogenised data used in the data sets. When I looked at eight stations in Paraguay, at Reykjavik and at two stations on Spitzbergen I was able to corroborate this result. Yet Euan Mearns has looked at groups of stations in central Australia and Iceland, in both finding no warming trend between the raw and adjusted temperature data. I thought that Mearns must be wrong, so when he published on 26 stations in Southern Africa1, I set out to evaluate those results, to find the flaw. I have been unable to fully reconcile the differences, but the notes I have made on the Southern African stations may enable a greater understanding of temperature adjustments. What I do find is that clear trends in the data across a wide area have been largely removed, bringing the data into line with Southern Hemisphere trends. The most important point to remember is that looking at data in different ways can lead to different conclusions.

Net difference and temperature adjustments

I downloaded three lots of data – raw, GCHNv3 and GISS Homogenised (GISS H), then replicated Mearns’ method of calculating temperature anomalies. Using 5 year moving averages, in Chart 1 I have mapped the trends in the three data sets.

There is a large divergence prior to 1900, but for the twentieth century the warming trend is not excessively increased. Further, the warming trend from around 1900 is about half of that in the GISTEMP Southern Hemisphere or global anomalies. Looked in this way Mearns would appear to have a point. But there has been considerable downward adjustment of the early twentieth century warming, so Homewood’s claim of cooling the past is also substantiated. This might be the more important aspect, as the adjusted data makes the warming since the mid-1970s appear unusual.

Another feature is that the GCHNv3 data is very close to the GISS Homogenised data. So in looking the GISS H data used in the creation of the temperature data sets is very much the same as looking at GCHNv3 that forms the source data for GISS.

But why not mention the pre-1900 data where the divergence is huge?

The number of stations gives a clue in Chart 2.

It was only in the late 1890s that there are greater than five stations of raw data. The first year there are more data points left in than removed is 1909 (5 against 4).

Removed data would appear to have a role in the homogenisation process. But is it material? Chart 3 graphs five year moving averages of raw data anomalies, split between the raw data removed and retained in GISS H, along with the average for the 26 stations.

Where there are a large number of data points, it does not materially affect the larger picture, but does remove some of the extreme “anomalies” from the data set. But where there is very little data available the impact is much larger. That is particularly the case prior to 1910. But after 1910, any data deletions pale into insignificance next to the adjustments.

The Adjustments

I plotted the average difference between the Raw Data and the adjustment, along with the max and min values in Chart 4.

The max and min of net adjustments are consistent with Euan Mearns’ graph “safrica_deltaT” when flipped upside down and made back to front. It shows a difficulty of comparing adjusted, where all the data is shifted. For instance the maximum figures are dominated by Windhoek, which I looked at a couple of weeks ago. Between the raw data and the GISS Homogenised there was a 3.6oC uniform increase. There were a number of other lesser differences that I have listed in note 3. Chart 5 shows the impact of adjusting the adjustments is on both the range of the adjustments and the pattern of the average adjustments.

Comparing this with this average variance between the raw data and the GISS Homogenised shows the closer fit if the adjustments to the variance. Please note the difference in scale on Chart 6 from the above!

In the earlier period has by far the most deletions of data, hence the lack of closeness of fit between the average adjustment and average variance. After 1945, the consistent pattern of the average adjustment being slightly higher than the average variance is probably due to a light touch approach on adjustment corrections than due to other data deletions. The might be other reasons as well for the lack of fit, such as the impact of different length of data sets on the anomaly calculations.

Update 15/03/15

Of note is that the adjustments in the early 1890s and around 1930 is about three times the size of the change in trend. This might be partly due to zero net adjustments in 1903 and partly due to the small downward adjustments in post 2000.

The consequences of the adjustments

It should be remembered that GISS use this data to create the GISTEMP surface temperature anomalies. In Chart 7 I have amended Chart 1 to include Southern Hemisphere annual mean data on the same basis as the raw data and GISS H.

It seems fairly clear that the homogenisation process has achieved bringing the Southern Africa data sets into line with the wider data sets. Whether the early twentieth century warming and mid-century cooling are outliers that have been correctly cleansed is a subject for further study.

What has struck me in doing this analysis is that looking at individual surface temperature stations becomes nonsensical, as they are grid reference points. Thus comparing the station moves for Reykjavik with the adjustments will not achieve anything. The implications of this insight will have to wait upon another day.

Kevin Marshall

Notes

1. 26 Data sets

The temperature stations, with the periods for the raw data are below.

Location

Lat

Lon

ID

Pop.

Years

Harare

17.9 S

31.1 E

156677750005

601,000

1897 – 2011

Kimberley

28.8 S

24.8 E

141684380004

105,000

1897 – 2011

Gwelo

19.4 S

29.8 E

156678670010

68,000

1898 – 1970

Bulawayo

20.1 S

28.6 E

156679640005

359,000

1897 – 2011

Beira

19.8 S

34.9 E

131672970000

46,000

1913 – 1991

Kabwe

14.4 S

28.5 E

155676630004

144,000

1925 – 2011

Livingstone

17.8 S

25.8 E

155677430003

72,000

1918 – 2010

Mongu

15.2 S

23.1 E

155676330003

< 10,000

1923 – 2010

Mwinilunga

11.8 S

24.4 E

155674410000

< 10,000

1923 – 1970

Ndola

13.0 S

28.6 E

155675610000

282,000

1923 – 1981

Capetown Safr

33.9 S

18.5 E

141688160000

834,000

1880 – 2011

Calvinia

31.5 S

19.8 E

141686180000

< 10,000

1941 – 2011

East London

33.0 S

27.8 E

141688580005

127,000

1940 – 2011

Windhoek

22.6 S

17.1 E

132681100000

61,000

1921 – 1991

Keetmanshoop

26.5 S

18.1 E

132683120000

10,000

1931 – 2010

Bloemfontein

29.1 S

26.3 E

141684420002

182,000

1943 – 2011

De Aar

30.6 S

24.0 E

141685380000

18,000

1940 – 2011

Queenstown

31.9 S

26.9 E

141686480000

39,000

1940 – 1991

Bethal

26.4 S

29.5 E

141683700000

30,000

1940 – 1991

Antananarivo

18.8 S

47.5 E

125670830002

452,000

1889 – 2011

Tamatave

18.1 S

49.4 E

125670950003

77,000

1951 – 2011

Porto Amelia

13.0 S

40.5 E

131672150000

< 10,000

1947 – 1991

Potchefstroom

26.7 S

27.1 E

141683500000

57,000

1940 – 1991

Zanzibar

6.2 S

39.2 E

149638700000

111,000

1880 – 1960

Tabora

5.1 S

32.8 E

149638320000

67,000

1893 – 2011

Dar Es Salaam

6.9 S

39.2 E

149638940003

757,000

1895 – 2011

2. Temperature trends

To calculate the trends I used the OLS method, both from the formula and using the EXCEL “LINEST” function, getting the same answer each time. If you are able please check my calculations. The GISTEMP Southern Hemisphere and global data can be accessed direct from the NASA GISS website. The GISTEMP trends are from the skepticalscience trends tool. My figures are:-

3. Adjustments to the Adjustments

Location

Recent adjustment

Other adjustment

Other Period
Antananarivo

0.50

 

 
Beira

 

0.10

Mid-70s + inter-war
Bloemfontein

0.70

 

 
Dar Es Salaam

0.10

 

 
Harare

 

1.10

About 1999-2002
Keetmanshoop

1.57

 

 
Potchefstroom

-0.10

 

 
Tamatave

0.39

 

 
Windhoek

3.60

 

 
Zanzibar

-0.80

 

 

RealClimate’s Mis-directions on Arctic Temperatures

Summary

Real Climate attempted to rebut the claims that the GISS temperature data is corrupted with unjustified adjustments by

  • Attacking the commentary of Christopher Booker, not the primary source of the allegations.
  • Referring readers instead to a dogmatic source who claims that only 3 stations are affected, something clearly contradicted by Booker and the primary source.
  • Alleging that the complaints are solely about cooling the past, uses a single counter example for Svarlbard of a GISS adjustment excessively warming the past compared to the author’s own adjustments.
  • However, compared to the raw data, the author’s adjustments, based on local knowledge were smaller than GISS, showing the GISS adjustments to be unjustified. But the adjustments bring the massive warming trend into line with (the still large) Reykjavik trend.
  • Examination of the site reveals that the Stevenson screen at Svarlbard airport is right beside the tarmac of the runway, with the heat from planes and the heat from snow-clearing likely affecting measurements. With increasing use of the airport over the last twenty years, it is likely the raw data trend should be reduced, but at an increasing adjustment trend, not decreasing.
  • Further, data from a nearby temperature station at Isfjord Radio reveals that the early twentieth century warming on Spitzbergen may have been more rapid and of greater magnitude. GISS Adjustments reduce that trend by up to 4 degrees, compared with just 1.7 degrees for the late twentieth century warming.
  • Questions arise how raw data for Isfjord Radio could be available for 22 years before the station was established, and how the weather station managed to keep on recording “raw data” between the weather station being destroyed and abandoned in 1941 and being re-opened in 1946.

Introduction

In climate I am used to mis-directions and turning, but in this post I may have found the largest temperature adjustments to date.

In early February, RealClimate – the blog of the climate science consensus – had an article attacking Christopher Booker in the Telegraph. It had strong similarities the methods used by anonymous blogger ….andthentheresphysics. In a previous post I provided a diagram to illustrate ATTP’s methods.


One would expect that a blog supported by the core of the climate scientific consensus would provide a superior defence than an anonymous blogger who censors views that challenge his beliefs. However, RealClimate may have dug an even deeper hole. Paul Homewood covered the article on February 12th, but I feel it only scratched the surface. Using the procedures outlined above I note similarities include:-

  • Attacking the secondary commentary, and not mentioning the primary sources.
  • Misleading statements that understate the extent of the problem.
  • Avoiding comparison of the raw and adjusted data.
  • Single counter examples that do not stand up.

Attacking the secondary commentary

Like ATTP, RealClimate attacked the same secondary source – Christopher Booker – but another article. True academics would have referred Paul Homewood, the source of the allegations.

Misleading statement about number of weather stations

The article referred to was by Victor Venema of Variable Variability. The revised title is “Climatologists have manipulated data to REDUCE global warming“, but the original title can be found from the link address – http://variable-variability.blogspot.de/2015/02/evil-nazi-communist-world-government.html

It was published on 10th February and only refers to Christopher Booker’s original article in the Telegraph article of 24th January without mentioning the author or linking. After quoting from the article Venema states:-

Three, I repeat: 3 stations. For comparison, global temperature collections contain thousands of stations. ……

Booker’s follow-up article of 7th February states:-

Following my last article, Homewood checked a swathe of other South American weather stations around the original three. ……

Homewood has now turned his attention to the weather stations across much of the Arctic, between Canada (51 degrees W) and the heart of Siberia (87 degrees E). Again, in nearly every case, the same one-way adjustments have been made, to show warming up to 1 degree C or more higher than was indicated by the data that was actually recorded.

My diagram above was published on the 8th February, and counted 29 stations. Paul Homewood’s original article on the Arctic of 4th February lists 19 adjusted sites. If RealClimate had actually read the cited article, they would have known that quotation was false in connection to the Arctic. Any undergraduate who made this mistake in an essay would be failed.

Misleading Counter-arguments

Øyvind Nordli – the Real Climate author – provides a counter example from his own research. He compares his adjustments of the Svalbard, (which he did as part of temperature reconstruction for Spitzbergen last year) with those of NASA GISS.

Clearly, he is right in pointing out that his adjustments created a lower warming trend than those of GISS.

I checked the “raw data” with the “GISS Homogenised” for Svalbard and compare with the Reykjavik data I looked at last week, as the raw data is not part of the comparison. To make them comparable, I created anomalies based on the raw data average of 2000-2009. I have also used a 5 year centered moving average.

The raw data is in dark, the adjusted data in light. For Reykjavik prior to 1970 the peaks in the data have been clearly constrained, making the warming since 1980 appear far more significant. For the much shorter Svalbard data the total adjustments from GHCN and GISS reduce the warming trend by a full 1.7oC, bringing the warming trend into line with the largely unadjusted Reykjavik. The GHCN & GISS seem to be adjusted to a pre-conceived view of what the data should look like. What Nordli et. al have effectively done is to restore the trend present in the raw data. So Nordli et al, using data on the ground, has effectively reached a similar conclusion to Trausti Jonsson of the Iceland Met Office. The adjustments made thousands of miles away in the United States by homogenization algorithms are massive and unjustified. It just so happens that in this case it is in the opposite direction to cooling the past. I find it somewhat odd Øyvind Nordli, an expert on local conditions, should not challenge these adjustments but choose to give the opposite impression.

What is even worse is that there might be a legitimate reason to adjust downwards the recent warming. In 2010, Anthony Watts looked at the citing of the weather station at Svalbard Airport. Photographs show it to right beside the runway. With frequent snow, steam de-icers will regularly pass, along with planes with hot exhausts. The case is there for a downward adjustment over the whole of the series, with an increasing trend to reflect the increasing aircraft movements. Tourism quintupled between 1991 and 2008. In addition, the University Centre in Svalbad founded in 1993 now has 500 students.

Older data for Spitzbergen

Maybe the phenomenal warming in the raw data for Svarlbard is unprecedented, despite some doubts about the adjustments. Nordli et al 2014 is titled Long-term temperature trends and variability on Spitsbergen: the extended Svalbard Airport temperature series, 1898-2012. Is a study that gathers together all the available data from Spitzbergen, aiming to create a composite temperature record from fragmentary records from a number of places around the Islands. From NASA GISS, I can only find Isfjord Radio for the earlier period. It is about 50km west of Svarlbard, so should give a similar shape of temperature anomaly. According to Nordli et al

Isfjord Radio. The station was established on 1 September 1934 and situated on Kapp Linne´ at the mouth of Isfjorden (Fig. 1). It was destroyed by actions of war in September 1941 but re-established at the same place in July 1946. From 30 June 1976 onwards, the station was no longer used for climatological purposes.

But NASA GISS has data from 1912, twenty-two years prior to the station citing, as does Berkeley Earth. I calculated a relative anomaly to Reykjavik based on 1930-1939 averages, and added the Isfjord Radio figures to the graph.

The portion of the raw data for Isfjord Radio, which seems to have been recorded before any thermometer was available, shows a full 5oC rise in the 5 year moving average temperature. The anomaly for 1917 was -7.8oC, compared with 0.6 oC in 1934 and 1.0 oC in 1938. For Svarlbard Airport lowest anomalies are -4.5 oC in 1976 and -4.7 oC in 1988. The peak year is 2.4 oC in 2006, followed by 1.5 oC in 2007. The total GHCNv3 and GISS adjustments are also of a different order. At the start of the Svarlbard series every month was adjusted up by 1.7. The Isfjord Radio 1917 data was adjusted up by 4.0 oC on average, and 1918 by 3.5 oC. February of 1916 & 1918 have been adjusted upwards by 5.4 oC.

So the Spitzbergen warming the trough to peak warming of 1917 to 1934 may have been more rapid and greater than in magnitude that the similar warming from 1976 to 2006. But from the adjusted data one gets the opposite conclusion.

Also we find from Nordli at al

During the Second World War, and also during five winters in the period 18981911, no observations were made in Svalbard, so the only possibility for filling data gaps is by interpolation.

The latest any data recording could have been made was mid-1941, and the island was not reoccupied for peaceful purposes until 1946. The “raw” GHCN data is actually infill. If it followed the pattern of Reykjavik – likely the nearest recording station – temperatures would have peaked during the Second World War, not fallen.

Conclusion

Real Climate should review their articles better. You cannot rebut an enlarging problem by referring to out-of-date and dogmatic sources. You cannot pretend that unjustified temperature adjustments in one direction are somehow made right by unjustified temperature adjustments in another direction. Spitzbergen is not only cold, it clearly experiences vast and rapid fluctuations in average temperatures. Any trend is tiny compared to these fluctuations.

Is there a Homogenisation Bias in Paraguay’s Temperature Data?

Last month Paul Homewood at Notalotofpeopleknowthat looked at the temperature data for Paraguay. His original aim was to explain the GISS claims of 2014 being the hottest year.

One of the regions that has contributed to GISS’ “hottest ever year” is South America, particularly Brazil, Paraguay and the northern part of Argentina. In reality, much of this is fabricated, as they have no stations anywhere near much of this area…

….there does appear to be a warm patch covering Paraguay and its close environs. However, when we look more closely, we find things are not quite as they seem.

In “Massive Tampering With Temperatures In South America“, Homewood looked at the “three genuinely rural stations in Paraguay that are currently operating – Puerto Casado, Mariscal and San Juan.” A few days later in “All Of Paraguay’s Temperature Record Has Been Tampered With“, he looked at remaining six stations.

After identifying that all of the three rural stations currently operational in Paraguay had had huge warming adjustments made to their data since the 1950’s, I tended to assume that they had been homogenised against some of the nearby urban stations. Ones like Asuncion Airport, which shows steady warming since the mid 20thC. When I went back to check the raw data, it turns out all of the urban sites had been tampered with in just the same way as the rural ones.

What Homewood does not do is to check the data behind the graphs, to quantify the extent of the adjustment. This is the aim of the current post.

Warning – This post includes a lot of graphs to explain how I obtained my results.

Homewood uses comparisons of two graphs, which he helpful provides the links to. The raw GHCN data + UHSHCN corrections is available here up until 2011 only. The current after GISS homogeneity adjustment data is available here.

For all nine data sets that I downloaded both the raw and homogenised data. By simple subtraction I found the differences. In any one year, they are mostly the same for each month. But for clarity I selected a single month – October – the month of my wife’s birthday.

For the Encarnacion (27.3 S,55.8 W) data sets the adjustments are as follows.

In 1967 the adjustment was -1.3C, in 1968 +0.1C. There is cooling of the past.

The average adjustments for all nine data sets is as follows.

This pattern is broadly consistent across all data sets. These are the maximum and minimum adjustments.

However, this issue is clouded by the special adjustments required for the Pedro Juan CA data set. The raw data set has been patched from four separate files,

Removing does not affect the average picture.

But does affect the maximum and minimum adjustments. This is shows the consistency in the adjustment pattern.

The data sets are incomplete. Before 1941 there is only one data set – Ascuncion Aero. The count for October each year is as follows.

In recent years there are huge gaps in the data, but for the late 1960s when the massive switch in adjustments took place, there are six or seven pairs of raw and adjusted data.

Paul Homewood’s allegation that the past has been cooled is confirmed. However, it does not give a full understanding of the impact on the reported data. To assist, for the full year mean data, I have created temperature anomalies based on the average anomaly in that year.

The raw data shows a significant cooling of up to 1oC in the late 1960s. If anything there has been over-compensation in the adjustments. Since 1970, any warming in the adjusted data has been through further adjustments.

Is this evidence of a conspiracy to “hide a decline” in Paraguayan temperatures? I think not. My alternative hypothesis is that this decline, consistent over a number of thermometers is unexpected. Anybody looking at just one of these data sets recently, would assume that the step change in 40-year-old data from a distant third world country is bound to be incorrect. (Shub has a valid point) That change goes against the known warming trend for over a century from the global temperature data sets and the near stationary temperatures from 1950-1975. More importantly cooling goes against the “known” major driver of temperature recent change – rises in greenhouse gas levels. Do you trust some likely ropey instrument data, or trust your accumulated knowledge of the world? The clear answer is that the instruments are wrong. Homogenisation is then not to local instruments in the surrounding areas, but to the established expert wisdom of the world. The consequent adjustment cools past temperatures by one degree. The twentieth century warming is enhanced as a consequence of not believing what the instruments are telling you. The problem is that this step change is replicated over a number of stations. Paul Homewood had shown that it probably extends into Bolivia as well.

But what happens if the converse happens? What if there is a step rise in some ropey data set from the 1970s and 1980s? This might be large, but not inconsitent with what is known about the world. It is unlikely to be adjusted downwards. So if there have been local or regional step changes in average temperature over time both up and down, the impact will be to increase the rate of warming if the data analysts believe that the world is warming and human beings are the cause of it.

Further analysis is required to determine the extent of the problem – but not from this unpaid blogger giving up my weekends and evenings.

Kevin Marshall

All first time comments are moderated. Please also use the comments as a point of contact, stating clearly that this is the case and I will not click the publish button, subject to it not being abusive. I welcome other points of view, though may give a robust answer.

The Propaganda methods of ….and Then There’s Physics on Temperature Homogenisation

There has been a rash of blog articles about temperature homogenisations that is challenging the credibility of the NASS GISS temperature data. This has lead to attempts by anonymous blogger andthentheresphysics (ATTP) to crudely deflect from the issues identified. It is propagandist’s trick of turning people’s perspectives. Instead of a dispute about some scientific data, ATTP turns the affair into a dispute between those with authority and expertise in scientific analysis, against a few crackpot conspiracy theorists.

The issues on temperature homogenisation are to do with the raw surface temperature data and the adjustments made to remove anomalies or biases within the data. “Homogenisation” is a term used for process of adjusting the anomalous data into line with that from the surrounding data.

The blog articles can be split into three categories. The primary articles are those that make direct reference to the raw data set and the surrounding adjustments. The secondary articles refer to the primary articles, and comment upon them. The tertiary articles are directed at the secondary articles, making little or no reference to the primary articles. I perceive the two ATTP articles as fitting into the scheme below.

Primary Articles

The source of complaints about temperature homogenisations is Paul Homewood at his blog notalotofpeopleknowthat. The source of the articles is NASA’s Goddard Institute for Space Studies (GISS) database. For any weather station GISS provide nice graphs of the temperature data. The current after GISS homogeneity adjustment data is available here and the raw GHCN data + UHSHCN corrections is available here up until 2011 only. For any weather station GISS provide nice graphs of the temperature data. Homewood’s primary analysis was to show the “raw data” side by side.

20/01/15 Massive Tampering With Temperatures In South America

This looked at all three available rural stations in Paraguay. The data from all three at Puerto Casado, Mariscal and San Jan Buatista/Misiones had the same pattern of homogenization adjustments. That is, cooling of the past, so that instead of the raw data showing the 1960s being warmer than today, it was cooler. What could they have been homogenized to?

26/01/15 All Of Paraguay’s Temperature Record Has Been Tampered With

This checked the six available urban sites in Paraguay. Homewood’s conclusion was that

warming adjustments have taken place at every single, currently operational site in Paraguay.

How can homogenization adjustments all go so same way? There is no valid reason for making such adjustments, as there is no reference point for the adjustments.

29/01/15Temperature Adjustments Around The World

Homewood details other examples from Southern Greenland, Iceland, Northern Russia, California, Central Australia and South-West Ireland. Instead of comparing the raw with the adjusted data, he compared the old adjusted data with the recent data. Adjustment decisions are changing over time, making the adjusted data sets give even more pronounced warming trends.

30/01/15 Cooling The Past In Bolivia

Then he looked at all 14 available stations in neighbouring Bolivia. His conclusion

At every station, bar one, we find the ….. past is cooled and the present warmed.”

(The exception was La Paz, where the cooling trend in the raw data had been reduced.)

Why choose Paraguay in the first place? In the first post, Homewood explains that within a NOAA temperature map for the period 1981-2010 there appeared to be a warming hotspot around Paraguay. Being a former accountant he checked the underlying data to see if it existed in the data. Finding an anomaly in one area, he checked more widely.

The other primary articles are

26/01/15 Kevin Cowton NOAA Paraguay Data

This Youtube video was made in response to Christopher Booker’s article in the Telegraph, a secondary source of data. Cowton assumes Booker is the primary source, and is criticizing NOAA data. A screen shot of the first paragraph shows these are untrue.

Further, if you read down the article, Cowton’s highlighting of the data from one weather station is also misleading. Booker points to three, but just illustrates one.

Despite this, it still ranks as a primary source, as there are direct references to the temperature data and the adjustments. They are not GISS adjustments, but might be the same.

29/01/15 Shub Niggurath – The Puerto Casado Story

Shub looked at the station moves. He found that the metadata for the station data is a mess, so there is no actual evidence of the location changing. But, Shub reasons the fact that there was a step change in the data meant that it moved, and the fact that it moved meant there was a change. Shub is a primary source as he looks at the adjustment reason.

 

Secondary Articles

The three secondary articles by Christopher Booker, James Delingpole and BishopHill are just the connectors in this story.

 

Tertiary articles of “…and Then There’s Physics”

25/01/15 Puerto Cascado

This looked solely at Booker’s article. It starts

Christopher Booker has a new article in the The Telegraph called Climategate, the sequel: How we are STILL being tricked with flawed data on global warming. The title alone should be enough to convince anyone sensible that it isn’t really worth reading. I, however, not being sensible, read it and then called Booker an idiot on Twitter. It was suggested that rather than insulting him, I should show where he was wrong. Okay, this isn’t really right, as there’s only so much time and effort available, and it isn’t really worth spending it rebutting Booker’s nonsense.

However, thanks to a tweet from Ed Hawkins, it turns out that it is really easy to do. Booker shows data from a site in Paraguay (Puerto Casado) in which the data was adjusted from a trend of -1.37o C per century to +1.36o C per century. Shock, horror, a conspiracy?

 

ATTP is highlighting an article, but is strongly discouraging anybody from reading it. That is why the referral is a red line in the graphic above. He then says he is not going to provide a rebuttal. ATTP is good to his word and does not provide a rebuttal. Basically it is saying “Don’t look at that rubbish, look at the real authority“. But he is wrong for a number of reasons.

  1. ATTP provides misdirection to an alternative data source. Booker quite clearly states that the source of the data is the NASA GISS temperature set. ATTP cites Berkeley Earth.
  2. Booker clearly states that there are thee rural temperature stations spatially spread that show similar results. ATTP’s argument that a single site was homogenized with the others in the vicinity falls over.
  3. This was further undermined by Paul Homewood’s posting on the same day on the other 6 available sites in Paraguay, all giving similar adjustments.
  4. It was further undermined by Paul Homewood’s posting on 30th January on all 14 sites in Bolivia.

The story is not of a wizened old hack making some extremist claims without any foundation, but of a retired accountant seeing an anomaly, and exploring it. In audit, if there is an issue then you keep exploring it until you can bottom it out. Paul Homewood has found an issue, found it is extensive, but is still far from finding the full extent or depth. ATTP, when confronted by my summary of the 23 stations that corroborate each other chose to delete it. He has now issued an update.

Update 4/2/2015 : It’s come to my attention that some are claiming that this post is misleading my readers. I’m not quite sure why, but it appears to be related to me not having given proper credit for the information that Christopher Booker used in his article. I had thought that linking to his article would allow people to establish that for themselves, but – just to be clear – the idiotic, conspiracy-laden, nonsense originates from someone called Paul Homewood, and not from Chistopher Booker himself. Okay, everyone happy now? J

ATTP cannot accept that he is wrong. He has totally misrepresented the arguments. When confronted with alternative evidence ATTP resorts to vitriolic claims. If someone is on the side of truth and science, they will encourage people to compare and contrast the evidence. He seems to have forgotten the advice about when in a whole…..

01/02/15
Temperature homogenisation

ATTP’s article on Temperature Homogenisation starts

Amazing as it may seem, the whole tampering with temperature data conspiracy has managed to rear its ugly head once again. James Delingpole has a rather silly article that even Bishop Hill calls interesting (although, to be fair, I have a suspicion that in “skeptic” land, interesting sometimes means “I know this is complete bollocks, but I can’t bring myself to actually say so”). All of Delingpole’s evidence seems to come from “skeptic” bloggers, whose lack of understand of climate science seems – in my experience – to be only surpassed by their lack of understanding of the concept of censorship J.

ATPP starts with a presumption of being on the side of truth, with no fault possible on his side. Any objections are due to a conscious effort to deceive. The theory of cock-up or of people not checking their data does not seem to have occurred to him. Then there is a link to Delingpole’s secondary article, but calling it “silly” again deters readers from looking for themselves. If they did, the readers would be presented with flashing images of all the “before” and “after” GISS graphs from Paraguay, along with links to the 6 global sites and Shub’s claims that there is a lack of evidence for the Puerto Casado site being moved. Delingpole was not able the more recent evidence from Bolivia, that further corroborates the story.

He then makes a tangential reference to his deleting my previous comments, though I never once used the term “censorship”, nor did I tag the article “climate censorship”, as I have done to some others. Like on basic physics, ATTP claims to have a superior understanding of censorship.

There are then some misdirects.

  • The long explanation of temperature homogenisation makes some good points. But what it does not do is explain that the size and direction of any adjustment is an opinion, and as such be wrong. It a misdirection to say that the secondary sources are against any adjustments. They are against adjustments that create biases within the data.
  • Quoting Richard Betts’s comment on Booker’s article about negative adjustments in sea temperature data is a misdirection, as Booker (a secondary source) was talking about Paraguay, a land-locked country.
  • Referring to Cowton’s alternative analysis is another misdirect, as pointed out above. Upon reflection, ATTP may find it a tad embarrassing to have this as his major source of authority.

Conclusions

When I studied economics, many lecturers said that if you want to properly understand an argument or debate you need to look at the primary sources, and then compare and contrast the arguments. Although the secondary sources were useful background, particularly in a contentious issue, it is the primary sources on all sides that enable a rounded understanding. Personally, by being challenged by viewpoints that I disagreed with enhanced my overall understanding of the subject.

ATTP has managed to turn this on its head. He uses methods akin to crudest propagandists of last century. They started from deeply prejudiced positions; attacked an opponent’s integrity and intelligence; and then deflected away to what they wanted to say. There never gave the slightest hint that one side might be at fault, or any acknowledgement that the other may have a valid point. For ATTP, and similar modern propagandists, rather than have a debate about the quality of evidence and science, it becomes a war of words between “deniers“, “idiots” and “conspiracy theorists” against the basic physics and the overwhelming evidence that supports that science.

If there is any substance to these allegations concerning temperature adjustments, for any dogmatists like ATTP, it becomes a severe challenge to their view of the world. If temperature records have systematic adjustment biases then climate science loses its’ grip on reality. The climate models cease to be about understanding the real world, but conforming to people’s flawed opinions about the world.

The only way to properly understand the allegations is to examine the evidence. That is to look at the data behind the graphs Homewood presents. I have now done that for the nine Paraguayan weather stations. The story behind that will have to await another day. However, although I find Paul Homewood’s claims of systematic biases in the homogenisation process to be substantiated, I do not believe that it points to a conspiracy (in terms of a conscious and co-ordinated attempt to deceive) on the part of climate researchers.