aTTP falsely attacks Bjorn Lomborg’s “Impact of Current Climate Proposals” Paper

The following is a comment to be posted at Bishop Hill, responding to another attempt by blogger ….andThenThere’sPhysics to undermine the work of Bjorn Lomborg. The previous attempt was discussed here. This post includes a number of links, as well as a couple of illustrative screen captures at the foot of the table.

aTTP’s comment is

In fact, you should read Joe Romm’s post about this. He’s showing that the INDCs are likely to lead to around 3.5C which I think is relative to something like the 1860-1880 mean. This is very similar to the MIT’s 3.7, and quite a bit lower than the RCP8.5 of around 4.5C. So, yes, we all know that the INDCs are not going to do as much as some might like, but the impact is likely to be a good deal greater than that implied by Lomborg who has essentially assumed that we get to 2030 and then simply give up.

Nov 11, 2015 at 9:31 AM | …and Then There’s Physics

My Comment

aTTP at 9.31 refers to Joe Romm’s blog post of Nov 3 “Misleading U.N. Report Confuses Media On Paris Climate Talks“. Romm uses Climate Interactive’s Climate Scoreboard Tool to show the INDC submissions (if fully implemented) will result in 3.5°C as against the 4.5°C in the non-policy “No Action” Scenario. This is six times the claimed maximum impact of 0.17°C claimed in Lomberg’s new paper. Who is right? What struck me first was that Romm’s first graph, copied straight from the Climate Interactive’s seem to have a very large estimate for emissions in the “No Action” Scenario producing. Downloading the underlying data, I find the “No Action” global emissions in 2100 are 139.3 GtCO2e, compared with about 110 GtCO2e in Figure SPM5(a) of the AR5 Synthesis Report for the RCP8.5 scenario high emissions scenario. But it is the breakdown per country or region that matters.

For the USA, without action emissions are forecast to rise from 2010 to 2030 by 40%, in contrast to a rise of just 9% in the period 1990 to 2010. It is likely that emissions will fall without policy and will be no higher in 2100 than in 2010. The “no action” scenario overestimates 2030 emissions by 2-3 GtCO2e in 2030 and about 7-8 GtCO2e in 2100.

For the China the overestimation is even greater. Emissions will peak during the next decade as China fully industrializes, just as emissions peaked in most European countries in the 1970s and 1980s. Climate Interactive assumes that emissions will peak at 43 GtCO2e in 2090, whereas other estimates that the emissions peak will be around 16-17 GtCO2e before 2030.

Together, overestimations of the US and China’s “No Action” scenarios account for over half 55-60 GtCO2e 2100 emissions difference between the “No Action” and “Current INDC” scenarios. A very old IT term applies here – GIGO. If aTTP had actually checked the underlying assumptions he would realise that Romm’s rebuttal of Lomborg based on China’s emission assumptions (and repeated on his own blog) are as false as claiming that the availability of free condoms is why population peaks.

Links posted at

Kevin Marshall


Figures referred to (but not referenced) in the comment above

Figure 1: Climate Interactive’s graph, referenced by Joe Romm.

Figure 2: Reproduction of Figure SPM5(a) from Page 9 of the AR5 Synthesis Report.


Update – posted the following to ATTP’s blog


Lomborg and the Grantham Institute on the INDC submissions

Bjorn Lomborg has a new paper published in the Global Policy journal, titled: Impact of Current Climate Proposals. (hattip Bishop Hill and WUWT)

From the Abstract

This article investigates the temperature reduction impact of major climate policy proposals implemented by 2030, using the standard MAGICC climate model. Even optimistically assuming that promised emission cuts are maintained throughout the century, the impacts are generally small. ………… All climate policies by the US, China, the EU and the rest of the world, implemented from the early 2000s to 2030 and sustained through the century will likely reduce global temperature rise about 0.17°C in 2100. These impact estimates are robust to different calibrations of climate sensitivity, carbon cycling and different climate scenarios. Current climate policy promises will do little to stabilize the climate and their impact will be undetectable for many decades.

That is pretty clear. COP21 in Paris is a waste of time.

An alternative estimate is provided in a paper by Boyd, Turner and Ward (BTW) of the LSE Grantham Institute, published at the end of October.

They state

The most optimistic estimate of global emissions in 2030 resulting from the INDCs is about halfway between hypothetical ‘business as usual’ and a pathway that is consistent with the 2°C limit

The MAGICC climate model used by both Lomborg & the IPCC predicts warming of about 4.7°C under BAU, implying up to a 1.35°C difference from the INDCs, compared to the 0.17°C maximum calculated by Lomborg, 8 times the amount. Lomborg says this is contingent on no carbon leakage (exporting industry from policy to non-policy countries), whilst citing studies showing that it could offset 10-40%, or even over 100% of the emissions reduction. So the difference between sceptic Lomborg and the mighty LSE Grantham Institute is even greater than 8 times. Yet Lomborg refers extensively to the August Edition of BTW. So why the difference? There is no explicit indication in BTW of how they arrive at their halfway conclusion. nor a comparison by Lomborg.

Two other estimates are from the UNFCCC, and Climate Action Tracker. Both estimate the INDCs will constrain warming to 2.7°C, or about 2.0°C below the MAGICC BAU scenario. They both make assumptions about massive reductions in emissions post 2030 that are not in the INDCs. But at least the UNFCCC and CAT have graphs that show the projection through to 2100. Not so with BTW.

This is where the eminent brain surgeons and Nobel-Prize winning rocket scientists among the readership will need to concentrate to achieve the penetrating analytical powers of a lesser climate scientist.

From the text of BTW, the hypothetical business as usual (BAU) scenario for 2030 is 68 GtCO2e. The most optimistic scenario for emissions from the INDCs (and pessimistic for economic growth in the emerging economies) us that 2030 emissions will be 52 GtCO2e. The sophisticated climate projection models have whispered in code to the climate scientists that to be on target for the limit of 2.0°C, 2030 emissions show be not more than 36 GtCO2e. The mathematicians will be able to determine that 52 is exactly halfway between 36 and 68.

Now for the really difficult bit. I have just spent the last half hour in the shed manically cranking the handle of my patent beancounter extrapolator machine to get this result. By extrapolating this halfway result for the forecast period 2010-2030 through to 2100 my extrapolator tells me the INDCs are halfway to reaching the 2.0°C maximum warming target.

As Bob Ward will no doubt point out in his forthcoming rebuttal of Bjorn Lomborg’s paper, it is only true climate scientists who can reach such levels of analysis and understanding.

I accept no liability for any injuries caused, whether physical or psychological, by people foolishly trying to replicate this advanced result. Please leave this to the experts.

Fut there is a serious side to this policy advocacy. The Grantham Institute, along with others, is utterly misrepresenting the effectiveness of policy to virtually every government on the planet. Lomborg shows by rigorous means that policy is ineffective even if loads ridiculous assumptions are made, whether on climate science forecasting, policy theory, technological solutions, government priorities, or the ability of  current governments to make policy commitments for governments for decades ahead. My prediction is that the reaction of the Grantham Institute, along with plenty of others, is a thuggish denunciation of Lomborg. What they will not allow is the rational response to wide differences of interpretation. That is to compare and contrast the arguments and the assumptions made, both explicit and implicit. 

Kevin Marshall

WORLD RESOURCES INSTITUTE and Indonesian Emission Figures

In looking at the Indonesian INDC submission, I came across a confusing array of estimates for Indonesia’s total greenhouse gas emissions. These are the ones I found.

Estimates of Indonesia’s Total Greenhouse Emissions in MtCO2e





UNFCCC 1,101 1,444 2,829 1,908
EDGAR 1,165 622 1,171 745
WRI CAIT 2.0 1,026 1,372 1,584 1,928
WRI Blog   1,000 1,400 1,500
Indonesian Govt     1,800  

In graph format the figures are:-

The Indonesian INDC Submission says it will give unconditionally cut emissions by 29% from the BAU of 2881 MtCO2e, it means that in 2030 emissions will be about 100 MtCO2e lower than in 2005 not 1120 MtCO2e lower (UNFCCC) or 530 MtCO2e higher (EDGAR) . But on the basis of the UNFCCC or EGDAR figures by 2010 Indonesia had fallen by a third, so meeting the 2030 unconditional target should prove a doddle. Alternatively, use the World Resources Institute CAIT 2.0 data and Indonesia has unconditionally agreed something much more drastic. Between 2005 and 2010 emissions grow at 4% a year. On that trend, the 2030 BAU becomes 4200 MtCO2e, not 2881 MtCO2e, so the unconditional emissions “cut” is not 29% but 51%.

The worst example is contained in a graph about the Indonesian INDC Submission at the World Resources Institute Blog and reproduced below.

There are a number of things wrong with this graph, including

  • Scale is in KtCO2e, not MtCO2e.
  • Does not use WRI’s own CAIT 2.0. This is despite WRI claiming itprovides free access to comprehensive, reliable, and comparable greenhouse gas emissions data sets, as well as other climate-relevant indicators, to enable analysis on a wide range of climate-related data questions.
  • Nor does is there any trace of Indonesia’s claimed emissions 1800 Mt CO2e in 2005. So where does this wibbly-wobbly projection come from? The reference includes BAPPENAS 2015 – the Indonesian “National Development Planning Agency”. A search finds this graph.

The figure for 2005 is about 1400 MtCO2e, not the 1800 MtCO2e stated in the INDC. The Indonesian’s have fiddled their own unaudited figures to get a politically desired result – an easily achievable “reduction” in GHG emissions. Even worse, the WRI does check the data. There are minor points that the Indonesian “dalam ribu ton” translates on Google as “in thousand tons“, or that anyone who knows climate data would realize that 1,000,000 MtCO2e is greater than 49GtCO2e, the UNIPCCs AR5 global estimate of GHG emissions in 2010.

Finally, the Carbon Brief, in a recent article says that 1997 was a record for forest fires – a record that may be broken in 2015. Already 1600 MtCO2e has been emitted from forest fires. On this basis, therefore, 1997 total Indonesian emissions are likely to be well in excess of 2000 MtCO2e, and a considerable spike in the record.

The WRI CAIT 2.0 data, shows a minor spike. The narrower “GHG Emissions from Land-Use Change and Forestry” was estimated at 904 MtCO2e, as against 1321 MtCO2e in 2006. This is nowhere near the implied Carbon Brief 1997 emissions record. The figures

In summary, emissions figures for Indonesia are just arbitrary estimates, based on extremely limited and contradictory data. Both the WRI and the Indonesian Government cherry-pick data to suit their cause. Whether it is justified depends on the purpose. The WRI states their missions clearly.

That is to impose their environmentalist beliefs and perspectives on everybody else.

Indonesia’s INDC submission begins

This is, in my view, a far more rounded and focused mission. Against the environmentalist ideologies of the UNFCCC I believed that in manipulating figures Indonesia is serving the interests of 250 million Indonesians.

Kevin Marshall

Indonesia Outflanks the Climate Activists in its INDC Submission

I have spent a few weeks trying to make sense of the INDC submissions. One of the most impenetrable appeared to that from Indonesia. This view is shared by The Carbon Brief.

Uncertain emissions

As well as being hazy on policy and financing needs, it is also difficult to gauge the ambition of Indonesia’s INDC emissions targets. This is despite the document including a projected figure for BAU emissions in 2030 of 2.9bn tonnes of CO2 equivalent (GtCO2e).

The pledge to reduce emissions by at least 29% compared to this trajectory means an effective cap in 2030 of 2GtCO2e. With the more ambitious 41% reduction compared to BAU, the cap would be 1.7GtCO2e.


Similarly the World Resources Institute states

(T)he current draft contribution still displays several important gaps in transparency and ambition, which must be addressed before submitting a final INDC to the United Nations Framework Convention on Climate Change (UNFCCC). By eliminating these gaps, the Indonesian government could bring its contribution into line with international best practices on transparency, demonstrate leadership internationally by enhancing ambition, and help ensure success at COP 21.

The context from Indonesia’s perspective is stated in the opening paragraph of Indonesia’s INDC Submission.

In more basic language, Indonesia has more important and immediate priorities than “climate change“. From a national point of view, imposing drastic and ineffective policies will go against the Indonesian Government’s perceived duty to its people. This will happen regardless of the truth of the projected catastrophes that await the planet without global mitigation. The policies will be ineffective because most other emerging economies have similar priorities to Indonesia, and are taking similar measures of policy avoidance. In the case of Indonesia these are

  • Cherry-picking a base year.
  • Making reductions relative to a fictional “Business as Usual” scenario with inflated economic growth figures.
  • Making sure that even the most ambitious objectives achievable within the range of an objective forecast.
  • Focus the negotiations on achieving the conditional objectives subject to outside assistance. Any failure to reach agreement then becomes the fault of rich countries failing to provide the finance.
  • Allow some room to make last minute concessions not in the original submission, contingent on further unspecified outside assistance that is so vast the money will never be forthcoming.

The calculations to achieve the figures in the submissions are fairly simple to work out with a bit of patience.


Calculating the 2030 Business as Usual 2881 MtCO2e

The Indonesian INDC submission states that in 2005 total emissions were 1800 MtCO2e and combustion of fossil fuels were 19% of this total. That implies about 342 MtCO2e from the combustion of fossil fuels. The Carbon Dioxide Information Analysis Center (CDIAC1) has an estimated figure of 341.71 MtCO2e and the UNFCCC Country Brief in 2005 “CO2 emissions from fuel combustion” were 335.71 MtCO2e. For 20112 the CDIAC estimate is 472.53 MtCO2e, rounded to 473. Let us now assume a growth rate in emissions of 6.0% per annum from 2012 to 2030, against an economic growth rate of around 5.2% from 2000 to 2010 and 5.8% from 2005 to 20103. At 6.0% compound growth fossil fuel emissions in 20304 will be 1431 MtCO2e.

The non-fossil fuel emissions are a bit more problematic to work out. In 2005 the baseline estimate is 81% of 18005 is 1458. It is only a vague estimate, so round it down to 1450 and then assume it is constant for the Business as Usual (BAU) scenario.

The BAU 2030 total emissions forecast for Indonesia is therefore 1431 + 1450 = 2881 MtCO2e.

There might be other ways to derive this figure, but none are simpler and the figures do not fall out exactly.


How does Indonesia achieve the unconditional 29% reduction against BAU?

The easiest part to achieve is outside of fossil fuel emissions. The major cause of these emissions is in the reduction of the rainforests. The Carbon Brief is claims the biggest source of non-fossil fuel emissions is due to illegal forest clearances to grow palm oil. Although in 2015 the forest fires are closing in on the record set in 1997, it is safe to say that that these will reduce considerably in the coming years as Indonesia already has 52% of world palm oil production. By assuming a 3.34% reduction per annum in these emissions from 2005, they will reduce from 1450 MtCO2e to 611 MtCO2e in 2030. Total emissions of 2042 MtCO2e (1431+611) are 29.1% lower than BAU without an expense on the part of the Indonesian Government.


How does Indonesia achieve the conditional 41% reduction against BAU?

Indonesia claims that it needs international cooperation increase the reduction against BAU to 41%. In whole numbers, if BAU is 2881 a 41% reduction would make 1700. Not 1699 or 1701, but 1700. This is 100 less than the estimated 1800 MtCO2e total GHG emissions for 2005. This will be achieved without any “international cooperation“, a euphemism for foreign aid. The reason is simple. From the UNFCCC Indonesia Country Brief for Indonesia GDP growth for 1990 to 2012 average GDP growth per annum was 4.9% and CO2 emissions from fuel combustion was 5.1%. Normally GDP growth exceeds emissions growth. As a country develops this gap will widen until emissions growth ceases altogether and will even fall slightly. In India GDP growth from 1990 to 2012 averaged 6.5% and emissions growth was 5.7%. In China the respective figures are 10.3% and 6.1%. In China, emissions will peak around 2025 to 2030 without any policy change. It is reasonable to assume therefore that forecast fossil fuel emissions growth will be at a lower rate than the forecast GDP growth of 6.0%. A conservative estimate is that the fossil fuel emissions growth rate will be 25% lower than GDP growth rate from 2011 to 2030 at 4.50%. Rounding as before4 gives forecast emissions of 1089 MtCO2e as against a BAU of 1431.

The revised 2030 total emissions forecast for Indonesia is 1089 + 611 = 1700 MtCO2e. This is a 41.0% reduction on the BAU of 2881 MtCO2e.


Why should Indonesia have such a cynical manipulation of the numbers?

Indonesia is caught between a rock and a hard place. The stated major priorities for this country of 250 million people are at odds with doing its bit to save the world. In this Indonesia is not alone. India, China, and Vietnam are other major emerging nations who site other priorities. Ranged against them are the activist scientists behind the climate scare who hold the a priori truth of the prophesied global warming catastrophes that await the planet if we do not amend out wicked ways. Further, mitigation policies are good for the sole, regardless of their effectiveness, and the practice of these policies will lead others to enlightenment they have found. They will not recognize that any alternative points of view exist, whether morally, politically or scientifically. Rather than argue, the best policy is to outflank them. The activists will accept official policy objectives without question so long as it appears to fit the cause. So the Indonesians gave them massive cuts related to fictitious projected figures, cloaked with the language of climate speak to throw them off the scent. They should be applauded for protecting 250 million people, rather than inflicting ineffective burdens upon them. The real shame is that the leaders of the so-called developed economies have fallen for this rubbish.

Kevin Marshall


  1. Reference of the full global carbon budget 2014: C. Le Quéré, R. Moriarty, R. M. Andrew, G. P. Peters, P. Ciais, P. Friedlingstein, S. D. Jones, S. Sitch, P. Tans, A. Arneth, T. A. Boden, L. Bopp, Y. Bozec, J. G. Canadell, F. Chevallier, C. E. Cosca, I. Harris, M. Hoppema, R. A. Houghton, J. I. House, A. K. Jain, T. Johannessen, E. Kato, R. F. Keeling, V. Kitidis, K. Klein Goldewijk, C. Koven, C. S. Landa, P. Landschützer, A. Lenton, I. D. Lima, G. H. Marland, J. T. Mathis, N. Metzl, Y. Nojiri, A. Olsen, T. Ono, W. Peters, B. Pfeil, B. Poulter, M. R. Raupach, P. Regnier, C. Rödenbeck, S. Saito, J. E. Sailsbury, U. Schuster, J. Schwinger, R. Séférian, J. Segschneider, T. Steinhoff, B. D. Stocker, A. J. Sutton, T. Takahashi, B. Tilbrook, G. R. van der Werf, N. Viovy, Y.-P. Wang, R. Wanninkhof, A. Wiltshire, and N. Zeng 2014. Global Carbon Budget 2014. Earth System Science Data Discussions, doi:10.5194/essdd-7-521-2014
  2. 2011 is the baseline year for the IPCC reports.
  3. This can be obtained from two sources. First the INDC submission notes that “GDP Growth Rate has slowed between 2010-2015 from 6.2-6.5% per annum to only 4.0% per annum (first quarter of 2015).” A return to the higher levels of growth is an assumption of successful government policy.
  4. Each year growth of 6.0% is rounded to the nearest whole number.
  5. The 2005 total emissions estimate of 1800 MtCO2 is at odds with other estimates. The WRI CAIT 2.0 figure is 1600; the EDGAR estimate is 1171; and the UNFCCC estimate is 2828. There might be another method of estimation. Maybe it is being a bit too cynical to assume that someone could have taken the average of the three (1866) and rounded down.

Plans to Increase Global Emissions at COP21 Paris


It is a necessary, but far from sufficient, condition to cut global greenhouse gas emissions for any increases in emissions in some parts of the world to be offset by emissions cuts elsewhere. INDC submissions for the COP21 in Paris contain proposed emissions targets between 2010 and 2030 suggest the opposite will be case. For every tonne of emissions reductions in 32 leading developed countries there will be at least three tonnes of emissions increases in 7 major developing countries. The net effect of these targets being achieved from these countries (which combined make up both 60% of global emissions and 60% of global population) will be to make global emissions 20% higher in 2030 than 2010. Using UNIPCC AR5 projections, unless there are large and rapid cuts in in global greenhouse emissions post 2030, any agreement based those submissions will not save the world from two degrees of dangerous global warming and will likely not save the world from three degrees of warming. This leads to a policy problem. Emissions reduction policies will only reduce a small part of the harms of climate change. So even if the more extreme claims of climate catastrophism are true, then it might be more beneficial for a nation to avoid emissions reduction policies.


In the following analysis makes these assumptions.

  • UNIPCC estimates of the relationship between global average temperature and atmospheric greenhouse gas levels are accurate.
  • UNIPCC estimates of the relationship between greenhouse gas emissions and atmospheric greenhouse gas levels are accurate.
  • Policy commitments will always turn into concrete policy.
  • Climate change policy priorities will not conflict with other priorities.
  • All policy will be effectively implemented in full, implying the requisite technological and project management capacities are available.

The Context

The World’s leaders meeting from 30 November to December 11 in Paris together to thrash out a plan to save the world from a dangerous two degrees of warming. In preparation 146 countries, representing 87% of Global Emissions have submitted plans to the United Nations Framework Convention on Climate Change (UNFCCC). These are available at the submissions website here. There is no-one who has gone through to evaluate whether these submissions are consistent with this objective. I have chosen a small sample of 7 major developing nations and 32 developing nations (EU 28 have a single target) which combined represent about 60% of global emissions and 60% of global population.

The level of global emissions control required to constrain global warming is given by the IPCC in their final version of the 2014 AR5 Synthesis Report page 21 Figure SPM 11(a) and reproduced below.

The dark blue band is the maximum emissions pathway to avoid going beyond 2 degrees of warming, with RCP2.6 denoting the central pathway. The dark orange pathway would produce 2.5-3.0 degrees of warming. According to the figure SPM 5(a) Annual GHG emissions in 2010 were 49 GtCO2. They are currently increasing by at least 2% a year. The extrapolated projection for 2030 is 70-75 GtCO2, roughly following the solid black line of the RCP8.5 BAU (non-policy) scenario. In 2015 this will be about 54 GtCO2. The minimum for policy is that global emissions should be at least no higher than they were in 2010, and preferably below that level to offset the cumulative overshoot that will occur.

How does the global policy requirement fit in with the country submissions?

If the IPCC projections are correct, to avoid 2 degrees of warming being exceeded there needs to be a global cap on greenhouse gas emissions of around 50 GtCO2 almost immediately and for that level to start to start falling in the early 2020s. Alternatively, if global emissions reach 60 GtCO2 without any prospect of major reductions thereafter then from the model projections three degrees of warming is likely to be exceeded. There is a large gap between these two scenarios, but even with submissions from a limited number of the major countries it is possible to state that the lower limit will be exceeded. This can be done by calculating emissions increases in the major high growth developing countries and the proposed emissions reductions in the major developed countries. This is not straight forward, as in most country submissions there are no clear figures, so various assumptions need to be made. For developing countries this is particularly difficult, as the estimated business as usual (BAU) emissions are usually not stated and are dependent upon assumptions of economic growth, though sometimes there are clues within the text. For the developed countries the projections are easier to calculate, as they are relative to a date in the past. There is a further issue of which measure of emissions to use. I have used the UNFCCC issued estimates of GHG emissions in its Country Briefs for 1990, 2000, 2005 & 2010.1 In many of the submissions there often both conditional and unconditional estimates of 2030 emissions. For developing countries the lower estimates are dependent on external funding. For the other countries, emissions reductions are expressed as a range. In every case I have used the lower emissions figure.2

For the developing countries, those with major projected emissions increases countries are as follows.3

Estimated targeted emissions increases from 2010 to 2030 for major developing countries based on INDC Submissions

Emissons Change

INDC Submission

Country Brief





























The targeted total increase GHG for these seven countries between 2010 and 2030 is estimated to be in excess of 13 Gt.

According to World Bank Data there were 3300 million people in these seven countries in 2013, or 46% of the global population.

For the developed countries those with the largest quantitative emissions reductions are as follows.4

Estimated targeted emissions change from 2010 to 2030 for major developed countries from INDC Submissions

Emissons Change

INDC Submission

Country Brief





















The targeted total decrease GHG for these thirty-two countries between 2010 and 2030 is estimated to be 4 Gt.

According to World Bank Data there were 900 million people in these thirty-two countries in 2013, or 13% of the global population.

For every one tonne of emissions reduction by developed countries, it will be replaced by at least three tonnes of emissions elsewhere. Bigger reductions by these developed countries will not close the gap, as their total 2010 emissions are just 12.9 G. The developing countries do not include a single African country, nor Pakistan, Iran, Venezuela, or numerous other countries. Yet it does include all the major developed countries.

Whilst the developing countries way not achieve this increase in emissions by 2030, collectively they will achieve this increase shortly after that date. Many of the developed countries may not achieve the emissions reductions due to changing priorities. For instance the EU targets reduction may not be achieved due to Germany abandoning nuclear power in favour of coal and Southern European states reducing renewables subsidies as a response to recent economic crises.

The Elephant in the Room

In 2030, even with an agreement based on the INDC submissions signed this December in Paris, and then fully implemented without compromise there is still a problem. If the IPCC models are correct, the only way to stop the 3 degrees of warming being exceeded is through rapid reductions in emissions in those countries where emissions have recently peaked (e.g. South Korea and China) along with steep reductions in emissions of countries where they are still increasing rapidly (e.g. India and Bangladesh). Unless a technological miracle happens in the next decade this is not going to happen. More likely is that global emissions may keep on rising as many slower-growing African and Asian nations have ever larger unit increases in emissions each year.

The Policy Problem

The justification for mitigation policy is most clearly laid out in the British 2006 Stern Review Summary of Conclusions page vi

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more.

That is the unknown and random costs of climate change can be exchanged for the lesser and predictable costs of policy. A necessary, but far from sufficient, condition of this happening is that policy will eradicate all the prospective costs of climate change. It could be that if warming is constrained to less than 2 degrees the costs of climate change would be trivial, so the reality could be a close approximation of Stern’s viewpoint. But if warming exceeds 3 degrees and the alleged harms are correct, then emissions reducing policies are likely to lead to net harms for the countries implementing those policies and a small net benefit for those countries without policy.

Kevin Marshall


  1. The exception is for Bangladesh. They are one of the few countries that clearly lays out 2030 estimates in MtCO2, but the 2010 estimate is about 20% lower than the UNFCCC figure. I have just lifted the Bangladeshi figures.
  2. For instance the USA the target is to reduce is emissions 26-28% on the 2005 level. I have used the 28% figure. The United States is about the only country not providing target figures for 2030. I would be imprudent to assume any greater reductions given that it is not certain even this level will be ratified by Congress.
  3. Not all the countries outside of the rich are targeting emissions increases. Brazil and Argentina are targeting emissions reductions, whilst Thailand and South Korea would appear to be targeting to maintaining emissions at around 2010 levels.
  4. Not all developed countries have emissions reduction targets.
  5. South Korea with 1.3% of 2010 global emissions could be included in developed countries, but its target it is to roughly maintain emissions at 2010 levels. Switzerland, Norway and Singapore are all committed to emissions reductions, but combined they have less 0.3 GT of emissions.

A note on Bias in Australian Temperature Homogenisations

Jo Nova has an interesting and detailed post guest post by Bob Fernley-Jones on heavily homogenised rural sites in Australia by the Australian BOM.

I did a quick comment that was somewhat lacking in clarity. This post is to clarify my points.

In the post Bob Fernley-Jones stated

The focus of this study has been on rural stations having long records, mainly because the BoM homogenisation process has greatest relevance the older the data is.

Venema et al. 2012 stated (Italics mine)

The most commonly used method to detect and remove the effects of artificial changes is the relative homogenization approach, which assumes that nearby stations are exposed to almost the same climate signal and that thus the differences between nearby stations can be utilized to detect inhomogeneities (Conrad and Pollak, 1950). In relative homogeneity testing, a candidate time series is compared to multiple surrounding stations either in a pairwise fashion or to a single composite reference time series computed for multiple nearby stations.

This assumption of nearby temperature stations being exposed to same climate signal is standard practice. Victor Venema, (who has his own blog) is a leading academic expert on temperature homogenisation. However, there are extreme examples where this assumption does not hold. One example is at the end of the 1960s in much of Paraguay where average temperatures fell by one degree. As this was not replicated in the surrounding area both GISTEMP and Berkeley Earth homogenisations eliminated this anomaly. This was despite using very different homogenisation techniques. My analysis is here.

On a wider scale take a look at the GISTEMP land surface temperature anomaly map for 2014 against 1976-2010. (obtained from here)

Despite been homogenised and smoothed it is clear that trends are different. Over much of North America there was cooling, bucking the global trend. What this suggests to me is that the greater the distance between weather stations the greater the likelihood that the climate signals will be different. Most importantly for temperature anomaly calculations, over the twentieth century the number of weather stations increased dramatically. So it is more likely homogenisation will end up smoothing out local and sub-regional variations in temperature trends in the early twentieth century than in the later period. This is testable.

Why should this problem occur with expert scientists? Are they super beings who know the real temperature data, but have manufactured some falsehood? I think it is something much more prosaic. Those who work at the Australian BOM believe that the recent warming is human caused. In fact they believe that more than 100% of warming is human caused. When looking at outlier data records, or records that show inconsistencies there is a very human bias. Each time the data is reprocessed they find new inconsistencies, having previously corrected the data.

Kevin Marshall

Degenerating Climatology 1: IPCC Statements on Human Caused Warming

This is the first in an occasional series of illustrating the degeneration of climatology away from an empirical science. In my view, for climatology to be progressing it needs to be making ever clearer empirical statements that support the Catastrophic Anthropogenic Global Warming (CAGW) hypothesis and moving away from the bland statements that can just as easily support a weaker form of the hypothesis, or support random fluctuations. In figure 1 this progression is illustrated by the red arrow, with increasing depth of colour. The example given below is an illustration of the opposite tendency.

Obscuring the slowdown in warming in AR5

Every major temperature data set shows that the warming rate this century has been lower than that towards the end of the end of the twentieth century. This is becoming a severe issue for those who believe that the main driver of warming is increasing atmospheric greenhouse gas levels. This gave a severe problem for the IPCC in trying to find evidence for the theory when they published in late 2013.

In the IPCC Fifth Assessment Report Working Group 1 (The Physical Science Basis) Summary for Policy Makers, the headline summary on the atmosphere is:-

Each of the last three decades has been successively warmer at the Earth’s surface than any preceding decade since 1850. In the Northern Hemisphere, 1983–2012 was likely the warmest 30-year period of the last 1400 years (medium confidence).

There are three parts to this.

  • The last three decades have been successively warmer according to the major surface temperature data sets. The 1980s were warmer than the 1970s; the 1990s warmer than the 1980s; and the 2000s warmer than the 1990s.
  • The 1980s was warmer than any preceding decade from the 1850s.
  • In the collective opinion of the climate experts there is greater than a 66% chance that the 1980s was the warmest decade in 1400 years.

What the does not include are the following.

  1. That global average temperature rises have slowed down in the last decade compared with the 1990s. From 2003 in the HADCRUT4 temperature series warming had stopped.
  2. That global average temperature also rose significantly in the mid-nineteenth and early twentieth centuries.
  3. That global average temperature fell in 4 or 5 of the 13 decades from 1880 to 2010.
  4. That in the last 1400 years there was a warm period about 1000 years ago and a significantly cold period that could have reached bottomed out around 1820. That is a Medieval Warm Period and the Little Ice Age.
  5. That there is strong evidence of Roman Warm Period that about 2000 years ago and a Bronze Age warm period about 3000 years ago.

Point (i) to (iii) can be confirmed by figure 2. Both the two major data surface temperature anomalies show warming trends in each of the last three decades, implying successive warming. A similar statement could have been made in 1943 if the data had been available.

In so far as the CAGW hypothesis is broadly defined as a non-trivial human-caused rise in temperatures (the narrower more precise definition being that the temperature change has catastrophic consequences) there is no empirical support found from the actual temperature records or from the longer data reconstructions from proxy data.

The major statement above is amplified by the major statement from the press release of 27/09/2013.

It is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century. The evidence for this has grown, thanks to more and better observations, an improved understanding of the climate system response and improved climate models.

This statement does exclude other types of temperature change, let alone other causes of the temperature change. The cooling in the 1960s is not included. The observed temperature change is only the net impact of all influences, known or unknown. Further, the likelihood is based upon expert opinion. If the experts have always given prominence to human influences on warming (as opposed to natural and random influences) then their opinion will be biased. Over time if this opinion is not objectively adjusted in the light of evidence that does not conform to the theory the basis of Bayesian statistic is undermined.

Does the above mean that climatology is degenerating away from a rigorous scientific discipline? I have chosen the latest expert statements, but not compared them with previous statements. A comparable highlighted statement to the human influence statement from the fourth assessment report WG1 SPM (Page 3) is

The understanding of anthropogenic warming and cooling influences on climate has improved since the TAR, leading to very high confidence that the global average net effect of human activities since 1750 has been one of warming, with a radiative forcing of +1.6 [+0.6 to +2.4] W m–2

The differences are

  • The greenhouse gas effect is no longer emphasised. It is now the broader “human influence”.
  • The previous statement was prepared to associate the influence with a much longer period. Probably the collapse of hockey stick studies, with their portrayal of unprecedented warming, has something to do with this.
  • Conversely, the earlier statement is only prepared to say that since 1750 the net effect of human influences has been one of warming. The more recent statement claims a dominant cause of warming has been human caused.

This leads my final point indicating degeneration of climatology away from science. When comparing the WG1 SPMs for TAR, AR4 and AR5 there are shifting statements. In each report the authors have chosen the best statements to fit their case at that point in time. The result is a lack of continuity that might demonstrate and increasing correspondence between theory and data.

Kevin Marshall

John Cook undermining democracy through misinformation

It seems that John Cook was posting comments in 2011 under the pseudonym Lubos Motl. The year before physicist and blogger Luboš Motl had posted a rebuttal of Cook’s then 104 Global Warming & Climate Change Myths. When someone counters your beliefs point for point, then most people would naturally feel some anger. Taking the online identity of Motl is potentially more than identity theft. It can be viewed as an attempt to damage the reputation of someone you oppose.

However, there is a wider issue here. In 2011 John Cook co-authored with Stephan Lewandowsky The Debunking Handbook, that is still featured prominently on the This short tract starts with the following paragraphs:-

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

A common misconception about myths is the notion that removing its influence is as simple as packing more information into people’s heads. This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information – in science communication, it’s known as the “information deficit model”. But that model is wrong: people don’t process information as simply as a hard drive downloading data.

If Cook was indeed using the pseudonym Lubos Motl then he was knowingly putting out into the public arena misinformation in a malicious form. If he misrepresented Motl’s beliefs, then the public may not know who to trust. Targeted against one effective critic, it could trash their reputation. At a wider scale it could allow morally and scientifically inferior views to gain prominence over superior viewpoints. If the alarmist beliefs were superior it what be necessary to misrepresent alternative opinions. Open debate would soon reveal which side had the better views. But in debating and disputing, all sides would sharpen their arguments. What would quickly disappear is the reliance on opinion surveys and rewriting of dictionaries. Instead, proper academics would be distinguishing between quality, relevant evidence from dogmatic statements based on junk sociology and psychology. They would start defining the boundaries of expertise between the basic physics, computer modelling, results analysis, public policy-making, policy-implementation, economics, ethics and the philosophy of science. They may then start to draw on the understanding that has been achieved in these subject areas.

Kevin Marshall

Climatic Temperature Variations

In the previous post I identified that the standard definition of temperature homogenisation assumes that there are little or no variations in climatic trends within the homogenisation area. I also highlighted specific instances of where this assumption has failed. However, the examples may be just isolated and extreme instances, or there might be other, offsetting instances so the failures could cancel each other out without a systematic bias globally. Here I explore why this assumption should not be expected to hold anywhere, and how it may have biased the picture of recent warming. After a couple of proposals to test for this bias, I look at alternative scenarios that could bias the global average temperature anomalies. I concentrate on the land surface temperatures, though my comments may also have application to the sea surface temperature data sets.


Comparing Two Recent Warming Phases

An area that I am particularly interested in is the relative size of the early twentieth century warming compared to the more recent warming phase. This relative size, along with the explanations for those warming periods gives a route into determining how much of the recent warming was human caused. Dana Nuccitelli tried such an explanation at skepticalscience blog in 20111. Figure 1 shows the NASA Gistemp global anomaly in black along with a split be eight bands of latitude. Of note are the polar extremes, each covering 5% of the surface area. For the Arctic, the trough to peak of 1885-1940 is pretty much the same as the trough to peak from 1965 to present. But in the earlier period it is effectively cancelled out by the cooling in the Antarctic. This cooling, I found was likely caused by use of inappropriate proxy data from a single weather station3.

Figure 1. Gistemp global temperature anomalies by band of latitude2.

For the current issue, of particular note is the huge variation in trends by latitude from the global average derived from the homogenised land and sea surface data. Delving further, GISS provide some very useful maps of their homogenised and extrapolated data4. I compare two identical time lengths – 1944 against 1906-1940 and 2014 against 1976-2010. The selection criteria for the maps are in figure 2.

Figure 2. Selection criteria for the Gistemp maps.

Figure 3. Gistemp map representing the early twentieth surface warming phase for land data only.

Figure 4. Gistemp map representing the recent surface warming phase for land data only.

The later warming phase is almost twice the magnitude of, and has much the better coverage than, the earlier warming. That is 0.43oC against 0.24oC. In both cases the range of warming in the 250km grid cells is between -2oC and +4oC, but the variations are not the same. For instance, the most extreme warming in both periods is at the higher latitudes. But, with the respect to North America in the earlier period the most extreme warming is over the Northwest Territories of Canada, whilst in the later period the most extreme warming is over Western Alaska, with the Northwest Territories showing near average warming. In the United States, in the earlier period there is cooling over Western USA, whilst in the later period there is cooling over much of Central USA, and strong warming in California. In the USA, the coverage of temperature stations is quite good, at least compared with much of the Southern Hemisphere. Euan Mearns has looked at a number of areas in the Southern Hemisphere4, which he summarised on the map in Figure 5

Figure 5. Euan Mearns says of the above “S Hemisphere map showing the distribution of areas sampled. These have in general been chosen to avoid large centres of human population and prosperity.

For the current analysis Figure 6 is most relevant.

Figure 6. Euan Mearns’ says of the above “The distribution of operational stations from the group of 174 selected stations.

The temperature data for the earlier period is much sparser than for later period. Even where there is data available in the earlier period the temperature data could be based on a fifth of the number of temperature stations as the later period. This may exaggerate slightly the issue, as the coasts of South America and Eastern Australia are avoided.

An Hypothesis on the Homogenisation Impact

Now consider again the description of homogenisation Venema et al 20125, quoted in the previous post.


The most commonly used method to detect and remove the effects of artificial changes is the relative homogenization approach, which assumes that nearby stations are exposed to almost the same climate signal and that thus the differences between nearby stations can be utilized to detect inhomogeneities. In relative homogeneity testing, a candidate time series is compared to multiple surrounding stations either in a pairwise fashion or to a single composite reference time series computed for multiple nearby stations. (Italics mine)


The assumption of the same climate signal over the homogenisation will not apply where the temperature stations are thin on the ground. The degree to which homogenisation eliminates real world variations in trend could be, to some extent, inversely related to the density. Given that the density of temperature data points diminishes in most areas of the world rapidly when one goes back in time beyond 1960, homogenisation in the early warming period far more likely to be between climatically different temperature stations than in the later period. My hypothesis is that, relatively, homogenisation will reduce the early twentieth century warming phase compared the recent warming phase as in earlier period homogenisation will be over much larger areas with larger real climate variations within the homogenisation area.

Testing the Hypothesis

There are at least two ways that my hypothesis can be evaluated. Direct testing of information deficits is not possible.

First is to conduct temperature homogenisations on similar levels of actual data for the entire twentieth century. If done for a region, the actual data used in global temperature anomalies should be run for a region as well. This should show that the recent warming phase is post homogenisation is reduced with less data.

Second is to examine the relate size of adjustments to the availability of comparative data. This can be done in various ways. For instance, I quite like the examination of the Manaus Grid block record Roger Andrews did in a post The Worst of BEST6.

Counter Hypotheses

There are two counter hypotheses on temperature bias. These may undermine my own hypothesis.

First is the urbanisation bias. Euan Mearns in looking at temperature data of the Southern Hemisphere tried to avoid centres of population due to the data being biased. It is easy to surmise the lack of warming Mearns found in central Australia7 was lack of an urbanisation bias from the large cities on the coast. However, the GISS maps do not support this. Ronan and Michael Connolly8 of Global Warming Solved claim that the urbanisation bias in the global temperature data is roughly equivalent to the entire warming of the recent epoch. I am not sure that the urbanisation bias is so large, but even if it were, it could be complementary to my hypothesis based on trends.

Second is that homogenisation adjustments could be greater the more distant in past that they occur. It has been noted (Steve Goddard in particular) that each new set of GISS adjustments adjusts past data. The same data set used to test my hypothesis above could also be utilized to test this hypothesis, by conducting homogenisations runs on the data to date, then only to 2000, then to 1990 etc. It could be that the earlier warming trend is somehow suppressed by homogenizing the most recent data, then working backwards through a number of iterations, each one using the results of the previous pass. The impact on trends that operate over different time periods, but converge over longer periods, could magnify the divergence and thus cause differences in trends decades in the past to be magnified. As such differences in trend appear to the algorithm to be more anomalous than in reality they actually are.

Kevin Marshall


  1. Dana Nuccitelli – What caused early 20th Century warming? 24.03.2011
  2. Source
  3. See my post Base Orcadas as a Proxy for early Twentieth Century Antarctic Temperature Trends 24.05.2015
  4. Euan Mearns – The Hunt For Global Warming: Southern Hemisphere Summary 14.03.2015. Area studies are referenced on this post.
  5. Venema et al 2012 – Venema, V. K. C., Mestre, O., Aguilar, E., Auer, I., Guijarro, J. A., Domonkos, P., Vertacnik, G., Szentimrey, T., Stepanek, P., Zahradnicek, P., Viarre, J., Müller-Westermeier, G., Lakatos, M., Williams, C. N., Menne, M. J., Lindau, R., Rasol, D., Rustemeier, E., Kolokythas, K., Marinova, T., Andresen, L., Acquaotta, F., Fratianni, S., Cheval, S., Klancar, M., Brunetti, M., Gruber, C., Prohom Duran, M., Likso, T., Esteban, P., and Brandsma, T.: Benchmarking homogenization algorithms for monthly data, Clim. Past, 8, 89-115, doi:10.5194/cp-8-89-2012, 2012.
  6. Roger Andrews – The Worst of BEST 23.03.2015
  7. Euan Mearns – Temperature Adjustments in Australia 22.02.2015
  8. Ronan and Michael Connolly – Summary: “Urbanization bias” – Papers 1-3 05.12.2013

Defining “Temperature Homogenisation”


The standard definition of temperature homogenisation is of a process that cleanses the temperature data of measurement biases to only leave only variations caused by real climatic or weather variations. This is at odds with GHCN & GISS adjustments which delete some data and add in other data as part of the homogenisation process. A more general definition is to make the data more homogenous, for the purposes of creating regional and global average temperatures. This is only compatible with the standard definition if assume that there are no real data trends existing within the homogenisation area. From various studies it is clear that there are cases where this assumption does not hold good. The likely impacts include:-

  • Homogenised data for a particular temperature station will not be the cleansed data for that location. Instead it becomes a grid reference point, encompassing data from the surrounding area.
  • Different densities of temperature data may lead to different degrees to which homogenisation results in smoothing of real climatic fluctuations.

Whether or not this failure of understanding is limited to a number of isolated instances with a near zero impact on global temperature anomalies is an empirical matter that will be the subject of my next post.



A common feature of many concepts involved with climatology, the associated policies and sociological analyses of non-believers, is a failure to clearly understand of the terms used. In the past few months it has become evident to me that this failure of understanding extends to term temperature homogenisation. In this post I look at the ambiguity of the standard definition against the actual practice of homogenising temperature data.


The Ambiguity of the Homogenisation Definition

The World Meteorological Organisation in its’ 2004 Guidelines on Climate Metadata and Homogenization1 wrote this explanation.

Climate data can provide a great deal of information about the atmospheric environment that impacts almost all aspects of human endeavour. For example, these data have been used to determine where to build homes by calculating the return periods of large floods, whether the length of the frost-free growing season in a region is increasing or decreasing, and the potential variability in demand for heating fuels. However, for these and other long-term climate analyses –particularly climate change analyses– to be accurate, the climate data used must be as homogeneous as possible. A homogeneous climate time series is defined as one where variations are caused only by variations in climate.

Unfortunately, most long-term climatological time series have been affected by a number of nonclimatic factors that make these data unrepresentative of the actual climate variation occurring over time. These factors include changes in: instruments, observing practices, station locations, formulae used to calculate means, and station environment. Some changes cause sharp discontinuities while other changes, particularly change in the environment around the station, can cause gradual biases in the data. All of these inhomogeneities can bias a time series and lead to misinterpretations of the studied climate. It is important, therefore, to remove the inhomogeneities or at least determine the possible error they may cause.


That is temperature homogenisation is necessary to isolate and remove what Steven Mosher has termed measurement biases2, from the real climate signal. But how does this isolation occur?

Venema et al 20123 states the issue more succinctly.


The most commonly used method to detect and remove the effects of artificial changes is the relative homogenization approach, which assumes that nearby stations are exposed to almost the same climate signal and that thus the differences between nearby stations can be utilized to detect inhomogeneities (Conrad and Pollak, 1950). In relative homogeneity testing, a candidate time series is compared to multiple surrounding stations either in a pairwise fashion or to a single composite reference time series computed for multiple nearby stations. (Italics mine)


Blogger …and Then There’s Physics (ATTP) partly recognizes these issues may exist in his stab at explaining temperature homogenisation4.

So, it all sounds easy. The problem is, we didn’t do this and – since we don’t have a time machine – we can’t go back and do it again properly. What we have is data from different countries and regions, of different qualities, covering different time periods, and with different amounts of accompanying information. It’s all we have, and we can’t do anything about this. What one has to do is look at the data for each site and see if there’s anything that doesn’t look right. We don’t expect the typical/average temperature at a given location at a given time of day to suddenly change. There’s no climatic reason why this should happen. Therefore, we’d expect the temperature data for a particular site to be continuous. If there is some discontinuity, you need to consider what to do. Ideally you look through the records to see if something happened. Maybe the sensor was moved. Maybe it was changed. Maybe the time of observation changed. If so, you can be confident that this explains the discontinuity, and so you adjust the data to make it continuous.

What if there isn’t a full record, or you can’t find any reason why the data may have been influenced by something non-climatic? Do you just leave it as is? Well, no, that would be silly. We don’t know of any climatic influence that can suddenly cause typical temperatures at a given location to suddenly increase or decrease. It’s much more likely that something non-climatic has influenced the data and, hence, the sensible thing to do is to adjust it to make the data continuous. (Italics mine)

The assumption of a nearby temperature stations have the same (or very similar) climatic signal, if true would mean that homogenisation would cleanse the data of the impurities of measurement biases. But there is only a cursory glance given to the data. For instance, when Kevin Cowtan gave an explanation of the fall in average temperatures at Puerto Casado neither he, nor anyone else, checked to see if the explanation stacked up beyond checking to see if there had been a documented station move at roughly that time. Yet the station move is at the end of the drop in temperatures, and a few minutes checking would have confirmed that other nearby stations exhibit very similar temperature falls5. If you have a preconceived view of how the data should be, then a superficial explanation that conforms to that preconception will be sufficient. If you accept the authority of experts over personally checking for yourself, then the claim by experts that there is not a problem is sufficient. Those with no experience of checking the outputs following processing of complex data will not appreciate the issues involved.


However, this definition of homogenisation appears to be different from that used by GHCN and NASA GISS. When Euan Mearns looked at temperature adjustments in the Southern Hemisphere and in the Arctic6, he found numerous examples in the GHCN and GISS homogenisations of infilling of some missing data and, to a greater extent, deleted huge chunks of temperature data. For example this graphic is Mearns’ spreadsheet of adjustments between GHCNv2 (raw data + adjustments) and the GHCNv3 (homogenised data) for 25 stations in Southern South America. The yellow cells are where V2 data exist V3 not; the greens cells V3 data exist where V2 data do not.



Definition of temperature homogenisation

A more general definition that encompasses the GHCN / GISS adjustments is of broadly making the
data homogenous. It is not done by simply blending the data together and smoothing out the data. Homogenisation also adjusts anomalous data as a result of pairwise comparisons between local temperature stations, or in the case of extreme differences in the GHCN / GISS deletes the most anomalous data. This is a much looser and broader process than homogenisation of milk, or putting some food through a blender.

The definition I cover in more depth in the appendix.



The Consequences of Making Data Homogeneous

A consequence of cleansing the data in order to make it more homogenous gives a distinction that is missed by many. This is due to making the strong assumption that there are no climatic differences between the temperature stations in the homogenisation area.

Homogenisation is aimed at adjusting for the measurement biases to give a climatic reading for the location where the temperature station is located that is a closer approximation to what that reading would be without those biases. With the strong assumption, making the data homogenous is identical to removing the non-climatic inhomogeneities. Cleansed of these measurement biases the temperature data is then both the average temperature readings that would have been generated if the temperature station had been free of biases and a representative location for the area. This latter aspect is necessary to build up a global temperature anomaly, which is constructed through dividing the surface into a grid. Homogenisation, in the sense of making the data more homogenous by blending is an inappropriate term. All what is happening is adjusting for anomalies within the through comparisons with local temperature stations (the GHCN / GISS method) or comparisons with an expected regional average (the Berkeley Earth method).


But if the strong assumption does not hold, homogenisation will adjust these climate differences, and will to some extent fail to eliminate the measurement biases. Homogenisation is in fact made more necessary if movements in average temperatures are not the same and the spread of temperature data is spatially uneven. Then homogenisation needs to not only remove the anomalous data, but also make specific locations more representative of the surrounding area. This enables any imposed grid structure to create an estimated average for that area through averaging the homogenized temperature data sets within the grid area. As a consequence, the homogenised data for a temperature station will cease to be a closer approximation to what the thermometers would have read free of any measurement biases. As homogenisation is calculated by comparisons of temperature stations beyond those immediately adjacent, there will be, to some extent, influences of climatic changes beyond the local temperature stations. The consequences of climatic differences within the homogenisation area include the following.


  • The homogenised temperature data for a location could appear largely unrelated to the original data or to the data adjusted for known biases. This could explain the homogenised Reykjavik temperature, where Trausti Jonsson of the Icelandic Met Office, who had been working with the data for decades, could not understand the GHCN/GISS adjustments7.
  • The greater the density of temperature stations in relation to the climatic variations, the less that climatic variations will impact on the homogenisations, and the greater will be the removal of actual measurement biases. Climate variations are unlikely to be much of an issue with the Western European and United States data. But on the vast majority of the earth’s surface, whether land or sea, coverage is much sparser.
  • If the climatic variation at a location is of different magnitude to that of other locations in the homogenisation area, but over the same time periods and direction, then the data trends will be largely retained. For instance, in Svarlbard the warming temperature trends of the early twentieth century and from the late 1970s were much greater than elsewhere, so were adjusted downwards8.
  • If there are differences in the rate of temperature change, or the time periods for similar changes, then any “anomalous” data due to climatic differences at the location will be eliminated or severely adjusted, on the same basis as “anomalous” data due to measurement biases. For instance in large part of Paraguay at the end of the 1960s average temperatures by around 1oC. Due to this phenomena not occurring in the surrounding areas both the GHCN and Berkeley Earth homogenisation processes adjusted out this trend. As a consequence of this adjustment, a mid-twentieth century cooling in the area was effectively adjusted to out of the data9.
  • If a large proportion of temperature stations in a particular area have consistent measurement biases, then homogenisation will retain those biases, as it will not appear anomalous within the data. For instance, much of the extreme warming post 1950 in South Korea is likely to have been as a result of urbanization10.


Other Comments

Homogenisation is just part of the process of adjusting data for the twin purposes of attempting to correct for biases and building a regional and global temperature anomalies. It cannot, for instance, correct for time of observation biases (TOBS). This needs to be done prior to homogenisation. Neither will homogenisation build a global temperature anomaly. Extrapolating from the limited data coverage is a further process, whether for fixed temperature stations on land or the ship measurements used to calculate the ocean surface temperature anomalies. This extrapolation has further difficulties. For instance, in a previous post11 I covered a potential issue with the Gistemp proxy data for Antarctica prior to permanent bases being established on the continent in the 1950s. Making the data homogenous is but the middle part of a wider process.

Homogenisation is a complex process. The Venema et al 20123 paper on the benchmarking of homogenisation algorithms demonstrates that different algorithms produce significantly different results. What is clear from the original posts on the subject by Paul Homewood and the more detailed studies by Euan Mearns and Roger Andrews at Energy Matters, is that the whole process of going from the raw monthly temperature readings to the final global land surface average trends has thrown up some peculiarities. In order to determine whether they are isolated instances that have near zero impact on the overall picture, or point to more systematic biases that result from the points made above, it is necessary to understand the data available in relation to the overall global picture. That will be the subject of my next post.


Kevin Marshall



  1. GUIDELINES ON CLIMATE METADATA AND HOMOGENIZATION by Enric Aguilar, Inge Auer, Manola Brunet, Thomas C. Peterson and Jon Wieringa
  2. Steven Mosher – Guest post : Skeptics demand adjustments 09.02.2015
  3. Venema et al 2012 – Venema, V. K. C., Mestre, O., Aguilar, E., Auer, I., Guijarro, J. A., Domonkos, P., Vertacnik, G., Szentimrey, T., Stepanek, P., Zahradnicek, P., Viarre, J., Müller-Westermeier, G., Lakatos, M., Williams, C. N., Menne, M. J., Lindau, R., Rasol, D., Rustemeier, E., Kolokythas, K., Marinova, T., Andresen, L., Acquaotta, F., Fratianni, S., Cheval, S., Klancar, M., Brunetti, M., Gruber, C., Prohom Duran, M., Likso, T., Esteban, P., and Brandsma, T.: Benchmarking homogenization algorithms for monthly data, Clim. Past, 8, 89-115, doi:10.5194/cp-8-89-2012, 2012.
  4. …and Then There’s Physics – Temperature homogenisation 01.02.2015
  5. See my post Temperature Homogenization at Puerto Casado 03.05.2015
  6. For example

    The Hunt For Global Warming: Southern Hemisphere Summary

    Record Arctic Warmth – in 1937

  7. See my post Reykjavik Temperature Adjustments – a comparison 23.02.2015
  8. See my post RealClimate’s Mis-directions on Arctic Temperatures 03.03.2015
  9. See my post Is there a Homogenisation Bias in Paraguay’s Temperature Data? 02.08.2015
  10. NOT A LOT OF PEOPLE KNOW THAT (Paul Homewood) – UHI In South Korea Ignored By GISS 14.02.2015



Appendix – Definition of Temperature Homogenisation

When discussing temperature homogenisations, nobody asks what the term actual means. In my house we consume homogenised milk. This is the same as the pasteurized milk I drank as a child except for one aspect. As a child I used to compete with my siblings to be the first to open a new pint bottle, as it had the cream on top. The milk now does not have this cream, as it is blended in, or homogenized, with the rest of the milk. Temperature homogenizations are different, involving changes to figures, along with (at least with the GHCN/GISS data) filling the gaps in some places and removing data in others1.

But rather than note the differences, it is better to consult an authoritative source. From, the definitions of homogenize are:-

verb (used with object), homogenized, homogenizing.

  1. to form by blending unlike elements; make homogeneous.
  2. to prepare an emulsion, as by reducing the size of the fat globules in (milk or cream) in order to distribute them equally throughout.
  3. to make uniform or similar, as in composition or function:

    to homogenize school systems.

  4. Metallurgy. to subject (metal) to high temperature to ensure uniform diffusion of components.

Applying the dictionary definitions, data homogenization in science is not about blending various elements together, nor about additions or subtractions from the data set, or adjusting the data. This is particularly true in chemistry.

For UHCN and NASA GISS temperature data homogenization involves removing or adjusting elements in the data that are markedly dissimilar from the rest. It can also mean infilling data that was never measured. The verb homogenize does not fit the processes at work here. This has led to some, like Paul Homewood, to refer to the process as data tampering or worse. A better idea is to look further at the dictionary.

Again from, the first two definitions of the adjective homogeneous are:-

  1. composed of parts or elements that are all of the same kind; not heterogeneous:

a homogeneous population.

  1. of the same kind or nature; essentially alike.

I would suggest that temperature homogenization is a loose term for describing the process of making the data more homogeneous. That is for smoothing out the data in some way. A false analogy is when I make a vegetable soup. After cooking I end up with a stock containing lumps of potato, carrot, leeks etc. I put it through the blender to get an even constituency. I end up with the same weight of soup before and after. A similar process of getting the same after homogenization as before is clearly not what is happening to temperatures. The aim of making the data homogenous is both to remove anomalous data and blend the data together.




Get every new post delivered to your Inbox.

Join 50 other followers