aTTP falsely attacks Bjorn Lomborg’s “Impact of Current Climate Proposals” Paper

The following is a comment to be posted at Bishop Hill, responding to another attempt by blogger ….andThenThere’sPhysics to undermine the work of Bjorn Lomborg. The previous attempt was discussed here. This post includes a number of links, as well as a couple of illustrative screen captures at the foot of the table.

aTTP’s comment is

In fact, you should read Joe Romm’s post about this. He’s showing that the INDCs are likely to lead to around 3.5C which I think is relative to something like the 1860-1880 mean. This is very similar to the MIT’s 3.7, and quite a bit lower than the RCP8.5 of around 4.5C. So, yes, we all know that the INDCs are not going to do as much as some might like, but the impact is likely to be a good deal greater than that implied by Lomborg who has essentially assumed that we get to 2030 and then simply give up.

Nov 11, 2015 at 9:31 AM | …and Then There’s Physics

My Comment

aTTP at 9.31 refers to Joe Romm’s blog post of Nov 3 “Misleading U.N. Report Confuses Media On Paris Climate Talks“. Romm uses Climate Interactive’s Climate Scoreboard Tool to show the INDC submissions (if fully implemented) will result in 3.5°C as against the 4.5°C in the non-policy “No Action” Scenario. This is six times the claimed maximum impact of 0.17°C claimed in Lomberg’s new paper. Who is right? What struck me first was that Romm’s first graph, copied straight from the Climate Interactive’s seem to have a very large estimate for emissions in the “No Action” Scenario producing. Downloading the underlying data, I find the “No Action” global emissions in 2100 are 139.3 GtCO2e, compared with about 110 GtCO2e in Figure SPM5(a) of the AR5 Synthesis Report for the RCP8.5 scenario high emissions scenario. But it is the breakdown per country or region that matters.

For the USA, without action emissions are forecast to rise from 2010 to 2030 by 40%, in contrast to a rise of just 9% in the period 1990 to 2010. It is likely that emissions will fall without policy and will be no higher in 2100 than in 2010. The “no action” scenario overestimates 2030 emissions by 2-3 GtCO2e in 2030 and about 7-8 GtCO2e in 2100.

For the China the overestimation is even greater. Emissions will peak during the next decade as China fully industrializes, just as emissions peaked in most European countries in the 1970s and 1980s. Climate Interactive assumes that emissions will peak at 43 GtCO2e in 2090, whereas other estimates that the emissions peak will be around 16-17 GtCO2e before 2030.

Together, overestimations of the US and China’s “No Action” scenarios account for over half 55-60 GtCO2e 2100 emissions difference between the “No Action” and “Current INDC” scenarios. A very old IT term applies here – GIGO. If aTTP had actually checked the underlying assumptions he would realise that Romm’s rebuttal of Lomborg based on China’s emission assumptions (and repeated on his own blog) are as false as claiming that the availability of free condoms is why population peaks.

Links posted at

Kevin Marshall


Figures referred to (but not referenced) in the comment above

Figure 1: Climate Interactive’s graph, referenced by Joe Romm.

Figure 2: Reproduction of Figure SPM5(a) from Page 9 of the AR5 Synthesis Report.


Update – posted the following to ATTP’s blog


Lomborg and the Grantham Institute on the INDC submissions

Bjorn Lomborg has a new paper published in the Global Policy journal, titled: Impact of Current Climate Proposals. (hattip Bishop Hill and WUWT)

From the Abstract

This article investigates the temperature reduction impact of major climate policy proposals implemented by 2030, using the standard MAGICC climate model. Even optimistically assuming that promised emission cuts are maintained throughout the century, the impacts are generally small. ………… All climate policies by the US, China, the EU and the rest of the world, implemented from the early 2000s to 2030 and sustained through the century will likely reduce global temperature rise about 0.17°C in 2100. These impact estimates are robust to different calibrations of climate sensitivity, carbon cycling and different climate scenarios. Current climate policy promises will do little to stabilize the climate and their impact will be undetectable for many decades.

That is pretty clear. COP21 in Paris is a waste of time.

An alternative estimate is provided in a paper by Boyd, Turner and Ward (BTW) of the LSE Grantham Institute, published at the end of October.

They state

The most optimistic estimate of global emissions in 2030 resulting from the INDCs is about halfway between hypothetical ‘business as usual’ and a pathway that is consistent with the 2°C limit

The MAGICC climate model used by both Lomborg & the IPCC predicts warming of about 4.7°C under BAU, implying up to a 1.35°C difference from the INDCs, compared to the 0.17°C maximum calculated by Lomborg, 8 times the amount. Lomborg says this is contingent on no carbon leakage (exporting industry from policy to non-policy countries), whilst citing studies showing that it could offset 10-40%, or even over 100% of the emissions reduction. So the difference between sceptic Lomborg and the mighty LSE Grantham Institute is even greater than 8 times. Yet Lomborg refers extensively to the August Edition of BTW. So why the difference? There is no explicit indication in BTW of how they arrive at their halfway conclusion. nor a comparison by Lomborg.

Two other estimates are from the UNFCCC, and Climate Action Tracker. Both estimate the INDCs will constrain warming to 2.7°C, or about 2.0°C below the MAGICC BAU scenario. They both make assumptions about massive reductions in emissions post 2030 that are not in the INDCs. But at least the UNFCCC and CAT have graphs that show the projection through to 2100. Not so with BTW.

This is where the eminent brain surgeons and Nobel-Prize winning rocket scientists among the readership will need to concentrate to achieve the penetrating analytical powers of a lesser climate scientist.

From the text of BTW, the hypothetical business as usual (BAU) scenario for 2030 is 68 GtCO2e. The most optimistic scenario for emissions from the INDCs (and pessimistic for economic growth in the emerging economies) us that 2030 emissions will be 52 GtCO2e. The sophisticated climate projection models have whispered in code to the climate scientists that to be on target for the limit of 2.0°C, 2030 emissions show be not more than 36 GtCO2e. The mathematicians will be able to determine that 52 is exactly halfway between 36 and 68.

Now for the really difficult bit. I have just spent the last half hour in the shed manically cranking the handle of my patent beancounter extrapolator machine to get this result. By extrapolating this halfway result for the forecast period 2010-2030 through to 2100 my extrapolator tells me the INDCs are halfway to reaching the 2.0°C maximum warming target.

As Bob Ward will no doubt point out in his forthcoming rebuttal of Bjorn Lomborg’s paper, it is only true climate scientists who can reach such levels of analysis and understanding.

I accept no liability for any injuries caused, whether physical or psychological, by people foolishly trying to replicate this advanced result. Please leave this to the experts.

Fut there is a serious side to this policy advocacy. The Grantham Institute, along with others, is utterly misrepresenting the effectiveness of policy to virtually every government on the planet. Lomborg shows by rigorous means that policy is ineffective even if loads ridiculous assumptions are made, whether on climate science forecasting, policy theory, technological solutions, government priorities, or the ability of  current governments to make policy commitments for governments for decades ahead. My prediction is that the reaction of the Grantham Institute, along with plenty of others, is a thuggish denunciation of Lomborg. What they will not allow is the rational response to wide differences of interpretation. That is to compare and contrast the arguments and the assumptions made, both explicit and implicit. 

Kevin Marshall

WORLD RESOURCES INSTITUTE and Indonesian Emission Figures

In looking at the Indonesian INDC submission, I came across a confusing array of estimates for Indonesia’s total greenhouse gas emissions. These are the ones I found.

Estimates of Indonesia’s Total Greenhouse Emissions in MtCO2e





UNFCCC 1,101 1,444 2,829 1,908
EDGAR 1,165 622 1,171 745
WRI CAIT 2.0 1,026 1,372 1,584 1,928
WRI Blog   1,000 1,400 1,500
Indonesian Govt     1,800  

In graph format the figures are:-

The Indonesian INDC Submission says it will give unconditionally cut emissions by 29% from the BAU of 2881 MtCO2e, it means that in 2030 emissions will be about 100 MtCO2e lower than in 2005 not 1120 MtCO2e lower (UNFCCC) or 530 MtCO2e higher (EDGAR) . But on the basis of the UNFCCC or EGDAR figures by 2010 Indonesia had fallen by a third, so meeting the 2030 unconditional target should prove a doddle. Alternatively, use the World Resources Institute CAIT 2.0 data and Indonesia has unconditionally agreed something much more drastic. Between 2005 and 2010 emissions grow at 4% a year. On that trend, the 2030 BAU becomes 4200 MtCO2e, not 2881 MtCO2e, so the unconditional emissions “cut” is not 29% but 51%.

The worst example is contained in a graph about the Indonesian INDC Submission at the World Resources Institute Blog and reproduced below.

There are a number of things wrong with this graph, including

  • Scale is in KtCO2e, not MtCO2e.
  • Does not use WRI’s own CAIT 2.0. This is despite WRI claiming itprovides free access to comprehensive, reliable, and comparable greenhouse gas emissions data sets, as well as other climate-relevant indicators, to enable analysis on a wide range of climate-related data questions.
  • Nor does is there any trace of Indonesia’s claimed emissions 1800 Mt CO2e in 2005. So where does this wibbly-wobbly projection come from? The reference includes BAPPENAS 2015 – the Indonesian “National Development Planning Agency”. A search finds this graph.

The figure for 2005 is about 1400 MtCO2e, not the 1800 MtCO2e stated in the INDC. The Indonesian’s have fiddled their own unaudited figures to get a politically desired result – an easily achievable “reduction” in GHG emissions. Even worse, the WRI does check the data. There are minor points that the Indonesian “dalam ribu ton” translates on Google as “in thousand tons“, or that anyone who knows climate data would realize that 1,000,000 MtCO2e is greater than 49GtCO2e, the UNIPCCs AR5 global estimate of GHG emissions in 2010.

Finally, the Carbon Brief, in a recent article says that 1997 was a record for forest fires – a record that may be broken in 2015. Already 1600 MtCO2e has been emitted from forest fires. On this basis, therefore, 1997 total Indonesian emissions are likely to be well in excess of 2000 MtCO2e, and a considerable spike in the record.

The WRI CAIT 2.0 data, shows a minor spike. The narrower “GHG Emissions from Land-Use Change and Forestry” was estimated at 904 MtCO2e, as against 1321 MtCO2e in 2006. This is nowhere near the implied Carbon Brief 1997 emissions record. The figures

In summary, emissions figures for Indonesia are just arbitrary estimates, based on extremely limited and contradictory data. Both the WRI and the Indonesian Government cherry-pick data to suit their cause. Whether it is justified depends on the purpose. The WRI states their missions clearly.

That is to impose their environmentalist beliefs and perspectives on everybody else.

Indonesia’s INDC submission begins

This is, in my view, a far more rounded and focused mission. Against the environmentalist ideologies of the UNFCCC I believed that in manipulating figures Indonesia is serving the interests of 250 million Indonesians.

Kevin Marshall

Indonesia Outflanks the Climate Activists in its INDC Submission

I have spent a few weeks trying to make sense of the INDC submissions. One of the most impenetrable appeared to that from Indonesia. This view is shared by The Carbon Brief.

Uncertain emissions

As well as being hazy on policy and financing needs, it is also difficult to gauge the ambition of Indonesia’s INDC emissions targets. This is despite the document including a projected figure for BAU emissions in 2030 of 2.9bn tonnes of CO2 equivalent (GtCO2e).

The pledge to reduce emissions by at least 29% compared to this trajectory means an effective cap in 2030 of 2GtCO2e. With the more ambitious 41% reduction compared to BAU, the cap would be 1.7GtCO2e.


Similarly the World Resources Institute states

(T)he current draft contribution still displays several important gaps in transparency and ambition, which must be addressed before submitting a final INDC to the United Nations Framework Convention on Climate Change (UNFCCC). By eliminating these gaps, the Indonesian government could bring its contribution into line with international best practices on transparency, demonstrate leadership internationally by enhancing ambition, and help ensure success at COP 21.

The context from Indonesia’s perspective is stated in the opening paragraph of Indonesia’s INDC Submission.

In more basic language, Indonesia has more important and immediate priorities than “climate change“. From a national point of view, imposing drastic and ineffective policies will go against the Indonesian Government’s perceived duty to its people. This will happen regardless of the truth of the projected catastrophes that await the planet without global mitigation. The policies will be ineffective because most other emerging economies have similar priorities to Indonesia, and are taking similar measures of policy avoidance. In the case of Indonesia these are

  • Cherry-picking a base year.
  • Making reductions relative to a fictional “Business as Usual” scenario with inflated economic growth figures.
  • Making sure that even the most ambitious objectives achievable within the range of an objective forecast.
  • Focus the negotiations on achieving the conditional objectives subject to outside assistance. Any failure to reach agreement then becomes the fault of rich countries failing to provide the finance.
  • Allow some room to make last minute concessions not in the original submission, contingent on further unspecified outside assistance that is so vast the money will never be forthcoming.

The calculations to achieve the figures in the submissions are fairly simple to work out with a bit of patience.


Calculating the 2030 Business as Usual 2881 MtCO2e

The Indonesian INDC submission states that in 2005 total emissions were 1800 MtCO2e and combustion of fossil fuels were 19% of this total. That implies about 342 MtCO2e from the combustion of fossil fuels. The Carbon Dioxide Information Analysis Center (CDIAC1) has an estimated figure of 341.71 MtCO2e and the UNFCCC Country Brief in 2005 “CO2 emissions from fuel combustion” were 335.71 MtCO2e. For 20112 the CDIAC estimate is 472.53 MtCO2e, rounded to 473. Let us now assume a growth rate in emissions of 6.0% per annum from 2012 to 2030, against an economic growth rate of around 5.2% from 2000 to 2010 and 5.8% from 2005 to 20103. At 6.0% compound growth fossil fuel emissions in 20304 will be 1431 MtCO2e.

The non-fossil fuel emissions are a bit more problematic to work out. In 2005 the baseline estimate is 81% of 18005 is 1458. It is only a vague estimate, so round it down to 1450 and then assume it is constant for the Business as Usual (BAU) scenario.

The BAU 2030 total emissions forecast for Indonesia is therefore 1431 + 1450 = 2881 MtCO2e.

There might be other ways to derive this figure, but none are simpler and the figures do not fall out exactly.


How does Indonesia achieve the unconditional 29% reduction against BAU?

The easiest part to achieve is outside of fossil fuel emissions. The major cause of these emissions is in the reduction of the rainforests. The Carbon Brief is claims the biggest source of non-fossil fuel emissions is due to illegal forest clearances to grow palm oil. Although in 2015 the forest fires are closing in on the record set in 1997, it is safe to say that that these will reduce considerably in the coming years as Indonesia already has 52% of world palm oil production. By assuming a 3.34% reduction per annum in these emissions from 2005, they will reduce from 1450 MtCO2e to 611 MtCO2e in 2030. Total emissions of 2042 MtCO2e (1431+611) are 29.1% lower than BAU without an expense on the part of the Indonesian Government.


How does Indonesia achieve the conditional 41% reduction against BAU?

Indonesia claims that it needs international cooperation increase the reduction against BAU to 41%. In whole numbers, if BAU is 2881 a 41% reduction would make 1700. Not 1699 or 1701, but 1700. This is 100 less than the estimated 1800 MtCO2e total GHG emissions for 2005. This will be achieved without any “international cooperation“, a euphemism for foreign aid. The reason is simple. From the UNFCCC Indonesia Country Brief for Indonesia GDP growth for 1990 to 2012 average GDP growth per annum was 4.9% and CO2 emissions from fuel combustion was 5.1%. Normally GDP growth exceeds emissions growth. As a country develops this gap will widen until emissions growth ceases altogether and will even fall slightly. In India GDP growth from 1990 to 2012 averaged 6.5% and emissions growth was 5.7%. In China the respective figures are 10.3% and 6.1%. In China, emissions will peak around 2025 to 2030 without any policy change. It is reasonable to assume therefore that forecast fossil fuel emissions growth will be at a lower rate than the forecast GDP growth of 6.0%. A conservative estimate is that the fossil fuel emissions growth rate will be 25% lower than GDP growth rate from 2011 to 2030 at 4.50%. Rounding as before4 gives forecast emissions of 1089 MtCO2e as against a BAU of 1431.

The revised 2030 total emissions forecast for Indonesia is 1089 + 611 = 1700 MtCO2e. This is a 41.0% reduction on the BAU of 2881 MtCO2e.


Why should Indonesia have such a cynical manipulation of the numbers?

Indonesia is caught between a rock and a hard place. The stated major priorities for this country of 250 million people are at odds with doing its bit to save the world. In this Indonesia is not alone. India, China, and Vietnam are other major emerging nations who site other priorities. Ranged against them are the activist scientists behind the climate scare who hold the a priori truth of the prophesied global warming catastrophes that await the planet if we do not amend out wicked ways. Further, mitigation policies are good for the sole, regardless of their effectiveness, and the practice of these policies will lead others to enlightenment they have found. They will not recognize that any alternative points of view exist, whether morally, politically or scientifically. Rather than argue, the best policy is to outflank them. The activists will accept official policy objectives without question so long as it appears to fit the cause. So the Indonesians gave them massive cuts related to fictitious projected figures, cloaked with the language of climate speak to throw them off the scent. They should be applauded for protecting 250 million people, rather than inflicting ineffective burdens upon them. The real shame is that the leaders of the so-called developed economies have fallen for this rubbish.

Kevin Marshall


  1. Reference of the full global carbon budget 2014: C. Le Quéré, R. Moriarty, R. M. Andrew, G. P. Peters, P. Ciais, P. Friedlingstein, S. D. Jones, S. Sitch, P. Tans, A. Arneth, T. A. Boden, L. Bopp, Y. Bozec, J. G. Canadell, F. Chevallier, C. E. Cosca, I. Harris, M. Hoppema, R. A. Houghton, J. I. House, A. K. Jain, T. Johannessen, E. Kato, R. F. Keeling, V. Kitidis, K. Klein Goldewijk, C. Koven, C. S. Landa, P. Landschützer, A. Lenton, I. D. Lima, G. H. Marland, J. T. Mathis, N. Metzl, Y. Nojiri, A. Olsen, T. Ono, W. Peters, B. Pfeil, B. Poulter, M. R. Raupach, P. Regnier, C. Rödenbeck, S. Saito, J. E. Sailsbury, U. Schuster, J. Schwinger, R. Séférian, J. Segschneider, T. Steinhoff, B. D. Stocker, A. J. Sutton, T. Takahashi, B. Tilbrook, G. R. van der Werf, N. Viovy, Y.-P. Wang, R. Wanninkhof, A. Wiltshire, and N. Zeng 2014. Global Carbon Budget 2014. Earth System Science Data Discussions, doi:10.5194/essdd-7-521-2014
  2. 2011 is the baseline year for the IPCC reports.
  3. This can be obtained from two sources. First the INDC submission notes that “GDP Growth Rate has slowed between 2010-2015 from 6.2-6.5% per annum to only 4.0% per annum (first quarter of 2015).” A return to the higher levels of growth is an assumption of successful government policy.
  4. Each year growth of 6.0% is rounded to the nearest whole number.
  5. The 2005 total emissions estimate of 1800 MtCO2 is at odds with other estimates. The WRI CAIT 2.0 figure is 1600; the EDGAR estimate is 1171; and the UNFCCC estimate is 2828. There might be another method of estimation. Maybe it is being a bit too cynical to assume that someone could have taken the average of the three (1866) and rounded down.

Plans to Increase Global Emissions at COP21 Paris


It is a necessary, but far from sufficient, condition to cut global greenhouse gas emissions for any increases in emissions in some parts of the world to be offset by emissions cuts elsewhere. INDC submissions for the COP21 in Paris contain proposed emissions targets between 2010 and 2030 suggest the opposite will be case. For every tonne of emissions reductions in 32 leading developed countries there will be at least three tonnes of emissions increases in 7 major developing countries. The net effect of these targets being achieved from these countries (which combined make up both 60% of global emissions and 60% of global population) will be to make global emissions 20% higher in 2030 than 2010. Using UNIPCC AR5 projections, unless there are large and rapid cuts in in global greenhouse emissions post 2030, any agreement based those submissions will not save the world from two degrees of dangerous global warming and will likely not save the world from three degrees of warming. This leads to a policy problem. Emissions reduction policies will only reduce a small part of the harms of climate change. So even if the more extreme claims of climate catastrophism are true, then it might be more beneficial for a nation to avoid emissions reduction policies.


In the following analysis makes these assumptions.

  • UNIPCC estimates of the relationship between global average temperature and atmospheric greenhouse gas levels are accurate.
  • UNIPCC estimates of the relationship between greenhouse gas emissions and atmospheric greenhouse gas levels are accurate.
  • Policy commitments will always turn into concrete policy.
  • Climate change policy priorities will not conflict with other priorities.
  • All policy will be effectively implemented in full, implying the requisite technological and project management capacities are available.

The Context

The World’s leaders meeting from 30 November to December 11 in Paris together to thrash out a plan to save the world from a dangerous two degrees of warming. In preparation 146 countries, representing 87% of Global Emissions have submitted plans to the United Nations Framework Convention on Climate Change (UNFCCC). These are available at the submissions website here. There is no-one who has gone through to evaluate whether these submissions are consistent with this objective. I have chosen a small sample of 7 major developing nations and 32 developing nations (EU 28 have a single target) which combined represent about 60% of global emissions and 60% of global population.

The level of global emissions control required to constrain global warming is given by the IPCC in their final version of the 2014 AR5 Synthesis Report page 21 Figure SPM 11(a) and reproduced below.

The dark blue band is the maximum emissions pathway to avoid going beyond 2 degrees of warming, with RCP2.6 denoting the central pathway. The dark orange pathway would produce 2.5-3.0 degrees of warming. According to the figure SPM 5(a) Annual GHG emissions in 2010 were 49 GtCO2. They are currently increasing by at least 2% a year. The extrapolated projection for 2030 is 70-75 GtCO2, roughly following the solid black line of the RCP8.5 BAU (non-policy) scenario. In 2015 this will be about 54 GtCO2. The minimum for policy is that global emissions should be at least no higher than they were in 2010, and preferably below that level to offset the cumulative overshoot that will occur.

How does the global policy requirement fit in with the country submissions?

If the IPCC projections are correct, to avoid 2 degrees of warming being exceeded there needs to be a global cap on greenhouse gas emissions of around 50 GtCO2 almost immediately and for that level to start to start falling in the early 2020s. Alternatively, if global emissions reach 60 GtCO2 without any prospect of major reductions thereafter then from the model projections three degrees of warming is likely to be exceeded. There is a large gap between these two scenarios, but even with submissions from a limited number of the major countries it is possible to state that the lower limit will be exceeded. This can be done by calculating emissions increases in the major high growth developing countries and the proposed emissions reductions in the major developed countries. This is not straight forward, as in most country submissions there are no clear figures, so various assumptions need to be made. For developing countries this is particularly difficult, as the estimated business as usual (BAU) emissions are usually not stated and are dependent upon assumptions of economic growth, though sometimes there are clues within the text. For the developed countries the projections are easier to calculate, as they are relative to a date in the past. There is a further issue of which measure of emissions to use. I have used the UNFCCC issued estimates of GHG emissions in its Country Briefs for 1990, 2000, 2005 & 2010.1 In many of the submissions there often both conditional and unconditional estimates of 2030 emissions. For developing countries the lower estimates are dependent on external funding. For the other countries, emissions reductions are expressed as a range. In every case I have used the lower emissions figure.2

For the developing countries, those with major projected emissions increases countries are as follows.3

Estimated targeted emissions increases from 2010 to 2030 for major developing countries based on INDC Submissions

Emissons Change

INDC Submission

Country Brief





























The targeted total increase GHG for these seven countries between 2010 and 2030 is estimated to be in excess of 13 Gt.

According to World Bank Data there were 3300 million people in these seven countries in 2013, or 46% of the global population.

For the developed countries those with the largest quantitative emissions reductions are as follows.4

Estimated targeted emissions change from 2010 to 2030 for major developed countries from INDC Submissions

Emissons Change

INDC Submission

Country Brief





















The targeted total decrease GHG for these thirty-two countries between 2010 and 2030 is estimated to be 4 Gt.

According to World Bank Data there were 900 million people in these thirty-two countries in 2013, or 13% of the global population.

For every one tonne of emissions reduction by developed countries, it will be replaced by at least three tonnes of emissions elsewhere. Bigger reductions by these developed countries will not close the gap, as their total 2010 emissions are just 12.9 G. The developing countries do not include a single African country, nor Pakistan, Iran, Venezuela, or numerous other countries. Yet it does include all the major developed countries.

Whilst the developing countries way not achieve this increase in emissions by 2030, collectively they will achieve this increase shortly after that date. Many of the developed countries may not achieve the emissions reductions due to changing priorities. For instance the EU targets reduction may not be achieved due to Germany abandoning nuclear power in favour of coal and Southern European states reducing renewables subsidies as a response to recent economic crises.

The Elephant in the Room

In 2030, even with an agreement based on the INDC submissions signed this December in Paris, and then fully implemented without compromise there is still a problem. If the IPCC models are correct, the only way to stop the 3 degrees of warming being exceeded is through rapid reductions in emissions in those countries where emissions have recently peaked (e.g. South Korea and China) along with steep reductions in emissions of countries where they are still increasing rapidly (e.g. India and Bangladesh). Unless a technological miracle happens in the next decade this is not going to happen. More likely is that global emissions may keep on rising as many slower-growing African and Asian nations have ever larger unit increases in emissions each year.

The Policy Problem

The justification for mitigation policy is most clearly laid out in the British 2006 Stern Review Summary of Conclusions page vi

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more.

That is the unknown and random costs of climate change can be exchanged for the lesser and predictable costs of policy. A necessary, but far from sufficient, condition of this happening is that policy will eradicate all the prospective costs of climate change. It could be that if warming is constrained to less than 2 degrees the costs of climate change would be trivial, so the reality could be a close approximation of Stern’s viewpoint. But if warming exceeds 3 degrees and the alleged harms are correct, then emissions reducing policies are likely to lead to net harms for the countries implementing those policies and a small net benefit for those countries without policy.

Kevin Marshall


  1. The exception is for Bangladesh. They are one of the few countries that clearly lays out 2030 estimates in MtCO2, but the 2010 estimate is about 20% lower than the UNFCCC figure. I have just lifted the Bangladeshi figures.
  2. For instance the USA the target is to reduce is emissions 26-28% on the 2005 level. I have used the 28% figure. The United States is about the only country not providing target figures for 2030. I would be imprudent to assume any greater reductions given that it is not certain even this level will be ratified by Congress.
  3. Not all the countries outside of the rich are targeting emissions increases. Brazil and Argentina are targeting emissions reductions, whilst Thailand and South Korea would appear to be targeting to maintaining emissions at around 2010 levels.
  4. Not all developed countries have emissions reduction targets.
  5. South Korea with 1.3% of 2010 global emissions could be included in developed countries, but its target it is to roughly maintain emissions at 2010 levels. Switzerland, Norway and Singapore are all committed to emissions reductions, but combined they have less 0.3 GT of emissions.

A note on Bias in Australian Temperature Homogenisations

Jo Nova has an interesting and detailed post guest post by Bob Fernley-Jones on heavily homogenised rural sites in Australia by the Australian BOM.

I did a quick comment that was somewhat lacking in clarity. This post is to clarify my points.

In the post Bob Fernley-Jones stated

The focus of this study has been on rural stations having long records, mainly because the BoM homogenisation process has greatest relevance the older the data is.

Venema et al. 2012 stated (Italics mine)

The most commonly used method to detect and remove the effects of artificial changes is the relative homogenization approach, which assumes that nearby stations are exposed to almost the same climate signal and that thus the differences between nearby stations can be utilized to detect inhomogeneities (Conrad and Pollak, 1950). In relative homogeneity testing, a candidate time series is compared to multiple surrounding stations either in a pairwise fashion or to a single composite reference time series computed for multiple nearby stations.

This assumption of nearby temperature stations being exposed to same climate signal is standard practice. Victor Venema, (who has his own blog) is a leading academic expert on temperature homogenisation. However, there are extreme examples where this assumption does not hold. One example is at the end of the 1960s in much of Paraguay where average temperatures fell by one degree. As this was not replicated in the surrounding area both GISTEMP and Berkeley Earth homogenisations eliminated this anomaly. This was despite using very different homogenisation techniques. My analysis is here.

On a wider scale take a look at the GISTEMP land surface temperature anomaly map for 2014 against 1976-2010. (obtained from here)

Despite been homogenised and smoothed it is clear that trends are different. Over much of North America there was cooling, bucking the global trend. What this suggests to me is that the greater the distance between weather stations the greater the likelihood that the climate signals will be different. Most importantly for temperature anomaly calculations, over the twentieth century the number of weather stations increased dramatically. So it is more likely homogenisation will end up smoothing out local and sub-regional variations in temperature trends in the early twentieth century than in the later period. This is testable.

Why should this problem occur with expert scientists? Are they super beings who know the real temperature data, but have manufactured some falsehood? I think it is something much more prosaic. Those who work at the Australian BOM believe that the recent warming is human caused. In fact they believe that more than 100% of warming is human caused. When looking at outlier data records, or records that show inconsistencies there is a very human bias. Each time the data is reprocessed they find new inconsistencies, having previously corrected the data.

Kevin Marshall

Islamophobic and Anti-Semitic Hate Crime in London

The BBC has rightly highlighted the 70.7% rise in Islamophobic crime in the 12 months to July 2015 compared to the previous 12 months to 718 instances. Any such jump in crime rates should be taken seriously and tackled. To be attacked for one’s religion, including being punched and having dog faeces smeared on one’s head is repulsive. However, according to the Metropolitan Police Crime Figures it is still less than 0.1% of total 720,939 crimes reported, and still a fraction of the crimes of Rape (5,300) and Robbery against the Person (20,300).

Raheem Kassam of Breitbart has a point when he states that there has been a 93.4% rise in Anti-Semitic crimes to 499 in the same period. He then points out that a Jew is a number of times more likely to be a victim of a religious hate crime in London than a Muslim. However, he fluffs the figures, as he makes a comparison between London crime figures and total numbers of adherents of each religion in the UK. Yet the Greater London Authority has a Datastore with the population by borough, along with the proportion of each religious group. The Metropolitan Police Crime Figures are also by borough. From this I have looked at the ten worst boroughs for Islamophobic and Anti-Semitic Hate Crime, which I have appended below.

In Summary

  • The London Borough with the highest number of reported Islamophobic hate crimes was Westminster with 54 reported in the 12 months ended July 2015, but relative to the number of Muslims living in the borough, Islington had the highest rate with 3.0 hate crimes per 1,000 Muslims.
  • Overall in London reported 718 in Islamophobic hate crimes reported was equivalent to 0.6 per 1,000 Muslims.
  • The London Borough with the highest number of reported Anti-Semitic hate crimes was Hackney with 122 reported in the 12 months ended July 2015, but relative to the number of Jews living in the borough, Tower Hamlets had the highest rate with 10.6 hate crimes per 1,000 Jews.
  • Overall in London reported 499 in Anti-Semitic hate crimes reported was equivalent to 3.2 per 1,000 Jews.
  • A Jew in London is therefore more than five times more likely to be the victim of a religious hate crime than a Muslim. In the London Borough of Tower Hamlets the Jew is over thirty times more likely to be a victim than a Muslim. Even Islington, proportionately the worst borough for Muslims, the Jew is still more than twice as likely to be a victim as the Muslim.

As a final note, late yesterday evening there was an extreme Anti-Semitic attack in North Manchester. Four young men were brutally attacked at a Metrolink Station. The youngest, for a period, was into a coma according to The Jewish Chronicle. I join in the prayers for his speedy and full recovery.

Kevin Marshall

Degenerating Climatology 1: IPCC Statements on Human Caused Warming

This is the first in an occasional series of illustrating the degeneration of climatology away from an empirical science. In my view, for climatology to be progressing it needs to be making ever clearer empirical statements that support the Catastrophic Anthropogenic Global Warming (CAGW) hypothesis and moving away from the bland statements that can just as easily support a weaker form of the hypothesis, or support random fluctuations. In figure 1 this progression is illustrated by the red arrow, with increasing depth of colour. The example given below is an illustration of the opposite tendency.

Obscuring the slowdown in warming in AR5

Every major temperature data set shows that the warming rate this century has been lower than that towards the end of the end of the twentieth century. This is becoming a severe issue for those who believe that the main driver of warming is increasing atmospheric greenhouse gas levels. This gave a severe problem for the IPCC in trying to find evidence for the theory when they published in late 2013.

In the IPCC Fifth Assessment Report Working Group 1 (The Physical Science Basis) Summary for Policy Makers, the headline summary on the atmosphere is:-

Each of the last three decades has been successively warmer at the Earth’s surface than any preceding decade since 1850. In the Northern Hemisphere, 1983–2012 was likely the warmest 30-year period of the last 1400 years (medium confidence).

There are three parts to this.

  • The last three decades have been successively warmer according to the major surface temperature data sets. The 1980s were warmer than the 1970s; the 1990s warmer than the 1980s; and the 2000s warmer than the 1990s.
  • The 1980s was warmer than any preceding decade from the 1850s.
  • In the collective opinion of the climate experts there is greater than a 66% chance that the 1980s was the warmest decade in 1400 years.

What the does not include are the following.

  1. That global average temperature rises have slowed down in the last decade compared with the 1990s. From 2003 in the HADCRUT4 temperature series warming had stopped.
  2. That global average temperature also rose significantly in the mid-nineteenth and early twentieth centuries.
  3. That global average temperature fell in 4 or 5 of the 13 decades from 1880 to 2010.
  4. That in the last 1400 years there was a warm period about 1000 years ago and a significantly cold period that could have reached bottomed out around 1820. That is a Medieval Warm Period and the Little Ice Age.
  5. That there is strong evidence of Roman Warm Period that about 2000 years ago and a Bronze Age warm period about 3000 years ago.

Point (i) to (iii) can be confirmed by figure 2. Both the two major data surface temperature anomalies show warming trends in each of the last three decades, implying successive warming. A similar statement could have been made in 1943 if the data had been available.

In so far as the CAGW hypothesis is broadly defined as a non-trivial human-caused rise in temperatures (the narrower more precise definition being that the temperature change has catastrophic consequences) there is no empirical support found from the actual temperature records or from the longer data reconstructions from proxy data.

The major statement above is amplified by the major statement from the press release of 27/09/2013.

It is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century. The evidence for this has grown, thanks to more and better observations, an improved understanding of the climate system response and improved climate models.

This statement does exclude other types of temperature change, let alone other causes of the temperature change. The cooling in the 1960s is not included. The observed temperature change is only the net impact of all influences, known or unknown. Further, the likelihood is based upon expert opinion. If the experts have always given prominence to human influences on warming (as opposed to natural and random influences) then their opinion will be biased. Over time if this opinion is not objectively adjusted in the light of evidence that does not conform to the theory the basis of Bayesian statistic is undermined.

Does the above mean that climatology is degenerating away from a rigorous scientific discipline? I have chosen the latest expert statements, but not compared them with previous statements. A comparable highlighted statement to the human influence statement from the fourth assessment report WG1 SPM (Page 3) is

The understanding of anthropogenic warming and cooling influences on climate has improved since the TAR, leading to very high confidence that the global average net effect of human activities since 1750 has been one of warming, with a radiative forcing of +1.6 [+0.6 to +2.4] W m–2

The differences are

  • The greenhouse gas effect is no longer emphasised. It is now the broader “human influence”.
  • The previous statement was prepared to associate the influence with a much longer period. Probably the collapse of hockey stick studies, with their portrayal of unprecedented warming, has something to do with this.
  • Conversely, the earlier statement is only prepared to say that since 1750 the net effect of human influences has been one of warming. The more recent statement claims a dominant cause of warming has been human caused.

This leads my final point indicating degeneration of climatology away from science. When comparing the WG1 SPMs for TAR, AR4 and AR5 there are shifting statements. In each report the authors have chosen the best statements to fit their case at that point in time. The result is a lack of continuity that might demonstrate and increasing correspondence between theory and data.

Kevin Marshall

Can Climatology Ever Be Considered a Science?

Can climatology ever be considered a science? My favourite Richard Feynman quote.

You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.

I would maintain that by its nature climatology will always be a vague theory. Climate consists of an infinite number of interrelationships that can only be loosely modelled by empirical generalisations. These can only ever be imperfectly measured, although that is improving both in scope and period of observations. Tweaking the models can always produce a desired outcome. In this sense climatology is never going to be a science way that physics and chemistry have become. But this does not mean that climatology cannot become more scientific. A step forward might be to classify empirical statements according to the part of the global warming theory they support, and the empirical content of those statements.

Catastrophic Anthropogenic Global Warming (CAGW) is a subset of AGW. The other elements of AGW are trivial, or positive. I would also include the benign impacts of aerosols in reducing the warming impacts. So AGW’ is not an empty set.

AGW is a subset of GW, where GW is the hypothesis that an increase in greenhouse gas levels will cause temperatures to rise. There could be natural causes of the rise in greenhouse gases as well, so GW’ is not an empty set.

GW is a subset of Climate Change CC. That is all causes of changing climate, both known and unknown, including entirely random causes.

In summary


Or diagrammatically the sets can be represented by a series of concentric rings.

To become more scientific, climatology as an academic discipline should be moving on two complementary fronts. Firstly, through generating clearer empirical confirmations, as against banal statements or conditional forecasts. Secondly, for the statements to become more unambiguous in being ascribable solely to the CAGW hypothesis in particular rather being just as easily be ascribed to vague and causeless climate change in general. These twin aims are shown in the diagram below, where the discipline should be aiming in the direction of the red progressing arrow towards science, rather the green degenerating arrow.

Nullis in verba on a recent Bishop Hill discussion forum rightly points out the statement

“you acknowledge that scientists predicted warming. And warming is what we observed”

commits the fallacy of “confirming the consequent”.

If your definition of climate change is loose enough, the observed rise could be a member the CC set. But to infer it is not part of GW’ (outside of the GW set) requires more empirical content. As Nullis has shown in his tightly worded comment to prove this is impossible. But the greater empirical content will give more confidence that the scientists did not just strike lucky. Two years ago Roy Spencer did attempt just that. From 73 climate models the prediction was that between 1979 and 2012 average global temperatures would rise by between 0.3 and 1.5C, with an average estimate of 0.8C. Most were within the 0.6 to 1.2C, so any actual rise in that range, which is pretty unusual historically, would be a fairly strong confirmation of a significant AGW impact. The actual satellite and weather balloon data showed a rise of about 0.2C. The scientists got it wrong on the basis of their current models. At a minimum the models are running too hot, at a minimum failing to confirm the CAGW hypothesis.

By more clearly specifying the empirical content of statements the scope of alternative explanations is narrowed. In this case we have an explanation for someone using a more banal statement.

I would contend that to obtain confirmation of CAGW requires a combination of the warming and the adverse consequences. So even if the hurricanes had got worse after Katrina in 2005, with zero warming on its own it is just that an observation climate has changed. But together they form a more empirically rich story that is explained by CAGW theory. Still better is a number of catastrophic consequences.

In the next post I shall show some further examples of the discipline moving in the direction of degenerating climatology.

Kevin Marshall

John Cook undermining democracy through misinformation

It seems that John Cook was posting comments in 2011 under the pseudonym Lubos Motl. The year before physicist and blogger Luboš Motl had posted a rebuttal of Cook’s then 104 Global Warming & Climate Change Myths. When someone counters your beliefs point for point, then most people would naturally feel some anger. Taking the online identity of Motl is potentially more than identity theft. It can be viewed as an attempt to damage the reputation of someone you oppose.

However, there is a wider issue here. In 2011 John Cook co-authored with Stephan Lewandowsky The Debunking Handbook, that is still featured prominently on the This short tract starts with the following paragraphs:-

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

A common misconception about myths is the notion that removing its influence is as simple as packing more information into people’s heads. This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information – in science communication, it’s known as the “information deficit model”. But that model is wrong: people don’t process information as simply as a hard drive downloading data.

If Cook was indeed using the pseudonym Lubos Motl then he was knowingly putting out into the public arena misinformation in a malicious form. If he misrepresented Motl’s beliefs, then the public may not know who to trust. Targeted against one effective critic, it could trash their reputation. At a wider scale it could allow morally and scientifically inferior views to gain prominence over superior viewpoints. If the alarmist beliefs were superior it what be necessary to misrepresent alternative opinions. Open debate would soon reveal which side had the better views. But in debating and disputing, all sides would sharpen their arguments. What would quickly disappear is the reliance on opinion surveys and rewriting of dictionaries. Instead, proper academics would be distinguishing between quality, relevant evidence from dogmatic statements based on junk sociology and psychology. They would start defining the boundaries of expertise between the basic physics, computer modelling, results analysis, public policy-making, policy-implementation, economics, ethics and the philosophy of science. They may then start to draw on the understanding that has been achieved in these subject areas.

Kevin Marshall


Get every new post delivered to your Inbox.

Join 50 other followers