Britain Stronger in Europe Letter

I received a campaign letter from Britain Stronger in Europe today headed

RE: THE FACTS YOU NEED TO KNOW ABOUT EUROPE AND THE EU REFERENDUM

Putting the “RE:” in front is a bit presumptuous. It is not a reply to my request. However, I believe in looking at both sides of the argument, so here is my analysis. First the main points in the body of the letter:-

  1. JOBS – Over 3 million UK jobs are linked to our exports to the EU.
  2. BUSINESSES – Over 200,000 UK Businesses trade with the EU, helping them create jobs in the UK.
  3. FAMILY FINANCES – Leaving the EU will cost the average UK household at least £850 a year, and potentially as much as £1,700, according to research released by the London School of Economics.
  4. PRICES – Being in Europe means lower prices in UK shops, saving the average UK household over £350 a year. If we left Europe, your weekly shop would cost more.
  5. BENEFITS vs COSTS – For every £1 we put into the EU, we get almost £10 back through increased trade, investment, jobs, growth and lower prices.
  6. INVESTMENT – The UK gets £66 million of investment every day from EU countries – that’s more than we pay to be a member of the EU.

The first two points are facts, but only show part of the picture. The UK not only exports to the EU, but also imports. Indeed there is a net deficit with the EU, and a large deficit in goods. It is only due to a net surplus in services – mostly in financial services based in the City of London – that the trade deficit is not larger. The ONS provides a useful graphic illustrating both the declining share of exports to the EU, and the increasing deficit, reproduced below.

No one in the UK is suggesting that Brexit would mean a decline in trade, and it would be counter-productive for the EU not to reach a trade agreement with an independent UK when the EU has this large surplus.

The impact on FAMILY FINANCES is based upon the Centre for Economic Performance, an LSE affiliated organisation. There is both a general paper and a technical paper to back up the claims. They are modelled estimates of the future, not facts. The modelled costs assume Britain exits the European Union without any trade agreements, despite this being in the economic interests of both the UK and the EU. The report also does a slight of hand in estimating the contributions the UK will make post Brexit. From page 18 the technical paper

We assume that the UK would keep contributing 83% of the current per capita contribution as Norway does in order to remain in the single market (House of Commons, 2013). This leads to a fiscal saving of about 0.09%.

The table at the foot of report page 22 (pdf page 28) gives the breakdown of the estimate from 2011 figures. The Norway figures are gross and have a fixed cost element. The UK economy is about six times that of Norway, so would not end up spending nearly as much per capita even on the same basis. The UK figures are also a net figure. The UK pays into the EU twice as much as it gets out. Ever since joining the Common Market in 1973 Britain has been the biggest loser in terms of net contributions, despite the rebates that Mrs Thatcher secured with much effort in the 1980s.

The source of the PRICES information is again from the Centre for Economic Performance, but again with no direct reference. I assume it is from the same report, and forms part of the modelled forecast costs.

The BENEFITS vs COSTS statement is not comparing like with like. The alleged benefits to the UK are not all due to being a member of a club, but as a consequence of being an open economy trading with its neighbours. A true BENEFITS vs COSTS comparison would be future scenarios of Brexit vs Remain. Leading economist Patrick Minford has published a paper for the Institute of Economic Affairs, who finds there is a net benefit in leaving, particularly when likely future economic growth is taken into account.

The INVESTMENT issue is just part of the BENEFITS vs COSTS statement. So, like with the PRICES statement it is making one point into two.

 In summary, Britain Stronger in Europe claims I need to know six facts relevant to the referendum decision, but actually fails to provide a one. The actual facts are not solely due to the UK being a member of the European Union, whilst the relevant statements are opinions on modelled future scenarios that are unlikely to happen. The choice is between a various possible future scenarios in the European Union and possible future scenarios outside. The case for remain should be proclaiming the achievements of the European Union in making a positive difference to the lives of the 500 million people in the 28 States, along with future pathways where it will build on these achievements. The utter lack of these arguments, in my opinion, is the strongest argument for voting to leave.

Kevin Marshall

 

Copy of letter from Britain Stronger in Europe

Climate Interactive’s Bogus INDC Forecast

Summary

Joe Romm wrote a post in early November claiming UNFCCC Executive Secretary Christiana Figueres had misled the public in claiming that the “INDCs have the capability of limiting the forecast temperature rise to around 2.7 degrees Celsius by 2100”. Using Climate Interactive’s figures Romm claims the correct figure is 3.5°C. That Romm had one of two sources of the 2.7°C staring at him is a side issue. The major question is how Climate Interactive can achieve a full 1.0°C reduction in expected temperature rise in 2100 and a reduction of 40% in 2100 GHG emissions from pledges covering the period 2015, when the UNFCCC estimates will have a much smaller impact in 2030? Looking at the CO2 emissions, which account for 75-80% of GHG emissions, I have found the majority answer. For OECD countries where emissions per capita have been stable or falling for decades, the “No Action” scenario forecasts that they will rise for decades. For Russia and China, where per capita emissions are likely to peak before 2030 without any policy action, the “No Action” scenario forecasts that they will rise for decades. This is largely offset by Climate Interactive assuming that both emissions and economic growth in India and Africa (where there are no real attempts to control emissions) will stagnate in the coming decades. Just by making more reasonable CO2 emissions forecasts for the OECD, Russia and China can account for half of the claimed 2100 reduction in GHG emissions from the INDC. Climate Interactive’s “No Action” scenario is bogus.

 

Joe Romm’s use of the Climate Interactive projection

A couple of months ago, prior to the COP21 Paris climate talks, Joe Romm at Climate Progress criticized the claim made in a press release by UNFCCC Executive Secretary Christiana Figueres:

The INDCs have the capability of limiting the forecast temperature rise to around 2.7 degrees Celsius by 2100, by no means enough but a lot lower than the estimated four, five, or more degrees of warming projected by many prior to the INDCs

Romm’s note to the media is

If countries go no further than their current global climate pledges, the earth will warm a total of 3.5°C by 2100.

At a basic level Romm should have done some proper research. As I found out, there are two sources of the claim that are tucked away at the end of a technical appendix to the UNFCCC Synthesis report on the aggregate effect of INDCs. One of these is Climate Action Tracker. On their home page they have a little thermometer which shows the 2.7°C figure. Romm would have seen this, as he refers in the text to CAT’s page on China. The significance may not have registered.

However, the purpose of this post is not to criticize Romm, but the Climate Interactive analysis that Romm uses as the basis of his analysis. Is the Climate Interactive Graph (reproduced in Figure 1) a reasonable estimate of the impact of the INDC submissions (policy pledges) on global emissions?1

Figure 1. Climate Interactive’s graph of impact of the INDC submissions to 2100

What struck me as odd when I first saw this graph was how the INDCs could make such a large impact beyond the 2015-2030 timeframe that they covered when the overall impact was fairly marginal within that timeframe. This initial impression is confirmed by the UNFCCC’s estimate of the INDCs

Figure 2. UNFCCC’s estimate of emissions impact of the INDCs, with the impact shown by the yellow bars. Original here.

There are two things that do not stack up. First is that the “No Action” scenario appears to be a fairly reasonable extrapolation of future emissions without policy. Second, and contrary to that is the first, is that the “Current INDCs” scenario does not make sense in terms of what I have read in the INDCs and is confirmed by the INDCs. To resolve this requires looking at the makeup of the “No Action” scenario. Climate Interactive usefully provide the model for others to do their own estimates,2 With the “User Reference Scenario” giving the “no action” data3, split by type of greenhouse gas and into twenty regions or countries. As about 75-80% of emissions with the model are CO2 Fossil Fuel emissions, I will just look at this area. For simplicity I have also reduced the twenty regions or countries into just seven. That is USA, Other OECD, Russia, China, India, Africa and Rest of the World. There are also lots of ways to look at the data, but some give better understanding of the data than others. Climate Interactive also have population estimates. Population changes over a long period can themselves result in changed emissions, so looking at emissions per capita gives a better sense of the reasonableness of the forecast. I have graphed the areas in figure 3 for the historical period 1970-2010 and the forecast period 2020-2100.

Figure 3 : Fossil Fuel Emissions per capita for six regions from the Climate Interactive “No Action” Scenario.

Understanding the CO2 emissions forecasts

In the USA, emissions per capita peaked at the time of 1973 oil embargo. Since then they have declined slightly. There are a number of reasons for this.

First, higher oil prices gave the economic incentives to be more efficient in usage of oil. In cars there have been massive improvements in fuel efficiency since that time. Industry has also used energy more efficiently. Second, there has been a growth in the use of nuclear power for strategic reasons more than economic. Third is that some of the most energy intensive industries have shifted to other countries, particularly steel and chemicals. Fourth, is that growth in developed countries is mostly in the service sector, whereas growth in developing countries is mostly in manufacturing. Manufacturing tends to have much higher energy usage per unit of output than services. Fifth, is that domestic energy usage is from cars and power for the home. In an emerging economy energy usage will rise rapidly as a larger proportion of the population acquire cars and full heating and lighting systems in the home. Growth is much slower once most households have these luxuries. Sixth is that in the near future emissions might continue to fall with the development of shale gas, with its lower emissions per unit of power than from coal.

I therefore cannot understand why anyone would forecast increasing emissions per capita in the near future, when they have been stable or falling in for decades. Will everyone start to switch to less efficient cars? When these forecasts were made oil was at $100 a barrel levels, and many thought peak oil was upon us. Would private sector companies abandon more efficient energy usage for less efficient and higher cost usage? The USA may abandon nuclear power and shift back to coal for political reasons. But in all forms of energy, production and distribution is likely to continue to become more efficient in all forms.

In the rest of the OECD, there are similar patterns. In Europe energy usage was never as high. In some countries without policy CO2 emissions may rise slightly. In Germany they are replacing nuclear power stations with coal for instance. But market incentives will increase energy efficiency and manufacturing will continue to shift to emerging nations. Again, there appears no reason for a steady increase in emissions per capita to increase in the future.

Russia has a slightly different recent past. Communist central planning was highly inefficient and lead to hugely inefficient energy usage. With the collapse of communism, energy usage fell dramatically. Since then emissions have been increasing, but more slowly than the economy as a whole. Emissions will peak again in a couple of decades. This will likely be at a lower level than in the USA in 1970, despite the harsher climate, as Russia will benefit from technological advances in the intervening period. There is no reason for emissions to go on increasing at such a rapid rate.4

China has recently had phenomenal growth rates. According to UN data, from 1990 to 2012, economic growth averaged 10.3% per annum and CO2 emissions 6.1%. In the not too distant future economic growth will slow as per capita income approaches rich country levels, and emissions growth will slow or peak. But the Climate Interactive forecast has total emissions only peaking in 2090. The reason for China’s and Russia’s forecast per capita emissions exceeding those of the USA is likely due to a failure to allow for population changes. In USA population is forecast to grow, whilst in China and Russia population is forecast to fall.

India has the opposite picture. In recently years economic and CO2 emissions growth has taken off. Current policies of Prime Minister Narendra Modi are to accelerate these growth rates. But in the Climate Interactive forecast growth, economic growth and CO2 emissions growth plummet in the near future. Economic growth is already wrong. I am writing on 30/12/15. To meet the growth forecast for 2010-2015, India’s GDP will need to drop by 20% in the next 24 hours.5

For the continent of Africa, there have been encouraging growth signs in the last few years, after decades when many countries saw stagnation or even shrinking economies. Climate Interative forecasts similar growth to India, but with a forecasts of rapid population growth, the emissions per capita will hardly move.

Revised CO2 emissions forecasts

It is extremely difficult and time consuming to make more accurate CO2 emissions forecasts. As a shortcut, I will look at the impact of revisions on 2100, then at the impact on the effect of the INDCs. This is laid out in Figure 4

Figure 4 : Revised Forecast CO2 Emissions from Fossil Fuels

The first three columns in pale lilac are for CO2 emissions per capita calculated, from the Climate Interactive data. In the 2100 Revised column are more realistic estimates for reasons discussed in the text. In the orange part of the table are the total forecast 2100 Climate Interactive figures for population and CO2 emissions from fossil fuels. In darker orange is the revised emissions forecast (emissions per capita multiplied by forecast population) and the impact of the revision. Overall the forecast is 10.2GtCO2e lower, as no calculation has been made for the rest of the world. To balance back requires emissions of 11.89 tonnes per capita for 2.9 billion people. As ROW includes such countries as Indonesia, Bangladesh, Iran, Vietnam, Brazil and Argentina this figure might be unreasonable 85 years from now.

The revised impact on the INDC submissions

The INDC submissions can be broken down.

The USA, EU, Japan and Australia all have varying levels of cuts to total emissions. So for the OECD as a whole I estimate Climate Interactive over estimates the impact of the INDCs by 8.4GtCO2e

The Russian INDC pledge it is unclear, but it seems to be saying that emissions will peak before 2030 at below 1990 levels6. As my revised forecast is above this level, I estimate Climate Interactive over estimates the impact of the INDCs by 3.2GtCO2e

The Chinese INDC claims pledges that its emissions will have peaked by 2030. This will have happened anyway and at around 10-12 tonnes per capita. I have therefore assumed that emissions will stay constant from 2030 to 2100 whilst the population is falling. Therefore I estimate that Climate Interactive over estimates the impact of the INDCs by 19.5GtCO2e

Overall for these areas the overestimation is around 31 GtCO2e. Instead of 63.5GtCO2e forecast for these countries for 2100 it should be nearer 32.5GtCO2e. This is about half the total 2100 reduction that Climate Interactive claims that the INDC submission will make from all types of greenhouse gases. A more rigorous forecast may have lower per capita emissions in the OECD and China. There may be other countries where similar forecast issues of CO2 emissions might apply. In addition, in note 7 I briefly look at the “No Action” CH4 emissions, the second largest greenhouse gas. There appear to be similar forecast issued there.

In summary, the make-up of the CO2 emissions “No Action” forecast is bogus. It deviates from an objective and professional forecast in a way that grossly exaggerates the impact of any actions to control GHG emissions, or even pledges that constitute nothing more than saying what would happen anyway.

Notes

  1. The conversion of a given quantity of emissions into average surface temperature changes is outside the scope of this article. Also we will assume that all policy pledges will be fully implemented.
  2. On the Home page use the menu for Tools/C-ROADS. Then on the right hand side select “Download C-ROADS”. Install the software. Run the software. Click on “Create New Run” in the centre of the screen.


    This will generate a spreadsheet “User Scenario v3 026.xls”. The figures I use are in the User Reference Scenario tab. The software version I am working from is v4.026v.071.

  3. The “User Reference Scenario” is claimed to be RCP 8.5. I may post at another time on my reconciliation between the original and the Climate Interactive versions.
  4. The forecast estimates for economic growth and emissions for Russia look quite bizarre when the 5 year percentage changes are graphed.


    I cannot see any reason for growth rates to fall to 1% p.a in the long term. But this is the situation with most others areas as well. Nor can I think of a reason for emissions growth rates to increase from 2030 to 2055, or after 2075 expect as a contrivance for consistency purposes.

  5. The forecast estimates for economic growth and emissions for India look even more bizarre than for Russia when the 5 year percentage changes are graphed.


    I am writing on 30/12/15. To meet the growth forecast for 2010-2015, India’s GDP will need to drop by 20% in the next 24 hours. From 2015 to 2030, the period of the INDC submissions, CO2 emissions are forecast to grow by 8.4%. India’s INDC submission implies GHG emissions growth from 2014 to 2030 of 90% to 100%. Beyond that India is forecast to stagnate to EU growth rates, despite being a lower to middle income country. Also, quite contrary to Russia, emissions growth rates are always lower than economic growth rates.

  6. The Russian Federation INDC states

    Limiting anthropogenic greenhouse gases in Russia to 70-75% of 1990 levels by the year 2030 might be a long-term indicator, subject to the maximum possible account of absorbing capacity of forests.

    This appears as ambiguous, but could be taken as meaning a long term maximum.

  7. CH4 (Methane) emissions per Capita

    I have quickly done a similar analysis of methane emissions per capita as in Figure 2 for CO2 emissions. The scale this time is in kilos, not tonnes.

    There are similarities

  • OECD emissions had been falling but are forecast to rise. The rise is not as great as for CO2.
  • In Russia and China emissions are forecast to rise. In Russia this is by a greater amount than for CO2, in China by a lesser amount.
  • In Africa, per capita emissions are forecast to fall slightly. Between 2010, CH4 emissions are forecast to rise 3.1 times and population by 4.3 times.
  • In both the USA and Other OECD (a composite of CI’s categories) total CH4 emissions are forecast in 2100 to be 2.778 times higher than in 2010. In both China and India total CH4 emissions are forecast in 2100 to be 2.420 times higher than in 2010.



Shotton Open Cast Coal Mine Protest as an example of Environmental Totalitarianism

Yesterday, in the Greens and the Fascists, Bishop Hill commented on Jonah Goldberg’s book Liberal Fascists. In summing up, BH stated:-

Goldberg is keen to point out that the liberal and progressive left of today do not share the violent tendencies of their fascist forebears: theirs is a gentler totalitarianism (again in the original sense of the word). The same case can be made for the greens. At least for now; it is hard to avoid observing that their rhetoric is becoming steadily more violent and the calls for unmistakably fascist policy measures are ever more common.

The link is to an article in the Ecologist (reprinted from Open Democracy blog) – “Coal protesters must be Matt Ridley’s guilty consience

The coal profits that fill Matt Ridley’s bank account come wet with the blood of those killed and displaced by the climate disaster his mines contribute to, writes T. If hgis consicence is no longer functioning, then others must step into that role to confront him with the evil that he is doing. (Spelling as in the original)

The protest consisted of blocking the road for eight hours to Shotton open cast coal mine. The reasoning was

This was an effective piece of direct action against a mine that is a major contributor to climate disaster, and a powerful statement against the climate-denying Times columnist, Viscount Matt Ridley, that owns the site. In his honour, we carried out the action as ‘Matt Ridley’s Conscience’.

The mine produces about one million tonnes of coal a year out of 8,000 million tonnes globally. The blocking may have reduced annual output by 0.3%. This will be made up from the mine, or from other sources. Coal is not the only source of greenhouse gas emissions, so the coal resulting in less than 0.004% of global greenhouse gas emissions. Further, the alleged impact of GHG emissions on the climate is cumulative. The recoverable coal at Shotton is estimated at 6 million tonnes or 0.0007% of the estimated global reserves of 861 billion tonnes (Page 5). These global reserves could increase as new deposits are found, as has happened in the recent past for coal, gas and oil. So far from being “a major contributor to climate disaster”, Shotton Open Cast Coal Mine is a drop in the ocean.

But is there a climate disaster of which Matt Ridley is in denial? Anonymous author and convicted criminal T does not offer any evidence of current climate disasters. He is not talking about modelled projections, but currently available evidence. So where are all the dead bodies, or the displaced persons? Where are the increased deaths through drought-caused famines? Where are the increased deaths from malaria or other diseases from warmer and worsening conditions? Where is the evidence of increased deaths from extreme weather, such as hurricanes? Where are the refugees from drought-stricken areas, or from low-lying areas now submerged beneath the waves?

The inability to evaluate the evidence is shown by the comment.

Ridley was ( … again) offered a platform on BBC Radio 4 just a week before our hearing, despite his views being roundly debunked by climate scientists.

The link leads to a script of the Radio 4 interview with annotated comments. I am not sure that all the collective brains do debunk (that is expose the falseness or hollowness of (an idea or belief)) Matt Ridley’s comments. Mostly it is based on nit-picking or pointing out the contradictions with their own views and values. There are two extreme examples among 75 comments I would like to highlight two.

First is that Matt Ridley mentioned the Hockey Stick graphs and the work of Steve McIntyre in exposing the underlying poor data. The lack of a medieval warm period would provide circumstantial (or indirect) evidence that the warming of the last 200 years is unprecedented. Gavin Schmidt, responded with comments (5) and (6) shown below.

Schmidt is fully aware that Steve McIntyre also examined the Wahl and Amman paper and thoroughly discredited it. In 2008 Andrew Montford wrote a long paper of the shenanigans that went into the publication of the paper, and its lack of statistical significance. Following from this Montford wrote the Hockey Stick Illusion in 2010, which was reviewed by Tamino of RealClimate. Steve McIntyre was able to refute the core arguments in Tamino’s polemic by reposting Tamino and the Magic Flute, which was written in 2008 and covered all the substantial arguments that Tamino made. Montford’s book further shows a number of instances where peer review in academic climatology journals is not a quality control mechanism, but more a device of discrimination between those that support the current research paradigm and those that would undermine that consensus.

Comment 6 concludes

The best updates since then – which include both methodology improvements and expanded data sources – do not show anything dramatically different to the basic picture shown in MBH.

The link is to Chapter 5 on the IPCC AR5 WG1 assessment report. The paleoclimate discussion is a small subsection, a distinct reversal from the prominent place given to the original hockey stick in the third assessment report of 2001. I would contend the picture is dramatically different. Compare the original hockey stick of the past 1,000 years with Figure 5.7 on page 409 of AR5 WG1 Chapter 5.

In 2001, the MBH reconstruction was clear. From 1900 to 2000 average temperatures in the Northern Hemisphere have risen by over 1C, far more than the change in any of century. But from at least two of the reconstructions – Ma08eivl and Lj10cps – there have been similarly sized fluctuations in other periods. The evidence now seems to back up Matt Ridley’s position of some human influence on temperatures, but does not support the contention of unprecedented temperature change. Gavin Schmidt’s opinions are not those of an expert witness, but of a blinkered activist.

Schmidt’s comments on hockey stick graphs are nothing compared to comment 35

The Carbon Brief (not the climate scientists) rejects evidence that contradicts their views based on nothing more than ideological prejudice. A search for Indur Goklany will find his own website, where he has copies of his papers. Under the “Climate Change” tab is not only the 2009 paper, but a 2011 update – Wealth and Safety: The Amazing Decline in Deaths from Extreme Weather in an Era of Global Warming, 1900–2010. Of interest are two tables.

Table 2 is a reproduction of World Health Organisation data from 2002. It clearly shows that global warming is well down the list of causes of deaths. Goklany states in the article why these figures are based on dubious assumptions. Anonymous T falsely believes that global warming is curr

Figure 6 for the period 1990-2010 shows

  • the Global Death and Death Rates per million Due to Extreme Weather Events
  • CO2 Emissions
  • Global average GDP Per Capita

Figure 6 provides strong empirical evidence that increasing CO2 emissions (about 70-80% of total GHG emissions) have not caused increased deaths. They are a consequence of increasing GDP per capita, which as Goklany argues, have resulted in fewer deaths from extreme weather. More importantly, increasing GDP has resulted in increased life expectancy and reductions in malnutrition and deaths that be averted by access to rudimentary health care. Anonymous T would not know this even if he had read all the comments, yet it completely undermines the beliefs that caused him to single out Matt Ridley.

The worst part of Anonymous T’s article

Anonymous T concludes the article as follows (Bold mine)

The legal process efficiently served its function of bureaucratising our struggle, making us attempt to justify our actions in terms of the state’s narrow, violent logic. The ethics of our action are so clear, and declaring myself guilty felt like folding to that.

We found ourselves depressed and demoralised, swamped in legal paperwork. Pleading guilty frees us from the stress of a court case, allowing us to focus on more effective arenas of struggle.

I faced this case from a position of relative privilege – with the sort of appearance, education and lawyers that the courts favour. Even then I found it crushing. Today my thoughts are with those who experience the racism, classism and ableism of the state and its laws in a way that I did not.

That reflection makes me even more convinced of the rightness of our actions. Climate violence strikes along imperialist lines, with those least responsible, those already most disadvantaged by colonial capitalism, feeling the worst impacts.

Those are the people that lead our struggle, but are often also the most vulnerable to repression in the struggle. When fighting alongside those who find themselves at many more intersections of the law’s oppression than I do, I have a responsibility to volunteer first when we need to face up to the police and the state.

Faced with structural injustice and laws that defend it, Matt Ridley’s Conscience had no choice but to disobey. Matt Ridley has no conscience and neither does the state nor its system of laws. Join in. Be the Conscience you want to see in the world.

The writer rejects the rule of law, and is determined to carry out more acts of defiance against it. He intends to commit more acts of violence, with “climate” as a cover for revolutionary Marxism. Further the writer is trying to incite others to follow his lead. He claims to know Matt Ridley’s Conscience better than Ridley himself, but in the next sentence claims that “Matt Ridley has no conscience“. Further this statement would seem to contradict a justification for the criminal acts allegedly made in Bedlington Magistrates Court on December 16th
that the protesters were frustrated by the lack of UK Government action to combat climate change.

It is not clear who is the author of this article, but he/she is one of the following:-

Roger Geffen, 49, of Southwark Bridge Road, London.

Ellen Gibson, 21, of Elm Grove, London;

Philip MacDonald, 28, of Blackstock Road, Finsbury Park, London;

Beth Louise Parkin, 29, of Dodgson House, Bidborough Street, London;

Pekka Piirainen, 23, of Elm Grove, London;

Thomas Youngman, 22, of Hermitage Road, London.

Laurence Watson, 27, of Blackstock Road, Finsbury Park, London;

Guy Shrubsole, 30, of Bavent Road, London;

Lewis McNeill, 34, of no fixed address.

Kevin Marshall

aTTP falsely attacks Bjorn Lomborg’s “Impact of Current Climate Proposals” Paper

The following is a comment to be posted at Bishop Hill, responding to another attempt by blogger ….andThenThere’sPhysics to undermine the work of Bjorn Lomborg. The previous attempt was discussed here. This post includes a number of links, as well as a couple of illustrative screen captures at the foot of the table.

aTTP’s comment is

In fact, you should read Joe Romm’s post about this. He’s showing that the INDCs are likely to lead to around 3.5C which I think is relative to something like the 1860-1880 mean. This is very similar to the MIT’s 3.7, and quite a bit lower than the RCP8.5 of around 4.5C. So, yes, we all know that the INDCs are not going to do as much as some might like, but the impact is likely to be a good deal greater than that implied by Lomborg who has essentially assumed that we get to 2030 and then simply give up.

Nov 11, 2015 at 9:31 AM | …and Then There’s Physics

My Comment

aTTP at 9.31 refers to Joe Romm’s blog post of Nov 3 “Misleading U.N. Report Confuses Media On Paris Climate Talks“. Romm uses Climate Interactive’s Climate Scoreboard Tool to show the INDC submissions (if fully implemented) will result in 3.5°C as against the 4.5°C in the non-policy “No Action” Scenario. This is six times the claimed maximum impact of 0.17°C claimed in Lomberg’s new paper. Who is right? What struck me first was that Romm’s first graph, copied straight from the Climate Interactive’s seem to have a very large estimate for emissions in the “No Action” Scenario producing. Downloading the underlying data, I find the “No Action” global emissions in 2100 are 139.3 GtCO2e, compared with about 110 GtCO2e in Figure SPM5(a) of the AR5 Synthesis Report for the RCP8.5 scenario high emissions scenario. But it is the breakdown per country or region that matters.

For the USA, without action emissions are forecast to rise from 2010 to 2030 by 40%, in contrast to a rise of just 9% in the period 1990 to 2010. It is likely that emissions will fall without policy and will be no higher in 2100 than in 2010. The “no action” scenario overestimates 2030 emissions by 2-3 GtCO2e in 2030 and about 7-8 GtCO2e in 2100.

For the China the overestimation is even greater. Emissions will peak during the next decade as China fully industrializes, just as emissions peaked in most European countries in the 1970s and 1980s. Climate Interactive assumes that emissions will peak at 43 GtCO2e in 2090, whereas other estimates that the emissions peak will be around 16-17 GtCO2e before 2030.

Together, overestimations of the US and China’s “No Action” scenarios account for over half 55-60 GtCO2e 2100 emissions difference between the “No Action” and “Current INDC” scenarios. A very old IT term applies here – GIGO. If aTTP had actually checked the underlying assumptions he would realise that Romm’s rebuttal of Lomborg based on China’s emission assumptions (and repeated on his own blog) are as false as claiming that the availability of free condoms is why population peaks.

Links posted at https://manicbeancounter.com/2015/11/11/attp-falsely-attacks-bjorn-lomborgs-impact-of-current-climate-proposals-paper/

Kevin Marshall

 

Figures referred to (but not referenced) in the comment above

Figure 1: Climate Interactive’s graph, referenced by Joe Romm.


Figure 2: Reproduction of Figure SPM5(a) from Page 9 of the AR5 Synthesis Report.

 

Update – posted the following to ATTP’s blog



 

John Cook undermining democracy through misinformation

It seems that John Cook was posting comments in 2011 under the pseudonym Lubos Motl. The year before physicist and blogger Luboš Motl had posted a rebuttal of Cook’s then 104 Global Warming & Climate Change Myths. When someone counters your beliefs point for point, then most people would naturally feel some anger. Taking the online identity of Motl is potentially more than identity theft. It can be viewed as an attempt to damage the reputation of someone you oppose.

However, there is a wider issue here. In 2011 John Cook co-authored with Stephan Lewandowsky The Debunking Handbook, that is still featured prominently on the skepticalscience.com. This short tract starts with the following paragraphs:-

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

A common misconception about myths is the notion that removing its influence is as simple as packing more information into people’s heads. This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information – in science communication, it’s known as the “information deficit model”. But that model is wrong: people don’t process information as simply as a hard drive downloading data.

If Cook was indeed using the pseudonym Lubos Motl then he was knowingly putting out into the public arena misinformation in a malicious form. If he misrepresented Motl’s beliefs, then the public may not know who to trust. Targeted against one effective critic, it could trash their reputation. At a wider scale it could allow morally and scientifically inferior views to gain prominence over superior viewpoints. If the alarmist beliefs were superior it what be necessary to misrepresent alternative opinions. Open debate would soon reveal which side had the better views. But in debating and disputing, all sides would sharpen their arguments. What would quickly disappear is the reliance on opinion surveys and rewriting of dictionaries. Instead, proper academics would be distinguishing between quality, relevant evidence from dogmatic statements based on junk sociology and psychology. They would start defining the boundaries of expertise between the basic physics, computer modelling, results analysis, public policy-making, policy-implementation, economics, ethics and the philosophy of science. They may then start to draw on the understanding that has been achieved in these subject areas.

Kevin Marshall

Ivanpah Solar Project Still Failing to Achieve Potential

Paul Homewood yesterday referred to a Marketwatch report titled “High-tech solar projects fail to deliver.” This was reposted at Tallbloke.

Marketwatch looks at the Ivanpah solar project. They comment

The $2.2 billion Ivanpah solar power project in California’s Mojave Desert is supposed to be generating more than a million megawatt-hours of electricity each year. But 15 months after starting up, the plant is producing just 40% of that, according to data from the U.S. Energy Department.

I looked at the Ivanpah solar project last fall, when the investors applied for a $539million federal grant to help pay off a $1.5 billion federal loan. One of the largest investors was Google, who at the end of 2013 had Cash, Cash Equivalents & Marketable Securities of $58,717million, $10,000million than the year before.

Technologically the Ivanpah plant seems impressive. It is worth taking a look at the website.

That might have been the problem. The original projections were for 1065,000 MWh annually from a 392 MW nameplate implying a planned output of 31% of capacity. When I look at the costings on Which? for solar panels on the roof of a house, they assume just under 10% of capacity. Another site, Wind and Sun UK, say

1 kWp of well sited PV array in the UK will produce 700-800 kWh of electricity per year.

That is around 8-9.5% of capacity. Even considering the technological superiority of the project and the climatic differences, three times is a bit steep, although 12.5% (40% of 31%) is very low. From Marketwatch some of the difference is can be explained by

  • Complex equipment constantly breaking down
  • Optimization of complex new technologies
  • Steam pipes leaking due to vibrations
  • Generating the initial steam takes longer than expected
  • It is cloudier than expected

However, even all of this cannot account for the output only being at 40% of expected. With the strong sun of the desert I would expect daily output to never exceed 40% of theoretical, as it is only daylight for 50% of the time, and just after sunrise and before sunset the sun is less strong than at midday. As well as the teething problems with complex technology, it appears that the engineers were over optimistic. A lack of due diligence in appraising the scheme – a factor common to many large scale Government backed initiatives – will have let the engineers have the finance for a fully scaled-up version of what should have been a small-scale project to prove the technology.

 

Has NASA distorted the data on global warming?

The Daily Mail has published some nice graphics from NASA on how the Earth’s climate has changed in recent years. The Mail says

Twenty years ago world leaders met for the first ever climate change summit but new figures show that since then the globe has become hotter and weather has become more weird.

Numbers show that carbon dioxide emissions are up, the global temperature has increased, sea levels are rising along with the earth’s population.

The statistics come as more than 190 nations opened talks on Monday at a United Nations global warming conference in Lima, Peru.

Read more: http://www.dailymail.co.uk/news/article-2857093/Hotter-weirder-How-climate-changed-Earth.html#ixzz3KyaTz1j9

Follow us: @MailOnline on Twitter | DailyMail on Facebook

http://www.dailymail.co.uk/news/article-2857093/Hotter-weirder-How-climate-changed-Earth.html

See if anyone can find a reason for the following.

  1. A nice graphic compares the minimum sea ice extent in 1980 with 2012 – nearly three month after the 2014 minimum. Why not use the latest data?

  2. There is a nice graphic showing the rise in global carbon emissions from 1960 to the present. Notice gradient is quite steep until the mid-70s; there is much shallower gradient to around 2000 when the gradient increases. Why do NASA not produce their temperature anomaly graph to show us all how these emissions are heating up the world?

    Data from http://cdiac.ornl.gov/GCP/.

     

  3. There is a simple graphic on sea level rise, derived from the satellite data. Why does the NASA graph start in 1997, when the University of Colorado data, that is available free to download, starts in 1993? http://sealevel.colorado.edu/

     

     

Some Clues

Sea Ice extent

COI | Centre for Ocean and Ice | Danmarks Meteorologiske Institut

Warming trends – GISTEMP & HADCRUT4

The black lines are an approximate fit of the warming trends.

Sea Level Rise

Graph can be obtained from the University of Colorado.

 

NB. This is in response to a post by Steve Goddard on Arctic Sea Ice.

Kevin Marshall

Proximity to Natural Gas Wells and Reported Health Status Study

A new study has been publisheda tentatively suggesting that there are significant health effects for those living in close proximity to gas fracking sites. The study may make headlines despite the authors expressly stating that the results should be viewed as ‘hypothesis generating’. There are a number of problems with the survey which could indicate small sample size and biases in adjusting for other factors account for the difference. Alternatively there is also the possibility that reported health effects of living near the fracking sites is due to stress from the false perceptions of the risks of living near to a fracking site. Anti-fracking environmentalists may be damaging people’s health and happiness through misinformation.

The study is

Proximity to Natural Gas Wells and Reported Health Status: Results of a Household Survey in Washington County, Pennsylvania (Environ Health Perspect; DOI:10.1289/ehp.1307732)

Peter M. Rabinowitz, Ilya B. Slizovskiy, Vanessa Lamers, Sally J. Trufan, Theodore R. Holford, James D. Dziura, Peter N. Peduzzi, Michael J. Kane, John S. Reif, Theresa R. Weiss, and Meredith H. Stowe

 

The households were split into three groups based on distance from a gas well. <1km (62 households), 1-2km (57) & >2km (61). The major result was

The number of reported health symptoms per person was higher among residents living <1 km (mean 3.27 ± 3.72) compared with >2 km from the nearest gas well (mean 1.60 ± 2.14, p=0.02).

The study also found significantly higher incidences in two out of five health symptoms in the <1km group than in >2km group.

There are multiple reasons for expecting these tentative results will not be replicated.

  • The small sample size for a very complex set of data.
  • Perceived water quality is not related to fracking.
  • Failure to control properly for obesity and smoking
  • Failure to repeat the sampling process with the same model.
  • Failure to corroborate the results by checks for actual contamination.
  • Biases in answering the questions.

 

  1. Small sample size

There is an obvious problem with the health status study. The sample size was reported as the sample size of 180 households with 472 people, too small to generate meaningful results when there are a number of inter-related factors involved.

Consider how this sample was selected. To select these households the researchers randomly selected 20 points on a map in each of 38 townships. On a map they located the nearest house to the spot. The researchers were concerned with the possible impact of fracking on ground fed water supplies, which only applied to a minority of households. This was the main reason for reducing the sample From 760 data points to 227 households. 47 refusals reduced this to down to the 180 households for which questionnaires were received. They then put the data through a model “that adjusted for age, gender, household education, smoking, awareness of environmental risk, work type, and animals in house.”

The results were based on comparing two sample groups – one with 62 households and 150 people, the other with 61 households with 192 people. The >2km households were 30% larger than the <1km group, and the average age was 7 years lower. Not only were the numbers small, but there were material differences in the sample groups. It was necessary to adjust for

  1. Perceived water quality is not related to fracking.

Sixty-six percent reported using their ground-fed water (well or natural spring) for drinking water and 84% reported using it for other activities such as bathing.

If there were health effects from contaminated water due to fracking, then there should be a difference in distance between those who drank the water and those who did not. But although there were more households who said the water has an unnatural appearance near the in the <1km group, (13/62 for <1km v 6/61 for >2km), the position was reversed when for those who said taste/odour prevented water use (14/62 for <1km v 19/61 for >2km). If people believed there was a problem with the water due to fracking, then those living near the wells might be more likely to avoid drinking the water than those further away. It was not the case. The proportions drinking the water were the same. It would appear that water quality is generally considered poor in the area. This point can be demonstrated by water sampling.

  1. Failure to control properly for obesity and smoking

Obesity and smoking have long-been accepted as having consequences for health. The questionnaire is in the Supplemental Material. For obesity it asks the respondent their height and weight, but not the height and weight of the other members of the household. For smoking the question is

Does anyone in this household smoke regularly inside the house?

Smoking causes health problems independent of whether someone smokes inside their home or not. Also the numbers of people smoking in a household matters, along with the number of years smoked and the quantity of cigarettes smoked.

  1. Failure to repeat the sampling process with the same model.

The model that filtered out other elements could have had some very large biases within it. For instance, the model could have over-adjusted for smoking. Conducting a completely fresh survey with the same sampling method would have eliminated this possibility.

  1. Failure to corroborate the results by checks for actual contamination.

If there were actual health issues water contamination or air contamination, then there should be some evidence in water and air samples. The authors did not consult any actual monitoring results to show contamination. In the case of water quality In the case of air quality they threw everything at the issue, including ‘operation of diesel equipment and vehicles‘. If there was something in the air and/or in the water that is causing real health problems, then it will be something that cannot be perceived.

  1. Biases in answering the questions.

In the introduction the authors say

A convenience sample survey of 53 community members living near Marcellus Shale development found that respondents attributed a number of health impacts and stressors to the development. Stress was the symptom reported most frequently (Ferrar et al. 2013).

The study said

We found instead that the refusal rate, while less than 25% overall, was higher among households farther from gas wells, suggesting that such households may have been less interested in participating due to lesser awareness of hazards.

If participation was higher in people nearer to wells because of perceived hazards, and the people get stressed by this. It could be that this stress exacerbates the symptoms and/or people on hearing stories of possible health effects notice their own conditions more. That is, the results of reported health effects of living near fracking sites may be to some extent real, but caused by the stress of believing the scare stories. This could be coupled with the fears of resulting in people remembering minor health symptoms, as there might be a cause. This alone could explain why the number of reported symptoms was twice the level for people living near to the gas wells. Conducting a similar, but larger survey with both dwellings where water is mains supplied and from ground-fed wells. If there is “something in the water”, then those who are mains supplied would not suffer from health effects to the same degree.

  1. Thanks to commentator “Entropic Man” at a Bishop-Hill discussion thread for alerting me to this study.

Kevin Marshall

Theconsensusproject – unskeptical misinformation on Global Warming

Summary

Following the publication of a survey finding a 97% consensus on global warming in the peer-reviewed literature the team at “skepticalscience.com” launched theconsensusproject.com website. Here I evaluate the claims using two of website owner John Cook’s own terms. First, that “genuine skeptics consider all the evidence in their search for the truth”. Second is that misinformation is highly damaging to democratic societies, and reducing its effects a difficult and complex challenge.

Applying these standards, I find that

  • The 97% consensus paper is very weak evidence to back global warming. Stronger evidence, such as predictive skill and increasing refinement of the human-caused warming hypothesis, are entirely lacking.
  • The claim that “warming is human caused” has been contradicted at the Sks website. Statements about catastrophic consequences are unsupported.
  • The prediction of 8oF of warming this century without policy is contradicted by the UNIPCC reference.
  • The prediction of 4oF of warming with policy fails to state this is contingent on successful implementation by all countires.
  • The costs of unmitigated warming and the costs of policy and residual warming are from cherry-picking from two 2005 sources. Neither source makes the total claim. The claims of the Stern Review, and its critics, are ignored.

Overall, by his own standards, John Cook’s Consensus Project website is a source of extreme unskeptical misinformation.

 

Introduction

Last year, following the successful publication of their study on “Quantifying the consensus on anthropogenic global warming in the scientific literature“, the team at skepticalscience.com (Sks) created the spinoff website theconsensusproject.com.

I could set some standards of evaluation of my own. But the best way to evaluate this website is by Sks owner and leader, John Cook’s, own standards.

First, he has a rather odd definition of what skeptic. In an opinion piece in 2011 Cook stated:-

Genuine skeptics consider all the evidence in their search for the truth. Deniers, on the other hand, refuse to accept any evidence that conflicts with their pre-determined views.

This definition might be totally at odds with the world’s greatest dictionary in any language, but it is the standard Cook sets.

Also Cook co-wrote a short opinion pamphlet with Stephan Lewandowsky called The Debunking Handbook. It begins

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

Cook fully believes that accuracy is hugely important. Therefore we should see evidence great care in ensuring the accuracy of anything that he or his followers promote.

 

The Scientific Consensus

The first page is based on the paper

Cooks definition of a skeptic considering “all the evidence” is technically not breached. With over abstracts 12,000 papers evaluated it is a lot of evidence. The problem is nicely explained by Andrew Montford in the GWPF note “FRAUD, BIAS AND PUBLIC RELATIONS – The 97% ‘consensus’ and its critics“.

The formulation ‘that humans are causing global warming’ could have two different meanings. A ‘deep’ consensus reading would take it as all or most of the warming is caused by humans. A ‘shallow’ consensus reading would imply only that some unspecified proportion of the warming observed is attributable to mankind.

It is the shallow consensus that the paper followed, as found by a leaked email from John Cook that Montford quotes.

Okay, so we’ve ruled out a definition of AGW being ‘any amount of human influence’ or ‘more than 50% human influence’. We’re basically going with Ari’s porno approach (I probably should stop calling it that) which is AGW= ‘humans are causing global warming’. e.g. – no specific quantification which is the only way we can do it considering the breadth of papers we’re surveying.

There is another aspect. A similar methodology applied to social science papers produced in the USSR would probably produce an overwhelming consensus supporting the statement “communism is superior to capitalism”. Most papers would now be considered worthless.

There is another aspect is the quality of that evidence. Surveying the abstracts of peer-reviewed papers is a very roundabout way of taking an opinion poll. It is basically some people’s opinions of others implied opinions from short statements on tangentially related issues. In legal terms it is an extreme form of hearsay.

More important still is whether as a true “skeptic” all the evidence (or at least the most important parts) has been considered. Where is the actual evidence that humans cause significant warming? That is beyond the weak correlation between rising greenhouse gas levels and rising average temperatures. Where is the evidence that the huge numbers of climate scientists have understanding of their subject, demonstrated by track record of successful short predictions and increasing refinement of the human-caused warming hypothesis? Where is the evidence that they are true scientists following in the traditions of Newton, Einstein, Curie and Feynman, and not the followers of Comte, Marx and Freud? If John Cook is a true “skeptic”, and is presenting the most substantial evidence, then climate catastrophism is finished. But if Cook leaves out much better evidence then his survey is misinformation, undermining the case for necessary action.

 

Causes of global warming

The next page is headed.

There is no exclusion of other causes of the global warming since around 1800. But, with respect to the early twentieth century warming Dana Nuccitelli said

CO2 and the Sun played the largest roles in the early century warming, but other factors played a part as well.

However, there is no clear way of sorting out the contribution of the relative components. The statement “the causes of global warming are clear” is false.

On the same page there is this.

This is a series of truth statements about the full-blown catastrophic anthropogenic global warming hypothesis. Regardless of the strength of the evidence in support it is still a hypothesis. One could treat some scientific hypotheses as being essentially truth statements, such as that “smoking causes lung cancer” and “HIV causes AIDS”, as they are so very strongly-supported by the multiple lines of evidence1. There is no scientific evidence provided to substantiate the claim that global warming is harmful, just the shallow 97% consensus belief that humans cause some warming.

This core “global warming is harmful” statement is clear misinformation. It is extremely unskeptical, as it is arrived at by not considering any evidence.

 

Predictions and Policy

The final page is in three parts – warming prediction without policy; warming prediction with policy; and the benefits and costs of policy.

Warming prediction without policy

The source info for the prediction of 8oF (4.4oC) warming by 2100 without policy is from the 2007 UNIPCC AR4 report. It is now seven years out of date. The relevant table linked to is this:-

There are a whole range of estimates here, all with uncertainty bands. The highest has a best estimate of 4.0oC or 7.2oF. They seem to have taken the highest best estimate and rounded up. But this scenario is strictly for the temperature change at 2090-2099 relative to 1980-1999. This is for a 105 year period, against an 87 year period on the graph. Pro-rata the best estimate for A1F1 scenario is 3.3oC or 6oF.

But a genuine “skeptic” considers all the evidence, not cherry-picks the evidence which suit their arguments. If there is a best estimate to be chosen, which one of the various models should it be? In other areas of science, when faced with a number of models to use for future predictions the one chosen is the one that performs best. Leading climatologist, Dr Roy Spencer, has provided us with such a comparison. Last year he ran 73 of the latest climate CIMP5 models. Compared to actual data every single one was running too hot.

A best estimate on the basis of all the evidence would be somewhere between zero and 1.1oC, the lowest figure available from any of the climate models. To claim a higher figure than the best estimate of the most extreme of the models is not only dismissing reality, but denying the scientific consensus.

But maybe this hiatus in warming of the last 16-26 years is just an anomaly? There are at possibly 52 explanations of this hiatus, with more coming along all the time. However, given that they allow for natural factors and/or undermine the case for climate models accurately describing climate, the case for a single extreme prediction of warming to 2100 is further undermined. To maintain that 8oF of warming is – by Cook’s own definition – an extreme case of climate denial.

Warming prediction with policy

If the 8oF of predicted human-caused warming is extreme, then a policy that successfully halves that potential warming is not 4oF, but half of whatever the accurate prediction would be. But there are further problems. To be successful, that policy involves every major Government of developed countries reducing emissions by 80% (least including USA, Russia, EU, and Japan) by around 2050, and every other major country (at least including Russia, China, India, Brazil, South Africa, Indonesia and Ukraine) constraining emissions at current levels for ever. To get all countries to sign-up to such a policy combatting global warming over all other commitments is near impossible. Then take a look at the world map in 1925-1930 and see if you could reasonably have expected those Governments to have signed commitments binding on the Governments of 1945, let alone today. To omit policy considerations is an act of gross naivety, and clear misinformation.

The benefits and costs of policy

The benefits and costs of policy is the realm of economics, not of climatology. Here Cook’s definition of skeptic does not apply. There is no consensus in economics. However, there are general principles that are applied, or at least were applied when I studied the subject in the 1980s.

  • Modelled projections are contingent on assumptions, and are adjusted for new data.
  • Any competent student must be aware of the latest developments in the field.
  • Evaluation of competing theories is by comparing and contrasting.
  • If you are referencing a paper in support of your arguments, at least check that it does just that.

The graphic claims that the “total costs by 2100” of action are $10 trillion, as against $20 trillion of inaction. The costs of action are made up of more limited damages costs. There are two sources for this claim, both from 2005. The first is from “The Impacts and Costs of Climate Change”, a report commissioned by the EU. In the Executive Summary is stated:-

Given that €1.00 ≈ $1.20, the costs of inaction are $89 trillion and of reducing to 550ppm CO2 equivalent (the often quoted crucial level of 2-3 degrees of warming from a doubling of CO2 levels above pre-industrial levels) $38 trillion, the costs do not add up. However, the average of 43 and 32 is 37.5, or about half of 74. This gives the halving of total costs.

The second is from the German Institute for Economic Research. They state:-

If climate policy measures are not introduced, global climate change damages amounting to up to 20 trillion US dollars can be expected in the year 2100.

This gives the $20 trillion.

The costs of an active climate protection policy implemented today would reach globally around 430 billion US dollars in 2050 and around 3 trillion US dollars in 2100.

This gives the low policy costs of combatting global warming.

It is only by this arbitrary sampling of figures from the two papers that the websites figures can be established. But there is a problem in reconciling the two papers. The first paper has cumulative figures up to 2100. The shorthand for this is “total costs by 2100“. The $20 trillion figure is an estimate for the year 2100. The statement about the policy costs confirms this. This confusion leads the policy costs to be less than 0.1% of global output, instead of around 1% or more.

Further the figures are contradicted by the Stern Review of 2006, which was widely quoted in the UNIPCC AR4. In the summary of conclusions, Stern stated.

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more.

In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.

The benefit/cost ratio is dramatically different. Tol and Yohe provided a criticism of Stern, showing he used the most extreme estimates available. A much fuller criticism is provided by Peter Lilley in 2012. The upshot is that even with a single prediction of the amount and effects of warming, there is a huge range of cost impacts. Cook is truly out of his depth when stating single outcomes. What is worse is that the costs and effectiveness of policy to greenhouse emissions is far greater than benefit-cost analyses allow.

 

Conclusion

To take all the evidence into account and to present the conclusions in a way that clearly presents the information available, are extremely high standards to adhere to. But theconsensusproject.com does not just fail to get close to these benchmarks, it does the opposite. It totally fails to consider all the evidence. Even the sources it cites are grossly misinterpreted. The conclusion that I draw is that the benchmarks that Cook and the skepticalscience.com team have set are just weapons to shut down opponents, leaving the field clear for their shallow, dogmatic and unsubstantiated beliefs.

Kevin Marshall

 

Notes

  1. The evidence for “smoking causes lung cancer” I discuss here. The evidence for “HIV causes AIDS” is very ably considered by the AIDS charity AVERT at this page. AVERT is an international HIV and AIDS charity, based in the UK, working to avert HIV and AIDS worldwide, through education, treatment and care. – See more here.
  2. Jose Duarte has examples here.

The Lewandowsky Smooth

Summary

The Risbey at al. 2014 paper has already had criticism of its claim that some climate models can still take account of actual temperature trends. However, those criticisms did not take into account the “actual” data used, nor did they account for why Stephan Lewandowsky, a professor of psychology, should be a co-author of a climate science paper. I construct simple model using Excel of surface temperature trends that accurately replicates the smoothed temperature data in Risbey et al. 2014. Whereas the HADCRUT4 data set shows the a cooling trend since 2006, a combination of three elements smooths it away to give the appearance of a minimal downturn in a warming trend. Those element are the use of the biases in Cowtan and Way 2013; the use of decadal changes in data (as opposed to change from previous period) and the use of 15 year centred moving averages. As Stephan Lewandowsky was responsible for the “analysis of models and observations” this piece of gross misinformation must be attributed to him, hence the title.

Introduction

Psychology Professor Stephan Lewandowsky has previously claimed that “inexpert mouths” should not be heard. He is a first a psychologist, cum statistician; then a specialist on ethics, and peer review; then publishes on the maths of uncertainty. Now Lewandowsky re-emerges as a Climate Scientist, in

Well-estimated global surface warming in climate projections selected for ENSO phase” James S. Risbey, Stephan Lewandowsky, Clothilde Langlais, Didier P. Monselesan, Terence J. O’Kane & Naomi Oreskes Nature Climate Change (Risbey et al. 2014)

Why the involvement?

Risbey et al. 2014 was the subject of a long post at WUWT by Bob Tisdale. That long post was concerned with the claim that the projections of some climate models could replicate surface temperature data.

Towards the end Tisdale notes

The only parts of the paper that Stephan Lewandowsky was not involved in were writing it and the analysis of NINO3.4 sea surface temperature data in the models. But, and this is extremely curious, psychology professor Stephan Lewandowsky was solely responsible for the “analysis of models and observations”.

Lewandowsky summarizes his contribution at shapingtomorrowsworld. The following is based on that commentary.

Use of Cowtan and Way 2013

Lewandowsky asks “Has global warming “stopped”?” To answer in the negative he uses Cowtan and Way 2013. This was an attempt to correct the coverage biases in the HADCRUT4 data set by infilling through modelling where the temperature series lacked data. Principally real temperature data was lacking at the poles and in parts of Africa. However, the authors first removed some of the HADCRUT4 data, stating reasons for doing so. In total Roman M found it was just 3.34% of the filled-in grid cells, but was strongly biased towards the poles. That is exactly where the HADCRUT4 data was lacking. A computer model was not just infilling for where data was absent, but replacing sparse data with modelled data.

Steve McIntyre plotted the differences between CW2013 and HADCRUT4.

Stephan Lewandowsky should have acknowledged that, through the greater use of modelling techniques, Cowtan and Way was a more circumstantial estimate of global average surface temperature trends than HADCRUT4. This aspect would be the case even if results were achieved by robust methods.

Modelling the smoothing methods

The Cowtan and Way modelled temperature series was then smoothed to create the following series in red.

The smoothing was achieved by employing two methods. First was to look at decadal changes rather than use temperature anomalies – the difference from a fixed point in time. Second was to use 15 year centred moving averages.

To help understand the impact these methods to the observations had on the data I have constructed a simple model of the major HADCRUT4 temperature changes. The skepticalscience.com website very usefully has a temperature trends calculator.

The phases I use in degrees Kelvin per decade are

The Cowtan and Way trend is simply HADCRUT4 with a trend of 0.120 Kelvin per decade for the 2005-2013 period. This simply coverts a cooling trend since 2005 into a warming one, illustrated below.

The next step is to make the trends into decadal trends, by finding the difference with between the current month figure and the one 120 months previous. This derives the following for the Cowtan and Way trend data.

Applying decadal trends spreads the impact of changes in trend over ten years following the change. Using HADCRUT4 would mean decadal trends are now zero.

The next step is to apply 15 year centred moving averages.

The centred moving average spreads the impact of a change in trend to before the change occurred. So warming starts in 1967 instead of 1973. This partly offsets the impact of decadal changes, but further smothers any step changes. The two elements also create a nice smoothing of the data. The difference of Cowtan and Way is to future-proof this conclusion.

Comparison of modelled trend with the “Lewandowsky Smooth”

Despite dividing up over sixty years of data into just 5 periods, I have managed to replicate the essentially features of the decadal trend data.

A. Switch from slight cooling to warming trend in late 1960s, some years before the switch occurred.

B. Double peaks in warming trend, the first below 0.2 degrees per decade, the second slightly above.

C. The smoothed graph ending with warming not far off the peak, obliterating the recent cooling in the HADCRUT4 data.

Lewandowsky may not have used the decadal change as the extra smoothing technique, but whichever technique that was used achieved very similar results to my simple Excel effort. So the answer to Lewandowsky’s question “Has global warming “stopped”?” the answer is “Yes”. Lewandowsky knew this, so has manipulated the data to smooth the problem away. The significance is in a quote from “the DEBUNKING Handbook“.

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

Lewandowsky is providing misinformation, and has an expert understanding of its pernicious effects.

Kevin Marshall

Follow

Get every new post delivered to your Inbox.

Join 54 other followers