IPCC AR5 Synthesis Report Presentation Miscalculated the Emissions for 2C of Warming

In a previous post I mistakenly claimed that the Ladybird Book on Climate Change (lead author HRH The Prince of Wales) had incorrectly interpreted the AR5 IPCC Synthesis Report in its egg-timer. It is the IPCC that is at fault.
In 2014 the IPCC produced a simplified presentation of 35 slides to summarize the AR5 Synthesis Report Summary for policy makers. A quick summary of a summary of the synthesis report.

Slide 30 on Limiting Temperature Increase to 2C, clearly states that it is global reductions in greenhouse gas emissions that are needed.

The Ladybird egg-timer is adapted from slide 33 of 35.

As a (slightly manic) beancounter I like to reconcile the figures. How are the 1900 GtCO2 and the 1000 GtCO2 arrived at? It could be that it is GtCO2e, like the throughout the synthesis report, where other greenhouse gases are recast in terms of CO2, which accounts for well over half of the warming from trace gases.

Some assumptions for my quick calculations.

1. A doubling of CO2 will lead to a warming of 3C. This was the central estimate of the Charney Report 1979 (pdf), along with all five of the UNIPCC assessment reports.
2. If the pre-industrial level of CO2 was 280ppm, the dangerous 2C of warming will be reached at 445ppm. Rounded this is 450ppm.
3. In 2011 the Mauna Loa CO2 level was 391.63 ppm.
4. Using the CDIAC World CO2 emission figures, gives the following figures for billions of tonnes of CO2 to achieve a 1ppm rise in CO2 levelsin the graph below. In the five years to 2011 on average it took 17.02 billion tonnes of CO2 to raise CO2 levels by 1 ppm. Lets round it to 17.

Now some quick calculations.
Start with 280ppm
Add 111.76 (=1900/17) gives 391.76. Pretty close to the CO2 level in 2011 of 391.63ppm
Add 58.82 (=1000/17) gives 450.58. Given rounding, this pretty close to 450ppm.

There are problems with these calculations.

  • The estimate of 17 GtCO2e is on the high side. The World CO2 emissions from the CDIAC National Emissions spreadsheet gives a sum of 1069.68 GtCO2 from 1960 to 2011, against a rise in CO2 of 74.72 ppm. That is 14.3 GtCO2e over the whole period. Since 2011 there has been a drop towards this long-term average.
  • The Ladybird Book, like the UNFCCC at COP21 Paris December 2015 talks about restraining emissions to 1.5C. If a doubling of CO2 leads to 3.000C of warming then going from 280ppm to 401ppm (the average level in 2015) will eventually 1.555C of warming. This is a tacit admission that climate sensitivity is vastly overstated.
  • But the biggest error of all is that CO2 is only the major greenhouse gas (if you forget about water vapour). It might be the majority of the warming impact and two-thirds of emissions, but it is not all the warming impact according to theory. That alone would indicate that climate sensitivity was 2 instead of 3. But actual warming from 1780 to 2011 was less than 1C, against the 1C from CO2 alone if CS=2. That indicates that CS ≈ 1.3. But not all of the warming in the last 230 years has been due to changes in GHG levels. There was also recovery from the Little Ice Age. Worst of all for climate alarmism is the divergence problem. In this century the rate of warming should have increased as the rate of CO2 levels increased, in turn due to an increase in the rate of rise in CO2 emissions. But warming stopped. Even with the impact of a strong El Nino, the rate of warming slowed dramatically.



The IPCC calculated their figures for 1000 billion tonnes of CO2 emissions for 2C of warming based on CO2 being the only greenhouse gas and a doubling of CO2 levels producing 3C of warming. On that basis 401ppm CO2 level should produce >1.5C of warming. Add in other greenhouse gases and we are in for 2C of warming without any more greenhouse gas emissions. It is only if climate sensitivity is much lower is it theoretically possible to prevent 2C of warming by drastically reducing global CO2 emissions. The IPCC, have concocted figures knowing that they do not reconcile back to their assumptions.

The questions arise are (a) where do the cumulative emissions figures come from? and (b) whether the UNIPCCC has copied these blatant errors in the COP processes?

This is an extended version of a comment made a Paul Homewoods’ notalotofpeopleknowthat blog.

Kevin Marshall

Friends of the Earth distorting the evidence for Fracking in the UK


Friends of the Earth have a webpage claiming to be “fracking facts”. The key points I make are.

  • The claims of dangers of fracking raise questions, that need to be answered before they can be considered credible.
  • The claim that fracking could affect house prices is totally unsupported.
  • The claim that shale gas will not significantly affect energy prices is based on out of date data. The British Geological Survey has shown that the potential of shale gas is huge. Friends of the Earth has played a major role in preventing that potential being realized.
  • FoE has consequently helped prevent shale gas from relieving the energy crisis brought upon by the Climate Change Act 2008.
  • Claims that pursuing shale gas in Britain will affect global emissions are pure fantasy. Also is a fantasy the belief that Britain is leading the way on emissions reductions. We ain’t leading if collectively the world is not following. The evidence shows clearly shows this.  

In the previous post I looked at how FoE blatantly mislead about an agreement they reached with the Advertising Standards Authority, which caused the unusual step of ASA Chief Executive Guy Parker issuing a strongly worded statement to defend the ASA’s integrity.

In this post I will look at FoE’s position on fracking, from Fracking definition? What does fracking mean? Read our fracking facts

I will look at various statements made (with FoE quotes in purple), showing how well they are supported by the evidence and/or providing alternative perspectives.

From the section What are the dangers of fracking?

Industry statistics from North America show that around 6% of fracking wells leak immediately.

Leaking wells lead to a risk of water contamination. Lord Smith, former chair of the Environment Agency, has said this is the biggest risk posed by fracking.

So it’s particularly concerning that the Government has now said it will allow fracking companies to drill through aquifers which provide household drinking water.

This raises some questions.

  • If leaks are a problem, with respect to fracking in the UK has this been risk assessed, with appropriate measures taken to prevent leaks?
  • Does that statistic of 6% allow for when there is natural leakage in the area of fracking leaking in the water supplies are venting into the atmosphere in the area where fracking is occurring? This was the case in the images of the flaming water faucet in the movie Gasland.
  • Have there been steps taken in the USA to reduce genuine leaks?
  • Has the proportion of wells leaking gas in the USA been increasing or decreasing?
  • Has the average amount of gas leaked been increasing or decreasing?
  • How when extracting gas from well below water aquifers, through a lined tube, that is both water-tight and gas-tight, is that gas (and fracking fluids) meant to leech into the water supply?

Then there is the statement without evidence.

Fracking could also affect house prices.

This was one of the issues FoE in its agreement with the ASA have the assurance not to repeat claims that fracking affects property prices, unless the evidence changes. Legally there might be cop-out where that assurance does not apply to claims made on its website. Literally, the statement is not untrue, just as the claim that a butterfly flapping its wings on the North Downs could lead to a typhoon in the South China Sea.

Would fracking bring down energy bills?

It’s very unlikely. Fracking company Cuadrilla has admitted that any impact on bills would be “basically insignificant”.

Claims that fracking would create a lot of jobs have also been overstated. According to Cuadrilla, each of its proposed 6-year projects in Lancashire that were recently rejected by the council would only have created 11 jobs.

The claim about Cuadrilla is sourced from an Independent article in June 2013.

“We’ve done an analysis and it’s a very small…at the most it’s a very small percentage…basically insignificant,” said Mark Linder, a public relations executive at Bell Pottinger who is also responsible for Cuadrilla’s corporate development.

The article later says

“According to Poyry, Lancashire shale gas production could also reduce the country’s wholesale gas and electricity prices by as much as 4 per cent between 2014 and 2035, which corresponds to an average saving of £810m/year,”

It is not surprising that shale gas developments in Lancashire alone will not have a significant impact on UK energy prices, especially if that is restricted to a few sites by one company. But over three years later the landscape has changed. The British Geological Survey has been publishing estimates of the quantities of shale gas (and oil) that exists beneath the ground.

The figures are at first hard to comprehend. Large number in units of measure that ordinary people (even people with some knowledge of the field) are unfamiliar are hard to comprehend, let alone put into some perspective. In my view, the figures need to be related to annual British consumption. Page 8 of the DECC UK Energy Statistics, 2015 & Q4 2015 estimates gas demand at 794 TWh in 2015.

The BGS uses tcf (tera cubic feet) for its’ estimates, which (like a domestic gas bill) can be converted from TWh. The 794 TWh is about 2.7 tcf. Not all shale gas is recoverable. In fact possibly only 10% of reserves is recoverable on existing technology, and depending on the quality of the deposits.

There are also shale oil deposits, measured by the BGS in both barrels and millions of tonnes. Refinery production (a rough estimate of consumption) was 63 millions of tonnes in 2015. I will again assume 10% recovery.

The biggest shock was published just a few weeks after the Independent article on 27th July 2013. The size of the Bowland shale was truly staggering. The central estimate is 1329 tcf, meaning enough to satisfy 49 years of current UK gas demand. Potentially it is more, due to the depth of deposits in many areas. No significant deposits of oil are thought to be present

On 23rd May 2014 BGS published the results for the Weald Basin, a large area in the South East of England. Whilst there were no significant deposits of gas, the central estimate of 591 million tonnes is enough to supply the UK for one year.

On 25 June 2014 the Welsh Government published the estimates for Wales. The main gas deposits are thought to be in Wrexham/Cheshire and in South Wales and estimated about 65 tcf, or just over two years of UK demand. (Strictly the Welsh estimate is somewhat below this, as Wrexham is on the Welsh border and Cheshire is an English county. )

On 23rd May 2014 BGS published the results for the Midland Valley of Scotland. The central estimate for shale gas was 80.3 tcf (3 years of UK demand) and for shale oil 800 million tonnes (15 months of refinery production).

Most recently on 13th October 2016, BGS published the results for the Jurassic shale of the Wessex area. Central estimate for shale oil was 149 million tonnes, equivalent to three months of UK refinery production.

In all, conservatively there is estimated to be sufficient gas to supply the UK for over 54 years and oil for two and half years. The impact on supply, and therefore the impact on jobs and (in the case of gas) on energy prices, demands on the ability of businesses to profitability develop these resources. As has happened in the USA, the impact on jobs is mostly dependent on the impact on prices, as low prices affect other industries. In the USA, industries that are sensitive to energy prices (or use gas as a raw material) have returned from overseas, boosting jobs. FoE has played no small part in delaying planning applications with spurious arguments, along with generating false fears that could have made regulations more onerous than if an objective assessment of the risks had been made.

Fracking can’t help any short term or medium term energy crisis.

Even if the industry was able to move ahead as fast as it wants, we wouldn’t see significant production until about 2025.

This is actually true and up to date. If it were not for the Climate Change Act along with eco-activists blocking every move to meet the real energy demands in the most affordable and efficient way possible, there would be no prospective energy crisis. In terms of shale gas meeting energy demands (and gas-fired power stations being built) FoE should claim some of the credit for preventing the rapid develop of cheap and reliable energy sources, and thus exacerbating fuel poverty.

Will fracking help us to tackle climate change?

Shale gas and shale oil are fossil fuels. They emit greenhouse gases. Avoiding the worst impacts of climate change means getting off fossil fuels as soon as possible.

Scientists agree that to stop dangerous climate change, 80% of fossil fuels that we know about need to stay in the ground.

Setting up a whole new fossil fuel industry is going in completely the wrong direction, if the UK is to do its fair share to stop climate change.

The hypothesis is that global emissions lead to higher levels of greenhouse gases. In respect of CO2 this is clear. But the evidence that accelerating rate of rise in CO2 levels has led to accelerating average global temperatures is strongly contradicted by real world data. There is no scientific consensus that contracts this conclusion. Further there is no proper scientific evidence to suggest that climate is changing for the worse, if you look at the actual data, like leading climate scientist Dr John Christy does in this lecture. But even if the catastrophic global warming hypothesis were true (despite the weight of real world data against it) global warming is global. Britain is currently emitting about 1.1% of global emissions. Even with all the recently discovered shale gas and oil deposits, under the UK is probably less than 1% of all estimated fossil fuel deposits. Keeping the fossil fuels under British soil in the ground will do nothing to change the global emissions situation.  Britain tried to lead the way with the Climate Change Act of 2008, in committing to reduce its emissions by 80% by 2050. The INDC submissions leading up to COP21 Paris in December 2015 clearly showed that the rest of the countries were collectively not following that lead. The UNFCCC produced a graph showing the difference of the vague policy proposals might make.  I have stuck on the approximate emissions pathway to which the UK is committed.

The FoE is basically objecting to fracking to keep up the appearance that the UK is “doing its bit” to save the world from catastrophic global warming. But in the real world, global warming ain’t happening, neither are the predicted catastrophes. Even if it were, whatever Britain does will make no difference. FoE attempting to deny future jobs growth and stop the alleviation of fuel poverty to maintain the fantasy that Britain is leading the way on climate change.


Isn’t it better to have our own gas rather than importing it?


If we went all out for shale, our gas imports would stay at current levels as the North Sea supply declines – and imports could increase by 11%.

This claim, without any reference, is based likely based on the same out of date sources as below. If FoE and fellow-travellers kept out of the way with their erroneous then shale gas has a huge potential to cause imports to decline.

Kevin Marshall

Friends of the Earth still perverting the evidence for fracking harms

Yesterday, the Advertising Standards Authority at long last managed to informally resolve the complaints about a misleading leaflet by Friends of the Earth Trust and Friends of the Earth Ltd. This is no fault of the ASA. Rather FoE tried to defend the indefensible, drawing out the process much like they try to draw out planning inquiries. From the Guardian

“After many attempts by Friends of the Earth to delay this decision, the charity’s admission that all of the claims it made, that we complained about, were false should hopefully put a stop to it misleading the UK public on fracking,” said Francis Egan, chief executive of Cuadrilla. …..

According to the BBC

Friends of the Earth (FOE) must not repeat misleading claims it made in an anti-fracking leaflet, the advertising watchdog has said.

The fundraising flyer claimed fracking chemicals could pollute drinking water and cause cancer and implied the process increases rates of asthma.

The charity “agreed not to repeat the claims,” the Advertising Standards Authority (ASA) said.

All pretty clear. As the BBC reports that the eco-worriers are not to be told that they are misleading the public.

Donna Hume, a campaigner for the environmental charity, said it would “continue to campaign against fracking” because it was “inherently risky for the environment”.


Ms Hume said Cuadrilla “started this process to distract from the real issues about fracking” and was trying to “shut down opposition”.

“It hasn’t worked though. What’s happened instead is that the ASA has dropped the case without ruling,” she said.

“We continue to campaign against fracking, alongside local people, because the process of exploring for and extracting shale gas is inherently risky for the environment, this is why fracking is banned or put on hold in so many countries.”

Donna Hume was just acting as mouthpiece to the FoE, who issued a misleading statement about the case. They stated

Last year fracking company Cuadrilla complained to the Advertising Standards Authority about one of our anti fracking leaflets.

But after more than a year, the complaint has been closed without a ruling.

The scientific evidence that fracking can cause harm to people and the environment keeps stacking up. Friends of the Earth is not alone in pointing out the risks of fracking, to the climate, to public health, of water contamination, and to the natural environment.

ASA Chief Executive Guy Parker, took the unusual step of setting the record straight.

But amidst the reports, the public comments by the parties involved and the social media chatter, there’s a risk that the facts become obscured.

So let me be clear. We told Friends of the Earth that based on the evidence we’d seen, claims it made in its anti-fracking leaflet or claims with the same meaning cannot be repeated, and asked for an assurance that they wouldn’t be. Friends of the Earth gave us an assurance to that effect. Unless the evidence changes, that means it mustn’t repeat in ads claims about the effects of fracking on the health of local populations, drinking water or property prices.

Friends of the Earth has said we “dropped the case”. That’s not an accurate reflection of what’s happened. We thoroughly investigated the complaints we received and closed the case on receipt of the above assurance. Because of that, we decided against publishing a formal ruling, but plainly that’s not the same thing as “dropping the case”. Crucially, the claims under the microscope mustn’t reappear in ads, unless the evidence changes. Dropped cases don’t have that outcome.

The ASA, which tries to be impartial and objective, had to take the unusual statement to combat FoE deliberate misinformation. So what is the scientific evidence that FoE claim? This from the false statement that ASA was forced to rebut.

The risks of fracking

In April 2016, a major peer-reviewed study by research institute PSE Healthy Energy was published in academic journal PLOS ONE, which assessed 685 pieces of peer-reviewed scientific literature from around the world over 2009-2015 and found:

  • “84% of public health studies contain findings that indicate public health hazards, elevated risks, or adverse health outcomes”

  • “69% of water quality studies contain findings that indicate potential, positive association, or actual incidence of water contamination”

  • “87% of air quality studies contain findings that indicate elevated air pollutant emissions and/or atmospheric concentrations”

I suggest readers actually read what is said. Hundreds of studies cannot identify, beyond reasonable doubt, that there is a significant large risk to human health. If any single study did establish this it would be world news. It is just hearsay, that would be dismissed by a criminal court in the UK. A suggestion is from what the  PLOS-ONE Journal does not include in the submission criteria, that is normal in traditional journals – that submissions should have something novel to say about the subject area. As an online journal it does not have to pay its way by subscriptions, as authors usually have to pay a fee of $1495 prior to publication.

But this still leaves the biggest piece of misinformation that FoE harps on about, but was not included in the ruling. Below is the BBC’s two pictures of the leaflet.



It is the the issue of climate change that goes unchallenged. Yet it is the most pernicious and misleading claim of the lot. If fracking goes ahead in the UK it will make not a jot of difference. According to the EU EDGAR data the UK emitted just 1.1% of global GHG emissions in 2012. That proportion is falling principally because emissions are rising in other countries. It will continue to fall as emissions in developing countries rise, as those countries develop. That is China, India, the rest of South East Asia and 50+ African nations. These developing countries, which are exempt from any obligation to constrain emissions under the 1992 Rio Declaration, have 80% of the global population and accounted for over 100% of emissions growth between 1990 and 2012. I have summarized the EDGAR data below.


So who does the FoE speak for when they say “We could trigger catastrophic global temperature increases if we burn shale in addition to other fossil fuels“?

They do not speak for the USA, where shale gas has replaced coal, and where total emissions have reduced as a result, with real pollutants falling. There the bonus has been hundreds of thousands of extra jobs. They do not speak for China where half of the global increase (well 53%) in GHG emissions between 1990 and 2012 occurred. They cannot speak for Britain, as if it triggers massive falls in energy costs like in the USA (and geologically the North of England Bowland shale deposits look to be much deeper than the US deposits, so potentially cheaper to extract) then industry could be attracted back to the UK from countries like China with much higher emissions per unit of output.

Even worse, F0E do not speak for the British people. In promoting renewables, they are encouraging higher energy prices, which have lead to increasing fuel poverty and increased winter deaths among the elderly. On the other hand the claims of climate catastrophism from human emissions look to be far fetched when this century global average temperature rises have stalled, when according to theory they should have increased at an accelerated rate.

Kevin Marshall

Update 7th Jan 11am

Ron Clutz has posted a good summary of the initial ruling, along with pointing to a blog run by retired minister Rev. Michael Roberts, who was one of the two private individuals (along with gas exploration company Cuadrilla) who made the complaint to ASA.

The Rev Roberts has a very detailed post on the 4th January giving extensive background history of FoE’s misinformation campaign against shale gas exploration in the Fylde. There is one link I think they should amend. The post finishes with what I believe to be a true statement.

Leaflet omits main reason for opposition is Climate change


The link is just to a series of posts on fracking in Lancashire. It is one of them is

Lancashire fracking inquiry: 3 reasons fracking must be stopped

The first reason is climate change. But rather than relate emissions to catastrophic global warming, they point to the how allowing development of fossil fuels appears in relation to Government commitments made in the Climate Change Act 2008 and the Paris Agreement. FoE presents their unbalanced case in much fuller detail in the mis-named Fracking Facts.

Update 2 7th Jan 2pm

I have rechecked the post Cuadrilla’s leaflet complaint is closed without a ruling, while evidence of fracking risks grows, where Friends of the Earth activist Tony Bosworth makes the grossly misleading statement that ASA closed the case without a ruling. The claim is still there, but no acknowledgement of the undertaking that F0E made to ASA. F0E mislead the public in order to gain donations, and now tries to hide the information from its supporters by misinformation. Below, is a screenshot of the beginning of the article.

How strong is the Consensus Evidence for human-caused global warming?

You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.

Richard Feynman – 1964 Lecture on the Scientific Method

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

The Debunking Handbook 2011 – John Cook and Stephan Lewandowsky

My previous post looked at the attacks on David Rose for daring to suggest that the rapid fall in global land temperatures at the El Nino event were strong evidence that the record highs in global temperatures were not due to human greenhouse gas emissions. The technique used was to look at long-term linear trends. The main problems with this argument were
(a) according to AGW theory warming rates from CO2 alone should be accelerating and at a higher rate than the estimated linear warming rates from HADCRUT4.
(b) HADCRUT4 shows warming stopped from 2002 to 2014, yet in theory the warming from CO2 should have accelerated.

Now there are at least two ways to view my arguments. First is to look at Feynman’s approach. The climatologists and associated academics attacking journalist David Rose chose to do so from a perspective of a very blurred specification of AGW theory. That is human emissions will cause greenhouse gas levels to rise, which will cause global average temperatures to rise. Global average temperature clearly have risen from all long-term (>40 year) data sets, so theory is confirmed. On a rising trend, with large variations due to natural variability, then any new records will be primarily “human-caused”. But making the theory and data slightly less vague reveals an opposite conclusion. Around the turn of the century the annual percentage increase in CO2 emissions went from 0.4% to 0.5% a year (figure 1), which should have lead to an acceleration in the rate of warming. In reality warming stalled.

The reaction was to come up with a load of ad hoc excuses. Hockey Schtick blog reached 66 separate excuses for the “pause” by November 2014, from the peer-reviewed to a comment in the UK Parliament.  This could be because climate is highly complex, with many variables, the presence of each contributing can only be guessed at, let alone the magnitude of each factor and the interrelationships with all factors. So how do you tell which statements are valid information and which are misinformation? I agree with Cook and Lewandowsky that misinformation is pernicious, and difficult to get rid of once it becomes entrenched. So how does one evaluate distinguish between the good information and the bad, misleading or even pernicious?

The Lewandowsky / Cook answer is to follow the consensus of opinion. But what is the consensus of opinion? In climate one variation is to follow a small subset of academics in the area who answer in the affirmative to

1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

Problem is that the first question is just reading a graph and the second could be is a belief statement will no precision. Anthropogenic global warming has been a hot topic for over 25 years now. Yet these two very vague empirically-based questions, forming the foundations of the subject, should be able to be formulated more precisely. On the second it is a case of having pretty clear and unambiguous estimates as to the percentage of warming, so far, that is human caused. On that the consensus of leading experts are unable to say whether it is 50% or 200% of the warming so far. (There are meant to be time lags and factors like aerosols that might suppress the warming). This from the 2013 UNIPCC AR5 WG1 SPM section D3:-

It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.

The IPCC, encapsulating the state-of-the-art knowledge, cannot provide firm evidence in the form of a percentage, or even a fairly broad range even with over 60 years of data to work on..  It is even worse than it appears. The extremely likely phrase is a Bayesian probability statement. Ron Clutz’s simple definition from earlier this year was:-

Here’s the most dumbed-down description: Initial belief plus new evidence = new and improved belief.

For the IPCC claim that their statement was extremely likely, at the fifth attempt, they should be able to show some sort of progress in updating their beliefs to new evidence. That would mean narrowing the estimate of the magnitude of impact of a doubling of CO2 on global average temperatures. As Clive Best documented in a cliscep comment in October, the IPCC reports, from 1990 to 2013 failed to change the estimate range of 1.5°C to 4.5°C. Looking up Climate Sensitivity in Wikipedia we get the origin of the range estimate.

A committee on anthropogenic global warming convened in 1979 by the National Academy of Sciences and chaired by Jule Charney estimated climate sensitivity to be 3 °C, plus or minus 1.5 °C. Only two sets of models were available; one, due to Syukuro Manabe, exhibited a climate sensitivity of 2 °C, the other, due to James E. Hansen, exhibited a climate sensitivity of 4 °C. “According to Manabe, Charney chose 0.5 °C as a not-unreasonable margin of error, subtracted it from Manabe’s number, and added it to Hansen’s. Thus was born the 1.5 °C-to-4.5 °C range of likely climate sensitivity that has appeared in every greenhouse assessment since…

It is revealing that quote is under the subheading Consensus Estimates. The climate community have collectively failed to update the original beliefs, based on a very rough estimate. The emphasis on referring to consensus beliefs about the world, rather than looking outward for evidence in the real world, I would suggest is the primary reason for this failure. Yet such community-based beliefs completely undermines the integrity of the Bayesian estimates, making its use in statements about climate clear misinformation in Cook and Lewandowsky’s use of the term. What is more, those in the climate community who look primarily to these consensus beliefs rather than the data of the real world will endeavour to dismiss the evidence, or make up ad hoc excuses, or smear those who try to disagree. A caricature of these perspectives with respect to global average temperature anomalies is available in the form of a flickering widget at John Cooks’ skepticalscience website. This purports to show the difference between “realist” consensus and “contrarian” non-consensus views. Figure 2 is a screenshot of the consensus views, interpreting warming as a linear trend. Figure 3 is a screenshot of the non-consensus or contrarian views. They is supposed to interpret warming as a series of short, disconnected,  periods of no warming. Over time, each period just happens to be at a higher level than the previous. There are a number of things that this indicates.

(a) The “realist” view is of a linear trend throughout any data series. Yet the period from around 1940 to 1975 has no warming or slight cooling depending on the data set. Therefore any linear trend line derived for a longer period than 1970 to 1975 and ending in 2015 will show a lower rate of warming. This would be consistent the rate of CO2 increasing over time, as shown in figure 1. But for shorten the period, again ending in 2015, and once the period becomes less than 30 years, the warming trend will also decrease. This contracts the theory, unless ad hoc excuses are used, as shown in my previous post using the HADCRUT4 data set.

(b) Those who agree with the consensus are called “Realist”, despite looking inwards towards common beliefs. Those who disagree with warming are labelled “Contrarian”. This is not inaccurate when there is a dogmatic consensus. But it utterly false to lump all those who disagree with the same views, especially when no examples are provided of those who hold such views.

(c) The linear trend appears as a more plausible fit than the series of “contrarian” lines. By implication, those who disagree with the consensus are viewed as as having a distinctly more blinkered and distorted perspective than those who follow the consensus. Yet even using gistemp data set (which is gives greatest support to the consensus views) there is a clear break in the linear trend. The less partisan HADCRUT4 data shows an even greater break.

Those who spot the obvious – that around the turn of the century warming stopped or slowed down, when in theory it should have accelerated – are given a clear choice. They can conform to the scientific consensus, denying the discrepancy between theory and data. Or they can act as scientists, denying the false and empirically empty scientific consensus, receiving the full weight of all the false and career-damaging opprobrium that accompanies it.





Kevin Marshall


Jeremy Corbyn needs to do the Maths on Boundary Commission Proposals

In my previous post I noted how some Labour MPs were falsely claiming that the Boundary Commission’s recommendations for England and Wales were party-political gerrymandering. Labour Party Leader, the Rt Hon Jeremy Corbyn MP makes a quite different claim to some of his more desperate MPs.

Corbyn claims that since last December (which the Boundary Commission used as a basis of the boundary changes) the electorate has grown by two million people. That is nearly 5% of the electorate. As a result of the wrong figures “you cannot deliver a fair and democratic result on the basis of information that is a year out of date.
Actually it is possible for it to be fair and democratic if the growth in the electorate is evenly spread across the country. That should be a default position that Corbyn needs to disprove. The question is, how much would the imbalance have to be to wipe out the disadvantage Labour gets from the boundary review – a disadvantage due to current 231 Labour seats in England and Wales having on average 3515 fewer constituents than the 329 Conservative seats in May 2015. Let us do the maths, ignoring the 13 seats held by other parties and the Speaker. To even up average constituency size Labour constituencies would need about 812,000 extra voters (231 x 3515), and for the rest of the two million to be evenly spread between the other 560 constituencies. That is about 2120 extra voters. It is not impossible that the average Labour constituency has added 5635 to the electoral roll (>8% extra) and the average Conservative constituency has added 2120 to the electoral roll (<3% extra). Winning the millions on Lotto is not impossible either. But both are highly unlikely, as the reason for the Boundary Review is that Constituency sizes have diverged, with greater growth in the South of England than in the North of England and Wales. So like other Labour MPs, Jeremy Corbyns’ opposition to the Boundary Commission’s proposals seem to be opposition to greater equality and fairness in the British democratic processes.
Two graphs to illustrate this point. Figure 1 from the previous post shows the average constituency size by party and region.

Figure 4 from the previous post shows that average constituency size per region is made much closer to the average constituency size for England and Wales in the proposed changes.


Kevin Marshall

Are the Proposed Boundary Changes Designed to hurt the Labour Party?

Yesterday the proposed new boundaries for England and Wales were published by the Boundary Commission. Nationally the total number of constituencies will be reduced from 650 to 600, still leaving Britain with one of the largest number of representatives of any democratic parliament. In England the reduction is from 533 to 501 and in Wales from 40 to 29. The UK Polling Report website reports

The changes in England and Wales result in the Conservatives losing 10 seats, Labour losing 28 seats, the Liberal Democrats losing 4 and the Greens losing Brighton Pavilion (though notional calculations like these risk underestimating the performance of parties with isolated pockets of support like the Greens and Lib Dems, so it may not hit them as hard as these suggest).

The Guardian Reports under the banner Boundary changes are designed to hurt us at next election, says Labour MP

Jon Ashworth, the shadow Cabinet Office minister leading the process for Labour, said the party was convinced the proposals were motivated by party politics.

The Manchester Evening News carries this comment

Jonathan Reynolds, Labour MP for Stalybridge and Hyde, accused the Conservatives of ‘old-fashioned gerrymandering’.
I will contest these proposals, because I believe they are a naked attempt to increase the electoral prospects of the Conservative Party at the expense of coherent parliamentary representation,” he said.

This are quite a serious claim to make, particularly as the Boundary Commission clearly states

The Boundary Commission for England is the independent and impartial body that is considering where the boundaries of the new constituencies should be. We must report to Parliament in September 2018.
In doing so, we have to ensure that every new constituency has roughly the same number of electors: no fewer than 71,031 and no more than 78,507. While proposing a set of boundaries which are fairer and more equal, the Commission will also try to reflect geographic factors and local ties. The Commission will also look at the boundaries of existing constituencies and local government patterns in redrawing the map of parliamentary constituency boundaries across England.
In undertaking the 2018 Review, we rely heavily on evidence from the public about their local area. Though we have to work within the tight electorate thresholds outlined above, we seek to recommend constituency boundaries that reflect local areas as much as we can. You can find more detailed guidance in our Guide to the 2018 Review.

I thought I would look at the figures myself to see whether the Boundary Commission has done a fair job overall, or has basically lied, providing a deliberately partisan result, that the UK Polling Report has been complicit in supporting.
For previous posts I downloaded the results of the May 2015 General Election by constituency. I then spilt the results into the regions of England and Wales.
Figure 1 shows the average size of constituency by Region and Party. Spk is the Speaker of the House of Commons.

On average the Conservative held constituencies had 3815 more voters in than Labour held ones. But there are large regional differences. Figure 2 shows the number of constituencies by region and political party.

In the South East and South West, where Labour have larger average constituency sizes they have very few seats. In these regions, the regional average seat size is greater than the England and Wales average, so there will be proportionately less seat reductions. The Conservatives, with the vast majority of seats in these regions do not lose from a reduction in the national total and a more equitable distribution. In the East Midlands, West Midlands and Yorkshire and The Humber, Labour are well represented, but have smaller average seat sizes than the Conservatives. In the North West and in Wales Labour are well represented, the average seat sizes in Labour seats are similar to Conservative seats, but the regional average seat sizes are smaller than the England and Wales average. Smaller average seat sizes in these regions will hit Labour harder than the Conservatives due to Labour’s higher representation.
The only exception to the England and Wales picture is London. The region has larger than average constituencies at present, the average constituency size of Labour constituencies is bigger than Conservative constituencies and over 60% of the 73 constituencies are Labour held. But still the region and Labour lose seats, though proportionately less than elsewhere.
The effect of the revisions in shown in the average seat size. In Figure 3 with less seats the average seat size increases, but in some regions by far more than others, resulting in much less regional variation from the proposed boundary changes.

Figure 4 emphasizes the more even distribution of seat size. Currently, the variation of average constituency by region from England and Wales average is between -14740 (Wales) and +4517 (South East). Under the proposals, the variation is between -1160 (East Midlands) and + 2135 (London). https://manicbeancounter.files.wordpress.com/2016/09/fig4variationewave.jpg

In London’s case it could be argued for another two constituencies, but this is hardly significant. Also, given that London MPs spend their week far nearer to their constituents than any other region, an extra 2-3% of people to represent is hardly a burden.
Finally I have done my own estimated impact on Labour, Conservative and Green seats based on changes in regional seat average sizes in Figure 5. If though I do not include the Lib-Dems, the results are very similar to UK Polling Report. The much more even (and therefore fairer) distribution of seats, along with a modest reduction in the total, disadvantages the Labour Party far more than the Conservatives, despite having two-thirds of the seats.

The Labour Party MPs who are doubting the independence of the Boundary Commission should apologize. The evidence is clearly against them.

Kevin Marshall

Going for Brexit or denying the EU Referendum

The Rt Hon David Davies MP and Secretary of State for Exiting the EU gave an update to the House of Commons today. He made quite clear what Brexit means

Naturally, people want to know what Brexit will mean.
Simply, it means the UK leaving the European Union. We will decide on our borders, our laws, and taxpayers’ money.
It means getting the best deal for Britain – one that is unique to Britain and not an ‘off the shelf’ solution. This must mean controls on the numbers of people who come to Britain from Europe – but also a positive outcome for those who wish to trade in goods and services.

He went on to lay out the principles on which Britain would proceed.

…as we proceed, we will be guided by some clear principles. First, as I said, we wish to build a national consensus around our position. Second, while always putting the national interest first, we will always act in good faith towards our European partners. Third, wherever possible we will try to minimise any uncertainty that change can inevitably bring. And, fourth, crucially, we will – by the end of this process – have left the European Union, and put the sovereignty and supremacy of this Parliament beyond doubt.

On other words Britain will Brexit is in a very British fashion.

– It will be from principles, not from specific objectives or adhering to specific rules.
– Britain will act honourably, something that the British have long been known for commercial dealings.
– It will recognize that other EU members have interests as well. The outcome being aimed for is where Britain’s relationship to the EU is based on co-operation and trade where both sides are net winners.
– At the end of the process Britain will have a more sovereign Parliament. That is, the democratically elected Government will be able to decide the future course of country, for better or worse.

Text is at ConservativeHome
Emily Thornberry MP, speaking for the Labour Party, gave a somewhat different perspective from about 13:10

– Strategy consists of clearly laid out and concrete plan.
– There are areas of policy that should placed outside of the scope of a sovereign Parliament, such “workers rights” and guarantees for EU Nationals currently resident in the UK.
– A “positive vision” consists of definite objectives.
– You listen to outside gloomy prophesies that support your perspective.
– The Government are now rushing to start negotiation, without a well-thought plan. Given that the Government is delaying triggering Article 50 until 2017, the means she is wanting a slower pace. But on 24th June when the referendum result was announced, Labour Leader Jeremy Corbyn was all for triggering Article 50 straight away. Is this another open split with the Labour Leader, or an about-face in Labour policy?
– Article 50 should not be triggered without a parliamentary vote to authorize.

On triggering Article 50 David Davies pointed out 20.35 there was a referendum bill that went through the House of Commons, and was voted for 6 to 1. Emily Thornberry voted in favour. It was made perfectly clear by the then Foreign Secretary at the time that the EU referendum was not a consultation, or an advice to parliament, but a decision by the electorate. The words of the Act do not state that, but people were lead to believe that in the campaign. Most importantly Will Straw, leader of Britain Stronger in Europe (the official Remain campaign) said the decision was for the voters.

On 23rd June you will get to vote on the EU Referendum and decide whether Britain remains in or leaves Europe.

Apart from the inaccuracy of naming the decision as whether to leave the geographical continent rather than the political organisation, the statement could not be clearer. Yet the losers in the Referendum want to re-interpret the meaning of the result.

Kevin Marshall

Guardian Images of Global Warming Part 2 – A Starved Dead Polar Bear

In the Part 2 of my look at Ashley Cooper’s photographs of global warming published in The Guardian on June 3rd I concentrate on the single image of a dead, emaciated, polar bear.
The caption reads

A male polar bear that starved to death as a consequence of climate change. Polar bears need sea ice to hunt their main prey, seals. Western fjords of Svalbard which normally freeze in winter, remained ice free all season during the winter of 2012/13, one of the worst on record for sea ice around the island archipelago. This bear headed hundreds of miles north, looking for suitable sea ice to hunt on before it finally collapsed and died.

The US National Snow and Ice Data Center (NSIDC) has monthly maps of sea ice extent. The Western Fjords were indeed ice free during the winter of 2012/13, even in March 2013 when the sea ice reaches a maximum. In March 2012 Western Fjords were also ice free, along with most of the North Coast was as well.  The maps are also available for March of 2011, 2010, 2009 and 2008. It is the earliest available year that seems to have the minimum extent. Screen shots of Svarlbard are shown below.

As the sea ice extent has been diminishing for years, maybe this had impacted on the polar bear population? This is not the case. A survey published late last year, showed that polar bear numbers has increased by 42% between 2004 and 2015 for Svarlbard and neighbouring archipelagos of Franz Josef Land and Novaya Zemlya.

Even more relevantly, studies have shown that the biggest threat to polar bear is not low sea ice levels but unusually thick spring sea ice. This affects the seal population, the main polar bear food source, at the time of year when the polar bears are rebuilding fat after the long winter.
Even if diminishing sea ice is a major cause of some starvation then it may have been a greater cause in the past. There was no satellite data prior to the late 1970s when the sea ice levels started diminishing. The best proxies are the average temperatures. Last year I looked at the two major temperature data sets for Svarlbard, both located on the West Coast where the dead polar bear was found. It would appear that there was a more dramatic rise in temperatures in Svarlbard in the period 1910-1925 than in period since the late 1970s. But in the earlier warming period polar bear numbers were likely decreasing, continuing into later cooling period. Recovery in numbers corresponds to the warming period. These changes have nothing to do with average temperatures or sea ice levels. It is because until recent decades polar bears were being hunted, a practice that has largely stopped.

The starvation of this pictured polar bear may have a more mundane cause. Polar bears are at the top of the food chain, relying on killing fast-moving seals for food. As a polar bear gets older it slows down, due to arthritis and muscles not working as well. As speed and agility are key factors in catching food, along with a bit of luck, starvation might be the most common cause of death in polar bears.

Kevin Marshall

Guardian Images of Global Warming Part 1 – Australian Droughts

On Friday June 3rd the Guardian presented some high quality images with the headline

Droughts, floods, forest fires and melting poles – climate change is impacting Earth like never before. From the Australia to Greenland, Ashley Cooper’s work spans 13 years and over 30 countries. This selection, taken from his new book, shows a changing landscape, scarred by pollution and natural disasters – but there is hope too, with the steady rise of renewable energy.

The purpose is to convince people that human-caused climate change is happening now, to bolster support for climate mitigation policies. But the real stories of what the pictures show is quite different.  I will start with three images relating to drought in Australia.

Image 5

Forest ghosts: Lake Eildon in Victoria, Australia was built in the 1950’s to provide irrigation water, but the last time it was full was in 1995. The day the shot was taken it was at 29% capacity with levels down around 75ft.

Data from Lake Eildon (which is accessible with a simple search of Lake Eildon capacity) links to a graph where up to 7 years of data can be compared.

In 1995 the dam was not at full capacity, but it was full, for a short period, in the following year. However, more recently after the recent drought broke, in 2011 the reservoir was pretty much full for all the year.

But were the low levels due to more extreme drought brought on by climate change? That is very difficult to determine, as Lake Eildon is an artificial lake, constructed to provide water for irrigation occasional hydro-electric power as well as recreational facilities. The near empty levels at the end of the biggest drought in many decades could be just due a failure to predict the duration of the drought, or simply a policy of supplying irrigation water for the maximum length of time. The fact that water levels never reached full capacity for many years is indicated by a 2003 article in The Age

The dam wall at Lake Eildon, Victoria’s biggest state-run water storage, has been declared unsafe and will need a $30 million upgrade if the lake is to be refilled.

The dam, which is at its lowest level since being completed in 1956, will be restricted to just 65 per cent capacity because it no longer meets safety standards for earthquakes and extreme floods.

Image 6

Forest destroyed by bush fires near Michelago, New South Wales, Australia.

The inference is that this is caused by global warming.

According to Munich Re

The majority of bushfires in southeast Australia are caused by human activity

Bushfire is the only natural hazard in which humans have a direct influence on the hazard situation. The majority of bushfires near populated areas are the consequence of human activity. Lightning causes the smaller portion naturally. Sometimes, a carelessly discarded cigarette or a glass shard, which can focus the sun’s rays is all it takes to start a fire. Heat from motors or engines, or electric sparks from power lines and machines can ignite dry grass. Besides this accidental causes, a significant share of wildfires are started deliberately.

Humans also change the natural fire frequency and intensity. They decrease the natural fire frequency due to deliberate fire suppression near populated areas. If there is no fuel-reduction burning in forests for the purposes of fire prevention, large quantities of combustible material can accumulate at ground level.

Surface fires in these areas can become so intense due to the large amounts of fuel that they spread to the crowns of the trees and rapidly grow into a major fire. If humans had not intervened in the natural bushfire regime, more frequent low-intensity fires would have consumed the forest undergrowth and ensured that woodland grasses and scrubs do not proliferate excessively.

David Evans expands on the issue of fuel load in a 2013 article.

Like with the water levels in an artificial lake, forest fires are strongly influenced by the management of those forests. Extinguishing forest fires before they have run their natural course results in bigger and more intense fires at a later date. More frequent or intense droughts would not change this primary cause of many horrific forest fire disasters seen in recent years.

Image 7

Where has all the water gone?: Lake Hume is the largest reservoir in Australia and was set up to provide irrigation water for farms further down the Murray Basin and drinking water for Adelaide. On the day this photograph was taken it was at 19.6% capacity. By the end of the summer of 2009 it dropped to 2.1 % capacity. Such impacts of the drought are likely to worsen as a result of climate change. The last time the water was anywhere near this road bridge was 10 years ago, rendering this no fishing sign, somewhat redundant.

Again this is old data. Like for Lake Eildon, it is easy to construct graphs.

Following the end of the drought, the reservoir came back to full capacity. Worsening drought is only apparent to those who look over a short time range.

When looking at drought in Australia, Dorothea Mackellar’s 1908 poem “My Country” provides some context. Written for a British audience, the poem begins

I love a sunburnt country,

A land of sweeping plains,

Of ragged mountain ranges,

Of droughts and flooding rains

To understand the difference that human-caused climate change is having on the climate first requires an understanding of natural climatic variation over multiple time-scales. It then requires an understanding of how other human factors are influencing the environment, both intended and unintended.

Kevin Marshall

Britain Stronger in Europe Letter

I received a campaign letter from Britain Stronger in Europe today headed


Putting the “RE:” in front is a bit presumptuous. It is not a reply to my request. However, I believe in looking at both sides of the argument, so here is my analysis. First the main points in the body of the letter:-

  1. JOBS – Over 3 million UK jobs are linked to our exports to the EU.
  2. BUSINESSES – Over 200,000 UK Businesses trade with the EU, helping them create jobs in the UK.
  3. FAMILY FINANCES – Leaving the EU will cost the average UK household at least £850 a year, and potentially as much as £1,700, according to research released by the London School of Economics.
  4. PRICES – Being in Europe means lower prices in UK shops, saving the average UK household over £350 a year. If we left Europe, your weekly shop would cost more.
  5. BENEFITS vs COSTS – For every £1 we put into the EU, we get almost £10 back through increased trade, investment, jobs, growth and lower prices.
  6. INVESTMENT – The UK gets £66 million of investment every day from EU countries – that’s more than we pay to be a member of the EU.

The first two points are facts, but only show part of the picture. The UK not only exports to the EU, but also imports. Indeed there is a net deficit with the EU, and a large deficit in goods. It is only due to a net surplus in services – mostly in financial services based in the City of London – that the trade deficit is not larger. The ONS provides a useful graphic illustrating both the declining share of exports to the EU, and the increasing deficit, reproduced below.

No one in the UK is suggesting that Brexit would mean a decline in trade, and it would be counter-productive for the EU not to reach a trade agreement with an independent UK when the EU has this large surplus.

The impact on FAMILY FINANCES is based upon the Centre for Economic Performance, an LSE affiliated organisation. There is both a general paper and a technical paper to back up the claims. They are modelled estimates of the future, not facts. The modelled costs assume Britain exits the European Union without any trade agreements, despite this being in the economic interests of both the UK and the EU. The report also does a slight of hand in estimating the contributions the UK will make post Brexit. From page 18 the technical paper

We assume that the UK would keep contributing 83% of the current per capita contribution as Norway does in order to remain in the single market (House of Commons, 2013). This leads to a fiscal saving of about 0.09%.

The table at the foot of report page 22 (pdf page 28) gives the breakdown of the estimate from 2011 figures. The Norway figures are gross and have a fixed cost element. The UK economy is about six times that of Norway, so would not end up spending nearly as much per capita even on the same basis. The UK figures are also a net figure. The UK pays into the EU twice as much as it gets out. Ever since joining the Common Market in 1973 Britain has been the biggest loser in terms of net contributions, despite the rebates that Mrs Thatcher secured with much effort in the 1980s.

The source of the PRICES information is again from the Centre for Economic Performance, but again with no direct reference. I assume it is from the same report, and forms part of the modelled forecast costs.

The BENEFITS vs COSTS statement is not comparing like with like. The alleged benefits to the UK are not all due to being a member of a club, but as a consequence of being an open economy trading with its neighbours. A true BENEFITS vs COSTS comparison would be future scenarios of Brexit vs Remain. Leading economist Patrick Minford has published a paper for the Institute of Economic Affairs, who finds there is a net benefit in leaving, particularly when likely future economic growth is taken into account.

The INVESTMENT issue is just part of the BENEFITS vs COSTS statement. So, like with the PRICES statement it is making one point into two.

 In summary, Britain Stronger in Europe claims I need to know six facts relevant to the referendum decision, but actually fails to provide a one. The actual facts are not solely due to the UK being a member of the European Union, whilst the relevant statements are opinions on modelled future scenarios that are unlikely to happen. The choice is between a various possible future scenarios in the European Union and possible future scenarios outside. The case for remain should be proclaiming the achievements of the European Union in making a positive difference to the lives of the 500 million people in the 28 States, along with future pathways where it will build on these achievements. The utter lack of these arguments, in my opinion, is the strongest argument for voting to leave.

Kevin Marshall


Copy of letter from Britain Stronger in Europe