Climate Delusions 1 – Karl et al 2015 propaganda

This is the first is a planned series of climate delusions. These are short pieces of where the climate alarmists are either deluding themselves, or deluding others, about the evidence to support the global warming hypothesis; the likely implications for changing the climate; the consequential implications of changing / changed climate; or associated policies to either mitigate or adapt to the harms. The delusion consists is I will make suggestions of ways to avoid the delusions.

Why is the Karl et al 2015 paper, Possible artifacts of data biases in the recent global surface warming hiatus proclaimed to be the pause-buster?

The concluding comments to the paper gives the following boast

Newly corrected and updated global surface temperature data from NOAA’s NCEI do not support the notion of a global warming “hiatus.”  …..there is no discernable (statistical or otherwise) decrease in the rate of warming between the second half of the 20th century and the first 15 years of the 21st century. Our new analysis now shows that the trend over the period 1950–1999, a time widely agreed as having significant anthropogenic global warming (1), is 0.113°C decade−1 , which is virtually indistinguishable from the trend over the period 2000–2014 (0.116°C decade−1 ). Even starting a trend calculation with 1998, the extremely warm El Niño year that is often used as the beginning of the “hiatus,” our global temperature trend (1998–2014) is 0.106°C decade−1 —and we know that is an underestimate because of incomplete coverage over the Arctic. Indeed, according to our new analysis, the IPCC’s statement of 2 years ago—that the global surface temperature “has shown a much smaller increasing linear trend over the past 15 years than over the past 30 to 60 years”—is no longer valid.

An opinion piece in Science, Much-touted global warming pause never happened, basically repeats these claims.

In their paper, Karl’s team sums up the combined effect of additional land temperature stations, corrected commercial ship temperature data, and corrected ship-to-buoy calibrations. The group estimates that the world warmed at a rate of 0.086°C per decade between 1998 and 2012—more than twice the IPCC’s estimate of about 0.039°C per decade. The new estimate, the researchers note, is much closer to the rate of 0.113°C per decade estimated for 1950 to 1999. And for the period from 2000 to 2014, the new analysis suggests a warming rate of 0.116°C per decade—slightly higher than the 20th century rate. “What you see is that the slowdown just goes away,” Karl says.

The Skeptical Science Temperature trend data gives very similar results. 1950-1999 gives a linear trend of 0.113°C decade−1 against 0.112°C decade−1 and for 2000-2014 gives 0.097°C decade−1 against 0.116°C decade−1. There is no real sign if a slowdown,

However, looking at any temperature anomaly  chart, whether Karl. NASA Gistemp, or HADCRUT4, it is clear that the period 1950-1975 showed little or no warming, whilst the last quarter of the twentieth century show significant warming.  This is confirmed by the Sks trend calculator figures in Figure 1.

What can be clearly seen is the claim of no slowdown in the twenty-first century compared with previous years is dependent on the selection of the period. To repeat the Karl et. al concluding claim.

Indeed, according to our new analysis, the IPCC’s statement of 2 years ago—that the global surface temperature “has shown a much smaller increasing linear trend over the past 15 years than over the past 30 to 60 years”—is no longer valid.

The period 1976-2014 is in the middle of the range, and from the Sks temperature trend is .160. The trend is significantly higher than 0.097, so a slowdown has taken place. Any remotely competent peer review would have checked what is the most startling claim. The comparative figures from HADCRUT4 are shown in Figure 2.

With the HADCRUT4 temperature trend it is not so easy to claim that there is no significant slowdown. But the full claim in the Karl et al paper to be a pause-buster can only be made by a combination of recalculating the temperature anomaly figures and selection of the 1950-1999 period for comparing the twenty-first century warming. It is the latter part that makes the “pause-buster” claims a delusion.

Kevin Marshall


Stoke Central By-Election – Labour’s achievement in statistics

Yesterday’s Parliamentary By-Elections were quite significant. The number of firsts about the result in Copeland have been gone over in fine detail. But in Stoke Central the winning Labour Candidate, Gareth Snell, can point to some records and distinctions that he has achieved. Purely in the interests of balance, I would like to help out. 🙂

Of the 650 MPs currently in the House of Commons, he will have the distinction of being elected on the least votes cast. Snell, in winning with 7853 votes, has removed from bottom place Angus MacNeil, SNP MP for Na h-Eileanan An Iar, who won with just 8662 votes. But this constituency covering the Hebrides has less than half of the Stoke-on-Trent Central electorate. Further, now 94% of sitting MPs are sitting in the House of Commons by virtue of winning with at least twice the numbers votes. In 2015 Tristram Hunt won Stoke Central with just 19.3% of electorate voting for him  – the lowest in England. Gareth Snell MP won with just 14.2% of the electorate voting for him, the lowest in Britain. Bottom place was previously held by Alasdair McDonnell, SDLP MP for Belfast South with 14.7% of the electorate voting for him. But in Belfast South six candidates saved their deposit, and seventh placed UKIP just missed out in getting 4.9% of the votes. In Stoke only four candidates saved their deposit and fifth placed Green candidate only got 1.4% of the vote. Whilst in Belfast South the majority was 2.3% of the votes cast, in Stoke Central it was 12.4%.

Another statistic is to look at the runner-ups in the General Election 2015. 560 of the 650 second-placed candidates received more than Gareth Snell’s 7853 votes. On average in GE 2015 the winners on average received and 23634 and the runners up 12121 votes, respectively 3 times and 1.5 times Snell’s mighty vote count. Although there were just 232 Labour MPs elected in 2015, 506 Labour Candidates received more than 7853 votes than Snell received yesterday. In the constituencies where they stood Labour received on average 14813 votes, nearly twice the votes received to win Stoke Central by a considerable margin. Of the 125 Labour candidates who received less votes than Gareth Snell, only 11 achieved the runner-up slot. The rest were lower-placed.

But this was a by-election, where turnout is usually much lower than at General Elections. Yet here Gareth Snell again sets records. You have to go all the way back to 15 July 2004 to find a winning candidate who won a by-election with less votes. That was Labour candidate Liam Byrne became the MP for Birmingham Hodge Hill with just 7451 votes. There have been 44 by-elections in between. Yet back then on average people won by-elections with smaller number of votes.

In the current Parliament winning by-election candidates achieve 50% more votes on average than in the 2001-2005 Parliament. It looks like more people turn out to by-elections now, maybe due to more focussed campaigning by the parties, and the greater national significance of the result than when Labour had large majorities in the House of Commons. Maybe it is due to the fact that less people tend to vote in Labour-held seats than for other parties. Below I show the numbers of by-elections held, splitting the winners into Conservative, Labour and Other.

The Labour Party seem to win by-elections with about 40% more votes than they did in 2001-2005.

Data for the 2015 General Election can be derived from

Kevin Marshall

Petitions on EU Referendum and Trump State Visit show dominance of Labour Party by London activists

In the UK it is possible to raise a petition to Parliament. If that petition obtains 10,000 signatures, there is a written response from the Government. If there are more than 100,000 signatures, the matter is discussed in Parliament. In less than two years 48 proposals have been discussed in Parliament, with another 14 pending. By far the largest was for EU Referendum Rules triggering a 2nd EU Referendum, which had 4.15 million signatures. It was never going to get far, as it would have meant changing the rules for the referendum vote after the vote had taken place. But it acted as a protest for the substantial and vocal minority who did not like result.

The signatures by constituency are available for download. There are a also non-UK signatures, which I shall ignore. I ranked the signatures by constituency, and divided the 650 constituencies into tenths, or decile groups. The constituencies I then classified by political party of the current MP, giving the graph shown in Figure 1.

Compared to the Conservative constituencies the Labour Party has a few dominant activist constituencies on in terms of wanting to overturn the EU Referendum results, whilst most are far less active. It is even worse if you include the SNP, many of which were Labour constituencies prior to 2015. Figure 2 splits these 231 Labour seats into the 14 regions.

Of the 34 Labour-held seats in the top decile, 27 are in London. The Labour heartlands of the North of England. parts of the Midlands and in Wales are far less activist. Those 27 London constituencies (or 15% of Labour seats) registered 41% of all signatures in Labour seats. 15% of Labour seats registered slightly more signatures than the lowest 140 or 60%. This lines up with the an analysis of the estimated split of the EU Referendum vote I did last year, and shown again as Figure 3.

The Labour seats that most virulently voted remain in the EU that are unsuprisingly the Labour seats with the most signatories who wanted to overturn the democratic result that goes against them. But it in terms of signatories, London-based activists skew the result even more, meaning that within in a political party their views are likely dominant over the those held in the majority of Labour-held seats. As Labour Party members are mostly pro remain, this means that going with party and not will the majority view in the constituencies that they represent.  There is a similarity with attitudes to Donald Trump’s prospective State visit to the UK. A petition against this is Prevent Donald Trump from making a State Visit to the United Kingdom. This currently has 1.85m signatures up from the 1.82m when I downloaded the figures a few days ago. Figure 4 shows the decile groups by political party of the current MP and the Figure 5 shows the split by region of the labour constituencies.

The Labour constituencies dominate even more the top 65 of constituencies by signatories, with the same 27 London constituencies being represented in the top decile. With 15% of Labour seats they registered 32% of all signatures in Labour seats and registered slightly more signatures than the lowest 144 or 62%. A very similar pattern to the EU referendum.

This petition has been countered by Donald Trump should make a State Visit to the United Kingdom. With just 307,000 signatories or one sixth signatories of the Prevent State visit, it might nor seem as relevant. Figure 6 and Figure 7, are from when the signatories were about 275,000.

The Labour constituencies are fairly united in their apathy for wanting a Donald Trump State visit, but are divided in the expressed opposition to a state visit. But are the far greater numbers of the “Stop Trump” signatories reflected in the wider population? YouGov Published an opinion poll on 1st February on the topic. Almost half the sample thought the state visit should go ahead, whilst just over a third thought it should not. In the detail, the poll also divides the country into five regions, with London separated out. Even here, the opinion was 46 to 38% in favour of the Trump state visit. The real problems for Labour are shown in the extract  of the detail in Figure 8 below.


Those who intend to vote Labour now are a smaller group than those who voted Labour in 2015. Proportionately if 30.4% voted Labour in 2015, 25% would do so now. In the unweighted sample, it implies around 70% of the of the 67 lost would support the state visit. The remaining Labour voters are much more against the majority who expressed an opinion than in GE2015. This indicates a party in general decline. That the opinion seems to be centered on London, this indicates the collapse in the Labour vote has in the traditional Labour heartlands of the Midlands, the North and Wales has further to go.

Yet if the visit does go ahead it is the noisy protesters that will come out in their thousands, the majority will be Labour supporters based in London, who shout down everybody else.




Warming Bias in Temperature Data due to Consensus Belief not Conspiracy

In a Cliscep article Science: One Damned Adjustment After Another? Geoff Chambers wrote:-

So is the theory of catastrophic climate change a conspiracy? According to the strict dictionary definition, it is, in that the people concerned clearly conferred together to do something wrong – namely introduce a consistent bias in the scientific research, and then cover it up.

This was in response to last the David Rose article in the Mail on Sunday, about claims the infamous the Karl et al 2015 breached America’s National Oceanic and Atmospheric Administration (NOAA) own rules on scientific intergrity.

I would counter this claim about conspiracy in respect of temperature records, even in the strict dictionary definition. Still less does it conform to a conspiracy theory in the sense of some group with a grasp of what they believe to be the real truth, act together to provide an alternative to that truth. or divert attention and resources away from that understanding of that truth. like an internet troll. A clue as to know why this is the case comes from on of the most notorious Climategate emails. Kevin Trenberth to Micheal Mann on Mon, 12 Oct 2009 and copied to most of the leading academics in the “team” (including Thomas R. Karl).

The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.

It is the first sentence that was commonly quoted, but it is the last part is the most relevant for temperatures anomalies. There is inevitably a number of homogenisation runs to get a single set of anomalies. For example the Reykjavik temperature data was (a) adjusted by the Iceland Met office by standard procedures to allow for known locals biases (b) adjusted for GHCNv2 (the “raw data”) (c) adjusted again in GHCNv3 (d) homogenized by NASA to be included in Gistemp.

There are steps that I have missed. Certainly Gistemp homogenize the data quite frequently for new sets of data. As Paul Matthews notes, adjustments are unstable. Although one data set might on average be pretty much the same as previous ones, there will be quite large anomalies thrown out every time the algorithms are re-run for new data. What is more, due to the nature of the computer algorithms, there is no audit trail, therefore the adjustments are largely unexplainable with reference to the data before, let alone with reference to the original thermometer readings. So how does one know whether the adjustments are reasonable or not, except through a belief in how the results ought to look? In the case of the climatologists like Kevin Trenberth and Thomas R. Karl, variations that show warmer than the previous run will be more readily accepted as correct rather than variations that show cooler. That is, they will find reasons why a particular temperature data set now shows greater higher warming than before. but will reject as outliers results that show less warming than before. It is the same when choosing techniques, or adjusting for biases in the data. This is exacerbated when a number of different bodies with similar belief systems try to seek a consensus of results, like  Zeke Hausfather alludes to in his article at the CarbonBrief. Rather than verifying results in the real world, temperature data seeks to conform to the opinions of others with similar beliefs about the world.

Kevin Marshall

IPCC AR5 Synthesis Report Presentation Miscalculated the Emissions for 2C of Warming

In a previous post I mistakenly claimed that the Ladybird Book on Climate Change (lead author HRH The Prince of Wales) had incorrectly interpreted the AR5 IPCC Synthesis Report in its egg-timer. It is the IPCC that is at fault.
In 2014 the IPCC produced a simplified presentation of 35 slides to summarize the AR5 Synthesis Report Summary for policy makers. A quick summary of a summary of the synthesis report.

Slide 30 on Limiting Temperature Increase to 2C, clearly states that it is global reductions in greenhouse gas emissions that are needed.

The Ladybird egg-timer is adapted from slide 33 of 35.

As a (slightly manic) beancounter I like to reconcile the figures. How are the 1900 GtCO2 and the 1000 GtCO2 arrived at? It could be that it is GtCO2e, like the throughout the synthesis report, where other greenhouse gases are recast in terms of CO2, which accounts for well over half of the warming from trace gases.

Some assumptions for my quick calculations.

1. A doubling of CO2 will lead to a warming of 3C. This was the central estimate of the Charney Report 1979 (pdf), along with all five of the UNIPCC assessment reports.
2. If the pre-industrial level of CO2 was 280ppm, the dangerous 2C of warming will be reached at 445ppm. Rounded this is 450ppm.
3. In 2011 the Mauna Loa CO2 level was 391.63 ppm.
4. Using the CDIAC World CO2 emission figures, gives the following figures for billions of tonnes of CO2 to achieve a 1ppm rise in CO2 levelsin the graph below. In the five years to 2011 on average it took 17.02 billion tonnes of CO2 to raise CO2 levels by 1 ppm. Lets round it to 17.

Now some quick calculations.
Start with 280ppm
Add 111.76 (=1900/17) gives 391.76. Pretty close to the CO2 level in 2011 of 391.63ppm
Add 58.82 (=1000/17) gives 450.58. Given rounding, this pretty close to 450ppm.

There are problems with these calculations.

  • The estimate of 17 GtCO2e is on the high side. The World CO2 emissions from the CDIAC National Emissions spreadsheet gives a sum of 1069.68 GtCO2 from 1960 to 2011, against a rise in CO2 of 74.72 ppm. That is 14.3 GtCO2e over the whole period. Since 2011 there has been a drop towards this long-term average.
  • The Ladybird Book, like the UNFCCC at COP21 Paris December 2015 talks about restraining emissions to 1.5C. If a doubling of CO2 leads to 3.000C of warming then going from 280ppm to 401ppm (the average level in 2015) will eventually 1.555C of warming. This is a tacit admission that climate sensitivity is vastly overstated.
  • But the biggest error of all is that CO2 is only the major greenhouse gas (if you forget about water vapour). It might be the majority of the warming impact and two-thirds of emissions, but it is not all the warming impact according to theory. That alone would indicate that climate sensitivity was 2 instead of 3. But actual warming from 1780 to 2011 was less than 1C, against the 1C from CO2 alone if CS=2. That indicates that CS ≈ 1.3. But not all of the warming in the last 230 years has been due to changes in GHG levels. There was also recovery from the Little Ice Age. Worst of all for climate alarmism is the divergence problem. In this century the rate of warming should have increased as the rate of CO2 levels increased, in turn due to an increase in the rate of rise in CO2 emissions. But warming stopped. Even with the impact of a strong El Nino, the rate of warming slowed dramatically.



The IPCC calculated their figures for 1000 billion tonnes of CO2 emissions for 2C of warming based on CO2 being the only greenhouse gas and a doubling of CO2 levels producing 3C of warming. On that basis 401ppm CO2 level should produce >1.5C of warming. Add in other greenhouse gases and we are in for 2C of warming without any more greenhouse gas emissions. It is only if climate sensitivity is much lower is it theoretically possible to prevent 2C of warming by drastically reducing global CO2 emissions. The IPCC, have concocted figures knowing that they do not reconcile back to their assumptions.

The questions arise are (a) where do the cumulative emissions figures come from? and (b) whether the UNIPCCC has copied these blatant errors in the COP processes?

This is an extended version of a comment made a Paul Homewoods’ notalotofpeopleknowthat blog.

Kevin Marshall

My Amazon Review of Ladybird Book of Climate Change

The following is my Amazon review of Ladybird Book of Climate Change.

The format goes back to the Ladybird Books of my childhood, with text on the left and a nice colour picture on the right. Whilst lacking in figures and references it provides an excellent summary of the current case of climate alarmism and the mitigation policies required to “save the world”. As such it is totally lopsided.
For instance, on page 35 is a drawing of 3 children holding a banner with “1.5 to stay alive”. The central estimate of the climate consensus since the Charney report of 1979 is that a doubling of CO2 levels will lead to 3C of warming. That means a rise from 280 to 400ppm would give 1.54C of warming. With the impact of the rise in other greenhouse gas levels the 2C of warming should already of happened. Either it is somehow hidden, ready to jump out at us unawares, or the the impact of emissions on climate has been exaggerated, so policy is not required.
The other major problem is with policy. The policy proposals are centered around what individuals in the UK can do. That is recycle more, eat less red meat and turn the heat down. There is no recognition that it is global GHG emissions that cause atmospheric GHG levels to rise. If the theory is correct, constraint of global warming means global emissions reductions. That includes the 80%+ of the global population who live in countries exempt from any obligation to constrain emissions. Including all the poorest countries, these countries accounted for all the emissions growth from 1990 to at least 2012.
If people genuinely want to learn about a controversial subject then they need to read different viewpoints. This is as true of climate change as history, economics or philosophy.

Ladybird Book on Climate Change

A couple of weeks ago there was a big splash about the forthcoming Ladybird Book for adults on Climate Change. (Daily Mail, Guardian, Sun, Telegraph etc.) Given that it was inspired by HRH The Prince of Wales, who wrote the forward, it should sell well. Even better, having just received a copy in a format that harks back to the Ladybird Books I grew up with. That is on each double page words on the left and a high quality coloured picture filling the right hand page. Unlike, the previous adult Ladybird series, which was humorous, this is the first in a series that seeks to educate.

The final paragraph of the forward states:-

I hope this modest attempt to alert a global public to the “wolf at the door” will make some small contribution towards requisite action; action that must be urgently scaled up, and scaled up now.

The question is whether there is enough here to convince the undecided. Is this is founded on real science, then there should be a sufficient level of evidence to show

(a) there is a huge emerging problem with climate.

(b) that the problem is human caused.

(b) that there are a set of potential steps that can be taken to stop constrain this problem.

(c) that the cure is not worse than the disease.

(d) that sufficient numbers will take up the policy to meet the targets.

My approach is is to look at whether there is sufficient evidence to persuade a jury. Is there evidence that would convict humanity of the collective sin of destroying the planet for future generations? And is there evidence that to show that, through humanity collectively working for the common good, catastrophe can be averted and a better future can be bequeathed to those future generations? That presumes that there is sufficient quality of evidence that an impartial Judge would not throw the evidence out as hearsay.

Evidence for an Emerging Problem with Climate.

Page 8 on melting ice and rising sea levels starts with the reduced Arctic sea ice. The only quantifiable estimate of the climate change other than the temperature graph on page 6, in claiming at the end of the 2016 melt season the sea ice levels were two-thirds that of at the end of the end of the twentieth century.

Any jury would hear that there has only been satellite data of sea ice extent since 1979; that this was the end of a period known as the “sea ice years“; that the maximum winter ice extent in April was likely less in the eighteenth century than today; that ships log books suggest that general sea ice extent was the roughly the same one hundred and fifty years ago as today; and that in the Antarctic average sea ice extent increase has largely offset the Arctic decrease.

The rest about sea levels correctly state both that they have risen; that the reasons for the rise are a combination of warming seas and melting ice caps. It is also correct that flooding occurs in storm surges. But there is no quantification of the rise in sea levels (about 8-12 inches a century), nor of the lack of evidence of the predicted acceleration.

Page 10 on heatwaves, droughts, floods and storms states that they can cause disruption, economic damage and loss of life. there are also recent examples, and speculation about future trends. But no evidence of emerging trends, particularly increasing loss of life. This lack of evidence is because the evidence of the harms of extreme weather appear on the decrease. Indur Goklany has been a rich source of the counter-evidence over many years.

Page 12 begins

Threats to food and water supply, human health and national security, and the risk of humanitarian crises are all potentially increases by climate change.

The rest is just padding out this speculation.

Page 14 is on disappearing wildlife. One quote

The polar bear has come to symbolize the threats posed to wildlife by climate change….

You can probably find many images of starved dead polar bears to back this up. But the truth is that this creatures live by hunting, and as they get older slow down, so are no longer fast enough to catch seals, their main food source. Zoologist Susan Crockford has a blog detailing how polar bear numbers have increased in recent years, and far from being threatened the species is thriving.

The climate change problem is mostly human caused

The book details that emissions of greenhouse gas levels have gone up, and so have the levels of greenhouse gases. The only quantities is for CO2, the major greenhouse gas. (Page 20) There is simple diagram explaining how CO2 emissions impacts on atmospheric CO2 levels, before explaining the major sources of the net increase – fossil fuel emissions and clearing forests. There is no actual testing of the theory against the data. But Page 20 begins

The scientific evidence shows that dominant cause of the rapid warming of the Earth’s climate over the last half century has been the activities of people…

The relevant quote from UNIPCC AR5 WG1 SPM section D3 says something slightly differently.

It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.

The extremely likely phrase is a Bayesian estimate. It is a belief that should be updated on the best available evidence. Lack of evidence, after much searching, suggests the original guess was wrong. Therefore true Bayesians would downgrade their certainties if they cannot refine the estimates over time. But this was written in 2013. Since the Carney Report of 1979 and the previous four IPCC reports of 1990 to 2007, there has been no refinement in the estimate of how much warming will eventually result from a doubling of CO2.

But how does the evidence stack up? On page 6 there is a chart of global surface temperature anomalies. That increase in temperatures can be tested against the doubling effect of CO2. Since around the turn of century the rate of rise in CO2 emissions and atmospheric CO2 levels has accelerated. But global warming stopped  for over a decade until 2014, only to restart due to a natural phenomena. Comparing the actual data to the theory, fails to support the strong beliefs that GHG emissions are the dominant cause of recent warming. 

Policy to contain the problem

Page 34 go into the benefits of containing warming to 1.5C. Given that the central estimate from the climate community since 1979 has been that a doubling of CO2 will lead to and eventual rise in average temperature of 3C, a rise in CO2 levels from the pre-industrial levels of 280ppm to 400ppm reached in 2015 would give 1.544C of warming. With other greenhouse gases it should be nearer to 2C of warming. Either it is way too late (and the warming is lurking like the Loch Ness monster is the dark and murky depths) or the central estimate is exaggerated. So the picture of three young people holding a banner with 1.5 to stay alive is of the doomed who we can do nothing about, or false alarmism.

Page 36 has a nice graphic adopted from the IPCC Synthesis Report of 2014, showing the liquid dripping through an egg-timer. It shows the estimate that 2000 billion tonnes of CO2 have been emitted so far, 1000 billion tonnes can be emitted before the 2 C of warming is breached. This was from a presentation to summarize the IPCC AR5 Synthesis Report of 2014. Slide 33 of 35.

Problem is that this was the data up to 2011, not five years later to 2016; it was for GHG emissions in billions of tonnes of CO2 equivalents; and the 40 billions tonnes of CO2 emissions should be around 52-55 billion tonnes CO2e GHG emissions. See for instance the EU Commission’s EDGAR figures, estimating 54GtCO2e in 2012 and 51GtCO2e in 2010 (against the IPCCs 49 GtCO2e). So the revised figure is about 750GtCO2e of emissions before this catestrophic figure is breached. The Ladybird book does not have references, to keep things simple, but should at least properly reflect the updated numbers. The IPCC stretched the numbers in 2014 in order to keep the show on the road to such extent that they fall apart on even a cursory examination. The worst part is at the very top of the egg-timer, coloured scarlett is “Coal, oil and gas reserves that cannot be used“. These are spread across the globe. Most notably the biggest reserves are in China, USA, Russia, Canada, Australia, Middle East and Venezuela, with the rest of the World have a substantial share of the rest.

The cure is worse than the disease

For the rest of the book to suggest European solutions like recycling, eating less red meat, turning down the heating to 17C and more organic farming, the authors write about making very marginal differences to emissions in a few countries with a small minority of global emissions. Most of those reserves will not be left in the ground no matter how much the first in line to the Throne gets hot under the collar. The global emissions will keep on increasing from non-policy countries with over 80% of the global population, two-thirds of global emissions and nearly 100% of the world’s poorest people. Below is a breakdown of those countries.

These countries collectively produced 35000 MtCOe in 2012, or 35 GtCO2e. That will increase well into the future short of inventing a safe nuclear reactor the size weight and cost of a washing machine. Now compare to the global emissions pathways to stop the 1.5C  or 2C of warming prepared by the UNFCCC for the 2015 Paris talks.


The combined impact of all the vague policy proposals do not stop global emissions from rising. It is the non-policy developing countries that make the real difference between policy proposals and the modelled warming pathways. If those countries do not keep using fossil fuels at increasing rates, then they deprive billions of people of increasing living standards for themselves and their children. Yet this must happen very quickly for the mythical 2C of warming not to be breached. So in the UK we just keep on telling people not to waste so much food, buy organic, ride a bike and put on a jumper.

There is no strong evidence would convict humanity of the collective sin of destroying the planet for future generations. Nor is there evidence that to show that a better future can be bequeathed to those future generations when the policies would destroy the economic future of the vast majority. The book neatly encapsulates how blinkered are the climate alarmists to both the real-world evidence and the wider moral policy perspectives.

Kevin Marshall


Carbon Capture and Storage Loses another £100m but saves up to £10bn

Last week the National Audit Office published a report Carbon Capture and Storage: the second competition for government support. The main headline was

“The Department has now tried twice to kick start CCS in the UK, but there are still no examples of the technology working. There are undoubtedly challenges in getting CCS established, but the Department faced an uphill battle as a result of the way it ran the latest competition. Not being clear with HM Treasury about what the budget is from the start would hamper any project, and caused particular problems in this case where the upfront costs are likely to be high. The Department must learn lessons from this experience if it is to stand any chance of ensuring the first CCS plants are built in the near future.”

Amyas Morse, head of the National Audit Office, 20 January 2017

Key elements

  • Two Projects in the Competition.
  • When project cancelled £100m had already been spent.
  • The first competition running from 2007 to 2011.
  • Full subsidy from the Treasury (i.e. Taxpayers) would have been £1 Billion
  • Over 15 years, subsidy from consumers would have been £3.9 Billion to £8.9 Billion
  • Would have captured 1Mt to 2 Mt of CO2 a year.
  • Consumer subsidy between £105 and £172 Mwh, on top of the current wholesale price of around £45 Mwh.

The BBC carried the story, correctly citing many of the costs, as did the Express, which stated

At the time it was cancelled, the competition had two preferred bidders: the White Rose consortium in North Yorkshire which planned to build a new coal plant with the technology, and Shell’s scheme in Peterhead, Aberdeenshire, to fit CCS to an existing gas plant operated by SSE.

The NAO report said the department initially estimated it would cost consumers – who would subsidise electricity from the schemes – between £2 billion and £6 billion over 15 years, but by 2015, this estimate had risen to as much as £8.9 billion.

The report found the Treasury was concerned over the costs to consumers, and that the competition was aiming to deliver CCS before it was cost-efficient to do so.

Joanne Nova points to a July 2015 post on the subject of CCS by Anton Lang. He stated

CCS artificially raises the costs of coal fired power in two ways

First, it raises the initial construction cost for any new large scale coal fired plant by around 60%.

Second, the CCS process is hugely energy intensive — consuming up to 40% of the electricity generated by the plant. So  the plant can only sell 60% of the actual power it produces.

As a (slightly manic) beancounter, I like to put the costs in context.

  1. How much would the cost have been if the Treasury had not pulled the plug per tonne of CO2 saved?
  2. What is the value of the subsidy be if China and India adopted the plan?

In the full NAO report (a 389kb pdf) Figure 6 gives details of the two schemes shortlisted in the competition.

It is the Peterhead scheme that would incur the lower subsidy of £105 Mwh. The £3.9 billion works out at an average 290 Mw production, or 76% of capacity over 15 years. It is cheaper due to adapting old plant. The disadvantage is that there is only 30 Mt of CO2 storage capacity in the area, so the area does not have the facility to develop much more unless further infrastructure development is made to pump the CO2 offshore into old oil wells.

The White Rose scheme has higher subsidy of £172 Mwh. The £8.9 billion works out at an average 394 Mw production, or 88% of capacity over 15 years. It is new plant, but has the advantage of 520 Mt of CO2 storage capacity in the area.

If we add in the £1bn subsidy without interest, over 15 years the cost per tonne of CO2 saved is about £264 (US$330, A$435) for the Peterhead project and £300 (US$374, A$490) for the White Rose project.

The NAO report in figure 12 that the subsidy could come down to £94 Mwh with scale.

Let us see what would be the cost if India and China adopted CCS for the current coal-fired power stations, but increasing capacity by 25% to cover the efficiency losses. Assume subsidy is just $100 Mwh.

According to Greenpeace (could be unreliable), China has about 900,000 MW of capacity. Add in 25% and assume 70% capacity, gives around $700bn a year subsidy. This is about 6% of current GDP.

From Wikipedia, India had 310 000 MW of capacity in 2015.  Add in 25% and assume 70% capacity, gives around $240bn a year subsidy. This is about 12% of current GDP.

I am sure that China and India will want to follow the UK’s lead. The only slight issue is finding a hole big enough. Maybe instead they could build some big greenhouses and grow tomatoes very rapidly.

Kevin Marshall


Bernie Saunders demonstrates why he was not fit to be President

Senator Bernie Saunders of Vermont was for a while running a close second to Hillary Clinton in the Democrat Primaries. Had his extreme left views, advanced years and the fact that he is the junior Senator from the 49th most populous State, he might have stood a chance against a former First Lady and Secretary of State. But Senator Sanders’ recent questioning of Scott Pruitt shows why he is unfit for high office. Ron Clutz has transcribed more of the dialog, by I think two statements encapsulate this.

At 0.45

As you may know, some 97% of scientists who have written articles for peer-reviewed journals have concluded that climate change is real, it is caused by human activity, and it is already causing devastating problems in the US and around the world. Do you believe that climate change is caused by carbon emissions from human activity?

There is no 97% survey of scientists which conclude these things. As Ron Clutz observes the nearest to definite questions was Examining the Scientific Consensus on Climate Change – Doran and Zimmerman 2009, where the second question was

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

One could answer “yes” if you thought that 10% of the rise in the was due to land use changes, and the rest due to natural factors. It does not ask about fossil fuel emissions, and the question allows for belief in other factors other than human activity whether known or unknown. It does not ask

There is also the Cook et. al survey of peer-reviewed academic papers that I looked after listening to a lecture Cook gave at Bristol University in late 2014. The survey just looked to the assumption that humans cause some warming, whether explicit or implied. Like the Doran and Zimmerman survey it is just hearsay. This Sen. Sanders presents as good evidence that there is already a clear catastrophic problem caused by changes in the climate. If there is real and overwhelming evidence, why does Sen. Sanders not refer to that instead of misrepresenting bogus opinion polls?

Senator Sanders then goes even further.  At 1.50

While you are not certain, the vast majority of scientists are telling us that if we do not get our act together and transform out energy system away from fossil fuel there is a real question as to the quality of the planet that we are going to be leaving our children and our grandchildren. So you are applying for a job as Administrator at the EPH to protect our environment. Overwhelming majority of scientists say we have to act boldly and your’re telling me that there needs to be more debate on this issue and that we should not be acting boldly.

Sanders now says a majority of scientists are telling us we must change our energy systems. Aside from the fact that only a very small minority of scientists have any sort of competency in the field of climate, (and there is evidence a lot of demonstrated incompetency within the small group e.g. here), they have no expertise in the economic or moral cases for policy. For policy the interpretation of the moral imperatives and the practical possibilities should be the realm of politicians. For those who sit on specialist committees, they should at least have their own developed views on the field.

Senator Bernie Saunders has taken some very dodgy opinion polls, grossly exaggerated the findings, and then ascribed statements to the climatologists that are far removed, and way beyond, any competencies they might have. As I see it, the role of President of the United States, as a leader, is to critically interpret what they are given in order to make decisions for the nation. That is the exact opposite of what Sanders did last week.

Kevin Marshall 


Tristram Hunt MP confirms the Labour split on Brexit

In July, following the Brexit vote, I made a couple of posts looking at Chris Hanretty’s has estimated EU referendum vote split for the 574 parliamentary constituencies in England and Wales. In the The Democratic Deficit in the Referendum Result, I concluded

The results show two things.
First is that there is a huge divergence in Referendum vote across the English and Welsh constituencies.
Second is that a disproportionate number of the constituencies with strong votes either for remaining in the EU or leaving the EU have a Labour Party MP.

The divergence is shown by two graphs of the Leave / Remain constituency split by region  – the overall result and the 231 constituencies with a Labour MP

The second post looked at the seats that Labour must win if it is to become the largest party at the next General Election. In England and Wales most of the target seats voted to leave the EU. But in Scotland, where Labour lost 40 seats to the SNP, every single constituency likely voted to remain in the EU. Further in London, which was strongly remain, reside about 40% of Labour members. There is a fundamental split.

One of the most Pro-Leave constituencies is Stoke on Trent Central. Pro-Remain Labour MP Tristram Hunt is resigning to take up the prestigious post of Director of the V&A Museum. Guido Fawkes had a post this afternoon TRISTRAM HUNT ON LABOUR’S EXISTENTIAL CRISIS.

Guido’s comment concurs with what I have concluded:-

In other words, Labour is increasingly irrelevant in Brexit Britain, and Tristram doesn’t have the answer…

Kevin Marshall