Can Climatology Ever Be Considered a Science?

Can climatology ever be considered a science? My favourite Richard Feynman quote.

You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.

I would maintain that by its nature climatology will always be a vague theory. Climate consists of an infinite number of interrelationships that can only be loosely modelled by empirical generalisations. These can only ever be imperfectly measured, although that is improving both in scope and period of observations. Tweaking the models can always produce a desired outcome. In this sense climatology is never going to be a science way that physics and chemistry have become. But this does not mean that climatology cannot become more scientific. A step forward might be to classify empirical statements according to the part of the global warming theory they support, and the empirical content of those statements.

Catastrophic Anthropogenic Global Warming (CAGW) is a subset of AGW. The other elements of AGW are trivial, or positive. I would also include the benign impacts of aerosols in reducing the warming impacts. So AGW’ is not an empty set.

AGW is a subset of GW, where GW is the hypothesis that an increase in greenhouse gas levels will cause temperatures to rise. There could be natural causes of the rise in greenhouse gases as well, so GW’ is not an empty set.

GW is a subset of Climate Change CC. That is all causes of changing climate, both known and unknown, including entirely random causes.

In summary


Or diagrammatically the sets can be represented by a series of concentric rings.

To become more scientific, climatology as an academic discipline should be moving on two complementary fronts. Firstly, through generating clearer empirical confirmations, as against banal statements or conditional forecasts. Secondly, for the statements to become more unambiguous in being ascribable solely to the CAGW hypothesis in particular rather being just as easily be ascribed to vague and causeless climate change in general. These twin aims are shown in the diagram below, where the discipline should be aiming in the direction of the red progressing arrow towards science, rather the green degenerating arrow.

Nullis in verba on a recent Bishop Hill discussion forum rightly points out the statement

“you acknowledge that scientists predicted warming. And warming is what we observed”

commits the fallacy of “confirming the consequent”.

If your definition of climate change is loose enough, the observed rise could be a member the CC set. But to infer it is not part of GW’ (outside of the GW set) requires more empirical content. As Nullis has shown in his tightly worded comment to prove this is impossible. But the greater empirical content will give more confidence that the scientists did not just strike lucky. Two years ago Roy Spencer did attempt just that. From 73 climate models the prediction was that between 1979 and 2012 average global temperatures would rise by between 0.3 and 1.5C, with an average estimate of 0.8C. Most were within the 0.6 to 1.2C, so any actual rise in that range, which is pretty unusual historically, would be a fairly strong confirmation of a significant AGW impact. The actual satellite and weather balloon data showed a rise of about 0.2C. The scientists got it wrong on the basis of their current models. At a minimum the models are running too hot, at a minimum failing to confirm the CAGW hypothesis.

By more clearly specifying the empirical content of statements the scope of alternative explanations is narrowed. In this case we have an explanation for someone using a more banal statement.

I would contend that to obtain confirmation of CAGW requires a combination of the warming and the adverse consequences. So even if the hurricanes had got worse after Katrina in 2005, with zero warming on its own it is just that an observation climate has changed. But together they form a more empirically rich story that is explained by CAGW theory. Still better is a number of catastrophic consequences.

In the next post I shall show some further examples of the discipline moving in the direction of degenerating climatology.

Kevin Marshall

A Great and Humble Man Dies

Sir Nicholas Winton died today at 106 Years Old. A true hero of mine.

Royal Baby Names to Save the United Kingdom

In a complete break from my normal posts on climate, after discussions with my daughter, I am going to speculate on Royal baby names.

In less than two weeks’ time there will be a General Election. Given that the betting is that a Labour/ Scots Nat Coalition is likely there is a strong possibility that the incoming Government could lead to the breakup of the United Kingdom. The Scottish Nationalists have very similar left-of-centre policies , there is which will in turn lead to the breakup of the United Kingdom. The Royal Family, particularly The Queen, firmly believes in the United Kingdom, has long been proud of its Scottish routes (the Queen is half Scots), but at the same time does not directly intercede in politics, except in the most tangential ways. Naming of a Royal Baby who is fourth in line to the throne is one of the few methods open to the Royal Family of sending a political message. The naming cannot offend the Scots, but at the same time will satisfy the far more numerous English. It must also be a name seen to be reasonably modern, but also in keeping with royal traditions.

If the Katherine, Duchess of Cambridge gives birth to a baby boy there will be a number of names that could be chosen. Prince George Alexander Louis set a precedent. George is the names of six British Monarchs, but identified as very English. It was in the reign of George II (1727-1760) for instance that the Young Pretender Bonnie Prince Charlie was finally defeated at Culloden in 1745. The second name, whilst being highly international was also the name of three medieval Scottish Kings, and the Gaelic form is Alistair. Louis is from Prince William’s great uncle Lord Louis Mountbatten, who was very close to the Prince of Wales. There was both traditional English and Scottish elements in the name, without seeming too old fashioned. Alexander has already been used, so is counted out. Some names of Scottish Kings cannot be considered. “Kenneth” and “Duncan” are very old fashioned. Macbeth was trashed as a plausible name by William Shakespeare. Lulach, Amlaíb, Cuilén, Dub and Indulf as too lost in time to inflict on any child, and would need explaining. This leaves James, David and Robert. In Scotland James is currently fourth most popular, behind Jack, Lewis and Riley. At Befair it is the most popular Boys’ name. So this might be a strong contender. However, the Royal Family will want to make an imprint less than two weeks before a General Election that could destroy the United Kingdom that the Queen pledged to defend. James is both Scottish and English. We have the King James Bible of 1611 that helped unite the factions in the Church of England for a while. But King James VI of Scotland (and James I of England) was an anomaly. He was a strong Scottish Presbyterian, who in commissioning this great work sought to bring together both the Puritan and Catholic elements of the Anglican Church. His grandson, James II almost caused a second Civil War through his Catholic tendencies, resulting in the current inability of the heir to the throne to marry a Roman Catholic. The betting markets, along with my daughter, may favour such a name, but the Queen may advise against.

So what is an appropriate boys’ name for a possible (but unlikely) future monarch, whose only role may be to save the Union by being born?

There are two courses that the Royal Family may take. I believe that they will take the safe course, and call the boy David. It has both strong Scottish routes, and David is the patron saint of Wales. But the option to save the United Kingdom is Robert. On 13th June 1214, Robert the Bruce defeated the English forces of Edward I (“Hammer of the Scots”) at Bannockburn near Stirling. Less than 150 years earlier William of Normandy had defeated the Anglo Saxon (English) at the Battle of Hastings. Although “Edward” was Anglo Saxon in origin the “English Kings” still spoke French at Court. Most fighting on the side of Edward could as little understand their Sovereigns’ words as the Gaelic-speaking Scots. If Robert is chosen, a second name cannot be Edward. But the older Anglo Saxon form of Edward (and still used today) is Edmund. How better for the Royal Family to remember the subjected of 800 years ago, whilst uniting both the downtrodden of both Scotland and England, whilst reconciling ancient enmities, whilst remembering the ancient Kings of both countries. A third name could be David, or one than avers to the Irish, such as Kevin J

Girls names are more difficult. The most famous Scottish girls name is Margaret, and until the 1960s was easily the most popular name. I have an Auntie Margaret and have fond memories of my Great Aunt Margaret, and had (by all accounts) a formidable Great Grandmother Margaret Ross, who died at the age of 93 when I was 3 years old. Many will remember the Queen’s Sister, Princess Margaret. But the name is now not in the top 100 of girl names in Scotland, and (due to the Royal connection) will not be viewed as particularly Scottish. In left-of-centre Scottish minds, it is also the Christian name of one of twentieth century’s greatest Prime Ministers.

There are not many Queens of Scotland. The most famous is Mary Queen of Scots, who, being a French-speaking Catholic, was hardly a figurehead for an increasingly Presbyterian Scotland of the time, nor for the a British Monarchy who has defended the middle-of-the-road Anglican Communion for well over 300 hundred years. Scottish Queens consorts were undistinguished and with names such as Maud, Joan, Sybilla, Ethelreda and Grouoch are hardly able to capture the imagination of the Scottish public. Margaret is again the most popular name, followed by Elizabeth. Looking at current most popular Scottish Girls names in 2012, they are Sophie followed by Emily, Olivia, Ava and Lucy. Hardly Royal, and not much different from England. A statement cannot be easily be made. The last truly Scottish Royal was Queen Elizabeth the Queen Mother, so Elizabeth might be a family option. The policy might be to play safe, or in a thorough break with tradition, let the parents decide.

Kevin Marshall

Declaration of Interest

I was born and bred in England, but my Mother is, and three of my grandparents were, Scottish. I named my son Edmund Alexander. The latter name was after a Great Grandfather and an Uncle who was always known as Alistair. I consider myself British, and am proud of both my Derbyshire and North Scottish ancestry.


Freeman Dyson on Climate Models

One of the leading physicists on the planet, Freeman Dyson, has given a video interview to the Vancouver Sun. Whilst the paper emphasizes Dyson’s statements about the impact of more CO2 greening the Earth, there is something more fundamental that can be gleaned.

Referring to a friend who constructed the first climate models, Dyson says at about 10.45

These climate models are excellent tools for understanding climate, but that they are very bad tools for predicting climate. The reason is simple – that they are models which have very few of the factors that may be important, so you can vary one thing at a time ……. to see what happens – particularly carbon dioxide. But there are a whole lot of things that they leave out. ….. The real world is far more complicated than the models.

I believe that Climate Science has lost sight of what this understanding of what their climate models actually are literally attempts to understand the real world, but are not the real world at all. It reminds me of something another physicist spoke about fifty years ago. Richard Feynman, a contemporary that Dyson got to know well in the late 1940s and early 1950s said of theories:-

You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.

Complex mathematical models suffer from this vagueness in abundance. When I see supporters of climate arguing the critics of the models are wrong by stating some simple model, and using selective data they are doing what lesser scientists and pseudo-scientists have been doing for decades. How do you confront this problem? Climate is hugely complex, so simple models will always fail on the predictive front. However, unlike Dyson I do not think that all is lost. The climate models have had a very bad track record due to climatologists not being able to relate their models to the real world. There are a number of ways they could do this. A good starting point is to learn from others. Climatologists could draw upon the insights from varied sources. With respect to the complexity of the subject matter, the lack of detailed, accurate data and the problems of prediction, climate science has much in common with economics. There are insights that can be drawn on prediction. One of the first empirical methodologists was the preeminent (or notorious) economist of the late twentieth century – Milton Friedman. Even without his monetarism and free-market economics, he would be known for his 1953 Essay “The Methodology of Positive Economics”. Whilst not agreeing with the entirety of the views expressed (there is no satisfactory methodology of economics) Friedman does lay emphasis on making simple, precise and bold predictions. It is the exact opposite of the Cook et al. survey which claims a 97% consensus on climate, implying that it relates to a massive and strong relationship between greenhouse gases and catastrophic global warming when in fact it relates to circumstantial evidence for a minimal belief in (or assumption of) the most trivial form of human-caused global warming. In relation to climate science, Friedman would say that it does not matter about consistency with the basic physics, nor how elegantly the physics is stated. It could be you believe that the cause of warming comes from the hot air produced by the political classes. What matters that you make bold predictions based on the models that despite being simple and improbable to the non-expert, nevertheless turn out to be true. However, where bold predictions have been made that appear to be improbable (such as worsening hurricanes after Katrina or the effective disappearance of Arctic Sea ice in late 2013) they have turned out to be false.

Climatologists could also draw upon another insight, held by Friedman, but first clearly stated by John Neville Keynes (father of John Maynard Keynes). That is on the need to clearly distinguish between the positive (what is) and the normative (what ought to be). But that distinction was alienate the funders and political hangers-on. It would also mean a clear split of the science and policy.

Hattips to Hilary Ostrov, Bishop Hill, and Watts up with that.


Kevin Marshall

Windhoek Temperature adjustments

At Euan Mean’s blog I made reference to my findings, posted in full last night, that in the Isfjord Radio weather station had adjustments that varied between +4.0oC in 1917 to -1.7oC in the 1950s. I challenged anyone to find bigger adjustments than that. Euan came back with the example of Windhoek in South Africa, claiming 5oC of adjustments between the “raw” and GISS homogenised data.

I cry foul, as the adjustments are throughout the data set. J

That is the whole of the data set has been adjusted up by about 4 oC!

However, comparing the “raw” with the GISS homogenised data, with 5 year moving averages, (alongside the net adjustments) there are some interesting features.

The overall temperatures have been adjusted up by around 4oC, but

  • From the start of the record in 1920 to 1939 the cooling has been retained, if not slightly amplified.
  • The warming from 1938 to 1947 of 1.5oC has been erased by a combination of deleting the 1940 to 1944 data and reducing the 1945-1948 adjustments by 1.4oC.
  • The 1945-1948 adjustments, along with random adjustments and deletion of data mostly remove the near 1.5oC of cooling from the late 1940s to mid-1950s and the slight rebound through to the early 1960s.
  • The early 1970s cooling and the warming to the end of the series in the mid-1980s is largely untouched.

The overall adjustments leave a peculiar picture that cannot be explained by a homogenisation algorithm. The cooling in the 1920s offsets the global trend. Deletion of data and the adjustments in the data counter the peak of warming in the early 1940s in the global data. Natural variations in the raw data between the late 1940s and 1970 appear to have been removed, then the slight early 1970s cooling and the subsequent warming in the raw data left alone. However, the raw data shows average temperatures in the 1980s to be around 0.8oC higher than in the early 1920s. The adjustments seem to have removed this.

This removal of the warming trend tends to disprove something else. There appears to be no clever conspiracy, with a secret set of true figures. Rather, there are a lot of people dipping in to adjusting adjusted data to their view of the world, but nobody really questioning the results. They have totally lost sight of what the real data actually is. If they have compared the final adjusted data with the raw data, then they realised that the adjustments had managed to have eliminated a warming trend of over 1 oC per century.

Kevin Marshall

The Propaganda methods of ….and Then There’s Physics on Temperature Homogenisation

There has been a rash of blog articles about temperature homogenisations that is challenging the credibility of the NASS GISS temperature data. This has lead to attempts by anonymous blogger andthentheresphysics (ATTP) to crudely deflect from the issues identified. It is propagandist’s trick of turning people’s perspectives. Instead of a dispute about some scientific data, ATTP turns the affair into a dispute between those with authority and expertise in scientific analysis, against a few crackpot conspiracy theorists.

The issues on temperature homogenisation are to do with the raw surface temperature data and the adjustments made to remove anomalies or biases within the data. “Homogenisation” is a term used for process of adjusting the anomalous data into line with that from the surrounding data.

The blog articles can be split into three categories. The primary articles are those that make direct reference to the raw data set and the surrounding adjustments. The secondary articles refer to the primary articles, and comment upon them. The tertiary articles are directed at the secondary articles, making little or no reference to the primary articles. I perceive the two ATTP articles as fitting into the scheme below.

Primary Articles

The source of complaints about temperature homogenisations is Paul Homewood at his blog notalotofpeopleknowthat. The source of the articles is NASA’s Goddard Institute for Space Studies (GISS) database. For any weather station GISS provide nice graphs of the temperature data. The current after GISS homogeneity adjustment data is available here and the raw GHCN data + UHSHCN corrections is available here up until 2011 only. For any weather station GISS provide nice graphs of the temperature data. Homewood’s primary analysis was to show the “raw data” side by side.

20/01/15 Massive Tampering With Temperatures In South America

This looked at all three available rural stations in Paraguay. The data from all three at Puerto Casado, Mariscal and San Jan Buatista/Misiones had the same pattern of homogenization adjustments. That is, cooling of the past, so that instead of the raw data showing the 1960s being warmer than today, it was cooler. What could they have been homogenized to?

26/01/15 All Of Paraguay’s Temperature Record Has Been Tampered With

This checked the six available urban sites in Paraguay. Homewood’s conclusion was that

warming adjustments have taken place at every single, currently operational site in Paraguay.

How can homogenization adjustments all go so same way? There is no valid reason for making such adjustments, as there is no reference point for the adjustments.

29/01/15Temperature Adjustments Around The World

Homewood details other examples from Southern Greenland, Iceland, Northern Russia, California, Central Australia and South-West Ireland. Instead of comparing the raw with the adjusted data, he compared the old adjusted data with the recent data. Adjustment decisions are changing over time, making the adjusted data sets give even more pronounced warming trends.

30/01/15 Cooling The Past In Bolivia

Then he looked at all 14 available stations in neighbouring Bolivia. His conclusion

At every station, bar one, we find the ….. past is cooled and the present warmed.”

(The exception was La Paz, where the cooling trend in the raw data had been reduced.)

Why choose Paraguay in the first place? In the first post, Homewood explains that within a NOAA temperature map for the period 1981-2010 there appeared to be a warming hotspot around Paraguay. Being a former accountant he checked the underlying data to see if it existed in the data. Finding an anomaly in one area, he checked more widely.

The other primary articles are

26/01/15 Kevin Cowton NOAA Paraguay Data

This Youtube video was made in response to Christopher Booker’s article in the Telegraph, a secondary source of data. Cowton assumes Booker is the primary source, and is criticizing NOAA data. A screen shot of the first paragraph shows these are untrue.

Further, if you read down the article, Cowton’s highlighting of the data from one weather station is also misleading. Booker points to three, but just illustrates one.

Despite this, it still ranks as a primary source, as there are direct references to the temperature data and the adjustments. They are not GISS adjustments, but might be the same.

29/01/15 Shub Niggurath – The Puerto Casado Story

Shub looked at the station moves. He found that the metadata for the station data is a mess, so there is no actual evidence of the location changing. But, Shub reasons the fact that there was a step change in the data meant that it moved, and the fact that it moved meant there was a change. Shub is a primary source as he looks at the adjustment reason.


Secondary Articles

The three secondary articles by Christopher Booker, James Delingpole and BishopHill are just the connectors in this story.


Tertiary articles of “…and Then There’s Physics”

25/01/15 Puerto Cascado

This looked solely at Booker’s article. It starts

Christopher Booker has a new article in the The Telegraph called Climategate, the sequel: How we are STILL being tricked with flawed data on global warming. The title alone should be enough to convince anyone sensible that it isn’t really worth reading. I, however, not being sensible, read it and then called Booker an idiot on Twitter. It was suggested that rather than insulting him, I should show where he was wrong. Okay, this isn’t really right, as there’s only so much time and effort available, and it isn’t really worth spending it rebutting Booker’s nonsense.

However, thanks to a tweet from Ed Hawkins, it turns out that it is really easy to do. Booker shows data from a site in Paraguay (Puerto Casado) in which the data was adjusted from a trend of -1.37o C per century to +1.36o C per century. Shock, horror, a conspiracy?


ATTP is highlighting an article, but is strongly discouraging anybody from reading it. That is why the referral is a red line in the graphic above. He then says he is not going to provide a rebuttal. ATTP is good to his word and does not provide a rebuttal. Basically it is saying “Don’t look at that rubbish, look at the real authority“. But he is wrong for a number of reasons.

  1. ATTP provides misdirection to an alternative data source. Booker quite clearly states that the source of the data is the NASA GISS temperature set. ATTP cites Berkeley Earth.
  2. Booker clearly states that there are thee rural temperature stations spatially spread that show similar results. ATTP’s argument that a single site was homogenized with the others in the vicinity falls over.
  3. This was further undermined by Paul Homewood’s posting on the same day on the other 6 available sites in Paraguay, all giving similar adjustments.
  4. It was further undermined by Paul Homewood’s posting on 30th January on all 14 sites in Bolivia.

The story is not of a wizened old hack making some extremist claims without any foundation, but of a retired accountant seeing an anomaly, and exploring it. In audit, if there is an issue then you keep exploring it until you can bottom it out. Paul Homewood has found an issue, found it is extensive, but is still far from finding the full extent or depth. ATTP, when confronted by my summary of the 23 stations that corroborate each other chose to delete it. He has now issued an update.

Update 4/2/2015 : It’s come to my attention that some are claiming that this post is misleading my readers. I’m not quite sure why, but it appears to be related to me not having given proper credit for the information that Christopher Booker used in his article. I had thought that linking to his article would allow people to establish that for themselves, but – just to be clear – the idiotic, conspiracy-laden, nonsense originates from someone called Paul Homewood, and not from Chistopher Booker himself. Okay, everyone happy now? J

ATTP cannot accept that he is wrong. He has totally misrepresented the arguments. When confronted with alternative evidence ATTP resorts to vitriolic claims. If someone is on the side of truth and science, they will encourage people to compare and contrast the evidence. He seems to have forgotten the advice about when in a whole…..

Temperature homogenisation

ATTP’s article on Temperature Homogenisation starts

Amazing as it may seem, the whole tampering with temperature data conspiracy has managed to rear its ugly head once again. James Delingpole has a rather silly article that even Bishop Hill calls interesting (although, to be fair, I have a suspicion that in “skeptic” land, interesting sometimes means “I know this is complete bollocks, but I can’t bring myself to actually say so”). All of Delingpole’s evidence seems to come from “skeptic” bloggers, whose lack of understand of climate science seems – in my experience – to be only surpassed by their lack of understanding of the concept of censorship J.

ATPP starts with a presumption of being on the side of truth, with no fault possible on his side. Any objections are due to a conscious effort to deceive. The theory of cock-up or of people not checking their data does not seem to have occurred to him. Then there is a link to Delingpole’s secondary article, but calling it “silly” again deters readers from looking for themselves. If they did, the readers would be presented with flashing images of all the “before” and “after” GISS graphs from Paraguay, along with links to the 6 global sites and Shub’s claims that there is a lack of evidence for the Puerto Casado site being moved. Delingpole was not able the more recent evidence from Bolivia, that further corroborates the story.

He then makes a tangential reference to his deleting my previous comments, though I never once used the term “censorship”, nor did I tag the article “climate censorship”, as I have done to some others. Like on basic physics, ATTP claims to have a superior understanding of censorship.

There are then some misdirects.

  • The long explanation of temperature homogenisation makes some good points. But what it does not do is explain that the size and direction of any adjustment is an opinion, and as such be wrong. It a misdirection to say that the secondary sources are against any adjustments. They are against adjustments that create biases within the data.
  • Quoting Richard Betts’s comment on Booker’s article about negative adjustments in sea temperature data is a misdirection, as Booker (a secondary source) was talking about Paraguay, a land-locked country.
  • Referring to Cowton’s alternative analysis is another misdirect, as pointed out above. Upon reflection, ATTP may find it a tad embarrassing to have this as his major source of authority.


When I studied economics, many lecturers said that if you want to properly understand an argument or debate you need to look at the primary sources, and then compare and contrast the arguments. Although the secondary sources were useful background, particularly in a contentious issue, it is the primary sources on all sides that enable a rounded understanding. Personally, by being challenged by viewpoints that I disagreed with enhanced my overall understanding of the subject.

ATTP has managed to turn this on its head. He uses methods akin to crudest propagandists of last century. They started from deeply prejudiced positions; attacked an opponent’s integrity and intelligence; and then deflected away to what they wanted to say. There never gave the slightest hint that one side might be at fault, or any acknowledgement that the other may have a valid point. For ATTP, and similar modern propagandists, rather than have a debate about the quality of evidence and science, it becomes a war of words between “deniers“, “idiots” and “conspiracy theorists” against the basic physics and the overwhelming evidence that supports that science.

If there is any substance to these allegations concerning temperature adjustments, for any dogmatists like ATTP, it becomes a severe challenge to their view of the world. If temperature records have systematic adjustment biases then climate science loses its’ grip on reality. The climate models cease to be about understanding the real world, but conforming to people’s flawed opinions about the world.

The only way to properly understand the allegations is to examine the evidence. That is to look at the data behind the graphs Homewood presents. I have now done that for the nine Paraguayan weather stations. The story behind that will have to await another day. However, although I find Paul Homewood’s claims of systematic biases in the homogenisation process to be substantiated, I do not believe that it points to a conspiracy (in terms of a conscious and co-ordinated attempt to deceive) on the part of climate researchers.

DECC’s Dumb Global Calculator Model

On the 28th January 2015, the DECC launched a new policy emissions tool, so everyone can design policies to save the world from dangerous climate change. I thought I would try it out. By simply changing the parameters one-by-one, I found that the model is both massively over-sensitive to small changes in input parameters and is based on British data. From the model, it is possible to entirely eliminate CO2 emissions by 2100 by a combination of three things – reducing the percentage travel in urban areas by car from 43% to 29%; reducing the average size of homes to 95m2 from 110m2 today; and for everyone to go vegetarian.

The DECC website says

Cutting carbon emissions to limit global temperatures to a 2°C rise can be achieved while improving living standards, a new online tool shows.

The world can eat well, travel more, live in more comfortable homes, and meet international carbon reduction commitments according to the Global Calculator tool, a project led by the UK’s Department of Energy and Climate Change and co-funded by Climate-KIC.

Built in collaboration with a number of international organisations from US, China, India and Europe, the calculator is an interactive tool for businesses, NGOs and governments to consider the options for cutting carbon emissions and the trade-offs for energy and land use to 2050.

Energy and Climate Change Secretary Edward Davey said:

“For the first time this Global Calculator shows that everyone in the world can prosper while limiting global temperature rises to 2°C, preventing the most serious impacts of climate change.

“Yet the calculator is also very clear that we must act now to change how we use and generate energy and how we use our land if we are going to achieve this green growth.

“The UK is leading on climate change both at home and abroad. Britain’s global calculator can help the world’s crucial climate debate this year. Along with the many country-based 2050 calculators we pioneered, we are working hard to demonstrate to the global family that climate action benefits people.”

Upon entering the calculator I was presented with some default settings. Starting from a baseline emissions in 2011 of 49.9 GT/CO2e, this would give predicted emissions of 48.5 GT/CO2e in 2050 and 47.9 GT/CO2e in 2100 – virtually unchanged. Cumulative emissions to 2100 would be 5248 GT/CO2e, compared with 3010 GT/CO2e target to give a 50% chance of limiting warming to a 2°C rise. So the game is on to save the world.

I only dealt with the TRAVEL, HOMES and DIET sections on the left.

I went through each of the parameters, noting the results and then resetting back to the baseline.

The TRAVEL section seems to be based on British data, and concentrated on urban people. Extrapolating for the rest of the world seems a bit of a stretch, particularly when over 80% of the world is poorer. I was struck first by changing the mode of travel. If car usage in urban areas fell from 43% to 29%, global emissions from all sources in 2050 would be 13% lower. If car usage in urban areas increased from 43% to 65%, global emissions from all sources in 2050 would be 7% higher. The proportions are wrong (-14% gives -13%, but +22% gives +7%) along with urban travel being too high a proportion of global emissions.

The HOMES section has similar anomalies. Reducing the average home area by 2050 to 95m2 from 110m2 today reduces total global emissions in 2050 by 20%. Independently decreasing average urban house temperature in 2050 from 17oC in Winter & 27oC in Summer, instead of 20oC & 24oC reduces total global emissions in 2050 by 7%. Both seem to be based on British-based data, and highly implausible in a global context.

In the DIET section things get really silly. Cutting the average calorie consumption globally by 10% reduces total global emissions in 2050 by 7%. I never realised that saving the planet required some literal belt tightening. Then we move onto meat consumption. The baseline for 2050 is 220 Kcal per person per day, against the current European average of 281 Kcal. Reducing that to 14 Kcal reduces global emissions from all sources in 2050 by 73%. Alternatively, plugging in the “worst case” 281 Kcal, increases global emissions from all sources in 2050 by 71%. That is, if the world becomes as carnivorous in 2050 as the average European in 2011, global emissions from all sources at 82.7 GT/CO2e will be over six times higher the 13.0 GT/CO2e. For comparison, OECD and Chinese emissions from fossil fuels in 2013 were respectively 10.7 and 10.0 GT/CO2e. It seems it will be nut cutlets all round at the climate talks in Paris later this year. No need for China, India and Germany to scrap all their shiny new coal-fired power stations.

Below is the before and after of the increase in meat consumption.

Things get really interesting if I take the three most sensitive, yet independent, scenarios together. That is, reducing urban car use from 43% to 29% of journeys in 2050; reducing the average home area by 2050 to 95m2 from 110m2; and effectively making a sirloin steak (medium rare) and venison in redcurrant sauce things of the past. Adding them together gives global emissions of -2.8 GT/CO2e in 2050 and -7.1 GT/CO2e in 2100, with cumulative emissions to 2100 of 2111 GT/CO2e. The model does have some combination effect. It gives global emissions of 3.2 GT/CO2e in 2050 and -0.2 GT/CO2e in 2100, with cumulative emissions to 2100 of 2453 GT/CO2e. Below is the screenshot of the combined elements, along with a full table of my results.

It might be great to laugh at the DECC for not sense-checking the outputs of its glitzy bit of software. But it concerns me that it is more than likely the same people who are responsible for this nonsense are also responsible for the glossy plans to cut Britain’s emissions by 80% by 2050 without destroying hundreds of thousands of jobs; eviscerating the countryside; and reducing living standards, especially of the poor. Independent and critical review and audit of DECC output is long overdue.

Kevin Marshall


A spreadsheet model is also available, but I used the online tool, with its’ excellent graphics. The calculator is built by a number of organisations.

Global Emissions Reductions Targets for COP21 Paris 2015

There is a huge build-up underway for the COP21 climate conference to be staged in Paris in November. Many countries and NGOs are pushing for an agreement that will constrain warming to just 2oC, but there are no publicly available figures of what this means for all the countries of the world. This is the gap I seek close with a series of posts. The first post is concerned with getting a perspective on global emissions and the UNIPCC targets.

In what follows, all the actual figures are obtained from three primary sources.

  • Emissions data comes from the Carbon Dioxide Information Analysis Centre or CDIAC.
  • Population data comes from the World Bank, though a few countries are missing. These are mostly from Wikipedia.
  • The Emissions targets can be found in the Presentation for the UNIPCC AR5 Synthesis Report.

All categorizations and forecast estimates are my own.

The 1990 Emissions Position

A starting point for emissions reductions is to stabilize emissions to 1990 levels, around the time that climate mitigation was first proposed. To illustrate the composition emissions I have divided the countries of the world into the major groups meaningful at that time – roughly into First World developed nations, the Second World developed communist countries and the Third World developing economies. The First World is represented by the OECD. I have only included members in 1990, with the USA split off. The Second World is the Ex-Warsaw pact countries, with the countries of the former Yugoslavia included as well. The rest are of the world is divided into five groups. I have charted the emissions per capita against the populations of these groups to come up with the following graph.

In rough terms, one quarter of the global population accounted for two-thirds of global emissions. A major reduction on total emissions could therefore be achieved by these rich countries taking on the burden of emissions reductions, and the other countries not increasing their emissions, or keeping growth to a minimum.

The 2020 emissions forecast

I have created a forecast of both emissions and population for 2020 using the data up to 2013 for both emissions and population. Mostly these are assuming the same change in the next seven years as the last. For emissions in the rapidly-growing countries this might be an understatement. For China and India I have done separate forecasts based on their emissions commitments. This gives the following graph.

The picture has changed dramatically. Population has increased by 2.4 billion or 45% and emissions by over 80%. Global average emissions per capita have increased from 4.1 to 5.2t/CO2 per capita. Due to the population increase, to return global emissions to 1990 levels would mean reducing average emissions per capita to 2.85t/CO2.

The composition of emissions has been even more dramatic. The former First and Second World countries will see a slight fall in emissions from 14.9 to 14.0 billion tonnes of CO2 and the global share will have reduced from 68% to 36%. Although total population will have increased on 1990, the slower growth than elsewhere means the share of global population has shrunk to just 19%. China will have a similar population and with forecast emissions of 13.1 billion tonnes of CO2, 33% of the global total.

The picture is not yet complete. On slide 30 of their Synthesis Report presentation the UNIPCC state

Measures exist to achieve the substantial emissions reductions required to limit likely warming to 2oC (40-70% emissions reduction in GHGs globally by 2050 and near zero GHGs in 2100)

The baseline is 2011, when global emissions were 29.74 billion t/CO2. In 2050 global population will be nearly nine billion. This gives an upper limit of 2.2 t/CO2 per capita and lower limit of 1.1 t/CO2 per capita.

To put this in another perspective, consider the proportions of people living in countries that need emissions targets based on greater than 2.2t/CO2 emissions per capita.

In 1990, it was just a third of the global population. In 2020 it will be three quarters. No longer can an agreement on constraining global CO2 emissions be limited to a few countries. It needs to be truly global. The only area that meets the target is Africa, but even here the countries of Algeria, Egypt, Libya, Tunisia and South Africa would need to have emission reduction targets.

Further Questions

  1. What permutations are possible if other moral considerations are taken into account, like the developed countries bear the burden of emission cuts?
  2. What targets should be set for non-fossil fuel emissions, such as from Agriculture? Are these easier or harder to achieve than for fossil fuels?
  3. What does meeting emission targets mean for different types of economies? For instance are emission reductions more burdensome for the fast-growing emerging economies that for the developed economies?
  4. What are the measures that IPCC claims exist to reduce emissions? Are they more onerous than the consequences of climate change?
  5. Are there in place measures to support the states dependent on the production of fossil fuels? In particular, the loss of income to the Gulf States from leaving oil in the ground may further destabilize the area.
  6. What sanctions if some countries refuse to sign up to an agreement, or are politically unable to implement an agreement?
  7. What penalties will be imposed if countries fail to abide by the agreements made?

Kevin Marshall

The Truth About Davey’s Energy Savings


Ed Davey’s claim that the DECC published “a complete picture of everything that affects final energy bills” is refuted by Paul Homewood below.
This is far from an exhaustive list. For instance there are also the costs of upgrading the National Grid to transport the generated the electricity generated in remote wind turbines to the centers of population; the impact on jobs and growth of increasing energy costs relative to other nations;and the more esoteric costs to democracy of having a dogmatic group of people with dogmatic beliefs in a specialist applied subject claiming that this gives them superior insights into public policy-making, policy implementation and economic theory.

Originally posted on NOT A LOT OF PEOPLE KNOW THAT:

By Paul Homewood


Ed Davey has been stung into defending his disastrous energy policies, following revelations that his department had disgracefully attempted to hide data, showing that electricity prices would soon be 40% higher, as a result of climate policies.

The above letter was published in last week’s Sunday Telegraph. Unfortunately, he is being rather economical with the truth.

First, let’s recap on the energy savings which Davey says will make us so much better off. The table below is from the data that DECC tried to hide.


The so-called savings are listed under 2).

View original 507 more words

Ed Hoskins: Capital Cost and Production Effectiveness of Renewable Energy in Europe – the Data


Ed Hoskins provides a very wide-ranging analysis on the capital costs of renewables in Europe, with information about all the major countries. Despite total investment of $500bn so far, renewables provide just 2.9% of actual power generated. Hoskins also provides some graphical data on “Intermittency and Non-dipatchability” of energy output, helping highlight that renewables are not just expensive, they are also pretty useless at providing power when required.
The one weakness in the analysis is in the costs per unit of output – something outside the main purpose of the post. The source of that data is the U.S. Energy Information Administration. This uses (Table 2-5 on page 44 of the pdf file) “Overnight Capital Cost” which measures capital and maintenance costs per unit of capacity. So, for instance, “Onshore Wind” appears to have only 2.2 times the capital cost of “Natural Gas Advanced Combined Cycle”. But assuming the former operates at 25% of capacity and the latter at 85%, the capital costs of wind power becomes 7.5 times that of gas. Similarly, assuming offshore wind operates at 35% of capacity, relative capital costs rise from 6.2 to 14.8 times that of gas.

Another point is that the EIA does not consider conventional coal-fired power stations, possibly inflating the price by some measure of “The Social cost of Carbon”. Using the average price in AR4 of $12 per tonne of CO2 (Synthesis Report Page 69) and that a coal-fired power station produces about 500kg per megawatt, this $6 per megawatt is trivial compared with the much higher cost of renewables.

Originally posted on Tallbloke's Talkshop:

Guest post from Ed Hoskins
A comparison of both the Capital Cost and Energy Production Effectiveness of the Renewable Energy in Europe.

The diagrams and table below collate the cost and capacity factors of Renewable Energy power sources, Onshore and Off-shore Wind Farms and Large scale Photovoltaic Solar generation, compared to the cost and output capacity of conventional Gas Fired Electricity generation.

Screen Shot 2014-12-16 at 08.16.07

The associated base data is shown below:

View original 2,794 more words


Get every new post delivered to your Inbox.

Join 51 other followers