Climate Alarmist Bob Ward’s poor analysis of Research Data

After Christopher Booker’s excellent new Report for the GWPF “Global Warming: A Case Study In Groupthink” was published on 20th February, Bob Ward (Policy and Communications Director at the Grantham Research Institute on Climate Change and the Environment at the LSE) typed a rebuttal article “Do male climate change ‘sceptics’ have a problem with women?“. Ward commenced the article with a highly misleading statement.

On 20 February, the Global Warming Policy Foundation launched a new pamphlet at the House of Lords, attacking the mainstream media for not giving more coverage to climate change ‘sceptics’.

I will lead it to the reader to judge for themselves how misleading the statement is by reading the report or alternatively reading his summary at

At Cliscep (reproduced at WUWT), Jaime Jessop has looked into Ward’s distractive claims about the GWPF gender bias. This comment by Ward particularly caught my eye.

A tracking survey commissioned by the Department for Business, Energy and Industrial Strategy showed that, in March 2017, 7.6% answered “I don’t think there is such a thing as climate change” or “Climate change is caused entirely caused by natural processes”, when asked for their views. Among men the figure was 8.1%, while for women it was 7.1%.

I looked at the Tracking Survey. It is interesting that the Summary of Key Findings contains no mention of gender bias, nor of beliefs on climate change. It is only in the Wave 21 full dataset spreadsheet that you find the results of the question 22.

Q22. Thinking about the causes of climate change, which, if any, of the following best describes your opinion?
1. Climate change is entirely caused by natural processes
2. Climate change is mainly caused by natural processes
3. Climate change is partly caused by natural processes and partly caused by human activity
4. Climate change is mainly caused by human activity
5. Climate change is entirely caused by human activity
6. I don’t think there is such a thing as climate change.
7. Don’t know
8. No opinion

Note that the first option presented to the questionee is 5, then 4, then 3, then 2, then 1. There may, therefore, be an inbuilt bias in overstating the support for Climate Change being attributed to human activity. But the data is clearly presented, so a quick pivot table was able to check Ward’s results.

The sample was of 2180 – 1090 females and 1090 males. Adding the responses  to “I don’t think there is such a thing as climate change” or “Climate change is caused entirely caused by natural processes” I get 7.16% for females – (37+41)/1090 – and 8.17% for males – (46+43)/1090. Clearly, Bob Ward has failed to remember what he was taught in high school about roundings.

Another problem is that this is raw data. The opinion pollsters have taken time and care to adjust for various demographic factors by adding a weighting to each line. On this basis, Ward should have reported 6.7% for females, 7.6% for males and 7.1% overall.

More importantly, if males tend to be more sceptical of climate change than females, then they will be less alarmist than females. But the data says something different. Of the weighted responses, to those who opted for the most extreme “Climate change is entirely caused by natural processes“, 12.5% were female and 14.5% were male. Very fractionally at the extreme, men are proportionality more alarmist than females than they are sceptical. More importantly, men are slightly more extreme in their opinions on climate change (for or against) than women.

The middle ground is the response to “Climate change is partly caused by natural processes and partly caused by human activity“. The weighted response was 44.5% female and 40.7% male, confirming that men are more extreme in their views than women.

There is a further finding that can be drawn. The projections by the IPCC for future unmitigated global warming assume that all, or the vast majority of, global warming since 1850 is human-caused. Less than 41.6% of British women and 43.2% of British men agree with this assumption that justifies climate mitigation policies.

Below are my summaries. My results are easily replicated for those with an intermediate level of proficiency in Excel.

Learning Note

The most important lesson for understanding data is to analyse that data from different perspectives, against different hypotheses. Bob Ward’s claim of a male gender bias towards climate scepticism in an opinion survey, upon a slightly broader analysis, becomes one where British males are slightly more extreme and forthright in their views than British females whether for or against. This has parallels to my conclusion when looking at the 2013 US study The Role of Conspiracist Ideation and Worldviews in Predicting Rejection of Science – Stephan Lewandowsky, Gilles E. Gignac, Klaus Oberauer. Here I found that rather than the paper’s finding that conspiracist ideation being “associated with the rejection of all scientific propositions tested”, the data strongly indicated that people with strong opinions on one subject, whether for or against, tend to have strong opinions on other subjects, whether for or against. Like with any bias of perspective, (ideological, religious, gender, race, social class, national, football team affiliation etc.) the way to counter bias is to concentrate on the data. Opinion polls are a poor starting point, but at least they may report on perspectives outside of one’s own immediate belief systems. 

Kevin Marshall

Sea Level Rise Projections and Policy

One blog I follow is TrustYetVerify. The latest post – Projecting sea level 300, nah, 1000 years in the future – is straightforward and highlights some significant issues for climate policy.

He compares claims of an activist in a Belgium newspaper that unmitigated climate change will result in sea level rise of 5 metres in 300 years, with a graphic from UNIPCC AR5 WG1 Chapter 13 on sea level rise that showed a at most around a 3 metre rise.

There was a good spot by Michel in relation to a graphic from a December 2017 presentation on the impacts of an 8 metre rise in sea levels by the year 3000. In was originally from a 2004 Greenpeace document. Only the earlier document also had the impacts of current sea inundation and a 1 metre sea level rise.

There are some lessons that can be learnt.

Marginal Difference of policy

The current sea coverage is of large areas of the Netherlands that are not currently covered by sea water. To create the graphic, they have removed the dykes that have enabled the Netherlands to vastly increase its land area. This not only vastly exaggerates the impact of sea level rise, but contains the assumption that people are too dumb to counter the impact of sea level rise by building dykes higher. Given that even the exaggerated claims are 5 metres in 300 years, that means an average rate of rising of 17mm per annum and a maximum rate of maybe 30mm. What is more, any rise is predictable over maybe decades. Decisions can be made over 20-50 year timescales, which are far less onerous than taking the long-term perspective. Even if a 5 metre rise over 300 years was accurate, either building dykes now assuming sea levels are 5 metres higher, or abandoning areas that will be inundated will cause needless costs for this generation and the next few generations.
The is an even greater policy assumption, that I repeatedly point out. Climate mitigation through reducing greenhouse gas emissions requires that global emissions are reduced.  It does not matter whether Belgium, and the Netherlands make massive cuts their emissions, if most other countries do not follow similar policies. As a graphic 3.1 from the UNEP Emissions Gap Report 2017 clearly demonstrates, the net impact of all proposed policies is very little compared to doing nothing, and a long way from the 1.5°C or the 2°C targets. This is after over 20 years of annual COP meetings to obtain much bigger reductions.

The marginal impact of sea-level rise is therefore exaggerated by

  • Assuming that the existing flood defences vanish.
  • Assuming people do not build any more defences.
  • Exaggerating the projected rise.
  • Looking at a far greater timescale than rational planning ought to take place.
  • Falsely promoting emissions reductions to combat sea level rise impacts, knowing that whatever a few countries do will not make a difference to overall emissions. If significant warming is caused by human GHG emissions, and this leads to significant sea level rise, then current emissions policies are largely a waste of time.


Checking and Interpreting Forecasts / Projections

Consider the sea level rise graphic from UNIPCC AR5 WG1 Chapter 13 .

Consider the projections for the year 2500.

The High Scenarios shows sea level rise of 1.5 to 6.5m in 2500 for >700ppm CO2.
Medium scenarios show sea level rise of 0.2 to 2.3m in 2500 for 500-700ppm CO2.
Low scenarios show sea level rise of 0.5 to 1.0m in 2500 for <500ppm CO2.

How can the medium scenarios project a lower bottom end than the low scenarios?

The explanation probably lies in different modelling assumptions. After all the greater the scenario from the current state of affairs, the greater the uncertainty range, unless you assume that the structure of the model contains truths not revealing from any observations.

Further note the High scenarios lower limit is only 30cm a century, and the top end is 1.3m a century, whilst the medium scenarios bottom end over five centuries is roughly the rate of sea level rise per century for the last few centuries. That is, well within the medium scenario uncertainty range is the possibility that some global warming will make no difference to the rate of sea level rise.

What I also find interesting is that under the medium scenarios, Antarctica is gaining ice, hence reducing sea levels, but under the low scenarios has no impact whatsoever. Again, this shows the different modelling assumptions used.

Concluding note

Suppose a pharmaceutical company promoted a product with clearly exaggerated claims of its effectiveness, false alarm for the need for the product, and deliberately played down the harms that the product could cause to the patient? There would be an outcry, and the company being sued in a world without regulations. In most countries, strict regulations mean that to market a new product, the onus is on that company to demonstrate the product works, and that side effects are known. But it is alright to promote such falsehoods to “save the plant for future generations“. Indeed, to shout down critics as deniers of climate change. 

Kevin Marshall

“Were going to miss the 2°C Warming target” study and IPCC AR5 WG3 Chapter 6

WUWT had a post on 22nd January

Study: we’re going to miss (and overshoot) the 2°C warming target

This comment (from a University of Southhampton pre-publication news release) needs some explanation to relate it to IPCC AR5.

Through their projections, Dr Goodwin and Professor Williams advise that cumulative carbon emissions needed to remain below 195-205 PgC (from the start of 2017) to deliver a likely chance of meeting the 1.5°C warming target while a 2°C warming target requires emissions to remain below 395-455 PgC.

The PgC is peta-grams of Carbon. For small weights, one normally uses grams. For larger weights one uses kilograms. For still larger weights one uses tonnes. Under the Imperial measurement system, one uses ounces, pounds and tons. So one peta-gram is a billion (or giga) tonne.
Following the IPCC’s convention, GHG emissions are expressed in units of CO2, not carbon. Other GHGs are expressed in CO2e. So 1 PgC = 3.664 GtCO2e.

So the emissions from the start of 2017 are 715-750 GtCO2e for 1.5°C of warming and 1447-1667 GtCO2e for 2°C of warming. To make comparable to IPCC AR5, (specifically to table 6.3 from IPCC AR5 WG3 Chapter 6 p431), one needs to adjust for two things – the IPCC’s projections are from 5 years earlier, and for CO2 emissions only, about 75% of GHG emissions.

The IPCC’s projections of CO2 emissions are 630-1180 GtCO2 for 1.5-1.7°C of warming and 960-1550 GtCO2e for 1.7-2.1°C of warming.

With GHG emissions roughly 50 GtCO2e a year and CO2 emissions 40 GtCO2 a year, from the IPCC’s figures updated from the start of 2017 and expressed in GtCO2e are 570-1300 GtCO2e for 1.5-1.7°C of warming and 1010-1800 GtCO2e for 1.7-2.1°C of warming.

Taking the mid-points of the IPCC’s and the Goodwin-Williams figures, the new projections are saying that at current emissions levels, 1.5°C will be breached four years earlier, and 2°C will be breached one year later. Only the mid-points are 1.6°C and 1.9°C, so it makes no real difference whatsoever. The Goodwin-Williams figures just narrow the ranges and use different units of measure.

But there is still a major problem. Consider this mega table 6.3 reproduced, at lower quality, below.

Notice Column A is for CO2 equivalent concentration in 2100 (ppm CO2eq). Current CO2 levels are around 405 ppm, but GHG gas levels are around 450 ppm CO2eq. Then notice columns G and H, with a joint heading of Concentration (ppm). Column G is for CO2 levels in 2100 and Column H is for CO2 equivalent levels. Note also that for the first few rows of data, Column H is greater than Column A, implying that sometime this century peak CO2 levels will be higher than at the end of the century, and (subject to the response period of the climate system to changes in greenhouse gas levels)  average global temperatures could (subject to the models being correct) exceed the projected 2100 levels. How much though?

I will a magic equation at the skeptical science blog, and (after correcting to make a doubling of CO2 convert to exactly 3°C of warming) assume that all changes in CO2 levels instantly translate into average temperature changes. Further, I assume that other greenhouse gases are irrelevant to the warming calculation, and peak CO2 concentrations are calculated from peak GHG, 2100 GHG, and 2100 CO2 concentrations. I derived the following table.

The 1.5°C warming scenario is actually 1.5-1.7°C warming in 2100, with a mid-point of 1.6°C. The peak implied temperatures are about 2°C.

The 2°C warming scenario is actually 1.7-2.1°C warming in 2100, with a mid-point of 1.9°C. The peak implied temperatures are about 2.3°C, with 2.0°C of warming in 2100 implying about 2.4°C peak temperature rise.

So when the IPCC talk about constraining temperature rise, it is about projected temperature rise in 2100, not about stopping global average temperature rise breaching 1.5°C or 2°C barriers.

Now consider the following statement from the University of Southhampton pre-publication news release, emphasis mine.

“Immediate action is required to develop a carbon-neutral or carbon-negative future or, alternatively, prepare adaptation strategies for the effects of a warmer climate,” said Dr Goodwin, Lecturer in Oceanography and Climate at Southampton. “Our latest research uses a combination of a model and historical data to constrain estimates of how long we have until 1.5°C or 2°C warming occurs. We’ve narrowed the uncertainty in surface warming projections by generating thousands of climate simulations that each closely match observational records for nine key climate metrics, including warming and ocean heat content.”

Professor Williams, Chair in Ocean Sciences at Liverpool, added: “This study is important by providing a narrower window of how much carbon we may emit before reaching 1.5°C or 2°C warming. There is a real need to take action now in developing and adopting the new technologies to move to a more carbon-efficient or carbon-neutral future as we only have a limited window before reaching these warming targets.” This work is particularly timely given the work this year of the Intergovernmental Panel on Climate Change (IPCC) to develop a Special Report on the Impacts of global warming of 1.5°C above pre-industrial levels.


The basic difference between IPCC AR5 Chapter 6 Table 6.3 and the new paper is the misleading message that various emissions policy scenarios will prevent warming breaching either 1.5°C or 2°C of warming when the IPCC scenarios are clear that this is the 2100 warming level. The IPCC scenarios imply that before 2100 warming could peak at respectively around 1.75°C or 2.4°C.  My calculations can be validated through assuming (a) a doubling of CO2 gives 3°C of warming, (b) other GHGs are irrelevant, (c) there no significant lag between the rise in CO2 level and rise in global average temperature.

Kevin Marshall

Is China leading the way on climate mitigation?

At the Conversation is an article on China’s lead in renewable energy.
China wants to dominate the world’s green energy markets – here’s why is by University of Sheffield academic Chris G Pope. The article starts:-

If there is to be an effective response to climate change, it will probably emanate from China. The geopolitical motivations are clear. Renewable energy is increasingly inevitable, and those that dominate the markets in these new technologies will likely have the most influence over the development patterns of the future. As other major powers find themselves in climate denial or atrophy, China may well boost its power and status by becoming the global energy leader of tomorrow.

The effective response ought to be put into the global context. At the end of October UNEP produced its Emissions Gap Report 2017, just in time for the COP23 meeting in Bonn. The key figure on the aimed for constraint of warming to 1.5°C to 2°C from pre-industrial levels – an “effective polcy response” – is E5.2, reproduced below.

An “effective response” by any one country is at least reducing it’s emissions substantially by 2030 compared with now at the start of 2018. To be a world leader in response to climate change requires reducing emissions in the next 12 years by more than the required global average of 20-30%.

Climate Action Tracker – which, unlike myself strongly promotes climate mitigation – rates China’s overall policies as Highly Insufficient in terms of limiting warming to 1.5°C to 2°C. The reason is that they forecast on the basis of current policies emissions will increase in China in the next few years, instead of rapidly decreasing.

So why has Chris Pope got China’s policy so radically wrong? After all, I accept the following statement.

Today, five of the world’s six top solar-module manufacturers, five of the largest wind turbine manufacturers, and six of the ten major car manufacturers committed to electrification are all Chinese-owned. Meanwhile, China is dominant in the lithium sector – think: batteries, electric vehicles and so on – and a global leader in smart grid investment and other renewable energy technologies.

Reducing net emissions means not just have lots of wind turbines, hydro schemes, solar farms and electric cars. It means those renewable forms of energy replacing CO2 energy sources. The problem is that renewables are adding to total energy production, along with fossil fuels. The principal source of China’s energy for electricity and heating is coal. The Global Coal Plant Tracker at has some useful statistics. In terms of coal-fired power stations, China now has 922 GW of coal-fired power stations operating (47% of the global total) with a further 153 GW “Announced + Pre-permit + Permitted” (28%) and 147 GW under construction (56%). Further, from 2006 to mid-2017, China’s Newly Operating Coal Plants had a capacity of 667 GW, fully 70% of the global total. estimates that coal-fired power stations account for 72% of global GHG emissions from the energy sector, with the energy-sector contributing to 41% of global GHG emissions. With China’s coal-fired power stations accounting for 47% of the global total, assuming similar capacity utilization, China’s coal-fired power stations account for 13-14% of global GHG emissions or 7 GtCO2e of around 52 GtCO2e. It does not stop there. Many homes in China use coal for domestic heating; there is a massive coal-to-liquids program (which may not be currently operating due to the low oil price); manufacturers (such as metal refiners) burn it direct; and recently there are reports of producing gas from coal. So why would China pursue a massive renewables program?

Possible reasons for the Chinese “pro-climate” policies

First, is for strategic energy reasons. I believe that China does not want to be dependent on world oil price fluctuations, which could harm economic growth. China, therefore, builds massive hydro schemes, despite it there being damaging to the environment and sometimes displacing hundreds of thousands of people. China also pursues coal-to-liquids programs, alongside promoting solar and wind farms. Although duplicating effort, it means that if oil prices suffer another hike, China is more immune from the impact than

Second, is an over-riding policy of a fast increase in perceived living standards. For over 20 years China managed average growth rates of up to 10% per annum, increasing average incomes by up to eight times, and moving hundreds of millions of people out of grinding poverty. Now economic growth is slowing (to still fast rates by Western standards) the raising of perceived living standards is being achieved by other means. One such method is to reduce the particulate pollution, particularly in the cities. The recent heavy-handed banning of coal burning in cities (with people freezing this winter) is one example. Another, is the push for electric cars, with the electricity mostly coming from distant coal-fired power stations. In terms of reducing CO2 emissions, electric cars do not make sense, but they do make sense in densely-populated areas with an emerging middle class wanting independent means of travel.

Third, is the push to dominate areas of manufacturing. With many countries pursuing hopeless renewables policies, the market for wind turbines and solar panels is set to increase. The “rare earths” required for the wind turbine magnets, such as neodymium, are produced in large quantities in China, such as in highly polluted Baotou. With lithium (required for batteries), China might only be currently world’s third largest producer – and some way behind Australia and Chile – but its reserves are the world’s second largest and sufficient on their own to supply current global demand for decades. With raw material supplies and low, secure energy costs from coal, along with still relatively low labour costs, China is well-placed to dominate these higher added-value manufacturing areas.

Concluding Comments

The wider evidence shows that an effective response to climate change is not emanating from China. The current energy policies are dominated, and will continue to be dominated, by coal. This will far out-weigh any apparent reductions in emissions from the manufacturing of renewables. Rather, the growth of renewables should be viewed in the context of promoting the continued rapid and secure increase in living standards for the Chinese people, whether in per capita income, or in standards of the local environment.

Kevin Marshall


NOAA Future Aridity against Al Gore’s C20th Precipitation Graphic

Paul Homewood has taken a look at an article in yesterdays Daily Mail – A quarter of the world could become a DESERT if global warming increases by just 2ºC.

The article states

Aridity is a measure of the dryness of the land surface, obtained from combining precipitation and evaporation.  

‘Aridification would emerge over 20 to 30 per cent of the world’s land surface by the time the global temperature change reaches 2ºC (3.6ºF)’, said Dr Manoj Joshi from the University of East Anglia’s School of Environmental Sciences and one of the study’s co-authors.  

The research team studied projections from 27 global climate models and identified areas of the world where aridity will substantially change.  

The areas most affected areas are parts of South East Asia, Southern Europe, Southern Africa, Central America and Southern Australia.

Now, having read Al Gore’s authoritative book An Inconvenient Truth there are statements first about extreme flooding, and then about aridity (pages 108-113). The reason for flooding coming first is on a graphic of twentieth-century changes in precipitation on pages 114 & 115.

This graphic shows that, overall, the amount of precipitation has increased globally in the last century by almost 20%.

 However, the effects of climate change on precipitation is not uniform. Precipitation in the 20th century increased overall, as expected with global warming, but in some regions precipitation actually decreased.

The blue dots mark the areas with increased precipitation, the orange dots with decreases. The larger the dot, the larger the change. So, according to Nobel Laureate Al Gore, increased precipitation should be the far more common than increased aridity. If all warming is attributed to human-caused climate change (as the book seems to imply) then over a third of the dangerous 2ºC occurred in the 20th century. Therefore there should be considerable coherence between the recent arid areas and future arid areas.

The Daily Mail reproduces a map from the UEA, showing the high-risk areas.

There are a couple of areas with big differences.

Southern Australia

In the 20th century, much of Australia saw increased precipitation. Within the next two or three decades, the UEA projects it getting considerably arider. Could this change in forecast be the result of the extreme drought that broke in 2012 with extreme flooding? Certainly, the pictures of empty reservoirs taken a few years ago, alongside claims that they would never likely refill show the false predictions.

One such reservoir is Lake Eildon in Victoria. Below is a graphic of capacity levels in selected years. It is possible to compare other years by following the historical water levels for EILDON link.

Similarly, in the same post, I linked to a statement by re-insurer Munich Re stating increased forest fires in Southern Australia were due to human activity. Not by “anthropogenic climate change”, but by discarded fag ends, shards of glass and (most importantly) fires that were deliberately started.

Northern Africa

The UEA makes no claims about increased aridity in Northern Africa, particularly with respect to the Southern and Northern fringes of the Sahara. Increasing desertification of the Sahara used to be claimed as a major consequence of climate change. In the year following Al Gore’s movie and book, the UNIPCC produced its Fourth Climate Assessment Report. Working Group II report, Chapter 9 (Pg 448) on Africa made the following claim.

In other countries, additional risks that could be exacerbated by climate change include greater erosion, deficiencies in yields from rain-fed agriculture of up to 50% during the 2000-2020 period, and reductions in crop growth period (Agoumi, 2003).

Richard North took a detailed look at the background of this claim in 2010. The other African countries were Morocco, Algeria and Tunisia. Agoumi 2003 compiled three reports, only one of which – Morocco – had anything near a 50% claim. Yet Morocco seems, from Al Gore’s graphic to have had a modest increase in rainfall over the last century.


The UEA latest doom-laden prophesy of increased aridity flies in the face of the accepted wisdom that human-caused global warming will result in increased precipitation. In two major areas (Southern Australia and Northern Africa), increased aridity is at add odds with changes in precipitation claimed to have occurred in the 20th Century by Al Gore in An Inconvenient Truth. Yet over a third of the of the dangerous 2ºC warming limit occurred in the last century.

Kevin Marshall


President Trumps Tweet on record cold in New York and Temperature Data

As Record-breaking winter weather grips North-Eastern USA (and much of Canada as well) President Donald Trump has caused quite a stir with his latest Tweet.

There is nothing new in the current President’s tweets causing controversy. This is a hard-hitting one has highlights a point of real significance for AGW theory. After decades of human-caused global warming, record cold temperatures are more significant than record warm temperatures. Record cold can be accommodated within the AGW paradigm by claiming greater variability in climate resultant on the warming. This would be a portent of the whole climate system being thrown into chaos once some tipping point had been breached. But that would also require that warm records are
(a) far more numerous than cold records and
(b) Many new warm records outstrip the old records of a few decades ago by a greater amount than the rise in average temperatures in that area.
I will illustrate with three temperature data sets I looked at a couple of years ago – Reykjavík, Iceland and Isfjord Radio and Svalbard Airport on Svalbard.

Suppose there had been an extremely high and an extremely low temperature in 2009 in Reykjavík. For the extreme high temperature to be a record it would only have to be nominally higher than a record set in 1940 to be a new record. The unadjusted average anomaly data is the same. If the previous record had been set in say 1990, a new high record would only be confirmation of more extreme climate if it was at least 1C higher than the previous record. But a new cold record in 2009 could be up to 1C higher than a 1990 low record to count as greater climate extremes. Similarly in the case of Svalbard Airport, new warm records in 2008 or 2009 would need to be over 4C higher than records set around 1980, and new cold records would need to be up to 4C higher than records set around 1980 to count as effective new warm and cold records.
By rebasing in terms of unadjusted anomaly data (and looking at monthly data) a very large number of possible records could be generated from one temperature station. With thousands of temperature stations with long records, it is possible to generate a huge number of “records” to analyze if the temperatures are becoming more extreme. But absolute record cold records should be few and far between. However, if relative cold records outstrip relative warm records, then there are questions to be asked of the average data. Similarly, if there were a lack of absolute records or a decreasing frequency of relative records, then the beliefs in impending climate chaos would be undermined.

I would not want to jump ahead with the conclusions. The most important element is to mine the temperature data and then analyze the results in multiple ways. There are likely to be surprises that could enhance understanding of climate in quite novel ways.

Kevin Marshall

Climate Public Nuisance as a justification for Climate Mitigation

Ron Clutz, at his Science Matters blog, has a penchant for producing some interesting articles that open up new areas outside of the mainstream of either climate alarmism or climate scepticism, some of which may challenge my own perspectives. With Critical climate intelligence for jurists and others, Ron has done this again.
There is a lot of ground covered here, and I am sure that it just touches on a few of the many issues. The first area covered is the tort of Public Nuisance, explained by legal scholar Richard O. Faulk. This touches upon areas that I have dealt with recently, particularly this section. (bold mine)

Generally in tort cases involving public nuisance, there is a term, which we all know from negligence cases and other torts, called proximate causation. In proximate causation, there is a “but for” test: but for the defendant’s activity, would the injury have happened? Can we say that climate change would not have happened if these power plants, these isolated five power plants, were not emitting greenhouse gases? If they completely stopped, would we still have global warming? If you shut them down completely and have them completely dismantled, would we still have global warming? 

I think Faulk then goes off the argument when he states.

Is it really their emissions that are causing this, or is it the other billions and billions of things on the planet that caused global warming—such as volcanoes? Such as gases being naturally released through earth actions, through off-gassing?

Is it the refinery down in Texas instead? Is it the elephant on the grasses in Africa? Is it my cows on my ranch in Texas who emit methane every day from their digestive systems? How can we characterize the public utilities’ actions as “but for” causes or “substantial contributions?” So far, the courts haven’t even reached these issues on the merits.

A necessary (but not sufficient) condition to be met for adverse human-caused global warming to be abated, is that most, if not all, human GHG emissions must be stopped. Unlike with smoke particulates, where elimination in the local area will make a big difference, GHGs are well-mixed. So shutting down a coal-fired power station in Oak Creek will have the same impact on the future climate change for people of South-East Wisconsin as shutting down a similar coal-fired power station in Boxburg, Ningxia, Mpumalanga, Kolkata or Porto do Pecém. That is in the range of zero or insignificantly different to zero depending on your perspective on CAGW.

Proximate causation was a term that I should have used to counter Minnesotan valve-turners climate necessity defense. As I noted in that post, to reduce global emissions by the amount desired by the UNIPCC – constraining future emissions to well below 1000 GtCO2e, requires not only reducing the demand for fossil fuels and other sources of GHG emissions, but also requires many countries dependent on the supply of fossil fuels for a large part of their national incomes, to leave at least 75% of known fossil fuel reserves in the ground.

An example of proximate causation to be found in the post of 27 December Judge delivers crushing blow to Washington Clean Air RuleGovernor Inslee called the legislation “the nation’s first Clean Air Rule, to cap and reduce carbon pollution.” But the legislation will only reduce so-called carbon pollution if the reduction is greater than the net increase in other areas of the world. The will not happen as both demand and supply are not covered by global agreements with the aggregate impact

Kevin Marshall

Thomas Fuller on polar-bear-gate at Cliscep

This is an extended version of a comment made at Thomas Fuller’s cliscep article Okay, just one more post on polar-bear-gate… I promise…

There are three things highlighted in the post and the comments that illustrate the Polar Bear smear paper as being a rich resource towards understanding the worst of climate alarmism.

First is from Alan Kendall @ 28 Dec 17 at 9:35 am

But what Harvey et al. ignores is that Susan Crockford meticulously quotes from the “approved canon of polar bear research” and exhorts her readers to read it (making an offer to provide copies of papers difficult to obtain). She provides an entree into that canon- an entree obviously used by many and probably to the fury of polar bear “experts”.

This is spot on about Susan Crockford, and, in my opinion, what proper academics should be aiming at. To assess an area where widely different perspectives are possible, I was taught that it is necessary to read and evaluate the original documents. Climate alarmists in general, and this paper in particular, evaluate in relation collective opinion as opposed to more objective criteria. In the paper, “science” is about support for a partly fictional consensus, “denial” is seeking to undermine that fiction. On polar bears this is clearly stated in relation to the two groups of blogs.

We found a clear separation between the 45 science-based blogs and the 45 science-denier blogs. The two groups took diametrically opposite positions on the “scientific uncertainty” frame—specifically regarding the threats posed by AGW to polar bears and their Arctic-ice habitat. Scientific blogs provided convincing evidence that AGW poses a threat to both, whereas most denier blogs did not.

A key element is to frame statements in terms of polar extremes.

Second, is the extremely selective use of the data (or selective analysis methods) to enable the desired conclusion to be reached. Thomas Fuller has clearly pointed out in the article and restated in the comments with respect to WUWT, the following.

Harvey and his 13 co-authors state that WUWT overwhelmingly links to Crockford. I have shown that this is not the case.

Selective use of data (or selective analysis methods) is common on climate alarmism. For instance

  • The original MBH 98 Hockey-Stick graph used out-of-date temperature series, or tree-ring proxies such as at Gaspe in Canada, that were not replicated by later samples.
  • Other temperature reconstructions. Remember Keith Briffa’s Yamal reconstruction, which relied on one tree for the post-1990 reconstructions? (see here and here)
  • Lewandowsky et al “Moon Hoax” paper. Just 10 out of 1145 survey respondents supported the “NASA faked the Moon Landings” conspiracy theory. Of these just 2 dogmatically rejected “climate”. These two faked/scam/rogue respondents 860 & 889 supported every conspiracy theory, underpinning many of the correlations.
  • Smoothing out the pause in warming in Risbey, Lewandowsky et al 2014 “Well-estimated global surface warming in climate projections selected for ENSO phase”. In The Lewandowsky Smooth, I replicated the key features of the temperature graph in Excel, showing how no warming for a decade in Hadcrut4 was made to appear as if there was hardly a cessation of warming.

Third, is to frame the argument in terms of polar extremes. Richard S J Tol @ 28 Dec 17 at 7:13 am

And somehow the information in those 83 posts was turned into a short sequence of zeros and ones.

Not only one many issues is there a vast number of intermediate positions possible (the middle ground), there are other dimensions. One is the strength of evidential support for a particular perspective. There could be little or no persuasive evidence. Another is whether there is support for alternative perspectives. For instance, although sea ice data is lacking for the early twentieth-century warming, average temperature data is available for the Arctic. NASA Gistemp (despite its clear biases) has estimates for 64N-90N.

The temperature data seems to clearly indicate that all of the decline in Arctic sea ice from 1979 is unlikely to be attributed to AGW. From the 1880s to 1940 there was a similar magnitude of Arctic warming as from 1979 t0 2010 with cooling in between. Yet the rate of increase in GHG levels was greater from greater in 1975-2010 than 1945-1975, which was in turn greater than the period decades before.

Kevin Marshall


Evidence for the Stupidest Paper Ever

Judith Curry tweeted a few days ago

This is absolutely the stupidest paper I have ever seen published.

What might cause Judith Curry to make such a statement about Internet Blogs, Polar Bears, and Climate-Change Denial by Proxy? Below are some notes that illustrate what might be considered stupidity.

Warmest years are not sufficient evidence of a warming trend

The US National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) both recently reported that 2016 was the warmest year on record (Potter et al. 2016), followed by 2015 and 2014. Currently, 2017 is on track to be the second warmest year after 2016. 

The theory is that rising greenhouse gas levels are leading to warming. The major greenhouse gas is CO2, supposedly accounting for about 75% of the impact. There should, therefore, be a clear relationship between the rising CO2 levels and rising temperatures. The form that the relationship should take is that an accelerating rise in CO2 levels will lead to an accelerating rate of increase in global average temperatures. Earlier this year I graphed the rate of change in CO2 levels from the Mauna Loa data.

The trend over nearly sixty years should be an accelerating trend. Depending on which temperature dataset you use, around the turn of the century warming either stopped or dramatically slowed until 2014. A strong El Nino caused a sharp spike in the last two or three years. The data contradicts the theory in the very period when the signal should be strongest.

Only the stupid would see record global average temperatures (which were rising well before the rise in CO2 was significant) as strong evidence of human influence when a little understanding of theory would show the data contradicts that influence.

Misrepresentation of Consensus Studies

The vast majority of scientists agree that most of the warming since the Industrial Revolution is explained by rising atmospheric greenhouse gas (GHG) concentrations (Doran and Zimmerman 2009, Cook et al. 2013, Stenhouse et al. 2014, Carlton et al 2015, Verheggen et al. 2015), 

Doran and Zimmerman 2009 asked two questions

1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

Believing that human activity is a significant contributing factor to rising global temperatures does not mean one believes the majority of warming is due to rising GHG concentrations. Only the stupid would fail to see the difference. Further, the results were a subset of all scientists, namely geoscientists. The reported 97% consensus was from a just 79 responses, a small subset of the total 3146 responses. Read the original to find out why.

The abstract to Cook et al. 2013 begins

We analyze the evolution of the scientific consensus on anthropogenic global warming (AGW) in the peer-reviewed scientific literature, examining 11 944 climate abstracts from 1991–2011 matching the topics ‘global climate change’ or ‘global warming’. We find that 66.4% of abstracts expressed no position on AGW, 32.6% endorsed AGW, 0.7% rejected AGW and 0.3% were uncertain about the cause of global warming. Among abstracts expressing a position on AGW, 97.1% endorsed the consensus position that humans are causing global warming. 

Expressing a position does not mean a belief. It could be an assumption. The papers were not necessarily by scientists, but merely authors of academic papers that involved the topics ‘global climate change’ or ‘global warming’. Jose Duarte listed some of the papers that were included in the survey, along with looking at some that were left out. It shows a high level of stupidity to use these flawed surveys as supporting the statement “The vast majority of scientists agree that most of the warming since the Industrial Revolution is explained by rising atmospheric greenhouse gas (GHG) concentrations“.

Belief is not Scientific Evidence

The most recent edition of climate bible from the UNIPCC states (AR5 WG1 Ch10 Page 869)

It is extremely likely that human activities caused more than half of the observed increase in GMST from 1951 to 2010.

Mispresenting surveys about beliefs are necessary because the real world data, even when that data is a deeply flawed statisticdoes not support the belief that “most of the warming since the Industrial Revolution is explained by rising atmospheric greenhouse gas (GHG) concentrations“.  

Even if the survey data supported the statement, the authors are substituting banal statements about beliefs for empirically-based scientific statements. This is the opposite direction to achieving science-based understanding. 

The false Consensus Gap

The article states

This chasm between public opinion and scientific agreement on AGW is now commonly referred to as the consensus gap (Lewandowsky et al. 2013)

Later is stated, in relation to sceptical blogs

Despite the growing evidence in support of AGW, these blogs continue to aggressively deny the causes and/or the projected effects of AGW and to personally attack scientists who publish peer-reviewed research in the field with the aim of fomenting doubt to maintain the consensus gap.

There is no reference that tracks the growing evidence in support of AGW. From WUWT (and other blogs) there has been a lot of debunking of the claims of the signs of climate apocalypse such as

  • Malaria increasing as a result of warming
  • Accelerating polar ice melt / sea level rise
  • Disappearing snows of Kilimanjaro due to warming
  • Kiribati and the Maldives disappearing due to sea level rise
  • Mass species extinction
  • Himalayan glaciers disappearing
  • The surface temperature record being a true and fair estimate of real warming
  • Climate models consistently over-estimating warming

The to the extent that a consensus gap exists it is between the consensus beliefs of the climate alarmist community and actual data. Scientific support from claims about the real world come from conjectures being verified, not by the volume of publications about the subject.

Arctic Sea Ice Decline and threats to Polar Bear Populations

The authors conjecture (with references) with respect to Polar Bears that

Because they can reliably catch their main prey, seals (Stirling and Derocher 2012, Rode et al. 2015), only from the surface of the sea ice, the ongoing decline in the seasonal extent and thickness of their sea-ice habitat (Amstrup et al. 2010, Snape and Forster 2014, Ding et al. 2017) is the most important threat to polar bears’ long-term survival.

That seems plausible enough. Now for the evidence to support the conjecture.

Although the effects of warming on some polar-bear subpopulations are not yet documented and other subpopulations are apparently still faring well, the fundamental relationship between polar-bear welfare and sea-ice availability is well established, and unmitigated AGW assures that all polar bears ultimately will be negatively affected. 

There is a tacit admission that the existing evidence contradicts the theory. There is data showing a declining trend in sea ice for over 35 years, yet in that time the various polar bear populations have been growing significantly, not just “faring well“. Surely there should be a decline by now in the peripheral Arctic areas where the sea ice has disappeared? The only historical evidence of decline is this comment in criticizing Susan Crockford’s work.

For example, when alleging sea ice recovered after 2012, Crockford downplayed the contribution of sea-ice loss to polar-bear population declines in the Beaufort Sea.

There is no reference to this claim, so readers cannot check if the claim is supported. But 2012 was an outlier year, with record lows in the Summer minimum sea ice extent due to unusually fierce storms in August. Losses of polar bears due to random & extreme weather events are not part of any long-term decline in sea ice.

Concluding Comments

The stupid errors made include

  • Making a superficial point from the data to support a conjecture, when deeper understanding contradicts it. This is the case with the conjecture that rising GHG levels are the main cause of recent warming.
  • Clear misrepresentation of opinion surveys.
  • Even if the opinion surveys were correctly interpreted, use of opinion to support scientific conjectures, as opposed looking at statistical tests of actual data or estimates should appear stupid from a scientific perspective.
  • Claims that a consensus gap between consensus and sceptic views when the real gap is between consensus opinion and actual data.
  • Claims that polar bear populations will decline as sea ice declines is contradicted by the historical data. There is no recognition of this contradiction.

I believe Harvey et al paper gives some lessons for climatologists in particular and academics in general.

First is that when making claims crucial to the argument they need to be substantiated. That substantiation needs to be more than referencing others who have said the same claims before.

Second is that points drawn from referenced articles should be accurately represented.

Third, is to recognize that scientific papers need to first reference actual data and estimates, not opinions.  It is by comparing the current opinions with the real world that opportunities for advancement of understanding arise.

Fourth is that any academic discipline should aim to move from conjectures to empirically-based verifiable statements.

I have only picked out some of the more obvious of the stupid points. The question that needs to be asked is why such stupidity should have been agreed upon by 14 academics and then passed peer review?

Kevin Marshall

The Supply-Side of Climate Mitigation is Toothless

To eliminate global greenhouse gas emissions requires a two-pronged policy approach. Much is made of reducing demand for greenhouse gases through the switch to renewables, regulations and carbon taxes. But, with respect to fossil fuels, the supply needs to be reduced and eventually ceased. Climate activists like valve-turner Micheal Foster recognize that to achieve the climate mitigation targets much of the potential supply of fossil fuels must be left in the ground. With respect to the valve-turners actions of October 16th 2016, whilst it is possible to look at the minuscule impact that on global oil supply and proven reserves of oil, it is more difficult to estimate the marginal impact on the overall greenhouse gas emissions of their broader objective of permanently shutting down Canadian oil production. That requires estimates of CO2 emissions per unit of oil, coal and gas. In searching for figures to make my own estimates I came across a letter to Nature. McGlade and Ekins 2015 (The geographical distribution of fossil fuels unused when limiting global warming to 2°C) estimate that the proven global reserves of oil, gas and coal would produce about 2900 GtCO2e. They further estimate that the “non-reserve resources” of fossil fuels represent a further 8000 GtCO2e of emissions.

There is no breakdown by country, so I input their values of CO2 per unit into the BP’s estimates of global reserves of oil, gas and coal, coming up with a similar 2800 GtCO2e. These represent roughly 50 years of oil and gas supply and 120 years of coal supply at current usage rates. This should be put into the context of the policy objectives. From the abstract.

It has been estimated that to have at least a 50 per cent chance of keeping warming below 2 °C throughout the twenty-first century, the cumulative carbon emissions between 2011 and 2050 need to be limited to around 1,100 gigatonnes of carbon dioxide (Gt CO2).

This is similar to the IPCC’s central estimate of 1000 Gt CO2e from 2012 onwards. With just over 50 GtCO2e of GHG emissions per annum, from the beginning of 2018, the figure is around 700-800 GtCO2e. Taking into account other GHG emissions, to achieve the emissions target around 75% of proven reserves and 100% of any non-reserve sources or future discoveries must be left in the ground. I have produced a chart of the countries where these proven reserves lie, measured in terms of CO2 produced from burning.

These are very rough estimates, based upon the assuming that the emissions per unit of each fossil fuel are the same as the McGlade and Ekins averages. This is clearly not the case. A better estimate for oil, for instance, would likely have higher potential emissions from Venezuela and Canada, and lower potential emissions from the Middle East, particularly Saudi Arabia. However, it is clear that if global emissions constraints are to be achieved, the UN must get binding agreements from USA, Russia, Iran, Venezuela, China, Saudi Arabia, India, Qatar – plus many other countries – to abandon these vital resources within a few years. This would need to be done fairly and equitably in the eyes of all parties. But in such matters, there are widely different perspectives on what is fair, with a lack of ability by the UN to impose a settlement. There are also considerable economic costs to those nations whose economies rely on the producing fossil fuels, with the compensation the that they might demand unimaginably high. Further, like any cartel, there are considerable economic advantages in reneging on such deals, whilst ensuring that rival countries are held to their part of the agreement.

The problem is even greater. McGlade and Ekins 2015 is likely to have underestimated the unproven reserves of fossil fuels, even though the 8000 GtCO2e is truly staggering. The short 2013 GWPF paper THE ABUNDANCE OF FOSSIL FUELS by Phillip Mueller estimates that unproven, but potential recoverable reserves of tar sands in Canada and Green River Basin Wyoming, heavy oil in Venezuela and shale oil in Saudi Arabia could each be similar to or exceed, the global proven reserves of oil. Combined these could produce the around the same CO2 emissions of all the proven reserves of oil, gas and coal combined.

Then there are methane hydrates, which could contain 500 to 5000+ GtCO2e of emissions if burnt. The very nature of the hydrates could mean that large amounts of methane being released directly into the atmosphere.  This US Geological survey graphic (from a 2014 BBC article) shows the very wide distribution of the hydrates, meaning many countries could have large deposits within their territorial waters. This is especially significant for African nations, most of whom have very low, or nil, proven fossil fuel resources.

Mueller does not explore the potential reserves of coal. Under the North Sea alone there are estimates of 3 to 23 trillion tonnes of the stuff. (Searches reveal a number of other sources.) This compares to the BP estimate of 800 million tonnes of global proven reserves. 3 to 23 trillion tonnes of hard coal if burnt would represent 7000 to 55000 GtCO2e of emissions, compared to less than 1000 GtCO2e the IPCC claims sufficient to reach the 2C warming limit.

How many other vast fossil fuel reserves are out there? It may be just economic factors that stop fossil fuels reserves being proven and then exploited. What is clear is that whilst activists might be able to curtail or stop production of fossil fuels in Western countries, they are powerless to stop vast reserves being exploited in much of the rest of the World. The only significant consequence is to harm the economic futures of any country in which they gain successes and inadvertently work to benefit some pretty intolerant and oppressive regimes.

However, this does not leave climate activists impotent. They can work on better identifying when and where the catastrophic impacts of climate change will occur. But that would mean recognizing that previous prophesies of impending doom have been either totally false or massively exaggerated.

Kevin Marshall