UK votes for divorce from EU

The unexpected has happened. Despite the efforts of most of the British political establishment, the UK has voted by a narrow margin to leave the European Union. It should be viewed as a divorce which the interested parties had tried to prevent Like with a divorce, there needs to be deep breaths all round to accept the future dissolution. Like a divorce with children involved, Britain and the EU need to work constructively to achieve the best futures for all.
British politicians need to reflect as well. Maybe two-thirds supported Remain. Many were in line with their constituents, especially in London, Scotland and the M4 corridor where Prime Minister David Cameron’s constituency lies. But most of the North of England, particularly in the Labour Heartlands, voted for Leave. The MPs have to clearly state that they accept the result, and will join in obtaining the best futures for Britain and the countries of Europe. Those who cannot accept this should recognize they have no future in public service and resign from leading roles in politics.

Kevin Marshall

Guardian Images of Global Warming Part 1 – Australian Droughts

On Friday June 3rd the Guardian presented some high quality images with the headline

Droughts, floods, forest fires and melting poles – climate change is impacting Earth like never before. From the Australia to Greenland, Ashley Cooper’s work spans 13 years and over 30 countries. This selection, taken from his new book, shows a changing landscape, scarred by pollution and natural disasters – but there is hope too, with the steady rise of renewable energy.

The purpose is to convince people that human-caused climate change is happening now, to bolster support for climate mitigation policies. But the real stories of what the pictures show is quite different.  I will start with three images relating to drought in Australia.

Image 5

Forest ghosts: Lake Eildon in Victoria, Australia was built in the 1950’s to provide irrigation water, but the last time it was full was in 1995. The day the shot was taken it was at 29% capacity with levels down around 75ft.

Data from Lake Eildon (which is accessible with a simple search of Lake Eildon capacity) links to a graph where up to 7 years of data can be compared.

In 1995 the dam was not at full capacity, but it was full, for a short period, in the following year. However, more recently after the recent drought broke, in 2011 the reservoir was pretty much full for all the year.

But were the low levels due to more extreme drought brought on by climate change? That is very difficult to determine, as Lake Eildon is an artificial lake, constructed to provide water for irrigation occasional hydro-electric power as well as recreational facilities. The near empty levels at the end of the biggest drought in many decades could be just due a failure to predict the duration of the drought, or simply a policy of supplying irrigation water for the maximum length of time. The fact that water levels never reached full capacity for many years is indicated by a 2003 article in The Age

The dam wall at Lake Eildon, Victoria’s biggest state-run water storage, has been declared unsafe and will need a $30 million upgrade if the lake is to be refilled.

The dam, which is at its lowest level since being completed in 1956, will be restricted to just 65 per cent capacity because it no longer meets safety standards for earthquakes and extreme floods.

Image 6

Forest destroyed by bush fires near Michelago, New South Wales, Australia.

The inference is that this is caused by global warming.

According to Munich Re

The majority of bushfires in southeast Australia are caused by human activity

Bushfire is the only natural hazard in which humans have a direct influence on the hazard situation. The majority of bushfires near populated areas are the consequence of human activity. Lightning causes the smaller portion naturally. Sometimes, a carelessly discarded cigarette or a glass shard, which can focus the sun’s rays is all it takes to start a fire. Heat from motors or engines, or electric sparks from power lines and machines can ignite dry grass. Besides this accidental causes, a significant share of wildfires are started deliberately.

Humans also change the natural fire frequency and intensity. They decrease the natural fire frequency due to deliberate fire suppression near populated areas. If there is no fuel-reduction burning in forests for the purposes of fire prevention, large quantities of combustible material can accumulate at ground level.

Surface fires in these areas can become so intense due to the large amounts of fuel that they spread to the crowns of the trees and rapidly grow into a major fire. If humans had not intervened in the natural bushfire regime, more frequent low-intensity fires would have consumed the forest undergrowth and ensured that woodland grasses and scrubs do not proliferate excessively.

David Evans expands on the issue of fuel load in a 2013 article.

Like with the water levels in an artificial lake, forest fires are strongly influenced by the management of those forests. Extinguishing forest fires before they have run their natural course results in bigger and more intense fires at a later date. More frequent or intense droughts would not change this primary cause of many horrific forest fire disasters seen in recent years.

Image 7

Where has all the water gone?: Lake Hume is the largest reservoir in Australia and was set up to provide irrigation water for farms further down the Murray Basin and drinking water for Adelaide. On the day this photograph was taken it was at 19.6% capacity. By the end of the summer of 2009 it dropped to 2.1 % capacity. Such impacts of the drought are likely to worsen as a result of climate change. The last time the water was anywhere near this road bridge was 10 years ago, rendering this no fishing sign, somewhat redundant.

Again this is old data. Like for Lake Eildon, it is easy to construct graphs.

Following the end of the drought, the reservoir came back to full capacity. Worsening drought is only apparent to those who look over a short time range.

When looking at drought in Australia, Dorothea Mackellar’s 1908 poem “My Country” provides some context. Written for a British audience, the poem begins

I love a sunburnt country,

A land of sweeping plains,

Of ragged mountain ranges,

Of droughts and flooding rains

To understand the difference that human-caused climate change is having on the climate first requires an understanding of natural climatic variation over multiple time-scales. It then requires an understanding of how other human factors are influencing the environment, both intended and unintended.

Kevin Marshall

Are the Paris Floods due to climate changing for the worse?

The flood of the River Seine is now past the 6.1m peak reached in the early hours of the early hours of Saturday 4th June. 36 hours later, the official measurements at Pont d’Austerlitz show that the level is below 5.7m. The peak is was just below the previous major flood in 1982 of 6.15m, but well above the previous flood emergency in 2000, when waters peaked at 3.92m. Below is a snapshot of a continually-updated graphic at the Environment Ministry VIGICRUES site.

Despite it being 16 years since this last emergency, the reaction of the authorities has been impressive. From giving people warnings of the rising levels; evacuating people; stopping all non-emergency vessels on the Seine; protecting those who live on the river; and putting into operation emergency procedures for the movement of art treasures out of basement storage in the Louvre.Without these measures the death toll and the estimated €600m cost of the flood would undoubtedly have been much higher.

The question that must be asked is whether human-caused climate change has made flooding worse on a river that has flooded for centuries. The data is hard to come by. An article in Le Figaro last year gave the top ten record floods, the worst being in 1658.

Although this is does show that the current high of 6.10m is a full 50cm below the tenth worst in 1920, there is no indication of increasing frequency.

From a 2012 report Les entreprises face au risque inondation I have compiled a graphic of all flood maximums which were six metres or higher.

This shows that major floods were much more frequent in the period 1910 to 1960 than in the period before or after. Superficially it would seem that recently flooding had been getting less severe. But this conclusion would ignore the many measures that were put in place after the flood of 1910. The 2014 OECD Report Seine Basin, Île-de-France: Resilience to Major Floods stated:-

Since 1910, the risk of a Seine River flood in the Ile-de-France region has been reduced in various stages by protective structures, including dams built upstream and river development starting in the 1920s, then in the 1950s up until the early 1990s. Major investments have been limited in the last decades, and it appears that protection levels are not up to the standards of many other comparable OECD countries, particularly in Europe. On the other hand, the exposure to the risk and the resulting vulnerability are accentuated by increasing urban density in the economic centre of France, as well as by the construction of a large number of areas activity centres and critical infrastructures (transport, energy, communications, water) along the Seine River.

If the climate impact had become more severe, then one would expect the number of major floods to increase given the limited new measures to prevent them. However, the more substantial measures taken in the last century could explain the reduced frequency of major floods, though the lack of floods between 1882 and 1910 suggests that the early twentieth century could have been an unusually wet period. Without detailed weather records my guess is that it is a bit of both. Extreme rainfall has decreased, whilst flood prevention measures have also had some impact on flood levels.

Kevin Marshall

Beliefs and Uncertainty: A Bayesian Primer

Ron Clutz’s introduction, based on a Scientific American article by John Horgan on January 4, 2016, starts to grapple with the issues involved.

The take home quote from Horgan is on the subject of false positives.

Here is my more general statement of that principle: The plausibility of your belief depends on the degree to which your belief–and only your belief–explains the evidence for it. The more alternative explanations there are for the evidence, the less plausible your belief is. That, to me, is the essence of Bayes’ theorem.

“Alternative explanations” can encompass many things. Your evidence might be erroneous, skewed by a malfunctioning instrument, faulty analysis, confirmation bias, even fraud. Your evidence might be sound but explicable by many beliefs, or hypotheses, other than yours.

In other words, there’s nothing magical about Bayes’ theorem. It boils down to the truism that your belief is only as valid as its evidence. If you have good evidence, Bayes’ theorem can yield good results. If your evidence is flimsy, Bayes’ theorem won’t be of much use. Garbage in, garbage out.
With respect to the question of whether global warming is human caused, there is basically a combination of three elements – (i) Human caused (ii) Naturally caused (iii) Random chaotic variation. There may be a number of sub-elements and an infinite number of combinations including some elements counteracting others, such as El Nino events counteracting underlying warming. Evaluation of new evidence is in the context of explanations being arrived at within a community of climatologists with strong shared beliefs that at least 100% of recent warming is due to human GHG emissions. It is that same community who also decide the measurement techniques for assessing the temperature data; the relevant time frames; and the categorization of the new data. With complex decisions the only clear decision criteria is conformity to the existing consensus conclusions. As a result, the original Bayesian estimates become virtually impervious to new perspectives or evidence that contradicts those original estimates.

Science Matters

Those who follow discussions regarding Global Warming and Climate Change have heard from time to time about the Bayes Theorem. And Bayes is quite topical in many aspects of modern society:

Bayesian statistics “are rippling through everything from physics to cancer research, ecology to psychology,” The New York Times reports. Physicists have proposed Bayesian interpretations of quantum mechanics and Bayesian defenses of string and multiverse theories. Philosophers assert that science as a whole can be viewed as a Bayesian process, and that Bayes can distinguish science from pseudoscience more precisely than falsification, the method popularized by Karl Popper.

Named after its inventor, the 18th-century Presbyterian minister Thomas Bayes, Bayes’ theorem is a method for calculating the validity of beliefs (hypotheses, claims, propositions) based on the best available evidence (observations, data, information). Here’s the most dumbed-down description: Initial belief plus new evidence = new and improved belief.   (A fuller and…

View original post 1,082 more words

CO2 Emissions from Energy production forecast to be rising beyond 2040 despite COP21 Paris Agreement

Last week the US Energy Information Administration (EIA) published their INTERNATIONAL ENERGY OUTLOOK 2016. The Daily Caller (and the GWPF) highlighted the EIA’s summary energy energy production. This shows that the despite the predicted strong growth in nuclear power and implausibly high growth in renewables, usage of fossil fuels are also predicted to rise, as shown in their headline graphic below.

For policy purposes, the important aspect is the translation into CO2 emissions. In the final Chapter 9. Energy-related CO2 Emissions figure 9.3 shows the equivalent CO2 Emissions in billions of tonnes of CO2. I have reproduced the graphic as a stacked bar chart.

Data reproduced as a stacked bar chart.

In 2010 these CO2 emissions are just under two-thirds of total global greenhouse gas emissions. The question is how does this fit into the policy requirements to avoid 2°C from the IPCC’s Fifth Assessment Report? The International Energy Authority summarized the requirements very succicently in World Energy Outlook 2015 Special Report page 18

The long lifetime of greenhouse gases means that it is the cumulative build-up in the atmosphere that matters most. In its latest report, the Intergovernmental Panel on Climate Change (IPCC) estimated that to preserve a 50% chance of limiting global warming to 2 °C, the world can support a maximum carbon dioxide (CO2) emissions “budget” of 3 000 gigatonnes (Gt) (the mid-point in a range of 2 900 Gt to 3 200 Gt) (IPCC, 2014), of which an estimated 1 970 Gt had already been emitted before 2014. Accounting for CO2 emissions from industrial processes and land use, land-use change and forestry over the rest of the 21st century leaves the energy sector with a carbon budget of 980 Gt (the midpoint in a range of 880 Gt to 1 180 Gt) from the start of 2014 onwards.

From the forecast above, cumulative CO2 emissions from 2014 with reach 980 Gt in 2038. Yet in 2040, there is no sign of peak emissions.

Further corroboration comes from the UNFCCC. In preparation for the COP21 from all the country policy proposals they produced a snappily titled Synthesis report on the aggregate effect of intended nationally determined contributions. The UNFCCC have updated the graphics since. Figure 2 of 27 Apr 2016 shows the total GHG emissions, which were about 17 Gt higher than the CO2 emissions from energy emissions in 2010.

The graphic clearly shows that the INDCs – many with very vague and non-verifiable targets – will make very little difference to the non-policy emissions path. Yet even this small impact is contingent on those submissions being implemented in full, which is unlikely in many countries. The 2°C target requires global emissions to peak in 2016 and then head downwards. There are no additional policies even being tabled to achieve this, except possibly by some noisy, but inconsequential, activist groups. Returning to the EIA’s report, figure 9.4 splits the CO2 emissions between the OECD and non-OECD countries.

The OECD countries represent nearly all countries who propose to reduce their CO2 emissions on the baseline 1990 level, but their emissions are forecast by the EIA still to be 19% higher in 2040. However, the increase is small compared to the non-OECD countries – who mostly are either proposing to constrain emissions growth or have no emissions policy proposals – with emissions forecast to treble in fifty years. As a result the global forecast is for CO2 emissions to double. Even if all the OECD countries completely eliminate CO2 emissions by 2040, global emissions will still be a third higher than in 1990. As the rapid economic growth in the former Third World reduces global income inequalities, it is also reducing the inequalities in fossil fuel consumption in energy production. This will continue beyond 2040 when the OECD with a sixth of the world population will still produce a third of global CO2 emissions.

Unless the major emerging economies peak their emissions in the next few years, then reduce the emissions rapidly thereafter, the emissions target allegedly representing 2°C or less of global warming by 2100 will not be met. But for countries like India, Vietnam, Indonesia, Bangladesh, Nigeria, and Ethiopia to do so, with the consequent impact on economic growth, is morally indefensible.

Kevin Marshall


Britain Stronger in Europe Letter

I received a campaign letter from Britain Stronger in Europe today headed


Putting the “RE:” in front is a bit presumptuous. It is not a reply to my request. However, I believe in looking at both sides of the argument, so here is my analysis. First the main points in the body of the letter:-

  1. JOBS – Over 3 million UK jobs are linked to our exports to the EU.
  2. BUSINESSES – Over 200,000 UK Businesses trade with the EU, helping them create jobs in the UK.
  3. FAMILY FINANCES – Leaving the EU will cost the average UK household at least £850 a year, and potentially as much as £1,700, according to research released by the London School of Economics.
  4. PRICES – Being in Europe means lower prices in UK shops, saving the average UK household over £350 a year. If we left Europe, your weekly shop would cost more.
  5. BENEFITS vs COSTS – For every £1 we put into the EU, we get almost £10 back through increased trade, investment, jobs, growth and lower prices.
  6. INVESTMENT – The UK gets £66 million of investment every day from EU countries – that’s more than we pay to be a member of the EU.

The first two points are facts, but only show part of the picture. The UK not only exports to the EU, but also imports. Indeed there is a net deficit with the EU, and a large deficit in goods. It is only due to a net surplus in services – mostly in financial services based in the City of London – that the trade deficit is not larger. The ONS provides a useful graphic illustrating both the declining share of exports to the EU, and the increasing deficit, reproduced below.

No one in the UK is suggesting that Brexit would mean a decline in trade, and it would be counter-productive for the EU not to reach a trade agreement with an independent UK when the EU has this large surplus.

The impact on FAMILY FINANCES is based upon the Centre for Economic Performance, an LSE affiliated organisation. There is both a general paper and a technical paper to back up the claims. They are modelled estimates of the future, not facts. The modelled costs assume Britain exits the European Union without any trade agreements, despite this being in the economic interests of both the UK and the EU. The report also does a slight of hand in estimating the contributions the UK will make post Brexit. From page 18 the technical paper

We assume that the UK would keep contributing 83% of the current per capita contribution as Norway does in order to remain in the single market (House of Commons, 2013). This leads to a fiscal saving of about 0.09%.

The table at the foot of report page 22 (pdf page 28) gives the breakdown of the estimate from 2011 figures. The Norway figures are gross and have a fixed cost element. The UK economy is about six times that of Norway, so would not end up spending nearly as much per capita even on the same basis. The UK figures are also a net figure. The UK pays into the EU twice as much as it gets out. Ever since joining the Common Market in 1973 Britain has been the biggest loser in terms of net contributions, despite the rebates that Mrs Thatcher secured with much effort in the 1980s.

The source of the PRICES information is again from the Centre for Economic Performance, but again with no direct reference. I assume it is from the same report, and forms part of the modelled forecast costs.

The BENEFITS vs COSTS statement is not comparing like with like. The alleged benefits to the UK are not all due to being a member of a club, but as a consequence of being an open economy trading with its neighbours. A true BENEFITS vs COSTS comparison would be future scenarios of Brexit vs Remain. Leading economist Patrick Minford has published a paper for the Institute of Economic Affairs, who finds there is a net benefit in leaving, particularly when likely future economic growth is taken into account.

The INVESTMENT issue is just part of the BENEFITS vs COSTS statement. So, like with the PRICES statement it is making one point into two.

 In summary, Britain Stronger in Europe claims I need to know six facts relevant to the referendum decision, but actually fails to provide a one. The actual facts are not solely due to the UK being a member of the European Union, whilst the relevant statements are opinions on modelled future scenarios that are unlikely to happen. The choice is between a various possible future scenarios in the European Union and possible future scenarios outside. The case for remain should be proclaiming the achievements of the European Union in making a positive difference to the lives of the 500 million people in the 28 States, along with future pathways where it will build on these achievements. The utter lack of these arguments, in my opinion, is the strongest argument for voting to leave.

Kevin Marshall


Copy of letter from Britain Stronger in Europe

Blighting of Fairbourne by flawed report and BBC reporting

The Telegraph is reporting (hattip Paul Homewood)

A Welsh village is to sue the government after a climate change report suggested their community would soon be washed away by rising sea levels.

The document says Fairbourne will soon be lost to the sea, and recommends that it is “decommissioned”.

However, I was not sure about some of the figures in the Telegraph report, so I checked for myself.

West of Wales Shoreline Management Plan 2(SMP2) is available in sections. Fairbourne is covered in file 4d3 – Section 4 Coastal Area D PDZ11.pdf under folder West of W…\Eng…\Coastal Area D

On page 16 is the following graphic.

Fairbourne is the grey area to the bottom left of the image. In 50 years about a third of the village will be submerged at high tide and in 100 years all of the village. This is without changes to flood defences. Even worse is this comment.

Over the 100 years with 2m SLR the area would be typically 1.5m below normal tidal levels.

Where would they have got this 1-2m of sea level rise from? In the Gwynedd council Cabinet Report 22/01/13 Topic : Shoreline Management Plan 2 it states

The WoWSMP2 was undertaken in defined stages as outlined in the Defra guidance published in March 2006.

And on sea level rise it states

There is a degree of uncertainty at present regarding the rate of sea level rise. There is an upper and lower estimate which produces a range of possibilities between 1m and 2m in the next 100 years. It will take another 10 to 20 years of data to determine where we are on the graph and what the projection for the future is.

Does the Defra guidance bear any resemblance to the expert opinion? In the UNIPCC AR5 Working Group 1 Summary for Policymakers page 21 is Table SPM.2

At the foot of the table is the RCP8.5 business as usual scenario for sea level rise.

The flood risk images produced in 2011 assume 0.36m of sea level rise in 50 years or about 2061. This is at the very top end of the RCP8.5 scenario estimates for 2046-2065. It is above the sea level rise projections with mitigation policies. Similarly a rise of 1m in 100 years is equivalent to the top end of the RCP8.5 scenario estimates for 2081-2100 of 0.82m. With any mitigation scenario the sea level rise is below that.

This means that the West of Wales Shoreline Management Plan 2 assumes that the Climate Change Act 2008 (which has increased electricity bills by at least 30% since it was passed, and blighted many rural areas with wind turbines) will have no impact at all. For added effect it takes the most extreme estimate of sea level rise and doubles it.

It gets worse. The action group Fairbourne Facing Change has a website

The Fairbourne Facing Change Community Action Group (FFC) was established in direct response to the alarming way the West of Wales Shoreline Management Plan 2(SMP2) was publicised on national and local television. The BBC programme ‘Week in Week Out’ broadcast on Tuesday, 11th February 2014, did not present an accurate and balanced reporting of the situation. This, then followed with further inaccurate coverage culminating in unnecessary concern, anxiety, and panic for the community.

The BBC has long been the centre of an extremist view on climate change. The lack of balance has caused real distress and helped exacerbate the situation. Yet in the 2014 report they did not mention this alarmism when reporting that

Fairbourne is expected to enter into “managed retreat” in 2025 when the council will stop maintaining defences due to rising sea levels.


More than 400 homes are expected to be abandoned in the village by 2055 as part of the council’s shoreline management plan (SMP) policy.

With sea level rise of about 3mm a year, and with forecast acceleration, the council is alleged to find it no longer worthwhile to maintain the sea defences when sea levels of have risen by one or two inches, and will have completely abandoned the village based on a sea level rise of less than 14 inches.

Kevin Marshall



James Ross Island warming of past 100 years not unusual

At Wattsupwiththat there is a post by Sebastian Lüning The Medieval Warm Period in Antarctica: How two one-data-point studies missed the target.

Lüning has the following quote and graphic from Mulvaney et al. 2012, published in Nature.

But the late Bob Carter frequently went on about the recent warming being nothing unusual. Using mainstream thinking, would you trust a single climate denialist against proper climate scientists?

There is a simple test. Will similar lines fit to data of the last two thousand years? It took me a few minutes to produce the following.

Bob Carter is right and nine leading experts, plus their peer reviewers are wrong. From the temperature reconstruction there were at least five times in the last 2000 years when there were similar or greater jumps in average temperature. There are also about seven temperature peaks similar to the most recent.

It is yet another example about how to look at the basic data rather than the statements of the experts. It is akin to a court preferring the actual evidence rather than hearsay.

Kevin Marshall

Insight into the mindset of FoE activists

Bishop Hill comments about how

the Charities Commissioners have taken a dim view of an FoE leaflet that claimed that silica – that’s sand to you or me – used in fracking fluid was a known carcinogen.

Up pops a FoE activist making all sorts of comments, including attacking the hosts book The Hockey Stick Illusion. Below is my comment

Phil Clarke’s comments on the hosts book are an insight into the Green Activists.
He says Jan 30, 2016 at 9:58 AM

So you’ve read HSI, then?
I have a reading backlog of far more worthwhile volumes, fiction and non-fiction. Does anybody dispute a single point in Tamino’s adept demolition?


Where did I slag off HSI? I simply trust Tamino; the point about innuendo certainly rings true, based on other writings.
So no, I won’t be shelling out for a copy of a hatchet job on a quarter-century old study. But I did read this, in detail

Tamino’s article was responded to twice by Steve McIntyre. The first looks at the use of non-standard statistical methods and Re-post of “Tamino and the Magic Flute” simply repeats the post of two years before. Tamino had ignored previous rebuttals. A simple illustration is the Gaspé series that Tamino defends. He misses out many issues with this key element in the reconstruction, including that a later sample from the area failed to show a hockey stick.
So Phil Clarke has attacked a book that he has not read, based on biased review by an author in line with his own prejudices. He ignores the counter-arguments, just as the biased review author does as well. Says a lot about the rubbish Cuadrilla are up against.

Kevin Marshall

William Connolley is on side of anti-science not the late Bob Carter

In the past week there have been a number of tributes to Professor Bob Carter, retired Professor of Geology and leading climate sceptic. This includes Jo Nova, James Delingpole, Steve McIntyre, Ian Pilmer at the GWPF, Joe Bast of The Heartland Institute and E. Calvin Beisner of Cornwall Alliance. In complete contrast William Connolley posted this comment in a post Science advances one funeral at a time

Actually A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it, but I’m allowed to paraphrase in titles. And anyway he said it in German, naturally. Today brings us news of another such advancement in science, with the reported death of Robert Carter.

Below is a comment I posted at Climate Scepticism

I believe Max Planck did have a point. In science people tenaciously hold onto ideas even if they have been falsified by the evidence or (as more often happens) they are supplanted by better ideas. Where the existing ideas form an institutionalized consensus, discrimination has occurred against those with the hypotheses can undermine that consensus. It can be that the new research paradigm can only gain prominence when the numbers dwindle in the old paradigm. As a result the advance of new knowledge and understanding is held back.

To combat this innate conservatism in ideas I propose four ideas.

First is to promote methods of evaluating competing theories that are independent of consensus or opinion. In pure science that is by conducting experiments that would falsify a hypothesis. In complex concepts, for which experiment is not possible and data is incomplete and of poor quality, like the AGW hypothesis or economic theories, comparative analysis needs to be applied based upon independent standards.

Second is to recognize institutional bias by promoting pluralism and innovation.

Third is to encourage better definition of concepts, more rigorous standards of data within the existing research paradigm to push the boundaries.

Fourth is to train people to separate scientific endeavours from belief systems, whether religious, political or ethical.

The problem for William Connolley is that all his efforts within climatology – such as editing Wikipedia to his narrow views, or helping set up Real Climate to save the Mannian Hockey Stick from exposure of its many flaws – are with enforcing the existing paradigm and blocking any challenges. He is part of the problem that Planck was talking about.

As an example of the narrow and dogmatic views that Connolley supports, here is the late Bob Carter on his major point about how beliefs in unprecedented human-caused warming are undermined by the long-term temperature proxies from ice core data. The video quality is poor, probably due to a lack of professional funding that Connolley and his fellow-travellers fought so hard to deny.

Kevin Marshall