Daniel Hannan on the selfishness of running a deficit and post-truth realities

In the latest Ici Londres production Dan Hannan looks at the morality of deficits.

Daniel Hannan starts by quoting Matthew 7:9-10

If the son shall ask bread of any of you that is a father, will you give him a stone? Or if he asks for a fish will you give him a serpent?

The passage goes onto to say the if you are evil, understand how to give good gifts to your children. By implication, to act for good, we must also understand how to act for the good, not just have the moral injunction.

Hannan goes onto say we do not run up large debts to bequeath to our children. Yet many impose a very different standard as voters, convincing themselves that they are being unselfish. By asking for more money from the State, whether to pay for care in old age or for a pay rise in the public sector, or remission of tuition fees, it might be a very good claim, but it is not an intrinsically unselfish claim, as they are asking for everybody else to chip in and pay for their cause. Conversely those who try to impose some fiscal discipline are deemed selfish. They are standing up for future generations. Austerity is not a random preference but a simple reality.

This is all pretty obvious stuff to anyone who understands basic morality and the slightest notion of finance. It is certainly within the understanding of anybody who has been brought up in a traditional British public school education. But I would suggest it is totally alien to the vast majority of the British public. This reason is described by a new word that entered the Oxford English Dictionary last month.

post-truth

Relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.

The General Election campaign is a clear illustration of the domination of post-truthers in public life. There is no understanding of public finances, just mass beliefs that are not based on any moral tradition. The spread of the beliefs is on social media, driven by those who most forcefully and repeatedly express their ideas. People are wrong because they disagree with the mass beliefs and shouted down (or trolled in the electronic version) because of it.

In a post last month – General Election 2017 is a victory for the Alpha Trolls over Serving One’s Country – I concluded

It is on the issue of policy to combat climate change that there is greatest cross-party consensus, and the greatest concentration of alpha trolls. It is also where there is the clearest illustration of policy that is objectively useless and harmful to the people of this country.

Like with public finances, climate change is an where post-truthers dominate. Two examples to illustrate.

Consensus messaging

There is no clear evidence of an emerging large human-caused problem with climate and there is no prospect of action to reduce greenhouse has emissions to near zero. Instead we have a dodgy survey that claimed 97% of academic papers on an internet search matching the topics ‘global climate change’ or ‘global warming’ expressed support (belief / assumptions) in the broadest, most banal, form of the global warming hypothesis. This was converted by Senator Bernie Sanders, in questioning Scott Pruitt, into the following:-

As you may know, some 97% of scientists who have written articles for peer-reviewed journals have concluded that climate change is real, it is caused by human activity, and it is already causing devastating problems in the US and around the world.

And

While you are not certain, the vast majority of scientists are telling us that if we do not get our act together and transform out energy system away from fossil fuel there is a real question as to the quality of the planet that we are going to be leaving our children and our grandchildren. 

The conversion from banal belief to these sweeping statements is not the fault of the Senator, though he (or his speech-writers) should have checked. Rather it is of lead author John Cook and his then PhD supervisor Cognitive Psychology Professor Stephan Lewandowsky. Post-truthers will not recognize the glaring difference between the dodgy survey and the Senator’s statements, as it is appeals to emotion and belief that are primary in evaluating political realities.

Mitigating Climate Change

Dangerous climate change is allegedly caused by human greenhouse emissions. The proposed solution is to reduce those emissions (mostly CO2 emissions from the burning of fossil fuels) to near zero. The key for policy is that emissions are global, yet most countries, covering over 80% of the global population have no primary obligation under the 1992 Rio Declaration to reduce their emissions. These developing “non-Annex” countries have accounted for all the in emissions since 1990, as shown in this graph.

The problem can be expressed in my First Law of Climate Mitigation

To reduce global greenhouse gas emissions, the aggregate reduction in countries that reduce their emissions must be greater than aggregate increase in emissions in all other countries.

All the ranting about supporting the Paris Agreement ignores this truism. As a result, countries like the UK who pursue climate mitigation will increase their energy costs and make life harder for the people, whilst not achieving the policy aims. It is the poorest in those policy countries who will bear the biggest burden and create comparative disadvantages compared to the non-policy countries. For the developing countries (shown in purple in the graph) to reduce their emissions would destroy their economic growth, thus preventing the slow climb out of extreme poverty still endured by the majority of people on this planet. In so doing we ignore the moral tradition from our Christian heritage that the primary moral concern of public policy should be the help the poor, the disadvantaged and the marginalized. Ignoring the truism and pursuing bequeaths a worse future for our children and our grandchildren. This is the same for climate change as for public finances. But in both cases it is the post-truth “reality” that prevent this recognition of basic logic and wider morality.

Kevin Marshall

 

Bernie Saunders demonstrates why he was not fit to be President

Senator Bernie Saunders of Vermont was for a while running a close second to Hillary Clinton in the Democrat Primaries. Had his extreme left views, advanced years and the fact that he is the junior Senator from the 49th most populous State, he might have stood a chance against a former First Lady and Secretary of State. But Senator Sanders’ recent questioning of Scott Pruitt shows why he is unfit for high office. Ron Clutz has transcribed more of the dialog, by I think two statements encapsulate this.

At 0.45

As you may know, some 97% of scientists who have written articles for peer-reviewed journals have concluded that climate change is real, it is caused by human activity, and it is already causing devastating problems in the US and around the world. Do you believe that climate change is caused by carbon emissions from human activity?

There is no 97% survey of scientists which conclude these things. As Ron Clutz observes the nearest to definite questions was Examining the Scientific Consensus on Climate Change – Doran and Zimmerman 2009, where the second question was

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

One could answer “yes” if you thought that 10% of the rise in the was due to land use changes, and the rest due to natural factors. It does not ask about fossil fuel emissions, and the question allows for belief in other factors other than human activity whether known or unknown. Neither does it ask if temperature rise is net harmful, with huge devastating impacts already evident.

There is also the Cook et. al survey of peer-reviewed academic papers that I looked after listening to a lecture Cook gave at Bristol University in late 2014. The survey just looked to the assumption that humans cause some warming, whether explicit or implied. Like the Doran and Zimmerman survey it is just hearsay. This Sen. Sanders presents as good evidence that there is already a clear catastrophic problem caused by changes in the climate. If there is real and overwhelming evidence, why does Sen. Sanders not refer to that instead of misrepresenting bogus opinion polls?

Senator Sanders then goes even further.  At 1.50

While you are not certain, the vast majority of scientists are telling us that if we do not get our act together and transform out energy system away from fossil fuel there is a real question as to the quality of the planet that we are going to be leaving our children and our grandchildren. So you are applying for a job as Administrator at the EPH to protect our environment. Overwhelming majority of scientists say we have to act boldly and your’re telling me that there needs to be more debate on this issue and that we should not be acting boldly.

Sanders now says a majority of scientists are telling us we must change our energy systems. Aside from the fact that only a very small minority of scientists have any sort of competency in the field of climate, (and there is evidence a lot of demonstrated incompetency within the small group e.g. here), they have no expertise in the economic or moral cases for policy. For policy the interpretation of the moral imperatives and the practical possibilities should be the realm of politicians. For those who sit on specialist committees, they should at least have their own developed views on the field.

Senator Bernie Saunders has taken some very dodgy opinion polls, grossly exaggerated the findings, and then ascribed statements to the climatologists that are far removed, and way beyond, any competencies they might have. As I see it, the role of President of the United States, as a leader, is to critically interpret what they are given in order to make decisions for the nation. That is the exact opposite of what Sanders did last week.

Kevin Marshall 

 

How strong is the Consensus Evidence for human-caused global warming?

You cannot prove a vague theory wrong. If the guess that you make is poorly expressed and the method you have for computing the consequences is a little vague then ….. you see that the theory is good as it can’t be proved wrong. If the process of computing the consequences is indefinite, then with a little skill any experimental result can be made to look like an expected consequence.

Richard Feynman – 1964 Lecture on the Scientific Method

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

The Debunking Handbook 2011 – John Cook and Stephan Lewandowsky

My previous post looked at the attacks on David Rose for daring to suggest that the rapid fall in global land temperatures at the El Nino event were strong evidence that the record highs in global temperatures were not due to human greenhouse gas emissions. The technique used was to look at long-term linear trends. The main problems with this argument were
(a) according to AGW theory warming rates from CO2 alone should be accelerating and at a higher rate than the estimated linear warming rates from HADCRUT4.
(b) HADCRUT4 shows warming stopped from 2002 to 2014, yet in theory the warming from CO2 should have accelerated.

Now there are at least two ways to view my arguments. First is to look at Feynman’s approach. The climatologists and associated academics attacking journalist David Rose chose to do so from a perspective of a very blurred specification of AGW theory. That is human emissions will cause greenhouse gas levels to rise, which will cause global average temperatures to rise. Global average temperature clearly have risen from all long-term (>40 year) data sets, so theory is confirmed. On a rising trend, with large variations due to natural variability, then any new records will be primarily “human-caused”. But making the theory and data slightly less vague reveals an opposite conclusion. Around the turn of the century the annual percentage increase in CO2 emissions went from 0.4% to 0.5% a year (figure 1), which should have lead to an acceleration in the rate of warming. In reality warming stalled.

The reaction was to come up with a load of ad hoc excuses. Hockey Schtick blog reached 66 separate excuses for the “pause” by November 2014, from the peer-reviewed to a comment in the UK Parliament.  This could be because climate is highly complex, with many variables, the presence of each contributing can only be guessed at, let alone the magnitude of each factor and the interrelationships with all factors. So how do you tell which statements are valid information and which are misinformation? I agree with Cook and Lewandowsky that misinformation is pernicious, and difficult to get rid of once it becomes entrenched. So how does one evaluate distinguish between the good information and the bad, misleading or even pernicious?

The Lewandowsky / Cook answer is to follow the consensus of opinion. But what is the consensus of opinion? In climate one variation is to follow a small subset of academics in the area who answer in the affirmative to

1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?

2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

Problem is that the first question is just reading a graph and the second could be is a belief statement will no precision. Anthropogenic global warming has been a hot topic for over 25 years now. Yet these two very vague empirically-based questions, forming the foundations of the subject, should be able to be formulated more precisely. On the second it is a case of having pretty clear and unambiguous estimates as to the percentage of warming, so far, that is human caused. On that the consensus of leading experts are unable to say whether it is 50% or 200% of the warming so far. (There are meant to be time lags and factors like aerosols that might suppress the warming). This from the 2013 UNIPCC AR5 WG1 SPM section D3:-

It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.

The IPCC, encapsulating the state-of-the-art knowledge, cannot provide firm evidence in the form of a percentage, or even a fairly broad range even with over 60 years of data to work on..  It is even worse than it appears. The extremely likely phrase is a Bayesian probability statement. Ron Clutz’s simple definition from earlier this year was:-

Here’s the most dumbed-down description: Initial belief plus new evidence = new and improved belief.

For the IPCC claim that their statement was extremely likely, at the fifth attempt, they should be able to show some sort of progress in updating their beliefs to new evidence. That would mean narrowing the estimate of the magnitude of impact of a doubling of CO2 on global average temperatures. As Clive Best documented in a cliscep comment in October, the IPCC reports, from 1990 to 2013 failed to change the estimate range of 1.5°C to 4.5°C. Looking up Climate Sensitivity in Wikipedia we get the origin of the range estimate.

A committee on anthropogenic global warming convened in 1979 by the National Academy of Sciences and chaired by Jule Charney estimated climate sensitivity to be 3 °C, plus or minus 1.5 °C. Only two sets of models were available; one, due to Syukuro Manabe, exhibited a climate sensitivity of 2 °C, the other, due to James E. Hansen, exhibited a climate sensitivity of 4 °C. “According to Manabe, Charney chose 0.5 °C as a not-unreasonable margin of error, subtracted it from Manabe’s number, and added it to Hansen’s. Thus was born the 1.5 °C-to-4.5 °C range of likely climate sensitivity that has appeared in every greenhouse assessment since…

It is revealing that quote is under the subheading Consensus Estimates. The climate community have collectively failed to update the original beliefs, based on a very rough estimate. The emphasis on referring to consensus beliefs about the world, rather than looking outward for evidence in the real world, I would suggest is the primary reason for this failure. Yet such community-based beliefs completely undermines the integrity of the Bayesian estimates, making its use in statements about climate clear misinformation in Cook and Lewandowsky’s use of the term. What is more, those in the climate community who look primarily to these consensus beliefs rather than the data of the real world will endeavour to dismiss the evidence, or make up ad hoc excuses, or smear those who try to disagree. A caricature of these perspectives with respect to global average temperature anomalies is available in the form of a flickering widget at John Cooks’ skepticalscience website. This purports to show the difference between “realist” consensus and “contrarian” non-consensus views. Figure 2 is a screenshot of the consensus views, interpreting warming as a linear trend. Figure 3 is a screenshot of the non-consensus or contrarian views. They is supposed to interpret warming as a series of short, disconnected,  periods of no warming. Over time, each period just happens to be at a higher level than the previous. There are a number of things that this indicates.

(a) The “realist” view is of a linear trend throughout any data series. Yet the period from around 1940 to 1975 has no warming or slight cooling depending on the data set. Therefore any linear trend line derived for a longer period than 1970 to 1975 and ending in 2015 will show a lower rate of warming. This would be consistent the rate of CO2 increasing over time, as shown in figure 1. But for shorten the period, again ending in 2015, and once the period becomes less than 30 years, the warming trend will also decrease. This contracts the theory, unless ad hoc excuses are used, as shown in my previous post using the HADCRUT4 data set.

(b) Those who agree with the consensus are called “Realist”, despite looking inwards towards common beliefs. Those who disagree with warming are labelled “Contrarian”. This is not inaccurate when there is a dogmatic consensus. But it utterly false to lump all those who disagree with the same views, especially when no examples are provided of those who hold such views.

(c) The linear trend appears as a more plausible fit than the series of “contrarian” lines. By implication, those who disagree with the consensus are viewed as as having a distinctly more blinkered and distorted perspective than those who follow the consensus. Yet even using gistemp data set (which is gives greatest support to the consensus views) there is a clear break in the linear trend. The less partisan HADCRUT4 data shows an even greater break.

Those who spot the obvious – that around the turn of the century warming stopped or slowed down, when in theory it should have accelerated – are given a clear choice. They can conform to the scientific consensus, denying the discrepancy between theory and data. Or they can act as scientists, denying the false and empirically empty scientific consensus, receiving the full weight of all the false and career-damaging opprobrium that accompanies it.

fig2-sks-realists

 

 

fig3-sks-contras

Kevin Marshall

 

John Cook undermining democracy through misinformation

It seems that John Cook was posting comments in 2011 under the pseudonym Lubos Motl. The year before physicist and blogger Luboš Motl had posted a rebuttal of Cook’s then 104 Global Warming & Climate Change Myths. When someone counters your beliefs point for point, then most people would naturally feel some anger. Taking the online identity of Motl is potentially more than identity theft. It can be viewed as an attempt to damage the reputation of someone you oppose.

However, there is a wider issue here. In 2011 John Cook co-authored with Stephan Lewandowsky The Debunking Handbook, that is still featured prominently on the skepticalscience.com. This short tract starts with the following paragraphs:-

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

A common misconception about myths is the notion that removing its influence is as simple as packing more information into people’s heads. This approach assumes that public misperceptions are due to a lack of knowledge and that the solution is more information – in science communication, it’s known as the “information deficit model”. But that model is wrong: people don’t process information as simply as a hard drive downloading data.

If Cook was indeed using the pseudonym Lubos Motl then he was knowingly putting out into the public arena misinformation in a malicious form. If he misrepresented Motl’s beliefs, then the public may not know who to trust. Targeted against one effective critic, it could trash their reputation. At a wider scale it could allow morally and scientifically inferior views to gain prominence over superior viewpoints. If the alarmist beliefs were superior it what be necessary to misrepresent alternative opinions. Open debate would soon reveal which side had the better views. But in debating and disputing, all sides would sharpen their arguments. What would quickly disappear is the reliance on opinion surveys and rewriting of dictionaries. Instead, proper academics would be distinguishing between quality, relevant evidence from dogmatic statements based on junk sociology and psychology. They would start defining the boundaries of expertise between the basic physics, computer modelling, results analysis, public policy-making, policy-implementation, economics, ethics and the philosophy of science. They may then start to draw on the understanding that has been achieved in these subject areas.

Kevin Marshall

Prof Lewandowsky – Where is the overwhelming evidence of climate change?

On Stephan Lewandowsky’s blog (funded by the Australian people) he claims that there is overwhelming evidence of climate change. My question is as follows

You claim that there is “overwhelming scientific evidence on climate change”. Does this apply to:-

  1. The trivial proposition that there is a greenhouse effect, so a rise in GHG levels will cause some rise in temperature?

    OR

  2. The non-trivial proposition that the unmitigated increase in GHG levels will lead to significant warming with catastrophic consequences?

The trivial proposition is something for a few academics to ponder. It is only when there is reasonable scientific evidence for the non-trivial proposition that a global policy to mitigate could be seriously contemplated.

Having attended John Cook’s lecture at Bristol University a few days ago, I found out that the vast survey of academic papers found a 97% consensus was about belief in the trivial proposition, and some of the papers were authored by non-scientists. That is, Cook presented weak, secondary, evidence of the trivial proposition.

Cook’s lecture also mentioned the four Hiroshima bombs a second of heat accumulation in the climate system since 1998, the widget for which you have on the left-hand side of this blog. Stated this way, there appears to be a non-trivial amount of warming, that anybody can perceive. It is equivalent to the average temperature of the ocean’s increasing at a rate less than 0.0018oC per annum. That is weak evidence for the trivial proposition.

So where is the overwhelming evidence that can justify policy?


This gives rise to a question that Australian citizen’s may one to raise with their elected representatives.

Should Australian taxpayers be funding a partisan blog that is strongly critical of mainstream political opinion, whose sole current contributor is a non-Australian working outside of Australia?

Kevin Marshall

Notes on John Cook’s presentation at Bristol University

On Friday 19th September John Cook talked on “ Dogma vs. consensus: Letting the evidence speak on climate change” at Bristol University. He was introduced by Stephen Lewandowsky, who is now a professor there. The slides are available at Skepticalscience. Here are some notes on the talk, along with brief comments.

The global warming hypothesis

John Cook started by asking people to see if they can define the greenhouse effect as a way of detecting if people know what they are talking about. However, he did not then apply this criteria in evaluating consensus views.

He stated that there is no single cause of global warming (including solar and volcanic), but that there is a major or principle one. From then on Cook proceeded as if there was a single cause. There was no evidence for relative size of each cause of global warming. Nor was there any consideration of the implications if AGW accounted for half or less of the warming rather than the entirety of it.

He stated that there are multiple lines of evidence for AGW actually operating, no mention of the quality of the evidence, or of contrary evidence that the pattern of warming does not fit the models.

Cook et. al 97% scientific consensus paper

Cook than went on to talk about his 97% consensus paper. He then showed the Barak Obama tweet.

In the Q&A John Cook admitted to two things. First, the paper only dealt with declared belief in the broadest, most banal, form of the global warming hypothesis. That is greenhouse gas levels are increasing and there is some warming as a consequence. Second is that the included papers that were outside the realm of climate science1, and quite possibly written by people without a science degree. The Barak Obama tweet account seems to have got the wrong impression.

This should be seen in the light of a comment about why consensus is important.

Communicating consensus isn’t about proving climate change. It’s addresses a public misconception about expert opinion.

John Cook has spectacularly failed on his own terms.

Fake Experts

Cook pointed to a petition of a few years ago signed by over 31,000 American scientists, opposing the Kyoto Treaty on the basis that it would harm the environment, hinder the advance of science and technology and damage the health and welfare of mankind. They also signed to say that there was no convincing evidence of catastrophic global warming.

He calls these people “fake experts” because these “scientists”, but are not “climate scientists”. But as we have seen neither were all the authors on his climate consensus paper.

If scientists from other areas are “fake experts” on climate science, then this equally applies to those making statements in support of the “climate consensus”. That means all the statements by various scientific bodies outside of the field of “climate” are equally worthless. Even more worthless are proclamations by political activists and politicians.

But most of all neither is John Cook a “climate expert”, as his degree is in physics.

Four Hiroshima Bombs and a Zillion Kitten Sneezes

As an encore, Cook had a short presentation on global warming. There were no hockey sticks showing the last thousand years of warming, or even a NASA Gistemp global surface temperature anomaly graph for the last century. The new graph is the earth’s cumulative heat energy accumulation since 1970, broken down into components. It was a bit like the UNIPCC’s graph below from AR5 WG1 Chapter 3. However, I do not remember the uncertainty bands being on Cook’s version.

Seeing that, I whispered to my neighbour “Four Hiroshima Bombs”. Lo and behold the next slide mentioned them. Not a great prediction on my part, as skepticalscience.com has a little widget. But an alternative variant of this was a zillion kitten sneezes a second, or some such preposterous figure. The next slide was a cute picture of a kitten. Cook seems to be parodying his work.

The Escalator with cherries on top

The last slide was of Cook’s “Escalator” graph, or at least the “Skeptics” view. The special feature for the evening was a pair of cherries in the top left, to emphasise that “skeptics” cherry-pick the evidence.

It was left flickering away for the last 15 minutes.

 

My thoughts on the presentation

Some of the genuine sceptics who left the room were seething, although they hung around and chatted.

But having reviewed my notes and the slides my view is different. John Cook started the presentation by trying to establish his expert authority on the global warming hypothesis. Then he let slip that he does not believe all global warming is from rising greenhouse gas levels. The centrepiece was the 97.4% scientific consensus paper where he was lead author. But, as Cook himself admitted, the survey looked for support for the most banal form of global warming, and the surveyed papers were not all written by climate scientists. Yet Barak Obama is enacting policy based on the false impression of a scientific consensus of dangerous warming.

Then in dissing an alternative viewpoint from actual scientists, Cook has implicitly undermined years of hard campaigning and entryism by green activists in getting nearly every scientific body in the world to make propaganda statements in support of the catastrophic global warming hypothesis and the necessity of immediate action to save the planet. Cook then parodied his own “four Hiroshima bombs a second” widget, before finishing off with a flickering gross misrepresentation of the sceptics, a number of whom were in the room listening politely.

About the final question was from someone who asked about why nearly all the questions were coming from sceptics, when the vast majority of the people in the room were in support of the “science”. At the end there was polite applause, and the room quickly emptied. I think the answer to the lack of questions was the embarrassment people felt. If John Cook is now the leading edge of climate alarmism, then the game is up.

Kevin Marshall

Notes

  1. This was in response to a question from blogger Katabasis pointed out some papers that were clearly not climate science, I believe using Jose Duarte’s list.

Theconsensusproject – unskeptical misinformation on Global Warming

Summary

Following the publication of a survey finding a 97% consensus on global warming in the peer-reviewed literature the team at “skepticalscience.com” launched theconsensusproject.com website. Here I evaluate the claims using two of website owner John Cook’s own terms. First, that “genuine skeptics consider all the evidence in their search for the truth”. Second is that misinformation is highly damaging to democratic societies, and reducing its effects a difficult and complex challenge.

Applying these standards, I find that

  • The 97% consensus paper is very weak evidence to back global warming. Stronger evidence, such as predictive skill and increasing refinement of the human-caused warming hypothesis, are entirely lacking.
  • The claim that “warming is human caused” has been contradicted at the Sks website. Statements about catastrophic consequences are unsupported.
  • The prediction of 8oF of warming this century without policy is contradicted by the UNIPCC reference.
  • The prediction of 4oF of warming with policy fails to state this is contingent on successful implementation by all countires.
  • The costs of unmitigated warming and the costs of policy and residual warming are from cherry-picking from two 2005 sources. Neither source makes the total claim. The claims of the Stern Review, and its critics, are ignored.

Overall, by his own standards, John Cook’s Consensus Project website is a source of extreme unskeptical misinformation.

 

Introduction

Last year, following the successful publication of their study on “Quantifying the consensus on anthropogenic global warming in the scientific literature“, the team at skepticalscience.com (Sks) created the spinoff website theconsensusproject.com.

I could set some standards of evaluation of my own. But the best way to evaluate this website is by Sks owner and leader, John Cook’s, own standards.

First, he has a rather odd definition of what skeptic. In an opinion piece in 2011 Cook stated:-

Genuine skeptics consider all the evidence in their search for the truth. Deniers, on the other hand, refuse to accept any evidence that conflicts with their pre-determined views.

This definition might be totally at odds with the world’s greatest dictionary in any language, but it is the standard Cook sets.

Also Cook co-wrote a short opinion pamphlet with Stephan Lewandowsky called The Debunking Handbook. It begins

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

Cook fully believes that accuracy is hugely important. Therefore we should see evidence great care in ensuring the accuracy of anything that he or his followers promote.

 

The Scientific Consensus

The first page is based on the paper

Cooks definition of a skeptic considering “all the evidence” is technically not breached. With over abstracts 12,000 papers evaluated it is a lot of evidence. The problem is nicely explained by Andrew Montford in the GWPF note “FRAUD, BIAS AND PUBLIC RELATIONS – The 97% ‘consensus’ and its critics“.

The formulation ‘that humans are causing global warming’ could have two different meanings. A ‘deep’ consensus reading would take it as all or most of the warming is caused by humans. A ‘shallow’ consensus reading would imply only that some unspecified proportion of the warming observed is attributable to mankind.

It is the shallow consensus that the paper followed, as found by a leaked email from John Cook that Montford quotes.

Okay, so we’ve ruled out a definition of AGW being ‘any amount of human influence’ or ‘more than 50% human influence’. We’re basically going with Ari’s porno approach (I probably should stop calling it that) which is AGW= ‘humans are causing global warming’. e.g. – no specific quantification which is the only way we can do it considering the breadth of papers we’re surveying.

There is another aspect. A similar methodology applied to social science papers produced in the USSR would probably produce an overwhelming consensus supporting the statement “communism is superior to capitalism”. Most papers would now be considered worthless.

There is another aspect is the quality of that evidence. Surveying the abstracts of peer-reviewed papers is a very roundabout way of taking an opinion poll. It is basically some people’s opinions of others implied opinions from short statements on tangentially related issues. In legal terms it is an extreme form of hearsay.

More important still is whether as a true “skeptic” all the evidence (or at least the most important parts) has been considered. Where is the actual evidence that humans cause significant warming? That is beyond the weak correlation between rising greenhouse gas levels and rising average temperatures. Where is the evidence that the huge numbers of climate scientists have understanding of their subject, demonstrated by track record of successful short predictions and increasing refinement of the human-caused warming hypothesis? Where is the evidence that they are true scientists following in the traditions of Newton, Einstein, Curie and Feynman, and not the followers of Comte, Marx and Freud? If John Cook is a true “skeptic”, and is presenting the most substantial evidence, then climate catastrophism is finished. But if Cook leaves out much better evidence then his survey is misinformation, undermining the case for necessary action.

 

Causes of global warming

The next page is headed.

There is no exclusion of other causes of the global warming since around 1800. But, with respect to the early twentieth century warming Dana Nuccitelli said

CO2 and the Sun played the largest roles in the early century warming, but other factors played a part as well.

However, there is no clear way of sorting out the contribution of the relative components. The statement “the causes of global warming are clear” is false.

On the same page there is this.

This is a series of truth statements about the full-blown catastrophic anthropogenic global warming hypothesis. Regardless of the strength of the evidence in support it is still a hypothesis. One could treat some scientific hypotheses as being essentially truth statements, such as that “smoking causes lung cancer” and “HIV causes AIDS”, as they are so very strongly-supported by the multiple lines of evidence1. There is no scientific evidence provided to substantiate the claim that global warming is harmful, just the shallow 97% consensus belief that humans cause some warming.

This core “global warming is harmful” statement is clear misinformation. It is extremely unskeptical, as it is arrived at by not considering any evidence.

 

Predictions and Policy

The final page is in three parts – warming prediction without policy; warming prediction with policy; and the benefits and costs of policy.

Warming prediction without policy

The source info for the prediction of 8oF (4.4oC) warming by 2100 without policy is from the 2007 UNIPCC AR4 report. It is now seven years out of date. The relevant table linked to is this:-

There are a whole range of estimates here, all with uncertainty bands. The highest has a best estimate of 4.0oC or 7.2oF. They seem to have taken the highest best estimate and rounded up. But this scenario is strictly for the temperature change at 2090-2099 relative to 1980-1999. This is for a 105 year period, against an 87 year period on the graph. Pro-rata the best estimate for A1F1 scenario is 3.3oC or 6oF.

But a genuine “skeptic” considers all the evidence, not cherry-picks the evidence which suit their arguments. If there is a best estimate to be chosen, which one of the various models should it be? In other areas of science, when faced with a number of models to use for future predictions the one chosen is the one that performs best. Leading climatologist, Dr Roy Spencer, has provided us with such a comparison. Last year he ran 73 of the latest climate CIMP5 models. Compared to actual data every single one was running too hot.

A best estimate on the basis of all the evidence would be somewhere between zero and 1.1oC, the lowest figure available from any of the climate models. To claim a higher figure than the best estimate of the most extreme of the models is not only dismissing reality, but denying the scientific consensus.

But maybe this hiatus in warming of the last 16-26 years is just an anomaly? There are at possibly 52 explanations of this hiatus, with more coming along all the time. However, given that they allow for natural factors and/or undermine the case for climate models accurately describing climate, the case for a single extreme prediction of warming to 2100 is further undermined. To maintain that 8oF of warming is – by Cook’s own definition – an extreme case of climate denial.

Warming prediction with policy

If the 8oF of predicted human-caused warming is extreme, then a policy that successfully halves that potential warming is not 4oF, but half of whatever the accurate prediction would be. But there are further problems. To be successful, that policy involves every major Government of developed countries reducing emissions by 80% (least including USA, Russia, EU, and Japan) by around 2050, and every other major country (at least including Russia, China, India, Brazil, South Africa, Indonesia and Ukraine) constraining emissions at current levels for ever. To get all countries to sign-up to such a policy combatting global warming over all other commitments is near impossible. Then take a look at the world map in 1925-1930 and see if you could reasonably have expected those Governments to have signed commitments binding on the Governments of 1945, let alone today. To omit policy considerations is an act of gross naivety, and clear misinformation.

The benefits and costs of policy

The benefits and costs of policy is the realm of economics, not of climatology. Here Cook’s definition of skeptic does not apply. There is no consensus in economics. However, there are general principles that are applied, or at least were applied when I studied the subject in the 1980s.

  • Modelled projections are contingent on assumptions, and are adjusted for new data.
  • Any competent student must be aware of the latest developments in the field.
  • Evaluation of competing theories is by comparing and contrasting.
  • If you are referencing a paper in support of your arguments, at least check that it does just that.

The graphic claims that the “total costs by 2100” of action are $10 trillion, as against $20 trillion of inaction. The costs of action are made up of more limited damages costs. There are two sources for this claim, both from 2005. The first is from “The Impacts and Costs of Climate Change”, a report commissioned by the EU. In the Executive Summary is stated:-

Given that €1.00 ≈ $1.20, the costs of inaction are $89 trillion and of reducing to 550ppm CO2 equivalent (the often quoted crucial level of 2-3 degrees of warming from a doubling of CO2 levels above pre-industrial levels) $38 trillion, the costs do not add up. However, the average of 43 and 32 is 37.5, or about half of 74. This gives the halving of total costs.

The second is from the German Institute for Economic Research. They state:-

If climate policy measures are not introduced, global climate change damages amounting to up to 20 trillion US dollars can be expected in the year 2100.

This gives the $20 trillion.

The costs of an active climate protection policy implemented today would reach globally around 430 billion US dollars in 2050 and around 3 trillion US dollars in 2100.

This gives the low policy costs of combatting global warming.

It is only by this arbitrary sampling of figures from the two papers that the websites figures can be established. But there is a problem in reconciling the two papers. The first paper has cumulative figures up to 2100. The shorthand for this is “total costs by 2100“. The $20 trillion figure is an estimate for the year 2100. The statement about the policy costs confirms this. This confusion leads the policy costs to be less than 0.1% of global output, instead of around 1% or more.

Further the figures are contradicted by the Stern Review of 2006, which was widely quoted in the UNIPCC AR4. In the summary of conclusions, Stern stated.

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more.

In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.

The benefit/cost ratio is dramatically different. Tol and Yohe provided a criticism of Stern, showing he used the most extreme estimates available. A much fuller criticism is provided by Peter Lilley in 2012. The upshot is that even with a single prediction of the amount and effects of warming, there is a huge range of cost impacts. Cook is truly out of his depth when stating single outcomes. What is worse is that the costs and effectiveness of policy to greenhouse emissions is far greater than benefit-cost analyses allow.

 

Conclusion

To take all the evidence into account and to present the conclusions in a way that clearly presents the information available, are extremely high standards to adhere to. But theconsensusproject.com does not just fail to get close to these benchmarks, it does the opposite. It totally fails to consider all the evidence. Even the sources it cites are grossly misinterpreted. The conclusion that I draw is that the benchmarks that Cook and the skepticalscience.com team have set are just weapons to shut down opponents, leaving the field clear for their shallow, dogmatic and unsubstantiated beliefs.

Kevin Marshall

 

Notes

  1. The evidence for “smoking causes lung cancer” I discuss here. The evidence for “HIV causes AIDS” is very ably considered by the AIDS charity AVERT at this page. AVERT is an international HIV and AIDS charity, based in the UK, working to avert HIV and AIDS worldwide, through education, treatment and care. – See more here.
  2. Jose Duarte has examples here.

William Connolley’s “correction” of the dictionary

William Connolley, at Roy Spencer’s blog, claims that those who disagree with him are not skeptics.

He hyperlinks to his 2004 posting “Septics and skeptics; denialists and contrarians

Consider his definition of the word “skeptic”

the true definition of skeptic in this context is something like: 

skeptic [Gr. skeptiko`s thoughtful, reflective, fr. ske`ptesqai to look carefully or about, to view, consider] 1. One who is yet undecided as to what is true; one who is looking or inquiring for what is true; an inquirer after facts or reasons. 

(I got that from here and edited it lightly (update 2004/12/11: but! they’ve changed the page. Argh. OK, so for the moment you can get the version I saw from googles cache, and if that fails, the original source is Webster’s Revised Unabridged Dictionary. I’ve also created an entry atwictionary in frustration; and the same defn is also available from BrainyDictionaryAnyway you know what I mean…)). 

I got that from here and edited it lightly” is a confession that he manipulated the definition to suit his purposes.

The “light” editing is from to dictionary.com, whose current definition is.

1. a person who questions the validity or authenticity of something purporting to be factual.

2. a person who maintains a doubting attitude, as toward values, plans, statements, or the character of others.

3. a person who doubts the truth of a religion, especially Christianity, or of important elements of it.

4. (initial capital letter) Philosophy.

a. a member of a philosophical school of ancient Greece, the earliest group of which consisted of Pyrrho and his followers, who maintained that real knowledge of things is impossible.

b. any later thinker who doubts or questions the possibility of real knowledge of any kind.

The first definition is about questioning something “purporting” to be factual. If somebody makes a claim that they earnestly believe to be true, they may not comprehend how anybody can be somewhat sceptical (or even incredulous) about those claims. Those who believe in alien abductions, for instance, may present “overwhelming” evidence to support that belief. If you try to convince them otherwise, you will be called stupid, or even as part of the conspiracy to discredit the truth.

The second definition is about a doubting attitude. There is nothing in those definitions that demarcates between good and bad scepticism. There can be a huge number of reasons for the doubt. For instance, a good marriage depends on trust. If one party has an affair, there will likely be a breakdown in that trust. The betrayed will now questions every statement and every motive. Once lost, that trust, it is very hard to regain – a point that Dale Carnegie makes in “How To Win Friends And Influence People“. Shifting blame, or failing to acknowledge fault, will only make matters worse.

However, given that it is worth having a healthy scepticism to any claims on the internet, a more reliable source is the printed word. My dictionary is a Shorter Oxford English Dictionary 1983 reprint edition. William Connolley, with a Dhil from Oxford, can hardly dispute its authority. This is what I wrote a couple of years ago:-

Definition 1 pertains to a school of philosophy after the Greek Pyrrho, which doubts the possibility of knowledge of any kind.

Definition 2 is someone who doubts the validity of knowledge claims in a particular area of inquiry. This includes, but is not confined to the natural sciences.

Definition 2.1 is “one who maintains a doubting attitude with reference to a particular question or statement”. The OED has this as the popular definition.

Definition 3 is one who doubts the truth of Christianity.

Definition 4 is one who is seeking the truth. That is “an inquirer who has not arrived at definite convictions”. This is only occasionally used, at least in the late 20th century.

Like with the dictionary.com definitions, there is no implied demarcation, between scepticism and denial of the truth. William Connolley’s definition is nearest to 4, implying that scepticism is transitional stage on the road to enlightenment or denial. But the oldest definition is denial of knowledge in general, and doubts of the truth of Christianity, can be a static state.

There are a huge number of possible reasons for the doubt that is scepticism. For instance, a good marriage depends on trust. If one party has an affair, there will likely be a breakdown in that trust. The betrayed will now question every statement and every action. Once lost, that trust it is very hard to regain – a point that Dale Carnegie makes in “How To Win Friends And Influence People“, although mostly with business relationships in mind. Shifting blame, or failing to acknowledge fault, will only make matters worse. William Connolley has helped betray the trust that people bestow on the authority of Wikipedia and in the authority of science. Rather than trying to restore that trust, he just makes comments that confirm people’s scepticism.

Kevin Marshall

 

 

Michael Mann and John Cook at Bristol University

Lucia at The Blackboard last month publicized that the John Cook is to speak at Bristol University on Dogma vs. consensus: Letting the evidence speak on climate change on Friday 19th September. There are still 395 free tickets left for the event.

Stephen Lewandowsky also notes that Michael Mann is to lecture at the same event on Tuesday 23rd September on The Hockey Stick and the climate wars – the battle continues. Just 102 free tickets left for this event.

Given the Mann’s belief that the continued climate denial is due to “massive funding of climate change denial by monied interests” (HuffPo), it might provide some light entertainment for the students.

Update 19th Sept. There are still 309 tickets left for the John Cook lecture for tomorrow – Friday 20th September. See http://www.bristol.ac.uk/cabot/events/2014/488.html

The Michael Mann lecture is now SOLD OUT, or more accurately, all the tickets have been given away.

Lewandowsky’s setback on campaign to undermine academic pluralism and excellence

The “Recursive Fury” paper, that allegedly libelled a number of bloggers1, has been taken down2. Lead author Stephan Lewandowsky has given his reaction at Shaping tommorow’s world.

Two of the “Recursive Fury” paper authors were Prof Lewandowsky and the blogger John Cook3. In 2011 they co-wrote “The Debunking Handbook“. I ask that readers view my comments in the context of the following opening statement:-

It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.

My comment is copied below. In brief I try to cover:-

  • Lewandowsky’s smearing of the majority with the views expressed by a minority.
  • Total failure to empathise with alternative points of view.
  • How his appeals for academic freedom are the reverse.
  • How the false allegations and smears are used to a shutdown questions on public policy.
  • How the “Lewandowsky Episode” can become a textbook example of why promotion of pluralism is necessary in our universities.
  • 56.ManicBeancounter at 20:37 PM on 23 March, 2014

    Stephan Lewandowsky,
    As a professor, you should be my intellectual superior. As a scientist you should be able to provide novel explanations about your subject area that go beyond what the non-specialist would find out for themselves, but at the same time accommodate the basic understanding that the non-specialist.
    Your “Hoax” paper ignored the obvious conclusion of the data. The vast majority of respondents did not believe in the cranky conspiracy theories, regardless of their views on “climate science”. Any “conspiracist ideation” revolves around differences in the small proportions that do. That means that the vast majority of “skeptics” who do not understand will feel insulted. Morally you should have clearly stated that any conclusions only apply to a small minority. The first part of the paper’s title inferred the opposite.
    “NASA Faked the Moon Landing—Therefore, (Climate) Science Is a Hoax”
    Out of 1145 respondents, just 2 strongly rejected “climate science” and strongly supported that faxed moon landing theory. The question was not asked of those two people if they followed that path of reasoning. Unsurprisingly, when you smear people with ideas that they find insulting they express outrage. There is nothing “confected” about this.
    There are three things that make this beyond the pale of academic freedom
    First, you do not advance knowledge, but to repress the obvious empirical statement (the vast majority do not believe in cranky conspiracy theories) with the opposite.
    Second is that the smears is to deny a group of people who you disagree with a voice.
    Third, is that you use false allegations of intellectual inferiority to evaluate climate “science”, to prevent a voice in matters of public policy. Yet the voices that you seek to repress often have far greater understanding and knowledge of economics and policy implementation than you and your fellow-travelling academics.
    Academic freedom must be protected so that ideas and knowledge that challenge society’s established beliefs can be nurtured. But that must be accompanied by a deliberate policy of pluralism, for there are none so defensive of their protecting their beliefs or ideas as those who spent their lives developing them. Professor Lewandowsky, your work in the last three years should become a textbook example of the attempts and consequences to suppress that freedom.

    69.ManicBeancounter at 06:39 AM on 24 March, 2014

    Geoff,

    Your comment 68 shows a basic function of peer review. Correcting the obvious errors. If there is no such quality control then the demarcation between academic and non-academic literature simply collapses. Further, if the academia cannot easily distinguish the excellent from the dross, then there must be a quality control before their recommendations are passed into public policy. Much the same way are new pharmaceuticals must go through rigorous regulatory testing before being proscribed to the public.

    70.ManicBeancounter at 06:59 AM on 24 March, 2014

    My comments as 57 and 70 should be viewed in the context of the opening comment in the “The Debunking Handbook”, written by John Cook and Stephen Lewandowsky and accessible on the right column.

    “It’s self-evident that democratic societies should base their decisions on accurate information. On many issues, however, misinformation can become entrenched in parts of the community, particularly when vested interests are involved. Reducing the influence of misinformation is a difficult and complex challenge.”

    By any independent measure the “Hoax” and “Recursive Fury” papers are full of misinformation. The authors aim at establishing a monopoly on truth, but by their very words, and subsequent behaviour, show that they are the last people you would entrust with that monopoly. There is no better example for the need of democratic societies to promote pluralism through competition in their universities to prevent the establishment of dogma. This is particularly true in Australia and the UK, where Government’s would like their universities to be World-leading.

Notes

  1. This includes Steve McIntyre, Barry Woods, Geoff Chambers and “Foxgoose”.
  2. See BishopHill (here and here), Geoff Chambers, Steve McIntyre, Australian Climate Madness (here and here), and the Guardian.
  3. This is the same John Cook who thinks he can define the meaning of words better than a dictionary.

Kevin Marshall