Lewandowsky on Radio 4 – missing out basic human psychology

Mike Haseler comments upon the appearance of Prof Stephan Lewandowsky on Radio 4 this week.

Lewandowski is a nasty piece of work who set out to fabricate data using bogus questions by which he attempted to prove sceptics are conspiracy theorists. All he managed to prove is that he is incapable of admitting the poor quality of his work. So, imagine my disgust tonight when I heard the BBC were broadcasting some of his material:

“Why do we continue to believe information even when we are told it’s wrong? Claudia Hammond discovers how the brain stores facts and why we don’t erase erroneous explanations.” (all in the mind)

That section of the program wasn’t very interesting (I fell asleep listening) but having had the misfortune to read the scenario before, the gist of it was that sometimes people will use ideas that they have been explicitly told are wrong showing that most people do not trust academics like Lewandowski.

Obviously that’s not what he intended the result to be.

The scenario given was that subjects were told there was a fire in a barn. They were told oil paints were stored in the barn. They were then told they were not stored in the barn (at which point is anyone going to believe the researcher?). Then they are asked why the fire had thick smoke. Lewandowski is trying to prove “false memories” or some such junk, by showing people still use the information that there was oil paints which they have been told is false. The reality is that what he proves is that very often people don’t believe the information the academics force down their throat and they come up with quite plausible explanations (the smoke was caused by the oil paints the researcher told them wasn’t present) which don’t agree with the “truth” ordained to them by academics like Lewandowski. What this clearly shows is that the general public is more inclined to trust their own ideas of what happened rather than rely on academics like Lewandowski when they are so untrustworthy they can’t make up their mind whether there is or is not paint in the barn.

My comment was

Your point about not believing somebody who has fed you false information is an enormously important part of human psychology. In close relationships, such as with one’s partner or a close friend we trust the other implicitly. If that trust is betrayed – such as a wife finding out after many years of marriage that the husband has a mistress – then it is not easily regained. A lot of distrust in climate science is that when the science gets it wrong, or is found giving false certainties (such as Glaciergate and Climategate), the reaction has not been to confess to error, but to sweep the issue under the carpet, or blame others.

Another aspect is that people tend to trust new information from people that they trust and respect, rather than people that they are prejudiced against. However hard we try to be neutral, people tend to more easily accept the words of the politicians that have their world view, than those of the opposite party. A life-long Tory from Haslemere has similar prejudices to a Labour supporter from Middlesbrough. They would far sooner trust a politician from their party than from the other side.

The problem with Lewandowsky is he fails to understand the problems of regaining trust when it has been breached, but instead tries to create prejudice against those who question his dogmatic views.


Wonthaggi Desal plant – Mothball to save Money and the Environment

Jo Nova has posted on the flagrant waste of money involved in the new Desalination Plant to serve the people of Melbourne. Here is my comment

Remember Topher, with his excellent “Forbidden History” video? Well, his earlier videos were on the problems of water shortage in Melbourne, and the Labour Government’s attempts to solve this problem. In his “Unpopular View #3” made in 2010, he looks at a magic solution. Rather than build a 150GL desalination plant, the Victorian Government could have spent $2.6bn on a pipeline from Tasmania producing 350GL of water. Topher further argues it would have helped Tasmanians. Why? the water is currently used for Hydro. Sold as water to the Victorians, the Tasmanians would make loads more money than they get from the electricity. This in turn could
Yet, in a huge report published, the authorities ignored this win-win solution, despite having four submissions that mentioned it.
Spend 15 minutes, and check it out for yourself.


Now for a bit of beancounting.

On these projects, the more you dig, the worse it gets.

Comparing Topher’s costs of the Tasmanian pipe-line (TPL) with your Shiny Desalination Plant (SDP).

Capital Cost – TPL $2.6bn, SDP $3.5bn (+$1.0bn?)

Annual costs – TPL $0.11bn (+up to $0.04bn running/maint costs?), SDP ($1.0bn)

Increase in Victorian Water Bills – TPL <5% (my estimate), SDP 34% (Herald Sun).

But it does not end there. The Tasmanian Pipeline would have nil power to deliver 350GL of water down a 2.5m pipe, as it would be gravity fed. The SDP requires massive amounts of power. The capital cost of wind generators to meet that power (as the project is committed to do) is estimated at $1.2bn. However, to be properly carbon neutral in operation, like the TPL, the desalination plant would require an estimated investment of approximately $6.0bn (See appendix)

Even though there is already at least $3.5bn already spent, there is a serious economic case for mothballing the desalination plant – and still building the Tasmanian Pipe-Line. In finance, one should only look forward, and let bygones-be-bygones. In politics, it is different. There are five possible scenarios.

  1. Mothball the desalination plant, and build the Tasmanian Pipe-line. Additional investment and damages might be $10bn, but is carbon neutral. Over 24 years it will pay around $2.5bn to Tasmania (paying for additional water infrastructure and/or protecting the wilderness), but with huge economic benefits for Victorian farmers with plentiful water supplies. Would require first voting out the Victorian Labor administration. Could recover $1m or so by suing the Labor administration of Victoria for gross negligence. (Financially not worthwhile, but would prevent others from doing similar mad schemes for a generation)
  2. Go ahead with the desalination plant and make it properly carbon neutral. Additional investment, and damages might be $7bn, but with around $10-$24bn of running costs, this “Green and honest” policy is expensive and electoral suicide.
  3. Go ahead with the desalination plant and pretend to carbon neutral by using actual capacity of wind farms. Additional investment, and damages might be $2.2bn, but with around $10-$24bn of running costs, this “Green and pretending to be honest” policy is expensive and would enough votes to guarantee an election would be lost.
  4. Go ahead with the desalination plant and pretend to carbon neutral by using nameplate capacity of wind farms. Additional investment for 100MW is $240m, and damages might be $1.0bn, but with around $10-$24bn of running costs, this “proclaiming to be honest” policy is expensive, but would lose votes for throwing money away.
  5. Go ahead with the desalination plant and forget about the green commitments. Additional investment is nil, and damages might be $1.0bn, but with around $10-$24bn of running costs, and this “ducking the issue” policy is expensive, but would lose less votes than being honest. However, the carbon tax at $10MWH, equates to $360,000 per annum if 150GL is produced. That is a trivial amount on the water bills and when it rises year-on-year, will hardly be noticed in the much bigger costs of the desalination plant.

So, in the interests of Melbourne and Tasmanian citizens, the best policy is to vote out the Labor Administration both nationally and locally. What will actually happen is the worst of options. Politicians will duck the issue, lumbering the Melbourne population with huge extra costs for a generation, going against national Labor policy on the environment, and failing to provide income to Tasmania, that could help Tasmanian farmers and finance the protection of the Tasmanian wilderness.

Appendix – Carbon Offsetting the Desalination Plant

The SDP will require 90-120MW to operate. Further, says Wikipedia, “additional energy will be required to pump the desalinated water from Wonthaggi to Cardinia Reservoir in Melbourne” To make the SDP carbon neutral, I will assume usage of wind power, as it is most popular type of renewable at present. To make the numbers easy I will assume 100MW is required (see below). The most popular type of renewable is wind power at present. Two such recent plants in the State of Victoria are the 192MW Waubra Wind Farm, which cost $600m, and 195MW Portland Wind Farm, projected to cost $330m. So that is $3.1m or $1.7m per megawatt plate capacity. That averages at $2.4m Wind turbines only have, however, an output of around 20% of nameplate. So to produce the average of 100MW, requires 500MW of capacity. However, if you want to be properly carbon neutral in Victoria, you need to allow for the coal-fired power stations running as back-up. True abatement levels are around 4% of nameplate. So for the SDP to be properly carbon neutral in Victoria, to offset the 100MW will require 2500MW of nameplate capacity wind farms To produce the required electricity from wind farms will mean investing $2.4m times 500 = $1.2bn. To be properly carbon neutral means investing $2.4m times 2500 = $6.0bn

Note – Power Requirements.

The figure of 100MW is calculated as follows. To produce 150GL of water assumes the plant is operating at 410 megalites per day 365 days a year. This gives the 90MW usage in normal operation. The extended capacity of 550 megalites per day is extended operation needs 120MW, which will be needed to allow some maintenance downtime. Let us assume 30 days normal downtime. So to produce 150GL in 335 days requires running the plant at 90MW for 224 days and 120MW for 91 days. Assume pumping adds around 10% to this gives and annual requirement of 36168 MWH, or a load of 99MW. Rounded is 100MW. 

Kevin Marshall 

 

 

Alcohol Concern’s anti-poor campaign

Although I am not in any way a socialist, I vigorously oppose anything where the poor and weak are made to subsidise the rich and the powerful. I also strongly oppose policy being enacted which will be to the net detriment of society as a whole. This is why I strongly oppose the latest report from Alcohol Concern “Binge – Drinking to get drunk: Influences on young adult drinking behaviours“. Before anybody gets the wrong idea, I support their concern about binge drinking, especially amongst minors. I also believe that if there were ways to improve this situation, then they should be enacted. However, if economic price incentives are involved, then one should also look at the unintended consequences.

The policy proposed is again a minimum price for alcohol. This has long been touted by the last Labour Government, the BMA and David Cameron. Yet none really understand the harm that it will cause to society. The proposal it to impose a minimum retail price per unit of alcohol of about 40p to 50p. This will not affect the cost in the pubs and clubs, where the cheapest pint of standard lager is around twice this level. It will dramatically impact the retail prices, in both small off-licences and the supermarkets. Below are some examples.


The way prices work is that premium products have not just premium prices, but larger profit margins both in absolute and in percentage terms. A minimum price for alcohol will invert this position. Suddenly a 3 litre bottle of cheap cider will have the highest profit margins not just in absolute, but also in percentage terms. This will create very perverse incentives for the retailer. One direct consequence will cause a rise in the price of drinks already over the minimum price. Consider the situation of the cheapest wine at £2.99 per bottle and the more mainstream wine at £4.99.


Even at 50p a unit, the cheap wine is still cheaper than the mainstream one. If the mainstream wine price remained unchanged, then the price premium to the consumer has dropped by 75%. Better quality has less of a premium. The retailer gets the margins reversed. The margin on the premium product goes from being 86% more to 48% less than the cheaper product. It makes sense for the retailer to increase the price. This increase might not be proportional to the cheap wine, but a least to make a greater margin in value terms.

Will the retailer end up making greater profit. This depends on something called elasticity of demand. To make less money on the cheap wine, demand would have to drop by over 72%. To make less money on the mainstream wine, demand would have to drop more than 53%.

Will this be of benefit to the supermarkets? It depends on the elasticity of demand. From Investopedia

Definition of ‘Price Elasticity Of Demand’

A measure of the responsiveness of the quantity demanded of a good to a change in its price. It is calculated as:



For the cheap wine the elasticity for break-even 72%/50.5% = 1.43

For the mainstream wine the elasticity for break-even 53%/25% = 2.12

Alcohol is well-known for being highly inelastic with respect to demand. That is elasticity measure is much less than 0.5. The supermarkets and the off-licences will make much, much larger profits on sales of cheap booze. With an elasticity of less than 1, consumers will end up spending more on alcohol than before, even though they are buying a smaller quantity. The biggest proportionate impact will be on those least able to afford that price rise. This is a double-hit. The poor spend a larger proportion of their income on alcohol than those on a higher income. They are also more likely to buy the cheaper forms of booze, which will have the larger percentage price rise.

The more equitable solution is to restructure the excise duties. The tax on alcohol should be shifted not just onto a per unit basis, but in such a way that it specifically targets the low-cost booze which is most attractive to minors. Therefore strong ciders (which I like), alchopops, and strong lagers should all have premium rates that are higher than, say, standard strength beers and wine. Weak taste drinks (Vodka, white cider) should have a premium over strong taste drinks such as real ale, whisky, or full-flavoured cider. This bigger added bonus is that there would a net gain in excise taxes, rather than just a gain in VAT receipts.

Monbiot and BBC – Accusing an innocent man due to a common prejudice?

Boris Johnson has something spot on about Newsnight’s accusing Lord McAlpine of paedophilia in a children’s home. Morally, it is probably today the worst sort of crime somebody could be accused of. Mass murder is not so bad, as long as you have higher motive. Even though it is meant to inspire terror into ordinary peaceful folk, it will not be called terrorism. The BBC will probably point to an excuse that as Newsnight supressed Jimmy Savile’s paedophilia due to sensitivity to Savile’s family, they did not want to fail in their duty for a second time. But there is something more than this, suggested by the Twittering George Monbiot. He was one of two prominent Twitterers to falsely “finger” Lord McAlpine as the culprit. Monbiot is now profusely apologetic, but I would suggest that his knee-jerk reaction was not out of character. It has some commonality with his take on the Gleick affair.

Earlier this year there was “released” a cache of documents from the Libertarian Heartland Institute. Peter Gleick, a dogmatic climate activist and scientist with a passionate dislike of any opposition obtained the documents by deception, and the released them anonymously. Most were innocuous, except for a “2012 Strategy Document”. Gleick was “outed” as the likely leaker, as this document was in Gleick’s peculiar writing style, not the more polished house-style of Heartland. It also contained a number of errors. George Monbiot praised Gleick’s actions as those of a “democratic hero” exposing the secret funding of climate denial by this right-wing think tank. There is no acknowledgement of the piffling size of this funding compared with government and private funding of alarmism and no acknowledgement of the evidence of forgery. Monbiot has no perspective on figures. If a few million dollars of Heartland “denial” is so effective against the billions poured into the science, Heartland should be chock full of internees infiltrated by every major Ad agency and democratic political party on the planet. Further, if there is a dominant, untenable, ideological position, then democracy is endangered not served by those who seek to confront the dominancy, but by those who seek to obliterate criticism. If the vast majority are on the side of the overwhelming truth, then publicity examining falsities can only serve to strengthen the perception of that truth. But, if it is a falsity, then exposing those who speak out to ad hominem attacks and slander is the thuggish way of silencing opposition. This principle is ingrained in the trial by jury system.

The reactions of the now BBC-departed Richard Black were in a similar vein.

What possible bearing can this have on George Monbiot’s judgement of the (false) allegations that Lord McAlpine was a paedophile? Might it be that Lord McAlpine was the former Treasurer (and very effective fundraiser) of the Conservative Party during the Thatcher years have something to do with it? When a tiny think tank can be so effective in sustaining climate denial, is not Lord McAlpine principally responsible for all that Mrs T inflicted on the Britain? And with the BBC culturally inculcated by similar pro-Guardian views, is it not conceivable that their failure to question the evidence might have something to do with McAlpine’s history?

Has Kevin Trenberth Reversed his position on Reversing the Null Hypothesis?

There is an interesting quote from Kevin Trenberth at SciGuy on Hurricane Sandy

It is true that hurricanes normally recurve and head east, especially at this time of year. So we do have a negative NAO and some blocking anticyclone in place, but the null hypothesis has to be that this is just “weather” and natural variability.

(emphasis mine)

Now would this be the same Kevin Trenberth who just 12 months ago was advocating that we reverse the null hypothesis?

“Humans are changing our climate. There is no doubt whatsoever,” said Trenberth. “Questions remain as to the extent of our collective contribution, but it is clear that the effects are not small and have emerged from the noise of natural variability. So why does the science community continue to do attribution studies and assume that humans have no influence as a null hypothesis?”

Has Trenberth now reversed his position on reversing the null hypthosis?

(I linked to SciGuy from Wattsupwiththat)

Comment made at Jo Nova’s Weekend Unthreaded.

Stephan Lewandowsky on Hurricane Sandy

Jo Nova posts on Stephan Lewandowsky’s analysis of Hurricane Sandy. Below is my comment, with the relevant links.

Lewandowsky has a lot to say about the overwhelming evidence for smoking causing lung cancer, but in substance has just this to say about the impending catastrophic global warming.

Trends such as the tripling of the number of weather-related natural disasters during the last 30 years or the inexorable rise in sea levels. Climate scientists predicted those trends long ago. And they are virtually certain that those trends would not have occurred without us pumping billions of tons of CO2 into the atmosphere.

There are 3 parts to this.

First, the economic analysis of natural disasters is Lewandowsky’s own. He ignores completely the opinions of Roger Pielke Jr, an expert in the field, with many peer reviewed studies on the subject. Pielke Jnr has shown there is nothing exceptional in the normalised cost of Hurricane Sandy. Furthermore, a 2009 report showed that New York is vulnerable to hurricanes, and the shape of the coastline makes it particularly vulnerable to storm surges.

Second, the sea level rise is a trivial issue. From the University of Colorado graph, it is clear that sea levels are rising at a steady rate of 31cm a century.

Third, he claims the predictions of unnamed “experts” have been fulfilled. A balanced analysis would point out that the CO2 levels have risen faster than predicted, but temperatures have not.

Last week I posted a proposal for analysing the costly impacts of global warming. Using the “equation”, I would suggest Lewandowsky overstates both the Magnitude and Likelihood that Sandy was caused by global warming. He misperceives the change in frequency (1/t). Furthermore, given than he has a track record in the highly biased use of statistics in his own field, and his deliberate lack of balance, the Weighting attached to anything he says should be negative. That is, like to newspapers of the Soviet Union, if Lewandowsky claims something, we should read between the lines to see what he does not say. However, unlike the Soviet Union we are still able to look for alternative opinions.


Normalized US Hurricane damage impacts


2012_rel4: Global Mean Sea Level Time Series (seasonal signals removed)

Costs of Climate Change in Perspective

This is a draft proposal in which to frame our thinking about the climatic impacts of global warming, without getting lost in trivial details, or questioning motives. This builds upon my replication of the thesis of the Stern Review in a graphical form, although in a slightly modified format.

The continual rise in greenhouse gases due to human emissions is predicted to cause a substantial rise in average global temperatures. This in turn is predicted to lead severe disruption of the global climate. Scientists project that the costs (both to humankind and other life forms) will be nothing short of globally catastrophic.

That is

CGW= f {K}                 (1)

The costs of global warming, CGW are a function of the change in the global average surface temperatures K. This is not a linear function, but of increasing costs per unit of temperature rise. That is

CGW= f {Kx} where x>1            (2)

Graphically


The curve is largely unknown, with large variations in the estimate of the slope. Furthermore, the function may be discontinuous as, there may be tipping points, beyond which the costly impacts of warming become magnified many times. Being unknown, the cost curve is an expectation derived from computer models. The equation thus becomes

E(CGW)= f {Kx}                (3)

The cost curve can be considered as having a number of elements the interrelated elements of magnitude M, time t and likelihood L. There are also costs involved in taking actions based on false expectations. Over a time period, costs are normally discounted, and when considering a policy response, a weighting W should be given to the scientific evidence. That is

E(CGW)=f {M,1/t,L,│Pr-E()│,r,W}    (4)

Magnitude M is the both severity and extent of the impacts on humankind or the planet in general.

Time t is highly relevant to the severity of the problem. Rapid changes in conditions are far more costly than gradual changes. Also impacts in the near future are more costly than those in the more distant future due to the shorter time horizon to put in place measures to lessen those costs.

Likelihood L is also relevant to the issue. Discounting a possible cost that is not certain to happen by the expected likelihood of that occurrence enables unlikely, but catastrophic, events to be considered alongside near certain events.

│Pr-E()│ is the difference between the predicted outcome, based on the best analysis of current data at the local level, and the expected outcome, that forms the basis of adaptive responses. It can work two ways. If there is a failure to predict and adapt to changing conditions then there is a cost. If there is adaptation to anticipation future condition that does not emerge, or is less severe than forecast, there is also a cost. │Pr-E()│= 0 when the outturn is exactly as forecast in every case. Given the uncertainty of future outcomes, there will always be costs incurred would be unnecessary with perfect knowledge.

Discount rate r is a device that recognizes that people prioritize according to time horizons. Discounting future costs or revenue enables us to evaluate the discount future alongside the near future.

Finally the Weighting (W) is concerned with the strength of the evidence. How much credence do you give to projections about the future? Here is where value judgements come into play. I believe that we should not completely ignore alarming projections about the future for which there is weak evidence, but neither should we accept such evidence as the only possible future scenario. Consider the following quotation.

There are uncertain truths — even true statements that we may take to be false — but there are no uncertain certainties. Since we can never know anything for sure, it is simply not worth searching for certainty; but it is well worth searching for truth; and we do this chiefly by searching for mistakes, so that we have to correct them.

Popper, Karl. In Search of a Better World. 1984.

Popper was concerned with hypothesis testing, whilst we are concerned here with accurate projections about states well into the future. However, the same principles apply. We should search for the truth, by looking for mistakes and (in the context of projections) inaccurate perceptions as well. However, this is not to be dismissive of uncertainties. If future climate catastrophe is the true future scenario, the evidence, or signal, will be weak amongst historical data where natural climate variability is quite large. This is illustrated in the graphic below.


The precarious nature of climate costs prediction.

Historical data is based upon an area where the signal of future catastrophe is weak.

Projecting on the basis of this signal is prone to large errors.

In light of this, it is necessary to concentrate on positive criticism, with giving due weighting to the evidence.

Looking at individual studies, due weighting might include the following:-

  • Uses verification procedures from other disciplines
  • Similarity of results from using different statistical methods and tests to analyse the data
  • Similarity of results using different data sets
  • Corroborated by other techniques to obtain similar results
  • Consistency of results over time as historical data sets become larger and more accurate
  • Consistency of results as data gathering becomes independent of the scientific theorists
  • Consistency of results as data analysis techniques become more open, and standards developed
  • Focus on projections on the local level (sub-regional) level, for which adaptive responses might be possible

To gain increased confidence in the projections, due weighting might include the following:-

  • Making way-marker predictions that are accurate
  • Lack of way-marker predictions that are contradicted
  • Acknowledgement of, and taking account of, way-marker predictions that are contradicted
  • Major pattern predictions that are generally accurate
  • Increasing precision and accuracy as techniques develop
  • Changing the perceptions of the magnitude and likelihood of future costs based on new data
  • Challenging and removal of conflicts of interest that arise from scientists verifying their own projections

    Kevin Marshall

East Australia High Speed Rail – Opening Comments

Bernd Felsche has been blogging recently on proposals for a High Speed Rail project for Eastern Australia. The details and Phase 1 report are here.

In Britain there has recently been approved a HSR project from London to Birmingham, costing at least £17.1bn (A$26.7bn) for just 190km of track. The estimated cost of A$61bn to A$108bn for around 1644km looks remarkably good value in comparison. However, it is worth studying the underlying assumptions.

The Taxpayers Alliance has made a number of damming criticisms of the UK project. In particular that the actual costs could be nearly three times the estimated if supporting infrastructure improvements are taken into account. Having also looked at the Manchester Congestion Charging Scheme in 2008, I thought it might be worth a perusal.

The basis for the project is the projected demand, so my first comments are population and demand levels.

Initial Thoughts on Population

The study assumes a high level of population growth for Australia as a whole. From the current 23m, population is forecast to be between 30 and 40m in 2056. That is growth of 30% to 74% over 45 years. Taking the mid-point, that is 52.2% growth to 35m. East Australia is forecast to grow 58.3% from 17.8m to 28.2m, leaving growth in the rest of Australia of 30.7% (5.2 to 6.8m).


Map from page iii of Executive Summary, annotated with city population growth projections for 2011 to 2056.

The highest growth in population (using Australian Bureau of Statistics, Population Projections Australia 2006 – 2101, 2008 (Series B forecasts updated)) is projected to be in the Brisbane area. Given that this is the least populated end of the line, these population projections need to be put through a sensitivity analysis. With much lower projections for South East Queensland growth it could be that the northern stretch of the line and one third the estimated cost is not economically justified.

Passenger Growth

From the Executive Summary page iv

The population of the east coast states and territory of Australia is forecast to increase from 18 million people in 2011 to 28 million people by 2056. Over 100 million long distance trips are made on the east coast of Australia each year, and this is forecast to grow to 264 million long-distance trips over the next 45 years.

So population will grow by 58% and long distance trips by 164%. By 2036 (with 35% growth in population), they will have grabbed half the project air market in 2036 for Melbourne to Sydney and Brisbane to Sydney. With such a huge capital outlay how can this be?

Capital Cost

From the Executive Summary

International experience suggests it is unrealistic to expect the capital cost of a HSR network to be recovered.

The reason that the projected fares look so cheap, so that there is not going to be any recovery of the costs in fares. So the

competitive ticket prices, with one way fares (in $2011) from Brisbane to Sydney costing $75–$177; Sydney to Melbourne $99–$197; and $16.50 for daily commuters between Newcastle and Sydney

are no such thing. A quick check on single flights from Melbourne to Sydney reveals prices of $125 economy and $850 business. The HSR will be financed out of taxation to grab market share from air travel.

Kevin Marshall


Electric Cars – toys of the rich, subsidised by the masses

Joanna Nova reports on a new study showing that electric cars produce more CO2 that either petrol or diesel cars if that electricity is produced principally from coal-fired power stations.

The most practical electric car

In Britain there is more a market for electric vehicles, but still puny sales. The European Car of the Year is the Chevrolet Volt, which has a 1.4 petrol engine to accompany the electric motor. At £29,995 it costs 50% more than a similarly-sized Ford Focus diesel, even with the £5,000 government subsidy. In fact, it is more than a similarly-sized Audi, BMW or Mercedes and will not last nearly as long. If you look at the detail, the Volt has a claimed CO2 emission 27 g/km, as against 99 g/km for the best diesels. This takes no account of the CO2 emissions from the power stations. In Britain electricity is mostly from gas, with much of the rest from coal and nuclear.

There is also a question of equity. Domestic electricity has a 5% tax added on. Diesel has over 120% added. So the cost for 100 km (using official figures and 15p per kwh + 5% vat) is £2.66 for the Volt and £6.00 for the equivalent diesel car (combined 67.3mpg and £1.43 per litre). But tax is £0.13 and £3.30, so most of the cost saving is in tax. In the UK the average is 12,000 miles or 19,300km per year. So the tax saving from driving the Volt is up to £610 per annum. Although if you travel that distance per annum there will be a number of long distance journeys. Let us assume half the 12,000 miles is on the petrol engine at 50mpg, with petrol at £1.38. Then the annual tax saving drops to just £70.

The biggest saving for electric car owners is in London, with the congestion charge. Drive 5 days a week for 11 months of the year into London, and the conventional car owner will pay £2,750 a year. Drive an electric car or hybrid and the charge is zero.

So what sort of people would be persuaded to buy such a device? It is the small minority who have money for at least two cars, but want to appear concerned about the environment. They have the open-top sports car for summer days, the luxury car for long journeys, and the Volt for trips to the supermarket or to friend’s houses. It is the new form of conspicuous consumption for the intelligentsia, making the Toyota Prius so last year.

The least practical electric car

Launched this year the Renault Twizy is claimed to be about the cheapest “car” available today. As a car it is also by far the smallest available as well, being more a quadricycle, with no proper doors. The cost is kept low by not including the battery which is rented for at least £48 a month. As the Telegraph concludes, it is an expensive toy. My 12 year old son said he would love one when he saw it in a car showroom recently. But he would soon regret it if he was transported to school in it every day, instead of riding on the top-deck of a bus. At least if his dad forgot to plug it in, it would be small enough for him to push.

Lewandowsky et al 2012 from two alternative philosophies of science

The following comment was made on Joanne Nova’s blog, in response to a comment by Jonathan Fordsham that Stephen Lewandowsky did not know what he was getting into by publishing his paper and the subsequent defence of that paper.

Whilst Lewandowsky may not have known what he was getting into, the aim of the paper was to find further reasons to dogmatically dismiss any views that question the established orthodoxy. It is from a view of science that sees conformity and belief in that orthodoxy as the mark of a scientist. From this conformity is the importance of opinion polls and declarations of belief by scientific bodies to this view. Promoting evidence or hypotheses that contradicts orthodoxy risks being branded a heretic or denier.

The alternative, “Popperian” view of science is that progress is often made by over-turning existing hypotheses, or subsuming them within more profound theories. Getting results that contradict hypotheses is a cause for celebration. It then raises a whole series of questions. In this view of science, belief in a specific hypothesis is dangerous. People do not like having their beliefs contradicted, and it would be hugely damaging psychologically to constantly attempt to undermine one’s core beliefs. Belief instead is in finding new understanding of the world by the most rigorous method.

The questionnaire, despite all its biases, clearly showed that the vast majority of respondents, whether skeptic or alarmist rejected cranky conspiracy theories. Lewandowsky’s theory about climate “deniers” having a conspiracist orientation was clearly contradicted by the evidence. A team of people then spent 18 months producing the paper. There is strong circumstantial evidence that the time was spent manipulating the data, choosing the best statistical methods to corroborate their story, and carefully phrasing what they wrote to claim the opposite of what the data revealed.

The “orthodox” view of science was clearly Lewandowsky’s enemy when the evidence contradicted his hypothesis. He could not publish the full results for risk of his status as a scientist and for future funding of his work. The “Popperian” view would have still allowed publication, as it falsifies a hypothesis that Lewandowsky and others believe in.

Kevin Marshall.