Another example of Censorship of Skeptics

The blog Zone5 (written by an environmentalist who is thoughtfully sceptical of global warming) has had an article taken down from what has been one of the more moderate pro-CAGW blogs. I left the following comment

The removal of your article is another small example of what you were writing about. Any attempt to offer counter arguments, or to criticize, is being shut down. This is true of blog comments or of peer-reviewed papers. But enough of the negative. Your article made some excellent points, particularly on Al Gore’s movie

First he misrepresents the science by claiming we are facing near certain doom, then he completely downplays the kind of changes we would have to make to prevent catastrophe if we accept the worst case scenario.

It is the crux of what I consider to be the problem of the climate change agenda. I believe there is quite strong science to back up the claim that a doubling of CO2 will cause about one degree of warming. Maybe the climate models are right, and this effect will be doubled or more by clouds feedbacks (though the virulence with which scientific papers that suggest otherwise have been attacked, and a similarly weak rebuttal suggesting the opposite praised greatly, suggests this is an Achilles heel). However, your comment on Al Gore’s film neatly summarises the issue in general. The potential effects of climate change are over-estimated in two ways – of magnitude and likelihood. The most important magnitude is time. For instance, the potential sea level rise is treated as if it would be in metres per year. So fast that large areas of land would be swamped before the harvest could be brought in. But even if global temperatures rose by five degrees in a generation (very unlikely), the resultant sea level rise would be sufficiently slow to relocate homes and agriculture, or to build dykes. People’s ability to adapt to rapidly to changes are remarkable, as emigrants from Britain to Australia (or from Asia to Britain) can testify, yet this is vastly underplayed.

The downplaying of effective policy issues is, if anything, even worse. It is assumed that with a little extra tax, everybody will switch to electric cars or bicycles, and plug a few drafts to cut heating bills by 90%. All this until we get a technological breakthrough in a few years to allow super-abundant carbon free power and near costless power. If Britain (or the EU) takes the lead, then everybody else will follow. No problem about over-running on costs, or pursuing the wrong type of green energy. No concern that a million or more families will enter fuel poverty every year, whilst still failing far behind on emissions reduction targets.

The overplay of risks / underplay of policy costs was put in a more sophisticated way in the Stern Review. I have attempted to analyse this at

https://manicbeancounter.wordpress.com/2011/02/11/climate-change-policy-in-perspective-%E2%80%93-part-1-of-4/

Please continue to encourage people to think for themselves and compare the various perspectives.

Is this another example of shutting down any sort of dissent, like the increasing dogmatism & extremism of sceptical science? (see here
here
here).

Feedbacks in Climate Science and Keynesian Economics

Warren Meyer posts of a parallel between Climate Science and Keynesian Economics. I posted about a subject close to his heart, and central to Keynesianism – Feedbacks. I have also attempted to update on the current debate on feedbacks.

Warren

There is a parallel between Keynes and the CAGW that is close to your heart – feedbacks. Pure Keynesianism is that an increase in government expenditure at less than full employment would have a positive feedback response. Keynes called the feedback measure the multiplier. (The multiplier is the reciprocal of the proportion of Government expenditure to GDP. So if government expenditure was 20% of GDP, then a $1bn fiscal boost would increase output by $5bn.)

By the 1950’s the leading sceptic was Milton Friedman who, in his 1962 book “Capitalism and Freedom”, estimated empirically that the multiplier was about 1 – that is it did not have any impact. Friedman was denounced as a denier and a dinosaur. (At the same time, mainstream economics adapted his verificationist methodology.) Indeed by the end of the 1960s it was generally agreed that the long-term feedback impact of government demand management was negative, as increased government expenditure crowded out the private sector, caused escalating inflation (as economic actors ceased to be fooled by the false signals 0f increased expenditure), slowed economic growth and generally undermined the very structures of the capitalist system. (see Friedman’s Nobel Prize lecture “Inflation and Unemployment“)

Keynesian thinking is that the capitalist economic system is inherently unstable. Stability is only achieved through the guiding hand of government. Keynes contrasted this with a caricature of neoclassical economics, with the macroeconomic system would rapidly come back into equilibrium. Similarly, the climate models assumption of chronic instability is contrasted by an extreme caricature of those who disagree with them. That is the “deniers” are saying that the climate is incredibly stable, with human beings having no influence. In both cases the consequence of this caricaturing is to automatically claim any extreme occurrence as vindification of their perspective.

The Positives of Global Warming in Context

David Friedman makes some good points about the positive aspects of global warming. I would like to put the positives of global warming into context and pointing the way to making the analysis of the consequences of global warming more rigorous.

The consequences of global warming may have positive and negative consequences. The severity of any consequence should be assessed according to three factors.

  1. Magnitude – how large it will be. This can be over a number of dimensions. So a predicted worsening of hurricanes, for instance, might be in frequency, power and area.
  2. Likelihood. The Probability of a forecast event it occurring.
  3. Randomness. It is predicted the weather systems will become destabilised, so the weather will become the norm.

When extreme events are postulated, the magnitude that is most often over-stated is time. So sea levels are imagined to rise by a foot a year, not a century at the current rate (3.2mm per year is the best estimate). The rate of change is crucial here. Incremental changes over generational times scale we will not notice globally, as economic conditions change much more rapidly than this. Also there are unstated assumptions about the likelihood of the events. From an economic point of view, the potential costs can be many times over-stated by a combination of magnitude and likelihood. There are two main reasons to believe this is the case – adaptation and way-markers.

Adaptation is people changing to changed circumstances. The reason that living standards are over 30 times greater and the world population is more than 10 times greater than 300 years ago is than the human race cannot just adapt to changing conditions – in wealthy countries extreme weather events and failed harvests are hardly a problem. Look back to the 1960s and 1970s, the mainstream forecasts were for increasing poverty and starvation. With the exceptions where governments are extremely bad (North Korea, Zimbabwe) or there has been extensive conflict (Zaire), this has not been the case. But many of the prophesies of doom assume no adaptation at all. So literally, farmers will grow the same crops they always have, and people will not think of moving as the sea immerses their houses.

Way-markers are the signals of climate change happening now. Many of the extreme short-term forecasts have been falsified, or shown to be based on pseudo-science. Sea levels have failed to rise by 25 metres anytime soon, the Arctic was not ice-free in the summer of 2008, nor will it be in 2013; the snows of Kilimanjaro are not primarily disappearing due to rising temperatures; and the Himalayan glaciers will not be gone by 2035. The Bangladesh landmass has increased; the Amazon rainforest is not about to reach a tipping point; and the Maldives will not disappear beneath the waves. With these clear near-term failures, it is reasonable to say that more long-term extrapolations will be unlikely and exaggerated in magnitude.

On the other side, whilst individuals and communities are incapable of adapting to changes, the assumption is that Governments can fix anything at minimal cost. So, subject to a global agreement, CO2 can be constrained (according to the UK Stern Review) at one fifth to one twentieth of the likely costs of doing nothing. No allowance is made that government projects tend to overrun on costs and underperform on benefits, nor that the this degree of underperformance tends to proportionately rise with lack of planning, vagueness of objections, complexity of organisations involved and scale.

Finally, for those with a grounding in economics, I have an unfinished project analysing the above issues graphically here and here.

Evangelical Christians and Climate Change Skepticism

Wm Briggs reports on a “forthcoming Bulletin of the American Meteorological Society paper “Making the climate a part of the human world”, in which University of British Columbia geographer Simon Donner argues that religion is the cause of global warming denial. ” (Pre-publication copy here)

Simon Donner’s Views

Donner’s Summary is

“Ongoing public uncertainty about climate change is rooted in a perceived conflict between the scientific evidence for a human role in the climate and a common belief that the weather and climate are controlled by higher powers.”

This is backed up by a number of studies of religions, both ancient and primitive religions from various parts of the world. This includes from Fiji and Papua New Guinea. I can find no reference to the major religions of Islam, Hinduism or Buddhism. There is only one biblical reference, from the Old Testament book of Job, but none from the New Testament – the stories about Jesus and his disciples. Neither is there a distinction between Catholicism and Protestantism, nor a split between evangelical and liberal protestants.

The Religious Sceptics in USA

The majority of the religious sceptics in the USA are the Protestant Evangelicals. Their type of Christianity is centred on biblical study, both individually and corporately, to perceive the revealed word of God and the interpretation for current circumstances. There are the specialists – the ordained pastors – who provide interpretations through sermons. However, this is just the lead for personal study and reflection.

Collectively, these evangelicals are not unified body theologically. For instance, a quick comparison of the Southern Baptist Convention and the Assemblies of God websites will quickly demonstrate the point. Nor are there strong ecumenical links between the major churches, as found in Britain.

This bible-based view of Christianity comes directly from the Reformation. In medieval Europe the Bible was handwritten and only available in Latin. With most people illiterate, reading of the Bible was limited to a few dedicated scholars, with interpretation highly centralised and strictly controlled. Any deviation was treated as heresy, often punishable by death. A combination of the advent of printing and translation into the vernacular suddenly made the word of God accessible to a much wider population. It soon became evident that the established religious orthodoxy was, in many places, unsupported from the sacred text and in some cases fundamentally at odds with that text. It was this need to study that changed public worship so dramatically, with teaching replacing the Mass as the centrepiece.

Politically, access to the Bible democratised understanding and the questioning of authority and centralised power. This gave a scholarly impetus to the development of modern science, and also the Liberal political philosophy of John Locke and the Scottish Enlightenment that in turn heavily influenced the Founding Fathers.

An Alternative Thesis

Evangelicals have as their primary resource the Bible and the interpretation of God’s purpose from within their local congregation. Your average church member will have quite a detailed knowledge of the Bible, being able to quote much of the primary doctrine and some major passages. Generally they also “cherish and defend religious liberty, and deny the right of any secular or religious authority to impose a confession of faith upon a church or body of churches.
(Southern Baptist Convention). The scepticism towards climate change comes from its presentation. It comes across as a core doctrine that is agreed upon by a consensus of leading scientists. But the truth cannot be perceived by the lay person, but only revealed by impenetrable computer models to scientific experts. Any deviation or questioning of core doctrine is treated with contempt and as a heresy. Yet the high scientific standards that these experts are supposed to follow has been found wanting. There are two areas where this is demonstrated most.

First, the poster hockey stick of a decade ago – showing global temperatures were dramatically higher than at an time in the last millennium – was investigated by the Steve McIntyre. He showed the results were as a result of a number of elements including cherry picking data; giving undue weighting to favourable results; excluding some unfavourable data points; failing to apply proper statistical tests. A book charting this episode is found here, and my comparison of an exchange following a savage book review is here.

Second is the Climategate email release, which showed that the core scientists were a fairly small group, that they viewed the science as far from settled, and they adhered to lower standards of scholarship than was the public perception.

The Inferences from the Donner Paper

Donner has either little understanding of mainstream Christianity in the USA, or he deliberately misrepresents what it stands for. In so doing, he not only completely misses the point of why religious Americans are sceptical but does so in such a way that will make them more antagonistic. The fact that peer review should allow through a paper that clearly does not have proper argument to support the thesis shows a failure to of that process. That a person with no qualifications or prior publishing record in the field of sociology or theology should be allowed to publish on the subject in a journal specialising in the weather shows how far climate science is straying beyond its area of competency. For Christians who unsure of the global warming arguments, clear evidence that a climate scientist not knowing what they are talking about will make them more sceptical. They will be more likely to accept the sceptical comments that the science is flawed, whether the theory, the computer models or the statistics.

A note on HADCRUT3 v GISSTEMP

Have just posted to WUWT the following on global temperature anomalies:-

Thanks Luboš for a well-thought out article, and nicely summarised by

“The “error of the measurement” of the warming trend is 3 times larger than the result!”

One of the implications of this wide variability, and the concentration of temperature measurements in a small proportion of the land mass (with very little from the oceans covering 70% of the globe) is that one must be very careful in the interpretation of the data. Even if the surface stations were totally representative and uniformly accurate (no UHI) and the raw data properly adjusted (Remember Darwin, Australia on this blog?), there are still normative judgements to be made to achieve a figure.

I have done some (much cruder) analysis comparing HADCRUT3 to GISSTEMP for the period 1880 to 2010, which helps illustrate these judgemental decisions.

1. The temperature series agree on the large fluctuations, with the exception of the post 1945 cooling – it happens 2 or 3 years later and more slowly in GISSTEMP.

2. One would expect greater agreement with recent data in more recent years. But since 1997 the difference in temperature anomalies has widened by nearly 0.3 celsius – GISSTEMP showing rapid warming and HADCRUT showing none.

3. If you take the absolute change in anomaly from month to month and average from 1880 to 2010, GISSTEMP is nearly double that of HADCRUT3 – 0.15 degrees v 0.08. The divergence in volatility reduced from 1880 to the middle of last century, when GISSTEMP was around 40% more volatile than HADCRUT3. But since then the relative volatility has increased. The figures for the last five years are respectively about 0.12 and 0.05 degrees. That is GISSTEMP is around 120% more volatile that HADCRUT3.

This all indicates that there must be greater clarity in the figures. We need the temperature indices to be compiled by qualified independent statisticians, not by those who major in another subject. This is particularly true of the major measure of global warming, where there is more than a modicum of partisan elements.

These graphs help illustrate the points made. Please note that I use overlapping moving averages, so it is for illustrative purposes only.

NB. Luboš Motl’s article was cross-posted from his blog here

Tamino on Australian Sea-Levels

Tamino attempts a hatchet-job on a peer-reviewed paper on Australian Sea Levels. Whilst making some valid comments, it gives the misleading impression that he has overturned the main conclusion.

The sceptic blogs (GWPF, Wattsupwiththat, Jo Nova) are highlighting a front page article in the Australian about a peer-reviewed paper by P.J. Watson about Australian sea levels trends over the past century.

The major conclusion is that:-

“The analysis reveals a consistent trend of weak deceleration at each of these gauge sites throughout Australasia over the period from 1940 to 2000. Short period trends of acceleration in mean sea level after 1990 are evident at each site, although these are not abnormal or higher than other short-term rates measured throughout the historical record.”

The significance is that Watson shows a twentieth century rise of 17cm +/-5cm in Australia, whilst Government policy is based a sea level rise of up to 90cm by the end of the century. If there is deceleration from an already low base, then government action is no longer required, potentially saving billions of dollars.

Looking for other viewpoints I found a direction from Real Climate to Tamino’s Open Mind blog. Given my last encounter when he tried to defend the deeply flawed Hockey Stick (see my comments here and here) I curious to know if this was another misdirection. I was not disappointed. Tamino manages to produce a graph showing the opposite to Watson. That is rapid acceleration, not gentle deceleration.

How does he end up with this contrary result? In Summary

  1. Chooses just one of the four data sets used. That is the Freemantle data set.
  2. Making valid, but largely irrelevant criticisms, to undermine the scientific and statistical competency of the author.
  3. Takes time to make the point about treating 20 year moving averages as data for analysis purposes. The problem is that it underweights the data points at the beginning and the end. In particular, any recent acceleration will be understated.
  4. Criticizes the modelling method, with good reasons.
  5. Slips in an alternative model that may answer that criticism.
  6. Shows the results of that model output.

Tamino’s choice of the Freemantle data set should be justified, especially as Watson gives the comment in the conclusion.

“There is evidence of significant mine subsidence embedded in the historical tide gauge record for Newcastle and a likelihood of inferred subsidence within the later (after the mid 1990s) portion of the Fremantle record. In this respect, it is timely and necessary to augment these relative tide gauge measurements with CGPS to gain accurate data on the vertical movement (if any) at each gauge site to measure eustatic sea level rise. At present only the Auckland gauge is fitted with such precision levelling technology.”

That is, the Freemantle data shows the largest acceleration towards the end and this extra acceleration might be because land levels are falling, not sea levels rising.

The underweighting of recent data is important and could be dealt with by looking at shorter period moving averages and observing the acceleration rates. That is looking at moving averages for 19, 18, 17 years etc. If the acceleration rates cross the 20cm a century rate with the shortening of the time periods then this will undermine Watson’s conclusion. Tamino does not do this, despite being well within his capabilities. Until such an analysis is carried out, the claim abstract in the abstract that “(s)hort period trends of acceleration in mean sea level after 1990 … are not abnormal or higher than other short-term rates measured throughout the historical record ” is not undermined.

Instead of pursuing the point, Tamino then goes on to substitute Watson’s modelling method for an arbitrary one plucked from the air, with the comment

“Finally, we come to the other very big problem with this analysis: the model itself. Watson models his data as a quadratic function of time:

.

He then uses  (the 2nd time derivative of the model) as the estimated acceleration. But this model assumes that the acceleration is constant throughout the observed time span. That’s clearly not so. ”

Instead he flippantly inserts a quartic equation, which gives the time-varying acceleration (the second derivation) as a quadratic function against time.

There are some problems with a quadratic functions as a model against time. Primarily it only has one turning point. Extend the graph far enough and it reaches infinity. So at some point in the future sea levels will reach the sun, and later the rate of rise will be faster than the speed of light. More seriously, if this quadratic is the closest fit to all the data series, it will either have, or soon will have, overstated the actual acceleration. If used to project 90 years or more ahead, it will provide a grossly exaggerated projection based on known data.

On this basis I have edited to give all the inferences that can be drawn from rising sea levels in Australasia.

That is, a pure maths exercise in plotting a quadratic equation on a graph, unrelated to any reality.

An alternative to this is to claim simply that there is not sufficient valid data, or the analysis is too poor draw any long-term inferences.

An alternative approach is to relate the sea level rises to the global temperature rises. Try comparing Watson’s graph of rate of change in sea levels to the two major temperature anomalies.



First it should be pointed out that Watson uses a twenty year moving average, so his data should lag the temperature data. The strong warming in the HADCRUT data in the 1920s to 1940s is replicated in Fort Denison and Auckland sea level data. The Lack of warming in the 1945 to 1975 period is replicated be marked deceleration in all four data sets from 1950 to the 1970s. The warming phase thereafter is similarly replicated in all four data sets. The current static phase, according to the more reliable HADCRUT data, should similarly be marked by a deceleration in sea level rise from an already low level. Further analysis of Watson’s data is needed to confirm this.

There is no reason in the existing data to believe that Watson’s conclusions are invalid. It is necessary to play fast and loose with the data and get lost in computer games models to draw alternative inferences. Yet if a member of the Australian Parliament says legislation to cope with sea level rise should be withdrawn due to a new study, the alarmist consensus, (who have just skimmed through Tamino’s debunking), will say that the study has been overturned. As a result, ordinary, coastal-dwelling people in Australia will continue to endure real hardship due to legislation based on alarmist exaggerations. (here & here).

IPCC & Greenpeace

The Shub Niggurath (Hattip BishopHill) arguments against the IPCC’s SSREN growth figures are complex. The Greenpeace model on which they were based basically took a baseline projection and backcast from there. A cursory look at the figure GDP figures shows that the economic models point to knife-edge scenario. The economic models indicate that the wrong combination of policies, but successfully applied, could cause a global depression for a nigh-on a generation and lead to 330 million less people in 2050 than the do-nothing scenario. But successful combination of policies will have absolutely no economic impact.

Shub examines this table :-

Table 10.3, page 1187, chapter 10 IPCC SRREN

(Page 32 of 106 in Chapter 10. Download available from here)

I have looked at the GDP per capita and population figures.


To see whether the per capita GDP projections are realistic, I have first estimated the implied annual growth rates. The IEA calculates a baseline of around 2% growth to 2030. The German Aerospace Centre then believes growth rates will fall to 1.7% in the following 20 years. Why, I am not sure, but it certainly gives a lower target to aim at. Projecting the 2030 to 2050 growth rate forward to the end of the century gives a GDP per capita (in 2005 constant values) of $56,000. That is a greater than five-fold increase in 93 years.

On a similar basis there are two scenarios examined for climate change policies. In the Category III+IV case, growth rates drop to 0.5% until 2030. It then picks up to 2% per annum. Why a policy that reduces global growth by 75% for 23 years should then cause a rebound is beyond me. However, the impact on living standards is profound. Almost 30% lower by 2030. Even if the higher growth is extrapolated to the end of the century, future generations are still 12% worse off than if nothing was done.

But the Category I+II case makes this global policy disaster seem mild by comparison. Here the assume is that global output per capita will fall year-on-year by 0.5% for nearly a generation. That is falling living standards for 23 years, ending up at little over half what they were in 2007. This scenario will be little changed in 2050 or 2100. Falling living standards mean lower life expectancy and a reduction in population growth. The model reflects this by projecting that these climate change policies will lead to 330 million less people than a do-nothing scenario.

Let us be clear what this table is saying. If the world gets together and successfully implements a set of policies to contain CO2 levels at 440ppm, the global output in 2050 will be 40% lower. There is a downside risk here as well – that this cost will not contain the growth in CO2, or that the alternative power supplies will mean power outages, or that large-scale, long-term government projects tend to massively overrun on costs and under perform on benefits.

Let us hark back to the Stern Review, published in 2006. From the Summary of Conclusions

“Using the results from formal economic models, the Review estimates that if we don’t

act, the overall costs and risks of climate change will be equivalent to losing at least

5% of global GDP each year, now and forever. If a wider range of risks and impacts

is taken into account, the estimates of damage could rise to 20% of GDP or more.

In contrast, the costs of action – reducing greenhouse gas emissions to avoid the

worst impacts of climate change – can be limited to around 1% of global GDP each

year.”

Stern looked at the costs, but not at the impact on economic growth. So even if you accept his alarmist prediction costs of 5% or more of GDP, would you bequeath that to your great grandchildren, or a 40% or more reduction lowering of their living standards along with the risk of the policies being ineffective? Add into the mix that The Stern Review took the more alarming estimates, rather a balanced perspective(1) then the IPCC case for reducing CO2 by more solar panels and wind farms is looking highly irresponsible.

From my own perspective, I would not have thought that the impact of climate mitigation policies could be so harmful to economic growth. If the models are correct that the wrong policies are hugely harmful to economic growth, then due diligence should be applied to any policy proposals. If the economic models from the IPCC are too sensitive to minor changes, then we must ask if their climate models suffer from the same failings.

  1. See for instance Tol & Yohe (WORLD ECONOMICS • Vol. 7 • No. 4• October–December 2006)

Update 27th July.

Have just read through Steve McIntyre’s posting on the report. Unusually for him, he concentrates on the provenance of the report and not on analysing the data.

Outflanking Al Gore & other alarmists

At Wattupwiththat there is a proposal to build a database by

Find(ing) every false, misleading, scary, idiotic, non-scientific statement they have made in the past twenty years. Create an index by name with pages listing those statement with links to the source. Keep it factual. Let their own words come back to haunt them.

My comment was

A database of all the exaggerations, errors and false prophesies on its own will do no good. No matter how extensive and thorough and rigorous, it will be dismissed as having been compiled by serial deniers funded by big oil. Getting a fair hearing in the MSM will be impossible. It the coming battle the alarmists have decided the field of battle and have impenetrable armour.

To be brief, there needs to be two analogies brought to the fore.

First is the legal analogy. If there is a case for CAGW, it must be demonstrated by primary, empirical evidence. That evidence must be tested by opponents. It is not the bits, that may be true – like lots more CO2 will cause some warming. But that there is sufficient CO2 to cause some warming, which will be magnified by positive feedbacks to cause even greater warming, and this substantive warming will destabilize the planet weather systems in a highly negative way. The counter-argument is two-fold – that many of dire, immediate, forecasts have been highly exaggerated and more importantly, the compound uncertainties that have been vastly underestimated. That the case is weak is shown by the prominence given to what is hearsay evidence, such as the consensus, or the proclamations of groups of scientists, or to the image of the hockey stick. In some cases, it has been tantamount to jury-tampering.

Second is the medical analogy. A medical doctor, in proscribing a painful and potentially harmful course of treatment, should at last have a strong professionally-based expectation that post treatment the patient will be better off than if nothing was done. The very qualities that make politicians electable – of being able to make build coalitions by fudging, projecting an image, and undermining the opponents by polarizing views – make them patently unfit for driving through and micro-managing effective policy to reduce CO2. They will of necessity overstate the benefits and massively understate the costs, whether financial or in human suffering. They will not admit that the problem is beyond their capabilities, nor that errors had been made. The problem is even worse in powerful dictatorships than democracies.

I have tried to suggest a method (for those who are familiar with microeconomics) the IPCC/Stern case for containing CO2 here.

https://manicbeancounter.wordpress.com/2011/02/11/climate-change-policy-in-perspective-%E2%80%93-part-1-of-4/

Also, why there is no effective, global political solution possible.

https://manicbeancounter.wordpress.com/2011/02/13/climate-change-in-perspective-%E2%80%93-part-2-of-4-the-mitigation-curve/

What is missing is why the costs of global warming have been grossly exaggerated.

Question for Sir John Beddington

According to Bishop HillSir John Beddington is seeking feedback on the climate impacts report I blogged about yesterday.”

My question is of a technical nature. Given that the Stern Review of 2006 received worldwide acclamation for its novel conclusions, I would have thought Sir John Beddington would have utilised this work. Apart from a footnote or two, the only reference is in a box on page 63.

Dear Sir John,

I am a humble beancounter, who spends his time in analysing complex project costs and application forms for capital expenditures. In this vein, on page 63 of your report you claim that the Stern Review had a social discount rate of 1.4%, whilst

conclude that the Lord Stern used a discount rate of 0.1%. Have we all misread the report?

Oppenheimer – False prophet or multi-layered alarmist?

Haunting the Library has a posting “Flashback 1988: Michael Oppenheimer Warn Seas to Surge 83 Feet Inland by 2020“.

Apart from being a false and alarmist forecast in retrospect, even if in 1988 the climate models on which it was based were correct and unbiased, there could still have been less than a 1 in 1000 chance of this scenario being forecast. Here is why.

The relevant quote from the “Hour” newspaper is

“Those actions could force the average temperature up by 2 degrees Fahrenheit in the next three decades….Such a temperature increase, for example, would cause the sea level to rise by 10 inches, bringing sea water an average of 83 feet inland”

There are at possibly three, or more, levels of alarmism upon which this conclusion depends:-

  1. The sea level rise was contingent on a 2oF (1.1oC) over 32 years would have been at the top end of forecasts. Although the centennial rate of increase is around 3.5oC, my understanding of the climate models it is not just the global temperatures that are projected to rise, but the decadal rate of increase in temperatures. This is consistent with the accelerating rate of CO2 increase. Normally the range of projections is over a 95% probability range, so the models would have projected a 2.5% chance of this temperature increase.
  2. The rise in sea levels would lag air temperature rises by a number of years. This is due to the twin primary causes of sea level rise – thermal expansion of the ocean and melting pack ice. Therefore, I would suggest a combination of three reasons for this projection. First, the models projection of 10 inch (25cm) rise was exaggerated, due to faulty modelling. (IPCC AR4 of 2007 estimates a centennial rise of 30cm to 60cm, with accelerating rates of sea-level rises correlating with, but lagging, temperature rises). Second it was at the top end of forecast probability ranges, so there was just a 2.5% change of the sea level rise reaching this level for a 2oF rise. Third, time lags were not fully taken into account.
  3. The mention of the impact on the horizontal average sea water movement of 83 feet (25m) is to simply spread alarmism. For low-lying populated coastal areas, such as Holland, it probably assumes the non-existence (or non-maintenance) of coastal defences. The calculation may also assume land levels do not naturally change. In the case of the heavily populated deltas and the coral islands, this ignores natural processes that have caused land levels to rise with sea levels.

So it could be that, based on the climate models in 1988, there as a 2.5% chance of a 2.5% chance of sea levels rising by 10 inches in 32 years, subject to the models being correct. There are a number reasons to suspect that the models of climate and sea level rise are extreme. For instance, the levels of temperature rise rely on extreme estimates of sensitivity of temperature to CO2 and/or the feedback effect of temperature increases on water vapour levels (See Roy Spencer here). Sea level rises were probably overstated, as it was assumed that Antarctic temperatures would rise in parallel with those of the rest of the world. As 70-80% of the global pack ice is located there, the absence of warming on the coldest continent, will have a huge impact on future sea level forecasts.

Although this forecast was made a climate scientist, it was not couched in nuanced terms that the empirical scientific modelling techniques require. But it is on such statements that policy is made.