6 A halving of climate sensitivity

Lord Lawson, in a spirited attack on the Energy Bill passing through parliament, said in the House of Lords on 18th June 2013

There is an emerging consensus among scientists that the climate sensitivity of carbon is probably less than they thought. That means, importantly, that any dangers from warming, if they occur, are postponed well into the next century. It means that there is no urgency to go ahead in this way, not only because the uncertainties are in the distant future but because we have no idea what technologies will develop over the next 100 years.

The analysis I have developed shows Lord Lawson understates how significant the climate sensitivity issue to the problem of catastrophic warming and mitigation policy.

In my analysis, the maximum warming would be 7oC. There can be a case for warming topping out at some level, as

  • There are diminishing returns to increases in greenhouse gas on temperature.
  • There are diminishing returns at some point for unit rises in emissions on levels of GHGs. That is, higher levels of GHGs in the atmosphere will lead to higher levels of absorption.

I will assume that climate sensitivity is halved, but will assume that temperatures eventually reach 5oC above pre-industrial levels.

The simple curve to work out the consequences is the policy curve. Any constraint of greenhouse gas levels will only have half the impact on temperature. The policy curve will shift to the right to PC1.

The climate costs curve is somewhat more difficult. The elements to consider in the curve are

E(CGW)=f {M,1/t,L,A,│Pr-E()│,r,R,W}

I have highlighted the elements to consider.

Time t will be doubled. Warming rates will therefore be halved. Some of the harmful consequences of warming are from unprecedented rapid change. For many animals and plants, it is speculated sudden change is much more damaging than a slower change. More importantly, sudden changes in average temperature could jolt climate systems into different patterns. Savannahs could become deserts, or the monsoon could shift. Another aspect to consider is that rapid warming of the tundra could release massive amounts of CH4 into the atmosphere, further accelerating warming. Or rapid warming could lead to rapid disintegration and breakup of the polar ice caps, leading to rapid acceleration of sea level rise. The slower warming will make us much less likely to cross these climate tipping points.

Adaptations A
can be phased in more gradually. For instance, with sea level rise, the Thames Barrier will no longer be adequate. A replacement to last 50 years will need to be much less extensive. If warming causes crop yields to fall by increased drought there is more time to adjust.

With changes happening more slowly, (and less chaotically), the adaptation cost errors, │Pr-E()│, are likely to be less.

For any positive rate of discount r, then the current net present value will be lower for the much extended warming period. However, as Stern had a discount value of not much different to zero, allowing for a discount rate would totally cover the other issues.

For all of these reasons, the climate cost curve will move down to CC1 and total cost curve to TC1. The point where policy costs equals climate change costs moves from A to B. That is at a significantly higher temperature, and for a much lower level of policy cost.

I have steered away from the weighting W issues. But given that sensitivity is a core issue that the climate models have got consistently wrong, then any weighting given to other predictions should be viewed with greater scepticism.

7 Appendix – Deriving the Policy and Forecast Graph

In the introduction, the derivation of the graph to replicate the claim that the costs of catastrophic global warming will be many times greater than mitigation policy costs was logically incomplete. This is a derivation of the two cost functions from a series of PowerPoint slides, which I find somewhat more satisfactory.

Slide 1

First draw two axis’s – for temperature and relative cost.

Slide 2

Next, add in five points.

A. If there had been no rise in human greenhouse gases, there would be no rise in temperatures and thus no consequential costly climate impacts.

B. With “business as usual”, there will be a huge amount of warming, with hugely costly consequential climate impacts.

C. Globally, policy could be used to stop any further rise in greenhouse gases, but with huge global cost.

D. No policy and no policy costs.

E. Intersection of two curves, which in Stern’s view is at the point of constraining warming to about 3 degrees above pre-twentieth century levels.

Slide 3

Connecting up the points AB (climate costs) and CD (Policy costs) with straight lines (linear functions), creates an intersection at point F.

To replicate Stern, we need cost functions that intersect at point E. That is the climate cost curve connects AEB and the policy costs curve connects CED.

Slide 4

Drawing curves within PowerPoint is beyond my current skills. Simple curves have symmetrical properties. The required cost curves do not have such properties.

Slide 5

Above is the actual graph used.

Slide 6

On my graphs the cost curves are unstable functions

For climate costs

RC = f(T4)

For policy costs

RC = f((10-T)5)

To justify policy

  1. Must have reliable consequences of warming beyond human experience. Climate models must be robust for the high temperature rise forecast and have a phenomenal degree of precision on the shorter-term cost impact forecasts.
  2. Must be sure that got achievable high-impact low-cost policies, with a highly results-driven approach to policy implementation.

Radiative Forcing – UNIPCC AR5 undermines AR4, but scientists have unshaken confidence in their work

Last year in “Aerosols – The UNIPCC AR4 adjustment factor” I claimed that in 2007 the UNIPCC engineered the radiative forcing components to tell a story. It basically manipulated the figures to account for the lack of warming up to that point. The release of AR5 Working Group 1 report yesterday shows the extent of the false levels of certainty in the scientist’s estimates in 2007.

The Data

In 2007 Figure 2.4 of the Synthesis Report was as follows

In 2013, Figure SPM.5 is below1

There are slight changes in format and terminology. I have put the two tables side-by-side for comparison, with analysis:-

 

The range of forcings I have expressed the range as a percentage of the mid-point.

Below are comments on the individual forcing components.

Carbon Dioxide CO2

The most important anthropogenic greenhouse gas has hardly moved, from 1.66 to 1.68 W m-2. In 1750 CO2 levels were 280 ppm, rising to 379 ppm in 2005 and 392 ppm in 2011. In 2007, the scientists estimated that it took a rise of 60 ppm to increase radiative forcing by 1 W m-2, compared to 66 ppm in 2013. Scientists have found that CO2 is 10% less effective as a greenhouse gas than previously thought. They are far less certain about this figure, as the range has doubled, but they are still have high confidence in their figures4 but scientists have switched from high confidence to very high confidence with their figures.3

Methane CH4

CH4 has practically doubled in impact, from 0.48 to 0.97 W m-2. In 1750 CH4 levels were 715 ppb, rising to 1774 ppb in 2005 and 1803 ppb in 2011. In 2007, the scientists estimated that it took a rise of 2200 ppb to increase radiative forcing by 1 W m-2, compared to 1120 ppm in 2013. Scientists have found that CH4 was practically twice as potent as a greenhouse gas than previously thought. They are far less certain about this figure, as the range has more than doubled relative to the mid-point. More significantly, the new potency is well outside the confidence range of the 2007 report. There the high point of the uncertainty range was 0.53 W m-2, whereas the low point of the uncertainty range is 0.74 W m-2. Despite having been so far out six years ago the scientists still have high confidence in their figures. The reason given on page 9 is

This difference in estimates is caused by concentration changes in ozone and stratospheric water vapour due to CH4 emissions and other emissions indirectly affecting CH4.

 

The potency of CH4 is a modelled estimate based on other factors. It is by including these indirect effects that the uncertainty is increased.

As a side point, of the 1100 ppb rise in CH4 levels since 1750, 80% was prior to the 1975. It has ceased to be a significant contributor to increasing radiative forcing. Given the increased recognised potency, it is a minor explanation of the pause in warming.

Nitrous Oxide N2O

This has hardly moved in impact, from 0.16 to 0.17 W m-2. In 1750 N2O levels were 270 ppb, rising to 319 ppb in 2005 and 324 ppb in 2011. Scientists are far less certain about these figures, as the range has nearly doubled, but they still have high confidence in their figures.

Halocarbons4

Although a minor group of greenhouse gases the impact has reduced from 0.34 to 0.18 W m-2, but the magnitude to the uncertainty band has increased more than five-fold from 0.06 (0.37-0.31) to 0.34 (0.35-0.01). Instead of reducing scientists confidence, they have gone from “high confidence” to “very high confidence” in the figures.

Aerosols

Of the 2007 report I claimed they were a fudge factor, suppressing the warming effect of greenhouse gases. The combined mid-point is now 1.20 W m-2of direct and cloud albedo effects, down more than 30% on 2007. The range of uncertainty is more significant. This has increased from 0.8 to 0.9 W m-2, with the impact of the high-end being a net warming effect. Despite being now being uncertain of whether the direct effect of aerosols warm or cool the planet, and despite being less certain of already high “confidence” range six years ago, the scientists still have high confidence in their figures.

Forecasts for Radiative Forcing in 2100 for CO2 and CH4

Let us assume that CO2 continue to increase at 3ppm a year and CH4 increases by 5ppb a year until 2100. Using 2007 potency estimates, CO2 forcing will be 6.34 W m-2 and CH4 will be 0.69 W m-2 above 1750 levels. Using 2013 potency estimates, CO2 forcing will be 5.72 W m-2 and CH4 will be 1.37 W m-2 above 1750 levels. Combined estimated forcing is less than 1% different, despite doubling the potency of CH4. Maybe we will have a much greater reason to worry about the melting of permafrost in the tundra, causing a huge rise in atmospheric methane levels. Suppressed warming from this factor has been doubled.

Conclusion

Scientists now implicitly admit that they were much too confident about the potency of greenhouse gases in 2007. They have now doubled the uncertainty bands on the three major greenhouse gases. Yet recognizing this past over confidence seems to have had no impact on current levels of confidence.

Kevin Marshall

 

Notes

  1. The graphic at the time of writing was only available in pdf format.
  2. NMVOC = Non-methane volatile organic compounds. They have a role in the production of ozone. Defra have a fuller explanation.
  3. All these figures are available from the 2007 “Full report” page and the 2013 WG1 Summary for Policymakers page 7. This is the 27-09-13 version. Page numbering will change once tables are properly inserted.
  4. Upon re-reading I have made two adjustments. For CO2, I note that scientists have increased their confidence despite doubling the size of their uncertainty bands. I have also added a comment on halocarbons, where confidence has increased, despite a

Assessing the UNIPCC fifth assessment report

The first part of the UNIPCC AR5 is due to be published in the coming days. At the Conversation, Research Fellows Roger Jones and Celeste Young at Victoria University have posted Explainer: how to read an IPCC report. It contains some useful stuff on penetrating the coded language of the IPCC report. You will be better able to decode what the IPCC mean by various levels of confidence. However, the authors are very careful not to give people a free rein in thinking for themselves. Therefore they stress that the language is complex, and any questions need to be answered by an expert. After all, it would not do to have people misinterpreting the science.

I suggest an alternative method of understanding the science. That is comparing what is said now with what the consensus said back in 2007 in AR4. The AR4 is available at the United Nations Intergovernmental Panel on Climate Change website at the following location.

http://www.ipcc.ch/publications_and_data/publications_ipcc_fourth_assessment_report_synthesis_report.htm

Figure 2.4 Radiative forcing components of SYR.

It would be nice to see the comparative estimates, particularly on whether aerosols have a comparatively large negative role and whether natural factors are still less than 10% of the net total.

.

Figure 2.4. Global average radiative forcing (RF) in 2005 (best estimates and 5 to 95% uncertainty ranges) with respect to 1750 for CO2, CH4, N2O and other important agents and mechanisms, together with the typical geographical extent (spatial scale) of the forcing and the assessed level of scientific understanding (LOSU). Aerosols from explosive volcanic eruptions contribute an additional episodic cooling term for a few years following an eruption. The range for linear contrails does not include other possible effects of aviation on cloudiness. {WGI Figure SPM.2}

Figure SPM.6. Projected surface temperature changes for the late 21st century (2090-2099).

An updated map on a comparable basis would be useful, especially for the most concerning area of the Arctic.


Figure SPM.6. Projected surface temperature changes for the late 21st century (2090-2099). The map shows the multi-AOGCM average projection for the A1B SRES scenario. Temperatures are relative to the period 1980-1999. {Figure 3.2}

Table SPM.2. Examples of some projected regional impacts.


It would be nice to have an update on how the short term impacts are doing. These all had high confidence or very high confidence

In Africa

By 2020, between 75 and 250 million of people are projected to be exposed to increased water stress due to climate change.

By 2020, in some countries, yields from rain-fed agriculture could be reduced by up to 50%. Agricultural production, including access to food, in many African countries is projected to be severely compromised. This would further adversely affect food security and exacerbate malnutrition.

In Australia and New Zealand

By 2020, significant loss of biodiversity is projected to occur in some ecologically rich sites, including the Great Barrier Reef and Queensland Wet Tropics.

Small Islands

Sea level rise is expected to exacerbate inundation, storm surge, erosion and other coastal hazards, thus threatening vital infrastructure, settlements and facilities that support the livelihood of island communities.

Please note the graphs used are available at this website and are IPCC Copyright.


Fundamentals that Climate Science Ignores

Updated 08/09/13 am

Updated 08/09/13 pm – M The Null Hypothesis

Climate Science is a hugely complex subject, dealing with phenomena that are essentially chaotic, with vague patterns. Yet the promotion of that science is banal and superficial. Below are some of the fundamentals that have been addressed in established areas like economics, philosophy and English Common Law, but which the Climate Science community ignores. Most overlap, or are different ways of looking at the same thing.

A Positive and Normative

I do not hold with the logical positivism in vogue in the early parts of the C20th and later underpinning the “positive economics” ideas of Milton Friedman that was popular in the 1950s to 1980s. But it made the useful distinction between positive statements (empirically based statements) and normative statements (what ought to be). The language of climate science is heavily value-laden. There is not attempt to distinguish positive from normative in language, nor highlight that competency in the sphere of positive statements is not necessarily an indication of competency in normative ones.  For instance, when scientists make statements about the moral imperative for policy, they may overemphasize the moral questions raised as they may be too close to the subject. In fact believing that that rising greenhouse gas levels causes a worsening of climate can lead to a bias towards the simplified solution to constrain that growth. It takes understanding of the entirely separate fields of economics and public policy-making to determine whether this is achievable, or the best solution.

B Boundary conditions

There is no clear definition of science in general or the study of climate in particular. The only underlying definitions are tantamount to saying that science is what scientists do, and scientific statements are those made by scientists. Without a clear definition of science, scientists end up making unsupported statements, outside their area of competency. For instance, scientists often make statements about the economic case for policy. With the term “climate change” encompassing both, the general public are misled into believing that “climate scientists” cover both areas.

C Open and closed questions

A closed question can by answered by a single word. The narrowest closed questions are those can be answered “Yes/No” or “True/False”. Open questions need fuller answers. Climate change is not just about closed questions. It is about how much, how likely, when and where. If terms of boundary, there is not a closed question of science versus non-science – with the boundary in actual work being between that published in a peer-reviewed journal and that published outside. That leads onto non-triviality and quality conditions and relevancy

D Trivial v. Non-trivial

The strongest evidence for global warming suggests a trivial issue. In one aspect this is true by definition. The non-trivial part – the potential climate catastrophe that policy seeks to avert – relies upon future projections. This relies on temperature rises many times greater than so far experienced. Projections will always be, weaker that the actual evidence. But there is an empirical aspect as well. If the actual trends are far below those predicted (surface temperature warming trends), or fail to show a switch to a path pointing to catastrophe (acceleration in the rate of sea level rise)

E Quality

There is good quality science and poor quality. Peer review should help, but (as suggested in the Climategate emails) acceptance/rejection can be based on criteria other than science. In most areas of science, and indeed in many professions, efforts have been made to improve the quality of results. One minor step towards improvement of quality is the insistence on publishing the data behind peer-reviewed articles. This has led to the quick exposure of shoddy work like Gergis et al 2012 and LOG12 papers, whereas it took many years of persistence by Steve McIntyre to get the full data on Keith Briffa’s deeply flawed Yamal tree-ring temperature proxy. However, as the forthcoming UNIPCC AR5 report will demonstrate, increasing quality is sacrificed in promoting climate catastrophism.

F False Positives and False Negatives

A particular subset of the quality issue is that of false positives and false negatives. With activists pressuring governments and scientific bodies to agree with the dogma, and promotion of pejorative language (e.g. deniers, fake skeptics), misattribution of significant weather events to climate change is a consequence. Whilst in cancer screening there have been efforts made to reduce the number of false positives and false negatives, in climate science there seems to be every effort to increase the numbers of false positives. (Superstorm Sandy that hit New York state last year, the extreme heat wave in Europe in 2003, the low sea ice point in September 2012).

G Relevancy and significance

Some pieces of information, or scientific papers, are more important than others. The vast majority of papers published are on trivial issues and/or fail to make a lasting impact. In terms of catastrophic global warming, most papers in the field are tangential to the subject. The same is true of items of information, statistics and opinions.

H Necessary and Sufficient

For a climate policy to give net benefits, a number of conditions are necessary, both in the science (greenhouse gas effect, significant warming, adverse consequences) and in policy area (policy with theoretical net benefits > costs of doing nothing, large enough policy area, effective policy management). Sufficient for policy success (net policy benefits > costs of doing nothing) all are to some extent necessary. For policy failure, it is only sufficient for one of the necessary conditions to fail. It does not matter whether this is

–       climate sensitivity being much lower than assumed

–       or adaptation at the non-governmental local level is much more effective than assumed

–       or the net adverse consequences of any given amount of warming are grossly exaggerated

–       or the theoretical economic case for policy is flawed (such as demand for energy is far more inelastic with respect to price over time than assumed, or that renewable energy is not a close substitute to fossil fuel energy)

–       or the actual policy enacted does not encapsulate the economic theory, diluting or nullifying the effectiveness

–       or unilateralist policy where success requires that the vast majority of the biggest economies to participate

–       or the policy on paper is potentially successful, but it is not project managed to drive through the maximum benefits at least cost

I Levels of evidence

In the legal systems, especially in criminal law, it has long been recognized that there are different qualities of evidence. The strongest is DNA, fingerprints, or catching somebody in the act. There is then secondary evidence from witnesses. There is then circumstantial evidence, such as the accused being near to the scene at the time, with no clear reason to be there. The lowest form of evidence, and usually rejected, is hearsay evidence. That is opinions of people with little interest in the case, giving unsupported opinions. The judicial process also views more highly evidence that is corroborated by other pieces of evidence, and evidence that on its own seems quite strong is downgraded or ruled out by contrary evidence, or alternative explanations.

J Values of the Legal Process in Reverse

Climate science, fails to grapple with the grading of evidence, as some its strongest arguments – consensus amongst scientists – is actually hearsay. Improving the quality of evidence would mean critically examining past forecasts in the light of evidence. In the judicial process, creating prejudice in the eyes of the jury against the defendants, or seeking to deny the accused a defence, is forcefully dealt with. Creating prejudice and denying a voice to those who question the climate change dogmas is viewed as part of the cause.

K Underdetermination Thesis

“The underdetermination thesis – the idea that any body of evidence can be explained by any number of mutually incompatible theories”

Quote from Kuhn vs Popper – Steve Fuller 2003

The global warming hypothesis is but one of a number of hypotheses trying to explain why climate changes over time. The problem is not just of a potential number of competing theories. It is that there might be a number of different elements influencing climate, with the various weightings dependent on the method and assumptions in analysis. It is not just trying to determine which one, but which ones and to what extent that they interplay.

L Vulnerability

Every scientific hypothesis is vulnerable to being refuted. Human-caused catastrophic global warming (CAGW) is based on extremely tentative assumptions, and is a forecast of future events. As the warming the past one hundred years is tiny compared that forecast to happen in the future, and that warming is partly obscured by natural variations, then the signal of future catastrophe will be weak. The issue is further clouded by the lack of long periods of data on climate variability before when human emissions became significant. That is data prior to 1945, when the post war economic boom led to a huge increase in human emissions. Assuming the forecasts of CAGW are correct, the hypothesis becomes incredibly vulnerable to rejection.
But if CAGW is false, or massively exaggerated, then the hypothesis is deeply susceptible to confirmation bias by those who only look to find evidence of its truth. The core belief of climate science is that the catastrophist hypothesis is true and the job of the “science” is to reveal this truth. The core mission of many co-believers is to stop any questioning of these core beliefs. The alternative view is that evidence for CAGW has become stronger over the last twenty-five years, making the hypothesis less vulnerable over time. This can be tested by looking at the success of the short-term predictions.

M The Null Hypothesis

Wikipedia’s definition is

In statistical inference of observed data of a scientific experiment, the null hypothesis refers to a general or default position: that there is no relationship between two measured phenomena,…… Rejecting or disproving the null hypothesis – and thus concluding that there are grounds for believing that there is a relationship between two phenomena …………….. – is a central task in the modern practice of science, and gives a precise sense in which a claim is capable of being proven false.

It applies to AGW theory, as the hypotheses are empirical relationships. With highly complex, and essentially chaotic, systems it is only by confronting the data using a battery of statistical tests that you can disprove the null hypothesis. Without the null hypothesis, and without such rigorous testing, all the data and observations will only confirm what you want to believe. Some of the best established empirically-based hypotheses, like “HIV causes AIDS” and “long-term heavy smoking significantly reduces life expectancy” have been confronted with the null hypothesis many times against large, high quality data sets. At extremely high levels of significance, the null hypothesis of no relationship can be rejected.

It could be claimed that the null hypothesis in not applicable to AGW theory as it forecasts something much worse happening than has so far been experienced. However, it is more important because of this. There is no bridge between reality and the theoretical relationships (with assumed magnitudes) in the climate models. The null hypothesis (general or default position) for testing against actual data is not that there is no relationship, but the double-negative of no non-trivial relationship. So the null hypothesis for testing “CO2 causes warming”, is not “CO2 does not affect temperature”, but “CO2 has no non-trivial impact on warming”. The reason is that the claimed requirement for policy is avoidance of a climate catastrophe, with relationships being non-trivial in magnitude.

Showing Warming when it has Stopped

There has been no statistically significant warming for at least 15 years. Yet some people, like commentator “Michael the Realist”, who is currently trolling Joanne Nova’s blog, are claiming otherwise. For instance

Again look at the following graph.

Now let me explain it to the nth degree.
# The long term trend over the whole period is obviously up.
# The long term trend has pauses and dips due to natural variations but the trend is unchanged.
# The current period is at the top of the trend.
# 2001 to 2010 is the hottest decade on the record despite a preponderance of natural cooling trends. (globally, ocean, land and both hemispheres)
# Hotter than the previous decade of 1991 to 2000 with its preponderance of natural warming events.
# Every decade bar one has been hotter than the previous decade since 1901.

Please explain why the above is true if not AGW with proof.

State of the climate 2012
http://www.climate.gov/news-features/understanding-climate/state-climate-2012-highlights

The three highlighted comments are the ones that this posting addresses.

Using decadal average temperature changes to cover up the standstill.

The latest way to avoid the truth that warming has stopped for 15 years or more is by decadal averages. This can be illustrated by using an approximate model of the data. Assume constant average temperatures from 1960 to 1975, a linear warming of 0.6oC from 1976 to 1998, followed by a further standstill.


The decadal averages are


So, instead of 24 years of warming, it is 4 consecutive decades, that are each warmer than the last. The 2000s are warmer than the 1990s simply because there was warming the 1990s. It is political spin, relying on an ignorance of basic statistics, that is needed to make such claims.


Lamar Smith and Implementing effective policy on climate change

There has been considerable ire directed at Texan Congressman Lamar Smith for his Washington Post Op-Ed entitled “Overheated rhetoric on climate change doesn’t make for good policies


Lamar begins

Climate change is an issue that needs to be discussed thoughtfully and objectively. Unfortunately, claims that distort the facts hinder the legitimate evaluation of policy options

Lamar concludes

Instead of pursuing heavy-handed regulations that imperil U.S. jobs and send jobs (and their emissions) overseas, we should take a step back from the unfounded claims of impending catastrophe and think critically about the challenge before us. Designing an appropriate public policy response to this challenge will require that we fully assess the facts and the uncertainties surrounding this issue, and that we set aside the hyped rhetoric.

I could not agree more. Judith Curry shows that the so-called “scientific” criticism is less balanced than the politician’s initial comments.

To think critically and objectively about any complex problem, it needs to be broken down into sub-sections with relevant areas of expertise. This is no more important in climate change policy, which science demands belief and people get lost in irrelevant detail. A starting point is to divide the issue into three parts, with the relevant experts in brackets.

1. Whether there is a potential problem. (Scientists)

2. Whether that potential problem is non-trivial. (Economists interpreting the scientists work)

3. Whether there is the ability to do something positive about that problem. (Economists and public policy-makers to formulate any policy. Economist/auditors, with some input from scientists, to interpret the results.)

1. Whether there is a potential problem.

The potential problem most would accept. Increase the level of greenhouse gases and average temperatures goes increase. It actually folds into the second.

2. Whether that potential problem is non-trivial.

But the second is far more important. The starting point to see if the size of the problem, it to break any potential impacts down into the components of magnitude, likelihood, time for changes to occur and the weighting that can be given to the scientific evidence. This is discussed here. Like in many other areas, the weighting we give to expert opinion should be based on a track record. Climate science is still very much in its infancy and many of the projected signposts were either wrong (worsening storms, accelerating sea level rises) or much too extreme (temperature rises). In fact any alleged successes are either through luck or through the initial prediction being so vague that it could hardly fail to be correct. There should also be a recognition concerning any potential benefits. For instance, Scotland would benefit from being a tad warmer, and increased CO2 may help plant growth. There is also a question of the quality of the climate model projections. There seems little or attempt at quality improvement through learning from past mistakes and building on successes. Further I see plenty of claims of being on the side of peer-reviewed science, and on consensus, whilst have a huge public-relations effort but little about building on the traditions of the greatest scientists, or learning from the philosophers of science. “Climate Science” seems somewhat out of that mainstream.

3. Whether there is the ability to do something positive about that problem.

The third is where the policy-makers step in. Are they able to deliver a policy that will tackle the issues at a lower costs than the benefits? To give a medical analogy, have they sufficient qualifications and the moral duty of care, that where they inflict painful treatments, the patient (the human race and/or Mother Gaia) is better off than if they had done nothing. Given the massive policy failures so far, the answer seems highly negative. Given that much of the effort is going into shutting down and policy discussion by believers in the science and in the policy, failures seem set to continue through deliberate negligence of this issue.

To take the medical analogy further, treatment is tempered by the uniqueness of the ailment and the track record in treating that ailment. For instance hip replacements have been performed for many years and are quite frequent, so the risks and pain of treatment, along with the mortality rates are known. So an otherwise reasonably healthy person of forty whose hip joints need replacing to enable them to walk would be recommended for the operation. A frail ninety year old would not. But we have never had human-caused climate change before. Indeed, there is a huge dispute about how serious the symptoms will actually be. They have not come to fruition just yet. Furthermore the “treatment” has not been properly tested. Neither have those devising the treatment any sort of qualifications or track record in devising similar treatments. Why do I know this? Because there has never been a global initiative to use economic tools to drive through a solution to a problem whose outward characteristics (though not necessarily the causes) are a naturally-occurring phenomena, neither are involved people who have experience is getting consensus on global issues, such as nuclear non-proliferation.

Note on the Moral View

I have a strong moral view that politicians should act to make the world a better place, as the underlying desired outcome of public service. It can be on the world stage or in a local community. Climate policy means imposing costs now to avoid much higher costs later. It might be a simplistic and naïve view, but the opposite – that politicians work to make a net negative impact, or do not care what effect they have, or simply work to serve some small factional interest (and to hell with everybody else) – are views that are at least distasteful and at worst downright evil. Like a medical professional, they have a duty of care to make sure there is a reasonable expectation that net positive outcomes will happen, and to monitor that progress.

Kevin Marshall

Are Climate Change and Obesity Linked?

Judith Curry has a (somewhat tongue-in-cheek) look at the links between climate change and obesity.

One of the two references is to the care2 website.

Consider the three alleged “links” between climate change and obesity that Dr Curry summarised:-

  • Rising inactivity rates because of hot temperatures
  • Drought-induced high prices on healthy foods
  • Food insecurity promotes unhealthy food choices

Rising inactivity is commonly thought to be due to less manual work, the rise of the car and evermore staring at the TV or computer. If a rise of 0.8C in temperature were a major factor then in Britain you would see (for instance) the Scots being more active than those in the South of England, or people being more active in winter than summer. In both cases the opposite is true.

Drought-induced high prices would have to show that droughts were the main cause of high prices of health foods compared to junk foods. Maybe convenience and taste have something more to do with the preference for unhealthy diets. Also you would need to show that rising food prices are connected to decreasing crop yields. Biofuels may have more with the rising food prices.

Food insecurity diminishes as per capita income rises, whilst obesity increases. That is the poorest of the world have hunger as a problem, whilst the rich countries have obesity as a growing problem. Obesity may be a problem of the poor in the developed nations, but food as a whole is not a problem.

The above article is a very extreme example of

The underdetermination thesis – the idea that any body of evidence can be explained by any number of mutually incompatible theories

Kuhn Vs.Popper: The Struggle for the Soul of Science – Steve Fuller 2003 Page 46.

Kevin Marshall

AR5 First Order Draft Summary for Policymakers – a few notes on pages 1 to 8

Alec Rawls has taken the brave step of releasing the first order draft of the UNIPCC AR5 Report. Anthony Watts has republished at Wattsupwiththat.

Although Alec Rawls published in breach of signed undertakings, I comment and quote the report in the public interest. There is more than a single, unequivocal, interpretation of the data. To claim otherwise is dogma. This dogma is being used to justify policies that promote net harm to western economies, particularly the poorer and more vulnerable sections of society. In the name of this dogma, impartiality is being annulled and dissenters called nutters.

I have started with some initial observations on the first eight pages on the Summary for Policymakers – the only bit that people ever read. Like utterings from the Kremlin on the 1970s and 1980s, the coded language says as much or more than the actual words.

Major points

  1. No admission of lack of recent rise in the surface temperature record.
  2. But the lack of recent rise is accounted for by a step change in the warming in the Southern Oceans.
  3. AR4 got it wrong on decreasing precipitation in the tropics (which underlay Africagate), and they got it wrong on increasing hurricanes.
  4. Sea level rise is not accelerating. In fact the recent rise since 1993 is similar to the 1930-1950 period.
  5. Global glacier melt is not accelerating. Himalayas do not even get a mention.
  6. Medieval Warm Period gains more recognition than the AR4. However, recent studies will render AR5 out of date before it even published.

Page 3 Lines 21-25.
On temperatures there is a cover-up of the recent lack of warming. They cannot admit that global average temperatures have not changed for 15 years.

Page 3 Lines 38-40. Precipitation in the tropics likely increased over the last decade, reversing a previous trend from mid-70s to mid-90s. The AR4 prediction of some African countries experiencing up to a 50% reduction in crop yields by 2020 (Africagate) was based upon a belief increasing extreme drought.

Page 3 Lines 46-48

Changes in many extreme weather and climate events have been observed, but the level of confidence in these changes varies widely depending on type of extreme and regions considered. Overall the most robust global changes are seen in measures of temperature {FAQ 2.2, 2.6} (see Table SPM.1).

Translation – Saying that an extreme weather events are evidence of global warming has no scientific validity. Best measures are of global temperature, which we can’t admit have been failing to rise.

Page 4 Line 14. An admission that previous IPCC reports got it wrong on tropical cyclones getting more extreme.

Page 4. Lot of stuff on Trenberth’s missing heat being in the oceans. Oceans have been warming since 1971. The lack of warming of air temperatures since the mid-90s could be accounted for by this comment on lines 36-37

It is very likely that the Southern Ocean has warmed throughout the full ocean depth since the 1990s, at a rate of about 0.03°C per decade.

The lack temperature rise is explained by the heating up of the oceans. Global warming is now confined to the Southern Ocean. It is imperceptible, so on the Southern perimeter it is not sufficient to have stopped the increase in Antarctic sea ice from extending slightly.

Then this

Warming of the ocean accounts for more than 90% of the extra energy stored by the Earth between 1971 and 2010. Upper ocean (0–700 m) heat content very likely increased at a rate between 74 [43 to 105] × 1012 W and 137 [120 to 154] × 1012 W for the relatively well-sampled 40 year period from 1971 to 2010. Warming has also been observed globally below 4000 m and below 1000 m in the Southern Ocean, in spite of sparse sampling (see Figure SPM.1). {3.2, Box 3.1, Figure 3.2, Figure 3.3}

The very likely heating of the Southern Ocean, is based on sparse sampling?

Page 4. Line 46. Seas have very likely become saltier. That is has become less alkaline. On Page 6 lines 30-31, Ph decline is 0.015 to 0.024 per decade over last 3 decades. Call becoming less alkaline “acidification”, which is inaccurate. Oceans are heading towards Ph neutrality.

Page 5. Glaciers are globally still shrinking. No mention of Himalayas, and no mention of global acceleration. Range is “210 [145 to 275] Gt yr–1 to 371 [321 to 421] Gt yr–1“. Omit to convert these to sea level rise. 210 Gt = 0.64mm. 421 Gt = 1.29mm (Oceans = 326.2m km2 & 1 Gt water = 1 km3). In old money, glaciers are contributing 2.5 to 5.1 inches per century.

Page 5 Lines 47-49. Sea levels

It is virtually certain that over the 20th century the mean rate of increase was between 1.4 to 2.0 mm yr-1, and between 2.7 and 3.7 mm yr-1 since 1993. It is likely that rates of increase were similar to the latter between 1930 and 1950.

Translation. Sea levels are rising but not accelerating. If sea levels are a lagged response to rising surface temperatures, then (using the HADCRUT3 surface temperature data) we would expect the rise in sea levels to level off in the next few years, unless there is continued warming in the oceans.

Pages 6 to 7 Long-Term Perspective from Paleoclimatic Records

There was a medieval warm period, despite what Micheal Mann and others said in 1998 and 1999. But the MWP is less than the temperatures at the end of the twentieth century. However, due to time schedules for acceptance into AR5, they ignore Christiansen and Ljungqvist April 2012 and Ljungqvist et al 2012. The later, despite including discredited proxies such as Briffa’s notorious Yamal data, quite clearly shows rom 120 proxies that the 10th century had higher temperatures than at the end of the 20th century.


Similarly the Esper et. al 2012 of summer temperatures in Northern Scandinavia will render this part of the report out-of-date prior to it being published.

In 2006 the UNIPCC could bring themselves to bend the rules to allow in a corrupt scientific paper that suited their purposes, but this time they ignore two strong cases that undermine their case. If there is an AR6 around 2020, the UNIPCC will have to face the scientific evidence.

Page 8 The last IPCC report overestimated the impact of aerosols. The net impact of greenhouse gases and aerosols rises from 1.72 W m-2 to 2.40 W m-2. Negative forcings dramatically fall. The positive forcing impact falls, despite the major contributor, CO2 rising from 1.66 W m-2 to 1.82 W m-2. The net impact of CO2 reduces from 100% to around 75% of warming impact. It is no longer possible to talk of “rising CO2” as a shorthand for anthropogenically-caused rising greenhouse gases.

NB – the SPM file I refer to can be accessed below. Please compare my comments with the file.

SummaryForPolicymakers_WG1AR5-SPM_FOD_Final

Kevin Marshall

Costs of Climate Change in Perspective

This is a draft proposal in which to frame our thinking about the climatic impacts of global warming, without getting lost in trivial details, or questioning motives. This builds upon my replication of the thesis of the Stern Review in a graphical form, although in a slightly modified format.

The continual rise in greenhouse gases due to human emissions is predicted to cause a substantial rise in average global temperatures. This in turn is predicted to lead severe disruption of the global climate. Scientists project that the costs (both to humankind and other life forms) will be nothing short of globally catastrophic.

That is

CGW= f {K}                 (1)

The costs of global warming, CGW are a function of the change in the global average surface temperatures K. This is not a linear function, but of increasing costs per unit of temperature rise. That is

CGW= f {Kx} where x>1            (2)

Graphically


The curve is largely unknown, with large variations in the estimate of the slope. Furthermore, the function may be discontinuous as, there may be tipping points, beyond which the costly impacts of warming become magnified many times. Being unknown, the cost curve is an expectation derived from computer models. The equation thus becomes

E(CGW)= f {Kx}                (3)

The cost curve can be considered as having a number of elements the interrelated elements of magnitude M, time t and likelihood L. There are also costs involved in taking actions based on false expectations. Over a time period, costs are normally discounted, and when considering a policy response, a weighting W should be given to the scientific evidence. That is

E(CGW)=f {M,1/t,L,│Pr-E()│,r,W}    (4)

Magnitude M is the both severity and extent of the impacts on humankind or the planet in general.

Time t is highly relevant to the severity of the problem. Rapid changes in conditions are far more costly than gradual changes. Also impacts in the near future are more costly than those in the more distant future due to the shorter time horizon to put in place measures to lessen those costs.

Likelihood L is also relevant to the issue. Discounting a possible cost that is not certain to happen by the expected likelihood of that occurrence enables unlikely, but catastrophic, events to be considered alongside near certain events.

│Pr-E()│ is the difference between the predicted outcome, based on the best analysis of current data at the local level, and the expected outcome, that forms the basis of adaptive responses. It can work two ways. If there is a failure to predict and adapt to changing conditions then there is a cost. If there is adaptation to anticipation future condition that does not emerge, or is less severe than forecast, there is also a cost. │Pr-E()│= 0 when the outturn is exactly as forecast in every case. Given the uncertainty of future outcomes, there will always be costs incurred would be unnecessary with perfect knowledge.

Discount rate r is a device that recognizes that people prioritize according to time horizons. Discounting future costs or revenue enables us to evaluate the discount future alongside the near future.

Finally the Weighting (W) is concerned with the strength of the evidence. How much credence do you give to projections about the future? Here is where value judgements come into play. I believe that we should not completely ignore alarming projections about the future for which there is weak evidence, but neither should we accept such evidence as the only possible future scenario. Consider the following quotation.

There are uncertain truths — even true statements that we may take to be false — but there are no uncertain certainties. Since we can never know anything for sure, it is simply not worth searching for certainty; but it is well worth searching for truth; and we do this chiefly by searching for mistakes, so that we have to correct them.

Popper, Karl. In Search of a Better World. 1984.

Popper was concerned with hypothesis testing, whilst we are concerned here with accurate projections about states well into the future. However, the same principles apply. We should search for the truth, by looking for mistakes and (in the context of projections) inaccurate perceptions as well. However, this is not to be dismissive of uncertainties. If future climate catastrophe is the true future scenario, the evidence, or signal, will be weak amongst historical data where natural climate variability is quite large. This is illustrated in the graphic below.


The precarious nature of climate costs prediction.

Historical data is based upon an area where the signal of future catastrophe is weak.

Projecting on the basis of this signal is prone to large errors.

In light of this, it is necessary to concentrate on positive criticism, with giving due weighting to the evidence.

Looking at individual studies, due weighting might include the following:-

  • Uses verification procedures from other disciplines
  • Similarity of results from using different statistical methods and tests to analyse the data
  • Similarity of results using different data sets
  • Corroborated by other techniques to obtain similar results
  • Consistency of results over time as historical data sets become larger and more accurate
  • Consistency of results as data gathering becomes independent of the scientific theorists
  • Consistency of results as data analysis techniques become more open, and standards developed
  • Focus on projections on the local level (sub-regional) level, for which adaptive responses might be possible

To gain increased confidence in the projections, due weighting might include the following:-

  • Making way-marker predictions that are accurate
  • Lack of way-marker predictions that are contradicted
  • Acknowledgement of, and taking account of, way-marker predictions that are contradicted
  • Major pattern predictions that are generally accurate
  • Increasing precision and accuracy as techniques develop
  • Changing the perceptions of the magnitude and likelihood of future costs based on new data
  • Challenging and removal of conflicts of interest that arise from scientists verifying their own projections

    Kevin Marshall