Fundamentals that Climate Science Ignores

Updated 08/09/13 am

Updated 08/09/13 pm – M The Null Hypothesis

Climate Science is a hugely complex subject, dealing with phenomena that are essentially chaotic, with vague patterns. Yet the promotion of that science is banal and superficial. Below are some of the fundamentals that have been addressed in established areas like economics, philosophy and English Common Law, but which the Climate Science community ignores. Most overlap, or are different ways of looking at the same thing.

A Positive and Normative

I do not hold with the logical positivism in vogue in the early parts of the C20th and later underpinning the “positive economics” ideas of Milton Friedman that was popular in the 1950s to 1980s. But it made the useful distinction between positive statements (empirically based statements) and normative statements (what ought to be). The language of climate science is heavily value-laden. There is not attempt to distinguish positive from normative in language, nor highlight that competency in the sphere of positive statements is not necessarily an indication of competency in normative ones.  For instance, when scientists make statements about the moral imperative for policy, they may overemphasize the moral questions raised as they may be too close to the subject. In fact believing that that rising greenhouse gas levels causes a worsening of climate can lead to a bias towards the simplified solution to constrain that growth. It takes understanding of the entirely separate fields of economics and public policy-making to determine whether this is achievable, or the best solution.

B Boundary conditions

There is no clear definition of science in general or the study of climate in particular. The only underlying definitions are tantamount to saying that science is what scientists do, and scientific statements are those made by scientists. Without a clear definition of science, scientists end up making unsupported statements, outside their area of competency. For instance, scientists often make statements about the economic case for policy. With the term “climate change” encompassing both, the general public are misled into believing that “climate scientists” cover both areas.

C Open and closed questions

A closed question can by answered by a single word. The narrowest closed questions are those can be answered “Yes/No” or “True/False”. Open questions need fuller answers. Climate change is not just about closed questions. It is about how much, how likely, when and where. If terms of boundary, there is not a closed question of science versus non-science – with the boundary in actual work being between that published in a peer-reviewed journal and that published outside. That leads onto non-triviality and quality conditions and relevancy

D Trivial v. Non-trivial

The strongest evidence for global warming suggests a trivial issue. In one aspect this is true by definition. The non-trivial part – the potential climate catastrophe that policy seeks to avert – relies upon future projections. This relies on temperature rises many times greater than so far experienced. Projections will always be, weaker that the actual evidence. But there is an empirical aspect as well. If the actual trends are far below those predicted (surface temperature warming trends), or fail to show a switch to a path pointing to catastrophe (acceleration in the rate of sea level rise)

E Quality

There is good quality science and poor quality. Peer review should help, but (as suggested in the Climategate emails) acceptance/rejection can be based on criteria other than science. In most areas of science, and indeed in many professions, efforts have been made to improve the quality of results. One minor step towards improvement of quality is the insistence on publishing the data behind peer-reviewed articles. This has led to the quick exposure of shoddy work like Gergis et al 2012 and LOG12 papers, whereas it took many years of persistence by Steve McIntyre to get the full data on Keith Briffa’s deeply flawed Yamal tree-ring temperature proxy. However, as the forthcoming UNIPCC AR5 report will demonstrate, increasing quality is sacrificed in promoting climate catastrophism.

F False Positives and False Negatives

A particular subset of the quality issue is that of false positives and false negatives. With activists pressuring governments and scientific bodies to agree with the dogma, and promotion of pejorative language (e.g. deniers, fake skeptics), misattribution of significant weather events to climate change is a consequence. Whilst in cancer screening there have been efforts made to reduce the number of false positives and false negatives, in climate science there seems to be every effort to increase the numbers of false positives. (Superstorm Sandy that hit New York state last year, the extreme heat wave in Europe in 2003, the low sea ice point in September 2012).

G Relevancy and significance

Some pieces of information, or scientific papers, are more important than others. The vast majority of papers published are on trivial issues and/or fail to make a lasting impact. In terms of catastrophic global warming, most papers in the field are tangential to the subject. The same is true of items of information, statistics and opinions.

H Necessary and Sufficient

For a climate policy to give net benefits, a number of conditions are necessary, both in the science (greenhouse gas effect, significant warming, adverse consequences) and in policy area (policy with theoretical net benefits > costs of doing nothing, large enough policy area, effective policy management). Sufficient for policy success (net policy benefits > costs of doing nothing) all are to some extent necessary. For policy failure, it is only sufficient for one of the necessary conditions to fail. It does not matter whether this is

–       climate sensitivity being much lower than assumed

–       or adaptation at the non-governmental local level is much more effective than assumed

–       or the net adverse consequences of any given amount of warming are grossly exaggerated

–       or the theoretical economic case for policy is flawed (such as demand for energy is far more inelastic with respect to price over time than assumed, or that renewable energy is not a close substitute to fossil fuel energy)

–       or the actual policy enacted does not encapsulate the economic theory, diluting or nullifying the effectiveness

–       or unilateralist policy where success requires that the vast majority of the biggest economies to participate

–       or the policy on paper is potentially successful, but it is not project managed to drive through the maximum benefits at least cost

I Levels of evidence

In the legal systems, especially in criminal law, it has long been recognized that there are different qualities of evidence. The strongest is DNA, fingerprints, or catching somebody in the act. There is then secondary evidence from witnesses. There is then circumstantial evidence, such as the accused being near to the scene at the time, with no clear reason to be there. The lowest form of evidence, and usually rejected, is hearsay evidence. That is opinions of people with little interest in the case, giving unsupported opinions. The judicial process also views more highly evidence that is corroborated by other pieces of evidence, and evidence that on its own seems quite strong is downgraded or ruled out by contrary evidence, or alternative explanations.

J Values of the Legal Process in Reverse

Climate science, fails to grapple with the grading of evidence, as some its strongest arguments – consensus amongst scientists – is actually hearsay. Improving the quality of evidence would mean critically examining past forecasts in the light of evidence. In the judicial process, creating prejudice in the eyes of the jury against the defendants, or seeking to deny the accused a defence, is forcefully dealt with. Creating prejudice and denying a voice to those who question the climate change dogmas is viewed as part of the cause.

K Underdetermination Thesis

“The underdetermination thesis – the idea that any body of evidence can be explained by any number of mutually incompatible theories”

Quote from Kuhn vs Popper – Steve Fuller 2003

The global warming hypothesis is but one of a number of hypotheses trying to explain why climate changes over time. The problem is not just of a potential number of competing theories. It is that there might be a number of different elements influencing climate, with the various weightings dependent on the method and assumptions in analysis. It is not just trying to determine which one, but which ones and to what extent that they interplay.

L Vulnerability

Every scientific hypothesis is vulnerable to being refuted. Human-caused catastrophic global warming (CAGW) is based on extremely tentative assumptions, and is a forecast of future events. As the warming the past one hundred years is tiny compared that forecast to happen in the future, and that warming is partly obscured by natural variations, then the signal of future catastrophe will be weak. The issue is further clouded by the lack of long periods of data on climate variability before when human emissions became significant. That is data prior to 1945, when the post war economic boom led to a huge increase in human emissions. Assuming the forecasts of CAGW are correct, the hypothesis becomes incredibly vulnerable to rejection.
But if CAGW is false, or massively exaggerated, then the hypothesis is deeply susceptible to confirmation bias by those who only look to find evidence of its truth. The core belief of climate science is that the catastrophist hypothesis is true and the job of the “science” is to reveal this truth. The core mission of many co-believers is to stop any questioning of these core beliefs. The alternative view is that evidence for CAGW has become stronger over the last twenty-five years, making the hypothesis less vulnerable over time. This can be tested by looking at the success of the short-term predictions.

M The Null Hypothesis

Wikipedia’s definition is

In statistical inference of observed data of a scientific experiment, the null hypothesis refers to a general or default position: that there is no relationship between two measured phenomena,…… Rejecting or disproving the null hypothesis – and thus concluding that there are grounds for believing that there is a relationship between two phenomena …………….. – is a central task in the modern practice of science, and gives a precise sense in which a claim is capable of being proven false.

It applies to AGW theory, as the hypotheses are empirical relationships. With highly complex, and essentially chaotic, systems it is only by confronting the data using a battery of statistical tests that you can disprove the null hypothesis. Without the null hypothesis, and without such rigorous testing, all the data and observations will only confirm what you want to believe. Some of the best established empirically-based hypotheses, like “HIV causes AIDS” and “long-term heavy smoking significantly reduces life expectancy” have been confronted with the null hypothesis many times against large, high quality data sets. At extremely high levels of significance, the null hypothesis of no relationship can be rejected.

It could be claimed that the null hypothesis in not applicable to AGW theory as it forecasts something much worse happening than has so far been experienced. However, it is more important because of this. There is no bridge between reality and the theoretical relationships (with assumed magnitudes) in the climate models. The null hypothesis (general or default position) for testing against actual data is not that there is no relationship, but the double-negative of no non-trivial relationship. So the null hypothesis for testing “CO2 causes warming”, is not “CO2 does not affect temperature”, but “CO2 has no non-trivial impact on warming”. The reason is that the claimed requirement for policy is avoidance of a climate catastrophe, with relationships being non-trivial in magnitude.

Was the twentieth century warming mostly due to human emissions?

There has been no statistically significant warming for at least 15 years. Yet some people, like commentator “Michael the Realist”, who is currently trolling Joanne Nova’s blog, are claiming otherwise. His full claims are as follows

Again look at the following graph.

Now let me explain it to the nth degree.
# The long term trend over the whole period is obviously up.
# The long term trend has pauses and dips due to natural variations but the trend is unchanged.
# The current period is at the top of the trend.
# 2001 to 2010 is the hottest decade on the record despite a preponderance of natural cooling trends. (globally, ocean, land and both hemispheres)
# Hotter than the previous decade of 1991 to 2000 with its preponderance of natural warming events.
# Every decade bar one has been hotter than the previous decade since 1901.

Please explain why the above is true if not AGW with proof.

The claims against the warming standstill I will deal with in a later posting. Here I will look at whether the argument proves, beyond reasonable doubt, that AGW exists and is significant.

There might be a temperature series, but there is no data on greenhouse gases. There is data on the outcome, but there is no presentation of data on the alleged cause. It is like a prosecution conducting a murder trial with a dead body, with the cause of death not established, and no evidence presented linking the accused to the death. I will have to fill this bit in. The alleged cause of most of the twentieth century global warming is human greenhouse gas emissions. The primary greenhouse gas emission is CO2. First I will compare estimated global CO2 emissions with the warming trend. Second, I then show evidence that the twentieth century warming is nothing exceptional.

The relationship of CO2 emissions to average temperature is weak

Some time ago I downloaded estimates of national CO2 emissions data from what is now the CDIAC website, then in filled my own estimates for all major countries where there were data gaps, using the patterns of other countries and my knowledge of economic history. This shows steady growth up to 1945 (with dips in WW1, the Great Depression and at the end of WW2) The post war economic boom, the 1973 oil crisis, the recession of 1980-81 and the credit crunch of 2008 are clearly visible. It therefore seems reasonable and not too dissimilar from the increase in atmospheric CO2 levels.

I have charted the growth in human CO2 emissions against the HADCRUT3 data, putting them on a comparative scale. The 5 year moving average temperature increased by around 0.5oC between 1910 and 1944 and 0.6oC between 1977 and 2004. In the former period, estimated CO2 emissions increased from 0.8 to 1.4 giga tonnes. In the latter period, estimated CO2 emissions increased from 4.9 to 7.4 giga tonnes. The period in between the 5 year moving average temperature decreased very slightly and CO2 emissions increased from 1.4 to 4.9 giga tonnes. 1945 and the late 1998 have two things in common – the start of a stall in average surface temperature increases and an acceleration in the CO2 emission rate of increase. On the face of it, in so far as there is a relationship between CO2 emissions and temperature, it seems to be a pretty weak one.

The longer view

The case for claiming human emissions affect temperature is even weaker if you take a longer perspective. Human CO2 emissions were negligable before the industrial revolution, yet there is plenty of evidence that temperatures have shown larger fluctuations in last couple of millennia. Four example are Law Dome, Esper et al 2012, Gergis et al 2012 and the CO2 Science website.

This Law Dome ice cores are the highest quality ice cores in Antarctica.

There seems to be no warming there at all. With 75% of the global ice packs in Antarctica it is fortunate that there is nothing exceptional about Antarctica warming. But maybe the Arctic is different.

Esper et al 2012, published in Nature, has the following Summer temperature reconstruction for Northern Scandinavia over two millennia.

There is a twentieth century uptick, but only in the context of a long term cooling trend.

Focussing on the last 130 years shows something at odds with the global position.

The highest temperatures were in the 1930s, just like the record temperatures in the USA. The warming trend from the mid-1970s is still far greater than the global averages, but less than the warming trends in the early twentieth century. It corroborates data that shows recent warming trends are higher in the Arctic than the global average, but also shows claims that there is nothing significant in these trends.

I find the most convincing evidence is from the withdrawn Gergis 2012 temperature reconstruction for the combined land and oceanic region of Australasia (0°S-50°S, 110°E-180°E). This is because it set out with the aim of showing the opposite – that the recent warming was much more significant than anything in the last millennium. Despite breaking their own selection rules for proxies, they managed to only demonstrate that the last decade of the last millennium the warmest by the narrowest of margins. See below.

There are many reasons to reject the paper (see here), but one significant point can be illustrated. There were only three reconstructions had any data prior to 1430. There were two tree ring studies from New Zealand, and coral study from Palmyra Atoll. Plotting the decadal averages shows that the erratic Palmyra data suppresses the medieval period and exaggerates the late twentieth century warming. Further, Palmyra Atoll is over 2000 km outside the study area.

Finally, specialises in accumulating evidence of the impacts of CO2. It also has a database of studies on the medieval warm period. There is a graph that summarizes the quantitative studies

Figure Description: The distribution, in 0.5°C increments, of Level 1 Studies that allow one

to identify the degree by which peak Medieval Warm Period temperatures either exceeded

(positive values, red) or fell short of (negative values, blue) peak Current Warm Period


In conclusion, on the face of it, there is very weak support for human emissions being the cause of most of the warming in the last century by the fact that changes in human emissions do not appear to move in line with changes in temperature. The case is further weakened by evidence that at times in the last 2000 years were warmer than in the current period. It does not discount the possibility that human emissions are responsible for some of the warming. But demonstrating that empirically would mean understanding and accurately measuring the full extent of the natural processes, then demonstrating that these were not operating as strongly as in previous epochs. By definition, the evidence will be more circumstantial than if there was a direct correlation. Furthermore, the larger the actual human impact the more circumstantial will be the evidence.

Gergis 2012 Mark 2 – Hurdles to overcome

BishopHill reported yesterday on the withdrawn Gergis paper that

The authors are currently reviewing the data and methods. The revised paper will be re-submitted to the Journal of Climate by the end of July and it will be sent out for peer review again.

It is worth listing the long list of criticisms that have been made of the paper. There are a lot of hurdles to overcome before Gergis et al 2012 should qualify for the status of a scientific paper.

My own, quite basic, points are:-

  1. Too few proxies for such a large area. Just 27 for > 5% of the globe.
  2. Even then, 6 are well outside the area.
  3. Of these six, Gergis’s table makes it appear 3 are inside the area. My analysis is below.

  4. Despite huge area, there are significant clusters – with massive differences between proxies at the same or nearby sites.
  5. There are no proxies from the sub-continental land mass of Australia.
  6. Need to remove the Palmyra Proxy because (a) it has errant readings (b) fails the ‘t’ test (c) > 2000km outside of the area, in the Northern Hemisphere.
  7. Without Palmyra the medieval period becomes the warmest of the millennium. But with just two tree ring proxies, one at 42 O South and the other at 43 O S representing an range from 0 to 50O S, this is hardly reliable. See the sum of proxies by year. Palmyra is the coral proxy in the 12th, 14th and 15th centuries.

On top of this are Steve McIntyre’s (with assistance from JeanS and RomanM) more fundamental criticisms:-

  1. The filtering method of Gergis excluded the high quality Law Dome series, but included the lower quality Vostok data, and the Oroko tree ring proxy. McIntyre notes that Jones and Mann 2003 rejected Oroko, but included Law Dome on different criteria.
  2. Gergis screening correlations were incorrectly calculated. JeanS calculated properly. Only 6 out of 27 proxies passed. (NB none of the six proxies outside the area passed)

  3. The Gergis initially screened 62 proxies. Given that the screening included proxies that should not have included 21 proxies, but should it have included some of the 35 excluded proxies. We do not know, as Gergis has refused to reveal these excluded proxies.
  4. Screening creates a bias in the results in favour of the desired result if that correlation is with a short period of the data. RomanM states the issues succinctly here. My, more colloquial take, is that if the proxies (to some extent) randomly show C20th warming or not, then you will accept proxies with a C20th uptick. If proxies show previous fluctuations (to some extent) randomly and (to some extent) independently of the C20th uptick, then those previous fluctuations will be understated. There only has to be a minor amount of randomness to show bias given that a major conclusion was

    The average reconstructed temperature anomaly in Australasia during A.D. 1238-1267, the warmest 30-year pre-instrumental period, is 0.09°C (±0.19°C) below 1961-1990 levels.

UPDATE 03/08/12

The end of July submissions date seems to have slipped to the end of September.

How Gergis Suppressed The Medieval Warm Period

The now withdrawn Gergis paper proudly proclaimed in the abstract

The average reconstructed temperature anomaly in Australasia during A.D. 1238-1267, the warmest 30-year pre-instrumental period, is 0.09°C (±0.19°C) below 1961-1990 levels.

On this basis, Gergis was able to say

A preliminary assessment of the roles of solar, volcanic, and anthropogenic forcings and natural ocean–atmosphere variability is performed using CSIRO Mk3L model simulations and independent palaeoclimate records. Solar and volcanic forcing does not have a marked influence on reconstructed Australasian temperature variations, which appear to be masked by internal variability.

This conclusion is from a single rogue proxy – the coral proxy from Palmyra Atoll.

There were only three temperature proxies covering this medieval peak period. Both Mt. Read in Tasmania and Oroko in New Zealand are tree ring proxies that cover the entire millennium. The Palymyra Atoll coral proxy data covers just 398 years over 4 periods. These are 1151-1221, 1319-1465, 1637-1704 and 1888-1999. Following Gergis, I have calculated the decadal averages. Below is the plot from my pivot table for the three proxies.

I contend that Palmyra is distinctly “odd” due to the following.

  1. Nowhere in the world am I aware of a single instance of massive cooling during the early 13th Century. If not rogue data, the it must be a purely local phenomena.
  2. Nowhere in the world am I aware of a single instance of massive cooling during the 17th Century. Nor was I aware the early 17th century had significant warm period. If not rogue data, it must be a purely local phenomena.
  3. The Hadcrut3 global temperature set has slight cooling at the end of the 19th century / start of the 20th Century, and a warming period from 1910 to 1940 almost as large as the warming period from 1975 to 1998. If not rogue data, it must be a purely local phenomena.
  4. The post 1960 warming trend is phenomenal. In fact it makes the twentieth century warming trend the largest of all the 27 proxies. (See table “Analysis of the Proxies in Gergis et al. 2012” below)

For these three reasons it would appear to be an outlier. So what is the impact?

I looked at the decadal average of the two tree-ring proxies and ranked the hundred decades from 1 for the highest to 100 for the lowest. I then took the decadal average of the three tree-ring proxies and similarly ranked the results.

The change is the decadal ranking was as follows:-

The medieval warm period is suppressed, and the twentieth century is “enhanced”.

Now let us be clear. There were 24 other proxies in the data set. However, none of the others were prior to 1430. Therefore the impact on the overall ranking will not be quite so marked. However, the fact remains that the conclusion that last decade of the 20th century is the warmest of the millennium is based on this one rogue data set.

But there are two more reasons that the Palmyra data set should not have been included in the reconstruction.

Firstly, the Gergis paper was withdrawn upon the publication of the JeanS ‘Gergis Significance’ t-values. Unsurprisingly, Palmyra was one of the proxies that failed the t-test, so is a rogue data set. See table below.

Secondly, is geography. The study is a “temperature reconstruction for the combined land and oceanic region of Australasia (0°S-50°S, 110°E-180°E)“. Palmyra Atoll is located at 5°52′ N, 162°06′ W, or over 2100Km (1300 miles) outside the study area.


The Palmyra Atoll coral proxy study is clearly an outlier statistically and geographically. In no way can it be considered a suitable proxy for the Australasia region, yet the major, headline, conclusion of the Gergis et al 2012 temperature reconstruction relied upon it.

Mid-Pacific Coral temperature proxies from Gergis et al. 2012

How odd is the Palmyra Atoll Coral Proxy?

In the last post I noted that there was something odd about the Palmyra proxy used in the Gergis paper, particularly in the late 20th century. This is at 5°52′ N, 162°06′ W.

There are four other coral proxies in the Mid-Pacific area. There are two proxy studies from
Rarotonga in the Cook Islands at 21° 14′ 0″ S, 159° 47′ 0″ W and two from the Fiji. For all five proxies I calculated a nine year centred moving average.

Palmyra shows a late 20th century warming trend more than twice that of the other series. Unless there is a locally recorded temperature anomaly on the atoll, then this is clearly wrong. If there is a local temperature spike, then it one should question why it is included in a reconstruction for which it is over 2000km outside the boundary. Either way it should be deleted from the study.

So how reliable are coral proxies. Here we have two pairs. If they are a good proxy for temperature, then they should be a good proxy for each other. On Fiji, they studies be less than 150km apart and on Rarotonga less than 10km apart, meaning they should be near identical. So I have plotted the differences between the moving averages.

It is not a statistically sound method, but indicative of the real issues with the proxy data sets. It also seems that the further back, the greater the consistency. The Palmyra study has four sections, the oldest of which starts in the 12th Century. Although Gergis claims to have done a series of tests for robustness, there is no correlation test over the known temperature record. Given that a central conclusion is:-

The average reconstructed temperature anomaly in Australasia during A.D. 1238-1267, the warmest 30-year pre-instrumental period, is 0.09°C (±0.19°C) below 1961-1990 levels.

Given that there is some question of the selection of the ice core studies at Vostok in preference for the closer and more robust studies at Ice Dome, then central conclusion of the study is not credible on such a small number of proxies.

Palmyra Atoll Coral Proxy in Gergis et al 2012

There is a lot of discussion on Bishop Hill (here and here) and Climate Audit of a new paper in Journal of Climate “Evidence of unusual late 20th century warming from an Australasian temperature reconstruction spanning the last millennium“, with lead author, Dr Joëlle Gergis. The reconstruction was based upon 27 climate proxies, one of which was a coral proxy from Palmyra Atoll.

There are two issues with this study.


The study is a “temperature reconstruction for the combined land and oceanic region of Australasia (0°S-50°S, 110°E-180°E)“. The study lists Palmyra Atoll as being at 6° S, 162° E, so within the study area. Wikipedia has the location at 5°52′ N, 162°06′ W, or over 2100Km (1300 miles) outside the study area. On a similar basis, Rarotunga in the Cook Islands (for which there are two separate coral proxy studies), is listed as being at 21° S, 160° E. Again well within the study area. Wikipedia has the location at 21° 14′ 0″ S, 159° 47′ 0″ W, or about 2000Km (1250 miles) outside the study area. The error has occurred due to a table with columns headed “Lon (°E)”, and “Lat (°S). Along with the two ice core studies from Vostok Station, Antarctica (Over 3100km, 1900 miles south of 50° S) there are 5 of the 27 proxies that are significantly outside the region.

Temperature Reconstruction

Palmyra Atoll reconstruction is one of just three reconstructions that has any data before 1430. From the abstract, a conclusion was

The average reconstructed temperature anomaly in Australasia during A.D. 1238-1267, the warmest 30-year pre-instrumental period, is 0.09°C (±0.19°C) below 1961-1990 levels.

From the proxy matrix I have plotted the data.

This indicates a massive change in late twentieth century temperatures, with 1996 being the most extreme on record.

The other two data sets with pre-1430 data are tree ring proxies from Mount Read, Tasmania and Oroko, New Zealand. These I have plotted with a 30 year moving average, with the data point at the last year.

There is something not right with the Palmyra Atoll proxy. The late 20th century trend is far too extreme. In the next posting I will compare to some other coral data sets.