More Coal-Fired Power Stations in Asia

A lovely feature of the GWPF site is its extracts of articles related to all aspects of climate and related energy policies. Yesterday the GWPF extracted from an opinion piece in the Hong Kong-based South China Morning Post A new coal war frontier emerges as China and Japan compete for energy projects in Southeast Asia.
The GWPF’s summary:-

Southeast Asia’s appetite for coal has spurred a new geopolitical rivalry between China and Japan as the two countries race to provide high-efficiency, low-emission technology. More than 1,600 coal plants are scheduled to be built by Chinese corporations in over 62 countries. It will make China the world’s primary provider of high-efficiency, low-emission technology.

A summary point in the article is not entirely accurate. (Italics mine)

Because policymakers still regard coal as more affordable than renewables, Southeast Asia’s industrialisation continues to consume large amounts of it. To lift 630 million people out of poverty, advanced coal technologies are considered vital for the region’s continued development while allowing for a reduction in carbon emissions.

Replacing a less efficient coal-fired power station with one of the latest technology will reduce carbon (i.e CO2) emissions per unit of electricity produced. In China, these efficiency savings replacement process may outstrip the growth in power supply from fossil fuels. But in the rest of Asia, the new coal-fired power stations will be mostly additional capacity in the coming decades, so will lead to an increase in CO2 emissions. It is this additional capacity that will be primarily responsible for driving the economic growth that will lift the poor out of extreme poverty.

The newer technologies are important in other types emissions. That is the particle emissions that has caused high levels of choking pollution and smogs in many cities of China and India. By using the new technologies, other countries can avoid the worst excesses of this pollution, whilst still using a cheap fuel available from many different sources of supply. The thrust in China will likely be to replace the high pollution power stations with new technologies or adapt them to reduce the emissions and increase efficiencies. Politically, it is a different way of raising living standards and quality of life than by increasing real disposable income per capita.

Kevin Marshall

 

HADCRUT4, CRUTEM4 and HADSST3 Compared

In the previous post, I compared early twentieth-century warming with the post-1975 warming in the Berkeley Earth Global temperature anomaly. From a visual inspection of the graphs, I determined that the greater warming in the later period is due to more land-based warming, as the warming in the oceans (70% of the global area) was very much the same. The Berkeley Earth data ends in 2013, so does not include the impact of the strong El Niño event in the last three years.

Global average temperature series page of the Met Office Hadley Centre Observation Datasets has the average annual temperature anomalies for CRUTEM4 (land-surface air temperature) and HADSST3 (sea-surface temperature)  and HADCRUT4 (combined). From these datasets, I have derived the graph in Figure 1.

Figure 1 : Graph of Hadley Centre annual temperature anomalies for Land (CRUTEM4), Sea (HADSST3) and Combined (HADCRUT4)

  Comparing the early twentieth-century with 1975-2010,

  • Land warming is considerably greater in the later period.
  • Combined land and sea warming is slightly more in the later period.
  • Sea surface warming is slightly less in the later period.
  • In the early period, the surface anomalies for land and sea have very similar trends, whilst in the later period, the warming of the land is considerably greater than the sea surface warming.

The impact is more clearly shown with 7 year centred moving average figures in Figure 2.

Figure 2 : Graph of Hadley Centre 7 year moving average temperature anomalies for Land (CRUTEM4), Sea (HADSST3) and Combined (HADCRUT4)

This is not just a feature of the HADCRUT dataset. NOAA Global Surface Temperature Anomalies for land, ocean and combined show similar patterns. Figure 3 is on the same basis as Figure 2.

Figure 3 : Graph of NOAA 7 year moving average temperature anomalies for Land, Ocean and Combined.

The major common feature is that the estimated land temperature anomalies have shown a much greater warming trend that the sea surface anomalies since 1980, but no such divergence existed in the early twentieth century warming period. Given that the temperature data sets are far from complete in terms of coverage, and the data is of variable quality, is this divergence a reflection of the true average temperature anomalies based on far more complete and accurate data? There are a number of alternative possibilities that need to be investigated to help determine (using beancounter terminology) whether the estimates are a true and fair reflection of the prespective that more perfect data and techniques would provide. My list might be far from exhaustive.

  1. The sea-surface temperature set understates the post-1975 warming trend due to biases within data set.
  2. The spatial distribution of data changed considerably over time. For instance, in recent decades more data has become available from the Arctic, a region with the largest temperature increases in both the early twentieth century and post-1975.
  3. Land data homogenization techniques may have suppressed differences in climate trends where data is sparser. Alternatively, due to relative differences in climatic trends between nearby locations increasing over time, the further back in time homogenization goes, the more accentuated these differences and therefore the greater the suppression of genuine climatic differences. These aspects I discussed here and here.
  4. There is deliberate manipulation of the data to exaggerate recent warming. Having looked at numerous examples three years ago, this is a perspective that I do not believe to have had any significant impact. However, simply believing something not to be the case, even with examples, does not mean that it is not there.
  5. Strong beliefs about how the data should look have, over time and multiple data adjustments created biases within the land temperature anomalies.

What I do believe is that an expert opinion to whether this divergence between the land and sea surface anomalies is a “true and fair view” of the actual state of affairs can only be reached by a detailed examination of the data. Jumping to conclusions – which is evident from many people across the broad spectrum of opinions on catastrophic anthropogenic global warming debate – will fall short of the most rounded opinion that can be gleaned from the data.

Kevin Marshall

 

The magnitude of Early Twentieth Century Warming relative to Post-1975 Warming

I was browsing the Berkeley Earth website and came across their estimate of global average temperature change. Reproduced as Figure 1.

Figure 1 – BEST Global Temperature anomaly

What clearly stands out is the 10-year moving average line. It clearly shows warming from in the early twentieth century, (the period 1910 to 1940) being very similar warming from the mid-1970s to the end of the series in both time period and magnitude. Maybe the later warming period is up to one-tenth of a degree Celsius greater than the earlier one. The period from 1850 to 1910 shows stasis or a little cooling, but with high variability. The period from the 1940s to the 1970s shows stasis or slight cooling, and low variability.

This is largely corroborated by HADCRUT4, or at least the version I downloaded in mid-2014.

Figure 2 – HADCRUT4 Global Temperature anomaly

HADCRUT4 estimates that the later warming period is about three-twentieths of a degree Celsius greater than the earlier period and that the recent warming is slightly less than the BEST data.

The reason for the close fit is obvious. 70% of the globe is ocean and for that BEST use the same HADSST dataset as HADCRUT4. Graphics of HADSST are a little hard to come by, but KevinC at skepticalscience usefully produced a comparison of the latest HADSST3 in 2012 with the previous version.

Figure 3  – HADSST Ocean Temperature anomaly from skepticalscience 

This shows the two periods having pretty much the same magnitudes of warming.

It is the land data where the differences lie. The BEST Global Land temperature trend is reproduced below.

Figure 4 – BEST Global Land Temperature anomaly

For BEST global land temperatures, the recent warming was much greater than the early twentieth-century warming. This implies that the sea surface temperatures showed pretty much the same warming in the two periods. But if greenhouse gases were responsible for a significant part of global warming then the warming for both land and sea would be greater after the mid-1970s than in the early twentieth century. Whilst there was a rise in GHG levels in the early twentieth century, it was less than in the period from 1945 to 1975, when there was no warming, and much less than the post-1975 when CO2 levels rose massively. Whilst there can be alternative explanations for the early twentieth-century warming and the subsequent lack of warming for 30 years (when the post-WW2 economic boom which led to a continual and accelerating rise in CO2 levels), without such explanations being clear and robust the attribution of post-1975 warming to rising GHG levels is undermined. It could be just unexplained natural variation.

However, as a preliminary to examining explanations of warming trends, as a beancounter, I believe it is first necessary to examine the robustness of the figures. In looking at temperature data in early 2015, one aspect that I found unsatisfactory with the NASA GISS temperature data was the zonal data. GISS usefully divide the data between 8 bands of latitude, which I have replicated as 7 year centred moving averages in Figure 5.

Figure 5 – NASA Gistemp zonal anomalies and the global anomaly

What is significant is that some of the regional anomalies are far greater in magnitude

The most Southerly is for 90S-64S, which is basically Antarctica, an area covering just under 5% of the globe. I found it odd that there should a temperature anomaly for the region from the 1880s, when there were no weather stations recording on the frozen continent until the mid-1950s. The nearest is Base Orcadas located at 60.8 S 44.7 W, or about 350km north of 64 S. I found that whilst the Base Orcadas temperature anomaly was extremely similar to the Antarctica Zonal anomaly in the period until 1950, it was quite dissimilar in the period after.

Figure 6. Gistemp 64S-90S annual temperature anomaly compared to Base Orcadas GISS homogenised data.

NASA Gistemp has attempted to infill the missing temperature anomaly data by using the nearest data available. However, in this case, Base Orcadas appears to climatically different than the average anomalies for Antarctica, and from the global average as well. The result of this is to effectively cancel out the impact of the massive warming in the Arctic on global average temperatures in the early twentieth century. A false assumption has effectively shrunk the early twentieth-century warming. The shrinkage will be small, but it undermines the NASA GISS being the best estimate of a global temperature anomaly given the limited data available.

Rather than saying that the whole exercise of determining a valid comparison the two warming periods since 1900 is useless, I will instead attempt to evaluate how much the lack of data impacts on the anomalies. To this end, in a series of posts, I intend to look at the HADCRUT4 anomaly data. This will be a top-down approach, looking at monthly anomalies for 5o by 5o grid cells from 1850 to 2017, available from the Met Office Hadley Centre Observation Datasets. An advantage over previous analyses is the inclusion of anomalies for the 70% of the globe covered by ocean. The focus will be on the relative magnitudes of the early twentieth-century and post-1975 warming periods. At this point in time, I have no real idea of the conclusions that can be drawn from the analysis of the data.

Kevin Marshall

 

 

Ocean Impact on Temperature Data and Temperature Homgenization

Pierre Gosselin’s notrickszone points looks at a new paper.

Temperature trends with reduced impact of ocean air temperature – Frank LansnerJens Olaf Pepke Pedersen.

The paper’s abstract.

Temperature data 1900–2010 from meteorological stations across the world have been analyzed and it has been found that all land areas generally have two different valid temperature trends. Coastal stations and hill stations facing ocean winds are normally more warm-trended than the valley stations that are sheltered from dominant oceans winds.

Thus, we found that in any area with variation in the topography, we can divide the stations into the more warm trended ocean air-affected stations, and the more cold-trended ocean air-sheltered stations. We find that the distinction between ocean air-affected and ocean air-sheltered stations can be used to identify the influence of the oceans on land surface. We can then use this knowledge as a tool to better study climate variability on the land surface without the moderating effects of the ocean.

We find a lack of warming in the ocean air sheltered temperature data – with less impact of ocean temperature trends – after 1950. The lack of warming in the ocean air sheltered temperature trends after 1950 should be considered when evaluating the climatic effects of changes in the Earth’s atmospheric trace amounts of greenhouse gasses as well as variations in solar conditions.

More generally, the paper’s authors are saying that over fairly short distances temperature stations will show different climatic trends. This has a profound implication for temperature homogenization. From Venema et al 2012.

The most commonly used method to detect and remove the effects of artificial changes is the relative homogenization approach, which assumes that nearby stations are exposed to almost the same climate signal and that thus the differences between nearby stations can be utilized to detect inhomogeneities (Conrad and Pollak, 1950). In relative homogeneity testing, a candidate time series is compared to multiple surrounding stations either in a pairwise fashion or to a single composite reference time series computed for multiple nearby stations. 

Lansner and Pederson are, by implication, demonstrating that the principle assumption on which homogenization is based (that nearby temperature stations are exposed to almost the same climatic signal) is not valid. As a result data homogenization will not only eliminate biases in the temperature data (such a measurement biases, impacts of station moves and the urban heat island effect where it impacts a minority of stations) but will also adjust out actual climatic trends. Where the climatic trends are localized and not replicated in surrounding areas, they will be eliminated by homogenization. What I found in early 2015 (following the examples of Paul Homewood, Euan Mearns and others) is that there are examples from all over the world where the data suggests that nearby temperature stations are exposed to different climatic signals. Data homogenization will, therefore, cause quite weird and unstable results. A number of posts were summarized in my post Defining “Temperature Homogenisation”.  Paul Matthews at Cliscep corroborated this in his post of February 2017 “Instability og GHCN Adjustment Algorithm“.

During my attempts to understand the data, I also found that those who support AGW theory not only do not question their assumptions but also have strong shared beliefs in what the data ought to look like. One of the most significant in this context is a Climategate email sent on Mon, 12 Oct 2009 by Kevin Trenberth to Michael Mann of Hockey Stick fame, and copied to Phil Jones of the Hadley centre, Thomas Karl of NOAA, Gavin Schmidt of NASA GISS, plus others.

The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate. (emphasis mine)

Homogenizing data a number of times, and evaluating the unstable results in the context of strongly-held beliefs will bring the trends evermore into line with those beliefs. There is no requirement for some sort of conspiracy behind deliberate data manipulation for this emerging pattern of adjustments. Indeed a conspiracy in terms of a group knowing the truth and deliberately perverting that evidence does not really apply. Another reason for the conspiracy not applying is the underlying purpose of homogenization. It is to allow that temperature station to be representative of the surrounding area. Without that, it would not be possible to compile an average for the surrounding area, from which the global average in constructed. It is this requirement, in the context of real climatic differences over relatively small areas, I would suggest leads to the deletions of “erroneous” data and the infilling of estimated data elsewhere.

The gradual bringing the temperature data sets into line will beliefs is most clearly shown in the NASA GISS temperature data adjustments. Climate4you produces regular updates of the adjustments since May 2008. Below is the March 2018 version.

The reduction of the 1910 to 1940 warming period (which is at odds with theory) and the increase in the post-1975 warming phase (which correlates with the rise in CO2) supports my contention of the influence of beliefs.

Kevin Marshall

 

Climate Alarmist Bob Ward’s poor analysis of Research Data

After Christopher Booker’s excellent new Report for the GWPF “Global Warming: A Case Study In Groupthink” was published on 20th February, Bob Ward (Policy and Communications Director at the Grantham Research Institute on Climate Change and the Environment at the LSE) typed a rebuttal article “Do male climate change ‘sceptics’ have a problem with women?“. Ward commenced the article with a highly misleading statement.

On 20 February, the Global Warming Policy Foundation launched a new pamphlet at the House of Lords, attacking the mainstream media for not giving more coverage to climate change ‘sceptics’.

I will lead it to the reader to judge for themselves how misleading the statement is by reading the report or alternatively reading his summary at Capx.co.

At Cliscep (reproduced at WUWT), Jaime Jessop has looked into Ward’s distractive claims about the GWPF gender bias. This comment by Ward particularly caught my eye.

A tracking survey commissioned by the Department for Business, Energy and Industrial Strategy showed that, in March 2017, 7.6% answered “I don’t think there is such a thing as climate change” or “Climate change is caused entirely caused by natural processes”, when asked for their views. Among men the figure was 8.1%, while for women it was 7.1%.

I looked at the Tracking Survey. It is interesting that the Summary of Key Findings contains no mention of gender bias, nor of beliefs on climate change. It is only in the Wave 21 full dataset spreadsheet that you find the results of the question 22.

Q22. Thinking about the causes of climate change, which, if any, of the following best describes your opinion?
[INVERT ORDER OF RESPONSES 1-5]
1. Climate change is entirely caused by natural processes
2. Climate change is mainly caused by natural processes
3. Climate change is partly caused by natural processes and partly caused by human activity
4. Climate change is mainly caused by human activity
5. Climate change is entirely caused by human activity
6. I don’t think there is such a thing as climate change.
7. Don’t know
8. No opinion

Note that the first option presented to the questionee is 5, then 4, then 3, then 2, then 1. There may, therefore, be an inbuilt bias in overstating the support for Climate Change being attributed to human activity. But the data is clearly presented, so a quick pivot table was able to check Ward’s results.

The sample was of 2180 – 1090 females and 1090 males. Adding the responses  to “I don’t think there is such a thing as climate change” or “Climate change is caused entirely caused by natural processes” I get 7.16% for females – (37+41)/1090 – and 8.17% for males – (46+43)/1090. Clearly, Bob Ward has failed to remember what he was taught in high school about roundings.

Another problem is that this is raw data. The opinion pollsters have taken time and care to adjust for various demographic factors by adding a weighting to each line. On this basis, Ward should have reported 6.7% for females, 7.6% for males and 7.1% overall.

More importantly, if males tend to be more sceptical of climate change than females, then they will be less alarmist than females. But the data says something different. Of the weighted responses, to those who opted for the most extreme “Climate change is entirely caused by natural processes“, 12.5% were female and 14.5% were male. Very fractionally at the extreme, men are proportionality more alarmist than females than they are sceptical. More importantly, men are slightly more extreme in their opinions on climate change (for or against) than women.

The middle ground is the response to “Climate change is partly caused by natural processes and partly caused by human activity“. The weighted response was 44.5% female and 40.7% male, confirming that men are more extreme in their views than women.

There is a further finding that can be drawn. The projections by the IPCC for future unmitigated global warming assume that all, or the vast majority of, global warming since 1850 is human-caused. Less than 41.6% of British women and 43.2% of British men agree with this assumption that justifies climate mitigation policies.

Below are my summaries. My results are easily replicated for those with an intermediate level of proficiency in Excel.

Learning Note

The most important lesson for understanding data is to analyse that data from different perspectives, against different hypotheses. Bob Ward’s claim of a male gender bias towards climate scepticism in an opinion survey, upon a slightly broader analysis, becomes one where British males are slightly more extreme and forthright in their views than British females whether for or against. This has parallels to my conclusion when looking at the 2013 US study The Role of Conspiracist Ideation and Worldviews in Predicting Rejection of Science – Stephan Lewandowsky, Gilles E. Gignac, Klaus Oberauer. Here I found that rather than the paper’s finding that conspiracist ideation being “associated with the rejection of all scientific propositions tested”, the data strongly indicated that people with strong opinions on one subject, whether for or against, tend to have strong opinions on other subjects, whether for or against. Like with any bias of perspective, (ideological, religious, gender, race, social class, national, football team affiliation etc.) the way to counter bias is to concentrate on the data. Opinion polls are a poor starting point, but at least they may report on perspectives outside of one’s own immediate belief systems. 

Kevin Marshall

Sea Level Rise Projections and Policy

One blog I follow is TrustYetVerify. The latest post – Projecting sea level 300, nah, 1000 years in the future – is straightforward and highlights some significant issues for climate policy.

He compares claims of an activist in a Belgium newspaper that unmitigated climate change will result in sea level rise of 5 metres in 300 years, with a graphic from UNIPCC AR5 WG1 Chapter 13 on sea level rise that showed a at most around a 3 metre rise.

There was a good spot by Michel in relation to a graphic from a December 2017 presentation on the impacts of an 8 metre rise in sea levels by the year 3000. In was originally from a 2004 Greenpeace document. Only the earlier document also had the impacts of current sea inundation and a 1 metre sea level rise.

There are some lessons that can be learnt.

Marginal Difference of policy

The current sea coverage is of large areas of the Netherlands that are not currently covered by sea water. To create the graphic, they have removed the dykes that have enabled the Netherlands to vastly increase its land area. This not only vastly exaggerates the impact of sea level rise, but contains the assumption that people are too dumb to counter the impact of sea level rise by building dykes higher. Given that even the exaggerated claims are 5 metres in 300 years, that means an average rate of rising of 17mm per annum and a maximum rate of maybe 30mm. What is more, any rise is predictable over maybe decades. Decisions can be made over 20-50 year timescales, which are far less onerous than taking the long-term perspective. Even if a 5 metre rise over 300 years was accurate, either building dykes now assuming sea levels are 5 metres higher, or abandoning areas that will be inundated will cause needless costs for this generation and the next few generations.
The is an even greater policy assumption, that I repeatedly point out. Climate mitigation through reducing greenhouse gas emissions requires that global emissions are reduced.  It does not matter whether Belgium, and the Netherlands make massive cuts their emissions, if most other countries do not follow similar policies. As a graphic 3.1 from the UNEP Emissions Gap Report 2017 clearly demonstrates, the net impact of all proposed policies is very little compared to doing nothing, and a long way from the 1.5°C or the 2°C targets. This is after over 20 years of annual COP meetings to obtain much bigger reductions.

The marginal impact of sea-level rise is therefore exaggerated by

  • Assuming that the existing flood defences vanish.
  • Assuming people do not build any more defences.
  • Exaggerating the projected rise.
  • Looking at a far greater timescale than rational planning ought to take place.
  • Falsely promoting emissions reductions to combat sea level rise impacts, knowing that whatever a few countries do will not make a difference to overall emissions. If significant warming is caused by human GHG emissions, and this leads to significant sea level rise, then current emissions policies are largely a waste of time.

 

Checking and Interpreting Forecasts / Projections

Consider the sea level rise graphic from UNIPCC AR5 WG1 Chapter 13 .

Consider the projections for the year 2500.

The High Scenarios shows sea level rise of 1.5 to 6.5m in 2500 for >700ppm CO2.
Medium scenarios show sea level rise of 0.2 to 2.3m in 2500 for 500-700ppm CO2.
Low scenarios show sea level rise of 0.5 to 1.0m in 2500 for <500ppm CO2.

How can the medium scenarios project a lower bottom end than the low scenarios?

The explanation probably lies in different modelling assumptions. After all the greater the scenario from the current state of affairs, the greater the uncertainty range, unless you assume that the structure of the model contains truths not revealing from any observations.

Further note the High scenarios lower limit is only 30cm a century, and the top end is 1.3m a century, whilst the medium scenarios bottom end over five centuries is roughly the rate of sea level rise per century for the last few centuries. That is, well within the medium scenario uncertainty range is the possibility that some global warming will make no difference to the rate of sea level rise.

What I also find interesting is that under the medium scenarios, Antarctica is gaining ice, hence reducing sea levels, but under the low scenarios has no impact whatsoever. Again, this shows the different modelling assumptions used.

Concluding note

Suppose a pharmaceutical company promoted a product with clearly exaggerated claims of its effectiveness, false alarm for the need for the product, and deliberately played down the harms that the product could cause to the patient? There would be an outcry, and the company being sued in a world without regulations. In most countries, strict regulations mean that to market a new product, the onus is on that company to demonstrate the product works, and that side effects are known. But it is alright to promote such falsehoods to “save the plant for future generations“. Indeed, to shout down critics as deniers of climate change. 

Kevin Marshall

“Were going to miss the 2°C Warming target” study and IPCC AR5 WG3 Chapter 6

WUWT had a post on 22nd January

Study: we’re going to miss (and overshoot) the 2°C warming target

This comment (from a University of Southhampton pre-publication news release) needs some explanation to relate it to IPCC AR5.

Through their projections, Dr Goodwin and Professor Williams advise that cumulative carbon emissions needed to remain below 195-205 PgC (from the start of 2017) to deliver a likely chance of meeting the 1.5°C warming target while a 2°C warming target requires emissions to remain below 395-455 PgC.

The PgC is peta-grams of Carbon. For small weights, one normally uses grams. For larger weights one uses kilograms. For still larger weights one uses tonnes. Under the Imperial measurement system, one uses ounces, pounds and tons. So one peta-gram is a billion (or giga) tonne.
Following the IPCC’s convention, GHG emissions are expressed in units of CO2, not carbon. Other GHGs are expressed in CO2e. So 1 PgC = 3.664 GtCO2e.

So the emissions from the start of 2017 are 715-750 GtCO2e for 1.5°C of warming and 1447-1667 GtCO2e for 2°C of warming. To make comparable to IPCC AR5, (specifically to table 6.3 from IPCC AR5 WG3 Chapter 6 p431), one needs to adjust for two things – the IPCC’s projections are from 5 years earlier, and for CO2 emissions only, about 75% of GHG emissions.

The IPCC’s projections of CO2 emissions are 630-1180 GtCO2 for 1.5-1.7°C of warming and 960-1550 GtCO2e for 1.7-2.1°C of warming.

With GHG emissions roughly 50 GtCO2e a year and CO2 emissions 40 GtCO2 a year, from the IPCC’s figures updated from the start of 2017 and expressed in GtCO2e are 570-1300 GtCO2e for 1.5-1.7°C of warming and 1010-1800 GtCO2e for 1.7-2.1°C of warming.

Taking the mid-points of the IPCC’s and the Goodwin-Williams figures, the new projections are saying that at current emissions levels, 1.5°C will be breached four years earlier, and 2°C will be breached one year later. Only the mid-points are 1.6°C and 1.9°C, so it makes no real difference whatsoever. The Goodwin-Williams figures just narrow the ranges and use different units of measure.

But there is still a major problem. Consider this mega table 6.3 reproduced, at lower quality, below.

Notice Column A is for CO2 equivalent concentration in 2100 (ppm CO2eq). Current CO2 levels are around 405 ppm, but GHG gas levels are around 450 ppm CO2eq. Then notice columns G and H, with a joint heading of Concentration (ppm). Column G is for CO2 levels in 2100 and Column H is for CO2 equivalent levels. Note also that for the first few rows of data, Column H is greater than Column A, implying that sometime this century peak CO2 levels will be higher than at the end of the century, and (subject to the response period of the climate system to changes in greenhouse gas levels)  average global temperatures could (subject to the models being correct) exceed the projected 2100 levels. How much though?

I will a magic equation at the skeptical science blog, and (after correcting to make a doubling of CO2 convert to exactly 3°C of warming) assume that all changes in CO2 levels instantly translate into average temperature changes. Further, I assume that other greenhouse gases are irrelevant to the warming calculation, and peak CO2 concentrations are calculated from peak GHG, 2100 GHG, and 2100 CO2 concentrations. I derived the following table.

The 1.5°C warming scenario is actually 1.5-1.7°C warming in 2100, with a mid-point of 1.6°C. The peak implied temperatures are about 2°C.

The 2°C warming scenario is actually 1.7-2.1°C warming in 2100, with a mid-point of 1.9°C. The peak implied temperatures are about 2.3°C, with 2.0°C of warming in 2100 implying about 2.4°C peak temperature rise.

So when the IPCC talk about constraining temperature rise, it is about projected temperature rise in 2100, not about stopping global average temperature rise breaching 1.5°C or 2°C barriers.

Now consider the following statement from the University of Southhampton pre-publication news release, emphasis mine.

“Immediate action is required to develop a carbon-neutral or carbon-negative future or, alternatively, prepare adaptation strategies for the effects of a warmer climate,” said Dr Goodwin, Lecturer in Oceanography and Climate at Southampton. “Our latest research uses a combination of a model and historical data to constrain estimates of how long we have until 1.5°C or 2°C warming occurs. We’ve narrowed the uncertainty in surface warming projections by generating thousands of climate simulations that each closely match observational records for nine key climate metrics, including warming and ocean heat content.”

Professor Williams, Chair in Ocean Sciences at Liverpool, added: “This study is important by providing a narrower window of how much carbon we may emit before reaching 1.5°C or 2°C warming. There is a real need to take action now in developing and adopting the new technologies to move to a more carbon-efficient or carbon-neutral future as we only have a limited window before reaching these warming targets.” This work is particularly timely given the work this year of the Intergovernmental Panel on Climate Change (IPCC) to develop a Special Report on the Impacts of global warming of 1.5°C above pre-industrial levels.

Summary

The basic difference between IPCC AR5 Chapter 6 Table 6.3 and the new paper is the misleading message that various emissions policy scenarios will prevent warming breaching either 1.5°C or 2°C of warming when the IPCC scenarios are clear that this is the 2100 warming level. The IPCC scenarios imply that before 2100 warming could peak at respectively around 1.75°C or 2.4°C.  My calculations can be validated through assuming (a) a doubling of CO2 gives 3°C of warming, (b) other GHGs are irrelevant, (c) there no significant lag between the rise in CO2 level and rise in global average temperature.

Kevin Marshall

Is China leading the way on climate mitigation?

At the Conversation is an article on China’s lead in renewable energy.
China wants to dominate the world’s green energy markets – here’s why is by University of Sheffield academic Chris G Pope. The article starts:-

If there is to be an effective response to climate change, it will probably emanate from China. The geopolitical motivations are clear. Renewable energy is increasingly inevitable, and those that dominate the markets in these new technologies will likely have the most influence over the development patterns of the future. As other major powers find themselves in climate denial or atrophy, China may well boost its power and status by becoming the global energy leader of tomorrow.

The effective response ought to be put into the global context. At the end of October UNEP produced its Emissions Gap Report 2017, just in time for the COP23 meeting in Bonn. The key figure on the aimed for constraint of warming to 1.5°C to 2°C from pre-industrial levels – an “effective polcy response” – is E5.2, reproduced below.

An “effective response” by any one country is at least reducing it’s emissions substantially by 2030 compared with now at the start of 2018. To be a world leader in response to climate change requires reducing emissions in the next 12 years by more than the required global average of 20-30%.

Climate Action Tracker – which, unlike myself strongly promotes climate mitigation – rates China’s overall policies as Highly Insufficient in terms of limiting warming to 1.5°C to 2°C. The reason is that they forecast on the basis of current policies emissions will increase in China in the next few years, instead of rapidly decreasing.

So why has Chris Pope got China’s policy so radically wrong? After all, I accept the following statement.

Today, five of the world’s six top solar-module manufacturers, five of the largest wind turbine manufacturers, and six of the ten major car manufacturers committed to electrification are all Chinese-owned. Meanwhile, China is dominant in the lithium sector – think: batteries, electric vehicles and so on – and a global leader in smart grid investment and other renewable energy technologies.

Reducing net emissions means not just have lots of wind turbines, hydro schemes, solar farms and electric cars. It means those renewable forms of energy replacing CO2 energy sources. The problem is that renewables are adding to total energy production, along with fossil fuels. The principal source of China’s energy for electricity and heating is coal. The Global Coal Plant Tracker at endcoal.org has some useful statistics. In terms of coal-fired power stations, China now has 922 GW of coal-fired power stations operating (47% of the global total) with a further 153 GW “Announced + Pre-permit + Permitted” (28%) and 147 GW under construction (56%). Further, from 2006 to mid-2017, China’s Newly Operating Coal Plants had a capacity of 667 GW, fully 70% of the global total. Endcoal.org estimates that coal-fired power stations account for 72% of global GHG emissions from the energy sector, with the energy-sector contributing to 41% of global GHG emissions. With China’s coal-fired power stations accounting for 47% of the global total, assuming similar capacity utilization, China’s coal-fired power stations account for 13-14% of global GHG emissions or 7 GtCO2e of around 52 GtCO2e. It does not stop there. Many homes in China use coal for domestic heating; there is a massive coal-to-liquids program (which may not be currently operating due to the low oil price); manufacturers (such as metal refiners) burn it direct; and recently there are reports of producing gas from coal. So why would China pursue a massive renewables program?

Possible reasons for the Chinese “pro-climate” policies

First, is for strategic energy reasons. I believe that China does not want to be dependent on world oil price fluctuations, which could harm economic growth. China, therefore, builds massive hydro schemes, despite it there being damaging to the environment and sometimes displacing hundreds of thousands of people. China also pursues coal-to-liquids programs, alongside promoting solar and wind farms. Although duplicating effort, it means that if oil prices suffer another hike, China is more immune from the impact than

Second, is an over-riding policy of a fast increase in perceived living standards. For over 20 years China managed average growth rates of up to 10% per annum, increasing average incomes by up to eight times, and moving hundreds of millions of people out of grinding poverty. Now economic growth is slowing (to still fast rates by Western standards) the raising of perceived living standards is being achieved by other means. One such method is to reduce the particulate pollution, particularly in the cities. The recent heavy-handed banning of coal burning in cities (with people freezing this winter) is one example. Another, is the push for electric cars, with the electricity mostly coming from distant coal-fired power stations. In terms of reducing CO2 emissions, electric cars do not make sense, but they do make sense in densely-populated areas with an emerging middle class wanting independent means of travel.

Third, is the push to dominate areas of manufacturing. With many countries pursuing hopeless renewables policies, the market for wind turbines and solar panels is set to increase. The “rare earths” required for the wind turbine magnets, such as neodymium, are produced in large quantities in China, such as in highly polluted Baotou. With lithium (required for batteries), China might only be currently world’s third largest producer – and some way behind Australia and Chile – but its reserves are the world’s second largest and sufficient on their own to supply current global demand for decades. With raw material supplies and low, secure energy costs from coal, along with still relatively low labour costs, China is well-placed to dominate these higher added-value manufacturing areas.

Concluding Comments

The wider evidence shows that an effective response to climate change is not emanating from China. The current energy policies are dominated, and will continue to be dominated, by coal. This will far out-weigh any apparent reductions in emissions from the manufacturing of renewables. Rather, the growth of renewables should be viewed in the context of promoting the continued rapid and secure increase in living standards for the Chinese people, whether in per capita income, or in standards of the local environment.

Kevin Marshall

 

NOAA Future Aridity against Al Gore’s C20th Precipitation Graphic

Paul Homewood has taken a look at an article in yesterdays Daily Mail – A quarter of the world could become a DESERT if global warming increases by just 2ºC.

The article states

Aridity is a measure of the dryness of the land surface, obtained from combining precipitation and evaporation.  

‘Aridification would emerge over 20 to 30 per cent of the world’s land surface by the time the global temperature change reaches 2ºC (3.6ºF)’, said Dr Manoj Joshi from the University of East Anglia’s School of Environmental Sciences and one of the study’s co-authors.  

The research team studied projections from 27 global climate models and identified areas of the world where aridity will substantially change.  

The areas most affected areas are parts of South East Asia, Southern Europe, Southern Africa, Central America and Southern Australia.

Now, having read Al Gore’s authoritative book An Inconvenient Truth there are statements first about extreme flooding, and then about aridity (pages 108-113). The reason for flooding coming first is on a graphic of twentieth-century changes in precipitation on pages 114 & 115.

This graphic shows that, overall, the amount of precipitation has increased globally in the last century by almost 20%.

 However, the effects of climate change on precipitation is not uniform. Precipitation in the 20th century increased overall, as expected with global warming, but in some regions precipitation actually decreased.

The blue dots mark the areas with increased precipitation, the orange dots with decreases. The larger the dot, the larger the change. So, according to Nobel Laureate Al Gore, increased precipitation should be the far more common than increased aridity. If all warming is attributed to human-caused climate change (as the book seems to imply) then over a third of the dangerous 2ºC occurred in the 20th century. Therefore there should be considerable coherence between the recent arid areas and future arid areas.

The Daily Mail reproduces a map from the UEA, showing the high-risk areas.

There are a couple of areas with big differences.

Southern Australia

In the 20th century, much of Australia saw increased precipitation. Within the next two or three decades, the UEA projects it getting considerably arider. Could this change in forecast be the result of the extreme drought that broke in 2012 with extreme flooding? Certainly, the pictures of empty reservoirs taken a few years ago, alongside claims that they would never likely refill show the false predictions.

One such reservoir is Lake Eildon in Victoria. Below is a graphic of capacity levels in selected years. It is possible to compare other years by following the historical water levels for EILDON link.

Similarly, in the same post, I linked to a statement by re-insurer Munich Re stating increased forest fires in Southern Australia were due to human activity. Not by “anthropogenic climate change”, but by discarded fag ends, shards of glass and (most importantly) fires that were deliberately started.

Northern Africa

The UEA makes no claims about increased aridity in Northern Africa, particularly with respect to the Southern and Northern fringes of the Sahara. Increasing desertification of the Sahara used to be claimed as a major consequence of climate change. In the year following Al Gore’s movie and book, the UNIPCC produced its Fourth Climate Assessment Report. Working Group II report, Chapter 9 (Pg 448) on Africa made the following claim.

In other countries, additional risks that could be exacerbated by climate change include greater erosion, deficiencies in yields from rain-fed agriculture of up to 50% during the 2000-2020 period, and reductions in crop growth period (Agoumi, 2003).

Richard North took a detailed look at the background of this claim in 2010. The other African countries were Morocco, Algeria and Tunisia. Agoumi 2003 compiled three reports, only one of which – Morocco – had anything near a 50% claim. Yet Morocco seems, from Al Gore’s graphic to have had a modest increase in rainfall over the last century.

Conclusion

The UEA latest doom-laden prophesy of increased aridity flies in the face of the accepted wisdom that human-caused global warming will result in increased precipitation. In two major areas (Southern Australia and Northern Africa), increased aridity is at add odds with changes in precipitation claimed to have occurred in the 20th Century by Al Gore in An Inconvenient Truth. Yet over a third of the of the dangerous 2ºC warming limit occurred in the last century.

Kevin Marshall

 

President Trumps Tweet on record cold in New York and Temperature Data

As Record-breaking winter weather grips North-Eastern USA (and much of Canada as well) President Donald Trump has caused quite a stir with his latest Tweet.

There is nothing new in the current President’s tweets causing controversy. This is a hard-hitting one has highlights a point of real significance for AGW theory. After decades of human-caused global warming, record cold temperatures are more significant than record warm temperatures. Record cold can be accommodated within the AGW paradigm by claiming greater variability in climate resultant on the warming. This would be a portent of the whole climate system being thrown into chaos once some tipping point had been breached. But that would also require that warm records are
(a) far more numerous than cold records and
(b) Many new warm records outstrip the old records of a few decades ago by a greater amount than the rise in average temperatures in that area.
I will illustrate with three temperature data sets I looked at a couple of years ago – Reykjavík, Iceland and Isfjord Radio and Svalbard Airport on Svalbard.

Suppose there had been an extremely high and an extremely low temperature in 2009 in Reykjavík. For the extreme high temperature to be a record it would only have to be nominally higher than a record set in 1940 to be a new record. The unadjusted average anomaly data is the same. If the previous record had been set in say 1990, a new high record would only be confirmation of more extreme climate if it was at least 1C higher than the previous record. But a new cold record in 2009 could be up to 1C higher than a 1990 low record to count as greater climate extremes. Similarly in the case of Svalbard Airport, new warm records in 2008 or 2009 would need to be over 4C higher than records set around 1980, and new cold records would need to be up to 4C higher than records set around 1980 to count as effective new warm and cold records.
By rebasing in terms of unadjusted anomaly data (and looking at monthly data) a very large number of possible records could be generated from one temperature station. With thousands of temperature stations with long records, it is possible to generate a huge number of “records” to analyze if the temperatures are becoming more extreme. But absolute record cold records should be few and far between. However, if relative cold records outstrip relative warm records, then there are questions to be asked of the average data. Similarly, if there were a lack of absolute records or a decreasing frequency of relative records, then the beliefs in impending climate chaos would be undermined.

I would not want to jump ahead with the conclusions. The most important element is to mine the temperature data and then analyze the results in multiple ways. There are likely to be surprises that could enhance understanding of climate in quite novel ways.

Kevin Marshall