Guardian Images of Global Warming Part 2 – A Starved Dead Polar Bear

In the Part 2 of my look at Ashley Cooper’s photographs of global warming published in The Guardian on June 3rd I concentrate on the single image of a dead, emaciated, polar bear.
The caption reads

A male polar bear that starved to death as a consequence of climate change. Polar bears need sea ice to hunt their main prey, seals. Western fjords of Svalbard which normally freeze in winter, remained ice free all season during the winter of 2012/13, one of the worst on record for sea ice around the island archipelago. This bear headed hundreds of miles north, looking for suitable sea ice to hunt on before it finally collapsed and died.

The US National Snow and Ice Data Center (NSIDC) has monthly maps of sea ice extent. The Western Fjords were indeed ice free during the winter of 2012/13, even in March 2013 when the sea ice reaches a maximum. In March 2012 Western Fjords were also ice free, along with most of the North Coast was as well.  The maps are also available for March of 2011, 2010, 2009 and 2008. It is the earliest available year that seems to have the minimum extent. Screen shots of Svarlbard are shown below.

As the sea ice extent has been diminishing for years, maybe this had impacted on the polar bear population? This is not the case. A survey published late last year, showed that polar bear numbers has increased by 42% between 2004 and 2015 for Svarlbard and neighbouring archipelagos of Franz Josef Land and Novaya Zemlya.

Even more relevantly, studies have shown that the biggest threat to polar bear is not low sea ice levels but unusually thick spring sea ice. This affects the seal population, the main polar bear food source, at the time of year when the polar bears are rebuilding fat after the long winter.
Even if diminishing sea ice is a major cause of some starvation then it may have been a greater cause in the past. There was no satellite data prior to the late 1970s when the sea ice levels started diminishing. The best proxies are the average temperatures. Last year I looked at the two major temperature data sets for Svarlbard, both located on the West Coast where the dead polar bear was found. It would appear that there was a more dramatic rise in temperatures in Svarlbard in the period 1910-1925 than in period since the late 1970s. But in the earlier warming period polar bear numbers were likely decreasing, continuing into later cooling period. Recovery in numbers corresponds to the warming period. These changes have nothing to do with average temperatures or sea ice levels. It is because until recent decades polar bears were being hunted, a practice that has largely stopped.

The starvation of this pictured polar bear may have a more mundane cause. Polar bears are at the top of the food chain, relying on killing fast-moving seals for food. As a polar bear gets older it slows down, due to arthritis and muscles not working as well. As speed and agility are key factors in catching food, along with a bit of luck, starvation might be the most common cause of death in polar bears.

Kevin Marshall

Guardian Images of Global Warming Part 1 – Australian Droughts

On Friday June 3rd the Guardian presented some high quality images with the headline

Droughts, floods, forest fires and melting poles – climate change is impacting Earth like never before. From the Australia to Greenland, Ashley Cooper’s work spans 13 years and over 30 countries. This selection, taken from his new book, shows a changing landscape, scarred by pollution and natural disasters – but there is hope too, with the steady rise of renewable energy.

The purpose is to convince people that human-caused climate change is happening now, to bolster support for climate mitigation policies. But the real stories of what the pictures show is quite different.  I will start with three images relating to drought in Australia.

Image 5

Forest ghosts: Lake Eildon in Victoria, Australia was built in the 1950’s to provide irrigation water, but the last time it was full was in 1995. The day the shot was taken it was at 29% capacity with levels down around 75ft.

Data from Lake Eildon (which is accessible with a simple search of Lake Eildon capacity) links to a graph where up to 7 years of data can be compared.

In 1995 the dam was not at full capacity, but it was full, for a short period, in the following year. However, more recently after the recent drought broke, in 2011 the reservoir was pretty much full for all the year.

But were the low levels due to more extreme drought brought on by climate change? That is very difficult to determine, as Lake Eildon is an artificial lake, constructed to provide water for irrigation occasional hydro-electric power as well as recreational facilities. The near empty levels at the end of the biggest drought in many decades could be just due a failure to predict the duration of the drought, or simply a policy of supplying irrigation water for the maximum length of time. The fact that water levels never reached full capacity for many years is indicated by a 2003 article in The Age

The dam wall at Lake Eildon, Victoria’s biggest state-run water storage, has been declared unsafe and will need a $30 million upgrade if the lake is to be refilled.

The dam, which is at its lowest level since being completed in 1956, will be restricted to just 65 per cent capacity because it no longer meets safety standards for earthquakes and extreme floods.

Image 6

Forest destroyed by bush fires near Michelago, New South Wales, Australia.

The inference is that this is caused by global warming.

According to Munich Re

The majority of bushfires in southeast Australia are caused by human activity

Bushfire is the only natural hazard in which humans have a direct influence on the hazard situation. The majority of bushfires near populated areas are the consequence of human activity. Lightning causes the smaller portion naturally. Sometimes, a carelessly discarded cigarette or a glass shard, which can focus the sun’s rays is all it takes to start a fire. Heat from motors or engines, or electric sparks from power lines and machines can ignite dry grass. Besides this accidental causes, a significant share of wildfires are started deliberately.

Humans also change the natural fire frequency and intensity. They decrease the natural fire frequency due to deliberate fire suppression near populated areas. If there is no fuel-reduction burning in forests for the purposes of fire prevention, large quantities of combustible material can accumulate at ground level.

Surface fires in these areas can become so intense due to the large amounts of fuel that they spread to the crowns of the trees and rapidly grow into a major fire. If humans had not intervened in the natural bushfire regime, more frequent low-intensity fires would have consumed the forest undergrowth and ensured that woodland grasses and scrubs do not proliferate excessively.

David Evans expands on the issue of fuel load in a 2013 article.

Like with the water levels in an artificial lake, forest fires are strongly influenced by the management of those forests. Extinguishing forest fires before they have run their natural course results in bigger and more intense fires at a later date. More frequent or intense droughts would not change this primary cause of many horrific forest fire disasters seen in recent years.

Image 7

Where has all the water gone?: Lake Hume is the largest reservoir in Australia and was set up to provide irrigation water for farms further down the Murray Basin and drinking water for Adelaide. On the day this photograph was taken it was at 19.6% capacity. By the end of the summer of 2009 it dropped to 2.1 % capacity. Such impacts of the drought are likely to worsen as a result of climate change. The last time the water was anywhere near this road bridge was 10 years ago, rendering this no fishing sign, somewhat redundant.

Again this is old data. Like for Lake Eildon, it is easy to construct graphs.

Following the end of the drought, the reservoir came back to full capacity. Worsening drought is only apparent to those who look over a short time range.

When looking at drought in Australia, Dorothea Mackellar’s 1908 poem “My Country” provides some context. Written for a British audience, the poem begins

I love a sunburnt country,

A land of sweeping plains,

Of ragged mountain ranges,

Of droughts and flooding rains

To understand the difference that human-caused climate change is having on the climate first requires an understanding of natural climatic variation over multiple time-scales. It then requires an understanding of how other human factors are influencing the environment, both intended and unintended.

Kevin Marshall

Are the Paris Floods due to climate changing for the worse?

The flood of the River Seine is now past the 6.1m peak reached in the early hours of the early hours of Saturday 4th June. 36 hours later, the official measurements at Pont d’Austerlitz show that the level is below 5.7m. The peak is was just below the previous major flood in 1982 of 6.15m, but well above the previous flood emergency in 2000, when waters peaked at 3.92m. Below is a snapshot of a continually-updated graphic at the Environment Ministry VIGICRUES site.

Despite it being 16 years since this last emergency, the reaction of the authorities has been impressive. From giving people warnings of the rising levels; evacuating people; stopping all non-emergency vessels on the Seine; protecting those who live on the river; and putting into operation emergency procedures for the movement of art treasures out of basement storage in the Louvre.Without these measures the death toll and the estimated €600m cost of the flood would undoubtedly have been much higher.

The question that must be asked is whether human-caused climate change has made flooding worse on a river that has flooded for centuries. The data is hard to come by. An article in Le Figaro last year gave the top ten record floods, the worst being in 1658.

Although this is does show that the current high of 6.10m is a full 50cm below the tenth worst in 1920, there is no indication of increasing frequency.

From a 2012 report Les entreprises face au risque inondation I have compiled a graphic of all flood maximums which were six metres or higher.

This shows that major floods were much more frequent in the period 1910 to 1960 than in the period before or after. Superficially it would seem that recently flooding had been getting less severe. But this conclusion would ignore the many measures that were put in place after the flood of 1910. The 2014 OECD Report Seine Basin, Île-de-France: Resilience to Major Floods stated:-

Since 1910, the risk of a Seine River flood in the Ile-de-France region has been reduced in various stages by protective structures, including dams built upstream and river development starting in the 1920s, then in the 1950s up until the early 1990s. Major investments have been limited in the last decades, and it appears that protection levels are not up to the standards of many other comparable OECD countries, particularly in Europe. On the other hand, the exposure to the risk and the resulting vulnerability are accentuated by increasing urban density in the economic centre of France, as well as by the construction of a large number of areas activity centres and critical infrastructures (transport, energy, communications, water) along the Seine River.

If the climate impact had become more severe, then one would expect the number of major floods to increase given the limited new measures to prevent them. However, the more substantial measures taken in the last century could explain the reduced frequency of major floods, though the lack of floods between 1882 and 1910 suggests that the early twentieth century could have been an unusually wet period. Without detailed weather records my guess is that it is a bit of both. Extreme rainfall has decreased, whilst flood prevention measures have also had some impact on flood levels.

Kevin Marshall

Beliefs and Uncertainty: A Bayesian Primer

Ron Clutz’s introduction, based on a Scientific American article by John Horgan on January 4, 2016, starts to grapple with the issues involved.

The take home quote from Horgan is on the subject of false positives.

Here is my more general statement of that principle: The plausibility of your belief depends on the degree to which your belief–and only your belief–explains the evidence for it. The more alternative explanations there are for the evidence, the less plausible your belief is. That, to me, is the essence of Bayes’ theorem.

“Alternative explanations” can encompass many things. Your evidence might be erroneous, skewed by a malfunctioning instrument, faulty analysis, confirmation bias, even fraud. Your evidence might be sound but explicable by many beliefs, or hypotheses, other than yours.

In other words, there’s nothing magical about Bayes’ theorem. It boils down to the truism that your belief is only as valid as its evidence. If you have good evidence, Bayes’ theorem can yield good results. If your evidence is flimsy, Bayes’ theorem won’t be of much use. Garbage in, garbage out.
With respect to the question of whether global warming is human caused, there is basically a combination of three elements – (i) Human caused (ii) Naturally caused (iii) Random chaotic variation. There may be a number of sub-elements and an infinite number of combinations including some elements counteracting others, such as El Nino events counteracting underlying warming. Evaluation of new evidence is in the context of explanations being arrived at within a community of climatologists with strong shared beliefs that at least 100% of recent warming is due to human GHG emissions. It is that same community who also decide the measurement techniques for assessing the temperature data; the relevant time frames; and the categorization of the new data. With complex decisions the only clear decision criteria is conformity to the existing consensus conclusions. As a result, the original Bayesian estimates become virtually impervious to new perspectives or evidence that contradicts those original estimates.

Science Matters

Those who follow discussions regarding Global Warming and Climate Change have heard from time to time about the Bayes Theorem. And Bayes is quite topical in many aspects of modern society:

Bayesian statistics “are rippling through everything from physics to cancer research, ecology to psychology,” The New York Times reports. Physicists have proposed Bayesian interpretations of quantum mechanics and Bayesian defenses of string and multiverse theories. Philosophers assert that science as a whole can be viewed as a Bayesian process, and that Bayes can distinguish science from pseudoscience more precisely than falsification, the method popularized by Karl Popper.

Named after its inventor, the 18th-century Presbyterian minister Thomas Bayes, Bayes’ theorem is a method for calculating the validity of beliefs (hypotheses, claims, propositions) based on the best available evidence (observations, data, information). Here’s the most dumbed-down description: Initial belief plus new evidence = new and improved belief.   (A fuller and…

View original post 1,082 more words

CO2 Emissions from Energy production forecast to be rising beyond 2040 despite COP21 Paris Agreement

Last week the US Energy Information Administration (EIA) published their INTERNATIONAL ENERGY OUTLOOK 2016. The Daily Caller (and the GWPF) highlighted the EIA’s summary energy energy production. This shows that the despite the predicted strong growth in nuclear power and implausibly high growth in renewables, usage of fossil fuels are also predicted to rise, as shown in their headline graphic below.

For policy purposes, the important aspect is the translation into CO2 emissions. In the final Chapter 9. Energy-related CO2 Emissions figure 9.3 shows the equivalent CO2 Emissions in billions of tonnes of CO2. I have reproduced the graphic as a stacked bar chart.

Data reproduced as a stacked bar chart.

In 2010 these CO2 emissions are just under two-thirds of total global greenhouse gas emissions. The question is how does this fit into the policy requirements to avoid 2°C from the IPCC’s Fifth Assessment Report? The International Energy Authority summarized the requirements very succicently in World Energy Outlook 2015 Special Report page 18

The long lifetime of greenhouse gases means that it is the cumulative build-up in the atmosphere that matters most. In its latest report, the Intergovernmental Panel on Climate Change (IPCC) estimated that to preserve a 50% chance of limiting global warming to 2 °C, the world can support a maximum carbon dioxide (CO2) emissions “budget” of 3 000 gigatonnes (Gt) (the mid-point in a range of 2 900 Gt to 3 200 Gt) (IPCC, 2014), of which an estimated 1 970 Gt had already been emitted before 2014. Accounting for CO2 emissions from industrial processes and land use, land-use change and forestry over the rest of the 21st century leaves the energy sector with a carbon budget of 980 Gt (the midpoint in a range of 880 Gt to 1 180 Gt) from the start of 2014 onwards.

From the forecast above, cumulative CO2 emissions from 2014 with reach 980 Gt in 2038. Yet in 2040, there is no sign of peak emissions.

Further corroboration comes from the UNFCCC. In preparation for the COP21 from all the country policy proposals they produced a snappily titled Synthesis report on the aggregate effect of intended nationally determined contributions. The UNFCCC have updated the graphics since. Figure 2 of 27 Apr 2016 shows the total GHG emissions, which were about 17 Gt higher than the CO2 emissions from energy emissions in 2010.

The graphic clearly shows that the INDCs – many with very vague and non-verifiable targets – will make very little difference to the non-policy emissions path. Yet even this small impact is contingent on those submissions being implemented in full, which is unlikely in many countries. The 2°C target requires global emissions to peak in 2016 and then head downwards. There are no additional policies even being tabled to achieve this, except possibly by some noisy, but inconsequential, activist groups. Returning to the EIA’s report, figure 9.4 splits the CO2 emissions between the OECD and non-OECD countries.

The OECD countries represent nearly all countries who propose to reduce their CO2 emissions on the baseline 1990 level, but their emissions are forecast by the EIA still to be 19% higher in 2040. However, the increase is small compared to the non-OECD countries – who mostly are either proposing to constrain emissions growth or have no emissions policy proposals – with emissions forecast to treble in fifty years. As a result the global forecast is for CO2 emissions to double. Even if all the OECD countries completely eliminate CO2 emissions by 2040, global emissions will still be a third higher than in 1990. As the rapid economic growth in the former Third World reduces global income inequalities, it is also reducing the inequalities in fossil fuel consumption in energy production. This will continue beyond 2040 when the OECD with a sixth of the world population will still produce a third of global CO2 emissions.

Unless the major emerging economies peak their emissions in the next few years, then reduce the emissions rapidly thereafter, the emissions target allegedly representing 2°C or less of global warming by 2100 will not be met. But for countries like India, Vietnam, Indonesia, Bangladesh, Nigeria, and Ethiopia to do so, with the consequent impact on economic growth, is morally indefensible.

Kevin Marshall

 

Insight into the mindset of FoE activists

Bishop Hill comments about how

the Charities Commissioners have taken a dim view of an FoE leaflet that claimed that silica – that’s sand to you or me – used in fracking fluid was a known carcinogen.

Up pops a FoE activist making all sorts of comments, including attacking the hosts book The Hockey Stick Illusion. Below is my comment

Phil Clarke’s comments on the hosts book are an insight into the Green Activists.
He says Jan 30, 2016 at 9:58 AM

So you’ve read HSI, then?
I have a reading backlog of far more worthwhile volumes, fiction and non-fiction. Does anybody dispute a single point in Tamino’s adept demolition?

and

Where did I slag off HSI? I simply trust Tamino; the point about innuendo certainly rings true, based on other writings.
So no, I won’t be shelling out for a copy of a hatchet job on a quarter-century old study. But I did read this, in detail
http://www.nature.com/ngeo/journal/v6/n5/full/ngeo1797.html

Tamino’s article was responded to twice by Steve McIntyre. The first looks at the use of non-standard statistical methods and Re-post of “Tamino and the Magic Flute” simply repeats the post of two years before. Tamino had ignored previous rebuttals. A simple illustration is the Gaspé series that Tamino defends. He misses out many issues with this key element in the reconstruction, including that a later sample from the area failed to show a hockey stick.
So Phil Clarke has attacked a book that he has not read, based on biased review by an author in line with his own prejudices. He ignores the counter-arguments, just as the biased review author does as well. Says a lot about the rubbish Cuadrilla are up against.

Kevin Marshall

William Connolley is on side of anti-science not the late Bob Carter

In the past week there have been a number of tributes to Professor Bob Carter, retired Professor of Geology and leading climate sceptic. This includes Jo Nova, James Delingpole, Steve McIntyre, Ian Pilmer at the GWPF, Joe Bast of The Heartland Institute and E. Calvin Beisner of Cornwall Alliance. In complete contrast William Connolley posted this comment in a post Science advances one funeral at a time

Actually A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it, but I’m allowed to paraphrase in titles. And anyway he said it in German, naturally. Today brings us news of another such advancement in science, with the reported death of Robert Carter.

Below is a comment I posted at Climate Scepticism

I believe Max Planck did have a point. In science people tenaciously hold onto ideas even if they have been falsified by the evidence or (as more often happens) they are supplanted by better ideas. Where the existing ideas form an institutionalized consensus, discrimination has occurred against those with the hypotheses can undermine that consensus. It can be that the new research paradigm can only gain prominence when the numbers dwindle in the old paradigm. As a result the advance of new knowledge and understanding is held back.

To combat this innate conservatism in ideas I propose four ideas.

First is to promote methods of evaluating competing theories that are independent of consensus or opinion. In pure science that is by conducting experiments that would falsify a hypothesis. In complex concepts, for which experiment is not possible and data is incomplete and of poor quality, like the AGW hypothesis or economic theories, comparative analysis needs to be applied based upon independent standards.

Second is to recognize institutional bias by promoting pluralism and innovation.

Third is to encourage better definition of concepts, more rigorous standards of data within the existing research paradigm to push the boundaries.

Fourth is to train people to separate scientific endeavours from belief systems, whether religious, political or ethical.

The problem for William Connolley is that all his efforts within climatology – such as editing Wikipedia to his narrow views, or helping set up Real Climate to save the Mannian Hockey Stick from exposure of its many flaws – are with enforcing the existing paradigm and blocking any challenges. He is part of the problem that Planck was talking about.

As an example of the narrow and dogmatic views that Connolley supports, here is the late Bob Carter on his major point about how beliefs in unprecedented human-caused warming are undermined by the long-term temperature proxies from ice core data. The video quality is poor, probably due to a lack of professional funding that Connolley and his fellow-travellers fought so hard to deny.

Kevin Marshall

Shotton Open Cast Coal Mine Protest as an example of Environmental Totalitarianism

Yesterday, in the Greens and the Fascists, Bishop Hill commented on Jonah Goldberg’s book Liberal Fascists. In summing up, BH stated:-

Goldberg is keen to point out that the liberal and progressive left of today do not share the violent tendencies of their fascist forebears: theirs is a gentler totalitarianism (again in the original sense of the word). The same case can be made for the greens. At least for now; it is hard to avoid observing that their rhetoric is becoming steadily more violent and the calls for unmistakably fascist policy measures are ever more common.

The link is to an article in the Ecologist (reprinted from Open Democracy blog) – “Coal protesters must be Matt Ridley’s guilty consience

The coal profits that fill Matt Ridley’s bank account come wet with the blood of those killed and displaced by the climate disaster his mines contribute to, writes T. If hgis consicence is no longer functioning, then others must step into that role to confront him with the evil that he is doing. (Spelling as in the original)

The protest consisted of blocking the road for eight hours to Shotton open cast coal mine. The reasoning was

This was an effective piece of direct action against a mine that is a major contributor to climate disaster, and a powerful statement against the climate-denying Times columnist, Viscount Matt Ridley, that owns the site. In his honour, we carried out the action as ‘Matt Ridley’s Conscience’.

The mine produces about one million tonnes of coal a year out of 8,000 million tonnes globally. The blocking may have reduced annual output by 0.3%. This will be made up from the mine, or from other sources. Coal is not the only source of greenhouse gas emissions, so the coal resulting in less than 0.004% of global greenhouse gas emissions. Further, the alleged impact of GHG emissions on the climate is cumulative. The recoverable coal at Shotton is estimated at 6 million tonnes or 0.0007% of the estimated global reserves of 861 billion tonnes (Page 5). These global reserves could increase as new deposits are found, as has happened in the recent past for coal, gas and oil. So far from being “a major contributor to climate disaster”, Shotton Open Cast Coal Mine is a drop in the ocean.

But is there a climate disaster of which Matt Ridley is in denial? Anonymous author and convicted criminal T does not offer any evidence of current climate disasters. He is not talking about modelled projections, but currently available evidence. So where are all the dead bodies, or the displaced persons? Where are the increased deaths through drought-caused famines? Where are the increased deaths from malaria or other diseases from warmer and worsening conditions? Where is the evidence of increased deaths from extreme weather, such as hurricanes? Where are the refugees from drought-stricken areas, or from low-lying areas now submerged beneath the waves?

The inability to evaluate the evidence is shown by the comment.

Ridley was ( … again) offered a platform on BBC Radio 4 just a week before our hearing, despite his views being roundly debunked by climate scientists.

The link leads to a script of the Radio 4 interview with annotated comments. I am not sure that all the collective brains do debunk (that is expose the falseness or hollowness of (an idea or belief)) Matt Ridley’s comments. Mostly it is based on nit-picking or pointing out the contradictions with their own views and values. There are two extreme examples among 75 comments I would like to highlight two.

First is that Matt Ridley mentioned the Hockey Stick graphs and the work of Steve McIntyre in exposing the underlying poor data. The lack of a medieval warm period would provide circumstantial (or indirect) evidence that the warming of the last 200 years is unprecedented. Gavin Schmidt, responded with comments (5) and (6) shown below.

Schmidt is fully aware that Steve McIntyre also examined the Wahl and Amman paper and thoroughly discredited it. In 2008 Andrew Montford wrote a long paper of the shenanigans that went into the publication of the paper, and its lack of statistical significance. Following from this Montford wrote the Hockey Stick Illusion in 2010, which was reviewed by Tamino of RealClimate. Steve McIntyre was able to refute the core arguments in Tamino’s polemic by reposting Tamino and the Magic Flute, which was written in 2008 and covered all the substantial arguments that Tamino made. Montford’s book further shows a number of instances where peer review in academic climatology journals is not a quality control mechanism, but more a device of discrimination between those that support the current research paradigm and those that would undermine that consensus.

Comment 6 concludes

The best updates since then – which include both methodology improvements and expanded data sources – do not show anything dramatically different to the basic picture shown in MBH.

The link is to Chapter 5 on the IPCC AR5 WG1 assessment report. The paleoclimate discussion is a small subsection, a distinct reversal from the prominent place given to the original hockey stick in the third assessment report of 2001. I would contend the picture is dramatically different. Compare the original hockey stick of the past 1,000 years with Figure 5.7 on page 409 of AR5 WG1 Chapter 5.

In 2001, the MBH reconstruction was clear. From 1900 to 2000 average temperatures in the Northern Hemisphere have risen by over 1C, far more than the change in any of century. But from at least two of the reconstructions – Ma08eivl and Lj10cps – there have been similarly sized fluctuations in other periods. The evidence now seems to back up Matt Ridley’s position of some human influence on temperatures, but does not support the contention of unprecedented temperature change. Gavin Schmidt’s opinions are not those of an expert witness, but of a blinkered activist.

Schmidt’s comments on hockey stick graphs are nothing compared to comment 35

The Carbon Brief (not the climate scientists) rejects evidence that contradicts their views based on nothing more than ideological prejudice. A search for Indur Goklany will find his own website, where he has copies of his papers. Under the “Climate Change” tab is not only the 2009 paper, but a 2011 update – Wealth and Safety: The Amazing Decline in Deaths from Extreme Weather in an Era of Global Warming, 1900–2010. Of interest are two tables.

Table 2 is a reproduction of World Health Organisation data from 2002. It clearly shows that global warming is well down the list of causes of deaths. Goklany states in the article why these figures are based on dubious assumptions. Anonymous T falsely believes that global warming is curr

Figure 6 for the period 1990-2010 shows

  • the Global Death and Death Rates per million Due to Extreme Weather Events
  • CO2 Emissions
  • Global average GDP Per Capita

Figure 6 provides strong empirical evidence that increasing CO2 emissions (about 70-80% of total GHG emissions) have not caused increased deaths. They are a consequence of increasing GDP per capita, which as Goklany argues, have resulted in fewer deaths from extreme weather. More importantly, increasing GDP has resulted in increased life expectancy and reductions in malnutrition and deaths that be averted by access to rudimentary health care. Anonymous T would not know this even if he had read all the comments, yet it completely undermines the beliefs that caused him to single out Matt Ridley.

The worst part of Anonymous T’s article

Anonymous T concludes the article as follows (Bold mine)

The legal process efficiently served its function of bureaucratising our struggle, making us attempt to justify our actions in terms of the state’s narrow, violent logic. The ethics of our action are so clear, and declaring myself guilty felt like folding to that.

We found ourselves depressed and demoralised, swamped in legal paperwork. Pleading guilty frees us from the stress of a court case, allowing us to focus on more effective arenas of struggle.

I faced this case from a position of relative privilege – with the sort of appearance, education and lawyers that the courts favour. Even then I found it crushing. Today my thoughts are with those who experience the racism, classism and ableism of the state and its laws in a way that I did not.

That reflection makes me even more convinced of the rightness of our actions. Climate violence strikes along imperialist lines, with those least responsible, those already most disadvantaged by colonial capitalism, feeling the worst impacts.

Those are the people that lead our struggle, but are often also the most vulnerable to repression in the struggle. When fighting alongside those who find themselves at many more intersections of the law’s oppression than I do, I have a responsibility to volunteer first when we need to face up to the police and the state.

Faced with structural injustice and laws that defend it, Matt Ridley’s Conscience had no choice but to disobey. Matt Ridley has no conscience and neither does the state nor its system of laws. Join in. Be the Conscience you want to see in the world.

The writer rejects the rule of law, and is determined to carry out more acts of defiance against it. He intends to commit more acts of violence, with “climate” as a cover for revolutionary Marxism. Further the writer is trying to incite others to follow his lead. He claims to know Matt Ridley’s Conscience better than Ridley himself, but in the next sentence claims that “Matt Ridley has no conscience“. Further this statement would seem to contradict a justification for the criminal acts allegedly made in Bedlington Magistrates Court on December 16th
that the protesters were frustrated by the lack of UK Government action to combat climate change.

It is not clear who is the author of this article, but he/she is one of the following:-

Roger Geffen, 49, of Southwark Bridge Road, London.

Ellen Gibson, 21, of Elm Grove, London;

Philip MacDonald, 28, of Blackstock Road, Finsbury Park, London;

Beth Louise Parkin, 29, of Dodgson House, Bidborough Street, London;

Pekka Piirainen, 23, of Elm Grove, London;

Thomas Youngman, 22, of Hermitage Road, London.

Laurence Watson, 27, of Blackstock Road, Finsbury Park, London;

Guy Shrubsole, 30, of Bavent Road, London;

Lewis McNeill, 34, of no fixed address.

Kevin Marshall

aTTP falsely attacks Bjorn Lomborg’s “Impact of Current Climate Proposals” Paper

The following is a comment to be posted at Bishop Hill, responding to another attempt by blogger ….andThenThere’sPhysics to undermine the work of Bjorn Lomborg. The previous attempt was discussed here. This post includes a number of links, as well as a couple of illustrative screen captures at the foot of the table.

aTTP’s comment is

In fact, you should read Joe Romm’s post about this. He’s showing that the INDCs are likely to lead to around 3.5C which I think is relative to something like the 1860-1880 mean. This is very similar to the MIT’s 3.7, and quite a bit lower than the RCP8.5 of around 4.5C. So, yes, we all know that the INDCs are not going to do as much as some might like, but the impact is likely to be a good deal greater than that implied by Lomborg who has essentially assumed that we get to 2030 and then simply give up.

Nov 11, 2015 at 9:31 AM | …and Then There’s Physics

My Comment

aTTP at 9.31 refers to Joe Romm’s blog post of Nov 3 “Misleading U.N. Report Confuses Media On Paris Climate Talks“. Romm uses Climate Interactive’s Climate Scoreboard Tool to show the INDC submissions (if fully implemented) will result in 3.5°C as against the 4.5°C in the non-policy “No Action” Scenario. This is six times the claimed maximum impact of 0.17°C claimed in Lomberg’s new paper. Who is right? What struck me first was that Romm’s first graph, copied straight from the Climate Interactive’s seem to have a very large estimate for emissions in the “No Action” Scenario producing. Downloading the underlying data, I find the “No Action” global emissions in 2100 are 139.3 GtCO2e, compared with about 110 GtCO2e in Figure SPM5(a) of the AR5 Synthesis Report for the RCP8.5 scenario high emissions scenario. But it is the breakdown per country or region that matters.

For the USA, without action emissions are forecast to rise from 2010 to 2030 by 40%, in contrast to a rise of just 9% in the period 1990 to 2010. It is likely that emissions will fall without policy and will be no higher in 2100 than in 2010. The “no action” scenario overestimates 2030 emissions by 2-3 GtCO2e in 2030 and about 7-8 GtCO2e in 2100.

For the China the overestimation is even greater. Emissions will peak during the next decade as China fully industrializes, just as emissions peaked in most European countries in the 1970s and 1980s. Climate Interactive assumes that emissions will peak at 43 GtCO2e in 2090, whereas other estimates that the emissions peak will be around 16-17 GtCO2e before 2030.

Together, overestimations of the US and China’s “No Action” scenarios account for over half 55-60 GtCO2e 2100 emissions difference between the “No Action” and “Current INDC” scenarios. A very old IT term applies here – GIGO. If aTTP had actually checked the underlying assumptions he would realise that Romm’s rebuttal of Lomborg based on China’s emission assumptions (and repeated on his own blog) are as false as claiming that the availability of free condoms is why population peaks.

Links posted at https://manicbeancounter.com/2015/11/11/attp-falsely-attacks-bjorn-lomborgs-impact-of-current-climate-proposals-paper/

Kevin Marshall

 

Figures referred to (but not referenced) in the comment above

Figure 1: Climate Interactive’s graph, referenced by Joe Romm.


Figure 2: Reproduction of Figure SPM5(a) from Page 9 of the AR5 Synthesis Report.

 

Update – posted the following to ATTP’s blog



 

Lomborg and the Grantham Institute on the INDC submissions

Bjorn Lomborg has a new paper published in the Global Policy journal, titled: Impact of Current Climate Proposals. (hattip Bishop Hill and WUWT)

From the Abstract

This article investigates the temperature reduction impact of major climate policy proposals implemented by 2030, using the standard MAGICC climate model. Even optimistically assuming that promised emission cuts are maintained throughout the century, the impacts are generally small. ………… All climate policies by the US, China, the EU and the rest of the world, implemented from the early 2000s to 2030 and sustained through the century will likely reduce global temperature rise about 0.17°C in 2100. These impact estimates are robust to different calibrations of climate sensitivity, carbon cycling and different climate scenarios. Current climate policy promises will do little to stabilize the climate and their impact will be undetectable for many decades.

That is pretty clear. COP21 in Paris is a waste of time.

An alternative estimate is provided in a paper by Boyd, Turner and Ward (BTW) of the LSE Grantham Institute, published at the end of October.

They state

The most optimistic estimate of global emissions in 2030 resulting from the INDCs is about halfway between hypothetical ‘business as usual’ and a pathway that is consistent with the 2°C limit

The MAGICC climate model used by both Lomborg & the IPCC predicts warming of about 4.7°C under BAU, implying up to a 1.35°C difference from the INDCs, compared to the 0.17°C maximum calculated by Lomborg, 8 times the amount. Lomborg says this is contingent on no carbon leakage (exporting industry from policy to non-policy countries), whilst citing studies showing that it could offset 10-40%, or even over 100% of the emissions reduction. So the difference between sceptic Lomborg and the mighty LSE Grantham Institute is even greater than 8 times. Yet Lomborg refers extensively to the August Edition of BTW. So why the difference? There is no explicit indication in BTW of how they arrive at their halfway conclusion. nor a comparison by Lomborg.

Two other estimates are from the UNFCCC, and Climate Action Tracker. Both estimate the INDCs will constrain warming to 2.7°C, or about 2.0°C below the MAGICC BAU scenario. They both make assumptions about massive reductions in emissions post 2030 that are not in the INDCs. But at least the UNFCCC and CAT have graphs that show the projection through to 2100. Not so with BTW.

This is where the eminent brain surgeons and Nobel-Prize winning rocket scientists among the readership will need to concentrate to achieve the penetrating analytical powers of a lesser climate scientist.

From the text of BTW, the hypothetical business as usual (BAU) scenario for 2030 is 68 GtCO2e. The most optimistic scenario for emissions from the INDCs (and pessimistic for economic growth in the emerging economies) us that 2030 emissions will be 52 GtCO2e. The sophisticated climate projection models have whispered in code to the climate scientists that to be on target for the limit of 2.0°C, 2030 emissions show be not more than 36 GtCO2e. The mathematicians will be able to determine that 52 is exactly halfway between 36 and 68.

Now for the really difficult bit. I have just spent the last half hour in the shed manically cranking the handle of my patent beancounter extrapolator machine to get this result. By extrapolating this halfway result for the forecast period 2010-2030 through to 2100 my extrapolator tells me the INDCs are halfway to reaching the 2.0°C maximum warming target.

As Bob Ward will no doubt point out in his forthcoming rebuttal of Bjorn Lomborg’s paper, it is only true climate scientists who can reach such levels of analysis and understanding.

I accept no liability for any injuries caused, whether physical or psychological, by people foolishly trying to replicate this advanced result. Please leave this to the experts.

But there is a serious side to this policy advocacy. The Grantham Institute, along with others, is utterly misrepresenting the effectiveness of policy to virtually every government on the planet. Lomborg shows by rigorous means that policy is ineffective even if loads of ridiculous assumptions are made, whether on climate science forecasting, policy theory, technological solutions, government priorities, or the ability of  current governments to make policy commitments for governments for decades ahead. My prediction is that the reaction of the Grantham Institute, along with plenty of others, is a thuggish denunciation of Lomborg. What they will not consider is the rational response to wide differences of interpretation. That is to compare and contrast the arguments and the assumptions made, both explicit and implicit. 

Kevin Marshall