The Morning Star’s denial of the Venezuelan Dictatorship

Guido Fawkes has an excellent example of the hard left’s denial of realities that conflict with their beliefs. From the Daily Politics, this is Morning Star editor Ben Chacko saying that the UN Human Rights Watch report on Venezuela was one-sided.

The Human Rights report can be found here.

The recent demonstrations need to be put into context. There are two contexts that can be drawn upon. The Socialist Side (with which many Socialists will disagree) is from Morning Star’s piece of 25th August The Bolivarian Revolution hangs in the balance.

They say

One of the world’s largest producers of oil, on which 95 per cent of its economy depends, the Bolivarian socialist government of Venezuela has, over the last 17 years, used its oil revenues to cut poverty by half and reduce extreme poverty to 5.4 per cent.

The government has built social housing; boosted literacy; provided free healthcare and free education from primary school to universities and introduced arts, music and cultural analysis programmes and many others targeting specific problems at the local level.

This is sentance emphasises the hard-left bias.

The mainly middle-class protesters, most without jobs and income, accused President Nicolas Maduro of dictatorship and continued with their daily demonstrations and demands for a change of government. 

Folks without “jobs or income” are hardly middle-class, but might be former middle-class. They have been laid low by circumstances. Should they be blaming the Government or forces outside the Government’s control?

 

From Capx.co on 16th August – Socialism – not oil prices – is to blame for Venezuela’s woes. Also from upi.com on 17th February – Venezuela: 75% of the population lost 19 pounds amid crisis. This is the classic tale of socialism’s failure.

  • Government control of food supplies leads to shortages, which leads to rationing, which leads to more shortages and black market profiteering. This started in 2007 when oil prices were high, but not yet at the record high.
  • Inflation is rampant, potentially rising from 720% in 2016 to 1600% this year. This is one of the highest rates in the world.
  • The weight loss is due to food shortages. It is the poorest who suffer the most, though most of the population are in extreme poverty.
  • An oil-based economy needs to diversify. Venezuela has not. It needs to use high oil prices to invest in infrastructure. Instead, the Chavez regime expropriated the oil production from successful private companies and handed to Government Cronies. A graphic from Forbes illustrates the problem.

About a decade ago at the height of the oil price boom, Venezuela’s known oil reserves more than tripled, yet production fell. It now has the highest oil reserves of any country in the world.

  • Crime has soared, whilst people are going hungry.
  • Maybe a million children are missing school through hunger and lack of resources to run schools. Short-run “successes” based on expropriating the wealth of others have reversed to create a situation far worse than before Chavez came to power.
  • Oil prices are in real terms above the level they were from 1986 to 2003 (with the exception of a peak for the first Gulf War) and comparable to the peak reached in 1973 with the setting up of the OPEC Cartel and oil embargo.

The reality is that Socialism always fails. But there is always a hardcore always in denial, always coming up with empty excuses for failure, often blaming it on others. With the rise of Jeremy Corbyn (who receives a copy of the Morning Star daily), this hardcore has have taken over the Labour Party. The example of Venezuela indicates the long-term consequences of their attaining power.

Kevin Marshall

New EU Vacuum Cleaner Regulations likely promoted with false claims

Summary

On September 1st, the EU Commission launched new regulations limiting the maximum power of vacuum cleaners to 900 watts.  A news item claimed

The updated rules will result in vacuum cleaners that use less energy for a better cleaning performance. This will help consumers to save money, as switching to a more efficient product can save €70 over its lifetime.

Elsewhere the is a claim that “with more efficient vacuum cleaners, Europe as a whole can save up to 20 TWh of electricity per year by 2020.

There is no reference to the source of the claims. Pulling in data from various sources I have calculated how the figures may have been derived. Based on these figures it would appear

  • The assumed savings are 200 kWh per vacuum cleaner, based on switching from a 1600 watts to a 900 watts, and 290 hours of use over the average lifetime.
  • This ignores that many vacuum cleaners are below 1600 watts due competition, not rules in place.
  • Cost savings are based on the average electricity costs in the EU, when in reality electricity costs in the most expensive country are 2.6 times that of the cheapest.
  •  Cost savings are not net of cost increases, such as more time spent cleaning and increase costs of the appliance.
  • Claims of reduction in electricity consumption are based on the requirement that all 350 million vacuum cleaners of 1600 watts are replaced by 900 watt cleaners by the start of 2020.

If any business made bald unsubstantiated claims about a new product, it would be required to back up the claims or withdraw them. Morally, I believe the EU Commission should aspire to emulate the standards that it imposes on others in marketing its own products. A law making Authority cannot be regulated and brought to account for the harms it causes. But I feel that it owes its citizens a moral duty of care to serve them, by minimizing the harms that it can cause and maximising the benefits.

The Launch of the New Regulations

BBC had an article on September 1st Sales of inefficient vacuum cleaners banned

They state

The EU’s own website says: “With more efficient vacuum cleaners, Europe as a whole can save up to 20 TWh of electricity per year by 2020.

“This is equivalent to the annual household electricity consumption of Belgium.

“It also means over 6 million tonnes of CO2 will not be emitted – about the annual emissions of eight medium-sized power plants.”

Although the BBC do not link to the webpage among millions. A search on the phrase reveals the following link.

http://ec.europa.eu/energy/en/topics/energy-efficiency/energy-efficient-products/vacuum-cleaners

Vacuum cleaners are subject to EU energy labelling and ecodesign requirements. By switching to one of the most energy efficient vacuum cleaners, you can save €70 over the lifetime of the product.  With more efficient vacuum cleaners, Europe as a whole can save up to 20 TWh of electricity per year by 2020. This is equivalent to the annual household electricity consumption of Belgium. It also means over 6 million tonnes of CO2 will not be emitted – about the annual emissions of eight medium-sized power plants.

There are no references to where the figures come from.

Another source is much nearer in the menu tree to the EU homepage and is on a news page.

http://ec.europa.eu/energy/en/news/updated-energy-efficiency-rules-vacuum-cleaners-will-save-consumers-money

Updated energy efficiency rules for vacuum cleaners will save consumers money

Friday, 01 September 2017

From today, vacuum cleaners sold in Europe will be more cost- and energy-efficient. The European Commission is making use of the latest state-of-the-art technologies to ensure that European consumers have the most energy efficient products available. The updated ecodesign requirements will lower appliances’ maximum power, annual energy consumption and noise levels. They will also increase their minimum ability to pick up dust.

The updated rules will result in vacuum cleaners that use less energy for a better cleaning performance. This will help consumers to save money, as switching to a more efficient product can save €70 over its lifetime. With more efficient vacuum cleaners, Europe as a whole will be in a position to save up to 20 TWh of electricity per year by 2020.

Like with the first EU source (which this press release links backed to) there is no reference to the source of the claims.

Establishing the calculations behind the claims

However, there the claims that together with other data and some assumptions have enabled me to piece together the numbers behind the claims. These are:-

  1. The maximum of 20 TWh of electricity that could be saved by 2020. There are one billion kilowatt hours in a terawatt hour.
  2. According to Eurostat’s Household Composition Statistics, there are 495.6 million EU citizens living in households, with an average 2.3 persons per household. That is around 215 million or maybe 210 million households.
  3. There is more than one vacuum cleaner in the average household.
  4. All vacuum cleaners are operated at maximum power all the time.
  5. All current vacuum cleaners are 1600 watts. By 2020 they will all be at 900 watts.
  6. Life of the average vacuum cleaner is five years. This I worked out from slotting in other variables.

 

To understand how many kilowatt hours in the maximum cost saving of €70, one needs to know the cost of a unit of electricity. In a recent post on electricity prices in South Australia, Joanne Nova provided a graphic based on data from MARKINTELL, US Energy Information Administration. Based on this I have produced a graphic showing that if Denmark, where electricity is most expensive, a person saved €70 on their electricity bill, the savings in most of the other EU countries.

If the Danes will save €70 from buying a vacuum cleaner under the new regulations, in the UK the saving will be about €49, France €39 and in Hungary and Estonia just €27. This is because of the huge difference in electricity costs, with Danish electricity being 2.6 times that in Hungary and Estonia. It is a simple step to work out the number of kilowatt hours of electricity saved for a spend of €70.  Assuming $1.00 = €0.85, the next graph shows how many units of electricity will be saved in each country.

If the EU Commission had properly checked its figures, when quoting the maximum saving, will base it on the highest electricity rates in the EU, and not the average rates. They will, therefore, assume that the maximum savings for the EU will be around 133 kilowatt hours and not 200 kilowatt hours. Otherwise, the maximum savings in Denmark, Germany, Italy and Portugal could be greater than the claimed maximum, whilst people in some other countries with lower than average electricity costs will be misled as to the extent of the possible savings.

I have put together a table that fits the assumptions and known variables based on €70 of savings in both Denmark and the fictional EU average.

The 200 kWh saving over a five-year vacuum cleaner life seems more reasonable than 133 kWh. The 350 million vacuum cleaners in the EU or two for every three people, seems more reasonable than 538 million, which is both less of a rounded estimate and would mean around 35 million more vacuum cleaners than people. The assumption that the average household spends 1 hour and 50 minutes per week vacuuming might be a bit high, but there again I know of people who regularly exceed this amount by quite a margin.

Based on how the numbers fit the maximum saving of €70 per vacuum cleaner to have been based on the average cost of electricity in the EU. As such it is an incorrect statement. There are other issues that arise.

Evaluating the claims

There are other issues that arise from consideration of these figures, though are not necessarily solely reliant upon those figures.

First, the 26TW of savings is if all the current vacuum cleaners (assumed to be at 1600W rating) will be replaced by the start of 2020. That is in just 2.33 years. If vacuum cleaners have an average five-year life, many people will be scrapping their existing vacuum cleaners before the end of their useful life. Even with a maximum marginal cost saving of €14 a year, this would mean incurring unnecessary additional costs and throwing out perfectly serviceable vacuum cleaners. However, if they replace a 2000 watt or higher vacuum cleaner purchased prior to September 2014, then the savings will be much higher. In which case the EU Commission News item should have noted that some savings were from regulations already in place.

Second is that many households have an old vacuum cleaner in reserve. They may have it for a number of reasons, such as having upgraded in the past, or purchased it prior to the regulations came into force in 2014. So when their main vacuum cleaner finally keels over, they will not purchase a low powered one. It will be therefore very many years before anything approaching 100% of existing vacuum cleaners have been replaced, especially if the perception is that the newer products are inferior.

Third, is an assumption that every vacuum cleaner is on the limit of the regulations. Greater efficiency (saving money) is something people are willing to pay for, so the market provides this anyway without the need for regulation, just as people pay for more fuel efficient cars. It is only the people who max out on the power permitted that will be affected to the full extent. As greater power is a cheap way of increasing performance, this will most affect the cheapest cleaners. The poor and those setting up a home for the first time (with severe budget constraints) are likely to be those most disadvantaged, whilst those who are willing and able to upgrade to the latest gadgets will make the lowest savings.

Fourth, the cost savings appear to be only on electricity costs. The extra costs of upgrading to a more technologically advanced machine that compensates for the loss of power, does not appear to have been taken into account in the calculations. If it had, then the electricity savings would have to be much greater, to include the additional costs. In which case, the fictional European average household would have to be saving far more on their electricity than €70. Let us say people upgrade from a €100 to €300 machine, both with a five year average life. To make €70 of savings over five years a Danish household would have to be running their vacuum cleaner for nearly three hours a week, a British or Dutch household over four hours per week, and the Hungarian and Estonian households over seven hours a week. But this defies other assumptions and would also shorten the average life of a vacuum cleaner. No allowance appears to have been made for more expensive vacuum cleaners.

Fifth, there are other, simpler ways of replacing the loss of suction from lack of power than technological wizardry that pushes up costs. The simplest is to reduce the area in contact with the floor. This means that people spend more time using the machines, offsetting some of the energy savings. Alternatively, there could be some loss of suction, which again means people spend more time cleaning, and getting frustrated due to the lack of performance. Some of this could be by more frequent swapping of cleaning heads. If you value people’s leisure time at just €5.00 an hour, then over the short five year life of a cleaner (about 290 hours based on 65 minutes a week of use), the average household will “lose” the €70 of electricity savings if they have to spend more than 5% more time cleaning. In reality it will be much more, and many people will feel aggrieved at having a less efficient machine.

Sixth is that the extra power can be used for simpler, proven and more robust technologies. Efficiency savings come about through complex optimisation strategies, reducing the life of cleaners.

So the claim by the EU that people will save money from the new regulations seems to be false for any one of a number of reasons. More likely than not people will be made net worse off by the regulations. Further the alleged benefits from the new regulations in terms of savings in electricity (and hence CO2 emissions) seems to have been grossly exaggerated.

But won’t there be a massive saving in CO2 emissions?  Even if the 6 million tonnes of emissions saved is in the more distant future, it is still a far large number. In terms of a small country like Belgium, it is a large amount. But considered in the context of EU’s INDC submission to the Paris climate talks it is quite small.

The EU and its Member States are committed to a binding target of an at least 40% domestic reduction in greenhouse gases emissions by 2030 compared to 1990,

From the accompanying country brief, the 1990 emissions were 5368 mtCO2e, so a 40% cut is 2147 mtCO2e. In 2012 emissions were 4241 mtCO2e (mostly for non-policy reasons) so there is just 1020 million tonnes to cut. 6 million is just 0.6% of that target.

On a global perspective, even with all the vague policy proposals fully enacted, global emissions by 2030 will be nearly 60,000 MtCO2e and will still be rising. There seems no prospect of additional policies being proposed that would start reducing global emissions. A policy that makes around 0.01% of the difference to the larger picture is inconsequential. To achieve the policy goals a few thousand similar-sized schemes are required. Nothing like that is going to happen. Countries in the developing world, with over half the global population, will see emissions will grow for decades, dwarfing any reductions made in the EU.

Concluding comments

The new vacuum cleaner regulations appear to be justified on the basis of grossly exaggerated and untenable claims of the benefits in terms of cost savings and reductions in GHG emissions, whilst ignoring the costs that they impose.

If any business made bald unsubstantiated claims about a new product, it would be required to back up the claims or withdraw them. If such sweeping claims were made about a new product such as anti-aging creams or vitamin pills, that could be attributed to other factors, then it would be prosecuted. Morally, I believe the EU Commission should aspire to emulate the standards that it imposes on others in marketing its own products. A law making Authority cannot be regulated and brought to account for the harms it causes. But I feel that it owes its citizens a moral duty of care to serve them, by minimizing the harms that it can cause and maximising the benefits.

Kevin Marshall

 

How the “greater 50% of warming since 1950 is human caused” claim is deeply flawed

Over at Cliscep, Jaime Jessop has rather jokingly raised a central claim of the IPCC Fifth Assessment Report, after someone on Twitter had accused her of not being a real person.

So here’s the deal: Michael Tobis convinces me, on here, that the IPCC attribution statement is scientifically sound and it is beyond reasonable doubt that more than half of the warming post 1950 is indeed caused by emissions, and I will post a photo verifying my actual existence as a real person.

The Report states (AR5 WG1 Ch10 Page 869)

It is extremely likely that human activities caused more than half of the observed increase in GMST from 1951 to 2010.

This extremely likely is at the 95% confidence interval and includes all human causes. The more specific quote on human greenhouse gas emissions is from page 878, section “10.2.4 Single-Step and Multi-Step Attribution and the Role of the Null Hypothesis

Attribution results are typically expressed in terms of conventional ‘frequentist’ confidence intervals or results of hypothesis tests: when it is reported that the response to anthropogenic GHG increase is very likely greater than half the total observed warming, it means that the null hypothesis that the GHG-induced warming is less than half the total can be rejected with the data available at the 10% significance level.

It is a much more circumspect message than the “<a href=”http://stocker IPCC 2013″ target=”_blank”>human influence on the climate system is clear</a>” announcements of WG1 four years ago.  In describing attribution studies, the section states

Overall conclusions can only be as robust as the least certain link in the multi-step procedure.

There are a number of candidates for “least certain link” in terms of empirical estimates. In general, if the estimates are made with reference to the other estimates, or biased by theory/beliefs, then the statistical test is invalidated. This includes the surface temperature data.

Further, if the models have been optimised to fit the surface temperature data, then the >50% is an absolute maximum, whilst the real figure, based on perfect information, is likely to be less than that.

Most of all are the possibilities of unknown unknowns. For, instance, the suggestion that non-human causes could explain pretty much all the post-1950 warming can be inferred from some paleoclimate studies. This reconstruction Greenland ice core (graphic climate4you) shows warming around as great, or greater, than the current warming in the distant past. The timing of a warm cycle is not too far out either.

In the context of Jaime’s challenge, there is more than reasonable doubt in the IPCC attribution statement, even if a statistical confidence of 90% (GHG emissions) or 95% (all human causes) were acceptable as persuasive evidence.

There is a further problem with the statement. Human greenhouse gas emissions are meant to account for all the current warming, not just over 50%. If the full impact of a doubling is CO2 is eventually 3C of warming, then from that the 1960-2010 CO2 rise from 317ppm to 390ppm alone will eventually be 0.9C of warming. Possibly 1.2C of warming from all sources. This graphic from AR5 WG1 Ch10 shows the issues.

The orange line of anthropogenic forcing accounts for nearly 100% of all the measured warming post-1960 of around 0.8C – shown by the large dots. Yet this is about 60% of the warming in from GHG rises if a doubling of CO2 will produce 3C of warming. The issue is with the cluster of dots at the right of the graph, representing the pause, or slow down in warming around the turn of the century. I have produced a couple of charts that illustrate the problem.

In the first graph, the long term impact on temperatures of the CO2 rise from 2003-2012 is 2.5 times that from 1953-1962. Similarly, from the second graph, the long term impact on temperatures of the CO2 rise from 2000-2009 is 2.6 times that from 1950-1959. It is a darn funny lagged response if the rate of temperature rise can significantly slow down when the alleged dominant element causing them to rise accelerates. It could be explained by rising GHG emissions being a minor element in temperature rise, with natural factors both causing some of the warming in the 1976-1998 period, then reversing, causing cooling, in the last few years.

Kevin Marshall

 

 

Forest Trump – stupid is as stupid does

Last Tuesdays’ BBC climate propaganda piece for the day was ‘Donald Trump forest’ climate change project gains momentum,

A campaign to plant trees to compensate for the impact of President Trump’s climate policies has 120,000 pledges.
The project was started by campaigners upset at what they call the president’s “ignorance” on climate science.
Trump Forest allows people either to plant locally or pay for trees in a number of poorer countries.
Mr Trump says staying in the climate pact will damage the US economy, cost jobs and give a competitive advantage to countries such as India and China.
The organisers say they need to plant an area the size of Kentucky to offset the Trump effect.

Trump Forest website (motto Where ignorance grows trees) explains

Breathe easy, Mr President.

 US President Donald Trump doesn’t believe in the science of human-caused climate change. He wants to ignore one of the greatest threats to healthy life on Earth.

Trump wants to bring back coal despite scientists telling us we cannot afford to burn it, and despite economists telling us there’s more money to be made and more jobs available in renewable energy.

So we’re planting a forest to soak up the extra greenhouse gases Trump plans to put into our atmosphere.

We’re planting a global forest to offset Trump’s monumental stupidity.

The claim Trump wants to “bring back coal” or, just to rescind the policies to phase it out, is a question of that can be answered by the empirical evidence. The BP statistical review of World Energy 2016 has estimates of coal consumption by country, measured in millions of barrels of oil equivalent. For the USA I have created a graph.

US coal consumption in 2015 was 31% below the level of 2015, but it is far from being phased out. Further, much of the fall in consumption is primarily down to government policy, but from switching to cleaner and cheaper shale gas. Add the two together in terms of millions of tonnes of oil equivalent, and consumption of the two fossil fuels has hardly changed in 20 years.

Natural Gas is not only cleaner, in terms of far fewer particulates emitted when burnt, it has the added benefit, for climate alarmists, of having around half the CO2 emissions. As a result, net emissions have been falling.

However, global warming is allegedly the result of rising levels of greenhouse gases, which in turn are mostly the result of increasing fossil fuel emissions. How does the falling consumption of coal in the USA compare to the global picture? Again the BP estimates give a fairly clear answer.

In 1965 the USA accounted for 20.8% of global coal consumption, and other rich OECD countries 42.3%. Fifty years later the shares had fallen to 10.3% and 15.2%. Yet the combined OECD consumption had increased by 11%. The lesson from this is that to reduce global GHG emissions requires that developing countries reduce their emissions. China,, which now accounts for just over 50% of global coal consumption, has committed to peak its emissions by 2030. India, whose coal consumption exceeded that is the USA for the first time in 2015, has no such commitment. With a similar population to China, fast economic growth will lead to fairly high rates of increase in coal consumption over the next few years. Into the distant future, the ROW, with around half the global population, are likely to see significant increases in coal consumption.

The switch from coal to shale gas is a major reason why total USA emissions have been falling, as evidenced in this graph from the USA INDC Submission.

The 2025 target is a bit a cheat. Most of the reduction would have been achieved without any policy. In fact, about one third had been achieved by 2013.

Trump Forest have a science page to explain the thinking behind the scheme. It states

If executed in its entirety the Clean Power Plan would prevent approximately 650 million tons of carbon dioxide from reaching the atmosphere over the next 8 years. Along with other actions, including tailpipe regulations (which Trump has also moved to eliminate), the United States was steering toward meeting its target for the Paris Agreement.

Also

The Paris Agreement, negotiated in the French capital in December 2015 was agreed to by over 190 nations. It is the first time the global community has agreed to address climate change by striving to keep the average global temperature increase below 2°C.

So how does the 650mtCO2e over 8 years measure up against those of the global community in the context of “striving to keep the average global temperature increase below 2°C”?

The UNFCCC produced a useful graphic, summarizing all the INDC submissions.

 

Without the 650mtCO2e claimed reduction from the US Clean Air Plan if fully implemented, global emissions will be just over 1% higher. Rather than global emissions being about 12.5% above the 2°C warming path they might be 14%.  In other words, even if a doubling of CO2 (or equivalent) will lead to 3°C and such warming will have catastrophic consequences (despite the lack of strong, let alone incontrovertible, evidence) the US Clean Air Plan would back no noticeable difference to climate change. using the figures presented by the UNFCCC.

It gets worse. Under the science, Trump Forest have the following graphic, lifted from Climate Interactive.

I have looked at Climate Interactive’s figure before. At least from their figures in December 2015, they claimed that future per capita emissions in the USA would rise without policy, whilst since the 1973 oil crisis per capita emissions had been falling. It was the same with the EU, only their per capita emissions had been falling since 1980. For China and Russia per capita emissions are shown rise through the rough. It is as though without them the guiding hand of the green apostles, Governments will deliberately wastefully burn ever-increasing amounts of fossil fuels. rather than promote the welfare of their nations. This is a graphic I produced from the Climate Interactives C-ROADS software version v4.026v.071 RCP8.5 baseline scenario and the built-in population forecasts.

China is the most globally significant. Despite a forecast decline in population to 1.00 billion in 2100, GHG emissions are forecast to peak at nearly 43GtCO2e by in 2090. That compares with 49GtCO2e from over 7 billion people in 2010. Conversely, non-policy developing countries (who do not want to game-playing ny committing to emissions reductions), are forecast to do disasterously economically and hence have very low emissions growth. That includes India, 50+ African nations, Pakistan, Bangladesh, Philippines, Vietnam, Indonesia, Saudi Arabia, Iran, Iraq etc.

The mere act of countries signing a bit of paper produces most of the drop in emissions. The 650mtCO2e claimed reduction from the US Clean Air Plan if the marginal impact of the policy is taken into account, rather than the difference between an unreasonable forecast and an objective.

It gets worse. The elimination of cheap sources of energy, along with the plethora of regulations, make energy more expensive. Apart from directly harming the living standards of households, this will increase energy costs to business, especially the high energy using industries such as steel, aluminum, and bulk chemicals. US industries will be placed at a competitive disadvantage to their competitors in non-policy emerging economies. Some of the US savings from the policy will be from emissions increases elsewhere. There are no built-in safeguards to stop this happening.

It gets worse. Emerging economies not only have lower labour productivity per unit of output, they also have less efficient use of energy per unit of output. Further, countries like China and India have a larger element of coal in the energy mix than the USA. For these reasons an unintended consequence of reducing emissions in the USA (and other developed countries) through shifting production overseas could be a net increase global emissions. Virtue signaling achieves the opposite of intentions.

However, the real world must not be allowed to confront the anointed in their evangelical zeal to save the world from Donald Trump. They might have to accept that their Virtue signaling are both wrong and if fully implemented will cause great net harm. That would seriously hurt their feelings. Like in the 1994 film Forrest Gump, the lesson is that the really stupid people are not those with naturally low IQs, but those with intelligence who do stupid things. This is what Forest Trump’s backers have achieved.

Like in the 1994 film Forrest Gump, the lesson is that the really stupid people are not those with naturally low IQs, but those with intelligence who do stupid things. This is what Forest Trump’s backers have achieved.

Kevin Marshall

 

Met Office Extreme Wet Winter Projections

I saw an article in the Telegraph

Met Office warns Britain is heading for ‘unprecedented’ winter rainfall, with records broken by up to 30pc 

Britain is heading for “unprecedented” winter rainfall after the Met Office’s new super computer predicted records will be broken by up to 30 per cent.

Widespread flooding has hit the UK in the past few years leading meteorologists to search for new ways to “quantify the risk of extreme rainfall within the current climate”.

In other words, the Telegraph reporting that the Met Office is projecting that if the current record is, say, 100mm, new records of 130mm could be set.

BBC is reporting something slightly different

High risk of ‘unprecedented’ winter downpours – Met Office

There is an increased risk of “unprecedented” winter downpours such as those that caused extensive flooding in 2014, the UK Met Office says.

Their study suggests there’s now a one in three chance of monthly rainfall records being broken in England and Wales in winter.

The estimate reflects natural variability plus changes in the UK climate as a result of global warming.

The BBC has a nice graphic, of the most extreme winter month of recent years for rainfall.

The BBC goes onto say

Their analysis also showed a high risk of record-breaking rainfall in England and Wales in the coming decade.

“We found many unprecedented events in the model data and this comes out as a 7% risk of a monthly record extreme in a given winter in the next few years, that’s just over Southeast England,” Dr Vikki Thompson, the study’s lead author told BBC News.

“Looking at all the regions of England and Wales we found a 34% chance of an extreme event happening in at least one of those regions each year.”

Not only is there a greater risk, but the researchers were also able to estimate that these events could break existing records by up to 30%.

“That is an enormous number, to have a monthly value that’s 30% larger, it’s a bit like what we had in 2014, and as much again,” said Prof Adam Scaife from the Met Office.

The 30% larger is an outlier.

But over what period is the record?

The Met Office website has an extended version of what the BBC reports. But strangely no figures. There is a little video by Dr Vikki Thomson to explain.

She does say only recent data is used, but no definition of what constitutes recent. A clue lies not in the text, but an explanatory graphic.

It is from 35 years of winters, which ties into the BBC’s graphic from 1981. There are nine regions in England and Wales by the Met Office definition. The tenth political region of London is included in the South East. There could be different regions for the modeling. As Ben Pile and Paul Homewood pointed out in the comments to the Cliscep article, elsewhere the Met Office splits England and Wales into six regions. What is amazing is that the Met Office article does not clarify the number of regions, still less show the current records in the thirty-five years of data. There is therefore no possibility of ever verifying the models.

Put this into context. Northern Ireland and Scotland are excluded, which seems a bit arbitrary. If rainfall was random, then the chance of this coming winter setting a new record in a region is nearly 3%. For any one of nine regions, if data rainfall data independent between regions (which it is not) it is nearly a 26% chance. 34% is higher. But consider the many alternatives ways for the climate patterns to become more extreme and variable. After all, with global warming there climate could be thrown into chaos, so more extreme weather should be emerging as a foretaste of much worse to come. Given the many different aspects of weather, there could be hundreds of possible ways climate could get worse. With rainfall, it could be wetter or drier, in either summer or winter. That is four variables, of which the Met Office choose just one. Or could be in any 1, 2, 3… or 12 month period. Then again, climate change could mean more frequent and violent storms, such as that of 1987. Or it could mean more heatwaves. Statistically, heatwaves records could be a number of different ways, such as, say, 5 consecutive days in a month where the peak daily temperature is more than 5C about the long-term monthly average peak temperature.
So why choose rainfall in winter? Maybe it is because in recent years there have been a number of unusually wet winters. It looks like the Met Office, for all the power of their mighty computers, have fallen for a common fallacy.

 

Texas sharpshooter fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are stressed. From this reasoning, a false conclusion is inferred. This fallacy is the philosophical/rhetorical application of the multiple comparisons problem (in statistics) and apophenia (in cognitive psychology). It is related to the clustering illusion, which refers to the tendency in human cognition to interpret patterns where none actually exist.
The name comes from a joke about a Texan who fires some gunshots at the side of a barn, then paints a target centered on the tightest cluster of hits and claims to be a sharpshooter.

A run of extremely wet winters might be due to random clustering, or it could genuine patterns from natural variation, or it could be a sign of human-caused climate change. An indication of random clustering would be to look at many other the different aspects of weather, to see if there is a recent trend of emerging climate chaos. Living in Britain, I suspect that the recent wet weather is just drawing the target around the tightest clusters. Even then, high winter rainfall in Britain high rainfall this is usually accompanied by slightly milder temperatures than average. Extreme winter cold is usually on cloud-free days. So, if winter rainfall is genuinely getting worse it seems that the whole global warming thing for Britain is predicted to become a bit a damp squib.

Kevin Marshall

 

John McDonnell accusation that Grenfell fire victims were murdered examined

At Glastonbury (which people think to be a music festival) John McDonnell MP, the shadow chancellor, and closet Marxist made the accusation that the victims of the Grenfell fire were murdered. Guido Fawkes has a video recording here. He repeated these allegations in videoed interview with the NME, a publication when I was at school stood for New Musical Express. He tried to justify the comments when asked about the comments by Andrew Marr. All three are reproduced in an appendix.

From McDonnell’s perspective, any causes of the fire that can be determined by an inquiry conducted experts in their fields objectively assessing the evidence, are superficial. But before the evidence has been evaluated, the ultimate causes are the wrong decisions of political opponents, particularly the Tories, but also within the former mainstream of the Labour Party. McDonnell and his fellow travelers know the true interests of the people and can point to instances in the past where they have stated these ultimate causes, and have not been listened to. The fact that these anointed people have not been listened to is not only a failure of democracy, but the resulting in deaths are murder.

But look at it from a different perspective.

As Andrew Marr pointed out, the legal definition of murder, (the killing of a human being by a sane person, with intent, malice aforethought – see below) does not embrace acts of indirect killing. At most such unintended killings a lesser form of manslaughter. This does not change by pointing out a long tradition of its use, any more than racial theories are valid through centuries of use.

The BBC corroborates that long tradition.

It was in the 19th Century that philosopher Friedrich Engels sought to prove that society commits “social murder” in his book Condition of the Working-Class in England in 1844.
When society places hundreds of proletarians in such a position that they inevitably meet a too early and an unnatural death… When it deprives thousands of the necessaries of life… forces them, through the strong arm of the law, to remain in such conditions until that death ensues… its deed is murder,” he wrote of Victorian England.
Engels went on to found Marxist theory with fellow German philosopher, Karl Marx.

Such thinking was developed into a conspiracy theory. When society does not progress towards a socialist utopia, as the forces of history dictate, it must be due to the collective and secretive actions of the capitalist class who would lose out. “Social murder” has benefits for those perpetuating those actions. Under Stalin, when the collectivization of agriculture failed to progress towards the socialist utopia, it was due to a conspiracy by the kulaks. In a famine, this gave justification for closing the grain supplies off to these peasant farmers and feeding the cities. Similarly, when factories failed to meet arbitrary targets, it must have been due to managers sabotaging production for the capitalist enemy. Simply being “outed” by another was sufficient evidence to being shot after torture and a show trial.
English Common Law long ago developed the concept of trial by jury. Charges must be clearly stated and substantiated by evidence. The prosecution must convince a jury of the accused’s peers beyond reasonable doubt. The accused have the right to rebut any allegations. The judge overseas proceeding to make sure established rules and procedures are adhered to. It recognizes that no matter how good a case might appear to be, it might be fundamentally flawed. By approaching the issue an alternative perspective, what appears to be a convincing case might not be so watertight, or could completely unravel. Of course, by allowing the defendant to speak, it might work the other way. A defense based on outright lies and contradictions will serve to convince a jury of the prosecution’s accusations. In so doing, the aim is for the decision of the court to be the truth.
The development of the British concepts of “fair play” and “a level playing field” emerged alongside those developments in criminal trial by jury. However, a new word entered the Oxford English Dictionary last month, which indicates that trend has now sharply reversed.

Post-truth

Relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.

The objective facts of the Grenfell fire can only be established by careful examination of the evidence by competent persons. In assessing those facts, real lessons can be learned, not only to prevent such a horror occurring but possible for people in social housing to be served better. What John McDonnell has achieved in his pronouncements, based on his blinkered prejudices, is to derive conclusions based on empty opinions and anger. As a result, any objective assessment that might show that non-Tory politicians share in the blame will be drowned out, along with the lessons about the limits of political competency.

Kevin Marshall

Appendix – John McDonnell’s Claims

McDonnell at Glastonbury 25th June 2017

Is democracy working? It didn’t work if you were a family living on the 20th floor of Grenfell Tower. Those families, those individuals, 79 so far and there will be more, were murdered by political decisions taken over recent decades. The decision not to build homes and to view housing as only for financial speculation rather than meeting a basic human need murdered those families. The decision to close fire stations and to cut 10,000 fire fighters and then to freeze their pay for over a decade. They were political decisions.

McDonnell Interview with NME – Grenfell Tower from 5.35

My Transcript, without editing.

We’re all angry. We’re all angry.

We are angry because we know the causes of that fire. You know, we know what we know that that the physical causes might have been a fridge that burst on fire, the cladding was wrong or .. that will come out at the inquiry, but I think we know roughly that what those causes are. But the real causes are decisions made by successive Tory Governments in particular. Who basically refused to build homes, in London in particular. And then housing then being used not for housing need but for speculative gain. And as a result you get people crammed into unsafe tower blocks and as a result people lose their lives. It’s a scandal, an absolute scandal.

We, we’ve campaigned over the years, for house building, council house building and investment in the housing program. Jeremy and I have been campaigning on that for nearly thirty years. In addition to that we have been campaigning for thirty years for safety. We were both members, in fact we set up the Fires Brigade Union Parliamentary Group that as far back as…. One speech I dug out was 2004 when I was calling for sprinklers as part of safety measures. So what we have said is when we go back into Government, first of all we will start building homes again. We have promised a million new homes. Half of them will be council houses. And that we tackle some of the housing crisis that we have got. Secondly we will ensure that we invest in our public services and that does to me making sure that homes are safe. Last year Labour put up an amendment to legislation which said that landlords should have a legal responsibility to make sure that their homes are fit for human habitation. That was voted down by Conservative MPs, 75 of whom were landlords. Absolutely disgraceful. So when we go back in we’ll build homes and make them safe.

McDonnell on Andrew Marr Show 17th July 2017 Grenfell from 10:35

Unedited Transcript

AM Do you regret saying that the people who died in the Grenfell tower were killed by political murder?

JMcD No I don’t regret that. I was extremely angry with what went on and I am a West London MP.  This site is not far from … Political decisions were made which resulted in the deaths of these people. That’s a scandal.

AM But murder means a specific thing. Murder means a volition to actually kill another human being – intentional killing.

JMcD There is a long history in this country of concept of social murder where decisions are made with no regard to the consequences of that, and as a result of that people have suffered. That whats happened here and I am angry about that.

AM Do you regard it as murder?

JMcD I believe that social murder has occurred in this instance and I believe that people should be held accountable for that.

AM So who are the murderers?

JMcD I think that it has being a consequence of political decisions over the years that have not addressed the housing crisis that we have had. That have cut back on local government, so proper inspections have not been made. Cut back 11,000 firefighters jobs been cut as well. Even the investment in aerial ladders and things like that in our country.

Appendix – definitions

manslaughter
http://dictionary.law.com/Default.aspx?selected=1209

  1. the unlawful killing of another person without premeditation or so-called “malice aforethought” (an evil intent prior to the killing). It is distinguished from murder (which brings greater penalties) by lack of any prior intention to kill anyone or create a deadly situation.

murder
http://dictionary.law.com/default.aspx?selected=1303

  1. the killing of a human being by a sane person, with intent, malice aforethought (prior intention to kill the particular victim or anyone who gets in the way) and with no legal excuse or authority.

Daniel Hannan on the selfishness of running a deficit and post-truth realities

In the latest Ici Londres production Dan Hannan looks at the morality of deficits.

Daniel Hannan starts by quoting Matthew 7:9-10

If the son shall ask bread of any of you that is a father, will you give him a stone? Or if he asks for a fish will you give him a serpent?

The passage goes onto to say the if you are evil, understand how to give good gifts to your children. By implication, to act for good, we must also understand how to act for the good, not just have the moral injunction.

Hannan goes onto say we do not run up large debts to bequeath to our children. Yet many impose a very different standard as voters, convincing themselves that they are being unselfish. By asking for more money from the State, whether to pay for care in old age or for a pay rise in the public sector, or remission of tuition fees, it might be a very good claim, but it is not an intrinsically unselfish claim, as they are asking for everybody else to chip in and pay for their cause. Conversely those who try to impose some fiscal discipline are deemed selfish. They are standing up for future generations. Austerity is not a random preference but a simple reality.

This is all pretty obvious stuff to anyone who understands basic morality and the slightest notion of finance. It is certainly within the understanding of anybody who has been brought up in a traditional British public school education. But I would suggest it is totally alien to the vast majority of the British public. This reason is described by a new word that entered the Oxford English Dictionary last month.

post-truth

Relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.

The General Election campaign is a clear illustration of the domination of post-truthers in public life. There is no understanding of public finances, just mass beliefs that are not based on any moral tradition. The spread of the beliefs is on social media, driven by those who most forcefully and repeatedly express their ideas. People are wrong because they disagree with the mass beliefs and shouted down (or trolled in the electronic version) because of it.

In a post last month – General Election 2017 is a victory for the Alpha Trolls over Serving One’s Country – I concluded

It is on the issue of policy to combat climate change that there is greatest cross-party consensus, and the greatest concentration of alpha trolls. It is also where there is the clearest illustration of policy that is objectively useless and harmful to the people of this country.

Like with public finances, climate change is an where post-truthers dominate. Two examples to illustrate.

Consensus messaging

There is no clear evidence of an emerging large human-caused problem with climate and there is no prospect of action to reduce greenhouse has emissions to near zero. Instead we have a dodgy survey that claimed 97% of academic papers on an internet search matching the topics ‘global climate change’ or ‘global warming’ expressed support (belief / assumptions) in the broadest, most banal, form of the global warming hypothesis. This was converted by Senator Bernie Sanders, in questioning Scott Pruitt, into the following:-

As you may know, some 97% of scientists who have written articles for peer-reviewed journals have concluded that climate change is real, it is caused by human activity, and it is already causing devastating problems in the US and around the world.

And

While you are not certain, the vast majority of scientists are telling us that if we do not get our act together and transform out energy system away from fossil fuel there is a real question as to the quality of the planet that we are going to be leaving our children and our grandchildren. 

The conversion from banal belief to these sweeping statements is not the fault of the Senator, though he (or his speech-writers) should have checked. Rather it is of lead author John Cook and his then PhD supervisor Cognitive Psychology Professor Stephan Lewandowsky. Post-truthers will not recognize the glaring difference between the dodgy survey and the Senator’s statements, as it is appeals to emotion and belief that are primary in evaluating political realities.

Mitigating Climate Change

Dangerous climate change is allegedly caused by human greenhouse emissions. The proposed solution is to reduce those emissions (mostly CO2 emissions from the burning of fossil fuels) to near zero. The key for policy is that emissions are global, yet most countries, covering over 80% of the global population have no primary obligation under the 1992 Rio Declaration to reduce their emissions. These developing “non-Annex” countries have accounted for all the in emissions since 1990, as shown in this graph.

The problem can be expressed in my First Law of Climate Mitigation

To reduce global greenhouse gas emissions, the aggregate reduction in countries that reduce their emissions must be greater than aggregate increase in emissions in all other countries.

All the ranting about supporting the Paris Agreement ignores this truism. As a result, countries like the UK who pursue climate mitigation will increase their energy costs and make life harder for the people, whilst not achieving the policy aims. It is the poorest in those policy countries who will bear the biggest burden and create comparative disadvantages compared to the non-policy countries. For the developing countries (shown in purple in the graph) to reduce their emissions would destroy their economic growth, thus preventing the slow climb out of extreme poverty still endured by the majority of people on this planet. In so doing we ignore the moral tradition from our Christian heritage that the primary moral concern of public policy should be the help the poor, the disadvantaged and the marginalized. Ignoring the truism and pursuing bequeaths a worse future for our children and our grandchildren. This is the same for climate change as for public finances. But in both cases it is the post-truth “reality” that prevent this recognition of basic logic and wider morality.

Kevin Marshall

 

Time will run out to prevent 2°C warming barrier being breached

I have a number of times referred to a graphic “Figure 2 Summary of Results” produced by the UNFCCC for the Paris COP21 Climate Conference in December 2015. It was a centerpiece of the UNFCCC Synthesis report on the aggregate effect of INDCs.

The updated graphic (listed as Figure 2, below the Main Document pdf) is below

This shows in yellow the impact of the INDC submissions covering the period 2015 to 2030) if fully implemented against limiting warming to 2°C  and 1.5°C . This showed the gulf between the vague policy reality and the targets. Simply put, the net result of the INDCs submissions would insufficient for global emissions to peal Yet in reaching an “agreement” the representatives of the entire world collectively put off recognizing that gulf.

For the launch of the UNIPCC AR5 synthesis report in 2014, there were produced a set of slides to briefly illustrate the policy problem. This is slide 20 of 35, showing the  reduction pathways.

 

The 2°C  of warming central estimate is based upon total GHG emissions in the 21st Century being around 2500 GtCO2e.

At the launch of 2006 Stern Review Sir Nicholas Stern did a short Powerpoint presentation. Slide 4 of the PDF file is below.

 

The 450ppm CO2e emissions pathway is commensurate with 2°C  of warming. This is based upon total GHG emissions in the 21st Century being around 2000 GtCO2e, with the other 500 GtCO2e presumably coming in the 22nd Century.

The UNFCCC Paris graphic is also based on 2500 GtCO2e it is also possible to calculate the emissions reduction pathway if we assume (a) All INDC commitments are met (b) Forecasts are correct (c) no additional mitigation policies are enacted.

I have produced a basic graph showing the three different scenarios.

The Stern Review assumed global mitigation policy would be enacted around 2010. Cumulative 21st Century emissions would then have been around 450 GtCO2e. With 500 GtCO2e allowed for post 2100, this gave average emissions of around 17 GtCO2e per annum for the rest of the century. 17 GtCO2e, is just under 40% of the emissions in the year the policy would be enacted.

IPCC AR5  assumed global mitigation policy would be enacted around 2020. Cumulative 21st Century emissions would then have been around 950 GtCO2e. A presentation to launch the Synthesis Report rounded this to 1000 GtCO2e as shown in slide 33 of 35.

Assuming that global emissions were brought to zero by the end of the century, this gave average emissions of 20 GtCO2e per annum for the rest of the century. 20 GtCO2e, is just under 40% of the emissions in the year the theoretical global policy would be enacted. The stronger assumption of global emissions being reduced to zero before the end of the century, along with a bit of rounding, offsets the delay.

If the Paris Agreement had been fully implemented, then by 2030 cumulative 21st Century emissions would have around 1500 GtCO2e, leaving average emissions of around 14 GtCO2e per annum for the rest of the century. 17 GtCO2e, is just over 25% of the emissions in the year the policy would be enacted. The failure of the Paris Agreement makes it necessary for true global mitigation policies, if in place by 2030, to be far more drastic that those of just a few years before to achieve the same target.

But the Paris Agreement will not be fully implemented. As Manhatten Contrarian (hattip The GWPF) states, the US was the only major country proposing to reduce its emissions. It looks like China, India, Indonesia, Russia and Germany will all increase their emissions. Further, there is no indication that most countries have any intention of drastically reduce their emissions. To pretend otherwise is to ignore a truism, what I will term the First Law of Climate Mitigation

To reduce global greenhouse gas emissions, the aggregate reduction in countries that reduce their emissions must be greater than aggregate increase in emissions in all other countries.

Modeled projections and targets are rendered meaningless if this truism is ignored. Yet this is what the proposers of climate mitigation policy have been effectively doing for many years. Emissions will therefore breach the mythical 2°C warming barrier, but based on recent data I believe warming will be nowhere near that level.

Kevin Marshall

 

 

Larson C ice-shelf break-away is not human-caused but Guardian tries hard to imply otherwise

A couple of days ago the BBC had an article Giant iceberg splits from Antarctic.

The giant block is estimated to cover an area of roughly 6,000 sq km; that’s about a quarter the size of Wales.

A US satellite observed the berg on Wednesday while passing over a region known as the Larsen C Ice Shelf.

Scientists were expecting it. They’d been following the development of a large crack in Larsen’s ice for more than a decade.

The rift’s propagation had accelerated since 2014, making an imminent calving ever more likely.

After looking at various evidence the BBC concludes

“Most glaciologists are not particularly alarmed by what’s going on at Larsen C, yet. It’s business as usual.”

Researchers will be looking to see how the shelf responds in the coming years, to see how well it maintains a stable configuration, and if its calving rate changes.

There was some keen interest a while back when the crack, which spread across the shelf from a pinning point known as the Gipps Ice Rise, looked as though it might sweep around behind another such anchor called the Bawden Ice Rise. Had that happened, it could have prompted a significant speed-up in the shelf’s seaward movement once the berg came off.

As it is, scientists are not now expecting a big change in the speed of the ice.

That is the theory about a link with accelerating global warming is no longer held due to lack of evidence. But the Guardian sees things differently.

Unlike thin layers of sea ice, ice shelves are floating masses of ice, hundreds of metres thick, which are attached to huge, grounded ice sheets. These ice shelves act like buttresses, holding back and slowing down the movement into the sea of the glaciers that feed them.

“There is enough ice in Antarctica that if it all melted, or even just flowed into the ocean, sea levels [would] rise by 60 metres,” said Martin Siegert, professor of geosciences at Imperial College London and co-director of the Grantham Institute for Climate Change & Environment. 

Despite the lack of evidence for the hypothesis about accelerating ice loss due to glaciers slipping into the sea the Guardian still quotes the unsupported hypothesis. Then the article has a quote from someone who seems to extend the hypothesis to the entire continent. Inspection of their useful map of the location of Larson C might have been helpful.

Larsen C is located mid-way up the Antarctic Peninsula, which comprises around 2% of the area of Antarctica. The Peninsula has seen some rapid warming, quite unlike East Antarctica where very little warming has been detected. That is the Antarctic Peninsula is climatically different from the vast majority of the continent, where nearly all of the ice mass is located.

The article the goes on to contradict the implication with climate change, so the quote is out of context.

Andrew Shepherd, professor of Earth Observation at the University of Leeds, agreed. “Everyone loves a good iceberg, and this one is a corker,” he said. “But despite keeping us waiting for so long, I’m pretty sure that Antarctica won’t be shedding a tear when it’s gone because the continent loses plenty of its ice this way each year, and so it’s really just business as usual!”

However, the Guardian then slips in another out of context quote at the end of the article.

The news of the giant iceberg comes after US president Donald Trump announced that the US will be withdrawing from the 2015 Paris climate accord – an agreement signed by more than 190 countries to tackle global warming. 

Another quote from the BBC article helps give more perspective.

How does it compare with past bergs?

The new Larsen berg is probably in the top 10 biggest ever recorded.

The largest observed in the satellite era was an object called B-15. It came away from the Ross Ice Shelf in 2000 and measured some 11,000 sq km. Six years later, fragments of this super-berg still persisted and passed by New Zealand.

In 1956, it was reported that a US Navy icebreaker had encountered an object of roughly 32,000 sq km. That is bigger than Belgium. Unfortunately, there were no satellites at the time to follow up and verify the observation.

It has been known also for the Larsen C Ice Shelf itself to spawn bigger bergs. An object measuring some 9,000 sq km came away in 1986. Many of Larsen’s progeny can get wound up in a gyre in the Weddell sea or can be despatched north on currents into the Southern Ocean, and even into the South Atlantic.

A good number of bergs from this sector can end up being caught on the shallow continental shelf around the British overseas territory of South Georgia where they gradually wither away.

Bigger events have happened in the past. It is only due to recent technologies that we are able to measure the break-up of ice shelves, or even to observe icebergs the size of small countries.

Note that the Guardian graphic is sourced from Swansea University. Bloomberg has a quote that puts the record straight.

Although this is a natural event, and we’re not aware of any link to human-induced climate change,” said Martin O’Leary, a glaciologist at Swansea University, in a statement.

Kevin Marshall

The Closest yet to my perspective on Climate Change

 Michael S. Bernstam of the Hoover Institution has produced a short post Inconvenient Math. (hattip The GWPF). The opening paragraphs are:-

Climate change faces a neglected actuarial problem. Too many conditions must be met to warrant a policy action on climate change. The following four stipulations must each be highly probable:

1. Global warming will accumulate at 0.12 degrees Celsius or higher per decade.

2. It is anthropogenic, due largely to carbon dioxide emissions.

3. The net effect is harmful to human well-being in the long run.

4. Preventive measures are efficient, that is, feasible at the costs not exceed-ing the benefits.

But even if the probability of each of these stipulations is as high as 85 percent, their compound probability is as low as 50 percent. This makes a decision to act or not to act on climate change equivalent to flipping a coin.

Bernstam later states

In the case of climate change, the conditions are four. They are not random, nor are they arbitrary. To see this, one can run a thought experiment and drop or ignore any of the above foursome. At once, the entire call for action on climate change becomes pointless. If global warming is not ongoing, there is no need to stop it. If it is not anthropogenic, there is no need to curb carbon dioxide emissions. If it is not harmful, there is no need to worry. If preventive measures are inefficient, they would not help and there is no use applying them. It follows that all four conditions are necessary. If just one of them does not hold, action is unnecessary or useless.

That is, for action on climate change to be justified (in terms of having a reasonable expectation that by acting to combat climate change a better future will be created than by not acting) there must be human-caused warming of sufficient magnitude to produce harmful consequences, AND measures that cost less than the expected future costs that they offset.

These sentiments are a simplified version of a series of posts I made in October 2013, where I very crudely deriving two cost curves (costs of climate change and climate mitigation). This aimed to replicate a takeaway quote from the Stern Review.

Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more. In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.

I looked at the idea of multiplying the various probabilities together, at least for the costs of climate change.  But instead of the boundary it is a continuous function of an infinite number of possible scenarios. In general I believe the more extreme the costs of warming, the less likely it is to happen. The reason is that we derive the non-visible part of the cost curve can only be objectively derived from the revealed warming from the recent past. Separation of the costs of warming-induced climate change are extremely difficult from the costs of random extreme weather events. Even worse, the costs of extreme natural weather events (especially in terms of death toll) has been falling over time, as Indur Goklany has documented. The fall-back for global-warming theory is to use the late Milton Friedman’s Methodology of Positive Economics. That is to evaluate theory credibility on its predictive ability. If in the short-run climate scientists (or anyone who believes in climate alarmism like Al Gore) are able to make predictions about the signals of impending climate apocalypse, then this should give some credibility for claims of substantially worse to come. The problem is there are a huge number of failed predictions of climate worsening, but not a single one that has come true. This would signify that the true risk (as opposed to the perceived risk from the climate community) of climate change is approximately zero. The divergence of belief from the evidence is likely from the collective navel-gazing of post normal science.

The policy aspect that Bernstam fails to explore is the re-distributional aspects of policy. The theory is that global warming is caused by global greenhouse gas emissions. Therefore climate mitigation must comprise of reducing those global emissions. However, as the COP21 Paris showed most of the worlds population live in countries where there are no GHG emissions reduction policies even proposed. But actually reducing emissions means increasing energy costs, and hampering businesses with onerous regulations. Policy countries are given a comparative disadvantage to non-policy countries, as I tried to show here. The implication is that if developed countries strongly pursue high cost mitigation policies, the marginal cost of non-policy emerging economies switching to emissions reduction policies increases. Thus, whilst Donald Trump’s famous tweet that Global Warming is a Chinese hoax to make US manufacturing non-competitive is false, the impact of climate mitigation policies as currently pursued are the same as if it were true.

There is also a paradox with the costs of climate change. The costs of climate change are largely related to the unexpected nature of the costly events. For instance, ceteris paribus. a category 1 hurricane could be more costly in a non-hurricane area than a stronger hurricane in say Florida. The reason is that in the non-hurricane area buildings will not be as resistant to storms, nor will there be early warning procedures in place as in Florida. The paradox is that more successful climate scientists are in forecasting the risks of climate change, the more people can adapt to climate change, reducing the costs. The current focus on climate consensus, rather than focusing on increasing competency and developing real expertise in the field is actually harmful to future generations if climate change is a actually a serious emerging problem. But the challenge for the climate alarmists is that in developing the real expertise may result in their beliefs about the world are false.

Finally, Bernstam fails to acknowledge an immutable law of public policy. Large complex public policy projects with vague aims; poorly defined plans and lack of measurable costs tend to overshoot on costs and under-perform of benefits. Climate mitigation is an extreme example of complexity, lack of clear objects and lack object measurement of costs per unit of emissions saved.

Kevin Marshall