Windhoek Temperature adjustments

At Euan Mean’s blog I made reference to my findings, posted in full last night, that in the Isfjord Radio weather station had adjustments that varied between +4.0oC in 1917 to -1.7oC in the 1950s. I challenged anyone to find bigger adjustments than that. Euan came back with the example of Windhoek in South Africa, claiming 5oC of adjustments between the “raw” and GISS homogenised data.

I cry foul, as the adjustments are throughout the data set. J

That is the whole of the data set has been adjusted up by about 4 oC!

However, comparing the “raw” with the GISS homogenised data, with 5 year moving averages, (alongside the net adjustments) there are some interesting features.

The overall temperatures have been adjusted up by around 4oC, but

  • From the start of the record in 1920 to 1939 the cooling has been retained, if not slightly amplified.
  • The warming from 1938 to 1947 of 1.5oC has been erased by a combination of deleting the 1940 to 1944 data and reducing the 1945-1948 adjustments by 1.4oC.
  • The 1945-1948 adjustments, along with random adjustments and deletion of data mostly remove the near 1.5oC of cooling from the late 1940s to mid-1950s and the slight rebound through to the early 1960s.
  • The early 1970s cooling and the warming to the end of the series in the mid-1980s is largely untouched.

The overall adjustments leave a peculiar picture that cannot be explained by a homogenisation algorithm. The cooling in the 1920s offsets the global trend. Deletion of data and the adjustments in the data counter the peak of warming in the early 1940s in the global data. Natural variations in the raw data between the late 1940s and 1970 appear to have been removed, then the slight early 1970s cooling and the subsequent warming in the raw data left alone. However, the raw data shows average temperatures in the 1980s to be around 0.8oC higher than in the early 1920s. The adjustments seem to have removed this.

This removal of the warming trend tends to disprove something else. There appears to be no clever conspiracy, with a secret set of true figures. Rather, there are a lot of people dipping in to adjusting adjusted data to their view of the world, but nobody really questioning the results. They have totally lost sight of what the real data actually is. If they have compared the final adjusted data with the raw data, then they realised that the adjustments had managed to have eliminated a warming trend of over 1 oC per century.

Kevin Marshall

The Propaganda methods of ….and Then There’s Physics on Temperature Homogenisation

There has been a rash of blog articles about temperature homogenisations that is challenging the credibility of the NASS GISS temperature data. This has lead to attempts by anonymous blogger andthentheresphysics (ATTP) to crudely deflect from the issues identified. It is propagandist’s trick of turning people’s perspectives. Instead of a dispute about some scientific data, ATTP turns the affair into a dispute between those with authority and expertise in scientific analysis, against a few crackpot conspiracy theorists.

The issues on temperature homogenisation are to do with the raw surface temperature data and the adjustments made to remove anomalies or biases within the data. “Homogenisation” is a term used for process of adjusting the anomalous data into line with that from the surrounding data.

The blog articles can be split into three categories. The primary articles are those that make direct reference to the raw data set and the surrounding adjustments. The secondary articles refer to the primary articles, and comment upon them. The tertiary articles are directed at the secondary articles, making little or no reference to the primary articles. I perceive the two ATTP articles as fitting into the scheme below.

Primary Articles

The source of complaints about temperature homogenisations is Paul Homewood at his blog notalotofpeopleknowthat. The source of the articles is NASA’s Goddard Institute for Space Studies (GISS) database. For any weather station GISS provide nice graphs of the temperature data. The current after GISS homogeneity adjustment data is available here and the raw GHCN data + UHSHCN corrections is available here up until 2011 only. For any weather station GISS provide nice graphs of the temperature data. Homewood’s primary analysis was to show the “raw data” side by side.

20/01/15 Massive Tampering With Temperatures In South America

This looked at all three available rural stations in Paraguay. The data from all three at Puerto Casado, Mariscal and San Jan Buatista/Misiones had the same pattern of homogenization adjustments. That is, cooling of the past, so that instead of the raw data showing the 1960s being warmer than today, it was cooler. What could they have been homogenized to?

26/01/15 All Of Paraguay’s Temperature Record Has Been Tampered With

This checked the six available urban sites in Paraguay. Homewood’s conclusion was that

warming adjustments have taken place at every single, currently operational site in Paraguay.

How can homogenization adjustments all go so same way? There is no valid reason for making such adjustments, as there is no reference point for the adjustments.

29/01/15Temperature Adjustments Around The World

Homewood details other examples from Southern Greenland, Iceland, Northern Russia, California, Central Australia and South-West Ireland. Instead of comparing the raw with the adjusted data, he compared the old adjusted data with the recent data. Adjustment decisions are changing over time, making the adjusted data sets give even more pronounced warming trends.

30/01/15 Cooling The Past In Bolivia

Then he looked at all 14 available stations in neighbouring Bolivia. His conclusion

At every station, bar one, we find the ….. past is cooled and the present warmed.”

(The exception was La Paz, where the cooling trend in the raw data had been reduced.)

Why choose Paraguay in the first place? In the first post, Homewood explains that within a NOAA temperature map for the period 1981-2010 there appeared to be a warming hotspot around Paraguay. Being a former accountant he checked the underlying data to see if it existed in the data. Finding an anomaly in one area, he checked more widely.

The other primary articles are

26/01/15 Kevin Cowton NOAA Paraguay Data

This Youtube video was made in response to Christopher Booker’s article in the Telegraph, a secondary source of data. Cowton assumes Booker is the primary source, and is criticizing NOAA data. A screen shot of the first paragraph shows these are untrue.

Further, if you read down the article, Cowton’s highlighting of the data from one weather station is also misleading. Booker points to three, but just illustrates one.

Despite this, it still ranks as a primary source, as there are direct references to the temperature data and the adjustments. They are not GISS adjustments, but might be the same.

29/01/15 Shub Niggurath – The Puerto Casado Story

Shub looked at the station moves. He found that the metadata for the station data is a mess, so there is no actual evidence of the location changing. But, Shub reasons the fact that there was a step change in the data meant that it moved, and the fact that it moved meant there was a change. Shub is a primary source as he looks at the adjustment reason.

 

Secondary Articles

The three secondary articles by Christopher Booker, James Delingpole and BishopHill are just the connectors in this story.

 

Tertiary articles of “…and Then There’s Physics”

25/01/15 Puerto Cascado

This looked solely at Booker’s article. It starts

Christopher Booker has a new article in the The Telegraph called Climategate, the sequel: How we are STILL being tricked with flawed data on global warming. The title alone should be enough to convince anyone sensible that it isn’t really worth reading. I, however, not being sensible, read it and then called Booker an idiot on Twitter. It was suggested that rather than insulting him, I should show where he was wrong. Okay, this isn’t really right, as there’s only so much time and effort available, and it isn’t really worth spending it rebutting Booker’s nonsense.

However, thanks to a tweet from Ed Hawkins, it turns out that it is really easy to do. Booker shows data from a site in Paraguay (Puerto Casado) in which the data was adjusted from a trend of -1.37o C per century to +1.36o C per century. Shock, horror, a conspiracy?

 

ATTP is highlighting an article, but is strongly discouraging anybody from reading it. That is why the referral is a red line in the graphic above. He then says he is not going to provide a rebuttal. ATTP is good to his word and does not provide a rebuttal. Basically it is saying “Don’t look at that rubbish, look at the real authority“. But he is wrong for a number of reasons.

  1. ATTP provides misdirection to an alternative data source. Booker quite clearly states that the source of the data is the NASA GISS temperature set. ATTP cites Berkeley Earth.
  2. Booker clearly states that there are thee rural temperature stations spatially spread that show similar results. ATTP’s argument that a single site was homogenized with the others in the vicinity falls over.
  3. This was further undermined by Paul Homewood’s posting on the same day on the other 6 available sites in Paraguay, all giving similar adjustments.
  4. It was further undermined by Paul Homewood’s posting on 30th January on all 14 sites in Bolivia.

The story is not of a wizened old hack making some extremist claims without any foundation, but of a retired accountant seeing an anomaly, and exploring it. In audit, if there is an issue then you keep exploring it until you can bottom it out. Paul Homewood has found an issue, found it is extensive, but is still far from finding the full extent or depth. ATTP, when confronted by my summary of the 23 stations that corroborate each other chose to delete it. He has now issued an update.

Update 4/2/2015 : It’s come to my attention that some are claiming that this post is misleading my readers. I’m not quite sure why, but it appears to be related to me not having given proper credit for the information that Christopher Booker used in his article. I had thought that linking to his article would allow people to establish that for themselves, but – just to be clear – the idiotic, conspiracy-laden, nonsense originates from someone called Paul Homewood, and not from Chistopher Booker himself. Okay, everyone happy now? J

ATTP cannot accept that he is wrong. He has totally misrepresented the arguments. When confronted with alternative evidence ATTP resorts to vitriolic claims. If someone is on the side of truth and science, they will encourage people to compare and contrast the evidence. He seems to have forgotten the advice about when in a whole…..

01/02/15
Temperature homogenisation

ATTP’s article on Temperature Homogenisation starts

Amazing as it may seem, the whole tampering with temperature data conspiracy has managed to rear its ugly head once again. James Delingpole has a rather silly article that even Bishop Hill calls interesting (although, to be fair, I have a suspicion that in “skeptic” land, interesting sometimes means “I know this is complete bollocks, but I can’t bring myself to actually say so”). All of Delingpole’s evidence seems to come from “skeptic” bloggers, whose lack of understand of climate science seems – in my experience – to be only surpassed by their lack of understanding of the concept of censorship J.

ATPP starts with a presumption of being on the side of truth, with no fault possible on his side. Any objections are due to a conscious effort to deceive. The theory of cock-up or of people not checking their data does not seem to have occurred to him. Then there is a link to Delingpole’s secondary article, but calling it “silly” again deters readers from looking for themselves. If they did, the readers would be presented with flashing images of all the “before” and “after” GISS graphs from Paraguay, along with links to the 6 global sites and Shub’s claims that there is a lack of evidence for the Puerto Casado site being moved. Delingpole was not able the more recent evidence from Bolivia, that further corroborates the story.

He then makes a tangential reference to his deleting my previous comments, though I never once used the term “censorship”, nor did I tag the article “climate censorship”, as I have done to some others. Like on basic physics, ATTP claims to have a superior understanding of censorship.

There are then some misdirects.

  • The long explanation of temperature homogenisation makes some good points. But what it does not do is explain that the size and direction of any adjustment is an opinion, and as such be wrong. It a misdirection to say that the secondary sources are against any adjustments. They are against adjustments that create biases within the data.
  • Quoting Richard Betts’s comment on Booker’s article about negative adjustments in sea temperature data is a misdirection, as Booker (a secondary source) was talking about Paraguay, a land-locked country.
  • Referring to Cowton’s alternative analysis is another misdirect, as pointed out above. Upon reflection, ATTP may find it a tad embarrassing to have this as his major source of authority.

Conclusions

When I studied economics, many lecturers said that if you want to properly understand an argument or debate you need to look at the primary sources, and then compare and contrast the arguments. Although the secondary sources were useful background, particularly in a contentious issue, it is the primary sources on all sides that enable a rounded understanding. Personally, by being challenged by viewpoints that I disagreed with enhanced my overall understanding of the subject.

ATTP has managed to turn this on its head. He uses methods akin to crudest propagandists of last century. They started from deeply prejudiced positions; attacked an opponent’s integrity and intelligence; and then deflected away to what they wanted to say. There never gave the slightest hint that one side might be at fault, or any acknowledgement that the other may have a valid point. For ATTP, and similar modern propagandists, rather than have a debate about the quality of evidence and science, it becomes a war of words between “deniers“, “idiots” and “conspiracy theorists” against the basic physics and the overwhelming evidence that supports that science.

If there is any substance to these allegations concerning temperature adjustments, for any dogmatists like ATTP, it becomes a severe challenge to their view of the world. If temperature records have systematic adjustment biases then climate science loses its’ grip on reality. The climate models cease to be about understanding the real world, but conforming to people’s flawed opinions about the world.

The only way to properly understand the allegations is to examine the evidence. That is to look at the data behind the graphs Homewood presents. I have now done that for the nine Paraguayan weather stations. The story behind that will have to await another day. However, although I find Paul Homewood’s claims of systematic biases in the homogenisation process to be substantiated, I do not believe that it points to a conspiracy (in terms of a conscious and co-ordinated attempt to deceive) on the part of climate researchers.

DECC’s Dumb Global Calculator Model

On the 28th January 2015, the DECC launched a new policy emissions tool, so everyone can design policies to save the world from dangerous climate change. I thought I would try it out. By simply changing the parameters one-by-one, I found that the model is both massively over-sensitive to small changes in input parameters and is based on British data. From the model, it is possible to entirely eliminate CO2 emissions by 2100 by a combination of three things – reducing the percentage travel in urban areas by car from 43% to 29%; reducing the average size of homes to 95m2 from 110m2 today; and for everyone to go vegetarian.

The DECC website says

Cutting carbon emissions to limit global temperatures to a 2°C rise can be achieved while improving living standards, a new online tool shows.

The world can eat well, travel more, live in more comfortable homes, and meet international carbon reduction commitments according to the Global Calculator tool, a project led by the UK’s Department of Energy and Climate Change and co-funded by Climate-KIC.

Built in collaboration with a number of international organisations from US, China, India and Europe, the calculator is an interactive tool for businesses, NGOs and governments to consider the options for cutting carbon emissions and the trade-offs for energy and land use to 2050.

Energy and Climate Change Secretary Edward Davey said:

“For the first time this Global Calculator shows that everyone in the world can prosper while limiting global temperature rises to 2°C, preventing the most serious impacts of climate change.

“Yet the calculator is also very clear that we must act now to change how we use and generate energy and how we use our land if we are going to achieve this green growth.

“The UK is leading on climate change both at home and abroad. Britain’s global calculator can help the world’s crucial climate debate this year. Along with the many country-based 2050 calculators we pioneered, we are working hard to demonstrate to the global family that climate action benefits people.”

Upon entering the calculator I was presented with some default settings. Starting from a baseline emissions in 2011 of 49.9 GT/CO2e, this would give predicted emissions of 48.5 GT/CO2e in 2050 and 47.9 GT/CO2e in 2100 – virtually unchanged. Cumulative emissions to 2100 would be 5248 GT/CO2e, compared with 3010 GT/CO2e target to give a 50% chance of limiting warming to a 2°C rise. So the game is on to save the world.

I only dealt with the TRAVEL, HOMES and DIET sections on the left.

I went through each of the parameters, noting the results and then resetting back to the baseline.

The TRAVEL section seems to be based on British data, and concentrated on urban people. Extrapolating for the rest of the world seems a bit of a stretch, particularly when over 80% of the world is poorer. I was struck first by changing the mode of travel. If car usage in urban areas fell from 43% to 29%, global emissions from all sources in 2050 would be 13% lower. If car usage in urban areas increased from 43% to 65%, global emissions from all sources in 2050 would be 7% higher. The proportions are wrong (-14% gives -13%, but +22% gives +7%) along with urban travel being too high a proportion of global emissions.

The HOMES section has similar anomalies. Reducing the average home area by 2050 to 95m2 from 110m2 today reduces total global emissions in 2050 by 20%. Independently decreasing average urban house temperature in 2050 from 17oC in Winter & 27oC in Summer, instead of 20oC & 24oC reduces total global emissions in 2050 by 7%. Both seem to be based on British-based data, and highly implausible in a global context.

In the DIET section things get really silly. Cutting the average calorie consumption globally by 10% reduces total global emissions in 2050 by 7%. I never realised that saving the planet required some literal belt tightening. Then we move onto meat consumption. The baseline for 2050 is 220 Kcal per person per day, against the current European average of 281 Kcal. Reducing that to 14 Kcal reduces global emissions from all sources in 2050 by 73%. Alternatively, plugging in the “worst case” 281 Kcal, increases global emissions from all sources in 2050 by 71%. That is, if the world becomes as carnivorous in 2050 as the average European in 2011, global emissions from all sources at 82.7 GT/CO2e will be over six times higher the 13.0 GT/CO2e. For comparison, OECD and Chinese emissions from fossil fuels in 2013 were respectively 10.7 and 10.0 GT/CO2e. It seems it will be nut cutlets all round at the climate talks in Paris later this year. No need for China, India and Germany to scrap all their shiny new coal-fired power stations.

Below is the before and after of the increase in meat consumption.

Things get really interesting if I take the three most sensitive, yet independent, scenarios together. That is, reducing urban car use from 43% to 29% of journeys in 2050; reducing the average home area by 2050 to 95m2 from 110m2; and effectively making a sirloin steak (medium rare) and venison in redcurrant sauce things of the past. Adding them together gives global emissions of -2.8 GT/CO2e in 2050 and -7.1 GT/CO2e in 2100, with cumulative emissions to 2100 of 2111 GT/CO2e. The model does have some combination effect. It gives global emissions of 3.2 GT/CO2e in 2050 and -0.2 GT/CO2e in 2100, with cumulative emissions to 2100 of 2453 GT/CO2e. Below is the screenshot of the combined elements, along with a full table of my results.

It might be great to laugh at the DECC for not sense-checking the outputs of its glitzy bit of software. But it concerns me that it is more than likely the same people who are responsible for this nonsense are also responsible for the glossy plans to cut Britain’s emissions by 80% by 2050 without destroying hundreds of thousands of jobs; eviscerating the countryside; and reducing living standards, especially of the poor. Independent and critical review and audit of DECC output is long overdue.

Kevin Marshall

 

A spreadsheet model is also available, but I used the online tool, with its’ excellent graphics. The calculator is built by a number of organisations.

Global Emissions Reductions Targets for COP21 Paris 2015

There is a huge build-up underway for the COP21 climate conference to be staged in Paris in November. Many countries and NGOs are pushing for an agreement that will constrain warming to just 2oC, but there are no publicly available figures of what this means for all the countries of the world. This is the gap I seek close with a series of posts. The first post is concerned with getting a perspective on global emissions and the UNIPCC targets.

In what follows, all the actual figures are obtained from three primary sources.

  • Emissions data comes from the Carbon Dioxide Information Analysis Centre or CDIAC.
  • Population data comes from the World Bank, though a few countries are missing. These are mostly from Wikipedia.
  • The Emissions targets can be found in the Presentation for the UNIPCC AR5 Synthesis Report.

All categorizations and forecast estimates are my own.

The 1990 Emissions Position

A starting point for emissions reductions is to stabilize emissions to 1990 levels, around the time that climate mitigation was first proposed. To illustrate the composition emissions I have divided the countries of the world into the major groups meaningful at that time – roughly into First World developed nations, the Second World developed communist countries and the Third World developing economies. The First World is represented by the OECD. I have only included members in 1990, with the USA split off. The Second World is the Ex-Warsaw pact countries, with the countries of the former Yugoslavia included as well. The rest are of the world is divided into five groups. I have charted the emissions per capita against the populations of these groups to come up with the following graph.

In rough terms, one quarter of the global population accounted for two-thirds of global emissions. A major reduction on total emissions could therefore be achieved by these rich countries taking on the burden of emissions reductions, and the other countries not increasing their emissions, or keeping growth to a minimum.

The 2020 emissions forecast

I have created a forecast of both emissions and population for 2020 using the data up to 2013 for both emissions and population. Mostly these are assuming the same change in the next seven years as the last. For emissions in the rapidly-growing countries this might be an understatement. For China and India I have done separate forecasts based on their emissions commitments. This gives the following graph.

The picture has changed dramatically. Population has increased by 2.4 billion or 45% and emissions by over 80%. Global average emissions per capita have increased from 4.1 to 5.2t/CO2 per capita. Due to the population increase, to return global emissions to 1990 levels would mean reducing average emissions per capita to 2.85t/CO2.

The composition of emissions has been even more dramatic. The former First and Second World countries will see a slight fall in emissions from 14.9 to 14.0 billion tonnes of CO2 and the global share will have reduced from 68% to 36%. Although total population will have increased on 1990, the slower growth than elsewhere means the share of global population has shrunk to just 19%. China will have a similar population and with forecast emissions of 13.1 billion tonnes of CO2, 33% of the global total.

The picture is not yet complete. On slide 30 of their Synthesis Report presentation the UNIPCC state

Measures exist to achieve the substantial emissions reductions required to limit likely warming to 2oC (40-70% emissions reduction in GHGs globally by 2050 and near zero GHGs in 2100)

The baseline is 2011, when global emissions were 29.74 billion t/CO2. In 2050 global population will be nearly nine billion. This gives an upper limit of 2.2 t/CO2 per capita and lower limit of 1.1 t/CO2 per capita.

To put this in another perspective, consider the proportions of people living in countries that need emissions targets based on greater than 2.2t/CO2 emissions per capita.

In 1990, it was just a third of the global population. In 2020 it will be three quarters. No longer can an agreement on constraining global CO2 emissions be limited to a few countries. It needs to be truly global. The only area that meets the target is Africa, but even here the countries of Algeria, Egypt, Libya, Tunisia and South Africa would need to have emission reduction targets.

Further Questions

  1. What permutations are possible if other moral considerations are taken into account, like the developed countries bear the burden of emission cuts?
  2. What targets should be set for non-fossil fuel emissions, such as from Agriculture? Are these easier or harder to achieve than for fossil fuels?
  3. What does meeting emission targets mean for different types of economies? For instance are emission reductions more burdensome for the fast-growing emerging economies that for the developed economies?
  4. What are the measures that IPCC claims exist to reduce emissions? Are they more onerous than the consequences of climate change?
  5. Are there in place measures to support the states dependent on the production of fossil fuels? In particular, the loss of income to the Gulf States from leaving oil in the ground may further destabilize the area.
  6. What sanctions if some countries refuse to sign up to an agreement, or are politically unable to implement an agreement?
  7. What penalties will be imposed if countries fail to abide by the agreements made?

Kevin Marshall

The Truth About Davey’s Energy Savings

manicbeancounter:

Ed Davey’s claim that the DECC published “a complete picture of everything that affects final energy bills” is refuted by Paul Homewood below.
This is far from an exhaustive list. For instance there are also the costs of upgrading the National Grid to transport the generated the electricity generated in remote wind turbines to the centers of population; the impact on jobs and growth of increasing energy costs relative to other nations;and the more esoteric costs to democracy of having a dogmatic group of people with dogmatic beliefs in a specialist applied subject claiming that this gives them superior insights into public policy-making, policy implementation and economic theory.

Originally posted on NOT A LOT OF PEOPLE KNOW THAT:

By Paul Homewood

Scan

Ed Davey has been stung into defending his disastrous energy policies, following revelations that his department had disgracefully attempted to hide data, showing that electricity prices would soon be 40% higher, as a result of climate policies.

The above letter was published in last week’s Sunday Telegraph. Unfortunately, he is being rather economical with the truth.

First, let’s recap on the energy savings which Davey says will make us so much better off. The table below is from the data that DECC tried to hide.

image

https://www.gov.uk/government/publications/estimated-impacts-of-energy-and-climate-change-policies-on-energy-prices-and-bills-2014

The so-called savings are listed under 2).

View original 507 more words

Ed Hoskins: Capital Cost and Production Effectiveness of Renewable Energy in Europe – the Data

manicbeancounter:

Ed Hoskins provides a very wide-ranging analysis on the capital costs of renewables in Europe, with information about all the major countries. Despite total investment of $500bn so far, renewables provide just 2.9% of actual power generated. Hoskins also provides some graphical data on “Intermittency and Non-dipatchability” of energy output, helping highlight that renewables are not just expensive, they are also pretty useless at providing power when required.
The one weakness in the analysis is in the costs per unit of output – something outside the main purpose of the post. The source of that data is the U.S. Energy Information Administration. This uses (Table 2-5 on page 44 of the pdf file) “Overnight Capital Cost” which measures capital and maintenance costs per unit of capacity. So, for instance, “Onshore Wind” appears to have only 2.2 times the capital cost of “Natural Gas Advanced Combined Cycle”. But assuming the former operates at 25% of capacity and the latter at 85%, the capital costs of wind power becomes 7.5 times that of gas. Similarly, assuming offshore wind operates at 35% of capacity, relative capital costs rise from 6.2 to 14.8 times that of gas.

http://www.eia.gov/forecasts/capitalcost/pdf/updated_capcost.pdf

Another point is that the EIA does not consider conventional coal-fired power stations, possibly inflating the price by some measure of “The Social cost of Carbon”. Using the average price in AR4 of $12 per tonne of CO2 (Synthesis Report Page 69) and that a coal-fired power station produces about 500kg per megawatt, this $6 per megawatt is trivial compared with the much higher cost of renewables.

Originally posted on Tallbloke's Talkshop:

Guest post from Ed Hoskins
A comparison of both the Capital Cost and Energy Production Effectiveness of the Renewable Energy in Europe.

The diagrams and table below collate the cost and capacity factors of Renewable Energy power sources, Onshore and Off-shore Wind Farms and Large scale Photovoltaic Solar generation, compared to the cost and output capacity of conventional Gas Fired Electricity generation.

Screen Shot 2014-12-16 at 08.16.07

The associated base data is shown below:

View original 2,794 more words

Nissan Leaf Fails The Test

manicbeancounter:

Paul Homewood has a very useful comparison between the cost of the electric Nissan Leaf car and a couple of super-efficient Ford Focuses. The electric car turns out to be a much worse buy. But looking at the costs of motoring to the consumer, and the tax costs can be complex, so there are a couple of points that I would amend.
First is that the £5000 rebate on an electric car is relevant to the buying decision. Otherwise it would not be in place. The purchaser of the car ends up paying £5000 less, so that is a reduction in both the depreciation and the borrowing they will face. As a result the annual cost differential on your figures reduces from £3200 to £1350. However, due to the differential in maintenance this figure is more like £1700.
Second is the difference in tax revenue. New cars attract 20% VAT. For the Leaf this is £4750. After the rebate, the exchequer gives out £250. VAT on the focus Focus Diesel is about £3300. In 3 years, the net tax revenue on the Leaf (purchase price, 5% VAT on electricity, 20% VAT in maintenance) is £50. On both Fords it is £5100.
The figures, by chance, fall out the same. Buy a Nissan Leaf instead of a Ford Focus and both you and the Exchequer will be about £5000 worse off over three years.
The differences do not stop there. As AC Osborn rightly points out there is a problem with range. The Leaf is limited to about 100 miles before a recharge of over four hours. As such, for families, it becomes a second car, whereas the a Focus with a range of at least 400 miles and a five minute refill can both serve for the school run / daily commute and for longer trips as well. An electric car becomes more of a lifestyle car, so on cost the Leaf is competing with an Audi A3 or similar.
Kevin Marshall

Originally posted on NOT A LOT OF PEOPLE KNOW THAT:

By Paul Homewood

With oil prices falling through the floor, and confirmation of just how much electricity prices are going to rise in the next few years, it is time to look again at the comparative costs of electric and conventional cars.

The Nissan Leaf seems to be the most popular electric car in the UK, and is comparable, from a specification point of view, to the Ford Focus. The Leaf Acenta is the mid range version, and can be compared with the Focus Zetec, which I have shown for both the 1.6 TDCi diesel and Eco 1.0 petrol options.

So first, some basic costs and specifications.

View original 653 more words

Spending Money on Foreign Aid instead of Renewables

On the Discussion at BishopHill, commentator Raff asked people whether the $1.7 trillion spent so far on renewables should have been spent on foreign aid instead. This is an extended version of my reply.

The money spent on renewables has been net harmful by any measure. It has not only failed to even dent global emissions growth, it will also fail even if the elusive global agreement is reached as the country targets do not stack up. So the people of the emissions-reducing countries will bear both the cost of those policies and practically all the costs of the unabated warming as well. The costs of those policies have been well above anything justified in the likes of the Stern Review. There are plenty of British examples at Bishop Hill of costs being higher than expected and (often) solutions being much less effective than planned from Wind, solar, CCS, power transmission, domestic energy saving etc. Consequences have been to create a new category of poverty and make our energy supplies less secure. In Spain the squandering of money has been proportionately greater and likely made a significant impact of the severity of the economic depression.1

The initial justification for foreign aid came out of the Harrod and Domar growth models. Lack of economic growth was due to lack of investment, and poor countries cannot get finance for that necessary investment. Foreign Aid, by bridging the “financing gap“, would create the desired rate of economic growth. William Easterly looked at 40 years of data in his 2002 book “The Elusive Quest for Growth“. Out of over 80 countries, he could find just one – Tunisia – where foreign aid conformed to the theory. That is where increased aid was followed by increased investment which was followed by increased growth. There were plenty examples of where countries received huge amounts of aid relative to GDP over decades and their economies shrank. Easterly graphically confirmed what the late Peter Bauer said over thirty years ago – “Official aid is more likely to retard development than to promote it.

In both constraining CO2 emissions and Foreign Aid the evidence shows that the pursuit of these policies is not just useless, but possibly net harmful. An analogy could be made with a doctor who continues to pursue courses of treatment when the evidence shows that the treatment not only does not work, but has known and harmful side effects. In medicine it is accepted that new treatments should be rigorously tested, and results challenged, before being applied. But a challenge to that doctor’s opinion would be a challenge to his expert authority and moral integrity. In constraining CO2 emissions and promoting foreign aid it is even more so.

Notes

  1. The rationale behind this claim is explored in a separate posting.

Kevin Marshall

Michael Mann and John Cook at Bristol University

Lucia at The Blackboard last month publicized that the John Cook is to speak at Bristol University on Dogma vs. consensus: Letting the evidence speak on climate change on Friday 19th September. There are still 395 free tickets left for the event.

Stephen Lewandowsky also notes that Michael Mann is to lecture at the same event on Tuesday 23rd September on The Hockey Stick and the climate wars – the battle continues. Just 102 free tickets left for this event.

Given the Mann’s belief that the continued climate denial is due to “massive funding of climate change denial by monied interests” (HuffPo), it might provide some light entertainment for the students.

Update 19th Sept. There are still 309 tickets left for the John Cook lecture for tomorrow – Friday 20th September. See http://www.bristol.ac.uk/cabot/events/2014/488.html

The Michael Mann lecture is now SOLD OUT, or more accurately, all the tickets have been given away.

Understanding the role of Peer Review

In “Newton, Einstein, Watson and Crick, were not peer reviewed“, Jo Nova questions whether peer review is valid at all. I think the answer is somewhat more nuanced. This is an extended version of a comment made.

Before dismissing peer review, we should ask are the boundaries of peer review. That is what peer review can achieve and what it cannot.

Proper peer review should check that the thesis of paper is original and properly references other works in the field. It should also make sure that the claims made are coherent, not demonstrably false, have a reason (or reasons) for originality, and all assumptions are clearly stated. It might also check to ensure that certain ethical boundaries are not breached. There is more basic checking, like that of an editor.

Peer review cannot determine if the following criteria are valid:-

(1) The ultimate truth. Make sure that the claims made are the last word on the subject. That is the thesis will never be falsified, contradicted, or supplanted by more general theories.

(2) The best to date. Determine that the thesis is superior to what is already available. There is a place for literature reviews to compare and contrast the existing body of knowledge.(i)

(3) That every point is correct, or every assumption known and stated.

(4) That every conjecture that the paper is built upon is correct, or every assumption is valid. Certain stated hypotheses or conjectures might be themselves based upon other conjectures. Assumptions might be accepted, but be false or exclude other, contradictory but quite valid, lines of enquiry.

(5) That a paper is hugely significant, or of little consequence.

(6) That a paper is of outstanding quality, against mediocre.

(7) That the absence of, superior, contradictory views in the academic literature is not a demonstration of the truth or quality of a research program.

Academic study is a combination of building on the work of that has gone before, whilst noticing the empirical or logical gaps and anomalies. It can be quite valid to making conjectures upon conjectures, as long as you do not lose sight that the falsification of a root conjecture will partially or completely undermine every piece of work built upon it.(ii) In climatology the vast majority of papers are built upon looking at the consequences of the catastrophic warming hypothesis. Falsifying CAGW will mean entire research programs will be null and void. That includes many studies in other areas such as economics and public-policy making.

 

Notes

  1. For instance, the Journal of Economic Literature has long-performed this service in economics.
  2. Until Andrew Wiles proved Fermat’s last theorem, large areas of mathematical proofs relied upon a conjecture. Watch the video here.
Follow

Get every new post delivered to your Inbox.

Join 44 other followers