Category Archives: Temperature

Post-Truth Global and Arctic Temperatures

“Post-truth” is the the Oxford Dictionaries word of the year for 2016. The definition reads as follows:

post-truth – an adjective defined as ‘relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief’.

and according to Oxford Dictionaries:

The concept of post-truth has been in existence for the past decade, but Oxford Dictionaries has seen a spike in frequency this year in the context of the EU referendum in the United Kingdom and the presidential election in the United States. It has also become associated with a particular noun, in the phrase post-truth politics.

Post-truth has gone from being a peripheral term to being a mainstay in political commentary, now often being used by major publications without the need for clarification or definition in their headlines.

Our old friend David Rose has been remarkably quiet on the topic of Arctic sea ice recently. Presumably the objective facts from the Arctic are impossible to spin to his satisfaction even for a man of David’s talents? However that didn’t stop him from penning an article for The Mail on Sunday at the end of November on the topic of the recent “record highs in global temperatures“:

Global average temperatures over land have plummeted by more than 1C since the middle of this year – their biggest and steepest fall on record.

The news comes amid mounting evidence that the recent run of world record high temperatures is about to end. The fall, revealed by Nasa satellite measurements of the lower atmosphere, has been caused by the end of El Niño – the warming of surface waters in a vast area of the Pacific west of Central America.

MoS2 Template Master

The Mail article helpfully included this one year old video from the World Meteorological Organization, explaining the basics of the El Niño phenomenon:

According to the commentary:

This phenomenon affects weather conditions across the equatorial Pacific, with potential knock on effects in other parts of the world.

We’ll get on to the “potential knock on effects” in the Arctic eventually, but let’s start with a snippet of Mr. Rose’s “post-truth politics”:

Some scientists, including Dr Gavin Schmidt, head of Nasa’s climate division, have claimed that the recent highs were mainly the result of long-term global warming.

Last year, Dr Schmidt said 2015 would have been a record hot year even without El Nino. ‘The reason why this is such a warm record year is because of the long-term underlying trend, the cumulative effect of the long-term warming trend of our Earth,’ he said. This was ‘mainly caused’ by the emission of greenhouse gases by humans.

Other experts have also disputed Dr Schmidt’s claims. Professor Judith Curry, of the Georgia Institute of Technology, and president of the Climate Forecast Applications Network, said yesterday: ‘I disagree with Gavin. The record warm years of 2015 and 2016 were primarily caused by the super El Nino.’ The slowdown in warming was, she added, real, and all the evidence suggested that since 1998, the rate of global warming has been much slower than predicted by computer models – about 1C per century.

David Whitehouse, a scientist who works with Lord Lawson’s sceptic Global Warming Policy Foundation, said the massive fall in temperatures following the end of El Nino meant the warming hiatus or slowdown may be coming back. ‘According to the satellites, the late 2016 temperatures are returning to the levels they were at after the 1998 El Nino. The data clearly shows El Nino for what it was – a short-term weather event,’ he said.

In case you’re wondering where the politics is in all of this, you need look no further than here:

The Twitter account of the United States’ House of Representatives Committee on Science, Space, and Technology quotes a Breitbart article by another old friend of ours, James Delingpole, which quotes David Rose’s article in the Mail on Sunday:

The last three years may eventually come to be seen as the final death rattle of the global warming scare. Thanks [sic] what’s now recognised as an unusually strong El Nino, global temperatures were driven to sufficiently high levels to revive the alarmist narrative – after an unhelpful pause period of nearly 20 years – that the world had got hotter than ever before.

In case you’re also wondering about the objective facts of the matter David Rose quotes with approval “the authoritative Met Office ‘Hadcrut4’ surface record” in his latest article in the Mail on Sunday this very morning:

New official data issued by the Met Office confirms that world average temperatures have plummeted since the middle of the year at a faster and steeper rate than at any time in the recent past.

The huge fall follows a report by this newspaper that temperatures had cooled after a record spike. Our story showed that these record high temperatures were triggered by naturally occurring but freak conditions caused by El Nino – and not, as had been previously suggested, by the cumulative effects of man-made global warming.

The Mail on Sunday’s report was picked up around the world and widely attacked by green propagandists as being ‘cherry-picked’ and based on ‘misinformation’. The report was, in fact, based on Nasa satellite measurements of temperatures in the lower atmosphere over land – which tend to show worldwide changes first, because the sea retains heat for longer.

There were claims – now exploded by the Met Office data shown here – that our report was ‘misleading’ and ‘cherry-picked’.

Yet bizarrely, the fiercest criticism was reserved for claims we never made – that there isn’t a long-term warming trend, mainly caused by human emissions.

This just wasn’t in our report – which presumably, critics hadn’t even read.

We’ve explained all this to David before, yet bizzarely we obviously need to do so again. Here’s the Mail’s version of the latest HADCRUT 4 data from the Met Office:

hadcrut-mail-20161211

and here’s ours:

hadcrut-wft-20161211

Can you spot any “cumulative effects of man-made global warming”?

Messrs Smith, Rose, Delingpole, Whitehouse et al. may well be unaware of the fact that the satellite temperature data they’re so fond of cherry picking doesn’t include data from the lower troposphere between 80 degrees North and the North Pole. Just in case they fancy spinning the latest objective facts from the Arctic in the near future, here’s the long term autumnal temperature trend:

80n-son-20161211

and here’s the long term November Arctic sea ice extent trend:

monthly_ice_11_nh

How to Make a Complete RSS of Yourself (With Sausages)

In the wake of the recent announcement from NASA’s Goddard Institute for Space Studies that global surface temperatures in February 2016 were an extraordinary 1.35 °C above the 1951-1980 baseline we bring you the third in our series of occasional guest posts.

Today’s article is a pre-publication draft prepared by Bill the Frog, who has also authorised me to reveal to the world the sordid truth that he is in actual fact the spawn of a “consumated experiment” conducted between Kermit the Frog and Miss Piggy many moons ago. Please ensure that you have a Microsoft Excel compatible spreadsheet close at hand, and then read on below the fold.


­­­­In a cleverly orchestrated move immaculately timed to coincide with the build up to the CoP21 talks in Paris, Christopher Monckton, the 3rd Viscount Monckton of Brenchley, announced the following startling news on the climate change denial site, Climate Depot

Morano1

Upon viewing Mr Monckton’s article, any attentive reader could be forgiven for having an overwhelming feeling of déjà vu. The sensation would be entirely understandable, as this was merely the latest missive in a long-standing series of such “revelations”, stretching back to at least December 2013. In fact, there has even been a recent happy addition to the family, as we learned in January 2016 that …

Morano2

The primary eye-candy in Mr Monckton’s November article was undoubtedly the following diagram …

https://lh3.googleusercontent.com/-xAdiohdkcU4/VjpSKNYP9SI/AAAAAAACa8Q/639el4qIzpM/s720-Ic42/monckton1.png

Fig 1: Copied from Nov 2015 article in Climate Depot

It is clear that Mr Monckton has the ability to keep churning out virtually identical articles, and this is a skill very reminiscent of the way a butcher can keep churning out virtually identical sausages. Whilst on the subject of sausages, the famous 19th Century Prussian statesman, Otto von Bismarck, once described legislative procedures in a memorably pithy fashion, namely that … “Laws are like sausages, it is better not to see them being made”.

One must suspect that those who are eager and willing to accept Mr Monckton’s arguments at face value are somehow suffused with a similar kind of “don’t need to know, don’t want to know” mentality. However, some of us are both able and willing to scratch a little way beneath the skin of the sausage. On examining one of Mr Monckton’s prize sausages, it takes all of about 2 seconds to work out what has been done, and about two minutes to reproduce it on a spreadsheet. That simple action is all that is needed to see how the appropriate start date for his “pause” automatically pops out of the data.

However, enough of the hors d’oeuvres, it’s time to see how those sausages get made. Let’s immediately get to the meat of the matter (pun intended) by demonstrating precisely how Mr Monckton arrives at his “no global warming since …” date. The technique is incredibly straightforward, and can be done by anyone with even rudimentary spreadsheet skills.

One basically uses the spreadsheet’s built-in features, such as the SLOPE function in Excel, to calculate the rate of change of monthly temperature over a selected time period. The appropriate command would initially be inserted on the same row as the first month of data, and it would set to range to the latest date available. This would be repeated (using a feature such as Auto Fill) on each subsequent row, down as far as the penultimate month. On each row, the start date therefore advances by one month, but the end date remains fixed. (As the SLOPE function is measuring rate of change, there must be at least two items in the range, that’s why the penultimate month would also be the latest possible start date.)

That might sound slightly complex, but if one then displays the results graphically, it becomes obvious what is happening, as shown below…

Fig 2: Variation in temperature gradient. End date June 2014

On the above chart (Fig 2), it can clearly be seen that, after about 13 or 14 years of stability, the rate of change of temperature starts to oscillate wildly as one looks further to the right. Mr Monckton’s approach has been simply to note the earliest transition point, and then declare that there has been no warming since that date. One could, if being generous, describe this as a somewhat naïve interpretation, although others may feel that a stronger adjective would be more appropriate. However, given his classical education, it is difficult to say why he does not seem to comprehend the difference between veracity and verisimilitude. (The latter being the situation when something merely has the appearance of being true – as opposed to actually being the real thing.)

Fig 2 is made up from 425 discrete gradient values, each generated (in this case) using Excel’s SLOPE function. Of these, 122 are indeed below the horizontal axis, and can therefore be viewed as demonstrating a negative (i.e. cooling) trend. However, that also means that 70% show a trend that is positive. Indeed, if one performs a simple arithmetic average across all 425 data points, the integration thus obtained is 0.148 degrees Celsius per decade.

(In the spirit of honesty and openness, it must of course be pointed out that the aggregated warming trend of 0.148 degrees Celsius/decade thus obtained has just about the same level of irrelevance as Mr Monckton’s “no warming since mm/yy” claim. Nether has any real physical meaning, as, once one gets closer to the end date(s), the values can swing wildly from one month to the next. In Fig 2, the sign of the temperature trend changes 8 times from 1996 onwards. A similar chart created at the time of his December 2013 article would have had no fewer than 13 sign changes over a similar period. This is because the period in question is too short for the warming signal to unequivocally emerge from the noise.)

As one adds more and more data, a family of curves gradually builds up, as shown in Fig 3a below.

Fig 3a: Family of curves showing how end-date also affects temperature gradient

It should be clear from Fig 3a that each temperature gradient curve migrates upwards (i.e. more warming) as each additional 6-month block of data comes in. This is only to be expected, as the impact of isolated events – such as the temperature spike created by the 1997/98 El Niño – gradually wane as they get diluted by the addition of further data. The shaded area in Fig 3a is expanded below as Fig 3b in order to make this effect more obvious.

Fig 3b: Expanded view of curve family

By the time we are looking at an end date of December 2015, the relevant curve now consists of 443 discrete values, of which just 39, or 9%, are in negative territory. Even if one only considers values to the right of the initial transition point, a full 82% of these are positive. The quality of Mr Monckton’s prize-winning sausages is therefore revealed as being dubious in the extreme. (The curve has not been displayed, but the addition of a single extra month – January 2016 – further reduces the number of data points below the zero baseline to just 26, or 6%.) To anyone tracking this, there was only ever going to be one outcome, eventually, the curve was going to end up above the zero baseline. The ongoing El Niño conditions have merely served to hasten the inevitable.

With the release of the February 2016 data from RSS, this is precisely what happened. We can now add a fifth curve using the most up-to-date figures available at the time of writing. This is shown below as Fig 4.

Fig 4: Further expansion of curve families incorporating latest available data (Feb 2016)

As soon as the latest (Feb 2016) data is added, the fifth member of the curve family (in Fig 4) no longer intersects the horizontal axis – anywhere. When this happens, all of Mr Monckton’s various sausages reach their collective expiry date, and his entire fantasy of “no global warming since mm/yy” simply evaporates into thin air.

Interestingly, although Mr Monckton chooses to restrict his “analysis” to only the Lower Troposphere Temperatures produced by Remote Sensing Systems (RSS), another TLT dataset is available from the University of Alabama in Huntsville (UAH). Now, this omission seems perplexing, as Mr Monckton took time to emphasise the reliability of the satellite record in his article dated May 2014.

In his Technical Note to this article, Mr Monckton tells us…

The satellite datasets are based on measurements made by the most accurate thermometers available – platinum resistance thermometers, which not only measure temperature at various altitudes above the Earth’s surface via microwave sounding units but also constantly calibrate themselves by measuring via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years.

Now, that certainly makes it all sound very easy. It’s roughly the metaphorical equivalent of the entire planet being told to drop its trousers and bend over, as the largest nurse imaginable approaches, all the while gleefully clutching at a shiny platinum rectal thermometer. Perhaps a more balanced perspective can be gleaned by reading what RSS themselves have to say about the difficulties involved in Brightness Temperature measurement.

When one looks at Mr Monckton’s opening sentence referring to “the most accurate thermometers available”, one would certainly be forgiven for thinking that there must perforce be excellent agreement between the RSS and UAH datasets. This meme, that the trends displayed by the RSS and UAH datasets are in excellent agreement, is one that appears to be very pervasive amongst those who regard themselves as climate change sceptics. Sadly, few of these self-styled sceptics seem to understand the meaning behind the motto “Nullius in verba”.

Tellingly, this “RSS and UAH are in close agreement” meme is in stark contrast to the views of the people who actually do that work for a living.

Carl Mears (of RSS) wrote an article back in September 2014 discussing the reality – or otherwise – of the so-called “Pause”. In the section of this article dealing with measurement errors, he wrote that …

 A similar, but stronger case can be made using surface temperature datasets, which I consider to be more reliable than satellite datasets (they certainly agree with each other better than the various satellite datasets do!) [my emphasis]

The views of Roy Spencer from UAH concerning the agreement (or, more accurately, the disagreement) between the two satellite datasets must also be considered. Way back in July 2011, Dr Spencer wrote

… my UAH cohort and boss John Christy, who does the detailed matching between satellites, is pretty convinced that the RSS data is undergoing spurious cooling because RSS is still using the old NOAA-15 satellite which has a decaying orbit, to which they are then applying a diurnal cycle drift correction based upon a climate model, which does not quite match reality.

So there we are, Carl Mears and Roy Spencer, who both work independently on satellite data, have views that are somewhat at odds with those of Mr Monckton when it comes to agreement between the satellite datasets. Who do we think is likely to know best?

The closing sentence in that paragraph from the Technical Note did give rise to a wry smile. I’m not sure what relevance Mr Monckton thinks there is between global warming and a refined value for the Hubble Constant, but, for whatever reason, he sees fit to mention that the Universe was born nearly 14 billion years ago. The irony of Mr Monckton mentioning this in an article which treats his target audience as though they were born yesterday appears to have passed him by entirely.

Moving further into Mr Monckton’s Technical Note, the next two paragraphs basically sound like a used car salesman describing the virtues of the rust bucket on the forecourt. Instead of trying to make himself sound clever, Mr Monckton could simply have said something along the lines of … “If you want to verify this for yourself, it can easily be done by simply using the SLOPE function in Excel”. Of course, Mr Monckton might prefer his readers not to think for themselves.

The final paragraph in the Technical Note reads as follows…

Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because the data are highly variable and the trend is flat.

Well, this is an example of the logical fallacy known as “Argument from Authority” combined with a blatant attempt at misdirection. The accuracy of the “… algorithm that determines the trend …” has absolutely nothing to do with Mr Monckton’s subsequent interpretation of the results, although that is precisely what the reader is meant to think. The good professor may well be seriously gifted at statistics, but that doesn’t mean he speaks with any authority about atmospheric science or about satellite datasets.

Also, for the sake of the students at Melbourne University, I would hope that Mr Monckton was extemporizing at the end of that paragraph. It is simply nonsense to suggest that the “flatness” of the trend displayed in his Fig 1 is in any way responsible for the trend equation also having an R2 value of (virtually) zero. The value of the coefficient of determination (R2) ranges from 0 to 1, and wildly variable data can most certainly result in having a value of zero, or thereabouts, but the value of the trend itself has little or no bearing upon this.

The phraseology used in the Technical Note would appear to imply that, as both the trend and the coefficient of determination are effectively zero, this should be interpreted as two distinct and independent factors which serve to corroborate each other. Actually, nothing could be further from the truth.

The very fact that the coefficient of determination is effectively zero should be regarded as a great big blazing neon sign which says “the equation to which this R2 value relates should be treated with ENORMOUS caution, as the underlying data is too variable to infer any firm conclusions”.

To demonstrate that a (virtually) flat trend can have an R2 value of 1, anyone can try inputting the following numbers into a spreadsheet …

10.0 10.00001 10.00002 10.00003 etc.

Use the Auto Fill capability to do this automatically for about 20 values. The slope of this set of numbers is a mere one part in a million, and is therefore, to all intents and purposes, almost perfectly is flat. However, if one adds a trend line and asks for the R2 value, it will return a value of 1 (or very, very close to 1). (NB When I tried this first with a single recurring integer – i.e. absolutely flat – Excel returned an error value. That’s why I suggest using a tiny increment, such as the 1 in a million slope mentioned above.)

Enough of the Technical Note nonsense, let’s looks at the UAH dataset as well. Fig 5 (below) is a rework of the earlier Fig 2, but this time with the UAH dataset added, as well as an equally weighted (RSS+UAH) composite.

Fig 5: Comparison between RSS and UAH (June 2014)

The difference between the RSS and UAH results makes it clear why Mr Monckton chose to focus solely on the RSS data. At the time of writing this present article, the RSS and UAH datasets each extended to February 2016, and Fig 6 (below) shows graphically how the datasets compare when that end date is employed.

Fig 6: Comparison between RSS and UAH (Feb 2016)

In his sausage with the November 2015 sell-by-date, Mr Monckton assured his readers that “…The UAH dataset shows a Pause almost as long as the RSS dataset.

Even just a moment or two spent considering the UAH curves on Fig 5 (June 2014) and then on Fig 6 (February 2016) would suggest precisely how far that claim is removed from reality. However, for those unwilling to put in this minimal effort, Fig 7 is just for you.

Fig 7: UAH TLT temperatures gradients over three different end dates.

From the above diagram, it is rather difficult to see any remote justification for Mr Monckton’s bizarre assertion that “…The UAH dataset shows a Pause almost as long as the RSS dataset.

Moving on, it is clear that, irrespective of the exact timeframe, both the datasets exhibit a reasonably consistent “triple dip”. To understand the cause(s) of this “triple dip” in the above diagrams (at about 1997, 2001 and 2009), one needs to look at the data in the usual anomaly format, rather than in gradient format used in Figs 2 – 7.

Fig 8: RSS TLT anomalies smoothed over 12-month and 60-month periods

The monthly data looks very messy on a chart, but the application of 12-month and 60-month smoothing used in Fig 8 perhaps makes some details easier to see. The peaks resulting from the big 1997/98 El Niño and the less extreme 2009/10 event are very obvious on the 12-month data, but the impact of the prolonged series of 4 mini-peaks centred around 2003/04 shows up more on the 60-month plot. At present, the highest 60-month rolling average is centred near this part of the time series. (However, that may not be the case for much longer. If the next few months follow a similar pattern to the 1997/98 event, both the 12- and 60-month records are likely to be surpassed. Given that the March and April RSS TLT values recorded in 2015 were the two coolest months of that year, it is highly likely that a new rolling 12-month record will be set in April 2016.)

Whilst this helps explain the general shape of the curve families, it does not explain the divergence between the RSS and the UAH data. To show this effect, two approaches can be adopted: one can plot the two datasets together on the same chart, or one can derive the differences between RSS and UAH for every monthly value and plot that result.

In the first instance, the equivalent UAH rolling 12- and 60-month values have effectively been added to the above chart (Fig 8), as shown below in Fig 9.

Fig 9: RSS and UAH TLT anomalies using 12- and 60-month smoothing

On this chart (Fig 9) it can be seen that the smoothed anomalies start a little way apart, diverge near the middle of the time series, and then gradually converge as one looks toward the more recent values. Interestingly, although the 60-month peak at about 2003/04 in the RSS data is also present in the UAH data, it has long since been overtaken.

The second approach would involve subtracting the UAH monthly TLT anomalies figures from the RSS equivalents. The resulting difference values are plotted on Fig 10 below, and are most revealing. The latest values on Figs 9 and 10 are for February 2016.

Fig 10: Differences between RSS and UAH monthly TLT values up to Feb 2016

Even without the centred 60-month smoothed average, the general shape emerges clearly. The smoothed RSS values start off about 0.075 Celsius above the UAH values, but by about 1999 or 2000, this delta has risen to +0.15 Celsius. It then begins a virtually monotonic drop such that the 6 most recent rolling 60-month values have gone negative.

NB It is only to be expected that the dataset comparison begins with an offset of this magnitude. The UAH dataset anomalies are based upon a 30-year meteorology stretching from 1981 – 2010. However, RSS uses instead a 20-year baseline running from 1979 – 1998. The mid points of the two baselines are therefore 7 years apart. Given that the overall trend is currently in the order of 0.12 Celsius per decade, one would reasonably expect the starting offset to be pretty close to 0.084 Celsius. The actual starting point (0.075 Celsius) was therefore within about one hundredth of a degree Celsius from this figure.

Should anyone doubt the veracity of the above diagram, hereis a copy of something similar taken from Roy Spencer’s web pages. Apart from the end date, the only real difference is that whereas Fig 9 has the UAH monthly values subtracted from the RSS equivalent, Dr Spencer has subtracted the RSS data from the UAH equivalent, and has applied a 3-month smoothing filter. This is reproduced below as Fig 11.

Fig 11: Differences between UAH and RSS (copied from Dr Spencer’s blog)

This actually demonstrates one of the benefits of genuine scepticism. Until I created the plot on Fig 10, I was sure that the 97/98 El Niño was almost entirely responsible for the apparent “pause” in the RSS data. However, it would appear that the varying divergence from the equivalent UAH figures also has a very significant role to play. Hopefully, the teams from RSS and UAH will, in the near future, be able to offer some mutually agreed explanation for this divergent behaviour. (Although both teams are about to implement new analysis routines – RSS going From Ver 3.3 to Ver 4, and UAH going from Ver 5.6 to Ver 4.0 – mutual agreement appears to be still in the future.)

Irrespective of this divergence between the satellite datasets, the October 2015 TLT value given by RSS was the second largest in that dataset for that month. That was swiftly followed by monthly records for November, December and January. The February value went that little bit further and was the highest in the entire dataset. In the UAH TLT dataset, September 2015 was the third highest for that month, with each of the 5 months since then breaking the relevant monthly record. As with its RSS equivalent, the February 2016 UAH TLT figure was the highest in the entire dataset. In fact, the latest rolling 12-month UAH TLT figure is already the highest in the entire dataset. This would certainly appear to be strange behaviour during a so-called pause.

As sure as the sun rises in the east, these record breaking temperatures (and their effect on temperature trends) will be written off by some as merely being a consequence of the current El Niño. It does seem hypocritical that these people didn’t feel that a similar argument could be made about the 1997/98 event. An analogy could be made concerning the measurement of Sea Level Rise. Imagine that someone – who rejects the idea that sea level is rising – starts their measurements using a high tide value, and then cries foul because a subsequent (higher) reading was also taken at high tide.

This desperate clutching of straws will doubtless continue unabated, and a new “last, best hope” has already appeared in guise of Solar Cycle 25. Way back in 2006, an article by David Archibald appeared in Energy & Environment telling us how Solar Cycles 24 & 25 were going to cause temperatures to plummet. In the Conclusion to this paper, Mr Archibald wrote that …

A number of solar cycle prediction models are forecasting weak solar cycles 24 and 25 equating to a Dalton Minimum, and possibly the beginning of a prolonged period of weak activity equating to a Maunder Minimum. In the former case, a temperature decline of the order of 1.5°C can be expected based on the temperature response to solar cycles 5 and 6.

Well, according to NASA, the peak for Solar Cycle 24 passed almost 2 years ago, so it’s not looking too good at the moment for that prediction. However, Solar Cycle 25 probably won’t peak until about 2025, so that will keep the merchants of doubt going for a while.

Meanwhile, back in the real world, it is very tempting to make the following observations …

  • The February TLT value from RSS seems to have produced the conditions under which certain allotropes of the fabled element known as Moncktonite will spontaneously evaporate, and …
  • If Mr Monckton’s sausages leave an awfully bad taste in the mouth, it could be due to the fact that they are full of tripe.

Inevitably however, in the world of science at least, those who seek to employ misdirection and disinformation as a means to further their own ideological ends are doomed to eventual failure. In the closing paragraph to his “Personal
observations on the reliability of the Shuttle
”, the late, great Richard Feynman used a phrase that should be seared into the consciousness of anyone writing about climate science, especially those who are economical with the truth…

For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.

Remember that final phrase – “nature cannot be fooled”, even if vast numbers of voters can be!

Explaining the Greenhouse Effect

Prompted by my (no doubt vain?) attempt to hold a sensible discussion about the greenhouse effect in a rather hostile environment, we have just added a brand new page to the long list of Great White Con educational resources:

Greenhouse Effect Explanations

A very brief history of the science of the greenhouse effect, courtesy of Ed Hawkins:

In the 1820s, the French mathematician Joseph Fourier was trying to understand the various factors that affect Earth’s temperature. But he found a problem – according to his calculations, the Earth should have been a ball of ice.

In 1861, the Irish physicist John Tyndall performed an experiment which changed our view of the atmosphere. Tyndall demonstrated that gases such as methane and carbon dioxide absorbed infrared radiation, and could trap heat within the atmosphere.

Svante Arrhenius, a Swedish chemist, provided the first numerical estimates of “climate sensitivity” – defined as the temperature change corresponding to a doubling of carbon dioxide in the atmosphere. He suggested a value around 4°C in 1896.

In 1938, Guy Stewart Callendar revealed evidence for a 0.3°C rise in global temperatures over the previous 50 years.

Much more recently “Tamino” has just penned an article on his Open Mind blog entitled:

Global Warming Basics: Greenhouse Gas

and Rasmus Benestad has written an article for RealClimate entitled

What is the best description of the greenhouse effect?

Intriguingly the supplementary materials accompanying Rasmus’ paper include ‘R’ source code!

If you have any constructive comments to make about efforts to explain the physics of the greenhouse effect to a lay audience and/or our new resource on that topic please feel free to do so in the space provided below.

More Heat Heading for the North Pole

We speculated a few days ago about whether the “Son of Storm Frank” might have battered Britain by now, and be sending a 10 meter swell past Svalbard towards the Arctic sea ice edge. That’s not quite how things have worked out in practice however! We haven’t had another named storm affecting the United Kingdom directly, but we have received a series of long distance swells from a sequence of hurricane force storms further out in the North Atlantic. I even managed to test my Arctic surfing equipment by personally partaking in the swell generated by Hurricane Alex!

Moving from the water into the air, here’s the Danish Meteorological Institute’s forecast for Greenland tomorrow:

Greenland-20160123+24h

If you’re at all familiar with isobars you’ll note yet another storm off Southern Greenland and that comparatively warm, moist air will be heading up the east coast of Greenland towards the Fram Strait, albeit not at the speeds generated by Storm Frank! As a consequence here is Climate Reanalyzer’s surface temperature anomaly map for first thing tomorrow:

CCI-AnomT-20160123+24h

and here is how it looks by Wednesday lunchtime:

CCI-AnomT-20160123+108h

As you can see, the ultimate effect of the recent hurricane force storms in both the Atlantic and the Pacific is to attack the Arctic with warm, moist air from both sides. Whilst we wait to see exactly how this much shorter term forecast pans out, particularly at the North Pole itself, the DMI’s graph of temperatures in the central Arctic has burst back into life after a “brief hiatus” in the New Year. Here’s how it looks at the moment:

DMI-T80N-20160123

2015 Really Is “The Warmest Year in Modern Record”!

Our regular reader(s) may recall that this time last year we took umbrage at an article by David Rose in the Mail on Sunday about the joint NASA/NOAA press briefing outlining their findings about global surface temperatures in 2014.

We’ve been discussing Mr. Rose’s recent misleading “Tweets” about the Arctic with him:

As a consequence we also found ourselves in conversation with Gavin Schmidt of NASA about this year’s NASA/NOAA press briefing about global surface temperatures in 2015, which takes place on January 20th. Pencil it into your diary:

Climate experts from NASA and the National Oceanic and Atmospheric Administration (NOAA) will discuss the release of new data on 2015 global temperatures, and the most important weather and climate events of the year, during a media teleconference at 11 a.m. EST Wednesday, Jan. 20.

The teleconference panelists are:

Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies in New York
Thomas R. Karl, director of NOAA’s National Centers for Environmental Information in Asheville, North Carolina, and chair of the Subcommittee on Global Change Research for the U.S. Global Change Research Program in Washington

Media can participate in the teleconference by calling 888-790-1804 (toll-free in the United States and Canada) or 415-228-4885 (international) and use the passcode “climate.”

Audio of the briefing, as well as supporting graphics, will stream live.

Whilst we wait with bated breath for the NASA/NOAA announcement, here’s how the Gavin, David & Snow show has been going over on Twitter:

 

You will note from the exchange on Twitter that the Berkeley Earth Surface Temperature project are one of the organisations that have already declared 2015 “The Warmest Year in the Modern Record”, which brings me to the Arctic connection. Tamino explains over at “Open Mind”, in an article entitled “Hottest Year On Record“:

When it comes to global temperature over land and sea, Berkeley produces two versions, different in the way they treat areas covered with sea ice. Version 1 uses air temperature estimates for sea-ice covered regions, version 2 uses ocean temperature estimates.

and quotes BEST as follows:

For most of the ocean, sea-surface temperatures are similar to near-surface air temperatures; however, air temperatures above sea ice can differ substantially from the water below the sea ice. The air temperature version of this average shows larger changes in the recent period, in part this is because water temperature changes are limited by the freezing point of ocean water. We believe that the use of air temperatures above sea ice provides a more natural means of describing changes in Earth’s surface temperature.

As Tamino puts it:

Let’s not keep you in suspense any longer. Here are annual averages through 2015 (which is now complete) according to version 1:

2015-berk1

Here it is according to version 2:

2015-berk2

Any way you look at it, 2015 is the hottest. Any way you look at it, there was no “pause” in global temperature.

[Edit – 17:30 UTC on January 20th 2016]

The joint NASA/NOAA media briefing on 2015 global average surface temperatures has just finished. The recording of the event is due to go online “in 2 hours” or so from:

http://www.noaanews.noaa.gov/advisories/011516-advisory-noaa-nasa-to-announce-official-analyses-of-2015-global-temperature-climate-conditions.html

You can get a good flavour things by taking a look at the second half of the Twitter “Storify” above. This one sums things up:

Whilst we all wait for the recording to emerge, you can download the slides for the briefing from:

http://www.nasa.gov/sites/default/files/atoms/files/noaa_nasa_global_analysis_2015.pdf

and here’s a video summary NASA have released:

Note too that the UK Met Office released their numbers this afternoon also. Here’s how they look:

and here’s the Met Office’s video in which Peter Stott explains their conclusion that:

2015 – Warmest year on record globally

I waited patiently in the NASA/NOAA queue to ask some Arctic related questions, but never received the call. I’ll let you know when I receive the promised answers by email.

The Telegraph is Wrong Again on Temperature Adjustments

Regular readers will recall that we recently sent The Telegraph a lesson or two about global surface temperature “adjustments”, both of which included “a video by a scientist who has studied such matters”. It seems nobody at The Telegraph, and particularly Christopher Booker, bothered to watch it or do the homework assignments.  The stated view of Jess McAree, Head of Editorial Compliance at the Telegraph Media Group, is that:

Only the most egregious inaccuracy could be significantly misleading.

We therefore take great pleasure in welcoming Dr. Kevin Cowtan from the University of York, the “scientist” mentioned above, who has kindly allowed us to reprint an article of his originally published at Skeptical Science. Please read on below the fold, and don’t forget to do your homework!


There has been a vigorous discussion of weather station calibration adjustments in the media over the past few weeks. While these adjustments don’t have a big effect on the global temperature record, they are needed to obtain consistent local records from equipment which has changed over time. Despite this, the Telegraph has produced two highly misleading stories about the station adjustments, the second including the demonstrably false claim that they are responsible for the recent rapid warming of the Arctic.

In the following video I show why this claim is wrong. But more importantly, I demonstrate three tools to allow you to test claims like this for yourself.

The central error in the Telegraph story is the attribution of Arctic warming (and somehow sea ice loss) to weather station adjustments. This conclusion is based on a survey of two dozen weather stations. But you can of course demonstrate anything you want by cherry picking your data, in this case in the selection of stations. The solution to cherry picking is to look at all of the relevant data – in this case all of the station records in the Arctic and surrounding region. I downloaded both the raw and adjusted temperature records from NOAA, and took the difference to determine the adjustments which had been applied. Then I calculated the trend in the adjustment averaged over the stations in each grid cell on the globe, to determine whether the adjustments were increasing or decreasing the temperature trend. The results are shown for the last 50 and 100 years in the following two figures:

Trend in weather station adjustments over the period 1965-2014, averaged by grid cell. Warm colours show upwards adjustments over time, cold colour downwards. For cells with less than 50 years of data, the trend is over the available period.

Trend in weather station adjustments over the period 1915-2014, averaged by grid cell. Warm colours show upwards adjustments over time, cold colour downwards. For cells with less than 100 years of data, the trend is over the available period.

In the video I demonstrate three tools which are useful in understanding and evaluating temperature adjustments:

A GHCN (global historical climatology network) station report browser. GHCN provide graphical reports on the adjustments made to each station record, but you need to know the station ID to find them. I have created an interactive map to make this easier.

The majority of cells show no significant adjustment. The largest adjustments are in the high Arctic, but are downwards, i.e. they reduce the warming trend. This is the opposite of what is claimed in the Telegraph story. You can check these stations using the GHCN station browser.

The upward adjustments to the Iceland stations, referred to in the Telegraph, predate the late 20th century warming. They occur mostly in the 1960’s, so they only appear in the centennial map. Berkeley Earth show a rather different pattern of adjustments for these stations.

Iceland is a particularly difficult case, with a small network of stations on an island isolated from the larger continental networks. The location of Iceland with respect to the North Atlantic Drift, which carries warm water from the tropics towards the poles, may also contribute to the temperature series being mismatched with records from Greenland or Scotland. However given that the Iceland contribution is weighted according to land area in the global records, the impact of this uncertainty is minimal. Global warming is evaluated on the basis of the land-ocean temperature record; the impact of adjustments on recent warming is minimal, and on the whole record it is small compared to the total amount of warming. As Zeke Hausfather has noted, the land temperature adjustments in the early record are smaller than and in the opposite direction to the sea surface temperature adjustments.

Impact of the weather stations adjustments on the global land-ocean temperature record, calculated using the Skeptical Science temperature record calculator in ‘CRU’ mode.

Manual recalibration of the Iceland records may make an interesting citizen science project. Most of the stations show good agreement since 1970, however they diverge in the earlier record. The challenge is to work out the minimum number of adjustments required to bring them into agreement over the whole period. But the answer may not be unique, and noise and geographical differences may also cause problems. To facilitate this challenge, I’ve made annualized data available for the eight stations as a spreadsheet file.

In the video I demonstrate three tools which are useful in understanding and evaluating temperature adjustments:

  • A GHCN (global historical climatology network) station report browser. GHCN provide graphical reports on the adjustments made to each station record, but you need to know the station ID to find them. I have created an interactive map to make this easier.
  • The Berkeley Earth station browser. The Berkeley Earth station reports provide additional information to help you understand why particular adjustments have been made.
  • The Skeptical Science temperature record calculator. This allows you to construct your own version of the temperature record, using either adjusted or unadjusted data for both the land and sea surface temperatures.

Data for the temperature calculator may be obtained from the following sources:

Finally, here are some interesting papers discussing why adjustments are required.

  • Menne et al (2009) The U.S. historical climatology network monthly temperature data, version 2.
  • Bohm et al (2010) The early instrumental warm-bias: a solution for long central European temperature series 1760–2007.
  • Brunet et al (2010) The minimization of the screen bias from ancient Western Mediterranean air temperature records: an exploratory statistical analysis.
  • Ellis (1890) On the difference produced in the mean temperature derived from daily maximum and minimum readings, as depending on the time at which the thermometers are read

 

Finally, for the moment at least, and lapsing back into our by now familiar (and adversarial?) style:

 

Us:


 

Them:

We’ll keep you posted!

 

Willie Soon Gate at the Mail Online

I’ve just had a long phone conversation with John Wellington of the Mail on Sunday. He assures me he is fit and well and back in his hot seat there, but that the long standing bone that I’ve been eager to pick with the Mail Online about Victoria Woollaston’s January 21st article entitled “Is climate change really that dangerous? Predictions are ‘very greatly exaggerated’, claims study” is the responsibility of Tal Gottesman. Here’s a brief extract of the article, to give you a little taste:

The paper, ‘Why models run hot: results from an irreducibly simple climate model’, was written by Lord Christopher Monckton of Brenchley, astrophysicist and geoscientist Willie Soon, Professor of Geography at the University of Delaware David Legates, and statistician Dr Matt Briggs.

It has been peer reviewed and is published in the journal Science Bulletin.

Mathematical equations used for large climate model typically require supercomputers that perform calculations quickly – some make more than 80 million calculations an hour.

and here’s the accompanying “infographic”:

I’ve been trying to get in touch with Vicky and/or her editor for several weeks now, so this is a big step forward! After a series of phone calls and emails that elicited no response “Willie SoonGate” broke earlier this week, so….

Us:

 

which also elicited no response, so following John’s phone call:

Hello Tal,

John was kind enough to telephone me and pass on your email address.

Further to the correspondence copied below you will note that I have had a singular lack of success directing my enquiries on the above topic to the generic MailOnline editorial address.

At the risk of repeating myself repeating myself

How do you suggest we go about starting a conversation?

Best wishes,

Jim Hunt

followed by:
 

Them:

Thank you for your email which has been passed to the Managing Editor’s office.

We understand that you would like to submit a complaint on the story below:

http://www.dailymail.co.uk/sciencetech/article-2920311/Is-climate-change-really-dangerous-Predictions-greatly-exaggerated-claims-study.html

If you could please outline what the issue is, we will ensure your complaint is investigated immediately.

Yours sincerely

MailOnline

 

Us:

Dear Madam,

Your understanding is (at long last) correct!

My first complaint is that I have received no previous response, not even an acknowledgement, to my emails of January 26th, February 2nd and February 23rd 2015. Can I safely assume that conversation using this email address will be more timely from now on? Do you by any chance possess a telephone number?

Moving on to the article in question, I will prepare a more substantive reply as soon as I have finished addressing similar issues with the Sunday Telegraph and the BBC. In brief the paper by Monckton, Soon, Legates & Briggs described in the article in question is scientific nonsense. As a famous scientist called Albert Einstein allegedly once put it:

“Everything should be as simple as possible, but not simpler”

As such the article is scientifically inaccurate and/or misleading, and as I am sure you must be aware allegations have recently been made elsewhere about possible reasons for that. I have been denied any discussion about a reply or “correction” to the article for 5 weeks thus far, and counting. Did you hear back from the IPCC by the way? If so, what did they say and where did you “print” it?

Best wishes,

Jim Hunt

 

Them:

We’ll keep you posted!

 

A Letter to the Editor of the Sunday Telegraph

I called Ian Marsden, managing editor at the Telegraph Media Group, earlier this week and informed him that I wished to register a complaint about some of their content. Ian told me that in the shiny new world of the Independent Press Standards Organisation the first thing I would need to do is fill in a form. That is what I have just done:

Us:

See also the print version of Christopher Booker’s article.

As I mentioned in my telephone conversation with Ian Marsden, this article is so full of scientific inaccuracies that it’s hard to know where to begin, and what actions The Telegraph could take that would be sufficient to correct the incredibly misleading portrayal of the underlying science.

As Ian is well aware, my particular specialisation is the Arctic, so let’s start there. Booker starts off:

“New data shows that the ‘vanishing’ of polar ice is not the result of runaway global warming”

What “new data”? There is none!

He goes on to say “Homewood has now turned his attention to the weather stations across much of the Arctic, between Canada (51 degrees W) and the heart of Siberia (87 degrees E). Again, in nearly every case, the same one-way adjustments have been made, to show warming up to 1 degree C or more higher than was indicated by the data that was actually recorded.”

That’s “old data” and the statement is inaccurate. Have you heard of Steven Mosher? The author of “Climategate – The Crutape Letters”? He tells me:

https://andthentheresphysics.wordpress.com/2015/02/09/guest-post-skeptics-demand-adjustments/#comment-47424

“Looking at some maps I have of the Arctic It looks to me like we “cool” the Arctic. That is but for our adjustments the raw data would show a warmer arctic. I’ll try to check that in detail.

The Homewood approach (and by extension Delingpole and Booker) is pretty simple. Look for stations that are warmed and complain. Of course, he fails to look at the entire picture, fails to look at the large parts of Africa (20% of the globe) that our algorithm “cools”.

By looking at the whole we know that the scientifically interesting result (the world is getting warmer) STANDS. it stands with adjustments. It stands with no adjustments. Any local detail that may be wrong or questionable is not material to this conclusion.”

Here’s a video by a scientist who has studied such matters, which explains the truth:

Watch it, check the inaccuracy of Booker’s statements for yourself if you so desire, then get back to me. I’ll be more than happy to go through all the other inaccurate and misleading statements in the article once you have attempted to justify this one.

Them:

From an email dated 20/02/2015 17:55:

Dear Mr Hunt

The fiddling with temperature data is the biggest science scandal ever, 7 Feb 2015
and The Sunday Telegraph, Feb 8 2015

Thank you for contacting us about this article.

As you are aware, climate change is a complex and controversial topic. A newspaper is not a scientific journal, and is not required to represent all the possible shades of evidence and interpretation that might have a bearing upon any given topic.

This is clearly an opinion article and identifiable as such. Against the background described above, readers can be expected to understand that any evidence offered is almost certainly contestable. It follows that in an opinion article of this nature only the most egregious inaccuracy could be significantly misleading. None of the points you raise qualify as such.

The phrase ‘new data’ is readily understandable, in context, as meaning the new study into existing Arctic weather station data undertaken by Paul Homewood, which is the focus of the article.

You say that Homewood’s analysis is ‘inaccurate’, and seek to prove this by reference to the work of others. The existence of contrary views and interpretations does not negate Christopher Booker’s right to describe Homewood’s findings and comment upon them. There is nothing in the points you raise that would engage the terms of the Editor’s Code of Conduct.

I trust this is of some assistance.

Yours sincerely

Jess McAree | Head of Editorial Compliance

 

Us:

Jess McAree’s email didn’t include a telephone number, so I called The Telegraph’s switchboard (on the morning of February 24th). They told me “He doesn’t take calls”. I persisted and they put me through to Andy, who assured me that whilst Mr. McAree was currently in a meeting he would tell him that I had called as soon as he emerged. Whilst waiting for a call back I registered another complaint via The Telegraph’s online form, this time checking the “Opportunity to reply” box:

This is a supplementary note to my original complaint of February 13th 2015, a copy of which is available online here:

https://greatWhiteCon.info/2015/02/a-letter-to-the-editor-of-the-sunday-telegraph/

It is now 11:30 on February 24th 2015. I spoke at length to Ian Marsden yesterday, and for some strange reason he didn’t mention Jess McAree’s email of the 20th inst. to me. Does the left hand at The Telegraph not know what the right hand is doing? I pointed out to Ian that your complaints policy states:

“We aim to acknowledge your complaint within 5 working days of receipt”

Ian reminded me about the “We aim” bit, and assured me that my complaint was being dealt with. Following the recommendation of an IPSO complaints officer I am registering this further complaint about the lack of a timely “right to reply” on what Ian referred to yesterday as The Telegraph’s “audit trail”. I shall also send a more detailed response to his email to Mr. McAree’s personal email address.

 

Them:
 
2015-02-24_1200_Telegraph 

Us:

We’ll keep you posted!

The Science of the David Rose “Climate of Hate” Self-Interview

Much like yesterday I was idly browsing my Twitter feed this morning whilst simultaneously consuming my habitual Sunday coffee + BLT when news reached me that David Rose had published yet another article in the Mail on Sunday that purports to investigate “climate science”:


 

Here it is:

Climate of Hate: His children are urged to kill him, he’s compared to Adolf Hitler and labelled a ‘denier’ – even though he’s Jewish. Disturbing article reveals what happens if you dare to doubt the Green prophets of doom

Perhaps due to all our sterling work here at the Great White Con extracting the Michael, it doesn’t seem to fall under the Mail’s “Great Green Con” banner anymore. The general drift is the same though, apart from that lurid title of course!

I think current ‘renewable’ sources such as wind and ‘biomass’ are ruinously expensive and totally futile. They will never be able to achieve their stated goal of slowing the rate of warming and are not worth the billions being paid by UK consumers to subsidise them.

Skipping over all the (merely rhetorical?) self-pity, let’s move on to the climate science, such as it is!

Last Monday… a Met Office press release stated: ‘2014 one of the warmest years on record globally’.

The previous week, almost every broadcaster and newspaper in the world had screamed that 2014 was emphatically The Hottest Year Ever. They did so because NASA told them so. Its Goddard Institute for Space Studies (GISS), the custodian of one of the main American temperature datasets, had announced: ‘The year 2014 ranks as Earth’s warmest since 1880.’ If you’d bothered to click on the sixth of a series of internet links listed at the end of the press release, you could have found deep within it the startling fact that GISS was only ’38 per confident’ that 2014 really did set a record.

In other words, it was 62 per cent confident that it wasn’t. Another detail was that the ‘record’ was set by just two hundredths of a degree. The margin of error was five times bigger. These boring details were ignored. The ‘2014 was a record’ claim went to the very top. President Obama cited it in his State of the Union address. Like the news outlets, it’s unlikely he will issue a correction or clarification any time soon.

Al Gore repeatedly suggested that the Arctic would likely be ice-free in summer by 2014. In fact Arctic ice has recovered in the past two years, and while the long term trend is down, it looks likely to last several more decades.

Unfortunately that is misleading and/or inaccurate, apart from the bit about the long term trend in Arctic sea ice. Hence I’ve just popped yet another Dear John (and Poppy) virtual letter to Mr. Rose’s managing editor (+PA) at the Mail on Sunday, and I’ll have yet another long chat with IPSO tomorrow:

Us:

Dear John/Poppy,

Would you believe that David Rose is at it again? Not only is he “interviewing” himself in your esteemed organ today, he is misrepresenting the underlying science yet again.

I really must insist that whoever owns the desk on which the buck currently stops for the following article starts communicating with me yesterday if not sooner:

http://www.dailymail.co.uk/news/article-2934540/What-happens-dare-doubt-Green-prophets-doom.html

Best wishes,

Jim Hunt
 

Them:

I am away from the office until Tuesday, February 10. I will be checking emails occasionally but if your message is urgent, please contact my assistant Poppy Swann.

Ultimately followed by:

Dear Jim

If you have a complaint about last Sunday’s article, you should set out exactly what it is. If you disagree with any opinions expressed you are welcome to write a letter that we will consider for publication.

You mention that you have sent us a number of inquiries recently. The only other, to my knowledge is that you wanted to know the source of some data that David Rose mentioned in an article some months ago. David Rose told me it came from the official website. Perhaps my colleague Poppy Hall can find it for you since David is probably unwilling to help after your insult.

Best regards

John

 

Us:

Dear Poppy (and John)

Please would you ask David to let me know where exactly, and on which “official website”, he obtained the DMI extent numbers he quoted in his article last Summer?

FYI John, at Poppy’s suggestion I have also emailed the editorial team @MailOnline. They have yet to even acknowledge receipt of my email of January 26th.

Best wishes,

Jim Hunt

 

Them:

We’ll keep you posted!

Was 2014 Really “The Warmest Year in Modern Record”

I don’t usually get involved in debates about “the global warming pause”, but as you will eventually see there is an Arctic connection, so please bear with me. Personally I reckon “global heat” is more relevant than “global surface temperature”, but nevertheless NASA and NOAA issued a “news release” a couple of days ago stating that:

The year 2014 ranks as Earth’s warmest since 1880, according to two separate analyses by NASA and National Oceanic and Atmospheric Administration (NOAA) scientists.

The 10 warmest years in the instrumental record, with the exception of 1998, have now occurred since 2000. This trend continues a long-term warming of the planet, according to an analysis of surface temperature measurements by scientists at NASA’s Goddard Institute of Space Studies (GISS) in New York.

In an independent analysis of the raw data, also released Friday, NOAA scientists also found 2014 to be the warmest on record.

The announcement was accompanied by this video:

I figured our old friend David Rose would have something to say about all that in the Mail on Sunday, and I was not disappointed. Yesterday David reported, in bold headlines:

Nasa climate scientists: We said 2014 was the warmest year on record… but we’re only 38% sure we were right

  • Nasa’s Goddard Institute for Space Studies claimed its analysis of world temperatures showed ‘2014 was the warmest year on record’

  • But it emerged that GISS’s analysis is subject to a margin of error

  •  Nasa admits this means it is far from certain that 2014 set a record at all

David Rose includes this NASA video in the online version of his article:

which finishes up showing the Arctic blanketed in red for the period 2010-14. In the body of the article David suggests that:

GISS’s director Gavin Schmidt has now admitted Nasa thinks the likelihood that 2014 was the warmest year since 1880 is just 38 per cent.

but for some strange reason David neglects to mention this NASA/NOAA “press briefing“, which includes the following figure:

2015-01-18-WarmProbs

or this January 16th “Tweet” from Gavin Schmidt:

all of which was discussed on the NASA/NOAA conference call last Friday, a recording of which is available from the NOAA website:

http://www.noaanews.noaa.gov/advisories/011415-advisory-2014-global-climatehighlights.html

As you can see and hear, Gavin Schmidt’s “admission” was pretty public, and available for anyone doing their due diligence on this thorny topic to see well before the Mail on Sunday published David Rose’s article.  For still more from Gavin see also the second half of yet another video from NASA, which we’ve hastily made embeddable from YouTube since NASA’s Goddard Space Flight Center don’t  seem to have done so themselves as yet:

[Edit – 23/01/2015]

By way of further elucidation of the NASA/NOAA table of probabilities above, here’s a new graphic courtesy of Skeptical Science:

WarmestYearNOAAasatJan2015graphic

The probability of 2014 being the warmest year (due to margin of uncertainty and the small differences between years) is almost ten times that of 1998. And the contrarians were very certain that year was warm!

Does that help make things clearer, for those who evidently have difficulty understanding statistics?

[/Edit]

I also figured that the likes of “Steve Goddard” and Anthony Watts would be jumping on the same bandwagon, so you can imagine my disappointment when I discovered that they have both, unlike Gavin, blocked me from their Twitter feeds! Venturing over to the so called “Real Science” blog instead I discovered that Steve/Tony does at least read Gavin’s Twitter feed, although apparently not NASA/NOAA press briefings:

 

Them:

Implausible Deniability

Gavin is playing his usual game, trying to cover his ass with “uncertainty” that wasn’t mentioned in the NASA press release.

They get the propaganda out there for the White House and major news outlets, then try to generate implausible deniability through back channels like twitter. None of this was mentioned in the NASA press release.

Us:

I take it you weren’t on the call either Tony? Have you by any chance seen this press briefing?

http://www.ncdc.noaa.gov/sotc/briefings/201501.pdf

 

Them:

I’m amazed you have the gall to show up around here, after saying I should be jailed for accurately reporting and predicting Arctic ice.

Pathetic and quite psychotic Jim. And the NASA press release said nothing about uncertainty or satellites.

http://www.nasa.gov/press/2015/january/nasa-determines-2014-warmest-year-in-modern-record/

World class wanker

 

Us:

I’ll take that as a no then.

Since you mention it, how did your 2014 Arctic sea ice predictions work out in the end?

 

Them:

Almost spot on.

Goddard-DMI_new-2014-04-23

 

Us:

“The NASA press release said nothing about uncertainty”

I didn’t say it did. I did however answer Daffy Duck’s question for him. What precisely is “pathetic and quite psychotic” about that?

No answer to that question as yet, so……

That’ll teach me to get involved in debates about “the global warming pause”. I can feel another blog post or two coming on!

What do you make of this recent Arctic sea ice extent chart from your beloved DMI?

DMI-Old-2015-01-19
Them:

Sensor error. Happens quite often. maybe you should go blog about and call for people to be jailed.

 

Us:

For once I agree with you, about the “sensor error” in the most recent 2015 data at least.

Actually I was wondering how that data justifies your “almost spot on” claim for 2014 above. See for example:

https://stevengoddard.wordpress.com/2014/08/01/my-arctic-forecast-4/

“The minimum this summer will likely be close to the 2006 minimum, which was the highest minimum of the past decade.”

That’s not really how things turned out, is it?

 

Them:

See “Implausible Deniability of 2014 Arctic Sea Ice Predictions” for further “debate” about Arctic sea ice. Meanwhile back to temperature…..

THE DATA ON WEATHER AND CLIMATE (NASA AND NOAA) CAN BE COMPARED TO THE STOCK MARKET ON WALL STREET, MUCH CORRUPTION AND ALTERING. WE ARE NOT GUARANTEED A CERTAIN TEMPERATURE EVERYDAY; ALTHOUGH, THAT IS WHAT THEY WOULD HAVE US THINK, JUST BECAUSE OF SEASONS IN GENERAL.

http://www.nbcnews.com/science/environment/2014-breaks-record-warmest-year-noaa-nasa-experts-say-n287551

 

Us:

What do you make of this bullish channel?
there-is-no-pause

 

Them:

Further to previous correspondence on similar matters, on January 27th 2015 I received the following email from the Personal Assistant to John Wellington, David Rose’s managing editor at the Mail on Sunday:

Dear Jim,

Thank you for your email.

I am afraid the best person to deal with your question is John Wellington who will reply on his return at the beginning of March.

Thank you for your patience.

Kind regards

Poppy Hall

 

Us:

CC: IPSO.co.uk

Dear Poppy,

Thanks for that information, but I am afraid my almost infinite patience in this matter is exhausted.

In John’s absence perhaps I might reiterate a question posed by Bob Ward of The Grantham Institute on Twitter yesterday:

Please would you ask whoever owns the desk on which the buck currently stops for the article entitled “Nasa climate scientists: We said 2014 was the warmest year on record… but we’re only 38% sure we were right” by David Rose to communicate with me as soon as possible. FYI – Here it is:

https://archive.today/SUTA8

As I’m sure you must realise by now, unfortunately it includes some inaccurate and/or misleading statements which as far as I can ascertain have still not been publicly corrected.

Best wishes,

Jim Hunt

 

Post Script:

Bob Ward lodged a formal complaint with the Independent Press Standards Organisation about the Mail on Sunday article. Their conclusion?

The complaint was not upheld.

Remedial Action Required – N/A

Date complaint received: 13/02/2015
Date decision issued: 22/06/2015

Their “reasoning”?

The Committee noted that information about the margin of error had been made available by GISS, but that it was not in dispute that these details had been omitted from the press release. The article had made clear that this specifically was the basis for its criticism of Nasa, and the newspaper was entitled to present its view that this omission represented a failure on the part of the organisation. While the information had been released by Nasa, it had been released to a limited selection of people, in comparison to those who would have had access to the press release, and had not been publicised to the same level as the information in the release. The press briefing images referred to by the complainant were available on Nasa’s website, but were not signposted by the press release. In this context, it was not misleading to report that the information relating to the margin of error had emerged in circumstances where the position was not made clear in the press release. While these details of the margin of error may have been noted in a press briefing two days previously, rather than “yesterday”, as reported, this discrepancy did not represent a significant inaccuracy requiring correction under the terms of the Code.