- The media is widely reporting 2014 as the warmest year on record, yet the underlying data don't support that conclusion.
- The data actually lead to a different finding, of 2014 tied with 2010 and 2005 within the margin of error, reflecting little warming since 2005.
I suppose it's understandable that the Post's editors and those of many other media reporting the same finding might rely on the expertise of the government agencies involved, rather than digging deeper. The Post's columnists apparently based their comments on information provided to the media by NASA's Goddard Space Flight Center. The NASA press release, entitled, "NASA, NOAA Find 2014 Warmest Year in Modern Record," included links enabling one to scrutinize the raw data upon which this conclusion was based. I've reproduced the relevant portion below in picture form as of noon today, since this data is subject to periodic revisions.
NASA's dataset displays the differences between measured temperatures and the 14.0°C average from 1951-80. On this basis it confirms that the average recorded temperature in 2014 was 0.02°C higher than the average for the previous warmest year, 2010, which was in turn 0.01°C higher than 2005's. Unfortunately, neither that page nor the press release includes any information about the uncertainty inherent in these figures, which turns out to be larger than the increase from 2010-14.
All physical measurements, including those from the weather stations providing data to NASA, are plus-or-minus some error. Averaging them doesn't entirely negate that. Within the accuracy of these temperatures, it's not possible to distinguish among 2005, 2010 and 2014; they represent a statistical tie. That fact was explained more clearly than I have done in a report on January 14, 2015, from the team of scientists at Berkeley Earth. Hardly climate skeptics, this is the same group that made headlines a couple of years ago with a comprehensive study of existing climate data.
Why does this distinction matter? After all, measured temperatures have warmed nearly 2° Fahrenheit since the early 20th century, as shown in the graph above. Whether last year or 2010 was warmer might seem like more of an academic point than a practical one. However, the refrain of "record temperature" reports gives a false sense that the warming is accelerating. Instead, as the Berkeley Earth report found, "the Earth's average temperature for the last decade has changed very little." That's a very different impression than the one created by the stories I saw, with implications for how we respond to the risks of climate change.
7 comments:
If you would take the temperature data from 1860 until today to a college statistics class without labeling what the data were about, most would conclude that they represent a segment of a sine wave going from minimum to maximum. Furthermore, if asked on a test to predict where this sequence was headed, they would predict down.
The globe has generally warmed since the trough of the Little Ice
Age, interspersed with periods of "hiatus" and cooling. Most of that
warming has been the result of natural variations, as was the cooling
which produced the Little Ice Age. Our ability to instrumentally measure
the extent of temperature variations began with the Central England
Temperature record, in the mid-1600s, near the trough of the LIA. Our
ability to measure global surface temperatures, such as it is, extends
from the mid-1800s. Prior to those instrumental records, our
understanding of climate history is based on the analysis of various
proxy records.
The current concern regarding anthropogenic warming
focuses on the period since the industrial revolution, with primary
focus on the period since the 1950s. Man is believed to influence
climate through two primary mechanisms: the emissions of greenhouse
gases, particulates and aerosols; and, land use changes, including
deforestation, agricultural expansion and the effects of structures,
such as cities. The individual effects of each of these factors are very
difficult to assess. That assessment has been made more difficult by
the poor quality of the surface temperature data, much of which is the
result of land use changes in proximity to the sensors and poor sensor
siting.
Climate science focuses on temperature anomalies, in part
in recognition of the poor quality of the surface temperature data.
However, the surface temperature anomalies are calculated from
"adjusted" temperatures, rather than from the actual data, again in
recognition of the poor quality of the data. Reliance on the calculated
surface temperature anomalies must be based on one of two assumptions:
the accuracy of the temperature measurement at any site does not vary
over the period of the anomaly calculation, as the result of either
sensor degradation or changes in the environment surrounding the sensor
location; or, the "adjustments" completely correct for any such
variations. Both of these assumptions are questionable.
As a result, we find ourselves concerned about annual global surface
temperature variations of one or two hundredths of a degree Centigrade,
based on "adjusted" temperature measurements made predominantly to one
half degree Centigrade, using sensors which are estimated to be in error
by two or more degrees Centigrade on average, compounded by "infilling"
of estimates where data do not exist. Further, we even report decadal
trends based on these adjusted temperatures to thousandths of a degree
Centigrade. The mind boggles!
As brief as our ability to measure
temperatures is, particularly on a global scale, our ability to
accurately measure surface temperatures using sets of highly accurate
instruments in "ideal" sites (US Climate Reference Network) is even more
limited. The satellite temperature record, our most comprehensive
global record, is still subject to significant variation of its
assessment by the two organizations tasked to analyze it.
It is important to distinguish between what we KNOW and what we BELIEVE.
Ed,
I recall my Chem 1A prof's heavy emphasis on the concepts of accuracy, precision and significant figures. While I can understand how statistical techniques can improve the accuracy or precision of the data, I'm not sure I understand how they generate additional sig figs for the anomalies.
I would also think that in line with your discussion of how errors creep into the long-term record, the most recent data would generally be the most accurate.
Geoff,
Computers are excellent and prolific generators of insignificant digits.
Assuming that the custodians of the data were diligent in checking sensor calibration,the condition of the sensor enclosures and the nature of the surrounding terrain, that would likely be true. Regrettably, that is not the case. Based on the criteria used to evaluate the US CRN locations, the typical US surface temperature sensor is likely to be in error by >/+ 2C, based on evaluation of its location and surroundings. That situation was documented by www.surfacestations.org; and, has not been resolved. From someone who spent half a career collecting and analyzing data from thermal systems, the surface temperature data "suck". The fact that the data cannot be used without "adjustment" says it all.
Note that GISS and NCDC use essentially the same data, or subsets of the same data, to produce their anomaly calculations. For December, 2014 the GISS anomaly increased by 0.06 C, while the NCDC anomaly increased by 0.12 C. However, each discrete data point available for their analysis increased by exactly the same amount. (In the immortal words of Arty Johnson: "Velly intellesting!)
Neither of the satellite records shown 2014 as "the warmest year on record". We have yet to hear from the Hadley Center, though their anomaly calculation was running ~0.2 C below GISS and NCDC, again using subsets of the same data.
Geoff,
One further observation regarding the accuracy of the more recent data. An "adjusted" temperature anomaly is the difference between the most recent "adjusted" temperature and the "adjusted" temperature at some time in the past, typically 30+ years. Therefore, even if the most recent data were the most accurate, they would be of little significance because of the short duration of the measurement period.
The US Climate Reference Network constitutes the most recent effort at accurate land surface temperature measurement. The sites are carefully selected and equipped with three highly accurate resistance temperature devices, which makes it possible to detect and correct instrument drift or failure. The data from these 100+ sites do not require "adjustment" prior to use.
http://wattsupwiththat.com/2015/01/27/uk-met-office-says-2014-was-not-the-hottest-year-ever-due-to-uncertainty-ranges-of-the-data/
The actual release on 2014 temps from the UK Met Office is somewhat different, at least in tone, than what's reported on WUWT. But it is clearly much closer to Berkeley Earth's conclusions than to NASA's:
http://www.metoffice.gov.uk/news/release/archive/2015/2014-global-temperature
Post a Comment