Wednesday, July 21, 2010

How Much Warmer?

If the present global temperature trend continues for the remainder of the year, we're bound to hear a growing chorus of reports about 2010 being the warmest year since records have been kept. The first six months of 2010 already appear to have been the warmest first half on record. Or was it? When you examine the numerical result from the National Climatic Data Center upon which this determination rests, it turns out that January-June of this year apparently topped the previous six-month record set in 1998 by just 0.03°F. Not only is that difference quite small, but there's a good chance it doesn't exist at all and is merely the result of average temperature data being tallied to more decimal places than the accuracy of the instruments recording them warrants. So when someone tells you this is the warmest year ever, you should at least ask for more detail on that assertion.

Before going any further let me clarify that this point doesn't affect the validity of climate change. When I look at the accumulating evidence, including the climate data that's publicly available, I see a decade-by-decade warming trend since the turn of the previous century, with a few time-outs. It's open to debate whether that trend is currently in abeyance; neither the incidence of a couple of relatively-cooler years recently--giving rise to claims of global cooling that at a minimum must be regarded as highly premature--nor a single hotter year this year necessarily alters that, at this point. However, there are good reasons why media-hyped claims about any one year being warmer or colder than another are pretty much irrelevant to the larger discussion concerning climate change. And in at least the current instance they likely rest on a foundation that simply can't bear their weight.

The global annual temperatures we see reported are really averages of the averages of numerous temperature readings from thousands of weather stations around the world. Such averaged data can only be as accurate as the least-accurate individual readings on which they are based. This reflects a principle called "significant figures" or "significant digits" that is drummed into students of college chemistry and physics. (If you're interested in the details, the USGS has a good overview here.) So if you take three temperatures, say 59.1°F, 56.3° and 58.7°, their average is not 58.03333° (as my calculator tells me), or even 58.03°, in the manner that most of the climate data centers report such figures, but simply 58.0°. Furthermore, if you take the difference of two such averages, that difference can't create greater accuracy than the individual readings. For example, if I subtract from the above figure the global average temperature of 57.2°F for the period 1951-1980 used by NASA's Goddard Institute for Space Studies (GISS), the result is not 0.83333° or 0.83°, but 0.8°.

Applying this common-sense principle becomes even more important when you take into account the actual accuracy and precision (repeatability) of the underlying measurements. A quick Google search turned up a 2004 paper in the Journal of Atmospheric and Oceanic Technology on this subject. After analyzing several sources of measurement error in commonly-used air temperature sensors, the authors found that these devices were accurate to no more than +/-0.2°C over a typical range of temperatures, and less accurate beyond that. So not only are the temperature readings that go into the averages upon which comparisons of global annual temperatures are based only good to one decimal place, but they may not be quite that good. Even if some of this error averages out over the large number of observations recorded (assuming it is random error), we still shouldn't read more into these data than is there, and the second of the two digits in the "temperature anomalies" (differences vs. an agreed average) that are reported should probably only be used to ensure that rounding is done consistently.

What does all this mean in practice? Well, referring to the GISS data it appears the global average temperatures for 1998, 2002, 2005, 2007 and 2009 were all essentially indistinguishable from each other at 14.6°C or 58.3°F. 2010 might be on track to beat that by a full 0.1°C, though it could still easily end up in a tie with these other years. Whether this year sets a new record or not is of little consequence to the climate change discussion. Although not likely to compete with such a finding for headlines, it's much more relevant, important and accurate that the average of temperatures in the 2000s was apparently 0.2°C warmer than the average of the 1990s, which were already 0.1°C warmer than the 1980s, and so on.

No comments: