A Closer Look At USHCN TOBS Adjustments

According to USHCN docs, the V2 TOBS adjustment hasn’t changed since V1

This net effect is about the same as that of the TOB adjustments in the HCN version 1 temperature data (Hansen et al. 2001), which is to be expected since the same TOB-adjustment method is used in both versions.

ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/menne-etal2009.pdf

The V1 TOBS adjustment was + 0.3F, and went flat after 1990.

ts.ushcn_anom25_diffs_pg (4)

ts.ushcn_anom25_diffs_pg.gif (650×502)

But the V2 adjustment (which is supposed to be the same) is twice as large, and negative.

ScreenHunter_536 Feb. 13 10.23

The next graph overlays the V2 adjustments in red, on the V1 adjustments in black. They claim that it is the same algorithm and nearly the same data.

ScreenHunter_539 Feb. 13 10.54

Below, I normalized the two data sets to 1940, and you can see that there is a large divergence between V2 and V1 after 1950, and particularly after 1990. In V1, TOBS goes flat after 1990, but in V2 it rises exponentially.

ScreenHunter_538 Feb. 13 10.53

Here is their claim again.

This net effect is about the same as that of the TOB adjustments in the HCN version 1 temperature data (Hansen et al. 2001), which is to be expected since the same TOB-adjustment method is used in both versions.

One more view shows the adjustments normalized to 1998. As always, cooling the past to create the appearance of warming.

ScreenHunter_543 Feb. 13 13.21

About Tony Heller

Just having fun
This entry was posted in Uncategorized. Bookmark the permalink.

6 Responses to A Closer Look At USHCN TOBS Adjustments

  1. Gail Combs says:

    Why the heck are they even doing Tobs adjustments? The min/max thermometer has been around for centuries!

    The Six’s registering thermometer which can record the maximum and minimum temperatures was invented in 1782. The most that would happen is someone ‘Might’ get a reading for the day before. Given most people would read and record just after breakfast or after dinner, I doubt Tobs is a real issue. Anthony watts and Jones are looking into the issue.
    http://www.jstor.org/discover/10.2307/531601?uid=3739776&uid=2&uid=4&uid=3739256&sid=21103445273577

    Here are the relevant comments from a recent thread at WUWT:

    1sky1 says:
    January 30, 2014 at 1:17 pm

    Gail Combs:

    I always suspected that AGWers drank gin-and-buttermilk martinis instead of kool-aid!

    On a more serious note, Six’s max-min-registering thermometer has long been the standard instrument used by Met services in English-speaking countries. The TOBS error is a misnomer; it should be called “time-of-reading” error. Inasmuch as temperature at time of reading is always recorded alongside, the possible error introduced when one of the registered extrema coincides with YESTERDAY’S temperature at time reading can be readily fixed by simple clerical changes. Instead, Karl introduces a misguided blanket “correction” based upon empirical comparisons with hourly instantantanous temperatures, which have precious little relationship to diurnal extrema. Small wonder that no other Met service accepts his nonsense.

    IN THE USA:

    evanmjones says:
    January 30, 2014 at 9:20 am

    The newer USHCN systems, MMTS/Nimbus-plus-the-new-toys, and ASOS/AWOS, have automated data gathering. So you don’t have to get up at 6:00 (one of THE WORST times for observation, BTW) anymore.

    But it’s all the same Min-Max principle. (And that principle has slashed my dataset and gives me no end of headaches.)
    http://wattsupwiththat.com/2014/01/29/important-study-on-temperature-adjustments-homogenization-can-lead-to-a-significant-overestimate-of-rising-trends-of-surface-air-temperature/#comment-1554718

    And further down:

    …+0.124 Raw, but that is without accounting for MMTS conversion and you really do need to do that. Not all adjustments are out of court (unfortunately).

    (That being the current result.)

    This is probably the most significant of Evan’s comments:

    evanmjones says:
    January 30, 2014 at 10:58 am

    “They remove the outliers from the station population. It is the most significant cause of the great dying of the thermometers.”

    That applies far more to GHCN than it does to USHCN. In the latter case it is generally stations that have been closed for many years but are still part of the record that are removed. About 50 were replaced from USHCN1 to USHCN2 out of 1200+. I have surveyed, I guess, at least two thirds of the new stations. They appear, on the face of it, to be around as bad as the old ones, in terms of siting. (Long live the new boss, same as the old boss? Maybe. TBD.)

    But, yes, what you say is, in essence, correct for GHCN — I think — having not surveyed them myself (yet). And when most of the stations are poorly sited and therefore reading spuriously high, there is a distinct tendency to identify as outliers and remove the good stations and promote the bad. This has a profound effect on trend, quite apart from step changes in offset. That stipulates that bad microsite affects not only offset, but trend, which is what we are hypothesizing. But you know this already. That wouldn’t matter so much if the majority of sites were good and therefore it was the bad ones being identified as outliers.

  2. NikFromNYC says:

    A organizational flow chart is needed to visually trace the path from raw data souces to the various global average plots. For instance, is NASA really using raw data and this creating their own burst in adjustments or are they just piggybacking on another archive? Is raw data being itself altered? After all these years I’m still confused for lack of a simple visual map of the territory.

  3. Brian H says:

    When the house is bottom-dealing, prepare to lose all your money.

  4. Anto says:

    As an aside, Anthony Watts has been showing for years that their homogenisation methods for siting issues also results in a cooling of the past. In fact, the method does that even for the best sited stations available (ie. even where there is no reason to touch a rural station’s record for siting, their algorithms still adjust the past downwards at those stations):
    http://wattsupwiththat.com/2010/01/27/rumours-of-my-death-have-been-greatly-exaggerated/

  5. Anto says:

    It is also no surprise to discover (via that same post at Anthony’s site) that their adjustment for the change from mercury to MMTS measurements is exactly the opposite of what it should be. From the post:

    Menne et al 2010 mentioned a “counterintuitive” cooling trend in some portions of the data. Interestingly enough, former California State Climatologist James Goodridge did an independent analysis ( I wasn’t involved in data crunchng, it was a sole effort on his part) of COOP stations in California that had gone through modernization, switching from Stevenson Screens with mercury LIG thermometers to MMTS electronic thermometers. He sifted through about 500 COOPs in California and chose stations that had at least 60 years of uninterrupted data, because as we know, a station move can cause all sorts of issues. He used the “raw” data from these stations as opposed to adjusted data.

    He writes:

    Hi Anthony,
    I found 58 temperature station in California with data for 1949 to 2008 and where the thermometers had been changed to MMTS and the earlier parts were liquid in glass. The average for the earlier part was 59.17°F and the MMTS fraction averaged 60.07°F.

    Jim

    A 0.9F (0.5C) warmer offset due to modernization is significant, yet NCDC insists that the MMTS units are tested at about 0.05C cooler. I believe they add this adjustment into the final data. Our experience shows the exact opposite should be done and with a greater magnitude.

  6. E. Martin says:

    Nick from NYC says ; An organizational flow chart is needed …….
    Indeed it is, and complete with the names and titles so as to expose the perps responsible for all the “adjustments”.

Leave a Reply

Your email address will not be published. Required fields are marked *