A Graphical View Of Anomalies And Infilling

The use of anomalies and infilling cleans up the temperature record, as shown below.

ScreenHunter_705 Jun. 29 16.20

About Tony Heller

Just having fun
This entry was posted in Uncategorized. Bookmark the permalink.

12 Responses to A Graphical View Of Anomalies And Infilling

  1. MikeTheDenier says:

    Tony, I thought you would enjoy this. I’m sure you’ve seen it already 🙂

    https://wattsupwiththat.files.wordpress.com/2014/06/josh_kansas.jpg

  2. Keith says:

    I imagine “Steven Goddard” is puking as he sees this post at WUWT. While it is excellent, the irony is too thick. Real Science has been reporting this for ages, gets heavy duty criticism from WUWT, where Antony is essentially now telling the same story.

  3. Chip Bennett says:

    Watts still doesn’t seem to want to make the only logical connection. Here’s my response (held in moderation; I’m not sure I’ve ever commented there before):

    This is not acceptable. It is not being honest with the public. It is not scientific. It violates the Data Quality Act…

    NOAA has been accused by others of “fabricating” data, and while that is a strong word that I don’t like to use, it looks to be more and more accurate.

    That said, I don’t believe this is case where somebody purposely has their hand on a control knob for temperature data, I think all of this is nothing more than artifacts of a convoluted methodology and typical bureaucratic blundering. As I’ve always said, never attribute malice to what can be explained by simple incompetence.

    Mere incompetence – artifacts of a convoluted methodology and typical bureaucratic blundering – would reasonably be expected to produce a normal distribution of error, especially given the hundreds of thousands of data points impacted.

    But that’s not what we see. All of the error serves to cool the past, and to warm the present – in other words, all of the error serves to bolster the very assertion being promoted. At the micro level, as original, historic station data are being adulterated on a monthly basis, the data are modified upward or downward subtly. But at the macro level, the net adulteration always – always – serves to cool the past, and to warm the present.

    It is utterly implausible that the observed error is random.

    • Exactly, and unbelievable that any skeptics would defend it.

    • _Jim says:

      I think all of this is nothing more than artifacts of a convoluted methodology and typical bureaucratic blundering.

      I speculated this had its roots many years ago in semi-automated procedures back in the card-deck and tape-dataset days, before widespread use of disks (DASD or Direct Access Storage Device) when I/O operations consisted of reading in one ‘tape’, performing some operation on the data as a new tape was being written in ‘blocks’ (this is how multi-megabyte data was ‘worked’ with only say 64K or 128K of ‘core’).

      There may be work procedures marked-up (ECN’d in drawing control lingo) since the 50’s describing the process; no one alive then is able to get on the ‘net today AND (note the ‘AND’) tell the story.

      .

      • slimething says:

        Jim,
        Thanks for pointing out the new format at WUWT, even after I had already read it. 🙂

        If there are ECN’s, where are they? I don’t think anything is organized in such a manner. It appears to me to be on-the-fly, and if the Boss likes the results, they go with it. There is no real V&V done. If there was, there would be a reference.

        If our technical writer presented manuals without correct information on calibration procedures, maintenance schedules etc. like is done is this sloppy field known as “climate science”, I’d have been out of a job 20 years ago.

        • Gail Combs says:

          Heck I wrote up test methods for the lab and procedures for production in a number of itty bity companies with under 50 employees.

Leave a Reply

Your email address will not be published. Required fields are marked *