Quick Summary Of NCDC Data Tampering Forensics

It may not be obvious to everyone yet, but this morning’s TOBS discovery is huge.

I need to run now, but here is a quick summary of things I can prove so far about the US temperature record.

  • Until 1999 NASA said the US was on a long term cooling trend
  • Until 1989 NOAA said there was no long term warming in the US
  • Sometime after 2000, NOAA made a large downwards shift in the absolute baseline temperature. This is probably why Nick and Zeke keep insisting on the use of anomalies, as it hides the shift.
  • Temperatures are being adjusted an average of about 1.5F relative to the 1930s
  • The raw data does not support the validity of a TOBS adjustment
  • NOAA is doing something in their conversion from daily data to monthly data to create a bias which selectively cools the past – which in turn creates the appearance that TOBS is valid.
  • Since 1990, almost all warming is due to infilling of non-existent temperature data.

And to top it all off, the UHI adjustment is much too small. The US is on a long term cooling trend for over 90 years, and used to be hotter. NCDC US temperature graphs do not even remotely resemble the actual US climate, and actually reverse the trend.

I will write this up in more detail later.

About Tony Heller

Just having fun
This entry was posted in Uncategorized. Bookmark the permalink.

24 Responses to Quick Summary Of NCDC Data Tampering Forensics

  1. Password protected says:

    Thank you for your efforts.

  2. daveandrews723 says:

    Good work! Prepare for the warmists to demonize your research and conclusions. They can’t deal with the truth.

  3. Latitude says:

    ..I can’t wait!!!.

  4. Henry says:

    thanks for all the work!

    JoNova reports a Ken Stewart is looking at some Australian govt numbers. Coming to similar skeptical conclusion as you
    http://joannenova.com.au/2014/07/wow-look-at-those-bom-adjustments-trends-up-by-two-degrees-c/

    FWIW

  5. omanuel says:

    Steven,

    Climategate emails that surfaced in late November 2009 allowed us to understand:

    A. This video by the former Head of Radiation Safety at the Hanford plant for recovery of plutonium, and

    https://www.youtube.com/watch?v=ejCQrOTE-XA&feature=player_embedded

    B. Why Hiroshima and Nagasaki completely recovered into bustling cities after the Second World War.

    C. How Stalin emerged powerfully to rule the world after the:

    a) CHAOS and FEAR in August 1945:

    https://dl.dropboxusercontent.com/u/10640850/CHAOS_and_FEAR_August_1945.pdf

    b) Destroyed the foundations of religion, science and democracy:

    https://dl.dropboxusercontent.com/u/10640850/Humanity_Lost_WWII.pdf

    c) Produced false propaganda on:

    1. The internal compositions of stars were changed from mostly iron (Fe) in 1945 to mostly hydrogen (H) in 1946.

    2. Albert Einstein’s and Francis Aston’s valid equations for nuclear stability were replaced with Carl von Weizsacker’s invalid nuclear binding energy equation, exaggerating proton-proton repulsion and minimizing NEUTRON-NEUTRON REPULSION, . . . The source of energy that powers the cosmos from cores of heavy atoms, some planets, ordinary stars, galaxies and our ever expanding universe!

  6. omanuel says:

    This CSPAN video shows the head of NASA in 1998 releasing isotope data from the 1995 Galileo probe of Jupiter,
    isotope data that confirmed a

    1. 1972 report in Nature meteorites formed out of unmixed supernova debris

    2. 1977 debate in Science and a 1979 report in Nature that the Sun birthed the solar system

    3. 1983 report in Meteoritics (based on analysis of samples from Apollo samples returned to Earth in 1969) that the interior of the Sun is mostly iron encasing the pulsar core that Peter Toth reported in Nature in 1977.

    https://m.youtube.com/watch?v=m3VIFmZpFco

  7. tom0mason says:

    Steven, thank-you for all your hard work, you are showing the world how the scam is being maintained. I predict that similar patterns of adjustments will be found in other countries.

    There is no reasonable explaination for all the ‘adjustments’ except politics and the agencies, bureaus, and data centers that need to stay attatched to the teat.

  8. Eliza says:

    Maybe you should be thinking of doing “sticky posts”like this one for example to important to let passso that the media ect… can get a look at it.

  9. Dave Day says:

    I bought you a few sips of Chardonnay in the tip jar.

    Keep up the good work.

    Dave

  10. Scarface says:

    Steven, I just can’t wait to see that you get a paper published that proves the malicious data tampering by these institutions, that you expose time and time again.
    Some other readers have also pointed to this conversation before:

    Lee says: February 24, 2014 at 8:12 pm
    Are you really “quite serious about this” or are you “just having fun”?
    I read your blog every day. I’m not sure when you sleep.
    I am going to assume that perhaps you are serious, and I hope you are.
    What is the most important subject you have been writing about lately? I think it is the data tampering. Do you believe what you are writing? Given the opportunity can you really prove it? Can you look at the subject from both sides and actually prove that the adjustments can only be explained as nefarious? If you can really accomplish that goal, you will be able to get the funding.
    Be serious and go for it.

    stevengoddard says: February 24, 2014 at 8:17 pm
    There is almost no money for skeptics.

    Lee says:
    February 24, 2014 at 8:50 pm
    You have my email address. If you are serious about this you should contact me. Funding can be had for something as important as this.

    Source: http://stevengoddard.wordpress.com/2014/02/24/show-me-the-money/

    Did you at one time contact him? You obviously don’t have to answer this question, but It looked like a serious proposal, hence this well-intended reminder.

    Keep fighting the good fight!

  11. emsnews says:

    Just this week they issued another report claiming that last year was one of the top ten hottest in history!!!

    And I bet, this globally cool summer, they will say the same thing thanks to using faked data that cools down the past so significantly, the super hot 30’s era is wiped out along with the Medieval Warm Period.

    This is utterly insane! Thanks for the research, Steven, on this data tampering.

  12. Anto says:

    Amazing, Steve. I’ve been following you for some years now. When you first started this investigation, I was already convinced that catastrophic AGWing theory was bunkem. What I never expected to see was that the entire warming trend is an artifact of data manipulation.

    Your work is extremely important, not just on the subject of temperature manipulations, but the contemporaneous accounts from newspapers and magazines of the past. The later provides corroborating evidence of the fraud.

    When this entire charade collapses, you will have played no small part in it.

  13. Truthseeker says:

    Steve,

    You need to get in touch with this guy who is doing the same type of analysis in Australia …

    http://kenskingdom.wordpress.com/2014/07/16/the-australian-temperature-record-revisited-part-4-outliers/

  14. anthonyvioli says:

    Yes Ken is doing a great job.

    So far his requests to the BOM have been flatly refused.

    We all know why.

  15. Crowbar of Daintree Forest says:

    Clearly, the authorities that produce these “homogenized” temperature data sets do not perform any meaningful “sanity-checking” on their results.

    I have never understood the value of destroying the true raw data covering Min and Max temperature, to produce a fictitious Daily Temperature for a station. Surely the raw data values should be averaged across the largest number of readings as possible to get a decent average? Starting the process with an average of 2 numbers (Min and Max) is corrupting the raw data from the get-go.

    I would like to see the following simple sanity-check to determine roughly what the raw data are saying:

    1. Start with ALL raw daily Min and Max temps for the contiguous US. The more stations, the better, irrespective of their length of record or gaps.
    2. The only data-check is to ignore a daily station reading if its Max is not greater than its Min, or if either Min or Max is missing for that day.
    3. Calculate the following for each individual day:

    a. Average Daily Min across all stations
    b. Average Daily Max across all stations
    c. Average Daily Latitude across all stations
    d. Average Daily Altitude across all stations
    e. Total Number of Stations with a reading

    Just to re-iterate, these are the daily averages of (and a count of) the stations with valid Min and Max readings for that day.
    4. If a decent number of station records go back say 100 years, then this would produce 36,525 records. If this is too many to represent on a single date-based graph, then use the daily averages to produce the same set of monthly averages, giving 1,200 records.
    5. Produce a single graph showing all 4 averages and the number of stations.

    From such a graph, we could tell how valid the trends of Min and Max temperatures are, given the movement in the Latitude, Altitude and Number of Stations. Assuming that these 3 metrics “settle down” such that there is not much movement up or down on a month-by-month basis, then I would argue that the trend (from that point), represented by the sheer weight of raw numbers, would point to a closer “truth” than any amount of cherry-picking, infilling, extrapolating and homogenizing.

    Am I hallucinating about the worth of such a graph?

    Steven, I hit your Tip Jar on 3rd July. I’d be happy to repeat the dose (times 2) for this graph, if you think it will contribute to the greater understanding.

    • cdquarles says:

      I think that this sort of study would have more value if you’d include all relevant weather data recorded. That is, the barometric pressure, the change in barometric pressure, the sign of the change in barometric pressure, the dew/frost point temperature, the change in the dew/frost point temperature, the sign of the change of the dew/frost point temperature, the wind vector (speed and direction), gusts, sunlight hours, cloudiness, and precipitation. At least you’d have a better idea of co-variables and confounding variables that seem to be ignored/hand-waved away.

      • Crowbar of Daintree Forest says:

        Yes cdquarles. and please be very clear on why I think this graph may be useful. It is NOT an attempt to be scientifically or statistically 100% accurate (or even 97%). It is, in my mind, the MOST SIMPLE, UNADULTERATED presentation of the massive cache of raw Min and Max temperature readings that we have across a single contiguous land mass. The supporting data (Average Latitude, Altitude and Number of Stations) are there to hopefully indicate that the concept of averaging the Min and Max temps across the whole of the US per day is not being unduly affected by major shifts in these supporting metrics over time.

        If any of the metrics that you mention are part of the daily record going back 100 years, then let’s include them. There may be another metric that could be derived from the Latitude and Longitude of each station, that points to the station’s distance from the moderating effects of the ocean. Again, we’d hope that as stations are added, and others are dropped, that this metric doesn’t move too much.

        This is the kind of sanity-checking that leads to either confirmation, or new discovery. Either way, you win.

  16. Mike says:

    I think it’s time for some “Real Science” T-Shirts and other marketing gear.

  17. NotAGolfer says:

    You are completely right! I hope you can be heard. The thin little research papers that support the tiny adjustment for UHI are laughable.

Leave a Reply

Your email address will not be published. Required fields are marked *