With Fake Data, NOAA Can Accomplish Anything

NOAA claims that the area of the US with unusually hot summer afternoon temperatures has increased in the US over the past 50 years to record levels.

Natural Disasters – Our World in Data

The National Climate Assessment shows that summers used to be much hotter, and that peak temperatures are way down over most of the country.


Temperature Changes in the United States – Climate Science Special Report

There are no metrics which support the NOAA graph.

So how did NOAA create the fake data? They simply made it up.  Almost half of all data from NOAA now is completely fake, with no underlying thermometer data.

This entry was posted in Uncategorized. Bookmark the permalink.

39 Responses to With Fake Data, NOAA Can Accomplish Anything

  1. Robertvd says:

    Just imagine what they could do with Sanders or AOC as president.

  2. Stephen Long says:

    1. If the data is faked, how did you get the real data?
    2. How do we know this data is not fake?
    3. What are your qualifications in this field?

    • paul courtney says:

      Stephen Long: Haven’t read much here? Keep reading, all 3 questions are answered. Then you’ll have 4. How are so many enviros taken in by such a blazingly obvious scam as AGW? Enjoy.

  3. D Boss says:

    These folks seem to doing what Tony does regards unadjusted temperature data, except they are doing it for the globe:


    Earth’s surface temp is declining over the last 5 years…

  4. Martin says:

    There was this smart dude Mallen Baker on YouTube debunking Tony’s claim of fraud at NOAA and NASA. He said that the stations changed positions to higher elevations which had to be homogenized. What is Tony’s response to this?


    • David A says:

      Martin, AFAIK Tony uses USHCN stations.

      NOAA has made the entire re ird FUBAR.

    • Aussie says:

      Yes, I saw Mallens weak response in another of his posts when questioned about this. He said that sometimes data had to be adjusted as methods became more accurate. BUT DATA IS DATA….

      Tony – worth you getting in touch with Jennifer Marohasy if you have not already done so regarding the BOMs adjustments.


      The BOM here in Australia are adjusting past data all the time, and their secret homogenisation program, well, who knows what happens, except that all that comes out is higher temperatures each year. I am certain a new Ice Age could come and the BOM would still be telling us its a balmy 25 deg C as the thermometers register below freezing…

    • xenomoly says:

      He already talked about the Time of Observation Bias claim in another blog post. Even if you only include the observation stations that did not ever change location – you see the same pattern acquired for the various “percent of stations above 100F” or other bounds. In fact the rural stations that have not been exposed to urban heat islands ( a major source of uncorrected warming ) show a very clear cyclical warming and cooling and warming in the 20th century – that goes along with the phase of the AMO.

    • arn says:

      As i remember the huge decline in weather stations since the 90ies
      resulted in shut downs of weather stations that were mostly located at higher altitudes.

      Wonder how the “homogenisation” went there(besides the obvious:Those showed even a higher temp increase rhetoric)

      read the comment section of your own link
      in terms of “debunking” it may help you a bit

    • Logic n reason says:

      Mallen is a failed politician who has no science qualifications as far as can be ascertained. He was a member of the green party in the UK but became a Liberal democrat. Apparently his discipline is in corporate social responsibilit.

    • Caleb Shaw says:


      Higher elevations? Hmmm. All the observing stations were hoisted? Why? Were they afraid the seas would rise?

      In actual fact some stations are where they always were, while others should be seen as separate histories because the stations moved. For example, Concord New Hampshire has been keeping records since 1869, but the station has moved three or four times.

      Furthermore, some stations that were originally rural have become urban, while some originally located at grassy airports with biplanes are now tar-covered with roaring jets. Tony, Anthony Watts, and others have spent a lot of time locating the stations which have always been in the same place, in rural locals, and it has been seen that such areas show very little warming, where areas of explosive urban growth show far more warming.

      This should suggest that, if “adjustments” are made, then urban areas should be adjusted down, to subtract the heat coming from buildings, pavement and engines. Is this the case? No. The adjustments are upwards, always upwards.

      The propensity of certain “climate scientists” to always adjust upwards has been noted for far longer than a decade. I first saw it in August, 2007, when Steave Macintire caught James Hansen tweaking the numbers, and Hansen needed to backtrack.


      Those of us who have been on this case for thirteen years and longer have heard all sorts of excuses given for “adjusting upwards”. Last year it was that the old thermometer-readers read at the wrong time of day, and past temperatures needed to be adjusted-down as recent ones were adjusted-up.

      I have no time for your Youtube post at this time, but fully expect it to be more of the same.

    • Archie says:

      Are you being facetious when you say “smart dude?” I can’t tell. I watched part and skipped around his presentation and what I noticed is that he didn’t present any actual evidence to support his assertion that stations were being redistribute in such a way that it would support making adjustments. And, as for outright fraud, I really like this article: https://www.dailymail.co.uk/sciencetech/article-4192182/World-leaders-duped-manipulated-global-warming-data.html

      Mallen seems to be just another “arm waver.” At least Tony is showing us actual historic articles and data. If you want a “real” scientist’s view, look at what Dr Spencer has to say as the U.S. Science Team leader for the Advanced Microwave Scanning Radiometer flying on NASA’s Aqua satellite. http://www.drroyspencer.com/

      Oh, and if you were being facetious about Mallen, I apologize for this reply.

    • Eric Hatfield says:

      I saw that video is out I didn’t view it. (I viewed a different video of his in which he tried to debunk the idea the effect of CO2 is largely saturated, but that’s a different issue.) Tony has dealt with this issue in a different video. Tony graphed the average elevation and latitude of US stations with time. In reality the average elevation of the stations lowered by about 40 or 50 feet since the 1930s. They also shifted south by about 4 miles.

      The point being is that while individual stations might have shifted to higher elevations, the average elevation is lower now than then. Thus there is no basis to do an in-bulk adjustment to the average temp based on elevation.

      The reality is 40-50 feet lower and a few miles further south aren’t going to impact the average enough to warrant an adjustment. If anything you might be able to argue past temperatures should be raised by a tenth of degree F.

      There is one real affect no one (not even Tony) has dealt with and that is the increasing urban heat island affect with increasing populations and energy usage with time. This is particularly important in the developing and third world countries as well as the west and south here in the US. There has been no attempt to define and extract that from the temperature record. By cooling the past and warming the recent NOAA etc is not only failing to extract the effect, but actually has made it worse. It’s safe to claim we have no real sound idea what the climate has been really doing over the last 100 years or so.

    • Eric Hatfield says:

      Let me correct a post that hasn’t shown up yet as I’m writing this.

      I just saw the chart I was referring to about the average elevation with time. The overall swings were about 40 feet. The average elevation of the stations in the 1930s were about the same as over the last couple decades.

      In any event there isn’t any basis for reducing the average temps of the 1930s by something like 1.5 degrees F based on station elevation (or by time of day bias as addressed by Tony in various videos.)

    • Martin says:

      David A
      Logic n reason
      Caleb Shaw

      Everybody thank you for attempting to answer my question and your time. I do remember Tony talking about stations being moved to more northern latitudes.

      f/e https://realclimatescience.com/corruption-of-the-us-temperature-record/

      Stations being moved hoisted to higher elevations is new to me. Mallen might have made it up or it is about minor adjustments in elevation of the stations used of about some decimeters (0,1 m) which cannot result in huge temperature deviations of degrees.

      In my country the Netherlands all temperature data before 1951 had been homogenized as well because they used different thermometers which were indeed hoisted a bit.


    • JCalvertN(UK) says:

      At RAAF Amberley near Brisbane Australia, BOM adjusted away a full 2 degrees Celsius. What kind of “higher elevation” makes for a change of 2 degrees? Airforce bases tend to be rather flat . . .

    • Gary Pearse says:

      Martin, here is a raw temp record for South Africa. It has exactly the same pattern as the US, moreover the same pattern occurs Paraguay, Bolivia, Canada, Greenland Russia. If this isn’t corroboration then what is? You can check this out at a site in UK called “Not a lot of people know that” a blog by Paul Homewood. Here is the South African one to twig your interest. Note the 1930s were the highs there too. It was posted on WUWT by a South African


      Naturally this has been “homogenized” by the fiddlers and completely changed. They used the “station move” stuff to such effect everytime you point these things out. I’ve come to believe these guys even will move a station just to justify the changing the numbers!

  5. rah says:

    “With Fake Data, You Can Accomplish Anything”
    Except change the actual temperatures, weather, and climate that is.

    • Gator says:

      Your perception is your reality. This is the main reason our federal government hijacked our public schools.

      • rah says:

        Facts are facts, and perception cannot change them. Saw another trucker wearing shorts and sandals in single digit temperatures the other morning. He can have his “reality”.

  6. David Appell says:

    Not one link to a NOAA site, despite writing “NOAA claims….”

    • spike55 says:

      What does the little script at the bottom of the chart say, you poor illiterate clown !!

    • paul courtney says:

      Mr. Appell: Did you try googling “NOAA” to find the site? If you linked us to the NOAA site that debunks Mr. Heller’s allegations, boy that’d show us! Why don’t you?

  7. David A says:

    Martin, AFAIK Tony uses USHCN stations.

    NOAA has made the entire re ird FUBAR.
    USHCN is a designated subset of the NOAA Cooperative Observer Program (COOP) Network with sites selected according to their spatial coverage, record length, data completeness,

  8. David Appell says:

    Heller censored my comment.

    What else do you need to know?

    • spike55 says:

      We know your name, and that you are a rancid fantasy-based troll.

      A putrid rotten appell.

      A greenie-brown slime that just keeps oozing.

      You should be censored, or more appropriately, locked in a mental asylum for degenerate non-humans..

    • Archie says:

      Which comment? It can take a long time for comments to post. Patience!

    • Kneel says:

      David, if the adjustments are justified, then they should all be individually justified – they are all different, and that adjustments are needed in Anchorage doesn’t mean the same adjustment(s) is valid for Dallas, for example.
      Under NO circumstances does the data itself provide a reason for adjustment, regardless of its statistics – that’s just post-hoc rubbish to turn what you actually got into what you expected.
      Even should you have valid individual reason(s) for adjustment(s), those adjustments should be added to the error margin.
      When “bias adjustment” and “homogenization” turn a negative trend to a positive one ( a la Aus BoM), and you cannot immediately provide solid documentation for it (a la Aus Bom), you are certainly fooling yourself, even if not everyone else.
      Remember David that questioning science – the data, the analysis and the conclusions, all three – is not only expected but encouraged in science. Trust but verify – the more expensive the “solution”, the more closely you need to look. Even the most evidence based science that approaches the complexity of climate (medicine) has made some pretty spectacular “fails” by following consensus instead of evidence.
      I think there are too many “spherical cows” in climate science…

  9. Logic n Reason says:

    Mallen Baker is a failed politician with no science background as far as can be ascertained. He is an ‘expert’ on corporate social responsibility. A point in his favour is he appears not to be a fan of the cult of extinction rebellion. He seems to have a problem with Tony especially and tries to find ways to debunk him at every opportunity. Would love to see these two slug it out in a televised or streamed online debate but no one seems to dare to take Tony on.

    • Steve Prewitt says:

      Scientific debate appears to be dead in the politics of globalism. Name any controversial topic and the establishment is afraid to debate the facts or treat facts as malleable things…

  10. DM says:

    Can Joe 6pack access the fabricated data? If so, how?

    I ask in order to test the following hypothesis: The fabricated data’s long term trend differs profoundly from the actual weather stations’ long term trend. Based on Tony’s work, at a minimum, the fab data’s temp change per unit of time is a multiple of the actual data’s change. One should not be surprised if the former is rising (rapidly) while the latter is falling.

    If data analysis confirms the hypothesis, climate realists and alarmists will be able agree Mann (and fellow travelers) causes global warming. Consensus will finally be reached;-} Comity will replace animosity between the 2 groups.

  11. Gary Puckering says:

    User Marin asks about Mallen Baker’s critique of Tony’s claims. Rather than post this as a reply, I’m posting it as a general comment so that readers can ask questions to me about it, and because the answer may be of general use to others.

    Baker makes many valid points in his critique. Tony does indeed claim fraudulent motivation, perhaps without all the evidence one would need to make that claim. But, between the “climategate” emails and some very dodgy behavior from key climate authorities such as Michael Mann, Katherine Hayhoe and others – he’s got some good reasons to suspect that there is outright fraud at play here.

    In any event, it’s the methodology arguments I’d like to address here. Baker begins that by using the NOAA website to generate temperature graphs and to show that they don’t look like Tony’s. That’s right – they don’t. Because NOAA’s graph generator uses adjusted temperatures, not raw temperatures. If you look closely at the red line on Tony’s graph and compare it to the orange line on Baker’s graph, and adjust for the fact that Tony covers 1920 to 2020 whereas Baker covers 1895 to 2020, then you can see that the two are a match.

    All Baker is doing is illustrating that NOAA’s temperature tool uses adjusted temperatures, not raw temperatures. Not that the page on which you generate the graphs mentions this! If NOAA want to be a little more transparent, the parameter name in the dropbox would be “Average Temperature (adjusted)” instead of just “Average Temperature”. Or they would have included a parameter called “Average Temperature (unadjusted)”.

    The web page Baker is using can be found here: https://www.ncdc.noaa.gov/cag/national/time-series

    If you go to this page and click on the Data Information tab, you can find out some details about the underlying dataset. It says the data is from the U.S. Climate Divisional Database, which has a link to a description of its history, where it says:

    There are 344 climate divisions in the CONUS. For each climate division, monthly station temperature and precipitation values are computed from the daily observations. The divisional values are weighted by area to compute statewide values and the statewide values are weighted by area to compute regional values. (Karl and Koss, 1984).

    And this is where things begin to go wrong.

    The temperatures displayed by the NOAA graph are adjusted using various statistical methods in order to create a more uniform distribution of coverage across the land. Put another way, gaps are filled in by various interpolation methods. This is intended to compensate for areas with no station coverage, and for places where stations have come and gone, leaving an incomplete record.

    It also includes adjustments for Time of Observation bias – which is the idea that many station operators in the past took temperature readings at various times of the day, rather than at the true maximum or minimum points of the day, or at consistent times like 4 am and 4 pm. The theory is that modern digital stations which do continuous record-keeping are more accurate. Remarkably though, their corrections negatively correlate extremely well with atmospheric CO2 trends. Too well. By this I mean that there are large corrections for when CO2 was low, and increasingly smaller corrections as it has increased. The result is a temperature graph that correlates with the CO2 trend. Coincidence? Or manipulation?

    There are ways to find out.

    One way is to look at individual stations rather than the aggregate of many stations. If we spot check individual stations around the country, and select stations that have kept records for a very long time, and which are well isolated from urban heat effects, ocean circulation patterns, and large bodies of water, then we can see whether the statistically homogenized trend differs from reliable individual trends. That may give us a clue whether the statistical manipulation is biasing the result.

    I wrote my own software to download NOAA temperature data from the GHCN daily temperature dataset. It parses the daily station files and outputs the data in tab-separated format so I can import it into Excel for analysis. It can also aggregate the data by day, month or year.

    I’ve used this software to spot check stations at various locations around the world, including many in the US. One spot I examined was the station at McPherson Kansas, smack dab in the geographic center of the continental U.S. I picked this station because it has continuous records back to 1900, it sits in a farm field far from urban heat island effects, and it’s in the center of the U.S. far from the Great Lakes and from the effects of ocean circulation patterns.

    My full analysis can be found here: https://cogitoergosum921815780.wordpress.com/2019/10/21/no-temperature-change-in-mcpherson-ks-since-1893/

    What the temperature records show are that in 125 years there’s been no increase or decrease in average, minimum or maximum daily temperature; no change in the number of very hot days, or very cold days; no change in temperature range; no significant change in precipitation; and no correlation to rising atmospheric CO2 levels.

    Of course, one station doesn’t tell the whole story. So, I used my software to compute the trend for a large selection of stations across the U.S.

    Baker’s other big criticism of Heller’s charts is that they use actual temperatures rather than temperature anomalies. His explanation of why anomalies are preferred is a bit off, but close enough. Anomalies are calculated at the station level, giving you a set of relative temperatures that can be aggregated with other stations. It’s, I hope, intuitively obvious that if you were to average the temperatures from Nome Alaska with those from Orlando Florida that the result would be useless. But, you can average the temperature anomalies and examine their trends to see if both stations are experiencing an upward or downward trend. Anomalies allow stations that are in different locations and elevations to be examined in aggregate.

    Using temperature anomalies, with 1971-2000 as the baseline, I generated a chart for all US stations that had data for 1920 to 2019 inclusive. There are 115,081 stations entries in the NOAA GHCN dataset. Of that, 61,867 are US stations (54%). Of those, only 1,597 stations had data for 1920 to 2019. Of those, 396 were missing more than 10% of their data in at least one year and were discard from my review. That gave me 1,201 stations to work with.

    I did a geomapping of the remaining stations and they provide very good coverage of the United States.

    When I chart the temperature anomalies from these stations, I see a lot of variability – up and down years – in the range from about -0.5°C to +1°C. Between 1931 and 1954 there were 9 years where the average anomaly was above 0.5°C – very hot, as evidenced in the historical news accounts and stories of the period. The Great Dust Bowl. The Dirty Thirties. The Grapes of Wrath.

    Temperatures got cooler between 1955 and 1997. There are more below-zero temperatures during that period than any other. Then things warm up again. Between 1998 and 2017 there are 10 years where the temperature is above 0.5°C, although two of those are barely above it.

    Since 2016, the trend is rapidly downward.

    Fitting a linear regression to this data gives you an upward temperature trend of 0.06°C per century, with a correlation coefficient of 0.0013 (very weak).

    Fitting a 4th order polynomial curve to the data gives you a clear up and down wave. It’s up during the first half of the 20th century, down in the last half, up again this century, and now trending back down. Even so, the correlation coefficient is only 0.1463, which is still weak. A moderately reliable trend would have a coefficient between 0.20 and 0.40.

    Baker’s chart from NOAA shows a trend of 0.15 °F per decade, which is 0.83°C per century. But it only shows a linear trend and it doesn’t show the correlation coefficient. If it did, and if it also showed a polynomial trend, it would be much clearer that the trend may be cyclical and that it’s not a reliable predictor because the data is so variable.

    So whether you use actual temperatures taken from reliable individual stations, or use anomalies to compute trends for large numbers of stations, there is many ways to show that the NOAA surface temperature graphs should come with a disclaimer, such as the one found at the bottom of every stock portfolio report I’ve ever gotten: “Past performance is not a reliable indicator of future performance”.

    A key issue here is that some people and agencies cherry-pick the start and end point of these charts to generate a linear trend that supports their narrative. That is manipulative. And climate scientists routinely omit any information about the weaknesses of their projections. Instead, they offer up 80-year projections, couched in language like “if current trends continue, then by 2100 ”.

    Heller has done a good job of exposing the data manipulation that has been done, but he’s also done a good job digging out hundreds of archived newspaper accounts which testify to temperature readings during past warm periods – such as the 1930’s. These accounts undermine the story told by the adjusted temperatures, and therefore undermine the validity of the adjustment methodologies.

    Baker is likely right that there is no “grand conspiracy” amongst scientists. But there is a whole lot of selection and confirmation bias going on, a whole lot of money being thrown at researchers who do research that starts with the premise that temperatures are rising because of CO2, a whole lot of third parties (including scientists) who’ve never looked at the data itself or who understand the statistical methods being used to massage it for public consumption, and a whole lot of silencing the few voices brave enough to offer a contrary assessment of the facts.

    Hoping this is a useful reply for user Martin.

  12. Gary Puckering says:

    I forgot to attach an image to my last post.

  13. Peter Carroll says:

    The elephant is in the room, and it’s a biggy!
    All the data homogenization, parameterization, adjusting, and general buggering about with, without fail, result in higher temperatures.
    None ever result in no change and none, God forbid, result in a lower temperature.
    There is only ever to be one result, and that is UP!

  14. Eric Hatfield says:

    Not sure what happened to the 2 posts last night, but I don’t see either.

    Anyway Tony in a very recent video addressed the elevation change reason to cool the past. He graphed the average elevations of the sites existing then and now and found a total swing of about 40 feet. Average elevations of the recent period and of the 1930s were about the same. You would need minimally 200 feet higher elevation today to justify cooling the past by the 1.5 degree F that was done.

    Tony’s point stands. There’s no justification for such a cooling of the past (even with the time of day bias question).

  15. Spurwing Plover says:

    The 1990’s Save the Rain Forests Campaign was one of the biggist scams ever just like the Anti-Pesticide campiagns in the 1970’s and the CFC ban from the 1980’s noting that back in the 1970’s it was Global Cooling and New Ice Age that liberal rag Tome gave it pront page coverage and it was on the cover of their infernal rag

Leave a Reply

Your email address will not be published. Required fields are marked *