Extreme Fraud At NOAA And NASA

This entry was posted in Uncategorized. Bookmark the permalink.

8 Responses to Extreme Fraud At NOAA And NASA

  1. john says:

    42% of the data for 2020 was generated by a model!
    Wow, Tony—– thank you for exposing this.
    That is criminal

    • Conrad Ziefle says:

      Yeah that is phenomenal. 42% of current temperature station data is faked. Now I’m wondering why they removed those stations to begin with. Was it to give them the path to fake data? They speak with forked tongue. They say the old timers didn’t know what they were doing and screwed up the data by resetting the thermometer at the high temperature point in the day. Looks like they are digging really hard to find an excuse to modify the data. Then they say that the modern stations, were also run by nincompoops and also have to be adjusted, but in the opposite direction. That’s really odd, because one would expect nincompoopism to be random walk, not always in the same direction at the same time, which could only be planned misdirection-which is more like what their adjustments are.

  2. john says:

    Also: around 3:20 mark, you show the daily temperature graph, and a monthly temperature graph. I’m guessing that the monthly chart is a rolling 30 day average.
    What surprised me is that the 30 month isn’t smoothed a lot more vis a vis the daily chart.

  3. Barry Sheridan says:

    Good reporting Anthony, thanks

  4. D. Boss says:

    Yes, criminal indeed! But sadly I fear they are going to get away with it. If the general public is too stupid to be able to grasp simple math, physics and logic – then the fraud may succeed.

    If it’s any indication of how stupid the troupes of talking monkeys are – it’s how many are dutifully putting masks back on with that other incredible fraud being peddled about the fake pandemic. An even larger fraud than the climate hype is being perpetrated with the plandemic bull schist.

    As one example they stopped recording how many persons that are fully “vaccinated” (it’s not a vaccine) are getting the disease a month or two ago, so now they can say “only those who are unvaccinated are getting covid”, when in fact places that continued to record their vax status show the majority getting the virus now have been fully vaccinated. (ergo either the vaccine doesn’t work, or it makes the wild virus worse as was predicted by many experts who’ve been silenced by the media mob)

    We’re in for some seriously bad times with the Alice in Wonderland mentality of these massive frauds and literal crimes against humanity.

  5. G W Smith says:

    Saul Alinsky (Obama and Hillary) tactics all the way – demonize the messenger.

  6. G W Smith says:

    A third of the population wants to be fooled, another third doesn’t know or care if they are being fooled, and the last third is demonized as deplorable conspiracy theorists.

  7. Solar Mutant Ninjaneer says:

    Back in the 1970s we used to say, “it’s not paranoia if they really are out to get you.” Similarly, it is not a conspiracy theory if they really are conspiring to falsify data.

    There is absolutely no excuse to “adjust” data. It is criminal in the extreme! In engineering it might be acceptable, or even advisable, to “normalize” data. For example, in characterizing the performance of engines like Otto cycle or gas turbines, power and efficiency vary with inlet temperature and pressure, so it is common to normalize performance to account for different operating conditions. An engine on a hot day in Denver will produce less power than the exact same engine on a cold day at sea level. Based on theory, collaborated empirically, this kind of “normalization” is done routinely. It is important that the algorithms used for normalization and their basis be well understood by everyone in the community. Anyone analyzing the raw data should be able to normalize performance and get the same results.

    For long-term temperature data, there are only three things that I can think of that should legitimately be “normalized,” (1) latitude, (2) elevation, and (3) urban heat island. The first two should be relatively straightforward. One could empirically determine how temperatures vary by latitude, or elevation, and then adjust the temperature up or down based on how the average latitude or elevation of the stations changed over time. The data would be normalized to an “average latitude”, for example, for the data set over the historical period. Normalization data fits could be done on a monthly or even daily basis. Whatever approach is used, anyone anywhere should be able to get the same normalized result.

    The urban heat island normalization is conceptually more difficult, but might account for population density variations over time. Whatever approach is used, it needs to be such that anyone can get the same result. I doubt latitude, or especially elevation normalization, would have much effect. The urban heat island normalization might be significant, and no doubt in the wrong direction for the criminals at NASA and NOAA.

Leave a Reply to Conrad Ziefle Cancel reply

Your email address will not be published. Required fields are marked *