Nick Stokes Busted : Part 4

In part 2 of this series, I took down Nick’s claim :

Steven Goddard produces these plots, and they seem to circulate endlessly, with no attempt at fact-checking, or even sourcing. I try, but it’s wearing.

He said that there was no attempt at fact checking or sourcing. I pointed out that all of the source code and executables have been available on line for years, and that he or anybody else could have checked them any time they wanted to. Perhaps Nick lacks basic computer skills or can’t follow simple instructions? Lots of other people have run the code.

So after being shown wrong, he has changed his story and made an idiotic statement that he bets no one circulating the graphs has done any fact checking.

There are so many things wrong with this statement, it is difficult to know where to begin. Lots of people have run my easy to use software – but I can guarantee that close to 0% of people circulating NOAA and NASA graphs have done any fact checking. If they had, they would know that NASA and NOAA graphs are complete garbage. People who read my blog are very informed about this topic.

Then Nick accidentally stumbles to the heart of what is wrong with NOAA data tampering.

It’s simple, and just wrong. There were (USHCN has been obsolete for years) 1218 stations in the final set. There were a varying number, usually somewhere around 900, in the raw set.

What Nick is saying is that, on average, the final data set contains 1218 – 900 = 318 fake monthly temperatures, or more than 25% of the data is fake each month. The raw data set is the measured temperatures. The extra 318 temperatures in the final data set, are simply made up. There is no underlying data. Is Nick too dense to understand his own words?

But it is worse than that. In recent years, the percent of fake data has skyrocketed to close to 50%.

Nick claims that USHCN is obsolete, which is also complete BS. USHCN makes up almost the entire NOAA/NCDC US temperature record. Here is the current NCDC graph :

Here is the current USHCN Final graph

The two graphs are almost identical.. The graph below plots both USHCN final and NCDC temperatures.

There is no meaningful difference between the two data sets.

National Temperature Index | Temperature, Precipitation, and Drought | National Centers for Environmental Information (NCEI)

NCDC data is a minor variant of USHCN. Nick is trying to make it sound like an important point, but it is close to meaningless. He is just kicking up dust.

Bottom line is that NOAA and all of their apologists are committing fraud. The data is being adjusted to match CO2 theory.

People like Nick will kick up as much dust as they can to obscure this fact, but Nick has nothing intelligent to say on this topic. It is wearing listening to his endless BS. Nick and his scam are obsolete. He calls the NOAA fraud a “Goddard Spike.”  All I am doing is reporting the data and making it accessible to millions of more intelligent people than Nick Stokes.

Note that Nick is now changing his idiotic story from “Goddard is wrong” to “USHCN is no longer used.” Looks like a tacit admission that my graphs are correct.

This entry was posted in Uncategorized. Bookmark the permalink.

47 Responses to Nick Stokes Busted : Part 4

  1. angech says:

    Frustrating is the word. He does know his stuff just that he chooses to deliberately distort facts to suit his own personal agenda and beliefs. Perhaps he has connections working at BOM and feels conflicted?
    Who knows.
    More likely he is just blindly following the cause.

  2. gator69 says:

    Maybe I am in the minority, but I was never impressed with Nick, he just doesn’t get or admit to obvious facts. And frankly, I really do wonder if he ever has enjoyed anything remotely approaching sanity.

  3. sunsettommy says:

    Posted this at WUWT,to see if Nick will continue his stupid B.S.

  4. John of Cloverdale, WA, Australia says:

    Off the subject of US temperatures, but interesting article by Jo Nova which includes a guest post by Dr Bill Johnston, titled:
    Canberra’s “hottest ever” September record due to thermometer changes and a wind profiler/Welcome to Canberra airport where it’s always sometimes hotter.
    http://joannenova.com.au/2017/10/canberras-hottest-ever-september-record-due-to-thermometer-changes-and-a-wind-profiler/

  5. kyle_fouro says:

    Tony,

    I know you’re busy with this blog and contract work, on top of prioritizing your health at the moment, but in my opinion you need to have an increased presence in the comments section on sites like WUWT and Climate Etc.

    Will Watts not let you post anything?

    • RW says:

      I back this.

      I recall years ago a bit of shade had been thrown on Tony there. I get the impression though that things are turning or have turned around on that front.

      Part of it is that the notion that whole ‘scientific’ government institutes are involved in peddling fake data is just so far outside of the possibility space for a lot of people, so their response is to tune out the info or indulge in a lot of counter scepticism – a level that they don’t even come close to applying to their own default global warming belief.

  6. sunsettommy says:

    Nick,shoots himself in the foot with his first reply to your new post:

    “Sunset,
    “Never could find where Nick claimed that USHCN is obsolete,even with the link he supplied.”

    I gave the link. Here is is again, describing developments in 2014:
    ” As a result, NCEI began using nClimDiv data to compute contiguous (CONUS) temperatures, replacing USHCN as the official CONUS temperature data set.”

    So TH points to the “current NCDC graph”. But where does it say USHCN?”

    https://wattsupwiththat.com/2017/10/04/quote-of-the-week-anonymous-cowards-please-take-note/comment-page-1/#comment-2627971

    I am thinking he is plain stupid to try bluffing the obvious away.

  7. Adamant de-Nye-er says:

    His comments show either a basic lack of understanding of computer-based modeling or of software systems development, or possibly both. I am afraid that I don’t understand any of his comments as coherent explanations, just a series of negative emotion laden sentences with little or no fact.
    At the risk of repeating myself, if software always runs as intended the authors have at least double the burden of proof to show that it is actually working correctly. Meteorlogical and climate models are software. So if the assertion that an “… algorithm is working exactly as it was designed…” is not backed up by heavily documented independent validation and verification, it is most likely to be presenting a faulty “answer,” even if there is no nefarious intent. I am unaware of any climate study that is accompanied by open source code and by independent verification — this being distinct from “peer review” of a report or its conclusions. The question isn’t whether someone else has verified the Unhiding software (I have, a little, but the source code is there and the methods are straightforward) but the question is what verification was done, and how was it done, on GISS and IPCC models? Without that test report, any conclusions based on those models are suspect, peer review or no.
    It seems as if there is an alarmist desire to rely primarily on satellite-source temperature data, without regard to the reduced precision of calibration of that information, and with its limited availability for verification. This could translate into regarding or referring to any ground station data as “obsolete” or unreliable.
    Is “Goddard spike” supposed to be a reference to your pseudonym, Tony? Impressive, as the 1930’s peak temperatures match unmolested data from a lot of other sources.

  8. sunsettommy says:

    Tony,

    he is insisting that USHCN is obsolete being replaced by CONUS.

    https://wattsupwiththat.com/2017/10/04/quote-of-the-week-anonymous-cowards-please-take-note/comment-page-1/#comment-2627994

    I have not been able to find the USHCN most recent data.

    • tonyheller says:

      All of this is addressed in this blog post.
      USHCN updates their data every day. I explained how to get the data in part 2
      https://realclimatescience.com/2017/09/nick-stokes-busted-part-2/
      All of my software depends on it.
      There is essentially no difference between USHCN and NCDC’s current data, as shown in the graph above
      https://realclimatescience.com/wp-content/uploads/2017/10/USHCNVsNCDC_shadow.png

      Nick is completely full of cr@p and just blowing smoke.

    • AndyG55 says:

      USHCN was stopped in 2013 (August iirc) but the data is still there.

      The data is NOT OBSOLETE, they just don’t calculate it past August 2013.

      The data is still VERY relevant to showing the mal-faeces of the AGW scam.

      The fact that ClimDiv matches it back to 1895 shows that they are essentially the SAME THING, just a re-name and a slight change in methodology.

      They are not fooling anyone… except AGW fools !!!

    • richard verney says:

      I have posted the following at WUWT:

      Nick is a very competent mathematician and that is the reason why he knows that all the time series thermometer temperature reconstruction sets are meaningless, and that it is impossible to make any comparison of temperature fluctuations with respect to time, because the sample set (the source data from which the temperature is obtained) for any one year is continually changing over time.

      I have made this point to Nick many times. The data set from which the 1880 temperature is assessed, is not the same data set from which the 1900 temperature is assessed, which in turn is not the same data set from which the 1920 temperature is assessed, which in turn is not the same data set from which the 1940 temperature is assessed, etc and so forth. This means that at no stage is it possible to make a like for like comparison. One cannot look at the time series reconstructions and conclude that temperatures are rising (or have changed) because this may be nothing more than the consequence of using a different sample set.

      It is akin to seeking to ascertain whether the height of men has changed over time, by measuring the height of adult men in Norway, Sweden, Finland, Iceland, Netherlands during the period 1961 to 1990 and finding an average of all these heights, and then compare it to the height of Spanish men measured in 1880 and conclude that the height of men has dramatically increased over time. One cannot from that conclude that in 1880 men were less tall than they are in 1960.

      Nick recognises this fact, which besets all the time series thermometer temperature reconstruction sets (eg., GISS, HadCrut etc). Nick states:

      It’s simple, and just wrong. There were…1218 stations in the final set. There were a varying number, usually somewhere around 900, in the raw set. He subtracts the average absolute temperatures, and says the result is due to adjustment. But they are different sets. The 900 raw stations may just, on average, be warmer or cooler places than the 1218 final.If there is inhomogeneity (lat, altitude etc) you either have to use the same set, or carefully correct for the difference. Else you get things like the Goddard spike. (my emphasis)

      There is no way that they can be carefully corrected for the difference, and it is absurd to even try. What one needs is good quality unadjusted RAW data that can be compared directly with good quality RAW data with no adjustments whatsoever,

      We should not be trying to make global or hemispherical sets with infilling, kriging, spatial coverage adjustments etc, we should just make like for like direct comparison at the same point on point locations. We should simply select say the 200 best sited stations where there can be no doubt that there has been no manmade changes/material changes in the surrounding locality, and then retrofit these stations with the same type of enclosure 9painted with the same type of paint), fitted with the same type of LIG thermometer (calibrated using the same methods as was historically applied at that location) and then observe using the same practice and procedure as was used in the 1930s/1940s.

      In that manner we can obtain modern day RAW data that can be directly compared to historic RAW data from the station from the 1930s/1940s without the need for any adjustment whatsoever to the data. This would be done on an individual station by station basis, simply comparing each station with itself to see whether there has been any change in temperature at that particular site since the highs of the 1930s/1940s.

      there would be no fancy statistics, simply a list of the number of stations that show say 0.2degC cooling, 0.1 degC cooling, no change, +0.1 deg warming, +0.2 degC warming etc. In that manner a like for like comparison can be made, and we would quickly get a feel as to whether there had been any significant warming during the period that some 95% of all manmade emissions have been made. Of course, that would not establish that CO2 is responsible, but would provide us with a better insight into temperature change.

      • Mark Luhman says:

        They can’t do that, that would require real work. Real work, sonething that is unknown in climate science.

      • AndyG55 says:

        I think you would be really hard pressed to find 200 well distributed “good” sites around the world.

        Many of the long term sites are heavily affected by UHI and other issues suck as thermometer type, casing etc etc.

  9. CheshireRed says:

    Can’t help but notice Nick’s reluctance to come on here and defend his position in person. It’s the online equivalent of a police suspect who under cross examination endlessly repeats ‘no comment’.

  10. AndyG55 says:

    Yo, heads up to all

    https://wattsupwiththat.com/2017/10/05/america-first-energy-conference-announced/

    Tony, if you want to go, I’ll chip in $50US.

    Can you manage to get on a panel or something…. PAID trip.. even better :-)

  11. Rob says:

    Oh how the mind is when it is so attached to something it believes in so much that it will ignore reality that goes against it. That guy sounds nutty and needs some help. Tony has clearly laid out logic.

  12. Tony

    You often post graphs of just USHCN sites which have continuous data, and these tell the same story.

    This should once and for all stop Stokes’ “changing station mix” argument

  13. richard verney says:

    For what it is worth, i will chuck my two pennies in this debate.

    What is important for determining whether there has been any change (over time) is to compare like with like. Tony does this by using the same data stream, namely USHCN. The comparison plots that Tony posts are a like for like comparison, and therefore valid to make the point that the past was significantly warmer than today, and that despite ever rising levels of CO2, there has been no warming, to the contrary there has been cooling.

    This would not be materially altered even if Nick was right that USHCN is no longer used. It does appear that it may have fallen out of favour sometime around 2014, but the data stream is still in use and still being updated, so there is no fundamental issue here.

    It may be that the ClimDiv data stream is (or is not) an improvement. It does appear that it has a lot more stations. But are all these stations with complete data sets going back as far as those in the USHCN data stream. If so then the greater sampling and spatial coverage of ClimDiv would appear to be an improvement, but that alone does not render the like for like comparisons based upon USHCN materially wrong, and/or not relevant to the assessment of what if any temperature change there has been.

    I found Nick’s comment

    There were (USHCN has been obsolete for years) 1218 stations in the final set. There were a varying number, usually somewhere around 900, in the raw set. He subtracts the average absolute temperatures, and says the result is due to adjustment. But they are different sets. The 900 raw stations may just, on average, be warmer or cooler places than the 1218 final. If there is inhomogeneity (lat, altitude etc) you either have to use the same set, or carefully correct for the difference.

    to be of particular interest, since is the fundamental problem with all the time series thermometer reconstructions. I have for a long time been pointing out that very fact to Nick!!!

    The problem with GISS/Hadcrut etc is that the stations that reported temperature in the 1860 are not the same stations that reported temperature in 1880 which are not the same stations that reported temperature in 1900, etc etc so that they are not the same stations that reported temperature in 2016.

    The sample that makes up the data set in any given year is a constantly moving feast such that no meaningful comparison can be made over time. One is never making a like for like comparison.

    One can not average data obtained during 1961 to 1990, and then compare it to data obtained in 1940 or 1880. It is like seeking to ascertain whether the height of men has changed over time by measuring the height of adult men in Norway, Sweden, Finland, Iceland, Netherlands during the period 1961 to 1990 and finding an average of all this, and then compare it to the height of Spanish men measured in 1880 and conclude that the height of men has dramatically increased over time.

    The type of comparison set forth in the time series thermometer reconstruction sets is completely meaningless.

    • Mark Fife says:

      “If there is inhomogeneity (lat, altitude etc) you either have to use the same set, or carefully correct for the difference.”

      Amazing how he is accusing Tony of doing exactly what their “proof” depends on. Laughable really.

      But here is what I want to know. Tell me exactly how you “carefully correct for the difference” between two stations for which there is no overlap in reporting dates.

      I know how I did that for a record of annual averages. There has to be a 3rd station which overlaps the other two completely. You then determine a compensation factor for the original two stations to transpose those annual averages.

      My analogy for this is actually music. Transposing a melody for the key of C to the key of C# means raising each note by a half step.

      That seems fairly intuitive to me. The problem is in how you compute the compensation factor. Therein lies the uncertainty. In music I know they key because it is well defined. However, the absolute true difference in average temperature between two locations is based upon two estimates. How good those estimates are depends upon how many records you have.

      Then again, the value of adding a station record to a grand average depends upon how long the record is. A record covering 1 year or even 5 years really doesn’t contribute any meaningful information anyway.

  14. TA says:

    “NCDC data is a minor variant of USHCN. Nick is trying to make it sound like an important point, but it is close to meaningless. He is just kicking up dust.”

    Exactly!

  15. Frank K. says:

    In my view, you could examine just a few rural US stations with reliable temperature data going back to the late 1800s to assess if a global warming signature was present. One has to remember that back in late 1800s and through the 20th century, farming and agriculture were an important part of the US economy, and having reliable temperature and precipitation data were essential to success. It would make no sense that agricultural communities would report shoddy temperature or precipitation measurements – in fact, I’m sure they were quite careful to report high quality data. Hence, one can look at newspapers, agricultural journals, and other sources to corroborate the “official” US climate data. For example, the journal “Monthly Weather Review” has archives stretching back to the 1890s with tables and charts for climate and weather in the US and world. I have been downloading these and plan to look at the reported data and see how it compares to the current data reported by the NCDC software.

    BTW, here is the “Monthly Weather Review” online archive:

    http://journals.ametsoc.org/toc/mwre/current

    • gator69 says:

      Years ago I was able to access unaltered NASA weather station data, and studied stations in the midwest with over a century of data. Many of these stations had remained rural and they showed either zero warming, or a slight cooling over the past 100+ years. The site gave actual data, along with recent photos of the station site. All that has since vanished.

    • tonyheller says:

      Gavin said the same thing a few years ago

  16. David M. says:

    Good evening, Tony & all other well informed thoughtful deniers:-)

    “Do NOT argue with fools” is an adage that has withstood the test of time of experience. You have already devastated Nick Stokes’, and his fellow travelers’, ludicrous claims about Tony’s analytical prowess and integrity. Ditto their claims about those who read & consider Tony’s research. Your time and efforts are now better spent devastating even more facets of the catastrophic climate change nonsense.

    Let me suggest three:
    Sea surface temperatures–Climate catastrophists claim the oceans are warming. They do so even the two best data sets refute the claim. The best data is collected by buoys and satellites. Hurricane and cyclone frequency and severity trends over the past few decades also refute the alarmists. Please see Tony’s numerous articles about severe weather.
    The concept that global warming is necessarily catastrophic. Wrong. Global temperature changes will increase food production in some areas, overwhelming or largely offsetting diminished food production elsewhere. Food is already shipped around the world. Ocean shipping is getting cheaper, while food handling and temporary storage practices are improving–dramatically. Also, modern HVAC equipment makes possible comfortable living and working conditions in hot and cold climates.
    Typical “climate models” have long been known to have a severe fundamental flaw: They assume TOO LARGE an increase in atmospheric water vapor as atmospheric CO2 increases. The assumed increase in water vapor is largely responsible for the models’ global temperature “estimates”. The models suffer many other severe flaws, too–including excessive complexity.

    Let me close with a sports analogy. A scoring, preferably overpowering, offense is needed to win baseball, football, soccer, basketball and other games. Defense merely reduces the odds of losing. Tony et. al NEED TO BE ON OFFENSE most of the time. The biggest challenge will be the issues on which to focus on a field of play as target rich as catastrophic climate change.

  17. RickS says:

    ” Is Nick too dense to understand his own words?” ???

    Umm, not exactly correct !

    “Nick” is what you call an “Asshole” !!

    Plain and simply an “extreme” ASSHOLE !!!

    Ya know, there are just too many ASSHOLES in today’s World, and it’s not a good thing for the rest of Us !

    Plain and simply it is not a Healthy thing to endure…

    AssholeOn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.