USHCN raw monthly data shows that maximum temperatures in July were much hotter in 1936, but average temperatures were only slightly hotter in 1936. This is because they are reporting higher minimum temperatures now.
Our friends claim that this is due to more humidity in the air because of global warming.
Just one minor problem with that theory – the reason it was hot in the midwest in July was because of the drought, which caused the humidity to be below normal.
Anyone with half a brain will realize that the USHCN temperature record is severely UHI contaminated, which is why alarmists can’t figure it out.
Washington, DC used to rarely get an 80 minimum. Now they are commonplace. The temperature of the Potomac has gotten close to 90 degrees at times. Do you think proximity of the thermomter to the river might have something do to with that? The way any thermometer is situated can make a huge difference in temperature. If you are out to prove it’s gotten hotter, any idiot can do it.
I see 5-10 degrees almost every night between the Safeway parking lot and the open space on the opposite corner of the intersection.
This should be no surprise to us, because these people have a clear-cut agenda, and they won’t let anything in their way.
Steven, I know I don’t thank you nearly enough for your endless critical data analysis. I have attributed the majority of AGW to UHI, since my college days, but never went to the trouble that you have to prove it to anyone else. It is easy to explain it, but to actually quantify it is a task that apparently only you are up to. 😉
Thanks again.
PS – It’s been awhile since we had an update on Hendrik’s dog.
With the extra record amount of CO2 in the atmosphere it must continue to get hotter – the theory says so, the models say so, NASA says so, the political consensus says so, publicity says so, and most importantly MSNBC says so.
Can they all be wrong?
/sarc off
Reblogged this on Climate Ponderings.