The temperature record is calculated with a precision several orders of larger than its accuracy. Then they pat themselves on the back, and think they are being really smart.
Disrupting the Borg is expensive and time consuming!
Google Search
-
Recent Posts
- Up Is Down
- The Clean Energy Boom
- Climate Change In Spain
- The Clock Is Ticking
- “hottest weather in 120,000 years”
- “Peace, Relief, And Recovery”
- “Earth’s hottest weather in 120,000 years”
- Michael Mann Hurricane Update
- Michael Mann Hurricane Update
- Making Themselves Irrelevant
- Michael Mann Predicts The Demise Of X
- COP29 Preview
- UK Labour To Save The Planet
- A Giant Eyesore
- CO2 To Destroy The World In Ten Years
- Rats Jumping Off The Climate Ship
- UK Labour To Save The Planet
- “False Claims” And Outright Lies”
- Michael Mann Cancelled By CNN
- Spoiled Children
- Great Lakes Storm Of November 11, 1835
- Harris To Win Iowa
- Angry Democrats
- November 9, 1913 Storm
- Science Magazine Explains Trump Supporters
Recent Comments
- arn on Up Is Down
- Luigi on Up Is Down
- Tel on Up Is Down
- Bob G on Up Is Down
- dm on The Clean Energy Boom
- arn on Up Is Down
- Bob G on Up Is Down
- arn on Climate Change In Spain
- conrad ziefle on The Clock Is Ticking
- conrad ziefle on Climate Change In Spain
My chemistry lecturer from Uni. used to laugh at all of us who suffered the OCD of calculating numerous decimal place accuracy and achieve such results from original measurements that were at best a half a millilitre accurate.
The really strange thing is why isn’t the academic community condemning such stupidity – they certainly ridiculed students who made such outrageous claims for accuracy in lab assignment reports ?
When the biggest kid in grade school is a bully, the rest of the kids do not tell him that the reason why he is the biggest, is that he was too stupid to pass fourth grade for three years in a row.
The so-called “climate scientists” are ignorant bullies. But they are also the biggest and dangerous to mess with.
s I recall we had an entire class devoted to significant figures. It seems the Climastrologist skipped that class.
Worse they think the statistics of large numbers applies to what is actually ONE data point.
…Can you change the phrase to “…several orders of magnitude greater…”?
Many decades ago when I studied mathematics, we did a short course on numerical approximation. The set book for this was Numerical Approximation by B.R. Morton (Dover Publications Inc.), I still have that book and have used it often over the years. It’s unfortunately long out of print but I feel that our current band of scientist could do well to get a copy and study it. As well as studying the difference between precision and accuracy.
The key to precision for alarmists is to have fewer stations and more extrapolation plus adjustments.
What seems to be little mentioned is that extrapolations and adjustments also introduce their own error, which further limits the accuracy.
I laugh when I read that the temperature can be measured globally to the nearest thousandth or hundredth of a degree. Even a tenth of a degree is suspect to me yet it amazes me how many “scientists” take this hook line and sinker. Anyone who know the slightest about measurement, accuracy and precision should laugh at the alarmist’s claims.
I figure they are doing good if they are within +/- 5 degrees celsius for the “global temperature”
Amen
I looked at GISS’s Fig.D.txt from May 2001, in which there are two decimal places of precisions (hundredths). I was amazed later when I looked at May 2014’s table, in which there are FOUR decimal places of precision (ten thousandths).
Whom do they expect to believe that we can measure atmospheric temperatures to ten thousandths of a degree? (Even hundredths of a degree is difficult to believe.) This must be, therefore, calculated precision not measured precision.
But we learned about significant figures in elementary school, didn’t we?
Those would look to be the result of ‘derivative calculations’; it would not make a lot of sense to calculate and then report ‘whole’ numbers as a statistic. The temperature was probably received (data intake) as a whole number, but statistics kinda mandate additional ‘resolution’ shall we say.
For example, calc the average of 2, 3 and 4.
Now calc the average of 2, 3, 4 and 5.
What are your answers using whole numbers?
What are your answers using one decimal point?
The average of 2,3 & 4 is 3
The average of 2,3,4 and 5 is 3.
If you report the average with anything more than one significant figure, you are introducing precision that isn’t there. If you were in my class, I would mark it wrong.
What if, this is a cumulative process, say, for paying you a percentage of sales?
Do you still object?
if your talking about money, it would probably be 2.00, 3.00 and 4.00 since we usually know money to the nearest cent. Then you could report the data to the nearest cent.
Any good scientist knows you should not gain or loose precision when manipulating data.
Bzzzzzt!
You’re en educator?
Please. I doubt it.
A better example would be the case of group of families and calculating the average number of children for that sample group.
The reporting can ONLY be a whole number.
The CALCULATED number of chillens per family can be a decimal value.
It only has meaning in a statistical sense, a result of a statistical inference.
.
If you are counting, then the data has unlimited significant figures so yes, you would report the average to whatever number of digits you calculate. If you measure something, say like a temperature, then when you report an average, you should not gain or lose precision when manipulating the data. If your temperature precision is to the nearest degree, then you average temperature should be reported to the nearest degree.
Yes, I am an educator, after working many years in industry.
The daily weather report for my city shows variations of 3 or 4 degrees C across the area. They would like us to believe their grid extrapolations mean something?
Yeah, accuracy and precision are often totally mucked up these days. I just purchased several “professional grade” USB port temperature/humidity loggers (~ $150 each). Stated temperature accuracy is +/- 1 degree C. Yet the software reports hundredths of a degree. Strapped them both to an aluminum heatsink. One reads 25.79 and the other 24.05. SO what is the true temperature of my aluminum heatsink ? I figure 25 degrees (+/-1), or the “true” temperature is anywhere between 24 and 26 degrees. It could be 24.0 and it could also be 26.0, or anywhere in between. It might be 25.00, but it could also be 24.10,or 25.67, or 24.89765, or…..
It’s true temperature is between 24 and 26 degrees (also stated as 25 +/-1), that’s all I can really know.
And anybody that argues that if I just bought a thousand of them I would know the “true” temperature to 0.01 degrees just doesn’t understand the problem.
Back in the old days one of the first lessons in engineering school was “significant digits”, of course they also told how to convert a magnitude/angle vector into X and Y components (and in the other direction) using GRAPH PAPER and a PROTRACTOR, OH the horror.
Digits, lots and lots of Digits, that’s the path to true enlightenment…..
/sarc off
Cheers, Kevin
“And anybody that argues that if I just bought a thousand of them I would know the “true” temperature to 0.01 degrees just doesn’t understand the problem.”
Exactly! And these are the same people who argue that “missing heat” has gone into the deep ocean. Why? “Because”, they say, “the global average temperature of the world’s oceans has increased by perhaps six one-hundredths of a degree over the last half century.”
LOL you bring back many memories! you probably remember log log graph paper too!
Now you got me thinking about slide rules, which I still have!
We have two laying on my desk down stairs.
Logarithmic graph plotting is extremely useful in the natural sciences as most of nature runs in natural logarithmic cycles of growth and decay. Plotting them logarithmically allowed you to see true effects of parameter variations.
Why is it not done with temperature?
YES!
We were taught PLOT THE DATA. You do nothing statistics wise until you first plot the data. That is when you find out that bell curve you were counting on has two humps….
Yes indeed, the log-log plots, look up the “smith chart”, it’s both log-log and circular as well, i had a few of them (the old paper versions) laying around for a while but I figured I would never need them again, so I tossed them.
Maybe I should have keep a few to enlighten these fools that believe everything their computer tells them….
Cheers, Kevin.
Smith Charts; although now the data is readily displayed on a PC screen or a hand-held device vs the human hard-copy version …
Accuracy of measurement instruments also depends on proper calibration. Any such instrument worthy of the name “professional” should be capable of periodic calibration.
The problem with that is you need a standard to calibrate against, again you calibration will only be as good as you standard. I was trained as and electronic technician, multimeters suffer the same problem as thermometers. A cabinet full of multimeters will diverge measuring the same voltage. Yes you can calibrated them but again how can you be sure that the standard is correct? After you have calibrated them for a time they will more or less agree at lest for the voltage you calibrated them at, but again on within their rated accuracy. The same in music I one had a person tell a tuning fork had to be a standard to calibrated against my frequency meter, you should have seem his face when I took a reading with the tuning fork a room temperature and then took a second reading after I cool it down with some freeze spray (Freon), funny the reading were different as to what was correct is anyone guess. It could have been the meter the tuning fork at room temperature of when I cooled it down it anyone’s guess.
If only you had the estimated temperature data from 1200km away, you could have come to an accurate temperature within 0.0001 degree C.
re: KevinK June 7, 2014 at 1:56 am
… Digits, lots and lots of Digits, that’s the path to true enlightenment…..
You might be overlooking a couple of not-so-obvious points.
1. A sensor of that nature is cal’d off the assembly line (presumably) to the accuracy quoted by the manufacturer.
Over shorter periods of time greater accuracy is possible with in-house ‘calibration’ (or the use of cal tables or a cal ‘factor’). This is up to you and your company to work out. Full characterization knowing the type of sensor (and any latent hysteresis) might be required for the greatest accuracy.
2. Differential temperature measurement is fully possible if two or more devices can be placed in the same ‘bath’ (temperature environment) and their difference in readings noted. This can be recorded and used later during the data reduction/analysis phase (after testing is complete.)
In this case the increased ‘precision’ as opposed to accuracy is an usable asset.
.
1. A sensor of that nature is cal’d off the assembly line (presumably) to the accuracy quoted by the manufacturer…..
>>>>>>>>>>>>
Snicker, Don’t bet on it. I have bought “Calibrated thermometers” for my lab only to find them off by a degree or more.
We had an NBS standard thermometer we used to calibrate all the lab thermometers monthly and then we had a professional come in to certify the thermometers and also the balances once a year or every six months depending on what the procedures called for.
Thanks Jim, I am fully aware of how to do in-house calibrations. And I can understand the difference in precision and accuracy just fine.
Some of what I do is absolute radiometric calibrations of Earth imaging satellite focal planes (a big digital camera in space) to NIST traceable standards. It makes temperature measurements look like a “walk in the park”.
Untraceable “digits” are a fools paradise, as shown by the climate “science” community.
Cheers, Kevin.
I learned about significant figures 25.7564738 years ago
On ocean heat, the IPCC’s AR5 tells us in Chapter 3:
“It is virtually certain that the upper ocean (0 to 700 m) warmed from 1971 to 2010….The warming rate is 0.11 [0.09 to 0.13]°C per decade in the upper 75 m, decreasing to about 0.015°C per decade by 700 m … the water column south of the Subantarctic Front warmed at a rate of about 0.03°C per decade from 1992–2005, and waters of Antarctic origin warmed below 3000 m at a global average rate approaching 0.01°C per decade at 4500 m over the same time period.”
Really? They can measure world-wide ocean warming rate accurate to 3 places? What rot!
And if you unscramble convolutions it isn’t any different than what they said in chapter 5 of their AR4 report:
“Over the period 1961 to 2003, global ocean temperature has risen by 0.10°C from the surface to a depth of 700 m.”
So the only thing they did in the AR5 was to bump up their accuracy claims from two places to three. What a joke!
“Over the period 1961 to 2003, global ocean temperature has risen by 0.10°C from the surface to a depth of 700 m.”
And these people apparently write that stuff with a straight face.
I am too lazy to re-do the calculations, but even with today’s MUCH higher rate of sampling (thanks, JASON) a global average to tenths or hundredths is foolish. Today’s ocean sample rate — the best we have ever had — is about the equivalent of sampling a body of water the same size as the Great Lakes at one location, once or twice a year. From that you get a hundredth of a degree?!