The entire AGW movement is founded on a false premise.

Consensus isn't science, moron. Truth isn't determined by a majority vote.
The current best knowledge is. It's called the scientific consensus. You are a scientific illiterate if you don't know that.

But why am I stating the obvious?
 
Please tell us how we have 1880 temperatures accurate to a tenth of a degree.
Probably because accurate thermometers were carefully calibrated. Please tell us why you are such a dullard.

Thermometers weren't accurate to a tenth of a degree in 1880. In fact, it's only a few short decades ago we were accurate to that degree. Why pretend we have data that can't possibly exist?
 
Please tell us how we have 1880 temperatures accurate to a tenth of a degree.

I've explained it to you and other deniers before, several times. Here are some examples of me doing that.

http://www.usmessageboard.com/posts/11774569/

http://www.usmessageboard.com/posts/13259890/

http://www.usmessageboard.com/posts/10425762/

And being that you're a mewling cult liar, you're still pretending to have never seen the explanations. You've proven yourself to be profoundly dishonest, so "fuck off, liar" is the only response owed to you by anyone.

Now, for those willing to learn, here's the answer Frank and his pals always run from.

Standard error - Wikipedia
---
SEM is usually estimated by the sample estimate of the population standard deviation (sample standard deviation) divided by the square root of the sample size (assuming statistical independence of the values in the sample):
bb234d9a63401082dbd197c430fd35c9.png

where
s is the sample standard deviation (i.e., the sample-based estimate of the standard deviation of the population), and
n is the size (number of observations) of the sample.
---

That is, the more measurements you use in the average, the less the error of the average is. The error goes down proportionally to the square root of the number of measurements. If you average 10,000 measurements, the error of the average is 100 times less than the error of each individual measurement.

That's why measurements of average temperature from long ago can have such small errors. Most of the deniers here, including the ones claiming a science background, are shockingly clueless about such basic statistics. They'd literally fail a Statistics 101 class, which is how we know that the ones claiming such a science background are open frauds. Nobody who really has a science background would make such a bonehead error. And it won't matter that I just took the time to explain it to them, again. They'll just add statistics to their list of "Basic math and science which has been known for centuries which we now know is really totally wrong, because the denier cult says so."

Fake news using statistics to give the impression of accuracy.

The data refuses to support your theory so you fake it and alter the data that does exists.

The ocean data from 1880 come from water drawn up in wooden buckets and measured with a glass tube accurate to 5 degrees. How many of those reading do you have? One?
 
Consensus isn't science, moron. Truth isn't determined by a majority vote.
The current best knowledge is. It's called the scientific consensus. You are a scientific illiterate if you don't know that.

But why am I stating the obvious?

"Best knowledge" isn't absolute truth, Majority vote doesn't determine scientific facts.
 
"Best knowledge" isn't absolute truth, Majority vote doesn't determine scientific facts.
But it is the best knowledge we have. If we're going to act we should act on the best knowledge. It's not exactly rocket science to work that out.

Again, absolute truth does not exist in scientific theories. Evidence and consensus is what counts. I don't know how many times I'll have to say that. I know, I'll let someone else say it for me.

Common misconceptions about science I: “Scientific proof”

https://www.psychologytoday.com

One of the most common misconceptions concerns the so-called “scientific proofs.” Contrary to popular belief, there is no such thing as a scientific proof.

Proofs exist only in mathematics and logic, not in science. Mathematics and logic are both closed, self-contained systems of propositions, whereas science is empirical and deals with nature as it exists. The primary criterion and standard of evaluation of scientific theory is evidence, not proof. All else equal (such as internal logical consistency and parsimony), scientists prefer theories for which there is more and better evidence to theories for which there is less and worse evidence. Proofs are not the currency of science.

Proofs have two features that do not exist in science: They are final, and they are binary. Once a theorem is proven, it will forever be true and there will be nothing in the future that will threaten its status as a proven theorem (unless a flaw is discovered in the proof). Apart from a discovery of an error, a proven theorem will forever and always be a proven theorem.

In contrast, all scientific knowledge is tentative and provisional, and nothing is final. There is no such thing as final proven knowledge in science. The currently accepted theory of a phenomenon is simply the best explanation for it among all available alternatives. Its status as the accepted theory is contingent on what other theories are available and might suddenly change tomorrow if there appears a better theory or new evidence that might challenge the accepted theory. No knowledge or theory (which embodies scientific knowledge) is final. That, by the way, is why science is so much fun.
 
Last edited:
"Best knowledge" isn't absolute truth, Majority vote doesn't determine scientific facts.
But it is the best knowledge we have. If we're going to act we should act on the best knowledge. It's not exactly rocket science to work that out.

Again, absolute truth does not exist in scientific theories. Evidence and consensus is what counts. I don't know how many times I'll have to say that. I know, I'll let someone else say it for me.

Common misconceptions about science I: “Scientific proof”

https://www.psychologytoday.com

One of the most common misconceptions concerns the so-called “scientific proofs.” Contrary to popular belief, there is no such thing as a scientific proof.

Proofs exist only in mathematics and logic, not in science. Mathematics and logic are both closed, self-contained systems of propositions, whereas science is empirical and deals with nature as it exists. The primary criterion and standard of evaluation of scientific theory is evidence, not proof. All else equal (such as internal logical consistency and parsimony), scientists prefer theories for which there is more and better evidence to theories for which there is less and worse evidence. Proofs are not the currency of science.

Proofs have two features that do not exist in science: They are final, and they are binary. Once a theorem is proven, it will forever be true and there will be nothing in the future that will threaten its status as a proven theorem (unless a flaw is discovered in the proof). Apart from a discovery of an error, a proven theorem will forever and always be a proven theorem.

In contrast, all scientific knowledge is tentative and provisional, and nothing is final. There is no such thing as final proven knowledge in science. The currently accepted theory of a phenomenon is simply the best explanation for it among all available alternatives. Its status as the accepted theory is contingent on what other theories are available and might suddenly change tomorrow if there appears a better theory or new evidence that might challenge the accepted theory. No knowledge or theory (which embodies scientific knowledge) is final. That, by the way, is why science is so much fun.

Wrong, moron, "consensus" doesn't count. Consensus isn't science. It's politics. One man with a good theory can defeat the entire scientific community. It happens all the time.
 
Fake news using statistics to give the impression of accuracy

Yep, cultist Frank is now completely off the rails. He's actually screaming that the statistics laws that have been known for centuries are "fake news".

Sadly, that sort of butthurt insanity is now seen across the entire denier cult. The whole cult considers it a point of pride to be stupid and totally detached from reality, and looks with paranoid suspicion upon any intelligent person who uses liberal concepts like "facts" and "data".
 
Fake news using statistics to give the impression of accuracy

Yep, cultist Frank is now completely off the rails. He's actually screaming that the statistics laws that have been known for centuries are "fake news".

Sadly, that sort of butthurt insanity is now seen across the entire denier cult. The whole cult considers it a point of pride to be stupid and totally detached from reality, and looks with paranoid suspicion upon any intelligent person who uses liberal concepts like "facts" and "data".

Show us your 10,000 data points from 1880

Wait. How's this for thumb on the scale...they merged the sea data from 1854 into the land readings. We all know how accurate the sea reading were back in 1880, right? So, so many reading too.

Extended Reconstructed Sea Surface Temperature (ERSST) v4 | National Centers for Environmental Information (NCEI) formerly known as National Climatic Data Center (NCDC)

Counting the days until the AGW Cult has to Show and Tell
 
Last edited:
Please tell us how we have 1880 temperatures accurate to a tenth of a degree.
Probably because accurate thermometers were carefully calibrated. Please tell us why you are such a dullard.

Thermometers weren't accurate to a tenth of a degree in 1880. In fact, it's only a few short decades ago we were accurate to that degree. Why pretend we have data that can't possibly exist?
The temperature measurements you generally see plotted on a graph are an average over a year with many thermometers. When you average just 100 readings you can get 1/10 degree accuracy since the standard deviation is proportional to the inverse of the square root of the number of readings.

That is why you can see a higher degree of precision of a variable when the number of readings for just one thermometer is 365 readings per year.

However that doesn't mean that the earth's average temperature was measured accurately. It only means that results can be plotted on a graph with 1/10 degree of accuracy.

Accuracy of yearly temperature rise is even greater with linear regression.
 
One man with a good theory can defeat the entire scientific community.
You are a fuckwit. It's not a good theory until the scientific community agrees it's a good theory because it has good evidence.

'Defeat the entire scientific community'
!

All you do is reveal greater depths of ignorance.
 
Please tell us how we have 1880 temperatures accurate to a tenth of a degree.
Probably because accurate thermometers were carefully calibrated. Please tell us why you are such a dullard.

Thermometers weren't accurate to a tenth of a degree in 1880. In fact, it's only a few short decades ago we were accurate to that degree. Why pretend we have data that can't possibly exist?
The temperature measurements you generally see plotted on a graph are an average over a year with many thermometers. When you average just 100 readings you can get 1/10 degree accuracy since the standard deviation is proportional to the inverse of the square root of the number of readings.

That is why you can see a higher degree of precision of a variable when the number of readings for just one thermometer is 365 readings per year.

However that doesn't mean that the earth's average temperature was measured accurately. It only means that results can be plotted on a graph with 1/10 degree of accuracy.

Accuracy of yearly temperature rise is even greater with linear regression.

Can you please link to the data set showing "100 readings" from 1880?

Thank you
 
Thermometers weren't accurate to a tenth of a degree in 1880. In fact, it's only a few short decades ago we were accurate to that degree. Why pretend we have data that can't possibly exist?
You really are a dullard. How many lab or meteorological thermometers have you read? Yet you witter on as though you have a clue.

Temperature measurements in the late 1800s were accurate to one- or two-tenths of a degree Fahrenheit.
http://articles.chicagotribune.com
 
Please tell us how we have 1880 temperatures accurate to a tenth of a degree.
Probably because accurate thermometers were carefully calibrated. Please tell us why you are such a dullard.

Thermometers weren't accurate to a tenth of a degree in 1880. In fact, it's only a few short decades ago we were accurate to that degree. Why pretend we have data that can't possibly exist?
The temperature measurements you generally see plotted on a graph are an average over a year with many thermometers. When you average just 100 readings you can get 1/10 degree accuracy since the standard deviation is proportional to the inverse of the square root of the number of readings.

That is why you can see a higher degree of precision of a variable when the number of readings for just one thermometer is 365 readings per year.

However that doesn't mean that the earth's average temperature was measured accurately. It only means that results can be plotted on a graph with 1/10 degree of accuracy.

Accuracy of yearly temperature rise is even greater with linear regression.

Can you please link to the data set showing "100 readings" from 1880?

Thank you
You didn't understand the post! No link is needed. As I said a single thermometer will show around 365 readings in one year.
 
Please tell us how we have 1880 temperatures accurate to a tenth of a degree.
Probably because accurate thermometers were carefully calibrated. Please tell us why you are such a dullard.

Thermometers weren't accurate to a tenth of a degree in 1880. In fact, it's only a few short decades ago we were accurate to that degree. Why pretend we have data that can't possibly exist?
The temperature measurements you generally see plotted on a graph are an average over a year with many thermometers. When you average just 100 readings you can get 1/10 degree accuracy since the standard deviation is proportional to the inverse of the square root of the number of readings.

That is why you can see a higher degree of precision of a variable when the number of readings for just one thermometer is 365 readings per year.

However that doesn't mean that the earth's average temperature was measured accurately. It only means that results can be plotted on a graph with 1/10 degree of accuracy.

Accuracy of yearly temperature rise is even greater with linear regression.

Can you please link to the data set showing "100 readings" from 1880?

Thank you
You didn't understand the post! No link is needed. As I said a single thermometer will show around 365 readings in one year.

Since you're committed to statistical dishonesty, you can say that one thermometer took a reading a second and therefore had 81,000 reading in a single day! So you're probably accurate to a hundredth of a degree back in 1880 notwithstanding the 5F MOE
 
Since you're committed to statistical dishonesty, you can say that one thermometer took a reading a second and therefore had 81,000 reading in a single day! So you're probably accurate to a hundredth of a degree back in 1880 notwithstanding the 5F MOE
I said one reading a day. That's 365 a year.
 
One man with a good theory can defeat the entire scientific community.
You are a fuckwit. It's not a good theory until the scientific community agrees it's a good theory because it has good evidence.

'Defeat the entire scientific community'
!

All you do is reveal greater depths of ignorance.


ROFL! Yeah, right. So was Einstein's theory of relativity good? The entire scientific community refused to accept it. How about Copernicus's theory that the Earth orbit's the sun? How about the theory of continental drift? All these theories were originally rejected by the entire scientific community.

“In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.”
Galileo Galilei

When Einstein was told of the publication of a book entitled, '100 Authors Against Einstein', he replied: "Why one hundred? If I were wrong, one would have been enough."

It appears that Einstein and Galileo disagree with you, fuck wit.
 
Last edited:
What the fuck are you talking about, Pattycake? Einstein's Theory of Relativity was accepted in record time, considering that it was a paradigm that upended the basic understanding of nearly the whole of physics at that time. You seem not to know much about the history of science.

Do you think that when someone proposes a radical new theory, that even if it is 100% correct, that everyone just says, "Oh, that's right"? No, the scientists take their time and measure that theory against the real world. And look for cases where it is self contradictory.

And that is why the denialists are subject to such ridicule. AGW has been measured against what is happening in nature at present, and has been found to accurately describe and predict what we are seeing. It is you denialists that have failed on all fronts. You have no reason for the present warming, present no evidence as to why GHGs being increased as they have in the atmosphere by our activities would not raise the temperature on Earth.
 
The entire scientific community refused to accept it
So when did it become the scientific consensus? See how it works, science illiterate?

Too, general gravitation theory clashes with quantum theory, so it is known something is not right, somewhere, but it is still the best knowledge available.
 
Since you're committed to statistical dishonesty, you can say that one thermometer took a reading a second and therefore had 81,000 reading in a single day! So you're probably accurate to a hundredth of a degree back in 1880 notwithstanding the 5F MOE
I said one reading a day. That's 365 a year.

latest


^ MOE 2-3 degrees but you pretend you're accurate to a tenth of a degree.
 
Please tell us how we have 1880 temperatures accurate to a tenth of a degree.
Probably because accurate thermometers were carefully calibrated. Please tell us why you are such a dullard.

Thermometers weren't accurate to a tenth of a degree in 1880. In fact, it's only a few short decades ago we were accurate to that degree. Why pretend we have data that can't possibly exist?
The temperature measurements you generally see plotted on a graph are an average over a year with many thermometers. When you average just 100 readings you can get 1/10 degree accuracy since the standard deviation is proportional to the inverse of the square root of the number of readings.

That is why you can see a higher degree of precision of a variable when the number of readings for just one thermometer is 365 readings per year.

However that doesn't mean that the earth's average temperature was measured accurately. It only means that results can be plotted on a graph with 1/10 degree of accuracy.

Accuracy of yearly temperature rise is even greater with linear regression.

Can you please link to the data set showing "100 readings" from 1880?

Thank you
You didn't understand the post! No link is needed. As I said a single thermometer will show around 365 readings in one year.

This is why AGW needs to put under oath so you can describe your methodology.

NOAA added in the ocean data too from 1854 forward.
 

Forum List

Back
Top