all readings to the whole number
4,4,5 the avg is 4.333333333333333....
how many digits are significant?
That depends on the error.
if there are 1000 data points with an avg of 4.33333333
how many digits are significant?
That also depends on the error.
if there are 1,000,000 data points with an avg of 4.3333333
how many digits are significant?
The same.
if you need to convert from degreesF to degreesC, how many working digits do you use? how many significant digits in the final answer. did the conversion have an effect?
You gave us a problem without enough information to determine an answer, and didn't realize it. That's because, like almost every denier, you fail at basic statistics.
Every denier on this thread is proudly displaying their Dunning-Kruger syndrome. That is, they're too stupid to grasp that they're stupid. There's also some serious narcissism at work. A normal person, upon finding the world disagrees with them, will conclude the most likely thing is that they themselves are wrong, and therefore they should seek to educate themselves prior to spouting any crazy accusations. Deniers? They consider themselves to be absolutely infallible and incapable of error, so when the world disagrees with them, they declare the only possible explanation is that there's a global plot against them.
I posted this on the other thread, and got no response. Let's try it again. It's statistics so simple, even a denier should be able to grasp it.
You have the following data set of independent temperature readings.
13, 15, 10, 14, 12
The error on each reading is 0.50. The error is gaussian.
What's the average temperature, and what's the error of the average?