Oh and you sate that satellites (even though they're good at measure temp on other planets, sun, light frequencies from stars 100s of light years away)...are not good at reading surface temps on earth.
Yes, exactly.
Accuracy matters. Getting the temperature on Jupiter to within 10C is good enough. But that kind of error would be useless for any climate studies on earth. For that, you've got to get the errors under 0.1C.
Your standard weather satellite on earth gets you to within a couple degrees. That's good enough for the weather forecast, but no good for climate studies.
I guess infrared thermometers are s**t tech
For climate studies, they are pretty useless, being they can't see through clouds.
Some earth orbiting satellites try to determine temperature by measuring the microwave radiation emitted by atmospheric oxygen. The atmosphere is mostly transparent to that, but not completely transparent. Hence, corrections have to be made for time of day, clouds, humidity, observation angle, satellite orbit degradation, sensor drift, and other factors. A lot of guesswork is involved. Surface temperature measurements are much more reliable, being how they measure temperature directly.
Some satellites do measure IR out of the stratosphere, since there's little water vapor to interfere there. That's how we know the stratosphere is cooling, which is one of the smoking guns for greenhouse-gas caused global warming. An increase in solar output would cause the stratosphere to warm, hence we know an increasing solar output is not the cause of the warming on earth.