SSDD
Gold Member
- Nov 6, 2012
- 16,672
- 1,967
- 280
- Thread starter
- #61
It all says that you're as stupid as stupid can be. Go back to the thread where you linked to that article and look at what I quoted from its abstract that clearly shows the authors conclusion was that the difference would be reduced radiation to space and higher future temperatures. See if the light comes on for you then.
Yeah, I read it...and saw the author's error immediately. He assumed that low emissivity in the far IR wavelengths would equal warming....because he failed to consider, or just didn't know that low emissivity must be coupled with low absorptivity. The author assumed, as is the case with all climate science that even though emissivity was low in the far IR that absorptivity must still be 100%...that is the sort of stupidity that is rampant in climate science...assumption after assumption. Lowered emissivity only equals warming if absorptivity doesn't change...alas, that isn't the case.
If 70% of the earths surface is a very poor absorber of the peak wavelengths of CO2, what does that do to the AGW hypothesis...and the greenhouse hypothesis for that matter?