In Support of the A in AGW

jc - we have been over this many times.

Do you agree that the atmosphere warms and moderates the temperature at the surface? If yes, then why?

Any composition atmosphere will have this effect. GHGs exacerbate the effect because they impede the escape of IR directly to space. Eg- if the surface was losing 400w while only receiving 160w solar input, we would cool rapidly, both the surface AND the atmosphere. Without GHGs the new equilibrium might be something like 250w or 300w which is very cold.

You are focusing your attention on minutiae and ignoring the bigger picture. Of course colder objects can warm warmer objects that are being heated by an outside source. Temperature is an equilibrium between energy input and energy loss. Reducing loss is equally effective as adding input.

Atmospheric thermal effect...and oddly enough, it accurately predicts the temperature of every planet in the solar system with an atmosphere while the greenhouse hypothesis can't even predict the temperature of earth without a fudge factor....the greenhouse hypothesis is wrong ian....it has failed as evidenced by the failing climate models based on it...if it were accurate, the models would't fail in mere hours...
 
Sorry, but it doesn't....and still it accurately predicts the temperature of every planet in the solar system while the greenhouse hypothesis can't even accurately predict the temperature of the earth without a fudge factor.
What does your blog theory say about the 400 W/m2 continually radiating from the earth?
It isn't a matter of belief...you can't measure back scatter at ambient temperature even though it is supposedly twice as much radiation as comes in from the sun....while you can measure the incoming radiation from the sun at ambient temperature with no problem...energy moving from a warm object to a cooler object is not back scatter...it is simple energy transfer...
So what happens when you remove the detector. The back radiation ceases?
But you guys not providing it does make it true....for all the talk, not one shred of observed, measured, empirical evidence has been presented that supports the A in AGW...but if you believe some has, by all means point it out and say how you think it supports the anthropogenic component of the AGW hypothesis.
Others have presented the observed, measured, empirical evidence of back scatter in this thread. Go back and read it. But don't forget that 100% of all physicists believe that quantum mechanics shows that thermal radiation from a body at any temperature always strikes any other object in the radiation path no matter what temperature they both are. You cannot prove otherwise. To even think otherwise is ridiculous. If you want to believe in smart photons that don't like to hit warmer objects, then you are not talking about science. Period.
 
why does it matter? The answer is that warm flows to cold and not vice versa. It is true physics. Testable as well. you post up the experiment that shows an ice cube warming a pan at room temperature.

Why does it matter? Gut question. Were you to try - and inevitably fail - to answer it, you'd realize a major misunderstanding of science that currently hampers your understanding of the earth's climate system.

Let's make this (thought) experiment instead. Assume you fly at a fixed point one light-minute above the sun's surface. Sure, you'd be hit with radiation from the sun, right? Now, I replace you with you likeness at the same place earlier occupied by you, only your likeness is ten times hotter than the sun.

According to your theory, the sun would no longer radiate into the direction of your likeness. So, what happens to one minute's worth of the sun's radiation that was sent out when the sun still thought it was sending radiation towards a cooler object (you), but which now heads towards a much hotter object - your incredibly hot likeness (remember, you were one light-minute above the sun)? Does the sun call back the radiation?
 
why does it matter? The answer is that warm flows to cold and not vice versa. It is true physics. Testable as well. you post up the experiment that shows an ice cube warming a pan at room temperature.

Why does it matter? Gut question. Were you to try - and inevitably fail - to answer it, you'd realize a major misunderstanding of science that currently hampers your understanding of the earth's climate system.

Let's make this (thought) experiment instead. Assume you fly at a fixed point one light-minute above the sun's surface. Sure, you'd be hit with radiation from the sun, right? Now, I replace you with you likeness at the same place earlier occupied by you, only your likeness is ten times hotter than the sun.

According to your theory, the sun would no longer radiate into the direction of your likeness. So, what happens to one minute's worth of the sun's radiation that was sent out when the sun still thought it was sending radiation towards a cooler object (you), but which now heads towards a much hotter object - your incredibly hot likeness (remember, you were one light-minute above the sun)? Does the sun call back the radiation?
Hahahaha Hahahaha, dude way too hard. Relax, you don't have it. I knew it.

Now, take an ice cube and hold it in your hand, does your hand get warm or cold?
 
What does your blog theory say about the 400 W/m2 continually radiating from the earth?

Not my theory...do feel free to read what it says about everything if you like, not that I would expect one of the AGW faithful to get anything out of it regardless of the fact that it accurately predicts the temperature of every planet in the solar system while the greenhouse hypothesis can't even predict the temperature of earth without a fudge factor...

https://tallbloke.files.wordpress.com/2011/12/unified_theory_of_climate_poster_nikolov_zeller.pdf
http://www.wcrp-climate.org/conference2011/posters/C7/C7_Nikolov_M15A.pdf

So what happens when you remove the detector. The back radiation ceases?

The more sensible question is what happens when you warm the detector....answer, it stops measuring radiation incoming from the atmosphere....if it is no longer measuring radiation coming from the atmosphere, what do you think is happening?...What makes more sense?.....the detector breaks and stops working at temperatures higher than the atmosphere.......or the atmosphere simply doesn't radiate to the warmer object....you can swing the uncooled detector towards warmer objects and immediately it starts detecting radiation again...so clearly the instrument is working....must be your hypothesis that isn't working.

Others have presented the observed, measured, empirical evidence of back scatter in this thread. Go back and read it. But don't forget that 100% of all physicists believe that quantum mechanics shows that thermal radiation from a body at any temperature always strikes any other object in the radiation path no matter what temperature they both are. You cannot prove otherwise. To even think otherwise is ridiculous. If you want to believe in smart photons that don't like to hit warmer objects, then you are not talking about science. Period.

No...crick produced observed, measured, quantified evidence of energy moving from a radiator that is warmer than -80F to a detector cooled to a temperature lower than -80F...that is not back scatter....that is energy moving from a warmer radiator to a cooler radiator...he hasn't shown any measurement of back scatter at ambient temperature... So the fact remains that zero observed, measured, quantified evidence has been shown that supports the A in AGW...

If back scatter is physical fact, why can it not be observed and measured at ambient temperature....the claim is that more than 300 wm2 of it is coming in...why can't it be measured unless the instrument is cooled to a temperature lower than that of the atmosphere?
 
Hahahaha Hahahaha, dude way too hard. Relax, you don't have it. I knew it.

Now, take an ice cube and hold it in your hand, does your hand get warm or cold?

You aren't really that much into answering questions, are you?

Moreover, time again watching folks burst into maniacal laughter, I find they are laughing at either their own ignorance or the hilarious consequences of the false beliefs they nevertheless staunchly hold, unwittingly.

Of course, my hand would get cold(er). So, we see, you have a firm grasp of heat flux. That's good, as far as it goes, as it is the dominant mode of energy transfer between close-contact bodies of different temperatures. Yet, when we're discussing the world's energy budget, we're talking overwhelmingly about radiative flux. The two concepts differ in important respects, for instance: While the former mode of energy transmission follows the heat gradient, the latter is a result of (warm, that is, above 0°K) bodies radiating heat into every direction. If you weren't warmed (!) by the radiative flux of that 70°F atmosphere surrounding you (all else equal), you'd probably freeze to death in short order, even though your body should show something like 100°F. Imagine that!

You could help your understanding of the earth's energy budget, and, indeed, AGW, a great deal if you tried to wrap your head around radiative flux.
 
Hahahaha Hahahaha, dude way too hard. Relax, you don't have it. I knew it.

Now, take an ice cube and hold it in your hand, does your hand get warm or cold?

You aren't really that much into answering questions, are you?

Moreover, time again watching folks burst into maniacal laughter, I find they are laughing at either their own ignorance or the hilarious consequences of the false beliefs they nevertheless staunchly hold, unwittingly.

Of course, my hand would get cold(er). So, we see, you have a firm grasp of heat flux. That's good, as far as it goes, as it is the dominant mode of energy transfer between close-contact bodies of different temperatures. Yet, when we're discussing the world's energy budget, we're talking overwhelmingly about radiative flux. The two concepts differ in important respects, for instance: While the former mode of energy transmission follows the heat gradient, the latter is a result of (warm, that is, above 0°K) bodies radiating heat into every direction. If you weren't warmed (!) by the radiative flux of that 70°F atmosphere surrounding you (all else equal), you'd probably freeze to death in short order, even though your body should show something like 100°F. Imagine that!

You could help your understanding of the earth's energy budget, and, indeed, AGW, a great deal if you tried to wrap your head around radiative flux.
I could help myself by getting evidence that such radiation actually exists. I answered your question, you didn't like my answer.

now if you took that same ice cube and held it three inches above your hand, would your hand warm up or cool down? Now they are radiating right? Does my hand stay warmer longer?
 
Not my theory...do feel free to read what it says about everything if you like, not that I would expect one of the AGW faithful to get anything out of it regardless of the fact that it accurately predicts the temperature of every planet in the solar system while the greenhouse hypothesis can't even predict the temperature of earth without a fudge factor...

https://tallbloke.files.wordpress.com/2011/12/unified_theory_of_climate_poster_nikolov_zeller.pdf
http://www.wcrp-climate.org/conference2011/posters/C7/C7_Nikolov_M15A.pdf

Thank you for the references. The whole theory of Nikolov and Zeller relies on adiabatic heating. That occurs as a reversible process when work is done. According to the authors the work is the gravitational force. In experiments the adiabatic process must take place before any heat can dissipate otherwise it is not reversible. If it's done quickly there is not enough time for any energy to transfer as heat to or from the system.

Here is an example: your hand pump gets hot when pumping up a tire due to adiabatic compression. If you wait, heat will dissipate and the pump will cool down. At that point the process is no longer adiabatic.

In order for the atmosphere to be in an adiabatic condition. All the air must start out, say, a few hundred miles above the earth. When the air falls to the earth it will be compressed most at the lowest levels and be the hottest. At higher levels the pressure will be less and the atmosphere will be cooler according to the ideal gas law. That is the temperature profile the authors are referring to, and as they claim, is similar to the profile of planets.

However, what the authors fail to include is the fact that without external energy, the atmosphere will eventually even out in temperature after the initial adiabatic heat is dissipated. Our atmosphere does not do that. The reason of course is that there is thermal energy continually being pumped into the system from the sun; the earth warms; and radiates LWIR, etc.

That is the reason for my remark, "what does your blog theory say about the 400 w/m2 continually radiating from the earth."

The more sensible question is what happens when you warm the detector....answer, it stops measuring radiation incoming from the atmosphere....if it is no longer measuring radiation coming from the atmosphere, what do you think is happening?...What makes more sense?.....the detector breaks and stops working at temperatures higher than the atmosphere.......or the atmosphere simply doesn't radiate to the warmer object....you can swing the uncooled detector towards warmer objects and immediately it starts detecting radiation again...so clearly the instrument is working....must be your hypothesis that isn't working.

No...crick produced observed, measured, quantified evidence of energy moving from a radiator that is warmer than -80F to a detector cooled to a temperature lower than -80F...that is not back scatter....that is energy moving from a warmer radiator to a cooler radiator...he hasn't shown any measurement of back scatter at ambient temperature... So the fact remains that zero observed, measured, quantified evidence has been shown that supports the A in AGW...

If back scatter is physical fact, why can it not be observed and measured at ambient temperature....the claim is that more than 300 wm2 of it is coming in...why can't it be measured unless the instrument is cooled to a temperature lower than that of the atmosphere?
Thermal detectors are cooled to prevent the detector housing from interfering with the measurement. The detector will still work if it's temperature is above the ambient temperature, but the temperature of the housing must be subtracted out to get the desired reading.

The original CMB detector in 1964 was at 4 deg K. It successfully used a method to subtract the 4 deg housing interference to get the 2.7 deg K CMB. Today's detectors can drop to much lower temperatures so that the mechanism for subtraction is no longer needed.
 
Not my theory...do feel free to read what it says about everything if you like, not that I would expect one of the AGW faithful to get anything out of it regardless of the fact that it accurately predicts the temperature of every planet in the solar system while the greenhouse hypothesis can't even predict the temperature of earth without a fudge factor...

https://tallbloke.files.wordpress.com/2011/12/unified_theory_of_climate_poster_nikolov_zeller.pdf
http://www.wcrp-climate.org/conference2011/posters/C7/C7_Nikolov_M15A.pdf
One further note. Although that concept was picked up in many blogs, your first reference is a "poster presentation." At conferences most papers are orally presented to an audience. However, some papers that aren't accepted for oral presentation are allowed a poster presentation. Those are sort of like "science fair" story boards set up in the lobby of the conference center. During breaks the authors generally stand around their poster set-up and field questions. Your second reference is no doubt exactly the poster in three sections set up on a table.
 
The more sensible question is what happens when you warm the detector....answer, it stops measuring radiation incoming from the atmosphere

Absolutely false. Anyone can now buy thermal cameras that require no cooling. Cooling just reduces the electronics noise that would otherwise swamp the signal, but with enough processing power, there are ways to filter that noise out, which is exactly what the new cameras do.

So, that kook claim of your is conclusively debunked, and with it, your whole argument collapses. Thanks for playing, we have some lovely parting gifts for you.
 
I could help myself by getting evidence that such radiation actually exists. I answered your question, you didn't like my answer.

now if you took that same ice cube and held it three inches above your hand, would your hand warm up or cool down? Now they are radiating right? Does my hand stay warmer longer?

You didn't answer any one of my questions.

I see, you're now seemingly into convection as another way of energy transport. Fine, we're still not quite there, though. Really, take a closer look at radiative flux, and realize that both hand and ice cube are radiating, which doesn't violate the Second Law of Thermodynamics, and neither does it contradict the fact that warmer bodies radiate more, and neither does that mean there isn't a net radiative flux from hand to ice cube. They're still emitting energy, both.
 
Not my theory...do feel free to read what it says about everything if you like, not that I would expect one of the AGW faithful to get anything out of it regardless of the fact that it accurately predicts the temperature of every planet in the solar system while the greenhouse hypothesis can't even predict the temperature of earth without a fudge factor...

https://tallbloke.files.wordpress.com/2011/12/unified_theory_of_climate_poster_nikolov_zeller.pdf
http://www.wcrp-climate.org/conference2011/posters/C7/C7_Nikolov_M15A.pdf

Thank you for the references. The whole theory of Nikolov and Zeller relies on adiabatic heating. That occurs as a reversible process when work is done. According to the authors the work is the gravitational force. In experiments the adiabatic process must take place before any heat can dissipate otherwise it is not reversible. If it's done quickly there is not enough time for any energy to transfer as heat to or from the system.

Here is an example: your hand pump gets hot when pumping up a tire due to adiabatic compression. If you wait, heat will dissipate and the pump will cool down. At that point the process is no longer adiabatic.

In order for the atmosphere to be in an adiabatic condition. All the air must start out, say, a few hundred miles above the earth. When the air falls to the earth it will be compressed most at the lowest levels and be the hottest. At higher levels the pressure will be less and the atmosphere will be cooler according to the ideal gas law. That is the temperature profile the authors are referring to, and as they claim, is similar to the profile of planets.

However, what the authors fail to include is the fact that without external energy, the atmosphere will eventually even out in temperature after the initial adiabatic heat is dissipated. Our atmosphere does not do that. The reason of course is that there is thermal energy continually being pumped into the system from the sun; the earth warms; and radiates LWIR, etc.

The atmosphere would only achieve equilibrium if it were a static column of air...the atmosphere is the farthest thing from static...the work of pressure never stops and the heat keeps bleeding into space...

The more sensible question is what happens when you warm the detector....answer, it stops measuring radiation incoming from the atmosphere....if it is no longer measuring radiation coming from the atmosphere, what do you think is happening?...What makes more sense?.....the detector breaks and stops working at temperatures higher than the atmosphere.......or the atmosphere simply doesn't radiate to the warmer object....you can swing the uncooled detector towards warmer objects and immediately it starts detecting radiation again...so clearly the instrument is working....must be your hypothesis that isn't working.

No...crick produced observed, measured, quantified evidence of energy moving from a radiator that is warmer than -80F to a detector cooled to a temperature lower than -80F...that is not back scatter....that is energy moving from a warmer radiator to a cooler radiator...he hasn't shown any measurement of back scatter at ambient temperature... So the fact remains that zero observed, measured, quantified evidence has been shown that supports the A in AGW...

If back scatter is physical fact, why can it not be observed and measured at ambient temperature....the claim is that more than 300 wm2 of it is coming in...why can't it be measured unless the instrument is cooled to a temperature lower than that of the atmosphere?

Thermal detectors are cooled to prevent the detector housing from interfering with the measurement. The detector will still work if it's temperature is above the ambient temperature, but the temperature of the housing must be subtracted out to get the desired reading.

Tell yourself that all you like....if there were in fact, 300+ wm2 of energy radiating from the atmosphere, you would not need to worry about the detector housing interfering with the measurement...instruments certainly don't need to be cooled to measure the piddling (by comparison) 161wm2 of radiation coming in from the sun... the instruments must be cooled to a temperature lower than that of the target radiator so that the energy can move from the warmer radiator to the cooler instrument.

The original CMB detector in 1964 was at 4 deg K. It successfully used a method to subtract the 4 deg housing interference to get the 2.7 deg K CMB. Today's detectors can drop to much lower temperatures so that the mechanism for subtraction is no longer needed.

Radio waves have no temperature....orignally CMB was detected via resonance radio signal...in order to detect actual CMB you must have an instrument cooled to about 3k
 
Not my theory...do feel free to read what it says about everything if you like, not that I would expect one of the AGW faithful to get anything out of it regardless of the fact that it accurately predicts the temperature of every planet in the solar system while the greenhouse hypothesis can't even predict the temperature of earth without a fudge factor...

https://tallbloke.files.wordpress.com/2011/12/unified_theory_of_climate_poster_nikolov_zeller.pdf
http://www.wcrp-climate.org/conference2011/posters/C7/C7_Nikolov_M15A.pdf
One further note. Although that concept was picked up in many blogs, your first reference is a "poster presentation." At conferences most papers are orally presented to an audience. However, some papers that aren't accepted for oral presentation are allowed a poster presentation. Those are sort of like "science fair" story boards set up in the lobby of the conference center. During breaks the authors generally stand around their poster set-up and field questions. Your second reference is no doubt exactly the poster in three sections set up on a table.


So you don't like the format....ok....got any specific comments on the fact that it accurately predicts the temperature of every planet in the solar system with an atmosphere while the greenhouse hypothesis can't even predict the temperature here on planet earth without a fudge factor?
 
The more sensible question is what happens when you warm the detector....answer, it stops measuring radiation incoming from the atmosphere

Absolutely false. Anyone can now buy thermal cameras that require no cooling. Cooling just reduces the electronics noise that would otherwise swamp the signal, but with enough processing power, there are ways to filter that noise out, which is exactly what the new cameras do.

So, that kook claim of your is conclusively debunked, and with it, your whole argument collapses. Thanks for playing, we have some lovely parting gifts for you.

Idiot....thermal cameras work via temperature measurements of an internal thermopile...and before you bring them up, FLIR cameras that are able to detect bodies that are cooler than the air surrounding them are in fact, cooled....
 
The atmosphere would only achieve equilibrium if it were a static column of air...the atmosphere is the farthest thing from static...the work of pressure never stops and the heat keeps bleeding into space...
You have a bit of contradictory confusion. Yes, about the static column. Yes, the atmosphere is far from static. But that is my point of why Nikolov et.al is wrong. They assume an adiabatic process which is far from correct since the atmosphere is chaotic between altitude levels and any possible adiabatic process would have died out eons ago.

Tell yourself that all you like....if there were in fact, 300+ wm2 of energy radiating from the atmosphere, you would not need to worry about the detector housing interfering with the measurement...instruments certainly don't need to be cooled to measure the piddling (by comparison) 161wm2 of radiation coming in from the sun... the instruments must be cooled to a temperature lower than that of the target radiator so that the energy can move from the warmer radiator to the cooler instrument.
Radiation can move anywhere.

Radio waves have no temperature....orignally CMB was detected via resonance radio signal...in order to detect actual CMB you must have an instrument cooled to about 3k
Nevertheless Penzias and Wilson did measure 2.7K in 1964 with an instrument cooled to 4K. That is a measurable, repeatable, observable experiment that showed that a detector that is warmer than the source will work.

You are right that radio waves have no temperature. Neither do photons that carry the visible and UV from the hot sun. That EM energy is just the medium for exchange of energy from one body to another. A high power laser can melt steel, but the narrow frequency band has no temperature of it's own either.

So, as you say, if radio waves have no temperature, why do you think they are forbidden from hitting anything at any other temperature? Surely those EM waves that have lost all information of their origin have no restrictions except to move in the original direction until absorbed. Freely moving photons no longer have a connection with the 2nd law of thermodynamics.

This is the connection of radiation physics with the 2nd law: a hotter body will emit more EM energy than a colder body. It follows that the net energy flow will always be from the hotter body to the colder one, just as the second law says.
 
The more sensible question is what happens when you warm the detector....answer, it stops measuring radiation incoming from the atmosphere

Absolutely false. Anyone can now buy thermal cameras that require no cooling. Cooling just reduces the electronics noise that would otherwise swamp the signal, but with enough processing power, there are ways to filter that noise out, which is exactly what the new cameras do.

So, that kook claim of your is conclusively debunked, and with it, your whole argument collapses. Thanks for playing, we have some lovely parting gifts for you.
Right you are.

PHOTONIC FRONTIERS: ROOM-TEMPERATURE IR IMAGING: Microbolometer arrays enable uncooled infrared camera

An attractive alternative for such LWIR applications is detecting the absorption or radiant heat rather than photons. Radiant heat sensors, called bolometers, have long been used for IR measurements. Now arrays of many thermally isolated microbolometers can record images in the thermal IR. Crucially, microbolometers do not require cooling, so this reduces their cost, size, and complexity.

http://www.laserfocusworld.com/arti...r-arrays-enable-uncooled-infrared-camera.html

This totally screws up SSDD's ill-conceived objection about detecting "resonance frequencies". Bolometers directly detect heat, not individual radio frequencies.
 
why does it matter? The answer is that warm flows to cold and not vice versa. It is true physics. Testable as well. you post up the experiment that shows an ice cube warming a pan at room temperature.

Why does it matter? Gut question. Were you to try - and inevitably fail - to answer it, you'd realize a major misunderstanding of science that currently hampers your understanding of the earth's climate system.

Let's make this (thought) experiment instead. Assume you fly at a fixed point one light-minute above the sun's surface. Sure, you'd be hit with radiation from the sun, right? Now, I replace you with you likeness at the same place earlier occupied by you, only your likeness is ten times hotter than the sun.

According to your theory, the sun would no longer radiate into the direction of your likeness. So, what happens to one minute's worth of the sun's radiation that was sent out when the sun still thought it was sending radiation towards a cooler object (you), but which now heads towards a much hotter object - your incredibly hot likeness (remember, you were one light-minute above the sun)? Does the sun call back the radiation?
Hahahaha Hahahaha, dude way too hard. Relax, you don't have it. I knew it.

Now, take an ice cube and hold it in your hand, does your hand get warm or cold?


A more realistic experiment would be to close your eyes in a heated room on a cold night. Put your flat hand close to the wall but not touching. As you circle the room I guarantee most people could pick out the outside wall(s). Why? Because the amount of radiant energy being received by the hand would be different from cooler outside walls than the warmer inner walls.

Likewise, the same type of experiment could pick out warmer southern facing walls on a hot summer day.
 
The more sensible question is what happens when you warm the detector....answer, it stops measuring radiation incoming from the atmosphere

Absolutely false. Anyone can now buy thermal cameras that require no cooling. Cooling just reduces the electronics noise that would otherwise swamp the signal, but with enough processing power, there are ways to filter that noise out, which is exactly what the new cameras do.

So, that kook claim of your is conclusively debunked, and with it, your whole argument collapses. Thanks for playing, we have some lovely parting gifts for you.
Right you are.

PHOTONIC FRONTIERS: ROOM-TEMPERATURE IR IMAGING: Microbolometer arrays enable uncooled infrared camera

An attractive alternative for such LWIR applications is detecting the absorption or radiant heat rather than photons. Radiant heat sensors, called bolometers, have long been used for IR measurements. Now arrays of many thermally isolated microbolometers can record images in the thermal IR. Crucially, microbolometers do not require cooling, so this reduces their cost, size, and complexity.

http://www.laserfocusworld.com/arti...r-arrays-enable-uncooled-infrared-camera.html

This totally screws up SSDD's ill-conceived objection about detecting "resonance frequencies". Bolometers directly detect heat, not individual radio frequencies.
I love it. Some people will believe anything.
 
why does it matter? The answer is that warm flows to cold and not vice versa. It is true physics. Testable as well. you post up the experiment that shows an ice cube warming a pan at room temperature.

Why does it matter? Gut question. Were you to try - and inevitably fail - to answer it, you'd realize a major misunderstanding of science that currently hampers your understanding of the earth's climate system.

Let's make this (thought) experiment instead. Assume you fly at a fixed point one light-minute above the sun's surface. Sure, you'd be hit with radiation from the sun, right? Now, I replace you with you likeness at the same place earlier occupied by you, only your likeness is ten times hotter than the sun.

According to your theory, the sun would no longer radiate into the direction of your likeness. So, what happens to one minute's worth of the sun's radiation that was sent out when the sun still thought it was sending radiation towards a cooler object (you), but which now heads towards a much hotter object - your incredibly hot likeness (remember, you were one light-minute above the sun)? Does the sun call back the radiation?
Hahahaha Hahahaha, dude way too hard. Relax, you don't have it. I knew it.

Now, take an ice cube and hold it in your hand, does your hand get warm or cold?


A more realistic experiment would be to close your eyes in a heated room on a cold night. Put your flat hand close to the wall but not touching. As you circle the room I guarantee most people could pick out the outside wall(s). Why? Because the amount of radiant energy being received by the hand would be different from cooler outside walls than the warmer inner walls.

Likewise, the same type of experiment could pick out warmer southern facing walls on a hot summer day.
that's no different than what I stated. Take the ice cube and hold it three inches from your hand. Does your hand get warmer? Nope. So it's still comes down to, the cooler atmosphere does not warm the surface.
 
SSDD doesn't seem to realize that any atmosphere is only there because of stored solar energy.

All heatsinks cause increased temperatures at some point in the system as energy input moves to energy output.
 

Forum List

Back
Top