Surface Evaporation

IanC

Gold Member
Sep 22, 2009
11,061
1,344
245
From Spencer's blog -

I frequently see the assertion made that infrared (IR) radiation cannot warm a water body because IR only affects the skin (microns of water depth), whereas solar radiation is absorbed over water depths of meters to tens of meters.

Before discussing the issue, though, we first must agree that temperature and temperature change of a body is related to rates of energy gain and energy loss by that body. If we cannot agree on the basic concept that temperature changes when energy gain does not equal energy loss, then there is no basis for further discussion.

If the surface of a water body is emitting IR, then IR must be part of its energy budget, and therefore of its temperature. Evaporation only occurs at the skin, and we know that evaporation is the major component of heat loss by water bodies. How is it that evaporation can perform this function, and IR cannot?

The temperature of land clearly is affected by IR, and that only occurs at the surface of the soil. So, how can IR affect land temperature and not ocean temperature?

If you claim that any additional IR (say, due to increasing carbon dioxide) is immediately lost by the water body through evaporation, how exactly does that occur? The surface doesn’t know why it has the temperature it does, it will evaporate water based (partly) on surface temperature, and it does not distinguish where the heat comes from (solar radiation from above, mixing from below, IR from above, sensible heat flux across the air/water interface). To claim that any energy gain from IR is immediately lost by evaporation is just an assertion.

NEVERTHELESS…

It might well be that solar radiation is more efficient (on a Watt per Watt basis) than IR radiation at changing ocean temperature. In other words, that IR warming of a water body is more likely to be lost through evaporation, since its warming effect does occur only at the surface, and so that energy is more likely to be lost through evaporation than absorbed solar radiation would be.

The effect would be greatest during low wind conditions, when vertical mixing of the water is weakest.

From what little I know of ocean modelling, this potential effect – which I presume is what people are talking about — is not taken into account. I suspect it would have to be parameterized, because it occurs on such a small scale. As far as I know, a Watt of IR is treated the same as a Watt of solar in the energy budget of the top ocean layer in a model.

I would like to hear what others know about this issue. I suspect it is something that would have to be investigated with a controlled experiment of some sort.

End quote



This is a simple yet incredibly important part of climate physics. Yet it is not 'settled science' or even modeled in the GCMs. The one point I am especially intrigued by is whether one watt of solar is equivalent to one watt IR.
 
From Spencer's blog -

I frequently see the assertion made that infrared (IR) radiation cannot warm a water body because IR only affects the skin (microns of water depth), whereas solar radiation is absorbed over water depths of meters to tens of meters.

Before discussing the issue, though, we first must agree that temperature and temperature change of a body is related to rates of energy gain and energy loss by that body. If we cannot agree on the basic concept that temperature changes when energy gain does not equal energy loss, then there is no basis for further discussion.

If the surface of a water body is emitting IR, then IR must be part of its energy budget, and therefore of its temperature. Evaporation only occurs at the skin, and we know that evaporation is the major component of heat loss by water bodies. How is it that evaporation can perform this function, and IR cannot?

The temperature of land clearly is affected by IR, and that only occurs at the surface of the soil. So, how can IR affect land temperature and not ocean temperature?

If you claim that any additional IR (say, due to increasing carbon dioxide) is immediately lost by the water body through evaporation, how exactly does that occur? The surface doesn’t know why it has the temperature it does, it will evaporate water based (partly) on surface temperature, and it does not distinguish where the heat comes from (solar radiation from above, mixing from below, IR from above, sensible heat flux across the air/water interface). To claim that any energy gain from IR is immediately lost by evaporation is just an assertion.

NEVERTHELESS…

It might well be that solar radiation is more efficient (on a Watt per Watt basis) than IR radiation at changing ocean temperature. In other words, that IR warming of a water body is more likely to be lost through evaporation, since its warming effect does occur only at the surface, and so that energy is more likely to be lost through evaporation than absorbed solar radiation would be.

The effect would be greatest during low wind conditions, when vertical mixing of the water is weakest.

From what little I know of ocean modelling, this potential effect – which I presume is what people are talking about — is not taken into account. I suspect it would have to be parameterized, because it occurs on such a small scale. As far as I know, a Watt of IR is treated the same as a Watt of solar in the energy budget of the top ocean layer in a model.

I would like to hear what others know about this issue. I suspect it is something that would have to be investigated with a controlled experiment of some sort.

End quote



This is a simple yet incredibly important part of climate physics. Yet it is not 'settled science' or even modeled in the GCMs. The one point I am especially intrigued by is whether one watt of solar is equivalent to one watt IR.

Of course it`s the same. The difference is that you need to integrate a wider frequency range with IR to get the same energy as you would with shorter wavelength since E= h*c/ λ

Liquid water has a much higher attenuation coefficient for IR than for shorter wavelength:
774px-Water_absorption_coefficient_large.gif


Visible region
Visible light absorption spectrum of pure water (attenuation coefficient vs. wavelength). [16][21][3]

Very weak light absorption, in the visible region, by liquid water has been measured using an integrating cavity absorption meter (ICAM).[16] The absorption was attributed to a sequence of overtone and combination bands whose intensity decreases at each step, giving rise to an absolute minimum at 418 nm, at which wavelength the attenuation coefficient is about 0.0044 m−1, meaning that energy of 418 nm light drops to one thousandth after traveling about 1570 metres in water.
It`s no problem to calculate the increase in temperature if you know watts and the mass (of water)
Obviously that mass which absorbs the number of watts from visible light is much greater than the mass of H2O which absorbs IR, because the shorter λ light penetrates much deeper.
Which in turn means that you heat more mass, but the temperature increase is lower, while with IR the same number of watts heat less mass but as a consequence to a higher temperature.

If you got enough mechanical mixing action to avoid heat stratification then the end result will be the same and it would not have mattered if your heat source was an equal amount of watts which came from IR or UV/visible.

If not then the warmer top layer which was heated with IR will have a higher vapor pressure and a higher evaporation rate.
...loosing heat
It will also radiate +T to the forth power more watts...again loosing more heat (than it would at a lower T)

So in the final analysis 1 watt of IR is not as efficient to heat water as 1 watt at the shorter wavelength
 
So -- all that PBear appears to be true.. Here's another part. All IR exchanges are calculated as BIDIRECTIONAL (actually omnidirectional, but we live in the up and down world) So in terms of energy wavelength conversion -- the land and the ocean do SIMILIAR but different things.

In the ocean, the wider band solar radiation creates a deeper heating. But it can't pass converted IR to the atmos budget thru deep water. It CAN however "convect" that heat to the surface where it does contribute to the UP direction IR exchange. Both the ocean and the land are net UP heat flows wrt GH IR warming. But this skin/depth issue makes the ocean a much larger UP contributor than the land IN TERMS OF THE GHouse induced IR THERMAL FLOW.. No conclusion based on OVERALL thermal flow -- but for GH heating, it's pretty apparent. IMO

Here's a question.. Actually hadn't consider this before.. But how much REFLECTION of short wave comes from BELOW the surface skin of water?? We KNOW it's coming from relatively deep water otherwise we couldn't see below the surface. And how much does heat lapse rate AFFECT this amount
of reflection..
Remember -- we are looking for incredibly SMALL effects in terms of W/m2 heating..
 
So -- all that PBear appears to be true.. Here's another part. All IR exchanges are calculated as BIDIRECTIONAL (actually omnidirectional, but we live in the up and down world) So in terms of energy wavelength conversion -- the land and the ocean do SIMILIAR but different things.

In the ocean, the wider band solar radiation creates a deeper heating. But it can't pass converted IR to the atmos budget thru deep water. It CAN however "convect" that heat to the surface where it does contribute to the UP direction IR exchange. Both the ocean and the land are net UP heat flows wrt GH IR warming. But this skin/depth issue makes the ocean a much larger UP contributor than the land IN TERMS OF THE GHouse induced IR THERMAL FLOW.. No conclusion based on OVERALL thermal flow -- but for GH heating, it's pretty apparent. IMO

Here's a question.. Actually hadn't consider this before.. But how much REFLECTION of short wave comes from BELOW the surface skin of water?? We KNOW it's coming from relatively deep water otherwise we couldn't see below the surface. And how much does heat lapse rate AFFECT this amount
of reflection.. Remember -- we are looking for incredibly SMALL effects in terms of W/m2 heating..

These are a lot of different issues.
Let`s start with the specific heat of water which is close to 1 and compare that with "land".
Specific Heat of some common Substances

Sand is 0.1 cals (0.42 watt sec) /per gram & degree C, rocks would not be much more than that. It only takes .22 calories ( = .92 watt secs )to heat asphalt by 1 C, like in these "urban heat islands".
So we got a rough idea for deserts and cities. For the rest of it let`s use wet mud which is listed at 0.6 cals per gram and deg C.
All are significantly lower than water and neither IR nor short wave light will penetrate any to the depths short wave light can penetrate water.
So the spectral nature of the incident radiation does not make as much of a difference with any of these ("land") materials as it does if the material was water....
With water it gets a lot more complicated than that for a number of reasons.
"Climate science" does consider the albedo for water, which is kind of laughable because when it comes to water and collimated (sun-)light penetration what matters is the reflectivity a.k.a. specular reflection.
Reflectivity - Wikipedia, the free encyclopedia
Specular reflection from a body of water is calculated by the Fresnel equations.[6]



Partial_transmittance.gif

Partial transmission and reflection amplitudes of a wave traveling from a low to high refractive index medium.




Fresnel reflection is directional and therefore does not contribute significantly to albedo which is primarily diffuse reflection.
A real water surface may be wavy. Reflectivity assuming a flat surface as given by the Fresnel equations can be adjusted to account for waviness.
...and as you know sun light is collimated, not diffuse light.
So now the angle of incidence which changes by 15 degrees per hour over a particular spot and by latitude really matters !

Here is a graph:
400px-Water_reflectivity.jpg


Hard to see, but at just 30 degrees off a vertical line the reflectivity of water is already doubled.
Translate that into time, that`s just 2 hours each day where this imaginary m^2 these climate models use gets irradiated (and absorbs) the number of watts they use as an average.
Again with any "land" surface, which are "Lambertian" surfaces as opposed to water none of that matters much. But with water it does and over 70 % of the earth`s surface is water.
Who knows if "climate science" factored in any of this with their simplistic averaging...and if they did, just how could anybody "average" any of that ?
Even if there were a more sophisticated model which incorporates the Fresnel equations for water reflectivity how would they deal with variables like wind, wave height & frequency, water turbidity and so on ?
Somebody like prophet Abe the 3.rd, who has no clue how to calculate the pH let alone solve a Fresnel equation would claim of course that they did.
I`m quite certain that they did not, because such a model would also be quite good in making accurate short term forecasts for evaporation, overcast and precipitation....and none of the existing models perform adequately in that area.
 
Last edited:
thanks, good answers which bring up even more questions.

my question really should have been couched in terms of entropy. one unit of solar also has a component of order which can be expended to change the degree of order somewhere else in the system. something which atmospheric IR cannot do to the surface. at this time I can mostly only come up with biological examples, and a few chemical ones. I would be pretty surprised if entropy wasnt powering some change in the thermal equilibriums somewhere though.
 
thanks, good answers which bring up even more questions.

my question really should have been couched in terms of entropy. one unit of solar also has a component of order which can be expended to change the degree of order somewhere else in the system. something which atmospheric IR cannot do to the surface. at this time I can mostly only come up with biological examples, and a few chemical ones. I would be pretty surprised if entropy wasnt powering some change in the thermal equilibriums somewhere though.

And it does. We do know that the average rate of energy capture by photosynthesis globally is approximately 130 terawatts
Photosynthesis - Wikipedia, the free encyclopedia
Today, the average rate of energy capture by photosynthesis globally is approximately 130 terawatts,[8][9][10] which is about six times larger than the current power consumption of human civilization.[11] Photosynthetic organisms also convert around 100–115 thousand million metric tonnes of carbon into biomass per year.[12][13]
My take on the "also" is, that the other 100 - 1115 thousand million metric tonnes of carbon which is converted into biomass was not included in the photosynthesis which consumes 130 terawatts.
So I`m going to assume that these 130 terawatts which go into the photosynthesis is for land surfaces and does not include the oceans....which "also convert around 100–115 thousand million metric tonnes of carbon into biomass per year"
Per square meter (land surface) that would be a energy consumption of 0.85 watts.
Not insignificant in magnitude in comparison to the 1.8 watts/m^2 CO2 radiatve forcing which the IPCC is using.
 

Forum List

Back
Top