In Part 1 I showed how they use a sneaky 'diagonal comparison' (i.e. they don't compare like-with-like) to create the illusion that Earth is warmer than it should be. Part 1 covers the balance between Earth's system and the Sun/outer space.
Then there's what goes in within the Earth's system itself. The other sneaky thing they do is to say that the atmosphere is far warmer than it is, it's just a straight lie. From NASA's Earth Factsheet:
Terrestrial Atmosphere
Surface pressure: 1014 mb
Surface density: 1.217 kg/m3
Scale height: 8.5 km
Total mass of atmosphere: 5.1 x 1018 kg
Total mass of hydrosphere: 1.4 x 1021 kg
Average temperature: 288 K (15 C)
Say what?
Imagine you are asked to measure the average temperature of the water in a deep lake. If you just take the surface temperature, you might get ~288K. But that's not the average temperature of the all the water in the lake. By and large it gets colder as you go down, so the true average is much lower.
The reverse applies in the troposphere (the lowest ~11km of the atmosphere). This is the bit we are interested. It's where the weather happens and the layer which warms and cools the surface.
It gets cooler as you go up, so if you only measure the temperature in the warmest layer, at or slightly above sea-level (where most measuring stations are), you will get an artificially high average temperature (i.e. ~288K).
~288K is fair estimate of the average surface temperature, but that's something completely different to the average temperature of the air in the troposphere. That's a lot colder. If you take a fair sample of readings at all altitudes, you would get ~255K, which is not uncoincidentally the temperature we expect from looking at the Earth vs Sun/outer space balance. See also Climatologists are Flat Earthers.
The vertical temperature gradient is no mystery. Basic maths, a rudimentary understanding of the Gas Laws and common sense (principles and worked example) tell us that it must be warmer than the ~255K average at sea level and colder than the ~255K average at the top of the tropopause. They worked this out in the 19th century and it was part of normal physics textbooks until a few decades ago. There's a given amount of thermal energy, and gravity and the Gas Laws constantly recycle it downwards.
The precise temperature gradient (aka 'lapse rate') is primarily the trade off between thermal energy (temperature) and potential energy (altitude). We all know that warm air cools as it rises. Energy cannot be created or destroyed, so what happens to the 'lost' thermal energy? Easy - air loses thermal energy as it rises... and gains potential energy. The reverse happens with Chinook and Föhn winds (Föhn is German word, pronounced 'fern' and is also the name for a hand-held hair dryer), when falling air warms up. So the lapse rate = gravity ÷ the specific heat capacity of 'air'.
(The lapse rate is reduced by the latent heat of evaporation, which has the opposite effect. The surface is cooled when water evaporates, the latent heat manifests itself again higher up when water vapour condenses. The latent heat in one gram of water vapour is enough to warm a cubic metre of air by about 2 degrees, it's a lot.)
------------------------------------------
The AGW theorists make great play of the fact that Earth's surface (being ~288K, not ~255K) radiates ~390 W/m2 but only ~240 W/m2 gets to space. They claim that the missing ~150 W/m2 is trapped by 'greenhouse gases'. This is part of Diagonal Comparison #1. Two-thirds of the surface doesn't radiate directly to space because it's covered by clouds; some of the surface radiation is reflected back down (in a quite literal sense, like clouds reflecting visible light) and the clouds themselves emit the required ~165 W/m2 to space. The average emitted to space ≈ 240 W/m2, which is what Earth receives from the Sun.
The AGW steamroller never stops of course. For sure clouds reflect some infrared radiation back down (which is why a cloudy night is warmer than a clear night), but clouds don't 'trap' radiation or warm the surface overall; on the whole, it's cooler if it's cloudy (there's no 'positive feedback'). And clouds certainly do not warm the atmosphere overall, the extra warmth under a cloud is equal and opposite to the missing warmth above it.
Radiation isn't pollution like plastic in the oceans, it can transform into other forms of energy instantaneously. Trying to account for it is like trying to catch sunshine in your hands. You cannot add, subtract, multiply 'radiations', the maths is insane but entirely unnecessary to explain and understand the basic equilibrium position with temperatures etc. You need to bring in radiation to reconcile the warming effect of Ozone Depletion, but that's another story...
Tuesday, 4 May 2021
AGW theory is based on two blatant 'diagonal comparisons' (Part 2)
My latest blogpost: AGW theory is based on two blatant 'diagonal comparisons' (Part 2)Tweet this! Posted by Mark Wadsworth at 19:19
Labels: diagonal comparison, global warming, Physics, Science
Subscribe to:
Post Comments (Atom)
2 comments:
What happened to the W/m2 emitted by the area below the the cloud cover less the W/m2 emitted by the clouds above the cloud covered area. 390 W/m2 - 165 W/m2 =
225 W/m2 .
Din, what do you think happens? What is the relevance when we know the outcome?
Post a Comment