I have been involved in some pretty extensive discussions with the TRCS group over the past few weeks. The posts lately have been offshoots of that discussion. The following one is one that I am putting together for that team as well, enjoy.
What determines how quickly the Earth loses energy? There is a simple answer and a complex answer to that. Since the Earth can only lose energy to space by infra-red (IR) transmission, the simple answer is that the Earth’s temperature determines the rate of energy loss since it is temperature that determines the intensity of the IR transmission as shown in the Stefen-Boltzmann Law.
The actual temperature of the Northern Hemisphere of the Earth is shown below.
If the NH directly lost energy to space at rates associated with those temperatures, the Earth would be a much colder place. This is where the importance of the atmosphere comes into play. It acts as an intermediary by taking the higher energy from the surface, but losing energy at a lower rate, but the temperature of the atmosphere still determines the rate of energy loss. This is easily shown by comparing the measured out-going long-wave radiation (OLR) to the absolute surface temperature.
This clearly shows that the warmer the Earth is, the faster it is losing energy to space. The measurements fit the theory very nicely here. It is possible from this to show the correlation between the surface temperature of the NH and the OLR.
What this indicates is that for each 1K increase in temperature, there will be an associated 2.2 W/m^2 increase in the OLR. There can be no more an effective feedback mechanism than this for regulating the Earth’s temperature. There are many reasons for this, but the best is simplicity. The warmer the Earth is, the faster it loses energy (which means it cools down faster).
Many warmists have noted that the annual change in OLR is smaller than the calibration error for measurement device (spaced based satellite in this case). Fortunately there is no need to depend on the overall annual data when all that is needed is to look at monthly data and build from the monthly change in temperature and OLR.
Based on the OLR measurements, the Earth was losing 2.6 W/m^2 more over the 5 year period from 2007-2011 than it did in the 5 year period from 1979-1983. The satellite temperature difference for those two periods show that the later period was 0.27 °C warmer. Based on the easily proven temperature dependency of the OLR, there is no reason to believe that the difference is satellite calibration error (although that doesn’t mean there is none).
While the Earth has been warmer over the past 10 years than it was 30 years ago, it is also losing energy at a higher rate, even though the CO2 level is higher now. Energy is what matters and if the Earth is losing it faster now than ever before (based on an entire 34 years of satellite data), then it doesn’t look like CO2 is doing a very good job at slowing the rate of energy loss. Conversely it appears that the tried and true Stefen-Boltzmann law is working just fine.