Global Temperature Anomaly: January


The main sets I use for the Blended Temperature set have finally updated for the month of January 2011.  The historical behavior for the month of January is very different than the one presented by the theory of Global Warming.  There are many claims that the Earth is warming more in Dec-Feb months, but when the monthly data is looked at by itself there is a distinct difference in the results.  Especially if the actual monthly data is looked at.  January is a very good example of a month that shows a very different picture of the past 160 years than the warmists paint.

The warmest month in the blended set was January of 1863.  The monthly anomaly for that month was 0.751 °C.  In that period of time there were the warmest and the coldest Januaries that have been recorded.  The coldest was in 1850 with an anomaly of -1.646 °C.  That is a difference of more than 2 °C slightly more than 10 years apart.  Large variations in January temperatures are normal.  Certainly part of this could be the very limited data that was available globally during this period of time, but since we are told that the data is good and should be trusted by the warmist consensus, I will go ahead and use it.

Here is what the past 160 years look like when shown by year.  The moving average is a 9 year average.

The Inconvenient Skeptic

Yearly January anomalies for the past 160 years. The biggest difference between now and then is not that we are warmer, but that there are less frigid Januaries now. I suppose the warmists would like more winters that are much worse than the past one.

There are three periods when there were months as warm as they currently are.  In the 1850-1880 period, 1930-1950 and finally from about 1995-now.  It is no surprise that there periods also coincide with warm phases of the Pacific and Atlantic ocean oscillations.  The biggest difference in the first period was that there were also very cold months intermixed in with the warm ones.  The warmest and the coldest Januaries took place back then.  There have been less truly frigid months in the past 100 years, but the warmest months are no different now than they were 160 years ago.  Since the warmists say that the world was better then than now, I suppose that they want winters that are much colder than the one the world experienced this year.  The recent claim that 2011’s difficult January was caused by global warming looks very weak in comparison to the actual historical data, regardless of what Al Gore says.  All around a review of the month of January is bad for the idea of global warming.

From a statistical point of view the average anomaly for the past 160 years is -0.124 °C.  The standard deviation is 0.377 °C.  So any temperature between -0.879 and 0.630 °C is within 2 standard deviations.  In a true process control situation 3-sigma is used, but since climate generally uses 2-sigma I will stick with that standard.  By that standard only two years are 2-sigma warm.  2007 and the all time warmest year which is 1863.  There were many 2-sigma cold years before 1880 though.  So the variation of cold Januaries is by far the most significant change.

More interesting is that the 60 year period when CO2 was increasing from the 1925-1985 time frame there is no warming at all in the data.  The trend is perfectly flat for that entire period.  CO2 climbed from what is considered natural to 340 ppm in that period of time that January had no trend.  Some would argue that it takes time for CO2 to have an effect, but that isn’t how radiative heat transfer works.

The Inconvenient Skeptic

There is no trend at all in the data over the period of time that CO2 levels started to increase.

Only if the past 20 years are included does any significant trend show up in the data.  Perhaps even less shocking is that the satellite data does not show the same trend that the other data sets do.  So in the period that actually shows warming, the satellite data is available and doesn’t show the same results.  The warming is primarily showing up in the station data sets.  This is precisely why so many people are skeptics.  The data just isn’t there to support that significant change has happened in the Earth’s climate.  What is visible is natural variation.  That there is less variation now than there was previously is easily explained by the limitations of data from 150 years ago.

The Inconvenient Skeptic

Using the same scale the variation from 1985 to now is insignificant. Especially when the warmest year was 2007 and since then the behavior has not been warming despite CO2 levels increasing.

From a historical perspective there is no evidence that the month of January warmer now than it was 150 years ago.  A chart will show that the average is warmer, but only because the older data shows very cold months back then.  That the past 4 years appear to be reversing the warming is all the more reason to be skeptical about the theory of global warming.

That this is the third period of “warmer” months in the past 160 years is another reason to consider that there is real and natural variation in the Earth’s climate.  Since the warm Januaries in the past two periods could not have been caused by CO2 emissions there is strong evidence that natural variation is a very important part of the Earth’s climate.  Saying otherwise is denying reality.

Posted in Anomaly by inconvenientskeptic on February 22nd, 2011 at 12:49 am.

4 comments

This post has 4 comments

  1. intrepid_wanders Feb 22nd 2011

    John,

    Question. If there can not be an agreed upon thesis of the the unit, is the rest technically, superfluous?

    “In chemistry, standard condition for temperature and pressure (informally abbreviated as STP) are standard sets of conditions for experimental measurements, to allow comparisons to be made between different sets of data. The most used standards are those of the International Union of Pure and Applied Chemistry (IUPAC) and the National Institute of Standards and Technology (NIST), although these are not universally accepted standards. Other organizations have established a variety of alternative definitions for their standard reference conditions. The current version of IUPAC’s standard is a temperature of 0 °C (273.15 K, 32 °F) and an absolute pressure of 100 kPa (14.504 psi, 0.986 atm),[1] while NIST’s version is a temperature of 20 °C (293.15 K, 68 °F) and an absolute pressure of 101.325 kPa (14.696 psi, 1 atm). International Standard Metric Conditions for natural gas and similar fluids [2] is 288.15 Kelvin and 101.325 kPa.”
    http://en.wikipedia.org/wiki/Standard_conditions_for_temperature_and_pressure

    I am approaching this from a metrology standpoint, and if I did not have a NIST tracable, my reference would have to have parameters based off of a NIST tracible standard. I worked with Neutron Transmutated Dopant standards (Uniform CC characteristics) which in principle are good from a defusion principle, but tested poorly on a capacitance/voltage sample depth/ramp (on straight I/V, very stable). Maybe stray capacitance in atmosphere, contact contitions….dunno.

    I am curious on your “proxy” evaluation…

  2. inconvenientskeptic Feb 22nd 2011

    Intrepid,
    The blended set is the combined anomaly in the CRU, GHCN, RSS and UAH temperature anomalies. There is no actual proxy.

    They each have monthly data (although different lengths). One argument for anomaly data is that it allows multiple sets to be combined. That is how the different sets function, I just combined them into a single one.

    I understand that there is some difference in the sets, but I estimate that a multi-set combination will be more accurate than any single set.

    There is an earlier post about the blended set.

  3. intrepid_wanders Feb 23rd 2011

    I apologize John, I was not questioning your datasets. I am making a silly high-school statement over the enormous fuss made over something that we, in all reality, have no real measurement standard. We do have the co-efficient of heat expansion which works well for TCs, but the re-calibrations that I experienced with these devices were quite frequent.

    Again, I agree with your presentation analysis, and find it refreshing you approach the data in the same format I would. Data is Data.

  4. inconvenientskeptic Feb 24th 2011

    Intrepid,

    No worries. The theory goes that with enough devices the error will on average cancel out as calibration issues should be random.

    Thermocouples are different, but the ones I use in my industry are very stable over years at up to 1000C. So they can also be reliable, but the high end ones are expensive.

    I don’t worry much about calibration as there are 1000’s of measurements involved.

Web Design & Dev by

Mazal Simantov Digital Creativity