How Likely Is The Observed Recent Warmth?

RHODES, GREECE - JULY 16:  Dry out ground near rodes city on July 16, 2009 in Rhodes, Greece. Rhodes is the largest of the Gr
RHODES, GREECE - JULY 16: Dry out ground near rodes city on July 16, 2009 in Rhodes, Greece. Rhodes is the largest of the Greek Dodecanes Islands. Due to climate change and global warming many areas and rivers drying out. (Photo by EyesWideOpen/Getty Images)

With 2015 having now come to completion, the preliminary numbers suggest that it will, by a substantial margin, be the new record-holder, the warmest year in recorded history for both the globe and the Northern Hemisphere. The title was sadly short-lived for previous record-holder 2014. And 2016 could be yet warmer if the current global warmth persists through the year.

One might well wonder: just how likely is it that we would be seeing these sort of streaks of record-breaking temperatures if not for human-caused warming of the planet?

Precisely that question was posed by several media organizations a year ago, in the wake of the then-record 2014 temperatures. Various press accounts reported odds anywhere from 1-in-27 million to 1-in-650 million that the observed run of global temperature records (9 of the 10 warmest years and 13 of the 15 warmest years each having had occurred since 2000) might have resulted from chance alone, i.e. without any assistance from human-caused global warming.

My colleagues and I suspected the odds quoted were way too slim. The problem is that each year was treated as though it were statistically independent of neighboring years (i.e. that each year is uncorrelated with the year before it or after it), but that's just not true. Temperatures don't vary erratically from one year to the next. Natural variations in temperature wax and wane over a period of several years.

For example, we've had a couple very warm years in a row now due in part to El Niño-ish conditions that have persisted since late 2013 and it is likely that the current El Niño event will boost 2016 temperatures as well. That is an example of a natural variation that is internally-generated. There are also natural variations in temperature that are externally-caused or 'forced', e.g. the multi-year cooling impact of large, explosive volcanic eruptions like the 1991 Mt. Pinatubo eruption, or the small-but-measurable changes in solar output that occur on timescales of a decade or longer. Each of these natural sources of temperature variation lead to correlations in temperature from one year to the next that would be present even in the absence of global warming. These correlations must be taken into account to get reliable answers to the questions being posed.

Let the reader be warned that we'll need to get a bit wonkish now to explore this in further depth.

The particular complication at hand is referred to, in the world of statistics as "serial correlation" or "autocorrelation". In this case, it means that the effective size of the temperature dataset is considerably smaller than one would estimate based purely on the number of years available. There are N=136 years of annual global temperature data from 1880-2015. However, when the natural correlations between neighboring years are accounted for, the effective size of the sample is a considerably smaller N'~30. That means that warm and cold periods tend to occur in stretches of roughly 4 years at a time. Runs of several cold or warm years are far more likely to happen based on chance alone than one would estimate under the incorrect assumption that natural temperature fluctuations are uncorrelated from one year to the next.

One can account for such effects by using a more sophisticated statistical model that faithfully reproduces the characteristics of actual natural climate variability. My co-authors and I used such an approach to more rigorously assess the likelihood of recent runs of record-breaking temperatures. We have now reported our findings in an article just published in the Nature journal Scientific Reports. With the study having come out shortly after New Year, we are able to update the results from the study to include the record new 2015 temperatures.

Our approach combines information from the state-of-the-art climate model simulations used in the most recent report of the Intergovernmental Panel on Climate Change (IPCC) with historical observations of global and Northern Hemisphere (NH) average temperature. Averaging over the various model simulations provides an estimate of the 'forced' component of temperature change--the component that is driven by external natural (i.e. volcanic and solar) and human (emission of greenhouse gases and pollutants) factors.

Fig 1. Historical NH mean temperatures (black solid) along with estimated 'forced' component of temperature change (blue dashed). The difference between the two curves provides an estimate of the 'internal' variability. The post-2000 era of particular interest is denoted (vertical dashed line) as are the record-breaking years of 1998, 2005, 2010, 2014, and 2015 (circles). Here and in the figures below, temperature departures are defined relative to the long-term 1880-2015 average.

The actual NH series (updated using the preliminary 2015 value) can be compared to the model-estimated 'forced' component of temperature change alone (Figure 1). The difference between the two series provides an estimate of the purely unforced, internal component of climate variability (e.g. the component associated with internal fluctuations in temperature such as those associated with El Niño). It is that component that can be considered random, and which we represent using a statistical model.

Using the statistical model, we generate a million alternative versions of the original series, called 'surrogates', each of which has the same basic statistical properties as the original series, but differs in its historical details, i.e. the magnitude and sequence of individual annual temperature values. Adding the forced component of natural temperature change (due to volcanic and solar impacts) to each of these surrogates yields an ensemble of a million surrogates for the total natural component of temperature variation.

These surrogates can be compared with the estimated natural component of the actual NH series as well as the full NH series itself (Figure 2). Tabulating results from the surrogates, which provide a million alternative, realistic scenarios of natural climate variability, we are able to diagnose how often a given run of record temperatures (e.g. 9 of the warmest 10 and 13 of the warmest 15 year having each occurred since 2000) is likely to have arisen naturally.

Fig 2. Historical NH mean temperatures (black solid) along with the estimated natural component alone (black dashed) and five of the surrogates (colored curves) for the natural component.

While the precise results depend on various details of the analysis, for the most defensible of assumptions our analysis suggests that the odds are no greater than a 1-in-170,000 that 13 of the 15 warmest years would have occurred since 2000 for the NH average temperature, and 1-in-10,000 for the global average temperature (even when we vary those assumptions, the odds never exceed 1-in-5000 and 1-in-1700 respectively). While not nearly as unlikely as past press reports might have suggested, the observed runs of record temperatures are nonetheless extremely unlikely to have occurred in the absence of global warming.

Updating the analysis to include 2015, we find that the record temperature run is even less likely to have arisen from natural variability. The odds are no greater than 1-in-300,000 that 14 of the 16 warmest years would have occurred since 2000 for the NH. The odds of back-to-back records (something we haven't seen in several decades) as witnessed with 2014 and 2015, is roughly 1-in-1500.

We can also use the surrogates to assess the likelihoods of individual annual temperature records, such as those that occurred during 1998, 2005, 2010, 2014 and now 2015. Here we require not only that particular years are warmer than certain previous years, but that they reach a particular threshold of warmth. This is even less likely to happen in the absence of global warming, for reasons that are obvious from Figure 2: the natural temperature series almost never exceeds a maximum value of 0.4C relative to the long-term average, while the warmest actual year--2015--exceeds 1C. For none of the record-setting years--1998, 2005, 2010, 2014, or 2015--do the odds exceed 1-in-a-million for temperatures having reached the levels they did due to chance alone, for either the NH or global mean temperature.

Fig 3. Historical NH mean temperatures (black solid) along with five different surrogates (colored solid curves) for the NH series.

Finally, by adding the human-forced component to the surrogates (see Figure 3), we are able to assess the likelihood of the various temperature records and warm streaks when accounting for the effects of global warming.

Using data through 2014, we estimate a 76% likelihood that 13 of the warmest 15 years would occur since 2000 for the NH. Updating the analysis to include 2015, we find there is a 76% likelihood that 14 of the 16 years would occur since 2000 as well. The likelihood of back-to-back records during the two most recent years 2014 and 2015 is just over 8%, still a bit of a fluke, but hardly out of the question.

As for individual record years, we find that the 1998, 2005, 2010, 2014, and 2015 records had likelihoods of 7%, 18%, 23%, 40% and 7% respectively. So while the 2014 temperature record had nearly even odds of occurring, the 2015 record had relatively long odds.

There is good reason for that. The 2015 temperature didn't just beat the previous record, but smashed it, coming in nearly 0.2C warmer than 2014. The 2015 warmth was boosted by an unusually large El Niño event--indeed, by some measures, the largest on record. A similar story holds for 1998 which, prior to 2015, was itself the largest El Niño on record. It too boosted 1998 warmth, which beat the previous record (1995) again by a whopping 0.2C. Each of the two monster El Niño events was, in a statistical sense, somewhat of a fluke. And each of them imparted considerably greater large-scale warmth than would have been expected from global warming alone.

That analysis, however, neglects one intriguing possibility. Could it be that human-caused climate change is actually boosting the magnitude of El Niño events themselves, leading to more monster events like the '98 and '15 events? That proposition indeed finds some support in the recent peer-reviewed literature. If the hypothesis turns out to be true, then the record warmth of '98 and '15 might not have been flukes after all.

To summarize, we find that the various record temperatures and runs of unusually warm years since 2000 are extremely unlikely to have happened in the absence of human-caused climate change, and reasonably likely to have happened when we account for climate change. We can, in this sense, attribute the record warmth to human-caused climate change at a high level of confidence.

Will the onslaught of record-breaking temperatures finally put the discredited "global warming has stopped" talking point to rest? Probably not, as many who continue to advance this canard appear far more motivated by political ideology than reality or reason.

But the next time you hear someone call into question the threat of human-caused climate change, you might explain to them that the likelihood we would be witnessing the recent record warmth in the absence of human-caused climate change is somewhere between one-in-a-thousand and one-in-a-million. You might ask them: Would you really gamble away the future of our planet with those sorts of odds?

Other versions of this commentary have been published at LiveScience and RealClimate


Michael Mann is Distinguished Professor of Meteorology at Pennsylvania State University and author of The Hockey Stick and the Climate Wars: Dispatches from the Front Lines and the recently updated and expanded Dire Predictions: Understanding Climate Change.