Search Results

You are looking at 1 - 10 of 19 items for :

  • Author or Editor: Peter A. Stott x
  • Journal of Climate x
  • All content x
Clear All Modify Search
Nikolaos Christidis and Peter A. Stott

Abstract

The new Hadley Centre system for attribution of weather and climate extremes provides assessments of how human influence on the climate may lead to a change in the frequency of such events. Two different types of ensembles of simulations are generated with an atmospheric model to represent the actual climate and what the climate would have been in the absence of human influence. Estimates of the event frequency with and without the anthropogenic effect are then obtained. Three experiments conducted so far with the new system are analyzed in this study to examine how anthropogenic forcings change the odds of warm years, summers, or winters in a number of regions where the model reliably reproduces the frequency of warm events. In all cases warm events become more likely because of human influence, but estimates of the likelihood may vary considerably from year to year depending on the ocean temperature. While simulations of the actual climate use prescribed observational data of sea surface temperature and sea ice, simulations of the nonanthropogenic world also rely on coupled atmosphere–ocean models to provide boundary conditions, and this is found to introduce a major uncertainty in attribution assessments. Improved boundary conditions constructed with observational data are introduced in order to minimize this uncertainty. In more than half of the 10 cases considered here anthropogenic influence results in warm events being 3 times more likely and extreme events 5 times more likely during September 2011–August 2012, as an experiment with the new boundary conditions indicates.

Full access
Fraser C. Lott and Peter A. Stott

Abstract

Although it is critical to assess the accuracy of attribution studies, the fraction of attributable risk (FAR) cannot be directly assessed from observations since it involves the probability of an event in a world that did not happen, the “natural” world where there was no human influence on climate. Instead, reliability diagrams (usually used to compare probabilistic forecasts to the observed frequencies of events) have been used to assess climate simulations employed for attribution and by inference to evaluate the attribution study itself. The Brier score summarizes this assessment of a model by the reliability diagram. By constructing a modeling framework where the true FAR is already known, this paper shows that Brier scores are correlated to the accuracy of a climate model ensemble’s calculation of FAR, although only weakly. This weakness exists because the diagram does not account for accuracy of simulations of the natural world. This is better represented by two reliability diagrams from early and late in the period of study, which would have, respectively, less and greater anthropogenic climate forcing. Two new methods are therefore proposed for assessing the accuracy of FAR, based on using the earlier observational period as a proxy for observations of the natural world. It is found that errors from model-based estimates of these observable quantities are strongly correlated with errors in the FAR estimated in the model framework. These methods thereby provide new observational estimates of the accuracy in FAR.

Full access
Nikolaos Christidis, Andrew Ciavarella, and Peter A. Stott

Abstract

Attribution analyses of extreme events estimate changes in the likelihood of their occurrence due to human climatic influences by comparing simulations with and without anthropogenic forcings. Classes of events are commonly considered that only share one or more key characteristics with the observed event. Here we test the sensitivity of attribution assessments to such event definition differences, using the warm and wet winter of 2015/16 in the United Kingdom as a case study. A large number of simulations from coupled models and an atmospheric model are employed. In the most basic case, warm and wet events are defined relative to climatological temperature and rainfall thresholds. Several other classes of events are investigated that, in addition to threshold exceedance, also account for the effect of observed sea surface temperature (SST) anomalies, the circulation flow, or modes of variability present during the reference event. Human influence is estimated to increase the likelihood of warm winters in the United Kingdom by a factor of 3 or more for events occurring under any atmospheric and oceanic conditions, but also for events with a similar circulation or oceanic state to 2015/16. The likelihood of wet winters is found to increase by at least a factor of 1.5 in the general case, but results from the atmospheric model, conditioned on observed SST anomalies, are more uncertain, indicating that decreases in the likelihood are also possible. The robustness of attribution assessments based on atmospheric models is highly dependent on the representation of SSTs without the effect of human influence.

Full access
Peter A. Stott and Simon F. B. Tett

Abstract

Spatially and temporally dependent fingerprint patterns of near-surface temperature change are derived from transient climate simulations of the second Hadley Centre coupled ocean–atmosphere GCM (HADCM2). Trends in near-surface temperature are calculated from simulations in which HADCM2 is forced with historical increases in greenhouse gases only and with both greenhouse gases and anthropogenic sulfur emissions. For each response an ensemble of four simulations is carried out. An estimate of the natural internal variability of the ocean–atmosphere system is taken from a long multicentury control run of HADCM2.

The aim of the study is to investigate the spatial and temporal scales on which it is possible to detect a significant change in climate. Temporal scales are determined by taking temperature trends over 10, 30, and 50 yr using annual mean data, and spatial scales are defined by projecting these trends onto spherical harmonics.

Each fingerprint pattern is projected onto the recent observed pattern to give a scalar detection variable. This is compared with the distribution expected from natural variability, estimated by projecting the fingerprint pattern onto a distribution of patterns taken from the control run. Detection is claimed if the detection variable is greater than the 95th percentile of the distribution expected from natural variability. The results show that climate change can be detected on the global mean scale for 30- and 50-yr trends but not for 10-yr trends, assuming that the model’s estimate of variability is correct. At subglobal scales, climate change can be detected only for 50-yr trends and only for large spatial scales (greater than 5000 km).

Patterns of near-surface temperature trends for the 50 yr up to 1995 from the simulation that includes only greenhouse gas forcing are inconsistent with the observed patterns at small spatial scales (less than 2000 km). In contrast, patterns of temperature trends for the simulation that includes both greenhouse gas and sulfate forcing are consistent with the observed patterns at all spatial scales.

The possible limits to future detectability are investigated by taking one member of each ensemble to represent the observations and other members of the ensemble to represent model realizations of future temperature trends. The results show that for trends to 1995 the probability of detection is greatest at spatial scales greater than 5000 km. As the future signal of climate change becomes larger relative to the noise of natural variability, detection becomes very likely at all spatial scales by the middle of the next century.

The model underestimates climate variability as seen in the observations at spatial scales less than 2000 km. Therefore, some caution must be exercised when interpreting model-based detection results that include a contribution of small spatial scales to the climate change fingerprint.

Full access
Donald P. Cummins, David B. Stephenson, and Peter A. Stott

Abstract

This study has developed a rigorous and efficient maximum likelihood method for estimating the parameters in stochastic energy balance models (with any k > 0 number of boxes) given time series of surface temperature and top-of-the-atmosphere net downward radiative flux. The method works by finding a state-space representation of the linear dynamic system and evaluating the likelihood recursively via the Kalman filter. Confidence intervals for estimated parameters are straightforward to construct in the maximum likelihood framework, and information criteria may be used to choose an optimal number of boxes for parsimonious k-box emulation of atmosphere–ocean general circulation models (AOGCMs). In addition to estimating model parameters the method enables hidden state estimation for the unobservable boxes corresponding to the deep ocean, and also enables noise filtering for observations of surface temperature. The feasibility, reliability, and performance of the proposed method are demonstrated in a simulation study. To obtain a set of optimal k-box emulators, models are fitted to the 4 × CO2 step responses of 16 AOGCMs in CMIP5. It is found that for all 16 AOGCMs three boxes are required for optimal k-box emulation. The number of boxes k is found to influence, sometimes strongly, the impulse responses of the fitted models.

Restricted access
Nikolaos Christidis, Peter A. Stott, and Simon J. Brown

Abstract

Formal detection and attribution analyses of changes in daily extremes give evidence of a significant human influence on the increasing severity of extremely warm nights and decreasing severity of extremely cold days and nights. This paper presents an optimal fingerprinting analysis that also detects the contributions of external forcings to recent changes in extremely warm days using nonstationary extreme value theory. The authors’ analysis is the first that attempts to partition the observed change in warm daytime extremes between its anthropogenic and natural components and hence attribute part of the change to possible causes. Changes in the extreme temperatures are represented by the temporal changes in a parameter of an extreme value distribution. Regional distributions of the trend in the parameter are computed with and without human influence using constraints from the global optimal fingerprinting analysis. Anthropogenic forcings alter the regional distributions, indicating that extremely warm days have become hotter.

Full access
DáithíA. Stone, Myles R. Allen, and Peter A. Stott

Abstract

This paper presents an update on the detection and attribution of global annual mean surface air temperature changes, using recently developed climate models. In particular, it applies a new methodology that permits the inclusion of many more general circulation models (GCMs) into the analysis, and it also includes more recent observations. This methodology involves fitting a series of energy balance models (EBMs) to the GCM output in order to estimate the temporal response patterns to the various forcings.

Despite considerable spread in estimated EBM parameters, characteristics of model performance, such as the transient climate response, appear to be more constrained for each of the forcings. The resulting estimated response patterns are provided as input to the standard fingerprinting method used in previous studies. The estimated GCM responses to changes in greenhouse gases are detected in the observed record for all of the GCMs, and are generally found to be consistent with the observed changes; the same is generally true for the responses to changes in stratospheric aerosols from volcanic eruptions. GCM responses to changes in tropospheric sulfate aerosols and solar irradiance also appear consistent with the observed record, although the uncertainty is larger. Greenhouse gas and solar irradiance changes are found to have contributed to a best guess of ∼0.8 and ∼0.3 K warming over the 1901–2005 period, respectively, while sulfate aerosols have contributed a ∼0.4 K cooling. This analysis provides an observationally constrained estimate of future warming, which is found to be fairly robust across GCMs. By 2100, a warming of between about 1.5 and 4.5 K can be expected according to the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) A1B emissions scenario.

These results indicate an emerging constraint for global mean surface temperature responses to external forcings across GCMs, which is corroborated in the observed record. This implies that observationally constrained estimates of past warming and predictions of future warming are indeed becoming robust.

Full access
Peter A. Stott, Gareth S. Jones, and John F. B. Mitchell

Abstract

Current attribution analyses that seek to determine the relative contributions of different forcing agents to observed near-surface temperature changes underestimate the importance of weak signals, such as that due to changes in solar irradiance. Here a new attribution method is applied that does not have a systematic bias against weak signals.

It is found that current climate models underestimate the observed climate response to solar forcing over the twentieth century as a whole, indicating that the climate system has a greater sensitivity to solar forcing than do models. The results from this research show that increases in solar irradiance are likely to have had a greater influence on global-mean temperatures in the first half of the twentieth century than the combined effects of changes in anthropogenic forcings. Nevertheless the results confirm previous analyses showing that greenhouse gas increases explain most of the global warming observed in the second half of the twentieth century.

Full access
Peter A. Stott, Gareth S. Jones, Jason A. Lowe, Peter Thorne, Chris Durman, Timothy C. Johns, and Jean-Claude Thelen

Abstract

The ability of climate models to simulate large-scale temperature changes during the twentieth century when they include both anthropogenic and natural forcings and their inability to account for warming over the last 50 yr when they exclude increasing greenhouse gas concentrations has been used as evidence for an anthropogenic influence on global warming. One criticism of the models used in many of these studies is that they exclude some forcings of potential importance, notably from fossil fuel black carbon, biomass smoke, and land use changes. Herein transient simulations with a new model, the Hadley Centre Global Environmental Model version 1 (HadGEM1), are described, which include these forcings in addition to other anthropogenic and natural forcings, and a fully interactive treatment of atmospheric sulfur and its effects on clouds. These new simulations support previous work by showing that there was a significant anthropogenic influence on near-surface temperature change over the last century. They demonstrate that black carbon and land use changes are relatively unimportant for explaining global mean near-surface temperature changes.

The pattern of warming in the troposphere and cooling in the stratosphere that has been observed in radiosonde data since 1958 can only be reproduced when the model includes anthropogenic forcings. However, there are some discrepancies between the model simulations and radiosonde data, which are largest where observational uncertainty is greatest in the Tropics and high latitudes.

Predictions of future warming have also been made using the new model. Twenty-first-century warming rates, following policy-relevant emissions scenarios, are slightly greater in HadGEM1 than in the Third Hadley Centre Coupled Ocean–Atmosphere General Circulation Model (HadCM3) as a result of the extra forcing in HadGEM1. An experiment in which greenhouse gases and other anthropogenic forcings are stabilized at 2100 levels and held constant until 2200 predicts a committed twenty-second-century warming of less than 1 K, whose spatial distribution resembles that of warming during the twenty-first century, implying that the local feedbacks that determine the pattern of warming do not change significantly.

Full access
David E. Rupp, Philip W. Mote, Nathaniel L. Bindoff, Peter A. Stott, and David A. Robinson

Abstract

Significant declines in spring Northern Hemisphere (NH) snow cover extent (SCE) have been observed over the last five decades. As one step toward understanding the causes of this decline, an optimal fingerprinting technique is used to look for consistency in the temporal pattern of spring NH SCE between observations and simulations from 15 global climate models (GCMs) that form part of phase 5 of the Coupled Model Intercomparison Project. The authors examined simulations from 15 GCMs that included both natural and anthropogenic forcing and simulations from 7 GCMs that included only natural forcing. The decline in observed NH SCE could be largely explained by the combined natural and anthropogenic forcing but not by natural forcing alone. However, the 15 GCMs, taken as a whole, underpredicted the combined forcing response by a factor of 2. How much of this underprediction was due to underrepresentation of the sensitivity to external forcing of the GCMs or to their underrepresentation of internal variability has yet to be determined.

Full access