Search Results

You are looking at 1 - 8 of 8 items for :

  • Author or Editor: Claudia Tebaldi x
  • Journal of Climate x
  • All content x
Clear All Modify Search
Gerald A. Meehl, Aixue Hu, and Claudia Tebaldi

Abstract

A “perfect model” configuration with a global coupled climate model 30-member ensemble is used to address decadal prediction of Pacific SSTs. All model data are low-pass filtered to focus on the low-frequency decadal component. The first three EOFs in the twentieth-century simulation, representing nearly 80% of the total variance, are used as the basis for early twenty-first-century predictions. The first two EOFs represent the forced trend and the interdecadal Pacific oscillation (IPO), respectively, as noted in previous studies, and the third has elements of both trend and IPO patterns. The perfect model reference simulation, the target for the prediction, is taken as the experiment that ran continuously from the twentieth to twenty-first century using anthropogenic and natural forcings for the twentieth century and the A1B scenario for the twenty-first century. The other 29 members use a perturbation in the atmosphere at year 2000 and are run until 2061. Since the IPO has been recognized as a dominant contributor to decadal variability in the Pacific, information late in the twentieth century and early in the twenty-first century is used to select a subset of ensemble members that are more skillful in tracking the time evolution of the IPO (EOF2) in relation to a notional start date of 2010. Predictions for the 19-yr period centered on the year 2020 use that subset of ensemble members to construct Pacific SST patterns based on the predicted evolution of the first three EOFs. Compared to the perfect model reference simulation, the predictions show some skill for Pacific SST predictions with anomaly pattern correlations greater than +0.5. An application of the Pacific SST prediction is made to precipitation over North America and Australia. Even though there are additional far-field influences on Pacific SSTs and North American and Australian precipitation involving the Atlantic multidecadal oscillation (AMO) in the Atlantic, and Indian Ocean and South Asian monsoon variability, there is qualitative skill for the pattern of predicted precipitation over North America and Australia using predicted Pacific SSTs. This exercise shows that, in the presence of a large forced trend like that in the large ensemble, much of Pacific region decadal predictability about 20 years into the future arises from increasing greenhouse gases.

Full access
Claudia Tebaldi, Richard L. Smith, Doug Nychka, and Linda O. Mearns
Full access
Claudia Tebaldi, Richard L. Smith, Doug Nychka, and Linda O. Mearns

Abstract

A Bayesian statistical model is proposed that combines information from a multimodel ensemble of atmosphere–ocean general circulation models (AOGCMs) and observations to determine probability distributions of future temperature change on a regional scale. The posterior distributions derived from the statistical assumptions incorporate the criteria of bias and convergence in the relative weights implicitly assigned to the ensemble members. This approach can be considered an extension and elaboration of the reliability ensemble averaging method. For illustration, the authors consider the output of mean surface temperature from nine AOGCMs, run under the A2 emission scenario from the Synthesis Report on Emission Scenarios (SRES), for boreal winter and summer, aggregated over 22 land regions and into two 30-yr averages representative of current and future climate conditions. The shapes of the final probability density functions of temperature change vary widely, from unimodal curves for regions where model results agree (or outlying projections are discounted) to multimodal curves where models that cannot be discounted on the basis of bias give diverging projections. Besides the basic statistical model, the authors consider including correlation between present and future temperature responses, and test alternative forms of probability distributions for the model error terms. It is suggested that a probabilistic approach, particularly in the form of a Bayesian model, is a useful platform from which to synthesize the information from an ensemble of simulations. The probability distributions of temperature change reveal features such as multimodality and long tails that could not otherwise be easily discerned. Furthermore, the Bayesian model can serve as an interdisciplinary tool through which climate modelers, climatologists, and statisticians can work more closely. For example, climate modelers, through their expert judgment, could contribute to the formulations of prior distributions in the statistical model.

Full access
Ana Lopez, Claudia Tebaldi, Mark New, Dave Stainforth, Myles Allen, and Jamie Kettleborough

Abstract

A Bayesian statistical model developed to produce probabilistic projections of regional climate change using observations and ensembles of general circulation models (GCMs) is applied to evaluate the probability distribution of global mean temperature change under different forcing scenarios. The results are compared to probabilistic projections obtained using optimal fingerprinting techniques that constrain GCM projections by observations. It is found that, due to the different assumptions underlying these statistical approaches, the predicted distributions differ significantly in particular in their uncertainty ranges. Results presented herein demonstrate that probabilistic projections of future climate are strongly dependent on the assumptions of the underlying methodologies.

Full access
Reto Knutti, Reinhard Furrer, Claudia Tebaldi, Jan Cermak, and Gerald A. Meehl

Abstract

Recent coordinated efforts, in which numerous general circulation climate models have been run for a common set of experiments, have produced large datasets of projections of future climate for various scenarios. Those multimodel ensembles sample initial conditions, parameters, and structural uncertainties in the model design, and they have prompted a variety of approaches to quantifying uncertainty in future climate change. International climate change assessments also rely heavily on these models. These assessments often provide equal-weighted averages as best-guess results, assuming that individual model biases will at least partly cancel and that a model average prediction is more likely to be correct than a prediction from a single model based on the result that a multimodel average of present-day climate generally outperforms any individual model. This study outlines the motivation for using multimodel ensembles and discusses various challenges in interpreting them. Among these challenges are that the number of models in these ensembles is usually small, their distribution in the model or parameter space is unclear, and that extreme behavior is often not sampled. Model skill in simulating present-day climate conditions is shown to relate only weakly to the magnitude of predicted change. It is thus unclear by how much the confidence in future projections should increase based on improvements in simulating present-day conditions, a reduction of intermodel spread, or a larger number of models. Averaging model output may further lead to a loss of signal—for example, for precipitation change where the predicted changes are spatially heterogeneous, such that the true expected change is very likely to be larger than suggested by a model average. Last, there is little agreement on metrics to separate “good” and “bad” models, and there is concern that model development, evaluation, and posterior weighting or ranking are all using the same datasets. While the multimodel average appears to still be useful in some situations, these results show that more quantitative methods to evaluate model performance are critical to maximize the value of climate change projections from global models.

Full access
Gerald A. Meehl, Warren M. Washington, Caspar M. Ammann, Julie M. Arblaster, T. M. L. Wigley, and Claudia Tebaldi

Abstract

Ensemble simulations are run with a global coupled climate model employing five forcing agents that influence the time evolution of globally averaged surface air temperature during the twentieth century. Two are natural (volcanoes and solar) and the others are anthropogenic [e.g., greenhouse gases (GHGs), ozone (stratospheric and tropospheric), and direct effect of sulfate aerosols]. In addition to the five individual forcing experiments, an additional eight sets are performed with the forcings in various combinations. The late-twentieth-century warming can only be reproduced in the model with anthropogenic forcing (mainly GHGs), while the early twentieth-century warming is mainly caused by natural forcing in the model (mainly solar). However, the signature of globally averaged temperature at any time in the twentieth century is a direct consequence of the sum of the forcings. The similarity of the response to the forcings on decadal and interannual time scales is tested by performing a principal component analysis of the 13 ensemble mean globally averaged temperature time series. A significant portion of the variance of the reconstructed time series can be retained in residual calculations compared to the original single and combined forcing runs. This demonstrates that the statistics of the variances for decadal and interannual time-scale variability in the forced simulations are similar to the response from a residual calculation. That is, the variance statistics of the response of globally averaged temperatures in the forced runs are additive since they can be reproduced in the responses calculated as a residual from other combined forcing runs.

Full access
Gerald A. Meehl, Warren M. Washington, Julie M. Arblaster, Aixue Hu, Haiyan Teng, Claudia Tebaldi, Benjamin N. Sanderson, Jean-Francois Lamarque, Andrew Conley, Warren G. Strand, and James B. White III

Abstract

Results are presented from experiments performed with the Community Climate System Model, version 4 (CCSM4) for the Coupled Model Intercomparison Project phase 5 (CMIP5). These include multiple ensemble members of twentieth-century climate with anthropogenic and natural forcings as well as single-forcing runs, sensitivity experiments with sulfate aerosol forcing, twenty-first-century representative concentration pathway (RCP) mitigation scenarios, and extensions for those scenarios beyond 2100–2300. Equilibrium climate sensitivity of CCSM4 is 3.20°C, and the transient climate response is 1.73°C. Global surface temperatures averaged for the last 20 years of the twenty-first century compared to the 1986–2005 reference period for six-member ensembles from CCSM4 are +0.85°, +1.64°, +2.09°, and +3.53°C for RCP2.6, RCP4.5, RCP6.0, and RCP8.5, respectively. The ocean meridional overturning circulation (MOC) in the Atlantic, which weakens during the twentieth century in the model, nearly recovers to early twentieth-century values in RCP2.6, partially recovers in RCP4.5 and RCP6, and does not recover by 2100 in RCP8.5. Heat wave intensity is projected to increase almost everywhere in CCSM4 in a future warmer climate, with the magnitude of the increase proportional to the forcing. Precipitation intensity is also projected to increase, with dry days increasing in most subtropical areas. For future climate, there is almost no summer sea ice left in the Arctic in the high RCP8.5 scenario by 2100, but in the low RCP2.6 scenario there is substantial sea ice remaining in summer at the end of the century.

Full access