Search Results

You are looking at 1 - 9 of 9 items for :

  • Author or Editor: Michael K. Tippett x
  • Bulletin of the American Meteorological Society x
  • Refine by Access: All Content x
Clear All Modify Search
Laurie Trenary
,
Timothy DelSole
,
Michael K. Tippett
, and
Brian Doty
Full access
Laurie Trenary
,
Timothy DelSole
,
Brian Doty
, and
Michael K. Tippett
Full access
Anthony G. Barnston
,
Michael K. Tippett
,
Michelle L. L'Heureux
,
Shuhua Li
, and
David G. DeWitt

Real-time model predictions of ENSO conditions during the 2002–11 period are evaluated and compared to skill levels documented in studies of the 1990s. ENSO conditions are represented by the Niño- 3.4 SST index in the east-central tropical Pacific. The skills of 20 prediction models (12 dynamical, 8 statistical) are examined. Results indicate skills somewhat lower than those found for the less advanced models of the 1980s and 1990s. Using hindcasts spanning 1981–2011, this finding is explained by the relatively greater predictive challenge posed by the 2002–11 period and suggests that decadal variations in the character of ENSO variability are a greater skill-determining factor than the steady but gradual trend toward improved ENSO prediction science and models. After adjusting for the varying difficulty level, the skills of 2002–11 are slightly higher than those of earlier decades. Unlike earlier results, the average skill of dynamical models slightly, but statistically significantly, exceeds that of statistical models for start times just before the middle of the year when prediction has proven most difficult. The greater skill of dynamical models is largely attributable to the subset of dynamical models with the most advanced, highresolution, fully coupled ocean–atmosphere prediction systems using sophisticated data assimilation systems and large ensembles. This finding suggests that additional advances in skill remain likely, with the expected implementation of better physics, numeric and assimilation schemes, finer resolution, and larger ensemble sizes.

Full access
Anthony G. Barnston
,
Michael K. Tippett
,
Michelle L. L'Heureux
,
Shuhua Li
, and
David G. DeWitt
Full access
John T. Allen
,
Michael K. Tippett
,
Adam H. Sobel
, and
Chiara Lepore
Full access
Gregory W. Carbin
,
Michael K. Tippett
,
Samuel P. Lillo
, and
Harold E. Brooks

Abstract

Two novel approaches to extending the range of prediction for environments conducive to severe thunderstorm events are described. One approach charts Climate Forecast System, version 2 (CFSv2), run-to-run consistency of the areal extent of severe thunderstorm environments using grid counts of the supercell composite parameter (SCP). Visualization of these environments is charted for each 45-day CFSv2 run initialized at 0000 UTC. CFSv2 ensemble-mean forecast maps of SCP coverage over the contiguous United States are also produced for those forecasts meeting certain criteria for high-impact weather. The applicability of this approach to the severe weather prediction challenge is illustrated using CFSv2 output for a series of severe weather episodes occurring in March and April 2014. Another approach, possibly extending severe weather predictability from CFSv2, utilizes a run-cumulative time-averaging technique of SCP grid counts. This process is described and subjectively verified with severe weather events from early 2014.

Full access
Michelle L. L’Heureux
,
Daniel S. Harnos
,
Emily Becker
,
Brian Brettschneider
,
Mingyue Chen
,
Nathaniel C. Johnson
,
Arun Kumar
, and
Michael K. Tippett

Abstract

Did the strong 2023–24 El Niño live up to the hype? While climate prediction is inherently probabilistic, many users compare El Niño events against a deterministic map of expected impacts (e.g., wetter or drier regions). Here, using this event as a guide, we show that no El Niño perfectly matches the ideal image and that observed anomalies will only partially match what was anticipated. In fact, the degree to which the climate anomalies match the expected ENSO impacts tends to scale with the strength of the event. The 2023–24 event generally matched well with ENSO expectations around the United States. However, this will not always be the case, as the analysis shows larger deviations from the historical ENSO pattern of impacts are commonplace, with some climate variables more prone to inconsistencies (e.g., temperature) than others (e.g., precipitation). Users should incorporate this inherent uncertainty in their risk and decision-making analysis.

Open access
Kathy Pegion
,
Ben P. Kirtman
,
Emily Becker
,
Dan C. Collins
,
Emerson LaJoie
,
Robert Burgman
,
Ray Bell
,
Timothy DelSole
,
Dughong Min
,
Yuejian Zhu
,
Wei Li
,
Eric Sinsky
,
Hong Guan
,
Jon Gottschalck
,
E. Joseph Metzger
,
Neil P Barton
,
Deepthi Achuthavarier
,
Jelena Marshak
,
Randal D. Koster
,
Hai Lin
,
Normand Gagnon
,
Michael Bell
,
Michael K. Tippett
,
Andrew W. Robertson
,
Shan Sun
,
Stanley G. Benjamin
,
Benjamin W. Green
,
Rainer Bleck
, and
Hyemi Kim

Abstract

The Subseasonal Experiment (SubX) is a multimodel subseasonal prediction experiment designed around operational requirements with the goal of improving subseasonal forecasts. Seven global models have produced 17 years of retrospective (re)forecasts and more than a year of weekly real-time forecasts. The reforecasts and forecasts are archived at the Data Library of the International Research Institute for Climate and Society, Columbia University, providing a comprehensive database for research on subseasonal to seasonal predictability and predictions. The SubX models show skill for temperature and precipitation 3 weeks ahead of time in specific regions. The SubX multimodel ensemble mean is more skillful than any individual model overall. Skill in simulating the Madden–Julian oscillation (MJO) and the North Atlantic Oscillation (NAO), two sources of subseasonal predictability, is also evaluated, with skillful predictions of the MJO 4 weeks in advance and of the NAO 2 weeks in advance. SubX is also able to make useful contributions to operational forecast guidance at the Climate Prediction Center. Additionally, SubX provides information on the potential for extreme precipitation associated with tropical cyclones, which can help emergency management and aid organizations to plan for disasters.

Free access
Ben P. Kirtman
,
Dughong Min
,
Johnna M. Infanti
,
James L. Kinter III
,
Daniel A. Paolino
,
Qin Zhang
,
Huug van den Dool
,
Suranjana Saha
,
Malaquias Pena Mendez
,
Emily Becker
,
Peitao Peng
,
Patrick Tripp
,
Jin Huang
,
David G. DeWitt
,
Michael K. Tippett
,
Anthony G. Barnston
,
Shuhua Li
,
Anthony Rosati
,
Siegfried D. Schubert
,
Michele Rienecker
,
Max Suarez
,
Zhao E. Li
,
Jelena Marshak
,
Young-Kwon Lim
,
Joseph Tribbia
,
Kathleen Pegion
,
William J. Merryfield
,
Bertrand Denis
, and
Eric F. Wood

The recent U.S. National Academies report, Assessment of Intraseasonal to Interannual Climate Prediction and Predictability, was unequivocal in recommending the need for the development of a North American Multimodel Ensemble (NMME) operational predictive capability. Indeed, this effort is required to meet the specific tailored regional prediction and decision support needs of a large community of climate information users.

The multimodel ensemble approach has proven extremely effective at quantifying prediction uncertainty due to uncertainty in model formulation and has proven to produce better prediction quality (on average) than any single model ensemble. This multimodel approach is the basis for several international collaborative prediction research efforts and an operational European system, and there are numerous examples of how this multimodel ensemble approach yields superior forecasts compared to any single model.

Based on two NOAA Climate Test bed (CTB) NMME workshops (18 February and 8 April 2011), a collaborative and coordinated implementation strategy for a NMME prediction system has been developed and is currently delivering real-time seasonal-to-interannual predictions on the NOAA Climate Prediction Center (CPC) operational schedule. The hindcast and real-time prediction data are readily available (e.g., http://iridl.ldeo.columbia.edu/SOURCES/.Models/.NMME/) and in graphical format from CPC (www.cpc.ncep.noaa.gov/products/NMME/). Moreover, the NMME forecast is already currently being used as guidance for operational forecasters. This paper describes the new NMME effort, and presents an overview of the multimodel forecast quality and the complementary skill associated with individual models.

Full access