Search Results

You are looking at 21 - 30 of 37 items for

  • Author or Editor: J.C. Wilson x
  • Refine by Access: All Content x
Clear All Modify Search
Jesse E. Bell
,
Michael A. Palecki
,
C. Bruce Baker
,
William G. Collins
,
Jay H. Lawrimore
,
Ronald D. Leeper
,
Mark E. Hall
,
John Kochendorfer
,
Tilden P. Meyers
,
Tim Wilson
, and
Howard J. Diamond

Abstract

The U.S. Climate Reference Network (USCRN) is a network of climate-monitoring stations maintained and operated by the National Oceanic and Atmospheric Administration (NOAA) to provide climate-science-quality measurements of air temperature and precipitation. The stations in the network were designed to be extensible to other missions, and the National Integrated Drought Information System program determined that the USCRN could be augmented to provide observations that are more drought relevant. To increase the network’s capability of monitoring soil processes and drought, soil observations were added to USCRN instrumentation. In 2011, the USCRN team completed at each USCRN station in the conterminous United States the installation of triplicate-configuration soil moisture and soil temperature probes at five standards depths (5, 10, 20, 50, and 100 cm) as prescribed by the World Meteorological Organization; in addition, the project included the installation of a relative humidity sensor at each of the stations. Work is also under way to eventually install soil sensors at the expanding USCRN stations in Alaska. USCRN data are stewarded by the NOAA National Climatic Data Center, and instrument engineering and performance studies, installation, and maintenance are performed by the NOAA Atmospheric Turbulence and Diffusion Division. This article provides a technical description of the USCRN soil observations in the context of U.S. soil-climate–measurement efforts and discusses the advantage of the triple-redundancy approach applied by the USCRN.

Full access
Steven M. Martinaitis
,
Katie A. Wilson
,
Nusrat Yussouf
,
Jonathan J. Gourley
,
Humberto Vergara
,
Tiffany C. Meyer
,
Pamela L. Heinselman
,
Alan Gerard
,
Kodi L. Berry
,
Andres Vergara
, and
Justin Monroe

Abstract

There are ongoing efforts to move beyond the current paradigm of using deterministic products driven by observation-only data to make binary warning decisions. Recent works have focused on severe thunderstorm hazards, such as hail, lightning, and tornadoes. This study discusses one of the first steps toward having probabilistic information combined with convective-scale short-term precipitation forecasts available for the prediction and warning of flash flooding. Participants in the Hydrometeorology Testbed–MRMS Hydrology (HMT-Hydro) experiment evaluated several probabilistic-based hydrologic model output from the probabilistic Flooded Locations and Simulated Hydrographs (PRO-FLASH) system during experimental real-time warning operations. Evaluation of flash flood warning performance combined with product surveys highlighted how forecasters perceived biases within the probabilistic information and how the different probabilistic approaches influenced warnings that were verified versus those that were unverified. The incorporation of the Warn-on-Forecast System (WoFS) ensemble precipitation forecasts into the PRO-FLASH product generation provided an opportunity to evaluate the first coupling of subhourly convective-scale ensemble precipitation forecasts with probabilistic hydrologic modeling at the flash flood warning time scale through archived case simulations. The addition of WoFS precipitation forecasts resulted in an increase in warning lead time, including four events with ≥29 min of additional lead time but with increased probabilities of false alarms. Additional feedback from participants provided insights into the application of WoFS forecasts into warning decisions, including how flash flood expectations and confidence evolved for verified flash flood events and how forecast probabilistic products can positively influence the communications of the potential for flash flooding.

Open access
T. Keenan
,
P. Joe
,
J. Wilson
,
C. Collier
,
B. Golding
,
D. Burgess
,
P. May
,
C. Pierce
,
J. Bally
,
A. Crook
,
A. Seed
,
D. Sills
,
L. Berry
,
R. Potts
,
I. Bell
,
N. Fox
,
E. Ebert
,
M. Eilts
,
K. O'Loughlin
,
R. Webb
,
R. Carbone
,
K. Browning
,
R. Roberts
, and
C. Mueller

The first World Weather Research Programme (WWRP) Forecast Demonstration Project (FDP), with a focus on nowcasting, was conducted in Sydney, Australia, from 4 September to 21 November 2000 during a period associated with the Sydney 2000 Olympic Games. Through international collaboration, nine nowcasting systems from the United States, United Kingdom, Canada, and Australia were deployed at the Sydney Office of the Bureau of Meteorology (BOM) to demonstrate the capability of modern forecast systems and to quantify the associated benefits in the delivery of a real-time nowcast service. On-going verification and impact studies supported by international committees assisted by the WWRP formed an integral part of this project. A description is given of the project, including component systems, the weather, and initial outcomes. Initial results show that the nowcasting systems tested were transferable and able to provide valuable information enhancing BOM nowcasts. The project provided for unprecedented interchange of concepts and ideas between forecasters, researchers, and end users in an operational framework where they all faced common issues relevant to real time nowcast decision making. A training workshop sponsored by the World Meteorological Organization (WMO) was also held in conjunction with the project so that other member nations could benefit from the FDP.

Full access
L. C. Shaffrey
,
I. Stevens
,
W. A. Norton
,
M. J. Roberts
,
P. L. Vidale
,
J. D. Harle
,
A. Jrrar
,
D. P. Stevens
,
M. J. Woodage
,
M. E. Demory
,
J. Donners
,
D. B. Clark
,
A. Clayton
,
J. W. Cole
,
S. S. Wilson
,
W. M. Connolley
,
T. M. Davies
,
A. M. Iwi
,
T. C. Johns
,
J. C. King
,
A. L. New
,
J. M. Slingo
,
A. Slingo
,
L. Steenman-Clark
, and
G. M. Martin

Abstract

This article describes the development and evaluation of the U.K.’s new High-Resolution Global Environmental Model (HiGEM), which is based on the latest climate configuration of the Met Office Unified Model, known as the Hadley Centre Global Environmental Model, version 1 (HadGEM1). In HiGEM, the horizontal resolution has been increased to 0.83° latitude × 1.25° longitude for the atmosphere, and 1/3° × 1/3° globally for the ocean. Multidecadal integrations of HiGEM, and the lower-resolution HadGEM, are used to explore the impact of resolution on the fidelity of climate simulations.

Generally, SST errors are reduced in HiGEM. Cold SST errors associated with the path of the North Atlantic drift improve, and warm SST errors are reduced in upwelling stratocumulus regions where the simulation of low-level cloud is better at higher resolution. The ocean model in HiGEM allows ocean eddies to be partially resolved, which dramatically improves the representation of sea surface height variability. In the Southern Ocean, most of the heat transports in HiGEM is achieved by resolved eddy motions, which replaces the parameterized eddy heat transport in the lower-resolution model. HiGEM is also able to more realistically simulate small-scale features in the wind stress curl around islands and oceanic SST fronts, which may have implications for oceanic upwelling and ocean biology.

Higher resolution in both the atmosphere and the ocean allows coupling to occur on small spatial scales. In particular, the small-scale interaction recently seen in satellite imagery between the atmosphere and tropical instability waves in the tropical Pacific Ocean is realistically captured in HiGEM. Tropical instability waves play a role in improving the simulation of the mean state of the tropical Pacific, which has important implications for climate variability. In particular, all aspects of the simulation of ENSO (spatial patterns, the time scales at which ENSO occurs, and global teleconnections) are much improved in HiGEM.

Full access

Cloudnet

Continuous Evaluation of Cloud Profiles in Seven Operational Models Using Ground-Based Observations

A. J. Illingworth
,
R. J. Hogan
,
E.J. O'Connor
,
D. Bouniol
,
M. E. Brooks
,
J. Delanoé
,
D. P. Donovan
,
J. D. Eastment
,
N. Gaussiat
,
J. W. F. Goddard
,
M. Haeffelin
,
H. Klein Baltink
,
O. A. Krasnov
,
J. Pelon
,
J.-M. Piriou
,
A. Protat
,
H. W. J. Russchenberg
,
A. Seifert
,
A. M. Tompkins
,
G.-J. van Zadelhoff
,
F. Vinit
,
U. Willén
,
D. R. Wilson
, and
C. L. Wrench

The Cloudnet project aims to provide a systematic evaluation of clouds in forecast and climate models by comparing the model output with continuous ground-based observations of the vertical profiles of cloud properties. In the models, the properties of clouds are simplified and expressed in terms of the fraction of the model grid box, which is filled with cloud, together with the liquid and ice water content of the clouds. These models must get the clouds right if they are to correctly represent both their radiative properties and their key role in the production of precipitation, but there are few observations of the vertical profiles of the cloud properties that show whether or not they are successful. Cloud profiles derived from cloud radars, ceilometers, and dual-frequency microwave radiometers operated at three sites in France, Netherlands, and the United Kingdom for several years have been compared with the clouds in seven European models. The advantage of this continuous appraisal is that the feedback on how new versions of models are performing is provided in quasi-real time, as opposed to the much longer time scale needed for in-depth analysis of complex field studies. Here, two occasions are identified when the introduction of new versions of the ECMWF and Météo-France models leads to an immediate improvement in the representation of the clouds and also provides statistics on the performance of the seven models. The Cloudnet analysis scheme is currently being expanded to include sites outside Europe and further operational forecasting and climate models.

Full access
Corey K. Potvin
,
Burkely T. Gallo
,
Anthony E. Reinhart
,
Brett Roberts
,
Patrick S. Skinner
,
Ryan A. Sobash
,
Katie A. Wilson
,
Kelsey C. Britt
,
Chris Broyles
,
Montgomery L. Flora
,
William J. S. Miller
, and
Clarice N. Satrio

Abstract

Thunderstorm mode strongly impacts the likelihood and predictability of tornadoes and other hazards, and thus is of great interest to severe weather forecasters and researchers. It is often impossible for a forecaster to manually classify all the storms within convection-allowing model (CAM) output during a severe weather outbreak, or for a scientist to manually classify all storms in a large CAM or radar dataset in a timely manner. Automated storm classification techniques facilitate these tasks and provide objective inputs to operational tools, including machine learning models for predicting thunderstorm hazards. Accurate storm classification, however, requires accurate storm segmentation. Many storm segmentation techniques fail to distinguish between clustered storms, thereby missing intense cells, or to identify cells embedded within quasi-linear convective systems that can produce tornadoes and damaging winds. Therefore, we have developed an iterative technique that identifies these constituent storms in addition to traditionally identified storms. Identified storms are classified according to a seven-mode scheme designed for severe weather operations and research. The classification model is a hand-developed decision tree that operates on storm properties computed from composite reflectivity and midlevel rotation fields. These properties include geometrical attributes, whether the storm contains smaller storms or resides within a larger-scale complex, and whether strong rotation exists near the storm centroid. We evaluate the classification algorithm using expert labels of 400 storms simulated by the NSSL Warn-on-Forecast System or analyzed by the NSSL Multi-Radar/Multi-Sensor product suite. The classification algorithm emulates expert opinion reasonably well (e.g., 76% accuracy for supercells), and therefore could facilitate a wide range of operational and research applications.

Significance Statement

We have developed a new technique for automatically identifying intense thunderstorms in model and radar data and classifying storm mode, which informs forecasters about the risks of tornadoes and other high-impact weather. The technique identifies storms that are often missed by other methods, including cells embedded within storm clusters, and successfully classifies important storm modes that are generally not included in other schemes, such as rotating cells embedded within quasi-linear convective systems. We hope the technique will facilitate a variety of forecasting and research efforts.

Full access
Marcin J. Kurowski
,
Joao Teixeira
,
Chi Ao
,
Shannon Brown
,
Anthony B. Davis
,
Linda Forster
,
Kuo-Nung Wang
,
Matthew Lebsock
,
Mary Morris
,
Vivienne Payne
,
Mark T. Richardson
,
Richard Roy
,
David R. Thompson
, and
Robert C. Wilson

Abstract

To address critical gaps identified by the National Academies of Sciences, Engineering, and Medicine in the current Earth system observation strategy, the 2017–27 Decadal Survey for Earth Science and Applications from Space recommended incubating concepts for future targeted observables including the atmospheric planetary boundary layer (PBL). A subsequent NASA PBL Incubation Study Team Report identified measurement requirements and activities for advancing the maturity of the technologies applicable to the PBL targeted observables and their associated science and applications priorities. While the PBL is the critical layer where humans live and surface energy, moisture, and mass exchanges drive the Earth system, it is also the farthest and most inaccessible layer for spaceborne instruments. Here we document a PBL retrieval observing system simulation experiment (OSSE) framework suitable for assessing existing and new measurement techniques and determining their accuracy and improvements needed for addressing the elevated Decadal Survey requirements. In particular, the benefits of large-eddy simulation (LES) are emphasized as a key source of high-resolution synthetic observations for key PBL regimes: from the tropics, through subtropics and midlatitudes, to subpolar and polar regions. The potential of LES-based PBL retrieval OSSEs is explored using six instrument simulators: Global Navigation Satellite System–Radio Occultation, differential absorption radar, visible to shortwave infrared spectrometer, infrared sounder, Multi-angle Imaging SpectroRadiometer, and microwave sounder. The crucial role of LES in PBL retrieval OSSEs and some perspectives for instrument developments are discussed.

Open access
Leo J. Donner
,
Bruce L. Wyman
,
Richard S. Hemler
,
Larry W. Horowitz
,
Yi Ming
,
Ming Zhao
,
Jean-Christophe Golaz
,
Paul Ginoux
,
S.-J. Lin
,
M. Daniel Schwarzkopf
,
John Austin
,
Ghassan Alaka
,
William F. Cooke
,
Thomas L. Delworth
,
Stuart M. Freidenreich
,
C. T. Gordon
,
Stephen M. Griffies
,
Isaac M. Held
,
William J. Hurlin
,
Stephen A. Klein
,
Thomas R. Knutson
,
Amy R. Langenhorst
,
Hyun-Chul Lee
,
Yanluan Lin
,
Brian I. Magi
,
Sergey L. Malyshev
,
P. C. D. Milly
,
Vaishali Naik
,
Mary J. Nath
,
Robert Pincus
,
Jeffrey J. Ploshay
,
V. Ramaswamy
,
Charles J. Seman
,
Elena Shevliakova
,
Joseph J. Sirutis
,
William F. Stern
,
Ronald J. Stouffer
,
R. John Wilson
,
Michael Winton
,
Andrew T. Wittenberg
, and
Fanrong Zeng

Abstract

The Geophysical Fluid Dynamics Laboratory (GFDL) has developed a coupled general circulation model (CM3) for the atmosphere, oceans, land, and sea ice. The goal of CM3 is to address emerging issues in climate change, including aerosol–cloud interactions, chemistry–climate interactions, and coupling between the troposphere and stratosphere. The model is also designed to serve as the physical system component of earth system models and models for decadal prediction in the near-term future—for example, through improved simulations in tropical land precipitation relative to earlier-generation GFDL models. This paper describes the dynamical core, physical parameterizations, and basic simulation characteristics of the atmospheric component (AM3) of this model. Relative to GFDL AM2, AM3 includes new treatments of deep and shallow cumulus convection, cloud droplet activation by aerosols, subgrid variability of stratiform vertical velocities for droplet activation, and atmospheric chemistry driven by emissions with advective, convective, and turbulent transport. AM3 employs a cubed-sphere implementation of a finite-volume dynamical core and is coupled to LM3, a new land model with ecosystem dynamics and hydrology. Its horizontal resolution is approximately 200 km, and its vertical resolution ranges approximately from 70 m near the earth’s surface to 1 to 1.5 km near the tropopause and 3 to 4 km in much of the stratosphere. Most basic circulation features in AM3 are simulated as realistically, or more so, as in AM2. In particular, dry biases have been reduced over South America. In coupled mode, the simulation of Arctic sea ice concentration has improved. AM3 aerosol optical depths, scattering properties, and surface clear-sky downward shortwave radiation are more realistic than in AM2. The simulation of marine stratocumulus decks remains problematic, as in AM2. The most intense 0.2% of precipitation rates occur less frequently in AM3 than observed. The last two decades of the twentieth century warm in CM3 by 0.32°C relative to 1881–1920. The Climate Research Unit (CRU) and Goddard Institute for Space Studies analyses of observations show warming of 0.56° and 0.52°C, respectively, over this period. CM3 includes anthropogenic cooling by aerosol–cloud interactions, and its warming by the late twentieth century is somewhat less realistic than in CM2.1, which warmed 0.66°C but did not include aerosol–cloud interactions. The improved simulation of the direct aerosol effect (apparent in surface clear-sky downward radiation) in CM3 evidently acts in concert with its simulation of cloud–aerosol interactions to limit greenhouse gas warming.

Full access
S. Pawson
,
K. Kodera
,
K. Hamilton
,
T. G. Shepherd
,
S. R. Beagley
,
B. A. Boville
,
J. D. Farrara
,
T. D. A. Fairlie
,
A. Kitoh
,
W. A. Lahoz
,
U. Langematz
,
E. Manzini
,
D. H. Rind
,
A. A. Scaife
,
K. Shibata
,
P. Simon
,
R. Swinbank
,
L. Takacs
,
R. J. Wilson
,
J. A. Al-Saadi
,
M. Amodei
,
M. Chiba
,
L. Coy
,
J. de Grandpré
,
R. S. Eckman
,
M. Fiorino
,
W. L. Grose
,
H. Koide
,
J. N. Koshyk
,
D. Li
,
J. Lerner
,
J. D. Mahlman
,
N. A. McFarlane
,
C. R. Mechoso
,
A. Molod
,
A. O'Neill
,
R. B. Pierce
,
W. J. Randel
,
R. B. Rood
, and
F. Wu

To investigate the effects of the middle atmosphere on climate, the World Climate Research Programme is supporting the project “Stratospheric Processes and their Role in Climate” (SPARC). A central theme of SPARC, to examine model simulations of the coupled troposphere–middle atmosphere system, is being performed through the initiative called GRIPS (GCM-Reality Intercomparison Project for SPARC). In this paper, an overview of the objectives of GRIPS is given. Initial activities include an assessment of the performance of middle atmosphere climate models, and preliminary results from this evaluation are presented here. It is shown that although all 13 models evaluated represent most major features of the mean atmospheric state, there are deficiencies in the magnitude and location of the features, which cannot easily be traced to the formulation (resolution or the parameterizations included) of the models. Most models show a cold bias in all locations, apart from the tropical tropopause region where they can be either too warm or too cold. The strengths and locations of the major jets are often misrepresented in the models. Looking at three-dimensional fields reveals, for some models, more severe deficiencies in the magnitude and positioning of the dominant structures (such as the Aleutian high in the stratosphere), although undersampling might explain some of these differences from observations. All the models have shortcomings in their simulations of the present-day climate, which might limit the accuracy of predictions of the climate response to ozone change and other anomalous forcing.

Full access
Pamela L. Heinselman
,
Patrick C. Burke
,
Louis J. Wicker
,
Adam J. Clark
,
John S. Kain
,
Jidong Gao
,
Nusrat Yussouf
,
Thomas A. Jones
,
Patrick S. Skinner
,
Corey K. Potvin
,
Katie A. Wilson
,
Burkely T. Gallo
,
Montgomery L. Flora
,
Joshua Martin
,
Gerry Creager
,
Kent H. Knopfmeier
,
Yunheng Wang
,
Brian C. Matilla
,
David C. Dowell
,
Edward R. Mansell
,
Brett Roberts
,
Kimberly A. Hoogewind
,
Derek R. Stratman
,
Jorge Guerra
,
Anthony E. Reinhart
,
Christopher A. Kerr
, and
William Miller

Abstract

In 2009, advancements in NWP and computing power inspired a vision to advance hazardous weather warnings from a warn-on-detection to a warn-on-forecast paradigm. This vision would require not only the prediction of individual thunderstorms and their attributes but the likelihood of their occurrence in time and space. During the last decade, the warn-on-forecast research team at the NOAA National Severe Storms Laboratory met this challenge through the research and development of 1) an ensemble of high-resolution convection-allowing models; 2) ensemble- and variational-based assimilation of weather radar, satellite, and conventional observations; and 3) unique postprocessing and verification techniques, culminating in the experimental Warn-on-Forecast System (WoFS). Since 2017, we have directly engaged users in the testing, evaluation, and visualization of this system to ensure that WoFS guidance is usable and useful to operational forecasters at NOAA national centers and local offices responsible for forecasting severe weather, tornadoes, and flash floods across the watch-to-warning continuum. Although an experimental WoFS is now a reality, we close by discussing many of the exciting opportunities remaining, including folding this system into the Unified Forecast System, transitioning WoFS into NWS operations, and pursuing next-decade science goals for further advancing storm-scale prediction.

Significance Statement

The purpose of this research is to develop an experimental prediction system that forecasts the probability for severe weather hazards associated with individual thunderstorms up to 6 h in advance. This capability is important because some people and organizations, like those living in mobile homes, caring for patients in hospitals, or managing large outdoor events, require extended lead time to protect themselves and others from potential severe weather hazards. Our results demonstrate a prediction system that enables forecasters, for the first time, to message probabilistic hazard information associated with individual severe storms between the watch-to-warning time frame within the United States.

Restricted access