Search Results

You are looking at 1 - 10 of 79 items for

  • Author or Editor: Russ S. Schumacher x
  • Refine by Access: All Content x
Clear All Modify Search
Russ S. Schumacher

Abstract

In this study, idealized numerical simulations are used to identify the processes responsible for initiating, organizing, and maintaining quasi-stationary convective systems that produce locally extreme rainfall amounts. Of particular interest are those convective systems that have been observed to occur near mesoscale convective vortices (MCVs) and other midlevel circulations. To simulate the lifting associated with such circulations, a low-level momentum forcing is applied to an initial state that is representative of observed extreme rain events. The initial vertical wind profile includes a sharp reversal of the vertical wind shear with height, indicative of observed low-level jets.

Deep moist convection initiates within the region of mesoscale lifting, and the resulting convective system replicates many of the features of observed systems. The low-level thermodynamic environment is nearly saturated, which is not conducive to the production of a strong surface cold pool; yet the convection quickly organizes into a back-building line. It is shown that a nearly stationary convectively generated low-level gravity wave is responsible for the linear organization, which continues for several hours. New convective cells repeatedly form on the southwest end of the line and move to the northeast, resulting in large local rainfall amounts. In the later stages of the simulated convective system, a cold pool does develop, but its interaction with the strong reverse shear at low levels is not optimized for the maintenance of deep convection along its edge. A series of sensitivity experiments shows some of the effects of hydrometeor evaporation and melting, planetary rotation, and the imposed mesoscale forcing.

Full access
Russ S. Schumacher

Abstract

Using a method for initiating a quasi-stationary, heavy-rain-producing elevated mesoscale convective system in an idealized numerical modeling framework, a series of experiments is conducted in which a shallow layer of drier air is introduced within the near-surface stable layer. The environment is still very moist in the experiments, with changes to the column-integrated water vapor of only 0.3%–1%. The timing and general evolution of the simulated convective systems are very similar, but rainfall accumulation at the surface is changed by a much larger fraction than the reduction in moisture, with point precipitation maxima reduced by up to 29% and domain-averaged precipitation accumulations reduced by up to 15%. The differences in precipitation are partially attributed to increases in the evaporation rate in the shallow subcloud layer, though this is found to be a secondary effect. More importantly, even though the near-surface layer has strong convective inhibition in all simulations and the convective available potential energy of the most unstable parcels is unchanged, convection is less intense in the experiments with drier subcloud layers because less air originating in that layer rises in convective updrafts. An additional experiment with a cooler near-surface layer corroborates these findings. The results from these experiments suggest that convective systems assumed to be elevated are, in fact, drawing air from near the surface unless the low levels are very stable. Considering that the moisture differences imposed here are comparable to observational uncertainties in low-level temperature and moisture, the strong sensitivity of accumulated precipitation to these quantities has implications for the predictability of extreme rainfall.

Full access
Russ S. Schumacher

Abstract

On 31 May 2013, a supercell thunderstorm initiated in west-central Oklahoma and produced a deadly tornado. This convection then grew upscale, with a nearly stationary line developing early on 1 June that produced very heavy rainfall and caused deadly flash flooding in the Oklahoma City area. Real-time convection-allowing (Δx = 4 km) model forecasts used during the Mesoscale Predictability Experiment (MPEX) provided accurate guidance regarding the timing, location, and evolution of convection in this case. However, attempts to simulate this event at higher resolution degraded the forecast, with the primary supercell failing to initiate and the evolution of the overnight MCS not resembling the observed system. Experiments to test the dependence of forecasts of this event on model resolution show that with grid spacing smaller than 4 km, mixing along the dryline in northwest Texas was more vigorous, causing low-level dry air to move more quickly eastward into Oklahoma. This drying prevented the supercell from initiating near the triple point in the higher-resolution simulations. Then, the lack of supercellular convection and its associated cold pool altered the evolution of subsequent convection. Whereas in observations and the 4-km forecast, a nearly stationary MCS developed parallel to, but displaced from, the supercell’s cold pool, the higher-resolution simulations instead had a faster-moving squall line that produced less rainfall. Although the degradation of convective forecasts at higher resolution is probably unusual and appears sensitive to the choice of boundary layer parameterization, these findings demonstrate that how numerical models treat boundary layer processes at different grid spacings can, in some cases, have profound influences on predictions of high-impact weather.

Full access
Russ S. Schumacher

Abstract

Floods and flash floods are, by their nature, a multidisciplinary problem: they result from a convergence of atmospheric conditions, the underlying topography, hydrological processes, and the built environment. Research aimed at addressing various aspects of floods, on the other hand, often follows paths that do not directly address all of these fundamental connections. With this in mind, the NSF-sponsored Studies of Precipitation, Flooding, and Rainfall Extremes Across Disciplines (SPREAD) workshop was organized and held in Colorado during the summers of 2013 and 2014. SPREAD brought together a group of 27 graduate students from a wide variety of academic disciplines, but with the unifying theme being research interests in extreme precipitation or flooding. During the first meeting of the workshop, groups of graduate student participants designed interdisciplinary research projects that they then began work on over the intervening year, with the second meeting providing a venue to present their results. This article will outline the preliminary findings of these research efforts. Furthermore, the workshop participants had the unique and meaningful experience of visiting several locations in Colorado that had flooded in the past, and then visiting them again in the aftermath of the devastating 2013 floods. In total, the workshop resulted in several fruitful research activities that will advance understanding of precipitation and flooding. Even more importantly, the workshop fostered the development of a network of early-career researchers and practitioners who will be “multilingual” in terms of scientific disciplines, and who are poised to lead within their respective careers and across the scientific community.

Full access
Russ S. Schumacher

Abstract

This study makes use of operational global ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF) to examine the factors contributing to, or inhibiting, the development of a long-lived continental vortex and its associated rainfall. From 25 to 30 June 2007, a vortex developed and grew upscale over the southern plains of the United States. It was associated with persistent heavy rainfall, with over 100 mm of rain falling in much of Texas, Oklahoma, Kansas, and Missouri, and amounts exceeding 300 mm in southeastern Kansas. Previous research has shown that, in comparison with other rainfall events of similar temporal and spatial scales, this event was particularly difficult for numerical models to predict.

Considering the ensemble members as different possible realizations of the evolution of the event, several methods are used to examine the processes that led to the development and maintenance of the long-lived vortex and its associated rainfall, and to its apparently limited predictability. Linear statistics are calculated to identify synoptic-scale flow features that were correlated to area-averaged precipitation, and differences between composites of “dry” and “wet” ensemble members are used to pinpoint the processes that were favorable or detrimental to the system’s development. The maintenance of the vortex, and its slow movement in the southern plains, are found to be closely related to the strength of a closed midlevel anticyclone in the southwestern United States and the strength of a midlevel ridge in the northern plains. In particular, with a weaker upstream anticyclone, the shear and flow over the incipient vortex are relatively weak, which allows for slow movement and persistent heavy rains. On the other hand, when the upstream anticyclone is stronger, there is stronger northerly shear and flow, which causes the incipient vortex to move southwestward into the high terrain of Mexico and dissipate. These relatively small differences in the wind and mass fields early in the ensemble forecast, in conjunction with modifications of the synoptic and mesoscale flow by deep convection, lead to very large spread in the resulting precipitation forecasts.

Full access
Russ S. Schumacher
and
Richard H. Johnson

Abstract

This study examines the characteristics of a large number of extreme rain events over the eastern two-thirds of the United States. Over a 5-yr period, 184 events are identified where the 24-h precipitation total at one or more stations exceeds the 50-yr recurrence amount for that location. Over the entire region of study, these events are most common in July. In the northern United States, extreme rain events are confined almost exclusively to the warm season; in the southern part of the country, these events are distributed more evenly throughout the year. National composite radar reflectivity data are used to classify each event as a mesoscale convective system (MCS), a synoptic system, or a tropical system, and then to classify the MCS and synoptic events into subclassifications based on their organizational structures. This analysis shows that 66% of all the events and 74% of the warm-season events are associated with MCSs; nearly all of the cool-season events are caused by storms with strong synoptic forcing. Similarly, nearly all of the extreme rain events in the northern part of the country are caused by MCSs; synoptic and tropical systems play a larger role in the South and East. MCS-related events are found to most commonly begin at around 1800 local standard time (LST), produce their peak rainfall between 2100 and 2300 LST, and dissipate or move out of the affected area by 0300 LST.

Full access
Gregory R. Herman
and
Russ S. Schumacher

Abstract

A continental United States (CONUS)-wide framework for analyzing quantitative precipitation forecasts (QPFs) from NWP models from the perspective of precipitation return period (RP) exceedances is introduced using threshold estimates derived from a combination of NOAA Atlas 14 and older sources. Forecasts between 2009 and 2015 from several different NWP models of varying configurations and spatial resolutions are analyzed to assess bias characteristics and forecast skill for predicting RP exceedances. Specifically, NOAA’s Global Ensemble Forecast System Reforecast (GEFS/R) and the National Severe Storms Laboratory WRF (NSSL-WRF) model are evaluated for 24-h precipitation accumulations. The climatology of extreme precipitation events for 6-h accumulations is also explored in three convection-allowing models: 1) NSSL-WRF, 2) the North American Mesoscale 4-km nest (NAM-NEST), and 3) the experimental High Resolution Rapid Refresh (HRRR). The GEFS/R and NSSL-WRF are both found to exhibit similar 24-h accumulation RP exceedance climatologies over the U.S. West Coast to those found in observations and are found to be approximately equally skillful at predicting these exceedance events in this region. In contrast, over the eastern two-thirds of the CONUS, GEFS/R struggles to predict the predominantly convectively driven extreme QPFs, predicting far fewer events than are observed and exhibiting inferior forecast skill to the NSSL-WRF. The NSSL-WRF and HRRR are found to produce 6-h extreme precipitation climatologies that are approximately in accord with those found in the observations, while NAM-NEST produces many more RP exceedances than are observed across all of the CONUS.

Full access
Gregory R. Herman
and
Russ S. Schumacher

Abstract

Approximately 11 years of reforecasts from NOAA’s Second-Generation Global Ensemble Forecast System Reforecast (GEFS/R) model are used to train a contiguous United States (CONUS)-wide gridded probabilistic prediction system for locally extreme precipitation. This system is developed primarily using the random forest (RF) algorithm. Locally extreme precipitation is quantified for 24-h precipitation accumulations in the framework of average recurrence intervals (ARIs), with two severity levels: 1- and 10-yr ARI exceedances. Forecasts are made from 0000 UTC forecast initializations for two 1200–1200 UTC periods: days 2 and 3, comprising, respectively, forecast hours 36–60 and 60–84. Separate models are trained for each of eight forecast regions and for each forecast lead time. GEFS/R predictors vary in space and time relative to the forecast point and include not only the quantitative precipitation forecast (QPF) output from the model, but also variables that characterize the meteorological regime, including winds, moisture, and instability. Numerous sensitivity experiments are performed to determine the effects of the inclusion or exclusion of different aspects of forecast information in the model predictors, the choice of statistical algorithm, and the effect of performing dimensionality reduction via principal component analysis as a preprocessing step. Overall, it is found that the machine learning (ML)-based forecasts add significant skill over exceedance forecasts produced from both the raw GEFS/R ensemble QPFs and from the European Centre for Medium-Range Weather Forecasts’ (ECMWF) global ensemble across almost all regions of the CONUS. ML-based forecasts are found to be underconfident, while raw ensemble forecasts are highly overconfident.

Full access
Russ S. Schumacher
and
Adam J. Clark

Abstract

This study investigates probabilistic forecasts made using different convection-allowing ensemble configurations for a three-day period in June 2010 when numerous heavy-rain-producing mesoscale convective systems (MCSs) occurred in the United States. These MCSs developed both along a baroclinic zone in the Great Plains, and in association with a long-lived mesoscale convective vortex (MCV) in Texas and Arkansas. Four different ensemble configurations were developed using an ensemble-based data assimilation system. Two configurations used continuously cycled data assimilation, and two started the assimilation 24 h prior to the initialization of each forecast. Each configuration was run with both a single set of physical parameterizations and a mixture of physical parameterizations. These four ensemble forecasts were also compared with an ensemble run in real time by the Center for the Analysis and Prediction of Storms (CAPS). All five of these ensemble systems produced skillful probabilistic forecasts of the heavy-rain-producing MCSs, with the ensembles using mixed physics providing forecasts with greater skill and less overall bias compared to the single-physics ensembles. The forecasts using ensemble-based assimilation systems generally outperformed the real-time CAPS ensemble at lead times of 6–18 h, whereas the CAPS ensemble was the most skillful at forecast hours 24–30, though it also exhibited a wet bias. The differences between the ensemble precipitation forecasts were found to be related in part to differences in the analysis of the MCV and its environment, which in turn affected the evolution of errors in the forecasts of the MCSs. These results underscore the importance of representing model error in convection-allowing ensemble analysis and prediction systems.

Full access
Gregory R. Herman
and
Russ S. Schumacher

Abstract

Three different statistical algorithms are applied to forecast locally extreme precipitation across the contiguous United States (CONUS) as quantified by 1- and 10-yr average recurrence interval (ARI) exceedances for 1200–1200 UTC forecasts spanning forecast hours 36–60 and 60–84, denoted, respectively, day 2 and day 3. Predictors come from nearly 11 years of reforecasts from NOAA’s Second-Generation Global Ensemble Forecast System Reforecast (GEFS/R) model and derive from a variety of thermodynamic and kinematic variables that characterize the meteorological regime in addition to the quantitative precipitation forecast (QPF) output from the ensemble. In addition to encompassing nine different atmospheric fields, predictors also vary in space and time relative to the forecast point. Distinct models are trained for eight different hydrometeorologically cohesive regions of the CONUS. One algorithm supplies the GEFS/R predictors directly to a random forest (RF) procedure to produce extreme precipitation forecasts; the second also employs RFs, but the predictors instead undergo principal component analysis (PCA), and extracted leading components are supplied to the RF. In the last algorithm, dimension-reduced predictors are supplied to a logistic regression (LR) algorithm instead of an RF. A companion paper investigated the quality of the forecasts produced by these models and other RF-based forecast models. This study is an extension of that work and explores the internals of these trained models and what physical and statistical insights they reveal about forecasting extreme precipitation from a global, convection-parameterized model.

Full access