Search Results

You are looking at 1 - 7 of 7 items for :

  • Author or Editor: David R. Novak x
  • Bulletin of the American Meteorological Society x
  • Refine by Access: All Content x
Clear All Modify Search
David R. Novak
and
Brian A. Colle
Full access
Joseph C. Picca
,
David M. Schultz
,
Brian A. Colle
,
Sara Ganetis
,
David R. Novak
, and
Matthew J. Sienkiewicz

The northeast U.S. extratropical cyclone of 8–9 February 2013 produced blizzard conditions and more than 0.6–0.9 m (2–3 ft) of snow from Long Island through eastern New England. A surprising aspect of this blizzard was the development and rapid weakening of a snowband to the northwest of the cyclone center with radar ref lectivity factor exceeding 55 dBZ. Because the radar reflectivity within snowbands in winter storms rarely exceeds 40 dBZ, this event warranted further investigation. The high radar reflectivity was due to mixed-phase microphysics in the snowband, characterized by high differential reflectivity (Z DR > 2 dB) and low correlation coefficient (CC < 0.9), as measured by the operational dual-polarization radar in Upton, New York (KOKX). Consistent with these radar observations, heavy snow and ice pellets (both sleet and graupel) were observed. Later, as the reflectivity decreased to less than 40 dBZ, surface observations indicated a transition to primarily high-intensity dry snow, consistent with lower-tropospheric cold advection. Therefore, the rapid decrease of the 50+ dBZ reflectivity resulted from the transition from higher-density, mixed-phase precipitation to lower-density, dry-snow crystals and aggregates. This case study indicates the value that dual-polarization radar can have in an operational forecast environment for determining the variability of frozen precipitation (e.g., ice pellets, dry snow aggregates) on relatively small spatial scales.

Full access
Faye E. Barthold
,
Thomas E. Workoff
,
Brian A. Cosgrove
,
Jonathan J. Gourley
,
David R. Novak
, and
Kelly M. Mahoney

Abstract

Despite advancements in numerical modeling and the increasing prevalence of convection-allowing guidance, flash flood forecasting remains a substantial challenge. Accurate flash flood forecasts depend not only on accurate quantitative precipitation forecasts (QPFs), but also on an understanding of the corresponding hydrologic response. To advance forecast skill, innovative guidance products that blend meteorology and hydrology are needed, as well as a comprehensive verification dataset to identify areas in need of improvement.

To address these challenges, in 2013 the Hydrometeorological Testbed at the Weather Prediction Center (HMT-WPC), partnering with the National Severe Storms Laboratory (NSSL) and the Earth System Research Laboratory (ESRL), developed and hosted the inaugural Flash Flood and Intense Rainfall (FFaIR) Experiment. In its first two years, the experiment has focused on ways to combine meteorological guidance with available hydrologic information. One example of this is the creation of neighborhood flash flood guidance (FFG) exceedance probabilities, which combine QPF information from convection-allowing ensembles with flash flood guidance; these were found to provide valuable information about the flash flood threat across the contiguous United States.

Additionally, WPC has begun to address the challenge of flash flood verification by developing a verification database that incorporates observations from a variety of disparate sources in an attempt to build a comprehensive picture of flash flooding across the nation. While the development of this database represents an important step forward in the verification of flash flood forecasts, many of the other challenges identified during the experiment will require a long-term community effort in order to make notable advancements.

Full access
David R. Novak
,
Sarah E. Perfater
,
Julie L. Demuth
,
Stephen W. Bieda III
,
Gregory Carbin
,
Jeffrey Craven
,
Michael J. Erickson
,
Matthew E. Jeglum
,
Joshua Kastman
,
James A. Nelson
,
David E. Rudack
,
Michael J. Staudenmaier
, and
Jeff S. Waldstreicher

Abstract

Winter storms are disruptive to society and the economy, and they often cause significant injuries and deaths. Innovations in winter storm forecasting have occurred across the value chain over the past two decades, from physical understanding, to observations, to model forecasts, to postprocessing, to forecaster knowledge and interpretation, to products and services, and ultimately to decision support. These innovations enable more accurate and consistent forecasts, which are increasingly being translated into actionable information for decision-makers. This paper reviews the current state of winter storm forecasting in the context of the U.S. National Weather Service operations and describes a potential future state. Given predictability limitations, a key challenge of winter storm forecasting has been characterizing uncertainty and communicating the forecast in ways that are understandable and useful to decision-makers. To address this challenge, particular focus is placed on establishing a probabilistic framework, with probabilistic hazard information serving as a foundation for winter storm decision support services. The framework is guided by social science research to ensure effective communication of risk to meet users’ needs. Solutions to gaps impeding progress in winter storm forecasting are highlighted, including better understanding of mesoscale phenomenon, the need for better ensemble calibration, a rigorous and consistent database of observed impacts, and linking multiparameter probabilities (e.g., probability of intense snowfall rates at rush hour) with users’ information needs and decisions.

Open access
Lynn A. McMurdie
,
Gerald M. Heymsfield
,
John E. Yorks
,
Scott A. Braun
,
Gail Skofronick-Jackson
,
Robert M. Rauber
,
Sandra Yuter
,
Brian Colle
,
Greg M. McFarquhar
,
Michael Poellot
,
David R. Novak
,
Timothy J. Lang
,
Rachael Kroodsma
,
Matthew McLinden
,
Mariko Oue
,
Pavlos Kollias
,
Matthew R. Kumjian
,
Steven J. Greybush
,
Andrew J. Heymsfield
,
Joseph A. Finlon
,
Victoria L. McDonald
, and
Stephen Nicholls

Abstract

The Investigation of Microphysics and Precipitation for Atlantic Coast-Threatening Snowstorms (IMPACTS) is a NASA-sponsored field campaign to study wintertime snowstorms focusing on East Coast cyclones. This large cooperative effort takes place during the winters of 2020–23 to study precipitation variability in winter cyclones to improve remote sensing and numerical forecasts of snowfall. Snowfall within these storms is frequently organized in banded structures on multiple scales. The causes for the occurrence and evolution of a wide spectrum of snowbands remain poorly understood. The goals of IMPACTS are to characterize the spatial and temporal scales and structures of snowbands, understand their dynamical, thermodynamical, and microphysical processes, and apply this understanding to improve remote sensing and modeling of snowfall. The first deployment took place in January–February 2020 with two aircraft that flew coordinated flight patterns and sampled a range of storms from the Midwest to the East Coast. The satellite-simulating ER-2 aircraft flew above the clouds and carried a suite of remote sensing instruments including cloud and precipitation radars, lidar, and passive microwave radiometers. The in situ P-3 aircraft flew within the clouds and sampled environmental and microphysical quantities. Ground-based radar measurements from the National Weather Service network and a suite of radars located on Long Island, New York, along with supplemental soundings and the New York State Mesonet ground network provided environmental context for the airborne observations. Future deployments will occur during the 2022 and 2023 winters. The coordination between remote sensing and in situ platforms makes this a unique publicly available dataset applicable to a wide variety of interests.

Full access
Charles O. Stanier
,
R. Bradley Pierce
,
Maryam Abdi-Oskouei
,
Zachariah E. Adelman
,
Jay Al-Saadi
,
Hariprasad D. Alwe
,
Timothy H. Bertram
,
Gregory R. Carmichael
,
Megan B. Christiansen
,
Patricia A. Cleary
,
Alan C. Czarnetzki
,
Angela F. Dickens
,
Marta A. Fuoco
,
Dagen D. Hughes
,
Joseph P. Hupy
,
Scott J. Janz
,
Laura M. Judd
,
Donna Kenski
,
Matthew G. Kowalewski
,
Russell W. Long
,
Dylan B. Millet
,
Gordon Novak
,
Behrooz Roozitalab
,
Stephanie L. Shaw
,
Elizabeth A. Stone
,
James Szykman
,
Lukas Valin
,
Michael Vermeuel
,
Timothy J. Wagner
,
Andrew R. Whitehill
, and
David J. Williams

Abstract

The Lake Michigan Ozone Study 2017 (LMOS 2017) was a collaborative multiagency field study targeting ozone chemistry, meteorology, and air quality observations in the southern Lake Michigan area. The primary objective of LMOS 2017 was to provide measurements to improve air quality modeling of the complex meteorological and chemical environment in the region. LMOS 2017 science questions included spatiotemporal assessment of nitrogen oxides (NO x = NO + NO2) and volatile organic compounds (VOC) emission sources and their influence on ozone episodes; the role of lake breezes; contribution of new remote sensing tools such as GeoTASO, Pandora, and TEMPO to air quality management; and evaluation of photochemical grid models. The observing strategy included GeoTASO on board the NASA UC-12 aircraft capturing NO2 and formaldehyde columns, an in situ profiling aircraft, two ground-based coastal enhanced monitoring locations, continuous NO2 columns from coastal Pandora instruments, and an instrumented research vessel. Local photochemical ozone production was observed on 2 June, 9–12 June, and 14–16 June, providing insights on the processes relevant to state and federal air quality management. The LMOS 2017 aircraft mapped significant spatial and temporal variation of NO2 emissions as well as polluted layers with rapid ozone formation occurring in a shallow layer near the Lake Michigan surface. Meteorological characteristics of the lake breeze were observed in detail and measurements of ozone, NOx, nitric acid, hydrogen peroxide, VOC, oxygenated VOC (OVOC), and fine particulate matter (PM2.5) composition were conducted. This article summarizes the study design, directs readers to the campaign data repository, and presents a summary of findings.

Full access
Adam J. Clark
,
Steven J. Weiss
,
John S. Kain
,
Israel L. Jirak
,
Michael Coniglio
,
Christopher J. Melick
,
Christopher Siewert
,
Ryan A. Sobash
,
Patrick T. Marsh
,
Andrew R. Dean
,
Ming Xue
,
Fanyou Kong
,
Kevin W. Thomas
,
Yunheng Wang
,
Keith Brewster
,
Jidong Gao
,
Xuguang Wang
,
Jun Du
,
David R. Novak
,
Faye E. Barthold
,
Michael J. Bodner
,
Jason J. Levit
,
C. Bruce Entwistle
,
Tara L. Jensen
, and
James Correia Jr.

The NOAA Hazardous Weather Testbed (HWT) conducts annual spring forecasting experiments organized by the Storm Prediction Center and National Severe Storms Laboratory to test and evaluate emerging scientific concepts and technologies for improved analysis and prediction of hazardous mesoscale weather. A primary goal is to accelerate the transfer of promising new scientific concepts and tools from research to operations through the use of intensive real-time experimental forecasting and evaluation activities conducted during the spring and early summer convective storm period. The 2010 NOAA/HWT Spring Forecasting Experiment (SE2010), conducted 17 May through 18 June, had a broad focus, with emphases on heavy rainfall and aviation weather, through collaboration with the Hydrometeorological Prediction Center (HPC) and the Aviation Weather Center (AWC), respectively. In addition, using the computing resources of the National Institute for Computational Sciences at the University of Tennessee, the Center for Analysis and Prediction of Storms at the University of Oklahoma provided unprecedented real-time conterminous United States (CONUS) forecasts from a multimodel Storm-Scale Ensemble Forecast (SSEF) system with 4-km grid spacing and 26 members and from a 1-km grid spacing configuration of the Weather Research and Forecasting model. Several other organizations provided additional experimental high-resolution model output. This article summarizes the activities, insights, and preliminary findings from SE2010, emphasizing the use of the SSEF system and the successful collaboration with the HPC and AWC.

A supplement to this article is available online (DOI:10.1175/BAMS-D-11-00040.2)

Full access