Search Results

You are looking at 1 - 10 of 19 items for

  • Author or Editor: Gregory J. Stumpf x
  • Refine by Access: All Content x
Clear All Modify Search
Caren Marzban
and
Gregory J. Stumpf

Abstract

The National Severe Storms Laboratory's (NSSL) mesocyclone detection algorithm (MDA) is designed to scotch for patterns in Doppler velocity radar data that are associated with rotating updrafts in severe thunderstorms. These storm-scale circulations are typically precursors to tornados and severe weather in thunderstorms, yet not all circulations produce such phenomena.

A neural network has been designed to diagnose which circulations detected by the NSSL MDA yield tornados. The data used both for the training and the testing of the network are obtained from the NSSL MDA. In particular, 23 variables characterizing the circulations are selected to be used as the input nodes of a feed-forward neural network. The output of the network is chosen to be the existence/nonexistence of tornados, based on ground observations. It is shown that the network outperforms the rule-based algorithm existing in the MDA, as well as statistical techniques such as discriminant analysis and logistic regression. Additionally, a measure of confidence is provided in terms of probability functions.

Full access
Caren Marzban
and
Gregory J. Stumpf

Abstract

A neural network is developed to diagnose which circulations detected by the National Severe Storms Laboratory’s Mesocyclone Detection Algorithm yield damaging wind. In particular, 23 variables characterizing the circulations are selected to be used as the input nodes of a feed-forward, supervised neural network. The outputs of the network represent the existence/nonexistence of damaging wind, based on ground observations. A set of 14 scalar, nonprobabilistic measures and a set of two multidimensional, probabilistic measures are employed to assess the performance of the network. The former set includes measures of accuracy, association, discrimination, skill, and the latter consists of reliability and refinement diagrams. Two classification schemes are also examined.

It is found that a neural network with two hidden nodes outperforms a neural network with no hidden nodes when performance is gauged with any of the 14 scalar measures, except for a measure of discrimination where the results are opposite. The two classification schemes perform comparably to one another. As for the performance of the network in terms of reliability diagrams, it is shown that the process by which the outputs are converted to probabilities allows for the forecasts to be completely reliable. Refinement diagrams complete the representation of the calibration-refinement factorization of the joint distribution of forecasts and observations.

Full access
Gregory J. Stumpf
and
Alan E. Gerard

Abstract

Threats-in-Motion (TIM) is a warning generation approach that would enable the NWS to advance severe thunderstorm and tornado warnings from the current static polygon system to continuously updating polygons that move forward with a storm. This concept is proposed as a first stage for implementation of the Forecasting a Continuum of Environmental Threats (FACETs) paradigm, which eventually aims to deliver rapidly updating probabilistic hazard information alongside NWS warnings, watches, and other products. With TIM, a warning polygon is attached to the threat and moves forward along with it. This provides more uniform, or equitable, lead time for all locations downstream of the event. When forecaster workload is high, storms remain continually tracked and warned. TIM mitigates gaps in warning coverage and improves the handling of storm motion changes. In addition, warnings are automatically cleared from locations where the threat has passed. This all results in greater average lead times and lower average departure times than current NWS warnings, with little to no impact to average false alarm time. This is particularly noteworthy for storms expected to live longer than the average warning duration (30 or 45 min) such as long-tracked supercells that are more prevalent during significant tornado outbreaks.

Open access
Robert J. Trapp
,
Gregory J. Stumpf
, and
Kevin L. Manross

Abstract

A large set of data collected by numerous Weather Surveillance Radar-1988 Doppler (WSR-88D) units around the United States was analyzed to reassess the percentage of tornadic mesocyclones. Out of the 5322 individual mesocyclone detections that satisfied the relatively stringent WSR-88D Mesocyclone Detection Algorithm objective criteria, only 26% were associated with tornadoes. In terms of height or altitude of mesocyclone base, 15% of midaltitude mesocyclone detections were tornadic, and more than 40% of low-altitude mesocyclone detections (e.g., those with bases ≤ 1000 m above radar level) were tornadic. These results confirm that a low-altitude mesocyclone is much more likely to be associated with a tornado than is a midaltitude mesocyclone, and more generally, that the percentage of tornadic mesocyclones is indeed lower than previously thought.

Full access
Gregory J. Stumpf
,
Richard H. Johnson
, and
Bradley F. Smull

Abstract

An analysis has been carried out of the surface pressure field in a highly complex mesoscale convective system that occurred on 3-4 June 1985 during the Oklahoma-Kansas Preliminary Regional Expeximent for STORM-Central (OK PRE-STORM). During its mature stage the storm consisted of two primary intersecting convective bands approximately 200 km in length, one oriented NIE-SW (to the north) and the other N-S (to the south), with a stratiform precipitation region extending to the northwest of the bands. Stratifonn precipitation was weak to nonexistent in the southernmost portion of the storm.

Although the organization of the storm was complex, the surface pressure field resembled those associated with simpler, quasi-linear squall systems containing trading stratifom regions: a mesohigh existed neat the convective line and a wake low was observed to the rear of the stratiform region. A strong system-relative, descending rear inflow jet was observed in the northern part of the storm near the wake low. Significantly, only the northern portion of the storm had a trailing stratiform region and it was only in that region that a wake low and a descending mu inflow jet occurred.

An analysis of dual-Doppler radar data taken in the northern part of the storm indicates remarkably strong, localized subsidence at low levels within the rear inflow jet, up to 6 m s−1 on a 10-km scale at the back edge of the trailing stratiform region. The maximum sinking occurred (a) to the rear of the highest reflectivity portion of the trailing stratiform region, (b) within the region of the strongest low-level reflectivity gradient, and (c) was coincident with the strongest surface pressure gradient [up to 2 mb (5 km)−1] ahead of the wake low center.

These findings indicate that the trailing stratiform precipitation regions of mesoscale convective systems can be dynamically significant phenomena, generating rapidly descending inflow jets at their back edges and, con-sequently, producing pronounced lower-tropospheric warming, intense surface pressure gradients and strong low-level winds.

Full access
Caren Marzban
,
E. De Wayne Mitchell
, and
Gregory J. Stumpf

Abstract

It is argued that the strength of a predictor is an ill-defined concept. At best, it is contingent on many assumptions, and, at worst, it is an ambiguous quantity. It is shown that many of the contingencies are met (or avoided) only in a bivariate sense, that is, one independent variable (and one dependent variable) at a time. Several such methods are offered after which data produced by the National Severe Storms Laboratory’s Tornado Detection Algorithm are analyzed for the purpose of addressing the question of which storm-scale vortex attributes based on Doppler radar constitute the “best predictors” of tornadoes.

Full access
John P. Monteverdi
,
Roger Edwards
, and
Gregory J. Stumpf

Abstract

This manuscript documents the tornado in the Rockwell Pass area of Sequoia National Park, California, that occurred on 7 July 2004. Since the elevation of the tornado’s ground circulation was approximately 3705 m (~12 156 ft) MSL, this is the highest-elevation tornado documented in the United States. The investigation of the storm’s convective mode was performed mostly inferentially on the basis of an analysis of the radar imagery from Edwards Air Force Base (which was in clear-air mode on this day), objectively produced soundings and/or CAPE estimates from two mesoscale models, an objectively produced proximity sounding and hodograph, and analyses of satellite imagery. The nearest Weather Surveillance Radar-1988 Doppler (WSR-88D) in Hanford, California, could not be used to observe this storm because of terrain blockage by the Sierra Nevada, and the nearest sounding sites were too distant and in a different meteorological environment on this day. The near-storm environment may have been favorable briefly for a supercell in the upper portion of the Kern River Canyon. The limitations of the radar data precluded the authors from making a definitive conclusion on the convective mode of the storm but do not rule out the possibility that the storm briefly might have been a supercell. There was insufficient evidence, however, to support the notion that the tornado itself was mesocyclone induced. High LCL heights in the proximity sounding also suggest that the tornado was formed by processes not associated with a mesocyclone (popularly known as a “landspout”), but do not allow us to dismiss the possibility that the tornado was mesocyclone induced.

Full access
Douglas A. Speheger
,
Charles A. Doswell III
, and
Gregory J. Stumpf

Abstract

The tornado events of 3 May 1999 within the county warning area of the Norman, Oklahoma, office of the National Weather Service are reviewed, emphasizing the challenges associated with obtaining accurate information about the existence, timing, location, and intensity of individual tornadoes. Accurate documentation of tornado and other hazardous weather events is critical to research, is needed for operational assessments, and is important for developing hazard mitigation strategies. The situation following this major event was unusual because of the high concentration of meteorologists in the area, relative to most parts of the United States. As a result of this relative abundance of resources, it is likely that these tornadoes were reasonably well documented. Despite this unique situation in central Oklahoma, it is argued that this event also provides evidence of a national need for a rapid-response scientific and engineering survey team to provide documentation of major hazardous weather events before cleanup destroys important evidence.

Full access
Valliappa Lakshmanan
,
Travis Smith
,
Kurt Hondl
,
Gregory J. Stumpf
, and
Arthur Witt

Abstract

With the advent of real-time streaming data from various radar networks, including most Weather Surveillance Radars-1988 Doppler and several Terminal Doppler Weather Radars, it is now possible to combine data in real time to form 3D multiple-radar grids. Herein, a technique for taking the base radar data (reflectivity and radial velocity) and derived products from multiple radars and combining them in real time into a rapidly updating 3D merged grid is described. An estimate of that radar product combined from all the different radars can be extracted from the 3D grid at any time. This is accomplished through a formulation that accounts for the varying radar beam geometry with range, vertical gaps between radar scans, the lack of time synchronization between radars, storm movement, varying beam resolutions between different types of radars, beam blockage due to terrain, differing radar calibration, and inaccurate time stamps on radar data. Techniques for merging scalar products like reflectivity, and innovative, real-time techniques for combining velocity and velocity-derived products are demonstrated. Precomputation techniques that can be utilized to perform the merger in real time and derived products that can be computed from these three-dimensional merger grids are described.

Full access
Arthur Witt
,
Michael D. Eilts
,
Gregory J. Stumpf
,
E. De Wayne Mitchell
,
J. T. Johnson
, and
Kevin W. Thomas

Abstract

This paper discusses some important issues and problems associated with evaluating the performance of radar-based severe storm detection algorithms. The deficiencies of using Storm Data as a source of verification are examined. Options for equalizing the time- and space scales of the algorithm predictions and the corresponding verification data are presented. Finally, recommendations are given concerning the different evaluation procedures that are available.

Full access