Search Results

You are looking at 1 - 5 of 5 items for

  • Author or Editor: Mark Shafer x
  • All content x
Clear All Modify Search
Darrian Bertrand and Mark Shafer

Abstract

State hazard mitigation plans guide state and local agencies in actions they may take to reduce their vulnerability to extreme events. However, because they are written for a general audience, they must be written in a way for a layperson to understand. In many cases, the people writing these plans are not meteorologists or do not have access to meteorological expertise. Consequently, descriptions of hazards may be taken from websites, other documents, or perhaps authoritative sources. This leads to inconsistencies in the way hazards are portrayed in the plans, which increases the difficulty of translating proposed actions to local governments or to other states.

This article delves into the issue of these variances and how it affects those who write state hazard mitigation plans. For this brief text, the hazards discussed in state plans that fall in the National Oceanic and Atmospheric Administration (NOAA) Regional Integrated Sciences and Assessments (RISA) Southern Climate Impacts Planning Program (SCIPP)’s region are covered with a comparison of definitions from the National Weather Service (NWS) and the American Meteorological Society (AMS). States within the SCIPP region include Oklahoma, Texas, Arkansas, Louisiana, Mississippi, and Tennessee. This study found that it is more common for states to use key words from NWS and AMS hazard definitions than to use exact definitions. The goal of this article is to prompt a discussion about the inconsistency of terminology used in state hazard mitigation plans and to spread awareness of this issue so that future plans can keep their unique elements while providing a better description and understanding of the included hazards.

Full access
Daniela Spade, Kirsten de Beurs, and Mark Shafer

Abstract

Evaluation of the standardized precipitation index (SPI) dataset published monthly in the National Oceanic and Atmospheric Administration/National Centers for Environmental Information (NOAA/NCEI) climate divisional database revealed that drought frequency is being mischaracterized in climate divisions across the United States. The 3- and 6-month September SPI values were downloaded from the database for all years between 1931 and 2019; the SPI was also calculated for the same time scales and span of years following the SPI method laid out by NOAA/NCEI. Drought frequency is characterized as the total number of years that the SPI fell below −1. SPI values across 1931–90, the calibration period cited by NOAA/NCEI, showed regional patterns in climate divisions that are biased toward or away from drought, according to the average values of the SPI. For both time scales examined, the majority of the climate divisions in the central, Midwest, and northeastern United States showed negative averages, indicating bias toward drought, whereas climate divisions in the western United States, the northern Midwest, and parts of the Southeast and Texas had positive averages, indicating bias away from drought. The standard deviation of the SPI also differed from the expected value of 1. These regional patterns in the NCEI’s SPI values are the result of a different (sliding) calibration period, 1895–2019, instead of the cited standardized period of 1931–90. The authors recommend that the NCEI modify its SPI computational procedure to reflect the best practices identified in the benchmark papers, namely, a fixed baseline period.

Open access
Mark A. Shafer, Donald R. MacGorman, and Frederick H. Carr

Abstract

Cloud-to-ground (CG) lightning data are examined relative to digitized radar data for a storm system that occurred in Oklahoma on 26 May 1985. This system evolved through three stages: 1) two lines of cells, one near the dryline and the other 60 km ahead of it; 2) a supercell storm; and 3) a mesoscale convective system (MCS). The behavior of lightning in each stage was different. Initially no ground flashes were observed in either line until reflectivity increased to ≥46 dBZ and vertically integrated liquid (VIL) increased to ≥10 kg m−2; then ground flash rates remained <1.2 min−1 for >1 h. Most ground flashes in the line of storms near the dryline were negative (18 −CG, 3 +CG), while most in the leading line were positive (11 +CG, 3 −CG), a pattern of polarity opposite to what usually has been observed. Approximately 3 h after radar detected the first storm, ground flash rates increased to >5 min−1 and remained so for 6 h. A mesocyclone formed approximately 30 min after flash rates exceeded 5 min−1, and a few positive ground flashes occurred near it. Ground flash rates increased briefly to >20 min−1 as the mesocyclone dissipated and then remained >10 min−1 as a squall line formed along the outflow boundary from the dissipating supercell and produced a stratiform region. Most ground flashes in this MCS occurred in the convective line and had negative polarity. The few ground flashes in the stratiform region tended to be positive (42 +CG, 32 −CG during 3 h). During 1 h of the MCS, ground flash rates decreased and then increased again simultaneously in both the convective and stratiform regions, a previously undocumented behavior. It is possible that this was caused by updrafts in both the convective line and stratiform region changing at roughly the same time. It is also possible that most ground flashes in the stratiform region originated near the convective line, and so were influenced by the line. Overall trends in ground flash density, flash relative frequency, reflectivity, VIL, and severe hail reports appeared similar as the storm system evolved.

Full access
Jason A. Otkin, Mark Shafer, Mark Svoboda, Brian Wardlow, Martha C. Anderson, Christopher Hain, and Jeffrey Basara
Full access
Mark A. Shafer, Christopher A. Fiebrich, Derek S. Arndt, Sherman E. Fredrickson, and Timothy W. Hughes

Abstract

High quality data sources are critical to scientists, engineers, and decision makers alike. The models that scientists develop and test with quality-assured data eventually become used by a wider community, from policy makers’ long-term strategies based upon weather and climate predictions to emergency managers’ decisions to deploy response crews. The process of developing high quality data in one network, the Oklahoma Mesonetwork (Mesonet) is detailed in this manuscript.

The Oklahoma Mesonet quality-assurance procedures consist of four principal components: an instrument laboratory, field visits, automated computer routines, and manual inspection. The instrument laboratory ensures that all sensors that are deployed in the network measure up to high standards established by the Mesonet Steering Committee. Routine and emergency field visits provide a manual inspection of the performance of the sensors and replacement as necessary. Automated computer routines monitor data each day, set data flags as appropriate, and alert personnel of potential errors in the data. Manual inspection provides human judgment to the process, catching subtle errors that automated techniques may miss.

The quality-assurance (QA) process is tied together through efficient communication links. A QA manager serves as the conduit through whom all questions concerning data quality flow. The QA manager receives daily reports from the automated system, issues trouble tickets to guide the technicians in the field, and issues summary reports to the broader community of data users. Technicians and other Mesonet staff remain in contact through cellular communications, pagers, and the World Wide Web. Together, these means of communication provide a seamless system: from identifying suspicious data, to field investigations, to feedback on action taken by the technician.

Full access