Search Results

You are looking at 1 - 10 of 17 items for

  • Author or Editor: Christopher Fiebrich x
  • Refine by Access: All Content x
Clear All Modify Search
Christopher A. Fiebrich and Kenneth C. Crawford

To ensure quality data from a meteorological observing network, a well-designed quality control system is vital. Automated quality assurance (QA) software developed by the Oklahoma Mesonetwork (Mesonet) provides an efficient means to sift through over 500 000 observations ingested daily from the Mesonet and from a Micronet sponsored by the Agricultural Research Service of the United States Department of Agriculture (USDA). However, some of nature's most interesting meteorological phenomena produce data that fail many automated QA tests. This means perfectly good observations are flagged as erroneous.

Cold air pooling, “inversion poking,” mesohighs, mesolows, heat bursts, variations in snowfall and snow cover, and microclimatic effects produced by variations in vegetation are meteorological phenomena that pose a problem for the Mesonet's automated QA tests. Despite the fact that the QA software has been engineered for most observations of real meteorological phenomena to pass the various tests—but is stringent enough to catch malfunctioning sensors—erroneous flags are often placed on data during extreme events.

This manuscript describes how the Mesonet's automated QA tests responded to data captured from microscale meteorological events that, in turn, were flagged as erroneous by the tests. The Mesonet's operational plan is to catalog these extreme events in a database so QA flags can be changed manually by expert eyes.

Full access
Claude Duchon, Christopher Fiebrich, and David Grimsley

Abstract

To better understand the undercatch process associated with tipping-bucket rain gauges, a high-speed camera normally used in determining the structure of lightning was employed. The photo rate was set at 500 frames per second to observe the tipping of the bucket in a commonly used tipping-bucket rain gauge. The photos showed detail never seen before as the bucket tipped from one side to the other. Two fixed rain rates of 19.9 mm h−1 (0.78 in. h−1) and 175.2 mm h−1 (6.90 in. h−1), the minimum and maximum available, respectively, were used.

The data from four tips at each rain rate were examined. The results show that the time from the beginning of a tip to the time the bucket assembly is horizontal—defined as the period during which undercatch occurs—was an average of 0.450 s for the eight cases. The average time for a complete tip was 0.524 s; thus, the vast majority of the time of a tip, 86%, is spent in undercatch mode. Because there was no apparent dependence of these times on rain rate, it should be possible to apply an accurate linear correction for undercatch as a function of rain rate given the time that undercatch occurs during a tip. Over all eight tips, the undercatch was found to be 0.98% for the 19.9 mm h−1 rate and 8.78% for the 175.2 mm h−1 rate. The procedure used to estimate the undercatch is described. Slow motion videos of the tipping of a bucket are available online.

Full access
Christopher A. Fiebrich and Kenneth C. Crawford

Abstract

The research documented in this manuscript demonstrates that undeniable differences exist between values of daily temperature recorded by the National Weather Service Cooperative Observer Program network and data recorded by the Oklahoma Mesonet. Because of this fact, a transition to automated observations would have the effect of changing the climate record for Oklahoma. However, the change to automated observations would produce an improvement in overall data quality.

A sampling of daily data from the two networks was compared for closely spaced station pairs for the period 1 January 2003 through 31 December 2005. As a result, a host of observer errors were discovered (including transcription errors, incorrectly resetting the manual sensors, and delaying the observation time). These errors created large daily differences that sometimes exceeded 5°C between the two datasets. More than 55% of the paired observations were found to differ by more than 1°C.

Full access
Claude E. Duchon, Christopher A. Fiebrich, and Bradley G. Illston

Abstract

The May 2015 record rainfall that occurred across Oklahoma was the result of a large number of high-intensity rain events. A unique set of observations from gauges in the Oklahoma Mesonet, the NWS Cooperative Observer (COOP) network, the Community Collaborative Rain, Hail and Snow (CoCoRaHS) network, an experimental pit gauge system, and NWS radar was available that covered an area in and around Norman, Oklahoma. This paper documents the performance of the various gauges throughout the course of the month. Key findings are 1) observations from all methods significantly exceeded the 200-yr return interval; 2) a weighing-bucket gauge at ground level recorded amounts up to 4.5% higher than a similarly located ground-level tipping-bucket gauge and up to 8.2% higher than a nearby aboveground tipping-bucket gauge; 3) a manual COOP gauge recorded nearly identical (within 1.2%) observations as compared to an automated tipping-bucket gauge at a collocated Mesonet station; and 4) observations from 26 CoCoRaHS gauges yielded an average rainfall within 1% of the aerially averaged radar rainfall derived from the Multisensor Precipitation Estimator.

Full access
Gary McManus, Thomas W. Schmidlin, and Christopher A. Fiebrich

A minimum temperature of −31°F (−35°C) was recorded at Nowata, Oklahoma, on 10 February 2011. This exceeded the previous record minimum temperature for Oklahoma of −27°F (−32.8°C). The Nowata station is in the Oklahoma Mesonet network. High pressure was centered over Oklahoma on the morning of the record with clear skies, calm winds, and a fresh snow cover of 38 cm at Nowata. A State Climate Extremes Committee (SCEC) examined the record, including the siting of the station, calibration of the thermometer, and depth of snow. The SCEC voted unanimously to approve the reading as the new lowest minimum temperature record for Oklahoma.

Full access
Christopher A. Fiebrich, Jadwiga R. Ziolkowska, Phillip B. Chilson, and Elizabeth A. Pillar-Little

Abstract

In recent years, technological developments in engineering and meteorology have provided the opportunity to introduce innovative extensions to traditional surface mesonets through the application of uncrewed aircraft systems (UAS). This new approach of measuring vertical profiles of weather variables by means of UAS in the atmospheric boundary layer, in addition to surface stations, has been termed a 3D mesonet. Technological innovations of a potential 3D mesonet have recently been described in the literature. However, a broader question remains about potential socioeconomic and environmental benefits and beneficiaries of this new extension. Given that the concept of a 3D mesonet is a new idea, studies about socioeconomic and environmental advantages of this network (as compared with traditional mesonets) do not appear to exist in the peer-reviewed literature. This paper aims to fill this gap by providing a first perspective on potential benefits and ripple effects of a 3D mesonet, addressing both the added value and prevented losses in specific sectoral applications and for different groups. A better understanding of qualitative economic aspects related to a 3D mesonet can facilitate future developments of this technology for more cost-effective applications and to mitigate environmental challenges in more efficient ways.

Open access
Mark A. Shafer, Christopher A. Fiebrich, Derek S. Arndt, Sherman E. Fredrickson, and Timothy W. Hughes

Abstract

High quality data sources are critical to scientists, engineers, and decision makers alike. The models that scientists develop and test with quality-assured data eventually become used by a wider community, from policy makers’ long-term strategies based upon weather and climate predictions to emergency managers’ decisions to deploy response crews. The process of developing high quality data in one network, the Oklahoma Mesonetwork (Mesonet) is detailed in this manuscript.

The Oklahoma Mesonet quality-assurance procedures consist of four principal components: an instrument laboratory, field visits, automated computer routines, and manual inspection. The instrument laboratory ensures that all sensors that are deployed in the network measure up to high standards established by the Mesonet Steering Committee. Routine and emergency field visits provide a manual inspection of the performance of the sensors and replacement as necessary. Automated computer routines monitor data each day, set data flags as appropriate, and alert personnel of potential errors in the data. Manual inspection provides human judgment to the process, catching subtle errors that automated techniques may miss.

The quality-assurance (QA) process is tied together through efficient communication links. A QA manager serves as the conduit through whom all questions concerning data quality flow. The QA manager receives daily reports from the automated system, issues trouble tickets to guide the technicians in the field, and issues summary reports to the broader community of data users. Technicians and other Mesonet staff remain in contact through cellular communications, pagers, and the World Wide Web. Together, these means of communication provide a seamless system: from identifying suspicious data, to field investigations, to feedback on action taken by the technician.

Full access
Christopher A. Fiebrich, Janet E. Martinez, Jerald A. Brotzge, and Jeffrey B. Basara

Abstract

In 1999, the Oklahoma Mesonet deployed infrared temperature (IRT) sensors at 89 of its environmental monitoring stations. A 3-yr dataset collected since that time provides a unique opportunity to analyze longer-term, continuous, mesoscale observations of skin temperature across a large area. Several limitations of the sensor have been identified and include 1) failure of the calibration equation during the cold season, 2) difficulty in keeping the sensor's lens clean at remote sites, and 3) limited representativeness of local conditions due to the sensor's narrow field of view. Despite these limitations, the Oklahoma Mesonet's skin temperature network provides a wealth of information that can be used to better understand many land–atmosphere interactions. Not only can the observations be used to estimate the partitioning of latent and sensible heat flux, they also provide beneficial “ground truth” estimates to validate remotely sensed estimates of skin temperature. This manuscript describes the IRT sensor, evaluates its performance, and provides analysis of time series data and observed spatial variability across Oklahoma.

Full access
Christopher A. Fiebrich, Cynthia R. Morgan, Alexandria G. McCombs, Peter K. Hall Jr., and Renee A. McPherson

Abstract

Mesoscale meteorological data present their own challenges and advantages during the quality assurance (QA) process because of their variability in both space and time. To ensure data quality, it is important to perform quality control at many different stages (e.g., sensor calibrations, automated tests, and manual assessment). As part of an ongoing refinement of quality assurance procedures, meteorologists with the Oklahoma Mesonet continually review advancements and techniques employed by other networks. This article’s aim is to share those reviews and resources with scientists beginning or enhancing their own QA program. General QA considerations, general automated tests, and variable-specific tests and methods are discussed.

Full access
Christopher A. Fiebrich, David L. Grimsley, Renee A. McPherson, Kris A. Kesler, and Gavin R. Essenberg

Abstract

The Oklahoma Mesonet, jointly operated by the University of Oklahoma and Oklahoma State University, is a network of 116 environmental monitoring stations across Oklahoma. Technicians at the Oklahoma Mesonet perform three seasonal (i.e., spring, summer, and fall) maintenance passes annually. During each 3-month-long pass, a technician visits every Mesonet site. The Mesonet employs four technicians who each maintain the stations in a given quadrant of the state. The purpose of a maintenance pass is to 1) provide proactive vegetation maintenance, 2) perform sensor rotations, 3) clean and inspect sensors, 4) test the performance of sensors in the field, 5) standardize maintenance procedures at each site, 6) document the site characteristics with digital photographs, and 7) inspect the station’s hardware. The Oklahoma Mesonet has learned that routine and standardized station maintenance has two unique benefits: 1) it allows personnel the ability to manage a large network efficiently, and 2) it provides users access to a multitude of station metadata.

Full access