Quality Control and Processing of Cooperative Observer Program Hourly Precipitation Data

Jay H. Lawrimore NOAA/National Centers for Environmental Information, Asheville, North Carolina

Search for other papers by Jay H. Lawrimore in
Current site
Google Scholar
PubMed
Close
,
David Wuertz NOAA/National Centers for Environmental Information, Asheville, North Carolina

Search for other papers by David Wuertz in
Current site
Google Scholar
PubMed
Close
,
Anna Wilson Center for Western Weather and Water Extremes, Scripps Institution of Oceanography, University of California, San Diego, La Jolla, California

Search for other papers by Anna Wilson in
Current site
Google Scholar
PubMed
Close
,
Scott Stevens Cooperative Institute for Satellite Earth System Studies, North Carolina State University, Asheville, North Carolina

Search for other papers by Scott Stevens in
Current site
Google Scholar
PubMed
Close
,
Matthew Menne NOAA/National Centers for Environmental Information, Asheville, North Carolina

Search for other papers by Matthew Menne in
Current site
Google Scholar
PubMed
Close
,
Bryant Korzeniewski Riverside Technology Inc. at NOAA/NCEI, Asheville, North Carolina

Search for other papers by Bryant Korzeniewski in
Current site
Google Scholar
PubMed
Close
,
Michael A. Palecki NOAA/National Centers for Environmental Information, Asheville, North Carolina

Search for other papers by Michael A. Palecki in
Current site
Google Scholar
PubMed
Close
,
Ronald D. Leeper Cooperative Institute for Satellite Earth System Studies, North Carolina State University, Asheville, North Carolina

Search for other papers by Ronald D. Leeper in
Current site
Google Scholar
PubMed
Close
, and
Thomas Trunk NOAA/National Weather Service/Surface and Upper Air Division, College Park, Maryland

Search for other papers by Thomas Trunk in
Current site
Google Scholar
PubMed
Close
Free access

Abstract

The National Oceanic and Atmospheric Administration (NOAA) has operated a network of Fischer & Porter gauges providing hourly and subhourly precipitation observations as part of the U.S. Cooperative Observer Program since the middle of the twentieth century. A transition from punched paper recording to digital recording was completed by NOAA’s National Weather Service in 2013. Subsequently, NOAA’s National Centers for Environmental Information (NCEI) upgraded its quality assurance and data stewardship processes to accommodate the new digital record, better assure the quality of the data, and improve the timeliness by which hourly precipitation observations are made available to the user community. Automated methods for removing noise, detecting diurnal variations, and identifying malfunctioning gauges are described along with quality control algorithms that are applied on hourly and daily time scales. The quality of the hourly observations during the digital era is verified by comparison with hourly observations from the U.S. Climate Reference Network and summary of the day precipitation totals from the Global Historical Climatology Network dataset.

Corresponding author: Jay Lawrimore, jay.lawrimore@noaa.gov

Abstract

The National Oceanic and Atmospheric Administration (NOAA) has operated a network of Fischer & Porter gauges providing hourly and subhourly precipitation observations as part of the U.S. Cooperative Observer Program since the middle of the twentieth century. A transition from punched paper recording to digital recording was completed by NOAA’s National Weather Service in 2013. Subsequently, NOAA’s National Centers for Environmental Information (NCEI) upgraded its quality assurance and data stewardship processes to accommodate the new digital record, better assure the quality of the data, and improve the timeliness by which hourly precipitation observations are made available to the user community. Automated methods for removing noise, detecting diurnal variations, and identifying malfunctioning gauges are described along with quality control algorithms that are applied on hourly and daily time scales. The quality of the hourly observations during the digital era is verified by comparison with hourly observations from the U.S. Climate Reference Network and summary of the day precipitation totals from the Global Historical Climatology Network dataset.

Corresponding author: Jay Lawrimore, jay.lawrimore@noaa.gov

1. Introduction

From 1901 through 2018, average annual precipitation for the contiguous United States increased 7.6% (NCEI 2019). The frequency of heavy precipitation events also increased, in part due to the greater water-holding capacity of the atmosphere, which is attributable to warming global temperatures (Trenberth et al. 2003). Such an understanding of U.S. trends in precipitation relies on the collection of data from a variety of observational networks, some of which are operated by the National Oceanic and Atmospheric Administration (NOAA). The largest and longest-lived network in the United States is the NOAA National Weather Service (NWS) Cooperative Observer Program (COOP) network. This network, which began in 1895, consists of 8143 stations as of July 2019, supported largely by volunteer observers who use 8-in. precipitation gauges to make once-a-day observations of precipitation collected over the previous 24 h. [These data are available as part of the Global Historical Climatology Network–Daily (GHCNd) dataset; https://www.ncdc.noaa.gov/ghcn-daily-description.]

A small but significant subset of the COOP network also provides precipitation observations on temporal scales less than one day (e.g., hourly and subhourly). Composed of a subnetwork of more than 1900 Fischer & Porter (F&P) precipitation gauges, these are the primary source of observations for the COOP Hourly Precipitation Dataset (C-HPD) produced by NOAA’s National Centers for Environmental Information (NCEI). This dataset provides observations of hourly and 15-min precipitation amounts (rainwater and melted snow and ice) from the mid-1900s to present. In addition to supporting studies of climate variability and change (Hayhoe et al. 2018), the lengthy record of this network can be used to characterize the nature of precipitation rates and extremes at a variety of time intervals. Combined with its relatively high spatial density, the C-HPD provides an important source of data for many hydrometeorological applications, supporting water resource decision makers (Raff et al. 2013), infrastructure engineering (NCHRP 2012), and flash flood guidance for weather forecasting (Clark et al. 2014), among many uses. It is important to note that this network complements many other observational networks recording precipitation, such as the Community Collaborative Rain, Hail and Snow Network (CoCoRaHS) and state and regional mesonets, that provide denser coverage and in some cases observations as often as every 5 min.

Throughout much of its lifetime the F&P gauges recorded precipitation through mechanical conversion of the weight of liquid collected in the instrument bucket to a punched paper tape record of the depth of precipitation every 15 min. In 2005, the NWS began upgrading the F&P stations by replacing the paper tape mechanism with a digital recorder (U.S. Department of Commerce 2005). The upgrade, designated as the F&P Upgrade (FPU) when it was first initiated, and later as the F&P Rebuild (FPR), aimed to improve the quality and completeness of hourly precipitation observations while reducing maintenance costs. As a replacement for paper tape, which are subject to tearing, deterioration, and being expended between site visits, the upgrade introduced digital recording via a datalogger.

Accompanying this transition to digital recording are new data acquisition, integration, and quality control processes developed at NCEI. This new approach features a change from a largely manual review and edit process to a fully automated system that removes the subjectivity that was previously required for quality control and processing. The conversion from paper to digital recording along with development of an objective quality control process is aimed at greatly improving the quality of the C-HPD data, which are available from NCEI as version 2 of the C-HPD. A description of the automated processing and quality control system is provided in the following sections which are organized as follows: section 2 describes the data sources, and section 3 provides a summary of data acquisition and the process of converting gauge depths to incremental precipitation totals. The quality control of hourly precipitation totals is described in section 4, section 5 contains an evaluation of the data, and conclusions are provided in section 6.

2. Data sources

Stations in the C-HPD network record the depth of precipitation in the gauge every 15 min. The network originally consisted of several different weighing rain gauge instruments before F&P gauges became the standard. Automated measurements recorded on paper tape were phased into the network in the early 1960s (Wilson et al. 2010). The locations of the HPD stations operating in 2019 are shown in Fig. 1. The number of stations operating each month during the period of record in the legacy and digital era are shown in Fig. 2. Note that in 2019 there were no C-HPD stations operating in the states of Massachusetts and Rhode Island, and only one station in New Jersey and Delaware. The distribution of stations in the C-HPD network is based on a number of factors such as the availability of data from stations in other complementary observing networks.

Fig. 1.
Fig. 1.

The location of each F&P station operating in 2019 and the period of record length (years) as indicated by the color.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

Fig. 2.
Fig. 2.

The number of operating stations in the C-HPD network from 1940 through December 2019; total number of stations (red), number of legacy (punch paper recording) stations (green), number of stations converted to digital recording (blue).

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

a. Legacy punched paper data

Until the conversion to digital recording, data collection required removal and replacement of paper tapes each month by a trained observer. The tapes were packaged by the respective Weather Forecast Office (WFO) and sent to NCEI or a contract-managed processing facility. Precipitation data were extracted by running the tapes through a MITRON punched paper tape reader which converted the observations into a digital precipitation record of gauge depth at 15-min increments (Hammer and Reek 1997). The gauge depths were subsequently converted to 15-min precipitation amounts and summed into hourly totals as part of operational processing at NCEI. Stewardship and access of the hourly and 15-min observations were provided through datasets designated as DSI-3240 and DSI-3260, respectively.

The legacy data from the DSI-3240 (NCDC 2003) is retained for the HPD stations converted to digital recording and merged with the record produced from the digital gauge depth observations to produce a C-HPD dataset that spans the full period of record from the 1940s to present. The data processing described in section 3 and following sections pertain only to data collected during the digital era. However, because the quality control flags of the legacy DSI-3240 dataset are retained in the merged version 2 dataset, a short description of those checks is included in section 4g.

b. Digital recording

The punched paper recording device provided a half-century record of precipitation across the United States. However, observations recorded on the punched paper tape were subject to loss due to tearing, deterioration, and the tape being expended between site visits. As aging equipment became more difficult to maintain, and with better technologies available, the NWS began the upgrade and replacement of the legacy equipment in 2005 (U.S. Department of Commerce 2005).

The upgrades took place in a series of deployments beginning with approximately 250 stations designated as F&P Upgrade (FPU). This was followed by deployment of F&P Rebuild (FPR) equipment designated FPR-D and FPR-E, provided by Sutron Corp., and Coastal Environmental Systems, Inc., respectively. (The FPU equipment was subsequently replaced with FPR-D or FPR-E equipment before the end of the upgrade project.) The last stations were converted to digital recording in 2014 and as of October 2019 there are 1961 digital recording stations in the version 2 inventory.

The various designations (FPU, FPR-D, and FPR-E) reflect different vendor-specific equipment. Although there are some differences between vendors, all equipment provides a digital record of 15-min gauge depths recorded on dataloggers at each site. Collection and dissemination of the data is made possible when the COOP site observer accesses the FPR and downloads the data to a portable memory device and subsequently transfers the data to a WFO workstation in the respective forecast area (FA). The data files for all stations in the WFO FA are subsequently bundled into a Windows zip file and transferred to NCEI. Data downloads are accomplished monthly at most stations, are relatively simple to perform, and procedures for submitting the data files to NCEI have proven robust.

Each zip file contains a set of files containing 15-min gauge depths for each F&P station in ASCII format. Automated processing at NCEI includes ingest, archive of each zip file as it is received, and subsequent parsing and downstream quality control. An automated review of each station file is made to identify problems associated with filename conventions and data corruption so that feedback can be provided to the NWS WFOs. This is described in more detail in section 3.

c. Other F&P stations

Approximately 200 F&P stations were not designated for inclusion in the digital recording upgrade project. More than half of these stations continue to operate, but were not included in the initial release of C-HPD version 2. These include 120–130 stations that are transmitted hourly via satellite or telephone telemetry and collected as part of the NWS Hydrometeorological Acquisition Data System (HADS). These stations were specially outfitted with telemetry capabilities in order to support precipitation forecasting and warning activities for the NWS, the U.S. Army Corps of Engineers, and the U.S. Geological Survey. An additional 30–40 stations that were not part of the upgrade project are transcribed from the punched paper tape at some WFO offices and sent to NCEI each month via NWS Form 79-IDs. Future plans include the addition of these stations to the C-HPD dataset.

d. Local climatological data

For many years the F&P data in the HPD dataset were augmented by approximately 270 stations operating at major airports and NWS offices. These stations consisted primarily of manual observations of hourly precipitation totals until the Automated Surface Observing System (ASOS) was deployed in the mid-1990s (National Research Council 2012). The hourly precipitation totals from these stations were processed and quality controlled primarily as part of NCEI’s Local Climatological Data (LCD) program, but they also were merged with hourly precipitation totals from the F&P network to improve the spatial coverage of hourly observations in the legacy DSI-3240 dataset. There are now more than 900 stations in the ASOS network and thousands more in the Automated Weather Observing System (AWOS) network as well as thousands more in other networks. No ASOS or AWOS stations are now included as part of the C-HPD version 2 dataset. Rather, there are ongoing parallel efforts at NCEI to incorporate C-HPD data along with these and other networks into a single integrated hourly dataset which will be released in the future.

3. Preprocessing, filtering, and conversion of 15-min gauge depth data

Data processing begins with acquisition and preprocessing steps, followed by filtering and conversion of the gauge depths to incremental precipitation, then quality control, and data output. Following each site visit and consolidation of the files within the forecast area, the responsible WFO transfers a single zip file to NCEI via ftp, typically within the first two to three weeks of each month.

When data arrive a check is performed to confirm the files are successfully unzipped, the filenames and date ranges are valid, and the data are correctly formatted. Any data duplicated with previous transmissions is removed. The status of processing is written to a log file that is made available to each WFO for confirmation that their data were successfully processed at NCEI or to determine the source of any problem that needs to be resolved.

This is followed by several steps involving the filtering and conversion of gauge depths to 15-min precipitation amounts as described below. Four filtering and processing steps are performed in converting the gauge depth values to 15-min precipitation totals, which are then summed to hourly amounts (Fig. 3).

Fig. 3.
Fig. 3.

Data flow diagram of the C-HPD ingest and quality control process. This consists of four major parts: data ingest and integrity checks, 15-min gauge depth filtering, conversion to hourly period of record data, and quality control of the hourly precipitation totals.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

Changes in gauge depth that are not associated with precipitation can occur for a variety of factors as well as variations in the sensitivity of gauge measurements associated with differing levels of attention given to instrument maintenance across the network. To distinguish changes in gauge depth resulting from precipitation from other factors, a series of thresholds were evaluated across hundreds of station-months to determine the optimum values. These thresholds are reflected in processing for missing observations, high-frequency noise, diurnal fluctuations, and malfunctioning gauges as described below.

a. Handling missing observations

It is not uncommon to find instances of one or more missing 15-min gauge depth measurements in a monthly HPD file. When producing the monthly series of incremental changes in gauge depth we replace periods of missing observations with estimated values if the period of missing data is less than one day. The gauge depths at the start and end of each missing period are compared to determine if infilling is possible. In instances when the gauge depth values are identical before and after the missing period, the missing period of 15-min values is filled with increments of zero. If the difference between the beginning and end of the missing period is less than or equal to 0.51 mm (0.02 in.), the missing values are replaced with the average of the incremental change in gauge depth values. This is applied to missing segments of three or more values. Otherwise the values remain missing. In only a small number of cases (~1% of all observations) is it not possible to fill in a missing value because the difference between gauge depths before and after the missing period is too large.

b. Removal of high-frequency noise

High-frequency, low-amplitude noise is a common issue with the F&P instruments. This can occur due to factors associated with equipment age and the level of maintenance for a station as well as natural and manmade influences such as fluctuations associated with moderate to high wind speeds, small earthquakes, and the passage of nearby rail traffic. Small negative and positive changes [less than ±0.76 mm (0.03 in.)] in gauge depth that occur in the absence of precipitation are removed by locating offsetting negative and positive oscillations.

c. Removal of diurnal fluctuations

Diurnal fluctuations of bucket values unrelated to precipitation are present in almost all F&P gauges with varying degrees of amplitude depending on maintenance procedures and climate zone. We have observed that diurnal fluctuations are typically manifested by a slow increase in reported gauge depth, apparently in response to rising temperatures that typically occur throughout the day.

This procedure to remove the effects of diurnal fluctuations begins by summing the incremental changes in gauge depth in order to compute the cumulative precipitation (CP) totals throughout the month. The differences between the smallest and largest CP values for each day of the month (i.e., daily amplitude) are determined, then the 10th percentile of these daily amplitudes is computed. Most 24-h changes in gauge depth caused by precipitation are larger than diurnal cycles of gauge depth on dry days. Therefore, a day is determined to be dry if the difference between the current day and previous day’s lowest CP is less than the 10th percentile and the difference between the current day and previous day’s highest CP is less than the 10th percentile. Days identified as dry have each 15-min increment set to zero; otherwise, the 15-min incremental values remain unchanged.

An example of diurnal fluctuations is shown for Canyon Dam, Texas, from April 2013 (Fig. 4a). The incremental changes in gauge depth during the early days of the month moved steadily and without much diurnal noise, which was detected and set to zero; the values after 20 April were noisy for other reasons and were flagged as described in the next section.

Fig. 4.
Fig. 4.

Example of (a) diurnal variations in gauge depth values for station Canyon Dam, Texas (COOP ID 411429), during April–May 2013. Large fluctuations began suddenly on 20 Apr and ended on 3 May. Small diurnal fluctuations are visible before and after this period; these fluctuations are identified and the gauge depth incremental changes set to zero except during periods of precipitation. Routine gauge emptying occurred on 21 May. (b) The measured accumulation of precipitation during the period of functioning instrumentation is also shown.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

d. Detection of malfunctioning gauges

Large fluctuations unrelated to precipitation can occur due to malfunctions in the gauge. Algorithms are used to identify these periods, to flag observations as invalid when necessary, and to preserve as much of the record of precipitation as possible. The checks are performed within 24-h blocks (0000–2359 LST). Days are invalidated when gauge depth oscillations are greater than ±0.76 mm (0.03 in.) or spikes exceed 20.3 mm (0.8 in.). Figure 4 provides an example of an instrument malfunction that occurred from 20 April to 3 May 2013 for the station in Canyon Dam, Texas. This station was operating nominally until a malfunction occurred on 20 April. Data during this period were invalidated and the hourly observations set to missing.

At the conclusion of the gauge depth filtering process the four 15-min values are summed across each hour to compute hourly precipitation amounts. Any hour with one or more missing 15-min totals is set to missing. Additional quality control is performed on the hourly totals as described in the next section.

4. Quality control of hourly precipitation totals

Before the quality control algorithms are applied, the hourly precipitation from the C-HPD digital era (generally 2005 to present) are combined with historical data from NCEI’s DSI-3240 dataset. The DSI-3240 dataset contains hourly precipitation through December 2013, which includes any data from stations converted to digital recording beginning in 2005. In the C-HPD dataset, hourly values derived from the 15-min gauge depth values of the digital era using the processes described in this article take precedence over the hourly data sourced from DSI-3240.

The combined hourly record is subjected to a set of four automated quality control algorithms. An additional quality control check is performed at the daily time scale using sums of the 24-hourly totals. Although the DSI-3240 hourly values were quality controlled when the data were first collected and processed (section 4g), the new automated hourly quality control checks are applied to the full period of record, providing the added benefit of an additional layer of quality assurance to the DSI-3240 era.

The quality control process is based on a model established by Durre et al. (2008) and first applied in development of quality assurance processes for NCEI’s GHCNd data (Menne et al. 2012). The strategy involves complete automation using a quality control system in which data are analyzed consistently and objectively. Manual intervention is used prior to the implementation of the quality control algorithms to identify and validate the thresholds used in the system’s decision making. For each potential threshold value an empirical assessment of random samples of flagged values is conducted to select the best threshold based on flag rate, false positive rate, types of errors detected, and the conditions under which errors might not be identified.

Advantages to this method include the removal of the subjective component of operational quality control using a consistent set of checks throughout the period of record. Most importantly, the ability to process the entire period of record makes it possible to apply quality control retrospectively as new methods are developed and to do so in a consistent manner throughout the life of the data.

a. U.S. extreme exceedance (hourly)

The greatest all-time hourly precipitation total in the United States (and globally) was recorded in Holt, Missouri, located in the U.S. Central Plains north of Kansas City, Missouri. This is a region often subject to severe storms produced by a variety of systems including cold fronts, squall lines, and arctic fronts (Locatelli and Hobbs 1995). On 22 June 1947 a very intense small-area rainstorm associated with local intensification ahead of a surface cold front produced a total of 304.8 mm (12.0 in.) of rainfall during a 42-min period. This total was estimated by a bucket survey conducted by the U.S. Army Corps of Engineers (Lott 1954). This U.S. and global hourly rainfall record is used as the upper limit of valid hourly observations in the C-HPD dataset. Any hourly observation exceeding this amount is flagged as invalid as the first step in the quality control process. If an extreme event of similar or greater magnitude were to produce a valid hourly total exceeding this amount it would be reviewed and corrected through the Datzilla process (section 4f).

b. State extreme exceedance (daily)

In addition to the hourly extreme check described above, a second extremes check is performed on the daily time scale using records maintained by NOAA’s State Climate Extremes Committee (SCEC; https://www.ncdc.noaa.gov/extremes/scec/records). These records provide the all-time highest precipitation recorded over a 24-h period in each of the 50 U.S. states. They serve as a valuable indicator of data quality since their accuracy is evaluated and verified through the SCEC.

This check is performed after summing the C-HPD hourly totals for each day (0000–2359 LST) for every C-HPD station. The daily totals for each station are compared to the record 24-h totals for the respective state. For any daily total found to exceed the statewide record, all hourly observations for the day are invalidated.

c. Streak check

This is the first of three checks based on the quality control algorithms first developed for the GHCNd dataset (Durre et al. 2010). A streak is a sequence of the same value occurring consecutively for an extended period of time that would be implausible to occur, although the individual values may lie within the climatological distribution of data values for the station. The minimum number of values that constitute an implausibly long streak was determined based on a series of evaluations of various thresholds to identify those that would support reliable detection of such streaks while not resulting in a high rate of false positives. In evaluating streaks of precipitation, missing and zero hourly totals are ignored. A streak of 20 or more identical nonzero hourly totals less than or equal to 7.6 mm (0.30 in.) results in all values being flagged as invalid. For heavy precipitation (≥7.6 mm; 0.30 in.), values are flagged for streaks of five or more identical nonzero hourly totals. An example of a streak of 12 consecutive hourly values of 25.4 mm (1.0 in.) recorded 11–13 October 1980 at Yaquina Bay, Oregon, is shown in Fig. 5. Contained within the streak is an hourly value of 50.8 mm (2.0 in.) that was previously flagged invalid as an extreme value as part of the legacy quality control (section 4g) and thus is considered a missing value when evaluating the series for streaks.

Fig. 5.
Fig. 5.

Example of hourly values of 25.4 mm (1.0 in.) flagged as invalid by the streak check for USC00359581, Yaquina Bay, Oregon, on 11–13 Oct 1980. An extreme value previously flagged and removed by the legacy extreme check [50.8 mm (2.0 in.)] also is shown.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

d. Gap check

The gap check examines the frequency distribution of hourly observations to identify values in the distribution’s tail when the tail is unrealistically separated from the remainder of the distribution. The gap threshold, or maximum allowable separation of the tail from the remainder of the distribution, is independent of station location and time.

The distribution of hourly precipitation totals is computed within 31-day moving windows for each station. These moving windows are overlapping so that each consists of 31-day periods computed beginning with day 1 through day 31, then day 2 through day 32, etc. Each of these periods contains data throughout the station’s period of record for the respective 31-day period. Within each period hourly totals are sorted from smallest to largest, and the differences between consecutive values are then calculated. Any hourly values separated by more than a gap threshold of 31.75 mm (1.25 in.) from the next closest value are identified and all values on the upper side of the gap are flagged as invalid. The threshold was determined using the method of Durre et al. (2010) as noted above. A value flagged by the gap check is shown in Fig. 6 for Spring Branch, Texas.

Fig. 6.
Fig. 6.

Example of a single hourly value flagged as invalid by the gap check (red) for USC00418544 Spring Branch, Texas on 29 Sep 2013. Note that the value does not exceed the threshold (7 × 95th percentile) for a climatological outlier (section 4e).

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

e. Climatological outlier

The climatological outlier check is applied to identify hourly observations that are so far outside the climatological distribution of the station’s period of record as to very likely be invalid. This is a percentile-based outlier check that flags precipitation totals that exceed 7 times the corresponding climatological 95th percentile for the calendar day on which the total was observed. The percentile is computed from nonzero hourly totals observed during all available years and within a 31-day window centered on the calendar day of the observation. An outlier recorded on 7 April 2007 for Pulaski, Virginia, is shown in Fig. 7.

Fig. 7.
Fig. 7.

Example of a single hourly value flagged as invalid by the climatological outlier check (red) for USC00446955, Pulaski, Virginia, on 7 Apr 2007.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

f. Expert analysis (Datzilla)

Following quality control, all nonmissing observations are either unflagged (i.e., valid) or they are flagged to indicate the quality control check under which the observation was determined to be invalid. If the quality of any observation is subsequently found to be different than that classified by the automated quality control process, an exception can be made and documented through NCEI’s Datzilla system (Shein 2008).

For valid observations later determined to be erroneously flagged by automated algorithms (false positives), a Datzilla override is applied to ensure the value is not flagged during the quality control process described above. Conversely, for unflagged observations that are found to be invalid through other corroborating evidence, a quality control flag is appended to invalidate the values. The flags set through the Datzilla process take precedence over all other quality control checks. During the quality control process, the Datzilla flags are set before the automated checks are performed (Fig. 3).

Datzilla is a web-based system used by NOAA staff to document all information associated with the suspected data quality issue, all steps taken to investigate the issue, and any actions taken to correct the error. All supporting materials are also included. This includes specific verifiable information such as that provided by a local expert who witnessed the extreme event or has other evidence to support the change in quality. All information related to the event in question and the evidence which supports a change to a quality indicator is documented in the Datzilla system and a corresponding source flag appended to the observation.

g. Legacy checks

Quality control flags from the legacy DSI-3240 dataset are retained in version 2 in order to preserve the historical data quality information. The quality control process for the legacy dataset was a manually intensive process requiring a series of decisions by trained meteorological technicians. Because of the lack of full automation, when new quality control processes were established it was not possible to re–quality control previous months and years of data with the newly established checks. Thus some checks are noted as being used only during certain periods of time. Additional information on the legacy quality control checks is available in Phillips (1985) and Hammer and Reek (1997).

  • Extreme value test (prior to 1996): For observations that were not accumulated totals, the value failed the 1-h statewide 100-yr return period precipitation. For accumulated precipitation totals, the value failed the 24-h statewide extreme precipitation total.

  • Missing 15-min observation (after January 1996): Hourly values can exclude one or more 15-min periods.

  • Suspect value (after January 1996): Value was determined to be suspect with regard to the times or period of occurrence.

The next three legacy quality control checks were retained for completeness even though they are consistent with C-HPD version 2 measurement flags.

  • Missing: Value is missing in the DSI-3240 dataset and no alternate data source is available.

  • Deleted: Value was deleted during the quality control process.

  • Accumulation: Value is not an hourly precipitation total but rather an accumulation total for a period greater than an hour in duration and lasting through the end of the hour. Accumulations across multiple hours exist only from the legacy DSI-3240 data source.

h. Flag rates

Flag rates for the years 1950 through June 2019 are shown in Table 1. Of the more than 630 million observations in the C-HPD dataset, only 72 152 are flagged invalid (0.01%). This is a very low flag rate for an in situ observational dataset. The reason for this is twofold. In the predigital era, observations were most often deleted rather than flagged if they were determined to be invalid. Second, the processing steps for digital-era data involved in converting 15-min gauge depth values to incremental precipitation (as described in section 3) includes quality assurance algorithms that remove periods when malfunctions of the gauge would result in invalid observations. Although the flag rates for each hourly quality control check are low, the ability to identify and flag invalid observations is essential to the overall quality of the C-HPD dataset.

Table 1.

Number (and rate) of hourly values flagged by each of the automated and legacy (DSI-3240) quality control checks. The number of flagged values is for period of record data available as of June 2019.

Table 1.

5. Verification through comparative analysis

We conducted an analysis of the C-HPD hourly precipitation data over the period since digital recording began to verify that the gauge filtering and quality control procedures provide an accurate record of hourly precipitation. This was performed in two ways. First, we compared the C-HPD data to hourly precipitation observations from NOAA’s U.S. Climate Reference Network (USCRN; Diamond et al. 2013). The second involved a comparison of daily precipitation totals (computed by summing hourly totals across each day) against daily totals from the GHCNd dataset (Menne et al. 2012).

a. Frequency of hourly totals exceeding various thresholds (comparison with USCRN)

The USCRN is a very high quality climate observing network, designed to adhere to Global Climate Observing System (GCOS) climate monitoring principles (www.wmo.int/pages/prog/gcos/documents/GCOS_Climate_Monitoring_Principles.pdf) and with instrumentation that is regularly calibrated to National Institute for Standards and Technology (NIST) traceable standards (Diamond et al. 2013). Configured with triplicate measurements of temperature, precipitation, soil moisture, and soil temperature, the USCRN provides the best standard against which the C-HPD data could be compared. Precipitation is measured in 5-min increments using a Geonor weighing precipitation gauge; the 12 values are summed to hourly totals (Leeper et al. 2015). A wetness sensor at each station is used to ensure noise in the vibrating wires of the Geonor does not result in false precipitation reports. (The F&P gauges have no wetness sensor to eliminate the effect of noise.)

This analysis also includes stations from the U.S. Regional Climate Reference Network (USRCRN). The USRCRN operated in the Southwest and Alabama until this program ended in 2012 and stations were transferred to each state for operations and maintenance. The higher density of reference network observations in those states is clearly seen in coverage maps as discussed below. Subsequent discussion of our analysis refers to all USCRN and USRCRN stations as USCRN.

We made initial comparisons by computing the frequency of hourly precipitation amounts exceeding various thresholds for heavy precipitation [7.6 mm (0.30 in.) or greater], moderate to heavy [2.54 mm (0.10 in.) or greater], and light to heavy precipitation [1.0 mm (0.04 in.) or greater]. We conducted seasonal analyses using all nonmissing hours for which there were valid C-HPD and USCRN data values. Looking at the frequency of heavy precipitation (Fig. 8), there is strong consistency in the patterns across all four seasons. In the winter season, heavy precipitation occurs approximately 0.1%–0.3% of the time across most of the United States in both C-HPD and USCRN networks, with the exception of the Southeast and parts of the West Coast and Hawaii, where heavy precipitation occurs 0.5% or more. In the spring, the area of 0.5% and higher frequencies extends northward into parts of the mid-Atlantic and Midwest in both networks, and in the summer all areas east of the Rockies have frequencies of heavy precipitation 0.5% and higher, while lower frequencies stretch to the west coast. By fall, the area of higher frequencies retreats to a pattern generally similar to that of spring.

Fig. 8.
Fig. 8.

Frequency of occurrence of heavy precipitation [hourly totals of 7.62 mm (0.30 in.)] for stations in the USCRN/USRCRN networks (squares) and the C-HPD network (circles) during the C-HPD digital era for (a) winter, (b) spring, (c) summer, and (d) fall.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

The benefit of the greater spatial density of the C-HPD network is evident most notably in the summer. While the USCRN and C-HPD stations show frequencies of heavy precipitation greater than 0.6% in southern Florida and the east coast of the state, the extension of frequencies exceeding 1.0% along the immediate Florida Gulf Coast and parts of the Panhandle is visible only in the higher density C-HPD dataset.

Patterns of moderate to heavy precipitation [2.54 mm (0.10 in.) or more] also are similar between the C-HPD and USCRN networks (Fig. 9). These amounts generally occur 1.5% or more of the time from the Gulf Coast to the Ohio Valley and coastal areas of the Northeast in the winter. Much of the west coast from central California to the Pacific Northwest also has higher frequencies of moderate to heavy precipitation in both networks. The higher frequency of occurrence extends northward to the east of the Rocky Mountains in spring, and reaches as far as the Canadian border during the summer.

Fig. 9.
Fig. 9.

Frequency of occurrence of moderate to heavy precipitation [hourly totals of 2.54 mm (0.10 in.)] or more for stations in the USCRN/USRCRN networks (squares) and the C-HPD network (circles) during the C-HPD digital era for (a) winter, (b) spring, (c) summer, and (d) fall.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

Frequencies of light to heavy precipitation [1.0 mm (0.04 in.) or more; not shown] are equally similar between the networks. The highest frequencies occur in the winter season; more than 5% of the time along parts of the Gulf Coast and more than 8% of the time along much of the west coast. An exception is precipitation in Puerto Rico, where the highest frequencies occur in the summer and fall.

b. Dry days analysis

We performed two types of comparative analysis on the daily time scale, one for dry periods and the other for wet periods. The first sought to determine the extent to which noise or malfunctions in the F&P gauges were being properly identified and excluded from calculations of hourly precipitation, rather than falsely reported as precipitation in the C-HPD dataset. This analysis relied on identification of dry periods using observations from the COOP network that are part of NCEI’s GHCNd dataset. We selected a subset of C-HPD stations collocated with a COOP station and within 10 km of a USCRN station. The locations of these stations are shown in Fig. 10. Using observations from the COOP network, we identified periods when no precipitation fell within 5-day windows from 2005 through 2017, and performed the analysis on the day centered within the 5-day dry windows for 20 stations. The use of the center value of a 5-day window ensured that time of observation differences would not affect the analysis. There were a total of 192 136 station-hours on these dry days in which C-HPD and USCRN data also were available.

Fig. 10.
Fig. 10.

Locations of the 20 paired C-HPD/COOP and USCRN stations used in the dry and wet days analyses. The C-HPD F&P gauges are collocated with the 8-in. COOP gauge. The distance to the USCRN station is indicated by the color of the circle.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

Of these dry hours, there were 1872 hourly observations (0.97%) of 0.25 mm (0.01 in.) or more of precipitation in the C-HPD dataset and 578 hourly observations (0.30%) of 0.2 mm or more in the USCRN dataset (Table 2). Of the 1872 h of reported precipitation in C-HPD on “dry days,” more than 90% were of 1 mm (0.04 in.), and most of the others were hourly reports less than 5 mm (0.20 in.). During 253 of these station-hours, both C-HPD and USCRN reported precipitation, suggesting that the error may have been a false zero within the COOP record in the GHCNd dataset. Since the COOP network relies on volunteer observers to take observations from an 8-in. rain gauge once each day, light precipitation is more likely than heavier precipitation amounts to go unreported in the COOP summary of the day precipitation total.

Table 2.

Number of hours (percentage) in which precipitation was not measured (HPD = 0, CRN = 0) and the numbers of hours (percentage) when precipitation was measured (HPD > 0, CRN > 0) in the HPD and CRN networks on days in which no precipitation was recorded at a nearby COOP (GHCNd) station.

Table 2.

However, it is not surprising to find that a small number of false reports of light precipitation are present in the C-HPD dataset due to noise in the F&P gauges that was not completely filtered out. But the relatively small number of such cases (less than 1%) is indicative of well-performing filtering algorithms. As noted previously a wetness sensor is used in the USCRN network to reduce or eliminate false precipitation reports associated with sensor noise, and the benefit of that is evident in the lower frequency of reported precipitation events during dry periods.

A diurnal signal was found in the timing of false reports of very light precipitation in the C-HPD data, peaking near noon during daylight hours (Fig. 11). This is consistent with an increase in gauge depth oscillations that can occur in association with diurnal heating of the F&P gauge, especially when solar radiation is strong. The lack of a strong diurnal signal for USCRN precipitation over dry days indicates the value of a wetness sensor when distinguishing noise from precipitation.

Fig. 11.
Fig. 11.

Diurnal distribution of the 1872 h with precipitation in the C-HPD (blue) and 578 USCRN hours (red) on days otherwise identified as dry using COOP/GHCNd data.

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

c. Comparison on wet days

To evaluate the accuracy of the C-HPD data during periods of precipitation, we used COOP data from GHCNd to identify precipitation events that had a total precipitation of at least 25.4 mm (1.0 in.) over one or more consecutive days with two or more consecutive dry days on either side. There were 742 unique station-precipitation events from the same 20 sites used in the dry-days analysis. Precipitation amounts for each event were compared between the GHCNd, C-HPD, and USCRN stations.

Good agreement was found between GHCNd and the C-HPD network stations and between the GHCNd and USCRN network stations. The R2 value between the C-HPD event totals and the totals for the same event as measured by the COOP stations in GHCNd is 0.82 (Fig. 12). The R2 value for the USCRN and GHCNd event totals was lower (0.68), but still showed good agreement. The higher R2 value for the C-HPD data is likely due to the fact that C-HPD stations were collocated with the COOP stations while USCRN stations could be located further (within 10 km) from the COOP site.

Fig. 12.
Fig. 12.

Scatterplot showing the relationship between precipitation event totals [25.4 mm (1.0 in.) or more] for GHCNd and C-HPD (blue) and GHCNd and USCRN (red).

Citation: Journal of Hydrometeorology 21, 8; 10.1175/JHM-D-19-0300.1

6. Summary

The U.S. Cooperative Observer Program’s network of F&P precipitation gauges provides relatively high-density coverage of hourly precipitation. With a period of record beginning in the middle of the twentieth century, it is the longest running network of subdaily precipitation measurements in the country. The upgrade from punched paper to digital recording that the National Weather Service began in 2005 and completed in 2013 has been combined with a modernized processing and quality control system at NOAA’s National Centers for Environmental Information. This new system provides rapid and repeatable processing and delivery of hourly precipitation observations to the user community via the COOP Hourly Precipitation Data version 2 dataset.

The design of a fully automated system of data ingest, gauge filtering, hourly quality control, and data output provides the user community with timely access to high-quality hourly observations at more than 1900 stations. Although the instrumentation is subject to variations in gauge depth unrelated to precipitation, filtering algorithms preserved the precipitation signal while identifying and removing anomalous gauge depth fluctuations due to factors such as diurnal heating of the instrument, evaporation from the bucket, anomalous oscillations around zero, and extreme fluctuations that can be due to instrument maintenance issues or unusual environmental factors such as nearby railroad traffic. Subsequent processing using quality control algorithms that operate on hourly and daily time scales identify and flag hourly values that exceed expected climatological thresholds.

Validation of the C-HPD data included comparisons of the quality controlled hourly observations with data from the U.S. Climate Reference Network as well as comparisons of daily totals against neighboring COOP 8-in. gauge measurements of daily precipitation from the GHCNd dataset. The distribution in the frequency of light, moderate, and heavy hourly precipitation amounts in the C-HPD matched that of the USCRN network in all seasons, with the benefit of the higher-density C-HPD network especially evident in areas with strong gradients in precipitation amounts such as coastal locations in Florida in summer.

An analysis performed on days determined to be dry based on neighboring COOP 8-in. gauge measurements as present in the GHCNd dataset showed that the C-HPD dataset rarely reported precipitation on these dry days. On occasions when precipitation was reported, it tended toward mostly very light precipitation events. There were also a small number of cases when USCRN reported precipitation on these dry days, suggesting some of the network differences may be due in part to inaccuracies in the COOP record, but more often than not attributable to the difficulty of identifying small anomalous fluctuations (0.01 in.) in the F&P gauge depths that are unrelated to precipitation.

A comparison performed on wet days showed good agreement between the COOP and the C-HPD network stations and between the COOP and USCRN network stations. The R2 value of 0.82 between the C-HPD and COOP event totals provided an added level of verification that the C-HPD stations are operating well during the digital era.

The improvements in processing implemented in the C-HPD version 2 dataset enable NCEI to provide a dataset of higher quality with rapid updates and better accessibility to the user community. These data are available at https://data.nodc.noaa.gov/cgi-bin/iso?id=gov.noaa.ncdc:C00988. The raw 15-min gauge depth data are available from NCEI upon request. Future work includes official release of the associated 15-min precipitation data in a dataset that similarly combines the legacy era (DSI-3260) with data in the digital era.

Acknowledgments

The authors wish to thank the many National Weather Service staff and volunteer observers for their valuable contributions to the nation’s weather and climate observations through the U.S. Cooperative Observer Program. We would also like to acknowledge Deb Misch for her graphics support and the anonymous reviewers for their valuable comments and suggestions. S.S. and R.D.L. are supported by NOAA’s National Centers for Environmental Information (NCEI) through the Cooperative Institute for Satellite Earth System Studies (CISESS) under Cooperative Agreement NA19NES4320002. B.K. is supported by NCEI under Contract ST-1330-17-CQ-0058.

REFERENCES

  • Clark, R. A., J. J. Gourley, Z. L. Flamig, Y. Hong, and E. Clark, 2014: CONUS-wide evaluation of National Weather Service flash flood guidance products. Wea. Forecasting, 29, 377392, https://doi.org/10.1175/WAF-D-12-00124.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Diamond, H. J., and Coauthors, 2013: U.S. Climate Reference Network after one decade of operations. Bull. Amer. Meteor. Soc., 94, 485498, https://doi.org/10.1175/BAMS-D-12-00170.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Durre, I., M. J. Menne, and R. S. Vose, 2008: Strategies for evaluating quality-control procedures. J. Appl. Meteor. Climatol., 47, 17851791, https://doi.org/10.1175/2007JAMC1706.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Durre, I., M. J. Menne, B. E. Gleason, T. G. Houston, and R. S. Vose, 2010: Comprehensive automated quality assurance of daily surface observations. J. Appl. Meteor. Climatol., 49, 16151633, https://doi.org/10.1175/2010JAMC2375.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hammer, G. R., and T. Reek, 1997: The processing of recording rain gage data at the National Climatic Data Center. Proc. 13th Conf. on Hydrology, Long Beach, CA, Amer. Meteor. Soc., 223–226.

  • Hayhoe, K., and Coauthors, 2018: Our changing climate. Impacts, Risks, and Adaptation in the United States: Fourth National Climate Assessment, D. R. Reidmiller et al., Eds., Vol. II, U.S. Global Change Research Program, 72–144, https://doi.org/10.7930/NCA4.2018.CH2.

    • Crossref
    • Export Citation
  • Leeper, R. D., M. A. Palecki, and E. Davis, 2015: Methods to calculate precipitation from weighing-bucket gauges with redundant depth measurements. J. Atmos. Oceanic Technol., 32, 11791190, https://doi.org/10.1175/JTECH-D-14-00185.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Locatelli, J. D., and P. V. Hobbs, 1995: A world record rainfall rate at Holt, Missouri: Was it due to cold frontogenesis aloft? Wea. Forecasting, 10, 779785, https://doi.org/10.1175/1520-0434(1995)010<0779:AWRRRA>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lott, G. A., 1954: The world-record 42-minute Holt, Missouri, rainstorm. Mon. Wea. Rev., 82, 5059, https://doi.org/10.1175/1520-0493(1954)082<0050:TWMHMR>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Menne, M. J., I. Durre, R. S. Vose, B. E. Gleason, and T. G. Houston, 2012: An overview of the Global Historical Climatology Network-Daily database. J. Atmos. Oceanic Technol., 29, 897910, https://doi.org/10.1175/JTECH-D-11-00103.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • National Research Council, 2012: The National Weather Service Modernization and Associated Restructuring: A Retrospective Assessment. National Academies Press, 120 pp., https://doi.org/10.17226/13216.

    • Crossref
    • Export Citation
  • NCDC, 2003: Data documentation for data set 3240 (DSI-3240). National Climatic Data Center, 10 pp., ftp://ftp.ncdc.noaa.gov/pub/data/hourly_precip-3240/dsi3240.pdf.

  • NCEI, 2019: Climate at a glance: National time series. NOAA/NCEI, accessed 2 August 2019, https://www.ncdc.noaa.gov/cag/national/time-series/110/pcp/12/12/1901-2018.

  • NCHRP, 2012: BMP sizing and design. Guidelines for evaluating and selecting modifications to existing roadway drainage infrastructure to improve water quality in ultra-urban areas, NCHRP Rep. 728, National Academies Press, 8393, https://doi.org/10.17226/22031.

    • Crossref
    • Export Citation
  • Phillips, C. S., 1985: An objective method for minimizing non-precipitation effects in precipitation data from punched paper tape. Proc. Int. Conf. on Interactive Information and Processing Systems for Meteorology, Oceanography and Hydrology, Los Angeles, CA, Amer. Meteor. Soc., 178182.

    • Search Google Scholar
    • Export Citation
  • Raff, D., and Coauthors, 2013: Short-term water management decisions; user needs for improved climate, weather, and hydrologic information. Tech. Rep. CWTS 2013-1, 261 pp.

  • Shein, K., 2008: Interactive quality assurance practices. 24th Conf. on IIPS, New Orleans, LA, Amer. Meteor. Soc., 6A.9, http://ams.confex.com/ams/88Annual/techprogram/paper_131217.htm.

    • Search Google Scholar
    • Export Citation
  • Trenberth, K. E., A. Dai, R. M. Rasmussen, and D. B. Parsons, 2003: The changing character of precipitation. Bull. Amer. Meteor. Soc., 84, 12051218, https://doi.org/10.1175/BAMS-84-9-1205.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • U.S. Department of Commerce, 2005: Cooperative Observer Program product improvement implementation plan [Addendum I] for Fischer & Porter sensor upgrade. NOAA/NWS, 29 pp.

  • Wilson, A., S. Hinson, D. Manns, R. Ray, and J. H. Lawrimore, 2010: Hourly precipitation data processing changes at NCDC. 15th Symp. on Meteorological Observation and Instrumentation, Atlanta, GA, Amer. Meteor. Soc., 8.3, https://ams.confex.com/ams/90annual/techprogram/paper_159693.htm.

    • Search Google Scholar
    • Export Citation
Save
  • Clark, R. A., J. J. Gourley, Z. L. Flamig, Y. Hong, and E. Clark, 2014: CONUS-wide evaluation of National Weather Service flash flood guidance products. Wea. Forecasting, 29, 377392, https://doi.org/10.1175/WAF-D-12-00124.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Diamond, H. J., and Coauthors, 2013: U.S. Climate Reference Network after one decade of operations. Bull. Amer. Meteor. Soc., 94, 485498, https://doi.org/10.1175/BAMS-D-12-00170.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Durre, I., M. J. Menne, and R. S. Vose, 2008: Strategies for evaluating quality-control procedures. J. Appl. Meteor. Climatol., 47, 17851791, https://doi.org/10.1175/2007JAMC1706.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Durre, I., M. J. Menne, B. E. Gleason, T. G. Houston, and R. S. Vose, 2010: Comprehensive automated quality assurance of daily surface observations. J. Appl. Meteor. Climatol., 49, 16151633, https://doi.org/10.1175/2010JAMC2375.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hammer, G. R., and T. Reek, 1997: The processing of recording rain gage data at the National Climatic Data Center. Proc. 13th Conf. on Hydrology, Long Beach, CA, Amer. Meteor. Soc., 223–226.

  • Hayhoe, K., and Coauthors, 2018: Our changing climate. Impacts, Risks, and Adaptation in the United States: Fourth National Climate Assessment, D. R. Reidmiller et al., Eds., Vol. II, U.S. Global Change Research Program, 72–144, https://doi.org/10.7930/NCA4.2018.CH2.

    • Crossref
    • Export Citation
  • Leeper, R. D., M. A. Palecki, and E. Davis, 2015: Methods to calculate precipitation from weighing-bucket gauges with redundant depth measurements. J. Atmos. Oceanic Technol., 32, 11791190, https://doi.org/10.1175/JTECH-D-14-00185.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Locatelli, J. D., and P. V. Hobbs, 1995: A world record rainfall rate at Holt, Missouri: Was it due to cold frontogenesis aloft? Wea. Forecasting, 10, 779785, https://doi.org/10.1175/1520-0434(1995)010<0779:AWRRRA>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lott, G. A., 1954: The world-record 42-minute Holt, Missouri, rainstorm. Mon. Wea. Rev., 82, 5059, https://doi.org/10.1175/1520-0493(1954)082<0050:TWMHMR>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Menne, M. J., I. Durre, R. S. Vose, B. E. Gleason, and T. G. Houston, 2012: An overview of the Global Historical Climatology Network-Daily database. J. Atmos. Oceanic Technol., 29, 897910, https://doi.org/10.1175/JTECH-D-11-00103.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • National Research Council, 2012: The National Weather Service Modernization and Associated Restructuring: A Retrospective Assessment. National Academies Press, 120 pp., https://doi.org/10.17226/13216.

    • Crossref
    • Export Citation
  • NCDC, 2003: Data documentation for data set 3240 (DSI-3240). National Climatic Data Center, 10 pp., ftp://ftp.ncdc.noaa.gov/pub/data/hourly_precip-3240/dsi3240.pdf.

  • NCEI, 2019: Climate at a glance: National time series. NOAA/NCEI, accessed 2 August 2019, https://www.ncdc.noaa.gov/cag/national/time-series/110/pcp/12/12/1901-2018.

  • NCHRP, 2012: BMP sizing and design. Guidelines for evaluating and selecting modifications to existing roadway drainage infrastructure to improve water quality in ultra-urban areas, NCHRP Rep. 728, National Academies Press, 8393, https://doi.org/10.17226/22031.

    • Crossref
    • Export Citation
  • Phillips, C. S., 1985: An objective method for minimizing non-precipitation effects in precipitation data from punched paper tape. Proc. Int. Conf. on Interactive Information and Processing Systems for Meteorology, Oceanography and Hydrology, Los Angeles, CA, Amer. Meteor. Soc., 178182.

    • Search Google Scholar
    • Export Citation
  • Raff, D., and Coauthors, 2013: Short-term water management decisions; user needs for improved climate, weather, and hydrologic information. Tech. Rep. CWTS 2013-1, 261 pp.

  • Shein, K., 2008: Interactive quality assurance practices. 24th Conf. on IIPS, New Orleans, LA, Amer. Meteor. Soc., 6A.9, http://ams.confex.com/ams/88Annual/techprogram/paper_131217.htm.

    • Search Google Scholar
    • Export Citation
  • Trenberth, K. E., A. Dai, R. M. Rasmussen, and D. B. Parsons, 2003: The changing character of precipitation. Bull. Amer. Meteor. Soc., 84, 12051218, https://doi.org/10.1175/BAMS-84-9-1205.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • U.S. Department of Commerce, 2005: Cooperative Observer Program product improvement implementation plan [Addendum I] for Fischer & Porter sensor upgrade. NOAA/NWS, 29 pp.

  • Wilson, A., S. Hinson, D. Manns, R. Ray, and J. H. Lawrimore, 2010: Hourly precipitation data processing changes at NCDC. 15th Symp. on Meteorological Observation and Instrumentation, Atlanta, GA, Amer. Meteor. Soc., 8.3, https://ams.confex.com/ams/90annual/techprogram/paper_159693.htm.

    • Search Google Scholar
    • Export Citation
  • Fig. 1.

    The location of each F&P station operating in 2019 and the period of record length (years) as indicated by the color.

  • Fig. 2.

    The number of operating stations in the C-HPD network from 1940 through December 2019; total number of stations (red), number of legacy (punch paper recording) stations (green), number of stations converted to digital recording (blue).

  • Fig. 3.

    Data flow diagram of the C-HPD ingest and quality control process. This consists of four major parts: data ingest and integrity checks, 15-min gauge depth filtering, conversion to hourly period of record data, and quality control of the hourly precipitation totals.

  • Fig. 4.

    Example of (a) diurnal variations in gauge depth values for station Canyon Dam, Texas (COOP ID 411429), during April–May 2013. Large fluctuations began suddenly on 20 Apr and ended on 3 May. Small diurnal fluctuations are visible before and after this period; these fluctuations are identified and the gauge depth incremental changes set to zero except during periods of precipitation. Routine gauge emptying occurred on 21 May. (b) The measured accumulation of precipitation during the period of functioning instrumentation is also shown.

  • Fig. 5.

    Example of hourly values of 25.4 mm (1.0 in.) flagged as invalid by the streak check for USC00359581, Yaquina Bay, Oregon, on 11–13 Oct 1980. An extreme value previously flagged and removed by the legacy extreme check [50.8 mm (2.0 in.)] also is shown.

  • Fig. 6.

    Example of a single hourly value flagged as invalid by the gap check (red) for USC00418544 Spring Branch, Texas on 29 Sep 2013. Note that the value does not exceed the threshold (7 × 95th percentile) for a climatological outlier (section 4e).

  • Fig. 7.

    Example of a single hourly value flagged as invalid by the climatological outlier check (red) for USC00446955, Pulaski, Virginia, on 7 Apr 2007.

  • Fig. 8.

    Frequency of occurrence of heavy precipitation [hourly totals of 7.62 mm (0.30 in.)] for stations in the USCRN/USRCRN networks (squares) and the C-HPD network (circles) during the C-HPD digital era for (a) winter, (b) spring, (c) summer, and (d) fall.

  • Fig. 9.

    Frequency of occurrence of moderate to heavy precipitation [hourly totals of 2.54 mm (0.10 in.)] or more for stations in the USCRN/USRCRN networks (squares) and the C-HPD network (circles) during the C-HPD digital era for (a) winter, (b) spring, (c) summer, and (d) fall.

  • Fig. 10.

    Locations of the 20 paired C-HPD/COOP and USCRN stations used in the dry and wet days analyses. The C-HPD F&P gauges are collocated with the 8-in. COOP gauge. The distance to the USCRN station is indicated by the color of the circle.

  • Fig. 11.

    Diurnal distribution of the 1872 h with precipitation in the C-HPD (blue) and 578 USCRN hours (red) on days otherwise identified as dry using COOP/GHCNd data.

  • Fig. 12.

    Scatterplot showing the relationship between precipitation event totals [25.4 mm (1.0 in.) or more] for GHCNd and C-HPD (blue) and GHCNd and USCRN (red).

All Time Past Year Past 30 Days
Abstract Views 146 0 0
Full Text Views 1558 479 145
PDF Downloads 1063 217 22