1. Introduction
For more than 100 yr, the Cooperative Observer Program (COOP; U.S. Department of Commerce 2003) of the National Oceanic and Atmospheric Administration (NOAA) has monitored the nation’s climate. This network relies primarily on human volunteers to manually record daily air temperature and precipitation observations. Applications of COOP data have grown exponentially, such that today’s society requires the data with an ever-decreasing latency (U.S. Department of Commerce 2004). A National Research Council report (1998), however, identified that “despite its increasing importance to the nation, over the past several years the COOP network has been weakened by a combination of technological, organizational, and budgetary factors.” In addition, “users of the network’s observations are deeply concerned that little attention has been paid to this important source of data … and that network capability has deteriorated” (National Research Council 1998).
In 2004, the National Weather Service (NWS)—today’s overseer of the COOP equipment and observers—took initial steps to modernize the COOP network by upgrading a limited number of sites to automated weather stations (Fiebrich et al. 2005). Although it is obvious that increased temporal resolution and real-time availability of automated weather observations provide significant new benefits to a wide spectrum of users, many scientists wonder how this transition to automated observations will affect the quality of the climate record (Wu et al. 2005; Hubbard et al. 2004; Holder et al. 2006).
The state of Oklahoma is fortunate to have collected almost 15 yr of automated observations from the Oklahoma Mesonet (Mesonet; McPherson et al. 2007), which overlap with COOP observations. To assess the effect of automation on the temperature record, a sample of Oklahoma COOP data was compared with nearly collocated data from the Mesonet. Thus, this manuscript documents the characteristics of and differences between these two temperature datasets. The greatest discrepancies were found to be caused by COOP observer errors—errors that could be eliminated if automated stations were used. Therefore, this paper illustrates how a transition to automated COOP stations would improve the quality of the nation’s temperature climate record. Although the historical climate archives of our nation are invaluable, the thrust of this paper is forward-looking. The intent is not to adjust the climate archives but to begin a new era of dramatically improved quality in the temperature data. The focus is on daily observations from the Historical Climate Network (HCN) subset of COOP observers. Photographic depiction of the sites are provided in the supplement available at the Journals Online Web site: http://dx.doi.org/10.1175/2009JTECHA1241.s1.
2. The COOP
Many scientists consider the COOP network to be the most authoritative source of climate information for U.S. temperature and precipitation (Wu et al. 2005). Throughout COOP’s history, an estimated 25 000 stations have participated in the network (Reek et al. 1992). Currently, the number of COOP observers who measure air temperature across the United States totals between 5000 and 6000 (Guttman and Quayle 1990; Quayle et al. 1991; Reek et al. 1992; National Research Council 1998). Although COOP stations are established, supervised, maintained, and managed by the NWS, COOP data are processed, quality controlled, and archived by the National Climatic Data Center (NCDC) of NOAA’s National Environmental Satellite, Data, and Information Service.
a. COOP temperature sensors
Currently across the COOP, the most common sensor for measuring temperature is the maximum–minimum temperature sensor (MMTS; Quayle et al. 1991). The MMTS consists of an electronic thermistor housed inside a small unaspirated shield (Fig. 1). Its radiation shield is approximately 25 cm in height and 20 cm in diameter. Because the shield is unaspirated, field inaccuracy of temperature measurements can be as high as 1.0°C when light winds and strong radiation occur (Hubbard et al. 2004). The MMTS sensor is a Dale/Vishay 1140 thermistor (Vishay Intertechnology, Inc.) with a nominal resistance of 20 000 ohm at 25°C (Hubbard et al. 2004). A small number of observers use maximum–minimum liquid-in-glass thermometers mounted in Cotton Region Shelters (Fig. 2; Chenoweth 1993; Wendland and Armstrong 1993).
b. COOP data quality
The fact that COOP observations are recorded by thousands of volunteers across the United States makes it difficult to fully categorize potential errors that may affect data quality. Although COOP observers receive some training, they are not required to take any certification examination. Thus, it is difficult to assess objectively the quality of the observers. Today, management of the COOP is shared by NWS headquarters, six regional headquarters, and 122 forecast offices (U.S. Department of Commerce 2003). Unfortunately, standardization does not exist in the quality checks made by the many federal offices (Guttman 2005; Del Greco et al. 2006).
At NCDC, COOP data for this period were processed through the Validation of Historical Daily Data (ValHiDD; Reek et al. 1992) and temperature validation (TempVal; Hubbard et al. 2007) programs. ValHiDD uses information regarding how observers made and recorded observations, as well as errors in card-punching techniques and storage problems, to create a set of quality checks. TempVal compares COOP observations with estimated values based on gridded data from first-order stations of the NWS. TempVal is fully automated and produces replacement values for COOP observations that are significantly different from station data interpolated to a 0.5° grid (Angel et al. 2005). In the latest version of the NCDC’s quality checks, all interactive features [i.e., to allow for manual quality assurance (QA)] are removed to increase objectivity (Angel et al. 2005).
c. The U.S. HCN
The U.S. HCN is a subset of 1219 long-term COOP stations in the contiguous United States. HCN sites were selected based on an extended length of record, a small percentage of missing data, a limited number of station moves, and an overall contribution to complete spatial coverage across the United States (Karl et al. 1990; Davey and Pielke 2005). Whether or not each station met accepted standards for siting was not considered. Inadvertently, many scientists incorrectly assume that if a COOP site were classified as part of the HCN, then its data are of exceptionally high quality.
d. COOP/HCN data used in this study
Oklahoma COOP data from nine objectively chosen HCN stations for the 2003–05 period were used in this research. These data include raw observations from the NCDC TD3200 dataset obtained from the Oklahoma Climatological Survey (OCS).
3. The Mesonet
The Oklahoma Mesonet was designed in the early 1990s to be a multipurpose, statewide, mesoscale, real-time environmental monitoring network (Brock et al. 1995; McPherson et al. 2007). The founders of the Mesonet sought to build a network to overcome four issues in the Oklahoma COOP that made its data difficult to use: 1) data availability, 2) data timeliness, 3) variables measured (i.e., only daily temperature and precipitation were available), and 4) data quality. In 1994, the Mesonet consisted of 110 commissioned stations. By 2008, the number of sites had increased to 120.
Oklahoma Mesonet sites report air temperature with a 5-min time resolution (McPherson et al. 2007). Campbell Scientific dataloggers record the observations on-site. The observations are transmitted via very high frequency (VHF) radio and the Oklahoma Law Enforcement Telecommunications System to the OCS every five minutes.
a. Mesonet temperature sensors
Prior to 2004, air temperature measurements at the standard 1.5-m height were made with the combination thermistor–sorption HMP35C probe (Brock et al. 1995). The HMP35C probe had a range of −30° to 50°C with a specified accuracy of ±0.35°C. The HMP35C used a Fenwal Electronics UUT51J1 thermistor, which was added by Campbell Scientific, Inc. On 1 January 2004, the Mesonet transitioned to a bare-bead thermistor assembly, manufactured by Thermometrics, for its 1.5-m air temperature measurements. This sensor is composed of Thermometric’s Unitherm Interchangeable Thermistor (UIM) DC95 mounted in a stainless-steel housing so the thermistor is coupled directly with the atmosphere. The Thermometrics sensor responds faster to changing temperatures (e.g., following the passage of strong cold fronts and thunderstorm outflows) than did the HMP35C. The Thermometrics sensor has an operating range of −30° to 50°C with a specified accuracy of ±0.4°C (McPherson et al. 2007). Because it was economically feasible, the Mesonet continues to support both temperature sensors at each site. More than 7 yr of overlapping data are available from the two temperature sensors at each site in the network.
During the study period, the Mesonet’s air temperature sensor was housed in an R. M. Young multiplate radiation shield (Fig. 3) to protect the sensor from solar radiation. Similar to the unaspirated shield of the MMTS sensor (used in the COOP), field inaccuracy of temperature measurements from Mesonet stations can be as high as 1.0°C when light winds and strong radiation occur (Hubbard et al. 2004).
b. Quality assurance
The Mesonet QA system (Shafer et al. 2000; Fiebrich and Crawford 2001; Martinez et al. 2004; Hall et al. 2008) is built around five main components: 1) laboratory calibration, 2) automated quality checks, 3) manual inspection by specially trained meteorologists, 4) in-field comparisons, and 5) routine maintenance. During laboratory calibration, each sensor is checked or calibrated before being deployed to a station. Mesonet air temperature sensors are calibrated using a commercial temperature chamber and two National Institute of Standards and Technology (NIST)-certified Hart Scientific reference temperature probes. The automated quality checks identify erroneous data in real time, in addition to providing reports for review by QA meteorologists. During manual inspection, additional data are flagged as true trace times of problems are determined or more subtle problems are detected in the data. At least once per year, in-field comparisons are conducted at each site to compare the station’s temperature sensor with a portable system housing a reference Rotronic PT100 resistance temperature detector (RTD) thermistor (Fiebrich et al. 2006). Finally, during routine maintenance, each station is visited at least three times per year to clean and inspect sensors, maintain the vegetation, and rotate sensors (as needed).
c. Mesonet data used in this study
Oklahoma Mesonet data from the period 2003–05 were used in this research. The study data are the quality-assured observations obtained from the OCS.
4. Comparison of COOP data with automated data from Oklahoma
The focus of this study was to assess the accuracy of the daily temperature measurements recorded by COOP observers. To begin the study, automated and COOP data from Stillwater, Oklahoma, were compared. The Stillwater COOP site was established in 1893. It is also part of the HCN. Stillwater is unique in that an automated Mesonet site and an automated Climate Reference Network (CRN; Gallo 2005) site are located within 500 m of each other. The Stillwater CRN was established in 2002 and was located 325 m from the COOP/HCN site and 500 m from the Mesonet site (Fig. 4).
A comparison of mean daily temperatures for 2003 from the Stillwater CRN station and those from the COOP/HCN indicated that only 65.8% of the days were within 1.0°C (Fig. 5, top).1,2 In fact, the difference in the mean daily temperature approached 5°C on some days. In contrast, the mean daily temperatures observed by the Mesonet station were within 1°C of those from the CRN station on more than 99% of the days during 2003.3 Because Mesonet data so closely emulated “climate reference” data, the authors concluded that Mesonet data could be used as a reference dataset to investigate what causes COOP observations to differ from the more accurate data observed by collocated Mesonet stations.
a. Methodology
Because the HCN subset of stations of the COOP network is the most commonly used in climate studies, this work focused comparisons on data obtained from collocated HCN and Mesonet stations. For each climate division in Oklahoma, the pair of nearest HCN and Mesonet stations was determined (Fig. 6). Site photos for each pair of locations are available (refer to supplemental information). In each of the nine climate divisions, the distance between the two network locations was less than or equal to 8.0 km (Table 1). The first step to compare data from the paired stations was to generate the Mesonet daily maximum and minimum values based on the same 24-h observation window as the associated HCN site. Days with missing or flagged observations (i.e., via either NCDC or Mesonet quality control) from either dataset were eliminated. Next, the values for each day between 1 January 2003 and 31 December 2005 were compared. Table 2 lists a summary of the differences. For detailed analysis, each observation pair that differed by more than 5°C was inspected to determine the cause for the discrepancy (roughly 1.0% of the total data, or 165 cases).
b. Results
Time series plots of Mesonet data were produced for each day when a disparity occurred that was greater than 5°C (i.e., those events listed in the “>5°C” column of Table 2). The authors used the high temporal resolution data to detect and characterize any systemic, methodical problems of the HCN observer. On the basis of this careful analysis, the data errors and observer behavior primarily were characterized as 1) incorrect date archived, 2) data were routinely date shifted, 3) incorrect resets of the MMTS by the HCN observer, 4) late or early observations, 5) nonstandard observation times, 6) rapidly changing temperatures at observation time, 7) transcription errors, and 8) unknown errors. It should be noted that all of these errors (with the possible exception being the “unknown errors”) were likely independent of valid siting and sensor calibration issues.
1) Incorrect date
The authors discovered that some observers apparently listed observations on the incorrect line on their observation form. For example, the Stillwater observer appeared to have listed the 28 February 2005 temperatures on the line for 27 February 2005. This subtlety is illustrated in Fig. 7. The Mesonet station recorded a maximum temperature of 15.8°C and a minimum of 3.8°C for the 24-h period ending at 0700 LT (all times are local) on 27 February 2005 (inferred from the solid black trace); the HCN observer recorded 8.9° and −2.2°C (indicated by the dashed horizontal lines) for the same period. However, the maximum and minimum temperatures calculated based on Mesonet data for the 24-h period on the following day exactly matched the HCN observations for the first period (Fig. 7, shaded region). It appears that the Stillwater observer incorrectly listed the observation on the observation form. This error caused a 6.9°C cool bias in the maximum temperature and a 6.0°C cool bias in the minimum temperature recorded for 27 February. This observational practice was determined to occur on 16 of the 18 cases with large observational errors (>5°C) in Stillwater. In all 16 cases, the day of the week was either a Saturday, Sunday, or Monday. Also noteworthy is that the Stillwater observer frequently reported reoccurring low temperatures on the consecutive weekend days (see Fig. 8). Thus, this problem likely was caused by an observer who was unable to take readings at the prescribed time on weekends.
2) Date shifting
A problem similar to the incorrect date error is the occurrence of “date shifting.” Date shifting occurs when a morning observer shifts the maximum temperature recorded by the MMTS to the previous day’s entry on the COOP form (Rumbaugh 1934; Hubbard et al. 2007). Some observers shift these observations because they assume the maximum temperature occurred on the previous day. As can be seen in Fig. 9, the observer in Altus likely shifted the maximum temperature recorded on 8 March 2003 to the 7 March 2003 line on the COOP form (on 7 March, the Mesonet maximum was 14.9°C versus the “reported” COOP maximum of 20.6°C). This shift caused a warm bias of 5.7°C for the daily maximum temperature on 7 March. In this case, the minimum temperature was similar to that measured by the Mesonet station (Fig. 9). This problem of date shifting was found to affect all maximum temperatures for the 3-yr-period in Altus. In fact, maximum temperature data from only 40% of the days during the period passed NCDC quality control.
3) Incorrect resets of the MMTS
The MMTS requires the observer to independently reset the maximum and minimum temperatures each day. It is not uncommon for an observer to forget to reset either the maximum or minimum temperatures on the module. Such an occurrence is displayed in Fig. 10. The Cherokee observer appeared to have forgotten to reset the maximum temperature on the morning of 9 April 2004. Hence, the maximum temperature stored in the MMTS on the morning of 10 April 2004 covered a 48-h period. As a result, the observer recorded a maximum temperature of 19.4°C on both 9 and 10 April. The maximum was actually 12.8°C for the 24-h period ending on 10 April 2004. Because of this occurrence, the maximum temperature for 10 April was biased 6.6°C warm. Similar examples were discovered when HCN observers forgot to reset the minimum temperature on the MMTS module.
4) Late or early observations
Understandably, it is improbable that a HCN observer can maintain exactly the same observation time each day. Deviations of even a few hours can introduce a significant bias in daily observations of air temperature. One such example is shown in Fig. 11. The data suggest that the observer in Goodwell reset his/her MMTS several hours late on the morning of 7 January 2005. Hence, the minimum temperature recorded on 8 January 2005 was −3.9°C (lower dashed line) rather than the −9.1°C measured at the nearby Mesonet station (solid black line). This error, likely insignificant to the observer, caused the minimum temperature to be warm biased by 5.2°C.
5) Nonstandard observation times
In some instances, the HCN observer employed a nonstandard observing method. For example, the Ada HCN observer identified him/herself on each COOP form as a 1700 observer, but the minimum temperature observations appeared to be valid for the 24-h period ending at 0800. To make matters worse, the observer appeared to occasionally date shift his/her maximum temperatures. Figure 12 reveals that the minimum observation recorded by the HCN for 9 February 2003 differed by more than 5.5°C from that observed by the nearby Mesonet site. Instead, the minimum temperature matched that of the Mesonet for the 24-h period ending at 0800 (shaded region of Fig. 12). It was found that 46 of the 48 large discrepancies detected at the Ada site were attributed to this problem. This observing practice makes it impossible for data users to properly interpret the data—either on a daily, monthly, or seasonal time scale.
6) Rapidly changing temperatures at observation time
When temperatures changed rapidly at observation time, large discrepancies could occur between the HCN and Mesonet observations. This problem is similar to that of a late observer [section 4b(4)] but, in these instances, even delays of less than an hour could generate large biases in the observations. At the Antlers site on 1 January 2003, the temperature increased 6°C between 8 and 9 a.m. (Fig. 13). Thus, a large warm bias was introduced by the HCN observer’s delayed reset of the MMTS (even though it likely was a delay of less than an hour).
7) Transcription errors
For each of the cases analyzed, the handwritten observer form also was reviewed to identify possible transcription errors. In some cases, the HCN observations that disagreed by more than 5°C from the Mesonet observation were written in a way that made the HCN observation questionable. For example, Fig. 14 reveals data for Claremore on 21 August 2004. For the 24-h period ending at 0700 on 21 August, the HCN observer reported a maximum temperature of 27.2°C, whereas the Mesonet data indicated a maximum of 21.9°C. Upon inspecting the COOP observation form (Fig. 15), the authors noticed that someone likely changed the entry from 71°F (21.7°C) to 81°F (27.2°C); the “8” on the line for 21 August looked suspiciously different than the other “8”s on the form. The original observation (i.e., “71”) would have matched the Mesonet data within 0.2°C; the edited value, however, generated a greater than 5°C warm bias in the COOP data. In this case, it is unknown whether the transcription error was created by the observer, by a reviewer, or by the person responsible for data entry at NCDC.
8) Unknown errors
Of the 165 cases of large temperature discrepancies, the authors were unable to determine the cause of 32 discrepancies between the HCN and Mesonet observations. For example, Fig. 16 represents a data plot for 18 February 2003 at the Tahlequah site. Neither the minimum nor maximum temperatures recorded by the HCN observer (dashed horizontal lines) agreed with those depicted in the 5-min Mesonet data (solid black line). In this instance, the HCN maximum appears to be biased 2.3°C cool while the minimum temperature appears 5.1°C too warm. Neither the handwritten COOP form nor data from days prior to or after the observation revealed an explanation. In addition, snow cover was not suspected as a possible cause for the discrepancy because snowfall had not been reported by the HCN observer during the preceding week. Therefore, this case, along with 31 others, was characterized as being caused by unknown errors.
5. Discussion
The seven examples described in section 4b summarize the most commonly identified sources of error in the HCN dataset in Oklahoma. What is surprising is that most of these error sources seem minor in nature, yet each was responsible for causing errors in excess of 5°C in the daily data. Although the daily maximum and minimum values from the automated data were calculated so that they matched the HCN observer’s recorded measurement times (e.g., 0800, 1700, and so on), it was found that those recorded measurement times were sometimes not precise (see sections 4b(4)–(6)]. When a manual observation is made at a time different than what is recorded in the station’s metadata, time-of-observation errors are possible. Numerous scientists have documented the effect to daily data when an observer changes his/her observation time (e.g., from a “morning observer” to an “evening observer,” or vice versa; Rumbaugh 1934; Baker 1975; Schaal and Dale 1977; Byrd 1985; Karl et al. 1986; Redmond 1992; DeGaetano 2000; Janis 2002; Wu et al. 2005; Belcher and DeGaetano 2005). However, the individual daily time-of-observation errors (e.g., an observer who makes observations at 0800 one day and 1000 the next day) are much more difficult to correct. Lastly, it should be noted that all of these errors [with the possible exception being the unknown errors in 4b(8)] were likely independent of valid siting and sensor calibration issues.
In terms of climatic and topographic ranges, Oklahoma is representative of about half of the nation [i.e., it has relatively large diurnal and interdiurnal temperature ranges (Karl et al. 1986), and it ranks 23rd in terms of elevation variability (available online at http://www.ngs.noaa.gov).] Despite this situation, some may question how applicable the findings in this research are to locations outside of Oklahoma. Although it is realized that this study inspected data from Oklahoma exclusively, the most severe problems in the COOP network were determined not to be specific to geography, topography, or climate. Instead, they were common to all weather observations that are acquired manually. In addition, they represented HCN stations in three NWS Weather Forecast Office (WFO) domains: Amarillo, Texas; Norman, Oklahoma; and Tulsa, Oklahoma. Thus, the authors feel the results presented herein are relevant to many other regions of the United States.
6. Conclusions
The daily COOP/HCN data revealed great disparity with observations from nearby Mesonet sites. In fact, upon inspection of the data from one HCN site in each climate division, differences greater than 5°C were discovered to occur more frequently than reported in the scientific literature. In addition, greater than 55% of all paired observations were found to differ by more than 1°C (see Table 2). Understandably, the effect of these errors diminishes as one averages them into monthly, annual, or statewide averages. In fact, the authors found that as the daily observations were averaged spatially and temporally to estimate monthly-mean temperatures at the climate division level, the resulting average discrepancies typically reduced to less than 1°C (not shown).
This research provides undeniable evidence that a significant difference exists between data recorded by the National Weather Service Cooperative Observer Program and data recorded by the Mesonet. Thus, a transition to automated observations would change the climate record for Oklahoma on the daily scale. The greatest discrepancies were found to be caused by COOP observer errors. In conclusion, this study illustrates that most of the largest errors archived in the daily COOP/HCN dataset likely could have been eliminated if automated temperature observations were available.
Acknowledgments
The authors thank Drs. Renee McPherson, David Karoly, Michael Richman, Kelly Redmond, and May Yuan for their constructive criticism of this manuscript. Grant Goodge, Richard Heim, Russell Vose, Steve DelGreco, and William Angel of the NCDC and Forrest Mitchell of the NWS in Norman, Oklahoma, were gracious in explaining the characteristics of the COOP network and its data. Continued funding for maintenance of the Oklahoma Mesonet is provided by the taxpayers of the state of Oklahoma through the Oklahoma State Regents for Higher Education.
REFERENCES
Angel, W. E., Urzen M. L. , Del Greco S. A. , and Bodosky M. W. , 2005: Automated validation for summary of the day temperature data. Preprints, 19th Conf. IPPS, Savannah, GA, Amer. Meteor. Soc., 15.3. [Available online at http://ams.confex.com/ams/pdfpapers/57274.pdf].
Baker, D. G., 1975: Effect of observation time on mean temperature estimation. J. Appl. Meteor., 14 , 471–476.
Belcher, B. N., and DeGaetano A. T. , 2005: A method to infer time of observation at US Cooperative Observer Network stations using model analyses. Int. J. Climatol., 25 , 1237–1251.
Brock, F. V., Crawford K. C. , Elliott R. L. , Cuperus G. W. , Stadler S. J. , Johnson H. L. , and Eilts M. D. , 1995: The Oklahoma Mesonet: A technical overview. J. Atmos. Oceanic Technol., 12 , 5–19.
Byrd, G. P., 1985: An adjustment for the effects of observation time on mean temperature and degree-day computations. J. Climate Appl. Meteor., 24 , 869–874.
Chenoweth, M., 1993: Nonstandard thermometer exposures at U.S. cooperative weather stations during the late nineteenth century. J. Climate, 6 , 1787–1797.
Davey, C. A., and Pielke R. A. Sr., 2005: Microclimate exposures of surface-based weather stations. Bull. Amer. Meteor. Soc., 86 , 497–504.
DeGaetano, A. T., 2000: A serially complete simulated observation time metadata file for U.S. daily historical climatology network stations. Bull. Amer. Meteor. Soc., 81 , 49–67.
Del Greco, S. A., Lott N. , Hawkins K. , Baldwin R. , Anders D. D. , Ray R. , Dellinger D. , Jones P. , and Smith F. , 2006: Surface data integration at NOAA’s National Climatic Data Center: Data format, processing, QC, and product generation. Preprints, 22nd Int. Conf. on Interactive Information Processing Systems for Meteorology, Oceanography, and Hydrology, Atlanta, GA, Amer. Meteor. Soc., J2.1. [Available online at http://ams.confex.com/ams/pdfpapers/100500.pdf].
Fiebrich, C. A., and Crawford K. C. , 2001: The impact of unique meteorological phenomena detected by the Oklahoma Mesonet and ARS Micronet on automated quality control. Bull. Amer. Meteor. Soc., 82 , 2173–2187.
Fiebrich, C. A., McPherson R. A. , Fain C. C. , Henslee J. R. , and Hurlbut P. D. , 2005: An end-to-end quality assurance system for the modernized COOP network. Preprints, 15th Conf. on Applied Climatology, Savannah, GA, Amer. Meteor. Soc., J3.3. [Available online at http://ams.confex.com/ams/pdfpapers/92198.pdf].
Fiebrich, C. A., Grimsley D. L. , McPherson R. A. , Kesler K. A. , and Essenberg G. R. , 2006: The value of routine site visits in managing and maintaining quality data from the Oklahoma Mesonet. J. Atmos. Oceanic Technol., 23 , 406–416.
Gallo, K. P., 2005: Evaluation of temperature differences for paired stations of the U.S. Climate Reference Network. J. Climate, 18 , 1629–1636.
Guttman, N. B., 2005: Standardizing the quality assessment of data: Partnership activities between NCDC and RCCs. Preprints, 15th Conf. on Applied Climatology, Savannah, GA, Amer. Meteor. Soc., J3.5. [Available online at http://ams.confex.com/ams/pdfpapers/94043.pdf].
Guttman, N. B., and Quayle R. G. , 1990: A review of cooperative temperature data validation. J. Atmos. Oceanic Technol., 7 , 334–339.
Hall P. K. Jr., , Morgan C. R. , Gartside A. D. , Bain N. E. , Jabrzemski R. , and Fiebrich C. A. , 2008: Use of climate data to further enhance quality assurance of Oklahoma Mesonet observations. [Available online at http://ams.confex.com/ams/pdfpapers/130407.pdf].
Holder, C., Boyles R. , Syed A. , Niyogi D. , and Raman S. , 2006: Comparison of collocated automated (NCECONet) and manual (COOP) climate observations in North Carolina. J. Atmos. Oceanic Technol., 23 , 671–682.
Hubbard, K. G., Lin X. , Baker C. B. , and Sun B. , 2004: Air temperature comparison between the MMTS and the USCRN temperature systems. J. Atmos. Oceanic Technol., 21 , 1590–1597.
Hubbard, K. G., Guttman N. B. , You J. , and Chen Z. , 2007: An improved QC process for temperature in the daily cooperative weather observations. J. Atmos. Oceanic Technol., 24 , 206–213.
Janis, M. J., 2002: Observation-time-dependent biases and departures for daily minimum and maximum air temperatures. J. Appl. Meteor., 41 , 588–603.
Karl, T. R., Williams C. N. , Young P. J. , and Wendland W. M. , 1986: A model to estimate the time of observation bias associated with monthly mean maximum, minimum and mean temperatures for the United States. J. Climate Appl. Meteor., 25 , 145–160.
Karl, T. R., Williams C. N. Jr., and Quinlan F. T. , 1990: United States Historical Climatology Network (HCN) serial temperature and precipitation data. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory Tech. Rep. ORNL/CDIAC-30, NDP-019/R1, 274 pp.
Martinez, J. E., Fiebrich C. A. , and Shafer M. A. , 2004: The value of a quality assurance meteorologist. Preprints, 14th Conf. on Applied Climatology, Seattle, WA, Amer. Meteor. Soc., 7.4. [Available online at http://ams.confex.com/ams/pdfpapers/69793.pdf].
McPherson, R. A., and Coauthors, 2007: Statewide monitoring of the mesoscale environment: A technical update on the Oklahoma Mesonet. J. Atmos. Oceanic Technol., 24 , 301–321.
National Research Council, 1998: Future of the National Weather Service Cooperative Observer Network. National Academy Press, 65 pp.
Quayle, R. G., Easterling D. R. , Karl T. R. , and Hughes P. Y. , 1991: Effects of recent thermometer changes in the cooperative station network. Bull. Amer. Meteor. Soc., 72 , 1718–1723.
Redmond, K. T., 1992: Effects of observation time on interpretation of climatic time series—a need for consistency. Proceedings of the Eighth Annual Pacific Climate (PACLIM) Workshop, Interagency Ecological Program Tech. Rep. 31, 141–150.
Reek, T., Doty S. R. , and Owen T. W. , 1992: A deterministic approach to the validation of historical daily temperature and precipitation data from the cooperative network. Bull. Amer. Meteor. Soc., 73 , 753–762.
Rumbaugh, W. F., 1934: The effect of time of observation on mean temperature. Mon. Wea. Rev., 62 , 375–376.
Schaal, L. A., and Dale R. F. , 1977: Time of observation temperature bias and “climate change.”. J. Appl. Meteor., 16 , 215–222.
Shafer, M. A., Fiebrich C. A. , Arndt D. S. , Fredrickson S. E. , and Hughes T. W. , 2000: Quality assurance procedures in the Oklahoma Mesonet. J. Atmos. Oceanic Technol., 17 , 474–494.
Sinnott, R. W., 1984: Virtues of the haversine. Sky Telesc., 68 , 159.
U.S. Department of Commerce, 2003: Cooperative Station Management. U.S. Department of Commerce, National Weather Service Instruction 10-1307, 11 pp.
U.S. Department of Commerce, 2004: COOP modernization: Building the National Cooperative Mesonet, Program development plan. U.S. Department of Commerce, 73 pp.
Wendland, W. M., and Armstrong W. , 1993: Comparison of maximum–minimum resistance and liquid-in-glass thermometer records. J. Atmos. Oceanic Technol., 10 , 233–237.
Wu, H., Hubbard K. G. , and You J. , 2005: Notes and correspondence: Some concerns when using data from the cooperative weather station networks: A Nebraska case study. J. Atmos. Oceanic Technol., 22 , 592–602.
MMTS used in the COOP network.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
Cotton Region Shelter (housing the maximum–minimum liquid-in-glass thermometers) used at a small number of stations in the COOP network.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
R.M. Young multiplate radiation shield used by the Oklahoma Mesonet.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
Locations of the Stillwater COOP, Mesonet, and CRN stations.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
Comparison of differences in the mean daily temperatures for 2003 observed at the (top) Stillwater CRN and COOP stations and (bottom) between the CRN and Mesonet stations.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
Location of HCN (open circles) and Mesonet stations (filled circles) available during the study period. Station labels indicate the pair of HCN and Mesonet stations that were closest to each other in each climate division.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
Time series plot of air temperature (°C) from the Mesonet station in Stillwater, OK (solid, black line). The dashed vertical lines denote 0700 each day; dashed horizontal lines mark the maximum and minimum temperature (°C) reported by the HCN observer for the 24-h period ending at 0700 LT 27 Feb 2005. The shaded region indicates the period when the HCN observation matched the Mesonet data (i.e., the following day).
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
COOP form for the Stillwater observer for March 2005. Highlighted cells indicate the Saturday, Sunday, and Monday minimum temperatures. During each weekend period, at least two of the minimum temperatures were repeated.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
As in Fig. 7, except for the 0800 HCN observer at the Altus site on 7 Mar 2003. Numerical annotations indicate the daily maximum and minimum temperatures reported by the HCN observer for each of the 24-h observation periods. In this case, the minimum temperatures matched closely with the Mesonet data, but the maximum temperatures were shifted backward one day.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
As in Fig. 7, except for the 0700 HCN observer at the Cherokee site on 10 Apr 2004. Because the MMTS was not reset on 9 Apr, the maximum recorded on 10 Apr was a 48-h maximum (shaded region).
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
As in Fig. 7, except for the 0800 HCN observer at the Goodwell site on 8 Jan 2005. The MMTS likely was reset several hours late on 7 Jan 2005, causing the minimum temperature recorded by the HCN to be warm biased by more than 5°C when reported on the morning of 8 Jan.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
As in Fig. 5, except for the 1700 observer at the Ada site on 9 Feb 2003. The observer’s minimum temperatures typically agree with the Mesonet’s observations for the 24-h period ending at 0800 (shaded region) rather than that ending at 1700.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
As in Fig. 7, except for the 0800 observer at the Antlers site on 1 Jan 2003. The temperature increased more than 6°C shortly after the observation time started (highlighted region), thereby creating the warm bias caused by a delayed observer.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
As in Fig. 7, except for the 0700 observer at the Claremore site on 21 Aug 2004. A transcription error is the most likely cause for the 5.5°C warm bias in the maximum temperature recorded by the HCN observer.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
COOP observation form for the Claremore observer during August 2004. The entry on 21 Aug is highlighted because it appears that the maximum temperature was changed from 71° to 81°F.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
As in Fig. 7, except for the 1800 observer at the Tahlequah site on 18 Feb 2003. No explanation seemed valid for the discrepancy between the HCN and Mesonet observations.
Citation: Journal of Atmospheric and Oceanic Technology 26, 7; 10.1175/2009JTECHA1241.1
Pairs of nearest HCN and Mesonet stations in each climate division. Distance calculated using the Haversine formula (Sinnott 1984).
Number of maximum or minimum daily temperatures that differed by greater than 1°–6°C at collocated Mesonet and HCN locations during the period 1 Jan 2003 through 31 Dec 2005. The “>5°C” column is italicized to indicate cases (i.e., discrepancies) that were researched in detail.
Mean daily temperatures calculated as the average of the 24-h minimum and maximum temperatures.
Daily maximum and minimum temperatures for the automated stations were calculated from 5-min observations to match the COOP/HCN station’s 0700 documented observation time.
See note 2 above.
* Supplemental information related to this paper is available at the Journals Online Web site: http://dx.doi.org/10.1175/2009JTECHA1241.s1.