• ANSI–ASQC, 1994a: Quality management and quality assurance standards—Guidelines for selection and use. American Society for Quality Control American National Standard Q9000-1-1994, 6 pp.

    • Search Google Scholar
    • Export Citation
  • ANSI–ASQC, 1994b: Quality systems—Model for quality assurance in production, installation, and servicing. American Society for Quality Control American National Standard Q9002-1-1994, 8 pp.

    • Search Google Scholar
    • Export Citation
  • DeGaetano, A. T., 1997: A quality-control routine for hourly wind observations. J. Atmos. Oceanic Technol., 14 , 308317.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fiebrich, C. A., , and Crawford K. C. , 2001: The impact of unique meteorological phenomena detected by the Oklahoma Mesonet and ARS Micronet on automated quality control. Bull. Amer. Meteor. Soc., 82 , 21732187.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fiebrich, C. A., , Grimsley D. L. , , McPherson R. A. , , Kesler K. A. , , and Essenberg G. R. , 2006: The value of routine site visits in managing and maintaining quality data from the Oklahoma Mesonet. J. Atmos. Oceanic Technol., 23 , 406416.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gandin, L. S., 1988: Complex quality control of meteorological observations. Mon. Wea. Rev., 116 , 11371156.

  • Graybeal, D. Y., , DeGaetano A. T. , , and Eggleston K. L. , 2004: Complex quality assurance of historical hourly surface airways meteorological data. J. Atmos. Oceanic Technol., 21 , 11561169.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Liljegren, J. C., , Carhart R. A. , , Lawday P. , , Tschopp S. , , and Sharp R. , 2008: Modeling the wet bulb globe temperature using standard meteorological measurements. J. Occup. Environ. Hyg., 5 , 645655.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lockhart, T. J., 1989: Comments on “A quality control program for surface mesonetwork data.”. J. Atmos. Oceanic Technol., 6 , 525526.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Michalsky, J. J., and Coauthors, 2003: Results from the first ARM diffuse horizontal shortwave irradiance comparison. J. Geophys. Res., 108 , 4108. doi:10.1029/2002JD002825.

    • Search Google Scholar
    • Export Citation
  • Mori, Y., 1986: Evaluation of several single-pass estimators of the mean and the standard deviation of wind direction. J. Climate Appl. Meteor., 25 , 13871397.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mori, Y., 1987: Methods for estimating the mean and the standard deviation of wind direction. J. Climate Appl. Meteor., 26 , 12821284.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • NWCG Fire Weather Committee, 2008: NWCG Fire weather station standards. National Wildfire Coordinating Group Publication PMS 426-3, 49 pp. [Available online at http://www.fs.fed.us/raws/standards/FireWxStds_final_revMay08.pdf].

    • Search Google Scholar
    • Export Citation
  • Shafer, M. A., , Fiebrich C. A. , , Arndt D. S. , , Fredrickson S. E. , , and Hughes T. W. , 2000: Quality assurance procedures in the Oklahoma Mesonetwork. J. Atmos. Oceanic Technol., 17 , 474494.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Steinhart, J. S., , and Hart S. R. , 1968: Calibration curves for thermistors. Deep-Sea Res., 15 , 497503.

  • U.S. Army, 2002: Toxic chemical agent safety standards. Department of the Army Pamphlet 385-61, 82 pp.

  • U.S. Code, 1986: Destruction of existing stockpile of lethal chemical weapons. Chemical and Biological Warfare Program, Title 50 U.S. Code 1521.

    • Search Google Scholar
    • Export Citation
  • U.S. EPA, 2000: Meteorological monitoring guidance for regulatory modeling applications. U.S. Environmental Protection Agency Tech. Rep. EPA-454/R-99-005, 171 pp.

    • Search Google Scholar
    • Export Citation
  • Yamartino, R. J., 1984: A comparison of several “single-pass” estimators of the standard deviation of wind direction. J. Climate Appl. Meteor., 23 , 13621366.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • View in gallery

    Locations of CSEPP sites.

  • View in gallery

    A typical MetView display. (top left) Current data with indications that atmospheric stability, wind speed, and direction are acceptable (“go”) for transporting chemical munitions from their storage igloos to the demilitarization facility. (top right) Map showing wind vectors with highlighted sectors indicating regions with lightning activity. (bottom left) A wind rose for the previous 72 h. (bottom right) A time series plot of stability class, solar irradiance, and vertical temperature difference.

  • View in gallery

    The expected error in the YSI 703 temperature probe using the manufacturer’s standard calibration (dashed line) and after recalibration (solid line).

  • View in gallery

    Probable range of 10-m wind speed and 10-m wind direction for all towers at BGAD. Values outside the 99.9th percentile contour are considered improbable.

  • View in gallery

    Probable range of 10-m air temperature as function of the day of year for all towers at BGAD.

  • View in gallery

    Probable spatial (horizontal) difference in 10-m wind speed as a function of wind speed for all towers at BGAD.

  • View in gallery

    Probable vertical σθ difference (60 − 10 m) as a function of wind speed for the DEMIL tower at BGAD.

  • View in gallery

    Probable vertical temperature difference (60 − 2 m) as a function of the cosine of the solar zenith angle for the DEMIL tower at BGAD.

  • View in gallery

    Flowchart of the decision algorithm logic.

  • View in gallery

    The 10-m wind speeds and quality flags on 8–11 Feb 2007 at BGAD. The anomalous periods are circled. Open circles mark the wind speed measurements for tower 3 (dark line) that failed the persistence test.

  • View in gallery

    (a) Mean 10-m wind speed, (b) 10-m wind direction, and (c) standard deviation of 10-m wind direction at PCD on 7 Apr 2006 for tower 2 with a CR10X datalogger (dark gray lines, symbols) and tower 3 with a 555B datalogger (light gray lines, symbols). Values flagged by the automatic screening algorithms (open circles) indicate either horizontal or horizontal and range anomalies. The dashed line at 104° indicates the value of σθ for a uniformly distributed random wind direction.

  • View in gallery

    (a) Solar irradiance (light gray) and maximum solar irradiance (dark gray); (b) normalized solar irradiance. (a) Data values that fail the range test are marked with open circles.

  • View in gallery

    The CSEPP problem-reporting and tracking system summary of open, pending, and recently closed issues.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 26 26 2
PDF Downloads 18 18 3

Quality Control of Meteorological Data for the Chemical Stockpile Emergency Preparedness Program

View More View Less
  • 1 Argonne National Laboratory, Argonne, Illinois
  • 2 U.S. Army Chemical Materials Agency, Edgewood, Maryland
© Get Permissions
Full access

Abstract

The Chemical Stockpile Emergency Preparedness Program Meteorological Support Project ensures the accuracy and reliability of data acquired by meteorological monitoring stations located at seven U.S. Army chemical weapons depots where storage and weapons destruction (demilitarization) activities are ongoing. The data are delivered in real time to U.S. Army plume dispersion models, which are used to plan for and respond to a potential accidental release of a chemical weapons agent. The project provides maintenance, calibration, and audit services for the instrumentation; collection, automated screening, visual inspection, and analysis of the data; and problem reporting and tracking to carefully control the data quality. The resulting high-quality meteorological data enhance emergency response modeling and public safety.

Corresponding author address: James C. Liljegren, Argonne National Laboratory, 9700 South Cass Ave., Argonne, IL 60439. Email: jcliljegren@anl.gov

Abstract

The Chemical Stockpile Emergency Preparedness Program Meteorological Support Project ensures the accuracy and reliability of data acquired by meteorological monitoring stations located at seven U.S. Army chemical weapons depots where storage and weapons destruction (demilitarization) activities are ongoing. The data are delivered in real time to U.S. Army plume dispersion models, which are used to plan for and respond to a potential accidental release of a chemical weapons agent. The project provides maintenance, calibration, and audit services for the instrumentation; collection, automated screening, visual inspection, and analysis of the data; and problem reporting and tracking to carefully control the data quality. The resulting high-quality meteorological data enhance emergency response modeling and public safety.

Corresponding author address: James C. Liljegren, Argonne National Laboratory, 9700 South Cass Ave., Argonne, IL 60439. Email: jcliljegren@anl.gov

1. Introduction

The Chemical Stockpile Emergency Preparedness Program (CSEPP) enhances emergency planning and preparedness for the unlikely event of an accidental release of chemical weapons agent from any of seven1 U.S. Army chemical weapons storage depots in the continental United States (depots shown in Fig. 1). Until the U.S. stockpile is destroyed, it will pose a continuing threat to depot workers and residents of surrounding communities.

At each depot, a network of meteorological monitoring stations provides real-time data to Army plume dispersion modelers. In case of an accidental release, these modelers must recommend protective actions (e.g., immediate evacuation or sheltering in place) to Army and off-post civilian authorities for communication to depot personnel and the general public. Because an accidental agent release could occur at any time, these meteorological instrument networks must provide instant, accurate data reliably around the clock.

The meteorological data acquired by the monitoring network serve several additional purposes. First, they help to meet national and local permitting requirements for the destruction of chemical weapons at the depots. The data also support compliance with the federal statutory requirement to provide “maximum protection for the environment, the general public, and the personnel who are involved in the destruction of the lethal chemical agents and munitions” (U.S. Code 1986) and help the United States comply with section 10 of article 3 of the Chemical Weapons Convention, which provides that each state party must “assign the highest priority to ensuring the safety of people and to protecting the environment” during chemical weapons destruction. Finally, the data are used to provide heat index, wind chill, and lightning information to increase the safety of daily operations at the storage depots.

This paper describes the quality control (QC) procedures developed by the CSEPP Meteorological Support Project at Argonne National Laboratory to ensure the accuracy and reliability of the data acquired by the meteorological monitoring stations at the storage depots. Following the discussion by Lockhart (1989) and the definitions set forth in ANSI–ASQC (1994a), we use “quality control” to indicate “operational techniques and activities that are used to fulfill the requirements for quality” and “quality assurance” to refer to oversight activities that are used “to provide adequate confidence that an entity will fulfill requirements for quality.”

A description of the meteorological monitoring network is presented first, followed by an overview of the QC procedures. Descriptions of the instrument audit and calibration, data verification and validation, and problem reporting and tracking are presented in turn and accompanied by illustrative examples.

2. The CSEPP meteorological system

a. Tower network

The CSEPP meteorological system summarized in Table 1 contains 73 towers on and around Anniston Army Depot (ANAD) near Anniston, Alabama; Blue Grass Army Depot (BGAD) near Richmond, Kentucky; Deseret Chemical Depot (DCD) near Tooele, Utah; Newport Chemical Depot (NECD) near Newport, Indiana; Pine Bluff Arsenal (PBA) near Pine Bluff, Arkansas; Pueblo Chemical Depot (PCD) near Pueblo, Colorado; and Umatilla Chemical Depot (UMCD) near Umatilla, Oregon.

These towers were established by different owners for varying purposes and serve CSEPP in different ways. As Table 1 indicates, three types of towers exist within the CSEPP system. The demilitarization (DEMIL) towers were originally installed for U.S. Environmental Protection Agency (U.S. EPA) permitting purposes but are now maintained by CSEPP. These towers satisfy all U.S. EPA (2000) guidelines and CSEPP requirements for instrumentation; at the depots where they exist, they provide the most complete and highest-quality data.

The on-post CSEPP towers are comparable to the DEMIL towers except that they may not be equipped with the full complement of instrumentation or the instrumentation may not fully satisfy U.S. EPA specifications. These towers provide redundancy as well as a spatial description of local meteorological conditions. The off-post (community) towers and independent non-CSEPP towers are typically equipped with only a wind speed, wind direction, temperature, and relative humidity sensor at a single level. These towers enhance the spatial description of the local conditions.

Argonne specified, installed, and maintains a subset (18) of the towers and instruments, which are indicated in Table 1. This complements the 30 CSEPP towers maintained by the Bureau of Land Management (BLM) and 33 non-CSEPP towers. Argonne collects, reviews, and archives the data from all 73 towers. Argonne audits the performance of all instrumentation except for that on the 17 independent towers.

b. Tower configurations

The towers in the CSEPP meteorological system are configured to provide at least the minimum dataset required by the plume dispersion models: wind speed, wind direction, and the inputs needed to compute stability class (solar radiation, vertical temperature gradient, and the standard deviation of wind direction σθ). These are provided as 15-min averages (or standard deviations) of 1-s samples. All towers are at least 10 m high and generally have a 2-m level with a temperature and relative humidity sensor; the 10-m level is generally equipped with temperature, wind speed, and wind direction sensors. DEMIL towers and some CSEPP towers are 30 or 60 m high. Temperature, wind speed, and wind direction are measured at the 30- and 60-m levels. The temperature and relative humidity sensors are enclosed in radiation shields, which may be mechanically aspirated (DEMIL towers) or naturally aspirated (CSEPP and community towers). At least one tower at each depot has a barometric pressure sensor. Each depot has two solar radiation sensors for redundancy. On some towers, the solar radiation sensors are mounted at the 2-m level, but they are often mounted away from the tower on a post to prevent problems with shadows and reflections from the tower. Each depot has two precipitation gauges that are mounted on a small cement slab and surrounded by a suitable windscreen. Ground surface temperature is measured at each depot to predict evaporation in the event of a chemical agent spill. The towers use a combination of commercial power and solar power; all towers are equipped with a battery system to bridge power interruptions.

c. Meteorological instrumentation

The instrumentation installed on the CSEPP towers designed by Argonne is summarized in Table 2. The sensors chosen by Argonne were selected to meet or exceed the accuracies and resolutions needed to obtain permits for demilitarization, as specified by the U.S. EPA (2000). The equipment provided by BLM is the same type used by the National Interagency Fire Center for Remote Automated Weather Stations, as described by NWCG Fire Weather Committee (2008).

The instrumented towers have been in operation for as many as 20 yr. The most common types of problems are worth noting. The bearings and potentiometers in the wind sensors periodically need to be replaced. Bird damage used to be a common problem with the R.M. Young wind speed and wind direction sensors until bird perches were installed above the sensors. Temperature sensor failures typically manifest themselves initially as an intermittent problem. To aid in rapidly identifying such intermittent problems, the dataloggers at the Argonne-maintained towers record maxima and minima from selected sensors.

d. Data communication

At each depot, secure digital radio links transmit real-time data from the dataloggers at each tower to a local meteorological data collection computer (MetPC). The MetPC is connected to the Army’s secure local network, which rapidly provides these data to local plume dispersion modeling computers. The data from all depots is also transmitted to the CSEPP meteorological data archive at Argonne, which provides secure data access to the U.S. Army Chemical Materials Agency headquarters.

e. MetView data reporting system

MetView is a meteorological data management application developed by Argonne as part of CSEPP. The dataloggers installed at the towers generate raw data in a variety of formats. The MetView software reads each of these data formats and integrates the incoming data into a single cohesive data stream. The MetView software is configured to provide this data automatically to Army plume dispersion models.

In each depot’s emergency operations center (EOC), the MetView software displays the current meteorological conditions in a concise status board form and raises alarms (both audible and visual) in response to a variety of trigger events, including missing or stale incoming meteorological data; format errors in incoming meteorological data; failure of incoming meteorological data to pass simple limit tests; meteorological conditions that may prohibit chemical agent operations (U.S. Army 2002); and the presence of hazardous situations, such as high wet-bulb globe temperature heat index [computed using an algorithm developed by Argonne for CSEPP (Liljegren et al. 2008)], low wind chill index, or nearby lightning.

MetView can display archived meteorological data in a variety of forms, as shown in Fig. 2, including graphs, tables, maps, wind roses, and stability arrays. MetView can also export data in delimited-column text format and Network Common Data Format (NetCDF) for subsequent analysis.

3. Data quality control procedures

The QC procedures established by Argonne for the CSEPP meteorological system follow U.S. EPA (2000) guidelines and are consistent with the ANSI–ASQC (1994b) standard, although not all aspects of the standard have been fully implemented. In addition to the U.S. EPA recommendations, which emphasize air-quality modeling applications, the QC procedures developed for CSEPP reflect the real-time use of the data for emergency management purposes. The need for a continuous quality-controlled data stream at each storage depot dictates a multifaceted continuous QC approach—not only to rapidly identify and correct problems when they arise but also to identify unfavorable trends and prevent problems before they occur.

The CSEPP QC procedures fall into three broad categories: 1) instrument inspections, performance audits, and calibration; 2) data validation and verification; and 3) problem reporting and tracking. These processes involve the coordinated, complementary efforts of Argonne and depot personnel. At three depots, an Argonne-trained technician inspects the DEMIL towers weekly and follows established problem-reporting procedures to alert Argonne personnel of any problems observed. Minor problems (e.g., weathered cables or connectors) are added to the list of tasks for the next scheduled calibration/audit trip. If necessary (e.g., for a sensor failure), an Argonne meteorological technician is immediately dispatched to the depot to correct the problem. Argonne meteorological technicians audit the performance of the meteorological instrumentation on all DEMIL, CSEPP, and community towers at least twice per year by recalibrating or replacing sensors as needed to ensure their proper operation. Including the BLM calibration visits, each depot is visited at least 4 times per year.

Depot personnel in the EOC respond to data-related alarms generated by MetView. A 24-h “hotline” has been established to permit depot personnel in the EOC to contact an Argonne meteorological instrument technician if the problem cannot be resolved locally. Upon receipt of a hotline call, the technician initiates a secure Internet connection to the local MetPC to diagnose the problem, which can usually be resolved remotely. The technician then prepares a problem report to document and, if necessary, track the issue.

Every morning, MetView automatically applies a comprehensive suite of QC algorithms to the previous day’s data and e-mails reports for each depot to CSEPP personnel at Argonne, BLM, and the Chemical Materials Agency summarizing the results. The Argonne data system manager—an experienced meteorological data analyst—reviews the QC reports and inspects the data to confirm the validity of any reported problems. Once a problem is confirmed, the data system manager determines the appropriate course of action and issues a data quality assessment (QA) report to document the problem and track its resolution. Problems with sensors on BLM-maintained towers are referred to BLM for diagnosis and resolution.

4. Performance audits and calibration

Argonne meteorological technicians audit each tower at least semiannually. Towers used for U.S. EPA permitting purposes are audited quarterly. During an audit, the performance of the sensors is carefully evaluated to ensure that they continue to meet U.S. EPA guidelines. Dataloggers, cables, and the battery backup system are also checked. In addition, the tower itself is inspected to ensure that it is in good condition and that the guying cable tensions are correct. Fiebrich et al. (2006) describe audits of similar frequency and scope performed for the Oklahoma Mesonet. For Argonne-maintained towers, items that fail an audit are replaced or corrected during the audit. BLM is notified if a problem is detected with any of the towers it maintains.

Consistent with the U.S. EPA guidelines and the ANSI–ASQC (1994a) standard, the standard operating procedures for weekly inspections and audits, as well as the results of each weekly inspection and audit, are documented and archived. The CSEPP Meteorology Support Project archive is accessible through a password-protected Web site. Annual emergency response exercises held at each depot test the entire CSEPP meteorological system, including the dispersion models.

a. Performance audit procedures

Wind direction is checked by placing the sensor in a test fixture and checking the output of the datalogger at 30° intervals; the error must be less than 5°. The orientation of the wind direction sensor is verified by attaching an alignment rod to the wind direction sensor mount. The direction that the alignment rod points is compared with a surveyed marker previously driven into the ground for that sensor. Wind speed is checked by rotating the shaft of the wind sensor at a set of known revolutions per minute. The sensor passes when the error in wind speed is less than 0.25 m s−1. This is more stringent than the U.S. EPA requirement of 0.2 m s−1 + 5% of observed. The wind speed and direction sensors are replaced when their starting torque approaches the maximum value recommended by the manufacturer. Temperature sensors are checked by placing the sensor in an ice bath at 0°C and then a stirred warm water bath to obtain a secondary test point. Towers that provide vertical temperature difference data must have a combined temperature error of less than 0.1°C. Towers that only measure temperature are allowed by the U.S. EPA to have an error of 0.5°C; however, the sensor is recalibrated if the error is greater than 0.1°C. The relative humidity sensor is checked with a collocated sensor and must have an error less than 10%. The solar radiation sensors are also checked with a collocated sensor; the error must be less than 5% of observed. Precipitation is checked by dripping a measured amount of water into the sensor; the error must be less than 10%. Barometric pressure is checked with a collocated sensor; the error must be less than 3 mb.

b. Calibration

For most sensors selected by Argonne for CSEPP, the manufacturer’s calibration is sufficient to achieve the accuracy specified in the U.S. EPA guidelines. If a sensor is replaced during an audit, it is returned to the manufacturer for recalibration or refurbishment. However, to achieve the ±0.1°C accuracy for the vertical temperature difference specified in the U.S. EPA guidelines, the temperature sensors are recalibrated to ensure that each individual temperature measurement is accurate to ±0.05°C.

The YSI 703 is a dual-thermistor sensor with a linearizing resistor network that provides a nearly linear output voltage with temperature. This feature allows the measurement sensitivity to remain constant as the temperature changes. Error in the manufacturer’s calibration arises primarily from two sources: 1) variation in the mix of resistive material during manufacturing, which causes a change in the shape of the resistance versus temperature curve, and 2) an error in the total amount of resistive material deposited during the laser-trimming step of manufacturing. Errors resulting from laser trimming cause the actual thermistor resistance to differ from the predicted resistance; however, the ratio of the actual thermistor resistance to the predicted resistance is independent of temperature.

Calibration of the YSI 703 dual-thermistor sensor is improved by accurately measuring the actual resistance of the thermistor (and the linearizing resistors) in an ice-water slurry to determine the ratio of the actual thermistor resistance to the predicted resistance, then using the Steinhart–Hart equation (Steinhart and Hart 1968) and the manufacturer-supplied calibration coefficients to calculate actual thermistor resistances as a function of temperature. A fifth-order polynomial fitted to these values of resistance and temperature produces an expected temperature accuracy of ±0.05°C from −30° to 50°C.

This calibration technique will largely remove any errors resulting from laser trimming but will not correct errors caused by deviations in the resistance versus temperature curve. The new calibration is verified at 0°, 10°, and 30°C by comparison with a National Institute of Standards and Technology (NIST)-traceable mercury-in-glass thermometer accurate to 0.01°C. Any thermistors found to have significant inaccuracy at 10° or 30°C are presumed to also have an error in the thermistor mix or other problems. These are considered defective and are not used by CSEPP. Figure 3 shows the expected error in temperature based on the manufacturer’s calibration and after recalibration.

5. Data validation and verification

CSEPP meteorological data are evaluated daily by automated screening and manual inspection. The purpose of the automated data screening is to objectively identify anomalous data values for subsequent review by an experienced data analyst. The review is necessary to determine whether an anomaly results from a problem with the instrumentation—and what maintenance action may be necessary—or whether it accurately reflects unusual meteorological conditions.

a. Automated data screening

The automated data screening process implemented in MetView involves a suite of tests that are applied to all measurements of wind speed, wind direction, σθ, temperature, pressure, relative humidity, solar radiation, precipitation, and battery voltage. It may seem redundant to test σθ in addition to the average wind direction. However, both the wind direction and its standard deviation are critical inputs to the dispersion models. Testing σθ also provides additional constraints on the wind direction measurements.

A decision algorithm collectively interprets the results of the individual tests to determine whether each datum is acceptable or whether it is potentially anomalous and requires visual review. Gandin (1988) first applied this approach, which he referred to as “complex quality control,” to meteorological data. This approach has worked successfully with similar data analysis issues for the 115-station Mesonet system in Oklahoma (Shafer et al. 2000; Fiebrich and Crawford 2001) and the National Climatic Data Center (DeGaetano 1997; Graybeal et al. 2004).

b. Data quality tests

The data quality tests fall into five categories ranging from simple to complex, from less restrictive to more restrictive: existence, limits, range, temporal, and spatial. These are consistent with the types of tests recommended by the U.S. EPA (2000). Whereas the specific screening criteria listed in the U.S. EPA recommendations are necessarily generic, a site-specific approach has been adopted for screening CSEPP meteorological data to account for the local climatology and terrain. Because the CSEPP meteorological tower network has been in operation for several years, these data (105–106 observations per variable per depot) have been used to determine—and periodically refine—the specific screening criteria for each site. Commensurate with CSEPP’s public safety mission, the screening criteria have been chosen conservatively to minimize false negative outcomes (bad data values not flagged) at the expense of a potentially higher rate of false positive identifications (valid data values flagged as anomalous).

1) Existence test (missing value)

The existence test is clearly the simplest; if this test fails, then no other tests are necessary. A missing value is identified by the occurrence of “-6999”, “-9999”, or “-99999” in the data stream. These values are inserted by the dataloggers to indicate loss of signal or out-of-range voltage from the sensor. MetView performs a separate check to identify gaps between successive data records.

2) Limit tests (valid minimum/maximum)

The limits tests establish the extreme boundaries for the measurements. These may be physical limits, sensor limits, or climatological limits. Values that fall outside these boundaries fail immediately and do not require further testing. Limits for CSEPP measurements are listed in Table 3.

3) Probable range test (climatological consistency)

Probable range tests may involve multiple sensors on the same tower at the same moment in time, such as wind speed and wind direction (similar to a wind rose). These tests provide a more stringent constraint than simple valid maximum/minimum limit tests by requiring consistency among the measurements as well as consistency with historical data. The probable range is determined from several years of prior data for a given location, which have been screened to eliminate periods of erroneous data that would affect the results.

A typical probable range involving multiple sensors is presented in Fig. 4 for wind speed and wind direction at the 10-m level from the Blue Grass Army Depot. The contours show which combinations of wind speed and wind direction fall within a given percentile of the joint probability density. The percentiles were determined by first binning the observations (1 m s−1 bins for speed, 10° bins for direction) in a joint histogram. The bins are then sorted in order from most to least observations, cumulatively summed, and divided by the total number of observations to yield the percentile for each bin. Combinations of wind speed and wind direction that fall outside the 99.9% boundary are considered improbable. Following a review of the values that fall outside the 99.9% boundary, the probable–improbable boundary has been adjusted in some cases to reduce the possibility of a false positive identification.

The 99.9th percentile was selected as the boundary of acceptability because, for 35 040 15-min observations per year, an average of 35 observations per year would be incorrectly flagged as anomalous. This represents a false positive rate of approximately one per 10-day period, which is further reduced by the decision algorithm.

Probable range tests may also compare measurements from a single sensor with the day of the year, similar to climatological high–low records. Figure 5 shows the probable range for 10-m air temperature and day of the year. Table 4 provides a list of the probable range comparisons. Because wind speed is involved in several comparisons, the wind speed measurements are checked first; if a wind speed measurement fails the existence or limits tests, the subsequent range tests involving that measurement are skipped.

Improbable values are not necessarily incorrect; to allow for the possibility of unusual meteorological conditions, the decision algorithm does not report a range anomaly unless at least one of the spatial consistency tests described below also fails (if these tests are available for the variable in question).

4) Temporal consistency

To examine the temporal consistency of the data, two tests involving the rate of change of the variables are applied: the spike/step test and the persistence test.

(i) Spike/step test (DeltaMax)

In the spike/step (or DeltaMax) test, the magnitude of the change between successive 15-min averages (of 1-s samples) is compared with the maximum probable change for a 15-min period (listed in Table 3). This test can reveal spikes and step changes in the data that may result from electrical or communications problems. Like the probable range test, the maximum probable change is based on the 99.9th percentile change for several years of prior data for a given location. Following a review of the detected anomalies, the exact value of the maximum probable change for each variable has generally been adjusted slightly higher than the 99.9th percentile change to further minimize the possibility of a false positive identification. The maximum probable change values for wind speed vary depending on the location and sensor mounting height.

Because wind gusts and rain associated with thunderstorms can produce large changes in successive data values, the decision algorithm ignores spike/step test failures during periods when precipitation is recorded. To further minimize the possibility of a false positive identification (e.g., during frontal passages), if a similar sensor on more than one other tower at the same depot also fails the spike/step test, the decision algorithm does not report a spike/step anomaly.

(ii) Persistence test (DeltaMin)

In the persistence (or DeltaMin) test, the number of consecutive measurements that fail to change by more than a minimum amount is compared with the maximum probable (99th percentile) number of consecutive periods of little or no change, which is determined from several years of prior data. This test can reveal a bearing in a wind speed or wind direction sensor that is beginning to fail (such that the threshold wind speed increases) or a sensor that may be frozen because of cold temperatures. Because the choice of the 99th percentile is somewhat arbitrary, the allowable number of persistent periods, listed in Table 5, has generally been set to a higher value based on a review of the flagged data.

The persistence test is complicated by variations in the output resolution of the dataloggers (i.e., the number of significant figures reported) at different depots. For example, on Argonne-maintained towers, the resolutions of the temperature, pressure, and relative humidity are 0.01°C, 0.01 mb, and 0.01%, respectively. On BLM-maintained towers the temperature resolution is either 0.1° or 0.56°C (1°F); the pressure resolution is either 0.1 or 0.34 mb (0.01 in. Hg); and the RH resolution is either 0.1% or 1%. When temperature, pressure, and/or relative humidity are varying slowly, limited resolution can cause the persistence test to incorrectly indicate an anomaly. To prevent this, the maximum allowable number of persistent periods is higher for these locations.

If a similar sensor on more than one other tower at the same depot also fails the persistence test, the decision algorithm does not report a persistence anomaly when the ambient temperature is above 0°C. This further minimizes the possibility of a false positive identification (e.g., during unusually long calm periods). If the temperature is below 0°C, a persistence anomaly is reported based on the assumption that freezing conditions can affect multiple sensors.

5) Spatial consistency tests

Two types of spatial tests are applied to the data where possible: 1) horizontal comparisons of the same measurement at the same height on different towers and 2) vertical comparisons of the same measurement at different heights on the same tower.

(i) Horizontal test

In the horizontal test, the differences between a measurement and the corresponding measurements on other towers are compared (at the 99.9th percentile) with probable differences established from several years of prior data. This is similar to the probable range test, as illustrated in Fig. 6 for the horizontal difference of 10-m wind speed at Blue Grass Army Depot. For all horizontal tests, the probable differences depend on wind speed. To account for elevation differences, all pressure measurements are adjusted to the elevation of the local DEMIL tower before horizontal tests are applied.

At least half of the comparisons with other towers must be out of the probable range for the measurement to fail the horizontal test. For example, if a depot has nine towers and each reports 10-m wind speed, then eight horizontal differences are possible for each wind speed measurement. If four differences fall outside the probable range, then the measurement fails the test; however, if only three differences fall outside the probable range, then the measurement passes the test.

If measurements are only available for two towers (e.g., Newport Chemical Depot), then only a single difference is available for comparison and the decision algorithm cannot unambiguously identify which tower is problematic. In this case, if the measurement from the comparison tower has passed the existence, valid limits, and temporal consistency tests, then the measurement in question is marked as anomalous. If a measurement fails any of the existence, valid limits, or temporal consistency tests, then the horizontal test is skipped.

(ii) Vertical test

The vertical test is similar to the horizontal test: differences between measurements at any two levels are compared with probable differences at the 99.9th percentile. As an example, Fig. 7 presents the probable differences between the 60- and 10-m σθ as a function of the 30-m wind speed for the BGAD. As the wind speed decreases, the maximum allowable difference between the measured standard deviations increases, and then decreases for near-calm conditions as the minimum wind speed threshold of the sensor is approached. For all vertical tests the probable differences are functions of wind speed, except for temperature. The probable vertical temperature differences are a function of the cosine of the solar zenith angle, as the example in Fig. 8 illustrates, to account for variations in the length of the day/night over the course of a year.

Because two comparisons are necessary to unambiguously identify which level is problematic, at least two vertical tests (comparing three levels) must fail for the decision algorithm to report an anomaly. Consequently, the decision algorithm will never report a vertical anomaly for towers where only two measurement levels are available. A single vertical test is still valuable, however, because a single failed vertical test can confirm a range test failure and cause the decision algorithm to report a range anomaly.

If a measurement fails any existence, valid limits, or temporal consistency test, then the vertical test is skipped.

c. Decision algorithm

A generic decision algorithm has been developed that is applicable to all variables and all sites. The decision algorithm interprets the results of the individual checks described above to determine whether a measurement is anomalous. Further experience with this decision algorithm may lead to refinements to increase its sophistication and provide more variable-specific and/or site-specific considerations.

The flowchart in Fig. 9 shows the logic of the decision algorithm. The algorithm proceeds sequentially through each step until a failure mode is identified; if no failure mode is identified the measurement is judged to be valid.

If the measurement is missing or if the quality checks reveal that the measurement is outside the valid range, no interpretation is necessary. However, if the measurement fails the DeltaMin test, then it may be that the sensor has failed, but it is also possible that an exceptionally long calm period may have occurred or that a winter storm may have caused the sensors to freeze. To decide which outcome is most likely, the result of the persistence test for the same variable on other towers is checked. If none or only one other tower fails the persistence test, then the decision is that the sensor has failed. This allows for the unlikely possibility that two sensors at the same level on different towers could fail at overlapping times. When two or more towers fail the persistence test, the temperature is checked: if it is below 0°C the decision is that the sensors have frozen; otherwise, the decision is that an exceptionally long calm period has occurred and that the sensors are operating properly.

If the measurement fails the DeltaMax test, then the results of the DeltaMax test for the previous time period are checked to determine whether an isolated spike occurred (e.g., because of an unusually large wind gust) or whether a step change has taken place. This evaluation could be extended to include N previous measurements to distinguish large temporary excursions (e.g., because of unusually strong prefrontal winds) that last more than a single measurement period from persistent offsets that may reveal a more serious sensor misalignment or calibration shift. The result of the DeltaMax test is ignored when precipitation is measured because storms can cause unusual changes in the measurements that are not indicative of sensor problems. In addition, if a similar sensor on more than one other tower at the same depot also fails the spike/step test, the decision algorithm does not report a spike/step anomaly for any of these sensors.

If the measurement fails the range test, which is based on the statistical likelihood of occurrence of the value in the past, then the horizontal and vertical test results are used (if available) to determine whether the measurement is likely to be problematic or whether the value is simply outside the statistical range. If the measurement also fails the horizontal or vertical tests, it is judged to have failed the range check. If it passes all of the horizontal and vertical tests, then it is judged to be valid.

If the measurement passes the range test but still fails the horizontal test (i.e., it does not agree with similar measurements on at least half of the other towers at the depot), then it is judged likely to be anomalous. Finally, if the measurement fails vertical comparisons with two other levels, then it is also judged to be anomalous. Both vertical tests must fail to determine which level has the problem. If the check of the 30 − 10 m wind speed difference fails, for example, then the checks of the 60 − 10 m and 60 − 30 m differences should reveal which height is anomalous. If both of the subsequent checks pass, then the measurements are judged to be valid.

d. Examples

In this section examples of various anomalies detected by the automated QC algorithms are presented for wind speed, wind direction and its standard deviation, and solar irradiance.

1) Wind speed

In Fig. 10, 10-m wind speeds at BGAD are presented along with their data quality flags for 8–11 February 2007. The flags indicate that persistence anomalies occurred each night during this period for tower 3. Because the maximum allowable persistent period for wind speed is 2 h (Table 5), an anomaly is not reported until after the allowable period has elapsed. A failing bearing was diagnosed as the cause of these anomalies and the wind speed sensor was replaced. If more than one other wind speed sensor had failed the persistence test, the decision algorithm would have considered this an unusually long calm period and would not have reported an anomaly.

2) Wind direction

All BLM-maintained towers are equipped with the Handar (now Vaisala) model 555B data collection platform (DCP) datalogger. Problems with the U.S. EPA–recommended Mitsuta algorithm (Mori 1986), used until mid-2008 by the 555B, have adversely affected measurements of the mean and standard deviation of wind direction on the BLM-maintained towers. The Mitsuta algorithm can produce incorrect mean wind directions for northerly winds accompanied by gusts that cause a full 360° rotation of the sensor. Mori (1987) noted this problem. The effect on the data is illustrated in Fig. 11, which compares 10-m mean wind speed and direction from two towers 416 m apart at PCD on 7 April 2006. Tower 3 is equipped with a 555B datalogger, whereas tower 2 is equipped with a CR10X datalogger (Campbell Scientific, Inc.). The CR10X datalogger employs a U.S. EPA–recommended unit-vector algorithm to calculate mean wind direction. Although the mean wind directions were mostly in agreement, at times the 555B reported wind directions that appear almost random, whereas the towers equipped with the CR10X generated no wind direction anomalies during this period.

The σθ has also been adversely affected by the algorithmic problems of the 555B. Figure 11 shows an example of the difference in behavior of σθ for towers equipped with CR10X and with 555B dataloggers at PCD. The values of σθ computed by the 555B frequently exceeded 104°, the standard deviation of a uniformly distributed random wind direction. During this period, no anomalies were identified for σθ on tower 2, which is equipped with the CR10X datalogger that computes σθ with a single-pass U.S. EPA–recommended algorithm described by Yamartino (1984).

Separate distributions of probable range, horizontal difference, and vertical difference of wind direction and σθ have been determined for CR10X- and 555B-equipped towers to account for this difference in behavior. For illustration purposes, the data in Fig. 11 were checked using quality test parameters and distributions derived for the CR10X-equipped towers.

To identify the source of the observed problems with wind direction and σθ, Argonne personnel established a collocated test system at the Pine Bluff Arsenal depot. Wind direction data were obtained at 1 Hz, then postprocessed using different averaging algorithms and compared with average wind direction data from the 555B. This revealed that the algorithms, not the sensors or the 555B hardware, were the source of the problems. Argonne personnel then worked closely with BLM and Vaisala to implement the unit-vector and Yamartino algorithms in the firmware of the 555B datalogger to correct the problem at the storage depots.

3) Solar irradiance

To account for the variable nature of the solar irradiance, the measured values are normalized by the maximum for the time and location, Smax, given by
i1520-0426-26-8-1510-eq1
where S0 is the solar constant (=1367 W m−2), θ is the solar zenith angle, and d is the Earth–Sun distance in astronomical units (mean Earth–Sun distance = 1 A.U.). For most of the storage depots, the solar irradiance rarely exceeds 90% of the maximum value; yet, for solar zenith angles greater than 60° (early morning), the irradiance at PCD routinely exceeds the maximum irradiance during the wintertime, thereby failing the range test.

Figure 12 presents solar irradiances from PCD for which this phenomenon occurred on the mornings of 13 and 17 January 2005. These range anomalies always occur on mornings when the temperature is at or near the frost point and the winds are light, which suggests that frost may be forming on the pyranometer dome during the night. (The pyranometers are not equipped with ventilators). If the dome is partially covered with frost opposite to the direction of the sunrise, it can act as a reflector to focus additional light on the pyranometer when the sun is low in the sky. In the case shown in Fig. 12, the conditions for frost were more severe on 14–16 January, when the anomalies did not occur. On these days, it appears the dome may have been entirely covered by frost.

6. Problem reporting and tracking

The problem-reporting system has two major functions: 1) to inform CSEPP personnel of problems and track their resolution and 2) to log all problems and permit CSEPP personnel to query the system for the types of issues that are most frequent or troublesome.

To achieve the first goal, the system has a user interface that permits authorized personnel to log problems as they occur. Each entry, or “ticket,” is tagged with information that permits problems to be identified and categorized, including a brief description of the problem. The system automatically sends e-mail notification to key personnel, including those assigned to respond to the task, those affected by the task, and program managers. The system also permits tickets to be updated following investigation and correction of the issue and the ticket status to be modified accordingly to indicate that the issue has been resolved (“closed”) or that corrective action has been taken but review of the data is necessary to ensure the issue has been resolved (“pending”). Figure 13 shows the system’s display of open, pending, and recently closed tickets.

The problem-reporting system distinguishes between two classes of reports: 1) hotline calls from depot personnel at the EOC that report equipment malfunctions and require immediate response and 2) data quality issues detected by Argonne personnel, which are normally of a less urgent nature. Most hotline calls report a data delivery problem: data from one or more towers that did not update at the scheduled 15-min interval. Argonne personnel usually resolve these problems quickly by remotely diagnosing the situation and, typically, restarting the appropriate software or computer. Data quality issues usually reflect a problem with an instrument or other equipment that may require sensor replacement or additional on-site investigation by a technician to resolve.

To achieve the second goal, the system permits query by site, tower, and instrument position on a tower (e.g., 10-m wind speed). This capability enables identification of similar problem trends across depots.

7. Conclusions

The CSEPP Meteorological Support Project ensures the accuracy and reliability of meteorological data from 73 towers at seven U.S. Army chemical weapons storage depots across the U.S. These data are used for real-time emergency response modeling as well as for training exercises. Quality control procedures developed at Argonne and implemented in collaboration with each depot ensure that the instrumentation continues to perform as specified and that the data continue to be accurate and reliable.

These processes have resulted in the continuous improvement of the CSEPP meteorological system. For example,

  • Common failure modes in the instrumentation have been identified along with their precursor trends in the data. This permits problems to be detected and instruments to be replaced prior to failure.
  • Anomalous means and standard deviations of wind direction have been traced to problematic algorithms used by the model 555B DCP dataloggers, which permitted the datalogger firmware to be revised.
  • Anomalous σθ at the 10-m level of the BGAD DEMIL tower were attributed to the adverse influence of a stand of trees upwind of the tower. The trees were removed to ensure that the measurements were representative of the local wind field.
  • Anomalous wind directions and their standard deviations from the DEMIL tower at Umatilla Chemical Depot were traced to aging lightning surge arrestors installed by the previous owner of the tower.
These examples illustrate the advantages of combining automated data screening to identify comparatively subtle anomalies with human inspection and analysis of the data and sensors to validate the findings and identify the root cause. The resulting improvements in the accuracy and reliability of the CSEPP meteorological data have, in turn, resulted in improved emergency response modeling and public safety.

This system may have application to other emergency preparedness–related needs. Local weather conditions affect protective action decisions for many hazards. The capability to acquire and display accurate, reliable real-time meteorological data on demand for optimizing protective action calculations can help to protect lives and property. Thus, the comprehensive QC system developed for the CSEPP Meteorological Support Project offers an opportunity for more effective public protection during emergencies.

Acknowledgments

This work was supported by the U.S. Army Chemical Materials Agency, Chemical Stockpile Emergency Preparedness Program, under Contract MIPR 7D22 CM7007 Amend 01 Rev 12. The submitted manuscript has been created by UChicago Argonne, LLC, Operator of Argonne National Laboratory (“Argonne”). Argonne, a U.S. Department of Energy Office of Science laboratory, is operated under Contract DE-AC02-06CH11357. The U.S. government retains for itself, and others acting on its behalf, a paid-up nonexclusive, irrevocable worldwide license in said article to reproduce, prepare derivative works, distribute copies to the public, and perform publicly and display publicly, by or on behalf of the government.

REFERENCES

  • ANSI–ASQC, 1994a: Quality management and quality assurance standards—Guidelines for selection and use. American Society for Quality Control American National Standard Q9000-1-1994, 6 pp.

    • Search Google Scholar
    • Export Citation
  • ANSI–ASQC, 1994b: Quality systems—Model for quality assurance in production, installation, and servicing. American Society for Quality Control American National Standard Q9002-1-1994, 8 pp.

    • Search Google Scholar
    • Export Citation
  • DeGaetano, A. T., 1997: A quality-control routine for hourly wind observations. J. Atmos. Oceanic Technol., 14 , 308317.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fiebrich, C. A., , and Crawford K. C. , 2001: The impact of unique meteorological phenomena detected by the Oklahoma Mesonet and ARS Micronet on automated quality control. Bull. Amer. Meteor. Soc., 82 , 21732187.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fiebrich, C. A., , Grimsley D. L. , , McPherson R. A. , , Kesler K. A. , , and Essenberg G. R. , 2006: The value of routine site visits in managing and maintaining quality data from the Oklahoma Mesonet. J. Atmos. Oceanic Technol., 23 , 406416.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gandin, L. S., 1988: Complex quality control of meteorological observations. Mon. Wea. Rev., 116 , 11371156.

  • Graybeal, D. Y., , DeGaetano A. T. , , and Eggleston K. L. , 2004: Complex quality assurance of historical hourly surface airways meteorological data. J. Atmos. Oceanic Technol., 21 , 11561169.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Liljegren, J. C., , Carhart R. A. , , Lawday P. , , Tschopp S. , , and Sharp R. , 2008: Modeling the wet bulb globe temperature using standard meteorological measurements. J. Occup. Environ. Hyg., 5 , 645655.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lockhart, T. J., 1989: Comments on “A quality control program for surface mesonetwork data.”. J. Atmos. Oceanic Technol., 6 , 525526.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Michalsky, J. J., and Coauthors, 2003: Results from the first ARM diffuse horizontal shortwave irradiance comparison. J. Geophys. Res., 108 , 4108. doi:10.1029/2002JD002825.

    • Search Google Scholar
    • Export Citation
  • Mori, Y., 1986: Evaluation of several single-pass estimators of the mean and the standard deviation of wind direction. J. Climate Appl. Meteor., 25 , 13871397.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mori, Y., 1987: Methods for estimating the mean and the standard deviation of wind direction. J. Climate Appl. Meteor., 26 , 12821284.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • NWCG Fire Weather Committee, 2008: NWCG Fire weather station standards. National Wildfire Coordinating Group Publication PMS 426-3, 49 pp. [Available online at http://www.fs.fed.us/raws/standards/FireWxStds_final_revMay08.pdf].

    • Search Google Scholar
    • Export Citation
  • Shafer, M. A., , Fiebrich C. A. , , Arndt D. S. , , Fredrickson S. E. , , and Hughes T. W. , 2000: Quality assurance procedures in the Oklahoma Mesonetwork. J. Atmos. Oceanic Technol., 17 , 474494.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Steinhart, J. S., , and Hart S. R. , 1968: Calibration curves for thermistors. Deep-Sea Res., 15 , 497503.

  • U.S. Army, 2002: Toxic chemical agent safety standards. Department of the Army Pamphlet 385-61, 82 pp.

  • U.S. Code, 1986: Destruction of existing stockpile of lethal chemical weapons. Chemical and Biological Warfare Program, Title 50 U.S. Code 1521.

    • Search Google Scholar
    • Export Citation
  • U.S. EPA, 2000: Meteorological monitoring guidance for regulatory modeling applications. U.S. Environmental Protection Agency Tech. Rep. EPA-454/R-99-005, 171 pp.

    • Search Google Scholar
    • Export Citation
  • Yamartino, R. J., 1984: A comparison of several “single-pass” estimators of the standard deviation of wind direction. J. Climate Appl. Meteor., 23 , 13621366.

    • Crossref
    • Search Google Scholar
    • Export Citation
Fig. 1.
Fig. 1.

Locations of CSEPP sites.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 2.
Fig. 2.

A typical MetView display. (top left) Current data with indications that atmospheric stability, wind speed, and direction are acceptable (“go”) for transporting chemical munitions from their storage igloos to the demilitarization facility. (top right) Map showing wind vectors with highlighted sectors indicating regions with lightning activity. (bottom left) A wind rose for the previous 72 h. (bottom right) A time series plot of stability class, solar irradiance, and vertical temperature difference.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 3.
Fig. 3.

The expected error in the YSI 703 temperature probe using the manufacturer’s standard calibration (dashed line) and after recalibration (solid line).

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 4.
Fig. 4.

Probable range of 10-m wind speed and 10-m wind direction for all towers at BGAD. Values outside the 99.9th percentile contour are considered improbable.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 5.
Fig. 5.

Probable range of 10-m air temperature as function of the day of year for all towers at BGAD.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 6.
Fig. 6.

Probable spatial (horizontal) difference in 10-m wind speed as a function of wind speed for all towers at BGAD.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 7.
Fig. 7.

Probable vertical σθ difference (60 − 10 m) as a function of wind speed for the DEMIL tower at BGAD.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 8.
Fig. 8.

Probable vertical temperature difference (60 − 2 m) as a function of the cosine of the solar zenith angle for the DEMIL tower at BGAD.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 9.
Fig. 9.

Flowchart of the decision algorithm logic.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 10.
Fig. 10.

The 10-m wind speeds and quality flags on 8–11 Feb 2007 at BGAD. The anomalous periods are circled. Open circles mark the wind speed measurements for tower 3 (dark line) that failed the persistence test.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 11.
Fig. 11.

(a) Mean 10-m wind speed, (b) 10-m wind direction, and (c) standard deviation of 10-m wind direction at PCD on 7 Apr 2006 for tower 2 with a CR10X datalogger (dark gray lines, symbols) and tower 3 with a 555B datalogger (light gray lines, symbols). Values flagged by the automatic screening algorithms (open circles) indicate either horizontal or horizontal and range anomalies. The dashed line at 104° indicates the value of σθ for a uniformly distributed random wind direction.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 12.
Fig. 12.

(a) Solar irradiance (light gray) and maximum solar irradiance (dark gray); (b) normalized solar irradiance. (a) Data values that fail the range test are marked with open circles.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Fig. 13.
Fig. 13.

The CSEPP problem-reporting and tracking system summary of open, pending, and recently closed issues.

Citation: Journal of Atmospheric and Oceanic Technology 26, 8; 10.1175/2009JTECHA1268.1

Table 1.

Types of towers in the CSEPP network. Argonne quality control procedures are applied to all towers except the independent non-CSEPP towers.

Table 1.
Table 2.

Specifications for Argonne-supplied sensors and U.S. EPA recommendations.

Table 2.
Table 3.

Valid limits.

Table 3.
Table 4.

Probable range test comparisons.

Table 4.
Table 5.

Maximum allowable persistent periods.

Table 5.
1

The Newport Chemical Depot completed its mission of demilitarization on 8 August 2008 and has shifted to “closure.” Of the six remaining sites, four conduct ongoing demilitarization activities and two currently conduct storage-only activities.

Save