• Collins, W. G., 2001: The operational complex quality control of radiosonde heights and temperatures at the National Centers for Environmental Prediction. Part I: Description of the method. J. Appl. Meteor., 40 , 137151.

    • Search Google Scholar
    • Export Citation
  • Durre, I., , R. S. Vose, , and D. B. Wuertz, 2006: Overview of the integrated global radiosonde archive. J. Climate, 19 , 5368.

  • Fiebrich, C. A., , and K. C. Crawford, 2001: The impact of unique meteorological phenomena detected by the Oklahoma Mesonet and ARS Micronet on automated quality control. Bull. Amer. Meteor. Soc., 82 , 21732187.

    • Search Google Scholar
    • Export Citation
  • Graybeal, D. Y., , A. T. DeGaetano, , and K. L. Eggleston, 2004a: Complex quality assurance of historical hourly surface airways meteorological data. J. Atmos. Oceanic Technol., 21 , 11561169.

    • Search Google Scholar
    • Export Citation
  • Graybeal, D. Y., , A. T. DeGaetano, , and K. L. Eggleston, 2004b: Improved quality assurance for historical hourly temperature and humidity: Development and application to environmental analysis. J. Appl. Meteor., 43 , 17221735.

    • Search Google Scholar
    • Export Citation
  • Guttman, N. B., , C. Karl, , T. Reek, , and V. Shuler, 1988: Measuring the performance of data validators. Bull. Amer. Meteor. Soc., 69 , 14481452.

    • Search Google Scholar
    • Export Citation
  • Hubbard, K. G., , S. Goddard, , W. D. Sorensen, , N. Wells, , and T. T. Osugi, 2005: Performance of quality assurance procedures for an Applied Climate Information System. J. Atmos. Oceanic Technol., 22 , 105112.

    • Search Google Scholar
    • Export Citation
  • Kahl, J. D., , M. C. Serreze, , S. Shiotani, , S. M. Skony, , and R. C. Schnell, 1992: In-situ meteorological sounding archives for Arctic studies. Bull. Amer. Meteor. Soc., 73 , 18241830.

    • Search Google Scholar
    • Export Citation
  • Kunkel, K. E., and Coauthors, 1998: An expanded digital daily database for climatic resources applications in the Midwestern United States. Bull. Amer. Meteor. Soc., 79 , 13571366.

    • Search Google Scholar
    • Export Citation
  • Kunkel, K. E., , D. R. Easterling, , K. Hubbard, , K. Redmond, , K. Andsager, , M. C. Kruk, , and M. L. Spinar, 2005: Quality control of pre-1948 Cooperative Observer Network data. J. Atmos. Oceanic Technol., 22 , 16911705.

    • Search Google Scholar
    • Export Citation
  • Loehrer, S. M., , T. A. Edmands, , and J. A. Moore, 1996: TOGA COARE upper-air sounding data archive: Development and quality control procedures. Bull. Amer. Meteor. Soc., 77 , 26512672.

    • Search Google Scholar
    • Export Citation
  • Lorenc, A. C., , and O. Hammon, 1988: Objective quality control of observations using Bayesian methods—Theory, and a practical implementation. Quart. J. Roy. Meteor. Soc., 114 , 515543.

    • Search Google Scholar
    • Export Citation
  • Reek, T., , S. R. Doty, , and T. W. Owen, 1992: A deterministic approach to the validation of historical daily temperature and precipitation data from the Cooperative Observer Network. Bull. Amer. Meteor. Soc., 73 , 753762.

    • Search Google Scholar
    • Export Citation
  • Shafer, M. A., , C. A. Fiebrich, , D. S. Arndt, , S. E. Frederickson, , and T. W. Hughes, 2000: Quality assurance procedures in the Oklahoma Mesonetwork. J. Atmos. Oceanic Technol., 17 , 474494.

    • Search Google Scholar
    • Export Citation
  • Wolter, K., 1997: Trimming problems and remedies in COADS. J. Climate, 10 , 19801997.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 164 164 62
PDF Downloads 70 70 21

Strategies for Evaluating Quality Assurance Procedures

View More View Less
  • 1 NOAA/National Climatic Data Center, Asheville, North Carolina
© Get Permissions
Restricted access

Abstract

The evaluation strategies outlined in this paper constitute a set of tools beneficial to the development and documentation of robust automated quality assurance (QA) procedures. Traditionally, thresholds for the QA of climate data have been based on target flag rates or statistical confidence limits. However, these approaches do not necessarily quantify a procedure’s effectiveness at detecting true errors in the data. Rather, as illustrated by way of an “extremes check” for daily precipitation totals, information on the performance of a QA test is best obtained through a systematic manual inspection of samples of flagged values combined with a careful analysis of geographical and seasonal patterns of flagged observations. Such an evaluation process not only helps to document the effectiveness of each individual test, but, when applied repeatedly throughout the development process, it also aids in choosing the optimal combination of QA procedures and associated thresholds. In addition, the approach described here constitutes a mechanism for reassessing system performance whenever revisions are made following initial development.

Corresponding author address: Dr. Imke Durre, NOAA/National Climatic Data Center, 151 Patton Ave., Asheville, NC 28801. Email: imke.durre@noaa.gov

Abstract

The evaluation strategies outlined in this paper constitute a set of tools beneficial to the development and documentation of robust automated quality assurance (QA) procedures. Traditionally, thresholds for the QA of climate data have been based on target flag rates or statistical confidence limits. However, these approaches do not necessarily quantify a procedure’s effectiveness at detecting true errors in the data. Rather, as illustrated by way of an “extremes check” for daily precipitation totals, information on the performance of a QA test is best obtained through a systematic manual inspection of samples of flagged values combined with a careful analysis of geographical and seasonal patterns of flagged observations. Such an evaluation process not only helps to document the effectiveness of each individual test, but, when applied repeatedly throughout the development process, it also aids in choosing the optimal combination of QA procedures and associated thresholds. In addition, the approach described here constitutes a mechanism for reassessing system performance whenever revisions are made following initial development.

Corresponding author address: Dr. Imke Durre, NOAA/National Climatic Data Center, 151 Patton Ave., Asheville, NC 28801. Email: imke.durre@noaa.gov

Save