Search Results

You are looking at 21 - 30 of 14,205 items for :

  • Data quality control x
  • All content x
Clear All
Kenneth G. Hubbard, Nathaniel B. Guttman, Jinsheng You, and Zhirong Chen

data is scant. General testing approaches such as using threshold and step change criteria have been designed for the single station review of data to detect potential outliers ( Wade 1987 ; Reek et al. 1992 ; Meek and Hatfield 1994 ; Eischeid et al. 1995 ). Recently, the use of multiple stations in quality assurance procedures has proven to provide valuable information for quality control (QC) compared with the single-station checking. Spatial tests compare a station’s data against the data

Full access
Paul E. Ciesielski, Wen-Ming Chang, Shao-Chin Huang, Richard H. Johnson, Ben Jong-Dao Jou, Wen-Chau Lee, Po-Hsiung Lin, Ching-Hwang Liu, and Junhong Wang

Environment (ASPEN) software, which performs some limited quality control on the data and produces level-2 (L2) files in NCAR Earth Observing Laboratory (EOL) sounding format. Details of EOL format and the ASPEN QC procedure are described in the L2 sonde documentation available from the TiMREX data archive. In level-3 (L3) processing, problems that are deemed correctable are resolved. Sections 3 – 6 of this paper describe the identification of the problems in the L2 dataset and their correction to

Full access
Joey J. Voermans, Alexander V. Babanin, Cagil Kirezci, Jonas T. Carvalho, Marcelo F. Santini, Bruna F. Pavani, and Luciano P. Pezzi

1. Introduction Quality control (QC) of ocean wave measurements provides confidence of the accuracy of wave observations (e.g., IOOS 2019 ) for the purpose of model calibration and validation, and the assimilation of real-time data in operational forecasting models (e.g., Gilhousen 1994 ; Komen et al. 1996 ). QC procedures are applied at different stages of the data pipeline where the initial stages target the functioning of the sensors and the last stages commonly focus on the

Restricted access
Valliappa Lakshmanan, Madison Miller, and Travis Smith

is, without having to use input data from the future. In an accumulation product that consists of N frames, it is possible for the temporal association to look ahead L frames and still remain causal as long as the look ahead is limited to the first N − L frames. In other words, when the accumulation is being generated in real time, the quality control is applied in a lagging sense—temporal continuity measures can be employed to remove spurious echoes in the first N − L frames, whereas

Full access
Simone Cosoli, Giorgio Bolzon, and Andrea Mazzoldi

( Oke et al. 2002 ; Breivik and Sætra 2001 ; Shulman et al. 2002 ; Paduan and Shulman 2004 ). Though limited to surface, HF radars in fact provide high-resolution real-time data on large observational grids at a relatively low cost and have the advantage of resolving rapidly varying current features that would require significant computational cost in ocean circulation models. 2. Quality control approaches a. General approaches While there is general agreement on the reliability of radar data for

Full access
Fabienne Gaillard, Emmanuelle Autret, Virginie Thierry, Philippe Galaup, Christine Coatanoan, and Thomas Loubrieu

the float data in real time and apply a series of standard automatic quality control (QC) tests defined by an international data management group ( Argo Data Management 2005 ) to set the QC flag values. These data are then transmitted to the global data assembly centers (GDACs) in charge of the distribution to the Argo users and to the Global Telecommunication System (GTS). A second level of processing, called “delayed mode,” is performed by the principal investigators (PIs). At the present time

Full access
Sung Yong Kim

enhanced awareness of building and sustaining regional coastal ocean observing programs (e.g., Malone and Cole 2000 ; Ocean.US 2002 ; Stokstad 2006 ). In this paper, detailed and technical descriptions of HFR data analysis are presented in terms of the quality assurance and quality control (QAQC) of radial velocity data based on the expected geophysical signals and dynamic relationships between driving forces and responses. This work will be beneficial and instructive not only for HFR operators and

Full access
Giuseppe M. R. Manzella and Marco Gambetta

1. Introduction Forecast, analysis, and reanalysis in ocean science all need data that are delivered in real time, as well as high-quality archived data. Real-time data are assimilated into numerical models, providing analysis of the state of the ocean and forecasts of future ocean characteristics. Quality control (QC) of such data normally involves a small number of working procedures compared to QC for archived data. In operational oceanography (as well as in meteorology) from time to time a

Full access
Etor E. Lucio-Eceiza, J. Fidel González-Rouco, Jorge Navarro, Hugo Beltrami, and Jorge Conte

1. Introduction Performing meteorological measurements, data storage, and management is a delicate process that is never exempt of errors, despite the efforts and care invested in the task. For any meaningful use of these meteorological data, it is important to ensure, as much as possible, the validity of observations. The procedures used for this purpose constitute the so-called quality control (QC; e.g., Wade 1987 ; Gandin 1988 ; DeGaetano 1997 ; Shafer et al. 2000 ; Fiebrich et al. 2010

Full access
Donald V. Hansen and Pierre-Marie Poulain

the surface currents of the World Ocean. To expedite completion ofresearch quality datasets for archival and dissemination, a data acquisition activity is being conducted at NOAAAtlantic Oceanographic and Meteorological Laboratory (AOML), Miami, Florida. At AOML, data from driftingbuoys of cooperating operators are quality controlled and optimally interpolated to uniform 6-h interval trajectories for archival at the Marine Environmental Data Service (Canada). This report describes in detail

Full access