Search Results

You are looking at 51 - 60 of 14,320 items for :

  • Data quality control x
  • All content x
Clear All
Susan Rennie, Peter Steinle, Alan Seed, Mark Curtis, and Yi Xiao

single-polarization S-band or C-band Doppler radars, with 1° or ~1.8° beamwidth, and 250- or 500-m range resolution. Each radar scans over 14 elevations every 6 or 10 min. The radar data undergo on-site quality control including zero-velocity ground clutter filtering and dual-PRF dealiasing before being transmitted offsite. Offsite processing includes further velocity dealiasing and echo classification, including clutter not removed by on-site filtering. The data (Australian Bureau of Meteorology

Full access
Jay H. Lawrimore, David Wuertz, Anna Wilson, Scott Stevens, Matthew Menne, Bryant Korzeniewski, Michael A. Palecki, Ronald D. Leeper, and Thomas Trunk

&P Rebuild (FPR), aimed to improve the quality and completeness of hourly precipitation observations while reducing maintenance costs. As a replacement for paper tape, which are subject to tearing, deterioration, and being expended between site visits, the upgrade introduced digital recording via a datalogger. Accompanying this transition to digital recording are new data acquisition, integration, and quality control processes developed at NCEI. This new approach features a change from a largely manual

Restricted access
Jérôme Gourrion, Tanguy Szekely, Rachel Killick, Breck Owens, Gilles Reverdin, and Bertrand Chapron

Europe, within the Copernicus Marine Environment Monitoring Service, these activities are conducted at global and regional scales, both in real time and delayed time. At the global scale, Mercator-Océan carries out the modeling and assimilation activities while Coriolis is involved in the observational ones. For these complementary activities to succeed, an essential and critical activity is the data quality control (QC). This paper focuses on QC procedures. For meteorological data, Gandin (1988

Restricted access
Li Bi, James A. Jung, Michael C. Morgan, and John F. Le Marshall

: 15 September–30 October 2006 and 15 February–30 March 2007. Our first choice would have been the more extreme months of January and July. However, due to data collection problems, including satellite outages, October and March were the two seasons with the best data coverage. The first two weeks of each time period were removed from these results to allow the assimilation system and forecast model to adjust to the new data. a. Quality control and data thinning The WindSat data in this experiment

Full access
Dean Vickers and L. Mahrt

1. Introduction Frequently data are analyzed without tedious inspection of individual records for isolated instrumentation problems. Some investigators have developed automatic checks for frequently occurring problems. Smith et al. (1996) have recently constructed automated quality control procedures for slow response surface data that flag questionable data for visual inspection. Foken and Wichura (1996) apply criteria to fast-response turbulence data to test for nonstationarity and

Full access
Youlong Xia, Trent W. Ford, Yihua Wu, Steven M. Quiring, and Michael B. Ek

in the United States lack harmonization through standardized quality control methods and protocols. The absence of consistent calibration and measurement standards among observation networks makes comparison of data collected by different networks very difficult. To overcome this limitation, the International Soil Moisture Network (ISMN; ) was initiated in 2010 to serve as a centralized data hosting facility where globally available in situ soil moisture

Full access
S. J. Rennie, M. Curtis, J. Peter, A. W. Seed, P. J. Steinle, and G. Wen

1. Introduction The use of radar observations for data assimilation (DA) in NWP is growing with the development of high-resolution NWP. Quality control is vital for data assimilation because the impact of a few bad observations can be substantial ( Rabier et al. 1996 ), damaging a forecast. For DA and other quantitative applications of radar data, quality control (QC) that provides flexibility depending on the application is desirable. Many echo identification algorithms have been developed in

Full access
Ann Gronell and Susan E. Wijffels

Exchange program. However, the resulting archives often contain many duplicates of the same profile as well as data of varying quality. Because the archives are so large (millions of profiles), hand quality controlling every profile is not affordable. Here we describe methods that use a combination of expert manual quality control techniques and automated statistical tests that allow the quality of these historical archives to be dramatically increased, thereby increasing the value of these important

Full access
J. Fidel González-Rouco, J. Luis Jiménez, Vicente Quesada, and Francisco Valero

were identified as those values trespassing a maximum threshold for each time series ( Trenberth and Paolino 1980 ; Peterson et al. 1998a ) defined by P out = q 0.75 + 3IQR, (1) where q 0.75 is the third quartile and IQR the interquartilic range. The IQR has been used in quality control of climate data ( Eischeid et al. 1995 ) because it is resistant to outliers. Values over P out were substituted by this limit. This way of proceeding reduces the bias caused by outliers and yet keeps the

Full access
Anthony R. Kirincich, Tony de Paolo, and Eric Terrill

, direction-finding algorithm performance, and velocity-based quality control schemes, the role of external data quality indicators (i.e., those independent of the estimated velocity itself) have received less attention. This work intends to show that using nonvelocity-based metrics of the signal quality and the direction-of-arrival (DOA) function to both implement additional data quality controls and alter the typical spatial averaging process can lead to surface currents with reduced scatter and biases

Full access