Search Results

You are looking at 1 - 10 of 817 items for :

  • Data quality control x
  • Journal of Physical Oceanography x
  • All content x
Clear All
Marios Christou and Kevin Ewans

cheap computational storage, the time history of the raw wave measurements is also saved. This step change has permitted the detailed analysis of rogue waves, without which the present study could not have been undertaken. As field measurements are associated with instrument errors, a vast amount of quality control (QC) is required. To produce a reliable database, a strict QC procedure is necessary; however, this has the undesired effect of reducing the amount of data available for analysis

Full access
Daniel L. Rudnick, Ganesh Gopalakrishnan, and Bruce D. Cornuelle

forward simulation with the optimized controls so that the ocean state exactly obeys the model dynamics. Three assimilation experiments are performed: experiment 1 uses only along-track satellite SSH, experiment 2 uses temperature and salinity from the gliders in addition to SSH, and a control experiment assimilates no data. Forecasts of the GoM circulation are initialized from the optimized solutions for each of these hindcasts. Some relevant details of the MITgcm-IAS model are as follows: The model

Full access
M. C. Gregg

made possible the computation of quantities such as the stability parameter N2 and the Richardson number Ri over scales of less than a meter. Quality control of the data has been done on an ad hoc basis by individual investigators who have used varying degrees of restraint in the claims made for the final results. As the amount of high resolution data increases and as the parameters computed from it are used in statistical studies a more objective approach is required. Profiling instruments are

Full access
Jerome A. Smith, Paola Cessi, Ilker Fer, Gregory Foltz, Baylor Fox-Kemper, Karen Heywood, Nicole Jones, Jody Klymak, and Joseph LaCasce

that this editorial will help to prod the discussion of exactly what data should be shared, and in what form (format, metadata, level of quality control, etc.). The motivation is that science requires evidence. Making data available allows other scientists to confirm results, uncover errors, or find new insights. Moreover, gathering good data is expensive and time consuming. Since the same data can often be used for a range of purposes, making data available can be an efficient use of limited

Open access
John Derber and Anthony Rosati

. Thesenewer and more complex techniques, however, appearto be currently impractical for application to a highresolution global oceanic GCM. Thus in this study we1334 JOURNAL OF PHYSICAL OCEANOGRAPHY VOLUME 19 have chosen a rather straightforward technique to per form the data assimilation. Any data assimilation system can be described interms of three components, the dynamical model, thedata and quality control

Full access
Weiwei Fu

1. Introduction Coupled physical–biogeochemical (CPB) models are important tools to understand the role of external forcing such as climate change and nutrient loads in controlling biogeochemical processes in the Baltic Sea ( Neumann 2010 ; Maar et al. 2011 ). The simulation and prediction of marine ecosystems with CPB models have also triggered growing needs to develop data assimilation capability ( George et al. 2005 ; Nerger and Gregg 2007 ; Siddorn et al. 2007 ; Brasseur et al. 2009

Full access
Carl Wunsch and Patrick Heimbach

, improved data quality, and new data sources, the results have continued to improve. Searches must continue for better and more realistic models, for better continuous global coverage by observations—particularly in the abyss—and for improved understanding of the skill in such combined estimates as in our own. The degree to which model-data misfits in one region of the ocean degrade the solution everywhere else for all time must be determined. Production of accurate, but never perfect estimates of the

Full access
Jiwei Tian, Lei Zhou, and Xiaoqian Zhang

energy dissipation balances net internal tide flux in the inner ocean, we calculate the mixing rate caused by internal tides in a suitably chosen control volume. Therefore, it is a meaningful attempt to estimate the distribution of the mixing rate in the upper ocean using the altimeter data and dynamical relations of internal waves. This paper is organized into four topical sections. After the first introduction section, section 2 will describe the method to calculate global energy flux of M 2

Full access
Bernadette M. Sloyan, Ken R. Ridgway, and Rebecca Cowley

OceanSITES ( http://www.oceansites.org ). Here, we provide a brief summary of the data quality control and procedures applied. The reader is referred to Cowley (2015 ; http://data.aodn.org.au/IMOS/public/ABOS/reports/ ) for further information on mooring design and quality control procedures. We use the quality controlled time series to produce the gridded velocity and temperature data used in this study. Processing and quality control of the data were completed using MATLAB routines ( Cowley 2015

Full access
M. Dolores Pérez-Hernández, Alonso Hernández-Guerra, Terrence M. Joyce, and Pedro Vélez-Belchí

://www.argo.net ) consists of a fleet of more than 3000 profilers across the World Ocean. These profilers drift at pressures of 1500 or 1000 dbar and every 10 days they descend to 2000 dbar to measure salinity and temperature at discrete levels in the way up to the surface were recorded data are telemeted ( Roemmich et al. 2009 ). All available good quality Argo data in the equatorial Indian Ocean (6°S–6°N, 38°–105°E) from January 2003 to March 2010 were downloaded in April 2010. Besides the real-time quality control

Full access