Browse

You are looking at 11 - 20 of 5,057 items for :

  • Journal of Atmospheric and Oceanic Technology x
  • All content x
Clear All
J.C. Hubbert, G. Meymaris, U. Romatschke, and M. Dixon

Abstract

Ground clutter filtering is an important and necessary step for quality control of ground-based weather radars. In this two-part paper ground clutter mitigation is addressed using a time-domain regression filter. Clutter filtering is now widely accomplished with spectral processing where the times series of data corresponding to a radar resolution volume are transformed with a Discrete Fourier Transform after which the zero and near-zero velocity clutter components are eliminated by setting them to zero. Subsequently for reectivity, velocity and spectrum width estimates, interpolation techniques are used to recover some of the power loss due to the clutter filter, which has been shown to reduce bias. The spectral technique requires that the I (in-phase) and Q (quadrature) time series be windowed in order to reduce clutter power leakage away from zero and near-zero velocities. Unfortunately, window functions such as the Hamming, Hann and Blackman attenuate the time series signal by 4.01, 4.19 and 5.23 dB for 64-point times series, respectively, and thereby effectively reduce the number of independent samples available for estimating the radar parameters of any underlying weather echo. Here in Part 1 a regression filtering technique is investigated, via simulated data, which does not require the use of such window functions and thus provides for better weather signal statistics. In Part 2 (Hubbert et al. 2021) the technique is demonstrated using both S-Pol and NEXRAD data. It is shown that the regression filter rejects clutter as effectively as the spectral technique but has the distinct advantage that estimates of the radar variables are greatly improved. The technique is straightforward and can be executed in real time.

Restricted access
Travis Miles, Wayne Slade, and Scott Glenn

Abstract

Suspended particle size and concentration are critical parameters necessary to understand water quality, sediment dynamics, carbon flux, and ecosystem dynamics among other ocean processes. In this study we detail the integration of a Sequoia Scientific, Inc., Laser In situ Scattering and Transmissometry (LISST) sensor into a Teledyne Webb Research Slocum autonomous underwater glider. These sensors are capable of measuring particle size, concentration, and beam attenuation by particles in size ranges from 1.00 to 500 μm at a resolution of 1 Hz. The combination of these two technologies provides the unique opportunity to measure particle characteristics persistently at specific locations, or survey regional domains from a single profiling sensor. In this study we present the sensor integration framework, detail quality assurance and control (QAQC) procedures, as well as provide a case study of storm driven sediment resuspension and transport. Specifically, Rutgers glider RU28 was deployed with an integrated LISST-Glider for 18 days in September of 2017. During this time period it sampled the nearshore environment off of coastal New Jersey, capturing full water column sediment resuspension during a coastal storm event. A novel method for in situ background corrections is demonstrated and used to mitigate long-term bio-fouling of the sensor windows. Additionally, we present a method for removing Schlieren contaminated time periods utilizing coincident conductivity temperature and depth, fluorometer, and optical backscatter data. The combination of LISST sensors and autonomous platforms has the potential to revolutionize our ability to capture suspended particle characteristics throughout the world’s oceans.

Restricted access
Dion Häfner, Johannes Gemmrich, and Markus Jochum

Abstract

The occurrence of extreme (rogue) waves in the ocean is for the most part still shrouded in mystery, as the rare nature of these events makes them difficult to analyze with traditional methods. Modern data mining and machine learning methods provide a promising way out, but they typically rely on the availability of massive amounts of well-cleaned data.

To facilitate the application of such data-hungry methods to surface ocean waves, we developed FOWD, a freely available wave dataset and processing framework. FOWD describes the conversion of raw observations into a catalogue that maps characteristic sea state parameters to observed wave quantities. Specifically, we employ a running window approach that respects the non-stationary nature of the oceans, and extensive quality control to reduce bias in the resulting dataset.

We also supply a reference Python implementation of the FOWD processing toolkit, which we use to process the entire CDIP buoy data catalogue containing over 4 billion waves. In a first experiment, we find that, when the full elevation time series is available, surface elevation kurtosis and maximum wave height are the strongest univariate predictors for rogue wave activity. When just a spectrum is given, crest-trough correlation, spectral bandwidth, and mean period fill this role.

Restricted access
Haifeng Zhang, Alexander Ignatov, and Dean Hinshaw

Abstract

In situ sea surface temperature (SST) measurements play a critical role in the calibration/validation (Cal/Val) of satellite SST retrievals and ocean data assimilation. However, their quality is not always optimal, and proper quality control (QC) is required before they can be used with confidence. The in situ SST Quality Monitor (iQuam) system was established at the National Oceanic and Atmospheric Administration (NOAA) in 2009, initially to support the Cal/Val of NOAA satellite SST products. It collects in situ SST data from multiple sources, performs uniform QC, monitors the QC’ed data online, and distributes it to users. In this study, the iQuam QC is compared with other QC methods available in some of the in situ data ingested in iQuam. Overall, the iQuam QC performs well on daily-to-monthly time scales over most global oceans and under a wide variety of environmental conditions. However, it may be less accurate in the daytime a when pronounced diurnal cycle is present, and in dynamic regions, due to the strong reliance on the “reference SST check”, which employs daily low-resolution level 4 (L4) analyses with no diurnal cycle resolved. The iQuam “performance history check”, applied to all in situ platforms, is an effective alternative to the customary “black/gray” lists, available only for some platforms (e.g., drifters and Argo floats). In the future, iQuam QC will be upgraded (e.g., using improved reference field(s), with enhanced temporal and spatial resolutions). More comparisons with external QC methods will be performed to learn and employ the best QC practices.

Restricted access
Tara Howatt, Stephanie Waterman, and Tetjana Ross

Abstract

Turbulence plays a key role in many oceanic processes, but a shortage of turbulence observations impedes its exploration. Parameterizations of turbulence applied to readily-available CTD data can be useful in expanding our understanding of the space-time variability of turbulence. Typically tested and applied to shipboard data, these parameterizations have not been rigorously tested on data collected by underwater gliders, which show potential to observe turbulence in conditions that ships cannot. Using data from a 10-day glider survey in a coastal shelf environment, we compare estimates of turbulent dissipation from the finescale parameterization and Thorpe scale method to those estimated from microstructure observations collected on the same glider platform. We find that the finescale parameterization captures the magnitude and statistical distribution of dissipation, but cannot resolve spatiotemporal features in this relatively shallow water depth. In contrast, the Thorpe scale method more successfully characterizes the spatiotemporal distribution of turbulence; however, the magnitude of dissipation is overestimated, largely due to limitations on the detectable density overturn size imposed by the typical glider CTD sampling frequency of 0.5 Hz and CTD noise. Despite these limitations, turbulence parameterizations provide a viable opportunity to use CTD data collected by the multitude of gliders sampling the ocean to develop greater insight into the space-time variability of ocean turbulence and the role of turbulence in oceanic processes.

Restricted access
Ganesh Gopalakrishnan, Bruce D. Cornuelle, Matthew R. Mazloff, Peter F. Worcester, and Matthew A. Dzieciuch

Abstract

The 2010–2011 North Pacific Acoustic Laboratory (NPAL) Philippine Sea experiment measured travel times between six acoustic transceiver moorings in a 660–km diameter ocean acoustic tomography array in the Northern Philippine Sea (NPS). The travel-time series compare favorably with travel times computed for a yearlong series of state estimates produced for this region using the Massachusetts Institute of Technology general circulation model–Estimating the Circulation and Climate of the Ocean four-dimensional variational (MITgcm-ECCO 4DVAR) assimilation system constrained by satellite sea surface height and sea surface temperature observations and by Argo temperature and salinity profiles. Fluctuations in the computed travel times largely match the fluctuations in the measurements caused by the intense mesoscale eddy field in the NPS, providing a powerful test of the observations and state estimates. The computed travel times tend to be shorter than the measured travel times, however, reflecting a warm bias in the state estimates. After processing the travel times to remove tidal signals and extract the low-frequency variability, the differences between the measured and computed travel times were used in addition to SSH, SST, and Argo temperature and salinity observations to further constrain the model and generate improved state estimates. The assimilation of the travel times reduced the misfit between the measured and computed travel times, while not increasing the misfits with the other assimilated observations. The state estimates that used the travel times are more consistent with temperature measurements from an independent oceanographic mooring than the state estimates that did not incorporate the travel times.

Restricted access
Xia Pengfei, Ye Shirong, Xu Caijun, and Jiang Weiping

Abstract

Tropospheric hydrostatic delay is one of the major source of errors in Global Navigation Satellite System (GNSS) navigation and positioning, and an important parameter in GNSS meteorology. This work first proposes a new method of computing zenith hydrostatic delay (ZHD) based on precipitable water vapor (PWV), using radiosonde data. Next, using these calculations as a reference, the performance of three empirical ZHD models and three ZHD integral models in China is assessed using benchmark values obtained from 8 years (2010-2017) of radiosonde data recorded at 75 stations across China. Finally, we provide a new revised ZHD model that can be applied to China and validate its performance using radiosonde data collected in China in 2018. The statistical results indicate that the ZHD can be estimated by this new model with an accuracy of several millimeters. Due to its performance and simplicity, this new model is shown to be the optimal ZHD model for use in China.

Restricted access
Naoki HIROSE, Tianran LIU, Katsumi TAKAYAMA, Katsuto UEHARA, Takeshi TANEDA, and Young Ho KIM

Abstract

This study clarifies the necessity of an extraordinary large coefficient of vertical viscosity for dynamical ocean modeling in a shallow and narrow strait with complex bathymetry. Sensitivity experiments and objective analyses imply that background momentum viscosity is at the order of 100 cm2/s, while tracer diffusivity estimates are on the order of 0.1 cm2/s. The physical interpretation of these estimates is also discussed in the last part of this paper. To obtain reliable solutions, this study introduces cyclic application of the dynamical response to each parameter to minimize the number of long-term sensitivity experiments. The recycling Green’s function method yields weaker bottom friction and enhanced latent heat flux simultaneously with the increased viscosity in high-resolution modeling of the Tsushima/Korea Strait.

Restricted access
Stephen S. Leroy, Chi O. Ao, Olga P. Verkhoglyadova, and Mayra I. Oyola

Abstract

Bayesian interpolation has previously been proposed as a strategy to construct maps of radio occultation (RO) data, but that proposition did not consider the diurnal dimension of RO data. In this work, the basis functions of Bayesian interpolation are extended into the domain of the diurnal cycle, thus enabling monthly mapping of radio occultation data in synoptic time and analysis of the atmospheric tides. The basis functions are spherical harmonics multiplied by sinusoids in the diurnal cycle up to arbitrary spherical harmonic degree and diurnal cycle harmonic. Bayesian interpolation requires a regularizer to impose smoothness on the fits it produces, thereby preventing the overfitting of data. In this work, a formulation for the regularizer is proposed and the most probable values of the parameters of the regularizer determined. Special care is required when obvious gaps in the sampling of the diurnal cycle are known to occur in order to prevent the false detection of statistically significant high-degree harmonics of the diurnal cycle in the atmosphere. Finally, this work probes the ability of Bayesian interpolation to generate a valid uncertainty analysis of the fit. The postfit residuals of Bayesian interpolation are dominated not by measurement noise but by unresolved variability in the atmosphere, which is statistically nonuniform across the globe, thus violating the central assumption of Bayesian interpolation. The problem is ameliorated by constructing maps of RO data using Bayesian interpolation that partially resolve the temporal variability of the atmosphere, constructing maps for approximately every 3 days of RO data.

Restricted access
Guangyao Dai, Xiaoye Wang, Kangwen Sun, Songhua Wu, Xiaoquan Song, Rongzhong Li, Jiaping Yin, and Xitao Wang

Abstract

A practical method for instrumental calibration and aerosol optical properties retrieval based on coherent Doppler lidar (CDL) and sun photometer is presented in this paper. To verify its feasibility and accuracy, this method is applied into three field experiments in 2019 and 2020. In this method, multiwavelength (440, 670, 870, and 1020 nm) aerosol optical depth (AOD) from sun-photometer measurements are used to estimate AOD at 1550 nm and calibrate integrated CDL backscatter signal. Then it is validated by comparing the retrieved calibrated AOD at 1550 nm from CDL signal and that from sun-photometer measurements. Good agreement between them with the correlation of 0.96, the RMSE of 0.0085, and the mean relative error of 22% is found. From the comparison results of these three experiments, sun photometer is verified to be an effective reference instrument for the calibration of CDL return signal and the aerosol optical properties measurement with CDL is feasible. It is expected to promote the study on the aerosol flux and transport mechanism in the planetary boundary layer with the widely deployed CDLs.

Open access