Browse

You are looking at 41 - 50 of 5,310 items for :

  • Journal of Atmospheric and Oceanic Technology x
  • Refine by Access: All Content x
Clear All
Duncan C. Wheeler
and
Sarah N. Giddings

Abstract

This manuscript presents several improvements to methods for despiking and measuring turbulent dissipation values with acoustic Doppler velocimeters (ADVs). This includes an improved inertial subrange fitting algorithm relevant for all experimental conditions as well as other modifications designed to address failures of existing methods in the presence of large infragravity (IG) frequency bores and other intermittent, nonlinear processes. We provide a modified despiking algorithm, wavenumber spectrum calculation algorithm, and inertial subrange fitting algorithm that together produce reliable dissipation measurements in the presence of IG frequency bores, representing turbulence over a 30 min interval. We use a semi-idealized model to show that our spectrum calculation approach works substantially better than existing wave correction equations that rely on Gaussian-based velocity distributions. We also find that our inertial subrange fitting algorithm provides more robust results than existing approaches that rely on identifying a single best fit and that this improvement is independent of environmental conditions. Finally, we perform a detailed error analysis to assist in future use of these algorithms and identify areas that need careful consideration. This error analysis uses error distribution widths to find, with 95% confidence, an average systematic uncertainty of ±15.2% and statistical uncertainty of ±7.8% for our final dissipation measurements. In addition, we find that small changes to ADV despiking approaches can lead to large uncertainties in turbulent dissipation and that further work is needed to ensure more reliable despiking algorithms.

Significance Statement

Turbulent mixing is a process where the random movement of water can lead to water with different properties irreversibly mixing. This process is important to understand in estuaries because the extent of mixing of freshwater and saltwater inside an estuary alters its overall circulation and thus affects ecosystem health and the distribution of pollution or larvae in an estuary, among other things. Existing approaches to measuring turbulent dissipation, an important parameter for evaluating turbulent mixing, make assumptions that fail in the presence of certain processes, such as long-period, breaking waves in shallow estuaries. We evaluate and improve data analysis techniques to account for such processes and accurately measure turbulent dissipation in shallow estuaries. Some of our improvements are also relevant to a broad array of coastal and oceanic conditions.

Restricted access
Free access
Steven M. Martinaitis
,
Scott Lincoln
,
David Schlotzhauer
,
Stephen B. Cocks
, and
Jian Zhang

Abstract

There are multiple reasons as to why a precipitation gauge would report erroneous observations. Systematic errors relating to the measuring apparatus or resulting from observational limitations due to environmental factors (e.g., wind-induced undercatch or wetting losses) can be quantified and potentially corrected within a gauge dataset. Other challenges can arise from instrumentation malfunctions, such as clogging, poor siting, and software issues. Instrumentation malfunctions are challenging to quantify as most gauge quality control (QC) schemes focus on the current observation and not on whether the gauge has an inherent issue that would likely require maintenance of the gauge. This study focuses on the development of a temporal QC scheme to identify the likelihood of an instrumentation malfunction through the examination of hourly gauge observations and associated QC designations. The analyzed gauge performance resulted in a temporal QC classification using one of three categories: GOOD, SUSP, and BAD. The temporal QC scheme also accounts for and provides an additional designation when a significant percentage of gauge observations and associated hourly QC were influenced by meteorological factors (e.g., the inability to properly measure winter precipitation). Findings showed a consistent percentage of gauges that were classified as BAD through the running 7-day (2.9%) and 30-day (4.4%) analyses. Verification of select gauges demonstrated how the temporal QC algorithm captured different forms of instrumental-based systematic errors that influenced gauge observations. Results from this study can benefit the identification of degraded performance at gauge sites prior to scheduled routine maintenance.

Significance Statement

This study proposes a scheme that quality controls rain gauges based on its performance over a running history of hourly observational data and quality control flags to identify gauges that likely have an instrumentation malfunction. Findings from this study show the potential of identifying gauges that are impacted by issues such as clogging, software errors, and poor gauge siting. This study also highlights the challenges of distinguishing between erroneous gauge observations based on an instrumentation malfunction versus erroneous observations that were the result of an environmental factor that influence the gauge observation or its quality control classification, such as winter precipitation or virga.

Restricted access
Konstantin G. Rubinshtein
and
Inna M. Gubenko

Abstract

The article compares four lightning detection networks, provides a brief overview of lightning observation data assimilation in numerical weather forecasts, and describes and illustrates the used procedure of lightning location and time assimilation in numerical weather forecasting. Evaluations of absolute errors in temperatures of air at 2 m, humidity at 2 m, air pressure near the surface, wind speed at 10 m, and precipitation are provided for 10 forecasts made in 2020 for days on which intensive thunderstorms were observed in the Krasnodar region of Russia. It has been found that average errors for the forecast area for 24, 48, and 72 h of the forecast decreased for all parameters when assimilation of observed lightning data is used for forecasting. It has been shown that the predicted precipitation field configuration and intensity became closer to references for both areas where thunderstorms were observed and the areas where no thunderstorms occurred.

Restricted access
Katrina S. Virts
and
William J. Koshak

Abstract

Performance assessments of the Geostationary Lightning Mapper (GLM) are conducted via comparisons with independent observations from both satellite-based sensors and ground-based lightning detection (reference) networks. A key limitation of this evaluation is that the performance of the reference networks is both imperfect and imperfectly known, such that the true performance of GLM can only be estimated. Key GLM performance metrics such as detection efficiency (DE) and false alarm rate (FAR) retrieved through comparison with reference networks are affected by those networks’ own DE, FAR, and spatiotemporal accuracy, as well as the flash matching criteria applied in the analysis. This study presents a Monte Carlo simulation–based inversion technique that is used to quantify how accurately the reference networks can assess GLM performance, as well as suggest the optimal matching criteria for estimating GLM performance. This is accomplished by running simulations that clarify the specific effect of reference network quality (i.e., DE, FAR, spatiotemporal accuracy, and the geographical patterns of these attributes) on the retrieved GLM performance metrics. Baseline reference network statistics are derived from the Earth Networks Global Lightning Network (ENGLN) and the Global Lightning Dataset (GLD360). Geographic simulations indicate that the retrieved GLM DE is underestimated, with absolute errors ranging from 11% to 32%, while the retrieved GLM FAR is overestimated, with absolute errors of approximately 16% to 44%. GLM performance is most severely underestimated in the South Pacific. These results help quantify and bound the actual performance of GLM and the attendant uncertainties when comparing GLM to imperfect reference networks.

Open access
Ibrahim Ibrahim
,
Gregory A. Kopp
, and
David M. L. Sills

Abstract

The current study develops a variant of the VAD method to retrieve thunderstorm peak event velocities using low-elevation WSR-88D radar scans. The main challenge pertains to the localized nature of thunderstorm winds, which complicates single-Doppler retrievals as it dictates the use of a limited spatial scale. Since VAD methods assume constant velocity in the fitted section, it is important that retrieved sections do not contain background flow. Accordingly, the current study proposes an image processing method to partition scans into regions, representing events and the background flows, that can be retrieved independently. The study compares the retrieved peak velocities to retrievals using another VAD method. The proposed technique is found to estimate peak event velocities that are closer to measured ASOS readings, making it more suitable for historical analysis. The study also compares the results of retrievals from over 2600 thunderstorm events from 19 radar–ASOS station combinations that are less than 10 km away from the radar. Comparisons of probability distributions of peak event velocities for ASOS readings and radar retrievals showed good agreement for stations within 4 km from the radar while more distant stations had a higher bias toward retrieved velocities compared to ASOS velocities. The mean absolute error for velocity magnitude increases with height ranging between 1.5 and 4.5 m s−1. A proposed correction based on the exponential trend of mean errors was shown to improve the probability distribution comparisons, especially for higher velocity magnitudes.

Open access
Douglas Cahl
,
George Voulgaris
, and
Lynn Leonard

Abstract

We assess the performance of three different algorithms for estimating surface ocean currents from two linear array HF radar systems. The delay-and-sum beamforming algorithm, commonly used with beamforming systems, is compared with two direction-finding algorithms: Multiple Signal Classification (MUSIC) and direction finding using beamforming (Beamscan). A 7-month dataset from two HF radar sites (CSW and GTN) on Long Bay, South Carolina (United States), is used to compare the different methods. The comparison is carried out on three locations (midpoint along the baseline and two locations with in situ Eulerian current data available) representing different steering angles. Beamforming produces surface current data that show high correlation near the radar boresight (R 2 ≥ 0.79). At partially sheltered locations far from the radar boresight directions (59° and 48° for radar sites CSW and GTN, respectively) there is no correlation for CSW (R 2 = 0) and the correlation is reduced significantly for GTN (R 2 = 0.29). Beamscan performs similarly near the radar boresight (R 2 = 0.8 and 0.85 for CSW and GTN, respectively) but better than beamforming far from the radar boresight (R 2 = 0.52 and 0.32 for CSW and GTN, respectively). MUSIC’s performance, after significant tuning, is similar near the boresight (R 2 = 0.78 and 0.84 for CSW and GTN) while worse than Beamscan but better than beamforming far from the boresight (R 2 = 0.42 and 0.27 for CSW and GTN, respectively). Comparisons at the midpoint (baseline comparison) show the largest performance difference between methods. Beamforming (R 2 = 0.01) is the worst performer, followed by MUSIC (R 2 = 0.37) while Beamscan (R 2 = 0.76) performs best.

Restricted access
Xiaobo Wu
,
Guijun Han
,
Wei Li
,
Qi Shao
, and
Lige Cao

Abstract

Variation of the Kuroshio path south of Japan has an important impact on weather, climate, and ecosystems due to its distinct features. Motivated by the ever-popular deep learning methods using neural network architectures in areas where more accurate reference data for oceanographic observations and reanalysis are available, we build four deep learning models based on the long short-term memory (LSTM) neural network, combined with the empirical orthogonal function (EOF) and complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN), namely, the LSTM, EOF–LSTM, CEEMDAN–LSTM, and EOF–CEEMDAN–LSTM. Using these models, we conduct long-range predictions (120 days) of the Kuroshio path south of Japan based on 50-yr ocean reanalysis and nearly 15 years of satellite altimeter data. We show that the EOF–CEEMDAN–LSTM performs the best among the four models, by attaining approximately 0.739 anomaly correlation coefficient and 0.399° root-mean-square error for the 120-day prediction of the Kuroshio path south of Japan. The hindcasts of the EOF–CEEMDAN–LSTM are successful in reproducing the observed formation and decay of the Kuroshio large meander during 2004/05, and the formation of the latest large meander in 2017. Finally, we present predictions of the Kuroshio path south of Japan at 120-day lead time, which suggest that the Kuroshio will remain in the state of the large meander until November 2022.

Restricted access
Yukio Kurihara

Abstract

Stripe noise is a common issue in sea surface temperatures (SSTs) retrieved from thermal infrared data obtained by satellite-based multidetector radiometers. We developed a bispectral filter (BSF) to reduce the stripe noise. The BSF is a Gaussian filter and an optimal estimation method for the differences between the data obtained at the split window. A kernel function based on the physical processes of radiative transfer has made it possible to reduce stripe and random noise in retrieved SSTs without degrading the spatial resolution or generating bias. The Second-Generation Global Imager (SGLI) is an optical sensor on board the Global Change Observation Mission–Climate (GCOM-C) satellite. We applied the BSF to SGLI data and validated the retrieved SSTs. The validation results demonstrate the effectiveness of BSF, which reduced stripe noise in the retrieved SGLI SSTs without blurring SST fronts. It also improved the accuracy of the SSTs by about 0.04 K (about 13%) in the robust standard deviation.

Significance Statement

This method reduces stripe noise and improves the accuracy of SST data with minimal compromise of spatial resolution. The method assumes the relationship between the brightness temperature and the brightness temperature difference in the split window based on the physical background of atmospheric radiative transfer. The physical background of the data provides an easy solution to a complex problem. Although destriping generally requires a complex algorithm, our approach is based on a simple Gaussian filter and is easy to implement.

Open access
Free access