Browse

You are looking at 1 - 10 of 4,087 items for :

  • Journal of Atmospheric and Oceanic Technology x
  • Refine by Access: Content accessible to me x
Clear All
Kaya Kanemaru
,
Toshio Iguchi
,
Takeshi Masaki
,
Naofumi Yoshida
, and
Takuji Kubota

Abstract

We analyze the calibration stability of the 17-yr precipitation radar (PR) data on board the Tropical Rainfall Measuring Mission (TRMM) satellite to develop a precipitation climate record from the spaceborne precipitation radar data of the TRMM and following satellite missions. Since the PR measures the normalized radar cross section (NRCS) over the ocean surface, the temporal change in the NRCS whose variability is insensitive to the sea surface wind is regarded as the temporal change of the PR calibration. The temporal change of the PR calibration in TRMM, version 7, is found to be 0.19 dB decade−1 from 1998 to 2013. The calibration change is simply adjusted to evaluate the NRCS time series and the near-surface precipitation trend analysis within the latitudinal band between 35°S and 35°N. The NRCS time series at nadir and off-nadir are uncorrelated before the calibration adjustment, but they are correlated after the adjustment. The 0.19 dB decade−1 change of the PR calibration causes an overestimation of 0.08 mm day−1 decade−1 or 2.9% decade−1 for the linear trend of the near-surface precipitation. Even after the adjustment, agreement of the results among the satellite products depends on the analysis period. The temporal stability of the data quality is also important to evaluate the plausible trend analysis. The reprocessing of the PR data in TRMM, version 8 (or later), takes into account the temporal adjustment of the calibration change based upon the results of this study, which can provide more credible data for a long-term precipitation analysis.

Significance Statement

The stability of long-term data is very important for climate research so that an account of temporal calibration changes in the sensor must be made. In this study, we investigate the calibration stability of the TRMM PR data and evaluate its impact on the precipitation trend analysis. The temporal change of the PR calibration is estimated to be 0.19 dB decade−1. Compensating for this change improves the consistency of precipitation trend analysis between the PR and other precipitation datasets. The reprocessed PR data provide more probable data for long-term precipitation analysis.

Open access
Kazuya Takami
,
Rimpei Kamamoto
,
Kenji Suzuki
,
Kosei Yamaguchi
, and
Eiichi Nakakita

Abstract

The density of newly fallen snow ρN is an important parameter for assessing accumulated snowfall depth. We examined the relationships between polarimetric parameters of X-band radar and the ρN in dry snow cases with ground temperatures less than 0°C. Our study was based on observations at Niigata Prefecture, Japan, along the coastal region of the Sea of Japan. This region is subjected primarily to sea-effect snow during the winter monsoon season, and convective clouds and rimed snow are common. We assumed that snow particles that accumulated on the ground originated from altitudes above an altitude with a temperature of −15°C, and we focused on the ratio of the differential phase K DP to radar reflectivity Zh , which is influenced by both aspect ratio and inverse particle size. We found that K DP/Zh at an altitude with a temperature of −15°C exhibited a greater magnitude for lower ρN values. Its correlation coefficient was the best among the polarimetric parameters that we examined. The difference in ice crystal flatness is highlighted rather than the difference in size because aggregation growth has not progressed at this altitude. On the basis of this result, we propose an empirical relationship between K DP/Zh at an altitude with a temperature of −15°C and ρN on the ground, thereby facilitating the estimation of snowfall depth by combining the estimated ρN with the liquid equivalent snowfall rate from, for example, Zh or K DP.

Significance Statement

This study aims to estimate the density of newly fallen (just-accumulated) snow from polarimetric radar observations. Understanding the newly fallen snow density will help to determine the exact snowfall depth. Focusing on polarimetric parameters at an altitude with a temperature of −15°C, we conducted radar and ground-based observations of snow particles and found that the newly fallen snow density of dry snow can be estimated. We were able to highlight the difference in ice crystal flatness before aggregation growth progressed by focusing on higher altitudes.

Open access
Aaron C. McCutchan
,
John D. Horel
, and
Sebastian W. Hoch

Abstract

Out of the 45 radars composing the Terminal Doppler Weather Radar (TDWR) network, 21 are located in areas of complex terrain. Their mission to observe low-level wind shear at major airports prone to strong shear-induced accidents puts them in an ideal position to fill critical boundary layer observation gaps within the NEXRAD network in these regions. Retrievals such as velocity azimuth display and velocity volume processing (VVP) are used to create time-height profiles of the boundary layer from radar conical scans, but assume that a wide area around the radar is horizontally homogeneous. This assumption is rarely met in regions of complex terrain. This paper introduces a VVP retrieval with limited radius to make these profiling techniques informative for flows affected by topography. These retrievals can be applied to any operational radar to help examine critical boundary layer processes. VVP retrievals were derived from the TDWR for Salt Lake City International Airport, TSLC, during a summertime high ozone period. These observations highlighted thermally driven circulations and variations in boundary layer depth at high vertical and temporal resolution and provided insight on their influence on air quality.

Significance Statement

Residents in many urban areas of the United States are exposed to elevated ozone concentrations during the summer months. In complex terrain, thermally driven circulations and terrain-forced flows affect chemical processes by modulating mixing and transport. A novel technique to monitor local boundary layer conditions on small horizontal length scales was applied to data from the Terminal Doppler Weather Radar located near Salt Lake City International Airport during a multiday high ozone event, and effects of these flows on ozone concentrations are illustrated. This technique can be applied to other operational weather radars to create long-term and real-time records of near-surface processes at high vertical and temporal resolution.

Open access
Benjamin C. Trabing
,
K. Hilburn
,
S. Stevenson
,
K. D. Musgrave
, and
M. DeMaria

Abstract

The Geostationary Lightning Mapper (GLM) has been providing unprecedented observations of total lightning since becoming operational in 2017. The potential for GLM observations to be used for forecasting and analyzing tropical cyclone (TC) structure and intensity has been complicated by inconsistencies in the GLM data from a number of artifacts. The algorithm that processes raw GLM data has improved with time; however, the need for a consistent long-term dataset has motivated the development of quality control (QC) techniques to help remove clear artifacts such as blooming events, spurious false lightning, “bar” effects, and sun glint. Simple QC methods are applied that include scaled maximum energy thresholds and minima in the variance of lightning group area and group energy. QC and anomaly detection methods based on machine learning (ML) are also explored. Each QC method is successfully able to remove artifacts in the GLM observations while maintaining the fidelity of the GLM observations within TCs. As the GLM processing algorithm has improved with time, the amount of QC flagged lightning within 100 km of Atlantic TCs is reduced, from 70% during 2017, to 10% in 2018, to 2% during 2021. These QC methods are relevant to the design of ML-based forecasting techniques which could pick up on artifacts rather than the signal of interest in TCs if QC was not applied beforehand.

Significance Statement

The Geostationary Lightning Mapper (GLM) provides total lightning observations in tropical cyclones that can benefit forecasts of intensity change. However, nonlightning artifacts in GLM observations make interpreting lightning observations challenging for automated techniques to predict intensity change. Quality control procedures have been developed to aid the TC community in using GLM observations for statistical and pattern-matching techniques.

Open access
Dudley B. Chelton

Abstract

The Ka-band radar interferometer (KaRIn) on the Surface Water and Ocean Topography (SWOT) satellite that was launched in December 2022 is providing the first two-dimensional altimetric views of sea surface height (SSH). Measurements are made across two parallel swaths of 50-km width separated by a 20-km gap. In the data product that will be used for most oceanographic applications, SSH estimates with a footprint diameter of about 3 km are provided on a 2 km × 2 km grid. Early analyses of in-flight KaRIn data conclude that the instrumental noise for this footprint diameter has a standard deviation less than σ 3km = 0.40 cm for conditions of 2-m significant wave height. This is a factor of 2.3 better than the prelaunch expectation based on the science requirement specification. The SSH fields measured by KaRIn allow the first satellite estimates of essentially instantaneous surface current velocity and vorticity computed geostrophically from SSH. The effects of instrumental noise on smoothed estimates of velocity and vorticity based on early postlaunch assessments are quantified here as functions of the half-power filter cutoff wavelength of the smoothing. Signal-to-noise ratios for smoothed estimates of velocity and vorticity are determined from simulated noisy KaRIn data derived from a high-resolution numerical model of the California Current System. The wavelength resolution capabilities for σ 3km = 0.40 cm are found to be about 17 and 35 km for velocity and vorticity, respectively, which correspond to feature diameters of about 8.5 and 17.5 km, and are better than the prelaunch expectations by about 45% and 35%.

Open access
Jianhua Qu
,
Ping Qin
,
Weichu Yu
,
Junjie Yan
, and
Mingge Yuan

Abstract

In remote sensing imaging systems, stripe noise is a pervasive issue primarily caused by the inconsistent response of multiple detectors. Stripe noise not only affects image quality but also severely hinders subsequent quantitative derived products and applications. Therefore, it is crucial to eliminate stripe noise while preserving detailed structure information in order to enhance image quality. Although existing destriping methods have achieved certain effects to some extent, they still face problems such as loss of image details, image blur, and ringing artifacts. To address these issues, this study proposes an image stripe correction algorithm based on weighted block sparse representation. This research applies techniques such as differential low-rank constraint and edge weight factor to remove stripe noise while retaining image detail information. The algorithm also uses the alternating direction method of multipliers (ADMM) to solve the minimax concave penalty (MCP)-regularized least squares optimization problem model, improving the processing efficiency of the model. The results of this study have been applied and validated in imager data from the Medium Resolution Spectral Imager (MERSI-II) onboard Fengyun-3D satellite, the multichannel scanning radiometer [Advanced Geosynchronous Radiation Imager (AGRI)] onboard Fengyun-4A satellite, and precipitation microwave radiometer [Microwave Radiation Imager-Rainfall Mission (MWRI-RM)] onboard Fengyun-3G. Compared to typical stripe correction methods, the proposed method achieves better stripe removal while preserving image detail information. The destriped image data can be used to generate high-quality quantitative products for various applications. Overall, by combining insights from prior research and innovative techniques, this study provides a more effective and robust solution to the stripe noise problem in remote sensing and weather forecast.

Significance Statement

Stripe noise is a persistent problem in remote sensing imaging systems, hindering image quality and subsequent analysis. This study introduces a novel algorithm based on weighted block sparse representation to remove stripe noise while preserving image details. By incorporating techniques like differential low-rank constraint and edge weight factor, our method achieves superior stripe removal. The proposed approach was validated using data from MERSI-II and AGRI satellites, showing its effectiveness in enhancing image quality. This research provides a more robust solution to the stripe noise issue, benefiting various applications in remote sensing and weather forecast.

Open access
Daniel Peláez-Zapata
,
Vikram Pakrashi
, and
Frédéric Dias

Abstract

Knowledge of the directional distribution of a wave field is crucial for a better understanding of complex air–sea interactions. However, the dynamic and unpredictable nature of ocean waves, combined with the limitations of existing measurement technologies and analysis techniques, makes it difficult to obtain precise directional information, leading to a poor understanding of this important quantity. This study investigates the potential use of a wavelet-based method applied to GPS buoy observations as an alternative approach to the conventional methods for estimating the directional distribution of ocean waves. The results indicate that the wavelet-based estimations are consistently good when compared to the framework of widely used parameterizations for the directional distribution. The wavelet-based method presents advantages in comparison with the conventional methods, including being purely data-driven and not requiring any assumptions about the shape of the distribution. In addition, it was found that the wave directional distribution is narrower at the spectral peak and broadens asymmetrically at higher and lower scales, particularly sharply for frequencies below the peak. The directional spreading appears to be independent of the wave age across the entire range of frequencies, implying that the angular width of the directional spectrum is primarily controlled by nonlinear wave–wave interactions rather than by wind forcing. These results support the use of the wavelet-based method as a practical alternative for the estimation of the wave directional distribution. In addition, this study highlights the need for continued innovation in the field of ocean wave measuring technologies and analysis techniques to improve our understanding of air–sea interactions.

Significance Statement

This study presents a wavelet-based technique for obtaining the directional distribution of ocean waves applied to GPS buoy. This method serves as an alternative to conventional methods and is relatively easy to implement, making it a practical option for researchers and engineers. The study was conducted in a highly energetic environment characterized by high wind speeds and large waves, providing a valuable dataset for understanding the dynamics of marine environment in extreme conditions. This research has implications for improving our understanding of directional characteristics of ocean waves, which is crucial for navigation, offshore engineering, weather forecasting, and coastal hazard mitigation. This study also highlights the challenges associated with understanding wave directionality and emphasizes a need for further observations.

Open access
Jakob Boventer
,
Matteo Bramati
,
Vasileios Savvakis
,
Frank Beyrich
,
Markus Kayser
,
Andreas Platis
, and
Jens Bange

Abstract

One of the most widely used systems for wind speed and direction observations at meteorological sites is based on Doppler wind lidar (DWL) technology. The wind vector derivation strategies of these instruments rely on the assumption of stationary and homogeneous horizontal wind, which is often not the case over heterogeneous terrain. This study focuses on the validation of two DWL systems, operated by the German Weather Service [Deutscher Wetterdienst (DWD)] and installed at the boundary layer field site Falkenberg (Lindenberg, Germany), with respect to measurements from a small, fixed-wing uncrewed aircraft system (UAS) of the type Multi-Purpose Airborne Sensor Carrier (MASC-3). A wind vector intercomparison at an altitude range from 100 to 500 m between DWL and UAS is performed, after a quality control of the aircraft’s data accuracy against a cup anemometer and wind vane mounted on a meteorological mast also operating at the location. Both DWL systems exhibit an overall root-mean-square difference in the wind vector retrieval of less than 22% for wind speed and lower than 18° for wind direction. The enhancement or deterioration of these statistics is analyzed with respect to scanning height and atmospheric stability. The limitations of this type of validation approach are highlighted and accounted for in the analysis.

Open access
Ryan D. Patmore
,
David Ferreira
,
David P. Marshall
,
Marcel D. du Plessis
,
J. Alexander Brearley
, and
Sebastiaan Swart

Abstract

Mixing in the upper ocean is important for biological production and the transfer of heat and carbon between the atmosphere and deep ocean, properties commonly targeted by observational campaigns using ocean gliders. We assess the reliability of ocean gliders to obtain a robust statistical representation of submesoscale variability in the ocean mixed layer of the Weddell Sea. A 1/48° regional simulation of the Southern Ocean is sampled with virtual “bow-tie” glider deployments, which are then compared against the reference model output. Sampling biases of lateral buoyancy gradients associated with the arbitrary alignment between glider paths and fronts are formally quantified, and the magnitude of the biases is comparable to observational estimates, with a mean error of 52%. The sampling bias leaves errors in the retrieved distribution of buoyancy gradients largely insensitive to deployment length and the deployment of additional gliders. Notable sensitivity to these choices emerges when the biases are removed by sampling perpendicular to fronts at all times. Detecting seasonal change in the magnitude of buoyancy gradients is sensitive to the glider-orientation sampling bias but the change in variance is not. We evaluate the impact of reducing the number of dives and climbs in an observational campaign and find that small reductions in the number of dive–climb pairs have a limited effect on the results. Lastly, examining the sensitivity of the sampling bias to path orientation indicates that the bias is not dependent on the direction of travel in our deep ocean study site.

Significance Statement

Recent observational campaigns have focused on using autonomous vehicles to better understand processes responsible for mixing in the surface region of the ocean. There exists uncertainty around how effective these missions are at returning reliable and representative information. This study seeks to quantify the performance of existing strategies in observing mixing processes, and we confirm that strategies are biased to underestimate indicators of mixing. Furthermore, compensating for the bias by increasing the number of resources or changing the manner in which resources are used has limited reward. Our findings are important for decision-making during the planning phase of an observational campaign and display that further innovations are required to account for the sampling bias.

Open access
Viktor Gouretski
,
Fabien Roquet
, and
Lijing Cheng

Abstract

The study focuses on biases in ocean temperature profiles obtained by means of Satellite Relay Data Loggers (SRDL recorders) and time–depth recorder (TDR) attached to marine mammals. Quasi-collocated profiles from Argo floats and from ship-based conductivity–temperature–depth (CTD) profilers are used as reference. SRDL temperature biases depend on the sensor type and vary with depth. For the most numerous group of Valeport 3 (VP3) and conductivity–temperature–fluorescence (CTF) sensors, the bias is negative except for the layer 100–200 m. The vertical bias structure suggests a link to the upper-ocean thermal structure within the upper 200-m layer. Accounting for a time lag which might remain in the postprocessed data reduces the bias variability throughout the water column. Below 200-m depth, the bias remains negative with the overall mean of −0.027° ± 0.07°C. The suggested depth and thermal corrections for biases in SRDL data are within the uncertainty limits declared by the manufacturer. TDR recorders exhibit a different bias pattern, showing the predominantly positive bias of 0.08°–0.14°C below 100 m primarily due to the systematic error in pressure.

Significance Statement

The purpose of this work is to improve the consistency of the data from the specific instrumentation type used to measure ocean water temperature, namely, the data from miniature temperature sensors attached to marine mammals. As mammals dive during their route to and from their feeding areas, these sensors measure water temperature and dataloggers send the measured temperature data to oceanographic data centers via satellites as soon as the mammals return to the sea surface. We have shown that these data exhibit small systematic instrumental errors and suggested the respective corrections. Taking these corrections into account is important for the assessment of the ocean climate change.

Open access