Browse

You are looking at 11 - 20 of 5,077 items for :

  • Journal of Atmospheric and Oceanic Technology x
  • Refine by Access: All Content x
Clear All
Keitaro Asai, Hiroshi Kikuchi, Tomoo Ushio, and Yasuhide Hobara

Abstract

The multi-parameter phased array weather radar (MP-PAWR) was the first dual-polarized phased array weather radar to be commissioned in Japan (2017). When conducting a volume scan, the MP-PAWR respectively uses electronic and mechanical scanning in the elevation and azimuth angles to achieve rapid scanning and high-density observations. Although the effectiveness of the MP-PAWR has been demonstrated in case studies, its observation accuracy is yet to be quantitatively analyzed. Therefore, this study compared data of MP-PAWR with that of an operational dual-polarized weather radar with a parabolic-type antenna (X-MP radar) using 2,347,097 data samples obtained over 14 h. The results showed that the observation accuracy of the MP-PAWR was approximately the same as that of the X-MP radar at low elevations. The correlations of observational parameters (radar reflectivity factor, differential resistivity, specific differential phase, and Doppler velocity) between the MP-PAWR and X-MP radar ranged from 0.77–0.99 when MP-PAWR data were recorded within 15 s of the X-MP radar observations. The correlation between the observational parameters of the two radars decreased as the observation time difference between the X-MP radar and MP-PAWR increased. In particular, the correlation coefficients between the specific differential phase and the differential reflectivity were considerably lower than the single-polarization parameter at observation time difference of 240–300 s. By providing high-frequency and high-density dual-polarization observations, the MP-PAWR can contribute to rainfall prediction in Japan and reduce the damage caused by localized, rapidly developing cumulonimbus clouds.

Restricted access
Hyun Mee Kim and Dae-Hui Kim

Abstract

In this study, the effect of boundary-condition configurations in the regional Weather Research and Forecasting (WRF) Model on the adjoint-based forecast sensitivity observation impact (FSOI) for 24 h forecast error reduction was evaluated. The FSOI has been used to diagnose the impact of observations on the forecast performance in several global and regional models. Different from the global model, in the regional model, the lateral boundaries affect forecasts and FSOI results. Several experiments with different lateral boundary conditions were conducted. The experimental period was from 1 to 14 June 2015. With or without data assimilation, the larger the buffer size in lateral boundary conditions, the smaller the forecast error. The nonlinear and linear forecast error reduction (i.e., observation impact) decreased as the buffer size increased, implying larger impact of lateral boundaries and smaller observation impact on the forecast error. In most experiments, in terms of observation types (variables), upper-air radiosonde observations (brightness temperature) exhibited the greatest observation impact. The ranking of observation impacts was consistent for observation types and variables among experiments with a constraint in the response function at the upper boundary. The fractions of beneficial observations were approximately 60%, and did not considerably vary depending on the boundary conditions specified when calculating the FSOI in the regional modeling framework.

Open access
Haifeng Zhang, Alexander Ignatov, and Dean Hinshaw

Abstract

In situ sea surface temperature (SST) measurements play a critical role in the calibration/validation (Cal/Val) of satellite SST retrievals and ocean data assimilation. However, their quality is not always optimal, and proper quality control (QC) is required before they can be used with confidence. The in situ SST Quality Monitor (iQuam) system was established at the National Oceanic and Atmospheric Administration (NOAA) in 2009, initially to support the Cal/Val of NOAA satellite SST products. It collects in situ SST data from multiple sources, performs uniform QC, monitors the QCed data online, and distributes them to users. In this study, the iQuam QC is compared with other QC methods available in some of the in situ data ingested in iQuam. Overall, the iQuam QC performs well on daily to monthly time scales over most global oceans and under a wide variety of environmental conditions. However, it may be less accurate in the daytime, when a pronounced diurnal cycle is present, and in dynamic regions, because of the strong reliance on the “reference SST check,” which employs daily low-resolution level-4 analyses with no diurnal cycle resolved. The iQuam “performance history check,” applied to all in situ platforms, is an effective alternative to the customary “black/gray” lists, available only for some platforms (e.g., drifters and Argo floats). In the future, iQuam QC will be upgraded [e.g., using improved reference field(s), with enhanced temporal and spatial resolutions]. More comparisons with external QC methods will be performed to learn and employ the best QC practices.

Restricted access
Rodriguez Yombo Phaka, Alexis Merlaud, Gaia Pinardi, Emmanuel Mahieu, François Hendrick, Martina M. Friedrich, Caroline Fayt, Michel Van Roozendael, Buenimio Lomami Djibi, Richard Bopili Mbotia Lepiba, Edmond Phuku Phuati, and Jean-Pierre Mbungu Tsumbu

Abstract

We present the first ground-based remote sensing measurements of NO2 made in Kinshasa. They were performed from 2017 to 2019. The motivation of making observations on air pollution in Kinshasa comes from its geographical location, its demography, its climatic conditions, and the many different sources of NO2 existing in its surroundings. A method for recovering the vertical density of the NO2 tropospheric column (VCDtropo) based on the differential optical absorption spectroscopy (DOAS) applied to observations at the zenith and 35° elevation angle is described. The mean value of VCDtropo observed in Kinshasa is 3 × 1015 molecules cm−2. We further present first comparisons with the Ozone Monitoring Instrument (OMI) and Tropospheric Monitoring Instrument (TROPOMI) satellite observations. When comparing OMI data with our observations and using a linear regression analysis, we find a slope of 0.34 and a correlation coefficient of 0.50 for 51 days of coincidences over 2017–19. Similar comparisons with TROPOMI for 44 days show a slope of 0.41 and a correlation coefficient of 0.72. This study opens up perspectives for further air quality–related studies in Kinshasa and central Africa.

Restricted access
Dion Häfner, Johannes Gemmrich, and Markus Jochum

Abstract

The occurrence of extreme (rogue) waves in the ocean is for the most part still shrouded in mystery, because the rare nature of these events makes them difficult to analyze with traditional methods. Modern data-mining and machine-learning methods provide a promising way out, but they typically rely on the availability of massive amounts of well-cleaned data. To facilitate the application of such data-hungry methods to surface ocean waves, we developed the Free Ocean Wave Dataset (FOWD), a freely available wave dataset and processing framework. FOWD describes the conversion of raw observations into a catalog that maps characteristic sea state parameters to observed wave quantities. Specifically, we employ a running-window approach that respects the nonstationary nature of the oceans, and extensive quality control to reduce bias in the resulting dataset. We also supply a reference Python implementation of the FOWD processing toolkit, which we use to process the entire Coastal Data Information Program (CDIP) buoy data catalog containing over 4 billion waves. In a first experiment, we find that, when the full elevation time series is available, surface elevation kurtosis and maximum wave height are the strongest univariate predictors for rogue wave activity. When just a spectrum is given, crest–trough correlation, spectral bandwidth, and mean period fill this role.

Open access
Jiali Zhang, Liang Zhang, Anmin Zhang, Lianxin Zhang, Dong Li, and Xuefeng Zhang

Abstract

Sound speed profile (SSP) affecting underwater acoustics is closely related to the temperature and the salinity fields. It is of great value to obtain the temperature and the salinity information through the high-precision sound speed profiles. In this paper, a data assimilation scheme by introducing sound speed profiles as a new constraint is proposed within the framework of 3DVAR data assimilation [referenced as SSP-constraint 3DVAR (SSPC-3DVAR)], which aims at improving the analysis accuracy of initial fields of the temperature and salinity in coastal sea areas. To validate the performance of the new assimilation scheme, ideal experiments are first carried out to show the advantages of the new proposed SSPC-3DVAR. Then the temperature, the salinity, and the SSP observations from field experiments in a coastal area are assimilated into the Princeton Ocean Model to validate the performance of short-time forecasts, adopting the SSPC-3DVAR scheme. Results show that it is efficient to improve the estimate accuracy by as much as 14.6% and 11.1% for the temperature and salinity, respectively, when compared with the standard 3DVAR. It demonstrates that the proposed SSPC-3DVAR approach works better in practice than the standard 3DVAR and will primarily benefit from variously and widely distributed observations in the future.

Restricted access
Lucas M. Merckelbach and Jeffrey R. Carpenter

Abstract

Autonomous, buoyancy-driven ocean gliders are increasingly used as a platform for the measurement of turbulence microstructure. In the processing of such measurements, there is a sensitive (quartic) dependence of the turbulence dissipation rate ϵ on the speed of flow past the sensors, or alternatively, the speed of the glider through the ocean water column. The mechanics of glider flight is therefore examined by extending previous flight models to account for the effects of ocean surface waves. It is found that due to the relatively small buoyancy changes used to drive gliders, the surface wave-induced motion, superimposed onto the steady-state motion, follows to a good approximation the motion of the wave orbitals. Errors expected in measuring ϵ at the ocean near-surface due to wave-induced relative velocities are generally less than 10%. However, pressure perturbations associated with the wave motion can cause significant perturbations in the glider-measured pressure signal and consequently also in the measured vertical glider velocity signal. This effect of surface waves is only present in the shallow water regime. It arises from an incomplete cancellation of the wave-induced pressure perturbation with the hydrostatic component due to vertical glider displacements, whereas for deep-water waves this cancellation is complete.

Open access
VINCENT T. WOOD, ROBERT P. DAVIES-JONES, and ALAN SHAPIRO

Abstract

Single-Doppler radar data are often missing in important regions of a severe storm due to low return power, low signal-to-noise ratio, ground clutter associated with normal and anomalous propagation, and missing radials associated with partial or total beam blockage. Missing data impact the ability of WSR-88D algorithms to detect severe weather. To aid the algorithms, we develop a variational technique that fills in Doppler velocity data voids smoothly by minimizing Doppler velocity gradients while not modifying good data. This method provides estimates of the analysed variable in data voids without creating extrema.

Actual single-Doppler radar data of four tornadoes are used to demonstrate the variational algorithm. In two cases, data are missing in the original data, and in the other two, data are voided artificially. The filled-in data match the voided data well in smoothly varying Doppler velocity fields. Near singularities such as tornadic vortex signatures, the match is poor as anticipated. The algorithm does not create any velocity peaks in the former data voids, thus preventing false triggering of tornado warnings. Doppler circulation is used herein as a far-field tornado detection and advance-warning parameter. In almost all cases, the measured circulation is quite insensitive to the data that have been voided and then filled. The tornado threat is still apparent.

Restricted access
Matthew E. Gropp and Casey E. Davenport

Abstract

Deep convective thunderstorm tracking methodologies and software have become useful and necessary tools across many applications, from nowcasting to model verification. Despite many available options, many of these pre-existing methods lack a customizable, fast, and flexible methodology that can track supercell thunderstorms within convective-allowing climate datasets with coarse temporal and spatial resolution. This project serves as one option to solve this issue via an all-in-one tracking methodology, built upon several open-source Python libraries, and designed to work with various temporal resolutions, including hourly. Unique to this approach is accounting for varying data availability of different model variables, while still sufficiently and accurately tracking specific convective features; in this case, supercells were the focus. To help distinguish supercells from ordinary cells, updraft helicity and other three-dimensional atmospheric data were incorporated into the tracking algorithm to confirm its supercellular status. Deviant motion from the mean wind was also used identify supercells. The tracking algorithm was tested and performed on a dynamically-downscaled regional climate model dataset with 4 km horizontal grid spacing. Each supercell was tracked for its entire lifetime over the course of 26 years of model output, resulting in a supercell climatology over the central United States. Due to the tracking configuration and dataset used, the tracking performs most consistently for long-lived and strong supercells compared to weak and short-lived supercells. This tracking methodology allows for customizable open-source tracking of supercells in any downscaled convective-allowing dataset, even with coarse temporal resolution.

Restricted access
Christopher D. Curtis and Sebastián M. Torres

Abstract

Range-oversampling processing is a technique that can be used to lower the variance of radar-variable estimates, reduce radar update times, or a mixture of both. There are two main assumptions for using range-oversampling processing: accurate knowledge of the range correlation and uniform reflectivity in the radar resolution volume. The first assumption has been addressed in previous research; this work focuses on the uniform reflectivity assumption. Earlier research shows that significant reflectivity gradients can occur in storms; we utilized those results to develop realistic simulations of radar returns that include effects of reflectivity gradients in range. An important consideration when using range-oversampling processing is the resulting change in the range weighting function. The range weighting function can change for different types of range-oversampling processing and some techniques, such as adaptive pseudowhitening, can lead to different range weighting functions at each range gate. To quantify the possible effects of differing range weighting functions in the presence of reflectivity gradients, we developed simulations to examine varying types of range-oversampling processing with two receiver filters: a matched receiver filter and a wider-bandwidth receiver filter (as recommended for use with range oversampling). Simulation results show that differences in range weighting functions are the only contributor to differences in radar reflectivity measurements. Results from real weather data demonstrate that the reflectivity gradients that occur in typical severe storms do not cause significant changes in reflectivity measurements, and the benefits from range-oversampling processing outweigh the possible isolated effects from large reflectivity gradients.

Restricted access