Browse

You are looking at 51 - 60 of 5,057 items for :

  • Journal of Atmospheric and Oceanic Technology x
  • All content x
Clear All
Zhen Shen, Kefei Zhang, Qimin He, Moufeng Wan, Longjiang Li, and Suqin Wu

Abstract

The sampling error caused by the uneven distribution of radio occultation (RO) profiles in both space and time domains is an important error source of RO climatologies. In this paper, the sampling error RO temperature climatologies is investigated using the 4-yr (2007–10) data from the Constellation Observing System for Meteorology, Ionosphere, and Climate mission. The error is divided into three parts, including local time component (LTC), temporal component (TC), and spatial component (SC). The characteristics of the three components are investigated. Results show the following: 1) The LTC part of the total sampling error is characterized by a pattern of periodic positive and negative deviations, with a full cycle of about four months. The most significant LTC values are found in the area around 60°N/S and the polar regions. 2) The TC part is mainly associated with the extent of day-to-day temperature variability and the daily number of RO profiles observed in each month. The most pronounced TC part is shown in high-latitude areas in wintertime, where the day-to-day temperature variability is high. 3) The SC part shows distinct features in different altitude ranges. It is characterized by a systemic error in the lower troposphere (2–8 km) but exhibits a seasonal trend at the altitude range from 8 to 40 km. 4) The total sampling error is dominated by the TC and SC parts in the troposphere and lower stratosphere, whereas in the upper stratosphere it is dominated by the LTC part.

Open access
Martin J. Murphy, John A. Cramer, and Ryan K. Said

Abstract

The U.S. National Lightning Detection Network (NLDN) underwent a complete sensor upgrade in 2013 followed by a central processor upgrade in 2015. These upgrades produced about a factor-of-5 improvement in the detection efficiency of cloud lightning flashes and about one additional cloud pulse geolocated per flash. However, they also reaggravated a historical problem with the tendency to misclassify a population of low-current positive discharges as cloud-to-ground strokes when, in fact, most are probably cloud pulses. Furthermore, less than 0.1% of events were poorly geolocated because the contributing sensor data were either improperly associated or simply underutilized by the geolocation algorithm. To address these issues, Vaisala developed additional improvements to the central processing system, which became operational on 7 November 2018. This paper describes updates to the NLDN between 2013 and 2018 and then focuses on the effects of classification algorithm changes and a simple means to normalize classification across upgrades.

Open access
R. Meneghini, L. Liao, and G. M. Heymsfield

Abstract

The High-Altitude Imaging Wind and Rain Airborne Profiler (HIWRAP) dual-frequency conically scanning airborne radar provides estimates of the range-profiled mean Doppler and backscattered power from the precipitation and surface. A velocity–azimuth display analysis yields near-surface estimates of the mean horizontal wind vector υ h in cases in which precipitation is present throughout the scan. From the surface return, the normalized radar cross section (NRCS) is obtained, which, by a method previously described, can be corrected for path attenuation. Comparisons between υ h and the attenuation-corrected NRCS are used to derive transfer functions that provide estimates of the wind vector from the NRCS data under both rain and rain-free conditions. A reasonably robust transfer function is found by using the mean NRCS (⟨NRCS⟩) over the scan along with a filtering of the data based on a Fourier series analysis of υ h and the NRCS. The approach gives good correlation coefficients between υ h and ⟨NRCS⟩ at Ku band at incidence angles of 30° and 40°. The correlation degrades if the Ka-band data are used rather than the Ku band.

Open access
Ruiyang Ma, Dong Zheng, Yijun Zhang, Wen Yao, Wenjuan Zhang, and Deqing Cuomu

Abstract

Herein, we compared data on the spatiotemporal distribution of lightning activity obtained from the World Wide Lightning Location Network (WWLLN) with that from the Lightning Imaging Sensor (LIS). The WWLLN and LIS both suggest intense lightning activity over the central and southeastern Tibetan Plateau (TP) during May–September. Meanwhile, the WWLLN indicates relatively weak lightning activity over the northeastern TP, where the LIS suggests very intense lightning activity, and it also indicates a high-density lightning center over the southwestern TP that is not suggested by the LIS. Furthermore, the WWLLN lightning peaks in August in terms of monthly variation and in late August in terms of 10-day variation, unlike the corresponding LIS lightning peaks of July and late June, respectively. Other observation data were also introduced into the comparison. The blackbody temperature (TBB) data from the Fengyun-2E geostationary satellite (as a proxy of deep convection) and thunderstorm-day data support the spatial distribution of the WWLLN lightning more. Meanwhile, for seasonal variation, the TBB data are more analogous to the LIS data, whereas the cloud-to-ground (CG) lightning data from a local CG lightning location system are closer to the WWLLN data. It is speculated that the different WWLLN and LIS observation modes may cause their data to represent different dominant types of lightning, thereby leading to differences in the spatiotemporal distributions of their data. The results may further imply that there exist regional differences and seasonal variations in the electrical properties of thunderstorms over the TP.

Restricted access
Jeremiah P. Sjoberg, Richard A. Anthes, and Therese Rieckh

Abstract

The three-cornered hat (3CH) method, which was originally developed to assess the random errors of atomic clocks, is a means for estimating the error variances of three different datasets. Here we give an overview of the historical development of the 3CH and select other methods for estimating error variances that use either two or three datasets. We discuss similarities and differences between these methods and the 3CH method. This study assesses the sensitivity of the 3CH method to the factors that limit its accuracy, including sample size, outliers, different magnitudes of errors between the datasets, biases, and unknown error correlations. Using simulated datasets for which the errors and their correlations among the datasets are known, this analysis shows the conditions under which the 3CH method provides the most and least accurate estimates. The effect of representativeness errors caused by differences in vertical resolution of datasets is investigated. These representativeness errors are generally small relative to the magnitude of the random errors in the datasets, and the impact of this source of errors can be reduced by appropriate filtering.

Open access
Joey J. Voermans, Alexander V. Babanin, Cagil Kirezci, Jonas T. Carvalho, Marcelo F. Santini, Bruna F. Pavani, and Luciano P. Pezzi

Abstract

Quality control measures for ocean waves observations are necessary to give confidence of their accuracy. It is common practice to detect anomalies or outliers in surface displacement observations by applying a standard deviation threshold. Besides being a purely statistical method, this quality control procedure is likely to flag extreme wave events erroneously, thereby impacting higher-order descriptions of the wave field. In this paper we extend the use of the statistical phase-space threshold, an established outlier detection method in the field of turbulence, to detect anomalies in a wave record. We show that a wave record in phase space (here defined as a diagram of displacement against acceleration) can be enclosed by a predictable ellipse where the major and minor axes are defined by the spectral properties of the wave field. By using the parameterized ellipse in phase space as a threshold to identify wave anomalies, this is a semiphysical filtering method. Wave buoy data obtained from a mooring deployed near King George Island, Antarctica [as part of the Antarctic Modeling Observation System (ATMOS)], and laser altimeter data obtained at the Northwest Shelf of Australia were used to demonstrate the functioning of the filtering methodology in identifying wave anomalies. Synthetic data obtained using a high-order spectral model are used to identify how extreme waves are positioned in phase space.

Restricted access
Philippe Keckhut, Alain Hauchecorne, Mustapha Meftah, Sergey Khaykin, Chantal Claud, and Pierre Simoneau

Abstract

While meteorological numerical models extend upward to the mesopause, mesospheric observations are required for leading simulations and numerical weather forecasts and climate projections. This work reviews some of the challenges about temperature observation requirements and the limiting factors of the actual measurements associated with atmospheric tides. A new strategy is described here using the limb scattering technique based on previous experiments in space. Such observations can be placed on board cube-satellites. Technical issues are the large dynamic range (4 magnitudes) required for the measurements, the accuracy of the limb pointing and the level of stray light. The technique described here will expect accuracy of 1-2 K with a vertical resolution of 1-2 km. A constellation of 100 platforms could provide temperature observations with space (100 km) and time resolution (3 hours) recommended by the World Organization Meteorology, while tidal issues could be resolved with a minimum of 3-5 platforms with specific orbit maintained to avoid drifts.

Restricted access
Yuichiro Takeshita, Brent D. Jones, Kenneth S. Johnson, Francisco P. Chavez, Daniel L. Rudnick, Marguerite Blum, Kyle Conner, Scott Jensen, Jacqueline S. Long, Thom Maughan, Keaton L. Mertz, Jeffrey T. Sherman, and Joseph K. Warren

Abstract

The California Current System is thought to be particularly vulnerable to ocean acidification, yet pH remains chronically undersampled along this coast, limiting our ability to assess the impacts of ocean acidification. To address this observational gap, we integrated the Deep-Sea-DuraFET, a solid-state pH sensor, onto a Spray underwater glider. Over the course of a year starting in April 2019, we conducted seven missions in central California that spanned 161 glider days and >1600 dives to a maximum depth of 1000 m. The sensor accuracy was estimated to be ± 0.01 based on comparisons to discrete samples taken alongside the glider (n = 105), and the precision was ±0.0016. CO2 partial pressure, dissolved inorganic carbon, and aragonite saturation state could be estimated from the pH data with uncertainty better than ± 2.5%, ± 8 μmol kg−1, and ± 2%, respectively. The sensor was stable to ±0.01 for the first 9 months but exhibited a drift of 0.015 during the last mission. The drift was correctable using a piecewise linear regression based on a reference pH field at 450 m estimated from published global empirical algorithms. These algorithms require accurate O2 as inputs; thus, protocols for a simple predeployment air calibration that achieved accuracy of better than 1% were implemented. The glider observations revealed upwelling of undersaturated waters with respect to aragonite to within 5 m below the surface near Monterey Bay. These observations highlight the importance of persistent observations through autonomous platforms in highly dynamic coastal environments.

Open access
Kenneth R. Knapp, Alisa H. Young, Hilawe Semunegus, Anand K. Inamdar, and William Hankins

Abstract

The International Satellite Cloud Climatology Project (ISCCP) began collecting data in the 1980s to help understand the distribution of clouds. Since then, it has provided important information on clouds in time and space and their radiative characteristics. However, it was apparent from some long-term time series of the data that there are some latent artifacts related to the changing satellite coverage over the more than 30 years of the record. Changes in satellite coverage effectively create secular changes in the time series of view zenith angle (VZA) for a given location. There is an inconsistency in the current ISCCP cloud detection algorithm related to VZA: two satellites viewing the same location from different VZAs can produce vastly different estimates of cloud amount. Research is presented that shows that a simple change to the cloud detection algorithm can vastly increase the consistency. This is accomplished by making the cloud–no cloud threshold VZA dependent. The resulting cloud amounts are more consistent between different satellites and the distributions are shown to be more spatially homogenous. Likewise, the more consistent spatial data lead to more consistent temporal statistics.

Open access
Sarah Ringerud, Christa Peters-Lidard, Joe Munchak, and Yalei You

Abstract

Accurate, physically based precipitation retrieval over global land surfaces is an important goal of the NASA/JAXA Global Precipitation Measurement Mission (GPM). This is a difficult problem for the passive microwave constellation, as the signal over radiometrically warm land surfaces in the microwave frequencies means that the measurements used are indirect and typically require inferring some type of relationship between an observed scattering signal and precipitation at the surface. GPM, with collocated radiometer and dual-frequency radar, is an excellent tool for tackling this problem and improving global retrievals. In the years following the launch of the GPM Core Observatory satellite, physically based passive microwave retrieval of precipitation over land continues to be challenging. Validation efforts suggest that the operational GPM passive microwave algorithm, the Goddard profiling algorithm (GPROF), tends to overestimate precipitation at the low (<5 mm h−1) end of the distribution over land. In this work, retrieval sensitivities to dynamic surface conditions are explored through enhancement of the algorithm with dynamic, retrieved information from a GPM-derived optimal estimation scheme. The retrieved parameters describing surface and background characteristics replace current static or ancillary GPROF information including emissivity, water vapor, and snow cover. Results show that adding this information decreases probability of false detection by 50% and, most importantly, the enhancements with retrieved parameters move the retrieval away from dependence on ancillary datasets and lead to improved physical consistency.

Restricted access