Search Results
You are looking at 1 - 10 of 13 items for
- Author or Editor: Ray Wang x
- Refine by Access: All Content x
Abstract
Radiosonde data are a valuable resource in the detection of climate change in the upper atmosphere. Long time series of stratospheric temperature data, carefully screened and corrected to remove errors, are available for this purpose. Normal reporting practice usually ascribes a fixed time and position (the station location) to all data reported in the ascent. In reality, the ascent may take around 90 min to complete and the spatial drift of the radiosonde may exceed 200 km. This note examines the magnitude of the errors associated with this practice using simulated radiosonde data generated from the ECMWF reanalysis archive. The results suggest that the temperature errors, while generally small in the troposphere, are locally significant in the stratosphere, particularly in the jet stream areas. However, the impact of the drift errors on global climate statistics is very small. Errors in the wind and humidity data are also examined.
Abstract
Radiosonde data are a valuable resource in the detection of climate change in the upper atmosphere. Long time series of stratospheric temperature data, carefully screened and corrected to remove errors, are available for this purpose. Normal reporting practice usually ascribes a fixed time and position (the station location) to all data reported in the ascent. In reality, the ascent may take around 90 min to complete and the spatial drift of the radiosonde may exceed 200 km. This note examines the magnitude of the errors associated with this practice using simulated radiosonde data generated from the ECMWF reanalysis archive. The results suggest that the temperature errors, while generally small in the troposphere, are locally significant in the stratosphere, particularly in the jet stream areas. However, the impact of the drift errors on global climate statistics is very small. Errors in the wind and humidity data are also examined.
Abstract
Airborne Doppler radar can collect data on target storms that are quite widely dispersed. However, the relatively long time required to sample an individual storm in detail, particularly with a single aircraft, and the amplification of the statistical uncertainty in the radial velocity estimates when Cartesian wind components are derived, suggests that errors in wind fields derived from airborne Doppler radar measurements would exceed those from a ground based radar network which was better located to observe the same storm. Error distributions for two analysis methods (termed Overdetermined and Direct methods) are given and discussed for various flight configurations. Both methods are applied to data collected on a sea breeze induced storm that occurred in western Florida on 28 July 1982. Application of the direct solution, which does not use the continuity equation, and the overdetermined dual-Doppler method, which requires the use of the continuity equation, resulted in similar fields. Since the magnitude of all errors are unknown and the response of each method to errors is different, it is difficult to assess overall which analysis performs better; indeed each might be expected to perform best in different parts of the analysis domain. A flexible collection strategy can be followed with different analysis methods to optimize the quality of resulting synthesized wind fields.
Abstract
Airborne Doppler radar can collect data on target storms that are quite widely dispersed. However, the relatively long time required to sample an individual storm in detail, particularly with a single aircraft, and the amplification of the statistical uncertainty in the radial velocity estimates when Cartesian wind components are derived, suggests that errors in wind fields derived from airborne Doppler radar measurements would exceed those from a ground based radar network which was better located to observe the same storm. Error distributions for two analysis methods (termed Overdetermined and Direct methods) are given and discussed for various flight configurations. Both methods are applied to data collected on a sea breeze induced storm that occurred in western Florida on 28 July 1982. Application of the direct solution, which does not use the continuity equation, and the overdetermined dual-Doppler method, which requires the use of the continuity equation, resulted in similar fields. Since the magnitude of all errors are unknown and the response of each method to errors is different, it is difficult to assess overall which analysis performs better; indeed each might be expected to perform best in different parts of the analysis domain. A flexible collection strategy can be followed with different analysis methods to optimize the quality of resulting synthesized wind fields.
Abstract
Data assimilation in the field of predictive land surface modeling is generally limited to using observational data to estimate optimal model states or restrict model parameter ranges. To date, very little work has attempted to systematically define and quantify error resulting from a model's inherent inability to simulate the natural system. This paper introduces a data assimilation technique that moves toward this goal by accounting for those deficiencies in the model itself that lead to systematic errors in model output. This is done using a supervised artificial neural network to “learn” and simulate systematic trends in the model output error. These simulations in turn are used to correct the model's output each time step. The technique is applied in two case studies, using fluxes of latent heat flux at one site and net ecosystem exchange (NEE) of carbon dioxide at another. Root-mean-square error (rmse) in latent heat flux per time step was reduced from 27.5 to 18.6 W m−2 (32%) and monthly from 9.91 to 3.08 W m−2 (68%). For NEE, rmse per time step was reduced from 3.71 to 2.70 μmol m−2 s−1 (27%) and annually from 2.24 to 0.11 μmol m−2 s−1 (95%). In both cases the correction provided significantly greater gains than single criteria parameter estimation on the same flux.
Abstract
Data assimilation in the field of predictive land surface modeling is generally limited to using observational data to estimate optimal model states or restrict model parameter ranges. To date, very little work has attempted to systematically define and quantify error resulting from a model's inherent inability to simulate the natural system. This paper introduces a data assimilation technique that moves toward this goal by accounting for those deficiencies in the model itself that lead to systematic errors in model output. This is done using a supervised artificial neural network to “learn” and simulate systematic trends in the model output error. These simulations in turn are used to correct the model's output each time step. The technique is applied in two case studies, using fluxes of latent heat flux at one site and net ecosystem exchange (NEE) of carbon dioxide at another. Root-mean-square error (rmse) in latent heat flux per time step was reduced from 27.5 to 18.6 W m−2 (32%) and monthly from 9.91 to 3.08 W m−2 (68%). For NEE, rmse per time step was reduced from 3.71 to 2.70 μmol m−2 s−1 (27%) and annually from 2.24 to 0.11 μmol m−2 s−1 (95%). In both cases the correction provided significantly greater gains than single criteria parameter estimation on the same flux.
Abstract
This work investigates the impact of tropical sea surface temperature (SST) biases on the Subseasonal to Seasonal Prediction project (S2S) precipitation forecast skill over the contiguous United States (CONUS) in the Unified Forecast System (UFS) coupled model Prototype 6. Boreal summer (June–September) and winter (December–March) for 2011–18 were analyzed. The impact of tropical west Pacific (WP) and tropical North Atlantic (TNA) warm SST biases is evaluated using multivariate linear regression analysis. A warm SST bias over the WP influences the CONUS precipitation remotely through a Rossby wave train in both seasons. During boreal winter, a warm SST bias over the TNA partly affects the magnitude of the North Atlantic subtropical high (NASH)’s center, which in the reforecasts is weaker than in reanalysis. The weaker NASH favors an enhanced moisture transport from the Gulf of Mexico, leading to increased precipitation over the Southeast United States. Compared to reanalysis, during boreal summer, the NASH’s center is also weaker and in addition, its position is displaced to the northeast. The displacement further affects the CONUS summer precipitation. The SST biases over the two tropical regions and their impacts become stronger as the forecast lead increases from week 1 to 4. These tropical biases explain up to 10% of the CONUS precipitation biases on the S2S time scale.
Abstract
This work investigates the impact of tropical sea surface temperature (SST) biases on the Subseasonal to Seasonal Prediction project (S2S) precipitation forecast skill over the contiguous United States (CONUS) in the Unified Forecast System (UFS) coupled model Prototype 6. Boreal summer (June–September) and winter (December–March) for 2011–18 were analyzed. The impact of tropical west Pacific (WP) and tropical North Atlantic (TNA) warm SST biases is evaluated using multivariate linear regression analysis. A warm SST bias over the WP influences the CONUS precipitation remotely through a Rossby wave train in both seasons. During boreal winter, a warm SST bias over the TNA partly affects the magnitude of the North Atlantic subtropical high (NASH)’s center, which in the reforecasts is weaker than in reanalysis. The weaker NASH favors an enhanced moisture transport from the Gulf of Mexico, leading to increased precipitation over the Southeast United States. Compared to reanalysis, during boreal summer, the NASH’s center is also weaker and in addition, its position is displaced to the northeast. The displacement further affects the CONUS summer precipitation. The SST biases over the two tropical regions and their impacts become stronger as the forecast lead increases from week 1 to 4. These tropical biases explain up to 10% of the CONUS precipitation biases on the S2S time scale.
Abstract
Satellite and gridded meteorological data can be used to estimate evaporation (E) from land surfaces using simple diagnostic models. Two satellite datasets indicate a positive trend (first time derivative) in global available energy from 1983 to 2006, suggesting that positive trends in evaporation may occur in “wet” regions where energy supply limits evaporation. However, decadal trends in evaporation estimated from water balances of 110 wet catchments
Abstract
Satellite and gridded meteorological data can be used to estimate evaporation (E) from land surfaces using simple diagnostic models. Two satellite datasets indicate a positive trend (first time derivative) in global available energy from 1983 to 2006, suggesting that positive trends in evaporation may occur in “wet” regions where energy supply limits evaporation. However, decadal trends in evaporation estimated from water balances of 110 wet catchments
Abstract
The future Surface Water and Ocean Topography (SWOT) mission aims to map sea surface height (SSH) in wide swaths with an unprecedented spatial resolution and subcentimeter accuracy. The instrument performance needs to be verified using independent measurements in a process known as calibration and validation (Cal/Val). The SWOT Cal/Val needs in situ measurements that can make synoptic observations of SSH field over an O(100) km distance with an accuracy matching the SWOT requirements specified in terms of the along-track wavenumber spectrum of SSH error. No existing in situ observing system has been demonstrated to meet this challenge. A field campaign was conducted during September 2019–January 2020 to assess the potential of various instruments and platforms to meet the SWOT Cal/Val requirement. These instruments include two GPS buoys, two bottom pressure recorders (BPR), three moorings with fixed conductivity–temperature–depth (CTD) and CTD profilers, and a glider. The observations demonstrated that 1) the SSH (hydrostatic) equation can be closed with 1–3 cm RMS residual using BPR, CTD mooring and GPS SSH, and 2) using the upper-ocean steric height derived from CTD moorings enable subcentimeter accuracy in the California Current region during the 2019/20 winter. Given that the three moorings are separated at 10–20–30 km distance, the observations provide valuable information about the small-scale SSH variability associated with the ocean circulation at frequencies ranging from hourly to monthly in the region. The combined analysis sheds light on the design of the SWOT mission postlaunch Cal/Val field campaign.
Abstract
The future Surface Water and Ocean Topography (SWOT) mission aims to map sea surface height (SSH) in wide swaths with an unprecedented spatial resolution and subcentimeter accuracy. The instrument performance needs to be verified using independent measurements in a process known as calibration and validation (Cal/Val). The SWOT Cal/Val needs in situ measurements that can make synoptic observations of SSH field over an O(100) km distance with an accuracy matching the SWOT requirements specified in terms of the along-track wavenumber spectrum of SSH error. No existing in situ observing system has been demonstrated to meet this challenge. A field campaign was conducted during September 2019–January 2020 to assess the potential of various instruments and platforms to meet the SWOT Cal/Val requirement. These instruments include two GPS buoys, two bottom pressure recorders (BPR), three moorings with fixed conductivity–temperature–depth (CTD) and CTD profilers, and a glider. The observations demonstrated that 1) the SSH (hydrostatic) equation can be closed with 1–3 cm RMS residual using BPR, CTD mooring and GPS SSH, and 2) using the upper-ocean steric height derived from CTD moorings enable subcentimeter accuracy in the California Current region during the 2019/20 winter. Given that the three moorings are separated at 10–20–30 km distance, the observations provide valuable information about the small-scale SSH variability associated with the ocean circulation at frequencies ranging from hourly to monthly in the region. The combined analysis sheds light on the design of the SWOT mission postlaunch Cal/Val field campaign.
EC-Earth
A Seamless Earth-System Prediction Approach in Action
Abstract
This article provides an overview of the NASA Atmospheric Tomography (ATom) mission and a summary of selected scientific findings to date. ATom was an airborne measurements and modeling campaign aimed at characterizing the composition and chemistry of the troposphere over the most remote regions of the Pacific, Southern, Atlantic, and Arctic Oceans, and examining the impact of anthropogenic and natural emissions on a global scale. These remote regions dominate global chemical reactivity and are exceptionally important for global air quality and climate. ATom data provide the in situ measurements needed to understand the range of chemical species and their reactions, and to test satellite remote sensing observations and global models over large regions of the remote atmosphere. Lack of data in these regions, particularly over the oceans, has limited our understanding of how atmospheric composition is changing in response to shifting anthropogenic emissions and physical climate change. ATom was designed as a global-scale tomographic sampling mission with extensive geographic and seasonal coverage, tropospheric vertical profiling, and detailed speciation of reactive compounds and pollution tracers. ATom flew the NASA DC-8 research aircraft over four seasons to collect a comprehensive suite of measurements of gases, aerosols, and radical species from the remote troposphere and lower stratosphere on four global circuits from 2016 to 2018. Flights maintained near-continuous vertical profiling of 0.15–13-km altitudes on long meridional transects of the Pacific and Atlantic Ocean basins. Analysis and modeling of ATom data have led to the significant early findings highlighted here.
Abstract
This article provides an overview of the NASA Atmospheric Tomography (ATom) mission and a summary of selected scientific findings to date. ATom was an airborne measurements and modeling campaign aimed at characterizing the composition and chemistry of the troposphere over the most remote regions of the Pacific, Southern, Atlantic, and Arctic Oceans, and examining the impact of anthropogenic and natural emissions on a global scale. These remote regions dominate global chemical reactivity and are exceptionally important for global air quality and climate. ATom data provide the in situ measurements needed to understand the range of chemical species and their reactions, and to test satellite remote sensing observations and global models over large regions of the remote atmosphere. Lack of data in these regions, particularly over the oceans, has limited our understanding of how atmospheric composition is changing in response to shifting anthropogenic emissions and physical climate change. ATom was designed as a global-scale tomographic sampling mission with extensive geographic and seasonal coverage, tropospheric vertical profiling, and detailed speciation of reactive compounds and pollution tracers. ATom flew the NASA DC-8 research aircraft over four seasons to collect a comprehensive suite of measurements of gases, aerosols, and radical species from the remote troposphere and lower stratosphere on four global circuits from 2016 to 2018. Flights maintained near-continuous vertical profiling of 0.15–13-km altitudes on long meridional transects of the Pacific and Atlantic Ocean basins. Analysis and modeling of ATom data have led to the significant early findings highlighted here.