Search Results
You are looking at 1 - 10 of 10 items for
- Author or Editor: D. E. Shaw x
- Refine by Access: All Content x
Abstract
The CSIRO Division of Cloud Physics has designed and built 103 automatic recording raingages, at a cost of about $US600 each, for use in a cloud-seeding experiment. Each unit consists of a siphoned tipping bucket interfaced to a monophonic cassette tape recorder. The raingages have a resolution of 0.2 mm, and this can be recorded to an accuracy of ∼2 s. Each tape can store 3000 tip events, and with a battery drain of 500 μ A the units can be left unattended in the field for many months. Experience with the network has shown that ∼6% of the units have failed when left unattended for 4½ months. Some examples of the type and quality of the data that can be obtained from such a network are presented.
Abstract
The CSIRO Division of Cloud Physics has designed and built 103 automatic recording raingages, at a cost of about $US600 each, for use in a cloud-seeding experiment. Each unit consists of a siphoned tipping bucket interfaced to a monophonic cassette tape recorder. The raingages have a resolution of 0.2 mm, and this can be recorded to an accuracy of ∼2 s. Each tape can store 3000 tip events, and with a battery drain of 500 μ A the units can be left unattended in the field for many months. Experience with the network has shown that ∼6% of the units have failed when left unattended for 4½ months. Some examples of the type and quality of the data that can be obtained from such a network are presented.
Abstract
Ice crystals were grown in a supercooled cloud at temperatures ranging from −3°C to −21°C for periods from 30–40 s to 150–180 s. When the axial dimensions at a given time were examined as a function of temperature, there was a marked maximum along the a axis at −15°C and a secondary broader maximum along the c axis at −6°C. The growth of the axial dimensions can he adequately represented by a linear function of time.
A power function of time was fitted to the crystal mass growth measurements; these show a sharp maximum at −15°C and a secondary broader maximum at −7°C.
Crystal bulk densities estimated from the masses and axial dimensions vary with temperature in a complicated way, with a minimum of about 0.4 Mg m−2 at −5 and − 17°C, and a maximum of 0.92 Mg m−2(pure rice) at and appear to he independent of time.
Abstract
Ice crystals were grown in a supercooled cloud at temperatures ranging from −3°C to −21°C for periods from 30–40 s to 150–180 s. When the axial dimensions at a given time were examined as a function of temperature, there was a marked maximum along the a axis at −15°C and a secondary broader maximum along the c axis at −6°C. The growth of the axial dimensions can he adequately represented by a linear function of time.
A power function of time was fitted to the crystal mass growth measurements; these show a sharp maximum at −15°C and a secondary broader maximum at −7°C.
Crystal bulk densities estimated from the masses and axial dimensions vary with temperature in a complicated way, with a minimum of about 0.4 Mg m−2 at −5 and − 17°C, and a maximum of 0.92 Mg m−2(pure rice) at and appear to he independent of time.
Abstract
A Monte-Carlo model by Lapidus and Shafrir for the temporal development of cloud droplet spectra is critically examined. Modifications are suggested to improve the statistical validity of the model.
Both the original model and the modified model only approximately simulate the coalescence process. It is demonstrated that when large collection kernels, such as the Golovin collection kernel, are used both models are inadequate. However, for smaller and more realistic collection kernels the models do not produce results that differ substantially from the standard solutions given by the coalescence equations.
Abstract
A Monte-Carlo model by Lapidus and Shafrir for the temporal development of cloud droplet spectra is critically examined. Modifications are suggested to improve the statistical validity of the model.
Both the original model and the modified model only approximately simulate the coalescence process. It is demonstrated that when large collection kernels, such as the Golovin collection kernel, are used both models are inadequate. However, for smaller and more realistic collection kernels the models do not produce results that differ substantially from the standard solutions given by the coalescence equations.
Abstract
A cloud-seeding experiment was conducted in Tasmania using a target area and three control areas. Seeding was on a random basis using silver-iodide smoke released from an aircraft. Evidence is presented that seeding increased rainfall in the eastern half of the target area during autumn.
Abstract
A cloud-seeding experiment was conducted in Tasmania using a target area and three control areas. Seeding was on a random basis using silver-iodide smoke released from an aircraft. Evidence is presented that seeding increased rainfall in the eastern half of the target area during autumn.
Abstract
Previous studies of the low-level jet (LLJ) over the central Great Plains of the United States have been unable to determine the role that mesoscale and smaller circulations play in the transport of moisture. To address this issue, two aircraft missions during the International H2O Project (IHOP_2002) were designed to observe closely a well-developed LLJ over the Great Plains (primarily Oklahoma and Kansas) with multiple observation platforms. In addition to standard operational platforms (most important, radiosondes and profilers) to provide the large-scale setting, dropsondes released from the aircraft at 55-km intervals and a pair of onboard lidar instruments—High Resolution Doppler Lidar (HRDL) for wind and differential absorption lidar (DIAL) for moisture—observed the moisture transport in the LLJ at greater resolution. Using these observations, the authors describe the multiscalar structure of the LLJ and then focus attention on the bulk properties and effects of scales of motion by computing moisture fluxes through cross sections that bracket the LLJ. From these computations, the Reynolds averages within the cross sections can be computed. This allow an estimate to be made of the bulk effect of integrated estimates of the contribution of small-scale (mesoscale to convective scale) circulations to the overall transport. The performance of the Weather Research and Forecasting (WRF) Model in forecasting the intensity and evolution of the LLJ for this case is briefly examined.
Abstract
Previous studies of the low-level jet (LLJ) over the central Great Plains of the United States have been unable to determine the role that mesoscale and smaller circulations play in the transport of moisture. To address this issue, two aircraft missions during the International H2O Project (IHOP_2002) were designed to observe closely a well-developed LLJ over the Great Plains (primarily Oklahoma and Kansas) with multiple observation platforms. In addition to standard operational platforms (most important, radiosondes and profilers) to provide the large-scale setting, dropsondes released from the aircraft at 55-km intervals and a pair of onboard lidar instruments—High Resolution Doppler Lidar (HRDL) for wind and differential absorption lidar (DIAL) for moisture—observed the moisture transport in the LLJ at greater resolution. Using these observations, the authors describe the multiscalar structure of the LLJ and then focus attention on the bulk properties and effects of scales of motion by computing moisture fluxes through cross sections that bracket the LLJ. From these computations, the Reynolds averages within the cross sections can be computed. This allow an estimate to be made of the bulk effect of integrated estimates of the contribution of small-scale (mesoscale to convective scale) circulations to the overall transport. The performance of the Weather Research and Forecasting (WRF) Model in forecasting the intensity and evolution of the LLJ for this case is briefly examined.
During the summer of 1989, the Forecast Systems Laboratory of the National Oceanic and Atmospheric Administration sponsored an evaluation of artificial-intelligence-based systems that forecast severe convective storms. The evaluation experiment, called Shootout-89, took place in Boulder, Colorado, and focused on storms over the northeastern Colorado foothills and plains.
Six systems participated in Shootout-89: three traditional expert systems, a hybrid system including a linear model augmented by a small expert system, an analogue-based system, and a system developed using methods from the cognitive science/judgment analysis tradition.
Each day of the exercise, the systems generated 2–9-h forecasts of the probabilities of occurrence of nonsignificant weather, significant weather, and severe weather in each of four regions in northeastern Colorado. A verification coordinator working at the Denver Weather Service Forecast Office gathered ground-truth data from a network of observers.
The systems were evaluated on several measures of forecast skill, on timeliness, on ease of learning, and on ease of use. They were generally easy to operate; however, they required substantially different levels of meteorological expertise on the part of their users, reflecting the various operational environments for which they had been designed. The systems varied in their statistical behavior, but on this difficult forecast problem, they generally showed a skill approximately equal to that of persistence forecasts and climatological forecasts.
During the summer of 1989, the Forecast Systems Laboratory of the National Oceanic and Atmospheric Administration sponsored an evaluation of artificial-intelligence-based systems that forecast severe convective storms. The evaluation experiment, called Shootout-89, took place in Boulder, Colorado, and focused on storms over the northeastern Colorado foothills and plains.
Six systems participated in Shootout-89: three traditional expert systems, a hybrid system including a linear model augmented by a small expert system, an analogue-based system, and a system developed using methods from the cognitive science/judgment analysis tradition.
Each day of the exercise, the systems generated 2–9-h forecasts of the probabilities of occurrence of nonsignificant weather, significant weather, and severe weather in each of four regions in northeastern Colorado. A verification coordinator working at the Denver Weather Service Forecast Office gathered ground-truth data from a network of observers.
The systems were evaluated on several measures of forecast skill, on timeliness, on ease of learning, and on ease of use. They were generally easy to operate; however, they required substantially different levels of meteorological expertise on the part of their users, reflecting the various operational environments for which they had been designed. The systems varied in their statistical behavior, but on this difficult forecast problem, they generally showed a skill approximately equal to that of persistence forecasts and climatological forecasts.
A field campaign was carried out near Boardman, Oregon, to study the effects of subgrid-scale variability of sensible- and latent-heat fluxes on surface boundary-layer properties. The experiment involved three U.S. Department of Energy laboratories, one National Oceanic and Atmospheric Administration laboratory, and several universities. The experiment was conducted in a region of severe contrasts in adjacent surface types that accentuated the response of the atmosphere to variable surface forcing. Large values of sensible-heat flux and low values of latent-heat flux characterized a sagebrush steppe area; significantly smaller sensible-heat fluxes and much larger latent-heat fluxes were associated with extensive tracts of irrigated farmland to the north, east, and west of the steppe. Data were obtained from an array of surface flux stations, remote-sensing devices, an instrumented aircraft, and soil and vegetation measurements. The data will be used to address the problem of extrapolating from a limited number of local measurements to area-averaged values of fluxes suitable for use in global climate models.
A field campaign was carried out near Boardman, Oregon, to study the effects of subgrid-scale variability of sensible- and latent-heat fluxes on surface boundary-layer properties. The experiment involved three U.S. Department of Energy laboratories, one National Oceanic and Atmospheric Administration laboratory, and several universities. The experiment was conducted in a region of severe contrasts in adjacent surface types that accentuated the response of the atmosphere to variable surface forcing. Large values of sensible-heat flux and low values of latent-heat flux characterized a sagebrush steppe area; significantly smaller sensible-heat fluxes and much larger latent-heat fluxes were associated with extensive tracts of irrigated farmland to the north, east, and west of the steppe. Data were obtained from an array of surface flux stations, remote-sensing devices, an instrumented aircraft, and soil and vegetation measurements. The data will be used to address the problem of extrapolating from a limited number of local measurements to area-averaged values of fluxes suitable for use in global climate models.
Abstract
Since 2007, meteorologists of the U.S. Army Test and Evaluation Command (ATEC) at Dugway Proving Ground (DPG), Utah, have relied on a mesoscale ensemble prediction system (EPS) known as the Ensemble Four-Dimensional Weather System (E-4DWX). This article describes E-4DWX and the innovative way in which it is calibrated, how it performs, why it was developed, and how meteorologists at DPG use it. E-4DWX has 30 operational members, each configured to produce forecasts of 48 h every 6 h on a 272-processor high performance computer (HPC) at DPG. The ensemble’s members differ from one another in initial-, lateral-, and lower-boundary conditions; in methods of data assimilation; and in physical parameterizations. The predictive core of all members is the Advanced Research core of the Weather Research and Forecasting (WRF) Model. Numerical predictions of the most useful near-surface variables are dynamically calibrated through algorithms that combine logistic regression and quantile regression, generating statistically realistic probabilistic depictions of the atmosphere’s future state at DPG’s observing sites. Army meteorologists view E-4DWX’s output via customized figures posted to a restricted website. Some of these figures summarize collective results—for example, through means, standard deviations, or fractions of the ensemble exceeding thresholds. Other figures show each forecast, individually or grouped—for example, through spaghetti diagrams and time series. This article presents examples of each type of figure.
Abstract
Since 2007, meteorologists of the U.S. Army Test and Evaluation Command (ATEC) at Dugway Proving Ground (DPG), Utah, have relied on a mesoscale ensemble prediction system (EPS) known as the Ensemble Four-Dimensional Weather System (E-4DWX). This article describes E-4DWX and the innovative way in which it is calibrated, how it performs, why it was developed, and how meteorologists at DPG use it. E-4DWX has 30 operational members, each configured to produce forecasts of 48 h every 6 h on a 272-processor high performance computer (HPC) at DPG. The ensemble’s members differ from one another in initial-, lateral-, and lower-boundary conditions; in methods of data assimilation; and in physical parameterizations. The predictive core of all members is the Advanced Research core of the Weather Research and Forecasting (WRF) Model. Numerical predictions of the most useful near-surface variables are dynamically calibrated through algorithms that combine logistic regression and quantile regression, generating statistically realistic probabilistic depictions of the atmosphere’s future state at DPG’s observing sites. Army meteorologists view E-4DWX’s output via customized figures posted to a restricted website. Some of these figures summarize collective results—for example, through means, standard deviations, or fractions of the ensemble exceeding thresholds. Other figures show each forecast, individually or grouped—for example, through spaghetti diagrams and time series. This article presents examples of each type of figure.
Abstract
The Lake Michigan Ozone Study 2017 (LMOS 2017) was a collaborative multiagency field study targeting ozone chemistry, meteorology, and air quality observations in the southern Lake Michigan area. The primary objective of LMOS 2017 was to provide measurements to improve air quality modeling of the complex meteorological and chemical environment in the region. LMOS 2017 science questions included spatiotemporal assessment of nitrogen oxides (NO x = NO + NO2) and volatile organic compounds (VOC) emission sources and their influence on ozone episodes; the role of lake breezes; contribution of new remote sensing tools such as GeoTASO, Pandora, and TEMPO to air quality management; and evaluation of photochemical grid models. The observing strategy included GeoTASO on board the NASA UC-12 aircraft capturing NO2 and formaldehyde columns, an in situ profiling aircraft, two ground-based coastal enhanced monitoring locations, continuous NO2 columns from coastal Pandora instruments, and an instrumented research vessel. Local photochemical ozone production was observed on 2 June, 9–12 June, and 14–16 June, providing insights on the processes relevant to state and federal air quality management. The LMOS 2017 aircraft mapped significant spatial and temporal variation of NO2 emissions as well as polluted layers with rapid ozone formation occurring in a shallow layer near the Lake Michigan surface. Meteorological characteristics of the lake breeze were observed in detail and measurements of ozone, NOx, nitric acid, hydrogen peroxide, VOC, oxygenated VOC (OVOC), and fine particulate matter (PM2.5) composition were conducted. This article summarizes the study design, directs readers to the campaign data repository, and presents a summary of findings.
Abstract
The Lake Michigan Ozone Study 2017 (LMOS 2017) was a collaborative multiagency field study targeting ozone chemistry, meteorology, and air quality observations in the southern Lake Michigan area. The primary objective of LMOS 2017 was to provide measurements to improve air quality modeling of the complex meteorological and chemical environment in the region. LMOS 2017 science questions included spatiotemporal assessment of nitrogen oxides (NO x = NO + NO2) and volatile organic compounds (VOC) emission sources and their influence on ozone episodes; the role of lake breezes; contribution of new remote sensing tools such as GeoTASO, Pandora, and TEMPO to air quality management; and evaluation of photochemical grid models. The observing strategy included GeoTASO on board the NASA UC-12 aircraft capturing NO2 and formaldehyde columns, an in situ profiling aircraft, two ground-based coastal enhanced monitoring locations, continuous NO2 columns from coastal Pandora instruments, and an instrumented research vessel. Local photochemical ozone production was observed on 2 June, 9–12 June, and 14–16 June, providing insights on the processes relevant to state and federal air quality management. The LMOS 2017 aircraft mapped significant spatial and temporal variation of NO2 emissions as well as polluted layers with rapid ozone formation occurring in a shallow layer near the Lake Michigan surface. Meteorological characteristics of the lake breeze were observed in detail and measurements of ozone, NOx, nitric acid, hydrogen peroxide, VOC, oxygenated VOC (OVOC), and fine particulate matter (PM2.5) composition were conducted. This article summarizes the study design, directs readers to the campaign data repository, and presents a summary of findings.
Abstract
The Southeast Atmosphere Studies (SAS), which included the Southern Oxidant and Aerosol Study (SOAS); the Southeast Nexus (SENEX) study; and the Nitrogen, Oxidants, Mercury and Aerosols: Distributions, Sources and Sinks (NOMADSS) study, was deployed in the field from 1 June to 15 July 2013 in the central and eastern United States, and it overlapped with and was complemented by the Studies of Emissions, Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS) campaign. SAS investigated atmospheric chemistry and the associated air quality and climate-relevant particle properties. Coordinated measurements from six ground sites, four aircraft, tall towers, balloon-borne sondes, existing surface networks, and satellites provide in situ and remotely sensed data on trace-gas composition, aerosol physicochemical properties, and local and synoptic meteorology. Selected SAS findings indicate 1) dramatically reduced NOx concentrations have altered ozone production regimes; 2) indicators of “biogenic” secondary organic aerosol (SOA), once considered part of the natural background, were positively correlated with one or more indicators of anthropogenic pollution; and 3) liquid water dramatically impacted particle scattering while biogenic SOA did not. SAS findings suggest that atmosphere–biosphere interactions modulate ambient pollutant concentrations through complex mechanisms and feedbacks not yet adequately captured in atmospheric models. The SAS dataset, now publicly available, is a powerful constraint to develop predictive capability that enhances model representation of the response and subsequent impacts of changes in atmospheric composition to changes in emissions, chemistry, and meteorology.
Abstract
The Southeast Atmosphere Studies (SAS), which included the Southern Oxidant and Aerosol Study (SOAS); the Southeast Nexus (SENEX) study; and the Nitrogen, Oxidants, Mercury and Aerosols: Distributions, Sources and Sinks (NOMADSS) study, was deployed in the field from 1 June to 15 July 2013 in the central and eastern United States, and it overlapped with and was complemented by the Studies of Emissions, Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS) campaign. SAS investigated atmospheric chemistry and the associated air quality and climate-relevant particle properties. Coordinated measurements from six ground sites, four aircraft, tall towers, balloon-borne sondes, existing surface networks, and satellites provide in situ and remotely sensed data on trace-gas composition, aerosol physicochemical properties, and local and synoptic meteorology. Selected SAS findings indicate 1) dramatically reduced NOx concentrations have altered ozone production regimes; 2) indicators of “biogenic” secondary organic aerosol (SOA), once considered part of the natural background, were positively correlated with one or more indicators of anthropogenic pollution; and 3) liquid water dramatically impacted particle scattering while biogenic SOA did not. SAS findings suggest that atmosphere–biosphere interactions modulate ambient pollutant concentrations through complex mechanisms and feedbacks not yet adequately captured in atmospheric models. The SAS dataset, now publicly available, is a powerful constraint to develop predictive capability that enhances model representation of the response and subsequent impacts of changes in atmospheric composition to changes in emissions, chemistry, and meteorology.