Search Results

You are looking at 31 - 40 of 66 items for

  • Author or Editor: Roy M. Rasmussen x
  • Refine by Access: All Content x
Clear All Modify Search
Jeffrey K. Lew
,
Derek C. Montague
,
Hans R. Pruppacher
, and
Roy M. Rasmussen

Abstract

The effects of porosity on the accretional growth characteristics of ice crystal aggregates (snowflakes) are investigated by riming circular disks of ice in a cloud tunnel. Twelve disk models were used, sized 5 to 6 mm and 10 to 11 mm in diameter, with various hole sizes and numbers, resulting in porosities ranging between 15% and 50%. The onset for riming occurred much earlier for porous dish than for similarly sized nonporous disks as the tunnel airflow speed was increased. For porosities in excess of 15%, the rime growth rates were found to be relatively insensitive to the extent of porosity. However, these rates were an order of magnitude greater than those for nonporous disks of the same size and rimed at the same flow velocity. The appearance of the rime was similar to that observed in the atmosphere for similar conditions. A large stellar model was rimed using the same techniques, and its riming rate was found to be in fair agreement with previous experiments.

Full access
Jeffrey K. Lew
,
Derek C. Montague
,
Hans R. Pruppacher
, and
Roy M. Rasmussen

Abstract

Natural and artificial snowflakes have been rimed both in free fall and while suspended on a thin flexible fiber in the UCLA cloud tunnel. The results of these experiments show that during the early stage of riming, the motions exhibited by a riming aggregate do not affect the distribution of the rime accretion, in agreement with the observations of the riming behavior of porous ice disks, reported in Part I of this study. It was also found that the collection kernel of a 10-mm diameter porous aggregate increased with respect to porosity at the same rate as that in part I of this study.

A discussion is presented of the free-fall behavior and the time evolution of the terminal velocities of riming aggregates.

Full access
Matteo Colli
,
Mattia Stagnaro
,
Luca G. Lanza
,
Roy Rasmussen
, and
Julie M. Thériault

Abstract

Adjustments for the wind-induced undercatch of snowfall measurements use transfer functions to account for the expected reduction of the collection efficiency with increasing the wind speed for a particular catching-type gauge. Based on field experiments or numerical simulation, collection efficiency curves as a function of wind speed also involve further explanatory variables such as surface air temperature and/or precipitation type. However, while the wind speed or wind speed and temperature approach is generally effective at reducing the measurement bias, it does not significantly reduce the root-mean-square error (RMSE) of the residuals, implying that part of the variance is still unexplained. In this study, we show that using precipitation intensity as the explanatory variable significantly reduces the scatter of the residuals. This is achieved by optimized curve fitting of field measurements from the Marshall Field Site (Colorado, United States), using a nongradient optimization algorithm to ensure optimal binning of experimental data. The analysis of a recent quality-controlled dataset from the Solid Precipitation Intercomparison Experiment (SPICE) campaign of the World Meteorological Organization confirms the scatter reduction, showing that this approach is suitable to a variety of locations and catching-type gauges. Using computational fluid dynamics simulations, we demonstrate that the physical basis of the reduction in RMSE is the correlation of precipitation intensity with the particle size distribution. Overall, these findings could be relevant in operational conditions since the proposed adjustment of precipitation measurements only requires wind sensor and precipitation gauge data.

Free access
Matthew R. Kumjian
,
Steven A. Rutledge
,
Roy M. Rasmussen
,
Patrick C. Kennedy
, and
Mike Dixon

Abstract

High-resolution X-band polarimetric radar data were collected in 19 snowstorms over northern Colorado in early 2013 as part of the Front Range Orographic Storms (FROST) project. In each case, small, vertically erect convective turrets were observed near the echo top. These “generating cells” are similar to those reported in the literature and are characterized by ~1-km horizontal and vertical dimensions, vertical velocities of 1–2 m s−1, and lifetimes of at least 10 min. In some cases, these generating cells are enshrouded by enhanced differential reflectivity Z DR, indicating a “shroud” of pristine crystals enveloping the larger, more isotropic particles. The anticorrelation of radar reflectivity factor at horizontal polarization Z H and Z DR suggests ongoing aggregation or riming of particles in the core of generating cells. For cases in which radiosonde data were collected, potential instability was found within the layer in which generating cells were observed. The persistence of these layers suggests that radiative effects are important, perhaps by some combination of cloud-top cooling and release of latent enthalpy through depositional and riming growth of particles within the cloud. The implications for the ubiquity of generating cells and their role as a mechanism for ice crystal initiation and growth are discussed.

Full access
Julie M. Thériault
,
Roy Rasmussen
,
Eddy Petro
,
Jean-Yves Trépanier
,
Matteo Colli
, and
Luca G. Lanza

Abstract

The accurate measurement of snowfall is important in various fields of study such as climate variability, transportation, and water resources. A major concern is that snowfall measurements are difficult and can result in significant errors. For example, collection efficiency of most gauge–shield configurations generally decreases with increasing wind speed. In addition, much scatter is observed for a given wind speed, which is thought to be caused by the type of snowflake. Furthermore, the collection efficiency depends strongly on the reference used to correct the data, which is often the Double Fence Intercomparison Reference (DFIR) recommended by the World Meteorological Organization. The goal of this study is to assess the impact of weather conditions on the collection efficiency of the DFIR. Note that the DFIR is defined as a manual gauge placed in a double fence. In this study, however, only the double fence is being investigated while still being called DFIR. To address this issue, a detailed analysis of the flow field in the vicinity of the DFIR is conducted using computational fluid dynamics. Particle trajectories are obtained to compute the collection efficiency associated with different precipitation types for varying wind speed. The results show that the precipitation reaching the center of the DFIR can exceed 100% of the actual precipitation, and it depends on the snowflake type, wind speed, and direction. Overall, this study contributes to a better understanding of the sources of uncertainty associated with the use of the DFIR as a reference gauge to measure snowfall.

Full access
Matteo Colli
,
Roy Rasmussen
,
Julie M. Thériault
,
Luca G. Lanza
,
C. Bruce Baker
, and
John Kochendorfer

Abstract

Recent studies have used numerical models to estimate the collection efficiency of solid precipitation gauges when exposed to the wind in both shielded and unshielded configurations. The models used computational fluid dynamics (CFD) simulations of the airflow pattern generated by the aerodynamic response to the gauge–shield geometry. These are used as initial conditions to perform Lagrangian tracking of solid precipitation particles. Validation of the results against field observations yielded similarities in the overall behavior, but the model output only approximately reproduced the dependence of the experimental collection efficiency on wind speed. This paper presents an improved snowflake trajectory modeling scheme due to the inclusion of a dynamically determined drag coefficient. The drag coefficient was estimated using the local Reynolds number as derived from CFD simulations within a time-independent Reynolds-averaged Navier–Stokes approach. The proposed dynamic model greatly improves the consistency of results with the field observations recently obtained at the Marshall Field winter precipitation test bed in Boulder, Colorado.

Full access
Roy M. Rasmussen
,
John Hallett
,
Rick Purcell
,
Scott D. Landolt
, and
Jeff Cole

Abstract

A new instrument designed to measure precipitation, the “hotplate precipitation gauge,” is described. The instrument consists of a heated thin disk that provides a reliable, low-maintenance method to measure precipitation rate every minute without the use of a wind shield. The disk consists of two heated, thermally isolated identical aluminum plates—one facing upward and the other downward. The two plates are heated independently, and both are maintained at constant temperature above 75°C by electronic circuitry that heats the plates depending on the deviation from the set temperature. Precipitation rate is estimated by calculating the power required to either melt or evaporate snow or to evaporate rain on the upward-facing plate, compensated for wind effects by subtracting out the power on the lower, downward-facing plate. Data from the World Meteorological Organization reference standard for liquid-equivalent snowfall rate measurements, the Double Fence Intercomparison Reference (DFIR) shield system, were used as the truth to develop the hotplate algorithm. The hotplate measures the liquid-equivalent precipitation rate from 0.25 to 35 mm h−1 within the National Weather Service standard for solid precipitation measurement. The hotplate was also shown to measure wind speed during severe icing conditions and during vibration. The high update rate (precipitation rate, wind speed, and temperature every 1 min), make this an ideal gauge for real-time applications, such as aircraft deicing and road weather conditions. It serves as an accumulation gauge by integrating the 1-min rates over time. It can also be used as a rain gauge for rainfall rates up to 35 mm h−1.

Full access
Kevin E. Trenberth
,
Aiguo Dai
,
Roy M. Rasmussen
, and
David B. Parsons

From a societal, weather, and climate perspective, precipitation intensity, duration, frequency, and phase are as much of concern as total amounts, as these factors determine the disposition of precipitation once it hits the ground and how much runs off. At the extremes of precipitation incidence are the events that give rise to floods and droughts, whose changes in occurrence and severity have an enormous impact on the environment and society. Hence, advancing understanding and the ability to model and predict the character of precipitation is vital but requires new approaches to examining data and models. Various mechanisms, storms and so forth, exist to bring about precipitation. Because the rate of precipitation, conditional on when it falls, greatly exceeds the rate of replenishment of moisture by surface evaporation, most precipitation comes from moisture already in the atmosphere at the time the storm begins, and transport of moisture by the storm-scale circulation into the storm is vital. Hence, the intensity of precipitation depends on available moisture, especially for heavy events. As climate warms, the amount of moisture in the atmosphere, which is governed by the Clausius–Clapeyron equation, is expected to rise much faster than the total precipitation amount, which is governed by the surface heat budget through evaporation. This implies that the main changes to be experienced are in the character of precipitation: increases in intensity must be offset by decreases in duration or frequency of events. The timing, duration, and intensity of precipitation can be systematically explored via the diurnal cycle, whose correct simulation in models remains an unsolved challenge of vital importance in global climate change. Typical problems include the premature initiation of convection, and precipitation events that are too light and too frequent. These challenges in observations, modeling, and understanding precipitation changes are being taken up in the NCAR “Water Cycle Across Scales” initiative, which will exploit the diurnal cycle as a test bed for a hierarchy of models to promote improvements in models.

Full access
Gregory Thompson
,
Paul R. Field
,
Roy M. Rasmussen
, and
William D. Hall

Abstract

A new bulk microphysical parameterization (BMP) has been developed for use with the Weather Research and Forecasting (WRF) Model or other mesoscale models. As compared with earlier single-moment BMPs, the new scheme incorporates a large number of improvements to both physical processes and computer coding, and it employs many techniques found in far more sophisticated spectral/bin schemes using lookup tables. Unlike any other BMP, the assumed snow size distribution depends on both ice water content and temperature and is represented as a sum of exponential and gamma distributions. Furthermore, snow assumes a nonspherical shape with a bulk density that varies inversely with diameter as found in observations and in contrast to nearly all other BMPs that assume spherical snow with constant density. The new scheme’s snow category was readily modified to match previous research in sensitivity experiments designed to test the sphericity and distribution shape characteristics. From analysis of four idealized sensitivity experiments, it was determined that the sphericity and constant density assumptions play a major role in producing supercooled liquid water whereas the assumed distribution shape plays a lesser, but nonnegligible, role. Further testing using numerous case studies and comparing model results with in situ and other observations confirmed the results of the idealized experiments and are briefly mentioned herein, but more detailed, microphysical comparisons with observations are found in a companion paper in this series (Part III, forthcoming).

Full access
Pablo A. Mendoza
,
Balaji Rajagopalan
,
Martyn P. Clark
,
Kyoko Ikeda
, and
Roy M. Rasmussen

Abstract

Statistical postprocessing techniques have become essential tools for downscaling large-scale information to the point scale, and also for providing a better probabilistic characterization of hydrometeorological variables in simulation and forecasting applications at both short and long time scales. In this paper, the authors assess the utility of statistical postprocessing methods for generating probabilistic estimates of daily precipitation totals, using deterministic high-resolution outputs obtained with the Weather Research and Forecasting (WRF) Model. After a preliminary assessment of WRF simulations over a historical period, the performance of three postprocessing techniques is compared: multinomial logistic regression (MnLR), quantile regression (QR), and Bayesian model averaging (BMA)—all of which use WRF outputs as potential predictors. Results demonstrate that the WRF Model has skill in reproducing observed precipitation events, especially during fall/winter. Furthermore, it is shown that the spatial distribution of skill obtained from statistical postprocessing is closely linked with the quality of WRF precipitation outputs. A detailed comparison of statistical precipitation postprocessing approaches reveals that, although the poorest performance was obtained using MnLR, there is not an overall best technique. While QR should be preferred if skill (i.e., small probability forecast errors) and reliability (i.e., match between forecast probabilities and observed frequencies) are target properties, BMA is recommended in cases when discrimination (i.e., prediction of occurrence versus nonoccurrence) and statistical consistency (i.e., equiprobability of the observations within their ensemble distributions) are desired. Based on the results obtained here, the authors believe that future research should explore frameworks reconciling hierarchical Bayesian models with the use of the extreme value theory for high precipitation events.

Full access