Search Results

You are looking at 31 - 40 of 56 items for

  • Author or Editor: Roy M. Rasmussen x
  • All content x
Clear All Modify Search
Pablo A. Mendoza, Balaji Rajagopalan, Martyn P. Clark, Kyoko Ikeda, and Roy M. Rasmussen

Abstract

Statistical postprocessing techniques have become essential tools for downscaling large-scale information to the point scale, and also for providing a better probabilistic characterization of hydrometeorological variables in simulation and forecasting applications at both short and long time scales. In this paper, the authors assess the utility of statistical postprocessing methods for generating probabilistic estimates of daily precipitation totals, using deterministic high-resolution outputs obtained with the Weather Research and Forecasting (WRF) Model. After a preliminary assessment of WRF simulations over a historical period, the performance of three postprocessing techniques is compared: multinomial logistic regression (MnLR), quantile regression (QR), and Bayesian model averaging (BMA)—all of which use WRF outputs as potential predictors. Results demonstrate that the WRF Model has skill in reproducing observed precipitation events, especially during fall/winter. Furthermore, it is shown that the spatial distribution of skill obtained from statistical postprocessing is closely linked with the quality of WRF precipitation outputs. A detailed comparison of statistical precipitation postprocessing approaches reveals that, although the poorest performance was obtained using MnLR, there is not an overall best technique. While QR should be preferred if skill (i.e., small probability forecast errors) and reliability (i.e., match between forecast probabilities and observed frequencies) are target properties, BMA is recommended in cases when discrimination (i.e., prediction of occurrence versus nonoccurrence) and statistical consistency (i.e., equiprobability of the observations within their ensemble distributions) are desired. Based on the results obtained here, the authors believe that future research should explore frameworks reconciling hierarchical Bayesian models with the use of the extreme value theory for high precipitation events.

Full access
Matteo Colli, Roy Rasmussen, Julie M. Thériault, Luca G. Lanza, C. Bruce Baker, and John Kochendorfer

Abstract

Recent studies have used numerical models to estimate the collection efficiency of solid precipitation gauges when exposed to the wind in both shielded and unshielded configurations. The models used computational fluid dynamics (CFD) simulations of the airflow pattern generated by the aerodynamic response to the gauge–shield geometry. These are used as initial conditions to perform Lagrangian tracking of solid precipitation particles. Validation of the results against field observations yielded similarities in the overall behavior, but the model output only approximately reproduced the dependence of the experimental collection efficiency on wind speed. This paper presents an improved snowflake trajectory modeling scheme due to the inclusion of a dynamically determined drag coefficient. The drag coefficient was estimated using the local Reynolds number as derived from CFD simulations within a time-independent Reynolds-averaged Navier–Stokes approach. The proposed dynamic model greatly improves the consistency of results with the field observations recently obtained at the Marshall Field winter precipitation test bed in Boulder, Colorado.

Full access
Gregory Thompson, Paul R. Field, Roy M. Rasmussen, and William D. Hall

Abstract

A new bulk microphysical parameterization (BMP) has been developed for use with the Weather Research and Forecasting (WRF) Model or other mesoscale models. As compared with earlier single-moment BMPs, the new scheme incorporates a large number of improvements to both physical processes and computer coding, and it employs many techniques found in far more sophisticated spectral/bin schemes using lookup tables. Unlike any other BMP, the assumed snow size distribution depends on both ice water content and temperature and is represented as a sum of exponential and gamma distributions. Furthermore, snow assumes a nonspherical shape with a bulk density that varies inversely with diameter as found in observations and in contrast to nearly all other BMPs that assume spherical snow with constant density. The new scheme’s snow category was readily modified to match previous research in sensitivity experiments designed to test the sphericity and distribution shape characteristics. From analysis of four idealized sensitivity experiments, it was determined that the sphericity and constant density assumptions play a major role in producing supercooled liquid water whereas the assumed distribution shape plays a lesser, but nonnegligible, role. Further testing using numerous case studies and comparing model results with in situ and other observations confirmed the results of the idealized experiments and are briefly mentioned herein, but more detailed, microphysical comparisons with observations are found in a companion paper in this series (Part III, forthcoming).

Full access
Roy M. Rasmussen, John Hallett, Rick Purcell, Scott D. Landolt, and Jeff Cole

Abstract

A new instrument designed to measure precipitation, the “hotplate precipitation gauge,” is described. The instrument consists of a heated thin disk that provides a reliable, low-maintenance method to measure precipitation rate every minute without the use of a wind shield. The disk consists of two heated, thermally isolated identical aluminum plates—one facing upward and the other downward. The two plates are heated independently, and both are maintained at constant temperature above 75°C by electronic circuitry that heats the plates depending on the deviation from the set temperature. Precipitation rate is estimated by calculating the power required to either melt or evaporate snow or to evaporate rain on the upward-facing plate, compensated for wind effects by subtracting out the power on the lower, downward-facing plate. Data from the World Meteorological Organization reference standard for liquid-equivalent snowfall rate measurements, the Double Fence Intercomparison Reference (DFIR) shield system, were used as the truth to develop the hotplate algorithm. The hotplate measures the liquid-equivalent precipitation rate from 0.25 to 35 mm h−1 within the National Weather Service standard for solid precipitation measurement. The hotplate was also shown to measure wind speed during severe icing conditions and during vibration. The high update rate (precipitation rate, wind speed, and temperature every 1 min), make this an ideal gauge for real-time applications, such as aircraft deicing and road weather conditions. It serves as an accumulation gauge by integrating the 1-min rates over time. It can also be used as a rain gauge for rainfall rates up to 35 mm h−1.

Full access
Roy M. Rasmussen, Jothiram Vivekanandan, Jeffrey Cole, Barry Myers, and Charles Masters

Abstract

The relationship between liquid equivalent snowfall rate and visibility is investigated using data collected at the National Center for Atmospheric Research Marshall Snowfall Test Site during two winter field seasons and using theoretical relationships. The observational data include simultaneous liquid equivalent snowfall rate, crystal types, and both automated and manual visibility measurements. Theoretical relationships between liquid equivalent snowfall rate and visibility are derived for 27 crystal types, and for “dry” and “wet” aggregated snowflakes. Both the observations and theory show that the relationship between liquid equivalent snowfall rate and visibility depends on the crystal type, the degree of riming, the degree of aggregation, and the degree of wetness of the crystals, leading to a large variation in the relationship between visibility and snowfall rate. Typical variations in visibility for a given liquid equivalent snowfall rate ranged from a factor of 3 to a factor of 10, depending on the storm. This relationship is shown to have a wide degree of scatter from storm to storm and also during a given storm. The main cause for this scatter is the large variation in cross-sectional area to mass ratio and terminal velocity for natural snow particles.

It also is shown that the visibility at night can be over a factor of 2 greater than the visibility during the day for the same atmospheric extinction coefficient. Since snowfall intensity is defined by the U.S. National Weather Service using visibility, this day/night difference in visibility results in a change in snowfall intensity category caused by only whether it is day or night. For instance, a moderate snowfall intensity during the day will change to a light snowfall intensity at night, and a heavy snowfall intensity during the day will change to a moderate snowfall intensity at night, for the same atmospheric extinction coefficient.

Thus, the standard relationship between snowfall intensity and visibility used by many national weather services (1/4 mile or less visibility corresponds to heavy snowfall intensity, between 5/16 and 5/8 mile corresponds to moderate intensity, and greater than 5/8 mile corresponds to light intensity) does not always provide the correct indication of actual liquid equivalent snowfall rate because of the variations in snow type and the differences in the nature of visibility targets during day and night. This false indication may have been a factor in previous ground-deicing accidents in which light snow intensity was reported based on visibility, when in fact the actual measured liquid equivalent snowfall rate was moderate to heavy.

Full access
Jeffrey K. Lew, Derek C. Montague, Hans R. Pruppacher, and Roy M. Rasmussen

Abstract

The effects of porosity on the accretional growth characteristics of ice crystal aggregates (snowflakes) are investigated by riming circular disks of ice in a cloud tunnel. Twelve disk models were used, sized 5 to 6 mm and 10 to 11 mm in diameter, with various hole sizes and numbers, resulting in porosities ranging between 15% and 50%. The onset for riming occurred much earlier for porous dish than for similarly sized nonporous disks as the tunnel airflow speed was increased. For porosities in excess of 15%, the rime growth rates were found to be relatively insensitive to the extent of porosity. However, these rates were an order of magnitude greater than those for nonporous disks of the same size and rimed at the same flow velocity. The appearance of the rime was similar to that observed in the atmosphere for similar conditions. A large stellar model was rimed using the same techniques, and its riming rate was found to be in fair agreement with previous experiments.

Full access
Julie M. Thériault, Roy Rasmussen, Eddy Petro, Jean-Yves Trépanier, Matteo Colli, and Luca G. Lanza

Abstract

The accurate measurement of snowfall is important in various fields of study such as climate variability, transportation, and water resources. A major concern is that snowfall measurements are difficult and can result in significant errors. For example, collection efficiency of most gauge–shield configurations generally decreases with increasing wind speed. In addition, much scatter is observed for a given wind speed, which is thought to be caused by the type of snowflake. Furthermore, the collection efficiency depends strongly on the reference used to correct the data, which is often the Double Fence Intercomparison Reference (DFIR) recommended by the World Meteorological Organization. The goal of this study is to assess the impact of weather conditions on the collection efficiency of the DFIR. Note that the DFIR is defined as a manual gauge placed in a double fence. In this study, however, only the double fence is being investigated while still being called DFIR. To address this issue, a detailed analysis of the flow field in the vicinity of the DFIR is conducted using computational fluid dynamics. Particle trajectories are obtained to compute the collection efficiency associated with different precipitation types for varying wind speed. The results show that the precipitation reaching the center of the DFIR can exceed 100% of the actual precipitation, and it depends on the snowflake type, wind speed, and direction. Overall, this study contributes to a better understanding of the sources of uncertainty associated with the use of the DFIR as a reference gauge to measure snowfall.

Full access
Matthew R. Kumjian, Steven A. Rutledge, Roy M. Rasmussen, Patrick C. Kennedy, and Mike Dixon

Abstract

High-resolution X-band polarimetric radar data were collected in 19 snowstorms over northern Colorado in early 2013 as part of the Front Range Orographic Storms (FROST) project. In each case, small, vertically erect convective turrets were observed near the echo top. These “generating cells” are similar to those reported in the literature and are characterized by ~1-km horizontal and vertical dimensions, vertical velocities of 1–2 m s−1, and lifetimes of at least 10 min. In some cases, these generating cells are enshrouded by enhanced differential reflectivity Z DR, indicating a “shroud” of pristine crystals enveloping the larger, more isotropic particles. The anticorrelation of radar reflectivity factor at horizontal polarization Z H and Z DR suggests ongoing aggregation or riming of particles in the core of generating cells. For cases in which radiosonde data were collected, potential instability was found within the layer in which generating cells were observed. The persistence of these layers suggests that radiative effects are important, perhaps by some combination of cloud-top cooling and release of latent enthalpy through depositional and riming growth of particles within the cloud. The implications for the ubiquity of generating cells and their role as a mechanism for ice crystal initiation and growth are discussed.

Full access
Roy M. Rasmussen, István Geresdi, Greg Thompson, Kevin Manning, and Eli Karplus

Abstract

This study evaluates the role of 1) low cloud condensation nuclei (CCN) conditions and 2) preferred radiative cooling of large cloud drops as compared to small cloud drops, on cloud droplet spectral broadening and subsequent freezing drizzle formation in stably stratified layer clouds. In addition, the sensitivity of freezing drizzle formation to ice initiation is evaluated. The evaluation is performed by simulating cloud formation over a two-dimensional idealized mountain using a detailed microphysical scheme implemented into the National Center for Atmospheric Research–Pennsylvania State University Mesoscale Model version 5. The height and width of the two-dimensional mountain were designed to produce an updraft pattern with extent and magnitude similar to documented freezing drizzle cases. The results of the model simulations were compared to observations and good agreement was found.

The key results of this study are 1) low CCN concentrations lead to rapid formation of freezing drizzle. This occurs due to the broad cloud droplet size distribution formed throughout the cloud in this situation, allowing for rapid broadening of the spectra to the point at which the collision–coalescence process is initiated. 2) Continental clouds can produce freezing drizzle given sufficient depth and time. 3) Radiative cooling of the cloud droplets near cloud top can be effective in broadening an initially continental droplet spectrum toward that of a maritime cloud droplet size distribution. 4) Any mechanism that only broadens the cloud droplet spectra near cloud top, such as radiative cooling, may not act over a sufficiently broad volume of the cloud to produce significant amounts of freezing drizzle. 5) Low ice-crystal concentrations (<0.08 L−1) in the region of freezing drizzle formation is a necessary condition for drizzle formation (from both model and observations). 6) Ice nuclei depletion is a necessary requirement for the formation of freezing drizzle. 7) The maximum cloud water mixing ratio and threshold amount for the onset of drizzle in stably stratified clouds was shown to depend strongly on the CCN concentration. 8) A key factor controlling the formation of freezing drizzle in stratified clouds is the lifetime of the mesoscale and synoptic conditions and the thickness and length of the cloud.

Full access
Matteo Colli, Mattia Stagnaro, Luca G. Lanza, Roy Rasmussen, and Julie M. Thériault

Abstract

Adjustments for the wind-induced undercatch of snowfall measurements use transfer functions to account for the expected reduction of the collection efficiency with increasing the wind speed for a particular catching-type gauge. Based on field experiments or numerical simulation, collection efficiency curves as a function of wind speed also involve further explanatory variables such as surface air temperature and/or precipitation type. However, while the wind speed or wind speed and temperature approach is generally effective at reducing the measurement bias, it does not significantly reduce the root-mean-square error (RMSE) of the residuals, implying that part of the variance is still unexplained. In this study, we show that using precipitation intensity as the explanatory variable significantly reduces the scatter of the residuals. This is achieved by optimized curve fitting of field measurements from the Marshall Field Site (Colorado, United States), using a nongradient optimization algorithm to ensure optimal binning of experimental data. The analysis of a recent quality-controlled dataset from the Solid Precipitation Intercomparison Experiment (SPICE) campaign of the World Meteorological Organization confirms the scatter reduction, showing that this approach is suitable to a variety of locations and catching-type gauges. Using computational fluid dynamics simulations, we demonstrate that the physical basis of the reduction in RMSE is the correlation of precipitation intensity with the particle size distribution. Overall, these findings could be relevant in operational conditions since the proposed adjustment of precipitation measurements only requires wind sensor and precipitation gauge data.

Free access