Search Results

You are looking at 1 - 10 of 17 items for

  • Author or Editor: David A. Olson x
  • Refine by Access: All Content x
Clear All Modify Search
Ronald D. McPherson
and
David A. Olson

Abstract

No abstract available

Full access
Frederick Sanders
and
David A. Olson

Abstract

Full access
Frederick Sanders
and
David A. Olson

Abstract

A method is proposed for deriving a useful quantitative precipitation forecast from a physical model in which no account is taken of the release of latent heat of condensation. The approach is to calculate, on the basis of a simple theory, the ratio between the large-scale updraft speed when the latent heat is incorporated and the smaller updraft speed when it is not. The resulting ratio is then assumed to be equivalent to the ratio of the storm-average precipitation to the comparable smaller average computed from the thermodynamically dry model. An empirical test supports the assumption reasonably well for large winter storms in the central and eastern United States. For operational purposes it is suggested that the theoretical ratio be estimated statistically from the mean temperature of the layer from 1000 mb to 500 mb. Evidence is provided that this estimate can be made fairly successfully. Finally, some inferences are drawn concerning the role of cumulus convection in this type of storm.

Full access
Harry E. Brown
and
David A. Olson

An intense winter storm deepened rapidly to maturity off the Mid-Atlantic coast and brought record amounts of snow to southeastern New England. The development, which was not the typical development of a Cape Hatteras storm into a “northeaster,” was predicted 3½ days in advance by the National Meteorological Center's seven-layer primitive equation (7LPE) model. The NMC's limited area fine mesh (LFM) model consistently forecast details of the cyclogenesis 24 h ahead. The success of the models in forecasting the cyclogenesis resulted in excellent forecasts of precipitation amounts and heavy snow. The total performance of NMC was probably the very best possible with the current models.

Full access
David L. Williamson
,
Jerry G. Olson
, and
Byron A. Boville

Abstract

At the modest vertical resolutions typical of climate models, simulations produced by models based on semi-Lagrangian approximations tend to develop a colder tropical tropopause than matching simulations from models with Eulerian approximations, all other components of the model being the same. The authors examine the source of this relative cold bias in the context of the NCAR CCM3 and show that it is primarily due to insufficient vertical resolution in the standard 18-level model, which has 3-km spacing near the tropopause. The difference is first diagnosed with the Held and Suarez idealized forcing to eliminate the complex radiative–convective feedback that affects the tropopause formation in the complete model. In the Held and Suarez case, the tropical simulations converge as the vertical grid layers are halved to produce 36 layers and halved again to produce 72 layers. The semi-Lagrangian approximations require extra resolution above the original 18 to capture the converged tropical tropopause. The Eulerian approximations also need the increased resolution to capture the single-level tropopause implied by the 36- and 72-level simulations, although with 18 layers it does not produce a colder tropopause, just a thicker multilevel tropopause. The authors establish a minimal grid of around 25 levels needed to capture the structure of the converged simulation with the Held and Suarez forcing. The additional resolution is added between 200 and 50 mb, giving a grid spacing of about 1.3 km near the tropopause. With this grid the semi-Lagrangian and Eulerian approximations also create the same tropical structure in the complete model. With both approximations the convective parameterization is better behaved with the extra upper-tropospheric resolution. A benefit to both approximations of the additional vertical resolution is a reduction of the tropical temperature bias compared to the NCEP reanalysis. The authors also show that the Eulerian approximations are prone to stationary grid-scale noise if the vertical grid is not carefully defined. The semi-Lagrangian shows no indication of stationary vertical-grid-scale noise. In addition, the Eulerian simulation exhibits significantly greater transient vertical-grid-scale noise than the semi-Lagrangian.

Full access
T. Scott Rupp
,
Xi Chen
,
Mark Olson
, and
A. David McGuire

Abstract

Projected climatic warming has direct implications for future disturbance regimes, particularly fire-dominated ecosystems at high latitudes, where climate warming is expected to be most dramatic. It is important to ascertain the potential range of climate change impacts on terrestrial ecosystems, which is relevant to making projections of the response of the Earth system and to decisions by policymakers and land managers. Computer simulation models that explicitly model climate–fire relationships represent an important research tool for understanding and projecting future relationships. Retrospective model analyses of ecological models are important for evaluating how to effectively couple ecological models of fire dynamics with climate system models. This paper uses a transient landscape-level model of vegetation dynamics, Alaskan Frame-based Ecosystem Code (ALFRESCO), to evaluate the influence of different driving datasets of climate on simulation results. Our analysis included the use of climate data based on first-order weather station observations from the Climate Research Unit (CRU), a statistical reanalysis from the NCEP–NCAR reanalysis project (NCEP), and the fifth-generation Pennsylvania State University–NCAR Mesoscale Model (MM5). Model simulations of annual area burned for Alaska and western Canada were compared to historical fire activity (1950–2000). ALFRESCO was only able to generate reasonable simulation results when driven by the CRU climate data. Simulations driven by the NCEP and MM5 climate data produced almost no annual area burned because of substantially colder and wetter growing seasons (May–September) in comparison with the CRU climate data. The results of this study identify the importance of conducting retrospective analyses prior to coupling ecological models of fire dynamics with climate system models. The authors’ suggestion is to develop coupling methodologies that involve the use of anomalies from future climate model simulations to alter the climate data of more trusted historical climate datasets.

Full access
David A. Olson
,
Norman W. Junker
, and
Brian Korty

Abstract

The National Meteorological Center (NMC) initiated Quantitative Precipitation Forecasts (QPF) and an intensive QPF verification program in 1960. These forecast products have evolved from a manual effort, relying on extensive forecast experience to one that placed much greater reliance on the interpretation and modification of numerical models.

Verification graphs show steady improvements in forecast accuracy, especially for the longer-range forecasts, which in this context am those in the 24–60-h range. During the 1960s the Threat Score (TS) for day-2 forecasts for 1 in or more of precipitation averaged approximately 0.07. During recent years, that score has nearly doubled, and the 36–60-h period forecast in 1993 had a TS comparable to that for the 12–36-h period during the 1960s. Improvement in accuracy is probably related to a number of diverse factors including improved numerical models, increased forecaster knowledge of the strengths and weaknesses of the operational models, and an increased understanding of precipitation processes. The verification results have been used to track individual and group progress.

Full access
Paul J. Kocin
,
David A. Olson
,
Arthur C. Wick
, and
Robert D. Harner

Abstract

The preparation of surface weather analyses at the National Meteorological Center (NMC) is currently under review. The availability of advanced graphics workstations and consideration of revisions to conceptual models of cyclogenesis and frontal analysis present challenges and opportunities for improving surface analysis at NMC. In this paper, current procedures and surface analysis products are reviewed. The adaptation of workstation technology to one surface weather analysis product, the Daily Weather Maps, Weekly Series, is described and presented as a preliminary experiment for assessing the utility of performing surface analyses on interactive workstations. Finally, issues that will impact the future of surface analysis at NMC, such as workstation development, utilization of gridded datasets and their manipulation for improving objective analyses, possible revisions to frontal symbology, incorporation of mesoscale symbology, and changes to sea-level pressure computations, are discussed.

Full access
Sean R. Scott
,
Jason P. Dunion
,
Mark L. Olson
, and
David A. Gay

Abstract

Atmospheric dust is an important mass transfer and nutrient supply process in Earth surface ecosystems. For decades, Saharan dust has been hypothesized as a supplier of nutrients to the Amazon rainforest and eastern North America. However, isotope studies aimed at detecting Saharan dust in the American sedimentary record have been ambiguous. A large Saharan dust storm emerged off the coast of Africa in June 2020 and extended into the southeastern United States. This storm provided a means to evaluate the influence of Saharan dust in North America confirmed by independent satellite and ground observations. Precipitation samples from 17 sites within the National Atmospheric Deposition Program (NADP) were obtained from throughout the southeastern United States prior to, during, and after the arrival of Saharan dust. Precipitation samples were measured for their lead (Pb) isotopic composition, total Pb content, and 210Pb activity using multicollector inductively coupled plasma mass spectrometry. We measured a significant isotopic shift (approximately 0.7% in the 208Pb/206Pb relative to the 207Pb/206Pb) in precipitation that peaked in late June 2020 when the dust blanketed the southeastern United States. However, the magnitude and short time period of the isotopic shift would make it difficult to detect in sedimentary records.

Full access
G. David Alexander
,
James A. Weinman
,
V. Mohan Karyampudi
,
William S. Olson
, and
A. C. L. Lee

Abstract

Inadequate specification of divergence and moisture in the initial conditions of numerical models results in the well-documented “spinup” problem. Observational studies indicate that latent heat release can be a key ingredient in the intensification of extratropical cyclones. As a result, the assimilation of rain rates during the early stages of a numerical simulation results in improved forecasts of the intensity and precipitation patterns associated with extratropical cyclones. It is challenging, however, particularly over data-sparse regions, to obtain complete and reliable estimates of instantaneous rain rate. Here, a technique is described in which data from a variety of sources—passive microwave sensors, infrared sensors, and lightning flash observations—along with a classic image processing technique (digital image morphing) are combined to yield a continuous time series of rain rates, which may then be assimilated into a mesoscale model. The technique is tested on simulations of the notorious 1993 Superstorm. In this case, a fortuitous confluence of several factors—rapid cyclogenesis over an oceanic region, the occurrence of this cyclogenesis at a time inconveniently placed in between Special Sensor Microwave/Imager overpasses, intense lightning during this time, and a poor forecast in the control simulation—leads to a dramatic improvement in forecasts of precipitation patterns, sea level pressure fields, and geopotential height fields when information from all of the sources is combined to determine the rain rates. Lightning data, in particular, has a greater positive impact on the forecasts than the other data sources.

Full access