Search Results

You are looking at 81 - 90 of 704 items for :

  • Forecasting techniques x
  • Bulletin of the American Meteorological Society x
  • Refine by Access: All Content x
Clear All
Susan F. Zevin

In the 23 years since Hurricane Camille devastated Virginia with 27 inches of rain in 24 hours, a major area targeted for hydrometeorological forecast service improvements has been flood and flash flood forecasting. The first attempts to tackle the problems were event driven. Numerous poststorm analyses led to the definition of meteorological criteria often associated with various types of major flash flood–producing rainfall situations. Individual forecast offices attempted to use these techniques with inconsistent success. Additionally, verification was not carried out on a routine or systematic basis.

In 1979, the National Weather Service (NWS) Eastern Region began to encourage its offices to use precipitation forecasts routinely to anticipate critical flood conditions, rather than awaiting observations of rainfall. However, the implementation of a broadscale programmatic approach to the routine operational use of quantitative precipitation forecasting faced numerous hurdles. Complexities ran the gamut of operational problems; and broadscale efforts to implement the program floundered. At the same time, public and private sector users continued to request more accurate information with better lead time for response. Academic studies showed that in order to gain enough lead time for effective decision making and response, it is essential to incorporate the uncertainty of the precipitation forecast into flood forecast operations.

Within the last five years, the NWS once again introduced the possibility of a disciplined, systematic, scientific application of these ideas in the field of operational forecasting. The NWS modernization has afforded the vehicle to implement these concepts operationally. In parallel, NWS forecasters and university researchers have collaborated on probabilistic approaches to the rainfall forecast problem, integrating theory, method, process, and operations.

Based on 20 years of progressive learning and operational experience, the NWS now has the tools, the understanding, and the scientific and operational capabilities to expand the efforts nationally.

Full access
Z. Zhang and T. N. Krishnamurti

Because of the initial data uncertainties, it is inevitable that operational hurricane track forecasting practice would, in the future, follow an ensemble forecast approach. The ensemble technique is becoming increasingly popular for the middle-latitude weather forecasts. This paper focuses on an ensemble forecast methodology for the hurricane track forecast procedure.

In this study, an ensemble perturbation method is applied for hurricane track predictions using the Florida State University's Global Spectral Model with horizontal spectral resolution of T63 and 14 vertical levels.

The method is based on the premise that (a) model perturbation grows linearly during the first few days of model integration, and (b) in order to make a complete set of ensemble perturbations of hurricane forecasts, both hurricane initial position and its structure and environment need to be perturbed. The initial position of the hurricane is perturbed by displacing its original position 50 km equally toward the north, south, east, and west directions. The hurricane environment and structure perturbations can be generated by implementing EOF analysis to the differences between forecasts starting from regular analysis and randomly perturbed analysis. Only the temperature and wind fields are perturbed with the order proportional to the respective observational error. The method generates 15 ensemble members for each hurricane.

The result shows that this ensemble prediction method leads to an improvement in the hurricane track forecasts. The track position errors are largely reduced by the ensemble prediction for most of the hurricane cases that have been tested, and these forecasts are superior to the results from single-model control experiments. It is also noted that the spread of the ensemble track forecasts is useful to assess the reliability of the predictions.

Full access
William R. Bergen and Allan H. Murphy

Severe downslope windstorms are an outstanding feature of the winter weather in Boulder, Colo., and property damage associated with these storms averages about $1 million each year. Recently, efforts to develop a numerical model capable of forecasting downslope windstorms have yielded encouraging results. The possibility that short-term forecasts of these storms might become available on an operational basis led to a study of the societal impact of improved windstorm forecasts in the Boulder area, and this paper describes the results of that study.

Surveys were conducted of selected samples of Boulder residents and businesses concerning the potential economic and social benefits and disbenefits of improvements in downslope windstorm forecasts. The survey questions concerned five basic topics: 1) perception of the windstorm hazard; 2) the desire for improved windstorm forecasts; 3) the use of windstorm forecasts; 4) the value of improved forecasts; and 5) possible forecast dissemination techniques. Personal interviews were conducted with local businesses and public service agencies to supplement and extend the results of the surveys.

All segments of the community were found to be concerned about the windstorms because of the possibility of serious injury and/or major property damage. The responses also revealed a strong desire for improved windstorm forecasts, although the level of desire was found to depend upon the accuracy of the forecasts. Moreover, significant increases in the use of a variety of protective actions would occur if accurate (i.e., 80% accurate) windstorm forecasts were available. The results of the surveys and interviews indicated that accurate forecasts could reduce residential property damage by approximately $200 000 annually, and the potential savings to local businesses were estimated to be an additional $150 000. These benefits appear to greatly exceed any incremental costs associated with formulating and disseminating the forecasts and any economic losses suffered by local businesses due to decreased windstorm damage. In addition, the residents expressed a willingness to support a local windstorm forecasting system if governmental funding was not available. Finally, while no completely effective procedure for alerting a significant fraction of the community to an approaching windstorm was identified, it was recognized that this problem is not unique to forecasts of downslope windstorms and requires further study.

Full access
Peter Lynch

To elucidate his numerical technique and to examine the effectiveness of geostrophic initial winds, Lewis Fry Richardson carried out an idealized forecast using the linear shallow-water equations and simple analytical pressure and velocity fields. This barotropic forecast has been repeated and extended using a global numerical model, and the results are presented in this paper. Richardson's conclusions regarding the use of geostrophic winds as initial data are reconsidered.

An analysis of Richardson's data into normal modes shows that almost 85% of the energy is accounted for by a single eigenmode, the gravest symmetric rotational Hough mode, which travels westward with a period of about five days. This five-day wave has been detected in analyses of stratospheric data. It is striking that the fields chosen by Richardson on considerations of smoothness should so closely resemble a natural oscillation of the atmosphere.

The numerical model employed in this study uses an implicit differencing technique, which is stable for large time steps. The numerical instability that would have destroyed Richardson's barotropic forecast, had it been extended, is thereby circumvented. It is sometimes said that computational instability was the cause of the failure of Richardson's baroclinic forecast, for which he obtained a pressure tendency value two orders of magnitude too large. However, the initial tendency is independent of the time step (at least for the explicit scheme used by Richardson). In fact, the spurious tendency resulted from the presence of unrealistically large high-frequency gravity-wave components in the initial fields.

High-frequency oscillations are also found in the evolution starting from the idealized data in the barotropic forecast. They are shown to be due to the gravity-wave components of the initial data. These oscillations may be removed by a slight modification of the initial fields. This initialization is effected by means of a simple digital filtering technique, which is applicable not only to the linear equations used here but also to a general nonlinear system.

Full access
Kerry Emanuel, Eugenia Kalnay, Craig Bishop, Russell Elsberry, Ronald Gelaro, Daniel Keyser, Steven Lord, David Rogers, Melvyn Shapiro, Christopher Snyder, and Christopher Velden

One of the most significant impediments to progress in forecasting weather over North America is the relative paucity of routine observations over data-sparse regions adjacent to the United States. Prospectus Development Team Seven was convened to consider ways to promote research that seeks to determine implementations of observing systems that are optimal for weather prediction in the United States. An “optimal” measurement system is considered to be one that maximizes the ratio of societal benefit to overall cost. The thrust of the conclusions is that existing means of estimating the value of current observing systems and the potential benefits of new or proposed observing systems are underutilized. At the same time, no rational way exists for comparing the cost of observations across the spectrum of federal agencies responsible for measuring the atmosphere and ocean. The authors suggest that a rational procedure for configuring an observation system that is optimal for weather prediction would consist of the following steps.

The authors believe that a rational approach to atmospheric measurement is critical to further improvements in weather prediction and that such improvements might very well be made within the current budget of routine observations, integrated across all of the responsible federal agencies. This document outlines a proposed strategy for rationalizing atmosphere observations in aid of weather prediction in the United States. The paper begins with a summary of recommendations.

Full access
William H. Bauman III and Steven Businger

Space shuttle launches and landings at Kennedy Space Center (KSC) are subject to strict weather-related launch commit criteria and landing weather flight rules. Complex launch commit criteria and end-of-mission landing weather flight rules demand very accurate forecasts and nowcasts (short-term forecasts of less than 2 h) of cloud, wind, visibility, precipitation, turbulence, and thunderstorms prior to shuttle launches and landings.

The challenges to the National Weather Service Spaceflight Meteorology Group forecasters at Johnson Space Center to nowcast and forecast for space shuttle landings and evaluate the landing weather flight rules are discussed. This paper focuses on the forecasts and nowcasts required for a normal end-of-mission and three scenarios for abort landings of a space shuttle at KSC. Specific weather requirements for a potential emergency landing are the dominant cause of weather-related delays to space shuttle launches. Some examples of meteorological techniques and technologies in support of space shuttle landing operations are reviewed. Research to improve nowcasting convective activity in the Cape Canaveral vicinity is discussed, and the particular forecast problem associated with landing a space shuttle during easterly flow regimes is addressed.

Full access
Jerome Namias

Harry Wexler was a close student of developing techniques in the field of long range forecasting, and a contributor to them, realizing that improvement of long range forecasts represents one of the most important goals of meteorology. His scientifically-oriented administrative ability played a large part in enlarging the role of this subject in the United States. Indeed, one of the principal reasons he pioneered in inaugurating the World Weather Watch program was to lay a firmer foundation for long-range forecasting. I was one of many who gained benefit and stimulation from a long friendship with him, and my lecture will recall some of the instances.

First, I want to describe the nature of the long range prediction problem as seen through the eyes of a pragmatist, and to present a balanced picture of what is now possible and what may become possible in the next decade or so. One might hope that this 10-year forecast of “weather predictability” will turn out better than a 10-year forecast of the weather itself!

Next, I will speak on the history of long range forecasting over the past century, as traced through the work of inventors of synoptic, statistical and physical approaches to the problem. In spite of decades of frustration imposed by lack of adequate data, ignorance of large-scale physical processes, and the absence of intensive and large-scale effort, meteorologists working on longe range problems have made encouraging progress.

Full access
Carl W. Kreitzberg

Effective reasoning, analysis and communication regarding natural phenomena require the use of models to render tractable the complexities of nature. This paper attempts to put into perspective the proper roles of different types of models to maximize the effectiveness of their utilization. The advances in short term forecasting envisioned for the 1970's from full implementation of new knowledge, models and technology will materialize only if the managers and researchers join in an interagency effort to provide the operational meteorologists with the education, techniques, tools and, particularly, the challenging working environment needed to fully develop man's role in forecasting. A program to meet these requirements is outlined.

The types of models discussed include: descriptive or synoptic, dynamic or analytic, numerical or physical, statistical or optimized. The uses of models discussed include: education (basic concepts), research (experimental), operations (customized). Since the operational meteorologist is responsible for the intelligent use of these types of models, he must continually update his training and properly understand the potential contributions of the models.

It is anticipated that during the 1970's routine computer models will become more refined and specialized data such as trajectories and probabilities will become more common. Highly specialized products will be available from special purpose models on a special request basis as field forecasters gain access to remote terminals. Also, forecasters will have access to specialized consultants when unusual events or unusual forecast requirements arise. Background materials will be provided to the applied meteorologists so that he may gain physical understanding from educational and research models including systematic numerical experiments. Communication advances will provide for dynamic (motion picture) displays of radar, synchronous satellite, weather map and weather forecast data.

Only if the operational forecasters do receive the necessary management and scientific support, will their jobs be challenging and attractive to highly motivated and qualified students; only then will the customers of specialized short term forecasts receive the benefits made feasible by science and technology.

Full access
R. H. Langland, Z. Toth, R. Gelaro, I. Szunyogh, M. A. Shapiro, S. J. Majumdar, R. E. Morss, G. D. Rohaly, C. Velden, N. Bond, and C. H. Bishop

The objectives and preliminary results of an interagency field program, the North Pacific Experiment (NORPEX), which took place between 14 January and 27 February 1998, are described. NORPEX represents an effort to directly address the issue of observational sparsity over the North Pacific basin, which is a major contributing factor in short-range (less than 4 days) forecast failures for land-falling Pacific winter-season storms that affect the United States, Canada, and Mexico. The special observations collected in NORPEX include approximately 700 targeted tropospheric soundings of temperature, wind, and moisture from Global Positioning System (GPS) dropsondes obtained in 38 storm reconnaissance missions using aircraft based primarily in Hawaii and Alaska. In addition, wind data were provided every 6 h over the entire North Pacific during NORPEX, using advanced and experimental techniques to extract information from multispectral geostationary satellite imagery. Preliminary results of NORPEX data impact studies using the U.S. Navy and National Weather Service forecast models include reductions of approximately 10% in mean 2-day forecast error over western North America (30°–60°N, 100°–130°W) from assimilation of targeted dropsonde and satellite wind data (when measured against control forecasts that contain no special NORPEX observations). There are local reductions of up to 50% in 2-day forecast error for individual cases, although some forecasts are degraded by the addition of the special dropsonde or satellite wind data. In most cases, the positive impact of the targeted dropsonde data on short-range forecast skill is reduced when the full set of advanced satellite wind data is already included in the model analyses. The NORPEX dataset is being used in research to improve objective methods for targeting observations, to study the “mix” of in situ and space-based observations, and to understand the structure and dynamics of fast-growing errors that limit our ability to provide more accurate forecasts of Pacific winter storms.

Full access
Shawn R. Smith and James J. O'Brien

Regional changes in early, middle, and late winter total snowfall distributions are identified over the continental United States in association with warm and cold phases of the El Niño-Southern Oscillation (ENSO). The analysis is primarily motivated by a desire to improve winter season climate forecasts. Original interest in snowfall associated with ENSO was provided by requests for skiing forecasts during the 1997 ENSO warm phase. Geographic regions with internally similar ENSO warm, cold, and neutral phase snowfall distributions are identified using a composite technique. The composites reveal three early winter, five midwinter, and three late winter regions with shifts in the upper-, middle, and lower-quartile seasonal snowfall. The quartile shifts revealed by the composite technique are important for forecasting applications; however, snowfall impact studies rely more on the absolute magnitude of the change in snowfall at individual stations. Potential impacts of the shifts in snowfall distributions associated with ENSO are discussed using the quartile snowfall magnitudes for the stations in the composites. Shifts in regional snowfall distributions are compared to published ENSO winter climate studies, and hypotheses are presented to relate physical processes to the warm, cold, and neutral phase snowfall distributions.

Principal findings include increased snowfall during an ENSO cold phase relative to warm and neutral phases in the northwestern states from early through midwinter, less (more) snowfall during a cold (warm) phase relative to neutral years in the Northeast, and less snowfall (relative to neutral winters) in both warm and cold phases in the Ohio Valley (early winter) and Midwest (midwinter). Combining these snowfall regions with an ever-improving ability to forecast ENSO warm and cold phases will improve seasonal snowfall forecasts. The results should improve mitigation strategies for agencies adversely impacted by ENSO-induced snowfall anomalies.

Full access