Search Results

You are looking at 1 - 10 of 39 items for :

  • Author or Editor: Charles A. Doswell III x
  • Weather and Forecasting x
  • All content x
Clear All Modify Search
Charles A. Doswell III

Abstract

Using a case study of a relatively modest severe weather event as an example, a framework for understanding the large-scale-mesoscale interaction is developed and discussed. Large-scale processes are limited, by definition, to those which are quasi-geostrophic. Mesoscale processes are defined to be those which are linked in essence to processes occurring on both larger and smaller scales. It is proposed that convective systems depend primarily on large-scale processes for developing a suitable thermodynamic structure, while mesoscale processes act mainly to initiate convection. The case study is presented not as a “typical” event in its particulars, but rather to suggest the complex ways in which large-scale and mesoscale processes can interact. Implications for forecasting are an essential part of the discussion, since mesoscale systems are so difficult to predict with the present knowledge and technology available in operations.

Full access
Charles A. Doswell III

Abstract

No abstract available

Full access
Full access
Charles A. Doswell III

Abstract

No abstract available

Full access
Charles A. Doswell III

Abstract

Some basic ideas about designing a meteorological workstation for operational weather forecasting are presented, in part as a complement to the recently published discussion of workstation design by R. R. Hoffman. Scientific weather forecasting is defined and used as a basis for developing a set of necessary structural capabilities in a workstation. These capabilities include: built-in excess capacity for flexibility, user-defined product menus, interactivity at the level of being able to change the data as well as analyzed fields, and a software suite of operators by which virtually any product can be custom built through concatenation of mathematically defined operations on any of the data resident within the workstation.

The need for user involvement is stressed by showing an example of a real forecaster “workstation” that successfully provided most of these capabilities and, in contrast, by pointing out the flaws in the current National Weather Service operational workstation's development. In order to provide a system of maximum value, the users must be intimately involved in the process of system design, which virtually precludes the standard federal procurement process. A process of hardware and software purchases “off the shelf” is advocated, in combination with the establishment of on-site expertise to craft locally tailored workstations. The implications for the future of operational weather forecasting are discussed.

Full access
Charles A. Doswell III

Abstract

The decision-making literature contains considerable information about how humans approach tasks involving uncertainty using heuristics. Although there is some reason to believe that weather forecasters are not identical in all respects to the typical subjects used in judgment and decision-making studies, there also is evidence that weather forecasters are not so different that the existing understanding of human cognition as it relates to making decisions is entirely inapplicable to weather forecasters. Accordingly, some aspects of cognition and decision making are reviewed and considered in terms of how they apply to human weather forecasters, including biases introduced by heuristics. Considerable insight into human forecasting could be gained by applying available studies of the cognitive psychology of decision making. What few studies exist that have used weather forecasters as subjects suggest that further work might well be productive in terms of helping to guide the improvement of weather forecasts by humans. It is concluded that a multidisciplinary approach, involving disciplines outside of meteorology, needs to be developed and supported if there is to be a future role for humans in forecasting the weather.

Full access
Charles A. Doswell III

Abstract

No abstract available

Full access
Charles A. Doswell III and John A. Flueck

Abstract

Verification of forecasts during research field experiments is discussed and exemplified using the DOPLIGHT '87 experiment. We stress the importance of forecast verification if forecasting is to be a serious component of the research. A direct comparison and contrast is done between forecasting for field research and forecasting in the operational sense, highlighting the differences between them. Ale verification of field research program forecasting is also different from that done in operations, as a result of those forecasting differences.

DOPLIGHT '87 was a field project conducted jointly by the National Severe Storms Laboratory and the Oklahoma City National Weather Service Forecast Office, and is described in detail. During the experimental design, special attention was given to forecast design, to ensure that verification would be unambiguous and that the data collected would be appropriate for validating the forecasts. This a priori design of the forecasts to consider proper objective verification is, we believe, unique among research field programs. The forecast evaluation focuses on the contingency table and summary statistics derived from it, as treated in a companion paper by Flueck (1989; hereafter referred to as F1u89).

Results are interpreted in terms of their implications for future field research experiments and for operational forecasting. For example, it is noted that DOPLIGHT '87 forecasts of convective potential were nearly constant from the evening before an anticipated operational day to about local noon on that day. This suggests that convective storm field research operational decisions could be made as early as the evening before an anticipated operational day with negligible loss of skill. Summary measures of the forecast verification suggest that the DOPLIGHT '87 forecasters demonstrated skill roughly comparable to the forecasters at the National Severe Storms Forecast Center in issuing outlooks of convective potential. The requirement for time to assimilate the most recent data is noted both for field experiments and for operations, and some discussion of the potential impact of new data acquisition and processing systems is offered.

Full access
Hamish A. Ramsay and Charles A. Doswell III

Abstract

Four supercell motion forecast algorithms are investigated with respect to their hodograph-analysis parameters. Another method derived from the data presented herein, the so-called offset method, is used to develop a baseline standard for the aforementioned schemes, using the observed storm motions and the mean wind. It is not a forecast scheme, as it is based on knowing the observed storm motions. This work explores the sensitivity of these algorithms to their arbitrary parameters by systematically varying those parameters, using a dataset of 394 right-moving supercells, and associated proximity soundings. The parameters used in these algorithms to define the layer depths for advection and/or propagation of supercells have not been shown to be optimum for this purpose. These arbitrary parameters compose the top and bottom levels of the mean wind layer, and a deviation vector from the mean wind defined through that layer. Two of the most recently developed algorithms have also implemented the vertical wind shear vector over an arbitrary layer depth. It has been found that, among other results, the scheme using both mean wind and vertical wind shear is more sensitive to the depth of the mean wind layer than it is to the depth of the vertical wind shear layer. It has also been shown that, when using the simplest schemes, the most accurate forecasts, on average, are obtained by using deep mean wind layers (i.e., greater than 0–10 km). Indeed, all the forecast schemes show a strong tendency for the u component of the predicted storm motion to be regulated by the depth of the mean wind layer. The υ component of the prediction storm motion, on the other hand, appears to be controlled by the deviation vector from the layer-mean wind. Although the schemes using vertical shear are shown to perform somewhat better on average than schemes based on the mean wind alone, there are times in which they also result in large forecast errors. The results demonstrate the inherent difficulty in using an observed hodograph to predict supercell motion.

Full access
Harold E. Brooks and Charles A. Doswell III

Abstract

The 3 May 1999 Oklahoma City tornado was the deadliest in the United States in over 20 years, with 36 direct fatalities. To understand how this event fits into the historical context, the record of tornado deaths in the United States has been examined. Almost 20 000 deaths have been reported associated with more than 3600 tornadoes in the United States since 1680. A cursory examination of the record shows a break in 1875. Prior to then, it is likely that many killer tornadoes failed to be reported. When the death toll is normalized by population, a near-constant rate of death is apparent until about 1925, when a sharp fall begins. The rate was about 1.8 people per million population in 1925 and was less than 0.12 people per million by 2000. The decrease in fatalities has resulted from two primary causes: a decrease in the number of killer tornadoes and a decrease in the number of fatalities in the most deadly tornadoes. Current death rates for mobile home residents, however, are still nearly what the overall national rate was prior to 1925 and are about 20 times the rate of site-built home residents. The increase in the fraction of the U.S. population living in mobile homes has important implications for future reductions in the death toll.

Full access