Search Results

You are looking at 91 - 100 of 704 items for :

  • Forecasting techniques x
  • Bulletin of the American Meteorological Society x
  • Refine by Access: All Content x
Clear All
John N. McHenry, William F. Ryan, Nelson L. Seaman, Carlie J. Coats Jr., Janusz Pudykiewicz, Sarav Arunachalam, and Jeffery M. Vukovich

This article reports on the first implementation of a real-time Eulerian photochemical model f o recast system in the United States. The forecast system consists of a tripartite set of one-way coupled models that run routinely on a parallel micro process or supercomputer. The component models are the fifth-generation Pennsylvania State University (PSU)–NCAR Mesoscale Model (MM5), the Sparse-Matrix Operator Kernel for Emissions (SMOKE) model, and the Multiscale Air Quality Simulation Platform—Real Time (MAQSIPRT) photochemical model. Though the system has been run in real time since the summer of 1998, forecast results obtained during August of 2001 at 15-km grid spacing over New England and the northern mid-Atlantic—conducted as part of an “early start” NOAA air quality forecasting initiative—are described in this article.

The development and deployment of a real-time numerical air quality prediction (NAQP) system is technically challenging. MAQSIP-RT contains a full photochemical oxidant gas-phase chemical mechanism together with transport, dry deposition, and sophisticated cloud treatment. To enable the NAQP system to run fast enough to meet operational forecast deadlines, significant work was devoted to data flow design and software engineering of the models and control codes. The result is a turnkey system now in use by a number of agencies concerned with operational ozone forecasting.

Results of the chosen episode are compared against three other models/modeling techniques: a traditional statistical model used routinely in the metropolitan Philadelphia, Pennsylvania, area, a set of publicly issued forecasts in the northeastern United States, and the operational Canadian Hemispheric and Regional Ozone and NOx System (CHRONOS) model. For the test period it is shown that the NAQP system performs as well or better than all of these operational approaches. Implications for the impending development of an operational U.S. ozone forecasting capability are discussed in light of these results.

Full access
Christopher Davis and Frederick Carr

The major results and discussion items presented at the 1998 Workshop on mesoscale model verification, held 18–19 June in Boulder, Colorado, are summarized. This forum represents perhaps the first attempt to bring together the mesoscale modeling and statistical communities in an attempt to discuss the most challenging issues related to verifying mesoscale forecasts. Pervading discussion was the issue of uncertainty in predictions and observations and how to account for this when performing verification. This article discusses techniques to verify both deterministic and probabilistic predictions and provides recommendations for approaches to future endeavors in mesoscale model verification.

Full access
Robert J. Dumont, Cynthia A. Nelson, Donald G. Caviness, Carl D. Thormeyer, David L. Martin, and John J. Pereira

The United States has several meteorological, oceanographic, and satellite operational processing centers (OPCs) in the military and civilian sectors. Separate cooperative and complementary military and civilian OPCs provide sufficient redundancy for backup purposes; permit the development of state-of-the-art forecasting schemes, such as the ensemble technique; and ensure the diverse environmental needs of military and civilian users are met with the most efficient use of resources. The effective collaboration of the military and civilian OPCs has resulted in the development of a truly national meteorological and oceanographic resource not attainable within any single agency.

Full access
Roy Lee

Progress toward an organizational solution of the weather forecasting problem depends directly on the nature of the problem itself, that is, how it is viewed and formulated. The real problem may be identified through an enquiry into the nature of prediction and the physical properties of the atmosphere whose future state we wish to know.

An analysis of prediction shows it to be generically similar to problem-solving or decision-making. There are five known prediction techniques or methods for computing future events—persistence, trend, cyclic, associative and analogue. Regarding the second aspect of the enquiry, the atmosphere has certain distinguishable scales of motion or eddy size having different time and space characteristics, commonly called macro or planetary scale, synoptic scale, meso-scale and micro-scale, respectively.

Weather forecasting is then considered as a special case of prediction applied to the atmosphere, leading to the formulation of a general schematic solution to the complete weather forecasting problem. The characteristic properties of the different scales of atmospheric phenomena are shown to impose certain natural limitations on the ultimate accuracy of weather prediction.

The application of these principles to design the new organizational structure known as the Canadian Weather Service Forecasting System is discussed. Its three main components, the Central Analysis Office, Weather Centrals and Weather Offices are respectively assigned the primary responsibility for extended, medium and short range forecasting, corresponding to the natural scales of atmospheric phenomena. Special units handle special problems such as ice forecasting. This in turn leads to a functionally-integrated system for the provision of meteorological service, a rational method for selecting and communicating meteorological data, a compatible set of operational prediction procedures for use in each component geared to current knowledge of the atmosphere and means of internal support and communication between components.

It is envisaged that the Canadian Weather Service Forecasting System will ensure the continual modernization of operational procedures to meet changing observational data forms, analysis techniques, advances in computer capability and results of research, thereby realizing the optimum standard of weather service which the science can provide. Finally, this approach can be seen to have applications in World Weather System planning.

Full access

REFRACTT 2006

Real-Time Retrieval of High-Resolution, Low-Level Moisture Fields from Operational NEXRAD and Research Radars

Rita D. Roberts, Frédéric Fabry, Patrick C. Kennedy, Eric Nelson, James W. Wilson, Nancy Rehak, Jason Fritz, V. Chandrasekar, John Braun, Juanzhen Sun, Scott Ellis, Steven Reising, Timothy Crum, Larry Mooney, Robert Palmer, Tammy Weckwerth, and Sharmila Padmanabhan

The Refractivity Experiment for H2O Research and Collaborative Operational Technology Transfer (REFRACTT), conducted in northeast Colorado during the summer of 2006, provided a unique opportunity to obtain high-resolution gridded moisture fields from the operational Denver Next Generation Weather Radar (NEXRAD) and three research radars using a radar-based index of refraction (refractivity) technique. Until now, it has not been possible to observe and monitor moisture variability in the near-surface boundary layer to such high spatial (4-km horizontal gridpoint spacing) and temporal (4–10-min update rates) resolutions using operational NEXRAD and provide these moisture fields to researchers and the National Weather Service (NWS) forecasters in real time. The overarching goals of REFRACTT were to 1) access and mosaic the refractivity data from the operational NEXRAD and research radars together over a large domain for use by NWS forecasters in real time for short-term forecasting, 2) improve our understanding of near-surface water vapor variability and the role it plays in the initiation of convection and thunderstorms, and 3) improve the accuracy of quantitative precipitation forecasts (QPF) through improved observations and assimilation of low-level moisture fields. This paper presents examples of refractivity-derived moisture fields from REFRACTT in 2006 and the moisture variability observed in the near-surface boundary layer, in association with thunderstorm initiation, and with a cold frontal passage.

Full access
Colby V. Ardis Jr.

Thunderstorm activity at Madison is frontal or found in association with fronts. The objective method derived to forecast thunderstorms determines first the air mass favorable for the occurrence of thunderstorms and secondly if there is a front within 24 hr of Madison to release the latent instability needed to produce the thunderstorm.

The air-mass predictors used are (1) the Showalter Stability Index, (2) the freezing level and (3) the surface dew-point temperature at Madison. The synoptic (frontal) predictors used are (1) the difference in surface temperatures and surface dew-point temperatures, Madison minus LaCrosse, (2) the surface wind direction at Madison and (3) the three-hour pressure tendency at Madison.

The method derived is for each of the summer months June, July and August. Five years of data were used for the development of each month, and two years of data were set aside for its test. Four scatter diagrams were developed for each month from which, within minutes each morning, a forecaster can obtain a dependable “YES” or “NO” forecast of thunderstorm activity at Madison without reference to his normal techniques and procedures used to forecast thunderstorms.

The results for June, July and August based on two years data each are the following per cent correct/skill scores: 87/0.59, 91/0.62, 91/0.76, respectively.

Full access
Thomas T. Warner and Nelson L. Seaman

A mesoscale modeling system is being applied on an experimental basis at The Pennsylvania State University (Penn State) for production of real-time, high resolution, numerical weather forecasts for the northeastern United States. The initial model experimentation is being supported by Penn State. It is believed to be the first time that a real-time, three-dimensional mesoscale model has been run routinely at an American university, although mesoscale models have been run in real time in government laboratories. A version of the Penn State/NCAR mesoscale model is employed, using a two-way interacting nested grid with a fine-grid increment of 30 km, a coarse-grid increment of 90 km, and 15 computational levels. The forecast cycle is initiated automatically by the Department of Meteorology's Digital Equipment Corporation VAX 8350 system when all the required 0000 UTC surface and upper-air National Weather Service (NWS) data have been received, quality checked, and archived. Lateral boundary conditions are extracted from the current or previous NWS nested-grid model forecast. The dataset constructed on the VAX system is then transmitted by a fiber-optic data network to an IBM 3090 located on the Penn State campus, where the model is initialized and run for a 24- to 36-h forecast. By about 0600 UTC, well before the beginning of the work day, a short-range mesoscale forecast is available in the Meteorology Department's weather station. These forecasts can be performed routinely on a daily basis, or they can be initiated when large-scale numerical guidance from the NWS indicates the possible development of significant mesoscale disturbances. Regular inspection of the fine-mesh model forecasts is serving as a catalyst for further improvements in the model and is stimulating the development of techniques for evaluation of mesoscale-model forecast skill and/or utilization of mesoscale numerical guidance in an operational setting. We are also finding that this real-time forecast capability is making significant contributions to the mesoscale-meteorology research program as well as to the teaching and public-service responsibilities of the Department of Meteorology at Penn State.

Full access
Joseph P. Koval and George S. Young

Computer applications of increasing diversity form a growing part of the undergraduate education of meteorologists in the early twenty-first century. The advent of the Internet economy, as well as a waning demand for traditional forecasters brought about by better numerical models and statistical forecasting techniques has greatly increased the need for operational and commercial meteorologists to acquire computer skills beyond the traditional techniques of numerical analysis and applied statistics. Specifically, students with the skills to develop data distribution products are in high demand in the private sector job market. Meeting these demands requires greater breadth, depth, and efficiency in computer instruction. The authors suggest that computer instruction for undergraduate meteorologists should include three key elements: a data distribution focus, emphasis on the techniques required to learn computer programming on an as-needed basis, and a project orientation to promote management skills and support student morale. In an exploration of this approach, the authors have reinvented the Applications of Computers to Meteorology course in the Department of Meteorology at The Pennsylvania State University to teach computer programming within the framework of an Internet product development cycle. Because the computer skills required for data distribution programming change rapidly, specific languages are valuable for only a limited time. A key goal of this course was therefore to help students learn how to retrain efficiently as technologies evolve. The crux of the course was a semester-long project during which students developed an Internet data distribution product. As project management skills are also important in the job market, the course teamed students in groups of four for this product development project. The success, failures, and lessons learned from this experiment are discussed and conclusions drawn concerning undergraduate instructional methods for computer applications in meteorology.

Full access
Richard H. Thuillier and James S. Sandberg

As an extension of strong policy against open burning, the Bay Area Air Pollution Control District, in 1967, placed the previously unregulated burning of deciduous fruit and nut tree prunings under meteorological control pending the development of alternative methods of disposal. In applying the control program, temperature inversion criteria and all other factors involving vertical mixing, horizontal transport and contaminant buildup were weighed by District meteorologists in arriving at a daily burn, no-burn decision.

As an evaluation of program effectiveness, three years of forecasts for the December–April agricultural burning season have been verified in terms of various indices of air quality. BAAPCD's quantitative Combined Pollutant Index was found to be a reasonable basis for forecast verification. A critical CPI value of 30 appeared as the 80th percentile in the burn frequency distribution and as the 20th percentile in the no-burn distribution An objective technique was developed incorporating minimum temperature at San Jose, near the principal burning sites, with 3000-ft winds from the rawinsonde at Oakland. A test of this method on independent data during the 1970–1971 burning season established its usefulness as a burning forecast tool.

Full access
Jeff Kingwell, Junichiro Shimizu, Kaneaki Narita, Hirofumi Kawabata, and Itsuro Shimizu

Many of the techniques employed for rocket meteorology—“rocket-casting”—have been adapted from aviation. However, the unique characteristics and requirements of rocketry demand special meteorological procedures and instrumentation, which are only recently becoming satisfactorily defined.

The influence of weather parameters on operational rocketry is examined, with special emphasis on the Tanegashima Space Center, Japan. It is concluded that the fundamental requirement for efficient launch operations is a highly sophisticated nowcasting facility, backed by an effective research and development program.

On 13 August 1986, the National Space Development Agency of Japan (NASDA) launched from the Osaki rocket range in Tanegashima three payloads on the inaugural flight of the H-1 launch vehicle.

The launch weather was expected to be fine at the range. In the event, a thunderstorm commenced close to the launch area during the last few seconds before launch, which nevertheless proceeded successfully. This incident highlights the uncertainties of rocket operations, particularly in the critical area of the provision of reliable weather information and forecasts.

The synoptic conditions at the time of the 13 August launch incident are examined, and a qualitative forecast checklist is suggested to assist in forecasting similar summertime early-morning maritime storms in the future.

Full access