Search Results

You are looking at 1 - 8 of 8 items for

  • Author or Editor: John A. McGinley x
  • Refine by Access: All Content x
Clear All Modify Search
Yoshi K. Sasaki and John A. McGinley

Abstract

Within variational calculus there are problems which employ inequalities as constraints. Adjustment of absolutely unstable atmospheric layers is a suitable problem on which to apply an appropriate solution technique. The Valentine-Berkovitz method introduces a new unknown dependent variable, the stack function, which changes the inequality condition to the form of an equality. An example is selected which shows how the adjustment may be controlled depending on the source of the unstable layer.

Full access
John A. McGinley, Steven C. Albers, and Peter A. Stamus

Abstract

Advances in remote sensing from earth- and spaceborne systems, expanded in situ observation networks, and increased low-cost computer capability will allow an unprecedented view of mesoscale weather systems from the local weather office. However, the volume of data from these new instruments, the nonconventional quantities measured, and the need for a frequent operational cycle require development of systems to translate this information into products aimed specifically at aiding the forecaster in 0- to 6-h prediction. In northeast Colorado an observing network now exists that is similar to those that a local weather office may see within 5–7 years. With GOES and TIROS satellites, Doppler radar, wind profilers, and surface mesonet stations, a unique opportunity exists to explore the use of such data in nowcasting weather phenomena. The scheme, called LAPS (the Local Analysis and Prediction System), objectively analyzes data on a high-resolution, three-dimensional grid. The analysed fields are used to generate mesoscale forecast products aimed at specific local forecast problems. An experiment conducted in the summer of 1989 sought to test the use of a preconvective index on the difficult problem of convective rain forecasting. The index was configured from surface-based lifted index and kinematically diagnosed vertical motion. The index involved a number of LAPS-derived meteorological fields and the results of the test measured in some sense the quality of those fields. Using radar reflectivity to verify the occurrence or nonoccurrence of convective precipitation, forecasts were issued for three time periods on each of 62 exercise days. The results indicated that the index was significantly better than persistence over a range of echo intensities. Skill scores computed from contingency tables indicated that the index had substantial skill in forecasting light convective precipitation with 1- to 3-h lead time. Less skill was shown for heavier convective showers. The skill of the index did not depend strongly on the density of surface data, but was negatively influenced by mountainous terrain.

Full access
John S. Snook, Peter A. Stamus, James Edwards, Zaphiris Christidis, and John A. McGinley

Abstract

The National Weather Service (NWS) developed the Olympic Weather Support System (OWSS) to provide specialized operational weather support for the 1996 Centennial Olympic Games in Atlanta. Operational implementation of the National Oceanic and Atmospheric Administration Forecast Systems Laboratory’s Local Analysis and Prediction System (LAPS) was a key element of the OWSS. LAPS is a complete, three-dimensional data assimilation system that produced subhourly atmospheric analyses on an 8-km grid covering all the Olympic venues. The LAPS analyses also provided initial conditions to the Regional Atmospheric Modeling System (RAMS) mesoscale forecast model. RAMS forecasts were generated at least every 3 h using 8- or 2-km grids. For the first time, a comprehensive operational analysis and forecast system operated in a local NWS forecast office to support meso-β-scale forecasts and warnings. Numerous benefits of LAPS–RAMS to the local forecast office were demonstrated. The OWSS, with LAPS–RAMS included, provided a precursory view of the enhanced operational mesoscale forecast capabilities that can be available to the NWS and other forecast offices in the near future.

Full access
Edward J. Szoke, John M. Brown, John A. McGinley, and Dennis Rodgers

Abstract

Stormscale Operational and Research Meteorology-Fronts Experimental Systems Test (STORM-FEST) was held from 1 February to 15 March 1992 in the central United States as a preliminary field systems test for an eventual larger-scale program. One of the systems tested was a remote operations center, located in Boulder, Colorado, which was significantly displaced from the main field concentration of scientists and research aircraft. In concert with the remote operations center test was a test of remote forecasting support, also centered in Boulder. The remote forecasting for STORM-FEST was the first major cooperative effort for the Boulder-Denver Experimental Forecast Facility (EFF), a cooperative effort between operations and research aimed at finding more effective ways of addressing applied meteorological problems. Two other newly formed EFF's, at Norman, Oklahoma, and Kansas City, Missouri, also played key roles in the forecasting/nowcasting support. A description of the design and function of this remote forecasting and nowcasting support is given, followed by an assessment of its utility during STORM-FEST. Although remote forecasting support was deemed plausible based on the STORM-FEST experience, a number of suggestions are given for a more effective way to conduct forecasting experiments and provide forecasting support during a field program.

Full access
Steven C. Albers, John A. McGinley, Daniel L. Birkenheuer, and John R. Smart

Abstract

The Local Analysis and Prediction System combines numerous data sources into a set of analyses and forecasts on a 10-km grid with high temporal resolution. To arrive at an analysis of cloud cover, several input analyses are combined with surface aviation observations and pilot reports of cloud layers. These input analyses am a skin temperature analysis (used to solve for cloud layer heights and coverage) derived from Geostationary Operational Environmental Satellite IR 11.24-µm data, other visible and multispectral imagery, a three-dimensional temperature analysis, and a three-dimensional radar reflectivity analysis derived from full volumetric radar data. Use of a model first guess for clouds is currently being phased in. The goal is to combine the data sources to take advantage of their strengths, thereby automating the synthesis similar to that of a human forecaster.

The design of the analysis procedures and output displays focuses on forecaster utility. A number of derived fields are calculated including cloud type, liquid water content, ice content, and icing severity, as well as precipitation type, concentration, and accumulation. Results from validating the cloud fields against independent data obtained during the Winter Icing and Storms Project are presented.

Forecasters can now make use of these analyses in a variety of situations, such as depicting sky cover and radiation characteristics over a region, three-dimensionally delineating visibility and icing conditions for aviation, depicting precipitation type, rain and snow accumulation, etc.

Full access
Huiling Yuan, John A. McGinley, Paul J. Schultz, Christopher J. Anderson, and Chungu Lu

Abstract

High-resolution (3 km) time-lagged (initialized every 3 h) multimodel ensembles were produced in support of the Hydrometeorological Testbed (HMT)-West-2006 campaign in northern California, covering the American River basin (ARB). Multiple mesoscale models were used, including the Weather Research and Forecasting (WRF) model, Regional Atmospheric Modeling System (RAMS), and fifth-generation Pennsylvania State University–National Center for Atmospheric Research Mesoscale Model (MM5). Short-range (6 h) quantitative precipitation forecasts (QPFs) and probabilistic QPFs (PQPFs) were compared to the 4-km NCEP stage IV precipitation analyses for archived intensive operation periods (IOPs). The two sets of ensemble runs (operational and rerun forecasts) were examined to evaluate the quality of high-resolution QPFs produced by time-lagged multimodel ensembles and to investigate the impacts of ensemble configurations on forecast skill. Uncertainties in precipitation forecasts were associated with different models, model physics, and initial and boundary conditions. The diabatic initialization by the Local Analysis and Prediction System (LAPS) helped precipitation forecasts, while the selection of microphysics was critical in ensemble design. Probability biases in the ensemble products were addressed by calibrating PQPFs. Using artificial neural network (ANN) and linear regression (LR) methods, the bias correction of PQPFs and a cross-validation procedure were applied to three operational IOPs and four rerun IOPs. Both the ANN and LR methods effectively improved PQPFs, especially for lower thresholds. The LR method outperformed the ANN method in bias correction, in particular for a smaller training data size. More training data (e.g., one-season forecasts) are desirable to test the robustness of both calibration methods.

Full access
Huiling Yuan, Chungu Lu, John A. McGinley, Paul J. Schultz, Brian D. Jamison, Linda Wharton, and Christopher J. Anderson

Abstract

Short-range quantitative precipitation forecasts (QPFs) and probabilistic QPFs (PQPFs) are investigated for a time-lagged multimodel ensemble forecast system. One of the advantages of such an ensemble forecast system is its low-cost generation of ensemble members. In conjunction with a frequently cycling data assimilation system using a diabatic initialization [such as the Local Analysis and Prediction System (LAPS)], the time-lagged multimodel ensemble system offers a particularly appealing approach for QPF and PQPF applications. Using the NCEP stage IV precipitation analyses for verification, 6-h QPFs and PQPFs from this system are assessed during the period of March–May 2005 over the west-central United States. The ensemble system was initialized by hourly LAPS runs at a horizontal resolution of 12 km using two mesoscale models, including the fifth-generation Pennsylvania State University–National Center for Atmospheric Research Mesoscale Model (MM5) and the Weather Research and Forecast (WRF) model with the Advanced Research WRF (ARW) dynamic core. The 6-h PQPFs from this system provide better performance than the NCEP operational North American Mesoscale (NAM) deterministic runs at 12-km resolution, even though individual members of the MM5 or WRF models perform comparatively worse than the NAM forecasts at higher thresholds and longer lead times. Recalibration was conducted to reduce the intensity errors in time-lagged members. In spite of large biases and spatial displacement errors in the MM5 and WRF forecasts, statistical verification of QPFs and PQPFs shows more skill at longer lead times by adding more members from earlier initialized forecast cycles. Combing the two models only reduced the forecast biases. The results suggest that further studies on time-lagged multimodel ensembles for operational forecasts are needed.

Full access
Steven V. Vasiloff, Dong-Jun Seo, Kenneth W. Howard, Jian Zhang, David H. Kitzmiller, Mary G. Mullusky, Witold F. Krajewski, Edward A. Brandes, Robert M. Rabin, Daniel S. Berkowitz, Harold E. Brooks, John A. McGinley, Robert J. Kuligowski, and Barbara G. Brown

Accurate quantitative precipitation estimates (QPE) and very short term quantitative precipitation forecasts (VSTQPF) are critical to accurate monitoring and prediction of water-related hazards and water resources. While tremendous progress has been made in the last quarter-century in many areas of QPE and VSTQPF, significant gaps continue to exist in both knowledge and capabilities that are necessary to produce accurate high-resolution precipitation estimates at the national scale for a wide spectrum of users. Toward this goal, a national next-generation QPE and VSTQPF (Q2) workshop was held in Norman, Oklahoma, on 28–30 June 2005. Scientists, operational forecasters, water managers, and stakeholders from public and private sectors, including academia, presented and discussed a broad range of precipitation and forecasting topics and issues, and developed a list of science focus areas. To meet the nation's needs for the precipitation information effectively, the authors herein propose a community-wide integrated approach for precipitation information that fully capitalizes on recent advances in science and technology, and leverages the wide range of expertise and experience that exists in the research and operational communities. The concepts and recommendations from the workshop form the Q2 science plan and a suggested path to operations. Implementation of these concepts is expected to improve river forecasts and flood and flash flood watches and warnings, and to enhance various hydrologic and hydrometeorological services for a wide range of users and customers. In support of this initiative, the National Mosaic and Q2 (NMQ) system is being developed at the National Severe Storms Laboratory to serve as a community test bed for QPE and VSTQPF research and to facilitate the transition to operations of research applications. The NMQ system provides a real-time, around-the-clock data infusion and applications development and evaluation environment, and thus offers a community-wide platform for development and testing of advances in the focus areas.

Full access