Search Results

You are looking at 1 - 7 of 7 items for :

  • Author or Editor: Eric Rogers x
  • Weather and Forecasting x
  • Refine by Access: All Content x
Clear All Modify Search
Eric Rogers, Dennis G. Deaven, and Geoffrey S. Dimego

Abstract

The analysis component of the National Centers for Environmental Prediction (NCEP) operational “early” 80-km eta model, as implemented in July 1993, is described. This optimum interpolation (OI) analysis is fully multivariate for wind and geopotential height (univariate for specific humidity) and is performed directly on the eta model's vertical coordinate. Although the eta OI analysis and model performance has been generally favorable when compared to the Limited-Area Fine Mesh Model (LFM) and the Nested Grid Model (NGM), deficiencies in the eta OI analysis fields have been observed, especially near the surface.

A series of improvements to the eta OI analysis is described. A refinement to the eta model orography, which created a more realistic depiction of the model terrain, is also discussed along with the impact of these changes on analysis and model performance. These changes were implemented in the early eta system in September 1994.

The operational configuration of the new mesoscale (29 km) eta model system is introduced, consisting of a mesoscale eta-based data assimilation system (EDAS) and the mesoscalee forecast. An example of an analysis produced by the mesoscale EDAS is presented for comparison with the operational 80-km eta OI analysis. A brief description of more recent changes to the early eta system are also described.

Full access
Wan-Shu Wu, David F. Parrish, Eric Rogers, and Ying Lin

Abstract

At the National Centers for Environmental Prediction, the global ensemble forecasts from the ensemble Kalman filter scheme in the Global Forecast System are applied in a regional three-dimensional (3D) and a four dimensional (4D) ensemble–variational (EnVar) data assimilation system. The application is a one-way variational method using hybrid static and ensemble error covariances. To enhance impact, three new features have been added to the existing EnVar system in the Gridpoint Statistical Interpolation (GSI). First, the constant coefficients that assign relative weight between the ensemble and static background error are now allowed to vary in the vertical. Second, a new formulation is introduced for the ensemble contribution to the analysis surface pressure. Finally, in order to make use of the information in the ensemble mean that is disregarded in the existing EnVar in GSI, the trajectory correction, a novel approach, is introduced. Relative to the application of a 3D variational data assimilation algorithm, a clear positive impact on 1–3-day forecasts is realized when applying 3DEnVar analyses in the North American Mesoscale Forecast System (NAM). The 3DEnVar DA system was operationally implemented in the NAM Data Assimilation System in August 2014. Application of a 4DEnVar algorithm is shown to further improve forecast accuracy relative to the 3DEnVar. The approach described in this paper effectively combines contributions from both the regional and the global forecast systems to produce the initial conditions for the regional NAM system.

Full access
Dusanka Zupanski, Milija Zupanski, Eric Rogers, David F. Parrish, and Geoffrey J. DiMego

Abstract

The National Centers for Environmental Prediction fine-resolution four-dimensional variational (4DVAR) data assimilation system is used to study the Great Plains tornado outbreak of 3 May 1999. It was found that the 4DVAR method was able to capture very well the important precursors for the tornadic activity, such as upper- and low-level jet streaks, wind shear, humidity field, surface CAPE, and so on. It was also demonstrated that, in this particular synoptic case, characterized by fast-changing mesoscale systems, the model error adjustment played a substantial role. The experimental results suggest that the common practice of neglecting the model error in data assimilation systems may not be justified in synoptic situations similar to this one.

Full access
David J. Stensrud, Geoffrey S. Manikin, Eric Rogers, and Kenneth E. Mitchell

Abstract

The cold pool, a pool of evaporatively cooled downdraft air that spreads out horizontally along the ground beneath a precipitating cloud, is often a factor in severe weather and heavy precipitation events. Unfortunately, cold pools are not well sampled by the present observational network and are rarely depicted in numerical model initial conditions. A procedure to identify and insert cold pools into the 29-km Eta Model is developed and tested on seven cases during 1995. Results suggest that when the large-scale forcing is strong, the inclusion of cold pools produces only slight changes in the forecasts. However, for the one case in which the large-scale forcing is relatively weak, the inclusion of cold pools produces significant changes in many of the model fields. These initial results, while not conclusive, suggest that the incorporation of cold pools, and other mesoscale features, may be important to the improvement of numerical guidance for severe weather and heavy precipitation forecasting.

Full access
Eric Rogers, Thomas L. Black, Dennis G. Deaven, Geoffrey J. DiMego, Qingyun Zhao, Michael Baldwin, Norman W. Junker, and Ying Lin

Abstract

This note describes changes that have been made to the National Centers for Environmental Prediction (NCEP) operational “early” eta model. The changes are 1) an decrease in horizontal grid spacing from 80 to 48 km, 2) incorporation of a cloud prediction scheme, 3) replacement of the original static analysis system with a 12-h intermittent data assimilation system using the eta model, and 4) the use of satellite-sensed total column water data in the eta optimum interpolation analysis. When tested separately, each of the four changes improved model performance. A quantitative and subjective evaluation of the full upgrade package during March and April 1995 indicated that the 48-km eta model was more skillful than the operational 80-km model in predicting the intensity and movement of large-scale weather systems. In addition, the 48-km eta model was more skillful in predicting severe mesoscale precipitation events than either the 80-km eta model, the nested grid model, or the NCEP global spectral model during the March-April 1995 period. The implementation of this new version of the operational early eta system was performed in October 1995.

Full access
Pius Lee, Jeffery McQueen, Ivanka Stajner, Jianping Huang, Li Pan, Daniel Tong, Hyuncheol Kim, Youhua Tang, Shobha Kondragunta, Mark Ruminski, Sarah Lu, Eric Rogers, Rick Saylor, Perry Shafran, Ho-Chun Huang, Jerry Gorline, Sikchya Upadhayay, and Richard Artz

Abstract

The National Air Quality Forecasting Capability (NAQFC) upgraded its modeling system that provides developmental numerical predictions of particulate matter smaller than 2.5 μm in diameter (PM2.5) in January 2015. The issuance of PM2.5 forecast guidance has become more punctual and reliable because developmental PM2.5 predictions are provided from the same system that produces operational ozone predictions on the National Centers for Environmental Prediction (NCEP) supercomputers.

There were three major upgrades in January 2015: 1) incorporation of real-time intermittent sources for particles emitted from wildfires and windblown dust originating within the NAQFC domain, 2) suppression of fugitive dust emissions from snow- and/or ice-covered terrain, and 3) a shorter life cycle for organic nitrate in the gaseous-phase chemical mechanism. In May 2015 a further upgrade for emission sources was included using the U.S. Environmental Protection Agency’s (EPA) 2011 National Emission Inventory (NEI). Emissions for ocean-going ships and on-road mobile sources will continue to rely on NEI 2005.

Incremental tests and evaluations of these upgrades were performed over multiple seasons. They were verified against the EPA’s AIRNow surface monitoring network for air pollutants. Impacts of the three upgrades on the prediction of surface PM2.5 concentrations show large regional variability: the inclusion of windblown dust emissions in May 2014 improved PM2.5 predictions over the western states and the suppression of fugitive dust in January 2015 reduced PM2.5 bias by 52%, from 6.5 to 3.1 μg m−3 against a monthly average of 9.4 μg m−3 for the north-central United States.

Full access
Tom H. Zapotocny, Steven J. Nieman, W. Paul Menzel, James P. Nelson III, James A. Jung, Eric Rogers, David F. Parrish, Geoffrey J. DiMego, Michael Baldwin, and Timothy J. Schmit

Abstract

A case study is utilized to determine the sensitivity of the Eta Data Assimilation System (EDAS) to all operational observational data types used within it. The work described in this paper should be of interest to Eta Model users trying to identify the impact of each data type and could benefit other modelers trying to use EDAS analyses and forecasts as initial conditions for other models.

The case study chosen is one characterized by strong Atlantic and Pacific maritime cyclogenesis, and is shortly after the EDAS began using three-dimensional variational analysis. The control run of the EDAS utilizes all 34 of the operational data types. One of these data types is then denied for each of the subsequent experimental runs. Differences between the experimental and control runs are analyzed to demonstrate the sensitivity of the EDAS system to each data type for the analysis and subsequent 48-h forecasts. Results show the necessity of various nonconventional observation types, such as aircraft data, satellite precipitable water, and cloud drift winds. These data types are demonstrated to have a significant impact, especially observations in maritime regions.

Full access