Search Results

You are looking at 1 - 10 of 24 items for

  • Author or Editor: Eric Rogers x
  • Refine by Access: All Content x
Clear All Modify Search
Eric Rogers and Lance F. Bosart

Abstract

A diagnostic study of two intense oceanic cyclones (16–18 November 1972 and 9–11 January 1982) along the east coast of North America is presented. Both cyclones formed along a strong low-level baroclinic zone and deepened in response to an approaching tropospheric short-wave trough aloft.

The 16–18 November 1972 storm developed along a Carolina coastal front in a region of enhanced low-level convergence and cyclonic vorticity generation favorable for incipient cyclogenesis. Significant oceanic heat and moisture fluxes northeast of the cyclone helped to enhance the low-level baroclinicity and to destabilize the warm, moist air mass along the cyclones’ path. Explosive deepening [35 mb(12 h)−1] commenced when a rather ordinary midtropospheric trough was about 300 km upstream of the surface low. A noteworthy aspect of the storm was the existence of deep convection near the center throughout the deepening phase and the exceptionally tight inner pressure gradient, suggestive of the role of diabatic process in the extreme development.

The 9–11 January 1982 cyclone formed and deepened in response to a vigorous mid- and upper-tropospheric trough/jet-streak system of greater intensity and large-scale baroclinicity than seen for the November 1972 cyclone. Oceanic heat fluxes were small except in the cyclones' wake. No widespread areas of convection and convective instability were observed as the cyclone explosively deepened. The 1982 storm achieved peak intensity over cold water while the 1972 storm reached maximum strength over the warm waters of the Gulf Stream. Upper-level frontogenesis may have contributed to the intensity of the upper-tropospheric trough and ensuing downstream development in the 1982 case.

Full access
Eric Rogers and Lance F. Bosart

Abstract

The explosively deepening oceanic cyclone or “bomb,” one which has central pressure falls of 12 mb (12 h)−1 or greater, has been studied using composites constructed from North Atlantic and Pacific weather ship rawinsonde data during the period October 1965 to May 1974, June through September excluded.

The composites revealed that the oceanic bomb evolved in a low-level baroclinic environment where the incipient circulation was confined to the lower troposphere. The cyclone subsequently developed into a deep vortex characterized by strong baroclinicity, strong low and midlevel ascent to the north and east of the cyclone, and lower tropospheric conditional instability found near and to the southeast of the cyclone center. A case study of a very intense cyclone that passed near ship 4YP revealed deep layers of conditional instability near the low center. It appears that explosively deepening cyclones are baroclinic phenomena whose development may be enhanced in some cases by the bulk effects of cumulus convention.

Full access
Wan-Shu Wu, David F. Parrish, Eric Rogers, and Ying Lin

Abstract

At the National Centers for Environmental Prediction, the global ensemble forecasts from the ensemble Kalman filter scheme in the Global Forecast System are applied in a regional three-dimensional (3D) and a four dimensional (4D) ensemble–variational (EnVar) data assimilation system. The application is a one-way variational method using hybrid static and ensemble error covariances. To enhance impact, three new features have been added to the existing EnVar system in the Gridpoint Statistical Interpolation (GSI). First, the constant coefficients that assign relative weight between the ensemble and static background error are now allowed to vary in the vertical. Second, a new formulation is introduced for the ensemble contribution to the analysis surface pressure. Finally, in order to make use of the information in the ensemble mean that is disregarded in the existing EnVar in GSI, the trajectory correction, a novel approach, is introduced. Relative to the application of a 3D variational data assimilation algorithm, a clear positive impact on 1–3-day forecasts is realized when applying 3DEnVar analyses in the North American Mesoscale Forecast System (NAM). The 3DEnVar DA system was operationally implemented in the NAM Data Assimilation System in August 2014. Application of a 4DEnVar algorithm is shown to further improve forecast accuracy relative to the 3DEnVar. The approach described in this paper effectively combines contributions from both the regional and the global forecast systems to produce the initial conditions for the regional NAM system.

Full access
Eric Rogers, Dennis G. Deaven, and Geoffrey S. Dimego

Abstract

The analysis component of the National Centers for Environmental Prediction (NCEP) operational “early” 80-km eta model, as implemented in July 1993, is described. This optimum interpolation (OI) analysis is fully multivariate for wind and geopotential height (univariate for specific humidity) and is performed directly on the eta model's vertical coordinate. Although the eta OI analysis and model performance has been generally favorable when compared to the Limited-Area Fine Mesh Model (LFM) and the Nested Grid Model (NGM), deficiencies in the eta OI analysis fields have been observed, especially near the surface.

A series of improvements to the eta OI analysis is described. A refinement to the eta model orography, which created a more realistic depiction of the model terrain, is also discussed along with the impact of these changes on analysis and model performance. These changes were implemented in the early eta system in September 1994.

The operational configuration of the new mesoscale (29 km) eta model system is introduced, consisting of a mesoscale eta-based data assimilation system (EDAS) and the mesoscalee forecast. An example of an analysis produced by the mesoscale EDAS is presented for comparison with the operational 80-km eta OI analysis. A brief description of more recent changes to the early eta system are also described.

Full access
Jean Thiébaux, Eric Rogers, Wanqiu Wang, and Bert Katz

A new blended high-resolution real-time global sea surface temperature analysis (RTG_SST), developed specifically for use in operational numerical weather forecasting models, was implemented in NCEP's operational job stream on 30 January 2001, immediately following investigations of miss-forecast precipitation events in the mid-Atlantic states. Each daily analysis uses the most recent 24-h receipts of in situ and satellite-derived surface temperature data and provides a global SST field on a 0.5° × 0.5° (latitude-longitude) grid. The RTG_SST provides the sea surface temperature fields for the regional Meso Eta Model, replacing the previously used National Environmental Satellite, Data, and Information Service (NESDIS) 50-km satellite-only SST analysis.

Forecast events leading to the implementation of the RTG_SST are described; comparison is made of the properties used in this new analysis with those of the Reynolds-Smith (RS) analysis and the NESDIS 50-km analysis; data ingestion, analysis, and verification components of the RTG_SST are reviewed; and analysis-related products and data that are available via the NCEP Web site are referenced.

Full access
Robert J. Zamora, Edward P. Clark, Eric Rogers, Michael B. Ek, and Timothy M. Lahmers

Abstract

The NOAA Hydrometeorology Testbed (HMT) program has deployed a soil moisture observing network in the Babocomari River basin located in southeastern Arizona. The Babocomari River is a major tributary of the San Pedro River. At 0000 UTC 23 July 2008, the second-highest flow during the period of record was measured just upstream of the location where the Babocomari River joins the main channel of the San Pedro River.

Upper-air and surface meteorological observations and Special Sensor Microwave Imager (SSM/I) satellite images of integrated water vapor were used to establish the synoptic and mesoscale conditions that existed before the flood occurred. The analysis indicates that a weak Gulf of California surge initiated by Hurricane Fausto transported a warm moist tropical air mass into the lower troposphere over southern Arizona, setting the stage for the intense, deep convection that initiated the flooding on the Babocomari River. Observations of soil moisture and precipitation at five locations in the basin and streamflow measured at two river gauging stations enabled the documentation of the hydrometeorological conditions that existed before the flooding occurred. The observations suggest that soil moisture conditions as a function of depth, the location of semi-impermeable layers of sedimentary rock known as caliche, and the spatial distribution of convective precipitation in the basin confined the flooding to the lower part of the basin. Finally, the HMT soil moisture observations are compared with soil moisture products from the NOAA/NWS/NCEP Noah land surface model.

Full access
Milija Zupanski, Dusanka Zupanski, David F. Parrish, Eric Rogers, and Geoffrey DiMego

Abstract

Four-dimensional variational (4DVAR) data assimilation experiments for the East Coast winter storm of 25 January 2000 (i.e., “blizzard of 2000”) were performed. This storm has received wide attention in the United States, because it was one of the major failures of the operational forecast system. All operational models of the U.S. National Weather Service (NWS) failed to produce heavy precipitation over the Carolina–New Jersey corridor, especially during the early stage of the storm development. The considered analysis cycle of this study is that of 0000 to 1200 UTC 24 January. This period was chosen because the forecast from 1200 UTC 24 January had the most damaging guidance for the forecasters at the National Weather Service offices and elsewhere.

In the first set of experiments, the assimilation and forecast results between the 4DVAR and the operational three-dimensional variational (3DVAR) data assimilation method are compared. The most striking difference is in the accumulated precipitation amounts. The 4DVAR experiment produced almost perfect 24-h accumulated precipitation during the first 24 h of the forecast (after data assimilation), with accurate heavy precipitation over North and South Carolina. The operational 3DVAR-based forecast badly underforecast precipitation. The reason for the difference is traced back to the initial conditions. Apparently, the 4DVAR data assimilation was able to create strong surface convergence and an excess of precipitable water over Georgia. This initial convection was strengthened by a low-level jet in the next 6–12 h, finally resulting in a deep convection throughout the troposphere.

In the second set of experiments, the impact of model error adjustment and precipitation assimilation is examined by comparing the forecasts initiated from various 4DVAR experiments. The results strongly indicate the need for the model error adjustment in the 4DVAR algorithm, as well as the clear benefit of assimilation of the hourly accumulated precipitation.

Full access
David J. Stensrud, Harold E. Brooks, Jun Du, M. Steven Tracton, and Eric Rogers
Full access
Dusanka Zupanski, Milija Zupanski, Eric Rogers, David F. Parrish, and Geoffrey J. DiMego

Abstract

The National Centers for Environmental Prediction fine-resolution four-dimensional variational (4DVAR) data assimilation system is used to study the Great Plains tornado outbreak of 3 May 1999. It was found that the 4DVAR method was able to capture very well the important precursors for the tornadic activity, such as upper- and low-level jet streaks, wind shear, humidity field, surface CAPE, and so on. It was also demonstrated that, in this particular synoptic case, characterized by fast-changing mesoscale systems, the model error adjustment played a substantial role. The experimental results suggest that the common practice of neglecting the model error in data assimilation systems may not be justified in synoptic situations similar to this one.

Full access
Eric W. Uhlhorn, Bradley W. Klotz, Tomislava Vukicevic, Paul D. Reasor, and Robert F. Rogers

Abstract

Wavenumber-1 wind speed asymmetries in 35 hurricanes are quantified in terms of their amplitude and phase, based on aircraft observations from 128 individual flights between 1998 and 2011. The impacts of motion and 850–200-mb environmental vertical shear are examined separately to estimate the resulting asymmetric structures at the sea surface and standard 700-mb reconnaissance flight level. The surface asymmetry amplitude is on average around 50% smaller than found at flight level, and while the asymmetry amplitude grows in proportion to storm translation speed at the flight level, no significant growth at the surface is observed, contrary to conventional assumption. However, a significant upwind storm-motion-relative phase rotation is found at the surface as translation speed increases, while the flight-level phase remains fairly constant. After removing the estimated impact of storm motion on the asymmetry, a significant residual shear direction-relative asymmetry is found, particularly at the surface, and, on average, is located downshear to the left of shear. Furthermore, the shear-relative phase has a significant downwind rotation as shear magnitude increases, such that the maximum rotates from the downshear to left-of-shear azimuthal location. By stratifying observations according to shear-relative motion, this general pattern of a left-of-shear residual wind speed maximum is found regardless of the orientation between the storm’s heading and shear direction. These results are quite consistent with recent observational studies relating western Pacific typhoon wind asymmetries to environmental shear. Finally, changes in wind asymmetry over a 5-day period during Hurricane Earl (2010) are analyzed to understand the combined impacts of motion and the evolving shear.

Full access