Search Results

You are looking at 1 - 5 of 5 items for

  • Author or Editor: Dennis G. Deaven x
  • Refine by Access: Content accessible to me x
Clear All Modify Search
Dennis G. Deaven

Abstract

A numerical model is designed to integrate the primitive equations in two dimensions to evaluate a new vertical coordinate scheme for hydrostatic flow. Because potential temperature is nearly conserved for large-scale atmospheric flow, potential temperature is utilized as the vertical coordinate in the layer bounded by the lower troposphere and the top of the model atmosphere which is near the middle stratosphere. The vertical coordinate scheme circumvents some of the problems that have appeared in previous isentropic coordinate models by utilizing a variable depth layer near the earth's surface in which a function of pressure is the vertical coordinate. For this reason, no coordinate surface intersects the surface of the earth and complicated topographic features can easily he constructed. In addition, adiabatic and superadiabatic lapse rates can develop near the surface of the earth because the vertical coordinate in this region is a quasi-horizontal function of pressure.

Two experiments are performed to test the proposed vertical coordinate scheme. The formation of waves that appear in the atmosphere when the flow is normal to a mountain ridge is simulated by the first experiment. A temperature excess is introduced at the earth's surface for the second experiment to simulate conditions analogous to a large industrial power park.

The results of the experiments illustrate that the layer developed here can be used as the interface between an isentropic coordinate layer and the earth's surface in numerical prediction models.

Full access
Eric Rogers, Dennis G. Deaven, and Geoffrey S. Dimego

Abstract

The analysis component of the National Centers for Environmental Prediction (NCEP) operational “early” 80-km eta model, as implemented in July 1993, is described. This optimum interpolation (OI) analysis is fully multivariate for wind and geopotential height (univariate for specific humidity) and is performed directly on the eta model's vertical coordinate. Although the eta OI analysis and model performance has been generally favorable when compared to the Limited-Area Fine Mesh Model (LFM) and the Nested Grid Model (NGM), deficiencies in the eta OI analysis fields have been observed, especially near the surface.

A series of improvements to the eta OI analysis is described. A refinement to the eta model orography, which created a more realistic depiction of the model terrain, is also discussed along with the impact of these changes on analysis and model performance. These changes were implemented in the early eta system in September 1994.

The operational configuration of the new mesoscale (29 km) eta model system is introduced, consisting of a mesoscale eta-based data assimilation system (EDAS) and the mesoscalee forecast. An example of an analysis produced by the mesoscale EDAS is presented for comparison with the operational 80-km eta OI analysis. A brief description of more recent changes to the early eta system are also described.

Full access
Fedor Mesinger, Zaviša I. Janjić, Slobodan Ničković, Dušanka Gavrilov, and Dennis G. Deaven

Abstract

The problem of the pressure gradient force error in the case of the terrain-following (sigma) coordinate does not appear to have a solution. The problem is not one of truncation error in the calculation of space derivatives involved. Thus, with temperature profiles resulting in large errors, an increase in vertical resolution may not reduce and is even likely to increase the error. Therefore, an approach abandoning the sigma system has been proposed. It involves the use of “step” mountains with coordinate surfaces prescribed to remain at fixed elevations at places where they touch (and define) or intersect the ground surface. Thus, the coordinate surfaces are quasi-horizontal, and the sigma system problem is not present. At the same time, the simplicity of the sigma system is maintained.

In this paper, design of the model (“silhouette” averaged) mountains, properties of the wall boundary condition, and the scheme for calculation of the potential to kinetic energy conversion are presented. For an advection scheme achieving a strict control of the nonlinear energy cascade on the semistaggered grid, it is demonstrated that a straightforward no-slip wall boundary condition maintains conservation properties of the scheme with no vertical walls, which are important from the point of view of the control of this energy cascade from large to small scales. However, with that simple boundary condition considered, momentum is not conserved. The scheme conserving energy in conversion between the potential and kinetic energy, given earlier for the one-dimensional case, is extended to two dimensions.

Results of real data experiments are described, testing the performance of the resulting “Step-mountain” model. An attractive feature of a step-mountain (“eta”) model is that it can easily be run as a sigma system model, the only difference being the definition of ground surface grid point values of the vertical coordinate. This permits a comparison of the sigma and the eta formulations. Two experiments of this kind have been made, with a model version including realistic steep mountains (steps at 290, 1112 and 2433 m). They have both revealed a substantial amount of noise resulting from the sigma, as compared to the eta, formulation. One of these experiments, especially with the step mountains, gave a rather successful simulation of the perhaps difficult “historic” Buzzi–Tibaldi case of Genoa lee cyclogenesis. A parallel experiment showed that, starting with the same initial data, one obtains no cyclogenesis without mountains. Still, the mountains experiment did simulate the accompanying midtropospheric cutoff, a phenomenon that apparently has not been reproduced in previous simulations of mountain-induced Genoa lee cyclogeneses.

For a North American limited area region, experimental step-mountain simulations were performed for a case of March 1984, involving development of a secondary storm southeast of the Appalachians. Neither the then operational U.S. National Meteorological Center's Limited Area Forecast Model (LFM) nor the recently introduced Nested Grid Model (NGM) were successful in simulating the redevelopment. On the other hand, the step-mountain model, with a space resolution set up to mimic that of NGM, successfully simulated the ridging that indicates the redevelopment.

Full access
Eric Rogers, Thomas L. Black, Dennis G. Deaven, Geoffrey J. DiMego, Qingyun Zhao, Michael Baldwin, Norman W. Junker, and Ying Lin

Abstract

This note describes changes that have been made to the National Centers for Environmental Prediction (NCEP) operational “early” eta model. The changes are 1) an decrease in horizontal grid spacing from 80 to 48 km, 2) incorporation of a cloud prediction scheme, 3) replacement of the original static analysis system with a 12-h intermittent data assimilation system using the eta model, and 4) the use of satellite-sensed total column water data in the eta optimum interpolation analysis. When tested separately, each of the four changes improved model performance. A quantitative and subjective evaluation of the full upgrade package during March and April 1995 indicated that the 48-km eta model was more skillful than the operational 80-km model in predicting the intensity and movement of large-scale weather systems. In addition, the 48-km eta model was more skillful in predicting severe mesoscale precipitation events than either the 80-km eta model, the nested grid model, or the NCEP global spectral model during the March-April 1995 period. The implementation of this new version of the operational early eta system was performed in October 1995.

Full access
E. Kalnay, M. Kanamitsu, R. Kistler, W. Collins, D. Deaven, L. Gandin, M. Iredell, S. Saha, G. White, J. Woollen, Y. Zhu, M. Chelliah, W. Ebisuzaki, W. Higgins, J. Janowiak, K. C. Mo, C. Ropelewski, J. Wang, A. Leetmaa, R. Reynolds, Roy Jenne, and Dennis Joseph

The NCEP and NCAR are cooperating in a project (denoted “reanalysis”) to produce a 40-year record of global analyses of atmospheric fields in support of the needs of the research and climate monitoring communities. This effort involves the recovery of land surface, ship, rawinsonde, pibal, aircraft, satellite, and other data; quality controlling and assimilating these data with a data assimilation system that is kept unchanged over the reanalysis period 1957–96. This eliminates perceived climate jumps associated with changes in the data assimilation system.

The NCEP/NCAR 40-yr reanalysis uses a frozen state-of-the-art global data assimilation system and a database as complete as possible. The data assimilation and the model used are identical to the global system implemented operationally at the NCEP on 11 January 1995, except that the horizontal resolution is T62 (about 210 km). The database has been enhanced with many sources of observations not available in real time for operations, provided by different countries and organizations. The system has been designed with advanced quality control and monitoring components, and can produce 1 mon of reanalysis per day on a Cray YMP/8 supercomputer. Different types of output archives are being created to satisfy different user needs, including a “quick look” CD-ROM (one per year) with six tropospheric and stratospheric fields available twice daily, as well as surface, top-of-the-atmosphere, and isentropic fields. Reanalysis information and selected output is also available on-line via the Internet (http//:nic.fb4.noaa.gov:8000). A special CDROM, containing 13 years of selected observed, daily, monthly, and climatological data from the NCEP/NCAR Reanalysis, is included with this issue. Output variables are classified into four classes, depending on the degree to which they are influenced by the observations and/or the model. For example, “C” variables (such as precipitation and surface fluxes) are completely determined by the model during the data assimilation and should be used with caution. Nevertheless, a comparison of these variables with observations and with several climatologies shows that they generally contain considerable useful information. Eight-day forecasts, produced every 5 days, should be useful for predictability studies and for monitoring the quality of the observing systems.

The 40 years of reanalysis (1957–96) should be completed in early 1997. A continuation into the future through an identical Climate Data Assimilation System will allow researchers to reliably compare recent anomalies with those in earlier decades. Since changes in the observing systems will inevitably produce perceived changes in the climate, parallel reanalyses (at least 1 year long) will be generated for the periods immediately after the introduction of new observing systems, such as new types of satellite data.

NCEP plans currently call for an updated reanalysis using a state-of-the-art system every five years or so. The successive reanalyses will be greatly facilitated by the generation of the comprehensive database in the present reanalysis.

Full access