Search Results

You are looking at 1 - 5 of 5 items for

  • Author or Editor: M. Iredell x
  • All content x
Clear All Modify Search
T-W. Yu, M. Iredell, and D. Keyser


A neural network algorithm used in this study to derive Special Sensor Microwave/Imager (SSM/I) wind speeds from the Defense Meteorological Satellite Program satellite-observed brightness temperatures is briefly reviewed. The SSM/I winds derived from the neural network algorithm are not only of better quality, but also cover a larger area when compared to those generated from the currently operational Goodberlet algorithm. The areas of increased coverage occur mainly over the regions of active weather developments where the operational Goodberlet algorithm fails to produce good quality wind data due to high moisture contents of the atmosphere. These two main characteristics associated with the SSM/I winds derived from the neural network algorithm are discussed.

SSM/I wind speed data derived from both the neural network algorithm and the operational Goodberlet algorithm are tested in parallel global data assimilation and forecast experiments for a period of about three weeks. The results show that the use of neural-network-derived SSM/I wind speed data leads to a greater improvement in the first-guess wind fields than use of wind data generated by the operational algorithm. Similarly, comparison of the forecast results shows that use of the neural-network-derived SSM/I wind speed data in the data assimilation and forecast experiment gives better forecasts when compared to those from the operational run that uses the SSM/I winds from the Goodberlet algorithm. These results of comparison between the two parallel analyses and forecasts from the global data assimilation experiments are discussed.

Full access
Hyun-Sook Kim, Carlos Lozano, Vijay Tallapragada, Dan Iredell, Dmitry Sheinin, Hendrik L. Tolman, Vera M. Gerald, and Jamese Sims


This paper introduces a next-generation operational Hurricane Weather Research and Forecasting (HWRF) system that was developed at the U.S. National Centers for Environmental Prediction. The new system, HWRF–Hybrid Coordinate Ocean Model (HYCOM), retains the same atmospheric component of operational HWRF, but it replaces the feature-model-based Princeton Ocean Model (POM) with the eddy-resolving HYCOM. The primary motivation is to improve enthalpy fluxes in the air–sea interface, by providing the best possible estimates of the balanced oceanic states using data assimilated Real-Time Ocean Forecast System products as oceanic initial conditions (IC) and boundary conditions.

A proof-of-concept exercise of HWRF–HYCOM is conducted by validating ocean simulations, followed by the verification of hurricane forecasts. The ocean validation employs airborne expendable bathythermograph sampled during Hurricane Gustav (2008). Storm-driven sea surface temperature changes agree within 0.1° and 0.5°C of the mean and root-mean-square difference, respectively. In-storm deepening mixed layer and shoaling 26°C isotherm depth are similar to observations, but they are overpredicted at similar magnitudes of their ICs. The forecast verification for 10 Atlantic hurricanes in 2008 and 2009 shows that HWRF–HYCOM improves intensity by 13.8% and reduces positive bias by 43.9% over HWRF–POM. The HWRF–HYCOM track forecast is indifferent, except for days 4 and 5, when it shows better skill (8%) than HWRF–POM. While this study proves the concept and results in a better skillful hurricane forecast, one well-defined conclusion is to improve the estimates of IC, particularly the oceanic upper layer.

Full access
M. Kanamitsu, J.C. Alpert, K.A. Campana, P.M. Caplan, D.G. Deaven, M. Iredell, B. Katz, H.-L. Pan, J. Sela, and G.H. White


A number of improvements were implemented on 6 March 1991 into the National Meteorological Center's global model, which is used in the global data assimilation system (GDAS), the aviation (AVN) forecast, and the medium-range forecast (MRF):

  • The horizontal resolution of the forecast model was increased from triangular truncation T80 to T126, which corresponds to an equivalent increase in grid resolution from 160 km to 105 km.
  • The use of enhanced orography has been discontinued and replaced by mean orography.
  • A new marine-stratus parameterization was introduced.
  • A new mass-conservation constraint was implemented.
  • The horizontal diffusion in the medium scales was reduced by adopting the Leith formulation.
  • A new, more accurate sea-surface temperature analysis is now used.

In this note, we discuss each of the changes and briefly review the new model performance.

Full access
Gerhard Theurich, C. DeLuca, T. Campbell, F. Liu, K. Saint, M. Vertenstein, J. Chen, R. Oehmke, J. Doyle, T. Whitcomb, A. Wallcraft, M. Iredell, T. Black, A. M. Da Silva, T. Clune, R. Ferraro, P. Li, M. Kelley, I. Aleinov, V. Balaji, N. Zadeh, R. Jacob, B. Kirtman, F. Giraldo, D. McCarren, S. Sandgathe, S. Peckham, and R. Dunlap IV


The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open-source terms or to credentialed users.

The ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the United States. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC) Layer, a set of ESMF-based component templates and interoperability conventions. This shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multiagency development of coupled modeling systems; controlled experimentation and testing; and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NAVGEM), the Hybrid Coordinate Ocean Model (HYCOM), and the Coupled Ocean–Atmosphere Mesoscale Prediction System (COAMPS); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and the Goddard Earth Observing System Model, version 5 (GEOS-5), atmospheric general circulation model.

Full access
E. Kalnay, M. Kanamitsu, R. Kistler, W. Collins, D. Deaven, L. Gandin, M. Iredell, S. Saha, G. White, J. Woollen, Y. Zhu, M. Chelliah, W. Ebisuzaki, W. Higgins, J. Janowiak, K. C. Mo, C. Ropelewski, J. Wang, A. Leetmaa, R. Reynolds, Roy Jenne, and Dennis Joseph

The NCEP and NCAR are cooperating in a project (denoted “reanalysis”) to produce a 40-year record of global analyses of atmospheric fields in support of the needs of the research and climate monitoring communities. This effort involves the recovery of land surface, ship, rawinsonde, pibal, aircraft, satellite, and other data; quality controlling and assimilating these data with a data assimilation system that is kept unchanged over the reanalysis period 1957–96. This eliminates perceived climate jumps associated with changes in the data assimilation system.

The NCEP/NCAR 40-yr reanalysis uses a frozen state-of-the-art global data assimilation system and a database as complete as possible. The data assimilation and the model used are identical to the global system implemented operationally at the NCEP on 11 January 1995, except that the horizontal resolution is T62 (about 210 km). The database has been enhanced with many sources of observations not available in real time for operations, provided by different countries and organizations. The system has been designed with advanced quality control and monitoring components, and can produce 1 mon of reanalysis per day on a Cray YMP/8 supercomputer. Different types of output archives are being created to satisfy different user needs, including a “quick look” CD-ROM (one per year) with six tropospheric and stratospheric fields available twice daily, as well as surface, top-of-the-atmosphere, and isentropic fields. Reanalysis information and selected output is also available on-line via the Internet (http// A special CDROM, containing 13 years of selected observed, daily, monthly, and climatological data from the NCEP/NCAR Reanalysis, is included with this issue. Output variables are classified into four classes, depending on the degree to which they are influenced by the observations and/or the model. For example, “C” variables (such as precipitation and surface fluxes) are completely determined by the model during the data assimilation and should be used with caution. Nevertheless, a comparison of these variables with observations and with several climatologies shows that they generally contain considerable useful information. Eight-day forecasts, produced every 5 days, should be useful for predictability studies and for monitoring the quality of the observing systems.

The 40 years of reanalysis (1957–96) should be completed in early 1997. A continuation into the future through an identical Climate Data Assimilation System will allow researchers to reliably compare recent anomalies with those in earlier decades. Since changes in the observing systems will inevitably produce perceived changes in the climate, parallel reanalyses (at least 1 year long) will be generated for the periods immediately after the introduction of new observing systems, such as new types of satellite data.

NCEP plans currently call for an updated reanalysis using a state-of-the-art system every five years or so. The successive reanalyses will be greatly facilitated by the generation of the comprehensive database in the present reanalysis.

Full access