The Operational Mesogamma-Scale Analysis and Forecast System of the U.S. Army Test and Evaluation Command. Part I: Overview of the Modeling System, the Forecast Products, and How the Products Are Used

Yubao Liu National Center for Atmospheric Research,Boulder, Colorado

Search for other papers by Yubao Liu in
Current site
Google Scholar
PubMed
Close
,
Thomas T. Warner National Center for Atmospheric Research,Boulder, Colorado
Department of Atmospheric and Oceanic Sciences, University of Colorado, Boulder, Colorado

Search for other papers by Thomas T. Warner in
Current site
Google Scholar
PubMed
Close
,
James F. Bowers U.S. Army Dugway Proving Ground, Dugway, Utah

Search for other papers by James F. Bowers in
Current site
Google Scholar
PubMed
Close
,
Laurie P. Carson National Center for Atmospheric Research,Boulder, Colorado

Search for other papers by Laurie P. Carson in
Current site
Google Scholar
PubMed
Close
,
Fei Chen National Center for Atmospheric Research,Boulder, Colorado

Search for other papers by Fei Chen in
Current site
Google Scholar
PubMed
Close
,
Charles A. Clough U.S. Army Aberdeen Proving Ground, Aberdeen, Maryland

Search for other papers by Charles A. Clough in
Current site
Google Scholar
PubMed
Close
,
Christopher A. Davis National Center for Atmospheric Research,Boulder, Colorado

Search for other papers by Christopher A. Davis in
Current site
Google Scholar
PubMed
Close
,
Craig H. Egeland U.S. Army Cold Regions Test Center, Fort Greely, Alaska

Search for other papers by Craig H. Egeland in
Current site
Google Scholar
PubMed
Close
,
Scott F. Halvorson U.S. Army Dugway Proving Ground, Dugway, Utah

Search for other papers by Scott F. Halvorson in
Current site
Google Scholar
PubMed
Close
,
Terrence W. Huck Jr. *U.S. Army White Sands Missile Range, White Sands, New Mexico

Search for other papers by Terrence W. Huck Jr. in
Current site
Google Scholar
PubMed
Close
,
Leo Lachapelle U.S. Army Redstone Technical Test Center, Redstone Arsenal, Alabama

Search for other papers by Leo Lachapelle in
Current site
Google Scholar
PubMed
Close
,
Robert E. Malone U.S. Army Electronics Proving Ground, Fort Huachuca, Arizona

Search for other papers by Robert E. Malone in
Current site
Google Scholar
PubMed
Close
,
Daran L. Rife National Center for Atmospheric Research,Boulder, Colorado

Search for other papers by Daran L. Rife in
Current site
Google Scholar
PubMed
Close
,
Rong-Shyang Sheu National Center for Atmospheric Research,Boulder, Colorado

Search for other papers by Rong-Shyang Sheu in
Current site
Google Scholar
PubMed
Close
,
Scott P. Swerdlin National Center for Atmospheric Research,Boulder, Colorado

Search for other papers by Scott P. Swerdlin in
Current site
Google Scholar
PubMed
Close
, and
Dean S. Weingarten U.S. Army Yuma Proving Ground, Yuma, Arizona

Search for other papers by Dean S. Weingarten in
Current site
Google Scholar
PubMed
Close
Full access

Abstract

Given the rapid increase in the use of operational mesoscale models to satisfy different specialized needs, it is important for the community to share ideas and solutions for meeting the many associated challenges that encompass science, technology, education, and training. As a contribution toward this objective, this paper begins a series that reports on the characteristics and performance of an operational mesogamma-scale weather analysis and forecasting system that has been developed for use by the U.S. Army Test and Evaluation Command. During the more than five years that this four-dimensional weather system has been in use at seven U.S. Army test ranges, valuable experience has been gained about the production and effective use of high-resolution model products for satisfying a variety of needs. This paper serves as a foundation for the rest of the papers in the series by describing the operational requirements for the system, the data assimilation and forecasting system characteristics, and the forecaster training that is required for the finescale products to be used effectively.

Corresponding author address: Yubao Liu, NCAR/RAL, P.O. Box 3000, Boulder, CO 80307-3000. Email: yliu@ucar.edu

Abstract

Given the rapid increase in the use of operational mesoscale models to satisfy different specialized needs, it is important for the community to share ideas and solutions for meeting the many associated challenges that encompass science, technology, education, and training. As a contribution toward this objective, this paper begins a series that reports on the characteristics and performance of an operational mesogamma-scale weather analysis and forecasting system that has been developed for use by the U.S. Army Test and Evaluation Command. During the more than five years that this four-dimensional weather system has been in use at seven U.S. Army test ranges, valuable experience has been gained about the production and effective use of high-resolution model products for satisfying a variety of needs. This paper serves as a foundation for the rest of the papers in the series by describing the operational requirements for the system, the data assimilation and forecasting system characteristics, and the forecaster training that is required for the finescale products to be used effectively.

Corresponding author address: Yubao Liu, NCAR/RAL, P.O. Box 3000, Boulder, CO 80307-3000. Email: yliu@ucar.edu

1. Introduction

The mission of the U.S. Army proving grounds and test ranges is to provide facilities and other support for evaluating the performance of matériel that is being considered for procurement by the U.S. Department of Defense. Because most of the tests have weather-related environmental and safety constraints, forecasts are required for test scheduling and nowcasts are required for test conduct. After tests have been completed, estimates are needed of the meteorological conditions that prevailed at the time and location of the test. These posttest analyses of meteorological conditions affecting test results require a model-based data assimilation system to dynamically interpolate between observations when it is not possible to place sensors at the test location (e.g., at a blast site or along a missile trajectory). There are also numerous range safety needs for accurate test-related weather forecasts. For example, fueling and munitions-handling operations cannot take place during convective weather events with lightning, surface-to-air missiles cannot be tested when upper-tropospheric winds will carry impact debris off the range, and high explosives cannot be tested when low-level winds and temperatures will allow high-intensity sound waves to cause damage in populated areas near the range.

In 1996, the U.S. Army Test and Evaluation Command (ATEC, then known as “TECOM”) assigned the National Center for Atmospheric Research (NCAR) with the task of evaluating the capabilities of its units that provide forecasting and other meteorological support services at the U.S. Army proving grounds and test ranges. This evaluation identified a number of factors that made it difficult for the meteorological units to perform their mission: 1) data quality control, archival, and retrieval capabilities required improvement; 2) display systems for integrated interpretation of meteorological fields were not available; 3) there were no operational modeling systems for forecasting the mesogamma- and mesobeta-scale1 motions that sometimes dominate the range meteorological conditions; and 4) special-applications models for defining transport and diffusion, sound propagation, and so on, were driven by observations (often displaced substantially from the test in time or space) rather than model output. In response to the results of this needs assessment, NCAR and ATEC collaborated on the development and implementation of a completely new meteorological support infrastructure called the Four-Dimensional Weather (4DWX) system. The 4DWX system at each test range is tailored to meet the specific needs of that range. Previous papers based on experience with this system include Davis et al. (1999) on mesoscale predictability, Warner and Hsu (2000) on cloud-resolving modeling, Rife et al. (2002) on thermally forced circulations in the Great Basin Desert, Rife et al. (2004) on near-surface wind predictability, and Warner et al. (2004) on emergency-response applications. This paper is the first in a series of papers that provides a unified and comprehensive description of the modeling technology, the challenges associated with the operational use of the forecasting system, and the scientific insights gained by its use.

The U.S. Army test ranges are typically located where there is strong local forcing from complex orography or coastlines, resulting in myriad mesoscale processes (Rife et al. 2002). These processes include coastal breezes; orographic effects such as terrain-slope flows and lee waves; and circulations resulting from inland differences in landscape type, such as the “salt-flat breezes” that are forced by thermal contrasts between salt flats and their surroundings. In addition, orography that prevails near the ranges in the western United States initiates convection, and the associated lightning must be forecast. Thus, these mesogamma-scale processes, in addition to the traditional processes on the synoptic scale, need to be forecast accurately. Each of the following ATEC ranges is served by its own version of this operational system: Dugway Proving Ground (DPG), Dugway, Utah; White Sands Missile Range (WSMR), White Sands, New Mexico; Aberdeen Proving Ground (APG), Aberdeen, Maryland; Yuma Proving Ground (YPG), Yuma, Arizona; Cold Regions Test Center (CRTC), Fort Greely, Alaska; Electronics Proving Ground (EPG), Fort Huachuca, Arizona; and Redstone Technical Test Center (RTTC), Redstone Arsenal, Alabama. The meteorological units at these ranges each employ from one to seven civilian forecasters.

Three types of range weather forecasts are needed. For long-horizon test scheduling, range climatological data are required to determine the season, time of day, and location on the range (the microclimate) that will have the highest probability of providing the necessary test conditions. This need is satisfied through the use of a software interface to a long period of archived observations, and to gridded range climatologies that are based on the high-resolution analyses produced by the model data assimilation system. The second type of forecast is of the range conditions that will prevail during the next 24–48 h. This information, provided by forecasters using the mesoscale model, is employed in decisions about whether to set up test equipment and schedule support staff. Last, very short-range forecasts, or analyses of current conditions, are used as the basis for final go/no-go decisions. Such analyses are a product of the model-based data assimilation system.

In addition to weather forecasting services for test support, most ranges provide meteorological data, obtained from instruments or models, directly to those conducting tests or to the range safety unit. These data are often used as input to special-applications models for calculating transport and diffusion of smoke or simulants, parachute drift, sound propagation, ballistic trajectories, and missile debris drift. In addition to test-related forecasts, all range meteorological units provide forecasts of local hazardous weather, wind chill, and heat indices that relate to the safety of personnel working outdoors in extreme environments.

It is very difficult to quantify the efficiency and safety benefits of this modernized forecasting capability. However, it is possible to show that the impacts of inaccurate weather forecasts on test operations are large. For example, DPG tests that involve chemical–biological simulant releases are among the most dependent on meteorological conditions, with go/no-go decisions typically made 12–36 h in advance, and decisions to reconfigure the sensor grid for a different wind direction made 24–36 h in advance. The cost per test day to deploy all DPG test personnel and systems (e.g., support aircraft) to the field can range from $10,000 to well over $100,000, depending on the scope of the test. Thus, unproductive time spent in the field awaiting favorable meteorological conditions can represent a significant fraction of overall field-test costs. Over a typical test period of 1–2 months, even a modest improvement in weather-based test go/no-go decisions can save many thousands of dollars. At WSMR, missile launches are dependent on winds aloft to ensure that any debris from aborted launches falls on government-controlled property. The WSMR starts charging for range time 24 h before a launch, and the hourly charges can become very expensive as t − 0 h is approached. If a launch is scrubbed at t − 0 h because of unforecast unfavorable winds, the lost costs for range support can exceed $1,000,000. Thus, test programs hope to make accurate weather-based go/no-go launch decisions 24 h in advance. In addition to the desire to avoid the expense of unproductive range time at WSMR, there are other scheduling concerns. Some types of launches require increased airspace restrictions and/or evacuations of private property that are only allowed several times per year. An erroneous go decision for one of these launches could result in a significant delay in the next launch attempt if it uses up that year’s quota of airspace restrictions or private property evacuations. Mission-specific dependencies on weather forecasts similarly affect the costs of field tests at the other test ranges.

Given the rapid growth in the use of operational mesoscale models to satisfy different specialized needs, it is especially important for the community to share ideas and solutions for meeting the many unmet challenges, which encompass science, technology, education, and training. As a contribution to this goal, this paper is the first in a series of four papers that reports on the characteristics and performance of the ATEC 4DWX analysis and forecasting system, and the experiences of the developers and system users. Liu et al. (2008, hereinafter Part II) addresses verification of the forecasts for each range, where the objective is to use conventional metrics to characterize the degree to which forecast accuracy varies from range to range, within the diurnal cycle, with elapsed forecast time, and among the seasons. Another rapidly evolving aspect of mesoscale modeling is the coupling of the meteorological forecast model with special-applications models. Sharman et al. (2008, hereinafter Part III) of this series describes examples of how the meteorological forecast variables have been used for the calculation of transport and diffusion, parachute trajectories, sound propagation, and missile trajectories. Last, although there are many requirements for very short-range forecasts of convective rainfall and lightning, full-physics convection-resolving models often have inadequate skill in such situations. To satisfy these needs, an automated algorithmic system has been developed by NCAR to serve the National Weather Service, the Federal Aviation Administration, and ATEC. Saxen et al. (2008, hereinafter Part IV) describe how this convection auto-nowcast system meets specialized needs at WSMR.

This paper provides an overall summary of the analysis and forecasting system and its use at the ranges. Section 2 describes the specific operational requirements for the system at each range, section 3 summarizes the analysis and forecasting system itself, section 4 reviews some of the experiences and methods associated with training forecasters in the use of mesogamma-scale products, and section 5 provides a summary.

2. Operational requirements and forecasting challenges at ATEC ranges

Table 1 summarizes the environment, critical forecast variables, and required forecast periods at each ATEC range. As shown by the table, weather must be forecast for a wide range of climate regimes. The APG is in a temperate, humid, coastal setting; CRTC has a subarctic climate; DPG is in a midlatitude “cold” desert; YPG, EPG, and WSMR are in subtropical hot deserts; and RTTC is in the humid subtropics. Model-based analyses, as well as forecasts of 24–36-h duration, are required especially for boundary layer variables. Thus, surface-forced circulations associated with coastlines, elevated terrain, and substrate–vegetation contrasts must be forecast well. Some range operations also need forecast winds in the free troposphere and lower stratosphere. The following paragraphs describe a few of the many specific forecast requirements and challenges for each range.

At the Dugway Proving Ground, most field tests involve the transport and dispersion of simulants for chemical and biological agents, usually within the nocturnal boundary layer. Because light winds are common for these tests, locally forced circulations dominate the calm synoptic-scale environment. Thus, upslope and drainage flows, lake and land breezes, and salt-flat breezes must be forecast. Although wind direction is especially challenging to forecast in these light-wind situations, it is nevertheless an important variable because sensor-grid configurations are wind direction–dependent. Also, because low-level static stability is important to dispersion, the mesoscale model must reasonably represent the surface energy budget.

At the White Sands Missile Range, wind analyses and forecasts throughout the free troposphere and stratosphere, and within the boundary layer near launch and impact points, are required for guided and unguided missile tests. In addition to anticipating wind effects on missile trajectories, winds are needed for calculating the drift of debris resulting from aborted launches and missile impacts with airborne targets. Mountain effects make the wind forecasts especially challenging. Atmospheric electrical activity must be forecast when munitions are exposed and fueling operations are taking place. The existence of clouds must also be predicted because some missiles must be tracked visually.

At the Aberdeen Proving Ground, low-level winds and temperatures must be analyzed and forecast for use in models that simulate the propagation of sound from explosions to determine when the intensity in residential areas exceeds a specified threshold. Similar variables must be predicted for calculating projectile trajectories. Because the range is adjacent to the complex coastline of the Chesapeake Bay, coastal breezes are important and can be challenging to forecast accurately.

At the Yuma Proving Ground, precision air drops by parachute require tropospheric wind forecasts, as do calculations of anticipated ballistic-projectile trajectories. Near-surface temperature in this extreme thermal environment is also an important test-scheduling criterion and forecast variable. Nearby complex orography produces local mesoscale effects in the wind field that must be predicted.

At the Cold Regions Test Center, the onset, durations, and locations of low-temperature extrema (−40°F) must be predicted for test scheduling. Models must therefore be capable of predicting cloud cover and the absence of the typical drainage flows on the north slope of the Alaska Range.

At the Electronics Proving Ground, convection must be forecast because some electronics equipment cannot be tested during periods of atmospheric electrical activity. Also, winds need to be predicted to support the use of a large surveillance blimp.

At the Redstone Technical Test Center, lightning and low relative humidity (because of static electricity buildup) pose a hazard when munitions are exposed and being handled, and therefore these conditions must be forecast. Winds aloft, icing conditions, turbulence, and cloud height impact the operation of unmanned aerial vehicles and must be predicted. For many activities, rain and visibility must be forecast.

3. Description of the data assimilation and forecasting system

Before describing the general properties of the modeling system, it is worth emphasizing that there are many system capabilities that are routinely being updated. This ongoing enhancement is a result of a relatively unusual working relationship between the system developers and ATEC users. In particular, there is a continual process in which a prototype system is fielded, developers work side-by-side with the users to identify necessary modifications, and the next generation in the system’s evolutionary cycle is completed and fielded. Another benefit of this ongoing and close developer–user relationship is the relatively short time to deployment of new capabilities. That is, there is a steady stream of new tools that flows from the developer’s laboratory to the desks of the range forecasters. This approach is in stark contrast to the alternative acquisition process wherein a “frozen” system is delivered, and the next capability upgrade takes place years later when a completely new and different system is delivered. There are special challenges associated with such a rapid refreshment of the technology. One is that any changes to the meteorological products, and the interface through which they are made available, requires that forecasters be retrained (see section 4). Also, there are trade-offs between the rate at which systems are updated and their stability (i.e., there is less time for prerelease testing).

Figure 1 depicts the general configuration of the ATEC forecasting system, which utilizes two personal computer (PC) clusters. One is the Model Applications Cluster (MAC) on which the mesoscale model runs. The other is the Data Applications Cluster (DAC) that is responsible for all processing of system input and output data. Data are input through multiple mechanisms. Standard observations and National Centers for Environmental Prediction (NCEP) model products are obtained through a National Oceanic and Atmospheric Administration system (NOAAPort) at each range, range observations are obtained directly from local networks, and other data are obtained through ftp transfer. The configuration of the 4DWX system differs somewhat from range to range, depending on the particular needs. For example, because a convection nowcasting capability is required at WSMR, the NCAR Auto-Nowcast System (described in Part IV) is part of the suite of capabilities at this particular range.

a. Data assimilation system and forecast system design

The forecast and data assimilation components of the 4DWX system are based on the same mesoscale model. The data assimilation engine ingests data as they become available, producing model-assimilated datasets that both define the current conditions on the ranges and serve as initial conditions for the model forecasts. The other component produces forecasts, initiated every 3 h, of 18–39-h duration, depending on the range.

1) The configuration of the modeling systems

The 4DWX modeling system is currently based on the fifth-generation Pennsylvania State University–National Center for Atmospheric Research Mesoscale Model (MM5, version 5; Grell et al. 1995), although the transition to the Weather Research and Forecast (WRF) model is under way. The general model configuration is summarized as follows, and details about specific model characteristics can be found in the NCAR technical note on MM5 (Grell et al. 1995). The model has nonhydrostatic dynamics, a two-way interactive nesting procedure with coarse grids that provide boundary conditions for fine grids running at smaller time steps and with feedback from fine grids to coarse grids, and a radiative upper-boundary condition that mitigates noise resulting from the reflection of vertically propagating waves. It also has time-dependent lateral-boundary conditions, relaxed toward large-scale model forecasts (Eta; Global Forecast System). A nudging zone of five rows and five columns is specified at the model lateral boundaries, with a nudging weight that allows the model-variable tendencies to relax gradually to the larger-scale model forecasts along the boundary. The Grell (1993) cumulus parameterization on 10-km grid increment, or larger, grids is used, along with the Reisner et al. (1998) mixed-phase microphysics parameterization, which includes explicit prognostic equations for cloud water, rainwater, ice particles, and snow processes. The model uses the Modified Medium-Range Forecast (MRF) model (Hong and Pan 1996; Liu et al. 2006a) boundary layer parameterization. The MRF parameterization is a nonlocal mixing scheme. The Richardson number is used to determine the depth of the boundary layer. Cloud effects on radiation (Dudhia 1989) are allowed for shortwave radiation, and the Rapid Radiative Transfer Model (Mlawer et al. 1997) is used for longwave radiation. The “Noah” land surface model (a modified Oregon State University land surface model; Chen and Dudhia 2001a, b) with four soil layers is used. Soil moisture and soil temperature are predicted at each grid point based on substrate and atmospheric properties. The model has a land surface data assimilation system (Chen et al. 2004) that diagnoses current substrate moisture and temperature using in situ and remotely sensed data. The model has 36 computational levels, with approximately 12 levels within the lowest 1 km.

Table 2 summarizes how the grid structure and forecast length vary among the ranges (shown in Fig. 2) to address range-dependent factors, such as the forecast length required for test support, the geographic area of the range that needs to be encompassed by the finest model grid, and the processor speed and number of processors of the PC cluster used. Some nonstandard adaptations to the MM5 system are used. For example, land surface categories for the salt flats around DPG and the white gypsum sands and lava flows near WSMR have been added to the standard MM5 table of physical properties.

2) The data assimilation technique employed

There are a number of different three- and four-dimensional data assimilation approaches that can be employed for producing analyses. After assessing the alternatives, the Newtonian relaxation method was selected. Four-dimensional variational techniques are presently computationally prohibitive for operational applications at mesogamma-scale resolutions, while three-dimensional variational approaches cannot take good advantage of high-frequency data that are available from various sources.

Data assimilation by Newtonian relaxation is accomplished by adding nonphysical nudging terms to the model predictive equations. These terms force the model solution at each grid point to observations, or analyses of observations, in proportion to the difference between the model solution and the data or analysis. This approach was chosen because it is relatively efficient computationally, it is robust, it allows the model to ingest data continuously rather than intermittently, the full model dynamics are part of the assimilation system so that analyses contain all locally forced mesoscale features, and it does not unduly complicate the structure of the model code. The implementation of Newtonian relaxation in the 4DWX system forces the model solution toward observations (observation or station nudging; Stauffer and Seaman 1994) rather than toward analyses of the data. This approach was chosen because observations on the mesoscale are sometimes sparse and typically are not very uniformly distributed in space, making objective analysis difficult. With station nudging, each observation is ingested into the model at its observed time and location, with proper space and time weights, and the model spreads the information in time and space according to the model dynamics.

Studies using Newtonian relaxation include Stauffer and Seaman (1994), Stauffer et al. (1991), Seaman et al. (1995), and Fast (1995). A common finding of these studies is that analysis nudging works better than intermittent assimilation on synoptic scales.2 Stauffer and Seaman (1994) and Seaman et al. (1995) showed that nudging toward observations was more successful on the mesoscale than nudging toward analyses. Leslie et al. (1998) found that the impact of observation nudging was similar to that of assimilating the same data in a four-dimensional variational data assimilation system, noting that the former was practicable while the latter was too computationally expensive.

Even though the basic nudging approach used in the ATEC system is based on Stauffer and Seaman (1994), several extensions have been added to enhance its performance. All near-surface observations are extrapolated to the lowest model computational layer using similarity theory. The influence of these data is then distributed throughout the model-simulated boundary layer using a vertical weighting function. See Liu et al. (2006b) for more details. To enable the model to retain the observation information, single-level upper-air observations, such as commercial aircraft data and satellite cloud-track winds, have their influence spread over a few model layers, with appropriate weights, rather than being applied to a single layer. Because of the small correlation between weather variables below and above the mixed-layer top, the vertical influence of single-level observations is not allowed to spread across this boundary. When assigning the weight of an observation to the nudging term at each grid point, the horizontal-influence functions are constructed to account for the blocking effect of elevated terrain. For each grid point, the terrain–elevation difference between the grid point and observation stations is calculated, and if this is larger than a specified amount the observation will not be allowed to affect the grid point. This is especially important because most of the inner grids of the forecast models have complex orography. More details can be found in Xu et al. (2002).

Rawinsonde and profiler soundings are not treated as columns of separate data points, which influence adjacent model levels with a weight that is related to the vertical displacement. Rather, the observation innovations are interpolated directly to the model levels, with a full weighting applied, as would be the case if the observation level corresponded with the model level. In effect, the vertical coherence of the data points is recognized. Sensitivity experiments were conducted to compare the model performance using soundings defined as individual point observations and using them in the way described here, and the method that took advantage of the sounding’s vertical coherence generated better forecasts.

In determining the influence of an observation on the nudging term at grid points, multiple scans with different influence radii are used to account for different scales of motion. This multiple scan approach is consistent with the concept of the classical successive-correction objective-analysis method (Cressman 1959). A data quality estimate is used as a factor in defining the influence of each data point on the nudging coefficient. This factor is estimated by comparing the statistical forecast error and the difference between the model solution and the observation for the instance in question. This is consistent with the fact that Newtonian relaxation is intended to be a weak constraint. More details about the data quality estimate and its relation to the nudging coefficient can be found in Liu et al. (2004).

3) The data assimilation and forecast cycle

Figure 3 illustrates the data assimilation and forecast cycle. The data assimilation process is complicated by the fact that some observations made during the assimilation period arrive too late to be used to initialize the forecasts. However, it is still desirable to have these data included in model-assimilated datasets that define retrospective conditions at test sites or are used to construct gridded range climatologies. Thus, there are two types of data assimilation cycles. The first (the preliminary analysis), which is used to initialize the model, does not wait for late data. The second (the final analysis) assimilates data over the same period, but is delayed until all data are in. Not only does this more complete assimilation produce datasets for the retrospective uses noted above, it creates a restart point for the next assimilation forecast cycle. For example, Fig. 3 shows that shortly after t hon the real-time axis, the data assimilation system has gridded data for t − 1 h available from a final analysis cycle (black arrow). The system begins integrating forward and assimilating available data (gray arrow) until forecast initiation time is t + 2 h. At some time during this period, there will be no more data available to ingest because of normal delays in data transmission and because the simulated time will exceed the real time. Thus, by the end of this assimilation period, a short forecast will have been produced. This 3-h period of assimilated data is called the preliminary analysis because some data applicable during the period have not been incorporated. The model forecast is then initiated based on this analysis. That is, the relaxation term is turned off; there is no further forcing of the model to observations. At real time t + 3 h, after all the data for the period t − 1 h to t + 2 h have come in, the final analysis is produced for that period. This cycling continues, except that, once per week, a conventional objective analysis is performed and the system is cold started with updated sea surface temperatures. Even though the rationale for the weekly cold start is that it prevents the accumulation of error in data-sparse areas, there has been no evidence of this problem.

4) Data sources employed

The ranges have a rich array of data that can be used for scientific purposes, directly for mission support, and for ingest by the data assimilation system. Table 3 lists the instrument platforms at each range, some of which provide data on a regular basis while others are used only during certain tests. The data sources utilized by the data assimilation system include the traditional hourly surface reports [aviation routine weather report (METAR), ship, buoy, and special] and the twice-daily upper-air rawinsondes. Also used are high-frequency measurements from various special networks. These data include those from the University of Utah’s MesoWest system, which integrates and disseminates data from many public and private networks (Horel et al. 2002); data from various other surface mesonets assembled by NOAA/Earth System Research Laboratory (ESRL); wind profiler data from NOAA/ESRL’s National Profiler Network and Cooperative Agency Profilers network (Neiman et al. 1992); hourly cloud-track winds derived from infrared, visible, and water vapor imagery (from the Geostationary Operational Environmental Satellite; Gray et al. 1996; Nieman et al. 1997); aircraft reports (from Aircraft Communications Addressing and Reporting System and Aircraft Meteorological Data Reporting System) processed and disseminated by NOAA/ESRL (Moninger et al. 2003); Quick Scatterometer (QuikSCAT) SeaWinds sea surface winds (Ebuchi et al. 2002) from the National Aeronautics and Space Administration (NASA), disseminated by the NOAA/National Environmental Satellite, Data, and Information Service (NESDIS); Tropospheric Airborne Meteorological Data Reporting data disseminated by NASA and AirDat, LLC; and data from various observation platforms at the test ranges, including surface meteorological stations, boundary layer profilers, and rawinsondes released routinely during test periods. Table 4 shows an example of the numbers of observations from various platforms that were ingested by the data assimilation system at one range during the indicated, typical, 1-h period.

b. Forecast products and user interface

There are two complementary graphical interfaces to meteorological products that are used by forecasters. The first graphical user interface is a Web-based system that allows forecasters to access static and animated imagery when they are away from the weather station. This Web interface enables them to provide test support while collocated with testers at sites on the range, or anywhere worldwide. Also, because commute times are long to some of the ranges, forecasts can be prepared from home on weekends. Bandwidth, however, limits the complexity of these Web-based products. Figure 4 shows an example of the Web interface as used by WSMR meteorologists to provide test site support for a missile launch from California. Conditions over a range of scales can be viewed by selecting the different computational grids in this nested system. A variety of combinations of different overlaid fields, and model soundings, can be selected and viewed as static or animated images.

The second graphical user interface, which is Java based, is much more flexible than the Web-based interface, allowing the user to customize displays with a variety of overlaid model-generated fields, observed images (e.g., satellite), and measurements. Figure 5 shows an example of a display for North America, where model output from DPG and APG have been composited into a single image. The plan view image is of 890-hPa temperature from the model final analysis, the window over the mid-Atlantic states displays an infrared satellite image, and the east–west-oriented solid line defines the axis of a vertical cross section of temperature that is inset in the lower right. The slide bar on the time axis that is exposed to the left of the cross section is used to control the time of the display, which can be either a model-based analysis of current conditions or a forecast. Below that, the user can choose the ATEC ranges to use in the display and the variables to be depicted. The dots to the right of the chosen variables are color coded to indicate the times for which the data are available. The mouse can be used to define the boundaries of any display area, and the highest-resolution model data available from any range will automatically be used to define the current or forecast conditions.

Other miscellaneous, nongraphical products include tabulations of analyzed and forecast values of the standard meteorological variables that have been interpolated to preselected locations, such as standard test sites. This makes it easy for forecasters to define point values of the variables, and it is easy for the system to apply statistical corrections to the model output. The model also produces datasets in formats needed for input to special-applications models that calculate noise propagation, transport and diffusion, parachute drift, etc.

c. Additional software tools

The ATEC system includes several software subsystems that produce products and services that allow forecasters and system administrators to perform their jobs more efficiently. For example, the Custom-Query Tool (CQT) allows forecasters to interrogate a relational database that consists of archives of model-based analyses, as well as observations. The CQT can be used to reconstruct meteorological conditions for previous tests, or it can define the season and location on the range where particular test requirements are most likely to be met. Conditional queries also can be constructed. For example, the database can be used to compute the mean temperature in July at a particular test site when the 10-m wind speed is less than 2 m s−1, between the hours of 0400 and 0800 UTC.

The Meteorological-Condition Alert Tool (MAT) automatically notifies forecasters by e-mail, cell phone, or pager when a preset condition is detected in the observations or a forecast. For example, a WSMR forecaster may be responsible for supporting a missile launch where surface wind speeds cannot exceed 10 m s−1 at the launch site and time. This criterion could be set within the MAT, which would notify the forecaster if a forecast for the test time contained a wind near the test site that exceeded the threshold. Similarly, the forecaster could be notified if a lightning strike was detected within some preset distance from high explosives that are being used in a test.

The SysView system monitor graphically displays the health of all aspects of the system, including sensors, computing hardware, storage devices, and displays. The major components are shown in a system diagram with a red, yellow, or green color code assigned to each component to provide information on operational status. Figure 6 shows the high-level SysView display for the APG system. To look in more detail at a component, a click on the icon produces a color-coded display of all the subsystems. For example, if one clicks on the MAC icon, the current display will be replaced by one that shows the status of all the components of the MAC, including each node.

d. Hardware and system software solutions

As noted above, the computing hardware that supports the modeling at each army test range consists of two clusters of PC processors running the Linux operating system, the MAC and the DAC. Because the systems are located at the range meteorological units, a significant infrastructure investment is required to provide stable power, sufficient cooling, and system administration support. The MAC hardware consists of 16–32 nodes, with two processors per node. There are two master nodes that manage the processes on the other slave nodes. The DACs typically have six nodes.

Table 5 shows the percent of possible forecast cycles (eight per day) that were available at each of the test ranges during the randomly chosen month of January 2004. To illustrate a typical sequence of available forecast cycles, Fig. 7 shows the forecast cycles per day for the YPG system from December 2003 to February 2004. Recall that new forecasts are initiated on a 3-h cycle, and thus there are 8 cycles per day. Figure 7 indicates good system reliability. In general, system downtime can be assigned to four causes. Approximately 5% of downtime is for planned shutdowns for system maintenance and upgrades during periods when forecasts are not needed. Less than 1% of outages are a result of model software crashes. Another 5% of the downtime is caused by system environment problems, such as the unavailability of Eta Model data from NCEP (for lateral boundary conditions), power outages, etc. The remaining roughly 90% of failures result from more MAC nodes being simultaneously lost than can be replaced with available spares. Power fluctuations, common at some of the more remote ranges, which are not controlled well by uninterruptible power supply units, can cause large numbers of nodes to fail. As with any operational system, these nodes are computing forecasts for over 22 h a day, 7 days a week, drawing full power and with a full memory load. This usage causes a shorter-than-average lifespan for system components.

For massively parallel, distributed-memory modeling applications, such as the 4DWX systems, the scaling of model performance with respect to the number of processors is an extremely important issue. The parallelization of the model is accomplished by decomposing the model domain into grid tiles, and running each tile on a separate processor. Because of overhead associated with interprocessor communication and redundant computations where tiles overlap, the scaling is not perfect. For example, if many processors are required to complete a forecast quickly, the computational grid must be partitioned into many tiles, and a significant fraction of the total grid area will be in the overlap region. If fewer processors and tiles are used, there will be less overlap area between tiles, and the scaling will be more linear. Table 6 shows the model execution time for two to sixty-four 2.4-GHz processors, where the model grids and physics parameterizations are similar to those used in the three-domain YPG operational system (see Table 2 for grid information). Myrinet communication technology is used. The computing time is based on 4-h model forecasts (without data assimilation).

4. Training forecasters in the use of the model products

When a new operational model is introduced to any group of forecasters, there is typically a significant adjustment period during which they will determine the strengths and weaknesses of the system; essentially they learn when and if to trust it. This was certainly the case when the new model products were introduced to the ATEC forecasters, who were accustomed only to the NCEP products. However, an even larger challenge for them resulted from the fact that the new model output had greater temporal resolution and approximately an order of magnitude greater horizontal resolution than the usual NCEP products. Instead of utilizing maps of only synoptic-scale processes, forecasters were confronted with hourly displays of locally forced mesogamma-scale processes that they had previously only seen in textbooks or had glimpsed from local data. Obviously quasigeostrophic reasoning had to be set aside, and new ways of looking at model output and interpreting model skill had to be developed. To further complicate the adjustment process, forecasters had to become proficient in the use of a completely new graphics system for displaying meteorological fields.

To help in this transition, online courses in mesoscale meteorology, the interpretation of mesoscale model output, and the utilization of the new graphics interface were set up for forecasters. Training teams periodically visit the range meteorological units to work with forecasters in their operational environment and to help with the interpretation of model products. Conferences for the ATEC forecasters are held regularly, in which a fraction of the range forecasters gather to discuss common forecasting problems and new technologies being introduced. There also are e-mail aliases for forecasters to use in querying their peers and system developers about common problems. It should be kept in mind that because the system is under continuous development and is upgraded at regular intervals, the above training process is an ongoing one that requires significant time resources from both forecasters and system developers.

5. Summary and discussion

It is increasingly common for mesoscale meteorological models to be used operationally for various purposes by private forecasting services, universities, and civilian and military branches of the government. This global trend has been facilitated by the rapid decrease in the cost of computing resources and the availability of increasingly robust and well-verified models. This paper documents the technologies that have been employed in the operational mesogamma-scale modeling systems that have been developed for ATEC and installed at six army test ranges. Also described are the experiences of the system developers and users. Later papers in this series will document the objective verification of the model products, the coupling of the mesoscale model with special-applications models that simulate sound propagation, transport and diffusion, etc., and the application of a convection nowcasting system at WSMR. The lessons learned by the developers and the users of these operational very high-resolution forecasting systems should make it easier for the implementation of similar systems by the modeling community.

The benefits to science that are being derived from these ATEC modeling systems are many. The ranges are rich with surface and upper-air data for use in mesoscale process diagnostics and for model verification. Because five of the ranges are located in areas of very complex orography and one is located on a complex coastline, these long-term (typically more than a decade) mesoscale datasets are especially valuable for studies of a number of interesting processes. The long record of high temporal- and spatial-resolution model-assimilated gridded datasets from the final analyses is especially useful for diagnostic physical process studies. The fact that the ranges are located in a variety of different climate zones—subarctic, high-elevation cold desert, hot desert, humid coastal—means that explicit and parameterized physics representations can be stressed over a wide range of climate conditions. The long record of operational forecasts and companion mesoscale verification datasets for such a variety of climates is also valuable for model verification. In many cases, long-term parallel runs have been conducted to test new parameterizations or to evaluate data impacts. Last, coupling of the mesoscale model with special-applications models can expand the usefulness of the meteorological products, and this system has provided many opportunities for subjectively and objectively verifying products from such coupled systems.

Of equal importance to the scientific knowledge gained, there have been many practical lessons learned in the process of developing the operational system and working with the forecasters to ensure that it meets their needs. In particular, it was determined that PC clusters provide over a factor of 10 greater computing power per dollar than the multiprocessor, shared-memory systems that were employed initially. System administration costs are relatively high for the large clusters because of the large number of individual PC processors, but the cost per cluster is still less than the alternatives. Another lesson from the operational implementation of the ATEC system was that the training of forecasters in the use of the mesogamma-scale model products was essential. Conventional experience and education were not sufficient. Last, the customization of the model configuration and products to the particular and sometimes specialized needs of each test range represented a significant benefit to the ranges. This range-specific customization included the use of the native-resolution output from the model grids, the provision of high–temporal frequency output, the generation of special output files for specific test sites or for use in coupled models, and the customization of graphics to meet special range needs.

Acknowledgments

The development of this system was funded by the U.S. Army Test and Evaluation Command through an Interagency Agreement with the National Science Foundation. Jennifer Cram developed a prototype of the data assimilation cycling strategy, Cindy Halley-Gotway assisted with the preparation of the graphic art, Karen Arp computed the system reliability statistics, and Carol Park provided editorial assistance.

REFERENCES

  • Chen, F., and J. Dudhia, 2001a: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part I: Model implementation and sensitivity. Mon. Wea. Rev., 129 , 569585.

    • Search Google Scholar
    • Export Citation
  • Chen, F., and J. Dudhia, 2001b: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part II: Model validation. Mon. Wea. Rev., 129 , 587604.

    • Search Google Scholar
    • Export Citation
  • Chen, F., K. Manning, D. Yates, M. A. LeMone, S. B. Trier, R. Cuenca, and D. Niyogi, 2004: Development of high-resolution land data assimilation system and its application to WRF. Preprints, 20th Conf. on Weather Analysis and Forecasting and 16th Conf. on Numerical Weather Prediction, Seattle, WA, Amer. Meteor. Soc., 22.3.

  • Cressman, G. P., 1959: An operational objective analysis system. Mon. Wea. Rev., 87 , 367374.

  • Davis, C., T. Warner, J. Bowers, and E. Astling, 1999: Development and application of an operational, relocatable, mesogamma-scale weather analysis and forecasting system. Tellus, 51A , 710727.

    • Search Google Scholar
    • Export Citation
  • Dudhia, J., 1989: Numerical study of convection observed during the winter monsoon experiment using a mesoscale two-dimensional model. J. Atmos. Sci., 46 , 30773107.

    • Search Google Scholar
    • Export Citation
  • Ebuchi, N., H. C. Graber, and M. J. Michael, 2002: Evaluation of wind vectors observed by QuikSCAT/SeaWinds using ocean buoy data. J. Atmos. Oceanic Technol., 19 , 20492062.

    • Search Google Scholar
    • Export Citation
  • Fast, J. D., 1995: Mesoscale modeling and four-dimensional data assimilation in areas of highly complex terrain. J. Appl. Meteor., 34 , 27622782.

    • Search Google Scholar
    • Export Citation
  • Gray, D., J. Daniels, S. Nieman, S. Lord, and G. Dimego, 1996: NESDIS and NWS assessment of GOES 8/9 operational satellite motion vectors. Proc. Third Int. Winds Workshop, Pub. EUM P18, Ascona, Switzerland, EUMETSAT, 175–183.

    • Search Google Scholar
    • Export Citation
  • Grell, G., 1993: Prognostic evaluation of assumptions used by cumulus parameterizations. Mon. Wea. Rev., 121 , 764787.

  • Grell, G. A., J. Dudhia, and D. R. Stauffer, 1995: A description of the fifth-generation Penn State/NCAR Mesoscale Model (MM5). NCAR Tech. Note NCAR/TN-398+STR, 122 pp.

  • Hong, S-Y., and H-L. Pan, 1996: Nonlocal boundary layer vertical diffusion in a medium-range forecast model. Mon. Wea. Rev., 124 , 23222339.

    • Search Google Scholar
    • Export Citation
  • Horel, J., M. Splitt, and B. White, 2002: MesoWest: Cooperative mesonets in the western United States. Bull. Amer. Meteor. Soc., 83 , 211225.

    • Search Google Scholar
    • Export Citation
  • Leslie, L. M., J. F. LeMarshall, R. P. Morrison, C. Spinoso, R. J. Purser, N. Pescod, and R. Seecamp, 1998: Improved hurricane track forecasting from the continuous assimilation of high-quality satellite wind data. Mon. Wea. Rev., 126 , 12481258.

    • Search Google Scholar
    • Export Citation
  • Liu, Y., F. Vandenberghe, S. Low-Nam, T. T. Warner, and S. Swerdlin, 2004: Observation-quality estimation and its application in the NCAR/ATEC real-time FDDA and forecast (RTFDDA) system. Preprints, 20th Conf. on Weather Analysis and Forecasting and 16th Conf. on Numerical Weather Prediction, Seattle, WA, Amer. Meteor. Soc., J1.7.

  • Liu, Y., F. Chen, T. Warner, and J. Basara, 2006a: Verification of a mesoscale data-assimilation and forecasting system for the Oklahoma City area during the Joint Urban 2003 Field Project. J. Appl. Meteor. Climatol., 45 , 912929.

    • Search Google Scholar
    • Export Citation
  • Liu, Y., W. Yu, F. Vandenberghe, A. Hahmann, T. Warner, and S. Swerdlin, 2006b: Assimilation of diverse meteorological datasets with a four-dimensional mesoscale analysis and forecast system. Preprints, 10th Symp. on Integrated Observing and Assimilation System for the Atmosphere, Oceans, and Land Surface, Atlanta, GA, Amer. Meteor. Soc., 2.8.

  • Liu, Y., and Coauthors, 2008: The operational mesogamma-scale analysis and forecast system of the U.S. Army test and evaluation command. Part II: Interrange comparison of the accuracy of model analyses and forecasts. J. Appl. Meteor. Climatol., 47 , 10931104.

    • Search Google Scholar
    • Export Citation
  • Mlawer, E. J., S. J. Taubman, P. D. Brown, M. J. Iacono, and S. A. Clough, 1997: Radiative transfer for inhomogeneous atmospheres: RRTM, a validated correlated-k model for the longwave. J. Geophys. Res., 102 , D14. 1666316682.

    • Search Google Scholar
    • Export Citation
  • Moninger, W. R., R. D. Mamrosh, and P. M. Pauley, 2003: Automated meteorological reports from commercial aircraft. Bull. Amer. Meteor. Soc., 84 , 203216.

    • Search Google Scholar
    • Export Citation
  • Neiman, P. J., P. T. May, and M. A. Shapiro, 1992: Radio Acoustic Sounding System (RASS) and wind profiler observations of lower and mid-troposphere. Preprints. Seventh Symp. on Meteorological Observation and Instrumentation, New Orleans, LA, Amer. Meteor. Soc., 61–66.

    • Search Google Scholar
    • Export Citation
  • Nieman, S. J., W. P. Menzel, C. M. Hayden, D. Gray, S. T. Wanzong, C. S. Velden, and J. Daniels, 1997: Fully automated cloud-drift winds in NESDIS operations. Bull. Amer. Meteor. Soc., 78 , 11211133.

    • Search Google Scholar
    • Export Citation
  • Reisner, J., R. J. Rasmussen, and R. T. Bruintjes, 1998: Explicit forecasting of supercooled liquid water in winter storms using the MM5 mesoscale model. Quart. J. Roy. Meteor. Soc., 124B , 10711107.

    • Search Google Scholar
    • Export Citation
  • Rife, D. L., T. T. Warner, F. Chen, and E. G. Astling, 2002: Mechanisms for diurnal boundary layer circulations in the Great Basin Desert. Mon. Wea. Rev., 130 , 921938.

    • Search Google Scholar
    • Export Citation
  • Rife, D. L., C. A. Davis, Y. Liu, and T. T. Warner, 2004: Predictability of low-level winds by mesoscale meteorological models. Mon. Wea. Rev., 132 , 25532569.

    • Search Google Scholar
    • Export Citation
  • Saxen, T. R., and Coauthors, 2008: The operational mesogamma-scale analysis and forecast system of the U.S. Army test and evaluation command. Part IV: The White Sands Missile Range Auto-Nowcast System. J. Appl. Meteor. Climatol., 47 , 11231139.

    • Search Google Scholar
    • Export Citation
  • Seaman, N. L., D. R. Stauffer, and A. M. Lario-Gibbs, 1995: A multiscale four-dimensional data assimilation system applied in the San Joaquin Valley during SARMAP. Part I: Modeling design and basic performance characteristics. J. Appl. Meteor., 34 , 17391761.

    • Search Google Scholar
    • Export Citation
  • Sharman, R. D., Y. Liu, R-S. Sheu, T. T. Warner, D. L. Rife, J. F. Bowers, C. A. Clough, and E. E. Ellison, 2008: The operational mesogamma-scale analysis and forecast system of the U.S. Army test and evaluation command. Part III: Forecasting with secondary-applications models. J. Appl. Meteor. Climatol., 47 , 11051122.

    • Search Google Scholar
    • Export Citation
  • Stauffer, D. R., and N. L. Seaman, 1994: Multiscale four-dimensional data assimilation. J. Appl. Meteor., 33 , 416434.

  • Stauffer, D. R., N. L. Seaman, and F. S. Binkowski, 1991: Use of four-dimensional data assimilation in a limited-area mesoscale model. Part II: Effects of data assimilation within the planetary boundary layer. Mon. Wea. Rev., 119 , 734754.

    • Search Google Scholar
    • Export Citation
  • Warner, T. T., and H-M. Hsu, 2000: Nested-model simulation of moist convection: The impact of coarse-grid parameterized convection on fine-grid resolved convection. Mon. Wea. Rev., 128 , 22112231.

    • Search Google Scholar
    • Export Citation
  • Warner, T. T., J. F. Bowers, S. P. Swerdlin, and B. A. Beitler, 2004: A rapidly deployable, operational, mesoscale modeling system for emergency-response applications. Bull. Amer. Meteor. Soc., 85 , 709716.

    • Search Google Scholar
    • Export Citation
  • Xu, M., Y. Liu, C. A. Davis, and T. T. Warner, 2002: Sensitivity study on nudging parameters for a mesoscale FDDA system. Preprints, 19th Conf. on Weather Analysis and Forecasting and 15th Conf. on Numerical Weather Prediction, San Antonio, TX, Amer. Meteor. Soc., 4B.4.

Fig. 1.
Fig. 1.

Schematic of the overall structure of the modeling system. Data sources are shown at the top. The MAC and DAC clusters each have a Redundant Array of Inexpensive Disks (RAID). The auto-nowcast system operates from the same data input stream as the rest of the system.

Citation: Journal of Applied Meteorology and Climatology 47, 4; 10.1175/2007JAMC1653.1

Fig. 2.
Fig. 2.

Example model grid configuration for WSMR. The expanded grid 3 shows (top right) terrain elevation in meters, with shaded color increments of 100 m, and (bottom right) land use distribution. The boundaries of the WSMR are also shown (in magenta). Horizontal grid increments are 30, 10, and 3.3 km, from the outer to the inner grids, respectively.

Citation: Journal of Applied Meteorology and Climatology 47, 4; 10.1175/2007JAMC1653.1

Fig. 3.
Fig. 3.

The data assimilation and forecast cycle. The ordinate axis is wall clock time, and the abscissa is model time (i.e., the time being simulated during the data assimilation or forecast process). The slope of the arrows indicates the elapsed time associated with the model’s execution, but the particular slope shown here is not meant to quantitatively indicate model efficiency. An example follows of the 0800 UTC cycle (real-time t on the ordinate = 0800). The cycle starts at 0820 UTC and generates a 3-h final analysis (horizontal black arrow) from 0400 to 0700 UTC in 20–30 min. Then the model writes a restart file (vertical black arrow) for the next cycle (1100 UTC, t + 3 on the ordinate) to start with. The model then continues to produce a preliminary analysis and a forecast for the current cycle, where the real time will be about 1110 UTC by the time the forecast (white arrow) terminates at t + 36 h.

Citation: Journal of Applied Meteorology and Climatology 47, 4; 10.1175/2007JAMC1653.1

Fig. 4.
Fig. 4.

Image from the Web-based graphical interface showing 10-m AGL winds (lines with barbs) and 2-m AGL temperature (color bands). This system was used by WSMR to support a missile launch from California.

Citation: Journal of Applied Meteorology and Climatology 47, 4; 10.1175/2007JAMC1653.1

Fig. 5.
Fig. 5.

A display for North America, where model output from DPG and APG have been composited on a single map. The plan view image is of model-analyzed 890-hPa temperature, the window over the mid-Atlantic states displays an infrared satellite image, and the east–west-oriented solid line defines the axis of a vertical cross section of temperature that is inset in the lower right. The mouse can be used to define any display area, and the highest-resolution model data available from any range will automatically be used to define the current or forecast conditions. See text for details.

Citation: Journal of Applied Meteorology and Climatology 47, 4; 10.1175/2007JAMC1653.1

Fig. 6.
Fig. 6.

Example of the system viewer (SysView) for the main components of the system. These components encompass the data input, MAC, and DAC. The status colors indicate that the NOAAPort system is down (gray boxes and background), and that the model (RT-FDDA) and the output products delivery system are not fully functioning (yellow boxes). If the RT-FDDA icon were selected, the resulting display would show an icon for each node, and a subcritical number of them would have color codes that indicate nonfunctioning.

Citation: Journal of Applied Meteorology and Climatology 47, 4; 10.1175/2007JAMC1653.1

Fig. 7.
Fig. 7.

Cycles per day available during the period from December 2003 through February 2004, inclusive, for the Yuma Proving Ground system. A full bar represents a completion of a full 8 cycles per day. The abscissa is labeled beginning with 1 Dec 2003, and each tick along the abscissa represents a day.

Citation: Journal of Applied Meteorology and Climatology 47, 4; 10.1175/2007JAMC1653.1

Table 1.

The forecast requirements and challenges at each ATEC range. The forecast periods correspond to those most critical for range operations, and differ from the actual lengths listed in Table 3.

Table 1.
Table 2.

Characteristics of the models at the various ranges. The D3 and D4 refer to computational domains 3 and 4, the innermost two domains of the nested-grid system. Two ranges (CRTC and DPG) have four grids, and the other ranges only have three grids.

Table 2.
Table 3.

Measurement platforms used at the different ranges.

Table 3.
Table 4.

Number of observations used in the WSMR data assimilation system for the 60 min centered on 0000 UTC 28 Jul 2003.

Table 4.
Table 5.

Percent of possible forecast cycles completed during January 2004, for each range at which there was a 4DWX system operating then. The first row excludes days on which the system was not operating for the entire day, thus eliminating periods during which the system was intentionally down for maintenance. The second row applies for all days.

Table 5.
Table 6.

Model execution time and efficiency (defined as the percent of perfect-scaling speed) for different numbers of processors. See text for details.

Table 6.

&& The National Center for Atmospheric Research is sponsored by the National Science Foundation.

1

Mesogamma-scale (mesobeta scale) motions have horizontal length scales of 2–20 km (20–200 km).

2

Intermittent data assimilation involves restarting the model at regular intervals, where the initial conditions are typically based on data within a certain time window being used to correct a first-guess field that is the forecast from the previous cycle. Restart intervals may typically be 1–6 h.

Save
  • Chen, F., and J. Dudhia, 2001a: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part I: Model implementation and sensitivity. Mon. Wea. Rev., 129 , 569585.

    • Search Google Scholar
    • Export Citation
  • Chen, F., and J. Dudhia, 2001b: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part II: Model validation. Mon. Wea. Rev., 129 , 587604.

    • Search Google Scholar
    • Export Citation
  • Chen, F., K. Manning, D. Yates, M. A. LeMone, S. B. Trier, R. Cuenca, and D. Niyogi, 2004: Development of high-resolution land data assimilation system and its application to WRF. Preprints, 20th Conf. on Weather Analysis and Forecasting and 16th Conf. on Numerical Weather Prediction, Seattle, WA, Amer. Meteor. Soc., 22.3.

  • Cressman, G. P., 1959: An operational objective analysis system. Mon. Wea. Rev., 87 , 367374.

  • Davis, C., T. Warner, J. Bowers, and E. Astling, 1999: Development and application of an operational, relocatable, mesogamma-scale weather analysis and forecasting system. Tellus, 51A , 710727.

    • Search Google Scholar
    • Export Citation
  • Dudhia, J., 1989: Numerical study of convection observed during the winter monsoon experiment using a mesoscale two-dimensional model. J. Atmos. Sci., 46 , 30773107.

    • Search Google Scholar
    • Export Citation
  • Ebuchi, N., H. C. Graber, and M. J. Michael, 2002: Evaluation of wind vectors observed by QuikSCAT/SeaWinds using ocean buoy data. J. Atmos. Oceanic Technol., 19 , 20492062.

    • Search Google Scholar
    • Export Citation
  • Fast, J. D., 1995: Mesoscale modeling and four-dimensional data assimilation in areas of highly complex terrain. J. Appl. Meteor., 34 , 27622782.

    • Search Google Scholar
    • Export Citation
  • Gray, D., J. Daniels, S. Nieman, S. Lord, and G. Dimego, 1996: NESDIS and NWS assessment of GOES 8/9 operational satellite motion vectors. Proc. Third Int. Winds Workshop, Pub. EUM P18, Ascona, Switzerland, EUMETSAT, 175–183.

    • Search Google Scholar
    • Export Citation
  • Grell, G., 1993: Prognostic evaluation of assumptions used by cumulus parameterizations. Mon. Wea. Rev., 121 , 764787.

  • Grell, G. A., J. Dudhia, and D. R. Stauffer, 1995: A description of the fifth-generation Penn State/NCAR Mesoscale Model (MM5). NCAR Tech. Note NCAR/TN-398+STR, 122 pp.

  • Hong, S-Y., and H-L. Pan, 1996: Nonlocal boundary layer vertical diffusion in a medium-range forecast model. Mon. Wea. Rev., 124 , 23222339.

    • Search Google Scholar
    • Export Citation
  • Horel, J., M. Splitt, and B. White, 2002: MesoWest: Cooperative mesonets in the western United States. Bull. Amer. Meteor. Soc., 83 , 211225.

    • Search Google Scholar
    • Export Citation
  • Leslie, L. M., J. F. LeMarshall, R. P. Morrison, C. Spinoso, R. J. Purser, N. Pescod, and R. Seecamp, 1998: Improved hurricane track forecasting from the continuous assimilation of high-quality satellite wind data. Mon. Wea. Rev., 126 , 12481258.

    • Search Google Scholar
    • Export Citation
  • Liu, Y., F. Vandenberghe, S. Low-Nam, T. T. Warner, and S. Swerdlin, 2004: Observation-quality estimation and its application in the NCAR/ATEC real-time FDDA and forecast (RTFDDA) system. Preprints, 20th Conf. on Weather Analysis and Forecasting and 16th Conf. on Numerical Weather Prediction, Seattle, WA, Amer. Meteor. Soc., J1.7.

  • Liu, Y., F. Chen, T. Warner, and J. Basara, 2006a: Verification of a mesoscale data-assimilation and forecasting system for the Oklahoma City area during the Joint Urban 2003 Field Project. J. Appl. Meteor. Climatol., 45 , 912929.

    • Search Google Scholar
    • Export Citation
  • Liu, Y., W. Yu, F. Vandenberghe, A. Hahmann, T. Warner, and S. Swerdlin, 2006b: Assimilation of diverse meteorological datasets with a four-dimensional mesoscale analysis and forecast system. Preprints, 10th Symp. on Integrated Observing and Assimilation System for the Atmosphere, Oceans, and Land Surface, Atlanta, GA, Amer. Meteor. Soc., 2.8.

  • Liu, Y., and Coauthors, 2008: The operational mesogamma-scale analysis and forecast system of the U.S. Army test and evaluation command. Part II: Interrange comparison of the accuracy of model analyses and forecasts. J. Appl. Meteor. Climatol., 47 , 10931104.

    • Search Google Scholar
    • Export Citation
  • Mlawer, E. J., S. J. Taubman, P. D. Brown, M. J. Iacono, and S. A. Clough, 1997: Radiative transfer for inhomogeneous atmospheres: RRTM, a validated correlated-k model for the longwave. J. Geophys. Res., 102 , D14. 1666316682.

    • Search Google Scholar
    • Export Citation
  • Moninger, W. R., R. D. Mamrosh, and P. M. Pauley, 2003: Automated meteorological reports from commercial aircraft. Bull. Amer. Meteor. Soc., 84 , 203216.

    • Search Google Scholar
    • Export Citation
  • Neiman, P. J., P. T. May, and M. A. Shapiro, 1992: Radio Acoustic Sounding System (RASS) and wind profiler observations of lower and mid-troposphere. Preprints. Seventh Symp. on Meteorological Observation and Instrumentation, New Orleans, LA, Amer. Meteor. Soc., 61–66.

    • Search Google Scholar
    • Export Citation
  • Nieman, S. J., W. P. Menzel, C. M. Hayden, D. Gray, S. T. Wanzong, C. S. Velden, and J. Daniels, 1997: Fully automated cloud-drift winds in NESDIS operations. Bull. Amer. Meteor. Soc., 78 , 11211133.

    • Search Google Scholar
    • Export Citation
  • Reisner, J., R. J. Rasmussen, and R. T. Bruintjes, 1998: Explicit forecasting of supercooled liquid water in winter storms using the MM5 mesoscale model. Quart. J. Roy. Meteor. Soc., 124B , 10711107.

    • Search Google Scholar
    • Export Citation
  • Rife, D. L., T. T. Warner, F. Chen, and E. G. Astling, 2002: Mechanisms for diurnal boundary layer circulations in the Great Basin Desert. Mon. Wea. Rev., 130 , 921938.

    • Search Google Scholar
    • Export Citation
  • Rife, D. L., C. A. Davis, Y. Liu, and T. T. Warner, 2004: Predictability of low-level winds by mesoscale meteorological models. Mon. Wea. Rev., 132 , 25532569.

    • Search Google Scholar
    • Export Citation
  • Saxen, T. R., and Coauthors, 2008: The operational mesogamma-scale analysis and forecast system of the U.S. Army test and evaluation command. Part IV: The White Sands Missile Range Auto-Nowcast System. J. Appl. Meteor. Climatol., 47 , 11231139.

    • Search Google Scholar
    • Export Citation
  • Seaman, N. L., D. R. Stauffer, and A. M. Lario-Gibbs, 1995: A multiscale four-dimensional data assimilation system applied in the San Joaquin Valley during SARMAP. Part I: Modeling design and basic performance characteristics. J. Appl. Meteor., 34 , 17391761.

    • Search Google Scholar
    • Export Citation
  • Sharman, R. D., Y. Liu, R-S. Sheu, T. T. Warner, D. L. Rife, J. F. Bowers, C. A. Clough, and E. E. Ellison, 2008: The operational mesogamma-scale analysis and forecast system of the U.S. Army test and evaluation command. Part III: Forecasting with secondary-applications models. J. Appl. Meteor. Climatol., 47 , 11051122.

    • Search Google Scholar
    • Export Citation
  • Stauffer, D. R., and N. L. Seaman, 1994: Multiscale four-dimensional data assimilation. J. Appl. Meteor., 33 , 416434.

  • Stauffer, D. R., N. L. Seaman, and F. S. Binkowski, 1991: Use of four-dimensional data assimilation in a limited-area mesoscale model. Part II: Effects of data assimilation within the planetary boundary layer. Mon. Wea. Rev., 119 , 734754.

    • Search Google Scholar
    • Export Citation
  • Warner, T. T., and H-M. Hsu, 2000: Nested-model simulation of moist convection: The impact of coarse-grid parameterized convection on fine-grid resolved convection. Mon. Wea. Rev., 128 , 22112231.

    • Search Google Scholar
    • Export Citation
  • Warner, T. T., J. F. Bowers, S. P. Swerdlin, and B. A. Beitler, 2004: A rapidly deployable, operational, mesoscale modeling system for emergency-response applications. Bull. Amer. Meteor. Soc., 85 , 709716.

    • Search Google Scholar
    • Export Citation
  • Xu, M., Y. Liu, C. A. Davis, and T. T. Warner, 2002: Sensitivity study on nudging parameters for a mesoscale FDDA system. Preprints, 19th Conf. on Weather Analysis and Forecasting and 15th Conf. on Numerical Weather Prediction, San Antonio, TX, Amer. Meteor. Soc., 4B.4.

  • Fig. 1.

    Schematic of the overall structure of the modeling system. Data sources are shown at the top. The MAC and DAC clusters each have a Redundant Array of Inexpensive Disks (RAID). The auto-nowcast system operates from the same data input stream as the rest of the system.

  • Fig. 2.

    Example model grid configuration for WSMR. The expanded grid 3 shows (top right) terrain elevation in meters, with shaded color increments of 100 m, and (bottom right) land use distribution. The boundaries of the WSMR are also shown (in magenta). Horizontal grid increments are 30, 10, and 3.3 km, from the outer to the inner grids, respectively.

  • Fig. 3.

    The data assimilation and forecast cycle. The ordinate axis is wall clock time, and the abscissa is model time (i.e., the time being simulated during the data assimilation or forecast process). The slope of the arrows indicates the elapsed time associated with the model’s execution, but the particular slope shown here is not meant to quantitatively indicate model efficiency. An example follows of the 0800 UTC cycle (real-time t on the ordinate = 0800). The cycle starts at 0820 UTC and generates a 3-h final analysis (horizontal black arrow) from 0400 to 0700 UTC in 20–30 min. Then the model writes a restart file (vertical black arrow) for the next cycle (1100 UTC, t + 3 on the ordinate) to start with. The model then continues to produce a preliminary analysis and a forecast for the current cycle, where the real time will be about 1110 UTC by the time the forecast (white arrow) terminates at t + 36 h.

  • Fig. 4.

    Image from the Web-based graphical interface showing 10-m AGL winds (lines with barbs) and 2-m AGL temperature (color bands). This system was used by WSMR to support a missile launch from California.

  • Fig. 5.

    A display for North America, where model output from DPG and APG have been composited on a single map. The plan view image is of model-analyzed 890-hPa temperature, the window over the mid-Atlantic states displays an infrared satellite image, and the east–west-oriented solid line defines the axis of a vertical cross section of temperature that is inset in the lower right. The mouse can be used to define any display area, and the highest-resolution model data available from any range will automatically be used to define the current or forecast conditions. See text for details.

  • Fig. 6.

    Example of the system viewer (SysView) for the main components of the system. These components encompass the data input, MAC, and DAC. The status colors indicate that the NOAAPort system is down (gray boxes and background), and that the model (RT-FDDA) and the output products delivery system are not fully functioning (yellow boxes). If the RT-FDDA icon were selected, the resulting display would show an icon for each node, and a subcritical number of them would have color codes that indicate nonfunctioning.

  • Fig. 7.

    Cycles per day available during the period from December 2003 through February 2004, inclusive, for the Yuma Proving Ground system. A full bar represents a completion of a full 8 cycles per day. The abscissa is labeled beginning with 1 Dec 2003, and each tick along the abscissa represents a day.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 3715 2567 957
PDF Downloads 1552 175 28