As arguably the world’s most widely used numerical weather prediction model, the Weather Research and Forecasting Model offers a spectrum of capabilities for an extensive range of applications.
The Weather Research and Forecasting (WRF) Model (Skamarock et al. 2008) is an atmospheric model designed, as its name indicates, for both research and numerical weather prediction (NWP). While it is officially supported by the National Center for Atmospheric Research (NCAR), WRF has become a true community model by its long-term development through the interests and contributions of a worldwide user base. From these, WRF has grown to provide specialty capabilities for a range of Earth system prediction applications, such as air chemistry, hydrology, wildland fires, hurricanes, and regional climate. The software framework of WRF has facilitated such extensions and supports efficient, massively-parallel computation across a broad range of computing platforms. As detailed below, this paper aims to provide a review of the WRF system and to convey to the meteorological community its significance via its contributions to atmospheric science and weather prediction.
Since its initial public release in 2000, WRF has become arguably the world’s most-used atmospheric model. This is evidenced in metrics of registered users and publications. For example, the cumulative number of WRF registrations is now over 36,000,1 distributed across 162 countries. Figure 1 (left) shows WRF’s steady growth in cumulative registrations since its initial release, while Fig. 2 maps the countries that have logged registered users (as well as those that have had operational forecasting users). WRF’s widespread acceptance is in part due to its being provided without cost, copyright encumbrance, or restrictions on modification. A measure of the ongoing interest in WRF and the influx of users is the number of annual registrations (Fig. 1, left). These averaged over 3,600 per year in the five years from 2011 to 2015. Meanwhile 8,200 users subscribe to the wrf-news e-mail for model information and updates.
The catalog of WRF-related publications reflects the model’s impact on science. The number of peer-reviewed journal publications involving WRF is currently over 3,500, and the annual average for the 2011–15 period is 510 per year (Fig. 1, right). The number of unique institutions on peer-reviewed WRF publications is over 1,340, and the number of unique authors exceeds 11,700. To date, the number of citations to WRF papers is over 26,500, with an average of over 10 citations per publication.
Though WRF is mature, it continues to advance. The system is being vigorously applied in new research directions, real-time settings, and marketplace opportunities. In light of WRF’s continuing growth, its prominent role in research, and its extensive use in forecasting, this article seeks to review this major NWP capability in order to inform and update the meteorological community on the current scope of the system, how it is being applied, and where it is going.
BACKGROUND.
During the 1990s the fifth-generation Pennsylvania State University–NCAR Mesoscale Model (MM5; Grell et al. 1994) saw widespread use in the research community. This stemmed largely from its abilities to address increasingly smaller atmospheric scales and to operate on workstation-level computers. However, while the MM5 was nonhydrostatic, it was not an optimal tool to pursue those scales: it was nonconservative, it had low-order numerics (meaning less accurate solutions for finer scales), and it lacked a framework that facilitated the addition of advanced physics or that supported many desirable software attributes [portability, parallelism, extensibility, software layers, and application programming interfaces (APIs)]. Meanwhile, the National Centers for Environmental Prediction (NCEP) had interest in developing a nonhydrostatic model for operational forecasting on finer scales. In this setting, circa 1995, the idea of WRF took shape on the premise that there could be a beneficial synergy in an NWP model shared by the research and operational camps and that would be a next-generation capability moving past known limitations (such as in the MM5). The model could be a common platform for an extensive research community to develop capabilities that operations could readily exploit. Furthermore, an understanding of model performance and needed improvements could be hastened in the crucible of operational use, guiding practical development in return. Thus, the capability would facilitate “research to operations” (R2O) developments while leading to sharpened research and development efforts on identified needs [the operations to research (O2R) path].
Seeing these complementary possibilities, a partnership formed to build WRF. Its original members were NCAR, the National Oceanic and Atmospheric Administration (NOAA) [represented by NCEP and what has become NOAA’s Earth System Research Laboratory (ESRL)], the U.S. Air Force, the Naval Research Laboratory, the University of Oklahoma, and the Federal Aviation Administration. In addition to planning the initial efforts, the partners provided in-kind and other resources to create the system.
The development of the model’s dynamical solver (or dynamical core) and related numerics were the initial foci. Compared to previous models (such as the MM5), what emerged was superior in terms of higher-order numerical accuracy and scalar conservation properties. As these pieces took shape, an innovative software framework (Michalakes et al. 1999) for the model’s dynamics, physics, and input/output (I/O) components was designed. The architecture united the modeling components logically and efficiently while looking ahead to ensure system extensibility, ease of development, and scalable massively-parallel operation. Another development thrust addressed the preprocessors for domain and input preparation while a separate effort tackled data assimilation. The original partners built these capabilities through the efforts of a number of working groups, and the first model release was in December 2000.
The initial physics packages for WRF were ported from the MM5, but community contributions have since delivered a host of schemes providing multiple options for atmospheric physical processes. Other WRF capabilities have been developed through the resources and efforts of interested agencies and universities. Taken as a whole, WRF’s growth reflects a collaborative and communal path: the system’s development has never been solely funded or directed by a single entity.
WRF’s dynamics are represented in its atmospheric fluid flow solvers, or cores. The two original cores had an Eulerian height–based vertical coordinate and a mass-based vertical coordinate (Klemp et al. 2007). WRF, version 2, saw the removal of the height-based version because the limited benefit of having both cores did not justify the extra complexity. In the early 2000s, another solver, the NCEP’s Nonhydrostatic Mesoscale Model (NMM) core (Janjić et al. 2001; Janjić 2003) was added to provide an alternative option for NCEP. The two WRF variants were called the Advanced Research version of WRF (ARW; WRF-ARW) and WRF-NMM.
Oversight of the WRF enterprise has evolved over time. Through the early years, the partners coordinated the various efforts. The WRF Executive Oversight Board, which represented each partner organization, handled top-level management. Under this were the Research Applications Board and the Operations Requirements Board, addressing the interests of the research and operational stakeholders, respectively. At the developmental level, various working groups focused on narrower areas, such as numerics, data assimilation, and physics. From the late 2000s, the original top-down direction of WRF has transitioned to a mode of community-driven input, with the responsibility for basic system and community support led by NCAR (see “The WRF community support effort” section).
WRF AND ITS APPLICATIONS.
Primary WRF components.
WRF produces atmospheric simulations. The process has two phases, with the first to configure the model domain(s), ingest the input data, and prepare the initial conditions, and the second to run the forecast model. The forecast model components operate within WRF’s software framework, which handles I/O and parallel-computing communications. WRF is written primarily in Fortran, can be built with a number of compilers, and runs predominately on platforms with UNIX-like operating systems, from laptops to supercomputers.2 WRF’s architecture has allowed it to be ported to virtually every type of platform in the world’s top 500 supercomputers (see www.top500.org).
WRF simulations begin with the WRF Preprocessing System (WPS). A series of utilities, the WPS first pulls in geographical information (e.g., topography, land use) to set up the user’s model domains. Next, it ingests, reformats, and interpolates the requisite first-guess atmospheric data (e.g., a global analysis or model forecast) to the user’s domains. Finally, the input fields are put on the model’s vertical levels and lateral boundary conditions are generated. WRF is then ready to run. This is done by the forecast component that contains the dynamical solver and physics packages for atmospheric processes (e.g., microphysics, radiation, planetary boundary layer).3
Per its design as a research tool, WRF can also be configured to conduct idealized simulations. This capability allows users to study processes in a simplified setting (e.g., reflecting a single sounding or idealized topography) by varying parameters and initial conditions while using limited physics. WRF currently provides 12 idealized scenarios, including baroclinic waves, supercell convection, flow over topography, large-eddy flows, and tropical cyclones. In addition, individual users can readily construct other idealized configurations.
WRF may also be run as a global model on a latitude–longitude grid. While Global WRF was originally built to study planetary atmospheres (Richardson et al. 2007), it has come to be used for terrestrial forecasting, chemistry, and climate applications (Zhang et al. 2012; Karamchandani et al. 2012; Jin et al. 2013; Hutchinson 2015).
Under both real-data and idealized configurations, WRF has been extensively used for research. As WRF is fundamentally a mesoscale model, WRF research applications over the years have run the gamut of topics in mesoscale meteorology: synoptic and mesoscale processes associated with extratropical cyclones, fronts, and jets (Zhang et al. 2009; Schultz and Sienkiewicz 2013; Thompson and Eidhammer 2014; Rostom and Lin 2015; Ganetis and Colle 2015; Lu and Deng 2015); mesoscale weather events and phenomena (Powers 2007; Shi et al. 2010; Brewer et al. 2013; Mass et al. 2014; Parish et al. 2015; DuVivier and Cassano 2013); organized convection (Trier et al. 2006; Xu et al. 2015; Meng et al. 2012; Morrison et al. 2012; Akter 2015); and hurricanes (Davis et al. 2008a; Khain et al. 2010; H. Chen et al. 2011; Moon and Nolan 2015). In recent years WRF’s use for regional climate research has surged (see, e.g., Leung and Qian 2009; Done et al. 2015; Bruyère et al. 2014), and, in this, WRF’s forte is resolving smaller-scale atmospheric and land surface processes better than the global models traditionally used for climate projections.
The WRF Data Assimilation System (WRFDA) is the primary data assimilation system for WRF (Barker et al. 2004, 2012; Huang et al. 2009). It features three-dimensional and four-dimensional variational (3DVAR, 4DVAR) approaches, as well as a hybrid variational–ensemble approach [ensemble transform Kalman filter (ETKF); ETKF-3DVAR; Wang et al. 2008; Schwartz et al. 2015c]. These techniques can assimilate a wide range of direct and indirect observation types, from traditional in situ surface and upper-air data to satellite-based measurements. Table 1 lists the observations that WRFDA can currently ingest. In addition to WRFDA, the Gridpoint Statistical Interpolation analysis system (GSI; Wu et al. 2002; Kleist et al. 2009) and the Data Assimilation Research Testbed (DART; Anderson et al. 2009) are data assimilation capabilities that can be used to prepare WRF initial conditions.
WRFDA Observation Catalog. Data formats used vary with observation but include ASCII, NCEP Prepared BUFR (PrepBUFR), binary, and Hierarchical Data Format (HDF). SYNOP = synoptic. METAR = aviation routine weather report. Pibal = pilot balloon. HIRS = High Resolution Infrared Radiation Sounder. AMSU = Advanced Microwave Sounding Unit. MHS = Microwave Humidity Sounder. SATEM = temperature profile. SSMIS = Special Sensor Microwave Imager/Sounder. DMSP = Defense Meteorological Satellite Program. ATMS = Advanced Technology Microwave Sounder. SNPP = Suomi National Polar-Orbiting Partnership. AIRREP = air report. ACARS = Aircraft Communication, Addressing, and Reporting System. AMDAR = Aircraft Meteorological Data Relay. TAMDAR = Tropospheric Airborne Meteorological Data Reporting. AIRS = Atmospheric Infrared Sounder. GPS = global positioning system. IASI = Infrared Atmospheric Sounding Interferometer. MetOp = Meteorological Operational satellite. SEVIRI = Spinning Enhanced Visible and Infrared Imager. Meteosat = Meteorological Satellite. MWTS = Microwave Temperature Sounder. MWHS = Microwave Humidity Sounder. FY-3 = Fengyun-3. AMSR-2 = Advanced Microwave Scanning Radiometer-2. GCOM = Global Change Observation Mission–Water 1.
New approaches to data assimilation and the impact of new observation types are fertile areas of research using WRFDA (e.g., Jung et al. 2013; Xu et al. 2013; Romine et al. 2016). Additionally, ensemble Kalman filter data assimilation has been used to initialize ensembles of WRF with convection-permitting resolutions (e.g., 3-km grids) for field campaigns [such as the Mesoscale Predictability Experiment (MPEX) in 2013 (Schwartz et al. 2015a) and the Deep Convective Clouds and Chemistry (DC3) field campaign of 2012 (Romine et al. 2014)], for NOAA’s Hazardous Weather Testbed experiment (Clark et al. 2012), and for ongoing real-time forecasting (e.g., at NCAR; see Schwartz et al. 2015b). Individual convection-permitting WRF runs have demonstrated the capability to capture the observed evolution and structure of organized convective storms. Figure 3 provides a good example for a long-lived central-U.S. squall line event of July 2015, from the NCAR ensemble. Apart from a timing offset of a few hours, the WRF prediction of over 24 hours in advance matches well the system progression seen in the composite radar analysis.
Real-time NWP.
Envisioned for NWP, WRF is used operationally at governmental centers around the world (see, e.g., Fig. 2; Dudhia 2014) as well as by private companies. The configurability of high-resolution domains, variety of possible input data, and computational flexibility (particularly in limited-resource settings), along with the ability to leverage model advancements from a global research community, have made it particularly attractive for real-time forecasting.
In the United States, NCEP employs WRF in support of the National Weather Service in a number of systems. WRF (ARW) is run in the Rapid Refresh (RAP) and High-Resolution Rapid Refresh (HRRR) systems developed by the NOAA ESRL and NCEP (Benjamin et al. 2016; Peckham et al. 2016). Providing the benefits of an hourly update cycle, the RAP and HRRR systems feature 13- and 3-km grids across North America and the conterminous United States (CONUS), respectively. Their frequent initializations use the GSI data assimilation system, employing a hybrid variational–ensemble approach, as well as digital filter initialization (Peckham et al. 2016). Figure 4 provides an example of an HRRR forecast and verification over the 3-km CONUS domain. HRRR and RAP applications include energy, hydrology, severe weather, aviation weather, and air quality, and are partially linked to several of the WRF specialty systems noted below (see the “Tailored WRF systems and model applications” section). A recently developed example is HRRR-Smoke, which employs the WRF Model coupled with Chemistry (WRF-Chem) in real time with the HRRR and has been developed to simulate the emissions and transport of smoke from wildfires (see https://rapidrefresh.noaa.gov/hrrr/HRRRsmoke/).
NCEP also uses WRF in its short-range ensemble forecasting (SREF) system, which includes 13 ARW forecast members (on 16-km grids), and in the High-Resolution Window Forecast System (HIRESW), with domains of 4.2–3.5-km grid spacing for the CONUS, Alaska, Hawaii, Puerto Rico, and Guam. As described below, NCEP also runs a version of the model called Hurricane WRF (HWRF; Tallapragada et al. 2014) operationally for tropical cyclone prediction.
From WRF’s inception, the U.S. Air Force has been both a key user of, and contributor to, WRF for real-time NWP. WRF has supported its forecasting needs in theaters across the globe since the mid-2000s. Outside of the United States, Taiwan’s Central Weather Bureau (CWB) runs WRF for operational forecasting, which includes a model configuration for its western Pacific typhoon prediction needs (Typhoon WRF; Hsiao et al. 2012). In a very different setting (namely, polar) the Antarctic Mesoscale Prediction System (AMPS; Powers et al. 2012) runs WRF over Antarctica to support the U.S. Antarctic Program (USAP). The primary users are the weather forecasters for the USAP, who provide the forecasts for crucial air operations and scientific and logistical activities across the continent.
Apart from its operational real-time user, WRF is run for real-time forecasting by scores of universities worldwide. Its output provides a focus for weather analysis and forecasting by students and faculty, as well as a vehicle for classroom explorations of NWP and course topics. WRF’s portability to workstation and limited-compute environments has facilitated this, and the university application has introduced many students to the practice of NWP.
Real-time WRF also supports science via NWP for field campaigns. In recent years WRF-assisted experiments have included DC3 (2012) (Barth et al. 2015); Tropical Ocean Troposphere Exchange of Reactive Halogen Species and Oxygenated Volatile Organic Compounds (TORERO; 2012) (Volkamer et al. 2015); Studies of Emissions and Atmospheric Composition, Clouds, and Climate Coupling by Regional Surveys (SEAC4RS; 2013) (Wagner et al. 2015); and MPEX (2013) (Weisman et al. 2015), and more recently the Olympic Mountains Experiment (OLYMPEX; http://olympex.atmos.washington.edu/) and the O2/N2 Ratio and CO2 Airborne Southern Ocean Study (ORCAS; www.eol.ucar.edu/field_projects/orcas). OLYMPEX (November 2015–February 2016) studied precipitation measurements in midlatitude oceanic frontal systems interacting with the coast and mountains, specifically the Olympic Peninsula and Olympic Mountains in Washington and addressed the validation of data from the NASA Global Precipitation Measurement (GPM) program satellite constellation (www.nasa.gov/mission_pages/GPM/main/index.html). The experiment included sampling in fronts, extratropical cyclones, and atmospheric rivers, with the University of Washington running real-time WRF down to 4 km in support of the missions of the NASA DC-8 and ER-2 and University of North Dakota Citation aircraft. ORCAS (January–February 2016) has sought to improve the understanding of the physical and biological controls on air–sea exchange of O2 and CO2 in the Southern Ocean. Within the framework of AMPS, WRF provided weather guidance for the NCAR High-Performance Instrumented Airborne Platform for Environmental Research (HIAPER; Gulfstream V) aircraft chemical measurement flights over the Southern Ocean and the Antarctic Peninsula.
Private sector real-time use of WRF is significant. Two large concerns that employ WRF for regular NWP services are The Weather Company and Vaisala. The Weather Company (www.theweathercompany.com) runs a modified version of WRF under the name Rapid Precision Mesoscale (RPM) to provide a range of services and products, including flight forecasting, which focuses on convection, turbulence, and icing. The Weather Company also has pushed the delivery of WRF-based weather information into the medium of television production and applications for smartphones, tablets, and desktop systems. Vaisala (www.vaisala.com), known in the meteorology community for its atmospheric measurement systems and information services, runs WRF for renewable energy efforts, both to assess the potential for wind and solar energy projects and to improve return on existing facility investments. Its services include probabilistic forecasting to address wind energy scheduling issues and to maximize opportunities to boost energy delivery to the electric power grid. As another innovative direction for WRF NWP, Ignitia (www.ignitia.se) has targeted agriculture and is running a model configuration in aid of farmers in tropical West Africa, focusing on Ghana (Smith 2015). Here, subscribers receive text messages with WRF forecast information that has been customized for farm decision-making (e.g., deployment of workers or machinery), improving harvest and investment returns.
Tailored WRF systems and model applications.
To address Earth system prediction beyond weather, WRF supports a host of tailored capabilities. Most of these are compiled with WRF and run with it, as opposed to being stand-alone models run offline using WRF output. Thus, the capabilities have strong two-way interaction with the atmospheric component.
Air chemistry
WRF-Chem is a WRF-based in-line atmospheric chemistry model (Grell et al. 2005; Fast et al. 2006). It is applied in a wide range of research on air quality and chemistry and has a spectrum of options to handle gas-phase and aqueous chemistry and aerosols. Table 2 lists the main process capabilities and applications of WRF-Chem. WRF-Chem integrates chemistry and dynamics at every time step, which is essential not only for investigating aerosol, weather, and climate interactions, but also for air quality research (Grell et al. 2004; Grell and Baklanov 2011; Baklanov et al. 2014). WRF-Chem can be used for dispersion forecasts (e.g., volcanic ash, dust, smoke, or other hazardous constituents), as well as the prediction and investigation of air quality (also with aerosol–radiation and aerosol–microphysics interactions) and complex interactions between chemistry, aerosols, and physics (such as aerosol indirect effects and aqueous-phase chemistry).
Main capabilities and applications of WRF-Chem, version 3.8. Descriptions of the parameterizations/schemes for the various processes can be found in the WRF-Chem User’s Guide (NOAA 2016) and in WRF-Chem related publications (see http://ruc.noaa.gov/wrf/wrf-chem/References/WRF-Chem.references.htm).
The NOAA ESRL provides WRF-Chem community support, including tutorials, and its WRF-Chem help desk provides guidance in model use, access to the code repository, and news on developments. The WRF-Chem development community convenes annually at the WRF Users’ Workshop, and in 2016 a special issue devoted to WRF-Chem was jointly organized between the open-access journals Geoscientific Model Development and Atmospheric Chemistry and Physics (see www.atmos-chem-phys.net/special_issue365_33.html).
Hydrology
The WRF Hydrological Modeling Extension Package (WRF-Hydro) is a hydrology modeling capability providing both fully coupled two-way interactions with WRF and a stand-alone capability (Gochis et al. 2015). First released to the community in 2013, WRF-Hydro is “multiscale” in that it can represent physical processes such as precipitation, infiltration, snowmelt, hillslope overland flow, and channel flow on varying grids. In addition, it is “multiphysics” in that it offers a number of options for representing hydrologic processes. Figure 5 depicts the WRF-Hydro forecasting elements.
WRF-Hydro includes a real-time streamflow data assimilation system, and outputs include snowpack (snow depth and snow water equivalent), soil moisture, evapotranspiration, standing/ponded water, shallow groundwater, and flow through rivers and reservoirs. The model code has been used in a variety of process and forecasting studies (e.g., David et al. 2009; Yucel et al. 2015; Senatore et al. 2015; Givati et al. 2016). Based on a collaboration of NCAR and the NOAA National Water Center, WRF-Hydro was chosen as the framework for the national hydrologic prediction system and has begun operational forecasting.
Fire weather
WRF-Fire (Coen et al. 2013) is a WRF and wildland fire-behavior physics package. The system has two-way coupling with the atmospheric dynamics such that at each time step near-surface WRF winds direct the fire spread rate and direction. In turn, the sensible and latent heat fluxes from the combustion force the atmosphere, producing fire-induced winds. The physics package keeps track of the subgrid-scale interface (the “flaming front”) between burning and unignited areas; calculates the fire spread rate as a function of local wind, slope, and fuel properties (e.g., amount, structure, and moisture); calculates the consumption rate of fuel; and releases heat fluxes into the boundary layer. WRF-Fire has been applied to reproduce fire phenomena (Simpson et al. 2013, 2014), investigate the effects of fuel moisture content and type (Coen et al. 2013), interpret wildfire case studies (Peace et al. 2015), and predict smoke and air quality (Kochanski et al. 2015). As an illustration of WRF-Fire, Fig. 6 shows output from a simulation of a 2015 Colorado grassland fire on a 200-m grid.
Tropical cyclones
HWRF (Tallapragada et al. 2014) is a version of WRF tailored for operational hurricane forecasting. NCEP, the primary developer, deploys HWRF worldwide for real-time prediction for tropical cyclones for which the National Hurricane Center (NHC), Joint Typhoon Warning Center (JTWC), and Central Pacific Hurricane Center (CPHC) have responsibility. HWRF employs the NMM core and, to better capture air–sea interaction, couples the Princeton Ocean Model (POM; Mellor 2004) in the Atlantic and northeast Pacific basins. Plans include the coupling of a surface wave model (NOAA Wavewatch III; Tolman 2009) as well.
WRF has been applied extensively for tropical cyclone (TC) research, and the Advanced Hurricane WRF (Davis et al. 2008b, 2010) is an ARW configuration tuned for this purpose. TC studies have exploited WRF’s menu of physics options and moving nest capabilities to investigate storm genesis, intensification, structure, and evolution (Davis et al. 2008a; Fierro et al. 2009; Fang and Zhang 2010; Wang et al. 2010; H. Chen et al. 2011; Chen and Zhang 2013; Miller et al. 2015). Furthermore, coupled systems of WRF and ocean and wave models have been built to improve TC forecasts, capture air–sea interactions, and better represent ocean and storm impacts/responses (Warner et al. 2010; B. Liu et al. 2011; Gopalakrishnan et al. 2012; Kim et al. 2014; Liu et al. 2015; Chen and Curcic 2016).
Also in the vein of better simulating atmosphere–ocean interactions, the Coupled Ocean–Atmosphere–Wave–Sediment Transport (COAWST) system (Warner et al. 2010) was created and applied for hurricane simulations (Zambon et al. 2014). Supported by the Woods Hole Oceanographic Institution, COAWST links WRF to the ocean model Regional Ocean Modeling System (ROMS; Shchepetkin and McWilliams 2005), the wave model Simulating Waves Nearshore (SWAN; Booij et al. 1999), and the Community Sediment Transport Modeling System (Warner et al. 2008). Similarly, the University of Miami has coupled WRF to the Hybrid Coordinate Ocean Model (HYCOM; Bleck 2002) and the University of Miami Wave Model (UMWM; Donelan et al. 2012) for hurricane studies (Chen et al. 2013; Chen and Curcic 2016).
Urban meteorology
WRF-Urban (F. Chen et al. 2011) couples an integrated, multiscale, multiphysics urban modeling system to WRF to represent the impacts of urbanization on regional weather and climate, public health, and water resources. The system integrates localized city morphology datasets and multiple urban-modeling capabilities (e.g., building/structure effect parameterizations) to capture interactions among heat islands, the city canopy boundary layer, and mesoscale atmospheric conditions. WRF-Urban is being applied by more than 100 groups in 25 countries for forecasting weather and air quality for cities (e.g., at the Beijing Institute of Urban Meteorology; Barlage et al. 2016), for investigating the impacts of urbanization on regional meteorology and water resources, and for exploring adaptation strategies of urban planners.
Solar and wind energy
WRF-Solar (Jimenez et al. 2016) is a configuration of WRF tailored for solar energy forecasting and applications. Among other modifications, it includes improved solar tracking and parameterizations, allowing either the interpolation of irradiances between expensive radiation scheme calls or the use of a fast radiative transfer code for irradiance calculations. WRF-Solar has improved representations of aerosol–radiation, cloud–aerosol, and cloud–radiation interactions and has been found to yield more accurate aerosol, shortwave irradiance, and solar parameter predictions (Jimenez et al. 2016).
WRF use in wind energy settings (highlighted above as a real-time use) is also growing. Applications in this area are pushing WRF to large-eddy resolutions to predict winds at turbine hub heights (e.g., 80 m AGL; Y. Liu et al. 2011). An award-wining WRF-based capability is the NCAR Wind and Solar Power Forecasting System (see Mahoney et al. 2012; Haupt and Mahoney 2015).4 This provides for forecasting over wind farms serving the regional utility Xcel Energy, which has the largest wind energy generation capacity in the United States (Haupt and Mahoney 2015). The U.S. Department of Energy’s National Renewable Energy Laboratory has collaborated in the development of the power production algorithms using the WRF data, which target turbine heights and which the system translates into energy generation predictions provided to Xcel.
Large-eddy-scale modeling
The WRF with turbulence-resolving capability based on a large-eddy simulation (LES) approach is called WRF-LES. Through very fine (e.g., 50–100 m) grids it can simulate flows in idealized, canonical atmospheric boundary layers as well as realistic, evolving boundary layers driven by large-scale flows. WRF-LES has been verified in planetary boundary layer (PBL) simulations under a spectrum of stability conditions (Mirocha et al. 2010; Munoz-Esparza et al. 2014, 2015). To develop a multiscale capability that bridges the mesoscale–microscale gap, ongoing developments include a scale-aware PBL scheme for the 100-m to 1-km range, a turbulence-triggering LES approach, and a surface layer parameterization accounting for surface heterogeneity (see, e.g., Mirocha et al. 2014; Aitken et al. 2014; Xiao et al. 2015).
WRF-LES is probing the large-eddy scale (Moeng et al. 2007; Mirocha et al. 2010) and aims to represent the scales, energies, and transport associated with atmospheric turbulence. It can simulate flows in hurricane boundary layers (Zhu 2008) and across wind farms (Y. Liu et al. 2011), and very high-resolution (e.g., Dx ∼50 m) simulations are being applied to explore both idealized settings (Kirkil et al. 2012) and real-data cases (Talbot et al. 2012). Recent work has significantly improved WRF for large-eddy applications by changing the model’s prognostic formulation of potential temperature (Xiao et al. 2015), with the benefits of avoiding spurious simulated motions and reducing the computational cost of LES runs. Pushing WRF to explore the microscale is helping to shape one of the model’s future roles in atmospheric modeling: addressing scales out of the practical reach of global models while spanning domains beyond which classical large-eddy modeling or direct numerical simulation can be afforded computationally.
Polar environments
Polar WRF (Hines and Bromwich 2008; Bromwich et al. 2009; Hines et al. 2011; Bromwich et al. 2013) provides options to more accurately represent conditions over the high latitudes and ice sheets. These include fractional sea ice representation, adjustments to the thermal/radiative properties of ice and snow surfaces, and specification of sea ice albedo and snow depth on sea ice. While the released version of WRF incorporates most of these modifications, the full Polar WRF code with the latest updates is provided by The Ohio State University (http://polarmet.osu.edu/PWRF). Supporting both climate and meteorological studies, the recently prepared Arctic System Reanalysis (ASR) dataset (Bromwich et al. 2016) has been generated from Polar WRF. This regional dataset targeting the Arctic is of higher resolution than the widely used European Centre for Medium-Range Weather Forecasts (ECMWF) interim reanalysis (ERA-Interim) and has shown improvements over it (Bromwich et al. 2015).
Teaching and scientific training.
WRF is used at universities around the world as a vehicle for teaching and thesis work. Based on a survey of professors at University Corporation for Atmospheric Research (UCAR) member institutions, the ways in which the system is used include instruction on NWP, numerical methods, and atmospheric processes; research projects; real-time runs for forecasting and weather classes; and model output for lecture and class materials. For studies, both graduate and undergraduate students can readily apply WRF to explore a spectrum of atmospheric research problems, using its idealized, real-data, or specialty capabilities. Furthermore, they can quickly begin to address scientific questions without having to reinvent and build a complex model as a study tool. That WRF can readily be installed and run on common computer hardware (i.e., workstations, clusters, and laptops) has been a big factor in its use in education.
THE WRF COMMUNITY SUPPORT EFFORT.
In the early years, the original WRF partners directed the development and support of WRF. As the system became established, the broader community increasingly undertook development. Model support, however, has remained centralized in NCAR, and today that effort includes user assistance, system oversight, and integration of code contributions.
Oversight.
Part of NCAR’s mission is to support the university community’s atmospheric research through facilities. Thus, as it had originally supported the MM5, NCAR transitioned that role to WRF, and NCAR currently manages and oversees the system. Of the original WRF management structure, the Research Applications Board (RAB) and certain working groups continue. The RAB, composed of scientists from the community, convenes during the annual WRF Users’ Workshop and considers WRF scientific issues and research community interests.
NCAR, and specifically its Mesoscale and Microscale Meteorology (MMM) Laboratory, is responsible for the management of the WRF code, for the oversight of releases, and for providing community support. To coordinate code and release management with community participation, the WRF Developers’ Committee and the WRF Release Committee were established. The Developers’ Committee (DC) has responsibility for maintaining the WRF system software by approving and overseeing code contributions, code testing, and repository upkeep. Members of the DC are active in creating code and in maintaining the WRF system, and they shepherd new code into the WRF repository by being liaisons with external contributors.
The WRF Release Committee (RC) oversees the model major release process. It sets the release schedule, compiles a list of features proposed by developers, and reviews the progress of candidate items. The committee makeup reflects the active areas of WRF code development (e.g., software, physics, data assimilation, chemistry), and its members serve as points of contact with contributors in these areas. Major releases are issued annually (April), with typically subsequent minor (primarily bug fix) releases by the following September.
Since WRF’s initial development, new components and capabilities have come from directed funding by agencies and from contributions from those in the research and operational communities. Anyone may offer code to WRF: submissions are not restricted to prescribed groups. Among the requirements for contributions, however, developers are responsible for the testing of their code and for providing the results to the DC (see NCAR 2016).
The WRF Physics Review Panel reviews new physics packages that are submitted by developers for inclusion in the model repository. Functioning like a journal editor, the panel communicates with the prospective contributors and taps relevant experts from the scientific community to anonymously review the proposed packages. The review materials required from developers include testing results and code documentation. The panel considers the reviewers’ input, with criteria that include scientific soundness, novelty of approach, and projected community interest, and makes a recommendation to the Developers’ Committee on whether the package should be accepted.
Community support.
Furnishing user support to the worldwide WRF community is a major effort. For the core WRF system (WRF and WRFDA), the MMM Laboratory provides this mainly under its base budget from the National Science Foundation. For WRF specialty systems, the primary development groups provide support (e.g., NOAA/ESRL for WRF-Chem). Fundamental to core WRF support is user assistance, provided chiefly through the wrfhelp service. Wrfhelp receives from 350 to 475 inquiries a month, ranging from questions on input data, to model configuration and compilation, to run-time problems.5 Model support also includes maintaining the user pages and downloadable materials, such as the WRF code, utilities, and documentation, and an important function is working with developers to integrate code contributions for releases. Users can also get assistance informally from other, experienced users through an online WRF forum (http://forum.wrfforum.com).
Tutorials have been a pillar of community support. The primary tutorials are hosted by NCAR in January and July, are a week long, and consist of lectures and practice sessions. Class sizes are typically 60. There are also regular tutorials for WRFDA and WRF-Chem. The NCAR support team also periodically delivers tutorials abroad.
WRF support includes organizing and hosting the annual WRF Users’ Workshop. Held at NCAR (Boulder, Colorado) each June, the workshop brings together the WRF community to discuss model developments, results, and issues. Popular workshop sessions include lectures on a selected modeling topic (e.g., cumulus parameterizations, data assimilation), group discussions on the main WRF-related areas (e.g., physics, software), and mini tutorials on model-related utilities (e.g., visualization tools). The workshop averages about 200 registrants per year.
FUTURE DIRECTIONS.
Though mature, WRF continues to advance and to effectively serve the needs of its user groups through the creative developments and applications of the community. Sketched below are future directions for WRF. It is emphasized that the examples are not exhaustive and that the varied interests of the user community will ultimately draw the picture.
While we have presented many positives of the model, WRF is not without issues and challenges. As one example, although WRF computes efficiently in parallel, it can exhibit relatively slow I/O performance in certain massively parallel environments. Currently, to reduce I/O time WRF does have an option to produce separate output files for each processor used. However, in settings with very high CPU counts, the number of resultant files can become a problem. Thus, to improve WRF’s parallel outputting, a future release will offer parallel I/O (PIO), a library for handling output that can scale in parallel with very large WRF domains.
An example of a challenge to WRF is the processing of code contributions. One issue here is ensuring the value of new physics packages, in light of the accumulation of options in the model. In response to this, the WRF Physics Review Panel was set up. Second, the testing and integration of new code require the resources of the WRF community support team. To reduce this demand, as well as to improve the level of submitted code, more stringent requirements on contributors for code testing and documentation have been established. As a related issue, another notable challenge to WRF is the pressure on the community support effort, in light of some erosion of the funding for this work. The community should note, however, that the WRF effort will continue to provide basic support (e.g., user help) and will prioritize that over other activities as necessary.
Considering the future demand for WRF, as long as it can more effectively meet users’ needs as an accurate, extensible, and scalable regional modeling system, it should continue to be widely employed. WRF is an established, understood system with a long track record. It has physics and specialty systems for capturing a range of Earth system processes, and the needs for these types of capabilities will continue. Furthermore, WRF can focus on regional domains with resolutions beyond what most users can attain using global models. One future role for WRF is thus in operating in the spectral range between the applied scales of global and LES models: the space from the convective to large-eddy scales.
In terms of WRF development, one direction is that of connected physics packages. With this, one scheme sends prognostic quantities to another, instead of the latter relying on its own estimates (e.g., from climatologies). An example is the coupling of the Thompson microphysics scheme and Rapid Radiative Transfer Model for general circulation models (RRTMG) radiation scheme, as described in Thompson et al. (2016). Whereas the radiation scheme would assume sizes of water and ice species, here the microphysics scheme supplies the predicted values of effective radii of cloud water, ice, and snow (Thompson and Eidhammer 2014; Thompson et al. 2016). Similar work is addressing aerosol information for microphysics and radiation schemes, and cumulus and radiation scheme coupling has also been done (Alapaty et al. 2012). Leveraging the prognostic output from physics packages and increasing the consistency of assumptions and calculations within the model are benefits from the connected physics approach.
Related to coupled physics is the idea of defining physics suites in WRF. Physics suites are combinations of schemes (e.g., microphysics, radiation, cumulus, PBL) that are run together. In WRF these can reflect schemes that have been coupled or package combinations that are known, or may be tuned, to perform well together. Motivations for designating suites are to improve the understanding of model behavior and ultimately improve model performance: concentrated use of a suite could focus attention on a scheme set, accelerate improvements within that set, and yield a better-documented model configuration. Having suites known for their suitability for specific grid sizes and applications (e.g., regional climate or hurricane prediction) can guide users in configuring the model, and the more consistent use and verification of a given combination can promote its further development.
The implementation of scale-aware physics schemes is another goal for WRF. These are packages that aim to perform accurately over a range of horizontal grid spacings without the problems arising from the breakdown of assumptions tied to specific scales (see, e.g., Molinari and Dudek 1992). Such schemes would clearly benefit applications with nesting that enters the “gray zone” between parameterized and explicit scales of convection (see, e.g., Wyngaard 2004; Yu and Lee 2010; Hong and Dudhia 2012; Shin and Hong 2013; Ching et al. 2014). Ideally, a scale-aware physics scheme would run accurately from hydrostatic (e.g., ≥10 km), to convection-permitting (e.g., 1–3 km), to large-eddy (e.g., tens/hundreds of meters) scales. Recently, Shin and Hong (2015) presented an algorithm for vertical transport in convective boundary layers aimed to bridge the subgrid- to resolved-scale gap, and such approaches may aid in the development of scale-aware schemes for WRF.
The future will see continued work to advance WRF computational performance. First, efforts are underway to speed up popular physics packages. For example, software engineers at the University of Wisconsin have worked on the Goddard longwave and shortwave radiation packages, the Yonsei University (YSU) PBL scheme, and the Thompson microphysics scheme, with plans to target other schemes as well. Their recoding improvements have been based on analyses from benchmarking and instrumentation programs. Preliminary results are showing substantive speedups (e.g., 2–4 times) in the selected schemes while maintaining bit-for-bit identical results (Mielikainen et al. 2014; Huang et al. 2014).
Second, to improve overall code performance and to more effectively use high-performance computing (HPC) platforms, Open Multiprocessing (OpenMP) compute environments for WRF will be exploited. This is being done through hybrid parallelism, in which there is shared-memory parallel operation (i.e., using OpenMP) within a compute node and a distributed-memory parallel approach [using message passing interface (MPI)] for communications between nodes. Testing in WRF has shown that hybrid parallelism can be better than either pure MPI or OpenMP. In addition, the WRF framework’s emphasis on MPI tasks and OpenMP threads aligns well with next-generation computers adopting the Many Integrated Core (MIC) approach (Satish et al. 2010). This strategy uses large numbers of lower-performance processors on a single chip to increase hardware parallelism, computations per watt, and overall performance.
As suggested above, WRF will continue to adapt to, and exploit, new computing trends and designs. Supercomputing is moving toward architectures with legions of cheaper, lower-clock-speed, and more energy-efficient processors. For instance, the last three main community machines at NCAR have featured increased processor counts and decreased processor clock speeds: Bluefire (2008–13) with 4,064 chips, each 4.7 GHz; Yellowstone (2012–17) with 72,576 chips, each 2.6 GHz; and Cheyenne (2017 and beyond) with 145,152 chips, each with 2.3 GHz. WRF runs well on these architectures. However, an opposite paradigm to that of supercomputer mainframes is also emerging: cloud computing. This is the use of the Internet to enlist a large number of remote servers for job processing, and WRF can take advantage of it. As an example, Amazon Web Services (https://aws.amazon.com) offers software and tools for users to create a compute cluster for doing WRF simulations via the cloud. Cloud computing for WRF promises users the ability to do modeling requiring compute resources otherwise beyond their hardware means.
One final aspect of the future of WRF is its relationship to other models, in particular global models and the Model for Prediction Across Scales (MPAS) project (Skamarock et al. 2012). Variable-resolution global models, such as MPAS, may provide clear benefits over regional models that rely on imposed lateral boundary conditions derived from those larger-scale models (see, e.g., Park et al. 2014). As we expect, however, that WRF will continue to be a preferred modeling platform for increasingly higher-resolution, shorter-time-scale applications (LES, wind energy, urban meteorology, cloud and storm dynamics, etc.) for the foreseeable future, there will be substantial benefits in the sharing of components and capabilities across system types.
The likely areas of interface for WRF will be physics, couplers, and postprocessing. The ability of WRF to share specific physics scheme sets with other models (e.g., MPAS) would support the testing of packages across scales and comparisons of model platforms. It would also allow WRF to use the same physics as the model supplying its first-guess field and boundary conditions, giving a measure of consistency of initial conditions (ICs)/boundary conditions (BCs) and the simulation. As an example of WRF’s move toward shared physics, the WRF and MPAS efforts are currently considering a common physics repository.
A specific coupler or coupling approach is another capability that WRF could adopt in common with other atmospheric modeling systems (CESM, MPAS, etc.). This would allow WRF to join readily with other types of environmental models (e.g., ocean, wave), using the same developed and supported connectivity, and it could allow for coupled WRF systems incorporating the same Earth process components as global systems. Similarly, a postprocessing capability that WRF shared with MPAS could provide an improved tool for output analysis (e.g., diagnostics and graphics) and the multimodel user interest in the common capability could promote its development and facilitate its support.
CLOSING REMARKS.
In retrospect, WRF’s significance for meteorology and atmospheric modeling in large part rests on the fact that over the years it has not simply supported—but stimulated—a productive and evolving community by providing solid common ground on which to pursue ideas and build on results. WRF has pioneered in, and expanded the boundaries of, high-resolution and coupled atmospheric prediction. Over its tenure, it has supported and served research, education, public needs, and private interests.
Looking forward, WRF will continue to enable the research and NWP communities as a powerful resource. Though computing and modeling advancements are allowing global models to address the mesoscale with higher resolution, WRF will in turn be able to capture regional and smaller scales more effectively with even finer grids, reaching down to large-eddy grid spacing. WRF will continue to be an effective platform for developing improved representations of physical processes and for serving as the foundation of an array of tailored and coupled systems for integrated Earth system modeling. WRF is truly a community model, and its populous, diverse user base will keep driving it to meet an evolving range of scientific, operational, educational, and commercial needs. It is envisioned that the community’s innovative applications and contributions will advance a vital WRF Model for years to come.
ACKNOWLEDGMENTS
On behalf of itself and the community, NCAR thanks the National Science Foundation for its funding of WRF community support over the years.
REFERENCES
Aitken, M. L., B. Kosović, J. D. Mirocha, and J. K. Lundquist, 2014: Large eddy simulation of wind turbine wake dynamics in the stable boundary layer using the Weather Research and Forecasting Model. J. Renewable Sustainable Energy, 6, 033137, doi:10.1063/1.4885111.
Akter, N., 2015: Mesoscale convection and bimodal cyclogenesis over the Bay of Bengal. Mon. Wea. Rev., 143, 3495–3517, doi:10.1175/MWR-D-14-00260.1.
Alapaty, K., J. A. Herwehe, T. L. Otte, C. G. Nolte, O. R. Bullock, M. S. Mallard, J. S. Kain, and J. Dudhia, 2012: Introducing subgrid-scale cloud feedbacks to radiation for regional meteorological and climate modeling. Geophys. Res. Lett., 39, L24809, doi:10.1029/2012GL054031.
Anderson, J., T. Hoar, K. Raeder, H. Liu, and N. Collins, 2009: The Data Assimilation Research Testbed: A community facility. Bull. Amer. Meteor. Soc., 90, 1283–1296, doi:10.1175/2009BAMS2618.1.
Baklanov, A., and Coauthors, 2014: Online coupled regional meteorology chemistry models in Europe: Current status and prospects. Atmos. Chem. Phys., 14, 317–398, doi:10.5194/acp-14-317-2014.
Barker, D. M., W. Huang, Y.-R. Guo, A. J. Bourgeois, and Q.-N. Xiao, 2004: A three-dimensional (3DVAR) data assimilation system for use with MM5: Implementation and initial results. Mon. Wea. Rev., 132, 897–914, doi:10.1175/1520-0493(2004)132<0897:ATVDAS>2.0.CO;2.
Barker, D. M., and Coauthors, 2012: The Weather Research and Forecasting Model’s Community Variational/Ensemble Data Assimilation System. Bull. Amer. Meteor. Soc., 93, 831–843, doi:10.1175/BAMS-D-11-00167.1.
Barlage, M., S. Miao, and F. Chen, 2016: Impact of physics parameterizations on high-resolution weather prediction over two Chinese megacities. J. Geophys. Res. Atmos., 121, 4487–4498, doi:10.1002/2015JD024450.
Barth, M. C., and Coauthors, 2015: The Deep Convective Clouds and Chemistry (DC3) field campaign. Bull. Amer. Meteor. Soc., 96, 1281–1309, doi:10.1175/BAMS-D-13-00290.1.
Benjamin, S. G., and Coauthors, 2016: A North American hourly assimilation and model forecast cycle: The Rapid Refresh. Mon. Wea. Rev., 144, 1669–1694, doi:10.1175/MWR-D-15-0242.1.
Bleck, R., 2002: An oceanic general circulation model framed in hybrid isopycnic-Cartesian coordinates. Ocean Modell., 4, 55–88, doi:10.1016/S1463-5003(01)00012-9.
Booij, N., R. C. Ris, and L. H. Holthuijsen, 1999: A third-generation wave model for coastal regions. Part I: Model description and validation. J. Geophys. Res., 104, 7649–7666, doi:10.1029/98JC02622.
Brewer, M. C., C. F. Mass, and B. E. Potter, 2013: The West Coast thermal trough: Mesoscale evolution and sensitivity to terrain and surface fluxes. Mon. Wea. Rev., 141, 2869–2896, doi:10.1175/MWR-D-12-00305.1.
Bromwich, D. H., K. M. Hines, and L.-S. Bai, 2009: Development and testing of Polar Weather Research and Forecasting model: 2. Arctic Ocean. J. Geophys. Res., 114, D08122, doi:10.1029/2008JD010300.
Bromwich, D. H., F. O. Otieno, K. M. Hines, K. W. Manning, and E. Shilo, 2013: Comprehensive evaluation of polar weather research and forecasting model performance in the Antarctic. J. Geophys. Res. Atmos., 118, 274–292, doi:10.1029/2012JD018139.
Bromwich, D. H., A. B. Wilson, L.-S. Bai, G. W. K. Moore, and P. Bauer, 2016: A comparison of the regional Arctic System Reanalysis and the global ERA-Interim Reanalysis for the Arctic. Quart. J. Roy. Meteor. Soc., 142, 644–658, doi:10.1002/qj.2527.
Bruyère, C. L., J. M. Done, G. J. Holland, and S. Fredrick, 2014: Bias corrections of global models for regional climate simulations of high-impact weather. Climate Dyn., 43, 1847–1856, doi:10.1007/s00382-013-2011-6.
Chen, F., and Coauthors, 2011: The integrated WRF/urban modelling system: Development, evaluation, and applications to urban environmental problems. Int. J. Climatol., 31, 273–288, doi:10.1002/joc.2158.
Chen, H., and D.-L. Zhang, 2013: On the rapid intensification of Hurricane Wilma (2005). Part II: Convective bursts and the upper-level warm core. J. Atmos. Sci., 70, 146–162, doi:10.1175/JAS-D-12-062.1.
Chen, H., D.-L. Zhang, J. Carton, and R. Atlas, 2011: On the rapid intensification of Hurricane Wilma (2005). Part I: Model prediction and structural changes. Wea. Forecasting, 26, 885–901, doi:10.1175/WAF-D-11-00001.1.
Chen, S. S., and M. Curcic, 2016: Ocean surface waves in Hurricane Ike (2008) and Superstorm Sandy (2012): Coupled model predictions and observations. Ocean Modell., 103, 161–176, doi:10.1016/j.ocemod.2015.08.005.
Chen, S. S., W. Zhao, M. A. Donelan, and H. L. Tolman, 2013: Directional wind–wave coupling in fully coupled atmosphere–wave–ocean models: Results from CBLAST-Hurricane. J. Atmos. Sci., 70, 3198–3215, doi:10.1175/JAS-D-12-0157.1.
Ching, J., R. Rotunno, M. A. LeMone, A. Martilli, B. Kosović, P. A. Jimenez, and J. Dudhia, 2014: Convectively induced secondary circulations in fine-grid mesoscale numerical weather prediction models. Mon. Wea. Rev., 142, 3284–3302, doi:10.1175/MWR-D-13-00318.1.
Clark, A. J., and Coauthors, 2012: An overview of the 2010 Hazardous Weather Testbed Experimental Forecast Program Spring Experiment. Bull. Amer. Meteor. Soc., 93, 55–74, doi:10.1175/BAMS-D-11-00040.1.
Coen, J. L., M. Cameron, J. Michalakes, E. G. Patton, P. J. Riggan, and K. M. Yedinak, 2013: WRF-Fire: Coupled weather–wildland fire modeling with the Weather Research and Forecasting model. J. Appl. Meteor. Climatol., 52, 16–38, doi:10.1175/JAMC-D-12-023.1.
David, C. H., D. J. Gochis, D. R. Maidment, W. Yu, D. N. Yates, and Z.-L. Yang, 2009: Using NHDPlus as the land base for the Noah-distributed model. Trans. GIS, 13, 363–377, doi:10.1111/j.1467-9671.2009.01169.x.
Davis, C. A., S. C. Jones, and M. Riemer, 2008a: Hurricane vortex dynamics during Atlantic extratropical transition. J. Atmos. Sci., 65, 714–736, doi:10.1175/2007JAS2488.1.
Davis, C. A., and Coauthors, 2008b: Prediction of landfalling hurricanes with the Advanced Hurricane WRF Model. Mon. Wea. Rev., 136, 1990–2005, doi:10.1175/2007MWR2085.1.
Davis, C. A., W. Wang, J. Dudhia, and R. Torn, 2010: Does increased horizontal resolution improve hurricane wind forecasts? Wea. Forecasting, 25, 1826–1841, doi:10.1175/2010WAF2222423.1.
Done, J. M., G. J. Holland, C. L. Bruyère, L. R. Leung, and A. Suzuki-Parker, 2015: Modeling high-impact weather and climate: Lessons from a tropical cyclone perspective. Climatic Change, 129, 381–395, doi:10.1007/s10584-013-0954-6.
Donelan, M. A., M. Curcic, S. S. Chen, and A. K. Magnusson, 2012: Modeling waves and wind stress. J. Geophys. Res., 117, C00J23, doi:10.1029/2011JC007787.
DTC/NCEP, 2014: NMM version 3 modeling system user’s guide. Developmental Testbed Center, chap. 1–7. [Available online at www.dtcenter.org/wrf-nmm/users/docs/user_guide/V3/users_guide_nmm_chap1-7.pdf.]
Dudhia, J., 2014: A history of mesoscale model development. Asia-Pac. J. Atmos. Sci., 50, 121–131, doi:10.1007/s13143-014-0031-8.
DuVivier, A. K., and J. J. Cassano, 2013: Evaluation of WRF Model resolution on simulated mesoscale winds and surface fluxes near Greenland. Mon. Wea. Rev., 141, 941–963, doi:10.1175/MWR-D-12-00091.1.
Fang, J., and F. Zhang, 2010: Initial development and genesis of Hurricane Dolly (2008). J. Atmos. Sci., 67, 655–672, doi:10.1175/2009JAS3115.1.
Fast, J. D., W. I. Gustafson, E. G. Chapman, R. C. Easter, J. Rishel, R. A. Zaveri, G. A. Grell, and M. Barth, 2006: Evolution of ozone, particulates, and aerosol direct forcing in an urban area using a new fully-coupled meteorology, chemistry, and aerosol model. J. Geophys. Res., 111, D21305, doi:10.1029/2005JD006721.
Fierro, A. O., R. F. Rogers, F. D. Marks, and D. S. Nolan, 2009: The impact of horizontal grid spacing on the microphysical and kinematic structures of strong tropical cyclones simulated with the WRF-ARW model. Mon. Wea. Rev., 137, 3717–3743, doi:10.1175/2009MWR2946.1.
Ganetis, S. A., and B. A. Colle, 2015: The thermodynamic and microphysical evolution of an intense snowband during the Northeast U.S. blizzard of 8–9 February 2013. Mon. Wea. Rev., 143, 4104–4125, doi:10.1175/MWR-D-14-00407.1.
Givati, A., D. J. Gochis, T. Rummler, and H. Kunstmann, 2016: Comparing one-way and two-way coupled hydrometeorological forecasting systems for flood forecasting in the Mediterranean region. Hydrology, 3, 19, doi:10.3390/hydrology3020019.
Gochis, D. J., W. Yu, and D. N. Yates, 2015: The WRF-Hydro Model technical description and user’s guide, version 3.0. NCAR Tech. Doc., 123 pp. [Available online at https://ral.ucar.edu/sites/default/files/public/images/project/WRF_Hydro_User_Guide_v3.0.pdf.]
Gopalakrishnan, S. G., S. Goldenberg, T. Quirino, X. Zhang, F. Marks Jr., K.-S. Yeh, R. Atlas, and V. Tallapragada, 2012: Toward improving high-resolution numerical hurricane forecasting: Influence of model horizontal grid resolution, initialization, and physics. Wea. Forecasting, 27, 647–666, doi:10.1175/WAF-D-11-00055.1.
Grell, G. A., and A. Baklanov, 2011: Integrated modeling for forecasting weather and air quality: A call for fully coupled approaches. Atmos. Environ., 45