• Bauer, P., A. Thorpe, and G. Brunet, 2015: The quiet revolution of numerical weather prediction. Nature, 525, 4755, doi:10.1038/nature14956.

  • Candille, G., 2009: The multiensemble approach: The NAEFS example. Mon. Wea. Rev., 137, 16551665, doi:10.1175/2008MWR2682.1.

  • Dutton, J. A., 2002: Opportunities and priorities in a new era for weather and climate services. Bull. Amer. Meteor. Soc., 83, 13031311, doi:10.1175/1520-0477(2002)083<1303:OAPIAN>2.3.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gall, R., J. Franklin, F. Marks, E. N. Rappaport, and F. Toepfer, 2013: The Hurricane Forecast Improvement Project. Bull. Amer. Meteor. Soc., 94, 329343, doi:10.1175/BAMS-D-12-00071.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hooke, W. H., and R. A. Pielke Jr., 2000: Short-term weather prediction: An orchestra in search of a conductor. Prediction: Science Decision Making and the Future of Nature, D. Sarewitz, R. A. Pielke Jr., and R. Byerly, Eds., Island Press, 61–84.

  • Hurrell, J., G. A. Meehl, D. Bader, T. L. Delworth, B. Kirtman, and B. Wielicki, 2009: A unified modeling approach to climate system prediction. Bull. Amer. Meteor. Soc., 90, 18191832, doi:10.1175/2009BAMS2752.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hurrell, J., and Coauthors, 2013: The Community Earth System Model: A framework for collaborative research. Bull. Amer. Meteor. Soc., 94, 13391360, doi:10.1175/BAMS-D-12-00121.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirtman, B. P., and Coauthors, 2014: The North American Multimodel Ensemble: Phase-I seasonal-to-interannual prediction; Phase-2 toward developing intraseasonal prediction. Bull. Amer. Meteor. Soc., 95, 585601, doi:10.1175/BAMS-D-12-00050.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Le Marshall, J., and Coauthors, 2007: The Joint Center for Satellite Data Assimilation. Bull. Amer. Meteor. Soc., 88, 329340, doi:10.1175/BAMS-88-3-329.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mass, C., 2006: The uncoordinated giant: Why U.S. weather research and prediction are not achieving their potential. Bull. Amer. Meteor. Soc., 87, 573584, doi:10.1175/BAMS-87-5-573.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • National Research Council, 1998: The Atmospheric Sciences: Entering the Twenty-First Century. National Academies Press, 384 pp.

  • National Research Council, 1999: A Vision for the National Weather Service: Road Map for the Future. National Academies Press, 88 pp.

  • National Research Council, 2000: From Research to Operations in Weather Satellites and Numerical Weather Prediction: Crossing the Valley of Death. National Academies Press, 96 pp.

  • National Research Council, 2005: Strategic Guidance for the National Science Foundation’s Support of the Atmospheric Sciences: An Interim Report. National Academies Press, 104 pp.

  • National Research Council, 2010: Assessment of Intraseasonal to Interannual Climate Prediction and Predictability. National Academies Press, 192 pp.

  • National Research Council, 2012: A National Strategy for Advancing Climate Modeling. National Academies Press, 300 pp.

  • National Research Council, 2016: Developing a U.S. Research Agenda to Advance Subseasonal to Seasonal Forecasting. National Academies Press, 372 pp.

  • Pielke, R. A., Jr., and R. E. Carbone, 2002: Weather impacts, forecasts, and policy: An integrated perspective. Bull. Amer. Meteor. Soc., 83, 393403, doi:10.1175/1520-0477(2002)083<0393:WIFAP>2.3.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sandgathe, S., W. O’Connor, N. Lett, D. McCarren, and F. Toepfer, 2011: National Unified Operational Prediction Capability Initiative. Bull. Amer. Meteor. Soc., 92, 13471351, doi:10.1175/2011BAMS3212.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Shapiro, M., and A. Thorpe, 2004: THORPEX International science plan. WMO/TD-1246, WWRP/THORPEX 2, 55 pp. [Available online at www.wmo.int/pages/prog/arep/wwrp/new/documents/CD_ROM_international_science_plan_v3.pdf.]

  • Shuman, F. G., 1989: History of numerical weather prediction at the National Meteorological Center. Wea. Forecasting, 4, 286296, doi:10.1175/1520-0434(1989)004<0286:HONWPA>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Stensrud, D. J., H. E. Brooks, J. Du, M. S. Tracton, and E. Rogers, 1999: Using ensembles for short-range forecasting. Mon. Wea. Rev., 127, 433446, doi:10.1175/1520-0493(1999)127<0433:UEFSRF>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Theurich, G., and Coauthors, 2016: The Earth System Prediction Suite: Toward a coordinated U.S. modeling capability. Bull. Amer. Meteor. Soc., 97, 12291247, doi:10.1175/BAMS-D-14-00164.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Uccellini, L. W., 2012: The move is on to the new NOAA Center for Weather and Climate Prediction. Bull. Amer. Meteor. Soc., 93, 16311632.

    • Search Google Scholar
    • Export Citation
  • UNISDR/CRED, 2015: The human cost of weather related disasters, 1995–2015. U.N. Office for Disaster Risk Reduction/Centre for Research on the Epidemiology of Disasters Rep., 27 pp. [Available online at www.unisdr.org/2015/docs/climatechange/COP21_WeatherDisastersReport_2015_FINAL.pdf.]

  • U.S. Department of Commerce, 2014: Fostering innovation, creating jobs, driving better decisions: The value of government data. Economics and Statistics Administration Rep., U.S. Department of Commerce, 54 pp. [Available online at www.esa.doc.gov/sites/default/files/revisedfosteringinnovationcreatingjobsdrivingbetterdecisions-thevalueofgovernmentdata.pdf.]

  • WMO, 2015: Seamless prediction of the earth system: From minutes to months. WMO-No. 1156, 481 pp. [Available online at http://library.wmo.int/pmb_ged/wmo_1156_en.pdf.]

  • View in gallery
    Fig. 1.

    Sampling of federal decision needs across time scales. Decisions span weather to climate prediction/projection capabilities, with responsibilities for actions falling throughout the federal and commercial sectors. Adapted from NRC (2016).

  • View in gallery
    Fig. 2.

    Historical (solid) and projected (dashed) computing capacity (teraflops) and global model resolution (km) for NOAA (red/purple) and the U.S. Navy (gold/blue). Data courtesy of NCEP Office of Central Processing (www.emc.ncep.noaa.gov/gmb/STATS/html/model_changes.html) and the NRL/MRY.

  • View in gallery
    Fig. 3.

    Committee membership for the CMA committee by organization as an example of National ESPC committee structure.

  • View in gallery
    Fig. 4.

    Overview of National ESPC goals.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 819 324 28
PDF Downloads 347 103 9

The National Earth System Prediction Capability: Coordinating the Giant

Jessie C. CarmanNOAA/Office of Weather and Air Quality, Silver Spring, Maryland

Search for other papers by Jessie C. Carman in
Current site
Google Scholar
PubMed
Close
,
Daniel P. EleuterioOffice of Naval Research, Arlington, Virginia

Search for other papers by Daniel P. Eleuterio in
Current site
Google Scholar
PubMed
Close
,
Timothy C. GallaudetU.S. Navy, Washington, D.C., and Naval Meteorology and Oceanography Command, Stennis Space Center, Mississippi

Search for other papers by Timothy C. Gallaudet in
Current site
Google Scholar
PubMed
Close
,
Gerald L. GeernaertClimate and Environmental Sciences Division, U.S. Department of Energy, Germantown, Maryland

Search for other papers by Gerald L. Geernaert in
Current site
Google Scholar
PubMed
Close
,
Patrick A. HarrDivision of Atmospheric and Geospace Sciences, National Science Foundation, Arlington, Virginia

Search for other papers by Patrick A. Harr in
Current site
Google Scholar
PubMed
Close
,
Jack A. KayeEarth Sciences Division, NASA, Washington, D.C.

Search for other papers by Jack A. Kaye in
Current site
Google Scholar
PubMed
Close
,
David H. McCarrenNaval Meteorology and Oceanography Command, Silver Spring, Maryland

Search for other papers by David H. McCarren in
Current site
Google Scholar
PubMed
Close
,
Craig N. McLeanNOAA/Office of Oceanic and Atmospheric Research, Silver Spring, Maryland

Search for other papers by Craig N. McLean in
Current site
Google Scholar
PubMed
Close
,
Scott A. SandgatheApplied Physics Laboratory, University of Washington, Seattle, Washington

Search for other papers by Scott A. Sandgathe in
Current site
Google Scholar
PubMed
Close
,
Frederick ToepferNOAA/National Weather Service/Office of Science Technology Integration, Silver Spring, Maryland

Search for other papers by Frederick Toepfer in
Current site
Google Scholar
PubMed
Close
, and
Louis W. UccelliniNOAA/National Weather Service, Silver Spring, Maryland

Search for other papers by Louis W. Uccellini in
Current site
Google Scholar
PubMed
Close
Full access

Abstract

The United States has had three operational numerical weather prediction centers since the Joint Numerical Weather Prediction Unit was closed in 1958. This led to separate paths for U.S. numerical weather prediction, research, technology, and operations, resulting in multiple community calls for better coordination. Since 2006, the three operational organizations—the U.S. Air Force, the U.S. Navy, and the National Weather Service—and, more recently, the Department of Energy, the National Aeronautics and Space Administration, the National Science Foundation, and the National Oceanic and Atmospheric Administration/Office of Oceanic and Atmospheric Research, have been working to increase coordination. This increasingly successful effort has resulted in the establishment of a National Earth System Prediction Capability (National ESPC) office with responsibility to further interagency coordination and collaboration. It has also resulted in sharing of data through an operational global ensemble, common software standards, and model components among the agencies. This article discusses the drivers, the progress, and the future of interagency collaboration.

CORRESPONDING AUTHOR E-MAIL: Scott A. Sandgathe, sandgathe@apl.washington.edu

Abstract

The United States has had three operational numerical weather prediction centers since the Joint Numerical Weather Prediction Unit was closed in 1958. This led to separate paths for U.S. numerical weather prediction, research, technology, and operations, resulting in multiple community calls for better coordination. Since 2006, the three operational organizations—the U.S. Air Force, the U.S. Navy, and the National Weather Service—and, more recently, the Department of Energy, the National Aeronautics and Space Administration, the National Science Foundation, and the National Oceanic and Atmospheric Administration/Office of Oceanic and Atmospheric Research, have been working to increase coordination. This increasingly successful effort has resulted in the establishment of a National Earth System Prediction Capability (National ESPC) office with responsibility to further interagency coordination and collaboration. It has also resulted in sharing of data through an operational global ensemble, common software standards, and model components among the agencies. This article discusses the drivers, the progress, and the future of interagency collaboration.

CORRESPONDING AUTHOR E-MAIL: Scott A. Sandgathe, sandgathe@apl.washington.edu

A five-agency strategy to coordinate and accelerate the national numerical environmental prediction capability is discussed.

In 1954 the Joint Numerical Weather Prediction Unit was organized, funded, and staffed between the U.S. Weather Bureau, the U.S. Air Force Air Weather Service, and the U.S. Navy Naval Weather Service. However, in 1958, the Joint Numerical Weather Prediction project ended and the National Weather Bureau, U.S. Navy, and U.S. Air Force proceeded independently to develop their own NWP facilities (Shuman 1989; see the appendix for expansions of acronyms). For many years, numerous congressional committees and panels questioned the need for multiple, independent government weather agencies; however, with the exception of establishing the OFCM in 1964 via Public Law 87-843 (www.gpo.gov/fdsys/pkg/STATUTE-76/pdf/STATUTE-76-Pg1080.pdf), all ended in confirmation of the status quo because of the significantly different agency missions, the difficulty of passing timely information, and the requirement for secure and reliable mission support. OFCM’s mission was confined to coordination of weather services and supporting research. Although the OFCM succeeded in certain areas such as weather radar, space weather, and tropical cyclone activities, it was never staffed for nor found a role in general NWP services and supporting research coordination.

By the turn of the century, the numerical prediction enterprise had grown exponentially, with massive satellite data streams, highly complex numerical prediction systems, and huge data volumes with ever-growing requirements for larger and more reliable computing capabilities. Simple NWP systems evolved into high-resolution mesoscale, global, and full climate–Earth system numerical prediction capabilities. Deterministic numerical model capabilities had expanded to include ensembles of single systems. Limited by statute and treaty, the NWS provided domestic services on which the U.S. economy had grown ever more dependent1 for protection of life and property, economic growth, and stability. Numerous private activities arose to tailor NWS model output and products to individual customer requirements. At the same time, U.S. defense forces operating around the world on humanitarian,2 training, and military missions developed and relied on similar yet separate weather organizations tailored for overseas mission needs.

The parallel but differing infrastructures produced by disparate hardware, distribution, and display technologies complicated communication and collaboration, which generally remained ad hoc. While the U.S. Navy was able to address internal coordination between research and operations through the creation of its AMOP, NOAA coordination among NESDIS, OAR, and the NWS was difficult, leading to competition for scarce funding among NOAA developmental laboratories. Hence, activities remained isolated and duplicative, despite advances in Internet and communications technology that made collaboration with a scientist 10,000 miles away almost as easy as with one in the next office.

Numerous reports (NRC 1998, 1999, 2000; Hooke and Pielke 2000) addressed issues within the U.S. weather enterprise, citing lack of priorities and agenda, poor communications, inadequate resources, lack of effective transition policies, and resource requirements of maintaining competitive numerical prediction technologies. Pielke and Carbone (2002, p. 393) argued that “weather research is unlikely to more effectively meet society’s needs—or receive greater resources—if the community proceeds in balkanized fashion; integration is an imperative.” The authors called for weather community leaders to develop a vision for collaboration toward common goals. A later NRC report (NRC 2005) observed that the NSF Atmospheric Sciences Program, and other agencies dealing with atmospheric research, work on an ad hoc basis without sufficient strategic planning.

The papers and reports identifying shortcomings of the weather enterprise culminated in the paper by Mass (2006), likening the U.S. weather enterprise to an uncoordinated giant. Mass compared NWS GFS global forecast predictive skill unfavorably to that of ECMWF, despite the larger U.S. weather enterprise with its greater funding for weather-related research. Mass (2006) identified further issues within the weather enterprise and called for increased collaboration between the federal agencies and the broader weather community, including increased strategic planning. While these comments were primarily directed at the operational weather enterprise, change requires the participation of the broader Earth system research and technology enterprise.

This paper addresses the federal agency efforts to coordinate U.S. weather enterprise research, operations, facilities, and capabilities. Legislative and executive efforts as well as individual agency efforts at collaboration are defined in the “Coordinating the giant” section. The “Moving to a national Earth system prediction capability” section discusses the overarching strategy and initiatives leading to a National ESPC. Finally, the “Challenges and the path forward” section presents the significant challenges still remaining and proposes a path forward.

COORDINATING THE GIANT.

Legislative and executive actions.

Congress has taken actions to increase collaboration on topics of interest to multiple agencies and established several programs fostering agency partnerships.

For example, CENRS (www.whitehouse.gov/sites/default/files/microsites/ostp/nstc-cenrs-charter.pdf) was established by the National Science and Technology Council in 2010 to increase the overall productivity and application of federal R&D efforts in the areas of environment, natural resources, and sustainability. CENRS also provides a formal mechanism for coordination of domestic, international policy, and R&D issues relevant to those areas. Within CENRS, several subcommittees oversee a variety of communities such as air quality, Arctic research, disaster reduction, water availability, ocean science, and Earth observations.

In conjunction with CENRS and its subcommittees, the USGCRP (www.globalchange.gov) was established by presidential initiative in 1989 and mandated by Congress in the GCRA of 1990. The USGCRP mission is to inform human responses to climate and global change through coordinated and integrated federal programs of research, education, communication, and decision support. USGCRP achieves its mission through IWGs; the Interagency Group on Integrative Modeling coordinates global change-related modeling activities across the federal government and provides guidance to USGCRP on modeling priorities.

The U.S. CLIVAR (www.usclivar.org) is a national research program investigating the variability and predictability of the global climate system on seasonal, interannual, decadal, and centennial time scales, with a particular emphasis on the role the ocean plays in climate variability. Created in 2000, the U.S. CLIVAR program contributes directly to the broader USGCRP and provides U.S. research and organizational contributions to the International CLIVAR program and the WRCP under the United Nations.

Another example is the NOPP (www.nopp.org), established by Congress in 1996 under the National Oceanographic Partnership Act (Public Law 104-201, 10 U.S.C. 7901–7903; www.congress.gov/bill/104th-congress/house-bill/3303). The NOPP’s goal is to improve knowledge of the ocean in support of national security, economic development, quality of life, and science education/communication. An important NOPP contribution was sponsorship of the U.S. Navy’s HYCOM, through a multi-institutional effort as part of the U.S. GODAE. The resulting ocean model is used operationally by both the U.S. Navy and the NWS as a component of their modeling systems.

Unfortunately, these programs do not adequately address the need for collaboration on the weather problem in research, development, or operations.

Agency actions.

The federal agencies recognized the issues discussed in the first section and responded to the community literature and NRC reports through several agreements and improved practices. The agencies additionally noted decision-makers’ increasing needs for skillful, reliable, actionable, internally consistent weather and ocean forecasts across synoptic and longer lead times to enhance the civil safety, economic health, and national security of the nation. Figure 1 is an overview of forecast timelines versus environmentally sensitive decision processes.

Fig. 1.
Fig. 1.

Sampling of federal decision needs across time scales. Decisions span weather to climate prediction/projection capabilities, with responsibilities for actions falling throughout the federal and commercial sectors. Adapted from NRC (2016).

Citation: Bulletin of the American Meteorological Society 98, 2; 10.1175/BAMS-D-16-0002.1

Among the federal agencies and academic and private sectors, multiple partnerships have been formed in response to case-by-case needs or collaboration opportunities, with some overlaps and gaps. Among these collaborative efforts are the HFIP (Gall et al. 2013), SREF (Stensrud et al. 1999), and the NMME (Kirtman et al. 2014) using global coupled climate models. Additionally, the NUOPC (Sandgathe et al. 2011) established a multimodel global atmosphere and wave ensemble, using the computational power of the separate prediction centers for products that bring increased skill to all the partners, and common modeling architecture to accelerate the transition of new technologies into operations. The JCSDA (www.jcsda.noaa.gov/; Le Marshall et al. 2007) focuses on data assimilation techniques for new satellite data types and phenomena such as ocean surface salinity and land surface states to reduce the time from satellite launch to operational data use in operational NWP.

To improve transitions to operations, the NWS, the NSF, and the U.S. Air Force organized the DTC (www.dtcenter.org/) to facilitate the use of operational models by the general federal and academic community. The DTC supports development, maintenance, and sharing of operational codes (data assimilation, forecasting, postprocessing, etc.) and organizes workshops and tutorials. This information technology environment facilitates modeling experiments with a single system available to both NCEP and collaborators, ensuring that results will be reproducible and relevant for R2O efforts.

NOAA actions.

With many of the criticisms directed at NOAA, NWS has taken multiple actions, including a thorough reorganization effective April 2015. To address poor connections with other NOAA laboratories and academia, NWS formed OSTI. OSTI analyzes requirements for service improvements and develops potential scientific and technological solutions through coordination with partners within NOAA (laboratories and cooperative institutes) and with the external research community. To support NWS’s outreach and transition effort, NOAA/OAR is increasing emphasis on transitions to operations or applications by requiring formal, coordinated transition plans for technologies nearing readiness.

To address computing capacity deficits, congressional authorizations allowed NWS to increase computer power, with sequential upgrades starting in 2013 (Fig. 2). These upgrades permitted establishment of a redundant backup system and additional computer resources for development and testing. These improvements also permitted operation of an upgraded, higher-resolution version of the GFS to longer times (Fig. 2), an improved version of the HWRF model, and operationally implementing the HRRR model. Additional capacity upgrades remain in the pipeline.

Fig. 2.
Fig. 2.

Historical (solid) and projected (dashed) computing capacity (teraflops) and global model resolution (km) for NOAA (red/purple) and the U.S. Navy (gold/blue). Data courtesy of NCEP Office of Central Processing (www.emc.ncep.noaa.gov/gmb/STATS/html/model_changes.html) and the NRL/MRY.

Citation: Bulletin of the American Meteorological Society 98, 2; 10.1175/BAMS-D-16-0002.1

To enhance external oversight and advice, NCEP requested UCAR to conduct a thorough review of its centers as well as the Office of the Director. UCAR formed the UCACN, which reviews operations and provides periodic reports of findings and recommendations (available to the public at www.vsp.ucar.edu/UCACN/). Most recently, the UCACN formed the UMAC to review the NCEP prediction suite strategy for development and to streamline NCEP’s production suite to most effectively use its increased computing power.

NWS and the international community have increased collaboration, forming the NAEFS (Candille 2009) in collaboration with Canada and Mexico, and increased participation in WMO activities such as the WWRP. Specifically, WWRP’s THORPEX (Shapiro and Thorpe 2004) has provided a 10-yr vehicle for collaboration to improve predictions from one day to two weeks and provided valuable research and experience with multimodel ensemble prediction. NWS has also replaced the old World Weather Building, which did not readily attract visiting scientists and was generally not supportive of a vigorous research to operations infrastructure and process, with the new state-of-the-art NCWCP at the University of Maryland, College Park (Uccellini 2012). The NCWCP is already becoming a focal point for collaborative interactions between NOAA scientists, information technology specialists, forecasters, and the larger national and international research community across a wide spectrum of observation and modeling activities.

DON actions.

The DOD has faced severe budget declines in NWP starting in the early 1990s and continuing today, driving efforts to consolidate and coordinate. A 1996 NWP agreement ceded global DOD NWP (as distinct from regional models) to the U.S. Navy, with the U.S. Air Force initially relying on the U.S. Navy global NWP for operations and then later partnering with NOAA for additional global NWP support and space weather collaboration. A partnership for Battlespace Environments R&D was also formed between the U.S. Navy and the U.S. Air Force to address issues such as display hardware and applications software. Within the U.S. Navy, the AMOP was formed to ensure coordination on software development and transition between their basic research organizations, applied laboratories, systems developers, and operational prediction centers. The U.S. Navy also revitalized their global NWP program in 2010, introducing the NAVGEM global prediction system, moving a major portion of its global NWP supercomputing production to the Defense Shared Resource Center in Mississippi, which provides significantly more computational power and routine hardware upgrades (Fig. 2).

DOE actions.

The DOE has had a long history of investment in climate research, including atmosphere, ocean, land, and cryosphere modeling; integrative assessment research involving energy and other sectors of the economy; integrative modeling of the Earth–human system; software applications; uncertainty quantification; model diagnostics; and supporting field experiments. For most of its history, DOE’s climate modeling research centered around a collaborative investment with NSF on the CESM. CESM model diagnostics and validation were provided to the international modeling enterprise through investments at LLNL in the ESGF and supporting diagnostics and validation research at LLNL’s PCMDI.

NASA actions.

NASA continually provides new types of global observations that contribute to the initialization, process representation, and verification of Earth system models used by the broader community. As a primary member NASA was instrumental in the formation of the JCSDA, working toward common satellite data assimilation techniques for U.S. operational modeling activities. As part of the community recognizing the need for a common modeling infrastructure to improve collaboration, NASA sponsored the initial prototype of the ESMF and has been a significant participant in the continued development of a common modeling architecture. NASA has a separate agreement with NWS to collaborate on the development of their data assimilation system.

NSF actions.

NSF’s support of NCAR has significantly contributed to the national modeling enterprise. The NCAR strategic plan identifies two grand challenges. The first emphasizes improvement in understanding and prediction of atmospheric, chemical, and space weather and the impacts on ecosystems, people, and society; the second calls for improving understanding and prediction of climate variability and change at regional and global scales. Both grand challenges require developing and deploying observing systems, advancing numerical techniques and model components, and developing applications to address needs of operational communities and society. NSF and NCAR initiated a major consolidation in regional and mesoscale modeling, leading the WRF Model development effort, and are continuing to support university WRF usage. NSF and NCAR, with DOE, have also been instrumental in the development and continuing support of the community climate model (CESM; Hurrell et al. 2013). In support of the need for a common modeling framework, NCAR and NSF have initiated CIME for the collective construction and maintenance of software infrastructure required for Earth system model development and application. NCAR and NSF additionally provide workshops and visits for broad community and interdisciplinary exchange of ideas and basic research results. The NSF Geosciences Directorate has also participated in a visiting scientist program that funds university researchers to visit and contribute to NCEP operational systems, accelerating the transition of basic research into operations.

MOVING TO A NATIONAL EARTH SYSTEM PREDICTION CAPABILITY.

These case-by-case partnerships did not meet the need for more strategic federal coordination between activities. In 2005, Vice Admiral (ret.) Conrad Lautenbacher, NOAA administrator; Brigadier General Lawrence Stutzriem, Air Force Director of Weather; and Rear Admiral Fred Byus, Oceanographer of the Navy, initiated a series of agreements for better coordination among the operational weather agencies, and later, research agencies with Earth system prediction missions. The goal was a more strategically coordinated, unified federal response to decision needs shown in Fig. 1, resting on a common foundational science.

After significant dialog among the three organizations, NUOPC was formed in 2008 as an agreement to coordinate activities between NWS, the U.S. Navy, and the U.S. Air Force to develop and implement the next-generation National Operational Global Ensemble modeling system. The NUOPC plan consists of the following elements:

  • a national operational NWP system with a commitment to address common requirements,

  • a multicomponent system with interoperable components built upon common standards and a common framework,

  • managed ensemble diversity to quantify and bound forecast uncertainty,

  • ensemble products used to drive high-resolution regional/local prediction and other downstream models,

  • a national research agenda for global NWP to accelerate development and transition to operations, and

  • increased leverage of partner agencies to avoid independent/duplicative operating costs.

NUOPC achieved its coordination successes via a series of interagency committees addressing mutual problems across agency and office lines; an example is shown in Fig. 3 with sidebar discussion. These committees have flexible membership and frequently involve academic and international participants to increase collaborative value.
Fig. 3.
Fig. 3.

Committee membership for the CMA committee by organization as an example of National ESPC committee structure.

Citation: Bulletin of the American Meteorological Society 98, 2; 10.1175/BAMS-D-16-0002.1

CMA COMMITTEE

NUOPC created the CMA committee in 2008 to develop a common architecture to accelerate transition of R2O and to encourage use of operational codes in research. Specific goals are determined according to National ESPC strategies, with membership drawn from the National ESPC participating agencies, academia, and internationally as per Fig. 3.

The committee currently focuses on three items:

  • managing national modeling infrastructure and coding standards based on the ESMF;

  • creating an organizational structure that supports design and implementation of software and conventions; and

  • entering new community model components into an organized governance structure.

To carry out these priorities, the CMA committee has two subgroups.

Physics interoperability: Working closely with the Global Modeling Testbed, the PI group focuses on building a working physics driver prototype with a common physics interface to further sharing of technology between academia and the participating agencies. The objectives are

  • to collect requirements from modeling centers,

  • to provide a requirements reference document for developing a universal physics driver, and

  • to prioritize introduction of physics components.

Content Standards Committee: The CSC addresses coupling of system components (atmosphere, ocean, land, etc.), including developing and maintaining a standard NUOPC coupling interface. Current actions are

  • to maintain and improve NUOPC Layer software;

  • to develop and promote other community conventions for coupled modeling;

  • to share expertise implementing the NUOPC Layer and other community conventions in CESM, GEOS-5, NEMS, the U.S. Navy, and other modeling systems;

  • to unify the NASA MAPL system and NUOPC Layer standards; and

  • to evolve the ESPS by adding new components and improving the implementation and documentation associated with existing components.

CMA committee accomplishments include the following:

  • developing and deploying the NUOPC Layer software across major U.S. coupled models, thereby initiating the ESPS;

  • updating the Kalnay protocols for physics parameterizations and defining and building a “universal” physics driver; and

  • establishing a governance model that coordinates U.S. modeling activities both at the agency executive level, through the CMA, and at an implementation level, through actions carried out by its CSC and PI subgroups.

Recognizing that prediction efforts over a longer time scale require more emphasis on research, much of which occurs at agencies not participating in NUOPC, the ESPC interagency effort was established in 2010. Initially, ESPC efforts encompassed the original NUOPC partners, but this was updated in 2013 to include environmental research activities from NASA, DOE, and NSF. This expansion of the ESPC acknowledged the need to improve coordination and collaboration across the entire federally sponsored environmental research and operational prediction community to improve global prediction at the weather-to-climate interface. The partnership pursues the goal of building a seamless prediction capability, to support internally consistent decision products across time scales and agency missions (Hurrell et al. 2009; WMO 2015).

While each agency retains its separate mission needs, the ESPC partnership recognizes that these missions rest on a central core national environmental modeling need for global integrated atmospheric, oceanic, terrestrial, cryospheric, and near-Earth space environment models. While prediction at longer time scales is generally estimated to be beyond the limits of deterministic predictability, multimodel ensemble-based probabilistic techniques provide a means for making meaningful forecasts at longer time scales (NRC 2016).

Rather than implementing a new air–land–sea–ice forecast system, the ESPC partnership leverages programs to bridge the gap between synoptic to multidecadal (∼30 years), with S2S and ISI (NRC 2010), as its most immediate priority. It also addresses the need for increased attention on coupled Earth system models (Bauer et al. 2015).

Additionally, the ESPC partnership advocates for and supports basic architectural foundations such as common coupled modeling architectures, data and archive standards, computational efficiency, and standardized forecast skill metrics. These supporting technologies, echoing NUOPC efforts, expand the scope of collaborative model development, common case studies, and evaluation datasets to aid improved understanding of underlying physical processes.

In 2012, volunteer working groups across the weather and climate prediction communities established plans to coordinate ongoing and future research where appropriate. Goals include developing a common modeling environment, establishing a community model repository of common datasets and test cases, and pursuing focus topics to assess forecast skill against potential ESPC stakeholder information needs. Ultimately, these efforts will identify where sources of extended-range predictability are sufficiently understood and reliable for use in future operational prediction with quantifiable uncertainty (NRC 2012, 2016). Critical path science and technology issues will be identified as future research challenges.

Between NUOPC and ESPC, the obvious overlap of participants and efforts over part of the time scale led the participating agencies to combine these two partnerships into the National ESPC interagency effort (operational and research oriented; Fig. 4). This merged effort benefits from the demonstrated NUOPC success leading interagency committees, explicitly working across agency funding lines to achieve goals beneficial to the project and the weather prediction community at large. NUOPC’s topical committees, such as the example in Fig. 3, are being expanded to meet the larger effort of longer time scales and inclusion of ocean, land, and cryosphere modeling and coupling. The merged effort will additionally benefit from the focus topics’ attention to predictability and prediction skill, forming a process-based skill assessment communicating in both directions between research and operations.

Fig. 4.
Fig. 4.

Overview of National ESPC goals.

Citation: Bulletin of the American Meteorological Society 98, 2; 10.1175/BAMS-D-16-0002.1

The National ESPC strategy.

The long-time-scale prediction/projection problem requires a greater integration of research with operational user perspectives (NRC 2016). Broadly, these national and societal needs as shown in Fig. 1 cover nowcasts of a few minutes for severe weather to decadal projections of sea level rise and changing climate affecting water resources and agriculture that lead to infrastructure loss/replacement and political/demographic instability.

Accurate forecasting of these phenomena is one of the grand challenges of applied physics, and progress has been made toward producing societally useful forecasts for certain applications. However, current forecast skill is not of sufficient fidelity to support decisions for many applications. The forecast should additionally provide prediction uncertainty information. Predictions and projections should be internally consistent for phenomena within NOAA’s responsibility (the United States and treaty regions) and DOD responsibility (overseas), across domains, and across time scales, providing appropriate uncertainty assessments. These needs are representative but not complete, and forecast requirements may change as the capability develops.

The vision for a National ESPC end state consists of a seamless operational suite of multiple numerical prediction systems spanning from 0 days to 30 years, covering the physical Earth system—atmosphere, ocean, wave, land, sea ice, and near space. This capability will provide the U.S. federal agencies and the public and private sectors the best available information on current/predicted physical Earth system conditions to support resource investment, national security, and protection of life and property. This national system will conform to a community-based, open-source, common modeling architecture to allow the use of modular components. The system should meet individual agency mission needs while enabling cross-agency benefits from coordinated modeling and research and more efficient technology transfer from research to operations. The National ESPC will coordinate data assimilation, dynamic and physical simulation, postprocessing of numerical output, and product generation across a distributed network of providers with common reliability, skill, and timeliness criteria.

Rather than creating a new model with the expectation that it will be valid over this extended time scale, the national system will initially consist of a suite of coupled ensembles, taken from existing independently developed operational and research agency models. Each individual model will participate in the system over the time scale for which it has been developed and tested, with model resolutions, assimilation/cycling procedures, and output requirements varying. Consistent with the original NUOPC effort, this national system will be capable of driving high-resolution regional/local prediction and other downstream models to meet individual agency needs and hence will be a global system.

In leveraging existing operational and research efforts, the National ESPC will provide an overarching coordination of such work, especially the results of those USGCRP research efforts addressing its time range. National ESPC maintains continued coordination with USGCRP through representatives participating in both activities.

Major National ESPC initiatives.

Software infrastructure collaboration.

A critical component of the National ESPC is the ability to accelerate software development and enhance transition through adoption of software infrastructure standards and protocols as described in the sidebar. The National ESPC framework builds on the NUOPC Layer of ESMF, which sets common implementation conventions for developers, to provide improved interoperability of their code with other groups operating within these standards. ESMF and the NUOPC Layer allow for coupling models from different domains, nesting models at different resolutions, and operating models in ensemble modes. All users of ESMF components are encouraged to employ the NUOPC Layer or its NASA counterpart, the MAPL. The most significant architectural task over the next three years will be to ensure that the NUOPC Layer and MAPL are interoperable. The full suite of codes, the ESPS, consists of NUOPC-compliant, documented components and modeling systems from National ESPC centers (Theurich et al. 2016). These ESPS codes will serve as the basis of the National ESPC system.

Common metrics and coordinated testing.

Under NUOPC guidance, the operational agencies have adopted a common set of metrics for evaluation of global atmospheric models and ensembles. This has allowed better comparison of predictive skill among agencies and also reduced the incestuous influence of interdependent data assimilation and model forecast systems. These metrics agreements will be expanded to include metrics for the ocean, sea ice, wave, and land fields in the Earth system, as well as expanded in time to include seasonal and interannual time scales. Prediction at ISI time scales and beyond presents a particularly difficult situation for developmental and operational test and validation. Common metrics, in an easily accessible development and test environment, are a necessary step for accelerating both multiagency collaboration and technology transition (NRC 2016). NMME metrics used at ISI time scales represent a starting point, but probably will not demonstrate forecast fidelity for all phenomena or needs. National ESPC must provide a coordinated infrastructure for both developmental and operational test and evaluation in order to ensure operational standards for accuracy and reliability are maintained.

Initially, technology transition will be through each agency’s internal transition process. Developmental collaboration and common software infrastructure should lead to enhanced transition both internally and between agencies. Ultimately, the National ESPC will provide a coherent system in which each agency’s technology transition impacts the system in a coordinated manner.

The research agencies and National ESPC.

While the primary motivations leading to the formation of the National ESPC strategy were directed at the operational agencies, success requires the full, coordinated participation of the primary research mission agencies.

NSF contributions.

NSF will contribute to the National ESPC as a lead agency for the NSCI (www.whitehouse.gov/the-press-office/2015/07/29/executive-order-creating-national-strategic-computing-initiative). The NSCI advances five strategic objectives:

  • accelerate delivery of an exascale computing system;

  • increase the technology base used for modeling and simulation in tandem with that used for data analytics;

  • define a path toward a viable HPC capability beyond the limits of current semiconductor technology;

  • increase the capacity and capability of an HPC ecosystem that addresses network technology, workflow, algorithm and software development, accessibility, and workforce development; and

  • build enduring public–private partnerships.

The Geoscience Directorate leads NSF’s science effort to advance data assimilation and predictability science, which relate to a majority of the NSCI objectives and are fundamental to increased capability of Earth system simulation and forecasting. As the NSCI efforts increase, it is expected that NSF will lead advances in data assimilation and predictability science.

Additionally, NSF can assist National ESPC objectives by overcoming barriers to distributed production, storage, and analysis of multimodel ensemble forecasts in universities.

NASA contributions.

NASA’s broad set of observational data, its capabilities in computational modeling, data assimilation, and reanalysis can play an important role in the quantitative evaluation of global Earth system models. In particular, Obs4MIPs (http://climatesciences.jpl.nasa.gov/projects/obs4mips), organized by NASA together with DOE’s PCMDI, facilitates NASA data use in model–measurement intercomparisons.

NASA’s independent data assimilation efforts support the development and evaluation of new satellite sensors and speed their transition into operational systems. NASA has produced the consistent, long-term (1979 to present) MERRA and MERRA-2 reanalyses that place the current suite of research satellites in a climate context, taking particular care to provide a broad suite of hydrologic variables (https://gmao.gsfc.nasa.gov/research/merra/; http://disc.sci.gsfc.nasa.gov/datacollection/M2I1NXLFO_V5.12.4.shtml).

NASA also maintains a robust climate modeling activity, focusing on sensitivity to parameterizations of clouds and moist convection, ground hydrology, and ocean–atmosphere–ice interactions.

NOAA contributions.

NOAA’s NGGPS represents the NWS’s synoptic-scale contribution to the National ESPC effort and will increase the accuracy of weather forecasts through accelerated development and implementation of current global weather prediction models, improved data assimilation techniques, and improved software architecture and system engineering.

The NMME, as funded by NOAA/OAR and other agency, international, and academic partners, represents a new paradigm for leveraging distributed computing and research to provide an improved annual prediction capability. Under this paradigm, research models become dual-use systems, improving scientific understanding and, with a relaxed reliability requirement suitable to longer time scales, contributing to vetted official predictions or projections. Other NOAA research clarifying predictability and improving physical process representation in coupled modeling systems will feed into this effort via those modeling systems.

DOE contributions.

In 2012, the DOE and its national laboratories conducted a series of workshops that resulted in the launch of a branch model of the CESM, the ACME. ACME was officially launched in 2014, with a goal to rapidly achieve very high spatial resolution by utilizing DOE’s increasingly sophisticated supercomputers as they become available to the community. ACME represents a major software and applied mathematics activity; its science challenges focus on the water cycle, biogeochemistry, cryogenic systems, and extreme weather as the climate evolves.

Unlike historical DOE investments in the climate sciences that focused on subdecadal to centennial predictability, ACME remains in the “climate time scales” yet considers seasonal to subdecadal scales as part of its scope by focusing primarily on time horizons spanning the past 40 years to the next 40 years. ACME prioritizes technical developments, including adaptation of climate codes and libraries to DOE Leadership Class Facility computers, minimizing power consumption, efficient code engineering, testing, and computational workflow. The ACME climate and technical codes will be released frequently to the larger community and should contribute to other groups requiring efficient use on advanced computers.

DOE’s ACME project will be an important member of the suite of numerical prediction systems over the 0-day to 30-yr time frame. Its high-resolution (15–25 km), cryosphere–ocean and hydrological interactions will provide projections to aid decision-making. As it is based on the CESM and is focused on advancing various computational methods, its developmental improvements will benefit other modeling systems.

DOE has been instrumental in forging a new generation of impact assessment models that complement and may be integrated into ACME. Efforts are underway to bridge the divide between classical IA models, involving annual to multiannual time steps, with IAV models that describe and predict infrastructure, socioeconomic, and behavioral response to extreme events. The eventual unification of IA and IAV capabilities with ACME, CESM, and other climate models is anticipated to provide DOE, and the partnership, a better understanding of the physical, socioeconomic, and behavioral aspects of a complex system.

DON contributions.

The U.S. Navy’s directly identified ESPC contribution focuses on advancing coupled global Earth system prediction, funding research in coupling advanced computer technologies, coupled data assimilation, and developing the next generation of Earth system prediction models.

The goal of the National ESPC strategy is to orchestrate these individual agency contributions to more directly support a future national capability. As a first step, the research agency principals have initiated recurring planning meetings, and all agencies are participating in the National ESPC coordinating committees.

THE CHALLENGES AND THE PATH FORWARD.

While progress has been made over the past 10 years, the challenges remain significant and will require community focus. The computational hardware environment is dramatically evolving and will require massive rewriting and restructuring of environmental prediction software amounting to tens of millions of lines of code. This is both a daunting challenge and an important opportunity for the community to coalesce. If hardware is procured piecemeal by different agencies, then code must also be developed piecemeal, increasing barriers to collaboration. It is imperative, and extremely challenging, that the next generation of hardware and associated software upgrades be coordinated (NRC 2016). National ESPC must bring the agencies and the external community together to discuss and develop a credible, collaborative path forward.

Compounding the coordination challenge, climate modeling efforts have been reaching down to finer spatial scales and back to shorter time scales, overlapping weather scales and increasing the apparent duplication in numerical environmental prediction. Climate models include a similar agency-led variety as NWP models; a recent study (NRC 2012) recommended a collaborative synoptic to climate prediction system in a shared modeling framework between climate and weather-scale operations communities.

At the weather end of the time scale, the operational and research agencies face increasing requests for decision support at longer time scales to meet needs ranging from infrastructure management and planning to military planning and training to international assessments and humanitarian relief efforts. Just as the weather community found a need for more integrated modeling systems and establishment of a federal concept of operations to prevent internally contradictory predictions, these longer-range predictions and projections need internal consistency for appropriate response planning. In addition, projections and predictions must be consistent across time scales (seamless), allowing for proper planning leading smoothly to execution or “ready, set, go” (NRC 2016).

Developing an effective National ESPC requires basic and applied research efforts focused on critical path technologies such as coupled data assimilation, informed by key process studies, and applying increasing coordination within and between operational prediction centers. The basic research efforts build on more than two decades of work by the national and international weather and climate research community (programs such as WWRP and WCRP) to understand synoptic and climate predictability and variability and to improve the simulation of basic processes and the systems used for prediction. Future progress will need to build on those agency programs/partners, and computational resources will need to be made available to support current and future research needs.

The National ESPC program must encourage not only R2O, but additionally establish improved communication from end users, to operational prediction centers, to the research community on successes and shortfalls of new modeling systems and products with respect to the weather services computational and personnel resources. This O2R focus should shape the path of scientific research through informing proposal calls and awards, generating scientific interest, and providing a preoperational setting for testing new technology.

The National ESPC is a voluntary partnership among the five coordinating agencies with assigned staff; its strength comes from regular engagement and dialog with leaders and stakeholders at various levels across the participant agencies. It faces the challenge of having to achieve progress on collaboration while dealing with the realities of unique mission requirements and varying budget environments. Frequently, DOD, NOAA, and NSF environmental budgets vary independently according to political climate and external demands. In the coming months, National ESPC will be aligned under the federal meteorological committee structure administered through OFCM, with a coordinating linkage to CENRS. This arrangement is intended to provide the operational interagency coordination inherent in the OFCM structure while institutionalizing relationships with the policy and higher-level advocacy contacts of CENRS. As this arrangement is evaluated, the eventual possibility of elevating National ESPC to the level of CENRS subcommittee should also be considered.

SUMMARY.

The federal agencies have responded positively and aggressively to community calls for better coordination on research and operational environmental prediction. The National ESPC is the most recent step to address these issues and was created to accelerate efforts to meet national forecast, management, and planning needs and to better serve the community on time scales from days to decades. Ongoing coordination includes scientific development, model interoperability, and output coordination. The National ESPC leverages the existing interagency coordination efforts and organizes them into a national capability to improve mission support, resource management planning, and protection of life and property.

Significant challenges remain both in environmental research and in operational coordination. Only broad agency and community acceptance of common standards, common architecture, and common goals and significant, strategically guided collaborations will allow the United States to regain preeminence in environmental prediction.

ACKNOWLEDGMENTS.

The authors wish to thank all the government and academic members of the ESPC and NUOPC committees who have devoted their valuable time to enhancing collaboration among the U.S. developmental and operational numerical prediction communities and making the National ESPC a reality. The authors are indebted to Kyrstin Fornace for assistance with graphics design.

APPENDIX: ACRONYMS.

557th WW

557th Weather Wing (U. S. Air Force)

ACME

Advanced Climate Model for Energy

AMOP

Administrative Model Oversight Panel (U. S. Navy)

ANL

Argonne National Laboratory (DOE)

CENRS

Committee on Environment, Natural Resources, and Sustainability

CESM

Community Earth System Model

CIME

Common Infrastructure for Modeling Earth

CLIVAR

Climate and Ocean: Variability, Predictability and Change

CMA

Common Model Architecture

CNMOC

Commander, Naval Meteorology and Oceanography Command (U. S. Navy)

CSC

Content Standards Committee

DOD

Department of Defense

DOE

Department of Energy

DON

Department of the Navy

DTC

Developmental Testbed Center

ECMWF

European Centre for Medium-Range Weather Forecasts

EMC

Environmental Modeling Center (NOAA/NWS)

ESGF

Earth System Grid Federation

ESMF

Earth System Modeling Framework

ESPC

Earth System Prediction Capability

ESPS

Earth System Prediction Suite

ESRL

Earth System Research Laboratory (NOAA/OAR)

FNMOC

Fleet Numerical Meteorology and Oceanography Center

GEOS-5

Goddard Earth Observing System Model, version 5 (NASA)

GFDL

Geophysical Fluid Dynamics Laboratory (NOAA/OAR)

GFS

Global Forecast System (NOAA/NWS)

GCRA

Global Change Research Act

GODAE

Global Ocean Data Assimilation Experiment

GSFC/GMAO

Goddard Space Flight Center Global Modeling and Assimilation Office (NASA)

HFIP

Hurricane Forecast Improvement Program

HPC

High-performance computing

HRRR

High-Resolution Rapid Refresh

HWRF

Hurricane Weather Research and Forecasting Model

HYCOM

Hybrid Coordinate Ocean Model

IA

Integrated assessment

IAV

Impact, adaptation, and vulnerability

ISI

Intraseasonal to interannual

IWG

Interagency working group

JCSDA

Joint Center for Satellite Data Assimilation

LLNL

Lawrence Livermore National Laboratory (DOE)

MAPL

Modeling Analysis and Prediction Layer

MERRA

Modern-Era Retrospective Analysis for Research and Applications

MMM

Mesoscale and Microscale Meteorology Laboratory (NCAR)

NAEFS

North American Ensemble Forecast System

NASA

National Aeronautics and Space Administration

NAVGEM

Navy Global Environmental Model

NAVO

Naval Oceanographic Office (U.S. Navy)

NCAR

National Center for Atmospheric Research

NCAS

National Centre for Atmospheric Science

NCEP

National Centers for Environmental Prediction (NOAA/NWS)

NCWCP

NOAA Center for Climate and Weather Prediction

NEMS

NOAA Environmental Modeling System

NESDIS

National Environmental Satellite, Data, and Information Service (NOAA)

NGGPS

Next-Generation Global Prediction System

NMME

North American Multi-Model Ensemble

NOAA

National Oceanic and Atmospheric Administration

NOPP

National Oceanographic Partnership Program

NPS

Naval Postgraduate School (U.S. Navy)

NRC

National Research Council

NRL/MRY

Naval Research Laboratory, Monterey (U.S. Navy)

NRL/SSC

Naval Research Laboratory, Stennis Space Center (U. S. Navy)

NSCI

National Strategic Computing Initiative

NSF

National Science Foundation

NUOPC

National Unified Operational Prediction Capability

NWP

Numerical weather prediction

NWS

National Weather Service (NOAA)

O2R

Operations to research

OAR

Office of Oceanic and Atmospheric Research (NOAA)

Obs4MIPS

Observations for Model Intercomparisons Project

OFCM

Office of the Federal Coordinator for Meteorology

OSTI

Office of Science and Technology Integration (NOAA/NWS)

PCMDI

Program for Climate Model Diagnosis and Intercomparison

PI

Physics interoperability

R&D

Research and development

R2O

Research to operations

RAP

Research Applications Program (NCAR)

S2S

Subseasonal to seasonal

SREF

Short-Range Ensemble Forecast

THORPEX

The Observing System Research and Predictability Experiment (WMO)

UCACN

UCAR Community Advisory Committee for NCEP

UCAR

University Corporation for Atmospheric Research

UMAC

UCACN Modeling Advisory Committee

USGCRP

U.S. Global Change Research Program

UW-APL

University of Washington Applied Physics Laboratory

WCRP

World Climate Research Programme (WMO)

WRF

Weather Research and Forecasting Model

WMO

World Meteorological Organization

WWRP

World Weather Research Programme (WMO)

REFERENCES

  • Bauer, P., A. Thorpe, and G. Brunet, 2015: The quiet revolution of numerical weather prediction. Nature, 525, 4755, doi:10.1038/nature14956.

  • Candille, G., 2009: The multiensemble approach: The NAEFS example. Mon. Wea. Rev., 137, 16551665, doi:10.1175/2008MWR2682.1.

  • Dutton, J. A., 2002: Opportunities and priorities in a new era for weather and climate services. Bull. Amer. Meteor. Soc., 83, 13031311, doi:10.1175/1520-0477(2002)083<1303:OAPIAN>2.3.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gall, R., J. Franklin, F. Marks, E. N. Rappaport, and F. Toepfer, 2013: The Hurricane Forecast Improvement Project. Bull. Amer. Meteor. Soc., 94, 329343, doi:10.1175/BAMS-D-12-00071.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hooke, W. H., and R. A. Pielke Jr., 2000: Short-term weather prediction: An orchestra in search of a conductor. Prediction: Science Decision Making and the Future of Nature, D. Sarewitz, R. A. Pielke Jr., and R. Byerly, Eds., Island Press, 61–84.

  • Hurrell, J., G. A. Meehl, D. Bader, T. L. Delworth, B. Kirtman, and B. Wielicki, 2009: A unified modeling approach to climate system prediction. Bull. Amer. Meteor. Soc., 90, 18191832, doi:10.1175/2009BAMS2752.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hurrell, J., and Coauthors, 2013: The Community Earth System Model: A framework for collaborative research. Bull. Amer. Meteor. Soc., 94, 13391360, doi:10.1175/BAMS-D-12-00121.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirtman, B. P., and Coauthors, 2014: The North American Multimodel Ensemble: Phase-I seasonal-to-interannual prediction; Phase-2 toward developing intraseasonal prediction. Bull. Amer. Meteor. Soc., 95, 585601, doi:10.1175/BAMS-D-12-00050.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Le Marshall, J., and Coauthors, 2007: The Joint Center for Satellite Data Assimilation. Bull. Amer. Meteor. Soc., 88, 329340, doi:10.1175/BAMS-88-3-329.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mass, C., 2006: The uncoordinated giant: Why U.S. weather research and prediction are not achieving their potential. Bull. Amer. Meteor. Soc., 87, 573584, doi:10.1175/BAMS-87-5-573.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • National Research Council, 1998: The Atmospheric Sciences: Entering the Twenty-First Century. National Academies Press, 384 pp.

  • National Research Council, 1999: A Vision for the National Weather Service: Road Map for the Future. National Academies Press, 88 pp.

  • National Research Council, 2000: From Research to Operations in Weather Satellites and Numerical Weather Prediction: Crossing the Valley of Death. National Academies Press, 96 pp.

  • National Research Council, 2005: Strategic Guidance for the National Science Foundation’s Support of the Atmospheric Sciences: An Interim Report. National Academies Press, 104 pp.

  • National Research Council, 2010: Assessment of Intraseasonal to Interannual Climate Prediction and Predictability. National Academies Press, 192 pp.

  • National Research Council, 2012: A National Strategy for Advancing Climate Modeling. National Academies Press, 300 pp.

  • National Research Council, 2016: Developing a U.S. Research Agenda to Advance Subseasonal to Seasonal Forecasting. National Academies Press, 372 pp.

  • Pielke, R. A., Jr., and R. E. Carbone, 2002: Weather impacts, forecasts, and policy: An integrated perspective. Bull. Amer. Meteor. Soc., 83, 393403, doi:10.1175/1520-0477(2002)083<0393:WIFAP>2.3.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sandgathe, S., W. O’Connor, N. Lett, D. McCarren, and F. Toepfer, 2011: National Unified Operational Prediction Capability Initiative. Bull. Amer. Meteor. Soc., 92, 13471351, doi:10.1175/2011BAMS3212.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Shapiro, M., and A. Thorpe, 2004: THORPEX International science plan. WMO/TD-1246, WWRP/THORPEX 2, 55 pp. [Available online at www.wmo.int/pages/prog/arep/wwrp/new/documents/CD_ROM_international_science_plan_v3.pdf.]

  • Shuman, F. G., 1989: History of numerical weather prediction at the National Meteorological Center. Wea. Forecasting, 4, 286296, doi:10.1175/1520-0434(1989)004<0286:HONWPA>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Stensrud, D. J., H. E. Brooks, J. Du, M. S. Tracton, and E. Rogers, 1999: Using ensembles for short-range forecasting. Mon. Wea. Rev., 127, 433446, doi:10.1175/1520-0493(1999)127<0433:UEFSRF>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Theurich, G., and Coauthors, 2016: The Earth System Prediction Suite: Toward a coordinated U.S. modeling capability. Bull. Amer. Meteor. Soc., 97, 12291247, doi:10.1175/BAMS-D-14-00164.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Uccellini, L. W., 2012: The move is on to the new NOAA Center for Weather and Climate Prediction. Bull. Amer. Meteor. Soc., 93, 16311632.

    • Search Google Scholar
    • Export Citation
  • UNISDR/CRED, 2015: The human cost of weather related disasters, 1995–2015. U.N. Office for Disaster Risk Reduction/Centre for Research on the Epidemiology of Disasters Rep., 27 pp. [Available online at www.unisdr.org/2015/docs/climatechange/COP21_WeatherDisastersReport_2015_FINAL.pdf.]

  • U.S. Department of Commerce, 2014: Fostering innovation, creating jobs, driving better decisions: The value of government data. Economics and Statistics Administration Rep., U.S. Department of Commerce, 54 pp. [Available online at www.esa.doc.gov/sites/default/files/revisedfosteringinnovationcreatingjobsdrivingbetterdecisions-thevalueofgovernmentdata.pdf.]

  • WMO, 2015: Seamless prediction of the earth system: From minutes to months. WMO-No. 1156, 481 pp. [Available online at http://library.wmo.int/pmb_ged/wmo_1156_en.pdf.]

1

It has been estimated that as much as one-third of U.S. GDP is weather sensitive (Dutton 2002; U.S. Department of Commerce 2014).

2

Flooding accounted for 47% of weather-related disasters (1995–2015), primarily in Asia. Storms resulted in 40% of the global total deaths for all weather-related disasters, with the vast majority of these deaths (89%) occurring in lower-income countries, even though they experienced only 26% of all storms (UNISDR/CRED 2015).

Save