• Anderson, M. C., , C. R. Hain, , B. D. Wardlow, , A. Pimstein, , J. R. Mecikalski, , and W. P. Kustas, 2011: Evaluation of drought indices based on thermal remote sensing of evapotranspiration over the continental United States. J. Climate, 24, 20252044.

    • Search Google Scholar
    • Export Citation
  • Atlas, R., 1997: Atmospheric observations and experiments to assess their usefulness in data assimilation. J. Meteor. Soc. Japan, 75, 111130.

    • Search Google Scholar
    • Export Citation
  • Auciello, E. P., , and R. L. Lavoie, 1993: Collaborative research activities between National Weather Service operational offices and universities. Bull. Amer. Meteor. Soc., 74, 625629.

    • Search Google Scholar
    • Export Citation
  • Bernardet, L., and Coauthors, 2008: The Developmental Testbed Center and its Winter Forecasting Experiment. Bull. Amer. Meteor. Soc., 89, 611627.

    • Search Google Scholar
    • Export Citation
  • Brooks, H. E., , C. A. Doswell III, , and L. J. Wicker, 1993: STORMTIPE: A forecasting experiment using a three-dimensional cloud model. Wea. Forecasting, 8, 352362.

    • Search Google Scholar
    • Export Citation
  • Brotzge, J., , K. Hondl, , B. Philips, , L. Lemon, , E. J. Bass, , D. Rude, , and D. L. Andra, 2010: Evaluation of distributed collaborative adaptive sensing for detection of low-level circulations and implications for severe weather warning operations. Wea. Forecasting, 25, 173189.

    • Search Google Scholar
    • Export Citation
  • Case, J. L., , S. V. Kumar, , J. Srikishen, , and G. J. Jedlovec, 2011: Improving numerical weather predictions of summertime precipitation over the southeastern United States through a high-resolution initialization of the surface state. Wea. Forecasting, 26, 785807.

    • Search Google Scholar
    • Export Citation
  • Chou, S.-H., , B. Zavodsky, , and G. J. Jedlovec, 2009: Data assimilation and regional weather forecast using Atmospheric Infrared Sounder (AIRS) profiles. Preprints, 16th Conf. on Satellite Meteorology and Oceanography, Phoenix, AZ, Amer. Meteor. Soc., JP6.11. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_147745.htm.]

    • Search Google Scholar
    • Export Citation
  • Chung, D., , and J. Teixeira, 2012: A simple model for stratocumulus to shallow cumulus cloud transitions. J. Climate, 25, 25472554.

  • Clark, A. J., and Coauthors, 2012: An overview of the 2010 Hazardous Weather Testbed Experimental Forecast Program Spring Experiment. Bull. Amer. Meteor. Soc., 93, 5574.

    • Search Google Scholar
    • Export Citation
  • Corfidi, S. F., 1999: The birth and early years of the Storm Prediction Center. Wea. Forecasting, 14, 507525.

  • Dabbert, W. F., and Coauthors, 2005: Multifunctional mesoscale observation networks. Bull. Amer. Meteor. Soc., 86, 961982.

  • Davidson, P., , J. Tuell, , L. Uccellini, , J. Gaynor, , S. Koch, , R. Pierce, , and M. Ralph, cited 2012: Recommended guidelines for testbeds and proving grounds. [Available online at www.testbeds.noaa.gov/pdf/Guidelines%20051911_v7.pdf.]

    • Search Google Scholar
    • Export Citation
  • DelSole, T., , and M. K. Tippett, 2008: Predictable components and singular vectors. J. Atmos. Sci., 65, 16661678.

  • DeMaria, M., , J. A. Knaff, , R. Knabb, , C. Lauer, , C. R. Sampson, , and R. T. DeMaria, 2009: A new method for estimating tropical cyclone wind speed probabilities. Wea. Forecasting, 24, 15731591.

    • Search Google Scholar
    • Export Citation
  • DOC/NOAA/NESDIS, 2004: Third GOES users conference. DOC/NOAA/NESDIS Conf. Rep., 90 pp. [Available online at www.goes-r.gov.]

  • Doswell, C. A., , and J. A. Flueck, 1989: Forecasting and verifying in a field research project: DOPLIGHT '87. Wea. Forecasting, 4, 97109.

    • Search Google Scholar
    • Export Citation
  • Goodman, S. J., and Coauthors, 2012: The GOES-R Proving Ground: Accelerating user readiness for the next-generation geostationary environmental satellite system. Bull. Amer. Meteor. Soc., 93, 10291040.

    • Search Google Scholar
    • Export Citation
  • Goodman, S. J., and Coauthors, 2013: The GOES-R Geostationary Lightning Mapper (GLM). Atmos. Res., 125–126, 3449.

  • Grasso, L. D., , M. Sengupta, , J. F. Dostalek, , R. Brummer, , and M. DeMaria, 2008: Synthetic satellite imagery for current and future environmental satellites. Int. J. Remote Sens., 29, 43734384.

    • Search Google Scholar
    • Export Citation
  • Haines, S. L., , G. J. Jedlovec, , and S. M. Lazarus, 2007: A MODIS sea surface temperature composite for regional applications. Trans. Geosci. Remote Sens., 45, 29192927.

    • Search Google Scholar
    • Export Citation
  • Heinselman, P. L., , D. L. Priegnitz, , K. L. Manross, , T. M. Smith, , and R. W. Adams, 2008: Rapid sampling of severe storms by the National Weather Radar Testbed Phased Array Radar. Wea. Forecasting, 23, 808824.

    • Search Google Scholar
    • Export Citation
  • Hillger, D., , L. Grasso, , S. Miller, , R. Brummer, , and R. DeMaria, 2011: Synthetic Advanced Baseline Imager (ABI) true-color imagery. J. Appl. Remote Sens., 5, 592597.

    • Search Google Scholar
    • Export Citation
  • Jedlovec, G. J., , J. Vazquez, , E. Armstrong, , and S. Haines, 2009: Combined MODIS/AMSR-E composite SST data for regional weather applications. Preprints, 16th Conf. on Satellite Meteorology and Oceanography, Phoenix, AZ, Amer. Meteor. Soc., JP8.6. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_145839.htm.]

    • Search Google Scholar
    • Export Citation
  • Junker, N. W., , R. H. Grumm, , R. Hart, , L. F. Bosart, , K. M. Bell, , and F. J. Pereira, 2008: Use of normalized anomaly fields to anticipate extreme rainfall in the mountains of Northern California. Wea. Forecasting, 23, 336356.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., 2004: The Kain–Fritsch convective parameterization: An update. J. Appl. Meteor., 43, 170181.

  • Kain, J. S., , P. R. Janish, , S. J. Weiss, , M. E. Baldwin, , R. S. Schneider, , and H. E. Brooks, 2003: Collaboration between forecasters and research scientists at the NSSL and SPC: The Spring Program. Bull. Amer. Meteor. Soc., 84, 17971806.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., , S. J. Weiss, , J. J. Levit, , M. E. Baldwin, , and D. R. Bright, 2006: Examination of convection-allowing configurations of the WRF model for the prediction of severe convective weather: The SPC/NSSL Spring Program 2004. Wea. Forecasting, 21, 167181.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., , S. R. Dembek, , S. J. Weiss, , J. L. Case, , J. J. Levit, , and R. A. Sobash, 2010: Extracting unique information from high-resolution forecast models: Monitoring selected fields and phenomena every time step. Wea. Forecasting, 25, 15361542.

    • Search Google Scholar
    • Export Citation
  • Kirtman, B. P., cited 2011: Toward a National Multi- Model Ensemble (NMME) system for operational intra-seasonal to interannual (ISI) climate forecasts. [Available online at www.cpc.ncep.noaa.gov/products/ctb/MMEWhitePaperCPO_revised.pdf.]

    • Search Google Scholar
    • Export Citation
  • Kirtman, B. P., , and D. Min, 2009: Multimodel ensemble ENSO prediction with CCSM and CFS. Mon. Wea. Rev., 137, 29082930.

  • Lakshmanan, V., , T. Smith, , G. J. Stumpf, , and K. Hondl, 2007: The warning decision support system–integrated information. Wea. Forecasting, 22, 596612.

    • Search Google Scholar
    • Export Citation
  • Lazo, J. K., , R. E. Morss, , and J. L. Demuth, 2009: 300 billion served: Sources, perceptions, uses, and values of weather forecasts. Bull. Amer. Meteor. Soc., 90, 785798.

    • Search Google Scholar
    • Export Citation
  • Lee, T., and Coauthors, 2010: NPOESS: Next generation operational global Earth observations. Bull. Amer. Meteor. Soc., 91, 727740.

  • Le Marshall, J., and Coauthors, 2007: The Joint Center for Satellite Data Assimilation. Bull. Amer. Meteor. Soc., 88, 329340.

  • Levit, J. J., , B. Entwistle, , and C. Wallace, 2011: The Aviation Weather Testbed: Infusion of new science and technology for aviation operations. Preprints, Second Aviation, Range, and Aerospace Meteorology Special Symp. on Weather–Air Traffic Management Integration, Seattle, WA, Amer. Meteor. Soc., 3.3. [Available online at https://ams.confex.com/ams/91Annual/webprogram/Paper185616.html.]

    • Search Google Scholar
    • Export Citation
  • Loss, G., , D. Bernhardt, , K. K. Fuell, , and G. T. Stano, 2009: An operational assessment of the MODIS false color composite with the Great Falls, Montana National Weather Service. Preprints, 23rd Conf. on Hydrology, Phoenix, AZ, Amer. Meteor. Soc., P4.2. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_147478.htm.]

    • Search Google Scholar
    • Export Citation
  • McCarty, W., , G. Jedlovec, , and T. L. Miller, 2009: Impact of the assimilation of Atmospheric Infrared Sounder radiance measurements on short-term weather forecasts. J. Geophys. Res., 114, D18122, doi:10.1029/2008JD011626.

    • Search Google Scholar
    • Export Citation
  • McPherson, R. D., 1994: The National Centers for Environmental Prediction: Operational climate, ocean, and weather prediction for the 21st century. Bull. Amer. Meteor. Soc., 75, 363373.

    • Search Google Scholar
    • Export Citation
  • Morss, R. E., , and F. M. Ralph, 2007: Use of information by National Weather Service forecasters and emergency managers during the CALJET and PACJET-2001. Wea. Forecasting, 22, 539555.

    • Search Google Scholar
    • Export Citation
  • NAS, 2000: From research to operations in weather satellites and numerical weather prediction: Crossing the valley of death. The National Academies Board on Atmospheric Sciences and Climate Rep., 80 pp. [Available online at http://dels.nas.edu/Report/From-Research-Operations-Weather/9948.]

    • Search Google Scholar
    • Export Citation
  • Neiman, P. J., , F. M. Ralph, , G. A. Wick, , Y.-H. Kuo, , T.-K. Wee, , Z. Ma, , G. H. Taylor, , and M. D. Dettinger, 2008: Diagnosis of an intense atmospheric river impacting the Pacific Northwest: Storm summary and offshore vertical structure observed with COSMIC satellite retrievals. Mon. Wea. Rev., 136, 43984420.

    • Search Google Scholar
    • Export Citation
  • NOAA Science Workshop Program Committee, cited 2010: Strengthening NOAA Science: Findings from the NOAA Science Workshop. [Available online at http://nrc.noaa.gov/sites/nrc/Documents/Workshops/Science_Workshop_2010_WP_FINAL.pdf.]

    • Search Google Scholar
    • Export Citation
  • Otkin, J. A., , and T. J. Greenwald, 2008: Comparison of WRF model-simulated and MODIS-derived cloud data. Mon. Wea. Rev., 136, 19571970.

    • Search Google Scholar
    • Export Citation
  • Paolino, D. A., , J. L. Kinter III, , B. P. Kirtman, , D. Min, , and D. M. Straus, 2012: The impact of land surface and atmospheric initialization on seasonal forecasts with CCSM. J. Climate, 25, 10071021.

    • Search Google Scholar
    • Export Citation
  • Petersen, R., , and R. Aune, 2009: Optimizing the impact of GOES sounder products in very-short-range forecasts— Recent results and future plans. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_149174.htm.]

    • Search Google Scholar
    • Export Citation
  • Petrescu, E., , and T. Hall, 2009: IC4D—A new tool for producing four dimensional aviation forecasts. Preprints, Aviation, Range, and Aerospace Meteorology Special Symp. on Weather–Air Traffic Management Integration, San Diego, CA, Amer. Meteor. Soc., P1.11. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_148046.htm.]

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , and M. D. Dettinger, 2011: Storms, floods and the science of atmospheric rivers. Eos, Trans. Amer. Geophys. Union, 92, 265266.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , and M. D. Dettinger, 2012: Historical and national perspectives on extreme West Coast precipitation associated with atmospheric rivers during December 2010. Bull. Amer. Meteor. Soc., 93, 783790.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., and Coauthors, 2005: Improving short-term (0–48 h) cool-season quantitative precipitation forecasting: Recommendations from a USWRP workshop. Bull. Amer. Meteor. Soc., 86, 16191632.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , P. J. Neiman, , G. A. Wick, , S. I. Gutman, , M. D. Dettinger, , D. R. Cayan, , and A. B. White, 2006: Flooding on California's Russian River: Role of atmospheric rivers. Geophys. Res. Lett., 33, L13801, doi:10.1029/2006GL026689.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , E. Sukovich, , D. Reynolds, , M. Dettinger, , S. Weagle, , W. Clark, , and P. J. Neiman, 2010: Assessment of extreme quantitative precipitation forecasts and development of regional extreme event thresholds using data from HMT-2006 and COOP observers. J. Hydrometeor., 11, 12881306.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , E. Sukovich, , G. N. Kiladis, , and K. Weickmann, 2011: A multiscale observational case study of a Pacific atmospheric river exhibiting tropical–extratropical connections and mesoscale frontal wave. Mon. Wea. Rev., 139, 11691189.

    • Search Google Scholar
    • Export Citation
  • Rappaport, E. N., , J.-G. Jiing, , C. W. Landsea, , S. T. Murillo, , and J. L. Franklin, 2012: The Joint Hurricane Test Bed: Its first decade of tropical cyclone research-to-operations activities reviewed. Bull. Amer. Meteor. Soc., 93, 371380.

    • Search Google Scholar
    • Export Citation
  • Reynolds, D., 2003: Value-added quantitative precipitation forecasts: How valuable is the forecaster? Bull. Amer. Meteor. Soc., 84, 876878.

    • Search Google Scholar
    • Export Citation
  • Rozumalski, R. A., 2007: WRF Environmental Modeling System User's Guide: Demystifying the process of installing, configuring, and running the Weather Research and Forecasting model. NOAA/NWS Forecast Decision Training Branch, COMET/UCAR, 95 pp. [Available online at http://strc.comet.ucar.edu/wrf/wrfems_userguide.htm.]

    • Search Google Scholar
    • Export Citation
  • Scharfenberg, K. A., and Coauthors, 2005: The Joint Polarization Experiment: Polarimetric radar in forecasting and warning decision making. Wea. Forecasting, 20, 775788.

    • Search Google Scholar
    • Export Citation
  • Schmit, T. J., , M. M. Gunshor, , W. P. Menzel, , J. Li, , S. Bachmeier, , and J. J. Gurka, 2005: Introducing the next-generation Advanced Baseline Imager on GOES-R. Bull. Amer. Meteor. Soc., 86, 10791096.

    • Search Google Scholar
    • Export Citation
  • Shaw, T. A., , and J. Perlwitz, 2010: The impact of stratospheric model configuration on planetary-scale waves in Northern Hemisphere winter. J. Climate, 23, 33693389.

    • Search Google Scholar
    • Export Citation
  • Shaw, T. A., , J. Perlwitz, , and N. Harnik, 2010: Downward wave coupling between the stratosphere and troposphere: The importance of meridional wave guiding and comparison with zonal-mean coupling. J. Climate, 23, 63656381.

    • Search Google Scholar
    • Export Citation
  • Stano, G. T., , K. K. Fuell, , and G. J. Jedlovec, 2010: NASA SPoRT GOES-R Proving Ground activities. Preprints, Sixth Annual Symp. on Future National Operational Environmental Satellite Systems: NPOESS and GOES-R, Atlanta, GA, Amer. Meteor. Soc., 8.2. [Available online at https://ams.confex.com/ams/90annual/techprogram/paper_163879.htm.]

    • Search Google Scholar
    • Export Citation
  • Suselj, K., , J. Teixeira, , and G. Matheou, 2012: Eddy diffusivity/mass flux and shallow cumulus boundary layer: An updraft PDF multiple mass flux scheme. J. Atmos. Sci., 69, 15131533.

    • Search Google Scholar
    • Export Citation
  • Teixeira, J., and Coauthors, 2011: Tropical and subtropical cloud transitions in weather and climate prediction models: The GCSS/WGNE Pacific Cross-Section Intercomparison (GPCI). J. Climate, 24, 52235256.

    • Search Google Scholar
    • Export Citation
  • Tippett, M. K., , T. DelSole, , S. J. Mason, , and A. G. Barnston, 2008: Regression-based methods for finding coupled patterns. J. Climate, 21, 43844398.

    • Search Google Scholar
    • Export Citation
  • Tollerud, E. I., and Coauthors, 2013: The DTC ensembles task: A new testing and evaluation facility for mesoscale ensembles. Bull. Amer. Meteor. Soc., 94, 321327..

    • Search Google Scholar
    • Export Citation
  • White, A. B., , D. J. Gottas, , A. F. Henkel, , P. J. Neiman, , F. M. Ralph, , and S. I. Gutman, 2010: Developing a performance measure for snow-level forecasts. J. Hydrometeor., 11, 739753.

    • Search Google Scholar
    • Export Citation
  • White, A. B., and Coauthors, 2012: NOAA's rapid response to the Howard A. Hanson Dam flood risk management crisis. Bull. Amer. Meteor. Soc., 93, 189207.

    • Search Google Scholar
    • Export Citation
  • Wolff, J. K., , B. S. Ferrier, , and C. F. Mass, 2012: Establishing closer collaboration to improve model physics for short-range forecasts. Bull. Amer. Meteor. Soc., 93, ES51ES53.

    • Search Google Scholar
    • Export Citation
  • Wolfson, M. M., , W. J. Dupree, , R. Rasmussen, , M. Steiner, , S. Benjamin, , and S. Weygandt, 2008: Consolidated Storm Prediction for Aviation (CoSPA). Preprints, 13th Conf. on Aviation, Range, and Aerospace Meteorology, New Orleans, LA, Amer. Meteor. Soc., J6.5. [Available online at https://ams.confex.com/ams/88Annual/techprogram/paper_132981.htm.]

    • Search Google Scholar
    • Export Citation
  • Xiao, H., , C.-M. Wu, , and C. R. Mechoso, 2012: A treatment for the stratocumulus-to-cumulus transition in GCMs. Climate Dyn., 39, 30753089.

    • Search Google Scholar
    • Export Citation
  • Zavodsky, B. T., , S.-H. Chou, , and G. J. Jedlovec, 2012: Improved regional analyses and heavy precipitation forecasts with assimilation of Atmospheric Infrared Sounder retrieved thermodynamic profiles. IEEE Trans. Geophys. Remote Sens., 50, 42434251.

    • Search Google Scholar
    • Export Citation
  • View in gallery

    Conceptual schematic of the test bed process for a hypothetical project, tool, or concept—including innovation, demonstration, evaluation, and, where suitable, a transition to operations within a federal, state, or local organization. NOS = National Ocean Service; USBR = United States Bureau of Reclamation; and USACE = U.S. Army Corps of Engineers.

  • View in gallery

    NOAA Testbed newsletters published since fall 2009.

  • View in gallery

    NHC's wind speed probability product for 1-min average tropical storm force wind (34 kt, where 1 kt = 0.51 m s−1) for (left) Hurricane Leslie and (right) Hurricane Michael issued on 1200 UTC 6 Sep 2012. The shading represents the probability (percentage) of sustained tropical force surface winds will occur during the forecast period in the shaded area on the map.

  • View in gallery

    Examples of atmospheric river events (from Ralph et al. 2011).

  • View in gallery

    HMT-West Legacy mesonet being installed in California by NOAA and partners as part of the CA-DWR's Enhanced Flood Response and Emergency Preparedness Observing Network.

  • View in gallery

    JCSDA enhances the usefulness of current satellite observations and accelerates the assimilation of data from new instruments, including infrared, microwave, active, passive, and geo- and polar-based measurements.

  • View in gallery

    Experimental warning exercises during the 2011 HWT Spring Experiment. Shown are (left) Steve Keighton (Blacksburg, VA, WFO) and (right) Kevin Brown (Norman, OK, WFO). In the background is the Norman WFO forecast operations center.

  • View in gallery

    Schematic diagram of activity areas of the DTC.

  • View in gallery

    CLIMAS–CPC collaborative development of an interactive web tool for CPC 3-month Climate Outlook.

  • View in gallery

    GOES-R Proving Ground partners and sample products demonstrated to forecasters.

  • View in gallery

    Participants collaborate together in the AWT during the 2011 Summer Experiment.

  • View in gallery

    (a) Example of the aviation weather impact graphic forecast. Contours highlight a high, medium, or low potential threat of convection impacts to the golden triangle area of the National Airspace System. (b) Example of the “probability exceedance” graphic. Contours indicate either a 30% or 60% probability of convection reaching a combined reflectivity value of 40 dBZ and a radar echo height of 37,000 ft or greater.

  • View in gallery

    Today's predictive services exist on a foundation of prior science and technology innovation.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 280 280 16
PDF Downloads 186 186 17

The Emergence of Weather-Related Test Beds Linking Research and Forecasting Operations

View More View Less
  • 1 NOAA/Earth System Research Laboratory, Boulder, Colorado
  • | 2 NOAA/National Weather Service, Norman, Oklahoma
  • | 3 NOAA/Atlantic Oceanographic and Meteorological Laboratory, Miami, Florida
  • | 4 Joint Center for Satellite Data Assimilation, Camp Springs, Maryland
  • | 5 NOAA/National Weather Service, National Centers for Environmental Prediction, Kansas City, Missouri
  • | 6 Office of Science and Technology, NOAA/National Weather Service, Silver Spring, Maryland
  • | 7 NOAA/National Weather Service, National Centers for Environmental Prediction, Kansas City, Missouri
  • | 8 NOAA/Office of Policy, Planning, and Evaluation, Silver Spring, Maryland
  • | 9 NOAA/National Environmental Satellite, Data, and Information Service, and NASA Goddard Space Flight Center, Greenbelt, Maryland
  • | 10 NOAA/National Hurricane Center, Miami, Florida
  • | 11 NOAA/National Weather Service, National Centers for Environmental Prediction, Kansas City, Missouri
  • | 12 Climate Prediction Center, NOAA/National Weather Service, Camp Springs, Maryland
  • | 13 NASA Marshall Space Flight Center, Huntsville, Alabama
  • | 14 NOAA/National Severe Storms Laboratory, Norman, Oklahoma, and Cooperative Institute for Mesoscale Meteorological Studies, University of Oklahoma, Norman, Oklahoma
  • | 15 National Center for Atmospheric Research, Boulder, Colorado
  • | 16 NOAA/National Weather Service, National Centers for Environmental Prediction, Kansas City, Missouri
  • | 17 NOAA/Atlantic Oceanographic and Meteorological Laboratory, Miami, Florida
  • | 18 Joint Center for Satellite Data Assimilation, Camp Springs, Maryland
  • | 19 Office of Hydrologic Development, NOAA/National Weather Service, Boulder, Colorado
  • | 20 Storm Prediction Center, NOAA/National Weather Service, Norman, Oklahoma
  • | 21 NOAA/National Severe Storms Laboratory, Norman, Oklahoma, and Cooperative Institute for Mesoscale Meteorological Studies, University of Oklahoma, Norman, Oklahoma
  • | 22 Storm Prediction Center, NOAA/National Weather Service, Norman, Oklahoma
© Get Permissions
Full access

Test beds have emerged as a critical mechanism linking weather research with forecasting operations. The U.S. Weather Research Program (USWRP) was formed in the 1990s to help identify key gaps in research related to major weather prediction problems and the role of observations and numerical models. This planning effort ultimately revealed the need for greater capacity and new approaches to improve the connectivity between the research and forecasting enterprise.

Out of this developed the seeds for what is now termed “test beds.” While many individual projects, and even more broadly the NOAA/National Weather Service (NWS) Modernization, were successful in advancing weather prediction services, it was recognized that specific forecast problems warranted a more focused and elevated level of effort. The USWRP helped develop these concepts with science teams and provided seed funding for several of the test beds described.

Based on the varying NOAA mission requirements for forecasting, differences in the organizational structure and methods used to provide those services, and differences in the state of the science related to those forecast challenges, test beds have taken on differing characteristics, strategies, and priorities. Current test bed efforts described have all emerged between 2000 and 2011 and focus on hurricanes (Joint Hurricane Testbed), precipitation (Hydrometeorology Testbed), satellite data assimilation (Joint Center for Satellite Data Assimilation), severe weather (Hazardous Weather Testbed), satellite data support for severe weather prediction (Short-Term Prediction Research and Transition Center), mesoscale modeling (Developmental Testbed Center), climate forecast products (Climate Testbed), testing and evaluation of satellite capabilities [Geostationary Operational Environmental Satellite-R Series (GOES-R) Proving Ground], aviation applications (Aviation Weather Testbed), and observing system experiments (OSSE Testbed).

CORRESPONDING AUTHOR: Marty Ralph, NOAA/Earth System Research Laboratory, R/E/PSD, 325 Broadway, Boulder, CO 80305, E-mail: marty.ralph@noaa.gov

A supplement to this article is available online (10.1175/BAMS-D-12-00080.2)

Test beds have emerged as a critical mechanism linking weather research with forecasting operations. The U.S. Weather Research Program (USWRP) was formed in the 1990s to help identify key gaps in research related to major weather prediction problems and the role of observations and numerical models. This planning effort ultimately revealed the need for greater capacity and new approaches to improve the connectivity between the research and forecasting enterprise.

Out of this developed the seeds for what is now termed “test beds.” While many individual projects, and even more broadly the NOAA/National Weather Service (NWS) Modernization, were successful in advancing weather prediction services, it was recognized that specific forecast problems warranted a more focused and elevated level of effort. The USWRP helped develop these concepts with science teams and provided seed funding for several of the test beds described.

Based on the varying NOAA mission requirements for forecasting, differences in the organizational structure and methods used to provide those services, and differences in the state of the science related to those forecast challenges, test beds have taken on differing characteristics, strategies, and priorities. Current test bed efforts described have all emerged between 2000 and 2011 and focus on hurricanes (Joint Hurricane Testbed), precipitation (Hydrometeorology Testbed), satellite data assimilation (Joint Center for Satellite Data Assimilation), severe weather (Hazardous Weather Testbed), satellite data support for severe weather prediction (Short-Term Prediction Research and Transition Center), mesoscale modeling (Developmental Testbed Center), climate forecast products (Climate Testbed), testing and evaluation of satellite capabilities [Geostationary Operational Environmental Satellite-R Series (GOES-R) Proving Ground], aviation applications (Aviation Weather Testbed), and observing system experiments (OSSE Testbed).

CORRESPONDING AUTHOR: Marty Ralph, NOAA/Earth System Research Laboratory, R/E/PSD, 325 Broadway, Boulder, CO 80305, E-mail: marty.ralph@noaa.gov

A supplement to this article is available online (10.1175/BAMS-D-12-00080.2)

Test beds have become an integral part of the weather enterprise, bridging research and forecast services by transitioning innovative tools and tested methods that impact forecasts and forecast users.

Over roughly the last decade, a variety of “test beds” have come into existence focused on high-impact weather and the core tools of meteorology—observations, models, and fundamental understanding of the underlying physical processes. They have entered the proverbial “valley of death” between research and forecast operations (NAS 2000), and have survived. This paper provides a brief background on how this happened; summarizes test bed origins, methods, and selected accomplishments; and provides a perspective on the future of test beds in our field. Dabbert et al. (2005) provides a useful description of test beds from early in their development and Fig. 1 summarizes the role of test beds.

Fig. 1.
Fig. 1.

Conceptual schematic of the test bed process for a hypothetical project, tool, or concept—including innovation, demonstration, evaluation, and, where suitable, a transition to operations within a federal, state, or local organization. NOS = National Ocean Service; USBR = United States Bureau of Reclamation; and USACE = U.S. Army Corps of Engineers.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

Many trace their origins to the U.S. Weather Research Program (USWRP)'s goals of linking weather research and forecasting operations more effectively. Although USWRP leadership initially envisioned that the associated gaps in capabilities and funding could be filled through major new federal appropriations (on the order of $100 million per year), no singular such funding was achieved. Instead, the National Oceanic and Atmospheric Administration (NOAA) received roughly $3 million per year in core funding for USWRP, which has seeded the development of several test beds, some of which now receive core, long-term funding from their host agencies and are no longer supported directly by NOAA/USWRP. Today, test beds involve multiple agencies—including NOAA, the National Aeronautics and Space Administration (NASA), the Federal Aviation Administration (FAA), and the Department of Defense (DoD)—and represent a major focus of effort in meteorology; although exact numbers are difficult to pin down, current investments are easily in the tens of millions of dollars per year. Individual test beds often have a scope of effort of roughly $1–$5 million per year including “core” funding and “project” funding. The core funding establishes a long-term foundation and capability, while project funding leverages this and delivers on specific tasks for the project sponsors, often in a highly synergistic manner. Their creation has typically involved ramping up over 2–4 years, starting with $100–$500K of funding. Table 1 lists the 10 test beds covered in this paper and briefly summarizes their key attributes.1

Table 1.

Overview of test bed descriptions and information (details at NOAA Testbed portal website: www.testbeds.noaa.gov).

Table 1.

Test bed accomplishments cover a wide range of applications and techniques, from new scientific understanding to better modeling and predictive tools, greater awareness of how weather information is used, and improved outcomes for society. These are achieved through a diverse set of technical and organizational approaches that have emerged organically to meet the needs represented by individual gaps in existing predictive or scientific capabilities. In spite of this diversity in approaches, there are some interesting symmetries between test beds. They often include a core research laboratory upon which scientific staff and tools can be leveraged and administrative infrastructure used. There is usually a specific National Centers for Environmental Prediction (NCEP) “center” that is integrated into the test bed activities including planning, testing, and adoption of suitable new methods and tools. Weather Forecast Offices (WFO) and River Forecast Centers (RFC) are often engaged, as are key users of forecasts. University investigators and students have been involved, which has led to employment opportunities for recent graduates and an infusion into the National Weather Service (NWS) and NOAA laboratories of people with experience and a mindset adapted to bridging research and forecasting operations. Test bed activities can yield results characterized as advancing science or predictions, and can occur in the form of “intangibles,” such as the spawning of a new research direction by exposure of scientists to operational forecasting challenges, or the realization by a forecaster exposed to new science or tools that a valuable new way of using existing observations, models, or forecast tools can be achieved with minimal effort.

As it became clear, by 2008, that several test beds had been created and were producing important results (publications, demonstrations of new tools/methods, transitions into forecasting operations, etc.), it was decided to hold a “NOAA Testbeds Workshop,” which was carried out in April 2009. Roughly 70 participants gathered for two days to share experiences and lessons learned. Two additional workshops have been held thus far—in 2010 and 2012. One of the outcomes of the first workshop was the creation of a NOAA Testbeds website (www.testbeds.noaa.gov) and a NOAA Testbed newsletter (Fig. 2). The second workshop revealed a need for greater coordination regarding recommendations from multiple test beds for major new NOAA observational or modeling infrastructure. Additionally, gaps were identified in capabilities across test beds, and the need for advocacy of test beds as a strategy for NOAA was recognized. In response, NOAA formed a Testbeds and Proving Ground Coordinating Committee, which was approved formally by NOAA leadership. This committee, which includes a manager for each test bed/proving ground and representatives from relevant NOAA line offices, organized the third workshop including identification of extreme precipitation as an integrating theme that engaged several test beds.

Fig. 2.
Fig. 2.

NOAA Testbed newsletters published since fall 2009.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

The test bed summaries herein were prepared by their respective leadership and are presented roughly in the order each was created (Table 1 and Table ES1 of the online supplement contain a listing and brief descriptions of each test bed). Each section includes information regarding the primary focus, objectives, tools used, organizational approach, selected accomplishments, and links to further information. The report concludes with a brief synopsis and description of potential future directions.

JOINT HURRICANE TESTBEDS (JHT).

The USWRP formed the JHT in late 2000 in response to the need articulated by the National Research Council's workshop report to bridge advances in research to the operational environment (NAS 2000). The JHT's mission is to smoothly and rapidly transfer new technology, research results, and observational advances into improved tropical cyclone analysis and prediction at operational centers. This mission is accomplished by identifying promising techniques, applications, or systems being developed by external scientists, and by supporting their testing, evaluation, and modification in a quasi-operational environment.

The JHT is located at the National Hurricane Center (NHC) and is governed by a terms of reference document (www.nhc.noaa.gov/jht/JHTTOR.13Sep2002.pdf) summarizing its organization and operation. Federal assistance through NOAA allows scientists to tailor their techniques for the operational environment. The total annual JHT budget has varied between about $1 million and $1.5 million, all of which has been provided by the USWRP to fund proposals submitted by the research community. Although NOAA/USWRP provides funding for some of JHT's infrastructure, JHT relies on NHC for critical forecaster, administrative, technical, and logistical support. NHC forecasters serve as scientific “points of contact,” providing guidance throughout the project cycle. The NHC also maintains JHT computer equipment, provides real-time data, and collaborates with project investigators to facilitate testing and evaluation. Researchers also work with other national centers [e.g., NCEP/Environmental Modeling Center (EMC)] with facilitation provided by the JHT. The NOAA/Atlantic Oceanographic and Meteorological Laboratory (AOML)'s Hurricane Research Division (HRD) is a primary research partner with JHT and has contributed staff members to two important JHT positions for almost 10 years. One HRD senior scientist sits on the JHT's Steering Committee (SC), and has served as the SC's research colead. HRD also staffs one of the two administrative assistant positions supporting the JHT director. HRD scientists have submitted numerous test bed project proposals that have been awarded funding. Some of these have benefited NHC's forecast operations.

Administration of the JHT comprises a director, two administrative assistants (each devote a quarter of their time), and a full-time information technology (IT) facilitator. The JHT IT environment closely mimics the basic NHC IT environment (data flow and formats, communications, hardware platforms, software applications, etc.) in order to test and to best prepare each technique for possible operational implementation at the conclusion of the project. NHC provides real-time access to the operational data stream to the JHT environment. The JHT Steering Committee advises the JHT director on all JHT activities, and its primary responsibility is to review proposals submitted to the JHT by the research community. The steering committee comprises seven members who broadly represent the tropical cyclone community, including representatives from NOAA and DoD tropical cyclone operations and research, as well as academia.

A biennial Announcement of Opportunity (AO) inviting projects is the initiation for JHT proposal-driven transitions, which includes the program objectives and priorities, and contains a list of NHC, Central Pacific Hurricane Center (CPHC), Joint Typhoon Warning Center (JTWC), and EMC analysis and forecast improvement needs that have been identified and prioritized by these centers. Researchers submit proposals as part of a competitive process. Proposal evaluation criteria are the scientific merit of the technique, readiness for real-time testing based on factors like research maturity, analysis–forecast issue priority, technical compatibility with operational systems, and costs.

After 1–2 years of testing, the conclusion of a JHT project is followed by the submission of a final JHT report to NHC's director and/or other operational center(s) if applicable. This report comes from the JHT staff and is based on its evaluation and input from the project scientist(s) and NHC's points of contact. NHC's director makes the decision on whether to begin the process of operational implementation of the techniques resulting from the project—decisions on model changes are made at EMC, with NHC input. The NHC director's decisions are based on an analysis of forecast or analysis benefit, efficiency, IT compatibility, and sustainability.

DEFINITION AND OBJECTIVES FOR NOAA TEST BEDS AND PROVING GROUNDS

TEST BEDS

  • i) Definition and purpose: A NOAA test bed is a working relationship for developmental testing in a quasi-operational framework among researchers and operational scientists/experts (such as measurement specialists, forecasters, and IT specialists) including partners in academia, the private sector, and government agencies, aimed at solving operational problems or enhancing operations in the context of user needs. A successful test bed involves physical assets as well as substantial commitments and partnerships.
  • ii) What is tested: Advances to be considered include candidates for more effective observing systems, better use of data in forecasts, improved forecast models, and applications for improved services and information with demonstrated economic/public safety benefits.
  • iii) Objectives: Test beds accelerate the translation of research and development (R&D) findings into better operations, services, and decision making. Outcomes from a test bed are capabilities that have been shown to work with operational systems and could include more effective observing systems, better use of data in forecasts, improved forecast models, and applications for improved services and information with demonstrated economic/public safety benefits. Successfully demonstrated test bed capabilities are ready for advanced predeployment testing, in a full simulation of real-time operational conditions, leading to “go/no go” deployment decisions.

OPERATIONS AND SERVICES PROVING GROUNDS

  • i) Definition and purpose: Operations and services proving grounds are a framework for NOAA/NWS to conduct testing of advanced operations, services, and science and technology capabilities that address the needs of both internal and external users. Successful testing demonstrates readiness to implement into operations.
  • ii) What is tested: Capabilities to be tested in operational proving grounds have already passed developmental testing. Such capabilities include advanced observing systems, better use of data in forecasts, improved forecast models, and applications for improved services and information with demonstrated economic/public safety benefits.
  • iii) Objectives: Testing in real time, in an operations-like setting to demonstrate achievement of performance metrics, including testing any workflow changes, needed for implementing in operations as well as end-to-end delivery of services. Performance metrics are defined for each candidate capability in categories of objective performance (e.g., accuracy/skill), subjective evaluations of utility (e.g., user feedback on balance positive), and production/engineering readiness (e.g., systems and communications reliability/security/backup, data retention). Performance criteria for objective and subjective evaluations by users internal to NWS include expected impacts to workflow and workload, except when advanced capabilities have no impact on workflow/workload (e.g., in the case of improvements to numeric quality of current operational guidance and tools). Successful predeployment testing is necessary for approval to implement into operations. (Excerpted from Davidson et al. 2012.)

Since the JHT's inception, NHC and other operational centers (e.g., CPHC and JTWC) have interacted with scientists on 74 projects, with over half of them implemented into operations. Rappaport et al. (2012) examined the first 10 years of the JHT, its impact on operations, and JHT's contributions to NHC's forecast operations. One project of note developed a way to describe the probability of tropical cyclone wind speed thresholds (DeMaria et al. 2009), which is now a routine operational product (Fig. 3 shows an example of two hurricanes). Improvements in tropical cyclone monitoring and prediction in recent years can be credited to the successful implementation of JHT projects.

Fig. 3.
Fig. 3.

NHC's wind speed probability product for 1-min average tropical storm force wind (34 kt, where 1 kt = 0.51 m s−1) for (left) Hurricane Leslie and (right) Hurricane Michael issued on 1200 UTC 6 Sep 2012. The shading represents the probability (percentage) of sustained tropical force surface winds will occur during the forecast period in the shaded area on the map.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

NOAA'S HYDROMETEOROLOGY TESTBED (HMT).

Extreme precipitation and the related hydro-meteorological “forcings” that contribute to flooding, such as soil moisture and snowpack, are the focus of HMT (Ralph et al. 2005). Flooding has triggered more presidential disaster declarations than any other single natural hazard and has contributed on average to more than $3 billion per year of damages nationally. Additionally, a recent study of public use and perceptions of weather forecasts determined that precipitation forecasts were the single most used component of weather forecasts (Lazo et al. 2009).

In spite of its crucial role in both extreme and day-to-day events, quantitative precipitation forecasting (QPF) has remained one of the great challenges in meteorology, especially for extreme events (e.g., Reynolds 2003; Ralph et al. 2010). Currently, NOAA's QPF performance is measured in terms of the “threat score” for forecasts of 1 in. or greater rainfall in 24 h with 1-day lead time. These are issued by NWS's Hydrometeorology Prediction Center (HPC). QPF skill scores typically range between 0.25 and 0.35 (1.0 is a “perfect” forecast). However, this verification metric does not address the highest-impact events, which can often exceed 3–5 in. of rain in 1 day, or >8 in. in 3 days (Ralph and Dettinger 2012), and are even more difficult to predict.

To address these gaps, HMT conducts research on precipitation and weather conditions that can lead to flooding, fosters transition of scientific advances and new tools into forecasting operations, and supports the broad needs for twenty-first-century precipitation information for flood control, water management, and other applications. Guided by NWS operational requirements, emerging scientific questions, and new technologies, HMT directly engages forecasters and scientists in research and development. New ideas, technologies, and predictive models are developed, demonstrated, evaluated, and refined through the test bed before being transitioned to operations. HMT will provide prototypes for state-of-the-art forcings for hydrologic prediction systems at NOAA's National Water Center.

A key driver of HMT was the desire expressed by the NWS forecast community and NOAA stakeholders for more continuous engagement with researchers following two field experiments—“CalJet” and “PacJet”—associated with extreme precipitation in West Coast storms in 1997/98 and 2001/02 (Morss and Ralph 2007). In response, the Physical Sciences Division (PSD) of NOAA/Earth System Research Laboratory (ESRL) sponsored HMT pilot studies in 2003/04 in Northern California's flood-prone Russian River region (Ralph et al. 2006). These studies addressed QPF, which had been identified by USWRP as a priority topic. Next steps for HMT were informed by an interagency planning workshop on cool-season QPF (Ralph et al. 2005). This workshop, plus stakeholder interest driven by the near-catastrophic f lood of 1997 that put downtown Sacramento, California at risk of up to 10 ft of inundation, led HMT to focus next on the American River basin above Sacramento starting in the winter of 2005/06.

HMT is led by ESRL/PSD, the core sponsor, and includes the following key partners: ESRL/Global System Division (GSD); NCEP/HPC; Office of Hydrological Development (OHD); National Environmental Satellite, Data, and Information Service (NESDIS); NWS Western Region RFCs and WFOs; and the state of California's–Department of Water Resources (CA–DWR). HMT has a program director; five “major activity areas,” each with two coleads; two regional field implementations; two transition coordinators for weather and water forecasting; and a field operations coordinator. HMT has collaborated with DTC on mesoscale modeling focused on precipitation. Several grants to universities address HMT's quantitative precipitation estimation (QPE) and QPF activity areas. HMT's purpose, organization, and foci are summarized in its charter, including the identification of five major activity areas representing the primary service performance gaps being addressed: i) QPE, ii) QPF, iii) snow information, iv) hydrologic applications and surface processes, and v) decision support tools (DST). For each area a team of researchers and forecast experts have defined a 5-yr implementation plan that includes key technical tasks with milestones and deliverables aligned with their funding sources. These tasks are addressed using observations, modeling, diagnostics, DST development, training, and transition, and are represented in each year's annual operating plan.

Extreme precipitation and flooding have diverse origins meteorologically and vary greatly by region, from land-falling extratropical cyclones on the West Coast to hurricanes in the east and south to deep convection in the interior and the Southwest. This requires regionally distinct research and development (Ralph et al. 2005). HMT-West is the first regional demonstration, which established that the bulk of heavy precipitation associated with land-falling winter storms is often triggered by “atmospheric rivers” (ARs) (Fig. 4; Ralph et al. 2011; Ralph and Dettinger 2011, 2012). As a consequence of HMT-West research, the NWS began training sessions focused on ARs to improve situational awareness for forecasters and water resource managers. This included creating a COMET training module on ARs (https://www.meted.ucar.edu/training_module.php?id=904). HMT-developed tools that focus on water vapor transport and ARs (Neiman et al. 2008; Junker et al. 2008; Ralph and Dettinger 2012; White et al. 2012) are used in NWS operations, and HPC and ESRL/PSD led an AR Retrospective Forecasting Experiment to advance AR predictions.

Fig. 4.
Fig. 4.

Examples of atmospheric river events (from Ralph et al. 2011).

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

In summary, HMT-West has fostered innovative research to improve understanding, monitoring, and prediction of extreme precipitation (evidenced by >60 peer-reviewed publications), and is now active in several regions outside of California (Table 2). HMT will soon complete a 93-station observing network in California and associated decision support tools, including an early warning system for extreme atmospheric river events (Fig. 5). HMT innovations were key in NOAA's rapid response to the Howard Hanson Dam flood risk management crisis near Seattle, Washington (White et al. 2012). HMT-Southeast will begin in 2013 in North Carolina, in partnership with the NASA Global Precipitation Measurement (GPM) mission. Finally, HMT represents a core NOAA capability to address “understanding and predicting the water cycle,” which is one of the Grand Science Challenges identified by NOAA in its 2010 report “Strengthening NOAA Science” (NOAA Science Workshop Program Committee 2010).

Fig. 5.
Fig. 5.

HMT-West Legacy mesonet being installed in California by NOAA and partners as part of the CA-DWR's Enhanced Flood Response and Emergency Preparedness Observing Network.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

Table 2.

Select accomplishment highlights and impacts from HMT.

Table 2.

JOINT CENTER FOR SATELLITE DATA ASSIMILATION (JCSDA).

The JCSDA was established in 2001 to improve and accelerate the use of research and operational satellite data in numerical weather, ocean, climate, and environmental analysis and prediction. NOAA and NASA were the founding partners and DoD (U.S. Navy and Air Force) joined later. USWRP provided seed funding to initiate JCSDA prior to the creation of new core funding. It is a distributed and collaborative effort that provides a focal point for the development of common software and infrastructure for the partner agencies (Le Marshall et al. 2007). The partnership allows these agencies to enhance the usefulness of the billions of satellite observations currently available daily and to fully prepare for the flood of data from the advanced satellite instruments to be launched during this decade. This is a challenging task given satellite data volume has been increasing at a rate of 100,000-fold per decade—in the last decade alone 50 new instruments were introduced.

The day-to-day activities of the JCSDA are managed by an executive team composed of the director, the deputy director, and associate directors representing all the JCSDA partner agencies [NOAA/NWS, Oceanic and Atmospheric Research (OAR), and NESDIS; NASA Goddard Space Flight Center (GSFC); and Air Force Weather Agency (AFWA) and U. S. Navy]. The executive team is overseen by and receives high-level guidance from a management oversight board with members from all the JCSDA partners. The JCSDA receives regular independent reviews of its scientific priorities and strategic directions from an external science steering committee and advisory panel.

The JCSDA supports scientific development work in priority areas including radiative transfer, clouds and precipitation, advanced instruments, land data assimilation, ocean data assimi lat ion, and atmospheric chemistry and aerosols (Fig. 6). Examples of success include advances in formulating the Community Radiative Transfer Model (CRTM), assessing the impact of assimilation of Advanced Infrared Sounder (AIRS) and Moderate Resolution Imaging Spectrometer (MODIS) data, and provision of AIRS data to operational centers worldwide after the data have been “thinned” appropriately (Le Marshall et al. 2007).

Fig. 6.
Fig. 6.

JCSDA enhances the usefulness of current satellite observations and accelerates the assimilation of data from new instruments, including infrared, microwave, active, passive, and geo- and polar-based measurements.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

The JCSDA research and preoperational implementation experiments are conducted by JCSDA-affiliated scientists with proposal-based funds (internal research) or through external grants and contracts awarded via a competitive process open to the broader scientific community (external research). There are also core projects that are regulated by an agreement between the funding agency and the project principal investigators (directed research). In addition, the JCSDA partners conduct their own internal projects, some of which are directly related to the JCSDA activities. These projects are considered by the JCSDA as in-kind support of JCSDA objectives.

JCSDA activities center on improving the assimilation of satellite data from research and operational sensors on national and foreign satellites and leveraging the efforts of all JCSDA partners. All kinds of satellite data are considered: direct measurements of radiances and brightness temperatures and derived products; observations from both polar-orbiting and geostationary satellites; measurements of instruments sensing in the ultraviolet, visible, infrared, and microwave spectral regions; and data from passive and active sensors, including radio occultation measurements. Recent achievements are listed in Table 3.

Table 3.

Select accomplishment highlights and impacts from JCSDA.

Table 3.

The JCSDA organizes annual scientific workshops on satellite data assimilation that are crucial for the technical coordination of the efforts between the different JCSDA partners. It also organizes a data assimilation summer colloquium, every 2–3 years, engaging graduate students and researchers with early postdoctoral appointments in the science of satellite data assimilation for the atmosphere, land, and oceans. The program includes lectures by international experts in data assimilation, and allows students to interact with the lecturers in an informal setting. The objective of the program is to foster the development of the next generation of data assimilation scientists to support environmental modeling. The JCSDA also publishes a quarterly newsletter highlighting recent research and implementation accomplishments, and conducts a monthly seminar series that is webcast nationally and internationally.

HAZARDOUS WEATHER TESTBED (HWT).

The HWT has its roots in a culture of collaboration established decades ago among severe weather enthusiasts with a commitment to excellence in both forecasting and research. This collaboration can be traced back to the 1950s when forecasters from the Severe Local Storms Warning Service (SELS) and research scientists with the National Severe Storms Project (NSSP) conducted pioneering forecast and research activities out of Kansas City, Missouri (Corfidi 1999). Interaction between these two groups waned somewhat when NSSP became the National Severe Storms Laboratory (NSSL) and moved to Norman, Oklahoma, in the early 1960s, but NSSL scientists forged new partnerships with the local WFO in Oklahoma City in the 1960s [the WFO is now in Norman].

Proximity and passion for severe weather were key ingredients in these partnerships. One element of the collaboration revolved around development and field testing of Doppler weather radar and dual polarization improvements (Scharfenberg et al. 2005). NSSL researchers made significant efforts to transition this science and technology to forecasting operations. Specifically, they engaged in month-long visits to more than a dozen WFOs nationwide to provide training, solicit direct feedback from a wide variety of operational forecasters, and facilitate operational implementation (e.g., Lakshmanan et al. 2007). The Norman-based collaboration also focused on forecast improvements (e.g., Doswell and Flueck 1989; Brooks et al. 1993), which led to the creation of an experimental forecast facility (EFF) in the mid-1990s, staffed by both researchers and forecasters and located adjacent to the operational forecast floor in the Norman WFO (Auciello and Lavoie 1993).

When the blueprint for NCEP was presented in the early 1990s (McPherson 1994), it reflected a strong desire to collocate each new operational center with a complementary research and/or academic institution. One of these new operational centers was the Storm Prediction Center (SPC)—formerly SELS. Given the historical linkage between the SPC and NSSL and the preexisting collaborative framework in central Oklahoma, the SPC was relocated to Norman where space for SPC operations was created within existing NSSL facilities. Additionally, a separate room was reserved for an EFF-like arrangement with dataflow, visualization, and computational resources that mirrored SPC operations. Leaders from NSSL and SPC identified a small group of researchers and forecasters with mutual interests in specific operationally relevant research topics and encouraged these individuals to use the new facilities and develop a framework for a long-term working relationship. This eventually gelled around the topic of more effective use of numerical weather prediction (NWP) models for severe weather forecasting, focusing on educating forecasters about the models, informing researchers about the needs and constraints of operational forecasters, and a two-way transfer of knowledge, tools, and insight between research and operations (Kain et al. 2003).

Concentrating on these themes, the first “spring program” was conducted in the spring of 2000 and became the basis for similar initiatives each spring thereafter. The focus on springtime ensured that compelling real-time convective weather forecasts would be presented nearly every day. The experiments were designed to challenge both model developers and forecasters. About half of each day was devoted to preparing and issuing severe weather forecasts and the other half on critical interrogation of experimental numerical-model guidance. Activities were conducted by small groups containing at least one representative from forecast operations and one model developer or researcher, allowing model developers to gain a broader understanding of how frontline forecasters use model output and the forecasters to develop insight that helped dramatically with interpretation of model guidance for severe weather. The process laid the foundation for new long-term working relationships.

This paradigm—challenging forecasters and researchers to work side by side in small groups to tackle difficult meteorological problems in real time—proved to be very effective (Fig. 7). It galvanized collaborative activities in the Norman meteorological community and inspired the formation of the HWT, even though no funding was available for such a test bed. When NSSL, SPC, and the Norman WFO all joined the University of Oklahoma (OU) School of Meteorology in the National Weather Center building in 2006, a physical space for the HWT was created between the SPC and the WFO and the test bed was formally created (Fig. 7).

Fig. 7.
Fig. 7.

Experimental warning exercises during the 2011 HWT Spring Experiment. Shown are (left) Steve Keighton (Blacksburg, VA, WFO) and (right) Kevin Brown (Norman, OK, WFO). In the background is the Norman WFO forecast operations center.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

The original HWT framework included two programs: 1) the Experimental Forecast Program (EFP), anchored by SPC-related forecasting research (Kain et al. 2006); and 2) the Experimental Warning Program (EWP), focusing on the development and testing of new science, applications, and remote sensing tools to assist the short-term (0–2 hours) nowcasting and warning decision-making process. In recent years the Geostationary Operational Environmental Satellite-R Series (GOES-R) Proving Ground has become part of the HWT, and other partners, most notably the Center for Analysis and Prediction of Storms (OU-CAPS), have become core contributors. Within these major programs, multiple experiments are conducted each year—the EFP conducts the Spring Forecasting Experiment (e.g., Clark et al. 2012) and the EWP conducts multiple experiments during this same spring time frame. For example, recent EWP experiments include the evaluation of phased array radar (Heinselman et al. 2008), a network of 3-cm wavelength radars (Brotzge et al. 2010), and multiradar/multisensor-blended algorithms (Lakshmanan et al. 2007). Individual initiatives emanating from the GOES-R Proving Ground have been intertwined within many of these experiments and have been exceptionally productive, both in terms of scientific publications and contributions to forecast and warning operations (e.g., Kain 2004; Lakshmanan et al. 2007; Kain et al. 2010), yet the HWT remains largely unfunded, except for internal support from the NSSL, SPC, and WFO. At the 2012 American Meteorological Society (AMS) Annual Meeting, the Hazardous Weather Testbed team was awarded the Kenneth C. Spengler Award “for bringing the government, academic, and private sectors together in a visionary, proactive, and exemplary manner to deal with the challenges posed by hazardous weather.”

SHORT-TERM PREDICTION RESEARCH AND TRANSITION (SPORT).

The SPoRT program transitions unique NASA, NOAA, and DoD satellite data and research capabilities to the operational weather community to improve short-term weather forecasts on a regional and local scale. NASA established the test bed in 2002, drawing on real-time MODIS, AIRS, and Advanced Microwave Scanning Radiometer for Earth Observing System (EOS) data from direct broadcast ground stations to address forecast problems common to WFOs in the Southeast United States. It is based at a NASA facility in Huntsville, Alabama, and is collocated with a NWS WFO. SPoRT management receives advice from an interagency science advisory committee of experts across disciplines who serve for 4-yr terms.

Since its establishment, SPoRT has expanded its collaborations to WFOs in all six NWS regions and to several national centers. SPoRT focuses on problems such as the timing and location of severe weather; changing weather conditions influenced by terrain and other local features; reduced surface visibility due to smoke, fog, and low clouds; predicting weather variations due to land–sea breeze circulations; and monitoring weather conditions in data-void regions. SPoRT involves forecasters in the entire process— matching forecast problems to data and research capabilities, testing solutions in a quasi-operational environment, and then transitioning proven solutions into the forecaster's decision support system. SPoRT also develops product training and involves forecasters in the assessment of the utility of the products on the relevant forecast challenges. The suite of SPoRT and collaborative partner products transitioned to the operational weather community is presented in the online supplement (Tables ES2).

A suite of real-time high-resolution MODIS imagery has been successfully used to improve situational awareness for a variety of nowcasting applications. A notable impact on hydrologic forecasting in the upper plains states has been documented by Loss et al. (2009). Atmospheric information from AIRS has been assimilated into weather forecast models and shown to improve the initial conditions and subsequent forecasts of sensible weather elements with the Weather and Research Forecasting (WRF) model (Zavodsky et al. 2012; Chou et al. 2009; McCarty et al. 2009; Lee et al. 2010). The improved initial fields are also being used in a diagnostic mode at various WFOs.

SPoRT scientists work collaboratively on forecast problems and product transitions with several other NOAA test beds. A high-resolution enhanced MODIS/Advanced Microwave Scanning Radiometer for EOS (AMSR-E) sea surface temperature (SST) composite product (e.g., Jedlovec et al. 2009; Haines et al. 2007), land surface information from the NASA's Land Information System (LIS) as implemented by Case et al. (2011), and atmospheric sounding information from AIRS were all used in deterministic real-time WRF forecasts that were evaluated at HWT's 2011 EFP. Near-real-time LIS runs and the SST composite product are also linked to the WRF Environmental Modeling System (Rozumalski 2007) to provide forecasters with unique tools for regional forecast applications. SPoRT has also partnered with the GOES-R Proving Ground to develop and transition proxy data and products from the Advanced Baseline Imagery (ABI) and Geostationary Lightning Mapper (GLM) instruments in advance of launch to prepare forecasters for these new observational capabilities (Stano et al. 2010). Total lightning measurements from ground-based networks have been used to provide additional lead time in severe weather warnings issued by southern region WFOs. Similar measurement capabilities from the GLM on GOES-R will contribute to improved warnings in the future. Other applications of SPoRT data have been documented by forecasters on the Wide World of SPoRT blog (weather.msfc.nasa.gov/sportblog).

SPoRT is extending its transition activities to include new satellite observations integrated into advanced decision support systems in WFOs around the country over the next few years. Data from the Visible/Infrared Imager Radiometer Suite (VIIRS) imaging and Cross-Track Infrared Sounding (CrIS) sounding instruments on the Joint Polar Satellite System (JPSS) will provide follow-on capabilities to those of the NASA MODIS and AIRS instruments. The existing and new data streams from JPSS and NASA Decadal Survey missions will be integrated into the NWS's Advanced Weather Interactive Processing System (AWIPS-II) to extend the use of unique high-resolution data in WFOs.

DEVELOPMENTAL TESTBED CENTER (DTC).

The mission of the DTC is to facilitate research-to-operations (R2O) transition in numerical weather prediction (Bernardet et al. 2008). To accomplish this objective, the DTC supports operational systems, performs testing and evaluation of promising NWP techniques, organizes workshops on important NWP areas, and hosts a DTC visitor program. The DTC was officially established in July 2003, at which time it was funded principally by the National Center for Atmospheric Research (NCAR) and USWRP. During 2011, DTC's budget reached ~$5.4 million, which included newly created core funding from NOAA/OAR that is the majority of support. Additional sponsorship is provided by the U.S. Air Force, NCAR, National Science Foundation (NSF), and USWRP. DTC is based primarily at NCAR and at NOAA/ESRL/GSD, operates under a charter, and receives advice from an executive committee (agency executives), a management board (primarily sponsors), and from a science advisory board. Execution is organized around five activities (Fig. 8), all of which include both testing and evaluation and community support components: meso-scale modeling, hurricanes, data assimilation (DA), ensembles, and verification. Additional collaborations exist with other test beds (principally HMT and HWT) and with the NWS Hurricane Forecast Improvement Project (HFIP).

Fig. 8.
Fig. 8.

Schematic diagram of activity areas of the DTC.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

Mesoscale modeling (MM).

The MM team has focused on testing and evaluation of potential R2O code transitions. In addition to direct model-to-model intercomparisons, the MM team has provided baseline configuration results to the NWP community (both operational and research) as designated WRF reference configurations (www.dtcenter.org/config). These carefully controlled, rigorous tests and accompanying verification statistics provide the research community with baselines against which the impacts of new techniques can be evaluated and the operational community guidance for selecting configurations with potential value for operational implementation. In addition, the MM team has helped NOAA's EMC identify appropriate configurations for the next implementation of the operational Short Range Ensemble Forecast (SREF) system. In 2011, DTC collaborated with EMC and universities to organize a workshop, which provided valuable recommendations and guidance for the NWP community (www.dtcenter.org/events/workshops11/mm_phys_11).

Hurricanes

The focus of the hurricane team is the transfer of new research and development to operations to improve tropical cyclone NWP. The work currently focuses on the Hurricane Weather Research and Forecasting (HWRF) model—a NOAA operational model. First, a solid code management capability was established in collaboration with NCEP/EMC that allows all HWRF developers (from AOML, ESRL, and other close collaborators) to use a single code base. Second, HWRF was expanded into a well-documented, supported community code, with over 400 registered users. The use of HWRF by a large community on a variety of computational platforms led to a more robust model. Finally, the DTC conducts extensive testing and evaluation of HWRF.

Data assimilation (DA).

The DA team bridges the data assimilation research and operational communities by providing the current operational Gridpoint Statistical Interpolation (GSI) capability to researchers [operations to research (O2R)] by enabling the research community to contribute to operational GSI development (R2O), and by facilitating collaboration between distributed GSI developers through the GSI review committee and the community GSI repository. The DA team provides the research community with an annual GSI release containing the latest GSI capabilities, as well as updated documentation. In addition, the DA team actively works with community researchers to help them merge their new DA innovations with GSI software and provides assistance with the process of committing innovations to the GSI repository. Significant R2O activities have included the assimilation of surface observations (air pollutants with diameter of 2.5 mm or less) for the Community Multiscale Air Quality (CMAQ) regional model and the WRF with Chemistry (WRF-Chem) model, the addition of control and state variables for cloud analysis, and GSI enhancements for Rapid Refresh model applications

Ensembles.

The DTC Ensembles Team (DET) brings the latest ensemble developments from the community into operations. These developments often come from experimental real-time ensemble forecast systems. Because they are usually run at a horizontal resolution higher than those available to operations, evaluation of these systems provides an opportunity to influence future operational ensembles. To build on this opportunity for enhanced R2O potential, the DET collaborates with the EMC and other test beds— particularly, HMT and HWT. Both have applied convection-allowing (3- or 4-km horizontal resolution) ensembles to the forecast process and offered lessons learned. Focused verification of QPF by the HMT and reflectivity forecasts by the HWT have provided important guidance as the EMC approaches decisions about the ultimate membership of next-generation operational ensemble forecast systems. Tollerud et al. (2013) provides further details of the infrastructure and objectives of the DET.

Verification.

Statistical verification of numerical forecasts is beneficial to both forecasters and end users because it can supply objective data about the quality or accuracy of their forecasts. These findings can feed back into decision processes, including those involved with R2O decisions about model elements to be transitioned to operations. Furthermore, routine, continuing verification of operational observations, models, analyses, and forecasts helps NOAA meet its obligations for information quality under the Information Quality Act. The DTC verification team primarily develops, tests, and demonstrates tools and methods for verification, including the Model Evaluation Tools (MET) (www.dtcenter.org/met/users/). Although the primary application for MET is the WRF model, the tools can also be applied to most other forecast models. In addition to providing MET to the community, the software package has become instrumental in collaborative efforts between the DTC and other test beds, including HMT, HWT, and HFIP (e.g., the development of atmospheric river–focused verification methods with HMT that have been implemented in MET). Most recently, focus has been on implementation of new tools and methods for verification of hurricane forecasts.

CLIMATE TESTBED (CTB).

NOAA's NWS/NCEP is the lead agency with responsibility for improving our nation's operational climate predictions on time scales from weeks to years. These predictions enhance our collective ability to understand and predict the state and evolution of the climate system, including linkages between climate and weather (including extremes) on all time scales. In 2004, NCEP and the OAR/Climate Program Office (CPO) jointly established a Climate Test bed facility. The mission of the CTB is to accelerate the transition of research and development into improved NOAA operational climate forecasts, products, and applications. The CTB objectives are

  • to accelerate implementation of advances in model improvements, multimodel techniques, forecaster tools, datasets, and observing systems into NOAA climate forecast operations;
  • to provide the climate research community with access to operational models, forecast tools, and datasets to enable collaborative research that accelerates additional improvements of NOAA climate forecast products; and
  • to develop new and improved operational climate forecast products for use in planning and decision making.

The CTB facility is located at NCEP/Climate Prediction Center (CPC) in College Park, Maryland. CTB projects are carried out jointly by scientists from NCEP, other NOAA organizations, and the broader research community through competitive projects funded using annual AOs and resourced and managed by the CPO. The CTB facility at NCEP provides an operational infrastructure (computing support and scientists at NCEP centers). The CTB has a science steering board to provide independent scientific advice, broad direction, and endorsement of ongoing and planned activities.

The CTB has made significant progress toward its objectives and major contributions to the NCEP operational forecasts and products, including a multimodel ensemble (MME) climate prediction system, improvements to the Climate Forecast System (CFS), and development of climate forecast products.

MME climate prediction system.

The CTB and the broader community have done extensive experimental multimodel prediction research and provided evidence that MME prediction approach yields superior forecasts compared to any single model. CTB developed a prototype the MME prediction system as a proof of concept to demonstrate the potential benefits of a MME system using a NCAR model and NCEP CFS (Kirtman and Min 2009; Paolino et al. 2012). CTB scientists also explored recalibration and consolidation methodologies in multimodel ensembling (Tippett et al. 2008; DelSole and Tippett 2008).

In 2011, CTB organized a team effort to develop a national multimodel ensemble (NMME) strategy (Kirtman 2011) and implemented the experimental NMME prediction system to produce real-time forecasts for the CPC operational monthly/seasonal forecasts. The current NMME system contributors include NOAA's NCEP and Geophysical Fluid Dynamics Laboratory, University of Miami, Center for Ocean–Land–Atmosphere Studies, International Research Institute, NASA, and NCAR with others expected in the next two years. This NMME prediction system directly transfers the modeling advances from other U.S. modeling centers to CPC forecast operations.

NCEP Climate Forecast System improvements.

The CTB strategy to improve CFS involves joint team efforts with participation from the external community and NCEP scientists and to use the NCEP operational model as a research tool. For example, scientists from NOAA/ESRL and NCEP identified polar vortex issues and improved the troposphere–stratosphere coupling in the current version, CFSv2 (Shaw and Perlwitz 2010; Shaw et al. 2010). CTB also funded a NCEP Climate Process Team (CPT) to evaluate and improve the representation of stratocumulus-to-cumulus transition in NCEP and NCAR climate models (e.g., Chung and Teixeira 2012; Suselj et al. 2012; Teixeira et al. 2011; Xiao et al. 2012).

CTB has made progress improving two-way communication between NCEP and the external community. The CFSv3 planning workshop provided a more cooperative, multilateral environment for identifying the needs for CFS improvement and future development strategies. CTB is currently working with NCEP and the external community to develop a NCEP climate modeling strategy.

Climate forecast products.

To improve the skill of NCEP operational climate forecasts and thus the quality of climate forecasts, CTB works with the user community to improve access to and understanding of climate forecast products. A CTB team from CPC and the Regional Integrated Sciences and Assessments/Climate Assessment for the Southwest (RISA/CLIMAS) developed and implemented a web-based service (Fig. 9) that allows dynamic interaction between users and CPC products, supports user-centric forecast evaluations, and develops user-customized forecast products.

Fig 9.
Fig 9.

CLIMAS–CPC collaborative development of an interactive web tool for CPC 3-month Climate Outlook.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

CTB also funded focused research to develop and improve drought monitoring and prediction products in support of the National Integrated Drought Information System. A CTB team with scientists from the U.S. Department of Agriculture (USDA), NESDIS, NCEP, and universities produced satellite-based evapotranspiration and soil moisture indices for drought monitoring (Anderson et al. 2011).

In the future, CTB will continue to focus on transition of research to NCEP climate operations and enhancing collaborations between NCEP, other test beds, and the external community. CTB will continue to improve the NMME capability and facilitate the planning and implementation of the NCEP climate modeling strategy. CTB will work directly with the RISA and Regional Climate Centers to improve NCEP's regional climate services.

GOES-R PROVING GROUND.

The GOES-R Proving Ground is an initiative that began in 2008 to accelerate user readiness for the next generation of U.S. geostationary environmental satellites beginning with the launch of the GOES-R satellite in late 2015 (Goodman et al. 2012). The origin of the GOES-R Proving Ground was a recommendation from the third GOES Users Conference in 2004 (DOC/NOAA/NESDIS 2004) to bridge the gap between research and operations by engaging the NWS forecast, watch, and warning community and other-agency users in preoperational demonstrations of the new and advanced capabilities to be available from GOES-R compared to the current GOES constellation. To ensure user readiness, forecasters and other users must have access to prototype advanced products within their operational environment with access through AWIPS and transitioning to AWIPS-II well before launch.

Prototypes of the future GOES-R capabilities can be emulated from current satellite and terrestrial observing systems having higher spatial, spectral, or temporal resolution than the current operational GOES imager, or through synthetic cloud and moisture imagery that can be derived from weather forecast models such as the WRF model. Products being demonstrated in the Proving Ground include (Fig. 10) improved volcanic ash detection, lightning detection, 1-min-interval rapid-scan imagery, dust and aerosol detection, and synthetic cloud and moisture imagery (Grasso et al. 2008; Otkin and Greenwald 2008). These new or enhanced product capabilities will be made possible by the ABI, a 16-channel imager with two visible channels, 4 near-infrared channels, and 10 infrared channels that will provide three times more spectral information, four times the spatial coverage, and an increase in temporal resolution that is more than five times the current imager (Schmit et al. 2005). Other advancements over current GOES capabilities include total lightning detection and mapping of in-cloud and cloud-to-ground f lashes never before available to forecasters from the GLM (Goodman et al. 2013) and increased dynamic range, resolution, and sensitivity in monitoring solar X-ray flux with the Solar UV Imager.

Fig. 10.
Fig. 10.

GOES-R Proving Ground partners and sample products demonstrated to forecasters.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

A key component of the GOES-R Proving Ground is the two-way interaction between the researchers who introduce new products and techniques and the forecasters who then provide feedback and ideas for improvements that can best be incorporated into NOAA's integrated observing and analysis operations. At the HWT, for example, the GOES-R Program provides funding for 10–15 forecasters from across the nation, chosen by the HWT management, to participate in the evaluation of forecast and warning products enabled by GOES-R capabilities (e.g., WRF-simulated cloud and moisture imagery, convective initiation, overshooting top detection, and total lightning) relevant to severe and high-impact weather. Collocated at select NWS national centers, NOAA test beds, and at the NWS Alaska and Pacific Region headquarters there are also long-term on-site Proving Ground visiting scientist technical liaisons—that is, subject matter satellite application experts who aid in the transition from research to operations by actively participating in product demonstrations, interpreting the added value of the satellite-derived information, and conducting training. Developers work with the satellite liaisons and forecasters to build capacity within the forecast office or national center. Summary reports of the product demonstrations conducted in the operational environment of the Proving Ground as well as near-real-time blog postings for recent high-impact weather events are posted at the Proving Ground website (www.goes-r.gov/users/proving-ground.html) and at the websites of the NOAA Cooperative Institute partners.

Administration of the Proving Ground is led by the GOES-R Program Office with new product planning, development, and demonstrations directed toward operational needs overseen by a Science and Demonstration Executive Board (SDEB). The SDEB is advised by 1) the NWS Operational Advisory Team, which is composed of the NWS region Scientific Services Division chiefs; 2) a technical advisory group representing NOAA line offices; and 3) an independent advisory committee composed of senior-level scientists from other government agencies, universities, international satellite organizations, and other national meteorological services. These advisory groups provide guidance, technical assistance, and subject matter expertise about the proposed activities to the executive board. A program review is held during the annual NOAA Satellite Science Week, where the researchers, forecasters, advisory committees, and program managers meet to evaluate the progress toward meeting the program goals and objectives, and determine priorities for the coming year. Performance measures include the number of products demonstrated, the number of products transitioned into operations, and the forecaster evaluations of the science and applicability of the products documented in the demonstration test reports. Annual funding for the Proving Ground and its various program elements is ~$2 million per year.

The next-generation GOES will continue providing valuable data to support high-impact weather warnings as well as key inputs for global and regional NWP models. The large quantities of GOES-R data will present new challenges and opportunities that require more intelligent integration of information derived from blended satellite products (e.g., geostationary and polar satellite observations); multidimensional classification of severe storm potential by combining satellite, radar, in situ data, and models; and new ways of visualizing GOES-R data within the AWIPS-II forecaster workstation. Algorithm developers at NESDIS, NASA SPoRT, and the NOAA Cooperative Institutes are already creating JAVA-based satellite application plug-ins for AWIPS-II, which will quickly accelerate the R2O transitions at NWS. During the GOES-14 out-of-storage period from 16 August to 31 October 2012, special 1-min rapid-scan imager datasets are being collected (sometimes concurrently with 3D total lightning and 1-min radar data) to showcase the benefit of GOES-R products and high-temporal-resolution geostationary measurements. These include, but are not limited to, imagery, convective initiation, cloud-top cooling, cloud microphysical properties, atmospheric motion vectors, etc. (http://cimss.ssec.wisc.edu/goes/srsor/GOES-14_SRSOR.html). NHC forecasters find the rapid-scan imagery especially useful for center fixing tropical cyclones and hurricanes at sunrise. In 2012 and beyond, the GOES-R Proving Ground will continue to test and validate display and visualization techniques (Hillger et al. 2011), decision aids, future capabilities, training materials (e.g., COMET; www.meted.ucar.edu/), and the data processing and product distribution systems to enable greater use of these products in operational settings.

AVIATION WEATHER TESTBED (AWT).

The AWT, located at the Aviation Weather Center (AWC) in Kansas City, Missouri, creates an environment for the transfer of new and innovative aviation weather forecast technology into real-time AWC operations for safe, efficient, and environmentally friendly flight, and to engage in the strategic implementation of the FAA's Next Generation Air Transportation System (NextGen) requirements for aviation weather. AWT's primary objective is to test, evaluate, and refine promising aviation weather research in partnership with the AWC's government, academic, and private sector stakeholders, with the eventual goal of implementing new ideas into a robust, secure, and real-time operational forecast system (Levit et al. 2011).

Prior to the AWT's reorganization in 2009, the AWT existed primarily to transfer research concepts from the Aviation Weather Research Program into AWC operations, and was composed of a small area on the AWC forecast floor. Now, the AWT is housed in a new state-of-the-art room (completed in 2010) with computer workstations that replicate the operational workstations used by AWC meteorologists, as well as advanced video teleconferencing capability that allows for broadcasting output from one workstation to one of several large overhead flat-panel monitors (Fig. 11). This room was designed to foster maximum interaction between teams located at different areas, so evaluations could be achieved in a team-oriented environment. The test bed reorganization also launched new collaborations between the AWC and other research groups, such as NCAR, AFWA, ESRL/GSD, GOES-R satellite program, NWS's Office of Science and Technology, Massachusetts Institute of Technology (MIT)/Lincoln Laboratories, and NCEP/Meteorological Development Laboratory (MDL), etc. Several of these groups have or are providing funding for AWT projects, either directly or through joint support within collaborative projects. The test bed is now organized to be a leading entity for the transfer of aviation weather research to operations and to serve as a conduit to provide research personnel with an opportunity to interact with an operational aviation weather center.

Fig. 11.
Fig. 11.

Participants collaborate together in the AWT during the 2011 Summer Experiment.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

The AWT was used extensively during the 2011 Summer Experiment from 27 June to 22 July 2011. The experiment focused on testing new and emerging weather datasets for forecasting convection in the “golden triangle” (Chicago, IL–New York, NY– Atlanta, GA) high-air-traffic area of the United States. Approximately 40 people, from nearly 15 organizations, visited the AWT and collaborated to produce two daily forecast products outlining the impact of convection to the National Airspace System: the “aviation weather impact” product (Fig. 12a), which depicts important convective weather features for the golden triangle, and the “probability exceedance” product (Fig. 12b), which contours regions where a 30% and 60% probability of exceeding composite reflectivity of 40 dBZ and radar echo tops at or exceeds 37,000 ft exist.

Fig. 12.
Fig. 12.

(a) Example of the aviation weather impact graphic forecast. Contours highlight a high, medium, or low potential threat of convection impacts to the golden triangle area of the National Airspace System. (b) Example of the “probability exceedance” graphic. Contours indicate either a 30% or 60% probability of convection reaching a combined reflectivity value of 40 dBZ and a radar echo height of 37,000 ft or greater.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

Numerous new and existing datasets were tested during the experiment and each were used to create the graphics, as already noted. High-resolution ensemble and deterministic numerical weather prediction models were tested for their ability to correctly resolve the timing, location, morphology, mode, and porosity of convection. The deterministic 3-km High Resolution Rapid Refresh (HRRR), and the Consolidated Storm Prediction for Aviation (Wolfson et al. 2008), along with a 4-km 12-member AFWA ensemble model and NCEP's SREF system, were used in combination with derived air-traffic-impact forecasts from NCAR to determine the forecast graphics. In addition, the GOES-R program supplied the “nearcast” forecasts (Petersen and Aune 2009), a short-term forecast of convective initiation derived from satellite, and Rapid Update Cycle (RUC) model data. As a result of the experiment, the 3-km HRRR model and hourly SREF data were integrated into AWC operations. The AWT held another summer experiment in June 2012 to test similar datasets and concepts with experimental forecasts.

Beyond the planned annual summer experiment, the test bed is also evaluating new interactive weather data display software—AWIPS-II is the next-generation data display system for the NWS. Also, the Interactive Calibration of Grids in Four Dimensions (IC4D; Petrescu and Hall 2009) software, an extension of the Graphical Forecast Editor in AWIPS, is undergoing evaluation by the AWC forecast staff within the AWT. The IC4D system can be used to combine observations, model data, and algorithms to create a gridded forecast—a concept for the “4-D Weather Cube” envisioned by NextGen. Many new concepts for the future forecast process and support of NextGen exist and the AWT will be an important resource in helping to decide which ideas have meaningful and demonstrated benefits, are efficient and reliable to implement, have long-term sustainability, and are compatible with information technology infrastructures.

OBSERVING SYSTEM SIMULATION EXPERIMENT TESTBED (OSSE).

The most recent test bed effort is the Observing System Simulation Experiment Testbed. OSSEs are an important tool for evaluating the potential impact of proposed new observing systems, as well as for evaluating trade-offs in observing system design, and in developing and assessing improved methodology for assimilating new observations on numerical weather prediction (Atlas 1997). The test bed development is being led and managed through NOAA/AOML for use by USWRP partners and academia in collaboration with NESDIS/Center for Satellite Applications and Research (STAR), NOAA/ESRL, and the JCSDA. The OSSE test bed will be applicable to analysis-/forecast-impact studies, observing system design, instrument trade studies, future instrument constellation planning, and data utility investigations. Through the OSSE test bed concept, the goal is to generate an OSSE process that invites participation by the broad community of agency planners, research scientists, and operational centers. The goal for establishing this numerical test bed is to enable a hierarchy of experiments to

  • determine the potential impact of proposed space-based, suborbital, and in situ observing systems on analyses and forecasts;
  • evaluate trade-offs in observing system design;
  • assess proposed methodology for assimilating new observations in coordination with JCSDA; and
  • define both the advantages and limitations of a hierarchy of OSSEs that includes rapid prototyping of instrument or data assimilation concepts, as well as the more rigorous “full” OSSEs.

Although only started in 2010 through seed funding by NOAA USWRP, the OSSE test bed has had several key accomplishments: provided expertise on OSSEs to NOAA and JCSDA partners and academia, and evaluated the global OSSE system and the experiments being performed; finalized regional OSSE nature runs at 3- and 1-km resolution, which required an exhaustive number of iterations of the WRF model embedded within an ECMWF global nature run; confirmed the validity (strong points and weaknesses) of both the 3- and 1-km nature runs over a 13-day period; completed the first phase of a global OSSE for the Unmanned Aircraft System (UAS) and completed a report and one refereed article from this OSSE; and established an external advisory committee for the OSSE test bed.

During the next several years, test bed activities include a survey across NOAA line offices to take stock of existing Observing System Experiment (OSE) and OSSE capabilities. This will include capturing the capabilities and expertise of each organization and the ability of each organization to perform and/or analyze experiments. Through the NOAA Observing System Council, the OSSE test bed will determine the most critical observing system questions to be addressed and their priority. In addition to providing expertise on OSSEs to NOAA and JCSDA partners and academia, the test bed will coordinate information on global and regional OSEs and OSSEs to be performed, the needed resources, and the role of each organization. Specifically, the test bed will conduct global and regional OSSEs for NOAA's UAS program and HFIP and perform OSSEs relating to the polar-orbiting satellite program and wind lidar. Efforts continue to develop the framework for the full OSSE test bed.

CONCLUSIONS AND FUTURE DIRECTIONS.

Test beds have become an integral part of the meteorological community. They have helped foster new forecast innovations and their transition into operations. These developments have powered opportunities for businesses and agencies to improve their products and services. Along the way, a community of subject matter experts has been created that have in-depth experience with bridging research and operations. Not surprisingly, as key forecast challenges and gaps are identified, new regionally focused test bed ideas have been proposed. Lining up support, connecting key research and NWS center “champions,” establishing other-agency partners, and identifying resources are all part of developing new test bed concepts.

A major risk for test beds is based on their inherent nature as a “bridging” entity. In other words, they tend to be “outsiders” relative to either the core mission of forecasting or the core mission of research. In spite of this, they enable more rapid improvements in forecast services and demonstrate tangible relevance of research centers to forecast services while not being entirely beholden to them.

A FRAMEWORK FOR PERFORMANCE MEASURES FOR TEST BEDS

With the advent of the Government Performance Requirements Act (GPRA), agencies are held highly accountable for performance. For NOAA, several of its “GPRA measures” represent forecasting skill (e.g., hurricane track forecast error, flash flood warning lead time, quantitative precipitation forecast skill, and tornado warning lead time). These measures have become a major focus of current forecasting and their improvements that represent the “requirements pull” of today's services. They are calculated by NOAA/NWS and NOAA reports them to the Department of Commerce, the White House, and to congressional committees.

While quite useful, these GPRA measures are difficult to change, and it is difficult to add new ones, even when well justified by forecast user needs. Understandably, it is risky for NOAA to promise too rapid an improvement in these challenging forecast topics. This inhibits setting ambitious goals that can drive innovation in the research community. Analogously, the science and technology communities have well-established measures of research and development performance (e.g., publications, citations, patents). Such measures tend not to reward focusing on the implementation of the new findings beyond the research community, thus inhibiting efforts to “take the next step” beyond publications and grants (NAS 2000). While NOAA laboratories help fill some of this gap, the differences between the standard measures used for science and those used for forecasting represent part of the divide between research and operations.

Several constraints have inhibited progress both in innovation and in transition to daily forecast operations. Here are some key examples:

  • science and technology (S&T) advances are a foundation of NOAA's service improvements, yet are often not initially measurable in the “service” GPRA scores;
  • improving the service GPRA scores requires service programs to adopt new methods, yet this may have a cost and require services to let go of existing methods; and
  • while research suggests fast improvements in GPRA scores may be possible, operational goals must be reasonably achievable or the risk of “failure” is increased.

Because the GPRA measures focus on products issued by NWS, and improvements in these products are often the result of a combination of many inseparable individual advances, a traceable connection between specific S&T advances and formal NWS service improvements is often not very tangible. This creates an underlying issue for the research community and for related test beds—that is, how to measure research and test bed performance in ways that reasonably represent both the underlying advances needed in S&T to enable transformative improvement in forecast services, as well as the near-term incremental improvements that typically build on existing operational tools.

Test beds have the potential to help by developing and monitoring what could be called DPMs, which would be used internally to the agency and test bed. These could be “stretch” versions of current measures (i.e., faster rate of improvement) or entirely new measures that address major societal needs [e.g., rapid hurricane intensity change; QPF for extreme precipitation; river flood warning lead time; snow-level aloft (White et al. 2010)]. The concept, illustrated in Fig. SB1, conveys the following:

  • goals for GPRA-like DPM scores can be set higher in test beds than in full operations;
  • adoption of new methods for full operations requires proof of concept;
  • proof of concept can be demonstrated by limiting tests to small areas, times, and tools;
  • by limiting the scope of tests, the costs can be kept within reasonable bounds;
  • researchers and forecasters jointly define strategies to demonstrate impacts on the suitable DPM goal during the tests; and
  • if regional testing demonstrates improvement, extend results nationally (as appropriate) with follow-up testing.

Fig. SB1.
Fig. SB1.

Today's predictive services exist on a foundation of prior science and technology innovation.

Citation: Bulletin of the American Meteorological Society 94, 8; 10.1175/BAMS-D-12-00080.1

This demonstration concept has been the de facto approach to date, but has not been codified and adopted in a transparent manner useable by test beds. NCEP uses it to evaluate whether model changes should be adopted operationally. JHT uses this approach extensively, and is a model of how to apply to a specific well-defined forecast problem with one NCEP center. Warning decision support tools turn new data into forecast usable information.

For NWS, implementation into operations to meet service requirements includes successful demonstration of key criteria (defined for the specific model/phenomena/capability), such as objective performance (e.g., model accuracy or sensor accuracy), subjective performance (e.g., utility of capability and impact on workflow/workforce), and production readiness (analogous to technology performance measures, but includes necessary IT infrastructure and backups, maintenance procedures, archiving, and in-place verification approach to ensure timely and reliable operational production). These are demonstrated in proving grounds; in some cases test beds also perform these functions—for example, for tools that are implemented directly in NHC systems, JHT can perform this function. Given that the level of effort to carry out these “transition oriented” steps could rapidly consume test bed investments in innovation and demonstration at stages prior to transition, it is vital that management and oversight for these key steps are primarily the responsibility of the operations, rather than the research, organization. The sidebar "GPRA measures" describes issues and perspectives on measuring performance of test beds and forecasting. Possible approaches for measuring performance that are adoptable by test beds and forecast centers include

  • internal measures suitable for state-of-the-art science and technology development (i.e., measure the innovation that underpins future breakthrough advances—the S&T “push”);
  • “infusion”-oriented measures, including test bed demonstration performance measures (DPM);
  • internal measures in “forecast service” programs tracking implementation of infusion (i.e., measure the services' “pull” for S&T);
  • internal measures tracking the rate at which innovation is assimilated into forecast operations and the rate at which outdated forecast tools are discontinued; and
  • use of technical readiness levels to help define the status of key transition activities.

Carrying this out requires adequate capacity and investment in the test beds and a commitment from forecast centers and laboratories. The recent creation by NWS of the “Operations Proving Ground” in Kansas City, which focuses on testing full integration of new tools and methods in a quasi-operational environment, is an example of progress in this regard. It also requires a vibrant research community following the well-established path for exploratory research and development—that is, transformational research today that can enable breakthrough advances in forecast services in the future. Major components of today's core forecast service capabilities are the result of past innovations, some of which were not “programmed” into detailed road maps of their eras. While it is clear that the “requirements driven” road map is critical, it should also be recognized that many of today's requirements emerged as it became apparent that new science and technology could enable meeting them (recall the parable that if Henry Ford had followed a typical requirements-driven approach, he would likely have focused on inventing a better horse, rather than the automobile).

It is recommended for each test bed to work with its research and forecasting experts and stakeholders to identify possible DPMs—for excample, stretch goals for current forecast measures, new forecast variables, measures of prototyping, and scientific advances (peer-reviewed papers). Also, from a NOAA perspective, the Testbed and Proving Ground Coordinating Committee has the potential to collect these measures from each test bed and offer support in coordinating across test beds on key measures. Recent successes in coordination across test beds include the establishment of an annual NOAA Test beds Workshop, identification of a cross-test bed-integrating theme on intense precipitation for the most recent workshop in 2012, creation of the NOAA Testbed News, and development of a parameterization assessment and improvement effort with NCEP/EMC that heavily engaged DTC, HMT, and HWT (Wolff et al. 2012).

In closing, test beds have become an integral part of the weather enterprise. They have developed, tested, and transitioned innovative tools and methods that are impacting forecasts and forecast users. A key direction is to identify commonalities in major gaps identified across multiple test beds (i.e., observations, modeling, and physical understanding) and coordinate requests for agencies to fill these gaps. The need to bridge research and forecast services represents a grand challenge to meteorology—a challenge that test beds have emerged over the last 10 years to address.

ACKNOWLEDGMENTS

The U.S. Weather Research Program in NOAA/OAR's Office of Weather and Air Quality provides full support for JHT, partial support for HMT and DTC, and seed funding for the OSSE Testbed. The HWT is jointly funded and managed by NSSL, NWS/SPC, and NWS/Norman WFO, with NSSL providing funding for the majority of the infrastructure. Funding is also provided by NOAA/OAR under the NOAA–University of Oklahoma Cooperative Agreement NA08OAR4320904, U.S. Department of Commerce. The DTC is jointly sponsored by the NOAA, U.S. Air Force, NSF, and NCAR. NCEP and the NOAA Climate Program Office jointly support the CTB. SPoRT is funded by the Earth Science Division at NASA headquarters and the NOAA GOES-R Program Science Office. The NOAA GOES-R Program Science Office supports the GOES-R Proving Ground.

REFERENCES

  • Anderson, M. C., , C. R. Hain, , B. D. Wardlow, , A. Pimstein, , J. R. Mecikalski, , and W. P. Kustas, 2011: Evaluation of drought indices based on thermal remote sensing of evapotranspiration over the continental United States. J. Climate, 24, 20252044.

    • Search Google Scholar
    • Export Citation
  • Atlas, R., 1997: Atmospheric observations and experiments to assess their usefulness in data assimilation. J. Meteor. Soc. Japan, 75, 111130.

    • Search Google Scholar
    • Export Citation
  • Auciello, E. P., , and R. L. Lavoie, 1993: Collaborative research activities between National Weather Service operational offices and universities. Bull. Amer. Meteor. Soc., 74, 625629.

    • Search Google Scholar
    • Export Citation
  • Bernardet, L., and Coauthors, 2008: The Developmental Testbed Center and its Winter Forecasting Experiment. Bull. Amer. Meteor. Soc., 89, 611627.

    • Search Google Scholar
    • Export Citation
  • Brooks, H. E., , C. A. Doswell III, , and L. J. Wicker, 1993: STORMTIPE: A forecasting experiment using a three-dimensional cloud model. Wea. Forecasting, 8, 352362.

    • Search Google Scholar
    • Export Citation
  • Brotzge, J., , K. Hondl, , B. Philips, , L. Lemon, , E. J. Bass, , D. Rude, , and D. L. Andra, 2010: Evaluation of distributed collaborative adaptive sensing for detection of low-level circulations and implications for severe weather warning operations. Wea. Forecasting, 25, 173189.

    • Search Google Scholar
    • Export Citation
  • Case, J. L., , S. V. Kumar, , J. Srikishen, , and G. J. Jedlovec, 2011: Improving numerical weather predictions of summertime precipitation over the southeastern United States through a high-resolution initialization of the surface state. Wea. Forecasting, 26, 785807.

    • Search Google Scholar
    • Export Citation
  • Chou, S.-H., , B. Zavodsky, , and G. J. Jedlovec, 2009: Data assimilation and regional weather forecast using Atmospheric Infrared Sounder (AIRS) profiles. Preprints, 16th Conf. on Satellite Meteorology and Oceanography, Phoenix, AZ, Amer. Meteor. Soc., JP6.11. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_147745.htm.]

    • Search Google Scholar
    • Export Citation
  • Chung, D., , and J. Teixeira, 2012: A simple model for stratocumulus to shallow cumulus cloud transitions. J. Climate, 25, 25472554.

  • Clark, A. J., and Coauthors, 2012: An overview of the 2010 Hazardous Weather Testbed Experimental Forecast Program Spring Experiment. Bull. Amer. Meteor. Soc., 93, 5574.

    • Search Google Scholar
    • Export Citation
  • Corfidi, S. F., 1999: The birth and early years of the Storm Prediction Center. Wea. Forecasting, 14, 507525.

  • Dabbert, W. F., and Coauthors, 2005: Multifunctional mesoscale observation networks. Bull. Amer. Meteor. Soc., 86, 961982.

  • Davidson, P., , J. Tuell, , L. Uccellini, , J. Gaynor, , S. Koch, , R. Pierce, , and M. Ralph, cited 2012: Recommended guidelines for testbeds and proving grounds. [Available online at www.testbeds.noaa.gov/pdf/Guidelines%20051911_v7.pdf.]

    • Search Google Scholar
    • Export Citation
  • DelSole, T., , and M. K. Tippett, 2008: Predictable components and singular vectors. J. Atmos. Sci., 65, 16661678.

  • DeMaria, M., , J. A. Knaff, , R. Knabb, , C. Lauer, , C. R. Sampson, , and R. T. DeMaria, 2009: A new method for estimating tropical cyclone wind speed probabilities. Wea. Forecasting, 24, 15731591.

    • Search Google Scholar
    • Export Citation
  • DOC/NOAA/NESDIS, 2004: Third GOES users conference. DOC/NOAA/NESDIS Conf. Rep., 90 pp. [Available online at www.goes-r.gov.]

  • Doswell, C. A., , and J. A. Flueck, 1989: Forecasting and verifying in a field research project: DOPLIGHT '87. Wea. Forecasting, 4, 97109.

    • Search Google Scholar
    • Export Citation
  • Goodman, S. J., and Coauthors, 2012: The GOES-R Proving Ground: Accelerating user readiness for the next-generation geostationary environmental satellite system. Bull. Amer. Meteor. Soc., 93, 10291040.

    • Search Google Scholar
    • Export Citation
  • Goodman, S. J., and Coauthors, 2013: The GOES-R Geostationary Lightning Mapper (GLM). Atmos. Res., 125–126, 3449.

  • Grasso, L. D., , M. Sengupta, , J. F. Dostalek, , R. Brummer, , and M. DeMaria, 2008: Synthetic satellite imagery for current and future environmental satellites. Int. J. Remote Sens., 29, 43734384.

    • Search Google Scholar
    • Export Citation
  • Haines, S. L., , G. J. Jedlovec, , and S. M. Lazarus, 2007: A MODIS sea surface temperature composite for regional applications. Trans. Geosci. Remote Sens., 45, 29192927.

    • Search Google Scholar
    • Export Citation
  • Heinselman, P. L., , D. L. Priegnitz, , K. L. Manross, , T. M. Smith, , and R. W. Adams, 2008: Rapid sampling of severe storms by the National Weather Radar Testbed Phased Array Radar. Wea. Forecasting, 23, 808824.

    • Search Google Scholar
    • Export Citation
  • Hillger, D., , L. Grasso, , S. Miller, , R. Brummer, , and R. DeMaria, 2011: Synthetic Advanced Baseline Imager (ABI) true-color imagery. J. Appl. Remote Sens., 5, 592597.

    • Search Google Scholar
    • Export Citation
  • Jedlovec, G. J., , J. Vazquez, , E. Armstrong, , and S. Haines, 2009: Combined MODIS/AMSR-E composite SST data for regional weather applications. Preprints, 16th Conf. on Satellite Meteorology and Oceanography, Phoenix, AZ, Amer. Meteor. Soc., JP8.6. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_145839.htm.]

    • Search Google Scholar
    • Export Citation
  • Junker, N. W., , R. H. Grumm, , R. Hart, , L. F. Bosart, , K. M. Bell, , and F. J. Pereira, 2008: Use of normalized anomaly fields to anticipate extreme rainfall in the mountains of Northern California. Wea. Forecasting, 23, 336356.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., 2004: The Kain–Fritsch convective parameterization: An update. J. Appl. Meteor., 43, 170181.

  • Kain, J. S., , P. R. Janish, , S. J. Weiss, , M. E. Baldwin, , R. S. Schneider, , and H. E. Brooks, 2003: Collaboration between forecasters and research scientists at the NSSL and SPC: The Spring Program. Bull. Amer. Meteor. Soc., 84, 17971806.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., , S. J. Weiss, , J. J. Levit, , M. E. Baldwin, , and D. R. Bright, 2006: Examination of convection-allowing configurations of the WRF model for the prediction of severe convective weather: The SPC/NSSL Spring Program 2004. Wea. Forecasting, 21, 167181.

    • Search Google Scholar
    • Export Citation
  • Kain, J. S., , S. R. Dembek, , S. J. Weiss, , J. L. Case, , J. J. Levit, , and R. A. Sobash, 2010: Extracting unique information from high-resolution forecast models: Monitoring selected fields and phenomena every time step. Wea. Forecasting, 25, 15361542.

    • Search Google Scholar
    • Export Citation
  • Kirtman, B. P., cited 2011: Toward a National Multi- Model Ensemble (NMME) system for operational intra-seasonal to interannual (ISI) climate forecasts. [Available online at www.cpc.ncep.noaa.gov/products/ctb/MMEWhitePaperCPO_revised.pdf.]

    • Search Google Scholar
    • Export Citation
  • Kirtman, B. P., , and D. Min, 2009: Multimodel ensemble ENSO prediction with CCSM and CFS. Mon. Wea. Rev., 137, 29082930.

  • Lakshmanan, V., , T. Smith, , G. J. Stumpf, , and K. Hondl, 2007: The warning decision support system–integrated information. Wea. Forecasting, 22, 596612.

    • Search Google Scholar
    • Export Citation
  • Lazo, J. K., , R. E. Morss, , and J. L. Demuth, 2009: 300 billion served: Sources, perceptions, uses, and values of weather forecasts. Bull. Amer. Meteor. Soc., 90, 785798.

    • Search Google Scholar
    • Export Citation
  • Lee, T., and Coauthors, 2010: NPOESS: Next generation operational global Earth observations. Bull. Amer. Meteor. Soc., 91, 727740.

  • Le Marshall, J., and Coauthors, 2007: The Joint Center for Satellite Data Assimilation. Bull. Amer. Meteor. Soc., 88, 329340.

  • Levit, J. J., , B. Entwistle, , and C. Wallace, 2011: The Aviation Weather Testbed: Infusion of new science and technology for aviation operations. Preprints, Second Aviation, Range, and Aerospace Meteorology Special Symp. on Weather–Air Traffic Management Integration, Seattle, WA, Amer. Meteor. Soc., 3.3. [Available online at https://ams.confex.com/ams/91Annual/webprogram/Paper185616.html.]

    • Search Google Scholar
    • Export Citation
  • Loss, G., , D. Bernhardt, , K. K. Fuell, , and G. T. Stano, 2009: An operational assessment of the MODIS false color composite with the Great Falls, Montana National Weather Service. Preprints, 23rd Conf. on Hydrology, Phoenix, AZ, Amer. Meteor. Soc., P4.2. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_147478.htm.]

    • Search Google Scholar
    • Export Citation
  • McCarty, W., , G. Jedlovec, , and T. L. Miller, 2009: Impact of the assimilation of Atmospheric Infrared Sounder radiance measurements on short-term weather forecasts. J. Geophys. Res., 114, D18122, doi:10.1029/2008JD011626.

    • Search Google Scholar
    • Export Citation
  • McPherson, R. D., 1994: The National Centers for Environmental Prediction: Operational climate, ocean, and weather prediction for the 21st century. Bull. Amer. Meteor. Soc., 75, 363373.

    • Search Google Scholar
    • Export Citation
  • Morss, R. E., , and F. M. Ralph, 2007: Use of information by National Weather Service forecasters and emergency managers during the CALJET and PACJET-2001. Wea. Forecasting, 22, 539555.

    • Search Google Scholar
    • Export Citation
  • NAS, 2000: From research to operations in weather satellites and numerical weather prediction: Crossing the valley of death. The National Academies Board on Atmospheric Sciences and Climate Rep., 80 pp. [Available online at http://dels.nas.edu/Report/From-Research-Operations-Weather/9948.]

    • Search Google Scholar
    • Export Citation
  • Neiman, P. J., , F. M. Ralph, , G. A. Wick, , Y.-H. Kuo, , T.-K. Wee, , Z. Ma, , G. H. Taylor, , and M. D. Dettinger, 2008: Diagnosis of an intense atmospheric river impacting the Pacific Northwest: Storm summary and offshore vertical structure observed with COSMIC satellite retrievals. Mon. Wea. Rev., 136, 43984420.

    • Search Google Scholar
    • Export Citation
  • NOAA Science Workshop Program Committee, cited 2010: Strengthening NOAA Science: Findings from the NOAA Science Workshop. [Available online at http://nrc.noaa.gov/sites/nrc/Documents/Workshops/Science_Workshop_2010_WP_FINAL.pdf.]

    • Search Google Scholar
    • Export Citation
  • Otkin, J. A., , and T. J. Greenwald, 2008: Comparison of WRF model-simulated and MODIS-derived cloud data. Mon. Wea. Rev., 136, 19571970.

    • Search Google Scholar
    • Export Citation
  • Paolino, D. A., , J. L. Kinter III, , B. P. Kirtman, , D. Min, , and D. M. Straus, 2012: The impact of land surface and atmospheric initialization on seasonal forecasts with CCSM. J. Climate, 25, 10071021.

    • Search Google Scholar
    • Export Citation
  • Petersen, R., , and R. Aune, 2009: Optimizing the impact of GOES sounder products in very-short-range forecasts— Recent results and future plans. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_149174.htm.]

    • Search Google Scholar
    • Export Citation
  • Petrescu, E., , and T. Hall, 2009: IC4D—A new tool for producing four dimensional aviation forecasts. Preprints, Aviation, Range, and Aerospace Meteorology Special Symp. on Weather–Air Traffic Management Integration, San Diego, CA, Amer. Meteor. Soc., P1.11. [Available online at https://ams.confex.com/ams/89annual/techprogram/paper_148046.htm.]

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , and M. D. Dettinger, 2011: Storms, floods and the science of atmospheric rivers. Eos, Trans. Amer. Geophys. Union, 92, 265266.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , and M. D. Dettinger, 2012: Historical and national perspectives on extreme West Coast precipitation associated with atmospheric rivers during December 2010. Bull. Amer. Meteor. Soc., 93, 783790.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., and Coauthors, 2005: Improving short-term (0–48 h) cool-season quantitative precipitation forecasting: Recommendations from a USWRP workshop. Bull. Amer. Meteor. Soc., 86, 16191632.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , P. J. Neiman, , G. A. Wick, , S. I. Gutman, , M. D. Dettinger, , D. R. Cayan, , and A. B. White, 2006: Flooding on California's Russian River: Role of atmospheric rivers. Geophys. Res. Lett., 33, L13801, doi:10.1029/2006GL026689.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , E. Sukovich, , D. Reynolds, , M. Dettinger, , S. Weagle, , W. Clark, , and P. J. Neiman, 2010: Assessment of extreme quantitative precipitation forecasts and development of regional extreme event thresholds using data from HMT-2006 and COOP observers. J. Hydrometeor., 11, 12881306.

    • Search Google Scholar
    • Export Citation
  • Ralph, F. M., , E. Sukovich, , G. N. Kiladis, , and K. Weickmann, 2011: A multiscale observational case study of a Pacific atmospheric river exhibiting tropical–extratropical connections and mesoscale frontal wave. Mon. Wea. Rev., 139, 11691189.

    • Search Google Scholar
    • Export Citation
  • Rappaport, E. N., , J.-G. Jiing, , C. W. Landsea, , S. T. Murillo, , and J. L. Franklin, 2012: The Joint Hurricane Test Bed: Its first decade of tropical cyclone research-to-operations activities reviewed. Bull. Amer. Meteor. Soc., 93, 371380.

    • Search Google Scholar
    • Export Citation
  • Reynolds, D., 2003: Value-added quantitative precipitation forecasts: How valuable is the forecaster? Bull. Amer. Meteor. Soc., 84, 876878.

    • Search Google Scholar
    • Export Citation
  • Rozumalski, R. A., 2007: WRF Environmental Modeling System User's Guide: Demystifying the process of installing, configuring, and running the Weather Research and Forecasting model. NOAA/NWS Forecast Decision Training Branch, COMET/UCAR, 95 pp. [Available online at http://strc.comet.ucar.edu/wrf/wrfems_userguide.htm.]

    • Search Google Scholar
    • Export Citation
  • Scharfenberg, K. A., and Coauthors, 2005: The Joint Polarization Experiment: Polarimetric radar in forecasting and warning decision making. Wea. Forecasting, 20, 775788.

    • Search Google Scholar
    • Export Citation
  • Schmit, T. J., , M. M. Gunshor, , W. P. Menzel, , J. Li, , S. Bachmeier, , and J. J. Gurka, 2005: Introducing the next-generation Advanced Baseline Imager on GOES-R. Bull. Amer. Meteor. Soc., 86, 10791096.

    • Search Google Scholar
    • Export Citation
  • Shaw, T. A., , and J. Perlwitz, 2010: The impact of stratospheric model configuration on planetary-scale waves in Northern Hemisphere winter. J. Climate, 23, 33693389.

    • Search Google Scholar
    • Export Citation
  • Shaw, T. A., , J. Perlwitz, , and N. Harnik, 2010: Downward wave coupling between the stratosphere and troposphere: The importance of meridional wave guiding and comparison with zonal-mean coupling. J. Climate, 23, 63656381.

    • Search Google Scholar
    • Export Citation
  • Stano, G. T., , K. K. Fuell, , and G. J. Jedlovec, 2010: NASA SPoRT GOES-R Proving Ground activities. Preprints, Sixth Annual Symp. on Future National Operational Environmental Satellite Systems: NPOESS and GOES-R, Atlanta, GA, Amer. Meteor. Soc., 8.2. [Available online at https://ams.confex.com/ams/90annual/techprogram/paper_163879.htm.]

    • Search Google Scholar
    • Export Citation
  • Suselj, K., , J. Teixeira, , and G. Matheou, 2012: Eddy diffusivity/mass flux and shallow cumulus boundary layer: An updraft PDF multiple mass flux scheme. J. Atmos. Sci., 69, 15131533.

    • Search Google Scholar
    • Export Citation
  • Teixeira, J., and Coauthors, 2011: Tropical and subtropical cloud transitions in weather and climate prediction models: The GCSS/WGNE Pacific Cross-Section Intercomparison (GPCI). J. Climate, 24, 52235256.

    • Search Google Scholar
    • Export Citation
  • Tippett, M. K., , T. DelSole, , S. J. Mason, , and A. G. Barnston, 2008: Regression-based methods for finding coupled patterns. J. Climate, 21, 43844398.

    • Search Google Scholar
    • Export Citation
  • Tollerud, E. I., and Coauthors, 2013: The DTC ensembles task: A new testing and evaluation facility for mesoscale ensembles. Bull. Amer. Meteor. Soc., 94, 321327..

    • Search Google Scholar
    • Export Citation
  • White, A. B., , D. J. Gottas, , A. F. Henkel, , P. J. Neiman, , F. M. Ralph, , and S. I. Gutman, 2010: Developing a performance measure for snow-level forecasts. J. Hydrometeor., 11, 739753.

    • Search Google Scholar
    • Export Citation
  • White, A. B., and Coauthors, 2012: NOAA's rapid response to the Howard A. Hanson Dam flood risk management crisis. Bull. Amer. Meteor. Soc., 93, 189207.

    • Search Google Scholar
    • Export Citation
  • Wolff, J. K., , B. S. Ferrier, , and C. F. Mass, 2012: Establishing closer collaboration to improve model physics for short-range forecasts. Bull. Amer. Meteor. Soc., 93, ES51ES53.

    • Search Google Scholar
    • Export Citation
  • Wolfson, M. M., , W. J. Dupree, , R. Rasmussen, , M. Steiner, , S. Benjamin, , and S. Weygandt, 2008: Consolidated Storm Prediction for Aviation (CoSPA). Preprints, 13th Conf. on Aviation, Range, and Aerospace Meteorology, New Orleans, LA, Amer. Meteor. Soc., J6.5. [Available online at https://ams.confex.com/ams/88Annual/techprogram/paper_132981.htm.]

    • Search Google Scholar
    • Export Citation
  • Xiao, H., , C.-M. Wu, , and C. R. Mechoso, 2012: A treatment for the stratocumulus-to-cumulus transition in GCMs. Climate Dyn., 39, 30753089.

    • Search Google Scholar
    • Export Citation
  • Zavodsky, B. T., , S.-H. Chou, , and G. J. Jedlovec, 2012: Improved regional analyses and heavy precipitation forecasts with assimilation of Atmospheric Infrared Sounder retrieved thermodynamic profiles. IEEE Trans. Geophys. Remote Sens., 50, 42434251.

    • Search Google Scholar
    • Export Citation

1Each of the test beds described here were represented at a NOAA Testbed Workshop, where a brainstorming meeting was held to discuss developing this article. Most also can trace their roots to support from USWRP and are focused on NOAA mission requirements. Although there are now many other test beds (in the United States and Europe) that could have been included here (e.g., Helsinki Testbed, European Severe Storms Laboratory Testbed, National Weather Radar Testbed), it was not practicable to do so. It is envisioned that this article will increase awareness of this emerging type of activity that is helping our field to better link research and forecasting operations (including by those test beds not described herein).

Supplementary Materials

Save