The authors thank NCEP and the anonymous reviewers for their comments, R. Treadon for providing Fig. 2, and P. Caplan for originating the processing of GFS skill distribution. Partial support for this work was provided by the JPSS and Next Generation Global Prediction System (NGGPS) Programs via NOAA Grants 1312M41460 and NA14NES4320003, respectively.
APPENDIX A: LIST OF OBSERVING SYSTEMS USED IN NCEP OPERATIONAL GLOBAL DATA ASSIMILATION SYSTEM IN 2012–13.
The GOS is an ever-changing collection of instruments and systems that provides observations to international NWP centers and also serves local government, industry, and public needs. It is important to keep track of all GOS changes in instrument type, number, quality, etc., since it is clear that operational forecast quality depends on these factors. The failure of a particular satellite instrument, for example, is only predictable statistically, as any instrument can exceed its designed lifetime or fail upon launch or soon thereafter, often with serious consequences due to the expense involved in its replacement.
Table A1 lists the types of observations, platforms, and instruments (or quantities and measurements) used operationally at NCEP during the period of this OSE.
Observing systems used by the NCEP operational GDAS in 2012–13. Pibals = pilot balloons. MDCRS = Meteorological Data Collection and Reporting System. AMDAR = Aircraft Meteorological Data Relay. ASOS = Automated Surface Observing System. AWOS = Automated Weather Observing System. AVHRR = Advanced Very High Resolution Radiometer. SBUV/2 = Solar Backscatter Ultraviolet. MODIS = Moderate Resolution Imaging Spectroradiometer. COSMIC-1 = Constellation Observing System for Meteorology, Ionosphere and Climate–1. GRACE-A = Gravity Recovery and Climate Experiment–A. SAC-C = Satélite de Aplicaciones Cientificas–C. C/NOFS = Communication/Navigation Outage Forecasting System. Earth Observation System–Aura = Aura. OMI = Ozone Monitoring Instrument. GOES = Geostationary Operational Environmental Satellite. Meteosat = Meteorological Satellite. SEVIRI = Spinning Enhanced Visible and Infrared Imager. JMA = Japan Meteorological Agency. MTSAT = Multifunctional Transport Satellite.
APPENDIX B: CONTEXT FOR OBSERVING SYSTEM IMPACTS.
It is informative to place observing system impacts, as demonstrated by OSEs, in context with the long-term skill improvements to operational global forecast systems, such as the NCEP GFS, as documented by a standard and representative score such as Z500AC. The Z500AC is a representative score since it measures the skill of forecast of high and low pressure locations and the vertically averaged atmospheric state, and furthermore it has a long history as a performance metric. Other scores (such as root-mean-square error) tend to move in tandem with Z500AC, while scores for precipitation and hurricane track and intensity are more specialized and tend to measure less representative aspects of atmospheric behavior.
Operational NWP centers are constantly improving their analysis and forecast systems. System improvements can result from scientific development of their many complex components. Development areas include but are not limited to increased quantity and quality of ingested observations from the GOS; improvements to the data assimilation and quality control algorithms and procedures; and improvements to various aspects of the forecast model, such as the representation of physical processes, increasing horizontal and vertical resolution, and increasing computational efficiency. Increased computational efficiency is important because it enables more sophisticated science to be added while maintaining the same computational cost in operations.
The average improvement rate for operational global forecast systems is approximately one day of skill per decade (Simmons and Hollingsworth 2002); that is, the average skill of today’s 5-day forecast is as good as that of a 4-day forecast produced a decade ago. The skill of the NCEP GFS has improved at the same rate, with average mean-annual increases in Z500AC of 0.007 (NH) and 0.010 (SH) as shown in Fig. B1. These increases in skill were due to the accumulated value of system improvements such as those noted above. For example, GFS horizontal resolution increases occurred in 1998 (100–70 km), 2002 (55 km), 2005 (38 km), 2010 (23 km), and 2015 (13 km), and all were enabled by operational high-performance computing (HPC) increases and enhanced computational efficiency. Most of these horizontal-resolution changes resulted in higher annual scores the next year in one or both hemispheres (Fig. B1), even though other system changes undoubtedly contributed.
Distributions of GFS forecast skill for each year over the period 1996–2014 (Fig. B2) provide even more information on the impact of improvements. Despite some minor year-to-year variability in forecast skill due to different weather patterns and despite the fact that GFS upgrades occurred irregularly over this period, it is generally apparent that each annual skill distribution is unique to the GFS of that particular year. Notably, as annual-mean scores have increased, their skill distributions are characterized by a reduced frequency of low scores and an increased frequency of high scores. Contrast, for example, the distributions for NH over 1997–99 and 2012–14: Scores in the range 0.525–0.625 constituted 16%–18% of the total in the earlier period but 1% in the most recent years. From 1997 to 1999, the GFS scores did not reach 0.925 but, in each year of 2012–14, 30%–35% of the NH scores did so.
As a forecast system improves its ability to extract observational information through its DAS and increase its forecast skill through a better model, it becomes more resilient to changes in the observing system and less likely to produce forecasts in the lower range of scores.
Andersson, E., , and Y. Sato, Eds., 2012: WIGOS: WMO Integrated Global Observing System; Final report of the Fifth WMO Workshop on the Impact of Various Observing Systems on Numerical Weather Prediction, WMO Tech. Rep. 2012-1, 25 pp. [Available online at www.wmo.int/pages/prog/www/OSY/Reports/NWP-5_Sedona2012.html.]
Böttger, H., , P. Menzel, , and J. Pailleux, Eds., 2004: Proceedings of the Third WMO Workshop on the Impact of Various Observing Systems on Numerical Weather Prediction. WMO/TD-1228, 324 pp. [Available online at www.wmo.int/pages/prog/www/documents.html.]
Cardinali, C., 2009: Monitoring the observation impact on the short-range forecast. Quart. J. Roy. Meteor. Soc., 135, 239–250, doi:10.1002/qj.366.
Chen, Y., , F. Weng, , Y. Han, , and Q. Liu, 2008: Validation of the Community Radiative Transfer Model by using CloudSat data. J. Geophys. Res., 113, D00A03, doi:10.1029/2007JD009561.
Chen, Y., , Y. Han, , P. Van Delst, , and F. Weng, 2010: On water vapor Jacobian in fast radiative transfer model. J. Geophys. Res., 115, D12303, doi:10.1029/2009JD013379.
Cucurull, L., 2010: Improvement in the use of an operational constellation of GPS radio occultation receivers in weather forecasting. Wea. Forecasting, 25, 749–767, doi:10.1175/2009WAF2222302.1.
Cucurull, L., , and J. C. Derber, 2008: Operational implementation of COSMIC observations into the NCEP’s Global Data Assimilation System. Wea. Forecasting, 23, 702–711, doi:10.1175/2008WAF2007070.1.
Cucurull, L., , and R. A. Anthes, 2015: Impact of loss of microwave and radio occultation observations in operational numerical weather prediction in support of the U.S. data gap mitigation activities. Wea. Forecasting, 30, 255–269, doi:10.1175/WAF-D-14-00077.1.
Cucurull, L., , J. C. Derber, , and R. J. Purser, 2013: A bending angle forward operator for global positioning system radio occultation measurements. J. Geophys. Res. Atmos., 118, 14–28, doi:10.1029/2012JD017782.
Derber, J. C., , and W.-S. Wu, 1998: The use of TOVS cloud-cleared radiances in the NCEP SSI analysis system. Mon. Wea. Rev., 126, 2287–2299, doi:10.1175/1520-0493(1998)126<2287:TUOTCC>2.0.CO;2.
EUMETSAT, 2011: The case for EPS/MetOP Second Generation: Cost benefit analysis. 38 pp. [Available online at www.wmo.int/pages/prog/sat/meetings/documents/PSTG-3_Doc_11-04_MetOP-SG.pdf.]
Eyre, J. R., , and W. P. Menzel, 1989: Retrieval of cloud parameters from satellite sounder data: A simulation study. J. Appl. Meteor., 28, 267–275, doi:10.1175/1520-0450(1989)028<0267:ROCPFS>2.0.CO;2.
Garrett, K., 2013: Forecast impact assessments of SNPP ATMS. JCSCA Quarterly, No. 43, Joint Center for Satellite Data Assimilation, College Park, MD, 3–4.
Gelaro, R., , and Y. Zhu, 2009: Examination of observation impacts derived from observing system experiments (OSEs) and adjoint models. Tellus, 61A, 179–193, doi:10.1111/j.1600-0870.2008.00388.x.
Goldberg, M. D., , H. Kilcoyne, , H. Cikanek, , and A. Mehta, 2013: Joint Polar Satellite System: The United States next generation civilian polar-orbiting environmental satellite system. J. Geophys. Res. Atmos., 118, 13 463–13 475, doi:10.1002/2013JD020389.
Grody, N., , J. Zhao, , R. Ferraro, , F. Weng, , and R. Boers, 2001: Determination of precipitable water and cloud liquid water over oceans from the NOAA 15 advanced microwave sounding unit. J. Geophys. Res., 106, 2943–2953, doi:10.1029/2000JD900616.
Hammersley, J. M., , and D. C. Handscomb, 1975: Monte Carlo Methods. Methuen and Co., 178 pp.
Han, Y., and Coauthors, 2013: Suomi NPP CrIS measurements, sensor data record algorithm, calibration and validation activities, and record data quality. J. Geophys. Res. Atmos., 118, 12 734–12 748, doi:10.1002/2013JD020344.
Hogg, R. V., , and A. T. Craig, 1978: Introduction to Mathematical Statistics. 4th ed. Macmillan, 225 pp.
Joo, S., , J. Eyre, , and R. Marriott, 2012: The impact of MetOp and other satellite data within the Met Office global NWP system using an adjoint-based sensitivity method. Met Office Forecasting Research Tech. Rep. 562, 20 pp.
Kalnay, E., 2003: Atmospheric Modeling, Data Assimilation and Predictability. Cambridge University Press, 341 pp.
Kelly, G., , A. P. McNally, , J.-N. Thepaut, , and M. Szyndel, 2004: OSEs of all main data types in the ECMWF operation system. Proceedings of the Third WMO Workshop on the Impact of Various Observing Systems on Numerical Weather Prediction, WMO/TD-1228, 63–94. [Available online at www.wmo.int/pages/prog/www/documents.html.]
Kim, E., , C.-H. J. Lyu, , K. Anderson, , R. V. Leslie, , and W. J. Blackwell, 2014: S-NPP ATMS instrument prelaunch and on-orbit performance evaluation. J. Geophys. Res. Atmos., 119, 5653–5670, doi:10.1002/2013JD020483.
Kleist, D. T., , and K. Ide, 2015: An OSSE-based evaluation of hybrid variational–ensemble data assimilation for the NCEP GFS. Part I: System description and 3D-hybrid results. Mon. Wea. Rev., 143, 433–451, doi:10.1175/MWR-D-13-00351.1.
Kleist, D. T., , D. F. Parrish, , J. C. Derber, , R. Treadon, , W.-S. Wu, , and S. Lord, 2009: Introduction of the GSI into the NCEP Global Data Assimilation System. Wea. Forecasting, 24, 1691–1705, doi:10.1175/2009WAF2222201.1.
Langland, R. H., , and N. L. Baker, 2004: Estimation of observation impact using the NRL atmospheric variational data assimilation adjoint system. Tellus, 56A, 189–201, doi:10.1111/j.1600-0870.2004.00056.x.
Liu, J., , and E. Kalnay, 2008: Estimating observation impact without adjoint model in an ensemble Kalman filter. Quart. J. Roy. Meteor. Soc., 134, 1327–1335, doi:10.1002/qj.280.
Lorenc, A. C., , and R. Marriott, 2014: Forecast sensitivity to observations in the Met Office global numerical weather prediction system. Quart. J. Roy. Meteor. Soc., 140, 209–224, doi:10.1002/qj.2122.
McNally, A. P., 2012: Observing system experiments to assess the impact of possible future degradation of the Global Satellite Observing Network. ECMWF Tech. Memo. 672, 20 pp. [Available online at www.ecmwf.int/sites/default/files/elibrary/2012/11085-observing-system-experiments-assess-impact-possible-future-degradation-global-satellite.pdf.]
McNally, A. P., , J. C. Derber, , W.-S. Wu, , and B. B. Katz, 2000: The use of TOVS level-1B radiances in the NCEP SSI analysis system. Quart. J. Roy. Meteor. Soc., 126, 689–724, doi:10.1002/qj.49712656315.
McNally, A. P., , M. Bonavita, , and J. Thepaut, 2014: The role of satellite data in the forecasting of Hurricane Sandy. Mon. Wea. Rev., 142, 634–646, doi:10.1175/MWR-D-13-00170.1.
Ota, Y., , J. C. Derber, , E. Kalnay, , and T. Miyoshi, 2013: Ensemble-based observation impact estimates using the NCEP GFS. Tellus, 65A, 20038, doi:10.3402/tellusa.v65i0.20038.
Pailleux, J., , E. Andersson, , and M. Ondráš, Eds., 2008: Proceedings of the Fourth WMO Workshop on the Impact of Various Observing Systems on Numerical Weather Prediction. WMO/TD-1450, 214 pp. [Available online at www.wmo.int/pages/prog/www/documents.html.]
Parrish, D. F., , and J. C. Derber, 1992: The National Meteorological Center’s spectral statistical-interpolation analysis system. Mon. Wea. Rev., 120, 1747–1763, doi:10.1175/1520-0493(1992)120<1747:TNMCSS>2.0.CO;2.
Simmons, A. J., , and A. Hollingsworth, 2002: Some aspects of the improvement of skill of weather numerical prediction. Quart. J. Roy. Meteor. Soc., 128, 647–677, doi:10.1256/003590002321042135.
Wang, X., , D. Parrish, , D. Kleist, , and J. Whitaker, 2013: GSI 3DVar-based ensemble–variational hybrid data assimilation for NCEP Global Forecast System: Single-resolution experiments. Mon. Wea. Rev., 141, 4098–4117, doi:10.1175/MWR-D-12-00141.1.
Wilks, D. S., 1995: Statistical Methods in the Atmospheric Sciences: An Introduction. Academic Press, 467 pp.
WMO, 2010: Global aspects. Vol. 1, Manual on the Global Data-Processing and Forecasting System, WMO-485, 196 pp. [Available online at www.wmo.int/pages/prog/www/DPFS/Manual/GDPFS-Manual.html.]
Wobus, R. L., , and E. Kalnay, 1995: Three years of operational prediction of forecast skill at NMC. Mon. Wea. Rev., 123, 2132–2148, doi:10.1175/1520-0493(1995)123<2132:TYOOPO>2.0.CO;2.
Zapotocny, T. H., , J. Jung, , J. LeMarshall, , and R. Treadon, 2008: A two-season impact study of four satellite data types and rawinsonde data in the NCEP Global Data Assimilation System. Wea. Forecasting, 23, 80–100, doi:10.1175/2007WAF2007010.1.
Zavyalov, V., and Coauthors, 2013: Noise performance of the CrIS instrument. J. Geophys. Res. Atmos., 118, 13 108–13 120, doi:10.1002/2013JD020457.
For the MW instrument, we used NOAA-19 AMSU-A and MHS instead of ATMS because it was originally planned to run an additional experiment to substitute ATMS for the NOAA-19 MW instruments. However, this experiment was never executed due to the unavailability of computing and personnel resources.
While the GFS forecast is run to 16 days in operations, the 10-day forecast for this OSE covers the most skillful part of the forecast that is most sensitive to, and appropriate for showing, the impact of observations and initial conditions on forecast accuracy.
The appellation “worst” is relative and may, in fact, be a very good score when compared to other or past forecast systems.