Improving Best Track Verification of Tropical Cyclones: A New Metric to Identify Forecast Consistency

Sarah D. Ditchek aCooperative Institute for Marine and Atmospheric Studies, Rosenstiel School for Marine and Atmospheric Science, University of Miami, Miami, Florida
bNOAA/AOML/Hurricane Research Division, Miami, Florida

Search for other papers by Sarah D. Ditchek in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-0316-9069
,
Jason A. Sippel bNOAA/AOML/Hurricane Research Division, Miami, Florida

Search for other papers by Jason A. Sippel in
Current site
Google Scholar
PubMed
Close
,
Peter J. Marinescu cCooperative Institute for Research in the Atmosphere, Colorado State University, Fort Collins, Colorado

Search for other papers by Peter J. Marinescu in
Current site
Google Scholar
PubMed
Close
, and
Ghassan J. Alaka Jr. bNOAA/AOML/Hurricane Research Division, Miami, Florida

Search for other papers by Ghassan J. Alaka Jr. in
Current site
Google Scholar
PubMed
Close
Restricted access

Abstract

This paper introduces a new tool for verifying tropical cyclone (TC) forecasts. Tropical cyclone forecasts made by operational centers and by numerical weather prediction (NWP) models have been objectively verified for decades. Typically, the mean absolute error (MAE) and/or MAE skill are calculated relative to values within the operational center’s best track. Yet, the MAE can be strongly influenced by outliers and yield misleading results. Thus, this paper introduces an assessment of consistency between the MAE skill as well as two other measures of forecast performance. This “consistency metric” objectively evaluates the forecast-error evolution as a function of lead time based on thresholds applied to the 1) MAE skill; 2) the frequency of superior performance (FSP), which indicates how often one forecast outperforms another; and 3) median absolute error (MDAE) skill. The utility and applicability of the consistency metric is validated by applying it to four research and forecasting applications. Overall, this consistency metric is a helpful tool to guide analysis and increase confidence in results in a straightforward way. By augmenting the commonly used MAE and MAE skill with this consistency metric and creating a single scorecard with consistency metric results for TC track, intensity, and significant-wind-radii forecasts, the impact of observing systems, new modeling systems, or model upgrades on TC-forecast performance can be evaluated both holistically and succinctly. This could in turn help forecasters learn from challenging cases and accelerate and optimize developments and upgrades in NWP models.

Significance Statement

Evaluating the impact of observing systems, new modeling systems, or model upgrades on TC forecasts is vital to ensure more rapid and accurate implementations and optimizations. To do so, errors between model forecasts and observed TC parameters are calculated. Historically, analyzing these errors heavily relied on using one or two metrics: mean absolute errors (MAE) and/or MAE skill. Yet, doing so can lead to misleading conclusions if the error distributions are skewed, which often occurs (e.g., a poorly forecasted TC). This paper presents a new, straightforward way to combine useful information from several different metrics to enable a more holistic assessment of forecast errors when assessing the MAE and MAE skill.

© 2023 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Sarah D. Ditchek, sarah.d.ditchek@noaa.gov

Abstract

This paper introduces a new tool for verifying tropical cyclone (TC) forecasts. Tropical cyclone forecasts made by operational centers and by numerical weather prediction (NWP) models have been objectively verified for decades. Typically, the mean absolute error (MAE) and/or MAE skill are calculated relative to values within the operational center’s best track. Yet, the MAE can be strongly influenced by outliers and yield misleading results. Thus, this paper introduces an assessment of consistency between the MAE skill as well as two other measures of forecast performance. This “consistency metric” objectively evaluates the forecast-error evolution as a function of lead time based on thresholds applied to the 1) MAE skill; 2) the frequency of superior performance (FSP), which indicates how often one forecast outperforms another; and 3) median absolute error (MDAE) skill. The utility and applicability of the consistency metric is validated by applying it to four research and forecasting applications. Overall, this consistency metric is a helpful tool to guide analysis and increase confidence in results in a straightforward way. By augmenting the commonly used MAE and MAE skill with this consistency metric and creating a single scorecard with consistency metric results for TC track, intensity, and significant-wind-radii forecasts, the impact of observing systems, new modeling systems, or model upgrades on TC-forecast performance can be evaluated both holistically and succinctly. This could in turn help forecasters learn from challenging cases and accelerate and optimize developments and upgrades in NWP models.

Significance Statement

Evaluating the impact of observing systems, new modeling systems, or model upgrades on TC forecasts is vital to ensure more rapid and accurate implementations and optimizations. To do so, errors between model forecasts and observed TC parameters are calculated. Historically, analyzing these errors heavily relied on using one or two metrics: mean absolute errors (MAE) and/or MAE skill. Yet, doing so can lead to misleading conclusions if the error distributions are skewed, which often occurs (e.g., a poorly forecasted TC). This paper presents a new, straightforward way to combine useful information from several different metrics to enable a more holistic assessment of forecast errors when assessing the MAE and MAE skill.

© 2023 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Sarah D. Ditchek, sarah.d.ditchek@noaa.gov
Save
  • Aberson, S. D., 2021: Serial correlation of tropical cyclone track and intensity forecasts. NOAA Tech. Memo. OAR-AOML-107, 16 pp., https://doi.org/10.25923/m0ah-bh98.

  • Alaka, G. J., X. Zhang, S. G. Gopalakrishnan, S. B. Goldenberg, and F. D. Marks, 2017: Performance of basin-scale HWRF tropical cyclone track forecasts. Wea. Forecasting, 32, 12531271, https://doi.org/10.1175/WAF-D-16-0150.1.

    • Search Google Scholar
    • Export Citation
  • Alaka, G. J., D. Sheinin, B. Thomas, L. Gramer, Z. Zhang, B. Liu, H.-S. Kim, and A. Mehra, 2020: A hydrodynamical atmosphere/ocean coupled modeling system for multiple tropical cyclones. Atmosphere, 11, 869, https://doi.org/10.3390/atmos11080869.

    • Search Google Scholar
    • Export Citation
  • Alaka, G. J., X. Zhang, and S. G. Gopalakrishnan, 2022: High-definition hurricanes: Improving forecasts with storm-following nests. Bull. Amer. Meteor. Soc., 103, E680E703, https://doi.org/10.1175/BAMS-D-20-0134.1.

    • Search Google Scholar
    • Export Citation
  • Cangialosi, J. P., 2022: National Hurricane Center forecast verification report: 2021 Hurricane season. NHC Tech. Rep., 76 pp., https://www.nhc.noaa.gov/verification/pdfs/Verification_2021.pdf.

  • Cangialosi, J. P., E. Blake, M. DeMaria, A. Penny, A. Latto, E. Rappaport, and V. Tallapragada, 2020: Recent progress in tropical cyclone intensity forecasting at the National Hurricane Center. Wea. Forecasting, 35, 19131922, https://doi.org/10.1175/WAF-D-20-0059.1.

    • Search Google Scholar
    • Export Citation
  • DeMaria, M., C. R. Sampson, J. A. Knaff, and K. D. Musgrave, 2014: Is tropical cyclone intensity guidance improving? Bull. Amer. Meteor. Soc., 95, 387398, https://doi.org/10.1175/BAMS-D-12-00240.1.

    • Search Google Scholar
    • Export Citation
  • Ditchek, S. D., J. A. Sippel, G. J. Alaka, S. B. Goldenberg, and L. Cucurull, 2023: A systematic assessment of the overall dropsonde impact during the 2017–2020 hurricane seasons using the basin-scale HWRF. Wea. Forecasting, 38, 789816, https://doi.org/10.1175/WAF-D-22-0102.1.

    • Search Google Scholar
    • Export Citation
  • Dong, J., and Coauthors, 2020: The evaluation of real-time hurricane analysis and forecast system (HAFS) stand-alone regional (SAR) model performance for the 2019 Atlantic hurricane season. Atmosphere, 11, 617, https://doi.org/10.3390/atmos11060617.

    • Search Google Scholar
    • Export Citation
  • Galarneau, T. J., and C. A. Davis, 2013: Diagnosing forecast errors in tropical cyclone motion. Mon. Wea. Rev., 141, 405430, https://doi.org/10.1175/MWR-D-12-00071.1.

    • Search Google Scholar
    • Export Citation
  • Gopalakishnan, S., and Coauthors, 2021: 2020 HFIP R&D activities summary: Recent results and operational implementation. NOAA HFIP Tech. Rep. HFIP2021-1, 49 pp., https://doi.org/10.25923/718e-6232.

  • Hazelton, A., and Coauthors, 2021: 2019 Atlantic hurricane forecasts from the global-nested hurricane analysis and forecast system: Composite statistics and key events. Wea. Forecasting, 36, 519538, https://doi.org/10.1175/WAF-D-20-0044.1.

    • Search Google Scholar
    • Export Citation
  • Hazelton, A., and Coauthors, 2022: Performance of 2020 real-time Atlantic hurricane forecasts from high-resolution global-nested hurricane models: HAFS-globalnest and GFDL T-SHiELD. Wea. Forecasting, 37, 143161, https://doi.org/10.1175/WAF-D-21-0102.1.

    • Search Google Scholar
    • Export Citation
  • Heming, J. T., 2017: Tropical cyclone tracking and verification techniques for Met Office numerical weather prediction models. Meteor. Appl., 24, 18, https://doi.org/10.1002/met.1599.

    • Search Google Scholar
    • Export Citation
  • Landsea, C. W., and J. L. Franklin, 2013: Atlantic hurricane database uncertainty and presentation of a new database format. Mon. Wea. Rev., 141, 35763592, https://doi.org/10.1175/MWR-D-12-00254.1.

    • Search Google Scholar
    • Export Citation
  • Landsea, C. W., and J. P. Cangialosi, 2018: Have we reached the limits of predictability for tropical cyclone track forecasting? Bull. Amer. Meteor. Soc., 99, 22372243, https://doi.org/10.1175/BAMS-D-17-0136.1.

    • Search Google Scholar
    • Export Citation
  • Marchok, T. P., 2002: How the NCEP tropical cyclone tracker works. Preprints, 25th Conf. on Hurricanes and Tropical Meteorology, San Diego, CA, Amer. Meteor. Soc., P1.13, https://ams.confex.com/ams/25HURR/techprogram/paper_37628.htm.

  • Marchok, T. P., 2021: Important factors in the tracking of tropical cyclones in operational models. J. Appl. Meteor. Climatol., 60, 12651284, https://doi.org/10.1175/JAMC-D-20-0175.1.

    • Search Google Scholar
    • Export Citation
  • Marinescu, P. J., L. Cucurull, K. Apodaca, L. Bucci, and I. Genkova, 2022: The characterization and impacts of Aeolus wind profile observations in NOAA’s regional tropical cyclone model (HWRF). Quart. J. Roy. Meteor. Soc., 148, 34913508, https://doi.org/10.1002/qj.4370.

    • Search Google Scholar
    • Export Citation
  • McAdie, C. J., and M. B. Lawrence, 2000: Improvements in tropical cyclone track forecasting in the Atlantic basin, 1970–98. Bull. Amer. Meteor. Soc., 81, 989998, https://doi.org/10.1175/1520-0477(2000)081<0989:IITCTF>2.3.CO;2.

    • Search Google Scholar
    • Export Citation
  • McBride, J. L., and G. J. Holland, 1987: Tropical-cyclone forecasting: A worldwide summary of techniques and verification statistics. Bull. Amer. Meteor. Soc., 68, 12301238, https://doi.org/10.1175/1520-0477(1987)068<1230:TCFAWS>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Mehra, A., V. Tallapragada, Z. Zhang, B. Liu, L. Zhu, W. Wang, and H.-S. Kim, 2018: Advancing the state of the art in operational tropical cyclone forecasting at NCEP. Trop. Cyclone Res. Rev., 7, 5156, https://doi.org/10.6057/2018TCRR01.06.

    • Search Google Scholar
    • Export Citation
  • Mohapatra, M., B. K. Bandyopadhyay, and D. P. Nayak, 2013: Evaluation of operational tropical cyclone intensity forecasts over North Indian Ocean issued by India meteorological department. Nat. Hazards, 68, 433451, https://doi.org/10.1007/s11069-013-0624-z.

    • Search Google Scholar
    • Export Citation
  • Moskaitis, J. R., 2008: A case study of deterministic forecast verification: Tropical cyclone intensity. Wea. Forecasting, 23, 11951220, https://doi.org/10.1175/2008WAF2222133.1.

    • Search Google Scholar
    • Export Citation
  • Neumann, C. J., 1981: Trends in forecasting the tracks of Atlantic tropical cyclones. Bull. Amer. Meteor. Soc., 62, 14731485, https://doi.org/10.1175/1520-0477-62.10.1473.

    • Search Google Scholar
    • Export Citation
  • Powell, M. D., and S. D. Aberson, 2001: Accuracy of U.S. tropical cyclone landfall forecasts in the Atlantic basin (1976–2000). Bull. Amer. Meteor. Soc., 82, 27492768, https://doi.org/10.1175/1520-0477(2001)082<2749:AOUSTC>2.3.CO;2.

    • Search Google Scholar
    • Export Citation
  • Rappaport, E. N., and Coauthors, 2009: Advances and challenges at the National Hurricane Center. Wea. Forecasting, 24, 395419, https://doi.org/10.1175/2008WAF2222128.1.

    • Search Google Scholar
    • Export Citation
  • Sippel, J., Z. Zhang, L. Bi, and A. Mehra, 2021: Recent advances in operational HWRF data assimilation. 34th Conf. on Hurricanes and Tropical Meteorology, online, Amer. Meteor. Soc., 3C.2, https://ams.confex.com/ams/34HURR/meetingapp.cgi/Paper/372789.

  • Sippel, J., X. Wu, S. D. Ditchek, V. Tallapragada, and D. T. Kleist, 2022: Impacts of assimilating additional reconnaissance data on operational GFS tropical cyclone forecasts. Wea. Forecasting, 37, 16151639, https://doi.org/10.1175/WAF-D-22-0058.1.

    • Search Google Scholar
    • Export Citation
  • Stoffelen, A., and Coauthors, 2005: The atmospheric dynamics mission for global wind field measurement. Bull. Amer. Meteor. Soc., 86, 7388, https://doi.org/10.1175/BAMS-86-1-73.

    • Search Google Scholar
    • Export Citation
  • Torn, R. D., and C. Snyder, 2012: Uncertainty of tropical cyclone best-track information. Wea. Forecasting, 27, 715729, https://doi.org/10.1175/WAF-D-11-00085.1.

    • Search Google Scholar
    • Export Citation
  • Velden, C., and S. Goldenberg, 1987: The inclusion of high density satellite wind information in a barotropic hurricane-track forecast model. Preprints, 17th Conf. on Hurricanes and Tropical Meteorology, Miami, FL, Amer. Meteor. Soc., 90–93.

  • Wilks, D. S., 2011: Statistical Methods in the Atmospheric Sciences. 3rd ed. International Geophysics Series, Vol. 100, Academic Press, 704 pp.

  • WMO, 2013: Verification methods for tropical cyclone forecasts. WWRP 2013-7, 98 pp., https://filecloud.wmo.int/share/s/Fotf-7H1RLyK1lkSC_onBA.

  • Yamaguchi, M., J. Ishida, H. Sato, and M. Nakagawa, 2017: WGNE intercomparison of tropical cyclone forecasts by operational NWP models: A quarter century and beyond. Bull. Amer. Meteor. Soc., 98, 23372349, https://doi.org/10.1175/BAMS-D-16-0133.1.

    • Search Google Scholar
    • Export Citation
  • Yu, H., and Coauthors, 2012: Operational tropical cyclone forecast verification practice in the western North Pacific region. Trop. Cyclone Res. Rev., 1, 361372, https://doi.org/10.6057/2012TCRR03.06.

    • Search Google Scholar
    • Export Citation
  • Zhang, X., S. G. Gopalakrishnan, S. Trahan, T. S. Quirino, Q. Liu, Z. Zhang, G. Alaka, and V. Tallapragada, 2016: Representing multiple scales in the Hurricane Weather Research and Forecasting modeling system: Design of multiple sets of movable multilevel nesting and the basin-scale HWRF forecast application. Wea. Forecasting, 31, 20192034, https://doi.org/10.1175/WAF-D-16-0087.1.

    • Search Google Scholar
    • Export Citation
  • Zhang, Z., J. A. Zhang, G. J. Alaka Jr., K. Wu, A. Mehra, and V. Tallapragada, 2021: A statistical analysis of high-frequency track and intensity forecasts from NOAA’s operational Hurricane Weather Research and Forecasting (HWRF) modeling system. Mon. Wea. Rev., 149, 33253339, https://doi.org/10.1175/MWR-D-21-0021.1.

    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 545 545 41
Full Text Views 333 333 17
PDF Downloads 306 306 20