• Aberson, S. D., 2003: Targeted observations to improve operational tropical cyclone track forecast guidance. Mon. Wea. Rev., 131, 16131628, https://doi.org/10.1175//2550.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., 2016: Improving high-impact forecasts through sensitivity-based ensemble subsets: Demonstration and initial tests. Wea. Forecasting, 31, 10191036, https://doi.org/10.1175/WAF-D-15-0121.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., and C. F. Mass, 2006: Structure, growth rates, and tangent linear accuracy of adjoint sensitivities with respect to horizontal and vertical resolution. Mon. Wea. Rev., 134, 29712988, https://doi.org/10.1175/MWR3227.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., and G. J. Hakim, 2007: Comparing adjoint- and ensemble-sensitivity analysis with applications to observation targeting. Mon. Wea. Rev., 135, 41174134, https://doi.org/10.1175/2007MWR1904.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., A. Bogusz, M. J. Lauridsen, and C. J. Nauert, 2018: Seeding chaos: The dire consequences of numerical noise in NWP perturbation experiments. Bull. Amer. Meteor. Soc., 99, 615628, https://doi.org/10.1175/BAMS-D-17-0129.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, J. L., 2001: An ensemble adjustment Kalman filter for data assimilation. Mon. Wea. Rev., 129, 28842903, https://doi.org/10.1175/1520-0493(2001)129<2884:AEAKFF>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, J. L., 2007: An adaptive covariance inflation error correction algorithm for ensemble filters. Tellus, 59A, 210224, https://doi.org/10.1111/j.1600-0870.2006.00216.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, J. L., 2009: Spatially and temporally varying adaptive covariance inflation for ensemble filters. Tellus, 61A, 7283, https://doi.org/10.1111/j.1600-0870.2008.00361.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Arnold, C., and C. Dey, 1986: Observing-systems simulation experiments: Past, present, and future. Bull. Amer. Meteor. Soc., 67, 687695, https://doi.org/10.1175/1520-0477(1986)067<0687:OSSEPP>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Atlas, R., 1997: Atmospheric observations and experiments to assess their usefulness in data assimilation (Special Issue: Data assimilation in meteology and oceanography: Theory and practice). J. Meteor. Soc. Japan, 75, 111130, https://doi.org/10.2151/jmsj1965.75.1B_111.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Barker, D. M., W. Huang, Y.-R. Guo, J. Bourgeois, and Q. N. Xiao, 2004: A three-dimensional variational data assimilation system for MM5: Implementation and initial results. Mon. Wea. Rev., 132, 897914, https://doi.org/10.1175/1520-0493(2004)132<0897:ATVDAS>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bednarczyk, C. N., and B. C. Ancell, 2015: Ensemble sensitivity analysis applied to a southern plains convective event. Mon. Wea. Rev., 143, 230249, https://doi.org/10.1175/MWR-D-13-00321.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bergot, T., 1999: Adaptive observations during FASTEX: A systematic survey of upstream flights. Quart. J. Roy. Meteor. Soc., 125, 32713298, https://doi.org/10.1002/qj.49712556108.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bergot, T., 2001: Influence of the assimilation scheme on the efficiency of adaptive observations. Quart. J. Roy. Meteor. Soc., 127, 635660, https://doi.org/10.1002/qj.49712757219.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Berliner, L., Z. Lu, and C. Snyder, 1999: Statistical design for adaptive weather observations. J. Atmos. Sci., 56, 25362552, https://doi.org/10.1175/1520-0469(1999)056<2536:SDFAWO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Berman, J. D., and R. D. Torn, 2019: The impact of initial condition and warm conveyor belt forecast uncertainty on variability in the downstream waveguide in an ECWMF case study. Mon. Wea. Rev., 147, 40714089, https://doi.org/10.1175/MWR-D-18-0333.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Berman, J. D., R. D. Torn, G. S. Romine, and M. L. Weisman, 2017: Sensitivity of Northern Great Plains convection forecasts to upstream and downstream forecast errors. Mon. Wea. Rev., 145, 21412163, https://doi.org/10.1175/MWR-D-16-0353.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brown, B. R., and G. J. Hakim, 2015: Sensitivity of intensifying Atlantic hurricanes to vortex structure. Quart. J. Roy. Meteor. Soc., 141, 25382551, https://doi.org/10.1002/qj.2540.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bryan, G. H., J. C. Wyngaard, and J. M. Fritsch, 2003: Resolution requirements for the simulation of deep moist convection. Mon. Wea. Rev., 131, 23942416, https://doi.org/10.1175/1520-0493(2003)131<2394:RRFTSO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Buizza, R., and A. Montani, 1999: Targeting observations using singular vectors. J. Atmos. Sci., 56, 29652985, https://doi.org/10.1175/1520-0469(1999)056<2965:TOUSV>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chang, E. K. M., M. Zheng, and K. Raeder, 2013: Medium-range ensemble sensitivity analysis of two extreme Pacific extratropical cyclones. Mon. Wea. Rev., 141, 211231, https://doi.org/10.1175/MWR-D-11-00304.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chen, F., and J. Dudhia, 2001: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part I: Model implementation and sensitivity. Mon. Wea. Rev., 129, 569585, https://doi.org/10.1175/1520-0493(2001)129<0569:CAALSH>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Coniglio, M. C., S. M. Hitchcock, and K. H. Knopfmeier, 2016: Impact of assimilating pre-convective upsonde observations on short-term forecasts of convection observed during MPEX. Mon. Wea. Rev., 144, 43014325, https://doi.org/10.1175/MWR-D-16-0091.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Coniglio, M. C., G. S. Romine, D. D. Turner, and R. D. Torn, 2019: Impacts of targeted AERI and Doppler lidar wind retrievals on short-term forecasts of the initiation and early evolution of thunderstorms. Mon. Wea. Rev., 147, 11491170, https://doi.org/10.1175/MWR-D-18-0351.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Diaconescu, E. P., and R. Laprise, 2012: Singular vectors in atmospheric sciences: A review. Earth-Sci. Rev., 113, 161175, https://doi.org/10.1016/j.earscirev.2012.05.005.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Doyle, J. D., C. A. Reynolds, C. Amerault, and J. Moskaitis, 2012: Adjoint sensitivity and predictability of tropical cyclogenesis. J. Atmos. Sci., 69, 35353557, https://doi.org/10.1175/JAS-D-12-0110.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Doyle, J. D., C. Amerault, C. A. Reynolds, and P. A. Reinecke, 2014: Initial condition sensitivity and predictability of a severe extratropical cyclone using a moist adjoint. Mon. Wea. Rev., 142, 320342, https://doi.org/10.1175/MWR-D-13-00201.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Doyle, J. D., C. A. Reynolds, and C. Amerault, 2019: Adjoint sensitivity analysis of high-impact extratropical cyclones. Mon. Wea. Rev., 147, 45114532, https://doi.org/10.1175/MWR-D-19-0055.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dudhia, J., 1989: Numerical study of convection observed during the Winter Monsoon Experiment using a mesoscale two-dimensional model. J. Atmos. Sci., 46, 30773107, https://doi.org/10.1175/1520-0469(1989)046<3077:NSOCOD>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Errico, R. M., 1997: What is an adjoint model? Bull. Amer. Meteor. Soc., 78, 25772591, https://doi.org/10.1175/1520-0477(1997)078<2577:WIAAM>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Evensen, G., 1994: Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. J. Geophys. Res., 99, 10 14310 162, https://doi.org/10.1029/94JC00572.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gao, J., C. Fu, D. J. Stensrud, and J. S. Kain, 2016: OSSEs for an ensemble-3DVAR data assimilation system with radar observations of convective storms. J. Atmos. Sci., 73, 24032426, https://doi.org/10.1175/JAS-D-15-0311.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Garcies, L., and V. Homar, 2009: Ensemble sensitivities of the real atmosphere: Application to Mediterranean intense cyclones. Tellus, 61A, 394406, https://doi.org/10.1111/j.1600-0870.2009.00392.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Garcies, L., and V. Homar, 2010: An optimized ensemble sensitivity climatology of Mediterranean intense cyclones. Nat. Hazards Earth Syst. Sci., 10, 24412450, https://doi.org/10.5194/nhess-10-2441-2010.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gaspari, G., and S. E. Cohn, 1999: Construction of correlation functions in two and three dimensions. Quart. J. Roy. Meteor. Soc., 125, 723757, https://doi.org/10.1002/qj.49712555417.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hakim, G. J., and R. D. Torn, 2008: Ensemble synoptic analysis. Synoptic-Dynamic Meteorology and Weather Analysis and Forecasting: A Tribute to Fred Sanders, Meteor. Monogr., No. 55, Amer. Meteor. Soc., 147–161.

    • Crossref
    • Export Citation
  • Hamill, T. M., 2001: Interpretation of rank histograms for verifying ensemble forecasts. Mon. Wea. Rev., 129, 550560, https://doi.org/10.1175/1520-0493(2001)129<0550:IORHFV>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hill, A. J., C. C. Weiss, and B. C. Ancell, 2016: Ensemble sensitivity analysis for mesoscale forecasts of dryline convection initiation. Mon. Wea. Rev., 144, 41614182, https://doi.org/10.1175/MWR-D-15-0338.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hill, A. J., C. C. Weiss, and B. C. Ancell, 2018: Towards improving forecasts of severe convection along the dryline through targeted observing with ensemble sensitivity analysis. 29th Conf. on Severe Local Storms, Stowe, VT, Amer. Meteor. Soc., 14.2, https://ams.confex.com/ams/29SLS/webprogram/Paper348727.html.

  • Hill, A. J., C. C. Weiss, and D. C. Dowell, 2020: Assimilating near-surface observations from a portable mesoscale network of StickNet platforms during VORTEX-SE with the high-resolution rapid refresh ensemble. Severe Local Storms Symp., Boston, MA, Amer. Meteor. Soc., 369006, https://ams.confex.com/ams/2020Annual/webprogram/Paper369006.html.

  • Hong, S., Y. Noh, and J. Dudhia, 2006: A new vertical diffusion package with an explicit treatment of entrainment processes. Mon. Wea. Rev., 134, 23182341, https://doi.org/10.1175/MWR3199.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hoskins, B. J., and M. M. Coutinho, 2005: Moist singular vectors and the predictability of some high impact European cyclones. Quart. J. Roy. Meteor. Soc., 131, 581601, https://doi.org/10.1256/qj.04.48.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Houtekamer, P., and H. Mitchell, 1998: Data assimilation using an ensemble Kalman filter technique. Mon. Wea. Rev., 126, 796811, https://doi.org/10.1175/1520-0493(1998)126<0796:DAUAEK>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Insurance Information Institute, 2019: Facts statistics: Tornadoes and thunderstorms. Accessed 13 October 2019, https://www.iii.org/fact-statistic/tornadoes-and-thunderstorms.

  • Joly, A., and Coauthors, 1997: The Fronts and Atlantic Storm-Track Experiment (FASTEX): Scientific objectives and experimental design. Bull. Amer. Meteor. Soc., 78, 19171940, https://doi.org/10.1175/1520-0477(1997)078<1917:TFAAST>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Joly, A., and Coauthors, 1999: Overview of the field phase of the Fronts and Atlantic Storm-Track Experiment (FASTEX) project. Quart. J. Roy. Meteor. Soc., 125, 31313163, https://doi.org/10.1002/qj.49712556103.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kain, J. S., 2004: The Kain–Fritsch convective parameterization: An update. J. Appl. Meteor., 43, 170181, https://doi.org/10.1175/1520-0450(2004)043<0170:TKCPAU>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kain, J. S., and Coauthors, 2013: A feasibility study for probabilistic convection initiation forecasts based on explicit numerical guidance. Bull. Amer. Meteor. Soc., 94, 12131225, https://doi.org/10.1175/BAMS-D-11-00264.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kalman, R., 1960: A new approach to linear filtering and prediction problems. J. Basic Eng., 82, 3545, https://doi.org/10.1115/1.3662552.

  • Kerr, C. A., and X. Wang, 2020: Ensemble-based targeted observation method applied to radar radial velocity observations on idealized supercell low-level rotation forecasts: A proof of concept. Mon. Wea. Rev., 148, 877890, https://doi.org/10.1175/MWR-D-19-0197.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kerr, C. A., D. J. Stensrud, X. Wang, C. A. Kerr, D. J. Stensrud, and X. Wang, 2019: Diagnosing convective dependencies on near-storm environments using ensemble sensitivity analyses. Mon. Wea. Rev., 147, 495517, https://doi.org/10.1175/MWR-D-18-0140.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Khare, S. P., and J. L. Anderson, 2006: An examination of ensemble filter based adaptive observation methodologies. Tellus, 58A, 179195, https://doi.org/10.1111/j.1600-0870.2006.00163.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kim, H. M., and B. J. Jung, 2009: Singular vector structure and evolution of a recurving tropical cyclone. Mon. Wea. Rev., 137, 505524, https://doi.org/10.1175/2008MWR2643.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lange, H., and G. C. Craig, 2014: The impact of data assimilation length scales on analysis and prediction of convective storms. Mon. Wea. Rev., 142, 37813808, https://doi.org/10.1175/MWR-D-13-00304.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Langland, R. H., 2005: Issues in targeted observing. Quart. J. Roy. Meteor. Soc., 131, 34093425, https://doi.org/10.1256/qj.05.130.

  • Langland, R. H., and Coauthors, 1999: The North Pacific Experiment (NORPEX-98): Targeted observations for improved North American weather forecasts. Bull. Amer. Meteor. Soc., 80, 13631384, https://doi.org/10.1175/1520-0477(1999)080<1363:TNPENT>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • LeDimet, F., and O. Talagrand, 1986: Variational algorithms for analysis and assimilation of meteorological observations: Theoretical aspects. Tellus, 38A, 97110, https://doi.org/10.1111/j.1600-0870.1986.tb00459.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Limpert, G. L., and A. L. Houston, 2018: Ensemble sensitivity analysis for targeted observations of supercell thunderstorms. Mon. Wea. Rev., 146, 17051721, https://doi.org/10.1175/MWR-D-17-0029.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Liu, C., Q. Xiao, and B. Wang, 2009: An ensemble-based four-dimensional variational data assimilation scheme. Part II: Observing system simulation experiments with Advanced Research WRF (ARW). Mon. Wea. Rev., 137, 16871704, https://doi.org/10.1175/2008MWR2699.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Madaus, L. E., and G. J. Hakim, 2017: Constraining ensemble forecasts of discrete convective initiation with surface observations. Mon. Wea. Rev., 145, 25972610, https://doi.org/10.1175/MWR-D-16-0395.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Madaus, L. E., G. J. Hakim, and C. F. Mass, 2014: Utility of dense pressure observations for improving mesoscale analyses and forecasts. Mon. Wea. Rev., 142, 23982413, https://doi.org/10.1175/MWR-D-13-00269.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Majumdar, S. J., 2016: A review of targeted observations. Bull. Amer. Meteor. Soc., 97, 22872303, https://doi.org/10.1175/BAMS-D-14-00259.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Masutani, M., and Coauthors, 2010: Observing system simulation experiments at the National Centers for Environmental Prediction. J. Geophys. Res., 115, D07101, https://doi.org/10.1029/2009JD012528.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McMurdie, L. A., and B. Ancell, 2014: Predictability characteristics of land-falling cyclones along the North American west coast. Mon. Wea. Rev., 142, 301319, https://doi.org/10.1175/MWR-D-13-00141.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mlawer, E. J., S. J. Taubman, P. D. Brown, M. J. Iacono, and S. Clough, 1997: Radiative transfer for inhomogeneous atmospheres: RRTM, a validated correlated-k model for the longwave. J. Geophys. Res., 102, 16 66316 682, https://doi.org/10.1029/97JD00237.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Oliphant, T. E., 2007: Python for scientific computing. Comput. Sci. Eng., 9, 1020, https://doi.org/10.1109/MCSE.2007.58.

  • Palmer, T. N., R. Gelaro, J. Barkmeijer, and R. Buizza, 1998: Singular vectors, metrics, and adaptive observations. J. Atmos. Sci., 55, 633653, https://doi.org/10.1175/1520-0469(1998)055<0633:SVMAAO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reynolds, C. A., J. D. Doyle, R. M. Hodur, and H. Jin, 2010: Naval research laboratory multiscale targeting guidance for T-PARC and TCS-08. Wea. Forecasting, 25, 526544, https://doi.org/10.1175/2009WAF2222292.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reynolds, C. A., J. D. Doyle, F. M. Ralph, and R. Demirdjian, 2019: Adjoint sensitivity of North Pacific atmospheric river forecasts. Mon. Wea. Rev., 147, 18711897, https://doi.org/10.1175/MWR-D-18-0347.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Romine, G. S., C. S. Schwartz, R. D. Torn, and M. L. Weisman, 2016: Impact of assimilating dropsonde observations from MPEX on ensemble forecasts of severe weather events. Mon. Wea. Rev., 144, 37993823, https://doi.org/10.1175/MWR-D-15-0407.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Skamarock, W. C., and Coauthors, 2008: A description of the Advanced Research WRF version 3. NCAR Tech. Note NCAR/TN-475+STR, 113 pp., https://doi.org/10.5065/D68S4MVH.

    • Crossref
    • Export Citation
  • Smith, N. H., and B. C. Ancell, 2017: Ensemble sensitivity analysis of wind ramp events with applications to observation targeting. Mon. Wea. Rev., 145, 25052522, https://doi.org/10.1175/MWR-D-16-0306.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sobash, R., and D. Stensrud, 2013: The impact of covariance localization for radar data on EnKF analyses of a developing MCS: Observing system simulation experiments. Mon. Wea. Rev., 141, 36913709, https://doi.org/10.1175/MWR-D-12-00203.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sobash, R., C. S. Schwartz, G. S. Romine, K. R. Fossell, and M. L. Weisman, 2015: Severe weather prediction using storm surrogates from an ensemble forecasting system. Wea. Forecasting, 31, 255271, https://doi.org/10.1175/WAF-D-15-0138.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Szunyogh, I., and Z. Toth, 2002: Propagation of the effect of targeted observations: The 2000 winter storm reconnaissance program. Mon. Wea. Rev., 130, 11441165, https://doi.org/10.1175/1520-0493(2002)130<1144:POTEOT>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Thompson, G., R. M. Rasmussen, and K. Manning, 2004: Explicit forecasts of winter precipitation using an improved bulk microphysics scheme. Part I: Description and sensitivity analysis. Mon. Wea. Rev., 132, 519543, https://doi.org/10.1175/1520-0493(2004)132<0519:EFOWPU>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., 2010: Ensemble-based sensitivity analysis applied to African easterly waves. Wea. Forecasting, 25, 6178, https://doi.org/10.1175/2009WAF2222255.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., 2014: The impact of targeted dropwindsonde observations on tropical cyclone intensity forecasts of four weak systems during PREDICT. Mon. Wea. Rev., 142, 28602878, https://doi.org/10.1175/MWR-D-13-00284.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., and G. J. Hakim, 2008: Ensemble-based sensitivity analysis. Mon. Wea. Rev., 136, 663677, https://doi.org/10.1175/2007MWR2132.1.

  • Torn, R. D., and G. J. Hakim, 2009: Initial condition sensitivity of western Pacific extratropical transitions determined using ensemble-based sensitivity analysis. Mon. Wea. Rev., 137, 33883406, https://doi.org/10.1175/2009MWR2879.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., and D. Cook, 2013: The role of vortex and environment errors in genesis forecasts of Hurricanes Danielle and Karl (2010). Mon. Wea. Rev., 141, 232251, https://doi.org/10.1175/MWR-D-12-00086.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., and G. S. Romine, 2015: Sensitivity of central Oklahoma convection forecasts to upstream potential vorticity anomalies during two strongly forced cases during MPEX. Mon. Wea. Rev., 143, 40644087, https://doi.org/10.1175/MWR-D-15-0085.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., G. J. Hakim, and C. Snyder, 2006: Boundary conditions for limited-area ensemble Kalman filters. Mon. Wea. Rev., 134, 24902502, https://doi.org/10.1175/MWR3187.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., G. S. Romine, and T. J. Galarneau, 2017: Sensitivity of dryline convection forecasts to upstream forecast errors for two weakly forced MPEX cases. Mon. Wea. Rev., 145, 18311852, https://doi.org/10.1175/MWR-D-16-0457.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Weisman, M. L., and Coauthors, 2015: The Mesoscale Predictability Experiment (MPEX). Bull. Amer. Meteor. Soc., 96, 21272149, https://doi.org/10.1175/BAMS-D-13-00281.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wile, S. M., J. P. Hacker, and K. H. Chilcoat, 2015: The potential utility of high-resolution ensemble sensitivity analysis for observation placement during weak flow in complex terrain. Wea. Forecasting, 30, 15211536, https://doi.org/10.1175/WAF-D-14-00066.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Xie, B., F. Zhang, Q. Zhang, J. Poterjoy, and Y. Weng, 2013: Observing strategy and observation targeting for tropical cyclones using ensemble-based sensitivity analysis and data assimilation. Mon. Wea. Rev., 141, 14371453, https://doi.org/10.1175/MWR-D-12-00188.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Yussouf, N., and D. J. Stensrud, 2010: Impact of phased-array radar observations over a short assimilation period: Observing system simulation experiments using an ensemble Kalman filter. Mon. Wea. Rev., 138, 517538, https://doi.org/10.1175/2009MWR2925.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zack, J., E. Natenberg, S. Young, G. V. Knowe, K. Waight, J. Manobainco, and C. Kamath, 2010a: Application of ensemble sensitivity analysis to observation targeting for short-term wind speed forecasting in the Tehachapi region winter season. Lawrence Livermore National Laboratory Tech. Rep. LLNL-TR-460956, 60 pp., https://computing.llnl.gov/projects/starsapphire-data-driven-modeling-analysis/LLNL-TR-460956.pdf.

    • Crossref
    • Export Citation
  • Zack, J., E. Natenberg, S. Young, G. V. Knowe, K. Waight, J. Manobianco, and C. Kamath, 2010b: Application of ensemble sensitivity analysis to observation targeting for short-term wind speed forecasting in the Washington-Oregon region. Lawrence Livermore National Laboratory Tech. Rep. LLNL-TR-458086, 67 pp., https://computing.llnl.gov/projects/starsapphire-data-driven-modeling-analysis/LLNL-TR-458086.pdf.

    • Crossref
    • Export Citation
  • Zack, J., E. Natenberg, S. Young, J. Manobianco, and C. Kamath, 2010c: Application of ensemble sensitivity analysis to observation targeting for short-term wind speed forecasting. Lawrence Livermore National Laboratory Tech. Rep. LLNL-TR-42442, 32 pp., https://doi.org/10.2172/972845.

    • Crossref
    • Export Citation
  • Zhang, F., C. Snyder, and J. Sun, 2004: Impacts of initial estimate and observation availability on convective-scale data assimilation with an ensemble Kalman filter. Mon. Wea. Rev., 132, 12381253, https://doi.org/10.1175/1520-0493(2004)132<1238:IOIEAO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zhang, F., Z. Meng, and A. Aksoy, 2006: Tests of an ensemble Kalman filter for mesoscale and regional-scale data assimilation. Part I: Perfect model experiments. Mon. Wea. Rev., 134, 722736, https://doi.org/10.1175/MWR3101.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zheng, M., E. K. M. Chang, and B. A. Colle, 2013: Ensemble sensitivity tools for assessing extratropical cyclone intensity and track predictability. Wea. Forecasting, 28, 11331156, https://doi.org/10.1175/WAF-D-12-00132.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • View in gallery

    Outermost rectangles (Ref d01, d02, and d03) represent the WRF observation-simulation domain configuration, where observations are gathered strictly from the innermost domain for cycled data assimilation (Ref d03). The innermost domains (Ens d01, d02, and d03) are used in the OSSE control ensemble-forecast simulations.

  • View in gallery

    Count of surface observing stations within 100 km of a point (shading). Radiosonde observing sites (red stars) correspond to locations at which observations are taken from the SOS in the vertical at all model levels. Observations in the vertical at each radiosonde location are assumed to not drift from the station’s surface latitude and longitude. Locations for the surface stations and radiosondes were obtained online (https://madis-data.ncep.noaa.gov/ and http://www.raob.com/, respectively).

  • View in gallery

    Ensemble mean forecasts of (a) composite reflectivity (dBZ) and (b) 2-m dewpoint temperature (°C) for case 9 at 2300 UTC 20 May 2013. Barbs represent the 10-m wind speed and direction, and contours in (b) are the mean sea level pressure (hPa, contoured every 2 hPa). The black-outlined rectangle in (a) identifies the response region for this case.

  • View in gallery

    Target fields valid at 1800 UTC 20 May 2013 in case 9 of (a),(b) surface altimeter (ALT) and (c),(d) 2-m temperature (T2) observations; control forecast ensemble mean altimeter (hPa) and temperature (°C) are contoured. Target fields are calculated with respect to (left) MDBZ (dBZ2) and (right) RAIN (mm2) responses valid at 2300 UTC 20 May 2013. Green circles in (a) and (b) are locations at which targeted observations are gathered; blue circles are locations for nontargeted observations. Similarly, green stars in (c) and (d) identify locations for targeted 2-m temperature observations, whereas blue stars demonstrate nontargeted observation locations. Green boxes outline the response regions where MDBZ and RAIN are calculated.

  • View in gallery

    Predicted changes vs actual changes of response mean for experiments of targeted observations with EnKF assimilation in all cases for (left) composite reflectivity (dBZ) and (right) accumulated rainfall (mm) responses at forecast hours (a),(b) 6; (c),(d) 12; and (e),(f) 18. The colored lines and the numbers in the bottom-right corners represent the linear regressions and regression correlation coefficients for the sample populations, respectively. Linear-regression slopes that are statistically significant (p value < 0.001) are denoted with an asterisk.

  • View in gallery

    As in Fig. 5, but for samples of predicted changes vs actual changes of response variance (dBZ2 and mm2).

  • View in gallery

    As in Fig. 5, but observations are assimilated with localization.

  • View in gallery

    As in Fig. 6 but observations are assimilated with localization.

  • View in gallery

    Correlation coefficients between predicted and actual changes of MDBZ (left) mean and (right) variance as a consequence of increasing the noise ratios CM and CV (see the text for a description of each ratio) at assimilation hours (a),(b) 6; (c),(d) 12; and (e),(f) 18. Black lines denote the sample size used to compute the correlation coefficient.

  • View in gallery

    As in Fig. 9, but the response variable is RAIN.

  • View in gallery

    Predicted changes vs actual changes of (left) response mean (m) and (right) response variance (m2) for (a),(b) targeted observations and (c),(d) nontargeted observations in the nonconvection case referenced in the text. Red diamonds correspond to observations assimilated at forecast hour 6, blue triangles correspond to observations assimilated at forecast hour 12, green circles correspond to observations assimilated at forecast hour 18, and purple stars correspond to observations assimilated at forecast hour 24. Colored lines represent the linear regressions for their colored sample populations.

All Time Past Year Past 30 Days
Abstract Views 102 44 0
Full Text Views 130 88 7
PDF Downloads 136 88 11

Factors Influencing Ensemble Sensitivity–Based Targeted Observing Predictions at Convection-Allowing Resolutions

View More View Less
  • 1 Department of Geosciences, Texas Tech University, Lubbock, Texas
© Get Permissions
Free access

Abstract

Ensemble sensitivity analysis (ESA) is applied to select types of observations, in various locations and in advance of forecast convection, to systematically evaluate the effectiveness of ESA-based observation targeting for 10 convection forecasts. To facilitate the analysis, observing system simulation experiments and perfect models are utilized to generate synthetic targeted observations of temperature and pressure for future assimilation with an ensemble prediction system. Various observation assimilation experiments are carried out to assess the impacts of nonlinearity, covariance localization, and numerical noise on ESA-based observation-impact predictions. It is discovered that localization applied during data assimilation restricts targeted-observation increments onto the forecast responses of composite reflectivity and 3-hourly accumulated precipitation, making impact predictions poor. In addition, numerical noise introduced by nonlinear perturbation evolution tends to reduce the correlations between observed and predicted impacts; small, random-perturbation experiments often yielded similar impacts on forecasts as targeted observations. Nonlinearity also manifests in the observation impacts when comparing targeted observations with nontargeted, randomly chosen observations; random observations have seemingly the same impact on forecasts as targeted observations. The results, under idealized conditions and simplified ensemble configurations, demonstrate that ESA-based targeting for nonlinear convection forecasts may be most applicable at short time scales. Important implications for ESA-based targeting methods employed with real-time ensemble systems are also discussed.

Current affiliation: Department of Atmospheric Science, Colorado State University, Fort Collins, Colorado.

© 2020 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Aaron Hill, aaron.hill@colostate.edu

Abstract

Ensemble sensitivity analysis (ESA) is applied to select types of observations, in various locations and in advance of forecast convection, to systematically evaluate the effectiveness of ESA-based observation targeting for 10 convection forecasts. To facilitate the analysis, observing system simulation experiments and perfect models are utilized to generate synthetic targeted observations of temperature and pressure for future assimilation with an ensemble prediction system. Various observation assimilation experiments are carried out to assess the impacts of nonlinearity, covariance localization, and numerical noise on ESA-based observation-impact predictions. It is discovered that localization applied during data assimilation restricts targeted-observation increments onto the forecast responses of composite reflectivity and 3-hourly accumulated precipitation, making impact predictions poor. In addition, numerical noise introduced by nonlinear perturbation evolution tends to reduce the correlations between observed and predicted impacts; small, random-perturbation experiments often yielded similar impacts on forecasts as targeted observations. Nonlinearity also manifests in the observation impacts when comparing targeted observations with nontargeted, randomly chosen observations; random observations have seemingly the same impact on forecasts as targeted observations. The results, under idealized conditions and simplified ensemble configurations, demonstrate that ESA-based targeting for nonlinear convection forecasts may be most applicable at short time scales. Important implications for ESA-based targeting methods employed with real-time ensemble systems are also discussed.

Current affiliation: Department of Atmospheric Science, Colorado State University, Fort Collins, Colorado.

© 2020 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Aaron Hill, aaron.hill@colostate.edu

1. Introduction

Since the 1940s, targeted observing has been employed to improve numerical weather prediction (NWP) forecasts, particularly for weather phenomena that NWP models struggle to forecast accurately and for which errors may grow quickly. Over the last few decades, sophisticated algorithms have been developed and tested that objectively target new observations, such as adjoint sensitivity (e.g., Errico 1997; Bergot 1999) and singular vectors (e.g., Palmer et al. 1998; Buizza and Montani 1999). Adjoint sensitivity linearly relates changes in forecast states to initial perturbations by utilizing an adjoint model (LeDimet and Talagrand 1986), which mandates differentiable dynamics and linearization of the nonlinear model dynamics. Adjoint sensitivity has been applied extensively for predictability studies of tropical cyclones (e.g., Doyle et al. 2012), extratropical cyclones (e.g., Ancell and Hakim 2007; Doyle et al. 2014, 2019), and atmospheric rivers (e.g., Reynolds et al. 2019). Singular vectors describe the fastest growing initial perturbations relative to a forecast metric (e.g., Hoskins and Coutinho 2005; Kim and Jung 2009) and can be computed from a decomposition of the tangent linear model (Diaconescu and Laprise 2012).

These two methods intelligently account for forecast-error growth and uncertainties in the forecast and analysis, which make them ideal to determine where new observations should be sampled to reduce forecast and analysis errors. However, they are inherently limited to primarily linear dynamic regimes; only recently were moist dynamics included within an adjoint model (e.g., Doyle et al. 2014, 2019). In general, these aforementioned techniques and others have largely been applied in targeting programs for large-scale systems in which linearity is more valid, including extratropical (e.g., Joly et al. 1997, 1999; Langland et al. 1999; Szunyogh and Toth 2002; Langland 2005) and tropical cyclones (e.g., Aberson 2003; Reynolds et al. 2010). Langland (2005) summarized the results from a variety of targeting field programs and noted average short-range forecast error reductions of 10%, with maximum reductions in error as much as 50%, and improved forecast skill in 70% of cases.

Exploration of targeting methods for smaller-scale forecasts (e.g., isolated deep convection) has been less extensive. The lack of exploration is particularly important given the large economic impact from severe storms and associated hazards on society, with insured losses exceeding $14 billion (U.S. dollars) in recent years (Insurance Information Institute 2019). Majumdar (2016) reviewed recent targeting campaigns and additionally noted the lack of focus on mesoscale phenomena, suggesting targeted observing for high-impact phenomena should be the focus of future targeting programs. However, neither adjoint sensitivity nor singular vectors are particularly useful for high-impact applications when models approach convection-allowing resolutions (i.e., <4 km) and linear approximations over long time windows are strained (Ancell and Mass 2006). Ensemble sensitivity analysis (ESA; Ancell and Hakim 2007; Hakim and Torn 2008; Torn and Hakim 2008) is a targeting method easily applied to an ensemble of forecasts, making it appealing for high-impact weather applications. ESA dynamically and statistically relates a forecast metric (e.g., area-averaged sea level pressure) to initial-condition perturbations, estimating how changes to prior states may influence a forecast. ESA has been utilized primarily to study dynamic forecast sensitivities of midlatitude cyclones (e.g., Torn and Hakim 2009; Garcies and Homar 2009, 2010; Chang et al. 2013; Zheng et al. 2013; McMurdie and Ancell 2014; Ancell 2016; Berman and Torn 2019) and tropical cyclones (e.g., Torn 2010; Torn and Cook 2013; Xie et al. 2013; Brown and Hakim 2015), but more recently has been applied to forecasts of convection (e.g., Bednarczyk and Ancell 2015; Torn and Romine 2015; Hill et al. 2016; Berman et al. 2017; Torn et al. 2017; Kerr et al. 2019; Kerr and Wang 2020) and wind (e.g., Zack et al. 2010a,b,c; Wile et al. 2015; Smith and Ancell 2017) with high-resolution forecast systems. ESA is capable of highlighting identifiable features (e.g., antecedent precipitation, midtropospheric troughs) that are dynamically related to forecast metrics (Hill et al. 2016). Coupling sensitivity with uncertainty estimates from an ensemble of forecasts, ESA can also predict how ensemble distributions will change from assimilated, targeted observations.

The ESA-based targeting method, derived from ensemble Kalman filter theory (EnKF; Kalman 1960; Evensen 1994), takes into account uncertainty of an ensemble forecast system analysis, targeted-observation error variance, perturbation evolution (i.e., sensitivity), and the data assimilation system that will be used to assimilate targeted observations (Ancell and Hakim 2007), all critical components to effective adaptive observing (Berliner et al. 1999; Langland 2005). Therefore, ESA-based targeting is capable of relating initial-condition uncertainty and potential error growth to a specific forecast response related to high-impact hazards (e.g., hail, precipitation, severe wind). Majumdar (2016) suggested that other targeting algorithms do not take into account data assimilation characteristics, lack error-variance information from the targeted observation, and do not estimate how assimilation increments will evolve with time; ESA takes into account all three factors.

ESA-based targeting procedures have been applied to tropical cyclone (e.g., Xie et al. 2013; Torn 2014) and extratropical cyclone (e.g., Ancell and Hakim 2007; Torn and Hakim 2008) forecasts. ESA-based targeting was retrospectively examined for extratropical cyclones in the Pacific Northwest by Torn and Hakim (2008), who found that for large-scale synoptic systems, ESA-based predictions of changes to forecast sea level pressure from a single assimilated buoy observation were accurate. Torn (2014) noted that for tropical cyclone applications, targeted dropsonde observations tended to improve forecasts relative to nontargeted observations. More recently, ESA was used to target observations for convection forecasts in the Great Plains of the United States (e.g., Torn and Romine 2015; Romine et al. 2016; Coniglio et al. 2016) during the Mesoscale Predictability Experiment (MPEX; Weisman et al. 2015). In contrast to the synoptic-based evaluation, Romine et al. (2016) determined that a number of 3-h accumulated precipitation forecasts during intensive observing periods from MPEX verified worse when targeted dropsonde observations were included in the assimilation process.

Various factors have been hypothesized that could influence targeted observation impacts and associated targeting predictions for all scales of flow, including nonlinear forecast evolution (Xie et al. 2013), localizing observations during assimilation and inflating ensemble distributions to reduce underdispersion (Hill et al. 2018), sampling error from limited ensemble members (Torn and Hakim 2008; Romine et al. 2016; Coniglio et al. 2019), insufficient sampling of dynamic target regions (Bergot 2001; Majumdar 2016), and suboptimal data assimilation systems (Romine et al. 2016). For example, the localization of observations—critical to reduce noise during assimilation—is not directly accounted for in the ESA-based method, thus its relative impact is not estimated prior to sampling and assimilation. Given the mixed performance of ESA-based targeting for convection forecasts in MPEX (Romine et al. 2016), and good algorithmic performance at larger scales (e.g., Torn and Hakim 2008), the efficacy of ESA-based targeting at convection-allowing resolutions warrants additional investigation.

The primary goal of this work is to examine ESA-based targeting for convection-allowing forecasts in an effort to understand what elements of the forecasts and assimilation procedure contribute to targeting success or failure. ESA-based targeting would accurately estimate observation impacts if applied to a linear model, with no localization of observations during EnKF data assimilation, and no model error nor numerical noise. Real-time convection-allowing NWP models are highly nonlinear, and model error and noise impacts are not easily quantified in real-time targeting applications. However, Observing System Simulation Experiments (OSSE) provide a framework to account for and remove model error; further discussion of OSSEs used in this work is provided in section 2. Various forecast and assimilation configurations to examine ESA-based targeting are also detailed in section 2. Section 3 evaluates ESA-based predictions after targeted observations are assimilated. Influences of numerical noise and nonlinearity are characterized in section 4. Section 5 offers a summary of the experiments and discussion of convective-scale targeted observing and implications for future applications.

2. Methodology

a. Ensemble sensitivity analysis–based targeting

ESA predicts optimal locations of targeted observations and their impact on forecasts by statistically linking forecast distributions to perturbations in the model state at earlier times. First, a scalar forecast metric Rf at time f is regressed back to the model state xp at time p to describe how Rf will change due to a perturbation in xp (i.e., sensitivity). Sensitivity is easily calculated given an ensemble forecast distribution and vector samples of Rf and xp:
Rfxp=covariance(Rf,xp)variance(xp).
Sensitivity is simply the slope of the linear regression between Rf and xp.
ESA-based targeting estimates are derived by considering how an update to the analysis error-covariance matrix from a new observation will project onto Rf; a formal derivation is presented by Ancell and Hakim (2007). Simply, the ESA-based targeting algorithm estimates the change in variance of Rf (δσ2) by
δσ2=[variance(xp)*sensitivity(Rf,xp)]2[variance(xp)+variance(ob)]
=covariance(Rf,xp)2variance(xp)+variance(ob),
where “sensitivity” represents ensemble sensitivity as in (1). The only input value to (3), besides an ensemble of forecast output, is an estimate of targeted observation error variance [variance (ob)], typically prescribed from instrument error characteristics. Note that (3) is negative definite; mathematically, any new observation assimilated will always reduce forecast metric variance. The estimated variance reduction from a new observation is trivial to calculate over a gridded domain from an ensemble of forecasts. ESA-based estimates of forecast metric-variance change rely on the linear evolution of initial perturbations, and in particular how those perturbations evolve with the tangent linear model (Ancell and Hakim 2007). It should be expected this particular linear dependence will be challenged in the proposed convection-allowing forecast experiments and analysis described below.

Also note that collinearity can be introduced between variance-reduction estimates when independent calculations of (3) are made for different observation types. Collinearity is particularly problematic when observations are considered independently at the same point in space and time, which results in positive or negative results (good or bad predictions of response impacts, for example) for both variables that are a direct result of the two variables covarying with one another. Collinearity becomes less of a factor (but is not absent) when observations are selected independently in different locations, as is done in this work; more details with regard to observation selection are presented later.

b. Cases

Ten dryline convection events in the southern plains are selected to evaluate ESA at convection-allowing resolutions. These 10 cases are subjectively chosen from the 2011–13 spring seasons (i.e., March–June). Each case produces convection within the southern Great Plains, which focuses the results and analyses on Texas and adjacent states. Each case is listed in Table 1, with corresponding date, location of primary convection, and approximate time of initiation, as determined by simulations discussed later in this section.

Table 1.

Dryline convection cases.

Table 1.

The cases are selected on the basis of cold-started simulations (i.e., no data assimilation) from 1 March to 30 June of each year, carried out with the Advanced Research core of the Weather Research and Forecasting (WRF) Model (Skamarock et al. 2008). Deterministic simulations were performed twice daily at 0000 and 1200 UTC with 4-km grid spacing from National Centers for Environmental Prediction Global Forecast System (GFS) initial conditions. Forecast simulated composite reflectivity and 2-m dewpoint temperature from each WRF simulation were visually scrutinized to select cases that produced convection along the dryline in the southern Great Plains. Dryline convection is exclusively chosen due to its relationship with the frequency of deep convection in this geographical region, but the described methodology can be applied to any convective regime. Forecasts of 2-m temperature were also inspected to verify that simulated convection was not initiated by a cold front (i.e., synoptically forced ascent).

c. Observing system simulation experiments

To examine individual aspects that influence ESA-based targeting performance, an observing system simulation experiment (OSSE; Arnold and Dey 1986; Atlas 1997) configuration is used. OSSEs are valuable tools that have traditionally been used in the analysis of new observing systems before their deployment (e.g., Yussouf and Stensrud 2010), new data assimilation techniques (e.g., Liu et al. 2009; Sobash and Stensrud 2013; Sobash et al. 2015; Gao et al. 2016), observing network design (e.g., Khare and Anderson 2006), and observation-impact studies (e.g., Yussouf and Stensrud 2010; Lange and Craig 2014; Madaus and Hakim 2017). A perfect-model OSSE configuration is used herein, which synthetically generates observations from a model simulation rather than meteorological instrumentation (e.g., Zhang et al. 2004, 2006), and assimilates those observations with an ensemble prediction system to generate forecasts; the same dynamical model is used for both modeling systems. While the perfect model configuration often leads to overly positive results (Masutani et al. 2010), it also removes the influence of model error on the forecast; observations generated from a model simulation will sample the model-error characteristics that are identical to the forecast model used for data assimilation and extended forecasts. It is advantageous for the proposed experiments to exploit this methodology and generate perturbed forecasts from targeted observations that are free of model error in order to isolate other system aspects that play a role in the impacts of targeted observations (e.g., localization) on convection forecasts.

The ensemble forecast system forms the basis for all experiment forecasts and targeted-observation evaluations; hereinafter, this ensemble is referred to as the control. The control ensemble uses version 3.8.1 of the WRF Model during data assimilation and the free-forecast period to advance the control ensemble members. In total, 50 ensemble members are generated through random perturbations to carry out cycled data assimilation and execute free forecasts. Three one-way nested domains are used with 27-, 9-, and 3-km horizontal grid spacing (Ens d01, d02, and d03 in Fig. 1) and 38 vertical levels; the 3-km domain is centered over the southern Great Plains (Ens d03 in Fig. 1). Initial and boundary conditions from the GFS are perturbed with the WRF three-dimensional variational data assimilation (Barker et al. 2004) and fixed-covariance boundary perturbation technique (Torn et al. 2006) for each ensemble member on the outermost domain. The innermost-domain members receive downscaled initial and boundary conditions from their respective parent members.

Fig. 1.
Fig. 1.

Outermost rectangles (Ref d01, d02, and d03) represent the WRF observation-simulation domain configuration, where observations are gathered strictly from the innermost domain for cycled data assimilation (Ref d03). The innermost domains (Ens d01, d02, and d03) are used in the OSSE control ensemble-forecast simulations.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

An ensemble adjustment Kalman filter (Anderson 2001) method within the Data Assimilation Research Testbed (DART; Anderson 2009) is used to develop flow-dependent covariances between model state variables during cycling, which is critical for ESA application. Cycling of synthetic observations begins 48 h prior to forecast initialization on the outermost domain, assimilating observations every 6 h. After 24 h of cycling, the inner domains are initialized by downscaling the outer domain. Following downscaling, another 24 h of cycled data assimilation is conducted on all three domains, with 6-hourly assimilation. Model parameterizations must be utilized so as to properly simulate subgrid-scale processes, which include the Rapid Radiative Transfer Model (Mlawer et al. 1997) and Dudhia (Dudhia 1989) radiation schemes, Yonsei University boundary layer scheme (Hong et al. 2006), Kain–Fritsch scheme (Kain 2004) for cumulus processes, Noah Land Surface Model (Chen and Dudhia 2001), and the Thompson microphysics scheme (Thompson et al. 2004). All listed schemes except for the cumulus parameterization are applied to the three domains; convection is explicitly simulated with 3-km grid spacing (Bryan et al. 2003; Kain et al. 2013) on the innermost domain. After cycling, ensemble members are integrated through a 36-h free-forecast period that spans the convective event.

To promote spread within the ensemble during cycling, a spatially and temporally adapting inflation mechanism (Anderson 2007, 2009) is applied. Inflation is used only on the posterior ensemble to inflate areas across the domain with too little spread after assimilation. Inflation counteracts the impact of undersampling the degrees of freedom, as a result of using only 50 ensemble members. Furthermore, to reduce spurious covariances across the domain that arise when a finite ensemble is used to approximate the true error-covariance matrix, covariance localization is employed during cycling to reduce the impact of observations far from the observing location (Houtekamer and Mitchell 1998; Hamill 2001). A Gaspari–Cohn localization function (Gaspari and Cohn 1999) is employed on each observation with horizontal half-widths of 950 and 320 km, and vertical half-widths of 1.5 and 0.5 km, for the outermost and nested domains, respectively. At two times the half-width, the assimilated observation has no impact on state variables.

Synthetic observations used during cycling are gathered from a separate WRF simulation [i.e., synthetic-observation simulation (SOS)] by interpolating model state variables to observation locations. The SOS is integrated for 54 h with version 3.8.1 of WRF, beginning 6 h prior to data assimilation, in order to sample the atmosphere over the cycled data assimilation window. The simulation is cold-started from GFS boundary and initial conditions, with three one-way nested domains at 27-, 9-, and 3-km grid spacing (Ref d01, d02, and d03 in Fig. 1). The innermost domain covers the same geographical area as the outermost control ensemble domain; all synthetic observations are gathered from this large, high-resolution domain. The same parameterization schemes are used in the SOS as in the control ensemble, and no convection parameterization is used on the 3-km grid spacing domain.

Surface observation locations are obtained from the Meteorological Data Ingest System (MADIS). Observations of temperature, specific humidity, altimeter, and both horizontal components of wind are gathered from the SOS at surface station locations across the United States (e.g., Automated Surface Observing System network, Mesonets, and Citizen Weather Observer Program) and a handful of buoys (Fig. 2). Additionally, observations are gathered at active and inactive radiosonde locations across the United States (stars in Fig. 2), retrieving temperature, specific humidity, and wind components from all 38 vertical model levels, as well as station altimeter, from the SOS. Error is added to each observation to reduce effects of underdispersion within the ensemble during data assimilation. The error characteristics of each observation type are obtained from DART and are described in Table 2. Errors are added to observations by randomly selecting a value from a normal distribution with mean 0 and variance equal to the specified error statistic for each individual observation (Table 2). Combined, these two observation sets produce approximately 150 000 observations at any particular assimilation time during cycling.

Fig. 2.
Fig. 2.

Count of surface observing stations within 100 km of a point (shading). Radiosonde observing sites (red stars) correspond to locations at which observations are taken from the SOS in the vertical at all model levels. Observations in the vertical at each radiosonde location are assumed to not drift from the station’s surface latitude and longitude. Locations for the surface stations and radiosondes were obtained online (https://madis-data.ncep.noaa.gov/ and http://www.raob.com/, respectively).

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

Table 2.

Observation errors.

Table 2.

d. Targeted observing

Area-averaged composite reflectivity (MDBZ) and 3-hourly accumulated precipitation (RAIN) are chosen as the forecast metrics (Rf) to represent deep convection along drylines (e.g., Fig. 3b) within the ensemble forecasts. They are calculated at prescribed times over regional areas (response regions) on the innermost control ensemble domain (e.g., Fig. 3a). Using the ESA-based targeting formula (described in section 2), the ensemble forecast estimates of Rf are regressed back to state variables at earlier times (e.g., forecast hour 6) and combined with initial-condition error variance to determine where targeted observations should be gathered (i.e., target fields). Note that ESA-based targeting estimates are only computed on the innermost convection-allowing control ensemble domain, and therefore targeted observations are only assimilated on this high-resolution domain. A two-sided t test is performed on the sample distribution regression slope (i.e., ensemble sensitivity) with a 90% confidence interval to subset sensitivities that are statistically significant and account for sampling errors with a limited ensemble size. Significant sensitivity rejects the null hypothesis that changes to the initial conditions will not affect the forecast metric. Statistical significance of sensitivity is then used to spatially subset potential targeted observations. Target fields are calculated at forecast hours 6, 12, and 18, providing 6-hourly spaced lead times (i.e., time between assimilation and calculated Rf) to examine how observation impacts vary with integration length and if nonlinearity influences are more prevalent with longer lead times. Initial conditions at each assimilation time are obtained from control ensemble member forecasts valid at that time.

Fig. 3.
Fig. 3.

Ensemble mean forecasts of (a) composite reflectivity (dBZ) and (b) 2-m dewpoint temperature (°C) for case 9 at 2300 UTC 20 May 2013. Barbs represent the 10-m wind speed and direction, and contours in (b) are the mean sea level pressure (hPa, contoured every 2 hPa). The black-outlined rectangle in (a) identifies the response region for this case.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

For concision, only temperature and altimeter are used as initial condition variables against which Rf is regressed and these act as the only targeted-observation types. Given prior ESA results (e.g., Bednarczyk and Ancell 2015; Hill et al. 2016), it is expected that forecasts of reflectivity and precipitation are sensitive to low-level and midlevel temperatures given their relationship to buoyancy. Moreover, Madaus et al. (2014) detailed the value of dense pressure observations for convection forecasts, demonstrating improved mesoscale analyses and forecasts when they were assimilated. To critically evaluate the ESA methodology, 10 gridpoint locations are chosen from the generated target fields at a single time in areas of high estimated MDBZ and RAIN variance reduction. The selected target locations are not necessarily at the maximum in variance reduction, but rather chosen in areas reflective of statistically significant sensitivity, which typically coincide with areas of high variance reduction. Five of the identified locations are used to select targeted temperature observations: one surface 2-m temperature observation (e.g., Figs. 4c,d) and four temperature observations aloft at 850, 700, 500, and 300 hPa. The other five locations are reserved for targeting altimeter observations using the same selection procedure (e.g., Figs. 4a,b). In addition to these 10 target locations, 10 other locations are chosen randomly for observing, at the same vertical levels, in areas designated as “nontargets” (i.e., areas that do not exhibit statistically significant sensitivity). In this manner, the ability of ESA to diagnose the best sampling regions will be judged. While this procedure can be automated, observation locations are selected manually to guarantee no observations are selected in mountainous regions, and to simulate a real-time application of ESA with human decision makers.

Fig. 4.
Fig. 4.

Target fields valid at 1800 UTC 20 May 2013 in case 9 of (a),(b) surface altimeter (ALT) and (c),(d) 2-m temperature (T2) observations; control forecast ensemble mean altimeter (hPa) and temperature (°C) are contoured. Target fields are calculated with respect to (left) MDBZ (dBZ2) and (right) RAIN (mm2) responses valid at 2300 UTC 20 May 2013. Green circles in (a) and (b) are locations at which targeted observations are gathered; blue circles are locations for nontargeted observations. Similarly, green stars in (c) and (d) identify locations for targeted 2-m temperature observations, whereas blue stars demonstrate nontargeted observation locations. Green boxes outline the response regions where MDBZ and RAIN are calculated.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

Once all 20 locations are determined, the respective temperature and pressure observations at the appropriate observing time are retrieved from a single ensemble member, which guarantees the assimilated target and nontarget observations fall within the ensemble distribution. Typically, the SOS would be used to generate targeted observations in an OSSE. However, it is critical that the chosen observations are not discarded by the filtering procedure, which could happen if the independent SOS and control ensemble forecasts significantly diverge. The only significant configuration difference between the SOS and a single ensemble member is the domain size; the model parameterizations and grid spacing are identical.

All observations selected are assimilated twice (in parallel assimilation systems) to evaluate the impacts of localization on ESA predictions. The ESA-based targeting methodology assumes no localization for targeting prediction and EnKF assimilation. Every targeted and nontargeted observation is assimilated with localization employed in one instance (320 km horizontally and 0.5 km vertically) and localization turned off in a parallel and identical second EnKF assimilation. Brief tests were conducted on the magnitude of vertical and horizontal localization radii, with little sensitivity noted (not shown). It should also be mentioned that no posterior inflation is used in experiment assimilations. Forecasts are carried out using these two parallel analyses with boundary conditions for each ensemble member from their respective parent domain control ensemble forecasts. These side-by-side-experiment forecasts provide the basis for determining how localizing observations may alter the observation’s impact on the forecast distributions of MDBZ and RAIN.

All targeting experiment permutations amount to 120 observations individually assimilated per case and 1200 in total: (10 target observations and 10 nontarget observations) × 3 assimilation times × 2 assimilation configurations × 10 cases. Therefore, while the response functions and initial condition variables are limited, there are a sufficient number of simulations to reach conclusions regarding the effectiveness of ESA-based targeting and what factors are playing a role in its success or failure within convection-allowing forecast applications.

e. Analysis

To assess the performance of ESA predictions, comparisons are made between forecast impacts predicted by the ESA methodology and realized impacts after targeted and nontargeted observations are assimilated and analyses are integrated forward. Comparisons are made between the control ensemble forecasts and forecasts from targeting experiments that vary assimilation times, assimilation configurations, response types, and observation types (i.e., targeted versus nontargeted observations).

The predicted change of the response mean (δR¯P) and variance (δσP2) is compared with the actual change of response mean (δR¯A) and variance (δσA2), where the subscripts P and A correspond to predicted and actual changes, respectively. The estimated change of the response mean,
δR¯P=Rxδx¯,
is simply calculated as the product of sensitivity (∂R/∂x) and the ensemble-mean analysis increment due to observation assimilation (δx¯). The predicted change in response variance (δσp2) is simply the negative definite ESA-estimated target value from (3). Actual changes of the response distribution statistics are calculated as the difference between experiment (subscript E) and control ensemble (subscript C) forecasts:
δR¯A=R¯ER¯Cand
δσA2=σE2σC2.
The agreement between predicted and actual changes to the response distributions is examined through linear regressions and associated correlation coefficients of the samples. Statistical significance is applied to the linear regressions when sample sizes are sufficiently large (>30) to determine when regression slopes are significant (i.e., different than zero). A Wald test (Oliphant 2007) is used to determine significance with a two-sided p value < 0.001.

To examine how numerical noise may influence the targeted-observation evaluation, experiments are designed that randomly perturb forecast-generated analyses in remote locations and integrate the new analyses with WRF over the forecast window. The resulting changes in response distributions are quantified and compared against the targeted- and nontargeted observation experiments. New, perturbed analyses are generated from the forecast distribution at each experiment assimilation time (i.e., forecast hours 6, 12, and 18) and integrated forward, resulting in one noise experiment at each assimilation time per convective case. To prescribe a random perturbation that would reasonably have no influence on convection, other than through numerical noise, the perturbation is made to a WRF prognostic variable (i.e., perturbation potential temperature) well removed from the convection of interest on each ensemble member. Perturbations are made at the topmost vertical level in the northeastern-most corner of the domain at a single grid point. The magnitudes of the perturbations are drawn separately for each ensemble member from a normal distribution with mean 0 and variance 0.5 K. Other variance magnitudes were tested to draw perturbations and produced similar results (not shown).

To quantify the impact of numerical noise on ESA predictions, each targeting experiment is assigned a ratio to represent the proportion of response-change signal explainable by noise, such that noise is less impactful on predicted and actual observation impacts as the proportion increases:
|δσE2||δσCS2|=CMand
|δR¯E|δR¯CS=CV,
where δσCS2 and δR¯CS represent the change in response variance and mean in the noise experiments, respectively; and CM and CV are the ratios that describe the proportions for changes in response mean and variance, respectively. For example, a CM or CV of 1 would indicate a targeted observation induces the same magnitude change on the response distribution as an innocuous, random perturbation. CM and CV can then be assessed as a function of the sample-regression correlation coefficients (CCs) to determine the influence of noise on ESA predictions through a spectrum of ratios.

Additional analysis will be carried out in a companion paper to examine the spatial distribution of observation increments and resulting forecast increments, which is beyond the scope of this paper. Moreover, while the analysis herein will focus on ESA-based predictions of observations and single-observation impacts on forecast distributions, conclusions regarding forecast skill improvements (i.e., forecast error) will not be discussed, given the limited number of targeted observations, response metrics, and cases to reach robust conclusions about changes in forecast skill from ESA-based targeted observations.

3. Predicted versus actual impacts

a. Targeted and nontargeted observations

From an aggregate perspective, ESA has stronger predictive skill for targeted observations assimilated at shorter lead times (i.e., there is generally better association between δR¯P and δR¯A). There is little association between δR¯P and δR¯A for targeted observations assimilated at forecast hour 6 (the longest lead time), as measured by 0.28 and 0.29 CCs for the MDBZ (Table 3) and RAIN (Table 4) responses, respectively; this association is similarly true at forecast hour 12, where predicted response-mean impacts relate to actual impacts with 0.1 and 0.06 CCs for the MDBZ and RAIN response variables, respectively. Prediction errors of δR¯A for MDBZ are largest for observations assimilated at hours 6 and 12, with linear regression slopes of 0.27 and 0.13, respectively. At forecast hour 18 (the shortest lead time), δR¯P and δR¯A more strongly correlate, though the sample linear regressions suggest only a 0.44 CC for MDBZ and 0.7 CC for RAIN. Generally, despite weak correlations, response mean change predictions improve as lead time increases (i.e., linear regression slopes approach one). At forecast hours 6 and 18, the associated linear regressions of δR¯ are statistically significant (boldface values in Tables 3 and 4) for both response variables despite poor predictions. Conversely, nontargeted observations have almost zero correlation between δR¯P and δR¯A for forecast hours 18 and 12 (Tables 3 and 4). This result is in large part because mean-change predictions of MDBZ and RAIN are relatively small compared to actual changes (not shown); nontarget observations are impacting the forecast at the same magnitude as target observations, but are not predicted by ESA to do so.

Table 3.

Sample linear regression slope and correlation coefficient between predicted and actual changes of MDBZ response mean and variance at forecast hours 6, 12, and 18 for targeted and nontargeted observations. Sample size is 200 in each forecast-hour sample. Linear regression slopes that are statistically significant (p value < 0.001) are shown with boldface type.

Table 3.
Table 4.

As in Table 3, but slopes and correlation coefficients are calculated with respect to the RAIN response variable.

Table 4.

Targeted observations assimilated at forecast hour 18 act to reduce MDBZ response variance in 83% of experiments, compared to 58% and 62% for forecast hours 12 and 6, respectively; RAIN response variance is decreased in 63%, 41%, and 56% of experiments at forecast hours 18, 12, and 6, respectively (not shown). This result suggests that nonlinear perturbation evolution may be more prevalent as lead time increases, causing ESA-based predictions to degrade in skill. Linear regressions of δσA2 and δσP2 for targeted observations at all assimilation times are statistically significant for both response variables, with forecast-hour-18 assimilation impacts correlating best with ESA-based predictions for RAIN (Table 4); hour-12 impacts correlate best to predictions for MDBZ (Table 3). As was the case when comparing targeted- and nontargeted observation impacts on R¯, nontargeted observations tend to produce a similar δσA2 as targeted observations (not shown). Moreover, nontargeted observations have statistically significant correlations at assimilation hours 6 and 12, but the linear regression slopes themselves are nearly vertical for the MDBZ response (large slopes); no regression slopes of the nontargeted observation impacts on response variance for RAIN are significant.

b. Lead time

ESA prediction-skill dependence on lead time is explored by further partitioning experiments by the assimilation configuration. For targeted observations assimilated with the EnKF and no localization, correlations between δR¯P and δR¯A increase and prediction errors largely decrease with decreasing lead time (Fig. 5). At forecast hour 6, the correlations between δR¯P and δR¯A for both responses are weak, and slightly negative (Figs. 5a,b). The MDBZ response has a 0.3 CC between δR¯P and δR¯A at forecast hour 12, which increases to 0.85 at forecast hour 18. The RAIN response has a similar increase in CCs, from 0.2 at forecast hour 12 to 0.98 at hour 18. At forecast hour 18, the sample-regression slopes for MDBZ and RAIN are statistically significant and more closely align with the perfect-prediction 1–1 line (Figs. 5e,f).

Fig. 5.
Fig. 5.

Predicted changes vs actual changes of response mean for experiments of targeted observations with EnKF assimilation in all cases for (left) composite reflectivity (dBZ) and (right) accumulated rainfall (mm) responses at forecast hours (a),(b) 6; (c),(d) 12; and (e),(f) 18. The colored lines and the numbers in the bottom-right corners represent the linear regressions and regression correlation coefficients for the sample populations, respectively. Linear-regression slopes that are statistically significant (p value < 0.001) are denoted with an asterisk.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

In comparison with response-mean correlations, changes and predictions of variance change are more correlated at forecast hours 6 and 12 (>0.5), although changes to the RAIN response at hour 12 exhibit a negative correlation (Fig. 6d), influenced by numerous positive changes to the variance of the rainfall response when predictions were negative. The targeted observations act to both increase and decrease response variance by as much as 30 dBZ2 and 0.75 mm2 at forecast hours 6 and 12, respectively. In contrast, at forecast hour 18, targeted observations primarily decrease variance for both response variables (Figs. 6e,f), correlating best to δσP2 with the RAIN response (0.83 CC). Predicted impacts of targeted observations on response variance generally span from 20 to 0 dBZ2 and from 0.5 to 0 mm2, whereas actual impacts span from 40 to 30 dBZ2 and from 0.5 to 0.75 mm2 for the MDBZ and RAIN responses, respectively (Fig. 6). It is hypothesized that the increases in MDBZ and RAIN variance are largely a result of nonlinear perturbation evolution that ESA cannot properly predict; nonlinearity is explored in section 4.

Fig. 6.
Fig. 6.

As in Fig. 5, but for samples of predicted changes vs actual changes of response variance (dBZ2 and mm2).

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

c. Localization

The assimilation configuration is also altered in the experiments to examine the influence of localization on ESA predictions of observation impacts. Localization negatively impacts the assimilation and subsequent correlation between predicted and actual response impacts (Figs. 7 and 8). Localization narrows the impact of the targeted observation at assimilation, and subsequently reduces the impact on response mean (cf. Figs. 7 and 5). This result is particularly evident at forecast hour 18 when changes in the MDBZ and RAIN means hover near zero, while predicted changes span from 4 to 4 dBZ and from 0.75 to 0.5 mm, respectively (Figs. 7e,f). Additionally, observations assimilated with localization employed have little impact on response variance (Fig. 8). While some observations induce a nonzero change in response variance, the majority of observations at all forecast times, and in particular hour 18 (Figs. 8e,f), produce little to no change in variance. Linear regressions applied to these sample subsets are often misleading with high CCs, since clustering of the samples is occurring (e.g., Figs. 7e and 8c,e). These results demonstrate that localization negatively impacts the prediction of response changes as determined by ESA.

Fig. 7.
Fig. 7.

As in Fig. 5, but observations are assimilated with localization.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

Fig. 8.
Fig. 8.

As in Fig. 6 but observations are assimilated with localization.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

4. Other sources of error

While the presented experiments were designed to control for a number of error sources, two main error sources remain: numerical noise and nonlinearity. Numerical noise may be contributing to poor correlation results through chaos seeding (Ancell et al. 2018). Additionally, nonlinear perturbation evolution through convective processes are unaccounted for in the presented experiments and may be contributing to the unfavorable experimental results related to ESA-based observation predictions. To some extent, this signal of nonlinearity is evident in the comparison between targeted and nontargeted observations as well as prediction-skill dependence on lead time. The ESA-based method fits a linear relationship between response and initial-condition variables, which may poorly describe the complex and potentially nonlinear relationships in the simulated cases, such as the relationship between surface temperatures and level of free convection attainment by parcels near the dryline. The impacts of numerical noise due to chaos seeding (i.e., perturbation evolution) and nonlinearity will be investigated further to enhance understanding of how targeted observations influence convection forecasts and the utility of ESA to target these observations.

a. Numerical noise

First, only the experiments that exhibit a CM or CV greater than 1 are considered; those experiments with a noise ratio below 1 are withheld from the samples. The correlations between δR¯P and δR¯A increase as a result of removing the “noisy” experiments—for both response variables—particularly at forecast hour 18. Observations assimilated at forecast hours 18 and 12 have stronger response-mean change correlations when noise is used to subset the sample (cf. Tables 5 and 3), indicating perhaps that numerical noise is a contributing factor to poor predictions of response mean change. Forecast hour 18 remains the best prediction time, with a correlation of δR¯ and δR¯A above 0.6 for both response variables, whereas forecast hours 12 and 6 have CCs < 0.2; the forecast-hour-6 sample still has a negative correlation in both response variables (Tables 5 and 6). Similar improvements in correlation are seen for nontargeted observation impacts on response mean when they are assimilated at forecast hour 18; the correlations remain small but do slightly improve from 0.02 to 0.19 (cf. Tables 5 and 3) and from 0.12 to 0.39 (cf. Tables 6 and 4) for MDBZ and RAIN responses, respectively. Correlations of predicted and actual response-variance changes by targeted observations appear minimally impacted by noise (cf. Tables 5 and 3). Sample regression slopes are largely unchanged, except for MDBZ regressions between δR¯P and δR¯A (Table 5) which have all progressed closer to one.

Table 5.

As in Table 3, but only considering observations that induce changes in response variance that exceed the chaos-induced response. Sample sizes are included for reference.

Table 5.
Table 6.

As in Table 5, but for the RAIN response.

Table 6.

As CM and CV are increased, the sample correlations between predicted and actual changes of MDBZ response mean and variance increase and decrease, respectively, regardless of assimilation time (Fig. 9). The correlation coefficients that describe changes to response mean are a function of CM and increase by as much as 0.25 as the sample size of experiments drops from 200 to < 75 for all times (Figs. 9a,c,e). Interestingly, the ESA-based predictions of response variance changes are negatively impacted by removing experiments that have a large component of noise in the experiment signal (Figs. 9b,d,f), which could hint at the rather complex variance distribution for a highly nonlinear response metric like MDBZ. Moreover, these same correlation changes as a function of noise ratio are found for the RAIN response (Fig. 10), but the variance correlations of RAIN are not affected as much as the MDBZ response (Figs. 10b,d,f); the response variance remains nearly constant despite removal of more than three quarters of the experiments (Figs. 10b,d,f). Furthermore, the correlation coefficient of predicted and actual impacts on response mean at assimilation hour 18 is nearly perfect when the only the largest noise ratios are considered (i.e., small proportion of experiment impact explained by noise). These results suggest that noisy observation increments play a role in ESA’s ability to predict observation impacts on response distributions.

Fig. 9.
Fig. 9.

Correlation coefficients between predicted and actual changes of MDBZ (left) mean and (right) variance as a consequence of increasing the noise ratios CM and CV (see the text for a description of each ratio) at assimilation hours (a),(b) 6; (c),(d) 12; and (e),(f) 18. Black lines denote the sample size used to compute the correlation coefficient.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

Fig. 10.
Fig. 10.

As in Fig. 9, but the response variable is RAIN.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

b. Nonlinearity influences

Additional experiments are conducted to further elucidate the impact of nonlinearity in the assessment of ESA-based predictions. As was previously discussed, the ESA method is developed using linear assumptions that describe the relationship between response and initial-condition variables. This assumption may not be valid at longer lead times when convection is present, but may approach validity at shorter lead times as seen through the analysis of the targeted-observation experiments above. However, if convection was removed from the forecast, then perturbation evolution through the forecast period—as a result of observation assimilation—would approach more linear assumptions, even for high-resolution forecasts. Therefore, a series of targeted and nontargeted observation experiments are conducted for a separate forecast case (23 January 2013) where little convection is present within the verification domain. Similar targeting procedures are followed as outlined in section 2, but the response variable is geopotential height at sigma level 25 in an area of high forecast spread. Temperature and altimeter observations are selected in target and nontarget locations as before, and assimilated at forecast hours 6, 12, 18 and 24. In this particular case, the response function is valid at hour 24 and an assimilation evaluation is attempted at the response time to understand how the EnKF assimilation spreads observational information directly onto the response function through state-variable covariances and without forecast integration. Such an evaluation is not possible for the MDBZ and RAIN responses since a number of the variables used to derive these responses are not contained within the WRF state vector.

Targeted observations demonstrate a strong relationship between predicted and actual changes of response mean at all forecast times (Fig. 11a) with sample regressions that deviate more from the one-to-one predicted relationship as the lead time increases. The linear regressions of the δR¯ samples suggest that the one-to-one predicted relationship is more valid for observations assimilated at shorter lead times, as the 24- and 18-h regressions have the largest slopes. Targeted observations assimilated at hour 6 produce nonmeaningful changes to geopotential height at hour 24 (Fig. 11a). Nontargeted observations have strong correlations at forecast hour 24 and 18 as well but exhibit smaller impacts on the response when compared with targeted observations (Fig. 11c). In addition, the changes to response variance for targeted observations assimilated at hours 24 and 18 are all below zero (Fig. 11b). Nontargeted observations have poorer correlations between predicted and actual changes of response variance, as response variance is increased for experiments at a number of assimilation hours (Fig. 11d). Nontargeted observations assimilated at hour 24 do not produce impacts on the response variance that are well predicted by ESA; they appear both positively and negatively biased. Moreover, the predicted change of response mean should best align with actual changes at forecast hour 24, since no model integration follows assimilation. This is indeed realized as both targeted and nontargeted observations have nearly perfect prediction of observation impacts on the response mean when no model integration is done (Figs. 11a,c, purple stars and lines).

Fig. 11.
Fig. 11.

Predicted changes vs actual changes of (left) response mean (m) and (right) response variance (m2) for (a),(b) targeted observations and (c),(d) nontargeted observations in the nonconvection case referenced in the text. Red diamonds correspond to observations assimilated at forecast hour 6, blue triangles correspond to observations assimilated at forecast hour 12, green circles correspond to observations assimilated at forecast hour 18, and purple stars correspond to observations assimilated at forecast hour 24. Colored lines represent the linear regressions for their colored sample populations.

Citation: Monthly Weather Review 148, 11; 10.1175/MWR-D-20-0015.1

The nonconvection case presented illustrates the complexities of nonlinearity and its influence on targeted observing for convection. Observation impacts on a nonconvective response are more predictable than observations assimilated for the convection forecasts because the initial perturbations from observation assimilation evolve more linearly through the forecast period. Subjectively, the changes in response mean are strongly correlated to actual impacts, while changes to response variance are nearly always negative (except for nontargeted observations). This would indicate that, unsurprisingly, nonlinearity is negatively impacting the performance of targeted observations for the convection cases presented, contributing to unpredictable changes to the response distributions. Additionally, it should be noted that the GPH response is vastly different from the MDBZ and RAIN responses used in the convection cases. For one, GPH is a continuous field across the domain, wheras MDBZ and RAIN are inherently localized and have sharp horizontal gradients where convection occurs. Moreover, GPH is a simple conversion of a model state variable (perturbation potential), whereas the convection responses are derived from complex functions of hydrometeor variables. However, our nonconvection case results are largely independent of the response variable chosen; when GPH is used to target observations in a convection case (not shown), the observations impacts are poorly correlated to the predicted impacts, particularly when observations are assimilated at hours 6 and 12, as they were for the MDBZ and RAIN responses for the majority of convection experiments explored. These results further support prior evidence that ESA-based predictions are most accurate at short lead times, and therefore ESA-based targeting for nonlinear forecasts is most applicable at shorter time scales.

5. Summary and discussion

Ensemble sensitivity analysis (ESA) is applied to numerous convection cases across three spring seasons to critically evaluate the performance of an observation targeting algorithm on convection-allowing scales. ESA linearly relates chosen forecast metrics—composite reflectivity and 3-hourly accumulated rainfall—at various forecast times to prior model state variables, or functions of state variables, to evaluate how small initial-condition perturbations will evolve onto the forecast. Combined with this sensitivity information is an estimate of prior uncertainty, which yields optimal locations to target new observations that will reduce the variance of the chosen forecast metrics, along with estimations of the variance reduction. Observing system simulation experiments with perfect models are utilized to control for model error and simulate truth, which additionally allows for four-dimensional sampling to generate new observations. Targeted and nontargeted observations are selected in locations determined from the ESA-based algorithm and assimilated with ensemble forecast analyses. The assimilation configuration, assimilation time, observation type, and response variable are varied to evaluate different factors that lead to successful or failed predicted impacts on the response distributions.

Targeted and nontargeted observation impacts are compared to determine if the ESA methodology can effectively discriminate good and bad sampling regions. It is initially determined that targeted and nontargeted observations provided similar-magnitude impacts on the response distributions. However, when comparing the convective cases to a case in which convection is effectively removed, it is clear that the similarities in target and nontarget observation impacts in the convective cases are due in large part to nonlinear perturbation evolution. Both observations, targeted and nontargeted, spread increments into the moist dynamics that then grow nonlinearly and filter into MDBZ and RAIN, which are inherently tied to moist dynamic processes of the model.

Whereas changes to the response mean aligned with ESA-based predictions under certain experiment configurations, changes to response variance were less predictable for most experiment permutations. It was found that observations equally increased and decreased response variance, except when observations are assimilated at short lead times. These results further support the conclusions that the assimilation increments propagate nonlinearly onto the response variables, and in particular their variance. A comparison of the 10 convective cases with a nonconvective case supported these claims.

The data assimilation configuration that did not include localization yielded the best predictions of response distribution changes, particularly with respect to the response mean. This result is not too surprising, given that the derivation of the ESA-targeting methodology, and estimated impacts, follows that of an EnKF assimilation with no localization. The impact of localization is dramatic, as the assimilation tool truncates the observation’s impact on the response distribution. It is plausible that allowing a single observation to update the entire state space is allowing the observational information to spread to the response metrics dynamically through the adjoint sensitivity. Given ensemble sensitivity is the projection of adjoint sensitivity onto the ensemble statistics (Ancell and Hakim 2007), allowing the observation to update the entire domain connects the observation, no matter the location, to the true dynamic sensitivity. When the analysis increment is restricted to a certain localization radius, this dynamic connection may be lost. Furthermore, it is entirely possible that the prescribed localization radius may be most optimal for the assimilation of a normal observation set (i.e., millions of observations), but is too severe for small observation sets or single-observation assimilation experiments as in this study.

To investigate any influence from numerical noise, additional experiments were conducted to provide a baseline change in response mean and variance from noise (i.e., random perturbations) and evaluate if targeted observations were making a perturbation on the response that was significant and larger than noise. When experiments were stratified by the ratio of targeting-experiment response change to noise-experiment response change, the response-mean correlations increased as the ratio increased, suggesting that when observation impacts are large, relative to numerical noise, they are more predictable. As the ratios increased, changes to MDBZ variance were less predictable, whereas predictability for RAIN-variance changes remained the same, highlighting the respective nonlinear and linear characteristics of the individual response types, respectively. Overall, while targeted and nontargeted observation assimilation and associated impacts were not completely predictable under any configuration, it was apparent that the best predictability in this EnKF assimilation framework arises under a very idealized modeling configuration with no model error, no localization, and for short lead times, supporting the conclusion that targeted observing estimates are more accurate for nonlinear convection forecasts at short time scales. Madaus and Hakim (2017) similarly showed that convection initiation prediction in the absence of strong forcing only benefited from observations within 1–2 h of initiation.

A number of caveats to these conclusions exist, however, given the idealized nature of the experimental design, that have significant implications for real-time targeting applications. First, these OSSEs only consider single-observation sampling. Real-data experiments would ideally target a location and sample either at multiple levels or at recurring times, providing more numerous observations that could constrict an analysis toward reality and produce large and meaningful impacts on the forecast. Ancell and Hakim (2007) describe the process of targeting two observations, and they suggest the calculation of variance reduction for a few observations would be trivial. However, they also stress the computational requirement to iteratively select numerous observations and the potential diminishing returns on variance reduction after a few observations are assimilated. Additionally, the single-observation assimilation experiments presented may artificially introduce gravity wave noise through the forecast period due to initial dynamic imbalances that reflect off the model domain due to inconsistencies with the analysis increment and boundary conditions (not shown). Supplementary analysis suggests that this noise gradually dampens with time before response variables are calculated (not shown).

Second, these OSSEs disregard regularly available observations from satellite platforms, radiosondes, etc., that would potentially influence the predicted value of targeted observations, particularly if there are redundant observations in target locations. It should be expected that targeted mesoscale observations would have even smaller impacts if considered within a real forecasting system since the background analysis would theoretically be improved with routine observations. In fact, this complication was mentioned specifically by Romine et al. (2016) as a limiting factor for targeting operations during MPEX, when routine analysis cycles were not accounted for between target selection and sampling, potentially removing the need for targeted observations. Additionally, the forecast metrics used are inherently tied to processes particularly difficult to model accurately (e.g., microphysics). Perhaps a metric tied to dryline position, for example, would yield better ESA predictions of observation impacts.

It is important to compare these idealized results with those obtained prior on synoptic-scale systems (e.g., Ancell and Hakim 2007; Torn and Hakim 2008). Torn and Hakim (2008) documented that both response mean and variance changes were predictable as the result of denying a single pressure observation, with much higher correlations than observed in these experiments. Particularly interesting is they found that changes to response variance were better predicted by ESA than the changes to response mean, which they attributed to sampling error under the pretense of no significant nonlinearity nor model error. Moreover, Torn and Hakim (2008) noted the ability of ESA to predict changes to response error (sea level pressure error) as well. This analysis step was not completed in this work due to the complexity of verifying rainfall and reflectivity in a gridpoint manner; object-based verification would be the proper method, which removes the applicability of gridpoint-based ESA-error reduction analyses. Additional tests could validate ESA on scales of < 3 km with analyses of continuous forecast parameters (e.g., surface pressure), which can be assessed appropriately on a grid.

The results of this work motivate additional research that could elucidate the validity of targeted observing for convection forecasts. Additional OSSEs could be implemented, either under perfect-model conditions or not, that assimilate multiple targeted observations and take into account additional analysis cycles between ensemble initiation and target sampling. These experiments would allow for a more robust examination of target observation types (e.g., moisture) and response variables, sampling all vertical levels of a radiosonde versus continuous sampling at the surface, or mobile observations at the surface (e.g., StickNet; Hill et al. 2020; mobile mesonets) and aloft (e.g., uncrewed aircraft systems). One of the more important avenues of exploration that needs to be considered is the application of multivariate regressions. Specifically, a multivariate approach that selects the most important state variable to sample (e.g., ANOVA test) could yield more impactful adjustments to the response distribution than subjectively chosen observation types. As targeted observation studies go to smaller and smaller scales (e.g., Limpert and Houston 2018; Kerr et al. 2019; Kerr and Wang 2020), the utility of other statistical techniques may be necessary to capture the relationships of state variables and how they covary with one another. In particular, this work does not consider collinearity between targeted-observation variables, which would undoubtedly need to be addressed when observations are sampled at the same point in space. Although this work does not close the door on real-time applications of targeted observing for convection forecasts, it certainly brings to light some of the complexities and limitations for its use that motivate continued evaluation.

Acknowledgments

The authors acknowledge high-performance computing support from Cheyenne (doi:10.5065/D6RX99HX) provided by NCAR’s Computational and Information Systems Laboratory, sponsored by the National Science Foundation. We also acknowledge Dr. Glen Romine for thoughtful and insightful input into the methodology of this work during the first author’s Graduate Visitor stint at the National Center for Atmospheric Research. We appreciate the thoughtful comments and suggestions from Dr. Luke Madaus and two anonymous reviewers during the review process. This work was supported by NOAA Award NA17NWS4680003 and NSF Grant IIS-1527183.

REFERENCES

  • Aberson, S. D., 2003: Targeted observations to improve operational tropical cyclone track forecast guidance. Mon. Wea. Rev., 131, 16131628, https://doi.org/10.1175//2550.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., 2016: Improving high-impact forecasts through sensitivity-based ensemble subsets: Demonstration and initial tests. Wea. Forecasting, 31, 10191036, https://doi.org/10.1175/WAF-D-15-0121.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., and C. F. Mass, 2006: Structure, growth rates, and tangent linear accuracy of adjoint sensitivities with respect to horizontal and vertical resolution. Mon. Wea. Rev., 134, 29712988, https://doi.org/10.1175/MWR3227.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., and G. J. Hakim, 2007: Comparing adjoint- and ensemble-sensitivity analysis with applications to observation targeting. Mon. Wea. Rev., 135, 41174134, https://doi.org/10.1175/2007MWR1904.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ancell, B. C., A. Bogusz, M. J. Lauridsen, and C. J. Nauert, 2018: Seeding chaos: The dire consequences of numerical noise in NWP perturbation experiments. Bull. Amer. Meteor. Soc., 99, 615628, https://doi.org/10.1175/BAMS-D-17-0129.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, J. L., 2001: An ensemble adjustment Kalman filter for data assimilation. Mon. Wea. Rev., 129, 28842903, https://doi.org/10.1175/1520-0493(2001)129<2884:AEAKFF>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, J. L., 2007: An adaptive covariance inflation error correction algorithm for ensemble filters. Tellus, 59A, 210224, https://doi.org/10.1111/j.1600-0870.2006.00216.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, J. L., 2009: Spatially and temporally varying adaptive covariance inflation for ensemble filters. Tellus, 61A, 7283, https://doi.org/10.1111/j.1600-0870.2008.00361.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Arnold, C., and C. Dey, 1986: Observing-systems simulation experiments: Past, present, and future. Bull. Amer. Meteor. Soc., 67, 687695, https://doi.org/10.1175/1520-0477(1986)067<0687:OSSEPP>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Atlas, R., 1997: Atmospheric observations and experiments to assess their usefulness in data assimilation (Special Issue: Data assimilation in meteology and oceanography: Theory and practice). J. Meteor. Soc. Japan, 75, 111130, https://doi.org/10.2151/jmsj1965.75.1B_111.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Barker, D. M., W. Huang, Y.-R. Guo, J. Bourgeois, and Q. N. Xiao, 2004: A three-dimensional variational data assimilation system for MM5: Implementation and initial results. Mon. Wea. Rev., 132, 897914, https://doi.org/10.1175/1520-0493(2004)132<0897:ATVDAS>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bednarczyk, C. N., and B. C. Ancell, 2015: Ensemble sensitivity analysis applied to a southern plains convective event. Mon. Wea. Rev., 143, 230249, https://doi.org/10.1175/MWR-D-13-00321.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bergot, T., 1999: Adaptive observations during FASTEX: A systematic survey of upstream flights. Quart. J. Roy. Meteor. Soc., 125, 32713298, https://doi.org/10.1002/qj.49712556108.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bergot, T., 2001: Influence of the assimilation scheme on the efficiency of adaptive observations. Quart. J. Roy. Meteor. Soc., 127, 635660, https://doi.org/10.1002/qj.49712757219.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Berliner, L., Z. Lu, and C. Snyder, 1999: Statistical design for adaptive weather observations. J. Atmos. Sci., 56, 25362552, https://doi.org/10.1175/1520-0469(1999)056<2536:SDFAWO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Berman, J. D., and R. D. Torn, 2019: The impact of initial condition and warm conveyor belt forecast uncertainty on variability in the downstream waveguide in an ECWMF case study. Mon. Wea. Rev., 147, 40714089, https://doi.org/10.1175/MWR-D-18-0333.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Berman, J. D., R. D. Torn, G. S. Romine, and M. L. Weisman, 2017: Sensitivity of Northern Great Plains convection forecasts to upstream and downstream forecast errors. Mon. Wea. Rev., 145, 21412163, https://doi.org/10.1175/MWR-D-16-0353.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brown, B. R., and G. J. Hakim, 2015: Sensitivity of intensifying Atlantic hurricanes to vortex structure. Quart. J. Roy. Meteor. Soc., 141, 25382551, https://doi.org/10.1002/qj.2540.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bryan, G. H., J. C. Wyngaard, and J. M. Fritsch, 2003: Resolution requirements for the simulation of deep moist convection. Mon. Wea. Rev., 131, 23942416, https://doi.org/10.1175/1520-0493(2003)131<2394:RRFTSO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Buizza, R., and A. Montani, 1999: Targeting observations using singular vectors. J. Atmos. Sci., 56, 29652985, https://doi.org/10.1175/1520-0469(1999)056<2965:TOUSV>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chang, E. K. M., M. Zheng, and K. Raeder, 2013: Medium-range ensemble sensitivity analysis of two extreme Pacific extratropical cyclones. Mon. Wea. Rev., 141, 211231, https://doi.org/10.1175/MWR-D-11-00304.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chen, F., and J. Dudhia, 2001: Coupling an advanced land surface–hydrology model with the Penn State–NCAR MM5 modeling system. Part I: Model implementation and sensitivity. Mon. Wea. Rev., 129, 569585, https://doi.org/10.1175/1520-0493(2001)129<0569:CAALSH>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Coniglio, M. C., S. M. Hitchcock, and K. H. Knopfmeier, 2016: Impact of assimilating pre-convective upsonde observations on short-term forecasts of convection observed during MPEX. Mon. Wea. Rev., 144, 43014325, https://doi.org/10.1175/MWR-D-16-0091.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Coniglio, M. C., G. S. Romine, D. D. Turner, and R. D. Torn, 2019: Impacts of targeted AERI and Doppler lidar wind retrievals on short-term forecasts of the initiation and early evolution of thunderstorms. Mon. Wea. Rev., 147, 11491170, https://doi.org/10.1175/MWR-D-18-0351.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Diaconescu, E. P., and R. Laprise, 2012: Singular vectors in atmospheric sciences: A review. Earth-Sci. Rev., 113, 161175, https://doi.org/10.1016/j.earscirev.2012.05.005.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Doyle, J. D., C. A. Reynolds, C. Amerault, and J. Moskaitis, 2012: Adjoint sensitivity and predictability of tropical cyclogenesis. J. Atmos. Sci., 69, 35353557, https://doi.org/10.1175/JAS-D-12-0110.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Doyle, J. D., C. Amerault, C. A. Reynolds, and P. A. Reinecke, 2014: Initial condition sensitivity and predictability of a severe extratropical cyclone using a moist adjoint. Mon. Wea. Rev., 142, 320342, https://doi.org/10.1175/MWR-D-13-00201.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Doyle, J. D., C. A. Reynolds, and C. Amerault, 2019: Adjoint sensitivity analysis of high-impact extratropical cyclones. Mon. Wea. Rev., 147, 45114532, https://doi.org/10.1175/MWR-D-19-0055.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dudhia, J., 1989: Numerical study of convection observed during the Winter Monsoon Experiment using a mesoscale two-dimensional model. J. Atmos. Sci., 46, 30773107, https://doi.org/10.1175/1520-0469(1989)046<3077:NSOCOD>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Errico, R. M., 1997: What is an adjoint model? Bull. Amer. Meteor. Soc., 78, 25772591, https://doi.org/10.1175/1520-0477(1997)078<2577:WIAAM>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Evensen, G., 1994: Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. J. Geophys. Res., 99, 10 14310 162, https://doi.org/10.1029/94JC00572.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gao, J., C. Fu, D. J. Stensrud, and J. S. Kain, 2016: OSSEs for an ensemble-3DVAR data assimilation system with radar observations of convective storms. J. Atmos. Sci., 73, 24032426, https://doi.org/10.1175/JAS-D-15-0311.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Garcies, L., and V. Homar, 2009: Ensemble sensitivities of the real atmosphere: Application to Mediterranean intense cyclones. Tellus, 61A, 394406, https://doi.org/10.1111/j.1600-0870.2009.00392.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Garcies, L., and V. Homar, 2010: An optimized ensemble sensitivity climatology of Mediterranean intense cyclones. Nat. Hazards Earth Syst. Sci., 10, 24412450, https://doi.org/10.5194/nhess-10-2441-2010.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gaspari, G., and S. E. Cohn, 1999: Construction of correlation functions in two and three dimensions. Quart. J. Roy. Meteor. Soc., 125, 723757, https://doi.org/10.1002/qj.49712555417.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hakim, G. J., and R. D. Torn, 2008: Ensemble synoptic analysis. Synoptic-Dynamic Meteorology and Weather Analysis and Forecasting: A Tribute to Fred Sanders, Meteor. Monogr., No. 55, Amer. Meteor. Soc., 147–161.

    • Crossref
    • Export Citation
  • Hamill, T. M., 2001: Interpretation of rank histograms for verifying ensemble forecasts. Mon. Wea. Rev., 129, 550560, https://doi.org/10.1175/1520-0493(2001)129<0550:IORHFV>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hill, A. J., C. C. Weiss, and B. C. Ancell, 2016: Ensemble sensitivity analysis for mesoscale forecasts of dryline convection initiation. Mon. Wea. Rev., 144, 41614182, https://doi.org/10.1175/MWR-D-15-0338.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hill, A. J., C. C. Weiss, and B. C. Ancell, 2018: Towards improving forecasts of severe convection along the dryline through targeted observing with ensemble sensitivity analysis. 29th Conf. on Severe Local Storms, Stowe, VT, Amer. Meteor. Soc., 14.2, https://ams.confex.com/ams/29SLS/webprogram/Paper348727.html.

  • Hill, A. J., C. C. Weiss, and D. C. Dowell, 2020: Assimilating near-surface observations from a portable mesoscale network of StickNet platforms during VORTEX-SE with the high-resolution rapid refresh ensemble. Severe Local Storms Symp., Boston, MA, Amer. Meteor. Soc., 369006, https://ams.confex.com/ams/2020Annual/webprogram/Paper369006.html.

  • Hong, S., Y. Noh, and J. Dudhia, 2006: A new vertical diffusion package with an explicit treatment of entrainment processes. Mon. Wea. Rev., 134, 23182341, https://doi.org/10.1175/MWR3199.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hoskins, B. J., and M. M. Coutinho, 2005: Moist singular vectors and the predictability of some high impact European cyclones. Quart. J. Roy. Meteor. Soc., 131, 581601, https://doi.org/10.1256/qj.04.48.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Houtekamer, P., and H. Mitchell, 1998: Data assimilation using an ensemble Kalman filter technique. Mon. Wea. Rev., 126, 796811, https://doi.org/10.1175/1520-0493(1998)126<0796:DAUAEK>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Insurance Information Institute, 2019: Facts statistics: Tornadoes and thunderstorms. Accessed 13 October 2019, https://www.iii.org/fact-statistic/tornadoes-and-thunderstorms.

  • Joly, A., and Coauthors, 1997: The Fronts and Atlantic Storm-Track Experiment (FASTEX): Scientific objectives and experimental design. Bull. Amer. Meteor. Soc., 78, 19171940, https://doi.org/10.1175/1520-0477(1997)078<1917:TFAAST>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Joly, A., and Coauthors, 1999: Overview of the field phase of the Fronts and Atlantic Storm-Track Experiment (FASTEX) project. Quart. J. Roy. Meteor. Soc., 125, 31313163, https://doi.org/10.1002/qj.49712556103.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kain, J. S., 2004: The Kain–Fritsch convective parameterization: An update. J. Appl. Meteor., 43, 170181, https://doi.org/10.1175/1520-0450(2004)043<0170:TKCPAU>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kain, J. S., and Coauthors, 2013: A feasibility study for probabilistic convection initiation forecasts based on explicit numerical guidance. Bull. Amer. Meteor. Soc., 94, 12131225, https://doi.org/10.1175/BAMS-D-11-00264.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kalman, R., 1960: A new approach to linear filtering and prediction problems. J. Basic Eng., 82, 3545, https://doi.org/10.1115/1.3662552.

  • Kerr, C. A., and X. Wang, 2020: Ensemble-based targeted observation method applied to radar radial velocity observations on idealized supercell low-level rotation forecasts: A proof of concept. Mon. Wea. Rev., 148, 877890, https://doi.org/10.1175/MWR-D-19-0197.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kerr, C. A., D. J. Stensrud, X. Wang, C. A. Kerr, D. J. Stensrud, and X. Wang, 2019: Diagnosing convective dependencies on near-storm environments using ensemble sensitivity analyses. Mon. Wea. Rev., 147, 495517, https://doi.org/10.1175/MWR-D-18-0140.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Khare, S. P., and J. L. Anderson, 2006: An examination of ensemble filter based adaptive observation methodologies. Tellus, 58A, 179195, https://doi.org/10.1111/j.1600-0870.2006.00163.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kim, H. M., and B. J. Jung, 2009: Singular vector structure and evolution of a recurving tropical cyclone. Mon. Wea. Rev., 137, 505524, https://doi.org/10.1175/2008MWR2643.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lange, H., and G. C. Craig, 2014: The impact of data assimilation length scales on analysis and prediction of convective storms. Mon. Wea. Rev., 142, 37813808, https://doi.org/10.1175/MWR-D-13-00304.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Langland, R. H., 2005: Issues in targeted observing. Quart. J. Roy. Meteor. Soc., 131, 34093425, https://doi.org/10.1256/qj.05.130.

  • Langland, R. H., and Coauthors, 1999: The North Pacific Experiment (NORPEX-98): Targeted observations for improved North American weather forecasts. Bull. Amer. Meteor. Soc., 80, 13631384, https://doi.org/10.1175/1520-0477(1999)080<1363:TNPENT>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • LeDimet, F., and O. Talagrand, 1986: Variational algorithms for analysis and assimilation of meteorological observations: Theoretical aspects. Tellus, 38A, 97110, https://doi.org/10.1111/j.1600-0870.1986.tb00459.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Limpert, G. L., and A. L. Houston, 2018: Ensemble sensitivity analysis for targeted observations of supercell thunderstorms. Mon. Wea. Rev., 146, 17051721, https://doi.org/10.1175/MWR-D-17-0029.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Liu, C., Q. Xiao, and B. Wang, 2009: An ensemble-based four-dimensional variational data assimilation scheme. Part II: Observing system simulation experiments with Advanced Research WRF (ARW). Mon. Wea. Rev., 137, 16871704, https://doi.org/10.1175/2008MWR2699.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Madaus, L. E., and G. J. Hakim, 2017: Constraining ensemble forecasts of discrete convective initiation with surface observations. Mon. Wea. Rev., 145, 25972610, https://doi.org/10.1175/MWR-D-16-0395.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Madaus, L. E., G. J. Hakim, and C. F. Mass, 2014: Utility of dense pressure observations for improving mesoscale analyses and forecasts. Mon. Wea. Rev., 142, 23982413, https://doi.org/10.1175/MWR-D-13-00269.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Majumdar, S. J., 2016: A review of targeted observations. Bull. Amer. Meteor. Soc., 97, 22872303, https://doi.org/10.1175/BAMS-D-14-00259.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Masutani, M., and Coauthors, 2010: Observing system simulation experiments at the National Centers for Environmental Prediction. J. Geophys. Res., 115, D07101, https://doi.org/10.1029/2009JD012528.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McMurdie, L. A., and B. Ancell, 2014: Predictability characteristics of land-falling cyclones along the North American west coast. Mon. Wea. Rev., 142, 301319, https://doi.org/10.1175/MWR-D-13-00141.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mlawer, E. J., S. J. Taubman, P. D. Brown, M. J. Iacono, and S. Clough, 1997: Radiative transfer for inhomogeneous atmospheres: RRTM, a validated correlated-k model for the longwave. J. Geophys. Res., 102, 16 66316 682, https://doi.org/10.1029/97JD00237.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Oliphant, T. E., 2007: Python for scientific computing. Comput. Sci. Eng., 9, 1020, https://doi.org/10.1109/MCSE.2007.58.

  • Palmer, T. N., R. Gelaro, J. Barkmeijer, and R. Buizza, 1998: Singular vectors, metrics, and adaptive observations. J. Atmos. Sci., 55, 633653, https://doi.org/10.1175/1520-0469(1998)055<0633:SVMAAO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reynolds, C. A., J. D. Doyle, R. M. Hodur, and H. Jin, 2010: Naval research laboratory multiscale targeting guidance for T-PARC and TCS-08. Wea. Forecasting, 25, 526544, https://doi.org/10.1175/2009WAF2222292.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reynolds, C. A., J. D. Doyle, F. M. Ralph, and R. Demirdjian, 2019: Adjoint sensitivity of North Pacific atmospheric river forecasts. Mon. Wea. Rev., 147, 18711897, https://doi.org/10.1175/MWR-D-18-0347.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Romine, G. S., C. S. Schwartz, R. D. Torn, and M. L. Weisman, 2016: Impact of assimilating dropsonde observations from MPEX on ensemble forecasts of severe weather events. Mon. Wea. Rev., 144, 37993823, https://doi.org/10.1175/MWR-D-15-0407.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Skamarock, W. C., and Coauthors, 2008: A description of the Advanced Research WRF version 3. NCAR Tech. Note NCAR/TN-475+STR, 113 pp., https://doi.org/10.5065/D68S4MVH.

    • Crossref
    • Export Citation
  • Smith, N. H., and B. C. Ancell, 2017: Ensemble sensitivity analysis of wind ramp events with applications to observation targeting. Mon. Wea. Rev., 145, 25052522, https://doi.org/10.1175/MWR-D-16-0306.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sobash, R., and D. Stensrud, 2013: The impact of covariance localization for radar data on EnKF analyses of a developing MCS: Observing system simulation experiments. Mon. Wea. Rev., 141, 36913709, https://doi.org/10.1175/MWR-D-12-00203.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sobash, R., C. S. Schwartz, G. S. Romine, K. R. Fossell, and M. L. Weisman, 2015: Severe weather prediction using storm surrogates from an ensemble forecasting system. Wea. Forecasting, 31, 255271, https://doi.org/10.1175/WAF-D-15-0138.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Szunyogh, I., and Z. Toth, 2002: Propagation of the effect of targeted observations: The 2000 winter storm reconnaissance program. Mon. Wea. Rev., 130, 11441165, https://doi.org/10.1175/1520-0493(2002)130<1144:POTEOT>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Thompson, G., R. M. Rasmussen, and K. Manning, 2004: Explicit forecasts of winter precipitation using an improved bulk microphysics scheme. Part I: Description and sensitivity analysis. Mon. Wea. Rev., 132, 519543, https://doi.org/10.1175/1520-0493(2004)132<0519:EFOWPU>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., 2010: Ensemble-based sensitivity analysis applied to African easterly waves. Wea. Forecasting, 25, 6178, https://doi.org/10.1175/2009WAF2222255.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., 2014: The impact of targeted dropwindsonde observations on tropical cyclone intensity forecasts of four weak systems during PREDICT. Mon. Wea. Rev., 142, 28602878, https://doi.org/10.1175/MWR-D-13-00284.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., and G. J. Hakim, 2008: Ensemble-based sensitivity analysis. Mon. Wea. Rev., 136, 663677, https://doi.org/10.1175/2007MWR2132.1.

  • Torn, R. D., and G. J. Hakim, 2009: Initial condition sensitivity of western Pacific extratropical transitions determined using ensemble-based sensitivity analysis. Mon. Wea. Rev., 137, 33883406, https://doi.org/10.1175/2009MWR2879.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., and D. Cook, 2013: The role of vortex and environment errors in genesis forecasts of Hurricanes Danielle and Karl (2010). Mon. Wea. Rev., 141, 232251, https://doi.org/10.1175/MWR-D-12-00086.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., and G. S. Romine, 2015: Sensitivity of central Oklahoma convection forecasts to upstream potential vorticity anomalies during two strongly forced cases during MPEX. Mon. Wea. Rev., 143, 40644087, https://doi.org/10.1175/MWR-D-15-0085.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., G. J. Hakim, and C. Snyder, 2006: Boundary conditions for limited-area ensemble Kalman filters. Mon. Wea. Rev., 134, 24902502, https://doi.org/10.1175/MWR3187.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Torn, R. D., G. S. Romine, and T. J. Galarneau, 2017: Sensitivity of dryline convection forecasts to upstream forecast errors for two weakly forced MPEX cases. Mon. Wea. Rev., 145, 18311852, https://doi.org/10.1175/MWR-D-16-0457.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Weisman, M. L., and Coauthors, 2015: The Mesoscale Predictability Experiment (MPEX). Bull. Amer. Meteor. Soc., 96, 21272149, https://doi.org/10.1175/BAMS-D-13-00281.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wile, S. M., J. P. Hacker, and K. H. Chilcoat, 2015: The potential utility of high-resolution ensemble sensitivity analysis for observation placement during weak flow in complex terrain. Wea. Forecasting, 30, 15211536, https://doi.org/10.1175/WAF-D-14-00066.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Xie, B., F. Zhang, Q. Zhang, J. Poterjoy, and Y. Weng, 2013: Observing strategy and observation targeting for tropical cyclones using ensemble-based sensitivity analysis and data assimilation. Mon. Wea. Rev., 141, 14371453, https://doi.org/10.1175/MWR-D-12-00188.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Yussouf, N., and D. J. Stensrud, 2010: Impact of phased-array radar observations over a short assimilation period: Observing system simulation experiments using an ensemble Kalman filter. Mon. Wea. Rev., 138, 517538, https://doi.org/10.1175/2009MWR2925.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zack, J., E. Natenberg, S. Young, G. V. Knowe, K. Waight, J. Manobainco, and C. Kamath, 2010a: Application of ensemble sensitivity analysis to observation targeting for short-term wind speed forecasting in the Tehachapi region winter season. Lawrence Livermore National Laboratory Tech. Rep. LLNL-TR-460956, 60 pp., https://computing.llnl.gov/projects/starsapphire-data-driven-modeling-analysis/LLNL-TR-460956.pdf.

    • Crossref