• Abuelgasim, A. A., S. Gopal, and A. H. Strahler, 1998: Forward and inverse modeling of canopy directional reflectance using a neural network. Int. J. Remote Sens., 19, 453471, https://doi.org/10.1080/014311698216099.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Aires, F., W. B. Rossow, N. A. Scott, and A. Chédin, 2002: Remote sensing from the infrared atmospheric sounding interferometer instrument: 2. Simultaneous retrieval of temperature, water vapor, and ozone atmospheric profiles. J. Geophys. Res., 107, 4620, https://doi.org/10.1029/2001JD001591.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ball, J. E., D. T. Anderson, and C. S. Chan, 2017: Comprehensive survey of deep learning in remote sensing: Theories, tools, and challenges for the community. J. Appl. Remote Sens., 11, 042609, https://doi.org/10.1117/1.jrs.11.042609.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ban, G.-Y., N. El Karoui, and A. E. B. Lim, 2018: Machine learning and portfolio optimization. Manage. Sci., 64, 11361154, https://doi.org/10.1287/mnsc.2016.2644.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Beucler, T., M. Pritchard, S. Rasp, P. Gentine, J. Ott, and P. Baldi, 2019: Enforcing analytic constraints in neural-networks emulating physical systems. arXiv, 11 pp., http://arxiv.org/abs/1909.00912.

    • Search Google Scholar
    • Export Citation
  • Blackwell, W. J., and Coauthors, 2019: Microwave atmospheric sounding CubeSats: From MicroMAS-2 to TROPICS and beyond. Ninth Conf. on Transition of Research to Operations, Phoenix, AZ, Amer. Meteor. Soc., J3.5, https://ams.confex.com/ams/2019Annual/webprogram/Paper352453.html.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., E. Maddy, A. Neiss, K. Garrett, E. Jones, K. Ide, N. Shahroudi, and K. Kumar, 2017: Exploring using artificial intelligence (AI) for NWP and situational awareness applications. Int. TOVS Study Conf. (ITSC) XXI, Darmstadt, Germany, EUMETSAT, 12.05, https://cimss.ssec.wisc.edu/itwg/itsc/itsc21/program/4december/0830_12.05_AI4DataAssimilAndFusion_BoukabaraEtAl_v7.pdf.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., E. Maddy, K. Ide, K. Garrett, E. Jones, K. Kumar, N. Shahroudi, and A. Neiss, 2018: Exploring using artificial intelligence (AI) for NWP and situational awareness applications. 17th Conf. on Artificial and Computational Intelligence and its Applications to the Environmental Sciences, Austin, Texas, Amer. Meteor. Soc., 5.3, https://ams.confex.com/ams/98Annual/webprogram/Paper330911.html.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., V. Krasnopolsky, and J. Q. Stewart, 2019a: Overview of NOAA AI activities in satellite observations and NWP: Status and perspectives. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, Maryland, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Tuesday/S1-2_NOAAai2019_Boukabara.pptx.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., V. Krasnopolsky, and J. Q. Stewart, S. G. Penny, R. N. Hoffman, and E. Maddy, 2019b: Artificial Intelligence may be key to better weather forecasts. Eos, Trans. Amer. Geophys. Union, 100, https://doi.org/10.1029/2019EO129967.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brenowitz, N. D., and C. S. Bretherton, 2018: Prognostic validation of a neural network unified physics parameterization. Geophys. Res. Lett., 45, 62896298, https://doi.org/10.1029/2018GL078510.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brunetti, A., D. Buongiorno, G. F. Trotta, and V. Bevilacqua, 2018: Computer vision and deep learning techniques for pedestrian detection and tracking: A survey. Neurocomputing, 300, 1733, https://doi.org/10.1016/j.neucom.2018.01.092.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Campos, R. M., V. Krasnopolsky, J.-H. G. M. Alves, and S. G. Penny, 2019: Nonlinear wave ensemble averaging in the Gulf of Mexico using neural network. J. Atmos. Oceanic Technol., 36, 113127, https://doi.org/10.1175/JTECH-D-18-0099.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cao, W., X. Wang, Z. Ming, and J. Gao, 2018: A review on neural networks with random weights. Neurocomputing, 275, 278287, https://doi.org/10.1016/j.neucom.2017.08.040.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chang, N.-B., K. Bai, and C.-F. Chen, 2015: Smart information reconstruction via time-space-spectrum continuum for cloud removal in satellite images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., 8, 18981912, https://doi.org/10.1109/JSTARS.2015.2400636.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chen, Y., E. Argentinis, and G. Weber, 2016: IBM Watson: How cognitive computing can be applied to Big Data challenges in life sciences research. Clin. Ther., 38, 688701, https://doi.org/10.1016/j.clinthera.2015.12.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chevallier, F., and J.-F. Mahfouf, 2001: Evaluation of the Jacobians of infrared radiation models for variational data assimilation. J. Appl. Meteor., 40, 14451461, https://doi.org/10.1175/1520-0450(2001)040<1445:EOTJOI>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chevallier, F., J.-J. Morcrette, F. Chéruy, and N. A. Scott, 2000: Use of a neural-network-based long-wave radiative-transfer scheme in the ECMWF atmospheric model. Quart. J. Roy. Meteor. Soc., 126, 761776, https://doi.org/10.1002/qj.49712656318.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chong, E., C. Han, and F. C. Park, 2017: Deep learning networks for stock market analysis and prediction: Methodology, data representations, and case studies. Expert Syst. Appl., 83, 187205, https://doi.org/10.1016/j.eswa.2017.04.030.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cintineo, J. L., M. J. Pavolonis, J. M. Sieglaff, and D. T. Lindsey, 2014: An empirical model for assessing the severe weather potential of developing convection. Wea. Forecasting, 29, 639653, https://doi.org/10.1175/WAF-D-13-00113.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Clough, S. A., M. W. Shephard, E. J. Mlawer, J. S. Delamere, M. J. Iacono, K. Cady-Pereira, S. Boukabara, and P. D. Brown, 2005: Atmospheric radiative transfer modeling: A summary of the AER codes. J. Quant. Spectrosc. Radiat. Transfer, 91, 233244, https://doi.org/10.1016/j.jqsrt.2004.05.058.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Collins, W. D., J. K. Hackney, and D. P. Edwards, 2002: An updated parameterization for infrared emission and absorption by water vapor in the National Center for Atmospheric Research Community Atmosphere Model. J. Geophys. Res., 107, 4664, https://doi.org/10.1029/2001JD001365.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Creswell, A., T. White, V. Dumoulin, K. Arulkumaran, B. Sengupta, and A. A. Bharath, 2018: Generative adversarial networks: An overview. IEEE Signal Process. Mag., 35, 5365, https://doi.org/10.1109/MSP.2017.2765202.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ding, L., 2018: Human knowledge in constructing AI systems—Neural logic networks approach towards an explainable AI. Procedia Comput. Sci., 126, 15611570, https://doi.org/10.1016/j.procs.2018.08.129.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Droste, A. M., J. J. Pape, A. Overeem, H. Leijnse, G. J. Steeneveld, A. J. V. Delden, and R. Uijlenhoet, 2017: Crowdsourcing urban air temperatures through smartphone battery temperatures in São Paulo, Brazil. J. Atmos. Oceanic Technol., 34, 18531866, https://doi.org/10.1175/JTECH-D-16-0150.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fan, Y., C.-Y. Wu, J. Gottschalck, and V. Krasnopolsky, 2019: Using artificial neural networks to improve CFS week 3–4 precipitation and 2m temperature forecast. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, Maryland, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S5-6_NOAAai2019_Fan.pptx.

    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., II, A. McGovern, S. E. Haupt, R. A. Sobash, J. K. Williams, and M. Xue, 2017: Storm-based probabilistic hail forecasting with machine learning applied to convection-allowing ensembles. Wea. Forecasting, 32, 18191840, https://doi.org/10.1175/WAF-D-17-0010.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., S. E. Haupt, D. W. Nychka, and G. Thompson, 2019: Interpretable deep learning for spatial analysis of severe hailstorms. Mon. Wea. Rev., 147, 28272845, https://doi.org/10.1175/MWR-D-18-0316.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Geer, A. J., and Coauthors, 2018: All-sky satellite data assimilation at operational weather forecasting centres. Quart. J. Roy. Meteor. Soc., 144, 11911217, https://doi.org/10.1002/qj.3202.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gentine, P., M. Pritchard, S. Rasp, G. Reinaudi, and G. Yacalis, 2018: Could machine learning break the convection parameterization deadlock? Geophys. Res. Lett., 45, 57425751, https://doi.org/10.1029/2018GL078202.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ghahramani, Z., 2015: Probabilistic machine learning and artificial intelligence. Nature, 521, 452459, https://doi.org/10.1038/nature14541.

  • Golestan, K., R. Soua, F. Karray, and M. S. Kamel, 2016: Situation awareness within the context of connected cars: A comprehensive review and recent trends. Inf. Fusion, 29, 6883, https://doi.org/10.1016/j.inffus.2015.08.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Goodfellow, I., Y. Bengio, and A. Courville, 2019: Deep Learning. Adaptive Computation and Machine Learning Series, MIT Press, 800 pp.

  • Gu, J., and Coauthors, 2018: Recent advances in convolutional neural networks. Pattern Recognit ., 77, 354377, https://doi.org/10.1016/j.patcog.2017.10.013.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Han, J., D. Zhang, G. Cheng, N. Liu, and D. Xu, 2018: Advanced deep-learning techniques for salient and category-specific object detection: A survey. IEEE Signal Process. Mag., 35, 84100, https://doi.org/10.1109/MSP.2017.2749125.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Haupt, S. E., A. Pasini, and C. Marzban, Eds., 2008: Artificial Intelligence Methods in the Environmental Sciences. Springer, 418 pp.

  • Hengl, T., and Coauthors, 2017: SoilGrids250m: Global gridded soil information based on machine learning. PLOS ONE, 12, e069748, https://doi.org/10.1371/journal.pone.0169748.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hernández-Ceballos, M. A., and Coauthors, 2019: UDINEE: Evaluation of multiple models with data from the JU2003 puff releases in Oklahoma City. Part I: Comparison of observed and predicted concentrations. Bound.-Layer Meteor., 171, 323349, https://doi.org/10.1007/s10546-019-00433-8.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hochreiter, S., and J. Schmidhuber, 1997: Long short-term memory. Neural Comput ., 9, 17351780, https://doi.org/10.1162/neco.1997.9.8.1735.

  • Hoffman, R. N., and S. M. Leidner, 2010: Some characteristics of time interpolation errors for fluid flows. J. Atmos. Oceanic Technol., 27, 12551262, https://doi.org/10.1175/2010JTECHA1429.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hopkins, M., G. Pineda-García, P. A. Bogdan, and S. B. Furber, 2018: Spiking neural networks for computer vision. Interface Focus, 8, 20180007, https://doi.org/10.1098/rsfs.2018.0007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hsieh, W. W., 2009: Machine Learning Methods in the Environmental Sciences: Neural Networks and Kernels. Cambridge University Press, 364 pp.

  • Huang, B., Y. Huan, L. D. Xu, L. Zheng, and Z. Zou, 2019: Automated trading systems statistical and machine learning methods and hardware implementation: A survey. Enterprise Inf. Syst., 13, 132144, https://doi.org/10.1080/17517575.2018.1493145.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Iskenderian, H., and Coauthors, 2019: Global synthetic weather radar capability in support of the U.S. Air Force. 19th Conference on Aviation, Range, and Aerospace Meteorology, Phoenix, AZ, Amer. Meteor. Soc., 7.1, https://ams.confex.com/ams/2019Annual/webprogram/Paper355542.html.

    • Search Google Scholar
    • Export Citation
  • Jones, E., E. Maddy, K. Garrett, and S.-A. Boukabara, 2019: The MIIDAPS algorithm for retrieval and quality control for microwave and infrared observations: Applications in data assimilation. 23rd Conf. on Integrated Observing and Assimilation Systems for the Atmosphere, Oceans, and Land Surface, Phoenix, AZ, Amer. Meteor. Soc., 10.4, https://ams.confex.com/ams/2019Annual/webprogram/Paper352855.html.

    • Search Google Scholar
    • Export Citation
  • Karpatne, A., W. Watkins, J. Read, and V. Kumar, 2018: Physics-guided neural networks (PGNN): An application in lake temperature modeling. arXiv, 11 pp., http://arxiv.org/abs/1710.11431.

    • Search Google Scholar
    • Export Citation
  • Karstens, C. D., and Coauthors, 2018: Development of a human–machine mix for forecasting severe convective events. Wea. Forecasting, 33, 715737, https://doi.org/10.1175/WAF-D-17-0188.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Khain, P., and Coauthors, 2019: Parameterization of vertical profiles of governing microphysical parameters of shallow cumulus cloud ensembles using LES with bin microphysics. J. Atmos. Sci., 76, 533560, https://doi.org/10.1175/JAS-D-18-0046.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kohonen, T., 2001: Self-Organizing Maps. 3rd ed. Springer Series in Information Sciences, Vol. 30, Springer, 522 pp., https://doi.org/10.1007/978-3-642-56927-2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., 2013: The Application of Neural Networks in the Earth System Sciences: Neural Network Emulations for Complex Multidimensional Mappings. Atmospheric and Oceanographic Sciences Library, Vol. 46, Springer, 212 pp., https://doi.org/10.1007/978-94-007-6073-8.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., and M. S. Fox-Rabinovitz, 2006: A new synergetic paradigm in environmental numerical modeling: Hybrid models combining deterministic and machine learning components. Ecol. Modell., 191, 518, https://doi.org/10.1016/j.ecolmodel.2005.08.009.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., and Y. Lin, 2012: A neural network nonlinear multimodel ensemble to improve precipitation forecasts over continental US. Adv. Meteor., 2012, 649450, https://doi.org/10.1155/2012/649450.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., L. C. Breaker, and W. H. Gemmill, 1995: A neural network as a nonlinear transfer function model for retrieving surface wind speeds from the special sensor microwave imager. J. Geophys. Res., 100, 11 03311 045, https://doi.org/10.1029/95JC00857.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., M. S. Fox-Rabinovitz, and A. A. Belochitski, 2008: Decadal climate simulations using accurate and fast neural network emulation of full, longwave and shortwave, radiation. Mon. Wea. Rev., 136, 36833695, https://doi.org/10.1175/2008MWR2385.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., M. S. Fox-Rabinovitz, Y. T. Hou, S. J. Lord, and A. A. Belochitski, 2010: Accurate and fast neural network emulations of model radiation for the NCEP coupled climate forecast system: Climate simulations and seasonal predictions. Mon. Wea. Rev., 138, 18221842, https://doi.org/10.1175/2009MWR3149.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., M. S. Fox-Rabinovitz, and A. A. Belochitski, 2013: Using ensemble of neural networks to learn stochastic convection parameterizations for climate and numerical weather prediction models from data simulated by a cloud resolving model. Adv. Artif. Neural Syst., 2013, 485913, https://doi.org/10.1155/2013/485913.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., S. Nadiga, A. Mehra, E. Bayler, and D. Behringer, 2016: Neural networks technique for filling gaps in satellite measurements: Application to ocean color observations. Comput. Intell. Neurosci., 2016, 6156513, https://doi.org/10.1155/2016/6156513.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., S. Nadiga, A. Mehra, and E. Bayler, 2018: Adjusting neural network to a particular problem: Neural network-based empirical biological model for chlorophyll concentration in the upper ocean. Appl. Comput. Intell. Soft Comput., 2018, 7057363, https://doi.org/10.1155/2018/7057363.

    • Search Google Scholar
    • Export Citation
  • Landschützer, P., N. Gruber, D. C. E. Bakker, U. Schuster, S. Nakaoka, M. R. Payne, T. P. Sasse, and J. Zeng, 2013: A neural network-based estimate of the seasonal to inter-annual variability of the Atlantic Ocean carbon sink. Biogeosciences, 10, 77937815, https://doi.org/10.5194/bg-10-7793-2013.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • LeCun, Y., Y. Bengio, and G. Hinton, 2015: Deep learning. Nature, 521, 436444, https://doi.org/10.1038/nature14539.

  • Li, J., and Coauthors, 2019: Potential numerical techniques and challenges for atmospheric modeling. Bull. Amer. Meteor. Soc., 100, ES239ES242, https://doi.org/10.1175/BAMS-D-19-0031.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Li, Y., S. Ye, and I. Bartoli, 2018: Semisupervised classification of hurricane damage from postevent aerial imagery using deep learning. J. Appl. Remote Sens., 12, 045008, https://doi.org/10.1117/1.jrs.12.045008.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Madaus, L. E., and C. F. Mass, 2017: Evaluating smartphone pressure observations for mesoscale analyses and forecasts. Wea. Forecasting, 32, 511531, https://doi.org/10.1175/WAF-D-16-0135.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Manogaran, G., V. Vijayakumar, R. Varatharajan, P. M. Kumar, R. Sundarasekar, and C.-H. Hsu, 2018: Machine learning based Big Data processing framework for cancer diagnosis using hidden Markov model and GM clustering. Wireless Pers. Commun ., 102, 20992116, https://doi.org/10.1007/s11277-017-5044-z.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mao, H. H., T. Shin, and G. Cottrell, 2018: DeepJ: Style-specific music generation. 12th Int. Conf. on Semantic Computing, Laguna Hills, CA, IEEE, https://doi.org/10.1109/icsc.2018.00077.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mass, C. F., and Y.-H. Kuo, 1998: Regional real-time numerical weather prediction: Current status and future potential. Bull. Amer. Meteor. Soc., 79, 253264, https://doi.org/10.1175/1520-0477(1998)079<0253:RRTNWP>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McGovern, A., K. L. Elmore, D. J. Gagne II, S. E. Haupt, C. D. Karstens, R. Lagerquist, T. Smith, and J. K. Williams, 2017: Using artificial intelligence to improve real-time decision-making for high-impact weather. Bull. Amer. Meteor. Soc., 98, 20732090, https://doi.org/10.1175/BAMS-D-16-0123.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McGovern, A., R. A. Lagerquist, D. J. Gagne, E. Jergensen, K. L. Elmore, C. R. Homeyer, and T. Smith, 2019: Making the black box more transparent: Understanding the physical implications of machine learning. Bull. Amer. Meteor. Soc., 100, 21752199, https://doi.org/10.1175/BAMS-D-18-0195.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Millard, K., and M. Richardson, 2015: On the importance of training data sample selection in random forest image classification: A case study in peatland ecosystem mapping. Remote Sens ., 7, 84898515, https://doi.org/10.3390/rs70708489.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mincholé, A., and B. Rodriguez, 2019: Artificial intelligence for the electrocardiogram. Nat. Med., 25, 2223, https://doi.org/10.1038/s41591-018-0306-1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • O’Gorman, P. A., and J. G. Dwyer, 2018: Using machine learning to parameterize moist convection: Potential for modeling of climate, climate change, and extreme events. J. Adv. Model. Earth Syst., 10, 25482563, https://doi.org/10.1029/2018MS001351.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • O’Mahony, N., S. Campbell, L. Krpalkova, D. Riordan, J. Walsh, A. Murphy, and C. Ryan, 2019: Computer vision for 3D perception. IntelliSys 2018: Intelligent Systems and Applications, K. Arai, S. Kapoor, and R. Bhatia, Eds., Springer, 788804, https://doi.org/10.1007/978-3-030-01057-7_59.

    • Search Google Scholar
    • Export Citation
  • Pagano, T. S., and Coauthors, 2017: Design and development of the CubeSat Infrared Atmospheric Sounder (CIRAS). Proc. SPIE, 10402, 1040209, https://doi.org/10.1117/12.2272839.

    • Search Google Scholar
    • Export Citation
  • Pasolli, L., F. Melgani, and E. Blanzieri, 2010: Gaussian process regression for estimating chlorophyll concentration in subsurface waters from remote sensing data. IEEE Geosci. Remote Sens. Lett., 7, 464468, https://doi.org/10.1109/LGRS.2009.2039191.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Pavolonis, M., J. Cintineo, J. Sieglaff, and C. Calvert, 2019: Machine learning based applications for environmental hazard detection and prediction. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, Maryland, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Wednesday/S3-3_NOAAai2019_Pavolonis.pptx.

    • Search Google Scholar
    • Export Citation
  • Penny, S. G., and T. M. Hamill, 2017: Coupled data assimilation for integrated earth system analysis and prediction. Bull. Amer. Meteor. Soc., 98, ES169ES172, https://doi.org/10.1175/BAMS-D-17-0036.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Price, J. D., and Coauthors, 2018: LANFEX: A field and modeling study to improve our understanding and forecasting of radiation fog. Bull. Amer. Meteor. Soc., 99, 20612077, https://doi.org/10.1175/BAMS-D-16-0299.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Racah, E., C. Beckham, T. Maharaj, and S. Ebrahimi Kahou, Prabhat, and C. Pal, 2017: ExtremeWeather: A large-scale climate dataset for semi-supervised detection, localization, and understanding of extreme weather events. Advances in Neural Information Processing Systems 30, I. Guyon et al., Eds., Curran Associates, Inc., 34023413, http://papers.nips.cc/paper/6932-extremeweather-a-large-scale-climate-dataset-for-semi-supervised-detection-localization-and-understanding-of-extreme-weather-events.

    • Search Google Scholar
    • Export Citation
  • Rasp, S., and S. Lerch, 2018: Neural networks for postprocessing ensemble weather forecasts. Mon. Wea. Rev., 146, 38853900, https://doi.org/10.1175/MWR-D-18-0187.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Rasp, S., M. S. Pritchard, and P. Gentine, 2018: Deep learning to represent subgrid processes in climate models. Proc. Natl. Acad. Sci. USA, 115, 96849689, https://doi.org/10.1073/pnas.1810286115.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Razzak, M. I., S. Naz, and A. Zaib, 2018: Deep learning for medical image processing: Overview, challenges and the future. Classification in BioApps, N. Dey, A. Ashour, and S. Borra, Eds., Lecture Notes in Computational Vision and Biomechanics, Vol. 26, Springer, 323350, https://doi.org/10.1007/978-3-319-65981-7_12.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reichstein, M., G. Camps-Valls, B. Stevens, M. Jung, J. Denzler, and N. Carvalhais, 2019: Deep learning and process understanding for data-driven Earth system science. Nature, 566, 195204, https://doi.org/10.1038/s41586-019-0912-1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Roebber, P. J., D. M. Schultz, B. A. Colle, and D. J. Stensrud, 2004: Toward improved prediction: High-resolution and ensemble modeling systems in operations. Wea. Forecasting, 19, 936949, https://doi.org/10.1175/1520-0434(2004)019<0936:TIPHAE>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Samek, W., T. Wiegand, and K.-R. Müller, 2017: Explainable artificial intelligence: Understanding, visualizing and interpreting deep learning models. arXiv, 8 pp., http://arxiv.org/abs/1708.08296.

    • Search Google Scholar
    • Export Citation
  • Scher, S., 2018: Toward data-driven weather and climate forecasting: Approximating a simple general circulation model with deep learning. Geophys. Res. Lett., 45, 12 61612 622, https://doi.org/10.1029/2018GL080704.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Schlef, K. E., H. Moradkhani, and U. Lall, 2019: Atmospheric circulation patterns associated with extreme United States floods identified via machine learning. Sci. Rep., 9, 20452322, https://doi.org/10.1038/s41598-019-43496-w.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Schmetz, J., and W. P. Menzel, 2015: A look at the evolution of meteorological satellites: Advancing capabilities and meeting user requirements. Wea. Climate Soc ., 7, 309320, https://doi.org/10.1175/WCAS-D-15-0017.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Schmidhuber, J., 2015: Deep learning in neural networks: An overview. Neural Networks, 61, 85117, https://doi.org/10.1016/j.neunet.2014.09.003.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Schneider, T., S. Lan, A. Stuart, and J. Teixeira, 2017: Earth System Modeling 2.0: A blueprint for models that learn from observations and targeted high-resolution simulations. Geophys. Res. Lett., 44, 12 39612 417, https://doi.org/10.1002/2017GL076101.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sejnowski, T. J., 2018: The Deep Learning Revolution. MIT Press, 352 pp.

  • Shahroudi, N., E. Maddy, S. Boukabara, and V. Krasnopolsky, 2019: Improvement to hurricane track and intensity forecast by exploiting satellite data and machine learning. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, Maryland, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Wednesday/S3-2_NOAAai2019_Shahroudi.pptx.

    • Search Google Scholar
    • Export Citation
  • Shi, X., Z. Chen, H. Wang, D.-Y. Yeung, W. Wong, and W. Woo, 2015: Convolutional LSTM network: A machine learning approach for precipitation nowcasting. Advances in Neural Information Processing Systems 28, C. Cortes et al., Eds., Curran Associates, Inc., 802810, https://papers.nips.cc/paper/5955-convolutional-lstm-network-a-machine-learning-approach-for-precipitation-nowcasting.

    • Search Google Scholar
    • Export Citation
  • Shi, X., Z. Gao, L. Lausen, H. Wang, D.-Y. Yeung, W. Wong, and W. Woo, 2017: Deep learning for precipitation nowcasting: A benchmark and a new model. Advances in Neural Information Processing Systems 30, I. Guyon et al., Eds., Curran Associates, Inc., 56175627, http://papers.nips.cc/paper/7145-deep-learning-for-precipitation-nowcasting-a-benchmark-and-a-new-model.

    • Search Google Scholar
    • Export Citation
  • Stensrud, D. J., and Coauthors, 2009: Convective-scale Warn-on-Forecast system: A vision for 2020. Bull. Amer. Meteor. Soc., 90, 14871499, https://doi.org/10.1175/2009BAMS2795.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Stensrud, D. J., and Coauthors, 2013: Progress and challenges with Warn-on-Forecast. Atmos. Res., 123, 216, https://doi.org/10.1016/j.atmosres.2012.04.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sun, J., and Coauthors, 2014: Use of NWP for nowcasting convective precipitation: Recent progress and challenges. Bull. Amer. Meteor. Soc., 95, 409426, https://doi.org/10.1175/BAMS-D-11-00263.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tan, C., F. Sun, T. Kong, W. Zhang, C. Yang, and C. Liu, 2018: A survey on deep transfer learning. Artificial Neural Networks and Machine Learning—ICANN 201, V. Kůrková et al., Eds., Lecture Notes in Computer Science, Vol. 11141, Springer, 270279, https://doi.org/10.1007/978-3-030-01424-7_27.

    • Search Google Scholar
    • Export Citation
  • Thorpe, A., and D. Rogers, 2018: The future of the global weather enterprise: Opportunities and risks. Bull. Amer. Meteor. Soc., 99, 20032008, https://doi.org/10.1175/BAMS-D-17-0194.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tompson, J., K. Schlachter, P. Sprechmann, and K. Perlin, 2017: Accelerating Eulerian fluid simulation with convolutional networks. Proceedings of the 34th International Conference on Machine Learning, D. Precup and Y. W. Teh, Eds., Proceedings of Machine Learning Research, Vol. 70, 34243433, PMLR, http://proceedings.mlr.press/v70/tompson17a.html.

    • Search Google Scholar
    • Export Citation
  • Toms, B. A., K. Kashinath, Prabhat, and D. Yang, 2019: Deep learning for scientific inference from geophysical data: The Madden-Julian oscillation as a test case. arXiv, 14 pp., http://arxiv.org/abs/1902.04621.

    • Search Google Scholar
    • Export Citation
  • van Straaten, C., K. Whan, and M. Schmeits, 2018: Statistical postprocessing and multivariate structuring of high-resolution ensemble precipitation forecasts. J. Hydrometeor., 19, 18151833, https://doi.org/10.1175/JHM-D-18-0105.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wang, D., and J. Chen, 2018: Supervised speech separation based on deep learning: An overview. IEEE/ACM Trans. Audio Speech Lang. Process., 26, 17021726, https://doi.org/10.1109/TASLP.2018.2842159.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wimmers, A., C. Velden, and J. H. Cossuth, 2019: Using deep learning to estimate tropical cyclone intensity from satellite passive microwave imagery. Mon. Wea. Rev., 147, 22612282, https://doi.org/10.1175/MWR-D-18-0391.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Yamashita, R., M. Nishio, R. K. G. Do, and K. Togashi, 2018: Convolutional neural networks: an overview and application in radiology. Insights Imaging, 9, 611629, https://doi.org/10.1007/s13244-018-0639-9.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Yano, J.-I., and Coauthors, 2018: Scientific challenges of convective-scale numerical weather prediction. Bull. Amer. Meteor. Soc., 99, 699710, https://doi.org/10.1175/BAMS-D-17-0125.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zhang, R., B. Dia, Y. Luo, X. Deng, M. L. Grieneisen, Z. Wang, G. Yaof, and Y. Zhan, 2018: A nonparametric approach to filling gaps in satellite-retrieved aerosol optical depth for estimating ambient PM2.5 levels. Environ. Pollut., 243, 9981007, https://doi.org/10.1016/j.envpol.2018.09.052.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zhang, Z., J. Geiger, J. Pohjalainen, A. E.-D. Mousa, W. Jin, and B. Schuller, 2018: Deep learning for environmentally robust speech recognition: An overview of recent developments. ACM Trans. Intell. Syst. Technol., 9, 49, https://doi.org/10.1145/3178115.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • View in gallery

    Growth in annual mean number of satellite observations (millions) per 0000 UTC cycle (a) available and (b) used by the NCEP DA system for different data types (colors). The data are grouped into the following types: atmospheric motion vector, ocean surface wind, solar backscatter ozone, radio occultation, and radiance. The radiance type is subdivided into MW, IR, and hyperspectral sounder types. Note the 100-fold differences in vertical axes between the two panels. Available radiances are so predominant that the other types are not visible in (a), even though only subsets of hyperspectral sounder channels are received. For a similar reason, the solar backscatter ozone type is not visible in (b). Graphic created by Krishna Kumar, Eric Zimmerman, and Ross Hoffman.

  • View in gallery

    Synopsis of the NWP processing chain, from secure ingest of satellite data to the generation of forecasts at different time scales. Color coding indicates which components are discussed in the text.

  • View in gallery

    NCEP CFS precipitation rates in mm day−1, 17-yr average for (a) the control run and (b) the run using ANN emulations of radiative heating rates, and differences of precipitation rates in mm day−1 for (c) the difference ANN run minus control run [(b) minus (a)] and (d) the difference between two control runs: one before and another after changing the version of the FORTRAN compiler. Results are from the experiments described by Krasnopolsky et al. (2010).

  • View in gallery

    Comparison of (a) the CPC analysis of precipitation for the 24 h period ending at 1200 UTC 24 Oct 2010 to three forecasts, including (b) the arithmetic mean of the precipitation forecasts provided by eight models, (c) the nonlinear ANN average of the same models, and (d) the prediction by a human analyst who used satellite images and ground observations in addition to model forecast. The color bar shows the precipitation rate from 0 to 175 mm day−1. Different shades of green correspond to precipitation rates from 1 to 10 mm day−1 and different shades of red to precipitation rates from 25 to 75 mm day−1. After Krasnopolsky and Lin (2012, their Fig. 6).

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 2209 2022 222
PDF Downloads 1807 1647 146

Leveraging Modern Artificial Intelligence for Remote Sensing and NWP: Benefits and Challenges

View More View Less
  • 1 NOAA/NESDIS/Center for Satellite Applications and Research, College Park, Maryland
  • | 2 NOAA/Environmental Modeling Center, College Park, Maryland
  • | 3 Cooperative Institute for Research in the Atmosphere, Colorado State University, at NOAA/Earth System Research Laboratory, Boulder, Colorado
  • | 4 Riverside Technology Inc. at NOAA/NESDIS/Center for Satellite Applications and Research, College Park, Maryland
  • | 5 Cooperative Institute for Satellite Earth System Studies, University of Maryland, College Park, at NOAA/NESDIS/Center for Satellite Applications and Research, College Park, Maryland
© Get Permissions
Free access

Abstract

Artificial intelligence (AI) techniques have had significant recent successes in multiple fields. These fields and the fields of satellite remote sensing and NWP share the same fundamental underlying needs, including signal and image processing, quality control mechanisms, pattern recognition, data fusion, forward and inverse problems, and prediction. Thus, modern AI in general and machine learning (ML) in particular can be positively disruptive and transformational change agents in the fields of satellite remote sensing and NWP by augmenting, and in some cases replacing, elements of the traditional remote sensing, assimilation, and modeling tools. And change is needed to meet the increasing challenges of Big Data, advanced models and applications, and user demands. Future developments, for example, SmallSats and the Internet of Things, will continue the explosion of new environmental data. ML models are highly efficient and in some cases more accurate because of their flexibility to accommodate nonlinearity and/or non-Gaussianity. With that efficiency, ML can help to address the demands put on environmental products for higher accuracy, for higher resolution—spatial, temporal, and vertical, for enhanced conventional medium-range forecasts, for outlooks and predictions on subseasonal to seasonal time scales, and for improvements in the process of issuing advisories and warnings. Using examples from satellite remote sensing and NWP, it is illustrated how ML can accelerate the pace of improvement in environmental data exploitation and weather prediction—first, by complementing existing systems, and second, where appropriate, as an alternative to some components of the NWP processing chain from observations to forecasts.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

CORRESPONDING AUTHOR: Dr. Sid-Ahmed Boukabara, sid.boukabara@noaa.gov

Abstract

Artificial intelligence (AI) techniques have had significant recent successes in multiple fields. These fields and the fields of satellite remote sensing and NWP share the same fundamental underlying needs, including signal and image processing, quality control mechanisms, pattern recognition, data fusion, forward and inverse problems, and prediction. Thus, modern AI in general and machine learning (ML) in particular can be positively disruptive and transformational change agents in the fields of satellite remote sensing and NWP by augmenting, and in some cases replacing, elements of the traditional remote sensing, assimilation, and modeling tools. And change is needed to meet the increasing challenges of Big Data, advanced models and applications, and user demands. Future developments, for example, SmallSats and the Internet of Things, will continue the explosion of new environmental data. ML models are highly efficient and in some cases more accurate because of their flexibility to accommodate nonlinearity and/or non-Gaussianity. With that efficiency, ML can help to address the demands put on environmental products for higher accuracy, for higher resolution—spatial, temporal, and vertical, for enhanced conventional medium-range forecasts, for outlooks and predictions on subseasonal to seasonal time scales, and for improvements in the process of issuing advisories and warnings. Using examples from satellite remote sensing and NWP, it is illustrated how ML can accelerate the pace of improvement in environmental data exploitation and weather prediction—first, by complementing existing systems, and second, where appropriate, as an alternative to some components of the NWP processing chain from observations to forecasts.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

CORRESPONDING AUTHOR: Dr. Sid-Ahmed Boukabara, sid.boukabara@noaa.gov

Machine learning can improve handling large volumes of observations, modeling, analysis, and forecasting of the environment by increasing the speed and accuracy of computations, but success requires great care in designing and training the machine learning models.

The purpose of this article is to provide evidence, based on specific examples, mostly remote sensing and NWP examples, that AI has tremendous potential for being used successfully in meteorology and for transforming the exploitation of environmental data in the future. (See appendix for a list of acronyms.) The authors believe that AI will fundamentally change how we do business in a wide range of our activities, including research, operations, and communication with users. This article expands on the discussion initiated by Boukabara et al. (2019b) by providing an introduction to those readers not yet conversant with AI and also a partial review of AI activities in meteorology for those who want to investigate more deeply. A complete review of the topic would be lengthy, and this article focuses on remote sensing and NWP. For an overview of cutting-edge research in AI for remote sensing and NWP, the interested reader is referred to the proceedings of a recent NOAA workshop (www.star.nesdis.noaa.gov/star/meeting_2019AIWorkshop.php).

AI has already proven to be a truly transformational and value-enhancing, disruptive technology in a variety of applications: autonomous vehicles, music generation, forecasting financial markets, speech recognition, smart assistance, quantum physics, medical diagnosis, and more (Sejnowski 2018). Already, various forms of AI, including ML (see sidebar “Machine learning and artificial neural networks”), have been applied with varying levels of success to many satellite remote sensing and NWP problems (Haupt et al. 2008; Hsieh 2009; Krasnopolsky 2013), ranging from remote sensing (Ball et al. 2017) to severe weather prediction (McGovern et al. 2017). In operational applications to date, AI has occupied only what might be termed “niche” applications, and these limited successes have been hard fought (e.g., Chevallier and Mahfouf 2001; Cintineo et al. 2014). However, AI demonstration projects have shown much promise for a wide range of geophysical problems from identifying critical situations to aid human interpretation (e.g., Pavolonis et al. 2019), to discovering new relationships in large datasets (e.g., Schlef et al. 2019), and to correcting model forecasts (e.g., Gagne et al. 2017; Campos et al. 2019). It is now clear (see references cited as examples in the body of this essay) that AI approaches, including recent advances in ML technology, such as Transfer Learning and Long and Short Term Memory Networks (LSTMs; Hochreiter and Schmidhuber 1997), Deep and Extreme Learning (Schmidhuber 2015; Goodfellow et al. 2019), and Computer Vision, have the potential to meet increasing requirements for and by nowcast and forecast products, including numerical and statistical weather forecasts and climate projections. Our discussion focuses on and our examples are generally drawn from global NWP, ranging from satellite data preprocessing to forecast postprocessing. This focus reflects the authors’ background and expertise, but of greater importance is that there are significant challenges and opportunities in global NWP due to its relative technical maturity, severe latency (i.e., timeliness) requirements, and the abundance of unexploited environmental observations.

MACHINE LEARNING AND ARTIFICIAL NEURAL NETWORKS

The artificial intelligence (AI) technique of machine learning (ML) estimates an output (scalar or not) from a set of inputs. The “machine” may be an artificial neural network (ANN), a decision tree, a support vector machine, a Bayesian network, a genetic algorithm, etc. ANNs are the most common form of ML used in the examples and citations of this article. In a trained ANN the strengths of the interactions between neurons are optimized (i.e., trained) to best fit data from a training set of inputs and outputs. ANNs are analogs of biological structures such as the eye and brain. Common ANN architectures (i.e., structures) are often composed of an input layer connected to the input data, an output layer that provides the desired estimate, and one or more “hidden” intermediate layers. The range of ML architectures is large, and different specialized architectures are appropriate for different tasks. Training an ANN and most ML architectures is a nonlinear optimization problem. It is time consuming, and success depends on having a large and representative training dataset, on the choice of optimization method, and on properly setting the “meta-parameters” that control the optimization process. Once trained, the ML model is fast. Specialized computer hardware (e.g., GPUs), some developed specifically for particular ML architectures, provide very significant acceleration of ML optimization and ML model computations.

For the foreseeable future, artificial general intelligence will not be available, and ML will require some degree of human expertise, intuition, and intervention to succeed. The steps in applying the general ML approach—identifying the problem, designing or selecting the ML architecture, selecting and normalizing inputs and outputs, preparing training sets, selecting a training algorithm and its parameters, making decisions about sufficient approximation accuracy, and validating the resulting ML model—currently all require active human participation and a disciplined approach based on modern software practices (version control, containerization, etc.) to ensure reproducibility, maintainability, and traceability. Therefore, close collaborations between computer scientists, geophysicists, data scientists, remote sensing experts, and modelers is essential when developing ML models for satellite remote sensing and NWP applications.

The following sections assess and illustrate the use of AI and specifically ML to augment or replace many components of the NWP processing chain—the processing chain that adds value at each of the series of steps from collecting and preprocessing Earth observations to the postprocessing and issuance of forecasts and warnings. As will be shown in the examples provided below, ML can 1) speed up and improve the processing of satellite data (quality control, gap filling, retrievals, etc.), 2) facilitate data assimilation and initialization of numerical weather and climate models, 3) speed up and improve model physics in numerical models, and 4) improve post processing of numerical model outputs. The “Satellite remote sensing and NWP challenges and ML” section highlights some of the major challenges facing satellite remote sensing and NWP and, in particular, the challenge of Big Data, and makes it clear that new approaches, such as AI, are needed to not only accelerate the exploitation of environmental observations but also to enhance the quality of the outcomes. The “Advantages of ML to meet the challenges in satellite remote sensing and NWP” section discusses the applicability, appropriateness, and complementarity of ML for geophysics. This is a discussion at both abstract and practical levels of to what extent and in what situations ML is actually applicable to satellite remote sensing and NWP, and whether ML provides an entirely new alternative, or instead, is complementary to more traditional approaches. Here, we note that the way forward to advance the use of ML in satellite remote sensing and NWP is to leverage the enormous recent advances in ML in other fields. The “Caveats in using ML to meet the challenges in satellite remote sensing and NWP” section then addresses a number of issues and caveats in the use of ML. In the “Examples of modern ML applications to satellite remote sensing and NWP” section, a closer look at examples of some of the elements of the NWP processing chain demonstrates the usefulness and applicability of ML-based techniques. The final section concludes with a summary and outlook.

SATELLITE REMOTE SENSING AND NWP CHALLENGES AND ML.

The many challenges to satellite remote sensing and NWP are grouped here by forcing mechanism—Big Data, advanced models and applications, and user demands. In the sections that follow we show that recent advances in ML in terms of efficiency, capability, and ease of implementation, can help to meet these challenges.

First, NWP is failing to exploit the growing diversity and volume of observations. As seen in Fig. 1, the volume of environmental data available, especially from satellites, has increased significantly in the recent past (Schmetz and Menzel 2015) and now presents a major challenge in terms of hardware, power, and latency constraints for real-time applications. Additionally, as the diversity of observations grows, new procedures are needed to exploit novel types of observations. In part for these reasons, global NWP uses only about 1%–3% of currently available satellite data and the processing time for traditional approaches is already crippling. The trend of rapidly increasing data volume (and diversity) will continue due to several trends. On one hand, technological advances result in sensor designs with higher spatial, temporal, and spectral resolution. In addition, miniaturization permits deployment on less expensive SmallSat and CubeSat platforms [e.g., CIRAS described by Pagano et al. (2017) and MicroMAS-2 described by Blackwell et al. (2019)]. As a result, we are now seeing the emergence of commercial space-based data from flotillas of small satellites being marketed by multiple industry players. Other nontraditional platforms in near-space altitudes (such as the constellations of balloons being deployed by the private sector) can host complementary observing systems in the stratosphere where there is a lack of accurate data. On the other hand, there is an emergence of new sources of data, such as the Internet of Things (IoT), which complement traditional environmental data sources. The IoT will include observations from personal electronic appliances, from automobiles, and from other mobile platforms [e.g., smartphone pressure observations described by Madaus and Mass (2017) and smartphone battery temperature as a proxy for air temperature described by Droste et al. (2017)]. The number and diversity of sensors, volume, and quality of data from the universe of the IoT is increasing, and will continue to increase, as technological advances lower cost, and mass and power requirements. This will improve analyses close to the surface (in the lower atmospheric boundary layer) where existing observations are not optimal.

Fig. 1.
Fig. 1.

Growth in annual mean number of satellite observations (millions) per 0000 UTC cycle (a) available and (b) used by the NCEP DA system for different data types (colors). The data are grouped into the following types: atmospheric motion vector, ocean surface wind, solar backscatter ozone, radio occultation, and radiance. The radiance type is subdivided into MW, IR, and hyperspectral sounder types. Note the 100-fold differences in vertical axes between the two panels. Available radiances are so predominant that the other types are not visible in (a), even though only subsets of hyperspectral sounder channels are received. For a similar reason, the solar backscatter ozone type is not visible in (b). Graphic created by Krishna Kumar, Eric Zimmerman, and Ross Hoffman.

Citation: Bulletin of the American Meteorological Society 100, 12; 10.1175/BAMS-D-18-0324.1

Second, weather forecasting has unmet and increasing requirements for improved computation resources, initialization of models, description of subgrid physical processes, and postprocessing of model outputs (e.g., Sun et al. 2014; Yano et al. 2018; Li et al. 2019; Roebber et al. 2004; Mass and Kuo 1998). Currently, NWP is hobbled by the combination of inadequate computer power and stringent latency requirements. Some extrapolations suggest that future improvements in NWP will soon be limited by power usage (www.noaa.gov/big-data-project). The latency requirement is particularly extreme for short-term forecasting of hazardous weather. Yet, improvements in NWP are driven by computationally intensive advances in all aforementioned areas. Examples of specific improvements for global medium-range NWP will include:

  • enhanced assimilation of satellite measurements, including radiances affected by clouds, precipitation, and surface properties [requiring more complete radiative transfer (RT) models accounting for these effects], and using improved or more efficient thinning, quality control, RT, observation bias correction, and cloud clearing procedures (e.g., Geer et al. 2018);

  • more accurate initial conditions that take advantage of an increasing volume of available real-time observations from satellites and the IoT (e.g., Madaus and Mass 2017);

  • enhanced modeling and data assimilation (DA) systems coupling multiple geophysical domains (atmosphere, ocean, etc.) that enable better use of observations affected by state variables in more than one domain (e.g., Penny and Hamill 2017);

  • improved parameterizations—the modeling of subscale geophysical phenomena, such as clouds and radiative heating rates (e.g., Khain et al. 2019; Price et al. 2018); and

  • improved postprocessing of model outputs, including better forecast bias correction and nonlinear ensemble averaging (e.g., van Straaten et al. 2018).

All these areas of improvement can benefit from faster and/or more accurate methods of calculation. These are also areas of active research for severe weather nowcasting and forecasting (e.g., the NOAA Warn-on-Forecast project, https://wof.nssl.noaa.gov; Stensrud et al. 2009, 2013). In what follows, the use of ML to provide such methods will be illustrated.

Third, different users of weather data—from airline pilots to emergency planners to farmers—increasingly demand higher accuracy and greater resolutions for an ever-expanding array of applications (Thorpe and Rogers 2018). This higher user expectation is partly due to the increasing societal impact of weather. As a result, there is a demand for consistent, comprehensive, and consolidated warnings, nowcasts, and forecasts. Weather products must combine space-based, air-based, and surface-based data sources at increasing spatial, vertical, and temporal resolutions and with improved timeliness, especially for nowcasting and short-range forecasting applications. Operational forecasting space and time horizons are expanding to range from very short-term localized street level (or urban) forecasting to global subseasonal to seasonal (S2S) forecasting. Both ends of this spectrum can benefit from even more detailed computations. For urban dispersion modeling, discretization scales of meters and seconds are already used (e.g., Hernández-Ceballos et al. 2019). For S2S, physical processes must be parameterized more accurately and additional physical processes must be modeled. This includes the use of complex coupled Earth system models (ESMs) that incorporate components with longer predictability times than the atmosphere—the cryosphere, the land surface, the ocean, the biosphere, the hydrosphere, etc., and the couplings between these components. Note that the need to integrate multiple types of environmental parameters into consolidated, data fused/blended observation products exists at both ends of the forecast time scale—both for nowcasts and for future coupled climate DA systems.

In summary, new approaches will be needed to take full advantage of all the observations, allowing more sources of observations to be ingested by increasingly more accurate (and ever more computationally demanding) DA and forecast systems, and to do so within an ever-shrinking time window allowed for processing and dissemination. In the next section, we discuss the different ways that ML can contribute to solve this problem/opportunity.

ADVANTAGES OF ML TO MEET THE CHALLENGES IN SATELLITE REMOTE SENSING AND NWP.

Advantages of ML methods include but are not limited to the following:

  • Computational efficiency. ML models are often a few orders of magnitude faster than the original physically based (deterministic) models (e.g., see the “Fast and accurate emulations of model physics” section).

  • Accuracy. With representative and very accurate training datasets, ML models can be more accurate than conventional efficient parameterizations (e.g., see the discussion later in this section on the use of satellite radiance observations).

  • Transferability. Advances in ML methods from a variety of allied fields can be leveraged for geophysical problems (e.g., Tan et al. 2018; see the “Transfer learning” sidebar).

  • Synergy. With different strengths and weaknesses, ML and traditional approaches can be synergistic (e.g., Krasnopolsky and Fox-Rabinovitz 2006), and best practices are being developed to optimize their combination (Reichstein et al. 2019). In this setting ML model efficiency can improve the quality and range of model results (e.g., to improve the model resolution, to extend the forecast horizon, and to increase ensemble size).

  • Flexibility. ML techniques can accommodate (i) variables that have not been (and sometimes cannot be) included in physically based models, (ii) physical constrains (like conservation laws or balance equations), (iii) processes that are nonlinear, (iv) non-Gaussian observation errors, and (v) empirical data for processes for which the true physics is poorly understood (Krasnopolsky 2013).

  • Ease of use. Modern ML techniques have been coupled with modern coding languages, including Python-based TensorFlow (www.tensorflow.org/learn) and Keras (https://keras.io), making current tools much more powerful and easier to use than previous versions.

Based on these advantages, we will see in the examples presented in this and following sections that ML offers new paradigms that enable the use of a large fraction of available data, produce highly efficient and accurate parameterizations, provide a wide range of tailored products to users, and meet the challenges of NWP in other ways.

TRANSFER LEARNING

Transfer learning techniques are useful in classification problems where the high-level pretrained network filters and weights are “fixed” but the output layers are added/modified/adjusted for classification or regression in other applications. During inference, the network uses/keeps those pretrained layers as part of its architecture.

Transfer learning techniques are also used in the optimization of weights in completely different network architectures in applications such as neural style transfer, GANs, image in-painting, ML-morphing/super slow-mo. In these cases, during training, images are projected through the high-level feature spaces of pretrained networks and compared to images reconstructed by the network and projected through the same high-level feature spaces; that is, the differences between the projection of true and reconstructed images through these high-level pretrained network layers are used as penalty terms in the optimization of the weights for completely different purposes and those pretrained layers are completely bypassed during model inference after training.

As a first example, for the use of satellite radiance or brightness temperature (BT) observations, the advantages of ML for computation acceleration (specialized processors) and improved algorithms (e.g., by using training datasets derived from highly accurate physically based models that are too expensive to use in applications for remote sensing and parameterizations) can improve overall accuracy and quality within tightening latency requirements. DA systems employ forward models to simulate observations in order to calculate the observation innovations (observed value minus that simulated from the prior estimate of the state) that are key to calculating the analysis increment in the DA analysis update step. For a satellite BT observation, the forward model simulates RT through the atmosphere, including the effect of the land or ocean boundary. Purely first principle calculations are theoretically possible but are inefficient and stymied by our lack of knowledge of the atmospheric scatterers (aerosols and hydrometeors) and land surface properties. Clear-sky atmospheric absorption can be very accurately calculated using line-by-line RT models, but are too slow for practical use. Further, there are numerous mechanisms that are poorly understood and poorly modeled (i.e., parameterized). For example, the emissivity of the land surface must be estimated or specified for RT calculations, but depends in a complicated way on the soil type, soil moisture, vegetation type, vegetation health, and vegetative phenology. Consequently, in DA and forecast systems, we make approximations and tune our estimates and parameterizations, often in ad hoc ways that involve some assumptions. For real-time DA and forecast applications, fast versions of RT are generally used, and these rely on parameterizations that are based on approximations and require tuning, which introduce errors. In this situation, ML can be a game changer: orders of magnitude execution acceleration allows a larger set of observations to be exploited, and provides an opportunity for improved accuracy/quality (e.g., Chevallier and Mahfouf 2001). This improved accuracy/quality is obtained by training the ML networks using data simulated by the most accurate RT calculations—once tuned the ML networks will be exceedingly fast irrespective of the accuracy of the training set. All of this comes with several caveats—for example, the ML models must be accurate for all situations, including very rare cases. The next section discusses the caveats, concerns, and mitigations for the ML approach. Furthermore, it must be stressed that continued research is needed to create highly accurate and detailed model physics, which can aid understanding of physical processes and which can serve as a source of simulated data to improve fast parameterizations, whether these are conventional model parameterizations or very fast and accurate ML emulations.

A particularly powerful advantage of ML to make rapid advances in geophysics is to leverage existing knowledge from applications in other fields (hereafter the allied fields), through transfer learning. That is, geophysical applications, including characterization of environmental systems, NWP, climate projections, and situational awareness (SA; e.g., providing the right information at the right time to decision-makers), can benefit from mature, successful applications with similar traits in the allied fields. In particular, we note a few examples of how current AI systems combine information from multiple sources to predict outcomes. These examples are in medicine: the Watson Project (Chen et al. 2016), image processing (Razzak et al. 2018), cancer detection (e.g., Manogaran et al. 2018), electrocardiogram (EKG) analysis (Mincholé and Rodriguez 2019); in finance: algorithmic trading (Huang et al. 2019), stock market analysis and prediction (Chong et al. 2017), and portfolio management (Ban et al. 2018); in natural language processing: speech separation (Wang and Chen 2018), signal extraction in noisy environments (Z. Zhang et al. 2018); in music: automatic composition in any desired style (Mao et al. 2018); and in autonomous vehicles: the real-time fusion of multiple observations for SA (Golestan et al. 2016). These applications are dependent on recently improved and expanded range of ML architectures, improved ML training algorithms, and Big Data for training. There are strong connections between problems in both satellite remote sensing and NWP and the allied fields through shared fundamental problems—such as forward and inverse problems, morphing, mapping, and pattern recognition. For example, the approaches useful in facial recognition are also useful in identifying meteorological features such as hurricanes (Racah et al. 2017; Wimmers et al. 2019; Gagne et al. 2019). So far, we have discussed the reuse of methods, which might be termed meta-transfer learning. In fact, it is possible reuse actual tools: the ML technique of transfer learning (see sidebar “Transfer learning”) is a powerful leveraging mechanism that takes an existing network trained for one task and refines it for another task by retraining only the final stages of that network for the new task (Tan et al. 2018).

The modern ML techniques most directly applicable to satellite remote sensing and NWP problems include convolutional neural networks (CNNs; Gu et al. 2018; Yamashita et al. 2018), deep-learning neural networks, (DNNs; LeCun et al. 2015), computer vision (O’Mahony et al. 2019), with its subelements of motion estimation (e.g., Hopkins et al. 2018), object recognition (Han et al. 2018), and video tracking (e.g., Brunetti et al. 2018). CNNs are a specialized form of DNNs that are often useful for image processing and computer vision applications. Identifying existing ML approaches and techniques that are successfully applied to these issues in the allied fields and then adapting these to related geophysical problems is an effective way to make significant progress quickly.

CAVEATS IN USING ML TO MEET THE CHALLENGES IN SATELLITE REMOTE SENSING AND NWP.

The key concerns and caveats when applying ML are as follows:

  • Will the ML model be reliable, or will it be prone to be less accurate or even fail for rare or unusual cases? ML models can only learn what is in their training datasets.

  • Can ML models satisfy constraints based on physical principals (e.g., conservation of mass)?

  • How can AI overcome the trust barrier, that is, the reluctance of some to accept ML model output if they cannot understand the action of the network hidden layers?

  • Can ML models be extended to produce uncertainty estimates (i.e., error bars)?

  • Can ML development be disciplined enough to produce reproducible results?

  • Can ML models be easily integrated into operational procedures?

With regard to reliability, that is, the ability to handle unusual situations, this concern can be mitigated by representative training. Since ML models are unreliable when extrapolating far beyond the domain covered by the training set, the training set should be representative, that is, large and diverse. Without a representative training set, one could accidentally introduce a bias where a specific feature (or signature) limited or absent in the training set would become associated by the trained model with other features in the training set having enough similarities to the specific feature. Thus, the choice of training dataset and specific input variables can be critical (e.g., Millard and Richardson 2015). When simulations are available, for example, from a realistic nature run, the quality of the simulations is critical because the ML model trained on simulated data will be applied in reality. Since network inputs often have high dimensionality, measured in hundreds or more, it is important to employ techniques that maximize the training set applicability. For instance, modern ML transfer learning techniques can benefit by taking an existing networks trained on millions of images and tailoring the last layer of that network to recognize, for example, hurricanes in satellite imagery. In addition, ensemble regularization is required for reliability in some situations. Nonlinear extrapolation is an ill-posed problem that requires a regularization to provide meaningful results. Problems can occur for rare inputs, when the environment is nonstationary, or for systems that changes with time, for example, under a future climate change scenario. One approach to avoid errors in these situations is for the ML model to include dynamical adjustments. Alternatively, an ensemble of ML models based on different architectures and training sets can regularize such extrapolation and deliver ML results that remain stable when the inputs approach or cross the boundary of the training domain.

To enforce conservation laws, ML training can include cost functions that measure the discrepancy from the conservation laws either as a strong constraint using the method of Lagrange multipliers during the minimization or as a weak constraint included in the overall cost function (e.g., Tompson et al. 2017).

With regard to trustworthiness, nonlinear statistical models of any type are difficult to understand and interpret. However, in response to the concern that ML models are black boxes that cannot be interpreted or understood, efforts are underway to develop explainable AI (McGovern et al. 2019; Samek et al. 2017; Toms et al. 2019) and physics guided neural networks (Ding 2018; Karpatne et al. 2018; Beucler et al. 2019). These developments have the potential to explain the connections in Big Data that ML has extracted, and to uncover new physical phenomena.

With regard to uncertainty estimation, Ghahramani (2015) states that ML must be able to represent and manipulate uncertainty about models and predictions. Gaussian process regression (GPR; Pasolli et al. 2010) is roughly speaking the ML version of variational assimilation. A natural output of the GPR algorithm is the uncertainty of its estimate, a quantity that is difficult to obtain from ordinary variational analysis approaches.

With regard to reproducibility, note that AI techniques eliminate some forms of human error inherent in many traditional DA techniques, in the sense that these traditional techniques require manual coding of human estimates based on intuition and approximation to implement a solution to what is fundamentally an ML problem—that is, to numerically (as opposed to analytically or symbolically) minimize an objective metric of errors. While ML is not prone to the errors that can occur in traditional techniques, the ML design and training processes require some experimentation and tuning.

With regard to ease of integration, this concerns place additional demands on the ML software system used for development. For example, for the development of ML parameterization of physics a very specific normalization of outputs or rearrangement of the loss function (e.g., in the case of missed outputs or physical constraints) are sometimes required. Note that in some operational cases, the trained tool must be converted to another programming language to be consistent with the parent application.

EXAMPLES OF MODERN ML APPLICATIONS TO SATELLITE REMOTE SENSING AND NWP.

Figure 2 presents the value-adding processing chain from observations preparation to nowcasts and NWP forecast postprocessing. The examples in this section (as well as the example of the forward problem described in the “Advantages of ML to meet the challenges in satellite remote sensing and NWP” section) explore the use of ML to enhance several of the different steps in this chain (color coded in the figure). Boukabara et al. (2017, 2018, 2019a) described other examples showing how ML techniques can be applied to many components of the NWP processing chain shown in Fig. 2—including quality control, geophysical information extraction, RT modeling (even in precipitating conditions), and other components.

Fig. 2.
Fig. 2.

Synopsis of the NWP processing chain, from secure ingest of satellite data to the generation of forecasts at different time scales. Color coding indicates which components are discussed in the text.

Citation: Bulletin of the American Meteorological Society 100, 12; 10.1175/BAMS-D-18-0324.1

Secure data ingest: Satellite observation gap filling.

Gap filling for satellite imagery is often required due to incomplete coverage, or due to sensor problems. For example, Chang et al. (2015) use the extreme learning machine (ELM) approach to fill in gaps due to cloudiness in Moderate Resolution Imaging Spectroradiometer (MODIS) reflectance data. [ELMs are fast to tune because the weights are random except for the output layer weights, but ELMs are typically less accurate than other architectures (Cao et al. 2018).] For ocean biology, Krasnopolsky et al. (2016) used a shallow artificial neural network (ANN) to fill gaps in global ocean color imagery due to incomplete 24-h coverage from polar-orbiting sensors. For air quality, R. Zhang et al. (2018) used random forests to fill gaps in satellite-retrieved aerosol optical depth.

For temporal gaps, the obvious solution, linear in time interpolation can fail severely for features of interest that propagate (Hoffman and Leidner 2010). Computer vision algorithms can effectively interpolate nonlinearly to fill in temporal gaps, for example to generate smooth and realistic video visualizations from infrequent model archives.

Preprocessing and inversion: Remote sensing retrievals.

Using AI in remote sensing has a long history (e.g., Krasnopolsky et al. 1995; Abuelgasim et al. 1998; Aires et al. 2002) and is considered mature. It can, however, benefit further from modern ML tools that were developed for other fields. An example of the use the modern ML techniques, in this case DNNs, is the AI-based pilot project that parallels the Multi-Instrument Inversion and Data Assimilation Preprocessing System (MIIDAPS) enterprise algorithm (Jones et al. 2019). Tests of applications, such as MIIDAPS-AI, in simulation as well as with real data compare well with conventional approaches in terms of data fit, spatial coherence, and interparameter correlations, but with execution speeds that are often two orders of magnitude faster.

Quality control.

Current quality control (QC) procedures have been built up over the course of time and adjusted to account for multiple sensors and instances of sensors. These procedures encapsulate accumulated, sometimes ad hoc, knowledge in what are often opaque undocumented (or incorrectly documented) computer programs. Machine learning can be effective to detect and correct observation problems, and offer an opportunity to rethink both QC and preprocessing by mining the substantial archives of observation innovations (i.e., observation minus background differences) from operational DA systems.

Data assimilation.

Analysis of observations, whether in the context of DA or data fusion, can benefit from modern ML techniques. Current DA systems typically use only a small fraction of the available observations to limit processing time and to avoid estimating and making use of observation correlations. ML can enhance this process. For example, fast ML emulations of forward models (both RT and empirical forward models) can be used for direct assimilation of satellite measurements (e.g., Chevallier and Mahfouf 2001). Also ML observation operators can be used to instantaneously propagate surface observations vertically (Krasnopolsky 2013).

Data fusion.

Data fusion combines and or converts observations and/or imagery into new information as in the following diverse examples. Li et al. (2018) identify hurricane damage to buildings in imagery using the results of unsupervised pretraining based on convolutional auto encoders (CAEs) to fine-tune CNNs. Hengl et al. (2017) trained an ensemble of ML networks to produce high resolution (250 m) operational global maps of six soil properties at seven vertical levels. Inputs were 150,000 soil profiles and 158 remotely sensed soil covariates. Landschützer et al. (2013) created 1° resolution maps of partial pressure of carbon dioxide (pCO2) in a two-step process based on observations and estimates of sea surface temperature, sea surface salinity, mixed layer depth, chlorophyll a, and atmospheric CO2 concentration. First, 16 biogeochemical provinces were identified using the self-organizing map (SOM) method (Kohonen 2001). Then a feed forward network for each province estimated the pCO2, essentially parameterizing pCO2 in terms of the input variables.

Fast and accurate emulations of model physics.

Model physics calculations are a computational bottleneck in numerical models. For example, accurate radiative transfer calculations are time consuming for retrievals, DA, and for model parameterizations. For some parts of the spectrum (e.g., the shortwave infrared) and for some atmospheric conditions (e.g., in the presence of cloud) radiative transfer calculations become more complex, time consuming and error prone. We already discussed the forward problem used in retrievals and DA in the “Advantages of ML to meet the challenges in satellite remote sensing and NWP” section. Here we discuss the parameterization of radiative heating (and cooling) in NWP models. Other physics parameterization for NWP are discussed below (in the “Short- and medium-range forecasting: Subgrid-scale model physics” section).

Often in a NWP model we can only afford to calculate radiative heating every hour. Then the ghost of the initial cloud field will haunt the radiative heating as the model evolves during each hour. ML models can speed up these calculations significantly and can be made more accurate at the cost only of developing larger and more accurate training samples. Examples of accurate and fast ANN emulations of longwave and shortwave radiation parameterizations have been developed for the ECMWF and NCAR and NCEP global models (Chevallier et al. 2000; Krasnopolsky et al. 2008, 2010). Table 1 and Fig. 3 show the high accuracy and speed of ML emulations in the experiments of Krasnopolsky et al. (2008, 2010). Table 1 shows error statistics and speedup achieved by these ANN emulations. Figure 3 compares the 17-yr time-averaged precipitation of two parallel runs of the NCEP CFS [the first with the original Rapid Radiative Transfer Model for general circulation models (GCMs) (RRTMG) radiation parameterization and the second with ANN emulations that use a single hidden layer]. The two difference plots (lower panels) compare the impact of the ANN to the impact of routine changes in the computational environment and confirm that the control minus ANN run differences are indeed small and of the order of magnitude that occurs when a new version of the FORTRAN compiler is introduced.

Table 1.

Statistics for estimating the accuracy of the heating rate calculations (in K day−1) and the computational performance (speedup) of the ANN emulation vs the original parameterization for the NCAR CAM (T42L26) (Collins et al. 2002) and for the NCEP CFS (T126L64) longwave radiation (LWR) and shortwave radiation (SWR) parameterizations. RRTMG is the Rapid Radiative Transfer Model for GCMs (Clough et al. 2005). Here, the speedup shows an averaged (over an independent global dataset) ratio of the timing of the original parameterization to that of the ANN emulation in a sequential single processor code by code comparison. The results of the calculation speedup for deep convection cases are presented in parentheses. Results are from the experiments described by Krasnopolsky et al. (2008, 2010).

Table 1.
Fig. 3.
Fig. 3.

NCEP CFS precipitation rates in mm day−1, 17-yr average for (a) the control run and (b) the run using ANN emulations of radiative heating rates, and differences of precipitation rates in mm day−1 for (c) the difference ANN run minus control run [(b) minus (a)] and (d) the difference between two control runs: one before and another after changing the version of the FORTRAN compiler. Results are from the experiments described by Krasnopolsky et al. (2010).

Citation: Bulletin of the American Meteorological Society 100, 12; 10.1175/BAMS-D-18-0324.1

Nowcasting.

Nowcasting is the process of extrapolating current conditions, often imagery into the near future (minutes to hours). Iskenderian et al. (2019) combine data from lightning sensors, satellite imagery and NWP model output in a CNN framework to create seamless weather radar mosaics (data fusion) and forecasts out to 12 h (nowcasting). Shahroudi et al. (2019) combined imagery and NWP outputs in a CNN to forecast 18-h tropical cyclone track and intensity. Shi et al. (2015) developed a combination of CNN and LSTM networks for ML of precipitation nowcasts (short-term extrapolations). Shi et al. (2017) extended this approach to a model that can actively learn the location-variant structure for recurrent connections.

Short- and medium-range forecasting: Subgrid-scale model physics.

ML techniques have been used to fully replace some model parameterizations (Schneider et al. 2017; Rasp et al. 2018). ML emulation, ML enhancement, and observational ML are based on different types of training. In parameterization emulation, the training set comes from the inputs and outputs of a traditional parameterization saved from a model forecast. ML emulation techniques have been used to replace traditional subgrid-scale NWP model physics parameterizations, such as the radiative transfer (discussed in the “Fast and accurate emulations of model physics” section), convection (e.g., Krasnopolsky et al. 2013; Gentine et al. 2018; O’Gorman and Dwyer 2018), microphysics, and superparameterization (Rasp et al. 2018). To exceed the accuracy of emulation, enhanced training datasets can be created using data simulated by higher resolution models like large-eddy simulations (LESs) and cloud-resolving models (CRMs; Krasnopolsky et al. 2013; Gentine et al. 2018; Rasp et al. 2018). Training datasets are also based on observations. For example, an ANN-based empirical biological model, which provides biological feedback to the ocean model, was developed (Krasnopolsky et al. 2018) using satellite-derived ocean color data and a combination of satellite and in situ measurements of upper-ocean physical parameters. In yet another approach, Brenowitz and Bretherton (2018) replaced the entire model physics with an ANN trained to estimate the apparent source terms of heat and moisture by minimizing the prediction errors of the model.

When appropriate, the hybridization of physics-based and ML-based techniques offers a direct and powerful path to advance the fields of satellite remote sensing and NWP (e.g., Krasnopolsky and Fox-Rabinovitz 2006). However, in cases such as forecasting the weather a week in advance, that is, in complex and chaotic problems, with huge state space vectors, proper training of ML networks will likely face significant challenges due to the size of the state space and the need for huge training sets [although this has been done for simple atmospheric models, e.g., Scher (2018)]. However, ML techniques are well suited to correcting traditional models (see the “Post-forecast processing and correction” section) or in conjunction with traditional physically based systems (Reichstein et al. 2019). In a novel example of the hybrid approach, Tompson et al. (2017) accelerate the solution of incompressible Navier–Stokes equations with a CNN to solve the Poisson equation for the pressure at the forward time step. This is also an example of enforcing a physical constraint, since in this case the ML model was trained to minimize the forward time step divergence.

Post-forecast processing and correction.

Post-forecast processing and corrections are an important method of improving forecasts, either on the original forecast grid or by adapting to local conditions [as in the model output statistics (MOS) method]. ML is well equipped in detect, evaluate and therefore correct errors made by physics-based models, to adapt forecasts from large-scale models to local conditions, and to nonlinearly average ensembles of forecasts. For ensemble forecast systems, ML methods to correct both bias and spread are being recognized as powerful tools (Krasnopolsky and Lin 2012; Campos et al. 2019; Rasp and Lerch 2018; Fan et al. 2019). Rasp and Lerch (2018) applied this technique to 2-m temperature ensemble forecasts over Germany obtaining similar improvements as compared with the standard (linear regression) technique. This is important since, compared to MOS, the ML approaches are easier to maintain and adapt to changing conditions. For example, ML approaches are easy to implement in sequential mode that automatically adapts to model and climate changes.

Figure 4 shows the results of nonlinear ANN averaging of the 24-h forecasts of precipitation over the continental United States from an eight-member multimodel ensemble (Krasnopolsky and Lin 2012). Nonlinear ANN averaging of the multimodel ensemble (Fig. 4c) removes a significant part of false low intensity precipitation produced by the simple arithmetic averaging (Fig. 4b), sharpens the precipitation features, enhancing fronts and maximums. ANN averaging produces a forecast that is very close to the forecast produced by a human analyst (Fig. 4d).

Fig. 4.
Fig. 4.

Comparison of (a) the CPC analysis of precipitation for the 24 h period ending at 1200 UTC 24 Oct 2010 to three forecasts, including (b) the arithmetic mean of the precipitation forecasts provided by eight models, (c) the nonlinear ANN average of the same models, and (d) the prediction by a human analyst who used satellite images and ground observations in addition to model forecast. The color bar shows the precipitation rate from 0 to 175 mm day−1. Different shades of green correspond to precipitation rates from 1 to 10 mm day−1 and different shades of red to precipitation rates from 25 to 75 mm day−1. After Krasnopolsky and Lin (2012, their Fig. 6).

Citation: Bulletin of the American Meteorological Society 100, 12; 10.1175/BAMS-D-18-0324.1

The paradigm of correcting errors can be applied in many situations, including corrections of observation, retrieval, and analysis bias. Correction of errors has been relatively successful using simple ANNs. In the future there will be opportunities to leverage new approaches, including U-Nets, CNNs, generative adversarial networks (GANs; Creswell et al. 2018), etc. In particular, for forecast correction, ML techniques that recognize patterns are applicable in cases where forecast errors include a displacement component.

CONCLUDING REMARKS.

ML approaches have been shown to be useful in many aspects of environmental prediction, including statistical weather forecasting, numerical weather prediction, and climate projection. Further, the use of ML by the satellite remote sensing and NWP community is ripe for more rapid advances due to recent progress and successes in applications in the allied fields (video gaming, medicine, finance, autonomous vehicles, natural language, etc.). The process of ML adoption will be accelerated by using lessons learned from previous studies of how humans adopt and adapt to new computer processes. For example, in a study of extreme convective event nowcasting, Karstens et al. (2018) conclude that human–ML systems should provide clear presentations, allow the human flexibility to choose and combine different tools, allow the human the ability to correct the ML results, and provide an explanation of how the ML results were obtained.

Without AI or some other transformation approach, difficulties due to the increasing amount of data are on track to be exacerbated in the future with the rapidly enlarging IoT, with the commoditization of space technology and the accompanying increase in the number of satellites observing the Earth, and with the increase in the capabilities of sensors in terms of higher spatial and spectral density and resolution. The efficiency of ML on the other hand will allow assimilating high spatial, temporal, and spectral resolution data that are now either aggressively thinned or not considered at all, thereby substantially increasing the spatiotemporal resolution and accuracy of ML-based data processing algorithms.

While AI sometimes provides novel capabilities, in many cases AI methods replace or enhance existing methods. This will blur the distinction between AI and physical science and increase collaborative hybrid approaches combining physical and data science perspectives. For example, in the future, we expect to see a cohesive and synergistic coexistence of physical models with ML enhancements. Perhaps the most direct example of this is to train ML techniques on previous forecast errors. Then the ML technique can provide a post-forecast correction to correct for the human-induced uncertainties—when modeling, when implementing systems, and even when issuing advisories. That is the AI system, having learned from all past mistakes can provide consultation and support to human practitioners by highlighting current anomalies and providing evidence for reconsideration of the current forecast (in the weather domain or current diagnosis in the medical realm).

A concern is that ML techniques may not be as reliably accurate as methods already in use. That is, how can ML improve on NWP models that are based on the laws of physics (conservation of mass, momentum, etc.) and DA systems that are based on optimal Bayesian estimation. In fact, they can and both physics-based models (parameterizations, etc.) and optimal estimation techniques (variational methods, minimization, and morphing, etc.) as implemented in operational systems have a lot in common with ML techniques. Already, many DA techniques overlap with ML techniques. Since DA techniques are prone to errors of implementation, misunderstandings of physical processes, errors of approximation, etc., they might be considered to be “hand-crafted” or “artisanal” ML models. For example, 3D- and 4D-variational data assimilation are based on clear Bayesian principles, but their implementation is rife with approximations and assumptions (e.g., that errors are Gaussian, that linearization errors are small, that solutions are unique). Modern ML provides the opportunity to build on existing techniques, uncover their limitations and possibly correct their implementation flaws.

Finally, in order for AI to be adopted by geophysicists, weather forecasters, and emergency responders, the ML tools must be trustworthy, reliable, and accurate: the uncertainty of their output need to be quantified, their results need to be reproducible and explainable, and their architecture configuration mathematically understandable. All these issues are critical for AI in general and are active fields of study in the AI community at large. Successes from such studies will lead to more widespread acceptance and adoption of AI in the fields of remote sensing and NWP and in environmental science in general.

ACKNOWLEDGMENTS

The authors thank their many colleagues who contributed by their interactions, peer reviews, and suggestions, including David M. Hall (Nvidia) and others. We also thank the journal editor, Dr. Robert G. Fovell, and the three peer reviewers for their thoughtful and thorough comments and suggestions, which greatly improved the manuscript. We gratefully acknowledge support for this work provided by NOAA—including support under the Scientific and Technical Services II Federal Contracting Vehicle (DOCDG133E12CQ0020), under the auspices the cooperative institutes given in the author affiliations on the title page through Cooperative Agreements NA14NES4320003 for CISESS, and NA14OAR4320125 for CIRA, and support providing access to specialized computational processing units.

APPENDIX: LIST OF ACRONYMS.

Acronyms used in the text are listed here. Common acronyms (e.g., UTC and RMSE) and proper names (e.g., names of specific institutions and systems such as NASA and MicroMAS) are not expanded in the text when first used.

AE

Autoencoder

AI

Artificial intelligence

ANN

Artificial neural network

BT

Brightness temperature

CAE

Convolutional autoencoders

CAM

Community Atmosphere Model

CFS

Climate Forecast System

CIRAS

CubeSat Infrared Atmospheric Sounder

CNN

Convolutional neural network

CPC

Climate Prediction Center

CRM

Cloud-resolving model

DA

Data assimilation

DNN

Deep-learning neural networks

ECMWF

European Centre for Medium-Range Weather Forecasts

EKG

Electrocardiogram

ELM

Extreme learning machine

ESM

Earth system model

FF

Feed forward

FORTRAN

Formula Translating System

GAN

Generative adversarial network

GCM

General circulation model

GPR

Gaussian processes regression

GPU

Graphical processing unit

LES

Large-eddy simulation

LSTM

Long and Short Term Memory Network

LWR

Longwave radiation

MicroMAS

Micro-sized Microwave Atmospheric Satellite

MIIDAPS

Multi-Instrument Inversion and Data Assimilation Preprocessing System

ML

Machine learning

MODIS

Moderate Resolution Imaging Spectroradiometer

MOS

Model output statistics

NCAR

National Center for Atmospheric Research

NCEP

National Centers for Environmental Prediction

NWP

Numerical weather prediction

pCO2

Partial pressure of carbon dioxide (CO2)

QC

Quality control

RRTMG

Rapid Radiative Transfer Model for GCMs

RT

Radiative transfer

S2S

Subseasonal to seasonal

SA

Situational awareness

SOM

Self-organizing map

SWR

Shortwave radiation

UTC

Universal time coordinated

REFERENCES

  • Abuelgasim, A. A., S. Gopal, and A. H. Strahler, 1998: Forward and inverse modeling of canopy directional reflectance using a neural network. Int. J. Remote Sens., 19, 453471, https://doi.org/10.1080/014311698216099.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Aires, F., W. B. Rossow, N. A. Scott, and A. Chédin, 2002: Remote sensing from the infrared atmospheric sounding interferometer instrument: 2. Simultaneous retrieval of temperature, water vapor, and ozone atmospheric profiles. J. Geophys. Res., 107, 4620, https://doi.org/10.1029/2001JD001591.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ball, J. E., D. T. Anderson, and C. S. Chan, 2017: Comprehensive survey of deep learning in remote sensing: Theories, tools, and challenges for the community. J. Appl. Remote Sens., 11, 042609, https://doi.org/10.1117/1.jrs.11.042609.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ban, G.-Y., N. El Karoui, and A. E. B. Lim, 2018: Machine learning and portfolio optimization. Manage. Sci., 64, 11361154, https://doi.org/10.1287/mnsc.2016.2644.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Beucler, T., M. Pritchard, S. Rasp, P. Gentine, J. Ott, and P. Baldi, 2019: Enforcing analytic constraints in neural-networks emulating physical systems. arXiv, 11 pp., http://arxiv.org/abs/1909.00912.

    • Search Google Scholar
    • Export Citation
  • Blackwell, W. J., and Coauthors, 2019: Microwave atmospheric sounding CubeSats: From MicroMAS-2 to TROPICS and beyond. Ninth Conf. on Transition of Research to Operations, Phoenix, AZ, Amer. Meteor. Soc., J3.5, https://ams.confex.com/ams/2019Annual/webprogram/Paper352453.html.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., E. Maddy, A. Neiss, K. Garrett, E. Jones, K. Ide, N. Shahroudi, and K. Kumar, 2017: Exploring using artificial intelligence (AI) for NWP and situational awareness applications. Int. TOVS Study Conf. (ITSC) XXI, Darmstadt, Germany, EUMETSAT, 12.05, https://cimss.ssec.wisc.edu/itwg/itsc/itsc21/program/4december/0830_12.05_AI4DataAssimilAndFusion_BoukabaraEtAl_v7.pdf.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., E. Maddy, K. Ide, K. Garrett, E. Jones, K. Kumar, N. Shahroudi, and A. Neiss, 2018: Exploring using artificial intelligence (AI) for NWP and situational awareness applications. 17th Conf. on Artificial and Computational Intelligence and its Applications to the Environmental Sciences, Austin, Texas, Amer. Meteor. Soc., 5.3, https://ams.confex.com/ams/98Annual/webprogram/Paper330911.html.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., V. Krasnopolsky, and J. Q. Stewart, 2019a: Overview of NOAA AI activities in satellite observations and NWP: Status and perspectives. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, Maryland, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Tuesday/S1-2_NOAAai2019_Boukabara.pptx.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., V. Krasnopolsky, and J. Q. Stewart, S. G. Penny, R. N. Hoffman, and E. Maddy, 2019b: Artificial Intelligence may be key to better weather forecasts. Eos, Trans. Amer. Geophys. Union, 100, https://doi.org/10.1029/2019EO129967.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brenowitz, N. D., and C. S. Bretherton, 2018: Prognostic validation of a neural network unified physics parameterization. Geophys. Res. Lett., 45, 62896298, https://doi.org/10.1029/2018GL078510.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brunetti, A., D. Buongiorno, G. F. Trotta, and V. Bevilacqua, 2018: Computer vision and deep learning techniques for pedestrian detection and tracking: A survey. Neurocomputing, 300, 1733, https://doi.org/10.1016/j.neucom.2018.01.092.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Campos, R. M., V. Krasnopolsky, J.-H. G. M. Alves, and S. G. Penny, 2019: Nonlinear wave ensemble averaging in the Gulf of Mexico using neural network. J. Atmos. Oceanic Technol., 36, 113127, https://doi.org/10.1175/JTECH-D-18-0099.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cao, W., X. Wang, Z. Ming, and J. Gao, 2018: A review on neural networks with random weights. Neurocomputing, 275, 278287, https://doi.org/10.1016/j.neucom.2017.08.040.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chang, N.-B., K. Bai, and C.-F. Chen, 2015: Smart information reconstruction via time-space-spectrum continuum for cloud removal in satellite images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., 8, 18981912, https://doi.org/10.1109/JSTARS.2015.2400636.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chen, Y., E. Argentinis, and G. Weber, 2016: IBM Watson: How cognitive computing can be applied to Big Data challenges in life sciences research. Clin. Ther., 38, 688701, https://doi.org/10.1016/j.clinthera.2015.12.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chevallier, F., and J.-F. Mahfouf, 2001: Evaluation of the Jacobians of infrared radiation models for variational data assimilation. J. Appl. Meteor., 40, 14451461, https://doi.org/10.1175/1520-0450(2001)040<1445:EOTJOI>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chevallier, F., J.-J. Morcrette, F. Chéruy, and N. A. Scott, 2000: Use of a neural-network-based long-wave radiative-transfer scheme in the ECMWF atmospheric model. Quart. J. Roy. Meteor. Soc., 126, 761776, https://doi.org/10.1002/qj.49712656318.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chong, E., C. Han, and F. C. Park, 2017: Deep learning networks for stock market analysis and prediction: Methodology, data representations, and case studies. Expert Syst. Appl., 83, 187205, https://doi.org/10.1016/j.eswa.2017.04.030.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cintineo, J. L., M. J. Pavolonis, J. M. Sieglaff, and D. T. Lindsey, 2014: An empirical model for assessing the severe weather potential of developing convection. Wea. Forecasting, 29, 639653, https://doi.org/10.1175/WAF-D-13-00113.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Clough, S. A., M. W. Shephard, E. J. Mlawer, J. S. Delamere, M. J. Iacono, K. Cady-Pereira, S. Boukabara, and P. D. Brown, 2005: Atmospheric radiative transfer modeling: A summary of the AER codes. J. Quant. Spectrosc. Radiat. Transfer, 91, 233244, https://doi.org/10.1016/j.jqsrt.2004.05.058.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Collins, W. D., J. K. Hackney, and D. P. Edwards, 2002: An updated parameterization for infrared emission and absorption by water vapor in the National Center for Atmospheric Research Community Atmosphere Model. J. Geophys. Res., 107, 4664, https://doi.org/10.1029/2001JD001365.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Creswell, A., T. White, V. Dumoulin, K. Arulkumaran, B. Sengupta, and A. A. Bharath, 2018: Generative adversarial networks: An overview. IEEE Signal Process. Mag., 35, 5365, https://doi.org/10.1109/MSP.2017.2765202.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ding, L., 2018: Human knowledge in constructing AI systems—Neural logic networks approach towards an explainable AI. Procedia Comput. Sci., 126, 15611570, https://doi.org/10.1016/j.procs.2018.08.129.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Droste, A. M., J. J. Pape, A. Overeem, H. Leijnse, G. J. Steeneveld, A. J. V. Delden, and R. Uijlenhoet, 2017: Crowdsourcing urban air temperatures through smartphone battery temperatures in São Paulo, Brazil. J. Atmos. Oceanic Technol., 34, 18531866, https://doi.org/10.1175/JTECH-D-16-0150.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fan, Y., C.-Y. Wu, J. Gottschalck, and V. Krasnopolsky, 2019: Using artificial neural networks to improve CFS week 3–4 precipitation and 2m temperature forecast. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, Maryland, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S5-6_NOAAai2019_Fan.pptx.

    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., II, A. McGovern, S. E. Haupt, R. A. Sobash, J. K. Williams, and M. Xue, 2017: Storm-based probabilistic hail forecasting with machine learning applied to convection-allowing ensembles. Wea. Forecasting, 32, 18191840, https://doi.org/10.1175/WAF-D-17-0010.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., S. E. Haupt, D. W. Nychka, and G. Thompson, 2019: Interpretable deep learning for spatial analysis of severe hailstorms. Mon. Wea. Rev., 147, 28272845, https://doi.org/10.1175/MWR-D-18-0316.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Geer, A. J., and Coauthors, 2018: All-sky satellite data assimilation at operational weather forecasting centres. Quart. J. Roy. Meteor. Soc., 144, 11911217, https://doi.org/10.1002/qj.3202.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gentine, P., M. Pritchard, S. Rasp, G. Reinaudi, and G. Yacalis, 2018: Could machine learning break the convection parameterization deadlock? Geophys. Res. Lett., 45, 57425751, https://doi.org/10.1029/2018GL078202.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ghahramani, Z., 2015: Probabilistic machine learning and artificial intelligence. Nature, 521, 452459, https://doi.org/10.1038/nature14541.

  • Golestan, K., R. Soua, F. Karray, and M. S. Kamel, 2016: Situation awareness within the context of connected cars: A comprehensive review and recent trends. Inf. Fusion, 29, 6883, https://doi.org/10.1016/j.inffus.2015.08.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Goodfellow, I., Y. Bengio, and A. Courville, 2019: Deep Learning. Adaptive Computation and Machine Learning Series, MIT Press, 800 pp.

  • Gu, J., and Coauthors, 2018: Recent advances in convolutional neural networks. Pattern Recognit ., 77, 354377, https://doi.org/10.1016/j.patcog.2017.10.013.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Han, J., D. Zhang, G. Cheng, N. Liu, and D. Xu, 2018: Advanced deep-learning techniques for salient and category-specific object detection: A survey. IEEE Signal Process. Mag., 35, 84100, https://doi.org/10.1109/MSP.2017.2749125.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Haupt, S. E., A. Pasini, and C. Marzban, Eds., 2008: Artificial Intelligence Methods in the Environmental Sciences. Springer, 418 pp.

  • Hengl, T., and Coauthors, 2017: SoilGrids250m: Global gridded soil information based on machine learning. PLOS ONE, 12, e069748, https://doi.org/10.1371/journal.pone.0169748.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hernández-Ceballos, M. A., and Coauthors, 2019: UDINEE: Evaluation of multiple models with data from the JU2003 puff releases in Oklahoma City. Part I: Comparison of observed and predicted concentrations. Bound.-Layer Meteor., 171, 323349, https://doi.org/10.1007/s10546-019-00433-8.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hochreiter, S., and J. Schmidhuber, 1997: Long short-term memory. Neural Comput ., 9, 17351780, https://doi.org/10.1162/neco.1997.9.8.1735.

  • Hoffman, R. N., and S. M. Leidner, 2010: Some characteristics of time interpolation errors for fluid flows. J. Atmos. Oceanic Technol., 27, 12551262, https://doi.org/10.1175/2010JTECHA1429.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hopkins, M., G. Pineda-García, P. A. Bogdan, and S. B. Furber, 2018: Spiking neural networks for computer vision. Interface Focus, 8, 20180007, https://doi.org/10.1098/rsfs.2018.0007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hsieh, W. W., 2009: Machine Learning Methods in the Environmental Sciences: Neural Networks and Kernels. Cambridge University Press, 364 pp.

  • Huang, B., Y. Huan, L. D. Xu, L. Zheng, and Z. Zou, 2019: Automated trading systems statistical and machine learning methods and hardware implementation: A survey. Enterprise Inf. Syst., 13, 132144, https://doi.org/10.1080/17517575.2018.1493145.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Iskenderian, H., and Coauthors, 2019: Global synthetic weather radar capability in support of the U.S. Air Force. 19th Conference on Aviation, Range, and Aerospace Meteorology, Phoenix, AZ, Amer. Meteor. Soc., 7.1, https://ams.confex.com/ams/2019Annual/webprogram/Paper355542.html.

    • Search Google Scholar
    • Export Citation
  • Jones, E., E. Maddy, K. Garrett, and S.-A. Boukabara, 2019: The MIIDAPS algorithm for retrieval and quality control for microwave and infrared observations: Applications in data assimilation. 23rd Conf. on Integrated Observing and Assimilation Systems for the Atmosphere, Oceans, and Land Surface, Phoenix, AZ, Amer. Meteor. Soc., 10.4, https://ams.confex.com/ams/2019Annual/webprogram/Paper352855.html.

    • Search Google Scholar
    • Export Citation
  • Karpatne, A., W. Watkins, J. Read, and V. Kumar, 2018: Physics-guided neural networks (PGNN): An application in lake temperature modeling. arXiv, 11 pp., http://arxiv.org/abs/1710.11431.

    • Search Google Scholar
    • Export Citation
  • Karstens, C. D., and Coauthors, 2018: Development of a human–machine mix for forecasting severe convective events. Wea. Forecasting, 33, 715737, https://doi.org/10.1175/WAF-D-17-0188.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Khain, P., and Coauthors, 2019: Parameterization of vertical profiles of governing microphysical parameters of shallow cumulus cloud ensembles using LES with bin microphysics. J. Atmos. Sci., 76, 533560, https://doi.org/10.1175/JAS-D-18-0046.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kohonen, T., 2001: Self-Organizing Maps. 3rd ed. Springer Series in Information Sciences, Vol. 30, Springer, 522 pp., https://doi.org/10.1007/978-3-642-56927-2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., 2013: The Application of Neural Networks in the Earth System Sciences: Neural Network Emulations for Complex Multidimensional Mappings. Atmospheric and Oceanographic Sciences Library, Vol. 46, Springer, 212 pp., https://doi.org/10.1007/978-94-007-6073-8.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., and M. S. Fox-Rabinovitz, 2006: A new synergetic paradigm in environmental numerical modeling: Hybrid models combining deterministic and machine learning components. Ecol. Modell., 191, 518, https://doi.org/10.1016/j.ecolmodel.2005.08.009.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., and Y. Lin, 2012: A neural network nonlinear multimodel ensemble to improve precipitation forecasts over continental US. Adv. Meteor., 2012, 649450, https://doi.org/10.1155/2012/649450.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., L. C. Breaker, and W. H. Gemmill, 1995: A neural network as a nonlinear transfer function model for retrieving surface wind speeds from the special sensor microwave imager. J. Geophys. Res., 100, 11 03311 045, https://doi.org/10.1029/95JC00857.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., M. S. Fox-Rabinovitz, and A. A. Belochitski, 2008: Decadal climate simulations using accurate and fast neural network emulation of full, longwave and shortwave, radiation. Mon. Wea. Rev., 136, 36833695, https://doi.org/10.1175/2008MWR2385.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., M. S. Fox-Rabinovitz, Y. T. Hou, S. J. Lord, and A. A. Belochitski, 2010: Accurate and fast neural network emulations of model radiation for the NCEP coupled climate forecast system: Climate simulations and seasonal predictions. Mon. Wea. Rev., 138, 18221842, https://doi.org/10.1175/2009MWR3149.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., M. S. Fox-Rabinovitz, and A. A. Belochitski, 2013: Using ensemble of neural networks to learn stochastic convection parameterizations for climate and numerical weather prediction models from data simulated by a cloud resolving model. Adv. Artif. Neural Syst., 2013, 485913, https://doi.org/10.1155/2013/485913.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., S. Nadiga, A. Mehra, E. Bayler, and D. Behringer, 2016: Neural networks technique for filling gaps in satellite measurements: Application to ocean color observations. Comput. Intell. Neurosci., 2016, 6156513, https://doi.org/10.1155/2016/6156513.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V. M., S. Nadiga, A. Mehra, and E. Bayler, 2018: Adjusting neural network to a particular problem: Neural network-based empirical biological model for chlorophyll concentration in the upper ocean. Appl. Comput. Intell. Soft Comput., 2018, 7057363, https://doi.org/10.1155/2018/7057363.

    • Search Google Scholar
    • Export Citation
  • Landschützer, P., N. Gruber, D. C. E. Bakker, U. Schuster, S. Nakaoka, M. R. Payne, T. P. Sasse, and J. Zeng, 2013: A neural network-based estimate of the seasonal to inter-annual variability of the Atlantic Ocean carbon sink. Biogeosciences, 10, 77937815, https://doi.org/10.5194/bg-10-7793-2013.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • LeCun, Y., Y. Bengio, and G. Hinton, 2015: Deep learning. Nature, 521, 436444, https://doi.org/10.1038/nature14539.

  • Li, J., and Coauthors, 2019: Potential numerical techniques and challenges for atmospheric modeling. Bull. Amer. Meteor. Soc., 100, ES239ES242, https://doi.org/10.1175/BAMS-D-19-0031.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Li, Y., S. Ye, and I. Bartoli, 2018: Semisupervised classification of hurricane damage from postevent aerial imagery using deep learning. J. Appl. Remote Sens., 12, 045008, https://doi.org/10.1117/1.jrs.12.045008.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Madaus, L. E., and C. F. Mass, 2017: Evaluating smartphone pressure observations for mesoscale analyses and forecasts. Wea. Forecasting, 32, 511531, https://doi.org/10.1175/WAF-D-16-0135.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Manogaran, G., V. Vijayakumar, R. Varatharajan, P. M. Kumar, R. Sundarasekar, and C.-H. Hsu, 2018: Machine learning based Big Data processing framework for cancer diagnosis using hidden Markov model and GM clustering. Wireless Pers. Commun ., 102, 20992116, https://doi.org/10.1007/s11277-017-5044-z.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mao, H. H., T. Shin, and G. Cottrell, 2018: DeepJ: Style-specific music generation. 12th Int. Conf. on Semantic Computing, Laguna Hills, CA, IEEE, https://doi.org/10.1109/icsc.2018.00077.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mass, C. F., and Y.-H. Kuo, 1998: Regional real-time numerical weather prediction: Current status and future potential. Bull. Amer. Meteor. Soc., 79, 253264, https://doi.org/10.1175/1520-0477(1998)079<0253:RRTNWP>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McGovern, A., K. L. Elmore, D. J. Gagne II, S. E. Haupt, C. D. Karstens, R. Lagerquist, T. Smith, and J. K. Williams, 2017: Using artificial intelligence to improve real-time decision-making for high-impact weather. Bull. Amer. Meteor. Soc., 98, 20732090, https://doi.org/10.1175/BAMS-D-16-0123.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McGovern, A., R. A. Lagerquist, D. J. Gagne, E. Jergensen, K. L. Elmore, C. R. Homeyer, and T. Smith, 2019: Making the black box more transparent: Understanding the physical implications of machine learning. Bull. Amer. Meteor. Soc., 100, 21752199, https://doi.org/10.1175/BAMS-D-18-0195.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Millard, K., and M. Richardson, 2015: On the importance of training data sample selection in random forest image classification: A case study in peatland ecosystem mapping. Remote Sens ., 7, 84898515, https://doi.org/10.3390/rs70708489.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mincholé, A., and B. Rodriguez, 2019: Artificial intelligence for the electrocardiogram. Nat. Med., 25, 2223, https://doi.org/10.1038/s41591-018-0306-1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • O’Gorman, P. A., and J. G. Dwyer, 2018: Using machine learning to parameterize moist convection: Potential for modeling of climate, climate change, and extreme events. J. Adv. Model. Earth Syst., 10, 25482563, https://doi.org/10.1029/2018MS001351.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • O’Mahony, N., S. Campbell, L. Krpalkova, D. Riordan, J. Walsh, A. Murphy, and C. Ryan, 2019: Computer vision for 3D perception. IntelliSys 2018: Intelligent Systems and Applications, K. Arai, S. Kapoor, and R. Bhatia, Eds., Springer, 788804, https://doi.org/10.1007/978-3-030-01057-7_59.

    • Search Google Scholar
    • Export Citation
  • Pagano, T. S., and Coauthors, 2017: Design and development of the CubeSat Infrared Atmospheric Sounder (CIRAS). Proc. SPIE, 10402, 1040209, https://doi.org/10.1117/12.2272839.

    • Search Google Scholar
    • Export Citation
  • Pasolli, L., F. Melgani, and E. Blanzieri, 2010: Gaussian process regression for estimating chlorophyll concentration in subsurface waters from remote sensing data. IEEE Geosci. Remote Sens. Lett., 7, 464468, https://doi.org/10.1109/LGRS.2009.2039191.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Pavolonis, M., J. Cintineo, J. Sieglaff, and C. Calvert, 2019: Machine learning based applications for environmental hazard detection and prediction. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, Maryland, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Wednesday/S3-3_NOAAai2019_Pavolonis.pptx.

    • Search Google Scholar
    • Export Citation
  • Penny, S. G., and T. M. Hamill, 2017: Coupled data assimilation for integrated earth system analysis and prediction. Bull. Amer. Meteor. Soc., 98, ES169ES172, https://doi.org/10.1175/BAMS-D-17-0036.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Price, J. D., and Coauthors, 2018: LANFEX: A field and modeling study to improve our understanding and forecasting of radiation fog. Bull. Amer. Meteor. Soc., 99, 20612077, https://doi.org/10.1175/BAMS-D-16-0299.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Racah, E., C. Beckham, T. Maharaj, and S. Ebrahimi Kahou, Prabhat, and C. Pal, 2017: ExtremeWeather: A large-scale climate dataset for semi-supervised detection, localization, and understanding of extreme weather events. Advances in Neural Information Processing Systems 30, I. Guyon et al., Eds., Curran Associates, Inc., 34023413, http://papers.nips.cc/paper/6932-extremeweather-a-large-scale-climate-dataset-for-semi-supervised-detection-localization-and-understanding-of-extreme-weather-events.