Outlook for Exploiting Artificial Intelligence in the Earth and Environmental Sciences

Sid-Ahmed Boukabara NOAA/NESDIS/Center for Satellite Applications and Research, College Park, Maryland

Search for other papers by Sid-Ahmed Boukabara in
Current site
Google Scholar
PubMed
Close
,
Vladimir Krasnopolsky NOAA/Environmental Modeling Center, College Park, Maryland

Search for other papers by Vladimir Krasnopolsky in
Current site
Google Scholar
PubMed
Close
,
Stephen G. Penny Cooperative Institute for Research in Environmental Sciences, University of Colorado Boulder, and Physical Sciences Division, NOAA/Earth System Research Laboratories, Boulder, Colorado

Search for other papers by Stephen G. Penny in
Current site
Google Scholar
PubMed
Close
,
Jebb Q. Stewart NOAA/Earth System Research Laboratories, Boulder, Colorado

Search for other papers by Jebb Q. Stewart in
Current site
Google Scholar
PubMed
Close
,
Amy McGovern School of Computer Science, University of Oklahoma, Norman, Oklahoma

Search for other papers by Amy McGovern in
Current site
Google Scholar
PubMed
Close
,
David Hall NVIDIA Corporation, Lafayette, Colorado

Search for other papers by David Hall in
Current site
Google Scholar
PubMed
Close
,
John E. Ten Hoeve NOAA/National Weather Service, Silver Spring, Maryland

Search for other papers by John E. Ten Hoeve in
Current site
Google Scholar
PubMed
Close
,
Jason Hickey Google Research, Kirkland, Washington

Search for other papers by Jason Hickey in
Current site
Google Scholar
PubMed
Close
,
Hung-Lung Allen Huang Space Science and Engineering Center, University of Wisconsin at Madison, Madison, Wisconsin

Search for other papers by Hung-Lung Allen Huang in
Current site
Google Scholar
PubMed
Close
,
John K. Williams The Weather Company, an IBM Business, Andover, Massachusetts

Search for other papers by John K. Williams in
Current site
Google Scholar
PubMed
Close
,
Kayo Ide University of Maryland, College Park, College Park, Maryland

Search for other papers by Kayo Ide in
Current site
Google Scholar
PubMed
Close
,
Philippe Tissot Conrad Blucher Institute, Texas A&M University–Corpus Christi, Corpus Christi, Texas

Search for other papers by Philippe Tissot in
Current site
Google Scholar
PubMed
Close
,
Sue Ellen Haupt Research Applications Laboratory, NCAR, Boulder, Colorado

Search for other papers by Sue Ellen Haupt in
Current site
Google Scholar
PubMed
Close
,
Kenneth S. Casey NOAA/NESDIS/National Centers for Environmental Information, Silver Spring, Maryland

Search for other papers by Kenneth S. Casey in
Current site
Google Scholar
PubMed
Close
,
Nikunj Oza Data Sciences Group, NASA Ames Research Center, Moffett Field, California

Search for other papers by Nikunj Oza in
Current site
Google Scholar
PubMed
Close
,
Alan J. Geer ECMWF, Reading, United Kingdom

Search for other papers by Alan J. Geer in
Current site
Google Scholar
PubMed
Close
,
Eric S. Maddy Riverside Technology Inc. at NOAA/NESDIS/Center for Satellite Applications and Research, College Park, Maryland

Search for other papers by Eric S. Maddy in
Current site
Google Scholar
PubMed
Close
, and
Ross N. Hoffman Cooperative Institute for Satellite Earth System Studies, University of Maryland, College Park, at NOAA/NESDIS/Center for Satellite Applications and Research, College Park, Maryland

Search for other papers by Ross N. Hoffman in
Current site
Google Scholar
PubMed
Close
Open access

Abstract

Promising new opportunities to apply artificial intelligence (AI) to the Earth and environmental sciences are identified, informed by an overview of current efforts in the community. Community input was collected at the first National Oceanic and Atmospheric Administration (NOAA) workshop on “Leveraging AI in the Exploitation of Satellite Earth Observations and Numerical Weather Prediction” held in April 2019. This workshop brought together over 400 scientists, program managers, and leaders from the public, academic, and private sectors in order to enable experts involved in the development and adaptation of AI tools and applications to meet and exchange experiences with NOAA experts. Paths are described to actualize the potential of AI to better exploit the massive volumes of environmental data from satellite and in situ sources that are critical for numerical weather prediction (NWP) and other Earth and environmental science applications. The main lessons communicated from community input via active workshop discussions and polling are reported. Finally, recommendations are presented for both scientists and decision-makers to address some of the challenges facing the adoption of AI across all Earth science.

Denotes content that is immediately available upon publication as open access.

© 2021 American Meteorological Society.

Corresponding author: Dr. Sid-Ahmed Boukabara, sid.boukabara@noaa.gov

This article is included in the 2019 NOAA Workshop on AI for Earth Observations and NWP Special Collection.

Abstract

Promising new opportunities to apply artificial intelligence (AI) to the Earth and environmental sciences are identified, informed by an overview of current efforts in the community. Community input was collected at the first National Oceanic and Atmospheric Administration (NOAA) workshop on “Leveraging AI in the Exploitation of Satellite Earth Observations and Numerical Weather Prediction” held in April 2019. This workshop brought together over 400 scientists, program managers, and leaders from the public, academic, and private sectors in order to enable experts involved in the development and adaptation of AI tools and applications to meet and exchange experiences with NOAA experts. Paths are described to actualize the potential of AI to better exploit the massive volumes of environmental data from satellite and in situ sources that are critical for numerical weather prediction (NWP) and other Earth and environmental science applications. The main lessons communicated from community input via active workshop discussions and polling are reported. Finally, recommendations are presented for both scientists and decision-makers to address some of the challenges facing the adoption of AI across all Earth science.

Denotes content that is immediately available upon publication as open access.

© 2021 American Meteorological Society.

Corresponding author: Dr. Sid-Ahmed Boukabara, sid.boukabara@noaa.gov

This article is included in the 2019 NOAA Workshop on AI for Earth Observations and NWP Special Collection.

The Earth and environmental sciences (collectively Earth science in what follows) stand to benefit from leveraging rapid advances in artificial intelligence (AI) from diverse applied science fields due to the combination of fast paced increases in data availability and computational capabilities. Leveraging algorithms used in other fields—what might be called meta-transfer learning—is accelerating the use of AI for environmental data and Earth system applications. We summarize here the main areas where significant progress has been made recently in the science of numerical weather prediction, including forecasting extreme weather events, and in exploiting satellite data. We then present a few potential directions that AI applications in Earth science may take in the future. We extend and update the perspective of Boukabara et al. (2019b) to include current activities, and expected future trends, based on presentations and discussion from the first National Oceanic and Atmospheric Administration (NOAA) workshop on “Leveraging AI in the Exploitation of Satellite Earth Observations and Numerical Weather Prediction” held in April 2019 in College Park, Maryland.1 While the overall perspective of this combined review and meeting summary focuses on addressing NOAA’s mission, the science has wide ranging relevance and applications. For reference, a number of the AI techniques and their interrelationships are summarized in Fig. 1.

Fig. 1.
Fig. 1.

A Venn diagram showing the interrelationships of several popular subdisciplines of AI. Credit: David John Gagne and Amy McGovern.

Citation: Bulletin of the American Meteorological Society 102, 5; 10.1175/BAMS-D-20-0031.1

NOAA identified AI as a strategic opportunity for the overall enhancement of NOAA’s mission of science, services, and environmental data stewardship.2 In particular, the NOAA Artificial Intelligence Strategy3 (issued February 2020) identified AI as a strong candidate to allow NOAA to address the “Big Data” challenge to collect, archive, and make useful the enormous data streams available now and in the near future and to help NOAA achieve its mission objectives and improve its performance.4 NOAA has therefore begun reaching out to partners, experts, and practitioners of AI who have interests in weather and climate prediction, many of whom participated in the first NOAA AI workshop. A summary of key ideas gathered from community input is presented to address the latest advances, major challenges, and potential applications that can best serve the NOAA mission. The overall layout of this paper follows the approximate structure of the workshop itself. In the next section, the overview session is summarized, describing current and planned activities using AI for satellite Earth observations and numerical weather prediction (NWP), including contributions from NOAA and its partners from academia (e.g., NCAR), the private sector (e.g., Google, NVIDIA, IBM), and international collaborators (e.g., ECMWF). Then, in the next section similarities between AI and conventional/physically-based approaches are discussed. Subsequent sections are structured to show the relevance of AI at each step of the “value chain” that exploits observations from data ingest to decision-making (Fig. 2).

Fig. 2.
Fig. 2.

Information from (top) Earth observations and remote sensing flows in the value chain that exploits these observations thru data assimilation, environmental numerical model, extreme weather monitoring and prediction, and (bottom) postprocessing of forecasts, and then onward to other applications and products across the public and private sectors that use environmental intelligence for decision making. The five colored blocks and large-type labels correspond to sections in this paper and the small-type entries correspond to topics related to the different sections.

Citation: Bulletin of the American Meteorological Society 102, 5; 10.1175/BAMS-D-20-0031.1

Motivations for considering AI for satellite Earth observations and NWP

Two of the main challenges in NWP are 1) to take advantage of the ever-increasing volume of environmental data collected from satellite and other sources, and 2) to satisfy the increasing societal reliance on forecasts with continually improving accuracy and reliability, which in turn implies a need for increased temporal and spatial resolutions in the underlying numerical forecast models. Boukabara et al. (2019a) noted the growing potential for AI in weather prediction. Significant research advances have already been made in the application of AI to different areas of meteorology and oceanography (Haupt et al. 2008; Hsieh 2009; Krasnopolsky 2013), ranging from remote sensing (Ball et al. 2017) to severe weather prediction (McGovern et al. 2017). However, until recently, far fewer AI applications were developed to operationally exploit environmental satellite data, or to enhance other operational activities such as NWP, data assimilation, nowcasting, forecasting, and extreme weather prediction. AI is increasingly being considered for these applications, with promising results. However, as outlined in the section “Similarities between AI and conventional/physically based approaches,” the inverse methods conventionally used for NWP, particularly the methods used in the field of data assimilation, already share many similarities with machine learning. The increase of data volume comes from higher-resolution satellites and sensors, from a growing list of new sensors (traditional as well as SmallSats and CubeSats; Stephens et al. 2020), and from an explosion of new observing systems that are beneficial byproducts of the Internet of Things (IoT; e.g., Madaus and Mass 2017) and unmanned systems. These data sources should help provide more accurate and detailed forecasts but their exploitation is expected to be a major challenge to any future computing infrastructure, not least in the area of data transfer and storage. AI can provide part of the solution, for example as described in the subsection “Fast and accurate ML model physics” of section “Highlights of AI activities in environmental numerical modeling.” In this case, although ML training requires substantial computations, these costs are insignificant compared to the savings from the speed-up of the resulting ML model implemented within an operational NWP model.

It is worth noting that the upcoming exascale computing capability, expected in the near future, which will undoubtedly increase our ability to run NWP models at higher resolutions and assimilate more data, will come at a high cost of (and limitation due to) energy consumption, and with the potential of adverse environmental impacts. It is therefore imperative that as a community of Earth scientists, we look at innovative software solutions along with, or in combination with hardware enhancements. According to Hall (2019), effective use of graphics processing units (GPUs) is needed in order to keep up with Moore’s law and permit NWP models to run faster and at higher resolution on modern supercomputers. In this regard, Williams (2019) discussed a collaboration between IBM and NCAR that accelerated the community Model for Prediction Across Scales (MPAS) using GPUs, paving the way to global, hourly-updating, convection-allowing NWP.

Workforce, training, collaboration, and outreach

A diverse community of experts participated in the NOAA AI workshop, representing a wide range of experience with AI (see participants’ affiliation in Table 1). Using AI in Earth science is new for many, but there already exists a core group of researchers with expertise in this area. An informal survey conducted by one of the authors (P. Tissot) indicated that while about 50% are just coming to AI within the last 2 years, around a third of the audience has already been using AI for Earth science for over 10 years. Many projects focused on leveraging the efficiency and the skill enhancement that AI could bring to the NOAA mission, while some individual projects have already been incorporated into NOAA systems (e.g., Krasnopolsky et al. 1999). Still others have already been implemented by the private sector, thereby proving their feasibility.

Table 1.

Counts for workshop and WebEx (italicized) components and participation in various categories. Note that affiliation counts are the numbers of unique organizations identified by the individual attendees.

Table 1.

The NOAA AI strategy emphasizes partnerships, outreach, and workforce training as essential ways to allow rapid progress in the infusion of AI in our systems. For these reasons, the workshop included three tutorial sessions, a participant real-time survey, and two interactive panels. The tutorial sessions provided hands on training with actual AI tools, techniques, and coding scripts. The panel discussions entitled: “How can scientists and engineers embrace AI technology to enhance their work?” and “Where do we go from here?”, as well as survey results, are summarized in the section “Emerging trends in AI with potential benefits for Earth observations and NWP” and in the section “Main conclusions and challenges identified,” with relevant concepts integrated into the remainder of the text.

General overview of satellite Earth observations and NWP AI activities

Applying AI to Earth science has a rich and varied history across government, academia, and the private sector. The AMS AI Committee5 has a history of more than 25 years of organizing AI efforts within the weather community. Enormous progress using AI in meteorological applications has been presented at AMS conferences, particularly in the last 5 years (Tissot 2019). NOAA has been using AI for a variety of Earth observations and satellite applications, including the use of neural networks for NWP model parameterization (Krasnopolsky et al. 2010) and using deep learning to infer missing data (Boukabara et al. 2019a). NCAR, a federally funded research and development center, has a long history of developing AI techniques for weather applications. Haupt et al. (2019) highlighted the Dynamic Integrated Forecast (DICast) system, a 20-year effort at NCAR that forms the “weather engine” of many applications, as well as more recent efforts to improve wildfire prediction with machine learning (ML). At The Weather Company, an IBM Business (TWC/IBM), AI has been used for years in numerous ways from improving observations to creating personalized forecasts for end-users (Williams et al. 2016; Williams 2019). Geer (2019) summarized opportunities for AI in NWP being explored at ECMWF. These include replacing numerical model parameterizations with ML models and using AI for data monitoring and for augmenting data assimilation. Hall (2019) demonstrated examples of using AI for applications ranging from sunspot detection, automated detection of large-scale weather phenomena such as tropical cyclones, and parameterization. Google’s Alphabet AI has recently added a focus on AI for weather (Hickey 2019) and is working to make large Earth science datasets available to the general public and researchers.

Similarities between AI and conventional/physically based approaches

AI subdisciplines such as deep learning use mathematical methods that are closely related to data assimilation (DA), statistical modeling, and data fusion methods already used by forecasters and Earth science researchers. These methods have the common foundation of being essentially based on optimal estimation theory (Geer 2019, 2021). For example, Hsieh and Tang (1998) showed close similarities between neural networks (NNs) and variational DA. However, there are differences in how the techniques are currently applied. A key characteristic of DA is the acknowledgment of the presence of dynamics, which is typically realized via the use of a numerical model, and cycling to follow a “true state” observed by sparse and uncertain observations. The paramount aim of DA is to estimate this true state. In contrast, typically ML uses a fixed set of inputs and outputs to train a model. However, a number of recent physical science ML approaches have employed dynamics as a constraint (e.g., Beucler et al. 2019; Jiang et al. 2020). The input–output pairs in ML (often referred to as “features” and “labels”) are analogous to the background state and the observations in data assimilation. ML approaches usually ignore input and output errors and the aim is to learn the parameters of a model. In contrast, when using DA for weather forecasting, model errors are usually ignored in practice (with some important exceptions; e.g., Fisher et al. 2005; Lindskog et al. 2009; Ngodock et al. 2017), while the errors in the input (the background state) and in the output (the observations) are carefully estimated. This difference reflects the different nature of problems that these methods have typically been used to address.

Table 2 highlights many of the similarities between ML and DA. Both typically optimize a cost function based on the misfit between the model and the observations. Regularization is an important element of high-dimensional nonlinear optimization. Regularization is the process of adding information in order to solve ill-posed problems and prevent overfitting. As with general nonlinear optimization, ML and DA both often apply regularization via a term in the loss/cost function. Ensemble averaging is another tool used. DA also frequently uses ensembles of short-term forecasts in order to estimate the temporally varying “errors of the day”—incorporating this information into a dynamically weighted regularization term via the background error covariance matrix. Additionally, both ML and DA have attempted to reduce model bias and variability in solutions by averaging over multiple realizations of a model (for DA, this is called a multimodel ensemble). A gradient descent method is often used to find the most accurate model or state. As examples, to estimate the gradient, both the technique of back propagation (used in NNs) and the adjoint method (used in 4D-Var; Bannister 2017) apply the chain rule in the reverse direction, starting from the cost function and ending with the linear sensitivity of the cost function to the network weights or state space variables, respectively.

Table 2.

Comparison between typical machine learning (e.g., a deep neural network in TensorFlow) and data assimilation, which underpins most global weather forecasting. To highlight the similarities, NN concepts have been written in a linear algebra style close to typical DA notation. Superscript T denotes the transpose operator, bold lowercase letters are vectors and bold uppercase letters are matrices. Adapted from Geer (2019).

Table 2.

DA methods are not limited to estimating a state; they can also simultaneously estimate parameters of the model. These techniques, including parameter estimation and weak-constraint DA, have very close links with ML. Abarbanel et al. (2018) showed a mathematical equivalence between deep learning and DA for model parameter estimation, and extended this connection further to define the concept of “deepest learning.” Bocquet et al. (2019) used DA itself as an ML tool to infer an ordinary differential equations (ODEs) representation of model dynamics from observations.

From the other direction, ML can become much closer to typical DA. ML optimization may be limited to an initial training phase, or it may be applied progressively over time as new features and labels are added to the training set. Pathak et al. (2018a) have demonstrated cycled NNs in which the output of one training cycle is used as input to the next, which comes close to the cycling used in typical DA. Bocquet et al. (2019) highlighted the notion that the residual deep learning architectures of NNs can roughly be interpreted as dynamical systems (e.g., Weinan 2017; Chang et al. 2018), and further noted that Wang and Lin (1998) and Fablet et al. (2018) showed the architecture of the NN can follow that of an integration scheme.

DA and ML each offer unique advantages, suggesting possible synergies. Hsieh and Tang (1998) suggested the potential of a new class of hybrid neural–dynamical models. More recently, efforts such as Pathak et al. (2018b) have explored the possibility of creating hybrid “data assisted” dynamical models to correct systematic errors that may be present in the dynamical model. Brajard et al. (2020) developed an alternating strategy between ML and DA in order to take advantage of the state estimation of DA to fill in sparse observations, thus providing a full-field estimate for training the ML method. Bocquet et al. (2020) unified the DA and ML approaches from a Bayesian perspective using expectation–maximization and coordinate descents. In doing so, the state trajectory and model error statistics are estimated simultaneously.

In summary, AI and traditional approaches based on optimal estimation share many mathematical similarities. The major difference is that a regular cycled optimization process is central to DA, while this is not generally used when training a ML model.

Highlights of AI activities in satellite Earth observations and remote sensing

Activities applying AI to remote satellite data range from detecting flood and ice from synthetic-aperture radar (SAR) images (e.g., Wang et al. 2017) to estimating tropical cyclone intensity from satellite microwave imagery (Wimmers et al. 2019) to a variety of uses of AI techniques in satellite data calibration, bias correction, and remote sensing of atmospheric and surface parameters (Reichstein et al. 2019). Preparing the data (e.g., labeling) for AI exploitation is a notable challenge in some applications. This critical but often overlooked step has attracted some recent attention (e.g., Bonfanti et al. 2018; Lee et al. 2019; Prabhat et al. 2020) and should be an emphasis of future efforts by prediction centers. This will not only provide more readily available datasets for AI exploitation, it should also in principle allow more creative ways to exploit satellite data.

The products generated by AI approaches have similar characteristics (i.e., accuracy, levels of misfit to observations, spatial features coherence, and interparameters correlations) to those generated by traditional physical approaches. For example, Boukabara et al. (2019a) showed that the total precipitable water vapor (TPW) retrieved from microwave brightness temperatures by AI captures all the main features of the NWP analyses. The most striking advantage of many AI approaches is efficiency. For example, while it takes about 2 h to process a full day of the Advanced Technology Microwave Sounder (ATMS) data with traditional iterative-based systems, the Multi-Instrument Inversion and Data Assimilation Preprocessing System–AI (MIIDAPS-AI) approach (excluding I/O and training time) requires only 5 s of CPU time.

Highlights of activities to leverage AI in data assimilation

AI has the potential to benefit DA in all stages of the analysis–forecast cycle. The process steps in a typical DA cycle include producing a forecast with a large-scale nonlinear numerical model, preprocessing vast quantities of observational data, subsetting or aggregating observational data, correcting systematic biases in observations, computing linear approximations of the state for use by minimization algorithms, performing a statistical analysis combining forecast and observations, rebalancing the analysis to ensure the forecast model numerical time integrations are stable, and correcting systematic biases in the forecast.

A well-recognized opportunity for ML within the DA cycle is the replacement of the observation operator. Verrelst et al. (2015) and Rivera et al. (2015) applied ML to estimate the forward model (i.e., the radiative transfer observation operator). These ML models provided greater computational efficiency while maintaining accuracy and flexibility for extrapolation. Moreover, where the physical equations of the forward model are unknown or difficult to implement, the ML observation operator can be trained using the observations and a DA analysis. An example is the training of an NN retrieval between Soil Moisture and Ocean Salinity (SMOS) radiances and soil moisture taken directly from NWP analyses (Rodríguez-Fernández et al. 2019). These retrievals were then successfully assimilated, and this approach eliminated the need for separate soil moisture observations in the training, and it automatically corrected any biases between the radiances and the NWP system.

Cintra and de Campos Velho (2018) demonstrated an emulation of the entire DA analysis by implementing a multilayer perceptron (MLP) model of the local ensemble transform Kalman filter (LETKF; Hunt et al. 2007) applied to the SPEEDY atmospheric model. The MLP and LETKF analyses are very similar—for example, surface pressure differences were within a ±5-hPa bound. However, there was a significant reduction in computational cost, so this approach could provide benefits to applications such as reanalysis, where the DA analysis is computed many times, and operational forecasting, which is typically tightly scheduled on limited computing resources.

Highlights of AI activities in environmental numerical modeling

AI applications targeting numerical modeling focus on either enhancing the numerical models, such as by replacing subgrid-scale parameterizations, or on replacing the numerical models altogether. In some cases, numerical models may be an essential component for training a ML-based model, while in others the ML model may be driven by observational data alone.

Fast and accurate ML model physics

Applications addressing model physics consist of three different but closely related types. The first is fast emulation or “surrogate modeling” of existing model parameterizations, which applies an emulation technique for accelerating calculation of previously developed parameterizations based on approximate description of underlying physical processes (e.g., radiation parameterizations; Krasnopolsky et al. 2010). The second is enhanced parameterization, based on data simulated by high-resolution models in situations when underlying physical processes are very complicated and not very well understood (e.g., Krasnopolsky et al. 2013; Brenowitz and Bretherton 2018). The third case involves data-driven parameterizations, which are new empirical parameterizations based on observed data (e.g., Haupt et al. 2019). The great flexibility of ML tools also allows a combination of these three approaches, and they can also be employed to speed up calculations within a partly physical framework (e.g., Chevallier et al. 2000).

Fast emulation of existing parameterizations

ML may be used to provide a functional approximation of a model parameterization with relatively small approximation error (Chevallier et al. 1998; Veerman et al. 2020). With sufficient training data, this can produce a relatively smooth interpolation within the training set domain. When an ML emulation is developed, in addition to the criterion of small approximation errors there are several additional criteria that must be met (Krasnopolsky 2013), the most important being to achieve high performance within the host NWP model.

Fast emulations of existing model physics parameterizations are usually developed for complex parameterizations that are computational bottlenecks, such as atmospheric radiation parameterizations and the planetary boundary layer (e.g., Wang et al. 2019). Krasnopolsky (2019) demonstrated that a 0.1 K day−1 RMS accuracy can be obtained for varied individual instantaneous profiles with shallow NN emulators with O(100) neurons. Even in moderate resolution climate models, the calculation of the atmospheric radiation can consume more than 50% of the computational load. ML emulations of atmospheric radiation parameterizations accelerate calculation of the long wave radiation about 16 times and the shortwave radiation about 60 times (Krasnopolsky 2019).

Enhanced parameterization by training on an advanced model

ML techniques can also be used not only to emulate existing physics parameterizations, but also to improve the representation of these subgrid-scale processes. Because of the approximations made in the parameterized physics that atmospheric general circulation models (AGCMs) use, these models cannot accurately simulate many important finescale processes like cloudiness and convective precipitation (e.g., Rasch et al. 2000; Brenowitz and Bretherton 2018; Rasp et al. 2018; Chen et al. 2019). Cloud-resolving models (CRMs) can represent many of the phenomena that lower-resolution global and regional models do not (i.e., higher-resolution fluid dynamic motions supporting updrafts and downdrafts, convective organization, mesoscale circulations, and stratiform and convective components that interact with each other). In this setting, the aim is to use ML to develop parameterizations by training on CRM data to allow the low-resolution models to emulate the behavior of a CRM while maintaining a low computational cost. The resulting emulation can be used as an enhanced, and computationally viable parameterization in a AGCM (Krasnopolsky et al. 2013; Schneider et al. 2017; Brenowitz and Bretherton 2018; Gentine et al. 2018; O’Gorman and Dwyer 2018; Bretherton et al. 2019; Brenowitz and Bretherton 2019a,b; Pal et al. 2019; Yuval and O’Gorman 2020).

Data-driven parameterization by training on observational data

In many cases, such as in Monin–Obukhov similarity theory (MOST; Monin and Obukhov 1954), the original theoretical formulations for model parameterizations were based on measured data. Traditional surface layer schemes are based on MOST and predict surface fluxes of temperature, momentum, and moisture based on physical relationships between wind speed, air and ground temperature, and air and ground specific humidity (Jiménez et al. 2012). Empirical coefficients within these relationships have traditionally been derived from experimental results; however, there is significant variation between the empirical coefficients determined by different field programs. Parameterizations can also be built directly from observational data. For example, researchers at NCAR are using data from field sites (Scoville, Idaho, and Cabauw, The Netherlands) to build new land surface layer models using neural networks and random forests (Haupt et al. 2019; Gagne et al. 2019a). A particular hurdle for boundary layer emulation is that some NWP models would require it to be incorporated within an implicit solver because the atmospheric and surface boundary layers are so tightly coupled.

ML predictive models

Some recent studies have taken a more extreme approach by completely replacing the dynamical model with ML-based surrogates. Scher (2018) emulated the dynamics of a low-resolution AGCM using a deep learning NN that can predict the complete model state several time steps ahead, and through cycling can produce a climate similar to the reference AGCM without explicitly imposing conservation properties. Dueben and Bauer (2018) used a toy model for global weather predictions to identify challenges and fundamental design choices for a forecast system based on neural networks. James et al. (2018) and O’Donncha et al. (2018) applied ML to emulate a numerical wave model for a significant speedup of approximately 1,000-fold compared to the source numerical model. Weyn et al. (2019, 2020) applied a convolutional neural network architecture to a cubed-sphere representation of atmospheric reanalysis fields and produced realistic weather forecasts at lead times of several weeks and longer. Keller and Evans (2019a,b) applied ML (random forest regression) to replace the gas-phase chemistry in an atmospheric chemistry transport model, demonstrating best results by predicting the change in concentration for long-lived species and the concentration itself at the end of the time step for short-lived species. Other closely related studies have used “analog” methods to build a statistical representation of the numerical model that can be sampled to make new forecasts or forecast distributions (Hamill and Whitaker 2006; Delle Monache et al. 2013).

Activities to leverage AI in extreme weather monitoring and prediction

Extreme weather events can have severe impacts on life and property. From nowcasting (forecasts up to about 2 h) tornado and lightning activity, to predicting longer-horizon events like heat waves and periods of prolonged precipitation, the ability to accurately predict the likelihood of extreme events can help mitigate their damage. Multiple techniques from more traditional linear regression to random forests and modern NN approaches have been demonstrated to enhance the skill of existing methods of nowcasting and predicting extreme weather. Stevenson et al. (2019) provide a historical overview of how statistical techniques were used by the National Hurricane Center (NHC) in the 1950s. Since that time, AI techniques such as regression, random forests, and neural networks have improved probabilistic hazard guidance from multimodel ensembles and forecasts of rapid intensification. Eslami et al. (2019) continued this theme using a deep learning ensemble approach with a regressive deep convolutional neural network to predict hurricane intensity, which performed better than the individual ensemble members and better than existing NHC forecasts. McGovern (2019) and Lagerquist et al. (2020) showed that ML can be used to improve forecasts of extreme weather, including hail 24–48 h in advance, as well as nowcasting for tornadoes. Lakshmanan et al. (2019) used Global Lightning Mapper (GLM) data to develop a ML-based nowcasting application. Sønderby et al. (2020) use a NN to extrapolate radar and satellite data to produce probabilistic precipitation maps out to several hours. Other research in extreme weather has focused on longer-term prediction horizons. Fan et al. (2019) applied a NN-based ensemble averaging approach to improve forecasts of week-3–4 precipitation and 2-m air temperature produced by NOAA’s Climate Forecast System (CFS). In this work, the NN corrects erroneous patterns in the model output and nearly doubles the skill compared to multiple linear regression. While these studies do not constitute a complete summary of work in this area, they highlight ongoing efforts and opportunities for future advancement.

Postprocessing of forecasts

Statisticians recognized the value of postprocessing NWP forecasts in the 1970s and developed the Model Output Statistics (MOS) technique (Glahn and Lowry 1972). As a natural extension to traditional MOS techniques, AI has been found to be quite effective at model postprocessing. NCAR began implementing DICast system to correct and blend multiple NWP model forecasts (Myers et al. 2011) in the late 1990s and transitioned it to a gridded system in the following decade (Haupt et al. 2019). Since then, a plethora of techniques have developed for model postprocessing. Regime-dependent postprocessing is also becoming important for improving forecasts. Greybush et al. (2008) showed that using empirical orthogonal functions could improve temperature forecasts. McCandless et al. (2016) demonstrated the use of regimes for solar power forecasting.

Alternative AI methods have been developed for probabilistic forecasting based on numerical model output. An example is the analog ensemble (AnEn) technique (Eckel and Delle Monache 2016), which examines a library of historical forecasts to identify those cases that best match present conditions. Combined with the verifying observations, this provides an empirical probability density function (pdf) that can be used to quantify the uncertainty of the forecast. The mean of that pdf can also help to reduce biases in the deterministic forecast (Delle Monache et al. 2013). This work leverages a single deterministic model integration to produce calibrated probabilistic information. Hamill and Whitaker (2006) and Hamill et al. (2015) used a similar analog approach to calibrate ensembles of precipitation forecasts.

The NN-based bias correction of wave forecasts by Campos et al. (2018, 2019) showed significant benefits, particularly at time horizons where errors begin to grow nonlinearly (e.g., beyond 5–7 days). Such forecast bias correction, applied to forecast models with greater nonlinear error growth on shorter time scales, has the potential to be integrated in the forecast component of the DA analysis cycle. Bolton and Zanna (2019) applied deep learning to combine observations and model data to predict unresolved processes and flow fields in a simulation study.

TWC/IBM has developed AI methods for creating probabilistic forecasts of meteorological variables and calibrated ensembles of equally likely scenarios intended to support decision services for various industries (Williams 2019). Some examples include electrical utilities mobilizing crews in preparation for potential outages and energy traders anticipating fluctuations in demand. ML has been used to postprocess NWP output to predict severe events, such as hail (Gagne et al. 2017) and tornadoes (McGovern et al. 2017). Haupt et al. (2019) and Gagne (2019) showed how application of convolutional neural networks to NWP data cannot only identify storms that are most likely to develop severe hail, but could also identify the features of those storms that make them hail producing (Gagne et al. 2019b). Additionally, the NHC has implemented multiple ML methods on model output to improve forecasts of hurricane intensity (Stevenson et al. 2019).

Emerging trends in AI with potential benefits for Earth observations and NWP

Physical scientists often view AI methods as a “black box” that gives little insight into the actual underlying properties of the system. Greater understanding of how these methods behave is needed before AI methods can be readily adopted in an operational forecasting environment. Efforts are underway to develop explainable AI (McGovern et al. 2019; Samek et al. 2017; Toms et al. 2020) and physics-guided ML (Ding 2018; Karpatne et al. 2018). McGovern et al. (2019) and Toms et al. (2020) demonstrated a variety of AI interpretation methods for techniques ranging from traditional ML (including decision trees and regression) to deep learning. Ghahramani (2015) states that ML must be able to represent and manipulate uncertainty about models and predictions. Probabilistic modeling and reasoning (Pearl 1988) is effective at training models that incorporate uncertainty estimation/quantification. Explainable AI is in its infancy, particularly within Earth science, and will be an important emerging field as AI methods continue to grow in popularity.

Explicit inclusion of physics constraints into AI has the potential to improve the physical consistency, performance, interpretability, transparency, and explainability of AI-driven models. This area of research is very active including many connections to explainable AI. Reviews and references of different methods to combine data and knowledge into ML are given by von Rueden et al. (2019), while Roscher et al. (2020) survey how these concepts are applied in science. Physical insight is often included in the design of ML models through selection of features, predictors, activation functions, and topologies. However, new efforts are underway to explore the explicit specification of known physical constraints as part of the AI model, either by incorporating the output of a physical model or by including explicit physical constraints during training such as via a penalty term in the loss function. Implementation of such constraints resulted in better generalizability and physically meaningful insights when applied to the prediction of lake temperature profiles (Karpatne et al. 2018). To enforce conservation laws, ML training can also include functions that measure the discrepancy from the conservation laws either as a strong constraint using the method of Lagrange multipliers during the minimization or as a weak constraint included in the overall cost function (e.g., Tompson et al. 2017).

Other related methods include physics-informed generative adversarial networks (PI-GAN; Wu et al. 2020). PI-GANs can account for unresolved physics by learning statistics from precomputed training data, for example by ensuring that the climatological covariance of generated samples matches that of the training dataset. This provides an alternative to the explicit development of closures or parameterizations, such as for turbulence. Adding physics can reduce the requirements for large datasets (e.g., Yang et al. 2020).

Main conclusions and challenges identified

Our objectives included 1) reviewing AI-enabling technology and tools, 2) reviewing scientific objectives for better utilization of current and future Earth observations, 3) examining ideas to improve NWP skill and efficiency of environmental data processing, and 4) identifying innovative ways to use satellite data and other environmental data to create new products and services. An overarching goal is to gather information to help inform the NOAA Artificial Intelligence Strategy and to establish a roadmap to fully leverage AI in NOAA’s Earth science portfolio. Benefits of AI to operational forecast systems sometimes take the form of a complement (e.g., by correcting tropical cyclone forecasts) and sometimes as a full alternative to heritage systems (e.g., for image correction and forward problem emulation for remote sensing). Because ML is particularly efficient for modeling nonlinear relationships, there is potential for emulating or replacing NWP parameterizations such as radiation and cumulus convection. AI also has potential to improve long-term data stewardship, for example by using natural language processing for data rescue and mining metadata to make the data more available and discoverable.

While some technologists believe AI will be disruptive in other industries from workplace roles to organizational structures (Bloomberg 2018), a survey of workshop participants agreed that AI would largely supplement current tools (Fig. 3). A panel of AI experts expressed broad concern that because AI tools are becoming easier to use, the larger NWP community may lose trust in AI if the application of AI is not managed well. The panel agreed that the lack of formal partnership and mentoring mechanisms across the sectors poses a challenge, yet encouraged NOAA to expand the use of cooperative research and development agreements (CRADAs), which allow a government agency to team with a private company or university, and to consider innovative mechanisms to improve collaboration, such as prize challenges. The panel noted that much of NOAA’s data are highly structured, making these ideal for AI applications. If more of NOAA’s data were made available in a cloud sandbox, if these data were labeled, when appropriate, and if domain scientists were available to explain nuances within the data, then others across the AI community could more easily contribute to solving NOAA’s challenges. Finally, the panel discussed the need to take a whole-systems approach to AI development, so that anticipated advancements in AI will inform NOAA’s future requirements in related areas such as data assimilation, parameterization development, and computing, resulting in the development of future forecast and postprocessing systems that combine both AI and traditional physics-based approaches, leveraging their relative strengths.

Fig. 3.
Fig. 3.

Real-time survey results from workshop participants presented as (a)–(d) bar plots and (e) a word cloud; N indicates sample size.

Citation: Bulletin of the American Meteorological Society 102, 5; 10.1175/BAMS-D-20-0031.1

There are challenges associated with greater adoption of AI. Requirements for training multivariate problems can be enormous both in terms of training sets and HPC. Since deep learning models are unreliable when extrapolating beyond the domain covered by the training set, or if the dataset is nonstationary due to changing conditions such as those caused by climate change, the training set should be representative and comprehensive. The fastest progress is expected to involve working on smaller subproblems or in estimating corrections to existing conventional applications. The transition from research to operations is a critical step, but trust in AI techniques is an obstacle. There is a knowledge gap related to AI in the weather, water, and climate workforce that contributes to a lack of trust. If scientists and forecasters do not understand a technique, they will be less likely to trust it, and thus less likely to use it. While, the growing availability of GPUs and new software tools (e.g., TensorFlow and PyTorch) that can exploit GPUs has made efficient and scalable ML more accessible to Earth science researchers, concerns arise about the acceptance and implementation of the Python language into high-performance computing and operational use or the availability of translation methods and/or compatibility of the new software tools for FORTRAN [but see recent efforts in this area by Ott et al. (2020)].

Despite the challenges in leveraging AI for Earth science, we expect greatly expanding use of AI for environmental data and forecasting applications. The drive to simultaneously improve forecast skill (by accounting for unknown or difficult to model phenomena) and increase efficiency (therefore reducing cost and meeting latency requirements) will continue to make AI attractive to operational centers like NOAA. The promotion of AI by the U.S. Government6 to help the economy and society in a variety of applications will be a major additional strategic driver for the increased use of AI techniques in Earth science.

Acknowledgments

The authors are members of the Scientific and/or Local Organizing Committees of the NOAA workshop on “Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction.” Grateful acknowledgement is made for funding and support provided by NOAA for the AI workshop and for several of the authors. NCAR is sponsored by the National Science Foundation. SGP acknowledges support from NOAA NGGPS (NA18NWS4680048), ONR (N00014-19-1-2522), and NOAA NESDIS. RNH and KI acknowledge support from NOAA (NA14NES4320003 and NA19NES4320002) through CICS and CISESS at the University of Maryland/ESSIC. The scientific results and conclusions, as well as any views or opinions expressed herein, are those of the author(s) and do not necessarily reflect those of NOAA or the U.S. Department of Commerce.

References

  • Abarbanel, H. D. I., P. J. Rozdeba, and S. Shirman, 2018: Machine learning: Deepest learning as statistical data assimilation problems. Neural Comput ., 30, 20252055, https://doi.org/10.1162/neco_a_01094.

    • Search Google Scholar
    • Export Citation
  • Ball, J. E., D. T. Anderson, and C. S. Chan, 2017: Comprehensive survey of deep learning in remote sensing: Theories, tools, and challenges for the community. J. Appl. Remote Sens., 11, 042609, https://doi.org/10.1117/1.JRS.11.042609.

    • Search Google Scholar
    • Export Citation
  • Bannister, R. N., 2017: A review of operational methods of variational and ensemble-variational data assimilation. Quart. J. Roy. Meteor. Soc., 143, 607633, https://doi.org/10.1002/qj.2982.

    • Search Google Scholar
    • Export Citation
  • Beucler, T., S. Rasp, M. Pritchard, and P. Gentine, 2019: Achieving conservation of energy in neural network emulators for climate modeling. arXiv, 5 pp., http://arxiv.org/abs/1906.06622.

    • Search Google Scholar
    • Export Citation
  • Bloomberg, J., 2018: Think You Know How Disruptive Artificial Intelligence Is? Think Again. Forbes, 7 July, www.forbes.com/sites/jasonbloomberg/2018/07/07/think-you-know-how-disruptive-artificial-intelligence-is-think-again/.

    • Search Google Scholar
    • Export Citation
  • Bocquet, M., J. Brajard, A. Carrassi, and L. Bertino, 2019: Data assimilation as a learning tool to infer ordinary differential equation representations of dynamical models. Nonlinear Processes Geophys ., 26, 143162, https://doi.org/10.5194/npg-26-143-2019.

    • Search Google Scholar
    • Export Citation
  • Bocquet, M., J. Brajard, A. Carrassi, and L. Bertino, 2020: Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization. Found. Data Sci., 5580, https://doi.org/10.3934/fods.2020004.

    • Search Google Scholar
    • Export Citation
  • Bolton, T., and L. Zanna, 2019: Applications of deep learning to ocean data inference and subgrid parameterization. J. Adv. Model. Earth Syst., 11, 376399, https://doi.org/10.1029/2018MS001472.

    • Search Google Scholar
    • Export Citation
  • Bonfanti, C., L. Trailovic, J. Stewart, and M. Govett, 2018: Machine learning: Defining worldwide cyclone labels for training. 21st Int. Conf. on Information Fusion (FUSION), Cambridge, United Kingdom, IEEE, 753760, https://doi.org/10.23919/ICIF.2018.8455276.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., V. Krasnopolsky, and J. Q. Stewart, 2019a: Overview of NOAA AI activities in satellite observations and NWP: Status and perspectives. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Tuesday/S1-2_NOAAai2019_Boukabara.pptx.

    • Search Google Scholar
    • Export Citation
  • Boukabara, S.-A., V. Krasnopolsky, J. Q. Stewart, E. S. Maddy, N. Shahroudi, and R. N. Hoffman, 2019b: Leveraging modern artificial intelligence for remote sensing and NWP: Benefits and challenges. Bull. Amer. Meteor. Soc., 100, ES473ES491, https://doi.org/10.1175/BAMS-D-18-0324.1.

    • Search Google Scholar
    • Export Citation
  • Brajard, J., A. Carrassi, M. Bocquet, and L. Bertino, 2020: Combining data assimilation and machine learning to emulate a dynamical model from sparse and noisy observations: A case study with the Lorenz 96 model. J. Comput. Sci., 44, 101171, https://doi.org/10.5194/GMD-2019-136-RC1.

    • Search Google Scholar
    • Export Citation
  • Brenowitz, N. D., and C. S. Bretherton, 2018: Prognostic validation of a neural network unified physics parameterization. Geophys. Res. Lett., 45, 62896298, https://doi.org/10.1029/2018GL078510.

    • Search Google Scholar
    • Export Citation
  • Brenowitz, N. D., and C. S. Bretherton, 2019a: Training neural network parameterizations with near-global cloud-resolving models. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Wednesday/S4_1-4_NOAAai2019_Brenowitz.pptx.

    • Search Google Scholar
    • Export Citation
  • Brenowitz, N. D., and C. S. Bretherton, 2019b: Spatially extended tests of a neural network parametrization trained by coarse-graining. J. Adv. Model. Earth Syst., 11, 27282744, https://doi.org/10.1029/2019MS001711.

    • Search Google Scholar
    • Export Citation
  • Bretherton, C., J. McGovern, and N. Brenowitz, 2019: Machine learning for moist physics parameterizations in weather and climate models. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Wednesday/S4_1-2_NOAAai2019_Bretherton.pptx.

    • Search Google Scholar
    • Export Citation
  • Campos, R. M., J.-H. G. M. Alves, S. G. Penny, and V. Krasnopolsky, 2018: Assessments of surface winds and waves from the NCEP ensemble forecast system. Wea. Forecasting, 33, 15331546, https://doi.org/10.1175/WAF-D-18-0086.1.

    • Search Google Scholar
    • Export Citation
  • Campos, R. M., V. Krasnopolsky, J.-H. G. M. Alves, and S. G. Penny, 2019: Nonlinear wave ensemble averaging in the Gulf of Mexico using neural network. J. Atmos. Oceanic Technol., 36, 113127, https://doi.org/10.1175/JTECH-D-18-0099.1.

    • Search Google Scholar
    • Export Citation
  • Chang, B., L. Meng, E. Haber, F. Tung, and D. Begert, 2018: Multi-level residual networks from dynamical systems view. arXiv, 14 pp., https://arxiv.org/abs/1710.10348.

    • Search Google Scholar
    • Export Citation
  • Chen, C.-C., A. Gettelman, and D. J. Gagne, 2019: Emulating critical cloud processes using machine learning. 2019 Fall Meeting, San Francisco, CA, Amer. Geophys. Union, Abstract A53H-05, https://agu.confex.com/agu/fm19/meetingapp.cgi/Paper/506720.

    • Search Google Scholar
    • Export Citation
  • Chevallier, F., F. Chéruy, N. A. Scott, and A. Chédin, 1998: A neural network approach for a fast and accurate computation of a longwave radiative budget. J. Appl. Meteor., 37, 13851397, https://doi.org/10.1175/1520-0450(1998)037<1385:ANNAFA>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Chevallier, F., J.-J. Morcrette, F. Chéruy, and N. A. Scott, 2000: Use of a neural-network-based long-wave radiative-transfer scheme in the ECMWF atmospheric model. Quart. J. Roy. Meteor. Soc., 126, 761776, https://doi.org/10.1002/qj.49712656318.

    • Search Google Scholar
    • Export Citation
  • Cintra, R. S., and H. F. de Campos Velho, 2018: Data assimilation by artificial neural networks for an atmospheric general circulation model. Advanced Applications for Artificial Neural Networks, A. El-Shahat, Ed., IntechOpen, 265–286, https://doi.org/10.5772/intechopen.70791.

    • Search Google Scholar
    • Export Citation
  • Delle Monache, L., F. A. Eckel, D. L. Rife, B. Nagarajan, and K. Searight, 2013: Probabilistic weather prediction with an analog ensemble. Mon. Wea. Rev., 141, 34983516, https://doi.org/10.1175/MWR-D-12-00281.1.

    • Search Google Scholar
    • Export Citation
  • Ding, L., 2018: Human knowledge in constructing AI systems—Neural logic networks approach towards an explainable AI. Procedia Comput. Sci., 126, 15611570, https://doi.org/10.1016/j.procs.2018.08.129.

    • Search Google Scholar
    • Export Citation
  • Dueben, P. D., and P. Bauer, 2018: Challenges and design choices for global weather and climate models based on machine learning. Geosci. Model Dev., 11, 39994009, https://doi.org/10.5194/gmd-11-3999-2018.

    • Search Google Scholar
    • Export Citation
  • Eckel, F. A., and L. Delle Monache, 2016: A hybrid NWP-analog ensemble. Mon. Wea. Rev., 144, 897911, https://doi.org/10.1175/MWR-D-15-0096.1.

    • Search Google Scholar
    • Export Citation
  • Eslami, E., and Coauthors, 2019: Hybrid AI hurricane forecasting system: Deep learning ensemble approach and Kalman filter. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S5-3_NOAAai2019_Choi.pdf.

    • Search Google Scholar
    • Export Citation
  • Fablet, R., S. Ouala, and C. Herzet, 2017: Bilinear Residual Neural Network for the Identification and Forecasting of Geophysical Dynamics. 26th European Signal Processing Conf. (EUSIPCO), Rome, Italy, IEEE, 14771481, https://doi.org/10.23919/EUSIPCO.2018.8553492.

    • Search Google Scholar
    • Export Citation
  • Fan, Y., C.-Y. Wu, J. Gottschalck, and V. Krasnopolsky, 2019: Using artificial neural networks to improve CFS week 3-4 precipitation and 2 meter air temperature forecasts. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S5-6_NOAAai2019_Fan.pptx.

    • Search Google Scholar
    • Export Citation
  • Fisher, M., M. Leutbecher, and G. Kelly, 2005: On the equivalence between Kalman smoothing and weak-constraint four-dimensional variational data assimilation. Quart. J. Roy. Meteor. Soc., 131, 32353246, https://doi.org/10.1256/qj.04.142.

    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., 2019: Machine learning parameterizations from the surface to the clouds. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/posters/P2.21_Gagne.pdf.

    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., A. McGovern, S. E. Haupt, R. A. Sobash, J. K. Williams, and M. Xue, 2017: Storm-based probabilistic hail forecasting with machine learning applied to convection-allowing ensembles. Wea. Forecasting, 32, 18191840, https://doi.org/10.1175/WAF-D-17-0010.1.

    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., T. C. McCandless, T. Brummet, B. Kosovic, and S. E. Haupt, 2019a: Surface layer flux machine learning parameterizations. 18th Conf. on Artificial and Computational Intelligence and its Applications to the Environmental Sciences, Boston, MA, Amer. Meteor. Soc., 5B.1, https://ams.confex.com/ams/2019Annual/meetingapp.cgi/Paper/352862.

    • Search Google Scholar
    • Export Citation
  • Gagne, D. J., S. E. Haupt, D. W. Nychka, and G. Thompson, 2019b: Interpretable deep learning for spatial analysis of severe hailstorms. Mon. Wea. Rev., 147, 28272845, https://doi.org/10.1175/MWR-D-18-0316.1.

    • Search Google Scholar
    • Export Citation
  • Geer, A., 2019: Opportunities for using AI methods in weather forecasting at ECMWF. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Tuesday/S1-5_NOAAai2019_Geer.pptx.

    • Search Google Scholar
    • Export Citation
  • Geer, A., 2021: Learning earth system models from observations: machine learning or data assimilation? Philos. Trans. Roy. Soc. A, 379, 121, https://doi.org/10.1098/rsta.2020.0089.

    • Search Google Scholar
    • Export Citation
  • Gentine, P., M. Pritchard, S. Rasp, G. Reinaudi, and G. Yacalis, 2018: Could machine learning break the convection parameterization deadlock? Geophys. Res. Lett., 45, 57425751, https://doi.org/10.1029/2018GL078202.

    • Search Google Scholar
    • Export Citation
  • Ghahramani, Z., 2015: Probabilistic machine learning and artificial intelligence. Nature, 521, 452459, https://doi.org/10.1038/nature14541.

    • Search Google Scholar
    • Export Citation
  • Glahn, H. R., and D. A. Lowry, 1972: The use of model output statistics (MOS) in objective weather forecasting. J. Appl. Meteor., 11, 12031211, https://doi.org/10.1175/1520-0450(1972)011<1203:TUOMOS>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Greybush, S. J., S. E. Haupt, and G. S. Young, 2008: The regime dependence of optimally weighted ensemble model consensus forecasts of surface temperature. Wea. Forecasting, 23, 11461161, https://doi.org/10.1175/2008WAF2007078.1.

    • Search Google Scholar
    • Export Citation
  • Hall, D., 2019: AI for science: Applications of NVIDIA GPUs for numerical weather prediction. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Tuesday/S1-7_NOAAai2019_Hall.pptx.

    • Search Google Scholar
    • Export Citation
  • Hamill, T. M., and J. S. Whitaker, 2006: Probabilistic quantitative precipitation forecasts based on reforecast analogs: Theory and application. Mon. Wea. Rev., 134, 32093229, https://doi.org/10.1175/MWR3237.1.

    • Search Google Scholar
    • Export Citation
  • Hamill, T. M., M. Scheuerer, and G. T. Bates, 2015: Analog probabilistic precipitation forecasts using GEFS reforecasts and climatology-calibrated precipitation analyses. Mon. Wea. Rev., 143, 33003309, https://doi.org/10.1175/MWR-D-15-0004.1.

    • Search Google Scholar
    • Export Citation
  • Haupt, S. E., A. Pasini, and C. Marzban, Eds., 2008: Artificial Intelligence Methods in the Environmental Sciences. Springer, 424 pp., www.springer.com/us/book/9781402091179.

    • Search Google Scholar
    • Export Citation
  • Haupt, S. E., D. J. Gagne, T. McCandless, J. Cowie, and B. Petzke, 2019: Artificial intelligence applications at NCAR. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, D, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Tuesday/S1-4_NOAAai2019_Haupt.pptx.

    • Search Google Scholar
    • Export Citation
  • Hickey, J., 2019: Alphabet AI and weather. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Tuesday/S1-6_NOAAai2019_Hickey.pptx.

    • Search Google Scholar
    • Export Citation
  • Hsieh, W. W., 2009: Machine Learning Methods in the Environmental Sciences: Neural Networks and Kernels. Cambridge University Press, 349 pp., https://doi.org/10.1017/CBO9780511627217.

    • Search Google Scholar
    • Export Citation
  • Hsieh, W. W., and B. Tang, 1998: Applying neural network models to prediction and data analysis in meteorology and oceanography. Bull. Amer. Meteor. Soc., 79, 18551870, https://doi.org/10.1175/1520-0477(1998)079<1855:ANNMTP>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Hunt, B., E. J. Kostelich, and I. Szunyogh, 2007: Efficient data assimilation for spatiotemporal chaos: A local ensemble transform Kalman filter. Physica D, 230, 112126, https://doi.org/10.1016/j.physd.2006.11.008.

    • Search Google Scholar
    • Export Citation
  • James, S. C., Y. Zhang, and F. O’Donncha, 2018: A machine learning framework to forecast wave conditions. Coast. Eng., 137, 110, https://doi.org/10.1016/j.coastaleng.2018.03.004.

    • Search Google Scholar
    • Export Citation
  • Jiang, C. M., and Coauthors, 2020: MeshfreeFlowNet: A physics-constrained deep continuous space-time super-resolution framework. SC20: Int. Conf. for High Performance Computing, Networking, Storage and Analysis, Atlanta, GA, IEEE, 115, doi:10.1109/sc41405.2020.00013.

    • Search Google Scholar
    • Export Citation
  • Jiménez, P. A., J. Dudhia, J. F. González-Rouco, J. Navarro, J. P. Montávez, and E. García-Bustamante, 2012: A revised scheme for the WRF surface layer formulation. Mon. Wea. Rev., 140, 898918, https://doi.org/10.1175/MWR-D-11-00056.1.

    • Search Google Scholar
    • Export Citation
  • Karpatne, A., W. Watkins, J. Read, and V. Kumar, 2018: Physics-guided neural networks (PGNN): An application in lake temperature modeling. arXiv, 11 pp., https://arxiv.org/abs/1710.11431.

    • Search Google Scholar
    • Export Citation
  • Keller, C. A., and M. J. Evans, 2019a: Application of random forest regression to the calculation of gas-phase chemistry within the GEOS-Chem chemistry model v10. Geosci. Model Dev., 12, 12091225, https://doi.org/10.5194/gmd-12-1209-2019.

    • Search Google Scholar
    • Export Citation
  • Keller, C. A., and M. J. Evans, 2019b: Atmospheric chemistry modeling and air quality forecasting using machine learning. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S4_2-2_NOAAai2019_Keller.pptx.

    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V., 2019: Neural network applications in numerical modeling. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S4_2-1_NOAAai2019_Krasnopolsky.pptx.

    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V., 2013: The Application of Neural Networks in the Earth System Sciences: Neural Networks Emulations for Complex Multidimensional Mappings. Atmospheric and Oceanographic Sciences Library, Vol. 46, Springer, 205 pp., https://doi.org/10.1007/978-94-007-6073-8.

    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V., W. H. Gemmill, and L. C. Breaker, 1999: A multi-parameter empirical ocean algorithm for SSM/I retrievals. Can. J. Rem. Sens., 25, 486503, https://doi.org/10.1080/07038992.1999.10874747.

    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V., M. S. Fox-Rabinovitz, Y. T. Hou, S. J. Lord, and A. A. Belochitski, 2010: Accurate and fast neural network emulations of model radiation for the NCEP coupled climate forecast system: Climate simulations and seasonal predictions. Mon. Wea. Rev., 138, 18221842, https://doi.org/10.1175/2009MWR3149.1.

    • Search Google Scholar
    • Export Citation
  • Krasnopolsky, V., M. S. Fox-Rabinovitz, and A. A. Belochitski, 2013: Using ensemble of neural networks to learn stochastic convection parameterizations for climate and numerical weather prediction models from data simulated by a cloud resolving model. Adv. Artif. Neural Syst., 2013, 485913, https://doi.org/10.1155/2013/485913.

    • Search Google Scholar
    • Export Citation
  • Lagerquist, R., A. McGovern, C. R. Homeyer, D. J. Gagne II, and T. Smith, 2020: Deep learning on three-dimensional multiscale data for next-hour tornado prediction. Mon. Wea. Rev., 148, 28372861, https://doi.org/10.1175/MWR-D-19-0372.1.

    • Search Google Scholar
    • Export Citation
  • Lakshmanan, V., J. Hickey, C. Gazen, and W. A. Zaki, 2019: Nowcasting lightning events with a cloud-based deep learning approach. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S5-5_NOAAai2019_Lak.pptx.

    • Search Google Scholar
    • Export Citation
  • Lee, Y.-J., D. Hall, J. Stewart, and M. Govett, 2019: Machine learning for targeted assimilation of satellite data. Databases ECML PKDD 2018: Machine Learning and Knowledge Discovery in Databases, Joint European Conference on Machine Learning and Knowledge Discovery, Lecture Notes in Computer Science (LNCS), Vol. 11053, Springer, 5368, https://doi.org/10.1007/978-3-030-10997-4_4.

    • Search Google Scholar
    • Export Citation
  • Lindskog, M., D. Dee, Y. Trémolet, E. Andersson, G. Radnóti, and M. Fisher, 2009: A weak-constraint four-dimensional variational analysis system in the stratosphere. Quart. J. Roy. Meteor. Soc., 135, 695706, https://doi.org/10.1002/qj.392.

    • Search Google Scholar
    • Export Citation
  • Madaus, L. E., and C. F. Mass, 2017: Evaluating smartphone pressure observations for mesoscale analyses and forecasts. Wea. Forecasting, 32, 511531, https://doi.org/10.1175/WAF-D-16-0135.1.

    • Search Google Scholar
    • Export Citation
  • McCandless, T. C., G. S. Young, S. Haupt, and L. M. Hinkelman, 2016: Regime-dependent short-range solar irradiance forecasting. J. Appl. Meteor. Climatol., 55, 15991613, https://doi.org/10.1175/JAMC-D-15-0354.1.

    • Search Google Scholar
    • Export Citation
  • McGovern, A., 2019: Using machine learning to improve prediction and understanding of convective hazards. First NOAA Workshop on Leveraging AI in the Exploitation of Satellite Earth Observations & Numerical Weather Prediction, College Park, MD, NOAA, www.star.nesdis.noaa.gov/star/documents/meetings/2019AI/Thursday/S5-1_NOAAai2019_McGovern.pptx.

    • Search Google Scholar
    • Export Citation
  • McGovern, A., K. L. Elmore, D. J. Gagne II, S. E. Haupt, C. D. Karstens, R. Lagerquist, T. Smith, and J. K. Williams, 2017: Using artificial intelligence to improve real-time decision-making for high-impact weather. Bull. Amer. Meteor. Soc., 98, 20732090, https://doi.org/10.1175/BAMS-D-16-0123.1.

    • Search Google Scholar
    • Export Citation
  • McGovern, A., R. A. Lagerquist, D. J. Gagne, E. Jergensen, K. L. Elmore, C. R. Homeyer, and T. Smith, 2019: Making the black box more transparent: Understanding the physical implications of machine learning. Bull. Amer. Meteor. Soc., 100, 21752199, https://doi.org/10.1175/BAMS-D-18-0195.1.

    • Search Google Scholar
    • Export Citation
  • Monin, A. S., and A. M. Obukhov, 1954: Basic regularity in turbulent mixing in the surface layer of the atmosphere. Akad. Nauk S. S. S. R. Trud Geofiz. Inst. Tr., 24, 163187.

    • Search Google Scholar
    • Export Citation
  • Myers, W., G. Wiener, S. Linden, and S. E. Haupt, 2011: A consensus forecasting approach for improved turbine hub height wind speed predictions. AWEA Windpower Conf. & Exhibition, Anaheim, CA, American Wind Energy Association, 136, https://opensky.ucar.edu/islandora/object/conference:3296.

    • Search Google Scholar
    • Export Citation
  • Ngodock, H., M. Carrier, S. Smith, and I. Souopgui, 2017: Weak and strong constraints variational data assimilation with the NCOM-4DVAR in the Agulhas Region using the representer method. Mon. Wea. Rev., 145, 17551764, https://doi.org/10.1175/MWR-D-16-0264.1.

    • Search Google Scholar
    • Export Citation
  • O’Donncha, F., Y. Zhang, B. Chen, and S. C. James, 2018: An integrated framework that combines machine learning and numerical models to improve wave-condition forecasts. J. Mar. Syst., 186, 2936, https://doi.org/10.1016/j.jmarsys.2018.05.006.

    • Search Google Scholar
    • Export Citation
  • O’Gorman, P. A., and J. G. Dwyer, 2018: Using machine learning to parameterize moist convection: Potential for modeling of climate, climate change, and extreme events. J. Adv. Model. Earth Syst., 10, 25482563, https://doi.org/10.1029/2018MS001351.

    • Search Google Scholar
    • Export Citation
  • Ott, J., M. Pritchard, N. Best, E. Linstead, M. Curcic, and P. Baldi, 2020: A Fortran-Keras deep learning bridge for scientific computing. arXiv, 14 pp., https://arxiv.org/abs/2004.10652.

    • Search Google Scholar
    • Export Citation
  • Pal, A., S. Mahajan, and M. R. Norman, 2019: Using deep neural networks as cost-effective surrogate models for super-parameterized E3SM radiative transfer. Geophys. Res. Lett., 46, 60696079, https://doi.org/10.1029/2018GL081646.

    • Search Google Scholar
    • Export Citation
  • Pathak, J., B. Hunt, M. Girvan, Z. Lu, and E. Ott, 2018a: Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach. Phys. Rev. Lett., 120, 024102, https://doi.org/10.1103/PhysRevLett.120.024102.

    • Search Google Scholar
    • Export Citation
  • Pathak, J., A. Wikner, R. Fussell, S. Chandra, B. R. Hunt, M. Girvan, and E. Ott, 2018b: Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model featured. Chaos, 28, 041101, https://doi.org/10.1063/1.5028373.

    • Search Google Scholar
    • Export Citation
  • Pearl, J., 1988: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, 552 pp.

  • Prabhat, and Coauthors, 2020: ClimateNet: An expert-labelled open dataset and deep learning architecture for enabling high-precision analyses of extreme weather. Geosci. Model Dev, 14, 107124, https://doi.org/10.5194/gmd-2020-72.

    • Search Google Scholar
    • Export Citation
  • Rasch, P. J., and Coauthors, 2000: A comparison of scavenging and deposition processes in global models: Results from the WCRP Cambridge workshop of 1995. Tellus, 52B, 10251056, https://doi.org/10.3402/tellusb.v52i4.17091.

    • Search Google Scholar
    • Export Citation
  • Rasp, S., M. S. Pritchard, and P. Gentine, 2018: Deep learning to represent subgrid processes in climate models. Proc. Natl. Acad. Sci. USA, 115, 96849689, https://doi.org/10.1073/pnas.1810286115.

    • Search Google Scholar
    • Export Citation
  • Reichstein, M., G. Camps-Valls, B. Stevens, M. Jung, J. Denzler, N. Carvalhais, Prabhat, 2019: Deep learning and process understanding for data-driven Earth system science. Nature, 566, 195204, https://doi.org/10.1038/s41586-019-0912-1.

    • Search Google Scholar
    • Export Citation
  • Rivera, J. P., J. Verrelst, J. Gómez-Dans, J. Muñoz-Marí, J. Moreno, and G. Camps-Valls, 2015: An emulator toolbox to approximate radiative transfer models with statistical learning. Remote Sens ., 7, 93479370, https://doi.org/10.3390/rs70709347.

    • Search Google Scholar
    • Export Citation
  • Rodríguez-Fernández, N., P. de Rosnay, C. Albergel, P. Richaume, F. Aires, C. Prigent, and Y. Kerr, 2019: SMOS neural network soil moisture data assimilation in a land surface model and atmospheric impact. Remote Sens ., 11, 1334, https://doi.org/10.3390/rs11111334.