Browse
Abstract
The Global Positioning System dropwindsonde has provided thousands of high-resolution kinematic and thermodynamic soundings in and around tropical cyclones (TCs) since 1997. These data have revolutionized the understanding of TC structure, improved forecasts, and validated observations from remote-sensing platforms. About 400 peer-reviewed studies on TCs using these data have been published to date. This paper reviews the history of dropwindsonde observations, changes to dropwindsonde technology since it was first used in TCs in 1982, and how the data have improved forecasting and changed our understanding of TCs.
Abstract
The Global Positioning System dropwindsonde has provided thousands of high-resolution kinematic and thermodynamic soundings in and around tropical cyclones (TCs) since 1997. These data have revolutionized the understanding of TC structure, improved forecasts, and validated observations from remote-sensing platforms. About 400 peer-reviewed studies on TCs using these data have been published to date. This paper reviews the history of dropwindsonde observations, changes to dropwindsonde technology since it was first used in TCs in 1982, and how the data have improved forecasting and changed our understanding of TCs.
Abstract
To address critical gaps identified by the National Academies of Sciences, Engineering and Medicine in the current Earth system observation strategy, the 2017-2027 Decadal Survey for Earth Science and Applications from Space recommended incubating concepts for future targeted observables including the atmospheric planetary boundary layer (PBL). A subsequent NASA PBL Incubation Study Team Report identified measurement requirements and activities for advancing the maturity of the technologies applicable to the PBL targeted observables and their associated science and applications priorities. While the PBL is the critical layer where humans live and surface energy, moisture, and mass exchanges drive the Earth system, it is also the farthest and most inaccessible layer for spaceborne instruments. Here we document a PBL retrieval Observing System Simulation Experiment (OSSE) framework suitable for assessing existing and new measurement techniques and determining their accuracy and improvements needed for addressing the elevated Decadal Survey requirements. In particular, the benefits of Large-Eddy Simulation (LES) are emphasized as a key source of high-resolution synthetic observations for key PBL regimes: from the tropics, through sub-tropics and mid-latitudes, to subpolar and polar regions. The potential of LES-based PBL retrieval OSSEs is explored using six instrument simulators: global navigation satellite system-radio occultation, differential absorption radar, visible to shortwave infrared spectrometer, infrared sounder, multi-angle imaging radio spectrometer, and microwave sounder. The crucial role of LES in PBL retrieval OSSEs and some perspectives for instrument developments are discussed.
Abstract
To address critical gaps identified by the National Academies of Sciences, Engineering and Medicine in the current Earth system observation strategy, the 2017-2027 Decadal Survey for Earth Science and Applications from Space recommended incubating concepts for future targeted observables including the atmospheric planetary boundary layer (PBL). A subsequent NASA PBL Incubation Study Team Report identified measurement requirements and activities for advancing the maturity of the technologies applicable to the PBL targeted observables and their associated science and applications priorities. While the PBL is the critical layer where humans live and surface energy, moisture, and mass exchanges drive the Earth system, it is also the farthest and most inaccessible layer for spaceborne instruments. Here we document a PBL retrieval Observing System Simulation Experiment (OSSE) framework suitable for assessing existing and new measurement techniques and determining their accuracy and improvements needed for addressing the elevated Decadal Survey requirements. In particular, the benefits of Large-Eddy Simulation (LES) are emphasized as a key source of high-resolution synthetic observations for key PBL regimes: from the tropics, through sub-tropics and mid-latitudes, to subpolar and polar regions. The potential of LES-based PBL retrieval OSSEs is explored using six instrument simulators: global navigation satellite system-radio occultation, differential absorption radar, visible to shortwave infrared spectrometer, infrared sounder, multi-angle imaging radio spectrometer, and microwave sounder. The crucial role of LES in PBL retrieval OSSEs and some perspectives for instrument developments are discussed.
Abstract
We applied social science research principles to develop a suite of probabilistic winter weather forecasting visualizations for High-Resolution Ensemble Forecast (HREF) system output. This was achieved through an iterative, dialogic process with U.S. National Weather Service (NWS) forecasters to design nine new web-based, interactive products aimed toward improving visualizations of winter weather event magnitudes, characteristics, and timing. These products were influenced by feedback from a preliminary focus group, which emphasized the importance of product credibility, contextualization, and scalability. In a follow-up discussion, winter weather forecasting experts found the event timing products to have the greatest utility due to their association with impact-decision support services (IDSS). Furthermore, forecasters assessed snowfall rates as the most impactful variable rather than snowfall totals and radar reflectivity. The timing products include plots of probabilistic snowfall onset time and duration, rush hour intersection probabilities, and a combination meteogram. The onset and duration plots visualize the ensemble-average onset time and duration of a specified snowfall rate, as demonstrated in previous works, but with the addition of uncertainty information by visualizing the earliest, most likely, and latest potential onset times as well as the shortest, most likely, and longest potential durations. The rush hour product visualizes the probability of exceeding a specified snowfall rate during local commutes, and the combination meteogram allows rapid identification of high-impact periods by encoding probabilities of precipitation, precipitation-type probabilities, and average rates into one graphical tool. Examples of these interactive products are maintained on our companion website: www.visweather.com/bams2023.
Abstract
We applied social science research principles to develop a suite of probabilistic winter weather forecasting visualizations for High-Resolution Ensemble Forecast (HREF) system output. This was achieved through an iterative, dialogic process with U.S. National Weather Service (NWS) forecasters to design nine new web-based, interactive products aimed toward improving visualizations of winter weather event magnitudes, characteristics, and timing. These products were influenced by feedback from a preliminary focus group, which emphasized the importance of product credibility, contextualization, and scalability. In a follow-up discussion, winter weather forecasting experts found the event timing products to have the greatest utility due to their association with impact-decision support services (IDSS). Furthermore, forecasters assessed snowfall rates as the most impactful variable rather than snowfall totals and radar reflectivity. The timing products include plots of probabilistic snowfall onset time and duration, rush hour intersection probabilities, and a combination meteogram. The onset and duration plots visualize the ensemble-average onset time and duration of a specified snowfall rate, as demonstrated in previous works, but with the addition of uncertainty information by visualizing the earliest, most likely, and latest potential onset times as well as the shortest, most likely, and longest potential durations. The rush hour product visualizes the probability of exceeding a specified snowfall rate during local commutes, and the combination meteogram allows rapid identification of high-impact periods by encoding probabilities of precipitation, precipitation-type probabilities, and average rates into one graphical tool. Examples of these interactive products are maintained on our companion website: www.visweather.com/bams2023.
Abstract
We developed five prototype convection-allowing model ensemble visualization products with the goal of improving depictions of the timing of winter weather hazards. These products are interactive, web-based plots visualizing probabilistic onset times and durations of intense snowfall rates, probabilities of heavy snow at rush hour, periods of heightened impacts, and mesoscale snowband probabilities. Prototypes were evaluated in three experimental groups coordinated by the Weather Prediction Center (WPC) Hydrometeorological Testbed (HMT), with a total of 53 National Weather Service (NWS) forecasters. Forecasters were asked to complete a simple forecast exercise for a snowfall event, with a control group using the Storm Prediction Center’s (SPC) High-Resolution Ensemble Forecast (HREF) system viewer, and an experimental group using both the HREF viewer and the five experimental graphics. Forecast accuracy was similar between the groups, but the experimental group exhibited smaller mean absolute error for snowfall duration forecasts. A series of Likert-scale questions saw participants respond favorably to all of the products and indicated that they would use them in operational forecasts and in communicating information to core partners. Forecasters also felt that the new products improved their comprehension of ensemble spread and reduced the time required to complete the forecasting exercise. Follow-up plenary discussions reiterated that there is a high demand for ensemble products of this type, though a number of potential improvements, such as greater customizability, were suggested. Ultimately, we demonstrated that social science methods can be effectively employed in the atmospheric sciences to yield improved visualization products.
Abstract
We developed five prototype convection-allowing model ensemble visualization products with the goal of improving depictions of the timing of winter weather hazards. These products are interactive, web-based plots visualizing probabilistic onset times and durations of intense snowfall rates, probabilities of heavy snow at rush hour, periods of heightened impacts, and mesoscale snowband probabilities. Prototypes were evaluated in three experimental groups coordinated by the Weather Prediction Center (WPC) Hydrometeorological Testbed (HMT), with a total of 53 National Weather Service (NWS) forecasters. Forecasters were asked to complete a simple forecast exercise for a snowfall event, with a control group using the Storm Prediction Center’s (SPC) High-Resolution Ensemble Forecast (HREF) system viewer, and an experimental group using both the HREF viewer and the five experimental graphics. Forecast accuracy was similar between the groups, but the experimental group exhibited smaller mean absolute error for snowfall duration forecasts. A series of Likert-scale questions saw participants respond favorably to all of the products and indicated that they would use them in operational forecasts and in communicating information to core partners. Forecasters also felt that the new products improved their comprehension of ensemble spread and reduced the time required to complete the forecasting exercise. Follow-up plenary discussions reiterated that there is a high demand for ensemble products of this type, though a number of potential improvements, such as greater customizability, were suggested. Ultimately, we demonstrated that social science methods can be effectively employed in the atmospheric sciences to yield improved visualization products.
Abstract
Ice formation and growth processes play a crucial role in the evolution of cloud systems and the formation of precipitation. However, the initial formation and growth of ice crystals are challenging to study in the real atmosphere resulting in uncertainties in weather forecasts and climate projections. The CLOUDLAB project tackles this problem by using supercooled stratus clouds as a natural laboratory for targeted glaciogenic cloud seeding to advance the understanding of ice processes: Ice nucleating particles are injected from an Uncrewed Aerial Vehicle (UAV) into supercooled stratus clouds to induce ice crystal formation and subsequent growth processes. Microphysical changes induced by seeding are measured 3 - 15 minutes downstream of the seeding location using in situ and ground-based remote sensing instrumentation. The novel application of seeding with a multirotor UAV combined with the persistent nature of stratus clouds enables repeated seeding experiments under similar and well-constrained initial conditions. This article describes the scientific goals, experimental design, and first results of CLOUDLAB. First, the seeding plume is characterized by using measurements of a UAV equipped with an optical particle counter. Second, the seeding-induced microphysical changes observed by cloud radars and a tethered balloon system are presented. The seeding signatures were detected by regions of increased radar reflectivity (> -20 dBZ), which were 10 to 20 dBZ higher than the natural background. Simultaneously, high concentrations of seeding particles and ice crystals (up to 2000 L−1) were observed. A cloud seeding case was simulated with the numerical weather model ICON to contextualize the findings.
Abstract
Ice formation and growth processes play a crucial role in the evolution of cloud systems and the formation of precipitation. However, the initial formation and growth of ice crystals are challenging to study in the real atmosphere resulting in uncertainties in weather forecasts and climate projections. The CLOUDLAB project tackles this problem by using supercooled stratus clouds as a natural laboratory for targeted glaciogenic cloud seeding to advance the understanding of ice processes: Ice nucleating particles are injected from an Uncrewed Aerial Vehicle (UAV) into supercooled stratus clouds to induce ice crystal formation and subsequent growth processes. Microphysical changes induced by seeding are measured 3 - 15 minutes downstream of the seeding location using in situ and ground-based remote sensing instrumentation. The novel application of seeding with a multirotor UAV combined with the persistent nature of stratus clouds enables repeated seeding experiments under similar and well-constrained initial conditions. This article describes the scientific goals, experimental design, and first results of CLOUDLAB. First, the seeding plume is characterized by using measurements of a UAV equipped with an optical particle counter. Second, the seeding-induced microphysical changes observed by cloud radars and a tethered balloon system are presented. The seeding signatures were detected by regions of increased radar reflectivity (> -20 dBZ), which were 10 to 20 dBZ higher than the natural background. Simultaneously, high concentrations of seeding particles and ice crystals (up to 2000 L−1) were observed. A cloud seeding case was simulated with the numerical weather model ICON to contextualize the findings.
Abstract
The modeling of weather and climate has been a success story. The skill of forecasts continues to improve and model biases continue to decrease. Combining the output of multiple models has further improved forecast skill and reduced biases. But are we exploiting the full capacity of state-of-the-art models in making forecasts and projections? Supermodeling is a recent step forward in the multimodel ensemble approach. Instead of combining model output after the simulations are completed, in a supermodel individual models exchange state information as they run, influencing each other’s behavior. By learning the optimal parameters that determine how models influence each other based on past observations, model errors are reduced at an early stage before they propagate into larger scales and affect other regions and variables. The models synchronize on a common solution that through learning remains closer to the observed evolution. Effectively a new dynamical system has been created, a supermodel, that optimally combines the strengths of the constituent models. The supermodel approach has the potential to rapidly improve current state-of-the-art weather forecasts and climate predictions. In this paper we introduce supermodeling, demonstrate its potential in examples of various complexity, and discuss learning strategies. We conclude with a discussion of remaining challenges for a successful application of supermodeling in the context of state-of-the-art models. The supermodeling approach is not limited to the modeling of weather and climate, but can be applied to improve the prediction capabilities of any complex system, for which a set of different models exists.
Abstract
The modeling of weather and climate has been a success story. The skill of forecasts continues to improve and model biases continue to decrease. Combining the output of multiple models has further improved forecast skill and reduced biases. But are we exploiting the full capacity of state-of-the-art models in making forecasts and projections? Supermodeling is a recent step forward in the multimodel ensemble approach. Instead of combining model output after the simulations are completed, in a supermodel individual models exchange state information as they run, influencing each other’s behavior. By learning the optimal parameters that determine how models influence each other based on past observations, model errors are reduced at an early stage before they propagate into larger scales and affect other regions and variables. The models synchronize on a common solution that through learning remains closer to the observed evolution. Effectively a new dynamical system has been created, a supermodel, that optimally combines the strengths of the constituent models. The supermodel approach has the potential to rapidly improve current state-of-the-art weather forecasts and climate predictions. In this paper we introduce supermodeling, demonstrate its potential in examples of various complexity, and discuss learning strategies. We conclude with a discussion of remaining challenges for a successful application of supermodeling in the context of state-of-the-art models. The supermodeling approach is not limited to the modeling of weather and climate, but can be applied to improve the prediction capabilities of any complex system, for which a set of different models exists.
Abstract
Dynamical downscaling is a crucial process for providing regional climate information for broad uses, using coarser-resolution global models to drive higher-resolution regional climate simulations. The pool of global climate models (GCMs) providing the fields needed for dynamical downscaling has increased from the previous generations of the Coupled Model Intercomparison Project (CMIP). However, with limited computational resources, the need for prioritizing the GCMs for subsequent downscaling studies remains. GCM selection for dynamical downscaling should focus on evaluating processes relevant for providing boundary conditions to the regional models and be inspired by regional uses such as the response of extremes to changes in the boundary conditions. This leads to the need for metrics representing processes of relevance to diverse stakeholders and subregions of a domain. Procedures to account for metric redundancy and the statistical distinguishability of GCM rankings are required. Further, procedures for selecting realizations from ensembles of top-performing GCM simulations can be used to span the range of climate change signals in multiple ways. As a result, distinct weighting of metrics and prioritization of particular realizations may depend on user needs. We provide high-level guidelines for such region-specific evaluations and address how CMIP7 might enable dynamical downscaling of a representative sample of high-quality models across representative shared socioeconomic pathways (SSPs).
Abstract
Dynamical downscaling is a crucial process for providing regional climate information for broad uses, using coarser-resolution global models to drive higher-resolution regional climate simulations. The pool of global climate models (GCMs) providing the fields needed for dynamical downscaling has increased from the previous generations of the Coupled Model Intercomparison Project (CMIP). However, with limited computational resources, the need for prioritizing the GCMs for subsequent downscaling studies remains. GCM selection for dynamical downscaling should focus on evaluating processes relevant for providing boundary conditions to the regional models and be inspired by regional uses such as the response of extremes to changes in the boundary conditions. This leads to the need for metrics representing processes of relevance to diverse stakeholders and subregions of a domain. Procedures to account for metric redundancy and the statistical distinguishability of GCM rankings are required. Further, procedures for selecting realizations from ensembles of top-performing GCM simulations can be used to span the range of climate change signals in multiple ways. As a result, distinct weighting of metrics and prioritization of particular realizations may depend on user needs. We provide high-level guidelines for such region-specific evaluations and address how CMIP7 might enable dynamical downscaling of a representative sample of high-quality models across representative shared socioeconomic pathways (SSPs).