Abstract
It has been over 75 years since the concept of directly suppressing lightning by modifying thunderstorm cloud processes was first proposed as a technique for preventing wildfire ignitions. Subsequent decades produced a series of successful field campaigns that demonstrated several techniques for interrupting storm electrification, motivated in part by the prospect of protecting Apollo-era rocket launches from lightning strike. Despite the technical success of these research programs, funding and interest diminished until the final field experiment in 1975 marked the last large-scale activity toward developing lightning prevention technology. Having lost widespread awareness over the ensuing 50 years, these pioneering efforts in experimental cloud physics have largely been forgotten, and this approach for mitigating lightning hazards has fallen into obscurity. At the present day, risks from lightning-ignited wildfires to lives, property, and infrastructure are once again a major topic of concern. Similarly, the rapid development of an emerging commercial space sector is placing new demands on airspace management and launch scheduling. These modern challenges may potentially be addressed by a seemingly antiquated concept—lightning suppression—but considerations must be made to understand the consequences of deploying this technology. Nonetheless, the possible economic, environmental, and societal benefits of this approach merit a critical reevaluation of this hazard mitigation technology in the current era.
Abstract
It has been over 75 years since the concept of directly suppressing lightning by modifying thunderstorm cloud processes was first proposed as a technique for preventing wildfire ignitions. Subsequent decades produced a series of successful field campaigns that demonstrated several techniques for interrupting storm electrification, motivated in part by the prospect of protecting Apollo-era rocket launches from lightning strike. Despite the technical success of these research programs, funding and interest diminished until the final field experiment in 1975 marked the last large-scale activity toward developing lightning prevention technology. Having lost widespread awareness over the ensuing 50 years, these pioneering efforts in experimental cloud physics have largely been forgotten, and this approach for mitigating lightning hazards has fallen into obscurity. At the present day, risks from lightning-ignited wildfires to lives, property, and infrastructure are once again a major topic of concern. Similarly, the rapid development of an emerging commercial space sector is placing new demands on airspace management and launch scheduling. These modern challenges may potentially be addressed by a seemingly antiquated concept—lightning suppression—but considerations must be made to understand the consequences of deploying this technology. Nonetheless, the possible economic, environmental, and societal benefits of this approach merit a critical reevaluation of this hazard mitigation technology in the current era.
Abstract
Marine heatwaves (MHWs) are prolonged extremely high sea surface temperature (SST) events. In 2021 summer, an intense MHW occurred over the central North Pacific; the SST in September 2021 was the highest in September since 1900, and the warming signal was distributed not only near the sea surface but also below the ocean mixed layer (∼300 m depth). Atmosphere reanalysis data showed westward expansion of the North Pacific Subtropical High (NPSH) in 2021 summer, but both an increase in downward shortwave radiation and a decrease in upward latent heat flux were not so large, and ocean mixed layer heat budget analysis, which also used ocean reanalysis data, revealed that the atmosphere-induced heating is insufficient to form the record-breaking MHW. Argo profiling floats indicated that, in 2021 summer, the Central Mode Water (CMW) – a huge water mass characterized by vertically uniform properties in depths of 100–500 m – decreased extremely, the thickness of which was less than 20% of the normal. Statistical analysis showed that, from the sea surface to the upper boundary of CMW, the heavier isopycnal surfaces are deeper associated with the decrease in CMW, leading to a weakening of the seasonal pycnocline. Then this causes the weakening of cooling heat flux associated with the entrainment of subsurface waters into the mixed layer, resulting in surface ocean warming, which in turn contributed to form the MHW in 2021 summer.
Abstract
Marine heatwaves (MHWs) are prolonged extremely high sea surface temperature (SST) events. In 2021 summer, an intense MHW occurred over the central North Pacific; the SST in September 2021 was the highest in September since 1900, and the warming signal was distributed not only near the sea surface but also below the ocean mixed layer (∼300 m depth). Atmosphere reanalysis data showed westward expansion of the North Pacific Subtropical High (NPSH) in 2021 summer, but both an increase in downward shortwave radiation and a decrease in upward latent heat flux were not so large, and ocean mixed layer heat budget analysis, which also used ocean reanalysis data, revealed that the atmosphere-induced heating is insufficient to form the record-breaking MHW. Argo profiling floats indicated that, in 2021 summer, the Central Mode Water (CMW) – a huge water mass characterized by vertically uniform properties in depths of 100–500 m – decreased extremely, the thickness of which was less than 20% of the normal. Statistical analysis showed that, from the sea surface to the upper boundary of CMW, the heavier isopycnal surfaces are deeper associated with the decrease in CMW, leading to a weakening of the seasonal pycnocline. Then this causes the weakening of cooling heat flux associated with the entrainment of subsurface waters into the mixed layer, resulting in surface ocean warming, which in turn contributed to form the MHW in 2021 summer.
Abstract
Societies in much of the Horn of Africa are affected by variability in two distinct rainy seasons: the March-May (MAM) “long” rains and the October-December (OND) “short” rains. A recent 5-season, La Niña-forced drought has renewed concerns about possible anthropogenic drying trends in the long rains, which had partially recovered after a multidecadal drying trend in the 1980s through 2000s. Despite observed drying, previous generations of global climate models (GCMs) have consistently projected long-term wetting due to increased greenhouse gas concentrations, an East African “Paradox” which complicates the interpretation of East African rainfall projections. We investigate the Paradox in new CMIP6 and seasonal forecast models, leveraging an improved observational record and Large Ensembles to better differentiate internal and forced trends. We find observed drying trends are at the limits of the GCM spread during the peak Paradox period, though the recent recovery is comfortably within the model spread. We find that the apparent Paradox is largely removed by prescribing sea surface temperatures (SSTs), and is likely caused by the GCM difficulties in simulating observed tropical Pacific SST trends in recent decades. In line with arguments that these SST trends are at least partially forced anthropogenically, we recommend users of future rainfall projections in East Africa consider the possibility of long-term MAM drying despite GCM wetting, and call for future model simulations that better sample the expected spread of SSTs.
Abstract
Societies in much of the Horn of Africa are affected by variability in two distinct rainy seasons: the March-May (MAM) “long” rains and the October-December (OND) “short” rains. A recent 5-season, La Niña-forced drought has renewed concerns about possible anthropogenic drying trends in the long rains, which had partially recovered after a multidecadal drying trend in the 1980s through 2000s. Despite observed drying, previous generations of global climate models (GCMs) have consistently projected long-term wetting due to increased greenhouse gas concentrations, an East African “Paradox” which complicates the interpretation of East African rainfall projections. We investigate the Paradox in new CMIP6 and seasonal forecast models, leveraging an improved observational record and Large Ensembles to better differentiate internal and forced trends. We find observed drying trends are at the limits of the GCM spread during the peak Paradox period, though the recent recovery is comfortably within the model spread. We find that the apparent Paradox is largely removed by prescribing sea surface temperatures (SSTs), and is likely caused by the GCM difficulties in simulating observed tropical Pacific SST trends in recent decades. In line with arguments that these SST trends are at least partially forced anthropogenically, we recommend users of future rainfall projections in East Africa consider the possibility of long-term MAM drying despite GCM wetting, and call for future model simulations that better sample the expected spread of SSTs.
Abstract
Difficulty in using observations to isolate the impacts of aerosols from meteorology on deep convection often stems from the inability to resolve the spatiotemporal variations in the environment serving as the storm’s inflow region. During the U.S. Department of Energy (DOE) Tracking Aerosol Convection interactions Experiment (TRACER) in June–September 2022, a Texas A&M University (TAMU) team conducted a mobile field campaign to characterize the meteorological and aerosol variability in air masses that serve as inflow to convection across the ubiquitous mesoscale boundaries associated with the sea and bay breezes in the Houston, Texas, region. These boundaries propagate inland over the fixed DOE Atmospheric Radiation Measurement (ARM) sites. However, convection occurs on either or both the continental or maritime sides or along the boundary. The maritime and continental air masses serving as convection inflow may be quite distinct, with different meteorological and aerosol characteristics that fixed-site measurements cannot simultaneously sample. Thus, a primary objective of TAMU TRACER was to provide mobile measurements similar to those at the fixed sites, but in the opposite air mass across these moving mesoscale boundaries. TAMU TRACER collected radiosonde, lidar, aerosol, cloud condensation nuclei (CCN), and ice nucleating particle (INP) measurements on 29 enhanced operations days covering a variety of maritime, continental, outflow, and prefrontal air masses. This paper summarizes the TAMU TRACER deployment and measurement strategy, instruments, and available datasets and provides sample cases highlighting differences between these mobile measurements and those made at the ARM sites. We also highlight the exceptional TAMU TRACER undergraduate student participation in high-impact learning activities through forecasting and field deployment opportunities.
Abstract
Difficulty in using observations to isolate the impacts of aerosols from meteorology on deep convection often stems from the inability to resolve the spatiotemporal variations in the environment serving as the storm’s inflow region. During the U.S. Department of Energy (DOE) Tracking Aerosol Convection interactions Experiment (TRACER) in June–September 2022, a Texas A&M University (TAMU) team conducted a mobile field campaign to characterize the meteorological and aerosol variability in air masses that serve as inflow to convection across the ubiquitous mesoscale boundaries associated with the sea and bay breezes in the Houston, Texas, region. These boundaries propagate inland over the fixed DOE Atmospheric Radiation Measurement (ARM) sites. However, convection occurs on either or both the continental or maritime sides or along the boundary. The maritime and continental air masses serving as convection inflow may be quite distinct, with different meteorological and aerosol characteristics that fixed-site measurements cannot simultaneously sample. Thus, a primary objective of TAMU TRACER was to provide mobile measurements similar to those at the fixed sites, but in the opposite air mass across these moving mesoscale boundaries. TAMU TRACER collected radiosonde, lidar, aerosol, cloud condensation nuclei (CCN), and ice nucleating particle (INP) measurements on 29 enhanced operations days covering a variety of maritime, continental, outflow, and prefrontal air masses. This paper summarizes the TAMU TRACER deployment and measurement strategy, instruments, and available datasets and provides sample cases highlighting differences between these mobile measurements and those made at the ARM sites. We also highlight the exceptional TAMU TRACER undergraduate student participation in high-impact learning activities through forecasting and field deployment opportunities.
Abstract
This study explores gulf-breeze circulations (GBCs) and bay-breeze circulations (BBCs) in Houston–Galveston, investigating their characteristics, large-scale weather influences, and impacts on surface properties, boundary layer updrafts, and convective clouds. The results are derived from a combination of datasets, including satellite observations, ground-based measurements, and reanalysis datasets, using machine learning, changepoint detection method, and Lagrangian cell tracking. We find that anticyclonic synoptic patterns during the summer months (June–September) favor GBC/BBC formation and the associated convective cloud development, representing 74% of cases. The main Tracking Aerosol Convection Interactions Experiment (TRACER) site located close to the Galveston Bay is influenced by both GBC and BBC, with nearly half of the cases showing evident BBC features. The site experiences early frontal passages ranging from 1040 to 1630 local time (LT), with 1300 LT being the most frequent. These fronts are stronger than those observed at the ancillary site which is located further inland from the Galveston Bay, including larger changes in surface temperature, moisture, and wind speed. Furthermore, these fronts trigger boundary layer updrafts, likely promoting isolated convective precipitating cores that are short lived (average convective lifetime of 63 min) and slow moving (average propagation speed of 5 m s−1), primarily within 20–40 km from the coast.
Abstract
This study explores gulf-breeze circulations (GBCs) and bay-breeze circulations (BBCs) in Houston–Galveston, investigating their characteristics, large-scale weather influences, and impacts on surface properties, boundary layer updrafts, and convective clouds. The results are derived from a combination of datasets, including satellite observations, ground-based measurements, and reanalysis datasets, using machine learning, changepoint detection method, and Lagrangian cell tracking. We find that anticyclonic synoptic patterns during the summer months (June–September) favor GBC/BBC formation and the associated convective cloud development, representing 74% of cases. The main Tracking Aerosol Convection Interactions Experiment (TRACER) site located close to the Galveston Bay is influenced by both GBC and BBC, with nearly half of the cases showing evident BBC features. The site experiences early frontal passages ranging from 1040 to 1630 local time (LT), with 1300 LT being the most frequent. These fronts are stronger than those observed at the ancillary site which is located further inland from the Galveston Bay, including larger changes in surface temperature, moisture, and wind speed. Furthermore, these fronts trigger boundary layer updrafts, likely promoting isolated convective precipitating cores that are short lived (average convective lifetime of 63 min) and slow moving (average propagation speed of 5 m s−1), primarily within 20–40 km from the coast.
Abstract
Atmospheric state analysis that leverages state-of-the-art data assimilation achieves high accuracy and can provide initial conditions for numerical weather prediction (NWP) and climatological reanalysis. However, the interactions between the atmosphere and the ocean have been inadequately addressed, with sea surface temperature (SST) as a boundary condition for the atmosphere. This limitation impacts the accuracy of atmospheric state analyses and the utilization of SST-sensitive observations. To address this, we developed a partially coupled data assimilation (PCDA) system for the atmosphere and SST by extending the operational global NWP system of the Japan Meteorological Agency. The PCDA system enhances the analysis variables and background error covariance matrices to include SST components and the use of microwave radiance observations sensitive to SST, particularly at low frequencies (6 to 11 GHz), which have previously been unused or absent in most NWP systems. Our numerical experiments demonstrate several key findings: (1) The PCDA system identified colder SSTs in regions with significant SST gradients, including SST fronts in the mid-latitudes, and we obtained zonally positive and negative increments in tropical instability wave regions; (2) The SST analysis produced by PCDA was consistent with independent SST analyses; (3) The system yielded a moist and warm low-level troposphere, leading to an increase in the first 24-h rain forecast near the intertropical convergence zone; and (4) PCDA globally improved the forecast accuracy of near-surface temperatures, with notable improvements in the tropics for most variables, except for mid-tropospheric temperature. In the extra-tropics, forecast accuracy improvements were observed for height and humidity, although some degradation occurred mainly in the southern hemisphere.
Abstract
Atmospheric state analysis that leverages state-of-the-art data assimilation achieves high accuracy and can provide initial conditions for numerical weather prediction (NWP) and climatological reanalysis. However, the interactions between the atmosphere and the ocean have been inadequately addressed, with sea surface temperature (SST) as a boundary condition for the atmosphere. This limitation impacts the accuracy of atmospheric state analyses and the utilization of SST-sensitive observations. To address this, we developed a partially coupled data assimilation (PCDA) system for the atmosphere and SST by extending the operational global NWP system of the Japan Meteorological Agency. The PCDA system enhances the analysis variables and background error covariance matrices to include SST components and the use of microwave radiance observations sensitive to SST, particularly at low frequencies (6 to 11 GHz), which have previously been unused or absent in most NWP systems. Our numerical experiments demonstrate several key findings: (1) The PCDA system identified colder SSTs in regions with significant SST gradients, including SST fronts in the mid-latitudes, and we obtained zonally positive and negative increments in tropical instability wave regions; (2) The SST analysis produced by PCDA was consistent with independent SST analyses; (3) The system yielded a moist and warm low-level troposphere, leading to an increase in the first 24-h rain forecast near the intertropical convergence zone; and (4) PCDA globally improved the forecast accuracy of near-surface temperatures, with notable improvements in the tropics for most variables, except for mid-tropospheric temperature. In the extra-tropics, forecast accuracy improvements were observed for height and humidity, although some degradation occurred mainly in the southern hemisphere.
Abstract
Ensemble sensitivity analysis (ESA) offers a computationally inexpensive way to diagnose sources of high-impact forecast feature uncertainty by relating a localized forecast phenomenon of interest (response function) back to early or initial forecast conditions (sensitivity variables). These information-rich diagnostic fields allow us to quantify the predictability characteristics of a specific forecast event. This work harnesses insights from a month-long dataset of ESA applied to convection-allowing model precipitation forecasts in the Central Plains of the US. Temporally-averaged and spatially-averaged sensitivity statistics are correlated with a variety of other metrics, such as skill, spread, and mean forecast precipitation accumulation. A high, but imperfect, correlation (0.81) between forecast precipitation and sensitivity is discovered. This quantity confirms the qualitatively known notion that while there is a connection between predictability and event magnitude, a high-end event does not necessarily entail a low predictability (high sensitivity) forecast. Flow regimes within this dataset are analyzed to see which patterns lend themselves to high and low predictability forecast scenarios. Finally, a novel metric known as the Error Growth Realization (EGR) Ratio is introduced. Derived by dividing the two mathematical formulations of ESA, this metric shows preliminary promise as a predictor of forecast skill prior to onset of a high-impact convective event. In essence, this research exemplifies the potential of ESA beyond its traditional use in case studies. By applying ESA to a broader dataset, we can glean valuable insight into the predictability of high-impact weather events, and in turn, work towards a collective baseline on what constitutes a high- or low-predictability event in the first place.
Abstract
Ensemble sensitivity analysis (ESA) offers a computationally inexpensive way to diagnose sources of high-impact forecast feature uncertainty by relating a localized forecast phenomenon of interest (response function) back to early or initial forecast conditions (sensitivity variables). These information-rich diagnostic fields allow us to quantify the predictability characteristics of a specific forecast event. This work harnesses insights from a month-long dataset of ESA applied to convection-allowing model precipitation forecasts in the Central Plains of the US. Temporally-averaged and spatially-averaged sensitivity statistics are correlated with a variety of other metrics, such as skill, spread, and mean forecast precipitation accumulation. A high, but imperfect, correlation (0.81) between forecast precipitation and sensitivity is discovered. This quantity confirms the qualitatively known notion that while there is a connection between predictability and event magnitude, a high-end event does not necessarily entail a low predictability (high sensitivity) forecast. Flow regimes within this dataset are analyzed to see which patterns lend themselves to high and low predictability forecast scenarios. Finally, a novel metric known as the Error Growth Realization (EGR) Ratio is introduced. Derived by dividing the two mathematical formulations of ESA, this metric shows preliminary promise as a predictor of forecast skill prior to onset of a high-impact convective event. In essence, this research exemplifies the potential of ESA beyond its traditional use in case studies. By applying ESA to a broader dataset, we can glean valuable insight into the predictability of high-impact weather events, and in turn, work towards a collective baseline on what constitutes a high- or low-predictability event in the first place.
Abstract
Earth and planetary radiometry requires spectrally dependent observations spanning an expansive range in signal flux due to variability in celestial illumination, spectral albedo, and attenuation. Insufficient dynamic range inhibits contemporaneous measurements of dissimilar signal levels and restricts potential environments, time periods, target types, or spectral ranges that instruments observe. Next-generation (NG) advances in temporal, spectral, and spatial resolution also require further increases in detector sensitivity and dynamic range corresponding to increased sampling rate and decreased field-of-view (FOV), both of which capture greater intrapixel variability (i.e., variability within the spatial and temporal integration of a pixel observation). Optical detectors typically must support expansive linear radiometric responsivity, while simultaneously enduring the inherent stressors of field, airborne, or satellite deployment. Rationales for significantly improving radiometric observations of nominally dark targets are described herein, along with demonstrations of state-of-the-art (SOTA) capabilities and NG strategies for advancing SOTA. An evaluation of linear dynamic range and efficacy of optical data products is presented based on representative sampling scenarios. Low-illumination (twilight or total lunar eclipse) observations are demonstrated using a SOTA prototype. Finally, a ruggedized and miniaturized commercial-off-the-shelf (COTS) NG capability to obtain absolute radiometric observations spanning an expanded range in target brightness and illumination is presented. The presented NG technology combines a Multi-Pixel Photon Counter (MPPC) with a silicon photodetector (SiPD) to form a dyad optical sensing component supporting expansive dynamic range sensing, i.e., exceeding a nominal 10 decades in usable dynamic range documented for SOTA instruments.
Abstract
Earth and planetary radiometry requires spectrally dependent observations spanning an expansive range in signal flux due to variability in celestial illumination, spectral albedo, and attenuation. Insufficient dynamic range inhibits contemporaneous measurements of dissimilar signal levels and restricts potential environments, time periods, target types, or spectral ranges that instruments observe. Next-generation (NG) advances in temporal, spectral, and spatial resolution also require further increases in detector sensitivity and dynamic range corresponding to increased sampling rate and decreased field-of-view (FOV), both of which capture greater intrapixel variability (i.e., variability within the spatial and temporal integration of a pixel observation). Optical detectors typically must support expansive linear radiometric responsivity, while simultaneously enduring the inherent stressors of field, airborne, or satellite deployment. Rationales for significantly improving radiometric observations of nominally dark targets are described herein, along with demonstrations of state-of-the-art (SOTA) capabilities and NG strategies for advancing SOTA. An evaluation of linear dynamic range and efficacy of optical data products is presented based on representative sampling scenarios. Low-illumination (twilight or total lunar eclipse) observations are demonstrated using a SOTA prototype. Finally, a ruggedized and miniaturized commercial-off-the-shelf (COTS) NG capability to obtain absolute radiometric observations spanning an expanded range in target brightness and illumination is presented. The presented NG technology combines a Multi-Pixel Photon Counter (MPPC) with a silicon photodetector (SiPD) to form a dyad optical sensing component supporting expansive dynamic range sensing, i.e., exceeding a nominal 10 decades in usable dynamic range documented for SOTA instruments.