Browse

Theresa Diefenbach
,
Leonhard Scheck
,
Martin Weissmann
, and
George Craig

Abstract

The analyses produced by a data assimilation system may be unbalanced, that is dynamically inconsistent with the forecasting model, leading to noisy forecasts and reduced skill. While there are effective procedures to reduce synoptic-scale imbalance, the situation on the convective scale is less clear because the flowon this scale is strongly divergent and non-hydrostatic. In this studywe compare three measures of imbalance relevant to convective-scale data assimilation: (i) surface pressure tendencies, (ii) vertical velocity variance in the vicinity of convective clouds, and (iii) departures from the vertical velocity prescribed by the weak temperature gradient (WTG) approximation. These are applied in a numerical weather prediction system, with three different data assimilation algorithms: 1. Latent Heat Nudging (LHN), 2. Local Ensemble Transform Kalman Filter (LETKF), and 3. LETKF in combination with incremental analysis updates (IAU). Results indicate that surface pressure tendency diagnoses a different type of imbalance than the vertical velocity variance and theWTG departure. The LETKF induces a spike in surface pressure tendencies, with a large-scale spatial pattern that is not clearly related to the precipitation pattern. This anomaly is notably reduced by the IAU. LHN does not generate a pronounced signal in the surface pressure, but produces the most imbalance in the other two measures. The imbalances measured by the partitioned vertical velocity variance andWTG departures are similar, and closely coupled to the convective precipitation. Between these two measures, the WTG departure has the advantage of being simpler and more economical to compute.

Restricted access
Gregory J. Hakim
and
Sanjit Masanam

Abstract

Global deep-learning weather prediction models have recently been shown to produce forecasts that rival those from physics-based models run at operational centers. It is unclear whether these models have encoded atmospheric dynamics, or simply pattern matching that produces the smallest forecast error. Answering this question is crucial to establishing the utility of these models as tools for basic science. Here we subject one such model, Pangu-Weather, to a set of four classical dynamical experiments that do not resemble the model training data. Localized perturbations to the model output and the initial conditions are added to steady time-averaged conditions, to assess the propagation speed and structural evolution of signals away from the local source. Perturbing the model physics by adding a steady tropical heat source results in a classical Matsuno–Gill response near the heating, and planetary waves that radiate into the extratropics. A localized disturbance on the winter-averaged North Pacific jet stream produces realistic extratropical cyclones and fronts, including the spontaneous emergence of polar lows. Perturbing the 500hPa height field alone yields adjustment from a state of rest to one of wind–pressure balance over ∼6 hours. Localized subtropical low pressure systems produce Atlantic hurricanes, provided the initial amplitude exceeds about 4 hPa, and setting the initial humidity to zero eliminates hurricane development. We conclude that the model encodes realistic physics in all experiments, and suggest it can be used as a tool for rapidly testing a wide range of hypotheses.

Open access
Connell S. Miller
,
Gregory A. Kopp
,
David M.L. Sills
, and
Daniel G. Butt

Abstract

Currently, the Enhanced Fujita scale does not consider the wind-induced movement of various large compact objects such as vehicles, construction equipment, farming equipment / haybales, etc. that are often found in post-event damage surveys. One reason for this is that modelling debris in tornadoes comes with considerable uncertainties since there are many parameters to determine, leading to difficulties in using trajectories to analyze wind speeds of tornadoes. This paper aims to develop a forensic tool using analytical tornado models to estimate lofting wind speeds based on trajectories of large compact objects. This is accomplished by implementing a Monte Carlo simulation to randomly select the parameters and plotting cumulative distribution functions showing the likelihood of lofting at each wind speed. After analyzing the debris lofting from several documented tornadoes in Canada, the results indicate that the method provides threshold lofting wind speeds that are similar to the estimated speeds given by other methods. However, the introduction of trajectories produces estimated lofting wind speeds that are higher than the EF-scale rating given from the ground survey assessment based on structural damage. Further studies will be required to better understand these differences.

Restricted access
Joël Stein
and
Fabien Stoop

Abstract

A procedure for evaluating the quality of probabilistic forecasts of binary events has been developed. This is based on a two-step procedure: pooling of forecasts on the one hand and observations on the other hand, on all the points of a neighborhood in order to obtain frequencies at the neighborhood length scale and then to calculate the Brier divergence for these neighborhood frequencies. This score allows the comparison of a probabilistic forecast and observations at the neighborhood length scale, and therefore, the rewarding of event forecasts shifted from the location of the observed event by a distance smaller than the neighborhood size. A new decomposition of this score generalizes that of the Brier score and allows the separation of the generalized resolution, reliability, and uncertainty terms. The neighborhood Brier divergence skill score (BDnSS) measures the performance of the probabilistic forecast against the sample climatology. BDnSS and its decomposition have been used for idealized and real cases in order to show the utility of neighborhoods when comparing at different scales the performances of ensemble forecasts between themselves or with deterministic forecasts or of deterministic forecasts between themselves.

Significance Statement

A pooling of forecasts on the one hand and observations on the other hand, on all the points of a neighborhood, is performed in order to obtain frequencies at the neighborhood scale. The Brier divergence is then calculated for these neighborhood frequencies to compare a probabilistic forecast and observations at the neighborhood scale. A new decomposition of this score generalizes that of the Brier score and allows the separation of the generalized resolution, reliability, and uncertainty terms. This uncertainty term is used to define the neighborhood Brier divergence skill score which is an alternative to the popular fractions skill score, with a more appropriate denominator.

Open access
Nicolas G. Alonso-De-Linaje
,
Andrea N. Hahmann
,
Ioanna Karagali
,
Krystallia Dimitriadou
, and
Merete Badger

Abstract

The paper aims to demonstrate how to enhance the accuracy of offshore wind resource estimation, specifically by incorporating near-surface satellite-derived wind observations into mesoscale models. We utilized the Weather Research and Forecasting (WRF) model and applied observational nudging by integrating ASCAT data over offshore areas to achieve this. We then evaluated the accuracy of the nudged WRF model simulations by comparing them with data from ocean oil platforms, tall masts, and a wind Lidar mounted on a commercial ferry crossing the southern Baltic Sea. Our findings indicate that including satellite-derived ASCAT wind speeds through nudging enhances the correlation and reduces the error of the mesoscale simulations across all validation platforms. Moreover, it consistently outperforms the control and previously published WRF-based wind atlases. Using satellite-derived winds directly in the model simulations also solves the issue of lifting near-surface winds to wind turbine heights, which has been challenging in estimating wind resources at such heights. The comparison of the one-year-long simulations with and without nudging reveals intriguing differences in the sign and magnitude between the Baltic and North Seas, which vary seasonally. The pattern highlights a distinct regional pattern attributed to regional dynamics, sea surface temperature, atmospheric stability, and the number of available ASCAT samples.

Restricted access
Tyler Cox
,
Aaron Donohoe
,
Kyle C. Armour
,
Gerard H. Roe
, and
Dargan M.W. Frierson

Abstract

Atmospheric heat transport (AHT) is an important piece of our climate system, but has primarily been studied at monthly or longer time scales. We introduce a new method for calculating zonal-mean meridional atmospheric heat transport (AHT) using instantaneous atmospheric fields. When time averaged, our calculations closely reproduce the climatological AHT used elsewhere in the literature to understand AHT and its trends on long timescales. In the extratropics, AHT convergence and atmospheric heating are strongly temporally correlated suggesting that AHT drives the vast majority of zonal-mean atmospheric temperature variability. Our AHT methodology separates AHT into two components, eddies and the mean-meridional circulation, which we find are negatively correlated throughout most of the mid- to high-latitudes. This negative correlation reduces the variance of total AHT compared to eddy AHT. Lastly, we find that the temporal distribution of total AHT at any given latitude is approximately symmetric.

Restricted access
Yun Chang
and
Alberto Scotti

Abstract

This paper provides a framework that unifies the characteristics of Langmuir turbulence, including the vortex force effect, velocity scalings, vertical flow structure, and crosswind spacing between surface streaks. The widely accepted CL2 mechanism is extended to explain the observed maximum alongwind velocity and downwelling velocity below the surface. Balancing the extended mechanism in the Craik-Leibovich equations, the scalings for the along-wind velocity u, cross-wind velocity v, and vertical velocity w are formulated as
U = U f L a 2 / 3 , V = W = ( U f 2 U s ) 1 / 3 .
Here, Uf is the friction velocity, Us is the Stokes drift on the surface, and La = (Uf /Us )1/2 is the Langmuir number. Simulations using the Stratified Ocean Model with Adaptive Refinement in Large Eddy Simulation mode (LES-SOMAR) validate the scalings and reveal physical similarity for velocity and crosswind spacing. The horizontally averaged velocity along the wind ū/U on the surface grows with time, whereas v/V and w/W are confined. The root mean square (rms) of w peaks at wrms/W ≈ 0.85 at a depth of 1.3Zs, where Zs is the e-folding scale of the Stokes drift. The crosswind spacing L grows linearly with time but is finally limited by the depth of the water H, with maximum L/H = 3.3. This framework agrees with measurement collected in six different field campaigns.
Restricted access
Fan Wu
and
Kelly Lombardo

Abstract

This study employs 3D idealized numerical experiments to investigate the physical processes associated with coastal convection initiation (CI) as an offshore-moving squall line traverses a mountainous coastal region. A squall line can propagate discretely as convection initiates over the lee slope downstream of the primary storm as the cold pool collides with a sea breeze. Intensity of the initiating convection, thus the downstream squall line, is sensitive to the sea breeze numerical initialization method, since it influences sea breeze and cold pool characteristics, instability and vertical wind shear in the sea breeze environment, and ultimately the vertical acceleration of air parcels during CI. Here, sea breezes are generated through four commonly used numerical methods: a cold-block marine atmospheric boundary layer (MABL), prescribed surface sensible heat flux function, prescribed surface sensible plus latent heat flux functions, and radiation plus surface-layer parameterization schemes. For MABL-initialized sea breezes, shallow weak sea breeze flow in a relatively low instability environment results in weak CI. For the remainder, deeper stronger sea breeze flow in an environment of enhanced instability supports more robust CI. In a subset of experiments, however, the vertical trajectory of air parcels is suppressed leading to weaker convection. Downward acceleration forms due to the horizontal rotation of the sea breeze flow. Accurate simulations of coastal convective storms rely on an accurate representation of sea breezes. For idealized experiments such as the present simulations, a combination of initialization methods likely produces a more realistic representation of sea breeze and the associated physical processes.

Restricted access
James R. Ledwell

Abstract

Lightening of bottom water is required to close the abyssal overturning circulation, believed to play an important role in the climate system. A tracer release experiment and turbulence measurement programs have revealed how bottom water is lightened, and illuminated the associated circulation in the deep Brazil Basin, a representative region of the global ocean. Tracer was released on an isopycnal surface about 4000 m deep, over one of the fracture zones emanating from the Mid-Atlantic Ridge (MAR). Tracer that mixed toward the bottom moved toward the MAR across isopycnal surfaces that bend down to intersect the bottom at a rate implying a near-bottom buoyancy flux of 1.5 × 10−9 m2 s−3, somewhat larger than inferred from dissipation measurements. The diffusivity at the level of the tracer release is estimated at 4.4 ± 1 × 10−4 m2 s−1, again larger than inferred from dissipation rates. The main patch moved southwest at about 2 cm s−1 while sinking due to the divergence of buoyancy flux above the bottom layer. The isopycnal eddy diffusivity was about 100 m2 s−1. Westward flow away from the MAR is the return flow balancing the eastward near-bottom upslope flow. The southward component of the flow is roughly consistent with conservation of potential vorticity. The circulation as well as the pattern of diapycnal flux are qualitatively the same as in but are more robust. The results indicate that diapycnal diffusivity is about twice that invoked by in calculating the basinwide buoyancy budget.

Significance Statement

Buoyancy flux into the abyssal waters is required to close the overturning circulation of those waters, an important part of the climate system. This contribution presents a robust view of the strength of that buoyancy flux and the associated circulation.

Restricted access
Clark Weaver
,
Dong L. Wu
,
P. K. Bhartia
,
Gordon Labow
,
David P. Haffner
,
Lauren Borgia
,
Laura McBride
, and
Ross Salawitch

Abstract

We construct a long-term record of top of atmosphere (TOA) shortwave (SW) albedo of clouds and aerosols from 340-nm radiances observed by NASA and NOAA satellite instruments from 1980 to 2013. We compare our SW cloud+aerosol albedo with simulated cloud albedo from both AMIP and historical CMIP6 simulations from 47 climate models. While most historical runs did not simulate our observed spatial pattern of the trends in albedo over the Pacific Ocean, four models qualitatively simulate our observed patterns. Those historical models and the AMIP models collectively estimate an equilibrium climate sensitivity (ECS) of ∼3.5°C, with an uncertainty from 2.7° to 5.1°C. Our ECS estimates are sensitive to the instrument calibration, which drives the wide range in ECS uncertainty. We use instrument calibrations that assume a neutral change in reflectivity over the Antarctic ice sheet. Our observations show increasing cloudiness over the eastern equatorial Pacific and off the coast of Peru as well as neutral cloud trends off the coast of Namibia and California. To produce our SW cloud+aerosol albedo, we first retrieve a black-sky cloud albedo (BCA) and empirically correct the sampling bias from diurnal variations. Then, we estimate the broadband proxy albedo using multiple nonlinear regression along with several years of CERES cloud albedo to obtain the regression coefficients. We validate our product against CERES data from the years not used in the regression. Zonal mean trends of our SW cloud+aerosol albedo show reasonable agreement with CERES as well as the Pathfinder Atmospheres–Extended (PATMOS-x) observational dataset.

Significance Statement

Equilibrium climate sensitivity is a measure of the rise in global temperature over hundreds of years after a doubling of atmospheric CO2 concentration. Current state-of-the-art climate models forecast a wide range of equilibrium climate sensitivities (1.5°–6°C), due mainly to how clouds, aerosols, and sea surface temperatures are simulated within these models. Using data from NASA and NOAA satellite instruments from 1980 to 2013, we first construct a dataset that describes how much sunlight has been reflected by clouds over the 34 years and then we compare this data record to output from 47 climate models. Based on these comparisons, we conclude the best estimate of equilibrium climate sensitivity is about 3.5°C, with an uncertainty range of 2.7°–5.1°C.

Open access