Browse

You are looking at 1 - 10 of 121,543 items for

  • Refine by Access: All Content x
Clear All
Zhangqi Dai
,
Bin Wang
,
Ling Zhu
,
Jian Liu
,
Weiyi Sun
,
Longhui Li
,
Guonian Lü
,
Liang Ning
,
Mi Yan
, and
Kefan Chen

Abstract

Atlantic multidecadal variability (AMV) is a cornerstone for decadal prediction and profoundly influences regional and global climate variability, yet its fundamental drivers remain an issue for debate. Studies suggest that external forcing may have affected AMV during the Little Ice Age (AD 1400–1860). However, the detailed mechanism remains elusive, and the AMV’s centennial to millennial variations over the past 2000 years have not yet been explored. We first show that proxy-data reconstructions and paleo-data assimilations suggest a significant 60-yr AMV during AD 1250–1860 but not during AD 1–1249. We then conducted a suite of experiments with the Community Earth System Model (CESM) to unravel the causes of the changing AMV property. The simulation results under all external forcings match the reconstructions reasonably well. We find that the significant 60-yr AMV during 1250–1860 arises predominantly from the volcano forcing variability. During the period 1–1249, the average volcanic eruption intensity is about half of the 1250–1860 intensity, and a 20–40-yr internal variability dominates the AMV. The volcanic radiative forcing during 1250–1860 amplifies AMV and shifts the internal variability peak from 20–40 years to 60 years. The volcano forcing prolongs AMV periodicity by sustaining Arctic cooling, delaying subpolar sea ice melting and atmospheric feedback to reduce surface evaporation. These slow-response processes over the subpolar North Atlantic results in a persisting reduction of sea surface salinity, weakening the Atlantic overturning circulation, and warm water transport from the subtropical North Atlantic. The results reveal the cause of the nonstationary AMV over the past two millennia and shed light on the AMV’s response to external forcing.

Significance Statement

AMV plays an important role in the regional and global climate variability. The purpose of this study is to better understand the secular change of AMV during the past 2000 years and its response to the external forcing. Proxy data and model simulation consistently show a significant 60-yr AMV during AD 1250–1860 that is absent during AD 1–1249. Active volcanic eruptions during 1250–1860 amplify the AMV and shift its intrinsic 20–40-yr to a prominent 60-yr variance peak. Volcanoes prolong AMV periodicity by sustaining Arctic cooling, delaying subpolar sea ice melting, reducing evaporation, and increasing surface salinity. These results help us better understand nonstationary AMV and highlight the role of external forcing over the past two millennia.

Open access
Luke Grant
,
Lukas Gudmundsson
,
Edouard L. Davin
,
David M. Lawrence
,
Nicolas Vuichard
,
Eddy Robertson
,
Roland Séférian
,
Aurélien Ribes
,
Annette L. Hirsch
, and
Wim Thiery

Abstract

Land-use and land-cover changes (land use) alter climates biogeophysically by affecting surface fluxes of energy and water. Yet, near-surface temperature responses to land use across observational versus model-based studies and spatial-temporal scales can be inconsistent. Here we assess the prevalence of the historical land use signal of daily maximum temperatures averaged over the warmest month of the year (tLU) using regularized optimal fingerprinting for detection and attribution. We use observations from the Climatic Research Unit and Berkeley Earth alongside historical simulations with and without land use from the Coupled Model Intercomparison Project Phase 6 to reconstruct an experiment representing the effects of land use on climate, LU. To assess the signal of land use at spatially resolved continental and global scales, we aggregate all input data across reference regions and continents, respectively. At both scales, land use does not comprise a significantly detectable set of forcings for two of four Earth system models and their multi-model mean. Furthermore, using a principal component analysis, we find that tLU is mostly composed of the non-local effects of land use rather than its local effects. These findings show that, at scales relevant for climate attribution, uncertainties in Earth system model representations of land use are too high relative to the effects of internal variability to confidently assess land use.

Restricted access
Alejandro Cáceres-Euse
,
Anne Molcard
,
Natacha Bourg
,
Dylan Dumas
,
Charles-Antoine Guérin
, and
Giovanni Besio

Abstract

To assess the contribution of wind drag and Stokes drift on the near-surface circulation, a methodology to isolate the geostrophic surface current from high-frequency radar data is developed. The methodology performs a joint analysis utilizing wind field and in situ surface currents along with an unsupervised neuronal network. The isolation method seems robust in the light of comparisons with satellite altimeter data, presenting a similar time variability and providing more spatial detail of the currents in the coastal region. Results show that the wind-induced current is around 2.1% the wind speed and deflected from the wind direction in the range [18°, 23°], whereas classical literature suggests higher values. The wave-induced currents can represent more than 13% of the ageostrophic current component as function of the wind speed, suggesting that the Stokes drift needs to be analyzed as an independent term when studying surface sea currents in the coastal zones. The methodology and results presented here could be extended worldwide, as complementary information to improve satellite-derived surface currents in the coastal regions by including the local physical processes recorded by high-frequency radar systems. The assessment of the wave and wind-induced currents have important applications on Lagrangian transport studies.

Restricted access
Yongquan Qu
and
Xiaoming Shi

Abstract

The development of machine learning (ML) techniques enables data-driven parameterizations, which have been investigated in many recent studies. Some investigations suggest that a priori trained ML models exhibit satisfying accuracy during training but poor performance when coupled to dynamical cores and tested. Here we use the evolution of the barotropic vorticity equation (BVE) with periodically reinforced shear instability as a prototype problem to develop and evaluate a model-consistent training strategy, which employs a numerical solver supporting automatic differentiation and includes the solver in the loss function for training ML-based subgrid-scale (SGS) turbulence models. This approach enables the interaction between the dynamical core and the ML-based parameterization during the model training phase. The BVE model was run at low, high, and ultra-high (truth) resolutions. Our training dataset contains only a short period of coarsened high-resolution simulations. However, given initial conditions long after the training dataset time, the trained SGS model can still significantly increase the effective lead time of the BVE model running at the low resolution by up to 50% compared to the BVE simulation without an SGS model. We also tested using a covariance matrix to normalize the loss function and found it can notably boost the performance of the ML parameterization. The SGS model’s performance is further improved by conducting transfer learning using a limited number of discontinuous observations, increasing the forecast lead time improvement to 73%. This study demonstrates a potential pathway to using machine learning to enhance the prediction skills of our climate and weather models.

Free access
Richard M. Schulte
and
Christian D. Kummerow

Abstract

Satellite-based oceanic precipitation estimates, particularly those derived from the Global Precipitation Measurement (GPM) satellite and CloudSat, suffer from significant disagreement over regions of the globe where warm rain processes are dominant. GPM estimates of average rain rate tend to be lower than CloudSat estimates, due in part to GPM being less sensitive to shallow and/or light precipitation. Using coincident observations between GPM and CloudSat, we find that the GPM_2BCMB product misses about two-thirds of total accumulated warm rain compared to the CloudSat 2C-RAIN-PROFILE product. This difference becomes much smaller when products are compared at 1000 m above the surface (mitigating surface clutter issues) and when forcing the frequency of rain from CloudSat to match the frequency from GPM (mitigating sensitivity issues). However, even then a gap of about 25% remains. Using an optimal estimation retrieval algorithm on the underlying data, we retrieve a similar result, but find that the remaining difference between the GPM and CloudSat retrieved rain rates can be almost entirely accounted for by inconsistent assumptions about the shape of the drop size distribution (DSD) that are made in the two retrievals. We conclude that DSD assumptions contribute significantly to the relative underestimation of warm rain by GPM compared to CloudSat. Because the choice of DSD model has such a large effect on retrieved rain rates, more work is needed to determine whether the DSD models assumed by either the GPM_2BCMB or 2C-RAIN-PROFILE algorithms are actually appropriate for warm rain.

Restricted access
Antonios Mamalakis
,
Elizabeth A. Barnes
, and
Imme Ebert-Uphoff

Abstract

Methods of eXplainable Artificial Intelligence (XAI) are used in geoscientific applications to gain insights into the decision-making strategy of Neural Networks (NNs) highlighting which features in the input contribute the most to a NN prediction. Here, we discuss our “lesson learned” that the task of attributing a prediction to the input does not have a single solution. Instead, the attribution results depend greatly on the considered baseline that the XAI method utilizes; a fact that has been overlooked in the geoscientific literature. The baseline is a reference point to which the prediction is compared so that the prediction can be understood. This baseline can be chosen by the user or is set by construction in the method’s algorithm – often without the user being aware of that choice. We highlight that different baselines can lead to different insights for different science questions and, thus, should be chosen accordingly. To illustrate the impact of the baseline, we use a large ensemble of historical and future climate simulations forced with the SSP3-7.0 scenario and train a fully connected NN to predict the ensemble- and global-mean temperature (i.e., the forced global warming signal) given an annual temperature map from an individual ensemble member. We then use various XAI methods and different baselines to attribute the network predictions to the input. We show that attributions differ substantially when considering different baselines, as they correspond to answering different science questions. We conclude by discussing important implications and considerations about the use of baselines in XAI research.

Free access
Nicole K. Neumann
and
Nicholas J. Lutsko

Abstract

The factors controlling the present-day pattern of temperature variance are poorly understood. In particular, it is unclear why the variance of wintertime near-surface temperatures on daily and synoptic time scales is roughly twice as high over North America as over Eurasia. In this study, continental geometry’s role in shaping regional wintertime temperature variance is investigated using idealized climate model simulations run with midlatitude continents of different shapes. An isolated, rectangular midlatitude continent suggests that in the absence of other geographic features, the highest temperature variance will be located in the northwest of the continent, roughly collocated with the region of largest meridional temperature gradients, and just north of the maximum near-surface wind speeds. Simulations with other geometries, mimicking key features of North America and Eurasia, investigate the impacts of continental length and width, sloping coastlines, and inland bodies of water on regional temperature variance. The largest effect comes from tapering the northwest corner of the continent, similar to Eurasia, which substantially reduces the maximum temperature variance. Narrower continents have smaller temperature variance in isolation, implying that the high variances over North America must be due to the nonlocal influence of stationary waves. Support for this hypothesis is provided by simulations with two midlatitude continents, which show how continental geometry and stationary waves can combine to shape regional temperature variance.

Significance Statement

Wintertime temperature variance over North America is roughly twice as high as over Eurasia, but the reasons for this are unknown. Here we use idealized climate model simulations to investigate how continental geometry shapes regional temperature variance. We find that the smaller variance over Eurasia is largely due to the tapering of its northwest coast, which weakens temperature gradients in the continental interior. Our simulations also suggest that in isolation a narrow continent, like North America, should have weak temperature variance, implying that stationary waves are responsible for the high variance over North America. Understanding the controls on regional temperature variance is important for interpreting present-day winter climates and how these will change in the future.

Open access
Li Zhao
,
Tao Xie
,
William Perrie
,
Ming Ma
,
Jingsong Yang
,
Chengzu Bai
, and
Rick Danielson

Abstract

Sea surface temperature (SST) fronts are important for fisheries and marine ecology, as well as upper-ocean dynamics, weather forecasting, and climate monitoring. In this paper, we propose a new approach to detect SST fronts from RADARSAT-2 ScanSAR images, based on the correlation of SAR-derived wind speeds using the gray level cooccurrence matrix (GLCM) approach. Due to the large differences between the correlation of wind speeds for SST fronts compared to other areas, SST fronts can be detected by the threshold method. To eliminate small-scale features (or noise), the 30 km scale is used as the length threshold for the detection of the SST fronts. The proposed method is effective when wind speeds are between 3 and 13 m s−1. The overall accuracy of our method is about 93.6%, which is sufficient for operational applications.

Restricted access
Chao He
and
Tianjun Zhou

Abstract

The subtropical North Pacific and North Atlantic are controlled by basin-scale anticyclones in boreal summer. Based on a novel metric regarding the strengths of the rotational and the divergent circulation of anticyclones, we investigated the possible future responses in the intensity of these two subtropical anticyclones to global warming. While the North Atlantic subtropical anticyclone (NASA) is projected to strengthen, the North Pacific subtropical anticyclone (NPSA) is projected to weaken, in terms of both the rotational and the divergent circulation. The distinct responses of the NPSA and NASA are corroborated by the models participating in the fifth and sixth phases of the Coupled Model Intercomparison Project (CMIP), under both intermediate and high emission scenarios. We further investigated the possible mechanism for their distinct responses by decomposing the effect of greenhouse gas forcing into the direct effect of increased CO2 concentration and the indirect effect through sea surface temperature (SST). The intensified NASA results from the CO2 direct forcing while the weakened NPSA is dominated by the SST warming. The CO2 direct forcing enhances the NASA by increasing land–ocean thermal contrast anchored by the largest subtropical continental area, the Eurasian–African continent. Both the uniform SST warming and the change in SST pattern act to weaken the NPSA by increasing the latent heating over the subtropical North Pacific basin, as suggested by atmospheric component model simulations. The distinct responses of the NPSA and the NASA may lead to zonal asymmetry of the subtropical climate change.

Restricted access
Morven Muilwijk
,
Aleksi Nummelin
,
Céline Heuzé
,
Igor V. Polyakov
,
Hannah Zanowski
, and
Lars H. Smedsrud

Abstract

The Arctic Ocean is strongly stratified by salinity in the uppermost layers. This stratification is a key attribute of the region as it acts as an effective barrier for the vertical exchanges of Atlantic Water heat, nutrients, and CO2 between intermediate depths and the surface of the Eurasian and Amerasian basins (EB and AB). Observations show that from 1970 to 2017, the stratification in the AB has strengthened, whereas, in parts of the EB, the stratification has weakened. The strengthening in the AB is linked to freshening and deepening of the halocline. In the EB, the weakened stratification is associated with salinification and shoaling of the halocline (Atlantification). Simulations from a suite of CMIP6 models project that, under a strong greenhouse-gas forcing scenario (ssp585), the overall surface freshening and warming continue in both basins, but there is a divergence in hydrographic trends in certain regions. Within the AB, there is agreement among the models that the upper layers will become more stratified. However, within the EB, models diverge regarding future stratification. This is due to different balances between trends at the surface and trends at depth, related to Fram Strait fluxes. The divergence affects projections of the future state of Arctic sea ice, as models with the strongest Atlantification project the strongest decline in sea ice volume in the EB. From these simulations, one could conclude that Atlantificaton will not spread eastward into the AB; however, models must be improved to simulate changes in a more intricately stratified EB correctly.

Restricted access