Abstract
The optimal fingerprinting approach is central to detecting and attributing climate change. It utilizes a regression model with covariates that have measurement errors, linked by a shared covariance matrix with the regression error up to a known scale. The inferences about the regression coefficients are vital for making reliable detection and attribution statements, as well as for quantifying uncertainties in outcomes like attributable warming. Traditionally, this has involved the total least squares (TLS) method, which depends on accurately estimating the covariance matrix of the regression error. However, inaccuracies in this matrix’s estimation can lead to skewed scaling factor estimators and overly optimistic confidence intervals, potentially misrepresenting the accuracy of detection and attribution statements. The recent advent of an estimating equations approach, which offers more efficient point estimation with smaller possible variance and precise uncertainty quantification, prompts a critical reassessment of past climate change detection and attribution analyses. By applying this advanced method to HadCRUT5 observational data and CMIP6 multimodel simulations, our study reevaluates temperature detection and attribution at global and regional levels, strengthens the existing detection and attribution conclusions at the global scale, and providing evidence of the effect of anthropogenic forcings in various regions.
© 2025 American Meteorological Society. This is an Author Accepted Manuscript distributed under the terms of the default AMS reuse license. For information regarding reuse and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).