Search Results

You are looking at 1 - 5 of 5 items for

  • Author or Editor: Martin P. Tingley x
  • Refine by Access: All Content x
Clear All Modify Search
Martin P. Tingley

Abstract

Climate datasets with both spatial and temporal components are often studied after removing from each time series a temporal mean calculated over a common reference interval, which is generally shorter than the overall length of the dataset. The use of a short reference interval affects the temporal properties of the variability across the records, by reducing the standard deviation within the reference interval and inflating it elsewhere. For an annually averaged version of the Climate Research Unit’s (CRU) temperature anomaly product, the mean standard deviation is 0.67°C within the 1961–90 reference interval, and 0.81°C elsewhere.

The calculation of anomalies can be interpreted in terms of a two-factor analysis of variance model. Within a Bayesian inference framework, any missing values are viewed as additional parameters, and the reference interval is specified as the full length of the dataset. This Bayesian scheme is used to re-express the CRU dataset as anomalies with respect to means calculated over the entire 1850–2009 interval spanned by the dataset. The mean standard deviation is increased to 0.69°C within the original 1961–90 reference interval, and reduced to 0.76°C elsewhere. The choice of reference interval thus has a predictable and demonstrable effect on the second spatial moment time series of the CRU dataset. The spatial mean time series is in this case largely unaffected: the amplitude of spatial mean temperature change is reduced by 0.1°C when using the 1850–2009 reference interval, while the 90% uncertainty interval of (−0.03, 0.23) indicates that the reduction is not statistically significant.

Full access
Martin P. Tingley
and
Peter Huybers

Abstract

Reconstructing the spatial pattern of a climate field through time from a dataset of overlapping instrumental and climate proxy time series is a nontrivial statistical problem. The need to transform the proxy observations into estimates of the climate field, and the fact that the observed time series are not uniformly distributed in space, further complicate the analysis. Current leading approaches to this problem are based on estimating the full covariance matrix between the proxy time series and instrumental time series over a “calibration” interval and then using this covariance matrix in the context of a linear regression to predict the missing instrumental values from the proxy observations for years prior to instrumental coverage.

A fundamentally different approach to this problem is formulated by specifying parametric forms for the spatial covariance and temporal evolution of the climate field, as well as “observation equations” describing the relationship between the data types and the corresponding true values of the climate field. A hierarchical Bayesian model is used to assimilate both proxy and instrumental datasets and to estimate the probability distribution of all model parameters and the climate field through time on a regular spatial grid. The output from this approach includes an estimate of the full covariance structure of the climate field and model parameters as well as diagnostics that estimate the utility of the different proxy time series.

This methodology is demonstrated using an instrumental surface temperature dataset after corrupting a number of the time series to mimic proxy observations. The results are compared to those achieved using the regularized expectation–maximization algorithm, and in these experiments the Bayesian algorithm produces reconstructions with greater skill. The assumptions underlying these two methodologies and the results of applying each to simple surrogate datasets are explored in greater detail in .

Full access
Martin P. Tingley
and
Peter Huybers

Abstract

presented a Bayesian algorithm for reconstructing climate anomalies in space and time (BARCAST). This method involves specifying simple parametric forms for the spatial covariance and temporal evolution of the climate field as well as “observation equations” describing the relationships between the data types and the corresponding true values of the climate field. As this Bayesian approach to reconstructing climate fields is new and different, it is worthwhile to compare it in detail to the more established regularized expectation–maximization (RegEM) algorithm, which is based on an empirical estimate of the joint data covariance matrix and a multivariate regression of the instrumental time series onto the proxy time series. The differing assumptions made by BARCAST and RegEM are detailed, and the impacts of these differences on the analysis are discussed. Key distinctions between BARCAST and RegEM include their treatment of spatial and temporal covariance, the prior information that enters into each analysis, the quantities they seek to impute, the end product of each analysis, the temporal variance of the reconstructed field, and the treatment of uncertainty in both the imputed values and functions of these imputations. Differences between BARCAST and RegEM are illustrated by applying the two approaches to various surrogate datasets. If the assumptions inherent to BARCAST are not strongly violated, then in scenarios comparable to practical applications BARCAST results in reconstructions of both the field and the spatial mean that are more skillful than those produced by RegEM, as measured by the coefficient of efficiency. In addition, the uncertainty intervals produced by BARCAST are narrower than those estimated using RegEM and contain the true values with higher probability.

Full access
Andrew Rhines
,
Karen A. McKinnon
,
Martin P. Tingley
, and
Peter Huybers

Abstract

There is considerable interest in determining whether recent changes in the temperature distribution extend beyond simple shifts in the mean. The authors present a framework based on quantile regression, wherein trends are estimated across percentiles. Pointwise trends from surface station observations are mapped into continuous spatial fields using thin-plate spline regression. This procedure allows for resolving spatial dependence of distributional trends, providing uncertainty estimates that account for spatial covariance and varying station density. The method is applied to seasonal near-surface temperatures between 1979 and 2014 to unambiguously assess distributional changes in the densely sampled North American region. Strong seasonal differences are found, with summer trends exhibiting significant warming throughout the domain with little distributional dependence, while the spatial distribution of spring and fall trends show a dipole structure. In contrast, the spread between the 95th and 5th percentile in winter has decreased, with trends of −0.71° and −0.85°C decade−1, respectively, for daily maximum and minimum temperature, a contraction that is statistically significant over 84% of the domain. This decrease in variability is dominated by warming of the coldest days, which has outpaced the median trend by approximately a factor of 4. Identical analyses using ERA-Interim and NCEP-2 yield consistent estimates for winter (though not for other seasons), suggesting that reanalyses can be reliably used for relating winter trends to circulation anomalies. These results are consistent with Arctic-amplified warming being strongest in winter and with the influence of synoptic-scale advection on winter temperatures. Maps for all percentiles, seasons, and datasets are provided via an online tool.

Full access