1. Introduction
In a recent paper, Bo Christiansen presents and discusses “LOC,” a methodology for reconstructing past climate that is based on local regressions between climate proxy time series and instrumental time series (Christiansen 2011, hereafter C11). LOC respects two important scientific facts about proxy data that are often overlooked, namely that many proxies are likely influenced by strictly local temperature, and, to reflect causality, the proxies should be written as functions of climate, not vice versa. There are, however, several weaknesses to the LOC method: uncertainty is not propagated through the multiple stages of the analysis, the effects of observational errors in the instrumental record are not considered, and as the proxies become uninformative of climate, the variance of a reconstruction produced by LOC becomes unbounded—a result that is clearly unphysical. These shortcomings can be overcome by interpreting the LOC method in the context of recently proposed Bayesian hierarchical reconstruction methods.
Section 2 reviews the basic modeling assumptions underlying LOC and details the shortcomings of this approach. To illustrate one possible solution to the shortcomings of LOC, section 3 presents a Bayesian interpretation of LOC and briefly describes the connections between LOC and two recently published Bayesian reconstruction methods. Section 4 discusses the variance of the reconstructed series in both the original and Bayesian LOC frameworks, and section 5 provides a few concluding remarks.
2. LOC: Description and shortcomings

The most obvious shortcoming of the LOC method is the behavior of the local reconstructions in the limiting case of the proxy series being independent of the corresponding true temperature series. As the proxy series becomes less informative of the temperature series,
Another shortcoming of the LOC method concerns the treatment of measurement errors. It is well known that when both the predictor and response variables in a regression are subject to measurement errors, inference on the model parameters is ill defined without additional information on the variance of the error terms (e.g., Fuller 1987). If such information is unavailable, it is possible to establish bounds on model parameters, but precise parameter estimation is not possible. The indirect regression underlying LOC bypasses the estimation of the measurement errors in the proxy and instrumental observations, and the resulting
The methodology used to combine the local reconstructions into a regional or hemispheric average is likewise unsatisfying. The fact that certain proxy series may have weaker relationships with local temperatures than others, and thus that the inferred temperature at those locations are less reliable, is not taken into account. In addition, the spatial covariance of the underlying temperature process is not exploited (cf. Tingley and Huybers 2010a). These two issues can both be resolved via hierarchical modeling, as discussed below. Given that C11 does not consider the spatial covariance between the local temperature reconstructions, and that the goal of the analysis is to infer Northern Hemisphere (NH) mean temperature, the advantage of first reconstructing local temperatures is unclear. As an alternative, C11 could use each proxy series to directly reconstruct the hemispheric average temperature, and then take the average over these reconstructions (cf. Li et al. 2010).
A final concern with LOC is that the uncertainty introduced at the various stages of the analysis, such as in the estimation of
3. LOC in a Bayesian framework



The key advantage of putting LOC into Bayesian framework is that a draw from the posterior of the parameters in Eq. (4) can be plugged into Eq. (3), which results in a draw of the temperature at each time and location for which there is no instrumental observation. The end result is an estimate of the posterior distribution of the temperature at these times and locations. Likewise, draws from various locations can be averaged at each year to produce a posterior distribution of the global or regional mean time series. Posterior draws of the local temperature time series are less variable at locations where the corresponding proxy series have strong relationships with local temperature (small
In contrast, uncertainty introduced at numerous stages of the LOC method is not propagated, and the resulting reconstructions (see Figs. 3, 5, 7, 9, 10, and 11 from C11) include no estimates of uncertainty. As the calibration model [Eq. (1)] that underpins LOC attributes all errors to the proxies and treats the target climate as a fixed quantity, estimates of residual variance after fitting the model in Eq. (1) cannot be readily used to estimate the uncertainty in predictions of T. In contrast, methods based on direct regression, such as the various implementations of the regularized EM algorithm (RegEM; Schneider 2001), attribute the errors to the target climate quantity, and uncertainty estimates can then be obtained from the residual analysis.
The Bayesian interpretation of LOC is closely related to two other published methods for reconstructing past climate.1 Li et al. (2010) describe a method for reconstructing the NH mean temperature time series using proxy time series that reflect temperatures at different temporal scales, and estimates of solar, volcanic, and greenhouse gas forcings. Forward models for each type of proxy link the true temperatures to the proxy observations, and a second-order autoregressive [AR(2)] model is assumed for the errors. The indirect regression model in LOC is essentially a special case of the data models used in Li et al. (2010). In contrast to LOC, the Bayesian model of Li et al. (2010) propagates uncertainty introduced at each level of the hierarchy and quantifies the uncertainty in the final estimate of the NH mean temperature time series. In addition, the process-level model used in Li et al. (2010), as well as the vague priors on parameters, provide regularization when the proxy signal-to-noise ratio is low.
Tingley and Huybers (2010a) propose a Bayesian Algorithm for Reconstructing Climate Anomalies in Space and Time (BARCAST), which is, like LOC, based on the data-level assumption that each proxy observation reflects strictly local (in space and time) information about the target process. Each type of proxy observation is assumed to share a common, linear relation of the form in Eq. (1) with the target climate field, while the instrumental observations are modeled as the true underlying value of the field perturbed by additive white noise with constant variance.
Temporally averaged climate processes, such as annual mean surface temperature anomalies, generally display persistence in both space and time, and one of the key features that differentiates BARCAST from LOC is the inclusion of a parametric space–time model. At the process level, the target field is assumed to be a multivariate AR(1) process, with a common AR(1) parameter for all locations. Spatial structure enters through the innovations that drive the AR(1) process, which are assumed to be draws from a mean-zero multivariate normal distribution, with an exponential spatial covariance function (see, e.g., Banerjee et al. 2004). This process-level model allows information to be shared between locations and through time, and the temperature inferred for a given location and year reflects information from neighboring times and places. In particular, explicitly modeling the space–time covariance structure allows for imputation of the temperature field at locations where there are no observations. Inference can then be made on a regular latitude–longitude grid, in which case weighting the inferred temperatures by the cosine of latitude, as done in C11, is a defensible spatial averaging procedure. Finally, Bayesian inference insures that uncertainty in the estimation of model parameters is included in the final estimate of uncertainty in the spatial mean time series.
An important feature of Bayesian hierarchical models is the simultaneous modeling of measurement errors in both the proxy and the instrumental observations (Tingley and Huybers 2010a,b; Li et al. 2010). The process level of a hierarchical model specifies the statistical structure of the target quantity, be it temporal (Li et al. 2010) or spatiotemporal (Tingley and Huybers 2010a), while the data level describes the errors in each type of observation of the target field. In contrast, LOC only models the errors in the proxies [Eq. (1)]. A measurement error model that explicitly takes into account the errors in the observations of both Pij and Tij seems more appropriate, as measurement error models allow for the estimation of the true relationship between the temperature process and the proxies. While appendix B of C11 notes that such models suffers from identifiability issues without additional information or constraints on the parameters,2 Ammann et al. (2010) construct an identifiable measurement error model by imposing the constraint of minimizing the bias of the reconstruction. Other methods that account for measurment errors in both predictor and response variables are discussed in Hegerl et al. (2007) and Mann et al. (2007).
4. Variance of the reconstruction
C11 argues that one key advantage of LOC is that the variability of the reconstruction accurately reflects the variability of the target process. While C11 states and illustrates that the variability of LOC reconstructions is larger than that for other methods (see in particular Fig. 10 of C11), no validation results are presented that speak to the method’s ability to capture the true variability of the inferred mean time series. Given Eq. (A6) from C11, LOC will drastically overestimate the variability if a proxy series has a weak relationship with local temperature (λ is close to zero) or the intrumental observations are contaminated by a substantial amount of noise. In addition, C11 does not investigate what must be an inevitable trade-off between minimizing bias and preserving variance, and doing so would lend insight into the properties of the resulting reconstructions. See Tingley and Huybers (2010b) for a comparison of the variability in various (pseudoproxy based) reconstructions to the target variability.
There is strong prior scientific knowledge about the plausible amplitude of temperature variability; for example, Jansen et al. (2007) estimate that, during the Last Glacial Maximum, the mean global surface temperature was about 4°–7°C colder than at present. Using such scientific knowledge to specify a weakly informative prior for the unknown temperature Tij regularizes the inference and avoids the problem of unbounded variance in the presence of noninformative proxies and error-prone instrumental temperatures.










Finally, we emphasize that within the hierarchical framework, it is possible to incorporate prior information about the space–time behavior of the target process. Li et al. (2010) incorporate time series of climate forcings and specify a parametric temporal process for the NH mean time series based on a simple energy balance model, while Tingley and Huybers (2010a) specify a spatiotemporal process for the surface temperature anomalies. In each case, weakly informative prior distributions are then placed on the parameters of the (spatio) temporal process, so that the estimates of the parameters are dominated by the data. Conditional on the process level being correctly specified, the posterior draws of the temperature time series or space–time process will have, on average, a reasonable variability—regardless of the signal-to-noise ratio for the proxies. If the proxies are uninformative, however, the variability of the mean across these draws will be attenuated—see Fig. 2 of Tingley and Huybers (2010b).
5. Conclusions
Hierarchal statistical modeling combined with Bayesian inference provides a flexible framework for analyzing data subject to numerous sources of uncertainty. In the paleoclimate context, hierarchical modeling allows the proxies to be linked to the target climate process via a probabilistic data-level model, and a parametric model of the target climate process to be incorporated at the process level. The indirect regression basis of LOC, which specifies the proxy observations as the dependent variable, can be understood as the data-level specification of a hierarchical model. By casting LOC into a Bayesian hierarchical modeling framework, weakly informative priors for the target temperature process eliminate the variance inflation that occurs in LOC if the proxies are uninformative of the climate process or if the observations of the target climate are subject to errors. Furthermore, since Bayesian hierarchical models allow for uncertainty propagation through multiple steps of an analysis, framing LOC as a Bayesian model thus allows for internally consistent uncertainty quantification—a key omission in the original description of LOC.
LOC is closely related to the data level of the model presented in either Li et al. (2010) or Tingley and Huybers (2010a). In turn, both of these Bayesian models make unrealistic but simplifying assumptions in order to facilitate the inference. The model of Li et al. (2010) allows the proxies to reflect the target climate process at different levels of temporal averaging, but is nonspatial, while that of Tingley and Huybers (2010a) includes a spatial component but assumes that each proxy observation reflects spatially and temporally local information about the climate field. Despite the simplicity of current implementations, Bayesian hierarchical models are more flexible and hold the potential to be much more comprehensive than LOC. Bayesian hierarchical modeling is a promising approach to inferring past climate, and the specification of more realistic data-level and process-level models remains an area of active research.
Acknowledgments
This work was conducted while MPT was part of The Institute for Mathematics Applied to Geosciences at The National Center for Atmospheric Research. MPT is supported in part by the NSF under Grants ATM-0902374 and ATM-0724828, and BL by the NSF under Grant DMS-1007686.
REFERENCES
Ammann, C., M. Genton, and B. Li, 2010: Technical note: Correcting for signal attenuation from noisy proxy data in climate reconstructions. Climate Past, 6, 273–279.
Banerjee, S., B. P. Carlin, and A. E. Gelfand, 2004: Hierarchical Modeling and Analysis for Spatial Data. Chapman & Hall/CRC, 472 pp.
Brohan, P., J. J. Kennedy, I. Harris, S. F. B. Tett, and P. D. Jones, 2006: Uncertainty estimates in regional and global observed temperature changes: A new data set from 1850. J. Geophys. Res., 111, D12106, doi:10.1029/2005JD006548.
Christiansen, B., 2011: Reconstructing the NH mean temperature: Can underestimation of trends and variability be avoided? J. Climate, 24, 674–692.
Fuller, W. A., 1987: Measurement Error Models. Wiley, 440 pp.
Gelman, A., J. B. Carlin, H. S. Stern, and D. B. Rubin, 2003: Bayesian Data Analysis. 2nd ed. Chapman & Hall/CRC, 696 pp.
Hegerl, G., T. Crowley, M. Allen, W. Hyde, H. Pollack, J. Smerdon, and E. Zorita, 2007: Detection of human influence on a new, validated 1500-year temperature reconstruction. J. Climate, 20, 650–666.
Jansen, E., and Coauthors, 2007: Palaeoclimate. Climate Change 2007: The Physical Science Basis, S. Solomon et al., Eds., Cambridge University Press, 433–497.
Li, B., D. Nychka, and C. Ammann, 2010: The value of multi-proxy reconstruction of past climate. J. Amer. Stat. Assoc., 105 (491), 883–911.
Mann, M., S. Rutherford, E. Wahl, and C. Ammann, 2007: Robustness of proxy-based climate field reconstruction methods. J. Geophys. Res., 112, D12109, doi:10.1029/2006JD008272.
Schneider, T., 2001: Analysis of incomplete climate data: Estimation of mean values and covariance matrices and imputation of missing values. J. Climate, 14, 853–871.
Solomon, S., and Coauthors, 2007: Technical summary. Climate Change 2007: The Physical Science Basis, S. Solomon et al., Eds., Cambridge University Press, 19–91.
Tingley, M., and P. Huybers, 2010a: A Bayesian algorithm for reconstructing climate anomalies in space and time. Part I: Development and applications to paleoclimate reconstruction problems. J. Climate, 23, 2759–2781.
Tingley, M., and P. Huybers, 2010b: A Bayesian algorithm for reconstructing climate anomalies in space and time. Part II: Comparison with the regularized expectation–maximization algorithm. J. Climate, 23, 2782–2800.
Tingley, M., P. Craigmile, M. Haran, B. Li, E. Mannshardt-Shamseldin, and B. Rajaratnam, 2012: Piecing together the past: Statistical insights into paleoclimatic reconstructions. Quat. Sci. Rev.,35, 1–22.
For more detail on the links between various reconstruction methods see Tingley et al. (2012).
For a more thorough discussion of the identifiability issues brought up in appendix B of C11, see Fuller (1987, chapter 1.1.3).