Search Results

You are looking at 1 - 10 of 23 items for

  • Author or Editor: Peter Thorne x
  • Refine by Access: All Content x
Clear All Modify Search
Chunlüe Zhou, Junhong Wang, Aiguo Dai, and Peter W. Thorne

Abstract

This study develops an innovative approach to homogenize discontinuities in both mean and variance in global subdaily radiosonde temperature data from 1958 to 2018. First, temperature natural variations and changes are estimated using reanalyses and removed from the radiosonde data to construct monthly and daily difference series. A penalized maximal F test and an improved Kolmogorov–Smirnov test are then applied to the monthly and daily difference series to detect spurious shifts in the mean and variance, respectively. About 60% (40%) of the changepoints appear in the mean (variance), and ~56% of them are confirmed by available metadata. The changepoints display a country-dependent pattern likely due to changes in national radiosonde networks. Mean segment length is 7.2 (14.6) years for the mean (variance)-based detection. A mean (quantile)-matching method using up to 5 years of data from two adjacent mean (variance)-based segments is used to adjust the earlier segments relative to the latest segment. The homogenized series is obtained by adding the two homogenized difference series back to the subtracted reference series. The homogenized data exhibit more spatially coherent trends and temporally consistent variations than the raw data, and lack the spurious tropospheric cooling over North China and Mongolia seen in several reanalyses and raw datasets. The homogenized data clearly show a warming maximum around 300 hPa over 30°S–30°N, consistent with model simulations, in contrast to the raw data. The results suggest that spurious changes are numerous and significant in the radiosonde records and our method can greatly improve their homogeneity.

Open access
Kyle F. E. Betteridge, Peter D. Thorne, and Paul S. Bell

Abstract

The simultaneous measurement of current flow and suspended sediment concentration in the marine environment is central to the study of sediment transport processes. In view of this, two acoustic approaches for measuring flow were tested in a tidal estuary to assess their capabilities in this environment. A coherent Doppler velocity profiler and a cross-correlation velocity profiler were assessed using conventional current meters and a commercially available acoustic Doppler velocimeter. Mean velocity profiles were obtained up to a range of 1.47 m in 0.046-m range bins over a number of flood tides. The measurements compared well with the reference instruments and regression analysis produced gradients close to unity. Turbulent velocities measured with the coherent Doppler profiler were comparable with turbulent fluctuations measured with the acoustic Doppler velocimeter. The cross-correlation velocity profiler was shown to be unable to measure turbulent velocities. The backscattered signals received on the cross-correlation transducers were also used to compute the sediment concentration profiles using an explicit solution to the acoustic backscatter equation. Combining the concentration and flow measurements enabled sediment flux profiles to be obtained, the results of which compared favorably with flux measurements obtained from the conventional current meters and pumped sampling.

Full access
Aiguo Dai, Junhong Wang, Peter W. Thorne, David E. Parker, Leopold Haimberger, and Xiaolan L. Wang

Abstract

Radiosonde humidity records represent the only in situ observations of tropospheric water vapor content with multidecadal length and quasi-global coverage. However, their use has been hampered by ubiquitous and large discontinuities resulting from changes to instrumentation and observing practices. Here a new approach is developed to homogenize historical records of tropospheric (up to 100 hPa) dewpoint depression (DPD), the archived radiosonde humidity parameter. Two statistical tests are used to detect changepoints, which are most apparent in histograms and occurrence frequencies of the daily DPD: a variant of the Kolmogorov–Smirnov (K–S) test for changes in distributions and the penalized maximal F test (PMFred) for mean shifts in the occurrence frequency for different bins of DPD. These tests capture most of the apparent discontinuities in the daily DPD data, with an average of 8.6 changepoints (∼1 changepoint per 5 yr) in each of the analyzed radiosonde records, which begin as early as the 1950s and ended in March 2009. Before applying breakpoint adjustments, artificial sampling effects are first adjusted by estimating missing DPD reports for cold (T < −30°C) and dry (DPD artificially set to 30°C) conditions using empirical relationships at each station between the anomalies of air temperature and vapor pressure derived from recent observations when DPD reports are available under these conditions. Next, the sampling-adjusted DPD is detrended separately for each of the 4–10 quantile categories and then adjusted using a quantile-matching algorithm so that the earlier segments have histograms comparable to that of the latest segment. Neither the changepoint detection nor the adjustment uses a reference series given the stability of the DPD series.

Using this new approach, a homogenized global, twice-daily DPD dataset (available online at www.cgd.ucar.edu/cas/catalog/) is created for climate and other applications based on the Integrated Global Radiosonde Archive (IGRA) and two other data sources. The adjusted-daily DPD has much smaller and spatially more coherent trends during 1973–2008 than the raw data. It implies only small changes in relative humidity in the lower and middle troposphere. When combined with homogenized radiosonde temperature, other atmospheric humidity variables can be calculated, and these exhibit spatially more coherent trends than without the DPD homogenization. The DPD adjustment yields a different pattern of change in humidity parameters compared to the apparent trends from the raw data. The adjusted estimates show an increase in tropospheric water vapor globally.

Full access
Katharine M. Willett, Philip D. Jones, Nathan P. Gillett, and Peter W. Thorne

Abstract

Water vapor constitutes the most significant greenhouse gas, is a key driver of many atmospheric processes, and hence, is fundamental to understanding the climate system. It is a major factor in human “heat stress,” whereby increasing humidity reduces the ability to stay cool. Until now no truly global homogenized surface humidity dataset has existed with which to assess recent changes. The Met Office Hadley Centre and Climatic Research Unit Global Surface Humidity dataset (HadCRUH), described herein, provides a homogenized quality controlled near-global 5° by 5° gridded monthly mean anomaly dataset in surface specific and relative humidity from 1973 to 2003. It consists of land and marine data, and is geographically quasi-complete over the region 60°N–40°S.

Between 1973 and 2003 surface specific humidity has increased significantly over the globe, tropics, and Northern Hemisphere. Global trends are 0.11 and 0.07 g kg−1 (10 yr)−1 for land and marine components, respectively. Trends are consistently larger in the tropics and in the Northern Hemisphere during summer, as expected: warmer regions exhibit larger increases in specific humidity for a given temperature change under conditions of constant relative humidity, based on the Clausius–Clapeyron equation. Relative humidity trends are not significant when averaged over the landmass of the globe, tropics, and Northern Hemisphere, although some seasonal changes are significant.

A strong positive bias is apparent in marine humidity data prior to 1982, likely owing to a known change in reporting practice for dewpoint temperature at this time. Consequently, trends in both specific and relative humidity are likely underestimated over the oceans.

Full access
Peter W. Thorne, David E. Parker, John R. Christy, and Carl A. Mears

Historically, meteorological observations have been made for operational forecasting rather than long-term monitoring purposes, so that there have been numerous changes in instrumentation and procedures. Hence to create climate quality datasets requires the identification, estimation, and removal of many nonclimatic biases from the historical data. Construction of a number of new tropospheric temperature climate datasets has highlighted previously unrecognized uncertainty in multidecadal temperature trends aloft. The choice of dataset can even change the sign of upper-air trends relative to those reported at the surface. So structural uncertainty introduced unintentionally through dataset construction choices is important and needs to be understood and mitigated. A number of ways that this could be addressed for historical records are discussed, as is the question of How it needs to be reduced through future coordinated observing systems with long-term monitoring as a driver, enabling explicit calculation, and removal of nonclimatic biases. Although upper-air temperature records are used to illustrate the arguments, it is strongly believed that the findings are applicable to all long-term climate datasets and variables. A full characterization of observational uncertainty is as vitally important as recent intensive efforts to understand climate model uncertainties if the goal to rigorously reduce the uncertainty regarding both past and future climate changes is to be achieved.

Full access
Peter A. Stott, Gareth S. Jones, Jason A. Lowe, Peter Thorne, Chris Durman, Timothy C. Johns, and Jean-Claude Thelen

Abstract

The ability of climate models to simulate large-scale temperature changes during the twentieth century when they include both anthropogenic and natural forcings and their inability to account for warming over the last 50 yr when they exclude increasing greenhouse gas concentrations has been used as evidence for an anthropogenic influence on global warming. One criticism of the models used in many of these studies is that they exclude some forcings of potential importance, notably from fossil fuel black carbon, biomass smoke, and land use changes. Herein transient simulations with a new model, the Hadley Centre Global Environmental Model version 1 (HadGEM1), are described, which include these forcings in addition to other anthropogenic and natural forcings, and a fully interactive treatment of atmospheric sulfur and its effects on clouds. These new simulations support previous work by showing that there was a significant anthropogenic influence on near-surface temperature change over the last century. They demonstrate that black carbon and land use changes are relatively unimportant for explaining global mean near-surface temperature changes.

The pattern of warming in the troposphere and cooling in the stratosphere that has been observed in radiosonde data since 1958 can only be reproduced when the model includes anthropogenic forcings. However, there are some discrepancies between the model simulations and radiosonde data, which are largest where observational uncertainty is greatest in the Tropics and high latitudes.

Predictions of future warming have also been made using the new model. Twenty-first-century warming rates, following policy-relevant emissions scenarios, are slightly greater in HadGEM1 than in the Third Hadley Centre Coupled Ocean–Atmosphere General Circulation Model (HadCM3) as a result of the extra forcing in HadGEM1. An experiment in which greenhouse gases and other anthropogenic forcings are stabilized at 2100 levels and held constant until 2200 predicts a committed twenty-second-century warming of less than 1 K, whose spatial distribution resembles that of warming during the twenty-first century, implying that the local feedbacks that determine the pattern of warming do not change significantly.

Full access
Ciara Ryan, Catriona Duffy, Ciaran Broderick, Peter W. Thorne, Mary Curley, Séamus Walsh, Conor Daly, Mairéad Treanor, and Conor Murphy

Abstract

Over much of the globe, the temporal extent of meteorological records is limited, yet a wealth of data remains in paper or image form in numerous archives. To date, little attention has been given to the role that students might play in efforts to rescue these data. Here we summarize an ambitious research-led, accredited teaching experiment in which undergraduate students successfully transcribed more than 1,300 station years of daily precipitation data and associated metadata across Ireland over the period 1860–1939. We explore i) the potential for integrating data rescue activities into the classroom, ii) the ability of students to produce reliable transcriptions and, iii) the learning outcomes for students. Data previously transcribed by Met Éireann (Ireland’s National Meteorological Service) were used as a benchmark against which it was ascertained that students were as accurate as the professionals. Details on the assignment, its planning and execution, and student-aids used are provided. The experience highlights the benefits that can accrue for data rescue through innovative collaboration between national meteorological services and academic institutions. At the same time, students have gained valuable learning outcomes and firsthand understanding of the processes that underpin data rescue and analysis. The success of the project demonstrates the potential to extend data rescue in the classroom to other universities, thus providing both an enriched learning experience for the students and a lasting legacy to the scientific community.

Open access
Xiaoyan Wei, Henk M. Schuttelaars, Megan E. Williams, Jennifer M. Brown, Peter D. Thorne, and Laurent O. Amoudry

Abstract

Asymmetric tidal turbulence (ATT) strongly influences estuarine health and functioning. However, its impact on the three-dimensional estuarine dynamics and the feedback of water motion and salinity distribution on ATT remain poorly understood, especially for short estuaries (estuarine length ≪ tidal wavelength). This study systematically investigates the abovementioned interactions in a short estuary for the first time, considering periodically weakly stratified conditions. This is done by developing a three-dimensional semi-analytical model (combining perturbation method with finite element method) that allows a dissection of the contributions of different processes to ATT, estuarine circulation, and salt transport. The generation of ATT is dominated by (i) strain-induced periodic stratification and (ii) asymmetric bottom-shear-generated turbulence, and their contributions to ATT are different both in amplitude and phase. The magnitude of the residual circulation related to ATT and the eddy viscosity–shear covariance (ESCO) is about half of that of the gravitational circulation (GC) and shows a “reversed” pattern as compared to GC. ATT generated by strain-induced periodic stratification contributes to an ESCO circulation with a spatial structure similar to GC. This circulation reduces the longitudinal salinity gradients and thus weakens GC. Contrastingly, the ESCO circulation due to asymmetric bottom-shear-generated turbulence shows patterns opposite to GC and acts to enhance GC. Concerning the salinity dynamics at steady state, GC and tidal pumping are equally important to salt import, whereas ESCO circulation yields a significant seaward salt transport. These findings highlight the importance of identifying the sources of ATT to understand its impact on estuarine circulation and salt distribution.

Open access
Boyin Huang, Peter W. Thorne, Viva F. Banzon, Tim Boyer, Gennady Chepurin, Jay H. Lawrimore, Matthew J. Menne, Thomas M. Smith, Russell S. Vose, and Huai-Min Zhang

Abstract

The monthly global 2° × 2° Extended Reconstructed Sea Surface Temperature (ERSST) has been revised and updated from version 4 to version 5. This update incorporates a new release of ICOADS release 3.0 (R3.0), a decade of near-surface data from Argo floats, and a new estimate of centennial sea ice from HadISST2. A number of choices in aspects of quality control, bias adjustment, and interpolation have been substantively revised. The resulting ERSST estimates have more realistic spatiotemporal variations, better representation of high-latitude SSTs, and ship SST biases are now calculated relative to more accurate buoy measurements, while the global long-term trend remains about the same. Progressive experiments have been undertaken to highlight the effects of each change in data source and analysis technique upon the final product. The reconstructed SST is systematically decreased by 0.077°C, as the reference data source is switched from ship SST in ERSSTv4 to modern buoy SST in ERSSTv5. Furthermore, high-latitude SSTs are decreased by 0.1°–0.2°C by using sea ice concentration from HadISST2 over HadISST1. Changes arising from remaining innovations are mostly important at small space and time scales, primarily having an impact where and when input observations are sparse. Cross validations and verifications with independent modern observations show that the updates incorporated in ERSSTv5 have improved the representation of spatial variability over the global oceans, the magnitude of El Niño and La Niña events, and the decadal nature of SST changes over 1930s–40s when observation instruments changed rapidly. Both long- (1900–2015) and short-term (2000–15) SST trends in ERSSTv5 remain significant as in ERSSTv4.

Full access
Ed Hawkins, Pablo Ortega, Emma Suckling, Andrew Schurer, Gabi Hegerl, Phil Jones, Manoj Joshi, Timothy J. Osborn, Valérie Masson-Delmotte, Juliette Mignot, Peter Thorne, and Geert Jan van Oldenborgh

Abstract

The United Nations Framework Convention on Climate Change (UNFCCC) process agreed in Paris to limit global surface temperature rise to “well below 2°C above pre-industrial levels.” But what period is preindustrial? Somewhat remarkably, this is not defined within the UNFCCC’s many agreements and protocols. Nor is it defined in the IPCC’s Fifth Assessment Report (AR5) in the evaluation of when particular temperature levels might be reached because no robust definition of the period exists. Here we discuss the important factors to consider when defining a preindustrial period, based on estimates of historical radiative forcings and the availability of climate observations. There is no perfect period, but we suggest that 1720–1800 is the most suitable choice when discussing global temperature limits. We then estimate the change in global average temperature since preindustrial using a range of approaches based on observations, radiative forcings, global climate model simulations, and proxy evidence. Our assessment is that this preindustrial period was likely 0.55°–0.80°C cooler than 1986–2005 and that 2015 was likely the first year in which global average temperature was more than 1°C above preindustrial levels. We provide some recommendations for how this assessment might be improved in the future and suggest that reframing temperature limits with a modern baseline would be inherently less uncertain and more policy relevant.

Open access