Search Results
You are looking at 1 - 7 of 7 items for :
- Author or Editor: Russell S. Vose x
- Journal of Climate x
- Refine by Access: All Content x
Abstract
Set cover models are used to develop two reference station networks that can serve as near-term substitutes (as well as long-term backups) for the recently established Climate Reference Network (CRN) in the United States. The first network contains 135 stations distributed in a relatively uniform fashion in order to match the recommended spatial density for CRN. The second network contains 157 well-distributed stations that are generally not in urban areas in order to minimize the impact of future changes in land use. Both networks accurately reproduce the historical temperature and precipitation variations of the twentieth century.
Abstract
Set cover models are used to develop two reference station networks that can serve as near-term substitutes (as well as long-term backups) for the recently established Climate Reference Network (CRN) in the United States. The first network contains 135 stations distributed in a relatively uniform fashion in order to match the recommended spatial density for CRN. The second network contains 157 well-distributed stations that are generally not in urban areas in order to minimize the impact of future changes in land use. Both networks accurately reproduce the historical temperature and precipitation variations of the twentieth century.
Abstract
A procedure is described that provides guidance in determining the number of stations required in a climate observing system deployed to capture temporal variability in the spatial mean of a climate parameter. The method entails reducing the density of an existing station network in a step-by-step fashion and quantifying subnetwork performance at each iteration. Under the assumption that the full network for the study area provides a reasonable estimate of the true spatial mean, this degradation process can be used to quantify the relationship between station density and network performance. The result is a systematic “cost–benefit” relationship that can be used in conjunction with practical constraints to determine the number of stations to deploy.
The approach is demonstrated using temperature and precipitation anomaly data from 4012 stations in the conterminous United States over the period 1971–2000. Results indicate that a U.S. climate observing system should consist of at least 25 quasi-uniformly distributed stations in order to reproduce interannual variability in temperature and precipitation because gains in the calculated performance measures begin to level off with higher station numbers. If trend detection is a high priority, then a higher density network of 135 evenly spaced stations is recommended. Through an analysis of long-term observations from the U.S. Historical Climatology Network, the 135-station solution is shown to exceed the climate monitoring goals of the U.S. Climate Reference Network.
Abstract
A procedure is described that provides guidance in determining the number of stations required in a climate observing system deployed to capture temporal variability in the spatial mean of a climate parameter. The method entails reducing the density of an existing station network in a step-by-step fashion and quantifying subnetwork performance at each iteration. Under the assumption that the full network for the study area provides a reasonable estimate of the true spatial mean, this degradation process can be used to quantify the relationship between station density and network performance. The result is a systematic “cost–benefit” relationship that can be used in conjunction with practical constraints to determine the number of stations to deploy.
The approach is demonstrated using temperature and precipitation anomaly data from 4012 stations in the conterminous United States over the period 1971–2000. Results indicate that a U.S. climate observing system should consist of at least 25 quasi-uniformly distributed stations in order to reproduce interannual variability in temperature and precipitation because gains in the calculated performance measures begin to level off with higher station numbers. If trend detection is a high priority, then a higher density network of 135 evenly spaced stations is recommended. Through an analysis of long-term observations from the U.S. Historical Climatology Network, the 135-station solution is shown to exceed the climate monitoring goals of the U.S. Climate Reference Network.
Abstract
Studies which utilize a long-term temperature record in determining the possibility of a global warming have led to conflicting results. We suggest that a time-series evaluation of mean annual temperatures is not sufficiently robust to determine the existence of a long-term warming. We propose the utilization of an air mass-based synoptic climatological approach, as it is possible that local changes within particular air masses have been obscured by the gross scale of temperature time-series evaluations used in previous studies of this type. An automated synoptic index was constructed for the winter months in four western North American Arctic locations to determine if the frequency of occurrence of the coldest and mildest air masses has changed and if the physical character of these air masses has shown signs of modification over the past 40 years. It appears that the frequencies of the majority of the coldest air masses have tended to decrease, while those of the warmest air masses have increased. In addition, the very coldest air masses at each site have warmed between 1°C to almost 4°C over the same time interval. A technique is suggested to determine whether these changes are possibly attributable to anthropogenic influences.
Abstract
Studies which utilize a long-term temperature record in determining the possibility of a global warming have led to conflicting results. We suggest that a time-series evaluation of mean annual temperatures is not sufficiently robust to determine the existence of a long-term warming. We propose the utilization of an air mass-based synoptic climatological approach, as it is possible that local changes within particular air masses have been obscured by the gross scale of temperature time-series evaluations used in previous studies of this type. An automated synoptic index was constructed for the winter months in four western North American Arctic locations to determine if the frequency of occurrence of the coldest and mildest air masses has changed and if the physical character of these air masses has shown signs of modification over the past 40 years. It appears that the frequencies of the majority of the coldest air masses have tended to decrease, while those of the warmest air masses have increased. In addition, the very coldest air masses at each site have warmed between 1°C to almost 4°C over the same time interval. A technique is suggested to determine whether these changes are possibly attributable to anthropogenic influences.
Abstract
The effect of the Luers–Eskridge adjustments on the homogeneity of archived radiosonde temperature observations is evaluated. Using unadjusted and adjusted radiosonde data from the Comprehensive Aerological Reference Dataset (CARDS) as well as microwave sounding unit (MSU) version-d monthly temperature anomalies, the discontinuities in differences between radiosonde and MSU temperature anomalies across times of documented changes in radiosonde are computed for the lower to midtroposphere, mid- to upper troposphere, and lower stratosphere. For this purpose, a discontinuity is defined as a statistically significant difference between means of radiosonde–MSU differences for the 30-month periods immediately prior to and following a documented change in radiosonde type. The magnitude and number of discontinuities based on unadjusted and adjusted radiosonde data are then compared. Since the Luers–Eskridge adjustments have been designed to remove radiation and lag errors from radiosonde temperature measurements, the homogeneity of the data should improve whenever these types of errors dominate.
It is found that even though stratospheric radiosonde temperatures appear to be somewhat more homogeneous after the Luers–Eskridge adjustments have been applied, transition-related discontinuities in the troposphere are frequently amplified by the adjustments. Significant discontinuities remain in the adjusted data in all three atmospheric layers. Based on the findings of this study, it appears that the Luers–Eskridge adjustments do not render upper-air temperature records sufficiently homogeneous for climate change analyses. Given that the method was designed to adjust only for radiation and lag errors in radiosonde temperature measurements, its relative ineffectiveness at producing homogeneous time series is likely to be caused by 1) an inaccurate calculation of the radiation or lag errors and/or 2) the presence of other errors in the data that contribute significantly to observed discontinuities in the time series.
Abstract
The effect of the Luers–Eskridge adjustments on the homogeneity of archived radiosonde temperature observations is evaluated. Using unadjusted and adjusted radiosonde data from the Comprehensive Aerological Reference Dataset (CARDS) as well as microwave sounding unit (MSU) version-d monthly temperature anomalies, the discontinuities in differences between radiosonde and MSU temperature anomalies across times of documented changes in radiosonde are computed for the lower to midtroposphere, mid- to upper troposphere, and lower stratosphere. For this purpose, a discontinuity is defined as a statistically significant difference between means of radiosonde–MSU differences for the 30-month periods immediately prior to and following a documented change in radiosonde type. The magnitude and number of discontinuities based on unadjusted and adjusted radiosonde data are then compared. Since the Luers–Eskridge adjustments have been designed to remove radiation and lag errors from radiosonde temperature measurements, the homogeneity of the data should improve whenever these types of errors dominate.
It is found that even though stratospheric radiosonde temperatures appear to be somewhat more homogeneous after the Luers–Eskridge adjustments have been applied, transition-related discontinuities in the troposphere are frequently amplified by the adjustments. Significant discontinuities remain in the adjusted data in all three atmospheric layers. Based on the findings of this study, it appears that the Luers–Eskridge adjustments do not render upper-air temperature records sufficiently homogeneous for climate change analyses. Given that the method was designed to adjust only for radiation and lag errors in radiosonde temperature measurements, its relative ineffectiveness at producing homogeneous time series is likely to be caused by 1) an inaccurate calculation of the radiation or lag errors and/or 2) the presence of other errors in the data that contribute significantly to observed discontinuities in the time series.
Abstract
This paper provides a general description of the Integrated Global Radiosonde Archive (IGRA), a new radiosonde dataset from the National Climatic Data Center (NCDC). IGRA consists of radiosonde and pilot balloon observations at more than 1500 globally distributed stations with varying periods of record, many of which extend from the 1960s to present. Observations include pressure, temperature, geopotential height, dewpoint depression, wind direction, and wind speed at standard, surface, tropopause, and significant levels.
IGRA contains quality-assured data from 11 different sources. Rigorous procedures are employed to ensure proper station identification, eliminate duplicate levels within soundings, and select one sounding for every station, date, and time. The quality assurance algorithms check for format problems, physically implausible values, internal inconsistencies among variables, runs of values across soundings and levels, climatological outliers, and temporal and vertical inconsistencies in temperature. The performance of the various checks was evaluated by careful inspection of selected soundings and time series.
In its final form, IGRA is the largest and most comprehensive dataset of quality-assured radiosonde observations freely available. Its temporal and spatial coverage is most complete over the United States, western Europe, Russia, and Australia. The vertical resolution and extent of soundings improve significantly over time, with nearly three-quarters of all soundings reaching up to at least 100 hPa by 2003. IGRA data are updated on a daily basis and are available online from NCDC as both individual soundings and monthly means.
Abstract
This paper provides a general description of the Integrated Global Radiosonde Archive (IGRA), a new radiosonde dataset from the National Climatic Data Center (NCDC). IGRA consists of radiosonde and pilot balloon observations at more than 1500 globally distributed stations with varying periods of record, many of which extend from the 1960s to present. Observations include pressure, temperature, geopotential height, dewpoint depression, wind direction, and wind speed at standard, surface, tropopause, and significant levels.
IGRA contains quality-assured data from 11 different sources. Rigorous procedures are employed to ensure proper station identification, eliminate duplicate levels within soundings, and select one sounding for every station, date, and time. The quality assurance algorithms check for format problems, physically implausible values, internal inconsistencies among variables, runs of values across soundings and levels, climatological outliers, and temporal and vertical inconsistencies in temperature. The performance of the various checks was evaluated by careful inspection of selected soundings and time series.
In its final form, IGRA is the largest and most comprehensive dataset of quality-assured radiosonde observations freely available. Its temporal and spatial coverage is most complete over the United States, western Europe, Russia, and Australia. The vertical resolution and extent of soundings improve significantly over time, with nearly three-quarters of all soundings reaching up to at least 100 hPa by 2003. IGRA data are updated on a daily basis and are available online from NCDC as both individual soundings and monthly means.
Abstract
The recent dryness in California was unprecedented in the instrumental record. This article employs spatially explicit precipitation reconstructions for California in combination with instrumental data to provide perspective on this event since 1571. The period 2012–15 stands out as particularly extreme in the southern Central Valley and south coast regions. which likely experienced unprecedented precipitation deficits over this time, apart from considerations of increasing temperatures and drought metrics that combine temperature and moisture information. Some areas lost more than two years’ average moisture delivery during these four years, and full recovery to long-term average moisture delivery could typically take up to several decades in the hardest-hit areas. These results highlight the value of the additional centuries of information provided by the paleo record, which indicates the shorter instrumental record may underestimate the statewide recovery time by over 30%. The extreme El Niño that occurred in 2015/16 ameliorated recovery in much of the northern half of the state, and since 1571 very-strong-to-extreme El Niños during the first year after a 2012–15-type event reduce statewide recovery times by approximately half. The southern part of California did not experience the high precipitation anticipated, and the multicentury analysis suggests the north-wet–south-dry pattern for such an El Niño was a low-likelihood anomaly. Recent wetness in California motivated evaluation of recovery times when the first two years are relatively wet, suggesting the state is benefiting from a one-in-five (or lower) likelihood situation: the likelihood of full recovery within two years is ~1% in the instrumental data and even lower in the reconstruction data.
Abstract
The recent dryness in California was unprecedented in the instrumental record. This article employs spatially explicit precipitation reconstructions for California in combination with instrumental data to provide perspective on this event since 1571. The period 2012–15 stands out as particularly extreme in the southern Central Valley and south coast regions. which likely experienced unprecedented precipitation deficits over this time, apart from considerations of increasing temperatures and drought metrics that combine temperature and moisture information. Some areas lost more than two years’ average moisture delivery during these four years, and full recovery to long-term average moisture delivery could typically take up to several decades in the hardest-hit areas. These results highlight the value of the additional centuries of information provided by the paleo record, which indicates the shorter instrumental record may underestimate the statewide recovery time by over 30%. The extreme El Niño that occurred in 2015/16 ameliorated recovery in much of the northern half of the state, and since 1571 very-strong-to-extreme El Niños during the first year after a 2012–15-type event reduce statewide recovery times by approximately half. The southern part of California did not experience the high precipitation anticipated, and the multicentury analysis suggests the north-wet–south-dry pattern for such an El Niño was a low-likelihood anomaly. Recent wetness in California motivated evaluation of recovery times when the first two years are relatively wet, suggesting the state is benefiting from a one-in-five (or lower) likelihood situation: the likelihood of full recovery within two years is ~1% in the instrumental data and even lower in the reconstruction data.
Abstract
The monthly global 2° × 2° Extended Reconstructed Sea Surface Temperature (ERSST) has been revised and updated from version 4 to version 5. This update incorporates a new release of ICOADS release 3.0 (R3.0), a decade of near-surface data from Argo floats, and a new estimate of centennial sea ice from HadISST2. A number of choices in aspects of quality control, bias adjustment, and interpolation have been substantively revised. The resulting ERSST estimates have more realistic spatiotemporal variations, better representation of high-latitude SSTs, and ship SST biases are now calculated relative to more accurate buoy measurements, while the global long-term trend remains about the same. Progressive experiments have been undertaken to highlight the effects of each change in data source and analysis technique upon the final product. The reconstructed SST is systematically decreased by 0.077°C, as the reference data source is switched from ship SST in ERSSTv4 to modern buoy SST in ERSSTv5. Furthermore, high-latitude SSTs are decreased by 0.1°–0.2°C by using sea ice concentration from HadISST2 over HadISST1. Changes arising from remaining innovations are mostly important at small space and time scales, primarily having an impact where and when input observations are sparse. Cross validations and verifications with independent modern observations show that the updates incorporated in ERSSTv5 have improved the representation of spatial variability over the global oceans, the magnitude of El Niño and La Niña events, and the decadal nature of SST changes over 1930s–40s when observation instruments changed rapidly. Both long- (1900–2015) and short-term (2000–15) SST trends in ERSSTv5 remain significant as in ERSSTv4.
Abstract
The monthly global 2° × 2° Extended Reconstructed Sea Surface Temperature (ERSST) has been revised and updated from version 4 to version 5. This update incorporates a new release of ICOADS release 3.0 (R3.0), a decade of near-surface data from Argo floats, and a new estimate of centennial sea ice from HadISST2. A number of choices in aspects of quality control, bias adjustment, and interpolation have been substantively revised. The resulting ERSST estimates have more realistic spatiotemporal variations, better representation of high-latitude SSTs, and ship SST biases are now calculated relative to more accurate buoy measurements, while the global long-term trend remains about the same. Progressive experiments have been undertaken to highlight the effects of each change in data source and analysis technique upon the final product. The reconstructed SST is systematically decreased by 0.077°C, as the reference data source is switched from ship SST in ERSSTv4 to modern buoy SST in ERSSTv5. Furthermore, high-latitude SSTs are decreased by 0.1°–0.2°C by using sea ice concentration from HadISST2 over HadISST1. Changes arising from remaining innovations are mostly important at small space and time scales, primarily having an impact where and when input observations are sparse. Cross validations and verifications with independent modern observations show that the updates incorporated in ERSSTv5 have improved the representation of spatial variability over the global oceans, the magnitude of El Niño and La Niña events, and the decadal nature of SST changes over 1930s–40s when observation instruments changed rapidly. Both long- (1900–2015) and short-term (2000–15) SST trends in ERSSTv5 remain significant as in ERSSTv4.