Search Results
You are looking at 1 - 10 of 10 items for
- Author or Editor: S. F. Parker x
- Refine by Access: All Content x
Abstract
A practical model of atmospheric dispersion of a passive tracer based on systematic reduction of the second-order closure transport equations using Gaussian shape assumptions is presented. The model is comparable with conventional Gaussian plume models in complexity, but still maintains the capability to also predict concentration fluctuation variance and to utilize direct measurements of turbulent velocity variances in a consistent manner. Comparison with laboratory data demonstrates the model's ability to produce reasonable predictions for the concentration field.
Abstract
A practical model of atmospheric dispersion of a passive tracer based on systematic reduction of the second-order closure transport equations using Gaussian shape assumptions is presented. The model is comparable with conventional Gaussian plume models in complexity, but still maintains the capability to also predict concentration fluctuation variance and to utilize direct measurements of turbulent velocity variances in a consistent manner. Comparison with laboratory data demonstrates the model's ability to produce reasonable predictions for the concentration field.
Abstract
No abstract available.
Abstract
No abstract available.
Abstract
Detailed statistics of the fluctuating concentration field produced by large-eddy simulations (LES) of the chemically reactive mixing of two species in a convectively driven mixed layer are presented. The effect of the turbulent mixing on the effective reaction rate between the species is analysed. The segregation between the species is shown to be significant for fast reactions, and therefore correct model predictions of the evolution of the species concentration requires an estimate of the segregation coefficient. Some simple modeling concepts for one-point second-order turbulence closure schemes are examined and compared with the LES results. The results are a promising indication that second-order closure schemes can be extended to provide a practical calculation of the turbulent mixing effects on fast chemical reactions.
Abstract
Detailed statistics of the fluctuating concentration field produced by large-eddy simulations (LES) of the chemically reactive mixing of two species in a convectively driven mixed layer are presented. The effect of the turbulent mixing on the effective reaction rate between the species is analysed. The segregation between the species is shown to be significant for fast reactions, and therefore correct model predictions of the evolution of the species concentration requires an estimate of the segregation coefficient. Some simple modeling concepts for one-point second-order turbulence closure schemes are examined and compared with the LES results. The results are a promising indication that second-order closure schemes can be extended to provide a practical calculation of the turbulent mixing effects on fast chemical reactions.
Abstract
A long-range transport model based on turbulence closure concepts is described. The model extends the description of planetary boundary layer turbulent diffusion to the larger scales and uses statistical wind information to predict contaminant dispersion. The model also contains a prediction of the statistical fluctuations in the tracer concentration resulting from the unresolved velocity fluctuations. The dispersion calculation is made by means of a Lagrangian puff representation, allowing the use of time-dependent three-dimensional flow fields. Predictions of the ANATEX (Across North America Tracer Experiment) releases are compared with observations. Both 24-h average surface and short-term aircraft sampler concentrations are calculated using the high-resolution wind fields from the NMC Nested Grid Model. The statistical prediction is also tested using long-term average wind data.
Statistical uncertainty in the predictions, due to the unresolved wind fluctuations, is found to be small for the 24-h average surface concentrations obtained with the high-resolution winds but is very significant for the short-term aircraft sampler concentrations. A clipped normal probability distribution provides a reasonably good description of the overall cumulative distribution of the aircraft sampler concentrations. A reasonably good description of the 24-h surface concentrations is also obtained using only the long-term average wind statistics and a lognormal probability distribution for the concentration values.
Abstract
A long-range transport model based on turbulence closure concepts is described. The model extends the description of planetary boundary layer turbulent diffusion to the larger scales and uses statistical wind information to predict contaminant dispersion. The model also contains a prediction of the statistical fluctuations in the tracer concentration resulting from the unresolved velocity fluctuations. The dispersion calculation is made by means of a Lagrangian puff representation, allowing the use of time-dependent three-dimensional flow fields. Predictions of the ANATEX (Across North America Tracer Experiment) releases are compared with observations. Both 24-h average surface and short-term aircraft sampler concentrations are calculated using the high-resolution wind fields from the NMC Nested Grid Model. The statistical prediction is also tested using long-term average wind data.
Statistical uncertainty in the predictions, due to the unresolved wind fluctuations, is found to be small for the 24-h average surface concentrations obtained with the high-resolution winds but is very significant for the short-term aircraft sampler concentrations. A clipped normal probability distribution provides a reasonably good description of the overall cumulative distribution of the aircraft sampler concentrations. A reasonably good description of the 24-h surface concentrations is also obtained using only the long-term average wind statistics and a lognormal probability distribution for the concentration values.
Abstract
A realistic hindcast simulation of the Salish Sea, which encompasses the estuarine systems of Puget Sound, the Strait of Juan de Fuca, and the Strait of Georgia, is described for the year 2006. The model shows moderate skill when compared against hydrographic, velocity, and sea surface height observations over tidal and subtidal time scales. Analysis of the velocity and salinity fields allows the structure and variability of the exchange flow to be estimated for the first time from the shelf into the farthest reaches of Puget Sound. This study utilizes the total exchange flow formalism that calculates volume transports and salt fluxes in an isohaline framework, which is then compared to previous estimates of exchange flow in the region. From this analysis, residence time distributions are estimated for Puget Sound and its major basins and are found to be markedly shorter than previous estimates. The difference arises from the ability of the model and the isohaline method for flux calculations to more accurately estimate the exchange flow. In addition, evidence is found to support the previously observed spring–neap modulation of stratification at the Admiralty Inlet sill. However, the exchange flow calculated increases at spring tides, exactly opposite to the conclusion reached from an Eulerian average of observations.
Abstract
A realistic hindcast simulation of the Salish Sea, which encompasses the estuarine systems of Puget Sound, the Strait of Juan de Fuca, and the Strait of Georgia, is described for the year 2006. The model shows moderate skill when compared against hydrographic, velocity, and sea surface height observations over tidal and subtidal time scales. Analysis of the velocity and salinity fields allows the structure and variability of the exchange flow to be estimated for the first time from the shelf into the farthest reaches of Puget Sound. This study utilizes the total exchange flow formalism that calculates volume transports and salt fluxes in an isohaline framework, which is then compared to previous estimates of exchange flow in the region. From this analysis, residence time distributions are estimated for Puget Sound and its major basins and are found to be markedly shorter than previous estimates. The difference arises from the ability of the model and the isohaline method for flux calculations to more accurately estimate the exchange flow. In addition, evidence is found to support the previously observed spring–neap modulation of stratification at the Admiralty Inlet sill. However, the exchange flow calculated increases at spring tides, exactly opposite to the conclusion reached from an Eulerian average of observations.
Abstract
A new flexible gridded dataset of sea surface temperature (SST) since 1850 is presented and its uncertainties are quantified. This analysis [the Second Hadley Centre Sea Surface Temperature dataset (HadSST2)] is based on data contained within the recently created International Comprehensive Ocean–Atmosphere Data Set (ICOADS) database and so is superior in geographical coverage to previous datasets and has smaller uncertainties. Issues arising when analyzing a database of observations measured from very different platforms and drawn from many different countries with different measurement practices are introduced. Improved bias corrections are applied to the data to account for changes in measurement conditions through time. A detailed analysis of uncertainties in these corrections is included by exploring assumptions made in their construction and producing multiple versions using a Monte Carlo method. An assessment of total uncertainty in each gridded average is obtained by combining these bias-correction-related uncertainties with those arising from measurement errors and undersampling of intragrid box variability. These are calculated by partitioning the variance in grid box averages between real and spurious variability. From month to month in individual grid boxes, sampling uncertainties tend to be most important (except in certain regions), but on large-scale averages bias-correction uncertainties are more dominant owing to their correlation between grid boxes. Changes in large-scale SST through time are assessed by two methods. The linear warming between 1850 and 2004 was 0.52° ± 0.19°C (95% confidence interval) for the globe, 0.59° ± 0.20°C for the Northern Hemisphere, and 0.46° ± 0.29°C for the Southern Hemisphere. Decadally filtered differences for these regions over this period were 0.67° ± 0.04°C, 0.71° ± 0.06°C, and 0.64° ± 0.07°C.
Abstract
A new flexible gridded dataset of sea surface temperature (SST) since 1850 is presented and its uncertainties are quantified. This analysis [the Second Hadley Centre Sea Surface Temperature dataset (HadSST2)] is based on data contained within the recently created International Comprehensive Ocean–Atmosphere Data Set (ICOADS) database and so is superior in geographical coverage to previous datasets and has smaller uncertainties. Issues arising when analyzing a database of observations measured from very different platforms and drawn from many different countries with different measurement practices are introduced. Improved bias corrections are applied to the data to account for changes in measurement conditions through time. A detailed analysis of uncertainties in these corrections is included by exploring assumptions made in their construction and producing multiple versions using a Monte Carlo method. An assessment of total uncertainty in each gridded average is obtained by combining these bias-correction-related uncertainties with those arising from measurement errors and undersampling of intragrid box variability. These are calculated by partitioning the variance in grid box averages between real and spurious variability. From month to month in individual grid boxes, sampling uncertainties tend to be most important (except in certain regions), but on large-scale averages bias-correction uncertainties are more dominant owing to their correlation between grid boxes. Changes in large-scale SST through time are assessed by two methods. The linear warming between 1850 and 2004 was 0.52° ± 0.19°C (95% confidence interval) for the globe, 0.59° ± 0.20°C for the Northern Hemisphere, and 0.46° ± 0.29°C for the Southern Hemisphere. Decadally filtered differences for these regions over this period were 0.67° ± 0.04°C, 0.71° ± 0.06°C, and 0.64° ± 0.07°C.
Abstract
Uncertainties in observed records of atmospheric temperature aloft remain poorly quantified. This has resulted in considerable controversy regarding signals of climate change over recent decades from temperature records of radiosondes and satellites. This work revisits the problems associated with the removal of inhomogeneities from the historical radiosonde temperature records, and provides a method for quantifying uncertainty in an adjusted radiosonde climate record due to the subjective choices made during the data homogenization.
This paper presents an automated homogenization method designed to replicate the decisions made by manual judgment in the generation of an earlier radiosonde dataset [i.e., the Hadley Centre radiosonde temperature dataset (HadAT)]. A number of validation experiments have been conducted to test the system performance and impact on linear trends.
Using climate model data to simulate biased radiosonde data, the authors show that limitations in the homogenization method are sufficiently large to explain much of the tropical trend discrepancy between HadAT and estimates from satellite platforms and climate models. This situation arises from the combination of systematic (unknown magnitude) and random uncertainties (of order 0.05 K decade−1) in the radiosonde data. Previous assessment of trends and uncertainty in HadAT is likely to have underestimated the systematic bias in tropical mean temperature trends. This objective assessment of radiosonde homogenization supports the conclusions of the synthesis report of the U.S. Climate Change Science Program (CCSP), and associated research, regarding potential bias in tropospheric temperature records from radiosondes.
Abstract
Uncertainties in observed records of atmospheric temperature aloft remain poorly quantified. This has resulted in considerable controversy regarding signals of climate change over recent decades from temperature records of radiosondes and satellites. This work revisits the problems associated with the removal of inhomogeneities from the historical radiosonde temperature records, and provides a method for quantifying uncertainty in an adjusted radiosonde climate record due to the subjective choices made during the data homogenization.
This paper presents an automated homogenization method designed to replicate the decisions made by manual judgment in the generation of an earlier radiosonde dataset [i.e., the Hadley Centre radiosonde temperature dataset (HadAT)]. A number of validation experiments have been conducted to test the system performance and impact on linear trends.
Using climate model data to simulate biased radiosonde data, the authors show that limitations in the homogenization method are sufficiently large to explain much of the tropical trend discrepancy between HadAT and estimates from satellite platforms and climate models. This situation arises from the combination of systematic (unknown magnitude) and random uncertainties (of order 0.05 K decade−1) in the radiosonde data. Previous assessment of trends and uncertainty in HadAT is likely to have underestimated the systematic bias in tropical mean temperature trends. This objective assessment of radiosonde homogenization supports the conclusions of the synthesis report of the U.S. Climate Change Science Program (CCSP), and associated research, regarding potential bias in tropospheric temperature records from radiosondes.
Abstract
Biases and uncertainties in large-scale radiosonde temperature trends in the troposphere are critically reassessed. Realistic validation experiments are performed on an automatic radiosonde homogenization system by applying it to climate model data with four distinct sets of simulated breakpoint profiles. Knowledge of the “truth” permits a critical assessment of the ability of the system to recover the large-scale trends and a reinterpretation of the results when applied to the real observations.
The homogenization system consistently reduces the bias in the daytime tropical, global, and Northern Hemisphere (NH) extratropical trends but underestimates the full magnitude of the bias. Southern Hemisphere (SH) extratropical and all nighttime trends were less well adjusted owing to the sparsity of stations. The ability to recover the trends is dependent on the underlying error structure, and the true trend does not necessarily lie within the range of estimates. The implications are that tropical tropospheric trends in the unadjusted daytime radiosonde observations, and in many current upper-air datasets, are biased cold, but the degree of this bias cannot be robustly quantified. Therefore, remaining biases in the radiosonde temperature record may account for the apparent tropical lapse rate discrepancy between radiosonde data and climate models. Furthermore, the authors find that the unadjusted global and NH extratropical tropospheric trends are biased cold in the daytime radiosonde observations.
Finally, observing system experiments show that, if the Global Climate Observing System (GCOS) Upper Air Network (GUAN) were to make climate quality observations adhering to the GCOS monitoring principles, then one would be able to constrain the uncertainties in trends at a more comprehensive set of stations. This reaffirms the importance of running GUAN under the GCOS monitoring principles.
Abstract
Biases and uncertainties in large-scale radiosonde temperature trends in the troposphere are critically reassessed. Realistic validation experiments are performed on an automatic radiosonde homogenization system by applying it to climate model data with four distinct sets of simulated breakpoint profiles. Knowledge of the “truth” permits a critical assessment of the ability of the system to recover the large-scale trends and a reinterpretation of the results when applied to the real observations.
The homogenization system consistently reduces the bias in the daytime tropical, global, and Northern Hemisphere (NH) extratropical trends but underestimates the full magnitude of the bias. Southern Hemisphere (SH) extratropical and all nighttime trends were less well adjusted owing to the sparsity of stations. The ability to recover the trends is dependent on the underlying error structure, and the true trend does not necessarily lie within the range of estimates. The implications are that tropical tropospheric trends in the unadjusted daytime radiosonde observations, and in many current upper-air datasets, are biased cold, but the degree of this bias cannot be robustly quantified. Therefore, remaining biases in the radiosonde temperature record may account for the apparent tropical lapse rate discrepancy between radiosonde data and climate models. Furthermore, the authors find that the unadjusted global and NH extratropical tropospheric trends are biased cold in the daytime radiosonde observations.
Finally, observing system experiments show that, if the Global Climate Observing System (GCOS) Upper Air Network (GUAN) were to make climate quality observations adhering to the GCOS monitoring principles, then one would be able to constrain the uncertainties in trends at a more comprehensive set of stations. This reaffirms the importance of running GUAN under the GCOS monitoring principles.
Abstract
There is no single reference dataset of long-term global upper-air temperature observations, although several groups have developed datasets from radiosonde and satellite observations for climate-monitoring purposes. The existence of multiple data products allows for exploration of the uncertainty in signals of climate variations and change. This paper examines eight upper-air temperature datasets and quantifies the magnitude and uncertainty of various climate signals, including stratospheric quasi-biennial oscillation (QBO) and tropospheric ENSO signals, stratospheric warming following three major volcanic eruptions, the abrupt tropospheric warming of 1976–77, and multidecadal temperature trends. Uncertainty estimates are based both on the spread of signal estimates from the different observational datasets and on the inherent statistical uncertainties of the signal in any individual dataset.
The large spread among trend estimates suggests that using multiple datasets to characterize large-scale upper- air temperature trends gives a more complete characterization of their uncertainty than reliance on a single dataset. For other climate signals, there is value in using more than one dataset, because signal strengths vary. However, the purely statistical uncertainty of the signal in individual datasets is large enough to effectively encompass the spread among datasets. This result supports the notion of an 11th climate-monitoring principle, augmenting the 10 principles that have now been generally accepted (although not generally implemented) by the climate community. This 11th principle calls for monitoring key climate variables with multiple, independent observing systems for measuring the variable, and multiple, independent groups analyzing the data.
Abstract
There is no single reference dataset of long-term global upper-air temperature observations, although several groups have developed datasets from radiosonde and satellite observations for climate-monitoring purposes. The existence of multiple data products allows for exploration of the uncertainty in signals of climate variations and change. This paper examines eight upper-air temperature datasets and quantifies the magnitude and uncertainty of various climate signals, including stratospheric quasi-biennial oscillation (QBO) and tropospheric ENSO signals, stratospheric warming following three major volcanic eruptions, the abrupt tropospheric warming of 1976–77, and multidecadal temperature trends. Uncertainty estimates are based both on the spread of signal estimates from the different observational datasets and on the inherent statistical uncertainties of the signal in any individual dataset.
The large spread among trend estimates suggests that using multiple datasets to characterize large-scale upper- air temperature trends gives a more complete characterization of their uncertainty than reliance on a single dataset. For other climate signals, there is value in using more than one dataset, because signal strengths vary. However, the purely statistical uncertainty of the signal in individual datasets is large enough to effectively encompass the spread among datasets. This result supports the notion of an 11th climate-monitoring principle, augmenting the 10 principles that have now been generally accepted (although not generally implemented) by the climate community. This 11th principle calls for monitoring key climate variables with multiple, independent observing systems for measuring the variable, and multiple, independent groups analyzing the data.