Search Results
You are looking at 1 - 10 of 34 items for
- Author or Editor: J.C. Wilson x
- Refine by Access: All Content x
Abstract
As part of the U.K. Hydrographic Office (UKHO)-sponsored Vertical Offshore Reference Frames (VORF) project, a high-resolution model of lowest astronomical tide (LAT) with respect to mean sea level has been developed for U.K.–Irish waters. In offshore areas the model relies on data from satellite altimetry, while in coastal areas data from a 3.5-km-resolution hydrodynamic tide-surge model and tide gauges have been included. To provide for a smooth surface and predict tidal levels in unobserved areas, the data have been merged and interpolated using the thin plate spline method, which has been appropriately tuned by an empirical prediction test whereby observed values at tide gauges were removed from the solution space and surrounding data used to predict its behavior. To allow for the complex coastal morphology, a sea distance function has been implemented within the data weighting, which is shown to significantly enhance the solution. The tuning process allows for independent validation giving a standard error of the resulting surface of 0.2 m for areas with no tidal observations.
Abstract
As part of the U.K. Hydrographic Office (UKHO)-sponsored Vertical Offshore Reference Frames (VORF) project, a high-resolution model of lowest astronomical tide (LAT) with respect to mean sea level has been developed for U.K.–Irish waters. In offshore areas the model relies on data from satellite altimetry, while in coastal areas data from a 3.5-km-resolution hydrodynamic tide-surge model and tide gauges have been included. To provide for a smooth surface and predict tidal levels in unobserved areas, the data have been merged and interpolated using the thin plate spline method, which has been appropriately tuned by an empirical prediction test whereby observed values at tide gauges were removed from the solution space and surrounding data used to predict its behavior. To allow for the complex coastal morphology, a sea distance function has been implemented within the data weighting, which is shown to significantly enhance the solution. The tuning process allows for independent validation giving a standard error of the resulting surface of 0.2 m for areas with no tidal observations.
Abstract
The United Kingdom Meteorological Office (UKMO) has developed an airborne interferometer to act as a simulator for future satellite-based infrared meteorological sounders. The Airborne Research Interferometer Evaluation System (ARIES) consists of a modified commercial interferometer mounted on the UKMO C-130 aircraft. The instrument is sensitive to the wavelength range 3.3–18 μm and has a maximum optical path difference of ±1.037 cm. This paper describes the design and performance of ARIES, discusses instrument calibration, and presents some preliminary results. An important problem associated with the use of the new generation of high-spectral resolution infrared meteorological sounders is that improvements need to be made to knowledge of atmospheric spectroscopy and radiative transfer. These improvements are necessary to extract the promised vertical and absolute resolution in temperature and humidity retrievals from these new high-spectral resolution sounders. By virtue of the extensive instrumentation that is available on the C-130 aircraft for observing and measuring the basic meteorological and atmospheric parameters (e.g., in situ temperature, humidity, and ozone), it is hoped that ARIES will be an important tool for use in studying this issue.
Abstract
The United Kingdom Meteorological Office (UKMO) has developed an airborne interferometer to act as a simulator for future satellite-based infrared meteorological sounders. The Airborne Research Interferometer Evaluation System (ARIES) consists of a modified commercial interferometer mounted on the UKMO C-130 aircraft. The instrument is sensitive to the wavelength range 3.3–18 μm and has a maximum optical path difference of ±1.037 cm. This paper describes the design and performance of ARIES, discusses instrument calibration, and presents some preliminary results. An important problem associated with the use of the new generation of high-spectral resolution infrared meteorological sounders is that improvements need to be made to knowledge of atmospheric spectroscopy and radiative transfer. These improvements are necessary to extract the promised vertical and absolute resolution in temperature and humidity retrievals from these new high-spectral resolution sounders. By virtue of the extensive instrumentation that is available on the C-130 aircraft for observing and measuring the basic meteorological and atmospheric parameters (e.g., in situ temperature, humidity, and ozone), it is hoped that ARIES will be an important tool for use in studying this issue.
Abstract
Joint observations by radar and high-frequency sferics detectors at Georgia Institute of Technology provided unique data on the Atlanta tornado of 24 March 1975. The classic hook echo was detected by radar at a range of about 26 km, 15 min before the tornado touched down. While the tornado was on the ground the sferics burst rate was very low, despite very high values recorded immediately before and after this interval. This observation, together with visual reports of a strong cloud-to-ground discharge at the time of tornado touchdown, suggests an interaction of the tornado with the electric field of the storm.
Abstract
Joint observations by radar and high-frequency sferics detectors at Georgia Institute of Technology provided unique data on the Atlanta tornado of 24 March 1975. The classic hook echo was detected by radar at a range of about 26 km, 15 min before the tornado touched down. While the tornado was on the ground the sferics burst rate was very low, despite very high values recorded immediately before and after this interval. This observation, together with visual reports of a strong cloud-to-ground discharge at the time of tornado touchdown, suggests an interaction of the tornado with the electric field of the storm.
Abstract
The Advanced Scatterometer (ASCAT) on the Meteorological Operational (MetOp) series of satellites is designed to provide data for the retrieval of ocean wind fields. Three transponders were used to give an absolute calibration and the worst-case calibration error is estimated to be 0.15–0.25 dB.
In this paper the calibrated data are validated by comparing the backscatter from a range of naturally distributed targets against models developed from European Remote Sensing Satellite (ERS) scatterometer data.
For the Amazon rainforest it is found that the isotropic backscatter decreases from −6.2 to −6.8 dB over the incidence angle range. The ERS value is around −6.5 dB. All ASCAT beams are within 0.1 dB of each other. Rainforest backscatter over a 3-yr period is found to be very stable with annual changes of approximately 0.02 dB.
ASCAT ocean backscatter is compared against values from the C-band geophysical model function (CMOD-5) using ECMWF wind fields. A difference of approximately 0.2 dB below 55° incidence is found. Differences of over 1 dB above 55° are likely due to inaccuracies in CMOD-5, which has not been fully validated at large incidence angles. All beams are within 0.1 dB of each other.
Backscatter from regions of stable Antarctic sea ice is found to be consistent with model backscatter except at large incidence angles where the model has not been validated. The noise in the ice backscatter indicates that the normalized standard deviation of the backscatter values Kp is around 4.5%, which is consistent with the expected value.
These results agree well with the expected calibration accuracy and give confidence that the calibration has been successful and that ASCAT products are of high quality.
Abstract
The Advanced Scatterometer (ASCAT) on the Meteorological Operational (MetOp) series of satellites is designed to provide data for the retrieval of ocean wind fields. Three transponders were used to give an absolute calibration and the worst-case calibration error is estimated to be 0.15–0.25 dB.
In this paper the calibrated data are validated by comparing the backscatter from a range of naturally distributed targets against models developed from European Remote Sensing Satellite (ERS) scatterometer data.
For the Amazon rainforest it is found that the isotropic backscatter decreases from −6.2 to −6.8 dB over the incidence angle range. The ERS value is around −6.5 dB. All ASCAT beams are within 0.1 dB of each other. Rainforest backscatter over a 3-yr period is found to be very stable with annual changes of approximately 0.02 dB.
ASCAT ocean backscatter is compared against values from the C-band geophysical model function (CMOD-5) using ECMWF wind fields. A difference of approximately 0.2 dB below 55° incidence is found. Differences of over 1 dB above 55° are likely due to inaccuracies in CMOD-5, which has not been fully validated at large incidence angles. All beams are within 0.1 dB of each other.
Backscatter from regions of stable Antarctic sea ice is found to be consistent with model backscatter except at large incidence angles where the model has not been validated. The noise in the ice backscatter indicates that the normalized standard deviation of the backscatter values Kp is around 4.5%, which is consistent with the expected value.
These results agree well with the expected calibration accuracy and give confidence that the calibration has been successful and that ASCAT products are of high quality.
Abstract
The Auto-Nowcast System (ANC), a software system that produces time- and space-specific, routine (every 5 min) short-term (0–1 h) nowcasts of storm location, is presented. A primary component of ANC is its ability to identify and characterize boundary layer convergence lines. Boundary layer information is used along with storm and cloud characteristics to augment extrapolation with nowcasts of storm initiation, growth, and dissipation. A fuzzy logic routine is used to combine predictor fields that are based on observations (radar, satellite, sounding, mesonet, and profiler), a numerical boundary layer model and its adjoint, forecaster input, and feature detection algorithms. The ANC methodology is illustrated using nowcasts of storm initiation, growth, and dissipation. Statistical verification shows that ANC is able to routinely improve over extrapolation and persistence.
Abstract
The Auto-Nowcast System (ANC), a software system that produces time- and space-specific, routine (every 5 min) short-term (0–1 h) nowcasts of storm location, is presented. A primary component of ANC is its ability to identify and characterize boundary layer convergence lines. Boundary layer information is used along with storm and cloud characteristics to augment extrapolation with nowcasts of storm initiation, growth, and dissipation. A fuzzy logic routine is used to combine predictor fields that are based on observations (radar, satellite, sounding, mesonet, and profiler), a numerical boundary layer model and its adjoint, forecaster input, and feature detection algorithms. The ANC methodology is illustrated using nowcasts of storm initiation, growth, and dissipation. Statistical verification shows that ANC is able to routinely improve over extrapolation and persistence.
Abstract
We develop a stochastic North Atlantic hurricane track model whose climate inputs are Atlantic main development region (MDR) and Indo-Pacific (IP) sea surface temperatures and produce extremely long model simulations for 58 different climates, each one conditioned on 5 yr of observed SSTs from 1950 to 2011—hereafter referred as medium-term (MT) views.
Stringent tests are then performed to prove that MT simulations are better predictors of hurricane landfalls than a long-term view conditioned on the entire SST time series from 1950 to 2011.
In this analysis, the authors extrapolate beyond the historical record, but not in terms of a forecast of future conditions. Rather it is attempted to define—within the limitation of the modeling approach—the magnitude of extreme events that could have materialized in the past at fixed probability thresholds and what is the likelihood of observed landfalls given such estimates.
Finally, a loss proxy is built and the value of the analysis results from a simplified property and casualty insurance perspective is shown. Medium-term simulations of hurricane activity are used to set the strategy of reinsurance coverage purchased by a hypothetical primary insurance, leading to improved solvency margins.
Abstract
We develop a stochastic North Atlantic hurricane track model whose climate inputs are Atlantic main development region (MDR) and Indo-Pacific (IP) sea surface temperatures and produce extremely long model simulations for 58 different climates, each one conditioned on 5 yr of observed SSTs from 1950 to 2011—hereafter referred as medium-term (MT) views.
Stringent tests are then performed to prove that MT simulations are better predictors of hurricane landfalls than a long-term view conditioned on the entire SST time series from 1950 to 2011.
In this analysis, the authors extrapolate beyond the historical record, but not in terms of a forecast of future conditions. Rather it is attempted to define—within the limitation of the modeling approach—the magnitude of extreme events that could have materialized in the past at fixed probability thresholds and what is the likelihood of observed landfalls given such estimates.
Finally, a loss proxy is built and the value of the analysis results from a simplified property and casualty insurance perspective is shown. Medium-term simulations of hurricane activity are used to set the strategy of reinsurance coverage purchased by a hypothetical primary insurance, leading to improved solvency margins.
Abstract
The dynamics of lateral circulation in the Passaic River estuary is examined in this modeling study. The pattern of lateral circulation varies significantly over a tidal cycle as a result of the temporal variation of stratification induced by tidal straining. During highly stratified ebb tides, the lateral circulation exhibits a vertical two-cell structure. Strong stratification suppresses vertical mixing in the deep channel, whereas the shoal above the halocline remains relatively well mixed. As a result, in the upper layer, the lateral asymmetry of vertical mixing produces denser water on the shoal and fresher water over the thalweg. This density gradient drives a circulation with surface currents directed toward the shoal, and the currents at the base of the pycnocline are directed toward the thalweg. In the lower layer, the lateral circulation tends to reduce the tilting of isopycnals and gradually diminishes at the end of the ebb tide. A lateral baroclinic pressure gradient is a dominant driving force for lateral circulation during stratified ebb tides and is generated by differential diffusion that indicates a lateral asymmetry in vertical mixing. Over the thalweg, vertical mixing is strong during the flood and weak during the ebb. Over the shoal, the tidally periodical stratification shows an opposite cycle of that at the thalweg. Lateral straining tends to enhance stratification during flood tides and vertical diffusion maintains the relatively well-mixed water column over the shoal during the stratified ebb tides.
Abstract
The dynamics of lateral circulation in the Passaic River estuary is examined in this modeling study. The pattern of lateral circulation varies significantly over a tidal cycle as a result of the temporal variation of stratification induced by tidal straining. During highly stratified ebb tides, the lateral circulation exhibits a vertical two-cell structure. Strong stratification suppresses vertical mixing in the deep channel, whereas the shoal above the halocline remains relatively well mixed. As a result, in the upper layer, the lateral asymmetry of vertical mixing produces denser water on the shoal and fresher water over the thalweg. This density gradient drives a circulation with surface currents directed toward the shoal, and the currents at the base of the pycnocline are directed toward the thalweg. In the lower layer, the lateral circulation tends to reduce the tilting of isopycnals and gradually diminishes at the end of the ebb tide. A lateral baroclinic pressure gradient is a dominant driving force for lateral circulation during stratified ebb tides and is generated by differential diffusion that indicates a lateral asymmetry in vertical mixing. Over the thalweg, vertical mixing is strong during the flood and weak during the ebb. Over the shoal, the tidally periodical stratification shows an opposite cycle of that at the thalweg. Lateral straining tends to enhance stratification during flood tides and vertical diffusion maintains the relatively well-mixed water column over the shoal during the stratified ebb tides.
Abstract
The Edmonton monthly mean temperature record has been examined using the concept of the cumulative high frequency monthly mean temperature anomaly, I. The time sequence of I is shown to exhibit bounded, oscillatory, nonperiodic behavior.
At times features such as annual and quasi-triennial cycles and sudden reversals are exhibited. Some implications of these observations for interannual climate modeling and forecasting are discussed.
Abstract
The Edmonton monthly mean temperature record has been examined using the concept of the cumulative high frequency monthly mean temperature anomaly, I. The time sequence of I is shown to exhibit bounded, oscillatory, nonperiodic behavior.
At times features such as annual and quasi-triennial cycles and sudden reversals are exhibited. Some implications of these observations for interannual climate modeling and forecasting are discussed.
Abstract
Quantifying past climate variation and attributing its causes improves our understanding of the natural variability of the climate system. Tree-ring-based proxies have provided skillful and highly resolved reconstructions of temperature and hydroclimate of the last millennium. However, like all proxies, they are subject to uncertainties arising from varying data quality, coverage, and reconstruction methodology. Previous studies have suggested that biological-based memory processes could cause spectral biases in climate reconstructions. This study determines the effects of such biases on reconstructed temperature variability and the resultant implications for detection and attribution studies. We find that introducing persistent memory, reflecting the spectral properties of tree-ring data, can change the variability of pseudoproxy reconstructions compared to the surrogate climate and resolve certain model–proxy discrepancies. This is especially the case for proxies based on ring-width data. Such memory inflates the difference between the Medieval Climate Anomaly and the Little Ice Age and suppresses and extends the cooling in response to volcanic eruptions. When accounting for memory effects, climate model data can reproduce long-term cooling after volcanic eruptions, as seen in proxy reconstructions. Results of detection and attribution studies show that signals in reconstructions as well as residual unforced variability are consistent with those in climate models when the model fingerprints are adjusted to reflect autoregressive memory as found in tree rings.
Abstract
Quantifying past climate variation and attributing its causes improves our understanding of the natural variability of the climate system. Tree-ring-based proxies have provided skillful and highly resolved reconstructions of temperature and hydroclimate of the last millennium. However, like all proxies, they are subject to uncertainties arising from varying data quality, coverage, and reconstruction methodology. Previous studies have suggested that biological-based memory processes could cause spectral biases in climate reconstructions. This study determines the effects of such biases on reconstructed temperature variability and the resultant implications for detection and attribution studies. We find that introducing persistent memory, reflecting the spectral properties of tree-ring data, can change the variability of pseudoproxy reconstructions compared to the surrogate climate and resolve certain model–proxy discrepancies. This is especially the case for proxies based on ring-width data. Such memory inflates the difference between the Medieval Climate Anomaly and the Little Ice Age and suppresses and extends the cooling in response to volcanic eruptions. When accounting for memory effects, climate model data can reproduce long-term cooling after volcanic eruptions, as seen in proxy reconstructions. Results of detection and attribution studies show that signals in reconstructions as well as residual unforced variability are consistent with those in climate models when the model fingerprints are adjusted to reflect autoregressive memory as found in tree rings.
Abstract
Statistical and case study–oriented comparisons of the quantitative precipitation nowcasting (QPN) schemes demonstrated during the first World Weather Research Programme (WWRP) Forecast Demonstration Project (FDP), held in Sydney, Australia, during 2000, served to confirm many of the earlier reported findings regarding QPN algorithm design and performance. With a few notable exceptions, nowcasting algorithms based upon the linear extrapolation of observed precipitation motion (Lagrangian persistence) were generally superior to more sophisticated, nonlinear nowcasting methods. Centroid trackers [Thunderstorm Identification, Tracking, Analysis and Nowcasting System (TITAN)] and pattern matching extrapolators using multiple vectors (Auto-nowcaster and Nimrod) were most reliable in convective scenarios. During widespread, stratiform rain events, the pattern matching extrapolators were superior to centroid trackers and wind advection techniques (Gandolf, Nimrod).
There is some limited case study and statistical evidence from the FDP to support the use of more sophisticated, nonlinear QPN algorithms. In a companion paper in this issue, Wilson et al. demonstrate the advantages of combining linear extrapolation with algorithms designed to predict convective initiation, growth, and decay in the Auto-nowcaster. Ebert et al. show that the application of a nonlinear scheme [Spectral Prognosis (S-PROG)] designed to smooth precipitation features at a rate consistent with their observed temporal persistence tends to produce a nowcast that is superior to Lagrangian persistence in terms of rms error. However, the value of this approach in severe weather forecasting is called into question due to the rapid smoothing of high-intensity precipitation features.
Abstract
Statistical and case study–oriented comparisons of the quantitative precipitation nowcasting (QPN) schemes demonstrated during the first World Weather Research Programme (WWRP) Forecast Demonstration Project (FDP), held in Sydney, Australia, during 2000, served to confirm many of the earlier reported findings regarding QPN algorithm design and performance. With a few notable exceptions, nowcasting algorithms based upon the linear extrapolation of observed precipitation motion (Lagrangian persistence) were generally superior to more sophisticated, nonlinear nowcasting methods. Centroid trackers [Thunderstorm Identification, Tracking, Analysis and Nowcasting System (TITAN)] and pattern matching extrapolators using multiple vectors (Auto-nowcaster and Nimrod) were most reliable in convective scenarios. During widespread, stratiform rain events, the pattern matching extrapolators were superior to centroid trackers and wind advection techniques (Gandolf, Nimrod).
There is some limited case study and statistical evidence from the FDP to support the use of more sophisticated, nonlinear QPN algorithms. In a companion paper in this issue, Wilson et al. demonstrate the advantages of combining linear extrapolation with algorithms designed to predict convective initiation, growth, and decay in the Auto-nowcaster. Ebert et al. show that the application of a nonlinear scheme [Spectral Prognosis (S-PROG)] designed to smooth precipitation features at a rate consistent with their observed temporal persistence tends to produce a nowcast that is superior to Lagrangian persistence in terms of rms error. However, the value of this approach in severe weather forecasting is called into question due to the rapid smoothing of high-intensity precipitation features.