Search Results
You are looking at 1 - 4 of 4 items for :
- Author or Editor: JOHN HENDERSON x
- Journal of Atmospheric and Oceanic Technology x
- Refine by Access: All Content x
Abstract
The Dines pressure tube anemometer was the primary wind speed recording instrument used in Australia until it was replaced by Synchrotac cup anemometers in the 1990s. Simultaneous observations of the gust wind speeds recorded using both types of anemometers during tropical cyclones have, however, raised questions about the equivalency of the gust wind speeds recorded using the two instruments. An experimental study of the response of both versions of the Dines anemometer used in Australia shows that the response of the anemometer is dominated by the motion of the float manometer used to record the wind speed. The amplitude response function shows the presence of two resonant peaks, with the amplitude and frequency of the peaks depending on the instrument version and the mean wind speed. Comparison of the gust wind speeds recorded using Dines and Synchrotac anemometers using random process and linear system theory shows that, on average, the low-speed Dines anemometer records values 2%–5% higher than those recorded using a Synchrotac anemometer under the same conditions, while the high-speed Dines anemometer records values 3%–7% higher, depending on the mean wind speed and turbulence intensity. These differences are exacerbated with the adoption of the WMO-recommended 3-s moving average gust wind speed when reporting the Synchrotac anemometer gust wind speeds, rising to 6%–12% and 11%–19% for low- and high-speed Dines anemometers, respectively. These results are consistent with both field observations and an independent extreme value analysis of simultaneously observed gust wind speeds at seven sites in northern Australia.
Abstract
The Dines pressure tube anemometer was the primary wind speed recording instrument used in Australia until it was replaced by Synchrotac cup anemometers in the 1990s. Simultaneous observations of the gust wind speeds recorded using both types of anemometers during tropical cyclones have, however, raised questions about the equivalency of the gust wind speeds recorded using the two instruments. An experimental study of the response of both versions of the Dines anemometer used in Australia shows that the response of the anemometer is dominated by the motion of the float manometer used to record the wind speed. The amplitude response function shows the presence of two resonant peaks, with the amplitude and frequency of the peaks depending on the instrument version and the mean wind speed. Comparison of the gust wind speeds recorded using Dines and Synchrotac anemometers using random process and linear system theory shows that, on average, the low-speed Dines anemometer records values 2%–5% higher than those recorded using a Synchrotac anemometer under the same conditions, while the high-speed Dines anemometer records values 3%–7% higher, depending on the mean wind speed and turbulence intensity. These differences are exacerbated with the adoption of the WMO-recommended 3-s moving average gust wind speed when reporting the Synchrotac anemometer gust wind speeds, rising to 6%–12% and 11%–19% for low- and high-speed Dines anemometers, respectively. These results are consistent with both field observations and an independent extreme value analysis of simultaneously observed gust wind speeds at seven sites in northern Australia.
Abstract
As part of its mandate to oversee the design of measurement networks for future weather and climate observing needs, the North American Atmospheric Observing System (NAOS) program hypothesized that replacing some of the existing radiosonde stations in the continental United States (CONUS) with another observing system would have little impact on weather forecast accuracy. The consequences of this hypothesis for climate monitoring over North America (NA) are considered here by comparing estimates of multidecadal trends in seasonal mean 500-mb temperature (T) integrated regionally over CONUS or NA, made with and without the 14 upper-air stations initially targeted for replacement. The trend estimates are obtained by subsampling gridded reanalysis fields at points nearest the 78 (126) existing CONUS (NA) radiosonde stations and at these points less the 14 stations. Trends in T for CONUS and NA during each season are also estimated based on the full reanalysis grid, but regardless of the sampling strategy, differences in trends are small and statistically insignificant. A more extreme reduction of the existing radiosonde network is also considered here, namely, one associated with the Global Climate Observing System (GCOS), which includes only 6 (14) stations in CONUS (NA). Again, however, trends for CONUS or NA based on the GCOS sampling strategy are not significantly different from those based on the current network, despite the large difference in station coverage. Estimates of continental-scale trends in 500-mb temperature therefore appear to be robust, whether based on the existing North American radiosonde network or on a range of potential changes thereto. This result depends on the large spatial scale of the underlying tropospheric temperature trend field; other quantities of interest for climate monitoring may be considerably more sensitive to the number and distribution of upper-air stations.
Abstract
As part of its mandate to oversee the design of measurement networks for future weather and climate observing needs, the North American Atmospheric Observing System (NAOS) program hypothesized that replacing some of the existing radiosonde stations in the continental United States (CONUS) with another observing system would have little impact on weather forecast accuracy. The consequences of this hypothesis for climate monitoring over North America (NA) are considered here by comparing estimates of multidecadal trends in seasonal mean 500-mb temperature (T) integrated regionally over CONUS or NA, made with and without the 14 upper-air stations initially targeted for replacement. The trend estimates are obtained by subsampling gridded reanalysis fields at points nearest the 78 (126) existing CONUS (NA) radiosonde stations and at these points less the 14 stations. Trends in T for CONUS and NA during each season are also estimated based on the full reanalysis grid, but regardless of the sampling strategy, differences in trends are small and statistically insignificant. A more extreme reduction of the existing radiosonde network is also considered here, namely, one associated with the Global Climate Observing System (GCOS), which includes only 6 (14) stations in CONUS (NA). Again, however, trends for CONUS or NA based on the GCOS sampling strategy are not significantly different from those based on the current network, despite the large difference in station coverage. Estimates of continental-scale trends in 500-mb temperature therefore appear to be robust, whether based on the existing North American radiosonde network or on a range of potential changes thereto. This result depends on the large spatial scale of the underlying tropospheric temperature trend field; other quantities of interest for climate monitoring may be considerably more sensitive to the number and distribution of upper-air stations.
Abstract
The National Polar-Orbiting Operational Environmental Satellite System (NPOESS) requires improved accuracy in the retrieval of sea surface skin temperature (SSTS) from its Visible Infrared Imager Radiometer Suite (VIIRS) sensor over the capability to retrieve bulk sea surface temperature (SSTB) that has been demonstrated with currently operational National Oceanic and Atmospheric Administration (NOAA) satellites carrying the Advanced Very High Resolution Radiometer (AVHRR) sensor. Statistics show an existing capability to retrieve SSTB with a 1σ accuracy of about 0.8 K in the daytime and 0.6 K with nighttime data. During the NPOESS era, a minimum 1σ SSTS measurement uncertainty of 0.5 K is required during daytime and nighttime conditions, while 0.1 K is desired. Simulations have been performed, using PACEOS™ scene generation software and the multichannel sea surface temperature (MCSST) algorithms developed by NOAA, to better understand the implications of this more stringent requirement on algorithm retrieval methodologies and system design concepts. The results suggest that minimum NPOESS SSTS accuracy requirements may be satisfied with sensor NEΔT values of approximately 0.12 K, which are similar to the AVHRR sensor design specifications. However, error analyses of retrieved SSTB from AVHRR imagery suggest that these more stringent NPOESS requirements may be difficult to meet with existing MCSST algorithms. Thus, a more robust algorithm, a new retrieval methodology, or more stringent system characteristics may be needed to satisfy SSTS measurement uncertainty requirements during the NPOESS era. It is concluded that system-level simulations must accurately model all relevant phenomenology and any new algorithm development should be referenced against in situ observations of ocean surface skin temperatures.
Abstract
The National Polar-Orbiting Operational Environmental Satellite System (NPOESS) requires improved accuracy in the retrieval of sea surface skin temperature (SSTS) from its Visible Infrared Imager Radiometer Suite (VIIRS) sensor over the capability to retrieve bulk sea surface temperature (SSTB) that has been demonstrated with currently operational National Oceanic and Atmospheric Administration (NOAA) satellites carrying the Advanced Very High Resolution Radiometer (AVHRR) sensor. Statistics show an existing capability to retrieve SSTB with a 1σ accuracy of about 0.8 K in the daytime and 0.6 K with nighttime data. During the NPOESS era, a minimum 1σ SSTS measurement uncertainty of 0.5 K is required during daytime and nighttime conditions, while 0.1 K is desired. Simulations have been performed, using PACEOS™ scene generation software and the multichannel sea surface temperature (MCSST) algorithms developed by NOAA, to better understand the implications of this more stringent requirement on algorithm retrieval methodologies and system design concepts. The results suggest that minimum NPOESS SSTS accuracy requirements may be satisfied with sensor NEΔT values of approximately 0.12 K, which are similar to the AVHRR sensor design specifications. However, error analyses of retrieved SSTB from AVHRR imagery suggest that these more stringent NPOESS requirements may be difficult to meet with existing MCSST algorithms. Thus, a more robust algorithm, a new retrieval methodology, or more stringent system characteristics may be needed to satisfy SSTS measurement uncertainty requirements during the NPOESS era. It is concluded that system-level simulations must accurately model all relevant phenomenology and any new algorithm development should be referenced against in situ observations of ocean surface skin temperatures.
Abstract
Data assimilation approaches that use ensembles to approximate a Kalman filter have many potential advantages for oceanographic applications. To explore the extent to which this holds, the Estuarine and Coastal Ocean Model (ECOM) is coupled with a modern data assimilation method based on the local ensemble transform Kalman filter (LETKF), and a series of simulation experiments is conducted. In these experiments, a long ECOM “nature” run is taken to be the “truth.” Observations are generated at analysis times by perturbing the nature run at randomly chosen model grid points with errors of known statistics. A diverse collection of model states is used for the initial ensemble. All experiments use the same lateral boundary conditions and external forcing fields as in the nature run. In the data assimilation, the analysis step combines the observations and the ECOM forecasts using the Kalman filter equations. As a control, a free-running forecast (FRF) is made from the initial ensemble mean to check the relative importance of external forcing versus data assimilation on the analysis skill. Results of the assimilation cycle and the FRF are compared to truth to quantify the skill of each.
The LETKF performs well for the cases studied here. After just a few assimilation cycles, the analysis errors are smaller than the observation errors and are much smaller than the errors in the FRF. The assimilation quickly eliminates the domain-averaged bias of the initial ensemble. The filter accurately tracks the truth at all data densities examined, from observations at 50% of the model grid points down to 2% of the model grid points. As the data density increases, the ensemble spread, bias, and error standard deviation decrease. As the ensemble size increases, the ensemble spread increases and the error standard deviation decreases. Increases in the size of the observation error lead to a larger ensemble spread but have a small impact on the analysis accuracy.
Abstract
Data assimilation approaches that use ensembles to approximate a Kalman filter have many potential advantages for oceanographic applications. To explore the extent to which this holds, the Estuarine and Coastal Ocean Model (ECOM) is coupled with a modern data assimilation method based on the local ensemble transform Kalman filter (LETKF), and a series of simulation experiments is conducted. In these experiments, a long ECOM “nature” run is taken to be the “truth.” Observations are generated at analysis times by perturbing the nature run at randomly chosen model grid points with errors of known statistics. A diverse collection of model states is used for the initial ensemble. All experiments use the same lateral boundary conditions and external forcing fields as in the nature run. In the data assimilation, the analysis step combines the observations and the ECOM forecasts using the Kalman filter equations. As a control, a free-running forecast (FRF) is made from the initial ensemble mean to check the relative importance of external forcing versus data assimilation on the analysis skill. Results of the assimilation cycle and the FRF are compared to truth to quantify the skill of each.
The LETKF performs well for the cases studied here. After just a few assimilation cycles, the analysis errors are smaller than the observation errors and are much smaller than the errors in the FRF. The assimilation quickly eliminates the domain-averaged bias of the initial ensemble. The filter accurately tracks the truth at all data densities examined, from observations at 50% of the model grid points down to 2% of the model grid points. As the data density increases, the ensemble spread, bias, and error standard deviation decrease. As the ensemble size increases, the ensemble spread increases and the error standard deviation decreases. Increases in the size of the observation error lead to a larger ensemble spread but have a small impact on the analysis accuracy.