Search Results
You are looking at 1 - 10 of 33 items for
- Author or Editor: J. S. Hall x
- Refine by Access: All Content x
Abstract
No abstract.
Abstract
No abstract.
Abstract
Side-scan sonars operating at 80–250 kHz have been deployed to produce narrow beams directed parallel and normal to shore on a gently sloping beach. These provide measurements of processes (such as wave propagation) seaward of the edge of the surf zone. Shoreward propagation of sound into the surf zone and hence useful information retrieval from this zone is prevented, however, by high bubble or suspended sediment absorption at its outer edge, as found in earlier Doppler sonar studies at 195 kHz by J.A. Smith. The Shoreward limit of acoustic propagation has a variable structure related to incident wave groups, the position at which waves break, and to dynamical processes within the surfzone determining the position of the bubble or suspended sediment boundary.
Abstract
Side-scan sonars operating at 80–250 kHz have been deployed to produce narrow beams directed parallel and normal to shore on a gently sloping beach. These provide measurements of processes (such as wave propagation) seaward of the edge of the surf zone. Shoreward propagation of sound into the surf zone and hence useful information retrieval from this zone is prevented, however, by high bubble or suspended sediment absorption at its outer edge, as found in earlier Doppler sonar studies at 195 kHz by J.A. Smith. The Shoreward limit of acoustic propagation has a variable structure related to incident wave groups, the position at which waves break, and to dynamical processes within the surfzone determining the position of the bubble or suspended sediment boundary.
Abstract
Observations are described of Langmuir circulation obtained using upward-pointing bottom-mounted sonars, and a methodology to use the data to estimate the dispersion of floating particles is suggested. Observations of linear bands of acoustic scatterers separated by 2–20 m and detected using side-scan sonar in Loch Ness, Scotland, and in the southern North Sea are ascribed to subsurface bubbles in the convergence zones produced by Langmuir circulation. Data from the two observation sites are compared. The sonar is able to monitor the variability of the patterns over many hours. When the currents are sufficiently small, as in Loch Ness, individual bubble clouds produced by breaking waves remain in the beam long enough for their speed to be resolved, and the rate of convergence into the bands can be estimated. It increases linearly with wind speed. The acoustic data and direct measurements using current meters are used to derive estimates of the response time of bubble bands to changes in wind, and their mean separation, length, and persistence time. The bands in Loch Ness are shorter, but persist longer, than those in similar wind conditions in the relatively shallow and well-mixed North Sea. It is suggested that these differences may be ascribed to the presence of turbulence generated by the shear stress of the strong tidal currents on the seabed in the North Sea, a factor absent in Loch Ness. Models are devised to simulate the dispersion of plumes of floating particles released from a fixed position in a field of Langmuir circulation advected by tidal currents, using the sonar data. The estimates of diffusivities show an increase with wind speed, but are sensitive to the choice of some underdetermined parameters. The resulting estimates of lateral dispersion of floating particles overlap the range of those of Faller and Auer.
Abstract
Observations are described of Langmuir circulation obtained using upward-pointing bottom-mounted sonars, and a methodology to use the data to estimate the dispersion of floating particles is suggested. Observations of linear bands of acoustic scatterers separated by 2–20 m and detected using side-scan sonar in Loch Ness, Scotland, and in the southern North Sea are ascribed to subsurface bubbles in the convergence zones produced by Langmuir circulation. Data from the two observation sites are compared. The sonar is able to monitor the variability of the patterns over many hours. When the currents are sufficiently small, as in Loch Ness, individual bubble clouds produced by breaking waves remain in the beam long enough for their speed to be resolved, and the rate of convergence into the bands can be estimated. It increases linearly with wind speed. The acoustic data and direct measurements using current meters are used to derive estimates of the response time of bubble bands to changes in wind, and their mean separation, length, and persistence time. The bands in Loch Ness are shorter, but persist longer, than those in similar wind conditions in the relatively shallow and well-mixed North Sea. It is suggested that these differences may be ascribed to the presence of turbulence generated by the shear stress of the strong tidal currents on the seabed in the North Sea, a factor absent in Loch Ness. Models are devised to simulate the dispersion of plumes of floating particles released from a fixed position in a field of Langmuir circulation advected by tidal currents, using the sonar data. The estimates of diffusivities show an increase with wind speed, but are sensitive to the choice of some underdetermined parameters. The resulting estimates of lateral dispersion of floating particles overlap the range of those of Faller and Auer.
Abstract
A self-contained instrument, the Autonomously Recording Inverted Echo Sounder (ARIES II), carrying two 250-kHz beam side-scan sonars and with the capacity to record sonar data sampled at 3.2 kHz for 168 h with the sonars operating with a pulse repetition rate of 2 Hz, has been constructed and tested in a mooring deployment that lasted for 25 days near the edge of the continental shelf west of Scotland. The mean water depth was 146 m. ARIES II was positioned at a mean transducer depth of 34.6 m with sonars directed upward at 20° to the horizontal to obtain acoustic returns from targets at or near the sea surface. The instrument was preprogrammed to record continuously over periods of 2, 4, and 13 h, the last to cover the M2 tidal cycle dominant in the area.
Sonographs are presented to illustrate observations of surface waves and wave groups, internal solitons, rain showers, and Langmuir circulation. An analysis is made of the effects of surface waves, currents, and internal waves on the instrument. The potential use of the instrument is demonstrated in providing estimates of the propagation direction and speed of internal waves, as well as in estimating the drift, orientation, and mean separation of Langmuir bands. The separation is found to increase with wind speed.
Abstract
A self-contained instrument, the Autonomously Recording Inverted Echo Sounder (ARIES II), carrying two 250-kHz beam side-scan sonars and with the capacity to record sonar data sampled at 3.2 kHz for 168 h with the sonars operating with a pulse repetition rate of 2 Hz, has been constructed and tested in a mooring deployment that lasted for 25 days near the edge of the continental shelf west of Scotland. The mean water depth was 146 m. ARIES II was positioned at a mean transducer depth of 34.6 m with sonars directed upward at 20° to the horizontal to obtain acoustic returns from targets at or near the sea surface. The instrument was preprogrammed to record continuously over periods of 2, 4, and 13 h, the last to cover the M2 tidal cycle dominant in the area.
Sonographs are presented to illustrate observations of surface waves and wave groups, internal solitons, rain showers, and Langmuir circulation. An analysis is made of the effects of surface waves, currents, and internal waves on the instrument. The potential use of the instrument is demonstrated in providing estimates of the propagation direction and speed of internal waves, as well as in estimating the drift, orientation, and mean separation of Langmuir bands. The separation is found to increase with wind speed.
Abstract
The advances in our understanding of extratropical atmosphere–ocean interaction over the past decade and a half are examined, focusing on the atmospheric response to sea surface temperature anomalies. The main goal of the paper is to assess what was learned from general circulation model (GCM) experiments over the recent two decades or so. Observational evidence regarding the nature of the interaction and dynamical theory of atmospheric anomalies forced by surface thermal anomalies is reviewed. Three types of GCM experiments used to address this problem are then examined: models with fixed climatological conditions and idealized, stationary SST anomalies; models with seasonally evolving climatology forced with realistic, time-varying SST anomalies; and models coupled to an interactive ocean. From representative recent studies, it is argued that the extratropical atmosphere does respond to changes in underlying SST although the response is small compared to internal (unforced) variability. Two types of interactions govern the response. One is an eddy-mediated process, in which a baroclinic response to thermal forcing induces and combines with changes in the position or strength of the storm tracks. This process can lead to an equivalent barotropic response that feeds back positively on the ocean mixed layer temperature. The other is a linear, thermodynamic interaction in which an equivalent-barotropic low-frequency atmospheric anomaly forces a change in SST and then experiences reduced surface thermal damping due to the SST adjustment. Both processes contribute to an increase in variance and persistence of low-frequency atmospheric anomalies and, in fact, may act together in the natural system.
Abstract
The advances in our understanding of extratropical atmosphere–ocean interaction over the past decade and a half are examined, focusing on the atmospheric response to sea surface temperature anomalies. The main goal of the paper is to assess what was learned from general circulation model (GCM) experiments over the recent two decades or so. Observational evidence regarding the nature of the interaction and dynamical theory of atmospheric anomalies forced by surface thermal anomalies is reviewed. Three types of GCM experiments used to address this problem are then examined: models with fixed climatological conditions and idealized, stationary SST anomalies; models with seasonally evolving climatology forced with realistic, time-varying SST anomalies; and models coupled to an interactive ocean. From representative recent studies, it is argued that the extratropical atmosphere does respond to changes in underlying SST although the response is small compared to internal (unforced) variability. Two types of interactions govern the response. One is an eddy-mediated process, in which a baroclinic response to thermal forcing induces and combines with changes in the position or strength of the storm tracks. This process can lead to an equivalent barotropic response that feeds back positively on the ocean mixed layer temperature. The other is a linear, thermodynamic interaction in which an equivalent-barotropic low-frequency atmospheric anomaly forces a change in SST and then experiences reduced surface thermal damping due to the SST adjustment. Both processes contribute to an increase in variance and persistence of low-frequency atmospheric anomalies and, in fact, may act together in the natural system.
Abstract
The rate of dissipation of turbulent kinetic energy has been measured with airfoil probes mounted on an autonomous vehicle, Autosub, on constant-depth legs at 2–10 m below the surface in winds up to 14 m s−1. The observations are mostly in an area limited by fetch to 26 km where the pycnocline depth is about 20 m. At the operational depths of 1.55–15.9 times the significant wave height H s , and in steady winds of about 11.6 m s−1 when the wave age is 11.7–17.2, dissipation is found to be lognormally distributed with a law-of-the-wall variation with depth and friction velocity. Breaking waves, leaving clouds of bubbles in the water, are detected ahead of the Autosub by a forward-pointing sidescan sonar, and the dissipation is measured when the clouds are subsequently reached. Bands of bubbles resulting from the presence of Langmuir circulation are identified by a semiobjective method that seeks continuity of band structure recognized by both forward- and sideways-pointing sidescan sonars. The times at which bands are crossed are determined and are used to relate dissipation rates and other measured parameters to the location of Langmuir bands. Shear-induced “temperature ramps” are identified with large horizontal temperature gradients. The turbulence measurements are consequently related to breaking waves, the bubble clouds, Langmuir circulation, and temperature ramps, and therefore to the principal processes of mixing in the near-surface layer of the ocean, all of which are found to have associated patterns of turbulent dissipation rates. A large proportion of the highest values of dissipation rate occur within bubble clouds. Dissipation is enhanced in the convergence region of Langmuir circulation at depths to about 10 m, and on the colder, bubble containing, side of temperature ramps associated with water advected downward from near the surface. Near the sea surface, turbulence is dominated by the breaking waves; below a depth of about 6H s the local vertical mixing in stronger Langmuir circulation cells exceeds that produced on average by the shear-induced eddies that form temperature ramps.
Abstract
The rate of dissipation of turbulent kinetic energy has been measured with airfoil probes mounted on an autonomous vehicle, Autosub, on constant-depth legs at 2–10 m below the surface in winds up to 14 m s−1. The observations are mostly in an area limited by fetch to 26 km where the pycnocline depth is about 20 m. At the operational depths of 1.55–15.9 times the significant wave height H s , and in steady winds of about 11.6 m s−1 when the wave age is 11.7–17.2, dissipation is found to be lognormally distributed with a law-of-the-wall variation with depth and friction velocity. Breaking waves, leaving clouds of bubbles in the water, are detected ahead of the Autosub by a forward-pointing sidescan sonar, and the dissipation is measured when the clouds are subsequently reached. Bands of bubbles resulting from the presence of Langmuir circulation are identified by a semiobjective method that seeks continuity of band structure recognized by both forward- and sideways-pointing sidescan sonars. The times at which bands are crossed are determined and are used to relate dissipation rates and other measured parameters to the location of Langmuir bands. Shear-induced “temperature ramps” are identified with large horizontal temperature gradients. The turbulence measurements are consequently related to breaking waves, the bubble clouds, Langmuir circulation, and temperature ramps, and therefore to the principal processes of mixing in the near-surface layer of the ocean, all of which are found to have associated patterns of turbulent dissipation rates. A large proportion of the highest values of dissipation rate occur within bubble clouds. Dissipation is enhanced in the convergence region of Langmuir circulation at depths to about 10 m, and on the colder, bubble containing, side of temperature ramps associated with water advected downward from near the surface. Near the sea surface, turbulence is dominated by the breaking waves; below a depth of about 6H s the local vertical mixing in stronger Langmuir circulation cells exceeds that produced on average by the shear-induced eddies that form temperature ramps.
Abstract
The Operational Multiscale Environment Model with Grid Adaptivity (OMEGA) is a multiscale nonhydrostatic atmospheric simulation system based on an adaptive unstructured grid. The basic philosophy behind the OMEGA development has been the creation of an operational tool for real-time aerosol and gas hazard prediction. The model development has been guided by two basic design considerations in order to meet the operational requirements: 1) the application of an unstructured dynamically adaptive mesh numerical technique to atmospheric simulation, and 2) the use of embedded atmospheric dispersion algorithms. An important step in proving the utility and accuracy of OMEGA is the full-scale testing of the model using simulations of real-world atmospheric events and qualitative as well as quantitative comparisons of the model results with observations. The main objective of this paper is to provide a comprehensive evaluation of OMEGA against a major dispersion experiment in operational mode. Therefore, OMEGA was run to create a 72-h forecast for the first release period (23–26 October 1994) of the European Tracer Experiment (ETEX). The predicted meteorological and dispersion fields were then evaluated against both the atmospheric observations and the ETEX dispersion measurements up to 60 h after the start of the release. In general, the evaluation showed that the OMEGA dispersion results were in good agreement in the position, shape, and extent of the tracer cloud. However, the model prediction indicated that there was a limited spreading of the predictions around the measurements with a small tendency to underestimate the concentration values.
Abstract
The Operational Multiscale Environment Model with Grid Adaptivity (OMEGA) is a multiscale nonhydrostatic atmospheric simulation system based on an adaptive unstructured grid. The basic philosophy behind the OMEGA development has been the creation of an operational tool for real-time aerosol and gas hazard prediction. The model development has been guided by two basic design considerations in order to meet the operational requirements: 1) the application of an unstructured dynamically adaptive mesh numerical technique to atmospheric simulation, and 2) the use of embedded atmospheric dispersion algorithms. An important step in proving the utility and accuracy of OMEGA is the full-scale testing of the model using simulations of real-world atmospheric events and qualitative as well as quantitative comparisons of the model results with observations. The main objective of this paper is to provide a comprehensive evaluation of OMEGA against a major dispersion experiment in operational mode. Therefore, OMEGA was run to create a 72-h forecast for the first release period (23–26 October 1994) of the European Tracer Experiment (ETEX). The predicted meteorological and dispersion fields were then evaluated against both the atmospheric observations and the ETEX dispersion measurements up to 60 h after the start of the release. In general, the evaluation showed that the OMEGA dispersion results were in good agreement in the position, shape, and extent of the tracer cloud. However, the model prediction indicated that there was a limited spreading of the predictions around the measurements with a small tendency to underestimate the concentration values.
A comprehensive series of global datasets for land-atmosphere models has been collected, formatted to a common grid, and released on a set of CD-ROMs. This paper describes the motivation for and the contents of the dataset.
In June of 1992, an interdisciplinary earth science workshop was convened in Columbia, Maryland, to assess progress in land-atmosphere research, specifically in the areas of models, satellite data algorithms, and field experiments. At the workshop, representatives of the land-atmosphere modeling community defined a need for global datasets to prescribe boundary conditions, initialize state variables, and provide near-surface meteorological and radiative forcings for their models. The International Satellite Land Surface Climatology Project (ISLSCP), a part of the Global Energy and Water Cycle Experiment, worked with the Distributed Active Archive Center of the National Aeronautics and Space Administration Goddard Space Flight Center to bring the required datasets together in a usable format. The data have since been released on a collection of CD-ROMs.
The datasets on the CD-ROMs are grouped under the following headings: vegetation; hydrology and soils; snow, ice, and oceans; radiation and clouds; and near-surface meteorology. All datasets cover the period 1987–88, and all but a few are spatially continuous over the earth's land surface. All have been mapped to a common 1° × 1° equal-angle grid. The temporal frequency for most of the datasets is monthly. A few of the near-surface meteorological parameters are available both as six-hourly values and as monthly means.
A comprehensive series of global datasets for land-atmosphere models has been collected, formatted to a common grid, and released on a set of CD-ROMs. This paper describes the motivation for and the contents of the dataset.
In June of 1992, an interdisciplinary earth science workshop was convened in Columbia, Maryland, to assess progress in land-atmosphere research, specifically in the areas of models, satellite data algorithms, and field experiments. At the workshop, representatives of the land-atmosphere modeling community defined a need for global datasets to prescribe boundary conditions, initialize state variables, and provide near-surface meteorological and radiative forcings for their models. The International Satellite Land Surface Climatology Project (ISLSCP), a part of the Global Energy and Water Cycle Experiment, worked with the Distributed Active Archive Center of the National Aeronautics and Space Administration Goddard Space Flight Center to bring the required datasets together in a usable format. The data have since been released on a collection of CD-ROMs.
The datasets on the CD-ROMs are grouped under the following headings: vegetation; hydrology and soils; snow, ice, and oceans; radiation and clouds; and near-surface meteorology. All datasets cover the period 1987–88, and all but a few are spatially continuous over the earth's land surface. All have been mapped to a common 1° × 1° equal-angle grid. The temporal frequency for most of the datasets is monthly. A few of the near-surface meteorological parameters are available both as six-hourly values and as monthly means.
Abstract
Two approaches are used to characterize how accurately the north Alabama Lightning Mapping Array (LMA) is able to locate lightning VHF sources in space and time. The first method uses a Monte Carlo computer simulation to estimate source retrieval errors. The simulation applies a VHF source retrieval algorithm that was recently developed at the NASA Marshall Space Flight Center (MSFC) and that is similar, but not identical to, the standard New Mexico Tech retrieval algorithm. The second method uses a purely theoretical technique (i.e., chi-squared Curvature Matrix Theory) to estimate retrieval errors. Both methods assume that the LMA system has an overall rms timing error of 50 ns, but all other possible errors (e.g., anomalous VHF noise sources) are neglected. The detailed spatial distributions of retrieval errors are provided. Even though the two methods are independent of one another, they nevertheless provide remarkably similar results. However, altitude error estimates derived from the two methods differ (the Monte Carlo result being taken as more accurate). Additionally, this study clarifies the mathematical retrieval process. In particular, the mathematical difference between the first-guess linear solution and the Marquardt-iterated solution is rigorously established thereby explaining why Marquardt iterations improve upon the linear solution.
Abstract
Two approaches are used to characterize how accurately the north Alabama Lightning Mapping Array (LMA) is able to locate lightning VHF sources in space and time. The first method uses a Monte Carlo computer simulation to estimate source retrieval errors. The simulation applies a VHF source retrieval algorithm that was recently developed at the NASA Marshall Space Flight Center (MSFC) and that is similar, but not identical to, the standard New Mexico Tech retrieval algorithm. The second method uses a purely theoretical technique (i.e., chi-squared Curvature Matrix Theory) to estimate retrieval errors. Both methods assume that the LMA system has an overall rms timing error of 50 ns, but all other possible errors (e.g., anomalous VHF noise sources) are neglected. The detailed spatial distributions of retrieval errors are provided. Even though the two methods are independent of one another, they nevertheless provide remarkably similar results. However, altitude error estimates derived from the two methods differ (the Monte Carlo result being taken as more accurate). Additionally, this study clarifies the mathematical retrieval process. In particular, the mathematical difference between the first-guess linear solution and the Marquardt-iterated solution is rigorously established thereby explaining why Marquardt iterations improve upon the linear solution.
Abstract
The Operational Multiscale Environment model with Grid Adaptivity (OMEGA) is an atmospheric simulation system that links the latest methods in computational fluid dynamics and high-resolution gridding technologies with numerical weather prediction. In the fall of 1999, OMEGA was used for the first time to examine the structure and evolution of a hurricane (Floyd, 1999). The first simulation of Floyd was conducted in an operational forecast mode; additional simulations exploiting both the static as well as the dynamic grid adaptation options in OMEGA were performed later as part of a sensitivity–capability study. While a horizontal grid resolution ranging from about 120 km down to about 40 km was employed in the operational run, resolutions down to about 15 km were used in the sensitivity study to explicitly model the structure of the inner core. All the simulations produced very similar storm tracks and reproduced the salient features of the observed storm such as the recurvature off the Florida coast with an average 48-h position error of 65 km. In addition, OMEGA predicted the landfall near Cape Fear, North Carolina, with an accuracy of less than 100 km up to 96 h in advance. It was found that a higher resolution in the eyewall region of the hurricane, provided by dynamic adaptation, was capable of generating better-organized cloud and flow fields and a well-defined eye with a central pressure lower than the environment by roughly 50 mb. Since that time, forecasts were performed for a number of other storms including Georges (1998) and six 2000 storms (Tropical Storms Beryl and Chris, Hurricanes Debby and Florence, Tropical Storm Helene, and Typhoon Xangsane). The OMEGA mean track error for all of these forecasts of 101, 140, and 298 km at 24, 48, and 72 h, respectively, represents a significant improvement over the National Hurricane Center (NHC) 1998 average of 156, 268, and 374 km, respectively. In a direct comparison with the GFDL model, OMEGA started with a considerably larger position error yet came within 5% of the GFDL 72-h track error. This paper details the simulations produced and documents the results, including a comparison of the OMEGA forecasts against satellite data, observed tracks, reported pressure lows and maximum wind speed, and the rainfall distribution over land.
Abstract
The Operational Multiscale Environment model with Grid Adaptivity (OMEGA) is an atmospheric simulation system that links the latest methods in computational fluid dynamics and high-resolution gridding technologies with numerical weather prediction. In the fall of 1999, OMEGA was used for the first time to examine the structure and evolution of a hurricane (Floyd, 1999). The first simulation of Floyd was conducted in an operational forecast mode; additional simulations exploiting both the static as well as the dynamic grid adaptation options in OMEGA were performed later as part of a sensitivity–capability study. While a horizontal grid resolution ranging from about 120 km down to about 40 km was employed in the operational run, resolutions down to about 15 km were used in the sensitivity study to explicitly model the structure of the inner core. All the simulations produced very similar storm tracks and reproduced the salient features of the observed storm such as the recurvature off the Florida coast with an average 48-h position error of 65 km. In addition, OMEGA predicted the landfall near Cape Fear, North Carolina, with an accuracy of less than 100 km up to 96 h in advance. It was found that a higher resolution in the eyewall region of the hurricane, provided by dynamic adaptation, was capable of generating better-organized cloud and flow fields and a well-defined eye with a central pressure lower than the environment by roughly 50 mb. Since that time, forecasts were performed for a number of other storms including Georges (1998) and six 2000 storms (Tropical Storms Beryl and Chris, Hurricanes Debby and Florence, Tropical Storm Helene, and Typhoon Xangsane). The OMEGA mean track error for all of these forecasts of 101, 140, and 298 km at 24, 48, and 72 h, respectively, represents a significant improvement over the National Hurricane Center (NHC) 1998 average of 156, 268, and 374 km, respectively. In a direct comparison with the GFDL model, OMEGA started with a considerably larger position error yet came within 5% of the GFDL 72-h track error. This paper details the simulations produced and documents the results, including a comparison of the OMEGA forecasts against satellite data, observed tracks, reported pressure lows and maximum wind speed, and the rainfall distribution over land.