Search Results
You are looking at 1 - 10 of 11 items for
- Author or Editor: J. Vogt x
- Refine by Access: All Content x
Abstract
This study applied remotely sensed cloud-to-ground (CG) lightning strike location data, a digital elevation model (DEM), and a geographic information system (GIS) to characterize negative polarity peak current CG lightning Earth attachment behavior. It explored the propensity for (i) flashes to favor topographic highpoint attachment and (ii) striking distance (a near-Earth attachment force) to increase with peak current. On a 16 000 km2 10-m DEM covering a section of southeast and south-central Colorado, a GIS extraction method identified approximately 5000 hilltop and outcrop highpoints containing at least 15 m of vertical gain in a 300-m radius neighborhood with a minimum horizontal separation of 600 m. Flashes with peak currents ranging from −20 to −119 kiloamps (kA), collected between February 2005 and May 2009, were subdivided into 10 kA classes and mapped on this modified DEM. Buffers of 100-, 200-, and 300-m radii created around each highpoint were used to assess the hypothesis that striking distance increases with higher negative peak current. Point-in-polygon counts compared actual CG strike totals to random point totals received inside buffers. CG strikes favored topographic highpoints by as much as 5.0% when compared to random points. Chi-square goodness-of-fit tests further corroborated that actual CG strikes at highpoints were generated by a more nonrandom process. A positive trend between striking distance and peak current was also observed. Although this correlation has been characterized in controlled settings, this study is the first to document this physical process at real-world landscape scales over multiple years.
Abstract
This study applied remotely sensed cloud-to-ground (CG) lightning strike location data, a digital elevation model (DEM), and a geographic information system (GIS) to characterize negative polarity peak current CG lightning Earth attachment behavior. It explored the propensity for (i) flashes to favor topographic highpoint attachment and (ii) striking distance (a near-Earth attachment force) to increase with peak current. On a 16 000 km2 10-m DEM covering a section of southeast and south-central Colorado, a GIS extraction method identified approximately 5000 hilltop and outcrop highpoints containing at least 15 m of vertical gain in a 300-m radius neighborhood with a minimum horizontal separation of 600 m. Flashes with peak currents ranging from −20 to −119 kiloamps (kA), collected between February 2005 and May 2009, were subdivided into 10 kA classes and mapped on this modified DEM. Buffers of 100-, 200-, and 300-m radii created around each highpoint were used to assess the hypothesis that striking distance increases with higher negative peak current. Point-in-polygon counts compared actual CG strike totals to random point totals received inside buffers. CG strikes favored topographic highpoints by as much as 5.0% when compared to random points. Chi-square goodness-of-fit tests further corroborated that actual CG strikes at highpoints were generated by a more nonrandom process. A positive trend between striking distance and peak current was also observed. Although this correlation has been characterized in controlled settings, this study is the first to document this physical process at real-world landscape scales over multiple years.
Abstract
For the state of Colorado, 10 years (2003–12) of 1 April–31 October cloud-to-ground (CG) lightning stroke data are mapped at 500-m spatial resolution over a 10-m spatial resolution U.S. Geological Survey (USGS) digital elevation model (DEM). Spatially, the 12.5 million strokes that are analyzed represent ground contacts, but translate to density values that are about twice the number of ground contacts. Visual interpretation of the mapped data reveals the general lightning climatology of the state, while geospatial analyses that quantify lightning activity by elevation identify certain topographic influences of Colorado’s physical landscape. Elevations lower than 1829 m (6000 ft) and above 3200 m (10 500 ft) show a positive relationship between lightning activity and elevation, while the variegated topography that lies between these two elevations is characterized by a fluctuating relationship. Though many topographic controls are elucidated through the mappings and analyses, the major finding of this paper is the sharp increase in stroke density observed above 3200 m (10 500 ft). Topography’s role in this rapid surge in stroke density, which peaks in the highest mountain summits, is not well known, and until now, was not well documented in the refereed literature at such high resolution from a long-duration dataset.
Abstract
For the state of Colorado, 10 years (2003–12) of 1 April–31 October cloud-to-ground (CG) lightning stroke data are mapped at 500-m spatial resolution over a 10-m spatial resolution U.S. Geological Survey (USGS) digital elevation model (DEM). Spatially, the 12.5 million strokes that are analyzed represent ground contacts, but translate to density values that are about twice the number of ground contacts. Visual interpretation of the mapped data reveals the general lightning climatology of the state, while geospatial analyses that quantify lightning activity by elevation identify certain topographic influences of Colorado’s physical landscape. Elevations lower than 1829 m (6000 ft) and above 3200 m (10 500 ft) show a positive relationship between lightning activity and elevation, while the variegated topography that lies between these two elevations is characterized by a fluctuating relationship. Though many topographic controls are elucidated through the mappings and analyses, the major finding of this paper is the sharp increase in stroke density observed above 3200 m (10 500 ft). Topography’s role in this rapid surge in stroke density, which peaks in the highest mountain summits, is not well known, and until now, was not well documented in the refereed literature at such high resolution from a long-duration dataset.
Abstract
The main objective of this study is to evaluate the uncertainties due to sample size associated with the estimation of the standardized precipitation index (SPI) and their impact on the level of confidence in drought monitoring in Africa using high-spatial-resolution data from short time series. To do this, two different rainfall datasets, each available on a monthly basis, were analyzed over four river basins in Africa—Oum er-Rbia, Limpopo, Niger, and eastern Nile—as well as at the continental level. The two precipitation datasets used were the Tropical Rainfall Measuring Mission (TRMM) satellite monthly rainfall product 3B43 and the Global Precipitation Climatology Centre full-reanalysis gridded precipitation dataset. A nonparametric resampling bootstrap approach was used to compute the confidence bands associated with the SPI estimation, which are essential for making a qualified assessment of drought events. The comparative analysis of different datasets suggests that for reliable drought monitoring over Africa it is feasible to use short time series of remote sensing precipitation data, such as those from TRMM, that have a higher spatial resolution than other gridded precipitation data. The proposed approach for drought monitoring has the potential to be used in support of decision making at both continental and subcontinental scales over Africa or over other regions that have a sparse distribution of rainfall measurement instruments.
Abstract
The main objective of this study is to evaluate the uncertainties due to sample size associated with the estimation of the standardized precipitation index (SPI) and their impact on the level of confidence in drought monitoring in Africa using high-spatial-resolution data from short time series. To do this, two different rainfall datasets, each available on a monthly basis, were analyzed over four river basins in Africa—Oum er-Rbia, Limpopo, Niger, and eastern Nile—as well as at the continental level. The two precipitation datasets used were the Tropical Rainfall Measuring Mission (TRMM) satellite monthly rainfall product 3B43 and the Global Precipitation Climatology Centre full-reanalysis gridded precipitation dataset. A nonparametric resampling bootstrap approach was used to compute the confidence bands associated with the SPI estimation, which are essential for making a qualified assessment of drought events. The comparative analysis of different datasets suggests that for reliable drought monitoring over Africa it is feasible to use short time series of remote sensing precipitation data, such as those from TRMM, that have a higher spatial resolution than other gridded precipitation data. The proposed approach for drought monitoring has the potential to be used in support of decision making at both continental and subcontinental scales over Africa or over other regions that have a sparse distribution of rainfall measurement instruments.
Weather radars with conventional antenna cannot provide desired volume scan updates at intervals of one minute or less, which is essential for significant improvement in warning lead time of impending storm hazards. The agile-beam multimission phased array radar (MPAR) discussed herein is one potential candidate that can provide faster scanning. It also offers a unique potential for multipurpose use to not only sample weather, but support air traffic needs and track noncooperative airplanes, thus making it an affordable option. After introducing the basic idea behind electronic beam steering, the needs for frequent observations of convective weather are explained. Then, advantages of the phased array radar (PAR) for weather monitoring and improving data quality are examined. To explore and develop weather-related applications of the PAR, a National Weather Radar Testbed (NWRT) has been established in Norman, Oklahoma. The NWRT's main purpose is to address the advanced capabilities anticipated within the next decade so that these could be projected to a possible network of future weather radars. Examples of data illustrating advantages of this advanced radar are shown, and forthcoming plans are discussed.
Weather radars with conventional antenna cannot provide desired volume scan updates at intervals of one minute or less, which is essential for significant improvement in warning lead time of impending storm hazards. The agile-beam multimission phased array radar (MPAR) discussed herein is one potential candidate that can provide faster scanning. It also offers a unique potential for multipurpose use to not only sample weather, but support air traffic needs and track noncooperative airplanes, thus making it an affordable option. After introducing the basic idea behind electronic beam steering, the needs for frequent observations of convective weather are explained. Then, advantages of the phased array radar (PAR) for weather monitoring and improving data quality are examined. To explore and develop weather-related applications of the PAR, a National Weather Radar Testbed (NWRT) has been established in Norman, Oklahoma. The NWRT's main purpose is to address the advanced capabilities anticipated within the next decade so that these could be projected to a possible network of future weather radars. Examples of data illustrating advantages of this advanced radar are shown, and forthcoming plans are discussed.
Abstract
The wind power industry has seen tremendous growth over the past decade and with it has come the need for clutter mitigation techniques for nearby radar systems. Wind turbines can impart upon these radars a unique type of interference that is not removed with conventional clutter-filtering methods. Time series data from Weather Surveillance Radar-1988 Doppler (WSR-88D) stations near wind farms were collected and spectral analysis was used to investigate the detailed characteristics of wind turbine clutter. Techniques to mask wind turbine clutter were developed that utilize multiquadric interpolation in two and three dimensions and can be applied to both the spectral moments and spectral components. In an effort to improve performance, a nowcasting algorithm was incorporated into the interpolation scheme via a least mean squares criterion. The masking techniques described in this paper will be shown to reduce the impact of wind turbine clutter on weather radar systems at the expense of spatial resolution.
Abstract
The wind power industry has seen tremendous growth over the past decade and with it has come the need for clutter mitigation techniques for nearby radar systems. Wind turbines can impart upon these radars a unique type of interference that is not removed with conventional clutter-filtering methods. Time series data from Weather Surveillance Radar-1988 Doppler (WSR-88D) stations near wind farms were collected and spectral analysis was used to investigate the detailed characteristics of wind turbine clutter. Techniques to mask wind turbine clutter were developed that utilize multiquadric interpolation in two and three dimensions and can be applied to both the spectral moments and spectral components. In an effort to improve performance, a nowcasting algorithm was incorporated into the interpolation scheme via a least mean squares criterion. The masking techniques described in this paper will be shown to reduce the impact of wind turbine clutter on weather radar systems at the expense of spatial resolution.
Abstract
The Namib Turbulence Experiment (NamTEX) was a multinational micrometeorological campaign conducted in the central Namib Desert to investigate three-dimensional surface layer turbulence and the spatiotemporal patterns of heat transfer between the subsurface, surface, and atmosphere. The Namib provides an ideal location for fundamental research that revisits some key assumptions in micrometeorology that are implicitly included in the parameterizations describing energy exchange in weather forecasting and climate models: homogenous flat surfaces, no vegetation, little moisture, and cloud-free skies create a strong and consistent diurnal forcing, resulting in a wide range of atmospheric stabilities. A novel combination of instruments was used to simultaneously measure variables and processes relevant to heat transfer: a 3-km fiber-optic distributed temperature sensor (DTS) was suspended in a pseudo-three-dimensional array within a 300 m × 300 m domain to provide vertical cross sections of air temperature fluctuations. Aerial and ground-based thermal imagers recorded high-resolution surface temperature fluctuations within the domain and revealed the spatial thermal imprint of atmospheric structures responsible for heat exchange. High-resolution soil temperature and moisture profiles together with heat flux plates provided information on near-surface soil dynamics. Turbulent heat exchange was measured with a vertical array of five eddy-covariance point measurements on a 21-m mast, as well as by collocated small- and large-aperture scintillometers. This contribution first details the scientific goals and experimental setup of the NamTEX campaign. Then, using a typical day, we demonstrate (i) the coupling of surface layer, surface, and soil temperatures using high-frequency temperature measurements, (ii) differences in spatial and temporal standard deviations of the horizontal temperature field using spatially distributed measurements, and (iii) horizontal anisotropy of the turbulent temperature field.
Abstract
The Namib Turbulence Experiment (NamTEX) was a multinational micrometeorological campaign conducted in the central Namib Desert to investigate three-dimensional surface layer turbulence and the spatiotemporal patterns of heat transfer between the subsurface, surface, and atmosphere. The Namib provides an ideal location for fundamental research that revisits some key assumptions in micrometeorology that are implicitly included in the parameterizations describing energy exchange in weather forecasting and climate models: homogenous flat surfaces, no vegetation, little moisture, and cloud-free skies create a strong and consistent diurnal forcing, resulting in a wide range of atmospheric stabilities. A novel combination of instruments was used to simultaneously measure variables and processes relevant to heat transfer: a 3-km fiber-optic distributed temperature sensor (DTS) was suspended in a pseudo-three-dimensional array within a 300 m × 300 m domain to provide vertical cross sections of air temperature fluctuations. Aerial and ground-based thermal imagers recorded high-resolution surface temperature fluctuations within the domain and revealed the spatial thermal imprint of atmospheric structures responsible for heat exchange. High-resolution soil temperature and moisture profiles together with heat flux plates provided information on near-surface soil dynamics. Turbulent heat exchange was measured with a vertical array of five eddy-covariance point measurements on a 21-m mast, as well as by collocated small- and large-aperture scintillometers. This contribution first details the scientific goals and experimental setup of the NamTEX campaign. Then, using a typical day, we demonstrate (i) the coupling of surface layer, surface, and soil temperatures using high-frequency temperature measurements, (ii) differences in spatial and temporal standard deviations of the horizontal temperature field using spatially distributed measurements, and (iii) horizontal anisotropy of the turbulent temperature field.
Abstract
A methodology combining Bayesian inference with Markov chain Monte Carlo (MCMC) sampling is applied to a real accidental radioactive release that occurred on a continental scale at the end of May 1998 near Algeciras, Spain. The source parameters (i.e., source location and strength) are reconstructed from a limited set of measurements of the release. Annealing and adaptive procedures are implemented to ensure a robust and effective parameter-space exploration. The simulation setup is similar to an emergency response scenario, with the simplifying assumptions that the source geometry and release time are known. The Bayesian stochastic algorithm provides likely source locations within 100 km from the true source, after exploring a domain covering an area of approximately 1800 km × 3600 km. The source strength is reconstructed with a distribution of values of the same order of magnitude as the upper end of the range reported by the Spanish Nuclear Security Agency. By running the Bayesian MCMC algorithm on a large parallel cluster the inversion results could be obtained in few hours as required for emergency response to continental-scale releases. With additional testing and refinement of the methodology (e.g., tests that also include the source geometry and release time among the unknown source parameters), as well as with the continuous and rapid growth of computational power, the approach can potentially be used for real-world emergency response in the near future.
Abstract
A methodology combining Bayesian inference with Markov chain Monte Carlo (MCMC) sampling is applied to a real accidental radioactive release that occurred on a continental scale at the end of May 1998 near Algeciras, Spain. The source parameters (i.e., source location and strength) are reconstructed from a limited set of measurements of the release. Annealing and adaptive procedures are implemented to ensure a robust and effective parameter-space exploration. The simulation setup is similar to an emergency response scenario, with the simplifying assumptions that the source geometry and release time are known. The Bayesian stochastic algorithm provides likely source locations within 100 km from the true source, after exploring a domain covering an area of approximately 1800 km × 3600 km. The source strength is reconstructed with a distribution of values of the same order of magnitude as the upper end of the range reported by the Spanish Nuclear Security Agency. By running the Bayesian MCMC algorithm on a large parallel cluster the inversion results could be obtained in few hours as required for emergency response to continental-scale releases. With additional testing and refinement of the methodology (e.g., tests that also include the source geometry and release time among the unknown source parameters), as well as with the continuous and rapid growth of computational power, the approach can potentially be used for real-world emergency response in the near future.
Abstract
The second Meteor Crater Experiment (METCRAX II) was conducted in October 2013 at Arizona’s Meteor Crater. The experiment was designed to investigate nighttime downslope windstorm−type flows that form regularly above the inner southwest sidewall of the 1.2-km diameter crater as a southwesterly mesoscale katabatic flow cascades over the crater rim. The objective of METCRAX II is to determine the causes of these strong, intermittent, and turbulent inflows that bring warm-air intrusions into the southwest part of the crater. This article provides an overview of the scientific goals of the experiment; summarizes the measurements, the crater topography, and the synoptic meteorology of the study period; and presents initial analysis results.
Abstract
The second Meteor Crater Experiment (METCRAX II) was conducted in October 2013 at Arizona’s Meteor Crater. The experiment was designed to investigate nighttime downslope windstorm−type flows that form regularly above the inner southwest sidewall of the 1.2-km diameter crater as a southwesterly mesoscale katabatic flow cascades over the crater rim. The objective of METCRAX II is to determine the causes of these strong, intermittent, and turbulent inflows that bring warm-air intrusions into the southwest part of the crater. This article provides an overview of the scientific goals of the experiment; summarizes the measurements, the crater topography, and the synoptic meteorology of the study period; and presents initial analysis results.
During a special observing period (SOP) of the Mesoscale Alpine Programme (MAP), boundary layer processes in highly complex topography were investigated in the Riviera Valley in southern Switzerland. The main focus was on the turbulence structure and turbulent exchange processes near the valley surfaces and free troposphere. Due to the anticipated spatial inhomogeneity, a number of different turbulence probes were deployed on a cross section through the valley. Together with a suite of more conventional instrumentation, to observe mean meteorological structure in the valley, this effort yielded a highly valuable dataset. The latter is presently being exploited to yield insight into the turbulence structure in very complex terrain, and its relation to flow regimes and associated mean flow characteristics. Specific questions, such as a detailed investigation of turbulent exchange processes over complex topography and the validity of surface exchange parameterizations in numerical models for such surfaces, the closure of the surface energy balance, or the definition and meaning of the “boundary layer height,” are investigated using the MAP-Riviera dataset. In the present paper, we provide details on sites and their characteristics, on measurements and observational strategies, and on efforts to guarantee comparability between different instrumentation at different sites, and we include an overview of the available instrumentation. On the basis of preliminary data and first results, the main research goals of the project are outlined.
During a special observing period (SOP) of the Mesoscale Alpine Programme (MAP), boundary layer processes in highly complex topography were investigated in the Riviera Valley in southern Switzerland. The main focus was on the turbulence structure and turbulent exchange processes near the valley surfaces and free troposphere. Due to the anticipated spatial inhomogeneity, a number of different turbulence probes were deployed on a cross section through the valley. Together with a suite of more conventional instrumentation, to observe mean meteorological structure in the valley, this effort yielded a highly valuable dataset. The latter is presently being exploited to yield insight into the turbulence structure in very complex terrain, and its relation to flow regimes and associated mean flow characteristics. Specific questions, such as a detailed investigation of turbulent exchange processes over complex topography and the validity of surface exchange parameterizations in numerical models for such surfaces, the closure of the surface energy balance, or the definition and meaning of the “boundary layer height,” are investigated using the MAP-Riviera dataset. In the present paper, we provide details on sites and their characteristics, on measurements and observational strategies, and on efforts to guarantee comparability between different instrumentation at different sites, and we include an overview of the available instrumentation. On the basis of preliminary data and first results, the main research goals of the project are outlined.
Drought is a global problem that has far-reaching impacts, especially on vulnerable populations in developing regions. This paper highlights the need for a Global Drought Early Warning System (GDEWS), the elements that constitute its underlying framework (GDEWF), and the recent progress made toward its development. Many countries lack drought monitoring systems, as well as the capacity to respond via appropriate political, institutional, and technological frameworks, and these have inhibited the development of integrated drought management plans or early warning systems. The GDEWS will provide a source of drought tools and products via the GDEWF for countries and regions to develop tailored drought early warning systems for their own users. A key goal of a GDEWS is to maximize the lead time for early warning, allowing drought managers and disaster coordinators more time to put mitigation measures in place to reduce the vulnerability to drought. To address this, the GDEWF will take both a top-down approach to provide global realtime drought monitoring and seasonal forecasting, and a bottom-up approach that builds upon existing national and regional systems to provide continental-to-global coverage. A number of challenges must be overcome, however, before a GDEWS can become a reality, including the lack of in situ measurement networks and modest seasonal forecast skill in many regions, and the lack of infrastructure to translate data into useable information. A set of international partners, through a series of recent workshops and evolving collaborations, has made progress toward meeting these challenges and developing a global system.
Drought is a global problem that has far-reaching impacts, especially on vulnerable populations in developing regions. This paper highlights the need for a Global Drought Early Warning System (GDEWS), the elements that constitute its underlying framework (GDEWF), and the recent progress made toward its development. Many countries lack drought monitoring systems, as well as the capacity to respond via appropriate political, institutional, and technological frameworks, and these have inhibited the development of integrated drought management plans or early warning systems. The GDEWS will provide a source of drought tools and products via the GDEWF for countries and regions to develop tailored drought early warning systems for their own users. A key goal of a GDEWS is to maximize the lead time for early warning, allowing drought managers and disaster coordinators more time to put mitigation measures in place to reduce the vulnerability to drought. To address this, the GDEWF will take both a top-down approach to provide global realtime drought monitoring and seasonal forecasting, and a bottom-up approach that builds upon existing national and regional systems to provide continental-to-global coverage. A number of challenges must be overcome, however, before a GDEWS can become a reality, including the lack of in situ measurement networks and modest seasonal forecast skill in many regions, and the lack of infrastructure to translate data into useable information. A set of international partners, through a series of recent workshops and evolving collaborations, has made progress toward meeting these challenges and developing a global system.