Search Results
You are looking at 11 - 20 of 23 items for
- Author or Editor: Robert N. Miller x
- Refine by Access: All Content x
Abstract
A linearized baroclinic, spectral-in-time tidal inverse model has been developed for assimilation of surface currents from coast-based high-frequency (HF) radars. Representer functions obtained as a part of the generalized inverse solution show that for superinertial flows information from the surface velocity measurements propagates to depth along wave characteristics, allowing internal tidal flows to be mapped throughout the water column. Application of the inverse model to a 38 km × 57 km domain off the mid-Oregon coast, where data from two HF radar systems are available, provides a uniquely detailed picture of spatial and temporal variability of the M 2 internal tide in a coastal environment. Most baroclinic signal contained in the data comes from outside the computational domain, and so data assimilation (DA) is used to restore baroclinic currents at the open boundary (OB). Experiments with synthetic data demonstrate that the choice of the error covariance for the OB condition affects model performance. A covariance consistent with assumed dynamics is obtained by nesting, using representers computed in a larger domain. Harmonic analysis of currents from HF radars and an acoustic Doppler profiler (ADP) mooring off Oregon for May–July 1998 reveals substantial intermittence of the internal tide, both in amplitude and phase. Assimilation of the surface current measurements captures the temporal variability and improves the ADP/solution rms difference. Despite significant temporal variability, persistent features are found for the studied period; for instance, the dominant direction of baroclinic wave phase and energy propagation is always from the northwest. At the surface, baroclinic surface tidal currents (deviations from the depth-averaged current) can be 10 cm s–1, 2 times as large as the depth-averaged current. Barotropic-to-baroclinic energy conversion is generally weak within the model domain over the shelf but reaches 5 mW m–2 at times over the slopes of Stonewall Bank.
Abstract
A linearized baroclinic, spectral-in-time tidal inverse model has been developed for assimilation of surface currents from coast-based high-frequency (HF) radars. Representer functions obtained as a part of the generalized inverse solution show that for superinertial flows information from the surface velocity measurements propagates to depth along wave characteristics, allowing internal tidal flows to be mapped throughout the water column. Application of the inverse model to a 38 km × 57 km domain off the mid-Oregon coast, where data from two HF radar systems are available, provides a uniquely detailed picture of spatial and temporal variability of the M 2 internal tide in a coastal environment. Most baroclinic signal contained in the data comes from outside the computational domain, and so data assimilation (DA) is used to restore baroclinic currents at the open boundary (OB). Experiments with synthetic data demonstrate that the choice of the error covariance for the OB condition affects model performance. A covariance consistent with assumed dynamics is obtained by nesting, using representers computed in a larger domain. Harmonic analysis of currents from HF radars and an acoustic Doppler profiler (ADP) mooring off Oregon for May–July 1998 reveals substantial intermittence of the internal tide, both in amplitude and phase. Assimilation of the surface current measurements captures the temporal variability and improves the ADP/solution rms difference. Despite significant temporal variability, persistent features are found for the studied period; for instance, the dominant direction of baroclinic wave phase and energy propagation is always from the northwest. At the surface, baroclinic surface tidal currents (deviations from the depth-averaged current) can be 10 cm s–1, 2 times as large as the depth-averaged current. Barotropic-to-baroclinic energy conversion is generally weak within the model domain over the shelf but reaches 5 mW m–2 at times over the slopes of Stonewall Bank.
Abstract
A linearized baroclinic, spectral-in-time tidal inverse model has been developed for assimilation of surface currents from coast-based high-frequency (HF) radars. Representer functions obtained as a part of the generalized inverse solution show that for superinertial flows information from the surface velocity measurements propagates to depth along wave characteristics, allowing internal tidal flows to be mapped throughout the water column. Application of the inverse model to a 38 km × 57 km domain off the mid-Oregon coast, where data from two HF radar systems are available, provides a uniquely detailed picture of spatial and temporal variability of the M 2 internal tide in a coastal environment. Most baroclinic signal contained in the data comes from outside the computational domain, and so data assimilation (DA) is used to restore baroclinic currents at the open boundary (OB). Experiments with synthetic data demonstrate that the choice of the error covariance for the OB condition affects model performance. A covariance consistent with assumed dynamics is obtained by nesting, using representers computed in a larger domain. Harmonic analysis of currents from HF radars and an acoustic Doppler profiler (ADP) mooring off Oregon for May–July 1998 reveals substantial intermittence of the internal tide, both in amplitude and phase. Assimilation of the surface current measurements captures the temporal variability and improves the ADP/solution rms difference. Despite significant temporal variability, persistent features are found for the studied period; for instance, the dominant direction of baroclinic wave phase and energy propagation is always from the northwest. At the surface, baroclinic surface tidal currents (deviations from the depth-averaged current) can be 10 cm s–1, 2 times as large as the depth-averaged current. Barotropic-to-baroclinic energy conversion is generally weak within the model domain over the shelf but reaches 5 mW m–2 at times over the slopes of Stonewall Bank.
Abstract
A linearized baroclinic, spectral-in-time tidal inverse model has been developed for assimilation of surface currents from coast-based high-frequency (HF) radars. Representer functions obtained as a part of the generalized inverse solution show that for superinertial flows information from the surface velocity measurements propagates to depth along wave characteristics, allowing internal tidal flows to be mapped throughout the water column. Application of the inverse model to a 38 km × 57 km domain off the mid-Oregon coast, where data from two HF radar systems are available, provides a uniquely detailed picture of spatial and temporal variability of the M 2 internal tide in a coastal environment. Most baroclinic signal contained in the data comes from outside the computational domain, and so data assimilation (DA) is used to restore baroclinic currents at the open boundary (OB). Experiments with synthetic data demonstrate that the choice of the error covariance for the OB condition affects model performance. A covariance consistent with assumed dynamics is obtained by nesting, using representers computed in a larger domain. Harmonic analysis of currents from HF radars and an acoustic Doppler profiler (ADP) mooring off Oregon for May–July 1998 reveals substantial intermittence of the internal tide, both in amplitude and phase. Assimilation of the surface current measurements captures the temporal variability and improves the ADP/solution rms difference. Despite significant temporal variability, persistent features are found for the studied period; for instance, the dominant direction of baroclinic wave phase and energy propagation is always from the northwest. At the surface, baroclinic surface tidal currents (deviations from the depth-averaged current) can be 10 cm s–1, 2 times as large as the depth-averaged current. Barotropic-to-baroclinic energy conversion is generally weak within the model domain over the shelf but reaches 5 mW m–2 at times over the slopes of Stonewall Bank.
We briefly but systematically review major sources of aerosol data, emphasizing suites of measurements that seem most likely to contribute to assessments of global aerosol climate forcing. The strengths and limitations of existing satellite, surface, and aircraft remote sensing systems are described, along with those of direct sampling networks and ship-based stations. It is evident that an enormous number of aerosol-related observations have been made, on a wide range of spatial and temporal sampling scales, and that many of the key gaps in this collection of data could be filled by technologies that either exist or are expected to be available in the near future. Emphasis must be given to combining remote sensing and in situ active and passive observations and integrating them with aerosol chemical transport models, in order to create a more complete environmental picture, having sufficient detail to address current climate forcing questions. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) initiative would provide an organizational framework to meet this goal.
We briefly but systematically review major sources of aerosol data, emphasizing suites of measurements that seem most likely to contribute to assessments of global aerosol climate forcing. The strengths and limitations of existing satellite, surface, and aircraft remote sensing systems are described, along with those of direct sampling networks and ship-based stations. It is evident that an enormous number of aerosol-related observations have been made, on a wide range of spatial and temporal sampling scales, and that many of the key gaps in this collection of data could be filled by technologies that either exist or are expected to be available in the near future. Emphasis must be given to combining remote sensing and in situ active and passive observations and integrating them with aerosol chemical transport models, in order to create a more complete environmental picture, having sufficient detail to address current climate forcing questions. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) initiative would provide an organizational framework to meet this goal.
Abstract
Visible satellite imagery is widely used by operational weather forecast centers for tropical and extratropical cyclone analysis and marine forecasting. The absence of visible imagery at night can significantly degrade forecast capabilities, such as determining tropical cyclone center locations or tracking warm-topped convective clusters. This paper documents ProxyVis imagery, an infrared-based proxy for daytime visible imagery developed to address the lack of visible satellite imagery at night and the limitations of existing nighttime visible options. ProxyVis was trained on the VIIRS day/night band imagery at times close to the full moon using VIIRS IR channels with closely matching GOES-16/17/18, Himawari-8/9, and Meteosat-9/10/11 channels. The final operational product applies the ProxyVis algorithms to geostationary satellite data and combines daytime visible and nighttime ProxyVis data to create full-disk animated GeoProxyVis imagery. The simple versions of the ProxyVis algorithm enable its generation from earlier GOES and Meteosat satellite imagery. ProxyVis offers significant improvement over existing operational products for tracking nighttime oceanic low-level clouds. Further, it is qualitatively similar to visible imagery for a wide range of backgrounds and synoptic conditions and phenomena, enabling forecasters to use it without special training. ProxyVis was first introduced to National Hurricane Center (NHC) operations in 2018 and was found to be extremely useful by forecasters becoming part of their standard operational satellite product suite in 2019. Currently, ProxyVis implemented for GOES-16/18, Himawari-9, and Meteosat-9/10/11 is being used in operational settings and evaluated for transition to operations at multiple NWS offices and the Joint Typhoon Warning Center.
Significance Statement
This paper describes ProxyVis imagery, a new method for combining infrared channels to qualitatively mimic daytime visible imagery at nighttime. ProxyVis demonstrates that a simple linear regression can combine just a few commonly available infrared channels to develop a nighttime proxy for visible imagery that significantly improves a forecaster’s ability to track low-level oceanic clouds and circulation features at night, works for all current geostationary satellites, and is useful across a wide range of backgrounds and meteorological scenarios. Animated ProxyVis geostationary imagery has been operational at the National Hurricane Center since 2019 and is also currently being transitioned to operations at other NWS offices and the Joint Typhoon Warning Center.
Abstract
Visible satellite imagery is widely used by operational weather forecast centers for tropical and extratropical cyclone analysis and marine forecasting. The absence of visible imagery at night can significantly degrade forecast capabilities, such as determining tropical cyclone center locations or tracking warm-topped convective clusters. This paper documents ProxyVis imagery, an infrared-based proxy for daytime visible imagery developed to address the lack of visible satellite imagery at night and the limitations of existing nighttime visible options. ProxyVis was trained on the VIIRS day/night band imagery at times close to the full moon using VIIRS IR channels with closely matching GOES-16/17/18, Himawari-8/9, and Meteosat-9/10/11 channels. The final operational product applies the ProxyVis algorithms to geostationary satellite data and combines daytime visible and nighttime ProxyVis data to create full-disk animated GeoProxyVis imagery. The simple versions of the ProxyVis algorithm enable its generation from earlier GOES and Meteosat satellite imagery. ProxyVis offers significant improvement over existing operational products for tracking nighttime oceanic low-level clouds. Further, it is qualitatively similar to visible imagery for a wide range of backgrounds and synoptic conditions and phenomena, enabling forecasters to use it without special training. ProxyVis was first introduced to National Hurricane Center (NHC) operations in 2018 and was found to be extremely useful by forecasters becoming part of their standard operational satellite product suite in 2019. Currently, ProxyVis implemented for GOES-16/18, Himawari-9, and Meteosat-9/10/11 is being used in operational settings and evaluated for transition to operations at multiple NWS offices and the Joint Typhoon Warning Center.
Significance Statement
This paper describes ProxyVis imagery, a new method for combining infrared channels to qualitatively mimic daytime visible imagery at nighttime. ProxyVis demonstrates that a simple linear regression can combine just a few commonly available infrared channels to develop a nighttime proxy for visible imagery that significantly improves a forecaster’s ability to track low-level oceanic clouds and circulation features at night, works for all current geostationary satellites, and is useful across a wide range of backgrounds and meteorological scenarios. Animated ProxyVis geostationary imagery has been operational at the National Hurricane Center since 2019 and is also currently being transitioned to operations at other NWS offices and the Joint Typhoon Warning Center.
Abstract
Thunderstorm mode strongly impacts the likelihood and predictability of tornadoes and other hazards, and thus is of great interest to severe weather forecasters and researchers. It is often impossible for a forecaster to manually classify all the storms within convection-allowing model (CAM) output during a severe weather outbreak, or for a scientist to manually classify all storms in a large CAM or radar dataset in a timely manner. Automated storm classification techniques facilitate these tasks and provide objective inputs to operational tools, including machine learning models for predicting thunderstorm hazards. Accurate storm classification, however, requires accurate storm segmentation. Many storm segmentation techniques fail to distinguish between clustered storms, thereby missing intense cells, or to identify cells embedded within quasi-linear convective systems that can produce tornadoes and damaging winds. Therefore, we have developed an iterative technique that identifies these constituent storms in addition to traditionally identified storms. Identified storms are classified according to a seven-mode scheme designed for severe weather operations and research. The classification model is a hand-developed decision tree that operates on storm properties computed from composite reflectivity and midlevel rotation fields. These properties include geometrical attributes, whether the storm contains smaller storms or resides within a larger-scale complex, and whether strong rotation exists near the storm centroid. We evaluate the classification algorithm using expert labels of 400 storms simulated by the NSSL Warn-on-Forecast System or analyzed by the NSSL Multi-Radar/Multi-Sensor product suite. The classification algorithm emulates expert opinion reasonably well (e.g., 76% accuracy for supercells), and therefore could facilitate a wide range of operational and research applications.
Significance Statement
We have developed a new technique for automatically identifying intense thunderstorms in model and radar data and classifying storm mode, which informs forecasters about the risks of tornadoes and other high-impact weather. The technique identifies storms that are often missed by other methods, including cells embedded within storm clusters, and successfully classifies important storm modes that are generally not included in other schemes, such as rotating cells embedded within quasi-linear convective systems. We hope the technique will facilitate a variety of forecasting and research efforts.
Abstract
Thunderstorm mode strongly impacts the likelihood and predictability of tornadoes and other hazards, and thus is of great interest to severe weather forecasters and researchers. It is often impossible for a forecaster to manually classify all the storms within convection-allowing model (CAM) output during a severe weather outbreak, or for a scientist to manually classify all storms in a large CAM or radar dataset in a timely manner. Automated storm classification techniques facilitate these tasks and provide objective inputs to operational tools, including machine learning models for predicting thunderstorm hazards. Accurate storm classification, however, requires accurate storm segmentation. Many storm segmentation techniques fail to distinguish between clustered storms, thereby missing intense cells, or to identify cells embedded within quasi-linear convective systems that can produce tornadoes and damaging winds. Therefore, we have developed an iterative technique that identifies these constituent storms in addition to traditionally identified storms. Identified storms are classified according to a seven-mode scheme designed for severe weather operations and research. The classification model is a hand-developed decision tree that operates on storm properties computed from composite reflectivity and midlevel rotation fields. These properties include geometrical attributes, whether the storm contains smaller storms or resides within a larger-scale complex, and whether strong rotation exists near the storm centroid. We evaluate the classification algorithm using expert labels of 400 storms simulated by the NSSL Warn-on-Forecast System or analyzed by the NSSL Multi-Radar/Multi-Sensor product suite. The classification algorithm emulates expert opinion reasonably well (e.g., 76% accuracy for supercells), and therefore could facilitate a wide range of operational and research applications.
Significance Statement
We have developed a new technique for automatically identifying intense thunderstorms in model and radar data and classifying storm mode, which informs forecasters about the risks of tornadoes and other high-impact weather. The technique identifies storms that are often missed by other methods, including cells embedded within storm clusters, and successfully classifies important storm modes that are generally not included in other schemes, such as rotating cells embedded within quasi-linear convective systems. We hope the technique will facilitate a variety of forecasting and research efforts.
A comprehensive and cohesive aerosol measurement record with consistent, well-understood uncertainties is a prerequisite to understanding aerosol impacts on long-term climate and environmental variability. Objectives to attaining such an understanding include improving upon the current state-of-the-art sensor calibration and developing systematic validation methods for remotely sensed microphysical properties. While advances in active and passive remote sensors will lead to needed improvements in retrieval accuracies and capabilities, ongoing validation is essential so that the changing sensor characteristics do not mask atmospheric trends. Surface-based radiometer, chemical, and lidar networks have critical roles within an integrated observing system, yet they currently undersample key geographic regions, have limitations in certain measurement capabilities, and lack stable funding. In situ aircraft observations of size-resolved aerosol chemical composition are necessary to provide important linkages between active and passive remote sensing. A planned, systematic approach toward a global aerosol observing network, involving multiple sponsoring agencies and surface-based, suborbital, and spaceborne sensors, is required to prioritize trade-offs regarding capabilities and costs. This strategy is a key ingredient of the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) framework. A set of recommendations is presented.
A comprehensive and cohesive aerosol measurement record with consistent, well-understood uncertainties is a prerequisite to understanding aerosol impacts on long-term climate and environmental variability. Objectives to attaining such an understanding include improving upon the current state-of-the-art sensor calibration and developing systematic validation methods for remotely sensed microphysical properties. While advances in active and passive remote sensors will lead to needed improvements in retrieval accuracies and capabilities, ongoing validation is essential so that the changing sensor characteristics do not mask atmospheric trends. Surface-based radiometer, chemical, and lidar networks have critical roles within an integrated observing system, yet they currently undersample key geographic regions, have limitations in certain measurement capabilities, and lack stable funding. In situ aircraft observations of size-resolved aerosol chemical composition are necessary to provide important linkages between active and passive remote sensing. A planned, systematic approach toward a global aerosol observing network, involving multiple sponsoring agencies and surface-based, suborbital, and spaceborne sensors, is required to prioritize trade-offs regarding capabilities and costs. This strategy is a key ingredient of the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) framework. A set of recommendations is presented.
Abstract
The chemical species emitted by forests create complex atmospheric oxidation chemistry and influence global atmospheric oxidation capacity and climate. The Southern Oxidant and Aerosol Study (SOAS) provided an opportunity to test the oxidation chemistry in a forest where isoprene is the dominant biogenic volatile organic compound. Hydroxyl (OH) and hydroperoxyl (HO2) radicals were two of the hundreds of atmospheric chemical species measured, as was OH reactivity (the inverse of the OH lifetime). OH was measured by laser-induced fluorescence (LIF) and by taking the difference in signals without and with an OH scavenger that was added just outside the instrument’s pinhole inlet. To test whether the chemistry at SOAS can be simulated by current model mechanisms, OH and HO2 were evaluated with a box model using two chemical mechanisms: Master Chemical Mechanism, version 3.2 (MCMv3.2), augmented with explicit isoprene chemistry and MCMv3.3.1. Measured and modeled OH peak at about 106 cm−3 and agree well within combined uncertainties. Measured and modeled HO2 peak at about 27 pptv and also agree well within combined uncertainties. Median OH reactivity cycled between about 11 s−1 at dawn and about 26 s−1 during midafternoon. A good test of the oxidation chemistry is the balance between OH production and loss rates using measurements; this balance was observed to within uncertainties. These SOAS results provide strong evidence that the current isoprene mechanisms are consistent with measured OH and HO2 and, thus, capture significant aspects of the atmospheric oxidation chemistry in this isoprene-rich forest.
Abstract
The chemical species emitted by forests create complex atmospheric oxidation chemistry and influence global atmospheric oxidation capacity and climate. The Southern Oxidant and Aerosol Study (SOAS) provided an opportunity to test the oxidation chemistry in a forest where isoprene is the dominant biogenic volatile organic compound. Hydroxyl (OH) and hydroperoxyl (HO2) radicals were two of the hundreds of atmospheric chemical species measured, as was OH reactivity (the inverse of the OH lifetime). OH was measured by laser-induced fluorescence (LIF) and by taking the difference in signals without and with an OH scavenger that was added just outside the instrument’s pinhole inlet. To test whether the chemistry at SOAS can be simulated by current model mechanisms, OH and HO2 were evaluated with a box model using two chemical mechanisms: Master Chemical Mechanism, version 3.2 (MCMv3.2), augmented with explicit isoprene chemistry and MCMv3.3.1. Measured and modeled OH peak at about 106 cm−3 and agree well within combined uncertainties. Measured and modeled HO2 peak at about 27 pptv and also agree well within combined uncertainties. Median OH reactivity cycled between about 11 s−1 at dawn and about 26 s−1 during midafternoon. A good test of the oxidation chemistry is the balance between OH production and loss rates using measurements; this balance was observed to within uncertainties. These SOAS results provide strong evidence that the current isoprene mechanisms are consistent with measured OH and HO2 and, thus, capture significant aspects of the atmospheric oxidation chemistry in this isoprene-rich forest.
Aerosols exert myriad influences on the earth's environment and climate, and on human health. The complexity of aerosol-related processes requires that information gathered to improve our understanding of climate change must originate from multiple sources, and that effective strategies for data integration need to be established. While a vast array of observed and modeled data are becoming available, the aerosol research community currently lacks the necessary tools and infrastructure to reap maximum scientific benefit from these data. Spatial and temporal sampling differences among a diverse set of sensors, nonuniform data qualities, aerosol mesoscale variabilities, and difficulties in separating cloud effects are some of the challenges that need to be addressed. Maximizing the longterm benefit from these data also requires maintaining consistently well-understood accuracies as measurement approaches evolve and improve. Achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the earth system can be achieved only through a multidisciplinary, interagency, and international initiative capable of dealing with these issues. A systematic approach, capitalizing on modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies, can provide the necessary machinery to support this objective. We outline a framework for integrating and interpreting observations and models, and establishing an accurate, consistent, and cohesive long-term record, following a strategy whereby information and tools of progressively greater sophistication are incorporated as problems of increasing complexity are tackled. This concept is named the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON). To encompass the breadth of the effort required, we present a set of recommendations dealing with data interoperability; measurement and model integration; multisensor synergy; data summarization and mining; model evaluation; calibration and validation; augmentation of surface and in situ measurements; advances in passive and active remote sensing; and design of satellite missions. Without an initiative of this nature, the scientific and policy communities will continue to struggle with understanding the quantitative impact of complex aerosol processes on regional and global climate change and air quality.
Aerosols exert myriad influences on the earth's environment and climate, and on human health. The complexity of aerosol-related processes requires that information gathered to improve our understanding of climate change must originate from multiple sources, and that effective strategies for data integration need to be established. While a vast array of observed and modeled data are becoming available, the aerosol research community currently lacks the necessary tools and infrastructure to reap maximum scientific benefit from these data. Spatial and temporal sampling differences among a diverse set of sensors, nonuniform data qualities, aerosol mesoscale variabilities, and difficulties in separating cloud effects are some of the challenges that need to be addressed. Maximizing the longterm benefit from these data also requires maintaining consistently well-understood accuracies as measurement approaches evolve and improve. Achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the earth system can be achieved only through a multidisciplinary, interagency, and international initiative capable of dealing with these issues. A systematic approach, capitalizing on modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies, can provide the necessary machinery to support this objective. We outline a framework for integrating and interpreting observations and models, and establishing an accurate, consistent, and cohesive long-term record, following a strategy whereby information and tools of progressively greater sophistication are incorporated as problems of increasing complexity are tackled. This concept is named the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON). To encompass the breadth of the effort required, we present a set of recommendations dealing with data interoperability; measurement and model integration; multisensor synergy; data summarization and mining; model evaluation; calibration and validation; augmentation of surface and in situ measurements; advances in passive and active remote sensing; and design of satellite missions. Without an initiative of this nature, the scientific and policy communities will continue to struggle with understanding the quantitative impact of complex aerosol processes on regional and global climate change and air quality.