Search Results
You are looking at 1 - 10 of 17 items for
- Author or Editor: John R. Adam x
- Refine by Access: All Content x
Abstract
Streams of uniformly sized and spaced droplets, produced by launching an ultrasonic wave onto a jet of water to disintegrate it, have been used recently in a number of cloud physics related experiments. It has been found that a stream of droplets is longitudinally unstable and that droplets in the stream tend to bunch and collide. Experimental and theoretical results show that by electrically charging the droplets, this longitudinal bunching may be inhibited. However, if the stream is charged too highly, it will spread laterally into an exponential cone.
Abstract
Streams of uniformly sized and spaced droplets, produced by launching an ultrasonic wave onto a jet of water to disintegrate it, have been used recently in a number of cloud physics related experiments. It has been found that a stream of droplets is longitudinally unstable and that droplets in the stream tend to bunch and collide. Experimental and theoretical results show that by electrically charging the droplets, this longitudinal bunching may be inhibited. However, if the stream is charged too highly, it will spread laterally into an exponential cone.
Abstract
In the development of raindrops from cloud droplets in warm rain, the collision-coalescence process is considered to be the main growth mechanism for droplets of unequal size greater than 20 μm in diameter. However, due to the wake effect, the possibility of equal-sized droplets colliding does exist for some maximum vertical separation of the droplets. An empirical study has been performed which led to the determination of the maximum vertical separation required, as a function of droplet size, for equal-sized droplets to be influenced by the wake effect.
Abstract
In the development of raindrops from cloud droplets in warm rain, the collision-coalescence process is considered to be the main growth mechanism for droplets of unequal size greater than 20 μm in diameter. However, due to the wake effect, the possibility of equal-sized droplets colliding does exist for some maximum vertical separation of the droplets. An empirical study has been performed which led to the determination of the maximum vertical separation required, as a function of droplet size, for equal-sized droplets to be influenced by the wake effect.
Abstract
The western North Pacific Ocean is the most active tropical cyclone (TC) basin. However, recent studies are not conclusive on whether the TC activity is increasing or decreasing, at least when calculations are based on maximum sustained winds. For this study, TC minimum central pressure data are analyzed in an effort to better understand historical typhoons. Best-track pressure reports are compared with aircraft reconnaissance observations; little bias is observed. An analysis of wind and pressure relationships suggests changes in data and practices at numerous agencies over the historical record. New estimates of maximum sustained winds are calculated using recent wind–pressure relationships and parameters from International Best Track Archive for Climate Stewardship (IBTrACS) data. The result suggests potential reclassification of numerous typhoons based on these pressure-based lifetime maximum intensities. Historical documentation supports these new intensities in many cases. In short, wind reports in older best-track data are likely of low quality. The annual activity based on pressure estimates is found to be consistent with aircraft reconnaissance and between agencies; however, reconnaissance ended in the western Pacific in 1987. Since then, interagency differences in maximum wind estimates noted here and by others also exist in the minimum central pressure reports. Reconciling these recent interagency differences is further exasperated by the lack of adequate ground truth. This study suggests efforts to intercalibrate the interagency intensity estimate methods. Conducting an independent and homogeneous reanalysis of past typhoon activity is likely necessary to resolve the remaining discrepancies in typhoon intensity records.
Abstract
The western North Pacific Ocean is the most active tropical cyclone (TC) basin. However, recent studies are not conclusive on whether the TC activity is increasing or decreasing, at least when calculations are based on maximum sustained winds. For this study, TC minimum central pressure data are analyzed in an effort to better understand historical typhoons. Best-track pressure reports are compared with aircraft reconnaissance observations; little bias is observed. An analysis of wind and pressure relationships suggests changes in data and practices at numerous agencies over the historical record. New estimates of maximum sustained winds are calculated using recent wind–pressure relationships and parameters from International Best Track Archive for Climate Stewardship (IBTrACS) data. The result suggests potential reclassification of numerous typhoons based on these pressure-based lifetime maximum intensities. Historical documentation supports these new intensities in many cases. In short, wind reports in older best-track data are likely of low quality. The annual activity based on pressure estimates is found to be consistent with aircraft reconnaissance and between agencies; however, reconnaissance ended in the western Pacific in 1987. Since then, interagency differences in maximum wind estimates noted here and by others also exist in the minimum central pressure reports. Reconciling these recent interagency differences is further exasperated by the lack of adequate ground truth. This study suggests efforts to intercalibrate the interagency intensity estimate methods. Conducting an independent and homogeneous reanalysis of past typhoon activity is likely necessary to resolve the remaining discrepancies in typhoon intensity records.
Abstract
Results are presented from two 60-yr integrations of the troposphere–stratosphere configuration of the U.K. Met. Office’s Unified Model. The integrations were set up identically, apart from different initial conditions, which, nonetheless, were both representative of the early 1990s. Radiative heating rates were calculated using the IS92A projected concentrations of the well-mixed greenhouse gases (GHGs) given by the Intergovernmental Panel on Climate Change, but changes in stratospheric ozone and water vapor were not included. Sea surface conditions were taken from a separate coupled ocean–atmosphere experiment. Both integrations reproduced the familiar pattern of tropospheric warming and a stratospheric cooling increasing with height to about −1.4 K per decade at 1 mb. There was good agreement in the trends apart from in the polar upper stratosphere and, to a greater extent, the polar lower-to-middle stratosphere, where there is significant interannual variability during the winter months. Even after decadal smoothing, the trends in the northern winter were still overshadowed by the variability resulting from the planetary wave forcing from the troposphere. In general, the decadal variability of the Northern Hemisphere stratosphere was not a manifestation of a uniform change throughout each winter but, as with other models, there was a change in the frequency of occurrence of sudden stratospheric warmings. Unlike previous studies, the different results from the two simulations confirm the change in frequency of warmings was due to internal atmospheric variability and not the prescribed changes in GHG concentrations or sea surface conditions. In the southern winter stratosphere the flux of wave activity from the troposphere increased, but any additional dynamical heating was more than offset by the extra radiative cooling from the growing total GHG concentration. Consequently the polar vortex became more stable, with the spring breakdown delayed by 1–2 weeks by the 2050s. Polar stratospheric cloud (PSC) amounts inferred from the predicted temperatures increased in both hemispheres, especially in the early winter. In the Southern Hemisphere, the region of PSC formation expanded both upward and equatorward in response to the temperature trend.
Abstract
Results are presented from two 60-yr integrations of the troposphere–stratosphere configuration of the U.K. Met. Office’s Unified Model. The integrations were set up identically, apart from different initial conditions, which, nonetheless, were both representative of the early 1990s. Radiative heating rates were calculated using the IS92A projected concentrations of the well-mixed greenhouse gases (GHGs) given by the Intergovernmental Panel on Climate Change, but changes in stratospheric ozone and water vapor were not included. Sea surface conditions were taken from a separate coupled ocean–atmosphere experiment. Both integrations reproduced the familiar pattern of tropospheric warming and a stratospheric cooling increasing with height to about −1.4 K per decade at 1 mb. There was good agreement in the trends apart from in the polar upper stratosphere and, to a greater extent, the polar lower-to-middle stratosphere, where there is significant interannual variability during the winter months. Even after decadal smoothing, the trends in the northern winter were still overshadowed by the variability resulting from the planetary wave forcing from the troposphere. In general, the decadal variability of the Northern Hemisphere stratosphere was not a manifestation of a uniform change throughout each winter but, as with other models, there was a change in the frequency of occurrence of sudden stratospheric warmings. Unlike previous studies, the different results from the two simulations confirm the change in frequency of warmings was due to internal atmospheric variability and not the prescribed changes in GHG concentrations or sea surface conditions. In the southern winter stratosphere the flux of wave activity from the troposphere increased, but any additional dynamical heating was more than offset by the extra radiative cooling from the growing total GHG concentration. Consequently the polar vortex became more stable, with the spring breakdown delayed by 1–2 weeks by the 2050s. Polar stratospheric cloud (PSC) amounts inferred from the predicted temperatures increased in both hemispheres, especially in the early winter. In the Southern Hemisphere, the region of PSC formation expanded both upward and equatorward in response to the temperature trend.
Abstract
Empirical orthogonal function (EOF) analysis is a powerful tool for data compression and dimensionality reduction used broadly in meteorology and oceanography. Often in the literature, EOF modes are interpreted individually, independent of other modes. In fact, it can be shown that no such attribution can generally be made. This review demonstrates that in general individual EOF modes (i) will not correspond to individual dynamical modes, (ii) will not correspond to individual kinematic degrees of freedom, (iii) will not be statistically independent of other EOF modes, and (iv) will be strongly influenced by the nonlocal requirement that modes maximize variance over the entire domain. The goal of this review is not to argue against the use of EOF analysis in meteorology and oceanography; rather, it is to demonstrate the care that must be taken in the interpretation of individual modes in order to distinguish the medium from the message.
Abstract
Empirical orthogonal function (EOF) analysis is a powerful tool for data compression and dimensionality reduction used broadly in meteorology and oceanography. Often in the literature, EOF modes are interpreted individually, independent of other modes. In fact, it can be shown that no such attribution can generally be made. This review demonstrates that in general individual EOF modes (i) will not correspond to individual dynamical modes, (ii) will not correspond to individual kinematic degrees of freedom, (iii) will not be statistically independent of other EOF modes, and (iv) will be strongly influenced by the nonlocal requirement that modes maximize variance over the entire domain. The goal of this review is not to argue against the use of EOF analysis in meteorology and oceanography; rather, it is to demonstrate the care that must be taken in the interpretation of individual modes in order to distinguish the medium from the message.
Abstract
Tropical cloud clusters (TCCs) are traditionally defined as synoptic-scale areas of deep convection and associated cirrus outflow. They play a critical role in the energy balance of the tropics, releasing large amounts of latent heat high in the troposphere. If conditions are favorable, TCCs can develop into tropical cyclones (TCs), which put coastal populations at risk. Previous work, usually connected with large field campaigns, has investigated TCC characteristics over small areas and time periods. Recently, developments in satellite reanalysis and global best track assimilation have allowed for the creation of a much more extensive database of TCC activity. The authors use the TCC database to produce an extensive global analysis of TCCs, focusing on TCC climatology, variability, and genesis productivity (GP) over a 28-yr period (1982–2009). While global TCC frequency was fairly consistent over the time period, with relatively small interannual variability and no noticeable trend, regional analyses show a high degree of interannual variability with clear trends in some regions. Approximately 1600 TCCs develop around the globe each year; about 6.4% of those develop into TCs. The eastern North Pacific Ocean (EPAC) basin produces the highest number of TCCs (per unit area) in a given year, but the western North Pacific Ocean (WPAC) basin has the highest GP (~12%). Annual TCC frequency in some basins exhibits a strong correlation to sea surface temperatures (SSTs), particularly in the EPAC, North Atlantic Ocean, and WPAC. However, GP is not as sensitive to SST, supporting the hypothesis that the tropical cyclogenesis process is most sensitive to atmospheric dynamical considerations such as vertical wind shear and large-scale vorticity.
Abstract
Tropical cloud clusters (TCCs) are traditionally defined as synoptic-scale areas of deep convection and associated cirrus outflow. They play a critical role in the energy balance of the tropics, releasing large amounts of latent heat high in the troposphere. If conditions are favorable, TCCs can develop into tropical cyclones (TCs), which put coastal populations at risk. Previous work, usually connected with large field campaigns, has investigated TCC characteristics over small areas and time periods. Recently, developments in satellite reanalysis and global best track assimilation have allowed for the creation of a much more extensive database of TCC activity. The authors use the TCC database to produce an extensive global analysis of TCCs, focusing on TCC climatology, variability, and genesis productivity (GP) over a 28-yr period (1982–2009). While global TCC frequency was fairly consistent over the time period, with relatively small interannual variability and no noticeable trend, regional analyses show a high degree of interannual variability with clear trends in some regions. Approximately 1600 TCCs develop around the globe each year; about 6.4% of those develop into TCs. The eastern North Pacific Ocean (EPAC) basin produces the highest number of TCCs (per unit area) in a given year, but the western North Pacific Ocean (WPAC) basin has the highest GP (~12%). Annual TCC frequency in some basins exhibits a strong correlation to sea surface temperatures (SSTs), particularly in the EPAC, North Atlantic Ocean, and WPAC. However, GP is not as sensitive to SST, supporting the hypothesis that the tropical cyclogenesis process is most sensitive to atmospheric dynamical considerations such as vertical wind shear and large-scale vorticity.
Abstract
This paper describes the Stratospheric Aerosol Geoengineering Large Ensemble (GLENS) project, which promotes the use of a unique model dataset, performed with the Community Earth System Model, with the Whole Atmosphere Community Climate Model as its atmospheric component [CESM1(WACCM)], to investigate global and regional impacts of geoengineering. The performed simulations were designed to achieve multiple simultaneous climate goals, by strategically placing sulfur injections at four different locations in the stratosphere, unlike many earlier studies that targeted globally averaged surface temperature by placing injections in regions at or around the equator. This advanced approach reduces some of the previously found adverse effects of stratospheric aerosol geoengineering, including uneven cooling between the poles and the equator and shifts in tropical precipitation. The 20-member ensemble increases the ability to distinguish between forced changes and changes due to climate variability in global and regional climate variables in the coupled atmosphere, land, sea ice, and ocean system. We invite the broader community to perform in-depth analyses of climate-related impacts and to identify processes that lead to changes in the climate system as the result of a strategic application of stratospheric aerosol geoengineering.
Abstract
This paper describes the Stratospheric Aerosol Geoengineering Large Ensemble (GLENS) project, which promotes the use of a unique model dataset, performed with the Community Earth System Model, with the Whole Atmosphere Community Climate Model as its atmospheric component [CESM1(WACCM)], to investigate global and regional impacts of geoengineering. The performed simulations were designed to achieve multiple simultaneous climate goals, by strategically placing sulfur injections at four different locations in the stratosphere, unlike many earlier studies that targeted globally averaged surface temperature by placing injections in regions at or around the equator. This advanced approach reduces some of the previously found adverse effects of stratospheric aerosol geoengineering, including uneven cooling between the poles and the equator and shifts in tropical precipitation. The 20-member ensemble increases the ability to distinguish between forced changes and changes due to climate variability in global and regional climate variables in the coupled atmosphere, land, sea ice, and ocean system. We invite the broader community to perform in-depth analyses of climate-related impacts and to identify processes that lead to changes in the climate system as the result of a strategic application of stratospheric aerosol geoengineering.
Abstract
We compare the performance of several modes of variability across six U.S. climate modeling groups, with a focus on identifying robust improvements in recent models [including those participating in phase 6 of the Coupled Model Intercomparison Project (CMIP)] compared to previous versions. In particular, we examine the representation of the Madden–Julian oscillation (MJO), El Niño–Southern Oscillation (ENSO), the Pacific decadal oscillation (PDO), the quasi-biennial oscillation (QBO) in the tropical stratosphere, and the dominant modes of extratropical variability, including the southern annular mode (SAM), the northern annular mode (NAM) [and the closely related North Atlantic Oscillation (NAO)], and the Pacific–North American pattern (PNA). Where feasible, we explore the processes driving these improvements through the use of “intermediary” experiments that utilize model versions between CMIP3/5 and CMIP6 as well as targeted sensitivity experiments in which individual modeling parameters are altered. We find clear and systematic improvements in the MJO and QBO and in the teleconnection patterns associated with the PDO and ENSO. Some gains arise from better process representation, while others (e.g., the QBO) from higher resolution that allows for a greater range of interactions. Our results demonstrate that the incremental development processes in multiple climate model groups lead to more realistic simulations over time.
Abstract
We compare the performance of several modes of variability across six U.S. climate modeling groups, with a focus on identifying robust improvements in recent models [including those participating in phase 6 of the Coupled Model Intercomparison Project (CMIP)] compared to previous versions. In particular, we examine the representation of the Madden–Julian oscillation (MJO), El Niño–Southern Oscillation (ENSO), the Pacific decadal oscillation (PDO), the quasi-biennial oscillation (QBO) in the tropical stratosphere, and the dominant modes of extratropical variability, including the southern annular mode (SAM), the northern annular mode (NAM) [and the closely related North Atlantic Oscillation (NAO)], and the Pacific–North American pattern (PNA). Where feasible, we explore the processes driving these improvements through the use of “intermediary” experiments that utilize model versions between CMIP3/5 and CMIP6 as well as targeted sensitivity experiments in which individual modeling parameters are altered. We find clear and systematic improvements in the MJO and QBO and in the teleconnection patterns associated with the PDO and ENSO. Some gains arise from better process representation, while others (e.g., the QBO) from higher resolution that allows for a greater range of interactions. Our results demonstrate that the incremental development processes in multiple climate model groups lead to more realistic simulations over time.
The 2011 Spring Forecasting Experiment in the NOAA Hazardous Weather Testbed (HWT) featured a significant component on convection initiation (CI). As in previous HWT experiments, the CI study was a collaborative effort between forecasters and researchers, with equal emphasis on experimental forecasting strategies and evaluation of prototype model guidance products. The overarching goal of the CI effort was to identify the primary challenges of the CI forecasting problem and to establish a framework for additional studies and possible routine forecasting of CI. This study confirms that convection-allowing models with grid spacing ~4 km represent many aspects of the formation and development of deep convection clouds explicitly and with predictive utility. Further, it shows that automated algorithms can skillfully identify the CI process during model integration. However, it also reveals that automated detection of individual convection cells, by itself, provides inadequate guidance for the disruptive potential of deep convection activity. Thus, future work on the CI forecasting problem should be couched in terms of convection-event prediction rather than detection and prediction of individual convection cells.
The 2011 Spring Forecasting Experiment in the NOAA Hazardous Weather Testbed (HWT) featured a significant component on convection initiation (CI). As in previous HWT experiments, the CI study was a collaborative effort between forecasters and researchers, with equal emphasis on experimental forecasting strategies and evaluation of prototype model guidance products. The overarching goal of the CI effort was to identify the primary challenges of the CI forecasting problem and to establish a framework for additional studies and possible routine forecasting of CI. This study confirms that convection-allowing models with grid spacing ~4 km represent many aspects of the formation and development of deep convection clouds explicitly and with predictive utility. Further, it shows that automated algorithms can skillfully identify the CI process during model integration. However, it also reveals that automated detection of individual convection cells, by itself, provides inadequate guidance for the disruptive potential of deep convection activity. Thus, future work on the CI forecasting problem should be couched in terms of convection-event prediction rather than detection and prediction of individual convection cells.
Abstract
One primary goal of annual Spring Forecasting Experiments (SFEs), which are coorganized by NOAA’s National Severe Storms Laboratory and Storm Prediction Center and conducted in the National Oceanic and Atmospheric Administration’s (NOAA) Hazardous Weather Testbed, is documenting performance characteristics of experimental, convection-allowing modeling systems (CAMs). Since 2007, the number of CAMs (including CAM ensembles) examined in the SFEs has increased dramatically, peaking at six different CAM ensembles in 2015. Meanwhile, major advances have been made in creating, importing, processing, verifying, and developing tools for analyzing and visualizing these large and complex datasets. However, progress toward identifying optimal CAM ensemble configurations has been inhibited because the different CAM systems have been independently designed, making it difficult to attribute differences in performance characteristics. Thus, for the 2016 SFE, a much more coordinated effort among many collaborators was made by agreeing on a set of model specifications (e.g., model version, grid spacing, domain size, and physics) so that the simulations contributed by each collaborator could be combined to form one large, carefully designed ensemble known as the Community Leveraged Unified Ensemble (CLUE). The 2016 CLUE was composed of 65 members contributed by five research institutions and represents an unprecedented effort to enable an evidence-driven decision process to help guide NOAA’s operational modeling efforts. Eight unique experiments were designed within the CLUE framework to examine issues directly relevant to the design of NOAA’s future operational CAM-based ensembles. This article will highlight the CLUE design and present results from one of the experiments examining the impact of single versus multicore CAM ensemble configurations.
Abstract
One primary goal of annual Spring Forecasting Experiments (SFEs), which are coorganized by NOAA’s National Severe Storms Laboratory and Storm Prediction Center and conducted in the National Oceanic and Atmospheric Administration’s (NOAA) Hazardous Weather Testbed, is documenting performance characteristics of experimental, convection-allowing modeling systems (CAMs). Since 2007, the number of CAMs (including CAM ensembles) examined in the SFEs has increased dramatically, peaking at six different CAM ensembles in 2015. Meanwhile, major advances have been made in creating, importing, processing, verifying, and developing tools for analyzing and visualizing these large and complex datasets. However, progress toward identifying optimal CAM ensemble configurations has been inhibited because the different CAM systems have been independently designed, making it difficult to attribute differences in performance characteristics. Thus, for the 2016 SFE, a much more coordinated effort among many collaborators was made by agreeing on a set of model specifications (e.g., model version, grid spacing, domain size, and physics) so that the simulations contributed by each collaborator could be combined to form one large, carefully designed ensemble known as the Community Leveraged Unified Ensemble (CLUE). The 2016 CLUE was composed of 65 members contributed by five research institutions and represents an unprecedented effort to enable an evidence-driven decision process to help guide NOAA’s operational modeling efforts. Eight unique experiments were designed within the CLUE framework to examine issues directly relevant to the design of NOAA’s future operational CAM-based ensembles. This article will highlight the CLUE design and present results from one of the experiments examining the impact of single versus multicore CAM ensemble configurations.