Search Results
You are looking at 1 - 10 of 85 items for
- Author or Editor: Eric A. Smith x
- Refine by Access: All Content x
Abstract
An investigation of the structure and likely role of the Arabian heat low is presented in two parts. In the first paper the surface energy budget of the Arabian heat low is examined. The investigation focuses on a site within the interior of the Saudi Arabian Empty Quarter during June 1981. Automated surface stations are used to collect continuous measurements of radiative fluxes, state parameters, and the subsurface thermal profiles. These data are synthesized in order to estimate the radiation properties of the desert surface within the vortex of the Arabian heat low and to obtain an estimate of sensible heat exchange that would characterize the lower boundary of the heat low during the spring/summer transition season coinciding with the onset period of the Southwest Summer Monsoon.
Results of the analysis demonstrate how radiative exchange both controls the mean properties of the desert surface and responds to perturbations in the heat low environment. The foremost characteristic of surface energy exchange is the well-balanced diurnal regularity. It is shown how the radiation budget of the surface is modulated by basic difference in the shortwave (VIS) and new-infrared (NIR) solar spectrum. More than 2:1 differences are noted in the NIR and VIS surface albedos. Diurnal averages of the surface and parameters illustrate significant day-night differences associated with the diurnal pulsation of the heat low vortex. Day-night differences in surface temperature are extreme; close to 50°C. It is shown that the diurnal amplitude of surface skin temperature is poorly correlated with the bulk Richardson number, suggesting that surface heat exchange is largely controlled by direct radiative exchange through a modulating optical path rather than by heat diffusion. It is shown how the phase lag in subsurface heating imparts a skew in the diurnal sensible heat cycle. The amplitude of the sensible heating cycle is 220 W m−2 peaking approximately 40 minutes past local noon. In a daily averaged sense, subsurface heat storage is approximately zero—thus a first order approximation for the mean heat low at that time scale equates sensible heating to the negative value of net radiation. Finally it is shown how the surface energy budget responds to an intermittent intensification of the heat low that perturbs boundary layer moisture. In Part II, the results of this investigation are incorporated with other data sources in order to examine the bulk tropospheric heat exchange process within the overall heat low system.
Abstract
An investigation of the structure and likely role of the Arabian heat low is presented in two parts. In the first paper the surface energy budget of the Arabian heat low is examined. The investigation focuses on a site within the interior of the Saudi Arabian Empty Quarter during June 1981. Automated surface stations are used to collect continuous measurements of radiative fluxes, state parameters, and the subsurface thermal profiles. These data are synthesized in order to estimate the radiation properties of the desert surface within the vortex of the Arabian heat low and to obtain an estimate of sensible heat exchange that would characterize the lower boundary of the heat low during the spring/summer transition season coinciding with the onset period of the Southwest Summer Monsoon.
Results of the analysis demonstrate how radiative exchange both controls the mean properties of the desert surface and responds to perturbations in the heat low environment. The foremost characteristic of surface energy exchange is the well-balanced diurnal regularity. It is shown how the radiation budget of the surface is modulated by basic difference in the shortwave (VIS) and new-infrared (NIR) solar spectrum. More than 2:1 differences are noted in the NIR and VIS surface albedos. Diurnal averages of the surface and parameters illustrate significant day-night differences associated with the diurnal pulsation of the heat low vortex. Day-night differences in surface temperature are extreme; close to 50°C. It is shown that the diurnal amplitude of surface skin temperature is poorly correlated with the bulk Richardson number, suggesting that surface heat exchange is largely controlled by direct radiative exchange through a modulating optical path rather than by heat diffusion. It is shown how the phase lag in subsurface heating imparts a skew in the diurnal sensible heat cycle. The amplitude of the sensible heating cycle is 220 W m−2 peaking approximately 40 minutes past local noon. In a daily averaged sense, subsurface heat storage is approximately zero—thus a first order approximation for the mean heat low at that time scale equates sensible heating to the negative value of net radiation. Finally it is shown how the surface energy budget responds to an intermittent intensification of the heat low that perturbs boundary layer moisture. In Part II, the results of this investigation are incorporated with other data sources in order to examine the bulk tropospheric heat exchange process within the overall heat low system.
Abstract
An investigation of the Arabian heat low is carried out based on observations from various satellites, an experimental aircraft and a surface energy budget monitoring station. The observations suggest that during the spring period the Arabian heat low is nearly radiatively neutral and lacks the properties of an energy sink characteristic of conventional desert heat lows. Satellite derived top-of-atmosphere radiation budget analyses illustrate the high contrast properties of the radiative exchange fields over the southern Arabian Peninsula with respect to its surroundings. However, an examination of a four-month time series of daily averaged net radiative exchange over the Arabian Empty Quarter, derived from Nimbus-7 Earth Radiation Budget (ERB) measurements, indicates that the heat low region is in slight relative excess.
Combining these results with estimates of the surface energy budget inside the Arabian Empty Quarter (described in Part I), and previously estimated tropospheric radiative heating rate profiles, provide a closed set of flux terms used to evaluate the energy exchange process within the heat low region. A synthesis of these results indicates that the heat low is a total energy source region. A conceptual structure of the heat low is offered based on a three-layer stratification of the heating mechanisms. The possible role of the Arabian heat low in controlling thermodynamic conditions and forcing baroclinicity in the western Arabian Sea is discussed. It is concluded that the surplus energy properties of the heat low may serve as an important mechanism in controlling moisture transport into the southwest monsoon rainfall regions.
Abstract
An investigation of the Arabian heat low is carried out based on observations from various satellites, an experimental aircraft and a surface energy budget monitoring station. The observations suggest that during the spring period the Arabian heat low is nearly radiatively neutral and lacks the properties of an energy sink characteristic of conventional desert heat lows. Satellite derived top-of-atmosphere radiation budget analyses illustrate the high contrast properties of the radiative exchange fields over the southern Arabian Peninsula with respect to its surroundings. However, an examination of a four-month time series of daily averaged net radiative exchange over the Arabian Empty Quarter, derived from Nimbus-7 Earth Radiation Budget (ERB) measurements, indicates that the heat low region is in slight relative excess.
Combining these results with estimates of the surface energy budget inside the Arabian Empty Quarter (described in Part I), and previously estimated tropospheric radiative heating rate profiles, provide a closed set of flux terms used to evaluate the energy exchange process within the heat low region. A synthesis of these results indicates that the heat low is a total energy source region. A conceptual structure of the heat low is offered based on a three-layer stratification of the heating mechanisms. The possible role of the Arabian heat low in controlling thermodynamic conditions and forcing baroclinicity in the western Arabian Sea is discussed. It is concluded that the surplus energy properties of the heat low may serve as an important mechanism in controlling moisture transport into the southwest monsoon rainfall regions.
Abstract
The tropical radiation balance is investigated on an interannual time scale using a five-year(1979–83) dataset obtained from the Nimbus-7 Earth Radiation Budget (ERB) experiment. The study emphasizes the separate contributions to interannual fluctuations in the global radiation balance by the tropics and extratropics. An attempt is made to Identify source regions within the tropics that give rise to the fluctuations and to quantify the effect of the fluctuations on zonal heat transport.
Superimposed on the five-year global trend pattern of net radiation are large amplitude nonseasonal variations largely confined to tropical latitudes. The significant regions are the Southwest–East Asian (SW–EA) monsoon and two regions associated with the ascent and descent branches of the Pacific Walker Cell. A “cloud reciprocity index” is formulated in order to examine the degree to which extended cloud systems over the oceanic tropics can induce these interannual fluctuations in the radiation balance. The SW–EA monsoon and the eastern Pacific exhibit low-index patterns, suggesting that these are the two dominant sources of the anomalies.
The impact of the fluctuations is examined in terms of external entropy exchange (EEE). Paltridge's theory that climate fluctuations are controlled by a minimum EEE constraint is partially supported. The impact of tropical fluctuations on zonal heat transport is examined. The amplitudes in the year-to-year tropical transport residuals are found to be as high as 50% of, and generally out of phase with, the total global residual. The SW–EA monsoon and the eastern Pacific can explain a large portion of the total tropical residual during specific years.
Simultaneous and lagged spatial correlation analyses are used to determine the degree to which the radiative anomalies associated with the SW–EA monsoon region are coupled to other centers of variability. The simultaneous correlations with net radiation are dissimilar to those found with the albedo and outgoing longwave radiation, particularly in terms of seasonal forcing. The organization of lagged albedo anomaly correlation patterns suggest that predictive indicators of the SW–EA monsoon behavior may be found in the tropical ocean basins.
Abstract
The tropical radiation balance is investigated on an interannual time scale using a five-year(1979–83) dataset obtained from the Nimbus-7 Earth Radiation Budget (ERB) experiment. The study emphasizes the separate contributions to interannual fluctuations in the global radiation balance by the tropics and extratropics. An attempt is made to Identify source regions within the tropics that give rise to the fluctuations and to quantify the effect of the fluctuations on zonal heat transport.
Superimposed on the five-year global trend pattern of net radiation are large amplitude nonseasonal variations largely confined to tropical latitudes. The significant regions are the Southwest–East Asian (SW–EA) monsoon and two regions associated with the ascent and descent branches of the Pacific Walker Cell. A “cloud reciprocity index” is formulated in order to examine the degree to which extended cloud systems over the oceanic tropics can induce these interannual fluctuations in the radiation balance. The SW–EA monsoon and the eastern Pacific exhibit low-index patterns, suggesting that these are the two dominant sources of the anomalies.
The impact of the fluctuations is examined in terms of external entropy exchange (EEE). Paltridge's theory that climate fluctuations are controlled by a minimum EEE constraint is partially supported. The impact of tropical fluctuations on zonal heat transport is examined. The amplitudes in the year-to-year tropical transport residuals are found to be as high as 50% of, and generally out of phase with, the total global residual. The SW–EA monsoon and the eastern Pacific can explain a large portion of the total tropical residual during specific years.
Simultaneous and lagged spatial correlation analyses are used to determine the degree to which the radiative anomalies associated with the SW–EA monsoon region are coupled to other centers of variability. The simultaneous correlations with net radiation are dissimilar to those found with the albedo and outgoing longwave radiation, particularly in terms of seasonal forcing. The organization of lagged albedo anomaly correlation patterns suggest that predictive indicators of the SW–EA monsoon behavior may be found in the tropical ocean basins.
Abstract
GOES-8 thermal infrared split window measurements have been used with a simultaneous land surface temperature (LST)–spectral emissivity retrieval algorithm to examine the potential of a combined retrieval methodology cast into a variational solution for temperatures at multiple but short-term 6- to 24-h time intervals and emissivities at multiple spectral bands assumed to be invariant over the selected time intervals. Retrieved LST and emissivity quantities under differing atmospheric conditions over an annual cycle are validated and analyzed in regard to their underlying diurnal and seasonal variations over the Department of Energy’s Atmospheric Radiation Measurement–Cloud and Radiation Test Bed (ARM–CART) site in Kansas and Oklahoma.
It is shown that the accuracy of the retrieval algorithm depends primarily on GOES infrared channel detector noise and uncertainties in columnar water vapor path, in which retrieval accuracy increases as pathlength decreases. A detailed analysis is given of the characteristic temporal–spatial gradient structures of LSTs and emissivities over the ARM–CART domain at point to area space scales and diurnally to seasonally varying timescales. Emphasis is given to explaining the relationship of heterogeneous features in the retrievals in conjunction with physical attributes of the landscape, that is, ecotones and phenology, and the effects of prior cloudiness on subsequent LSTs.
Abstract
GOES-8 thermal infrared split window measurements have been used with a simultaneous land surface temperature (LST)–spectral emissivity retrieval algorithm to examine the potential of a combined retrieval methodology cast into a variational solution for temperatures at multiple but short-term 6- to 24-h time intervals and emissivities at multiple spectral bands assumed to be invariant over the selected time intervals. Retrieved LST and emissivity quantities under differing atmospheric conditions over an annual cycle are validated and analyzed in regard to their underlying diurnal and seasonal variations over the Department of Energy’s Atmospheric Radiation Measurement–Cloud and Radiation Test Bed (ARM–CART) site in Kansas and Oklahoma.
It is shown that the accuracy of the retrieval algorithm depends primarily on GOES infrared channel detector noise and uncertainties in columnar water vapor path, in which retrieval accuracy increases as pathlength decreases. A detailed analysis is given of the characteristic temporal–spatial gradient structures of LSTs and emissivities over the ARM–CART domain at point to area space scales and diurnally to seasonally varying timescales. Emphasis is given to explaining the relationship of heterogeneous features in the retrievals in conjunction with physical attributes of the landscape, that is, ecotones and phenology, and the effects of prior cloudiness on subsequent LSTs.
Abstract
A combined land surface temperature–emissivity retrieval algorithm is developed and tested for Geostationary Operational Environmental Satellite (GOES)-Imager and National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (AVHRR) split-window channels. By assuming that the spectral emissivities are constant over a short time period (12–24 h), two sets of split-window radiance measurements taken at two different times are used to retrieve two spectral emissivities and two land surface temperatures (LSTs) simultaneously. The algorithm employs an optimization scheme rather than a direct solver for a system of equations because of constraint requirements. The retrieved variables minimize the rms differences between measured satellite radiances and those predicted by a spectrally detailed radiative transfer model.
A GOES-8 version of the algorithm is validated with in situ radiometer measurements from the Department of Energy’s Atmospheric Radiation Measurement Program Cloud and Radiation Testbed (ARM CART) site. In addition, an AVHRR version is validated with in situ measurements from the First ISLSCP Field Experiment (FIFE) site, the Hydrological Atmospheric Pilot Experiment–Sahel (HAPEX–Sahel) site, and an LST validation site operated near Melbourne, Australia. The biases of the retrieved LSTs for the validation sites in the Australian, FIFE, and ARM CART study areas are approximately 0.08°, 1.7°, and 1.4°C, respectively, yielding an overall bias error of better than half the current expected accuracy limit of some ±3°C. The associated bias-adjusted rmse differences are approximately 0.78°, 4.8°, and 4.5°C, respectively, mostly driven by intercomparing in situ point measurements to area-integrated satellite pixel retrievals. The bias-adjusted rmse differences for HAPEX–Sahel are larger (5° and 11°C), resulting from incomplete characterization of site heterogeneity, insufficient radiosonde launch frequency, and poor data quality of the temperature–moisture soundings, rather than intrinsic algorithm problems. Notably, the averaged retrieved emissivities for the trouble-free sites are within the expected range of emissivities for vegetated surfaces.
The GOES-8 retrieved LSTs exhibit small amplitude, high-frequency noise, and a daily error cycle when compared to in situ measurements. The noise is attributed to random detector errors in the satellite observations for which the channel 4 noise-equivalent temperature difference is larger than that of channel 5. The systematic differences between validation measurements and retrievals are near zero during nighttime but exhibit a small semidiurnal oscillation during daytime. Notwithstanding a possible semidiurnal bias in the pyrgeometer validation measurements associated with imperfect solar dome heating corrections, plus unaccounted-for attenuation between the surface and pyrgeometer, the latter error cycle is attributed to a too-coarse sampling of the nonlinear diurnal evolution of the thermodynamic structure of the atmospheric boundary layer, particularly near the sunrise and sunset transition times. Thus, sounding frequency determines the error characteristics of the nonlinearly evolving split-window weighting functions.
Abstract
A combined land surface temperature–emissivity retrieval algorithm is developed and tested for Geostationary Operational Environmental Satellite (GOES)-Imager and National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (AVHRR) split-window channels. By assuming that the spectral emissivities are constant over a short time period (12–24 h), two sets of split-window radiance measurements taken at two different times are used to retrieve two spectral emissivities and two land surface temperatures (LSTs) simultaneously. The algorithm employs an optimization scheme rather than a direct solver for a system of equations because of constraint requirements. The retrieved variables minimize the rms differences between measured satellite radiances and those predicted by a spectrally detailed radiative transfer model.
A GOES-8 version of the algorithm is validated with in situ radiometer measurements from the Department of Energy’s Atmospheric Radiation Measurement Program Cloud and Radiation Testbed (ARM CART) site. In addition, an AVHRR version is validated with in situ measurements from the First ISLSCP Field Experiment (FIFE) site, the Hydrological Atmospheric Pilot Experiment–Sahel (HAPEX–Sahel) site, and an LST validation site operated near Melbourne, Australia. The biases of the retrieved LSTs for the validation sites in the Australian, FIFE, and ARM CART study areas are approximately 0.08°, 1.7°, and 1.4°C, respectively, yielding an overall bias error of better than half the current expected accuracy limit of some ±3°C. The associated bias-adjusted rmse differences are approximately 0.78°, 4.8°, and 4.5°C, respectively, mostly driven by intercomparing in situ point measurements to area-integrated satellite pixel retrievals. The bias-adjusted rmse differences for HAPEX–Sahel are larger (5° and 11°C), resulting from incomplete characterization of site heterogeneity, insufficient radiosonde launch frequency, and poor data quality of the temperature–moisture soundings, rather than intrinsic algorithm problems. Notably, the averaged retrieved emissivities for the trouble-free sites are within the expected range of emissivities for vegetated surfaces.
The GOES-8 retrieved LSTs exhibit small amplitude, high-frequency noise, and a daily error cycle when compared to in situ measurements. The noise is attributed to random detector errors in the satellite observations for which the channel 4 noise-equivalent temperature difference is larger than that of channel 5. The systematic differences between validation measurements and retrievals are near zero during nighttime but exhibit a small semidiurnal oscillation during daytime. Notwithstanding a possible semidiurnal bias in the pyrgeometer validation measurements associated with imperfect solar dome heating corrections, plus unaccounted-for attenuation between the surface and pyrgeometer, the latter error cycle is attributed to a too-coarse sampling of the nonlinear diurnal evolution of the thermodynamic structure of the atmospheric boundary layer, particularly near the sunrise and sunset transition times. Thus, sounding frequency determines the error characteristics of the nonlinearly evolving split-window weighting functions.
Abstract
In a two-part study we investigate the impact of time-dependent cloud microphysical structure on the transfer to space of passive microwave radiation at several frequencies across the EHF and lower SHF portions of the microwave spectrum in order to explore the feasibility of using multichannel passive-microwave retrieval techniques for the estimation of precipitation from space-based platforms.
A series of numerical sensitivity experiments have been conducted that were designed to quantify the impact of an evolving cumulus cloud in conjunction with a superimposed rain layer on the transfer to space of microwave radiation emitted and scattered from the cloud layers, rain layer and the underlying surface. The specification of cloud microphysics has been based on the results of a time-dependent two-dimensional numerical cumulus model developed by Hall (1980). An assortment of vertically homogeneous rain layers, described by the Marshall-Palmer rain drop distribution, has been inserted in the model atmosphere to simulate the evolution of rainfall in a precipitating cumulus cell. The effects of ice hydrometeors on upwelling brightness temperatures have been studied by placing several types of ice canopies over the cloud and rain layers. Both rough ocean and land backgrounds have been considered. The top-of-atmosphere brightness temperatures have been computed by means of a vertically and angularly detailed plane-parallel radiative transfer model for unpolarized microwave radiation.
Part I describes the modeling framework. In addition, it provides a detailed description of the single-scattering properties of the hydrometeors (model-cloud water drops, ice crystals and rain drops) in order to evaluate each component's role in influencing the upwelling radiation to space. We demonstrate that cloud water can have a major impact on the upwelling microwave radiation originating from both the surface and a rain layer placed below cloud base. The radiative properties of the model cloud are shown to be significantly different from those of an equivalent Marshall-Palmer treatment. It is the appearance of the large-drop mode (r> 100 μm) of the cumulus cloud drop distribution function that denotes the point at which cloud drops begin to attenuate the microwave signals, even at the lower frequencies, which are normally considered to be mostly unaffected by purely cloud processes. It is shown that at the early stages of cloud evolution, the model cloud acts mainly through absorption/emission processes. As the cloud develops, however, scattering plays an ever-increasing role. It is also demonstrated that the relative contribution by the small drop mode (r<100 μm) of the cloud to absorption/emission is always significant. It is concluded that the vertical variation of the microphysical structure of the rain-cloud plays an important role in the interpretation of passive microwave rainfall signatures and thus should be considered in precipitation retrieval algorithms.
Abstract
In a two-part study we investigate the impact of time-dependent cloud microphysical structure on the transfer to space of passive microwave radiation at several frequencies across the EHF and lower SHF portions of the microwave spectrum in order to explore the feasibility of using multichannel passive-microwave retrieval techniques for the estimation of precipitation from space-based platforms.
A series of numerical sensitivity experiments have been conducted that were designed to quantify the impact of an evolving cumulus cloud in conjunction with a superimposed rain layer on the transfer to space of microwave radiation emitted and scattered from the cloud layers, rain layer and the underlying surface. The specification of cloud microphysics has been based on the results of a time-dependent two-dimensional numerical cumulus model developed by Hall (1980). An assortment of vertically homogeneous rain layers, described by the Marshall-Palmer rain drop distribution, has been inserted in the model atmosphere to simulate the evolution of rainfall in a precipitating cumulus cell. The effects of ice hydrometeors on upwelling brightness temperatures have been studied by placing several types of ice canopies over the cloud and rain layers. Both rough ocean and land backgrounds have been considered. The top-of-atmosphere brightness temperatures have been computed by means of a vertically and angularly detailed plane-parallel radiative transfer model for unpolarized microwave radiation.
Part I describes the modeling framework. In addition, it provides a detailed description of the single-scattering properties of the hydrometeors (model-cloud water drops, ice crystals and rain drops) in order to evaluate each component's role in influencing the upwelling radiation to space. We demonstrate that cloud water can have a major impact on the upwelling microwave radiation originating from both the surface and a rain layer placed below cloud base. The radiative properties of the model cloud are shown to be significantly different from those of an equivalent Marshall-Palmer treatment. It is the appearance of the large-drop mode (r> 100 μm) of the cumulus cloud drop distribution function that denotes the point at which cloud drops begin to attenuate the microwave signals, even at the lower frequencies, which are normally considered to be mostly unaffected by purely cloud processes. It is shown that at the early stages of cloud evolution, the model cloud acts mainly through absorption/emission processes. As the cloud develops, however, scattering plays an ever-increasing role. It is also demonstrated that the relative contribution by the small drop mode (r<100 μm) of the cloud to absorption/emission is always significant. It is concluded that the vertical variation of the microphysical structure of the rain-cloud plays an important role in the interpretation of passive microwave rainfall signatures and thus should be considered in precipitation retrieval algorithms.
Abstract
The time-dependent role of cloud liquid water in conjunction with its vertical heterogeneities on top-of-atmosphere (TOA) passive microwave brightness temperatures is investigated. A cloud simulation is used to specify the microphysical structure of an evolving cumulus cloud growing toward the rain stage. A one-dimensional multistream solution to the radiative transfer equation is used to study the upwelling radiation at the top of the atmosphere arising from the combined effect of cloud, rain, and ice hydrometeors. Calculations are provided at six window frequencies and one H2O resonance band within the EHF/SHF microwave spectrum. Vertically detailed transmission functions are used to help delineate the principal radiative interactions that control TOA brightness temperatures. Brightness temperatures are then associated with a selection of microphysical situations that reveal how an evolving cloud medium attenuates rainfall and surface radiation. The investigation is primarily designed to study the impact of cloud microphysics on space-based measurements of passive microwave signals, specifically as they pertain to the retrieval of precipitation over water and land backgrounds.
Results demonstrate the large degree to which the relationship between microwave brightness temperature (BT) and rainrate (RR) can be altered purely by cloud water processes. The relative roles of the cloud and rain drop spectra in emissive contributions to the upwelling radiation are assessed with a normalized absorption index, which removes effects due purely to differences in the magnitudes of the cloud and rain liquid water contents. This index is used to help explain why the amplitudes of the BT-RR functions decrease with respect to cloud evolution time and why below-cloud precipitation is virtually masked from detection at the TOA.
Although cloud water tends to obscure BT-RR relationships, it does so in a differential manner with respect to frequency, suggesting that the overall impact of cloud water is not necessarily debilitating to precipitation retrieval schemes. Furthermore, it is shown how a “surface” of “probability” can be defined, which contains an optimal time-dependent BT-RR function associated with an evolving cloud at a given frequency and removes ambiguities within the BT-RR functions at the critical retrieval frequencies. The influence of a land surface having varying emissivity characteristics is also examined in the context of an evolving cloud to show how the time-dependent cloud microphysics modulates the sign and magnitude of brightness temperature differences between various frequencies.
Model results are assessed in conjunction with a Nimbus-7 SMMR case study of precipitation within an intense tropical Pacific storm. It is concluded that in order to obtain a realistic estimation and distribution of rainrates, the effects of cloud liquid water content must be considered.
Abstract
The time-dependent role of cloud liquid water in conjunction with its vertical heterogeneities on top-of-atmosphere (TOA) passive microwave brightness temperatures is investigated. A cloud simulation is used to specify the microphysical structure of an evolving cumulus cloud growing toward the rain stage. A one-dimensional multistream solution to the radiative transfer equation is used to study the upwelling radiation at the top of the atmosphere arising from the combined effect of cloud, rain, and ice hydrometeors. Calculations are provided at six window frequencies and one H2O resonance band within the EHF/SHF microwave spectrum. Vertically detailed transmission functions are used to help delineate the principal radiative interactions that control TOA brightness temperatures. Brightness temperatures are then associated with a selection of microphysical situations that reveal how an evolving cloud medium attenuates rainfall and surface radiation. The investigation is primarily designed to study the impact of cloud microphysics on space-based measurements of passive microwave signals, specifically as they pertain to the retrieval of precipitation over water and land backgrounds.
Results demonstrate the large degree to which the relationship between microwave brightness temperature (BT) and rainrate (RR) can be altered purely by cloud water processes. The relative roles of the cloud and rain drop spectra in emissive contributions to the upwelling radiation are assessed with a normalized absorption index, which removes effects due purely to differences in the magnitudes of the cloud and rain liquid water contents. This index is used to help explain why the amplitudes of the BT-RR functions decrease with respect to cloud evolution time and why below-cloud precipitation is virtually masked from detection at the TOA.
Although cloud water tends to obscure BT-RR relationships, it does so in a differential manner with respect to frequency, suggesting that the overall impact of cloud water is not necessarily debilitating to precipitation retrieval schemes. Furthermore, it is shown how a “surface” of “probability” can be defined, which contains an optimal time-dependent BT-RR function associated with an evolving cloud at a given frequency and removes ambiguities within the BT-RR functions at the critical retrieval frequencies. The influence of a land surface having varying emissivity characteristics is also examined in the context of an evolving cloud to show how the time-dependent cloud microphysics modulates the sign and magnitude of brightness temperature differences between various frequencies.
Model results are assessed in conjunction with a Nimbus-7 SMMR case study of precipitation within an intense tropical Pacific storm. It is concluded that in order to obtain a realistic estimation and distribution of rainrates, the effects of cloud liquid water content must be considered.
Abstract
A quantitative investigation of the relationship between satellite-derived cloud-top temperature parameters and the detection of intense convective rainfall is described. The area of study is that of the Cooperative Convective Precipitation Experiment (CCOPE), which was held near Miles City, Montana during the summer of 1981. Cloud-top temperatures, derived from the GOES-West operational satellite, were used to calculate a variety of parameters for objectively quantifying the convective intensity of a storm. A dense network of rainfall provided verification of surface rainfall. The cloud-top temperature field and surface rainfall data were processed into equally sized grid domains in order to best depict the individual samples of instantaneous precipitation.
The technique of statistical discriminant analysis was used to determine which combinations of cloud-top temperature parameters best classify rain versus no-rain occurrence using three different rain-rate cutoffs: 1, 4, and 10 mm h−1. Time lags within the 30 min rainfall verification were tested to determine the optimum time delay associated with rainfall reaching the ground.
A total of six storm cases were used to develop and test the statistical models. Discrimination of rain events was found to be most accurate when using a 10 mm h−1 rain-rate cutoff. Use parameters designated as coldest cloud-top temperature, the spatial mean of coldest cloud-top temperature, and change over time of mean coldest cloud-top temperature were found to be the best classifiers of rainfall in this study. Combining both a 10-min time lag (in terms of surface verification) with a 10 mm h−1 rain-rate threshold resulted in classifying over 60% of all rain and no-rain cases correctly.
Abstract
A quantitative investigation of the relationship between satellite-derived cloud-top temperature parameters and the detection of intense convective rainfall is described. The area of study is that of the Cooperative Convective Precipitation Experiment (CCOPE), which was held near Miles City, Montana during the summer of 1981. Cloud-top temperatures, derived from the GOES-West operational satellite, were used to calculate a variety of parameters for objectively quantifying the convective intensity of a storm. A dense network of rainfall provided verification of surface rainfall. The cloud-top temperature field and surface rainfall data were processed into equally sized grid domains in order to best depict the individual samples of instantaneous precipitation.
The technique of statistical discriminant analysis was used to determine which combinations of cloud-top temperature parameters best classify rain versus no-rain occurrence using three different rain-rate cutoffs: 1, 4, and 10 mm h−1. Time lags within the 30 min rainfall verification were tested to determine the optimum time delay associated with rainfall reaching the ground.
A total of six storm cases were used to develop and test the statistical models. Discrimination of rain events was found to be most accurate when using a 10 mm h−1 rain-rate cutoff. Use parameters designated as coldest cloud-top temperature, the spatial mean of coldest cloud-top temperature, and change over time of mean coldest cloud-top temperature were found to be the best classifiers of rainfall in this study. Combining both a 10-min time lag (in terms of surface verification) with a 10 mm h−1 rain-rate threshold resulted in classifying over 60% of all rain and no-rain cases correctly.