Search Results

You are looking at 1 - 10 of 15 items for

  • Author or Editor: H. Lee Kyle x
  • All content x
Clear All Modify Search
Harbans L. Dhuria and H. Lee Kyle

Abstract

Nimbus-7 cloud and earth radiation budget data are compared in a study of the effects of clouds on the tropical radiation budget. The data consist of daily averages over fixed (500 km)2 target areas, and the months of July 1979 and January 1980 were chosen to show the effect of seasonal changes. Six climate regions, consisting of 14 to 24 target areas each, were picked for intensive analysis because they exemplify the range in the tropical cloud/net radiation interactions. The normal analysis was to consider net radiation as the independent variable and examine how cloud cover, cloud type, albedo and emitted radiation vary with the net radiation. Two recurring themes keep repeating on a local, regional, and zonal basis: the net radiation is strongly influenced by the average cloud type and amount present, but most net radiation values could be produced by several combinations of cloud types and amount.

The regions of highest net radiation (greater than 125 W m−2) tend to have medium to heavy cloud cover. In these cases, thin medium altitude clouds predominate. Their cloud tops are normally too warm to be classified as cirrus by the Nimbus cloud algorithm. A common feature in the tropical oceans are large regions where the total regional cloud cover varies from 20% to 90%, but with little regional difference in the net radiation. The monsoon and rain areas are high net radiation regions. The deep convective storm centers tend to have low, often highly negative, net radiation, but these are surrounded by large areas of high net radiation covered by thin medium and high level clouds. large regional differences in the net radiation caused by varying cloud cover and type do, however, occur over the tropical oceans. But the most noticeable difference is between continental and ocean regions. The net radiation is considerably higher over the oceans. The largest longitudinal variations in net radiation in July and January occur in high solar insulation regions somewhat poleward of the subsolar point. Over the oceans, net radiation maxima are associated with an average cloud cover of low albedo clouds. Some or most of these clouds may actually be high attitude cirrus clouds with emissivity less than one, although in the Nimbus-7 cloud cover dataset, most are identified as midaltitude clouds. The ocean net radiation minima are associated with bright low-altitude clouds. The largest differences, over 100 W m−2, are between the ocean maxima and the deep minima over the continental deserts. Over the deserts during the summer, however, cloud variations appear less important than regional variations in the surface albedo.

Full access
Philip E. Ardanuy and H. Lee Kyle

Abstract

The Earth Radiation Budget experiment, launched aboard the Nimbus-7 polar-orbiting spacecraft in late 1978, has now taken over seven years of measurements. The dataset, which is global in coverage, consists of the individual components of the Earth's radiation budget, including longwave emission, not radiation, and both total and new-infrared albedos. Starting some six months after the 1982 eruption of the El Chichón volcano, substantial long-lived positive shortwave irradiance anomalies were observed by the experiment in both the northern and southern polar regions. Analysis of the morphology of this phenomena indicates that the cause is the global stratospheric aerosol layer which formed from the cloud of volcanic effluents. There was little change in the emitted longwave in the polar regions. At the north pole the largest anomaly was in the near-infrared, but at the south pole the near UV-visible anomaly was larger. Assuming an exponential decay, the time constant for the north polar, near-infrared anomaly was 1.2 years. At mid- and low latitudes the effect of the El Chichón aerosol layer could not be separated from the strong reflected-shortwave and emitted-longwave perturbations issuing from the El Niño/Southern Oscillation event of 1982–83.

Full access
H. Lee Kyle and K. L. Vasanth

Abstract

Broad spectral band Nimbus-7 Earth Radiation Budget (ERB) experiment data are analyzed for top-of-the-atmosphere regional variations in near-ultraviolet visible and near-infrared reflected solar radiation. Regional differences in the noon vs midnight outgoing longwave flux are also studied, as is the difference in land and ocean net radiation budgets. Temporal sampling problems are discussed. The annual behavior is examined for a year (June 1979 through May 1980) on a global scale and for five selected study areas, and reasonable agreement is found with the results of previous investigators. The studies show a marked difference in behavior between oceanic and continental regions. The annual global total, near-ultraviolet visible and near-infrared albedo values obtained were, respectively, 30.2, 34.6 and 25.9. However, over the continents, the near-IR albedo was often the largest, but clouds and snow sharply decrease the near-IR albedo over land. Over the oceans, the average noon and midnight outgoing longwave-flux density is nearly identical but with regional and seasonal differences of several watts per square meter. Over the continents, the noon-emitted flux density averages were 15–25 W m−2 larger than the midnight value but with large regional and seasonal variations. The annually averaged global net radiation derived from the ERB scanner is −23.8 W m−2 for the land areas, +6.5 W m−2 for ocean areas and −3.2 W m−2 total.

Full access
Philip E. Ardanuy and H. Lee Kyle

Abstract

Five years of broad-band earth radiation budget measurements taken by the Nimbus-7 ERB experiment have been archived. This period encompasses the 1982/83 El Niño/Southern Oscillation event, which reached a peak near the beginning of the fifth data year (January 1983). A 41-month outgoing longwave radiation subset of this data set, extending from June 1980 through October 1983, has been further processed to enhance the spatial resolution.

Analysis of the resultant fields and the anomalies from the pre-El Niño climatology provides the first broad-band glimpse of the terrestrial outgoing longwave radiative response to the El Niño event throughout its life cycle. Of particular interest are the quasi-stationary planetary-scale tropical and midlatitude patterns which emerge as the El Niño reaches its peak intensity. Important new implications to the vertical motion field are addressed.

Full access
H. Lee Kyle, Mitchell Weiss, and Philip Ardanuy

Abstract

Quasi-biennial global, midlatitude, and tropical oscillations were observed using top-of-the-atmosphere outgoing longwave radiation (OLR), surface air temperature (SAT), and cloud amount for the period from 1979 to 1989. The in-phase quasi-biennial variations of OLR and SAT were strongest in the Tropics. Two prominent peaks in these two data fields were observed after the end of the main phases of the 1982–83 and 1986–87 El Niño-Southern Oscillation (ENSO) events, which were also accompanied by a decrease in the mean tropical cloud cover. The quasi-biennial signal was less noticeable in the midlatitudes during the two ENSO events but was strong during two non-ENSO peaks occurring in 1980–81 and 1989–90. In this study, the authors used two SAT datasets comprised of departures estimated from a specific base period, where the record of these two datasets predates the start of this century. The OLR dataset was obtained by concatenating Nimbus-7 (1979–87) and Earth Radiation Budget Satellite (ERBS) (1985–89) measurements. The cloud dataset was generated by concatenating Nimbus-7 (1979–84) estimates with those from the International Satellite Cloud Climatology Program (ISCCP) (1983–90). In the concatenation procedure, adjustments were made for previously identified, long-term nonphysical data trends in the Nimbus-7 datasets; sonar additional experiments were made in which detrending was applied to all the datasets. As a consequence, decade-long trends were not considered. In the detrended datasets, OLR and SAT were strongly positively correlated with explained variances of 74.5% or larger in the Tropics and midlatitudes and of 96.4% on a global scale. OLR and cloud were negatively correlated; however the results were less definitive with explained variances of 36.7% and 79.4%, respectively, for the globe and midlatitudes but only 17.4% in the Tropics. These observations imply a clear identification of a quasi-biennial signal and relationship between OLR and SAT, but the results are less certain when OLR is compared to cloud cover.

Full access
Philip E. Ardanuy, H. Lee Kyle, and Hyo-Duck Chang

Abstract

The Nimbus-7 satellite has been in a 955-km, sun-synchronous orbit since October 1978. The Earth Radiation Budget (ERB) experiment has taken approximately 8 years of high-quality data during this time, of which 7 complete years have been archived at the National Space Science Data Center. A final reprocessing of the wide- field-of-view channel dataset is underway. Error analyses indicate a long-term stability of 1% better over the length of the data record.

As part of the validation of the ERB measurements, the archived 7-year Nimbus-7 ERB dataset is examined for the presence and accuracy of interannual variations including the Southern Oscillation signal. Zonal averages of broadband outgoing longwave radiation indicate a terrestrial response of more than 2 years to the oceanic and atmospheric manifestations of the 1982–83 El Niñ/Southern Oscillation (ENSO) event, especially in the tropics. This signal is present in monthly and seasonal averages and is shown here to derive primarily from atmospheric responses to adjustments in the Pacific Ocean. The calibration stability of this dataset thus provides a powerful new tool to examine the physics of the ENSO phenomena.

Full access
Philip E. Ardanuy, H. Lee Kyle, and Douglas Hoyt

Abstract

The analyses of Cess are extended to consider global relationships among the earth's radiation budget (including solar insulation and changes in optically active gass), cloudiness, solar constant, volcanic aerosols, and surface temperature. Interannual variability and correlations between Nimbus-7 THIR/TOMS cloud amount, ERB WFOV longwave, shortwaye, and net radiation, and SAM II aerosol optical depths, along with Hansen and Lebedeff's surface temperature analyses, are assessed.

Solar luminosity is apparently related to the global surface temperature in the 1979–1990 time period based on the Nimbus-7 observations and an extended Hansen and Lebedeff temperature dataset. The 0.40°C range in observed global temperatures may be partitioned into a 0.15°C component due to a 2 W m−2 change in the solar constant and a 0.22°C component due to the increasing concentration of CO2 and other greenhouse gases. A relatively large component of the variance in the global temperature, cloudiness, and radiation budget signals is due to interannual earth system variability over time periods much shorter than a solar cycle (e.g., 2–4 years), for which the solar luminosity experiences no comparable fluctuation.

The Nimbus-7 observations indicate that the global, annual cloud amount varies by +0.3% to −0.5% with a pronounced quasi-biennial periodicity and is inversely proportional to the outgoing longwave flux and surface temperature. The time dependence of aerosols injected into the stratosphere by the explosive 1982 eruption of El Chichón is found to be important, along with the global cloud amount, in describing the time dependence of the earth's albedo during the period.

The sign of the relationship between the earth's surface temperature and the net radiation is of fundamental importance. The Nimbus-7 ERB net radiation observations compared to surface temperature analyses imply a stable climate (at least about some set point that is dictated by other conditions such as the concentration of C02 and other greenhouse gases, that do not apply over the relatively short time interval considered here).

When considering future mission we conclude that reliable and well-characterized satellite datasets with of ideally one to two decades or more are required to perform quantitative analyses of the relationships among different elements of the earth's climate system. To accomplish this, the instruments’ calibration should be maintained and valid to a stability that permits the analysis of interannual global fluctuations at the 0.2% level.

Full access
Richard R. Hucek, Philip Ardanuy, and H. Lee Kyle

Abstract

The results of a constrained, wide field-of-view (WFOV) radiometer measurement deconvolution are presented and compared against higher resolution results obtained from the Earth Radiation Budget (ERB) Experiment on the Nimbus-7 satellite and from the Earth Radiation Budget Experiment (ERBE). The method is applicable to both longwave and shortwave observations and is specifically designed to treat the problem of anisotropic reflection and emission at the top of the atmosphere (TOA), and low signal-to-noise ratios that arise regionally within the observation field. The latter occur, for example, near the earth's terminator where measured WFOV shortwave signals contain increasing percentages of instrument and modeling errors. Ridge regression and meridional smoothing are used to quell the resulting “local” instability and permit the recovery of a global solution. An optimized retrieval is obtained by tuning the constraints until the recovered solution matches, as well as possible, a known higher resolution product or, lacking that, until unacceptable features in the recovered field no longer appear. The latter approach leads to a set of weight factors that depend on the length of the sampling period and on the desired parameter field, but not on the calendar date. A 1-year study dataset, July 1983 through June 1984, as well as data for the individual months of April 1980 and 1985 have been processed using a preliminary version of these algorithms. Representative deconvolved fields of mean daily longwave flux and albedo are shown for monthly and 8-day inversion periods. When compared to ERB scanner data (April 1980) within 63° of the equator, the WFOV deconvolved solution reduces the RMS error of the WFOV archived results by 31% for longwave flux and 10% for shortwave flux. When compared to the ERBE data of April 1985 over the same domain, error reductions of 25% and 5% are obtained, respectively, for the longwave and shortwave fluxes.

Full access
H. Lee Kyle, Richard Hucek, Philip Ardanuy, Lanning Penn, John Hickey, and Brian Groveman

Abstract

Much of the early record of spectrally broadband earth radiation budget (ERB) measurements was taken by the ERB instrument launched on the Nimbus-7 spacecraft in October 1978. The wide-field-of-view (WFOV) sensors measured the emitted and reflected radiation from November 1978 through January 1993, and the first nine years have been processed into a stable, long-term dataset. However, heating and cooling of the ERB experiment introduced thermal perturbations in the original measurements that were only significant in the shortwave (SW) channels. These sensors were covered by spherical filter domes to absorb incident longwave (LW) radiation. In this paper, a thermal regression model—the thermal calibration adjustment table (CAT)—is developed to track and remove these thermal signals from the SW data. The model relies on instrument temperatures within and near the surface of the ERB instrument, and the observed nonzero nighttime sensor readings represent the thermal signals. Confidence that the model is stable for daytime applications was gained by smoothing the solution using ridge regression and noting the effect on the solution coefficient vector. The bias signal produced by the thermal CAT portrays the balance of instrument heating and cooling within the Nimbus-7 variable external radiation environment. Cooling occurs over about two-thirds of an orbit including satellite night. During the nighttime, the sensor bias change is about 17 W m−2 (compare with mean daytime SW flux of about 200 W m−2) with little seasonal or annual fluctuation. Strong warming takes place during morning and evening twilight when direct solar radiation illuminates the WFOV sensors. This warming effectively compensates for nighttime cooling when the opposite thermal signature is found. Additional daytime warming occurs for satellite positions near the solar declination when the effects of combined LW and SW terrestrial fluxes exceed thermal cooling to space. However, this heating is influenced by the terrestrial scene and so it varies seasonally.

The thermal CAT was one of two semi-independent procedures, each of equal mean accuracy, developed to validate and correct for thermally induced sensor signals. The other, called the global CAT, is described in the second paper in this series. Although the thermal CAT was considered heuristically superior, the global CAT was chosen for the basic calibration work since it was thought to be potentially more stable for the production of a consistent long-term ERB dataset.

Full access
H. Lee Kyle, Richard Hucek, Philip Ardanuy, Lanning Penn, and Brian Groveman

Abstract

Sensitivity changes in the four wide-field-of-view (WFOV) Nimbus-7 earth radiation budget (ERB) sensors were monitored over a 9-yr period (November 1978–October 1987) by use of a number of reference sources. The sun was the primary reference and was used to check the shortwave (SW; about 0.2–4 μm) sensitivities on the twin total channels 11 and 12. The longwave (LW; greater than 4 μm) sensitivity in channel 12 was checked by a time series analysis of the nighttime mean global terrestrial signal, but the method could not be usefully applied to channel 11 because it was shuttered too much of the time. The accuracy of this type of analysis was verified by comparing a similar shortwave time series analysis with the solar calibration results. It was also checked by comparing channel 12 nighttime measurements with those from the companion scanner. The scanner had a built-in blackbody for calibration, but the scanner failed after 20 months. As a result of this comparison, a bias adjustment of 12.6 W m−2 was made in the channel 12 measurements. In addition, channels 11 and 12 were compared to each other. The shortwave channels 13 (0.2–4 μm) and 14 (0.7–2.8 μm) were covered by Suprasil-W domes that blocked radiation greater than 4 μm. A piece of red glass in channel 14 further restricted its spectral range to the near infrared. After launch, these domes fogged asymmetrically. For this reason, the effective sensitivity changes in these channels were monitored by comparison with channel 12 using the whole earth as a transfer target. The shortwave range mentioned above for channels 11 and 12 really refers to channel 13 and not to channels 11 and 12. By October 1987, the following sensitivity decreases had occurred: channel 11 (no observable change), channel 12 (LW 2.5%, SW 1.5%), channel 13 (13.3%), and channel 14 (6%). Corrections for these changes kept the calibrated signals stable to better than 0.5% over the 9-yr period. Year-to-year annual global mean longwave shifts of 0.1%–0.4% have been related to climate perturbations and appear real.

Full access