Abstract
Computer simulations of satellite-derived Earth radiation parameters are examined to determine the source and size of errors arising from averaging parameters over 1 month on a 2.5°×2.5° longitude-latitude grid. November 1978 data from the Geostationary Operational Environmental Satellite (GOES) have been used as a source of radiation parameter fields within each region. The regions are sampled according to various combinations of satellite orbits which have been chosen on the basis of their applicability to the Earth Radiation Budget Experiment. A mathematical model is given for the data-processing algorithms that are used to produce daily, monthly and monthly hourly estimates of shortwave, longwave and net radiant exitance. Because satellite sampling of each region is sparse during any day, and because the meteorological behavior between measurements is unknown, the retrieved diurnal cycle in shortwave radiant exitance is especially sensitive to the temporal distribution of measurements. The resulting retrieval errors are seen to be due to insufficient knowledge of the temporal distribution of both cloud fraction and albedo. These errors, in combination with similar sampling errors resulting from diurnal variations in longwave radiant exitance (especially over land), produce biases in monthly net radiant exitance which are complex, regionally-dependent functions of the local time of the measurements. The regions studied have shown standard errors of estimate for monthly net radiant exitance ranging from about 20 W m−2 for the worst single-satellite sample to ∼2 W m−2 for the three-satellite sampling assumed to be available.