Abstract
The authors have empirically examined the dependence of the outgoing longwave radiation (OLR) on sea surface temperature (Ts), precipitable water (W), and height-mean relative humidity (RH¯). The OLR is obtained from 4 yr of data from the Earth Radiation Budget Experiment (ERBE), while Ts, W, and RH¯ are obtained from objective analyses of rawinsonde and ship data. It is found that in the midlatitudes, the surface temperature explains over 80% of the variability in the clear-sky OLR (Fcs) and almost half of the variability in the total OLR (Ftot). It fails badly in the tropics and subtropics, however, where (Ts explains only about 20% of the variability in (Fcs, and is largely decoupled from Ftot. The two-dimensional contour plot of the OLR binned with respect to Ts and RH¯ is marked by distinct changes in the gradient that are consistent with inferences from earlier investigations. For low values of Ts (<10°C), the OLR depends mainly on Ts. For values of Ts above 10°C, the OLR depends increasingly on RH¯. Specifically, in the tropics (Ts˜25°C), the total and clear-sky OLR depend significantly on both Ts and RH¯. The well-known drop in OLR in the tropics with increasing Ts correlates directly to an increase in RH¯, and not to changes in Ts. The authors suggest that the observed dependence of the OLR on Ts and RH¯ be a minimum performance standard for climate models. This approach is illustrated by comparing the observed dependence with the results of a radiative transfer model and an R 15 general circulation model, and by discussing the strengths and limitations of using RH¯ to parameterize the OLR.