Abstract
A critical review is given of the experimental and theoretical results concerning the measurement of rainfall using optical extinction, i.e., the attenuation of radiation with wavelength less than or equal to that of the infrared band. It is shown that rainfall rates found from an empirical relation involving optical extinction generally display average deviations without regard for sign of only 25% when compared with those measured by raingages directly beneath the optical beam. It is also shown that the differences between experimental and theoretical results can be explained in terms of variations of the shape of the raindrop size distribution, i.e., deviations from exponentiality.