Abstract
We compare a series of 85 dustsonde measurements and 84 lidar measurements made in midlatitude North America during 1974–80. This period includes two major volcanic increases (Fuego in 1974 and St. Helens in 1980), as well as an unusually clean, or background, period in 1978–79. An optical modeling technique is used to relate the dustsonde-number data to the lidar-backscatter data. The model includes a range of refractive indices and of size distribution functional forms, to show its sensitivity to these factors. Moreover, two parameters of each size distribution function are adjustable, so that each distribution can be matched to any two-channel dustsonde measurement.
We show how the mean particle radius for backscatter, rB, changes in response to size distribution changes revealed by the dustsonde channel ratio, Nr>0.15/Nr>0.25. (Nr>x is the number of particles with radius larger than x microns.) In early 1975, just after the Fuego injection, Nr>0.15/Nr>0.25 was ∼3, and the corresponding rB, was ∼0.5 μm; by early 1980, when Nr>0.15/Nr>0.25 had increased to eight or larger, rB had correspondingly decreased to ∼0.25 μm. Throughout the 1975–76 Fuego decay, rB always exceeded 0.3 μm; thus, lidar backscatter was influenced primarily by particles larger than those that contribute most to Nr>0.15 and Nr>0.25. This is in accord with the shorter lidar background-corrected, 1/e decay time: 7.4 months, versus 10.4 and 7.9 months for Nr>0.15 and Nr>0.25.
The modeling technique is used to derive a time series of dustsonde-inferred peak backscatter mixing ratio, which agrees very well with the lidar-measured series. The best overall agreement for 1974–80 is achieved with a mixture of refractive indices corresponding to aqueous sulfuric acid at about 210 K with an acid-weight fraction between 0.6 and 0.85.