In this paper we explain the general theory of the excess bias measurement as a radar parameter suitable for deriving information of meteorological relevance from the fluctuation of weather echoes.
Since the excess bias is the difference between the estimates of the radar reflectivity factor within the radar measurement cell obtained with the logarithmic and simulated receivers, the integral and differential characteristics of this measurement are accurately analyzed.
It is shown that, for specific reflectivity models, the excess bias is a monotonic function of the reflectivity variation within the measurement cell, and the experimental data confirm a close correlation between these two parameters.
Finally, we assess the influence of the simulated receiver on detecting reflectivity variations through excess bias measurements. The study refers to the class of receiver responses for which the output y is related to the input power P by: y = aPb, where a and b are characteristic of the receiver. The optimum receiver responses varies with the reflectivity models considered. Whenever the reflectivity field exhibits linear variation in a logarithmic scale, the optimum response approaches the response of a quadratic receiver (b = 1), as much as the reflectivity variations increase. For the models which describe reflectivity steps at the edge of precipitation cells, the optimum b is smaller than the corresponding one for the linear model; the greater the fraction of the measurement cell filled with precipitation, the smaller becomes the optimum b.