The precise and continuous calibration of a radar is an important task for maintaining the accuracy of radar measurements. The calibration of radar receiver transfer function slope can be performed using bias measurements of meteorological echoes. The bias measurement is obtained as the difference in the estimate of the mean power given by a logarithmic receiver and a power law receiver. The estimate of the receiver transfer function slope (obtained from bias measurement) and its accuracy are studied in this paper, taking into consideration the correlation structure of the radar time samples. Using the asymptotic theory and computer simulation, the estimate of receiver transfer function slope and its variance are evaluated for many sample sizes and Doppler spectra. Verification of the theoretical results is presented using data collected by the Polar 55-C band radar in Italy.