Abstract
In cases of modest correlation, parameters calculated from a standard least squares linear regression can vary depending on the selection of dependent and independent variates. A neutral regression that addresses this problem is proposed. The eigenvector corresponding to the smallest eigenvalue of the cross-correlation matrix of the two variates is used as a set of regression coefficients. Error bars are calculated for the eigenvalues and eigenvectors by means of a perturbation expansion of the cross-correlation matrix and are then verified by Monte Carlo simulation. A procedure is suggested for extension of the technique to the multivariate case. Examples of a linear fit for low-correlation and a quadratic fit for high-correlation cases are given. Conclusions are presented regarding the strengths and weaknesses of both the least squares and the neutral regression.
Corresponding author address: Dr. Richard F. Marsden, Physics Department, Royal Military College of Canada, P.O. Box 17000 STN FORCES, Kingston, ON K7K 7B4, Canada.
Email: marsden-r@rmc.ca