Thursday, January 20, 2011

Least Squares Fitting

The term least squares describes a frequently used approach to solving overdetermined or inexactly specified systems of equations in an approximate sense. Instead of solving the equations exactly, we seek only to minimize the sum of the squares of the residuals.
The least squares criterion has important statistical interpretations. If appropriate probabilistic assumptions about underlying error distributions are made, least squares produces what is known as the maximum-likelihood estimate of the parameters. Even if the probabilistic assumptions are not satisfied, years of experience
have shown that least squares produces useful results.
The computational techniques for linear least squares problems make use of orthogonal matrix factorizations.
Lets consider function:
for estimation of coefficients a and b we can use following equations:

double sumY = 0;
double sumX = 0;
double sumXY = 0;
double sumX2 = 0;

foreach (Sample s in samples)
    sumY = sumY + s.y;
    sumX = sumX + s.x;
    sumX2 = sumX2 + s.x * s.x;
    sumXY = sumXY + s.x * s.y;

double a = (sumY * sumX2 - sumX * sumXY) / (samples.Count * sumX2 - (sumX * sumX));
double b = (samples.Count * sumXY - (sumX * sumY)) / (samples.Count * sumX2 - (sumX * sumX));

No comments:

Post a Comment