By far the simplest and quickest method for determining a straight line is to choose the line that can best be drawn by eye. Although this method often paints a reasonably good picture, it is extremely subjective and imprecise and is worthless for statistical inference. We now consider two analytical approaches for finding the best-fitting straight line. 

1. The Least-squared Method

The least-squared method determines the best-fitting straight line as the line that minimizes the sum of squares of the lengths of the vertical-line segments drawn from the observed data points on the scatter diagram to the fitted line. The idea here is that the smaller the deviations of observed values from this line ( and consequently the smaller the sum of squares of these deviations), the closer or "snugger" the best-fitting line will be to the data.

2. The Minimum-Variance Method

The minimum-variance method is more classically statistical than the method of least squares, which can be viewed as a purely mathematical algorithm. In this second approach, determining the best fit becomes a statistical estimation problem. The goal is to find point estimators of Beta(zero) and Beta(1) with good statistical properties. In this regard, under the previous assumptions, the best line is determined by the estimators Beta(zero) and Beta(1) that are unbiased for their unknown population counterparts Beta(zero) and Beta(1), respectively, and have minimum variance among all unbiased (linear) estimators of Beta(zero) and Beta(1).

*Fortunately, both the least-squares method and the minimum-variance method yield exactly the same solution, which we will state without proof. 

posted by sergeant