Simple linear model equation is denoted by

- Ordinary Least Squares is the most common method to estimate the parameters in a linear regression model regardless of the form of distribution of the error π.
- Least squares stand for the minimum square error or πππΈ (ππ’π ππ πππ’ππππ πΈππππ). A lower error results in a better explanatory power of the regression model.
- Also, least-squares produce the best linear unbiased estimators of π0 and π1.

### Properties of least square estimators and the fitted regression model

- The sum of the residuals in any regression model that contains an intercept π0 is always zero, that is –
- The sum of the observed value π¦π equals the sum of the fitted values Ε·i, that is –
- The least squares regression line always passes through the centroid (Θ³, xΜ) of the data.
- The sum of the residuals weighted by the corresponding value of the regressor variable always equals zero, that is –
- The sum of the residuals weighted by the corresponding fitted value always equals zero, that is β