# Properties of OLS estimators and the fitted regression model

Simple linear model equation is denoted by

• Ordinary Least Squares is the most common method to estimate the parameters in a linear regression model regardless of the form of distribution of the error π.
• Least squares stand for the minimum square error or πππΈ (ππ’π ππ πππ’ππππ πΈππππ). A lower error results in a better explanatory power of the regression model.
• Also, least-squares produce the best linear unbiased estimators of π0 and π1.

### Properties of least square estimators and the fitted regression model

1. The sum of the residuals in any regression model that contains an intercept π0 is always zero, that is  –
2. The sum of the observed value π¦π equals the sum of the fitted values Ε·i, that is  –
3. The least squares regression line always passes through the centroid (Θ³, xΜ) of the data.
4. The sum of the residuals weighted by the corresponding value of the regressor variable always equals zero, that is –
5. The sum of the residuals weighted by the corresponding fitted value always equals zero, that is β