Simple linear model equation is denoted by
- Ordinary Least Squares is the most common method to estimate the parameters in a linear regression model regardless of the form of distribution of the error 𝑒.
- Least squares stand for the minimum square error or 𝑆𝑆𝐸 (𝑆𝑢𝑚 𝑜𝑓 𝑆𝑞𝑢𝑎𝑟𝑒𝑑 𝐸𝑟𝑟𝑜𝑟). A lower error results in a better explanatory power of the regression model.
- Also, least-squares produce the best linear unbiased estimators of 𝑏0 and 𝑏1.
Properties of least square estimators and the fitted regression model
- The sum of the residuals in any regression model that contains an intercept 𝑏0 is always zero, that is –
- The sum of the observed value 𝑦𝑖 equals the sum of the fitted values ŷi, that is –
- The least squares regression line always passes through the centroid (ȳ, x̄) of the data.
- The sum of the residuals weighted by the corresponding value of the regressor variable always equals zero, that is –
- The sum of the residuals weighted by the corresponding fitted value always equals zero, that is –