Properties of OLS estimators and the fitted regression model

 


Simple linear model equation is denoted by

    
  • Ordinary Least Squares is the most common method to estimate the parameters in a linear regression model regardless of the form of distribution of the error 𝑒.
  • Least squares stand for the minimum square error or 𝑆𝑆𝐸 (π‘†π‘’π‘š π‘œπ‘“ π‘†π‘žπ‘’π‘Žπ‘Ÿπ‘’π‘‘ πΈπ‘Ÿπ‘Ÿπ‘œπ‘Ÿ). A lower error results in a better explanatory power of the regression model.
  • Also, least-squares produce the best linear unbiased estimators of 𝑏0 and 𝑏1.

Properties of least square estimators and the fitted regression model

  1. The sum of the residuals in any regression model that contains an intercept 𝑏0 is always zero, that is  -

  2. The sum of the observed value 𝑦𝑖 equals the sum of the fitted values Ε·i, that is  -

  3. The least squares regression line always passes through the centroid (Θ³, x̄) of the data.
  4. The sum of the residuals weighted by the corresponding value of the regressor variable always equals zero, that is -

  5. The sum of the residuals weighted by the corresponding fitted value always equals zero, that is –

Comments