The properties of the OLS estimator

Summary

Setup

The linear regression model with random sampling,

\[y_i=β_1+β_2x_{i,2}+β_3x_{i,3}+…+β_kx_{i,k}+ε_i, i=1,…,n\]

\(b_1,b_2,…,b_k\) are the OLS estimators of \(β_1,β_2,…,β_k\) .

Gauss-Markov assumptions

The following set of assumptions qualify as Gauss-Markov assumptions:

  • All the \(x\) -variables are exogenous

\[E\left( ε_i|x_i \right)=0, i=1,…,n\]

  • The residuals are homoscedastic with a common variance \(σ^2\)

\[Var\left( ε_i|x_i \right)=σ^2, i=1,…,n\]

OLS unbiased

  • The OLS estimators \(b_1,b_2,…,b_k\) are unbiased if all the \(x\) -variables are exogenous

OLS consistent

  • The OLS estimators \(b_1,b_2,…,b_k\) are consistent under mild conditions if all the \(x\) -variables are exogenous.

Under GM assumptions,

  • It is possible to derive the variance of the of the OLS estimators (the OLS GM variance formulas)

\[Var\left( b_j|x \right) , j=1,…,k\]

  • This variance will depend on all the \(x\) -data as well as \(σ^2\) but the formulas are messy when \(k>2\) (without using matrices).
  • We define

\[s^2= \frac{RSS}{n-k}= \frac{1}{n-k}\sum_{i=1}^{n}{ e_i^2 }\]

  • Under GM, \(s^2\) (the OLS estimator of \(σ^2\) ) is an unbiased and consistent (under mild conditions) estimator of \(σ^2\) .
  • \(s\) , the square root of \(s^2\) , is called standard error of the regression.
  • If we replace \(σ^2\) with \(s^2\) in the OLS GM variance formulas, we get the estimated variance of \(b_1,b_2,…,b_k\) :

\[\widehat{Var}\left( b_j \right) , j=1,…,k\]

  • The OLS standard errors
  • \(SE\left( b_j \right)=\sqrt{\widehat{Var}\left( b_j \right)} , j=1,…,k \)
  • are presented in all econometric packages.
  • The OLS estimators will be BLUE (Best Linear Unbiased Estimator).