Estimating the variance of the OLS estimators
Summary
Setup
The LRM with random sampling
\[y_i=β_1+β_2x_i+ε_i i=1,…,n\]
Estimated variance of \(b_1\) and \(b_2\)
If we replace \(σ^2\) with \(s^2\) in the OLS GM variance formulas, we get the estimated variance of \(b_1\) and \(b_2\) :
\[\widehat{Var}\left( b_1 \right)=s^2\left( \frac{1}{n}+ \frac{{\bar{x}}^2}{\sum_{i=1}^{n}{ {\left( x_i-\bar{x} \right)}^2 }} \right)\]
\[\widehat{Var}\left( b_2 \right)= \frac{s^2}{\sum_{i=1}^{n}{ {\left( x_i-\bar{x} \right)}^2 }}\]
Standard errors
- Under GM, the estimated variances are unbiased and consistent estimators of the true variances.
- The square root of the estimated variances are called the OLS standard errors of the regression parameters and they are denoted by \(SE\left( b_1 \right)\) and \(SE(b_2)\) .