Properties of the OLS estimator

Problem

Setup: a linear regression model with a random sample

\[y_i=β_1+β_2x_{i,2}+β_3x_{i,3}+ \ldots +β_kx_{i,k}+ε_i\]

What is true?

  1. If the \(x\) -variables are exogenous then the error terms are homoscedastic
  2. If the error terms are not homoscedastic then the OLS estimator cannot be unbiased
  3. If all the explanatory variables are independent of the error terms and \(E\left( ε_i \right)=0\) then the OLS estimator is unbiased
  4. The OLS estimator is unbiased if and only if it is consistent
  5. Under GM, \(s^2\) is an unbiased estimator of \(σ^2\) .

Solution

  1. This is false. Exogeneity and homoscedasticity are two different assumptions. Exogeneity is about the conditional mean being zero, \(E\left( ε_i \mid x_i \right)=0\) , while homoscedasticity is about the conditional variance being constant, \(Var\left( ε_i \mid x_i \right)=σ^2\) .
  2. This is false. If the error terms are not homoscedastic, exogeneity can still hold and that is all we need for the OLS estimator to be unbiased.
  3. True. In this case, exogeneity will hold.
  4. False. These are two different nice properties that an estimator may have (or not have) Unbiasedness means that it is “on average” correct. Consistency means that it will become equal to the true \(β\) as \(n\) goes to infinity.
  5. This is true. We estimate the variance of the error terms using \(s^2\)