The OLS estimator

Problem

Setup: a linear regression model with a random sample

\[y_i=β_1+β_2x_{i,2}+β_3x_{i,3}+ \ldots +β_kx_{i,k}+ε_i\]

what is true about the OLS estimator \(b_3\) ?

  1. \(b_3\) is an estimator of \(β_3\)
  2. \(b_3\) is equal to \(β_3\)
  3. \(b_3\) is the only estimator of \(β_3\)
  4. The formula for \(b_3\) is

\[b_3= \frac{\sum_{i=1}^{n}{ \left( x_{i,3}-{\bar{x}}_3 \right)\left( y_i-\bar{y} \right) }}{\sum_{i=1}^{n}{ {\left( x_{i,3}-{\bar{x}}_3 \right)}^2 }}\]

e. The formula for \(b_3\) , as well as for the other OLS estimators is derived from minimizing \(RSS\)

Solution

  1. True. \(b_1\) is an estimator of \( β_1\) and so on. We can calculate the \(b\) ’s from our data while the \(β\) ’s are unknown.
  2. False. \(b_3\) is an estimator of \(β_3\) . Due to the error terms, we cannot find \(β_3\) exactly.
  3. False. We can always find many estimators of the same parameter. We simply have to find a good estimator with nice properties and the OLS estimator is often a good one.
  4. False. There is no simple formula for the OLS estimator when we have more than one explanatory variable. However, there is a nice simple formula for the OLS estimator but it requires matrix notation.
  5. Yes, this is true. We still rely on the least squares principle of minimizing \(RSS\) . The minimization problem is now over \(k\) variables and we end up solving a system with \(k\) equations.