The OLS estimator
Problem
Setup: a linear regression model with a random sample
\[y_i=β_1+β_2x_{i,2}+β_3x_{i,3}+ \ldots +β_kx_{i,k}+ε_i\]
what is true about the OLS estimator \(b_3\) ?
- \(b_3\) is an estimator of \(β_3\)
- \(b_3\) is equal to \(β_3\)
- \(b_3\) is the only estimator of \(β_3\)
- The formula for \(b_3\) is
\[b_3= \frac{\sum_{i=1}^{n}{ \left( x_{i,3}-{\bar{x}}_3 \right)\left( y_i-\bar{y} \right) }}{\sum_{i=1}^{n}{ {\left( x_{i,3}-{\bar{x}}_3 \right)}^2 }}\]
e. The formula for \(b_3\) , as well as for the other OLS estimators is derived from minimizing \(RSS\)
Solution
- True. \(b_1\) is an estimator of \( β_1\) and so on. We can calculate the \(b\) ’s from our data while the \(β\) ’s are unknown.
- False. \(b_3\) is an estimator of \(β_3\) . Due to the error terms, we cannot find \(β_3\) exactly.
- False. We can always find many estimators of the same parameter. We simply have to find a good estimator with nice properties and the OLS estimator is often a good one.
- False. There is no simple formula for the OLS estimator when we have more than one explanatory variable. However, there is a nice simple formula for the OLS estimator but it requires matrix notation.
- Yes, this is true. We still rely on the least squares principle of minimizing \(RSS\) . The minimization problem is now over \(k\) variables and we end up solving a system with \(k\) equations.