An alternative estimator of beta2

Problem

This problem is at a higher level. You may want to skip it for now (or try to understand as much as you can) and return to it later in the course.

Given the LRM with random sampling \(y_i=β_1+β_2x_i+ε_i\) I use two different estimators for \(β_2\) : \(b_2\) is the OLS estimator and \(c_2\) is an estimator calculated as the slope of a a straight line through the leftmost and the rightmost observation,

\[c_2= \frac{y_B-y_A}{x_B-x_A}\]

where \(x_A\) is the smallest x-value (leftmost), \(x_A\) the largest x-value and \(y_A=β_1+β_2x_A+ε_A\) and \(y_B=β_1+β_2x_B+ε_B\) .

a) Show that

\[c_2=β_2+ \frac{ε_B-ε_A}{x_B-x_A}\]

b) Show that

\[E\left( c_2 \mid x \right)=β_2\]

c) Show that \(c_2\) is an unbiased estimator of \(β_2\)

d) Is \(c_2\) a linear estimator?

e) What does the GM theorem tell us about the relationship between \(Var\left( b_2 \mid x \right)\) and \(Var\left( c_2 \mid x \right)\) ?

Solution