The OLS estimator

Summary

Key idea 1 from lecture.

  • The linear regression model is an example of a data generating process (DGP) as it allows us to simulate data on yiyi .

Key idea 2 from lecture.

  • In the linear regression model

yi=β1+β2xi+εi,i=1,,nyi=β1+β2xi+εi,i=1,,n

  • the OLS estimator

b2=ni=1(xiˉx)(yiˉy)ni=1(xiˉx)2b2=ni=1(xi¯x)(yi¯y)ni=1(xi¯x)2

  • is an estimator of β2β2 and the OLS estimator

b1=ˉyb2ˉxb1=¯yb2¯x

  • is an estimator of β1β1 .
  • Many other estimators for β1,β2β1,β2 exist in the LRM.
  • The OLS fitted values

ˆyi=b1+b2xi^yi=b1+b2xi

  • are estimates of the conditional expectations

β1+β2xi=E(yi|xi)β1+β2xi=E(yi|xi)

  • Similarly

ˆy=b1+b2x^y=b1+b2x

  • are estimates of the conditional expectations

β1+β2x=E(y|x)β1+β2x=E(y|x)

  • for arbitrary values of xx not necessarily in the sample.
  • Since the residuals are given by

ei=yib1b2xiei=yib1b2xi

  • and the error terms are given by

εi=yiβ1β2xiεi=yiβ1β2xi

  • if b1b1 is close to β1β1 and b2b2 is close to β2β2 then eiei is close to εiεi .

Key idea 3 from lecture.

  • The OLS estimators b1,b2b1,b2 are random variables (so are ˆyi^yi and eiei )