The linear regression model: the method of moments estimator
Summary
- Setup
- Random sample \(\left( y_i,x_i \right)\) for \(i=1, \ldots ,n\)
- Correctly specified linear regression model, \(y_i=x'_iβ+ε_i\) where \(E\left( ε_i \mid x_i \right)=0\) , \(E\left( y_i \mid x_i \right)=x'_iβ\) .
- Result: For \(i=1, \ldots ,n\)
\[E\left( {x_iε}_i \right)=E\left( x_i\left( y_i-x'_iβ \right) \right)=0\]
- These are called true moments or true means and there are \(k\) of them (unknown as \(β\) is unknown).
- Given true moments \(E\left( x_i\left( y_i-x'_iβ \right) \right)\) ,
\[ \frac{1}{n}\sum_{i=1}^{n}{ x_i\left( y_i-x'_iβ \right) }\]
- are called the corresponding sample moments or sample means and there are \(k\) of them.
- The value of \(β\) which makes the sample moments equal to the true moments is called the method of moment estimator of \(β\) , denoted by \(b_{MM}\) .
- For our model, \(b_{MM}\) is defined by
\[ \frac{1}{n}\sum_{i=1}^{n}{ x_i\left( y_i-x'_ib_{MM} \right) }=0\]
- Solve for \(b_{MM}\) :
\[b_{MM}={\left( \sum_{i=1}^{n}{ x_ix'_i } \right)}^{-1}\sum_{i=1}^{n}{ x_iy_i }={\left( X'X \right)}^{-1}X'y=b_{OLS}\]
- In the linear regression model with exogenous explanatory variables, the MM-estimator is the same as the OLS-estimator.