Generalized method of moments estimator
Summary
- True moments:
\[E\left( f\left( w_i,θ \right) \right)=0\]
- where
- \(w_i\) is all data for individual \(i\) (dependent variable, explanatory variable, instruments)
- \(θ\) is a \(k×1\) vector of unknown parameters
- \(f\left( w_i,θ \right)\) is now \(r×1\) .
- Corresponding sample moments:
\[s\left( θ \right)= \frac{1}{n}\sum_{i=1}^{n}{ f\left( w_i,θ \right) }\]
- If \(r=k\) then the solution to \(s\left( θ \right)=0\) will give us the MM estimator of \(θ\) .
- If \(r≠k\) then the method of moments estimator will not, in general, exist ( \(r\) equations and \(k\) unknowns)
- If \(r<k\) then there is no general procedure of estimating \(θ\) (too little information).
- If \(r≥k\) then we define the quadratic form (scalar)
\[Q\left( θ \right)=s{\left( θ \right)}'Ws\left( θ \right)\]
- where \(W\) is an arbitrary \(r×r\) symmetric positive definite matrix called the weighting matrix .
- The generalized method of moments estimator, \({\hat{θ}}_{GMM}\) for a given \(W\) is defined as
\[{\hat{θ}}_{GMM}=arg \min_{θ} Q\left( θ \right)\]
- Result: \({\hat{θ}}_{GMM}\) is a consistent estimator of \(θ\) (under regularity conditions)
- The asymptotic variance of \({\hat{θ}}_{GMM}\) will depend on \(W\) .
- We define
\[W^{opt}={\left( E\left( f\left( w_i,θ \right)f{\left( w_i,θ \right)}' \right) \right)}^{-1}\]
- as the optimal weighting matrix.
- The asymptotic variance of \({\hat{θ}}_{GMM}\) is minimized when \(W=W^{opt}\) .
- \(W^{opt}\) is, in general, unknown. However, it can often be estimated.