Consistency

Summary

Given:

  • A sample \(x_1,x_2,...,x_n\)
  • A statistical model where \(θ\) is an unknown parameter
  • An estimator \({\hat{θ}}_n=g(x_1,…,x_n)\)

Definition: consistency

  • We may view \({\hat{θ}}_1,{\hat{θ}}_2,{\hat{θ}}_3,…\) as a sequence of random variables .
  • We say that \({\hat{θ}}_n\) is a consistent estimator of \(θ\) if \({\hat{θ}}_n\) converges in probability to \(θ\) ,

\[\textrm{plim } {\hat{θ}}_n=θ\]

  • or

\[P\left( \left| {\hat{θ}}_n-θ \right|<ε \right)→1 \textrm{ as } n→∞\]

  • for all \(ε>0\) .

Example

  • \(x_1,x_2,...,x_n\) is an IID random sample where \(E\left( x_i \right)=μ\) and \(Var\left( x_i \right)=σ^2\) . Then \({\bar{x}}_n\) is a consistent estimator of \(μ\) ( the weak law of large numbers ).