Convergence in probability
Summary
- Given a sequence of random variables \(z_1,z_2,…\) , we investigate a condition for this sequence to converge to a constant \(c\) .
Definition: convergence in probability
- Consider an open interval around \(c\) : \(\left( c-ε,c+ε \right)\) where \(ε\) is a small number.
- \(z_n\) is in the interval if \(z_n∈\left( c-ε,c+ε \right)\) after the experiment was performed.
- Alternatively, \(z_n\) is in the interval if \(\left| z_n-c \right|<ε\)
- Before the experiment, we can only evaluate the probability of \(z_n\) ending up in the interval
\[P\left( \left| z_n-c \right|<ε \right)\]
- For a given \(ε\) , the bigger the \(n\) , the higher this probability should be.
- We say \(z_n\) converges in probability to a constant \(c\) if for all \(ε>0\) ,
\[P\left( \left| z_n-c \right|<ε \right)→1 \textrm{ as } n→∞\]
- We write this as
\[\textrm{plim } z_n=c\]
Result
- If \(E\left( z_n \right)→c\) and \(Var\left( z_n \right)→0\) as \(n→∞\) then \(\textrm{plim } z_n=c\) . The opposite is not necessarily true.
Example
- \(x_1,x_2,...,x_n\) is an IID random sample where \(E\left( x_i \right)=μ\) and \(Var\left( x_i \right)=σ^2\) . Then
\[\textrm{plim } {\bar{x}}_n=μ\]
- This is an important result called the (weak) law of large numbers , LLN.