Maximum likelihood, discrete random variables

Problem

We have a random sample where \(y_i\) follows a Bernoulli with parameter \(p\) ,

\[y_i \sim Ber(p)\]

for \(i=1, \ldots ,n\) . This means that \(y_i\) can take values 0 or 1 with probability \(1-p\) and \(p\) .

  1. Show that \(f\left( y_i;p \right)=P\left( y_i=1 \right)\) , the probability mass function of \(f\) , can be written as \(f\left( y_i;p \right)=p^{y_i}{\left( 1-p \right)}^{1-y_i}\)
  2. What is \(L_i\left( p \right)\) , the likelihood function for observation \(i\) ?
  3. What is \(l_i(p)\) , the loglikelihood function for observation \(i\) ?
  4. Find the joint density \(f_J(y)\)
  5. Find the likelihood function \(L(p)\)
  6. Find the loglikelihood function \(l(p)\) in two ways: by using \(l\left( p \right)=log L(p)\) and by using \(l\left( p \right)=∑l_i\left( p \right)\) . Use the symbol \(n_1=∑y_i\) .
  7. Find \(s_i(p)\)
  8. Find \(s(p)\) in two ways, by using \(s\left( p \right)=dl/dp\) and by using \(s\left( p \right)=∑s_i\left( p \right)\) .
  9. Find the stationary point of \(l(p)\) , the solution to \(s\left( p \right)=0\) (the maximum likelihood estimate)
  10. Find the second derivative \(d^2l_i/dp^2\)
  11. Find \(I\left( p \right)\) as \(-E\left( d^2l_i/dp^2 \right)\)
  12. Find \(I\left( p \right)\) as \(E\left( s_i{\left( p \right)}^2 \right)\) and confirm that the answer is the same as in k.
  13. Suppose that \(n=100\) and \(n_1=50\) . Find   \({\hat{p}}_{ML}\) and \(\hat{V}\) .
  14. Suppose that \(n=100\) and \(n_1=50\) . Find a 95% confidence interval for \(p\) assuming that \(n\) is large enough for \(\sqrt{n}\left( {\hat{p}}_{ML}-p \right)\) to be approximately \(N(0,V)\) .
  15. In this problem, one can show that \(I\left( {\hat{p}}_{ML} \right)=I_G\left( {\hat{p}}_{ML} \right)=I_H\left( {\hat{p}}_{ML} \right)\) (this is a nice algebra problem that you can do if you like). Confirm this using the data \(y_1=1, y_2=1,y_3=1,y_4=0,y_5=0\) .

Solution

a. We have

\[P\left( y_i=0 \right)=f\left( 0;p \right)=p^0{\left( 1-p \right)}^{1-0}=1-p\]

and

\[P\left( y_i=1 \right)=f\left( 1;p \right)=p^1{\left( 1-p \right)}^{1-1}=p\]

b.

\[L_i\left( p \right)=f\left( y_i;p \right)=p^{y_i}{\left( 1-p \right)}^{1-y_i}\]

c.

\[l_i\left( p \right)=log L_i\left( p \right)=log p^{y_i}{\left( 1-p \right)}^{1-y_i}=y_ilog p+\left( 1-y_i \right)log \left( 1-p \right)\]

d.

\[f_J\left( y;p \right)=\prod_{i=1}^{n}{ f\left( y_i;p \right) }=\prod_{i=1}^{n}{ p^{y_i}{\left( 1-p \right)}^{1-y_i} }=\]

\[\prod_{i=1}^{n}{ p^{y_i} }\prod_{i=1}^{n}{ {\left( 1-p \right)}^{1-y_i} }=p^{\sum_{i=1}^{n}{ y_i }}{\left( 1-p \right)}^{\sum_{i=1}^{n}{ \left( 1-y_i \right) }}=p^{\sum_{i=1}^{n}{ y_i }}{\left( 1-p \right)}^{n-\sum_{i=1}^{n}{ y_i }}\]

e.

\[L\left( p \right)=f_J\left( y;p \right)=p^{\sum_{i=1}^{n}{ y_i }}{\left( 1-p \right)}^{n-\sum_{i=1}^{n}{ y_i }}\]

f. First:

\[l\left( p \right)=log L\left( p \right)=log p^{n_1}{\left( 1-p \right)}^{n-n_1}=n_1log p +\left( n-n_1 \right)log \left( 1-p \right)\]

Second:

\[l\left( p \right)=\sum_{i=1}^{n}{ l_i\left( p \right) }=\sum_{i=1}^{n}{ \left( y_ilog p+\left( 1-y_i \right)log \left( 1-p \right) \right) }=\]

\[=log p\sum_{i=1}^{n}{ y_i }+log \left( 1-p \right)\sum_{i=1}^{n}{ \left( 1-y_i \right) }=n_1log p+\left( n-n_1 \right)log \left( 1-p \right)\]

g.

\[s_i\left( p \right)= \frac{dl_i\left( p \right)}{dp}= \frac{y_i}{p}- \frac{1-y_i}{1-p}= \frac{y_i-p}{p\left( 1-p \right)}\]

h.

First:

\[s\left( p \right)= \frac{dl}{dp}= \frac{n_1}{p}- \frac{n-n_1}{1-p}= \frac{n_1-np}{p\left( 1-p \right)}\]

Second:

\[s\left( p \right)=\sum_{i=1}^{n}{ s_i\left( p \right) }=\sum_{i=1}^{n}{ \frac{y_i-p}{p\left( 1-p \right)} }= \frac{n_1-np}{p\left( 1-p \right)}\]

i. \( s\left( p \right)=0\) :

\[ \frac{n_1-np}{p\left( 1-p \right)}=0\]

implies \(n_1-np=0\) or

\[{\hat{p}}_{ML}= \frac{n_1}{n}= \frac{1}{n}\sum_{i=1}^{n}{ y_i }=\bar{y}\]

j.

\[ \frac{d^2l_i}{dp^2}= \frac{ds_i}{dp}= \frac{d}{dp}\left( \frac{y_i}{p}- \frac{1-y_i}{1-p} \right)=- \frac{y_i}{p^2}- \frac{1-y_i}{{\left( 1-p \right)}^2}\]

k.

\[I\left( p \right)=-E\left( \frac{d^2l_i}{dp^2} \right)=-E\left( - \frac{y_i}{p^2}- \frac{1-y_i}{{\left( 1-p \right)}^2} \right)= \frac{E\left( y_i \right)}{p^2}+ \frac{1-E\left( y_i \right)}{{\left( 1-p \right)}^2}\]

Now,

\[E\left( y_i \right)=1⋅P\left( y_i=1 \right)+0⋅P\left( y_i=0 \right)=1⋅p+0⋅\left( 1-p \right)=p\]

and

\[I\left( p \right)= \frac{p}{p^2}+ \frac{1-p}{{\left( 1-p \right)}^2}= \frac{1}{p}+ \frac{1}{1-p }= \frac{1}{p\left( 1-p \right)}\]

l.

\[I\left( p \right)=E\left( s_i{\left( p \right)}^2 \right)=E\left( {\left( \frac{y_i-p}{p\left( 1-p \right)} \right)}^2 \right)= \frac{E\left( {\left( y_i-p \right)}^2 \right)}{p^2{\left( 1-p \right)}^2}\]

Now, \(E\left( {\left( y_i-p \right)}^2 \right)\) is the variance in \(y_i\) which is \(p\left( 1-p \right)\) for a Bernoulli. We can show this:

\[E\left( {\left( y_i-p \right)}^2 \right)=E\left( y_i^2-2py_i+p^2 \right)=E\left( y_i^2 \right)-2pE\left( y_i \right)+p^2\]

\(y_i^2\) is the same random variable as \(y_i\) (since \(0^2=0\) and \(1^2=1\) ) so \(E\left( y_i^2 \right)=p\) . We have

\[E\left( {\left( y_i-p \right)}^2 \right)=p-2p^2+p^2=p-p^2=p\left( 1-p \right)\]

Therefore,

\[I\left( p \right)= \frac{p\left( 1-p \right)}{p^2{\left( 1-p \right)}^2}= \frac{1}{p\left( 1-p \right)}\]

Same as k.

m.

\[{\hat{p}}_{ML}= \frac{n_1}{n}= \frac{50}{100}= \frac{1}{2}\]

Since \(I\left( p \right)\) is known, we use  

\[\hat{V}=I{\left( {\hat{p}}_{ML} \right)}^{-1}= \frac{1}{4}\]

n. the estimated \(Var\left( {\hat{p}}_{ML} \right)\) is \(n^{-1}\hat{V}=1/400\) so \(SE\left( {\hat{p}}_{ML} \right)=1/20\) . We have

\[p={\hat{p}}_{ML}±1.96⋅0.05\]

or

\[p=0.5±0.098\]

o. \({\hat{p}}_{ML}=3/5=0.6\) .

\[I\left( {\hat{p}}_{ML} \right)= \frac{1}{0.6⋅0.4}=4.17\]

\[I_H\left( {\hat{p}}_{ML} \right)= \frac{1}{5}\sum_{i=1}^{5}{ \left( \frac{y_i}{{0.6}^2}+ \frac{1-y_i}{{0.4}^2} \right) }=4.17\]

\[I_G\left( {\hat{p}}_{ML} \right)= \frac{1}{5}\sum_{i=1}^{5}{ {\left( \frac{y_i-0.6}{0.6⋅0.4} \right)}^2 }=4.17\]

Algebra

\[I_H\left( p \right)=- \frac{1}{n}\sum_{i=1}^{n}{ \frac{d^2l_i\left( p \right)}{dp^2} }=- \frac{1}{n}\sum_{i=1}^{n}{ \left( - \frac{y_i}{p^2}+ \frac{1-y_i}{{\left( 1-p \right)}^2} \right) }=\]

\[= \frac{1}{np^2}\sum_{i=1}^{n}{ y_i }- \frac{1}{n{\left( 1-p \right)}^2}\sum_{i=1}^{n}{ \left( 1-y_i \right) }= \frac{n_1}{np^2}- \frac{n-n_1}{n{\left( 1-p \right)}^2}\]

Evaluate this at \(p={\hat{p}}_{ML}=n_1/n\) :

\[I_H\left( {\hat{p}}_{ML} \right)= \frac{1}{{\hat{p}}_{ML}}- \frac{1}{1-{\hat{p}}_{ML} }\]