t-test and confidence intervals

Problem

Setup: a linear regression model with a random sample, the Gauss-Markov assumptions hold and the errors are normally distributed,

\[y_i=β_1+β_2x_i+ε_i i=1, \ldots ,n\]

Consider \(H_0: β_2=0\) . Show that if \(H_0\) is rejected at \(α\) then 0 is not in the \(\left( 1-α \right)*100\) % confidence interval for \(β_2\) .

Solution

If \(H_0\) is rejected, then it must be the case that \(\left| t \right|>t_{α/2,n-2}\) .

This means that either \(t>t_{α/2,n-2}\) or \(t<-t_{α/2,n-2}\) .

Since \(t=b_2/SE\left( b_2 \right)\) , either \(b_2/SE\left( b_2 \right)>t_{α/2,n-2}\) or \(b_2/SE\left( b_2 \right)<-t_{α/2,n-2}\)

Multiply both sides by \(SE\left( b_2 \right)\) .

Then either \(\) or \(b_2<-SE\left( b_2) \right.⋅t_{α/2,n-2}\)

To be in the confidence interval for \(β_2\) , \(b_2\) must satisfy

\[-SE\left( b_2) \right.⋅t_{α/2,n-2}<b_2<SE\left( b_2) \right.⋅t_{α/2,n-2} \]

Thus, \(b_2\) is not in the C.I.