Standardizing a multivariate normal random vector
Problem
\(X=\left( X_1, \ldots ,X_k \right)\) is a \(k×1\) random vector following a multivariate normal, \(X \sim N\left( μ, Σ \right)\) where \(μ=E\left( X \right)\) is \(k×1\) and \(Σ\) is the \(k×k\) variance (or variance-covariance) matrix of \(X\) . Each random variable \(X_i\) follows a normal distribution with expected value \(μ_i\) and variance \(Σ_{i,i}\) and the covariances \(Cov\left( X_i,X_j \right)\) are equal to \(Σ_{i,j}\) .
We define \(Σ^{-1/2}\) as the inverse of \(Σ^{1/2}\) , which in turn is defined by \(Σ^{1/2}Σ^{1/2}=Σ\) . \(Σ^{-1/2}\) is \(k×k\) and symmetric and always exist if \(Σ\) is positive definite. The standard power rules apply:
- \(Σ^{-1/2}Σ^{-1/2}=Σ^{-1}\)
- \(Σ^{-1/2}Σ^{1/2}=I\)
Show that
\[Σ^{-1/2}(X-μ) \sim N(0,I)\]
Learning point:
- For a random variable \(X\) following normal distribution, \(X \sim N\left( μ,σ^2 \right)\) , we have the important result that we can normalize it:
\[ \frac{X-μ}{σ} \sim N\left( 0,1 \right)\]
- The multivariate result corresponding to this is
\[Σ^{-1/2}(X-μ) \sim N(0,I)\]
- We can write the single variable result as
\[{\left( σ^2 \right)}^{-1/2}(X-μ) \sim N\left( 0,1 \right)\]
- which makes the multivariate result a more natural extension.
Solution
\(Σ^{-1/2}(X-μ)\) contains linear functions of normally distributed random variables so each random variable in this expression must be normally distributed and \(Σ^{-1/2}(X-μ)\) must follow a multivariate normal. All we need to do is to find the expected value and variance.
\[E\left( Σ^{-1/2}(X-μ) \right)=Σ^{-1/2}E\left( X-μ \right)\]
which is zero since \(E\left( X-μ \right)=E\left( X \right)-μ=μ-μ=0\)
\[Var\left( Σ^{-1/2}(X-μ) \right)=Σ^{-1/2}Var\left( X-μ \right)Σ^{-1/2}\]
Since \(Var\left( X-μ \right)=Var\left( X \right)=Σ\) we have
\[Var\left( Σ^{-1/2}(X-μ) \right)=Σ^{-1/2}ΣΣ^{-1/2}\]
We can write \(Σ=Σ^{1/2}Σ^{1/2}\) . Then
\[Σ^{-1/2}ΣΣ^{-1/2}=Σ^{-1/2}Σ^{1/2}Σ^{1/2}Σ^{-1/2}=\left( Σ^{-1/2}Σ^{1/2} \right)\left( Σ^{1/2}Σ^{-1/2} \right)=I⋅I=I\]