The least squares solution

Summary

  • Result for \(i=1, \ldots ,n\) :

\[x_ix'_i=\begin{bmatrix}x_{i,1}^2 & ⋯ & x_{i,1}x_{i,k} \\ ⋮ & ⋱ & ⋮ \\ x_{i,k}x_{i,1} & ⋯ & x_{i,k}^2\end{bmatrix}\]

  • If \(x_{i,1}=1\) :

\[x_ix'_i=\begin{bmatrix}1 & x_{i,2} & ⋯ & x_{i,k} \\ x_{i,2} & x_{i,2}^2 & ⋯ & x_{i,2}x_{i,k} \\ ⋮ & ⋮ & ⋱ & ⋮ \\ x_{i,k} & x_{i,k}x_{i,2} & ⋯ & x_{i,k}^2\end{bmatrix}\]

  • \(x_ix'_i\) is \(k×k\)
  • Result:

\[\sum_{i=1}^{n}{ x_ix'_i }=X'X\]

  • \(X'X\) is \(k×k\) .
  • Result: The least squares problem has a unique solution if \(X'X\) is invertible (has full rank). If it is not invertible, we say that we have perfect multicolinearity .
  • Result: If \(X'X\) is invertible then the solution to the least squares problem is in matrix form:

\[b={\left( X'X \right)}^{-1}X'y\]

  • Solution in vector form:

\[b={\left( \sum_{i=1}^{n}{ x_ix'_i } \right)}^{-1}\sum_{i=1}^{n}{ x_iy_i }\]

  • The solution is called the ordinary least squares estimator , or the OLS estimator .
  • If \(b\) is the OLS estimator then the fitted values evaluated are called the OLS fitted values or least squares fitted values .
  • If \(b\) is the OLS estimator then the residuals are called the OLS residuals or the least squares residuals .