Estimating the parameters of a statistical model
Summary
- Given a random sample \(\left( y_i,x_i \right)\) and a statistical model \(E\left( y_i \right|x_i)=g(x_i,β)\) for \(i=1, \ldots ,n\)
- The goal is to estimate \(β\) , the unknown part of \(g(x_i,β)\)
- An estimator of \(β\) will typically be denoted by \(b\) ( \(b\) will be a function of the random sample).
- Given an estimator \(b\) of \(β\) , we can estimate \(E\left( y_i \right|x_i)\) by
\[{\hat{y}}_i=g(x_i,b)\]
- An estimate of \(E\left( y_i \right|x_i)\) is called a fitted value . For a linear model we have
\[{\hat{y}}_i=x'_ib\]
- Residuals are defined as \(e_i=y_i-{\hat{y}}_i\) .
- For a linear model \(E\left( y_i \right|x_i)=x'_iβ\) , the obvious candidate as an estimator of \(β\) is the OLS estimator
\[b={\left( X'X \right)}^{-1}X'y\]
- In this case, \({\hat{y}}_i\) are called the OLS fitted values and \(e_i\) are called the OLS residuals.