Advanced econometrics
Chapter 5 : The statistics of the linear regression model, small samples
By Lund University
The statistics of the linear regression model, small samples
The linear regression model
We begin by looking at what we mean by a random sample. Given a random sample, we formulate a statistical model or a linear statistical model by modeling the conditional expectation of y given x. We then consider b as an estimator of the unknown parameter beta. From the conditional expectations, we can introduce the error terms which will lead us to the fundamental regression model.
Random sample, statistical model and linear statistical model
Estimating the parameters of a statistical model
Conditional expectations in matrix form
Error terms and the regression model
Misspecified models
Properties of the OLS estimator
We have now created the setup that we need to analyze the properties of the OLS estimator. We begin by looking at the OLS statistical formula which related the OLS estimator directly to the error terms. Using this formula, we can show that the OLS estimator is unbiased under the assumption of exogeneity. In order to anything more about the OLS estimator, we introduce the concept heteroscedasticity (constant variance) and the Gauss-Markov assumptions. We will be able to find the variance of the OLS estimator these new assumptions. We also look at the Gauss-Markov theorem which tells us that that the OLS estimator is the Best Linear Unbiased Estimator under the GM assumptions. We end this section by looking at how to estimate the variance of the OLS estimator.
The OLS statistical formula
Unbiasedness of the OLS estimator
Homoscedasticity and Gauss-Markov assumptions
The variance of the OLS estimator
The Gauss Markov theorem
Estimating the variance of the error terms
Estimating the variance of the OLS estimator:
Inference in the linear regression model
Next, we look at hypothesis testing and confidence intervals in the linear regression model. We then need the distribution of the OLS estimator which requires additional assumptions on the error terms. In this section, we will assume that the error terms follow IID normal distributions. Using this assumption, we can derive the distribution of the OLS estimator. Once we have this distribution, we can do hypothesis testing. We begin with simplest ones, testing if an individual beta-parameter is zero. We build on this and learn how to test general linear restrictions using t, F and chi-square tests. This section also has a page on confidence intervals.
The distribution of the OLS estimator
More on the distribution of the OLS estimator, the t-distribution
Hypothesis testing: simple t-test
Simple t-test: extensions
Confidence interval for
Testing a single linear restriction
Testing several linear restrictions jointly
Problems
Problems