Introduction to Econometrics

Chapter 2 : Introduction to probability theory

By Lund University

In order to make more sense of the concepts introduced in chapter 1, we need some probability theory and statistics. We want to be able to explain observed deviations from the trendline and we will do that with random variables called error terms. This chapter covers the absolute minimum from probability theory: random variables, distribution functions, expected value, variance and covariance. This chapter also introduces conditional moments which will turn out to be of great importance in econometrics as the fundamental assumption on the error terms will be stated as a conditional expectation.

Random variables and distributions

The most fundamental concept in probability theory is the random variable. We will not be able to analyze the formal definition of a random variable as this is very technical. However, we will be able to develop an understanding of a random variable and this is all we need. It will turn out to be useful to distinguish between discrete random variables and continuous random variables. Random variables are intimately connected to their distribution functions and this will be discussed in detail. Finally, we look at a specific random variable, the standard normal random variable. The standard normal has the well known bell-shaped density function. *Random Variable* page is broken". Use https://www.youtube.com/watch?v=LXHYs40tZfo

Random variable (broken: use https://www.youtube.com/watch?v=LXHYs40tZfo

Distribution functions

Standard normal

Problem: The cdf function

Problem: The cdf function

Problem: The pdf function

Problem: The cdf function

Problem: The pdf function

Problem: Find pdf from cdf

Problem: integrating the pdf function

Problem: Find probabilities from the pdf

Problem: The standard normal

Moments of a random variable

Once we know what a random variable is, we will look at important properties of a random variable. Most important are its expected value and its variance. We will look the definitions as well as the intuition behind these properties. Next, we can create a new random variable from an old one. If the new one is a linear function of the old one, then figuring out its expected value and variance is particularly simple. We end this section with the normal random variables which may have any expected value and any positive variance.

Expected value of a discrete random variable

Expected value of a continuous random variable

The variance of a random variable

The expected value and variance of a linear function of a random variable

The normal distribution

Moments of two or more random variables

In the previous section we looked at a single random variable and its moments. In this section we will look at several random variables and the combined moments of two of them. First, we look at covariance, correlation and independence. Then, we look at conditional expectation and the conditional variance of one random variable given another. We end this section by looking at a sequence of random variables introducing the concept random sequence of random variables meaning that all the random variables in this sequence or independent and have the same distribution.

Covariance, correlation and independence (intro)

Conditional expectation and conditional variance, introduction

Sample as a sequence of random variables