Probability theory and statistics

Chapter 4 : Several random variables

By Lund University

Our setup in probability theory is always an experiment, a collection of possible outcome, a bunch of events and probabilities assigned to each of these events. On top of this we have so far defined a single random variable. What we will do now is to define several random variables on top of the same experiment. Defining several random variables is no more mysterious than defining several functions on the same domain. They will simply map a given outcome to different numbers. More on this in the first section. The second section then introduces joint distribution functions for these random variables. We then introduce conditional distribution functions in three which will simply calculations of conditional probabilities introduced in chapter 1. Given several random variables, we can create a new one by taking a function of them. Section 4 looks at this as well as how to find moments of a random variable defined like this. Several random variables will also have associated joint moments and section five looks at the most important joint moment, the covariance. Section six and seven look at another type of moments, namely conditional moments. Chapter six focuses on conditional expectations while chapter seven looks at conditional variances and the important law of iterated expectations.

Several random variables

Given several random variables, we can define events involving some or all of the random variables and assign probabilities to these events. This will allow us to calculate probabilities involving some or all of the random variables such as the probability that X = 2 and Y = 1.

Several random variables

Problem: Two random variables

Distribution functions for several random variables

With two random variables, the probability mass function, which was a function of one variable, will be replaced with the joint probability mass function, a function of two variables. The probability density function will similarly be replaced with the joint probability density function. Joint distribution functions can be used to calculate probabilities involving both of the random variables. In many cases, we are given a joint distribution function but we want to calculate the probability involving only one of the random variables in this case, we must find the marginal PDF/PMF which is then a function of only one variable. Finally, we will define what we mean by two random variables being independent.

The joint probabillity mass function

The marginal probabillity mass function

The joint probability density function

The marginal probability density function

Independent random variables

Conditional distribution functions

Early in the course we talked about conditional probabilities. With two random variables I can define the conditional probability mass function as the probability that the first random variable takes a particular value given that the second random variable takes another given value. For continuous random variables, we will define the conditional probability density function or the conditional PDF. In a problem we will illustrate that if the random variables are independent then the conditional PDF/PMF will be equal to the marginal.

Conditional probability mass function

Conditional probability density function

Problem: Conditional pdf is marginal if X, Y are independent

Function of several random variables and its moments

In chapter 3 we talked about the importance of being able to define new random variable as a function of an existing random variable. If I have two random variables, I will be able to define 1/3 random variable as a function of my two existing ones. More generally, I can create a new random variable as a function of an arbitrary number of existing random variables. As was the case when we talked about a function of a single random variable, finding the moments of a random variable defined as a function of existing ones is generally a simpler problem than finding its PDF/PMF, particularly if the function is linear.

Function of several random variables

Expected value of a function of several random variables

Linear function of several random variables

Problem: Expected value and variance of a difference

Covariance and correlation

With that two or more random variables, I can define what is called joint moments. These are moments that the random variables carry as a group and they can be calculated from the joint PDF/PMF. The most important joint moments of two random variables is the covariance. We will begin by introducing covariance, correlation and independence from a nontechnical point of view. We will then look at the expected value of the product of two random variables which will then help us understand the definition of covariance. We will end this section by looking at some important results related to the covariance between two random variables.

Covariance, correlation and independence (intro)

The expected value of the product of two random variables

Covariance

Covariance, results

Conditional expectation

We have talked about the expected value of a random variable and we have talked about the conditional PDF/PMF. Given to random variables, we can define the conditional expectation of one of them given the other one. Intuitively, the conditional expectation of Y given X is the value that I expect for Y if I know the value that the random variable X took. The conditional expectation can be calculated from the conditional PDF/PMF.

Conditional expectations, discrete random variables

Conditional expectations, continuous random variables

Problem: Conditional expectations, independent random variables

Law of iterated expectation and conditional moments

We begin this section with the important law of iterated expectations which connects the regular unconditional expected value to the conditional expected value. Just like it is possible to find the expected value of the function of a random variable, we can find the conditional expected value of a function of a random variable by evaluating a sample or an integral. Finally, we define the conditional variance of a random variable Y given another random variable X.

Law of iterated expectations

Conditional expectation of a function of a random variable

Problem: Conditional expectations

Problem: Conditional expectation, uncorrelatedness and independence

Conditional variance