Menu
Home
Log in / Register
 
Home arrow Economics arrow Econometrics
< Prev   CONTENTS   Next >

1.2. The multivariate probability distribution function

Until now we have been looking at univariate probability distribution functions, that is, probability functions related to one single variable. Often we may be interested in probability statements for several random variables jointly. In those cases it is necessary to introduce the concept of a multivariate probability function, or a joint distribution function.

In the discrete case we talk about the joint probability mass function expressed as

Example 1.11

Two people A and B both flip a coin twice. We form the random variables X = "number of heads obtained by A", and Y = "number of heads obtained by B". We will start by deriving the corresponding probability mass function using the classical definition of a probability. The sample space for person A and B is the same and equals {(H,H), (H,T), (T,H), (T,T)} for each of them. This means that the sample space consists of 16 (4 x 4) sample points. Counting the different combinations, we end up with the results presented in Table 1.5.

Joint probability mass function, f (X ,Y)

Table 1.5 Joint probability mass function, f (X ,Y)

As an example, we can read that P(X = 0, Y = 1) = 2/16 = 1/8. Using this table we can for instance determine the following probabilities:

Using the joint probability mass function we may derive the corresponding univariate probability mass function. When that is done using a joint distribution function we call it the marginal probability function. It is possible to derive a marginal probability function for each variable in the joint probability function. The marginal probability functions for X and Y are

Example 1.12

Find the marginal probability functions for the random variables X given in Table 1.5.

Another concept that is very important in regression analysis is the concept of statistically independent random variables. Two random variables X and Y are said to be statistically independent if and only if their joint probability mass function equals the product of their marginal probability functions for all combinations of X and Y:

1.3. Characteristics of probability distributions

Even though the probability function for a random variable is informative and gives you all information you need about a random variable, it is sometime too much and too detailed. It is therefore convenient to summarize the distribution of the random variable by some basic statistics. Below we will shortly describe the most basic summary statistics for random variables and their probability distribution.

1.3.1. Measures of central tendency

There are several statistics that measure the central tendency of a distribution, but the single most important one is the expected value. The expected value of a discrete random variable is denoted E[X], and is defined as follows:

It is interpreted as the mean, and refers to the mean of the population. It is simply a weighted average of all X-values that exist for the random variable where the corresponding probabilities work as weights.

Example 1.13

Use the marginal probability function in Example 1.12 and calculate the expected value of X.

When working with the expectation operator it is important to know some of its basic properties:

1) The expected value of a constant equals the constant, i?[c] = c

2) If c is a constant and X is a random variable then: E[cX ] = cE[X ]

3) If a, b, and c are constants and X, and Y random variables then: E[aX + bY + c] = aE[X ]+bE[Y ]+c

4) If X and Y are statistically independent then and only then: e[XY] = e[x]eY]

The concept of expectation can easily be extended to the multivariate case. For the bivariate case we have

Example 1.14

Calculate the E[XY] using the information in Table 1.5. Using (1.10) we receive:

 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Philosophy
Political science
Psychology
Religion
Sociology
Travel