Joint Distribution Notes
Practice Questions for Examination
Practice questions are for practice purposes only, but they cover the syllabus. This session (number 10) continues the discussion on joint distribution, which began in the last class.
Joint Distribution
When dealing with more than one random variable, joint distribution is used. This session covers three main parts:
Joint distribution and marginal distribution
Dependence and independence of random variables
Expectation, mean, and variance
Questions similar to the practice ones might appear in the exam. The session focuses on three parts:
Discrete bivariate
Continuous bivariate
Multivariate
Discrete Bivariate Random Variables
For two discrete random variables and , the probability that takes the value and takes the value is written as: .
This is an intersection, and the joint density function is defined as:
This is the joint probability mass function or joint probability density function.
Example: Rolling two dice. Let be the number on the first die and be the number on the second die. Both and vary from 1 to 6, and their density function is for all and .
Joint Probability Table
y=1 | y=2 | y=3 | y=4 | y=5 | y=6 | |
|---|---|---|---|---|---|---|
x=1 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 |
x=2 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 |
x=3 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 |
x=4 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 |
x=5 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 |
x=6 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 | 1/36 |
Each entry represents the joint density function for the corresponding and values. Also, probabilities are always greater than zero and less than one. The summation for all values should be equal to one:
Example: Let be the value on the first die and be the total on both dice. varies from 1 to 6, while varies from 2 to 12.
Table for x and t
t=2 | t=3 | t=4 | t=5 | t=6 | t=7 | t=8 | t=9 | t=10 | t=11 | t=12 | |
|---|---|---|---|---|---|---|---|---|---|---|---|
x=1 | value | value | value | value | value | value | 0 | 0 | 0 | 0 | 0 |
x=2 | 0 | value | value | value | value | value | value | 0 | 0 | 0 | 0 |
x=3 | 0 | 0 | value | value | value | value | value | value | 0 | 0 | 0 |
x=4 | 0 | 0 | 0 | value | value | value | value | value | value | 0 | 0 |
x=5 | 0 | 0 | 0 | 0 | value | value | value | value | value | value | 0 |
x=6 | 0 | 0 | 0 | 0 | 0 | value | value | value | value | value | value |
Example: Event . Find the probability. Add their density functions.
Marginal Distribution
Marginal distribution involves taking a projection from two dimensions to one dimension.
For example, given two random variables and , transform the two-dimensional random variable into one-dimensional random variables, alone and alone. This allows the application of concepts from previous chapters (3 & 4) to find PDFs, CDFs, means, etc.
Marginal density function for is given as:
This is a function of alone, as the effect of is accumulated.
Marginal density function for is given as:
This is a function of alone, accumulating the effect of on .
Notation
or : Joint density function
: Marginal density function for
: Marginal density function for
Example: Calculating marginal density functions from a joint distribution table. Add row-wise to find and column-wise to find .
f_X(x) = \begin{cases}
1/6, & x = 1, 2, 3, 4, 5, 6 \
0, & \text{otherwise}
\end{cases}
f_Y(y) = \begin{cases}
1/6, & y = 1, 2, 3, 4, 5, 6 \
0, & \text{otherwise}
\end{cases}
Cumulative Distribution Function (CDF)
For alone:
F_X(x) = \begin{cases}
0, & x < 1 \
1/6, & 1 \leq x < 2 \
2/6, & 2 \leq x < 3 \
3/6, & 3 \leq x < 4 \
4/6, & 4 \leq x < 5 \
5/6, & 5 \leq x < 6 \
1, & x \geq 6
\end{cases}
Dependence and Independence of Random Variables
Two random variables are independent if the joint density function is equal to the product of the marginal density functions:
Otherwise, they are dependent.
Example: Using the dice example, if , then they are independent.
If variables are not independent, the concept of conditional probability applies. Recall from chapter 3:
For conditional probability:
Conditional Density Function
The conditional density function can be written as:
This is the probability of given that .
Example: Finding the total value of 4, given the first die is 3.
Independence and Conditional Probability
If two random variables are independent, the conditional probability becomes the same as the marginal probability:
Expectation
Expectation involves adding over the entire sample space. With two random variables, it involves adding for all and all .
The expectation of any function is:
Mean
The mean for x () is:
The mean for y () is:
Expectation of xy
To find the expected value of :
Marginal density functions cannot be used here.
Covariance
Covariance is defined as:
A direct formula is:
Variance of x
Instead of squared, covariance of and is used.
Tossing two fair coins: Let be the number of heads and be the number of tails. Find the joint density function, CDF, marginal density function, and check whether and are independent. If dependent, find .
Covariance and Independence
Covariance measures the joint variability of two random variables. If and are independent, then . However, if , it does not necessarily mean that and are independent.
Zero covariance means no linear relationship.
Important
Covariance formula: .
Continuous Joint Distribution
For continuous random variables, the concepts are similar to the discrete case, but summations are replaced with integrations.
For two continuous random variables and , a function such that and is the joint probability density function.
The CDF is:
Marginal density functions are:
Independence:
Expectation:
Example
Bank operates drive-up and walk-up. Let be the proportion of time the drive-up is in use, and be the proportion of time the walk-up is in use. The joint density function is given. Calculate the probability that neither facility is busy more than one quarter of the time. (Limits are 0 to 1/4 for both integrals.)
Also marginal pdf for x gives probability distribution of busy time for drive up facility without reference to the walk up window. The same is true for y.
Conditional PDF
To find the conditional PDF of given that , use:
Important Results on Expectation
If x and y are Independent