Section 6.1 & 6.2
Random variable: A random variable takes numerical values that describe
the outcomes of a random process.
Probability distribution: The probability distribution of a random variable gives
its possible values and their probabilities.
Discrete random variable: A discrete random variable X takes a fixed set of
possible values with gaps between them.
Mean (expected value) of a discrete random variable: The mean (expected value) of a discrete random variable is its average value over many, many trials of the same random process. Suppose that X is a discrete random variable with probability distribution:
To find the mean (expected value) of X, multiply each possible value of X by its probability, then add all the products:
X = E(X) = x1p1+ x2p2 + x3p3+... = XiPi
Standard deviation of a discrete random variable: The standard deviation of a discrete random variable measures how much the values of the variable typically vary from the mean in many, many trials of the random process. Suppose that X is a discrete random variable with probability distribution
and that μx is the mean of X.
Variance: The variance of X is
The standard deviation of X is the square root of the variance:
Continuous random variable: A continuous random variable can take any value in an interval on the number line.
Adding a positive constant a to (subtracting a from) a random variable increases (decreases) measures of center and location by a, but does not affect measures of variability (range, IQR, standard deviation) or the shape of its probability distribution.
Multiplying (dividing) a random variable by a positive constant b multiplies (divides) measures of center and location by b and multiplies (divides) measures of variability (range, IQR, standard deviation) by b, but does not change the shape of its probability distribution.
If Y = a + bX is a linear transformation of the random variable X with b > 0,
The probability distribution of Y has the same shape as the probability distribution of X.
μY = a + bμX
σY = bσX
If X and Y are any two random variables,
μX+Y = μX + μY : The mean of the sum of two random variables is the sum of their means.
μX−Y = μX − μY : The mean of the difference of two random variables is the difference of their means.
If X and Y are independent random variables, then knowing the value of one variable does not change the probability distribution of the other variable. In that case, variances add:
σ 2 X+Y = σ 2 X + σ 2 Y : The variance of the sum of two independent random variables is the sum of their variances.
σ 2 X−Y = σ 2 X + σ 2 Y : The variance of the difference of two independent random variables is the sum of their variances.
To get the standard deviation of the sum or difference of two independent random variables, calculate the variance and then take the square root:
σX+Y = σX−Y =σ2 X + σ2 Y
If aX + bY is a linear combination of the random variables X and Y,
Its mean is aμX + bμY .
Its standard deviation is a2σ2 X +b2 σ2 Y if X and Y are independent.
A linear combination of independent Normal random variables is a Normal random variable.