Section 2.4

Independent Random Variables

Mutually Independent

A finite or infinite sequence of random variables defined on the same probability space are mutually independent if for every choice of subsets , the events are mutually independent events.
i.e.

Example. A fair coin is tossed twice. Let be the total number of heads on the 1st toss and the total number of heads among both tosses. Show that are not mutually independent random variables.




So are not independent

Bernoulli Random Variables

Suppose we perform an experiment and designate certain outcomes as “successes” and the rest as “failures.” Then we can define a random variable by

Examples.

  1. is a success. Then
  2. is a success. Then

Suppose . Then is a discrete random variable with pmf given by


In this case we call a Bernoulli random variable with success probability and write . Its distribution is called the Bernoulli distribution.

Examples.

  1. Flip a fair coin, is a success. Then
  2. Roll a fair die, is a success. Then

Binomial Random Variables

Suppose we perform 4 independent trials each w/ success probability . Let if the ith trial is a success and if the ith trial is a failure. Then are mutually independent random variables.

  • Calculate
    b/c ‘s are independent
  • Calculate

Let
is a discrete random variable. What is its pmf?




6 terms


In general, we say that is a binomial random variable w/ parameteres for all and we write . Its distribution is called the binomial distribution.

Takeaway: A random variable counting the total # of successes of independent trials each having success probability .

Example. A fair die is rolled 10 times. What is the probability that a six is rolled at least 3 times?

total number of 6’s.


Geometric Random Variables


I roll a fair die repeatedly. What is the probability that a six is rolled
for the first time on the 2nd roll?
for the first time on the 3rd roll?
for the first time on the nth roll?

In general, we say that N is a geometric random variable w/ success parameter for all and we write . Its distribution is called the geometric distribution.

Takeaway: A random variable counts the number of trials until first success, (if each trial has success probability )
We expect that defines a pmf. Let’s check that its sum is 1:


Let


How we could simplify the sum: Series magic