Table of Contents
SOME SPECIAL DISCRETE RANDOM VARIABLE
If an experiment has two possible outcomes, “success” and
“failure”, and their probabilities are, respectively, p and 1 p, then
the number of successes, 0 or 1, has a Bernoulli distribution;
symbolically, we have the following definition:
If an experiment has two possible outcomes, “success” and
“failure”, and their probabilities are, respectively, p and 1 p, then
the number of successes, 0 or 1, has a Bernoulli distribution;
symbolically, we have the following definition:
2. The Bernoulli Distribution:
A random variable X has a Bernoulli distribution and it is
referred to as a Bernoulli random variable if and only if its
probability mass function is given by
f (x; p) = p x (1 p)1 x
for x = 0, 1.
• Observe that we used the notation f (x; p) to indicate explicitly
that the Bernoulli distribution has one parameter p.
• We denote the Bernoulli random variable by writing
X ⇠ BER(p).
• We refer to an experiment to which the Bernoulli distribution
applies as a Bernoulli trial, or simply a trial, and to sequences of
such experiments as repeated trials.
• Observe that we used the notation f (x; p) to indicate explicitly
that the Bernoulli distribution has one parameter p.
• We denote the Bernoulli random variable by writing
X ⇠ BER(p).
• We refer to an experiment to which the Bernoulli distribution
applies as a Bernoulli trial, or simply a trial, and to sequences of
such experiments as repeated trials.
EXAMPLES:
1. Tossing a fair coin.
2. What is the probability of getting a score of not less than 5 in a
throw of a six-sided die?
THEOREM: If X is a Bernoulli random variable with parameter p,
then the mean, variance and moment generating functions are
respectively given by
E (X ) = µX = p
2
Var (X ) = X = p(1 p).
Consider a fixed number n of mutually independent Bernoulli trails.
Suppose these trials have same probability of success, say p.
A random variable X is called a binomial random variable if it
represents the total number of successes in n independent
Bernoulli trials.
Consider a fixed number n of mutually independent Bernoulli trails.
Suppose these trials have same probability of success, say p.
A random variable X is called a binomial random variable if it
represents the total number of successes in n independent
Bernoulli trials.
3. Binomial Distribution:
The random variable X is called the binomial random variable
with parameters p and n if its probability mass function is of
the form
✓ ◆
n x
f (x) = p (1 p)n x , x = 0, 1, . . . , n.
x
where 0 < p < 1 is the probability of success.
We will denote a binomial random variable with parameters p and
n as X ⇠ BIN(n, p).
Examples. 1. Tossing a fair coin twice and X denotes the number
of heads.
2. On a five-question multiple-choice test there are five possible
answers, of which one is correct. If a student guesses randomly
and independently, what is the probability that she is correct only
on two questions?
Excercise. On a five-question multiple-choice test there are five
possible answers, of which one is correct. If a student guesses
randomly and independently, what is the probability that she is
correct only on questions 1 and 4?
Theorem: If X is a random variable with mean E (X ) and variance
Var (X ), then
E (X ) = np
2
Var (X ) = X = np(1 p).
The geometric distribution is also constructed from independent
Bernoulli trials, but from an infinite sequence. Let X denote the
trial number on which the first success occurs.
The geometric distribution is also constructed from independent
Bernoulli trials, but from an infinite sequence. Let X denote the
trial number on which the first success occurs.
4. Geometric Distribution:
The random variable X is called the geometric random vari-
able with parameters p if its probability mass function is of
the form
f (x) = (1 p)x 1
p, x = 1, . . . , 1,
where 0 < p < 1 is the probability of success in a single
Bernouli’s trial.
The geometric distribution is also constructed from independent
Bernoulli trials, but from an infinite sequence. Let X denote the
trial number on which the first success occurs.
4. Geometric Distribution:
The random variable X is called the geometric random vari-
able with parameters p if its probability mass function is of
the form
f (x) = (1 p)x 1
p, x = 1, . . . , 1,
where 0 < p < 1 is the probability of success in a single
Bernouli’s trial.
If X has a geometric distribution we denote it as X ⇠ GEO(p).
EXAMPLE: : If X is the number of tosses needed until the first
head when tossing a coin.
2. The probability of winning in a certain lottery is said to be
about 1/9. If it is exactly 1/9, the distribution of the number of
tickets a person must purchase up to and including the first
winning ticket is a geometric random variable with p = 1/9.
Theorem: If X is a geometric random variable with parameter p,
then the mean, variance and moment generating functions are
respectively given by
1
E (X ) = µX =
p
2 1 p
Var (X ) = X = .
p2
Let X denote the trial number on which the rth success occurs.
Here r is a positive integer greater than or equal to one. This is
equivalent to saying that the random variable X denotes the
number of trials needed to observe the rth successes.
Let X denote the trial number on which the rth success occurs.
Here r is a positive integer greater than or equal to one. This is
equivalent to saying that the random variable X denotes the
number of trials needed to observe the rth successes.
5. Negative Binomial (or Pascal) Distribution:
The random variable X is called the negative distribution ran-
dom variable with parameters p if its probability mass func-
tion is of the form
✓ ◆
x 1
f (x) = (1 p)x r p r , x = 1, . . . , 1,
r 1
where 0 < p < 1 is the probability of success in a single
Bernouli’s trial.
Let X denote the trial number on which the rth success occurs.
Here r is a positive integer greater than or equal to one. This is
equivalent to saying that the random variable X denotes the
number of trials needed to observe the rth successes.
5. Negative Binomial (or Pascal) Distribution:
The random variable X is called the negative distribution ran-
dom variable with parameters p if its probability mass func-
tion is of the form
✓ ◆
x 1
f (x) = (1 p)x r p r , x = 1, . . . , 1,
r 1
where 0 < p < 1 is the probability of success in a single
Bernouli’s trial.
If X has a negative binomial distribution we denote it as
X ⇠ NBIN(p).
EXAMPLE: What is the probability that the second head is
observed on the 3rd independent flip of a coin?
EXAMPLE: What is the probability that the second head is
observed on the 3rd independent flip of a coin?
1
In this case p = 2
✓ ◆
2 2
P(X = 3) = f (3) = p (1 p)
1
We shall now investigate the limiting form of the binomial
distribution when n ! 1, p ! 0, while np remains constant.
Letting this constant be , that is, np = and, hence, p = /n ,
we can write
✓ ◆ ✓ ◆x ✓ ◆n x
n
f (x; n, p) = 1
x n n
✓ ◆x ✓ ◆n x
n(n 1)(n 2) . . . (n x + 1)
= 1
x! n n
✓ ◆ ! ✓ ◆
1 2 x 1 n/ x
(1 n )(1 n ) . . . (1 n ) x
=1 ( ) 1 1
x! n n
Finally, if we let n ! 1 while x and remain fixed, we find that
1 2 x 1
1(1 )(1 ) . . . (1 )!1
n n n
✓ ◆ x
1 !1
n
✓ ◆ n/
1 ! e.
n
and, hence, that the limiting distribution becomes
e x
f (x; ) = , x = 1, 2, . . . , 1,
x!
5. Poisson Distribution:
A random variable X is said to have a Poisson distribution if
its probability mass function is given by
e x
f (x; ) = , x = 1, 2, . . . , 1,
x!
where 0 < is a parameter.
We denote such a random variable by X ⇠ POI( ).
5. Poisson Distribution:
A random variable X is said to have a Poisson distribution if
its probability mass function is given by
e x
f (x; ) = , x = 1, 2, . . . , 1,
x!
where 0 < is a parameter.
We denote such a random variable by X ⇠ POI( ).
Example given below is taken from the book “Mathematical
Statistics and Data Analysis by John A. Rice”.
THEOREM: If X is a Poisson’s random variable with parameter
, then the mean, variance and moment generating functions are
respectively given by
E (X ) = µX =
2
Var(X ) = X =
EXAMPLE: The distribution of the number of tickets purchased
up to and including the second winning ticket is negative binomial:
P(X = k) = (k 1)p 2 (1 p)k2 .