Lecture 3 - Probability - BMSLec02
Lecture 3 - Probability - BMSLec02
Contents
1 Discrete Probability 2
1.1 Random Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Axioms of Probability . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Probability Mass Function (PMF) . . . . . . . . . . . . . . . . . 4
3 Continuous Probability 7
3.1 Probability Density Function (PDF) . . . . . . . . . . . . . . . . 7
3.2 Cumulative Density Function (CDF) . . . . . . . . . . . . . . . . 7
1
1 Discrete Probability
1.1 Random Events
An event (generally denoted by A) is a collection of outcomes in a random
experiment.
Example 1 (An event). (i) Getting even in a die rolling experiment. (ii)
Getting ”Head” in a coin toss experiment.
A sure event (generally denoted by S) will always happen
Example 2 (A sure event). (i) Getting a number between “1” and “6” in a die
rolling experiment. (ii) Getting ”Head or Tail” in a coin toss experiment.
p(S) = 1 (2)
Example 3 (Rolling a fair die). Find out the probability of (i) Any of the face
(ii) Even numbered faces
From the axiom of probability
2
Definition 1 (Bernoulli trials). A Bernoulli trial is an experiment where
• There are two possible outcomes: {success, failure}
• The probability of success is p and the probability of failure is q such that
p = 1 − q.
The following theorem helps us to obtain the answer for any number of
successes:
3
Theorem 1 (Probability of a number of successes in a Bernoulli trial).
Consider n Bernoulli trials each with a probability of success p; the prob-
ability of getting exactly j successes is
n
b(j, n, p) = pj q n−j
j
n!
= pj q n−j (8)
(n − j)!j!
where δ(·) is the Dirac delta function that has the following properties
1 x=0
δ(x) = (14)
0 x 6= 0
Z ∞
δ(x) = 0 (15)
−∞
4
2 Important Discrete Probability Distributions
2.1 Discrete Uniform Distribution
The probability mass function (PMF) of a discrete uniform distribution is given
by
1
U{n; a, b} = (16)
b−a+1
where a and b are integers and n ∈ {a, a + 1, . . . , b}. Figure 2 illustrates the
PMF of a discrete uniform distribution.
Tossing a 6-face die where the faces are numbered from a = 1 to b = 6. Let
us denote n as the number of times the die was thrown until face 1 appears.
The x-axis in Figure 3 denotes n and y-axis denotes the probability of n where
p is the probability of obtaining face 1 in a single trial.
5
Figure 3: PMF of geometric distribution.
6
3 Continuous Probability
3.1 Probability Density Function (PDF)
The probability density function (pdf) of a scalar continuous-valued random
variable x at x = ξ is
P {ξ − δξ < x < ξ}
px (ξ) = lim (20)
dξ→0 dξ
∂Px (ξ)
p(x) = (24)
∂ξ
7
Probabiliy of n successes from 5 Bernoulli trials
0.35
0.3
0.25
Probability
0.2
0.15
0.1
0.05
0
0 1 2 3 4 5
Number of success ( n )
0.16
0.14
0.12
Probability
0.1
0.08
0.06
0.04
0.02
0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Number of success ( n )
8
Figure 5: PMF of Poisson distribution.
9
4.1 Continuous Uniform Distribution
When x is uniformly distributed between a and b, written formally as x ∼
U(a, b), its probability density function (PDF) is written as
1
b−a x ∈ [a, b]
p(x) = U(x; a, b) = (25)
0 else
Example 12. Use a Monte-Carlo simulation to verify that the number gener-
ated in Example 11 is indeed x ∼ U(10, 20) through a histogram visualization.
Solution: The following commands can be used to simulate n = 100, 000 sam-
ples and to visualize their distribution through a histogram.
10
n = 100000;
x = 10 + (20-10)*rand(n,1);
hist(x);
Figure 7 shows the resulting histogram which confirms to us (visually) that the
generated data x is indeed distributed x ∼ U(10, 20).
11
Figure 8: PDF of Exponential distribution.
12
Example 15. Use Matlab to generate a random number x ∼ N (5, 3); then, use
a Monte-Carlo approach, similar to the one in Example 12 to visually verify the
simulated data.
Solution: The following command in Matlab generates a sample of x ∼ N (µ, σ)
x = µ + σ ∗ randn; (29)
Similar to before, the following commands can be used to simulate n = 100, 000
samples and to visualize their distribution through a histogram.
n = 100000;
x = 5 + 3*rand(n,1);
histogram(x, -10:.5:20);
13
is said to have Chi-square distributed with n degrees of freedom, i.e., q ∼ Xn2 .
Formally, the PDF of a chi-square distribution can be written as
1
p(x) = xk/2−1 e−x/2 (32)
2k/2 Γ(k/2)
Figure 11 illustrates the PDF of a chi-square distribution.
14
• The first moment is also known as the mean, i.e.,
Z ∞
x̄ = E{x} = xp(x)dx (38)
−∞
15
PDF as a surface
0.06
0.04
0.02
0
6
4 8
2 4 6
0 2
-2 -2 0
-4 -4
PDF as a contour
6
-2
-4
-3 -2 -1 0 1 2 3 4 5 6 7
16