[go: up one dir, main page]

0% found this document useful (0 votes)
27 views16 pages

Lecture 3 - Probability - BMSLec02

The document discusses probability distributions and their properties. It covers discrete and continuous probability distributions including the binomial, Poisson, geometric and normal distributions. Formulas and examples are provided for each distribution.

Uploaded by

Anchal Anchal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views16 pages

Lecture 3 - Probability - BMSLec02

The document discusses probability distributions and their properties. It covers discrete and continuous probability distributions including the binomial, Poisson, geometric and normal distributions. Formulas and examples are provided for each distribution.

Uploaded by

Anchal Anchal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

ELEC-8900 Special Topics:

Advanced Energy Storage Systems


Lecture 02: Review of Probability
Instructor: Dr. Balakumar Balasingam
(Link to video from last year: https://tinyurl.com/y65l2uhj)
May 20, 2022

Contents
1 Discrete Probability 2
1.1 Random Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Axioms of Probability . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Probability Mass Function (PMF) . . . . . . . . . . . . . . . . . 4

2 Important Discrete Probability Distributions 5


2.1 Discrete Uniform Distribution . . . . . . . . . . . . . . . . . . . . 5
2.2 Geometric Distribution . . . . . . . . . . . . . . . . . . . . . . . . 5
2.3 Binomial Distribution . . . . . . . . . . . . . . . . . . . . . . . . 6
2.4 Poisson Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 6

3 Continuous Probability 7
3.1 Probability Density Function (PDF) . . . . . . . . . . . . . . . . 7
3.2 Cumulative Density Function (CDF) . . . . . . . . . . . . . . . . 7

4 Important Continuous Probability Distributions 7


4.1 Continuous Uniform Distribution . . . . . . . . . . . . . . . . . . 10
4.2 Exponential Distribution . . . . . . . . . . . . . . . . . . . . . . . 11
4.3 Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . . . . 12
4.4 Chi-Squared Distrubution . . . . . . . . . . . . . . . . . . . . . . 13

5 Expectations and Moments 14

6 Joint PDF of Two Random Variables 15

7 Vector Probability Distribution 15

1
1 Discrete Probability
1.1 Random Events
An event (generally denoted by A) is a collection of outcomes in a random
experiment.
Example 1 (An event). (i) Getting even in a die rolling experiment. (ii)
Getting ”Head” in a coin toss experiment.
A sure event (generally denoted by S) will always happen
Example 2 (A sure event). (i) Getting a number between “1” and “6” in a die
rolling experiment. (ii) Getting ”Head or Tail” in a coin toss experiment.

1.2 Axioms of Probability


1. Probability of an event is nonnegative:

p(A) > 0 (1)

2. Probability of a sure event is unity:

p(S) = 1 (2)

3. Probability is additive for mutually exclusive events: If the events A and


B have no common elements (their set intersection is ∅, the empty set),
i.e.,

A ∩ B , {A and B} , {A, B} = 0 (3)

then their union (logical “or”) has probability

P {A ∪ B} , P {A or B} , P {A + B} = P (A) + P (B) (4)

Example 3 (Rolling a fair die). Find out the probability of (i) Any of the face
(ii) Even numbered faces
From the axiom of probability

P (S) = P {F1 + F2 + F3 + F4 + F5 + F6 } = 6Fi = 1 (5)

From the above, the probability of getting any of the 6 faces is


1
P (Fi ) = (6)
6
Probability of getting an even numbered face is
1 1 1 1
P (F2 ) + P (F4 ) + P (F6 ) = + + = (7)
6 6 6 2

2
Definition 1 (Bernoulli trials). A Bernoulli trial is an experiment where
• There are two possible outcomes: {success, failure}
• The probability of success is p and the probability of failure is q such that
p = 1 − q.

• The outcome of one trial is not affected by the preceding outcomes.


Example 4. A fair coin is tossed four times. What is the probability of getting
three heads?
Answer: Figure 1 describes that the answer is 1/4.

Figure 1: Probability of getting three heads from four coin tosses.

The following theorem helps us to obtain the answer for any number of
successes:

3
Theorem 1 (Probability of a number of successes in a Bernoulli trial).
Consider n Bernoulli trials each with a probability of success p; the prob-
ability of getting exactly j successes is
 
n
b(j, n, p) = pj q n−j
j
n!
= pj q n−j (8)
(n − j)!j!

Using Theorem 1, the answer to Example 4 can be easily obtained as follows:


Since it is a fair coin, we have p = 1/2. The required probability is then
4!
b(3, 4, 1/2) = (1/2)3 (1/2)1 = 0.25 (9)
3!1!
Example 5. A biased coin, where the probability of getting a head is ph = 0.1,
is tossed four times. What is the probability of getting three heads?
Answer: We will simply use Theorem 1 to find the answer:
4!
b(3, 4, 0.1) = (0.1)3 (0.9)1 = 0.0036 (10)
3!1!

1.3 Probability Mass Function (PMF)


Let us assume that a random variable x takes only discrete values ξi , i =
1, 2, . . . , n. The probability of each observation ξi is given by
µx (ξi ) = P {x = ξi } = µi i = 1, . . . , n (11)
which is called as the probability mass function (PMF).
Similar to the PDF for continuous random variables, the PMF has the prop-
erty
n
X
µi = 1 (12)
i=1

The PMF in (11) can also be written as


n
X
p(x) = µi δ(x − ξi ) (13)
i=1

where δ(·) is the Dirac delta function that has the following properties

1 x=0
δ(x) = (14)
0 x 6= 0

Z ∞
δ(x) = 0 (15)
−∞

4
2 Important Discrete Probability Distributions
2.1 Discrete Uniform Distribution
The probability mass function (PMF) of a discrete uniform distribution is given
by
1
U{n; a, b} = (16)
b−a+1
where a and b are integers and n ∈ {a, a + 1, . . . , b}. Figure 2 illustrates the
PMF of a discrete uniform distribution.

Figure 2: PMF of discrete uniform distribution.

Example 6 (Discrete uniform distribution).

Tossing a 6-face die where the faces are numbered from a = 1 to b = 6.

2.2 Geometric Distribution


The PMF of a geometric distribution is given by

G(n; p) = (1 − p)n−1 p (17)

Figure 3 illustrates the PMF of a geometric distribution.


Example 7 (Geometric distribution).

Tossing a 6-face die where the faces are numbered from a = 1 to b = 6. Let
us denote n as the number of times the die was thrown until face 1 appears.
The x-axis in Figure 3 denotes n and y-axis denotes the probability of n where
p is the probability of obtaining face 1 in a single trial.

5
Figure 3: PMF of geometric distribution.

2.3 Binomial Distribution


The PMF of a binomial distribution is given by
 
k k!
B(n; k, p) = pn q k−n = pn q k−n (18)
n (k − n)!j!
where n is the number of success obtained out of k consecutive binomial trials
where the success probability is p. Figure 4 illustrates the PMF of a Binomial
distribution for two sets of parameters.
Example 8.
Consider an unbiased coin is tossed for k times and let n denote the number
of times a head is obtain. The binomial PMF in Figure 4 shows the probability
of n for all possible values of n.

2.4 Poisson Distribution


Poisson PMF, written as
(λT )n
P{n; λ, T } = e−λT (19)
n!
The Poisson PMF above describes the number of random points in an interval
T . Figure 5 illustrates the PMF of a Poisson distribution.
Example 9.
Consider a radar that is designed to detect the number of airplanes flying
in a certain surveillance region. The radar sometimes falsely detects an aircraft
which is not really present (due to noise and other disturbances). the number
of false alarms n per fixed unit (defined in terms of time, area, volume etc.) is
found to be distributed as a Poisson distribution.

6
3 Continuous Probability
3.1 Probability Density Function (PDF)
The probability density function (pdf) of a scalar continuous-valued random
variable x at x = ξ is

P {ξ − δξ < x < ξ}
px (ξ) = lim (20)
dξ→0 dξ

where P {·} is the probability of the event {·}.

More common notation of PDF

px (ξ) = px (x) = p(x) (21)

From (20) and Axiom 3, one can write


Z ξ
P (η < x ≤ ξ) = p(x)dx (22)
η

3.2 Cumulative Density Function (CDF)


The function
Z ξ
Px (ξ) = P (x ≤ ξ) = p(x)dx (23)
−∞

is called the cumulative probability density function (CDF) of x at ξ.


The relationship between the density and the cumulative distribution is

∂Px (ξ)
p(x) = (24)
∂ξ

4 Important Continuous Probability Distribu-


tions

7
Probabiliy of n successes from 5 Bernoulli trials
0.35

0.3

0.25
Probability

0.2

0.15

0.1

0.05

0
0 1 2 3 4 5
Number of success ( n )

(a) B(n; 5, 0.5)

Probabiliy of n successes from 20 Bernoulli trials


0.18

0.16

0.14

0.12
Probability

0.1

0.08

0.06

0.04

0.02

0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Number of success ( n )

(b) B(n; 20, 0.5)

Figure 4: PMF of binomial distribution.

8
Figure 5: PMF of Poisson distribution.

9
4.1 Continuous Uniform Distribution
When x is uniformly distributed between a and b, written formally as x ∼
U(a, b), its probability density function (PDF) is written as
 1
b−a x ∈ [a, b]
p(x) = U(x; a, b) = (25)
0 else

Figure 6 illustrates the PDF of the above defined uniform distribution.

Figure 6: PDF of uniform distribution.

Example 10 (Continuous Uniform distributed variable).


Consider a Ferris wheel after all the buckets are removed. Let us give it a
spin. The point of rest has a uniform probability along its circumference which
may be considered to between a = 0 and b = 2πr where r is the radius of the
wheel.
Example 11. Use Matlab to generate a random number x ∼ U(10, 20)
Solution: The Matlab command ‘rand’ generates a number that is U(0, 1).
When written as ‘a∗rand’, the resulting number is U(0, a). Hence, the above
required random number can be generated using the following command in
Matlab:

x = 10 + (20 − 10) ∗ rand; (26)

Example 12. Use a Monte-Carlo simulation to verify that the number gener-
ated in Example 11 is indeed x ∼ U(10, 20) through a histogram visualization.

Solution: The following commands can be used to simulate n = 100, 000 sam-
ples and to visualize their distribution through a histogram.

10
n = 100000;
x = 10 + (20-10)*rand(n,1);
hist(x);
Figure 7 shows the resulting histogram which confirms to us (visually) that the
generated data x is indeed distributed x ∼ U(10, 20).

Figure 7: Histogram of random generated values. Monte-Carlo Simulation


to verify Matlab ‘rand’ command.

Remark 1. Visualization is a useful tool to understand the nature of the data;


however, it must be noted that the above visualization in Example 12 did not
proof that the above data was generated according to U(10, 20). In order to prove
that, there are rigorous statistical tools which are not the focus of this course.

4.2 Exponential Distribution


The PDF of a scalar variable exponentially distributed x ∼ E(x, λ), is given by

p(x) = E(x; λ) = λe−λx (27)

Figure 8 illustrates the PDF of an exponential distribution.

Example 13 (Exponentially distributed variable).


The amount of time a clerk spends with a customer is generally modelled as
exponential.

11
Figure 8: PDF of Exponential distribution.

4.3 Gaussian Distribution


The PDF of a scalar variable x that is distributed Gaussian with mean µ and
variance σ 2 , i.e., x ∼ N (µ, σ 2 ), is given by
1 (x−µ)2
p(x) = N (x; µ, σ 2 ) , √ e− 2σ 2 (28)
2πσ 2
Figure 9 illustrates the PDF of a Gaussian distribution.

Figure 9: PDF of Gaussian distribution.

Example 14 (Gaussian distributed variable).


There are numerous real-world examples of Gaussian distributed variables,
e.g, measurement noise, height of population, birth weight of babies, etc.

12
Example 15. Use Matlab to generate a random number x ∼ N (5, 3); then, use
a Monte-Carlo approach, similar to the one in Example 12 to visually verify the
simulated data.
Solution: The following command in Matlab generates a sample of x ∼ N (µ, σ)
x = µ + σ ∗ randn; (29)
Similar to before, the following commands can be used to simulate n = 100, 000
samples and to visualize their distribution through a histogram.
n = 100000;
x = 5 + 3*rand(n,1);
histogram(x, -10:.5:20);

Figure 10: Histogram of random generated values. Monte-Carlo Simula-


tion to verify Matla

4.4 Chi-Squared Distrubution


Given n random variables each are Gaussian with zero mean and unit-variance,
i.e.,
ui ∼ N (0, 1) (30)
the new variable
n
X
q= u2i (31)
i=1

13
is said to have Chi-square distributed with n degrees of freedom, i.e., q ∼ Xn2 .
Formally, the PDF of a chi-square distribution can be written as
 
1
p(x) = xk/2−1 e−x/2 (32)
2k/2 Γ(k/2)
Figure 11 illustrates the PDF of a chi-square distribution.

Figure 11: PDF of Chi square distribution.

Example 16 (Chi-square distributed variable).


Square error is distributed Chi-square. Hence, there are several applications
in evaluating the performance of estimation algorithms.
Property 1 (Addition of two Chi-square distributed random variables.). Let
q1 ∼ Xn1 (33)
q2 ∼ Xn2 (34)
then
q1 + q2 ∼ Xn1 +n2 (35)

5 Expectations and Moments


The nth moment is
Z ∞
E{xn } = xn p(x)dx (36)
−∞

The nth central moment is


Z ∞
n
E{(x − x̄) } = xn p(x)dx (37)
−∞

where x̄ is the mean.

14
• The first moment is also known as the mean, i.e.,
Z ∞
x̄ = E{x} = xp(x)dx (38)
−∞

• The second central moment is also known as the variane, i.e.,


Z ∞
2 2
σx = E{(x − x̄) } = (x − x̄)2 p(x)dx (39)
−∞

6 Joint PDF of Two Random Variables


The joint pdf of two random variables x and y is defined in terms of the prob-
ability of the following joint event
{ξ − dξ < x ≤ ξ} ∩ {η − dη < y ≤ η}
px,y (ξ, η) = lim (40)
dξ→0,dη→0 dξdη
Similar to before, we will use simpler notation for the above PDF as follows
px,y (ξ, η) → p(x, y) (41)
The joint CDF is expressed as
Z ξ Z η
Px,y (ξ, η) = P {x ≤ ξ, y ≤ η} = (x, y)dxdy (42)
−∞ −∞
It must be noted that the joint PDF holds the following property
Z ∞
p(x, y)dx = p(y) (43)
−∞
Z ∞
p(x, y)dy = p(x) (44)
−∞
where the resulting PDFs are called the marginal PDF.

7 Vector Probability Distribution


The definition of the joint probability of two random variables can be
The PDF of a vector variable x that is distributed Gaussian with mean x̄
and covariance matrix P, i.e., x ∼ N (x̄, P), is given by
1 1 T
P−1 (x−x̄)
p(x) = N (x; x̄, P) , 1/2
e− 2 (x−x̄) (45)
|2πP|
Example 17.
Consider x ∼ N (x̄, P) where
x̄ = [2 1]T (46)
 
4 1
P= (47)
1 4
Figure 12 shows the PDF in two different ways.

15
PDF as a surface

0.06

0.04

0.02

0
6
4 8
2 4 6
0 2
-2 -2 0
-4 -4

PDF as a contour
6

-2

-4
-3 -2 -1 0 1 2 3 4 5 6 7

Figure 12: Two dimensional Gaussian distribution.

16

You might also like