EEE 6542 - Lecture 3 Notes - Complete - F2024
EEE 6542 - Lecture 3 Notes - Complete - F2024
Engineering
ìï p, n =1
(
Pr N = n = í )
ïî 1- p, n = 0
Examples:
1. Tossing a coin (success=H, failure=T, and p = P(H))
2. Inspecting an item (success=defective, failure=non- defective, and p=P(defective))
But what if we look for multiple coin flips? For
example, for a biased coin with P(T) = 2/3, if you flip it
5 times what is the probability that you get 2 tails?
Bernoulli Trials →Binomial Distribution
Bernoulli trials consists of repeated independent and
identical experiments each of which has only two outcomes
Suppose n independent coin flips p = Pr(success) and N =
the number of successes:
(
Pr N = k = )
æ n ö n!
Where çç ÷÷ = is the number of ways to select k
è k ø (n - k)!k!
items from n choices.
Binomial Distribution - Graph
Note: this is a
Pr (n) theoretical graph –
how would an
experimental one
be different?
n
A B
C
Example: Bernoulli Trials
If you flip a coin 5 times, find the probability that you
get less than 3 tails.
Random
Variables
Let’s first start with a simple example…
Consider the following – you roll a dice and define
X(.) to be the number of dots that come up.
Ok – now let’s do a formal
definition…
Random Variable (RV)
A random variable X is a function that assigns a real
number, X(ξ), to each outcome ξ of a chance
experiment. X can be discrete or continuous (or a
complex number).
X(ξ) = x
S ξ real
x line
SX
• The domain of X is the sample space S.
• The range of X is the set SX of all values taken on by X.
• SX is a subset of the set of all real numbers.
• The function X is deterministic, but randomness in the experiment
induces randomness in the X(ξ).
• A complex random variable z = x + jy is the sum of two real RVs.
Let’s think about our previous
example with a slight change…
You can consider X(.) to be twice the number of the
dots that come up.
Consider coin flipping scenario
ℝ
Set of outcomes
of a coin toss 1
heads
tails
-1
How to generate random variables: outcomes of
random experiments
Two step process:
1. Perform experiment to obtain outcome ξ
2. Do measurement to obtain X(ξ)
Examples:
• Toss coin 3 times, count number of heads
• Select person at random, measure height
F(∞) =
F(-∞) =
P{ x1 ≤X ≤ x2 } =
P[X > x] =
Continuous, Discrete and Mixed Random
Variables
x1 x2 x
Continuous random variable FX (x)
1
q
x
1
Discrete random variable
We’ll begin with discrete random
variables…
• X is a discrete random variable if it a assumes values at a
countable set of points x0, x1, x2,… .
• Remember coin flipping example. Another simple
example: if you roll a dice and define X as the
number that comes up.
æ n ö k
( ) ( )
n-k
Pr X = k = çç ÷÷ p 1- p , k = 0,1,...,n
è k ø
Do you guys remember these plots from earlier today?
Cumulative distribution function (let’s
fill the blanks together)
x
0 1 2 … ???
Expected Value of Discrete Random Variables
• Expected values (the average value) summarizes
information contained in the CDF, and PMF.
• E(cX) =
• E(c1X1 +c2X2 ) =
• More generally:
Let X be a discrete random variable
with a set of possible values A and
probability mass function p(x), and
E(X) = .
Variance provides
information about the
spread of the variation
about the mean.
A simple but nice example