[go: up one dir, main page]

0% found this document useful (0 votes)
13 views53 pages

EEE 6542 - Lecture 3 Notes - Complete - F2024

The document outlines Lecture 3 of EEL 6542, focusing on Bayes' theorem, repeated trials, and random variables in electrical engineering. Key topics include the application of Bayes' rule, the Bernoulli process, and the definitions and properties of random variables and their distributions. The lecture also emphasizes the importance of expected value and variance in understanding random variables.

Uploaded by

Eliane Birba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views53 pages

EEE 6542 - Lecture 3 Notes - Complete - F2024

The document outlines Lecture 3 of EEL 6542, focusing on Bayes' theorem, repeated trials, and random variables in electrical engineering. Key topics include the application of Bayes' rule, the Bernoulli process, and the definitions and properties of random variables and their distributions. The lecture also emphasizes the importance of expected value and variance in understanding random variables.

Uploaded by

Eliane Birba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

EEL 6542 – Random Processes in Electrical

Engineering

Lecture 3 – Bayes’ Rule, Repeated Trials


& Random Variables

Ismail Uysal, Ph.D.


Associate Professor
Undergraduate Director
Department of Electrical Engineering
iuysal@usf.edu
Tampa Office Location: ENB II Building, 366
Office Hours: Bi-weekly 4:00PM – 5:00PM Thursdays (Microsoft Teams)

Stay in touch – Canvas is your main communications tool !


Announcements
• Remember – recitation hours are ever Thursday
starting @ 5PM at this very classroom (CMC 141).
• Thank you Rifat for hosting our first session last
Thursday – we will stick to the same setup going
forward…
Office Hours (Me and Rifat)
• I have my office hours starting this Thursday before
the recitation session – between 4PM – 5PM.
• These are on Teams and should be on your calendar.
• These will be bi-weekly with more hours set around
exam times.
• Rifat will have his own office hours, every week.
• Mondays 1PM – 2PM
• These are on Teams and should be on your calendar
already.
• You can always reach out to us if you need
additional help.
Second quiz this week
• On today’s material as always.
• Due next Tuesday (September 17) before start of
class.
• Will become available after 7PM today.
What did we learn so far?
• We learned about different definitions of
probability.
• We learned about axioms of probability and how to
define sample space, events, etc. using Venn
diagrams.
• We learned how to use 3 axioms of probability to
prove some observations.
• We learned conditional probability
Today
• Bayes’ theorem (one of the most practical
outcomes of probability theory).
• We look at repeated trials and…
• We start our discussion on random variables !!!
BAYES’ THEOREM:
Here is a fun fact…You use Bayes’ rule in your every day
life, all the time !
Although simple enough, Bayes’ theorem has an
interesting interpretation: P(A) represents the a priori
probability of the event A. Suppose B has occurred, and
assume that A and B are not independent.
How can this new information be used to update our
knowledge about A? Bayes’ rule takes into account the
new information (“B has occurred”) and gives out the a-
posteriori probability of A given B.
A more general version of Bayes’ theorem involves
partition of . Remember…
Example: Two boxes B1 and B2 contain 100 light bulbs
each. The first box (B1) has 20 defective bulbs and the
second 10. Suppose a box is selected at random and one
bulb is picked out.
(a) What is the probability that it is defective?
(b) Suppose we test the bulb and it is found to be defective.
What is the probability that it came from box 1? P(B1 | D) = ?
Repeated Trials
J. Bernoulli (1654-1705)
analyzed the idea of
repeated independent trials
that had two possible
outcomes: success or failure.
In his notation he wrote that
the probability of success is
denoted by p and the
probability of failure is
denoted by q or 1-p.
So what is the Bernoulli process?

A Bernoulli process is an experiment that must satisfy the


following properties:
• The experiment consists of n repeated Bernoulli trials.
• The probability of success, P(s)= p, remains constant
from trial to trial.
• The repeated trials are independent; that is the
outcome of one trial has no effect on the outcome of
any other trial.
How can you say if something is
Bernoulli process?
There are many “real world” problems that are
Bernoulli processes, and the test is answering “yes” to
the following three questions:

• Is the event outcome random and discrete?


• Is each event independent and with the same
probability of success?
• How would you define a “success” and a “failure”?
Bernoulli Distribution
Bernoulli Trial: Bernoulli trial is an experiment with only two possible outcomes
that are generally labeled as success (s) and failure (f)

Single coin flip p = Pr (success)


N = 1 if success, 0 otherwise

ìï p, n =1
(
Pr N = n = í )
ïî 1- p, n = 0

Examples:
1. Tossing a coin (success=H, failure=T, and p = P(H))
2. Inspecting an item (success=defective, failure=non- defective, and p=P(defective))
But what if we look for multiple coin flips? For
example, for a biased coin with P(T) = 2/3, if you flip it
5 times what is the probability that you get 2 tails?
Bernoulli Trials →Binomial Distribution
Bernoulli trials consists of repeated independent and
identical experiments each of which has only two outcomes
Suppose n independent coin flips p = Pr(success) and N =
the number of successes:

(
Pr N = k = )
æ n ö n!
Where çç ÷÷ = is the number of ways to select k
è k ø (n - k)!k!
items from n choices.
Binomial Distribution - Graph

Typical shape of a binomial distribution:


– Symmetric, with total P(n) = 1

Note: this is a
Pr (n) theoretical graph –
how would an
experimental one
be different?
n
A B

C
Example: Bernoulli Trials
If you flip a coin 5 times, find the probability that you
get less than 3 tails.
Random
Variables
Let’s first start with a simple example…
Consider the following – you roll a dice and define
X(.) to be the number of dots that come up.
Ok – now let’s do a formal
definition…
Random Variable (RV)
A random variable X is a function that assigns a real
number, X(ξ), to each outcome ξ of a chance
experiment. X can be discrete or continuous (or a
complex number).
X(ξ) = x

S ξ real
x line

SX
• The domain of X is the sample space S.
• The range of X is the set SX of all values taken on by X.
• SX is a subset of the set of all real numbers.
• The function X is deterministic, but randomness in the experiment
induces randomness in the X(ξ).
• A complex random variable z = x + jy is the sum of two real RVs.
Let’s think about our previous
example with a slight change…
You can consider X(.) to be twice the number of the
dots that come up.
Consider coin flipping scenario


Set of outcomes
of a coin toss 1

heads

tails
-1
How to generate random variables: outcomes of
random experiments
Two step process:
1. Perform experiment to obtain outcome ξ
2. Do measurement to obtain X(ξ)

Examples:
• Toss coin 3 times, count number of heads
• Select person at random, measure height

Question is: how to calculate probabilities involving


measurement?
For example…
Consider flipping the coin two times and counting the
number of heads…
Let’s plot this.

• This plot is called the probability mass function (PMF)


of random variable X. We’ll get back to this.
Probability Distribution Function
The probability P{X ≤ x } is denoted by FX (x) and is called
the (cumulative) distribution function (CDF) of the
random variable X.

That is, FX (x) = P{X ≤ x }

• The CDF is defined for both discrete and continuous


valued RVs.
• We’ll drop the subscript and write F(x) to denote the
CDF for the random variable X.
Some properties of F(x)
F(x) is a non-decreasing function of x and F(x) ≤ 1.

F(∞) =

F(-∞) =

P{ x1 ≤X ≤ x2 } =

P[X > x] =
Continuous, Discrete and Mixed Random
Variables

• The random variable X is said to be continuous if its


distribution function F(x) is continuous.

• If F(x) is constant except for a finite number of jump


discontinuities (i.e., piecewise constant with steps) then X is
said to be a discrete RV.

• X is mixed random variable if F(x) is continuous except at a


countable number of values x0, x1, x2,…
Continuous vs. Discrete
FX (x)
1

x1 x2 x
Continuous random variable FX (x)

1
q
x
1
Discrete random variable
We’ll begin with discrete random
variables…
• X is a discrete random variable if it a assumes values at a
countable set of points x0, x1, x2,… .
• Remember coin flipping example. Another simple
example: if you roll a dice and define X as the
number that comes up.

Probability Mass Function (PMF) of X


pX(x) = P[X = x]=P[{ζ: X(ζ) = x}], x a real number
Some properties of PMF
A quick example
• Rolling a 3-sided dice (can you physically make it?)
• X is defined as 2 x dice_dots. Find and plot CDF
and PMF.
Let’s look at some famous
discrete Random Variables !
Bernoulli Random Variable
Bernoulli random variable describes
the outcome of a single event as a 0
(“failure”) or 1 (“success”).

The indicator function for A is: A

IA is called the Bernoulli random variable.


Can you sketch PMF and CDF for
Bernoulli Random Variable?
Binomial Random Variable
• X = # successes in n independent Bernoulli trials

æ n ö k
( ) ( )
n-k
Pr X = k = çç ÷÷ p 1- p , k = 0,1,...,n
è k ø
Do you guys remember these plots from earlier today?
Cumulative distribution function (let’s
fill the blanks together)

x
0 1 2 … ???
Expected Value of Discrete Random Variables
• Expected values (the average value) summarizes
information contained in the CDF, and PMF.

• The expected value is denoted by E(X). It is found


by:

• Other symbols for the mean are mX(x) and most


commonly μx
Properties of Expected Value
• Expectation is a linear operation.
• Linearity implies that (c = a constant)

• E(cX) =

• E(c1X1 +c2X2 ) =

• More generally:
Let X be a discrete random variable
with a set of possible values A and
probability mass function p(x), and
E(X) = .

Variance of It is useful to know the spread about


a discrete the mean.
random
variable
The Var(X) and X, called the
variance and the standard deviation
of X, respectively, are defined by:
The expected value E[X]
provides the center of
mass of the distribution
of a random variable.

Variance provides
information about the
spread of the variation
about the mean.
A simple but nice example

Flipping the coin three times, define


the RV as “the number of tails minus
the number of heads”. Let’s do the
same practice of finding and plotting
CDF, PMF, expected value, variance
and standard deviation.

You might also like