[go: up one dir, main page]

0% found this document useful (0 votes)
52 views18 pages

This Image Is in The Public Domain

This document provides an introduction to discrete random variables and expectation. It defines key concepts such as probability mass functions (pmf) and cumulative distribution functions (cdf). Examples are provided for Bernoulli, binomial, and geometric random variables. The document also discusses interpreting expectations and examples of expected values. Board questions are included that ask the reader to calculate probabilities and expected values.

Uploaded by

animesh7392
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views18 pages

This Image Is in The Public Domain

This document provides an introduction to discrete random variables and expectation. It defines key concepts such as probability mass functions (pmf) and cumulative distribution functions (cdf). Examples are provided for Bernoulli, binomial, and geometric random variables. The document also discusses interpreting expectations and examples of expected values. Board questions are included that ask the reader to calculate probabilities and expected values.

Uploaded by

animesh7392
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Discrete Random Variables; Expectation

18.05 Spring 2014

Jeremy Orlo and Jonathan Bloom

This image is in the public domain.

http://www.mathsisfun.com/data/quincunx.html
http://www.youtube.com/watch?v=9xUBhhM4vbM

Board Question: Evil Squirrels


One million squirrels, of which 100 are pure evil!
Events: E = evil, G = good, A = alarm sounds
Accuracy: P(A|E ) = .99, P(A|G ) = .01.
a) A squirrel sets o the alarm, what is the probability that it is evil?
b) Is the system of practical use?
answer: a) Let E be the event that a squirrel is evil. Let A be the
event that the alarm goes o. By Bayes Theorem, we have:
P(A | E )P(E )
P(E | A) =
P(A | E )P(E ) + P(A | E c )P(E c )
100
.99 1000000
=
100
999900
.99 1000000
+ .01 1000000
.01.
b) No. The alarm would be more trouble than its worth, since for
every true positive there are about 99 false positives.
May 27, 2014

2 / 17

Big point

Evil
Nice
Alarm
99
9999
10098
No alarm
1 989901 989902
100 999900 1000000
Summary:
Probability a random test is correct =

99+989901
1000000

Probability a positive test is correct =

99
10098

= .99

.01

These probabilities are not the same!

May 27, 2014

3 / 17

Table Question: Dice Game

The Randomizer holds the 6-sided die in one st and


the 8-sided die in the other.

The Roller selects one of the Randomizers sts and

covertly takes the die.

The Roller rolls the die in secret and reports the result

to the table.

Given the reported number, what is the probability that


the 6-sided die was chosen?

May 27, 2014

4 / 17

Reading Review
Random variable X assigns a number to each outcome:
X :R
X = a denotes the event { | X () = a}.
Probability mass function (pmf) of X is given by
p(a) = P(X = a).
Cumulative distribution function (cdf) of X is given by
F (a) = P(X a).
May 27, 2014

5 / 17

Concept Question: cdf and pmf


X a random variable.
values of X : 1 3 5 7
cdf F (a): .5 .75 .9 1
1. What is P(X 3)?
a) .15

b) .25

c) .5

d) .75

2. What is P(X = 3)
a) .15 b) .25 c) .5

d) .75

May 27, 2014

6 / 17

CDF and PMF

1
.9
.75

F (a)

.5

a
1

p(a)
.5
.25
.15
a

May 27, 2014

7 / 17

Deluge of discrete distributions


Bernoulli(p) = 1 (success) with probability p,

0 (failure) with probability 1 p.

In more neutral language:


Bernoulli(p) = 1 (heads) with probability p,
0 (tails) with probability 1 p.
Binomial(n,p) = # of successes in n independent
Bernoulli(p) trials.
Geometric(p) = # of tails before rst heads in a
sequence of indep. Bernoulli(p) trials.
(Neutral language avoids confusing whether we want the number of
successes before the rst failure or vice versa.)

May 27, 2014

8 / 17

Concept Question

1. Let X binom(n, p) and Y binom(m, p) be


independent. Then X + Y follows:
a) binom(n + m, p)
b) binom(nm, p)
c) binom(n + m, 2p)
d) other
2. Let X binom(n, p) and Z binom(n, q) be
independent. Then X + Z follows:
a) binom(n, p + q)
b) binom(n, pq)
c) binom(2n, p + q)
d) other

May 27, 2014

9 / 17

Board Question: Find the pmf

X = # of successes before the second failure of a


sequence of independent Bernoulli(p) trials.
Describe the pmf of X .

May 27, 2014

10 / 17

Dice simulation: geometric(1/4)

Roll the 4-sided die repeatedly until you roll a 1.

Click in X = # of rolls BEFORE the 1.

(If X is 9 or more click 9.)

Example: If you roll (3, 4, 2, 3, 1) then click in 4.

Example: If you roll (1) then click 0.

May 27, 2014

11 / 17

Fiction

Gamblers fallacy: [roulette] if black comes up several


times in a row then the next spin is more likely to be red.
Hot hand: NBA players get hot.

May 27, 2014

12 / 17

Fact

P(red) remains the same.

The roulette wheel has no memory. (Monte Carlo, 1913).

The data show that player who has made 5 shots in a row
is no more likely than usual to make the next shot.

May 27, 2014

13 / 17

Memory

Show that Geometric(p) is memoryless, i.e.


P(X = n + k | X n) = P(X = k)
Explain why we call this memoryless.

May 27, 2014

14 / 17

Computing expected value

Denition:

E (X ) =

xi p(xi )

1. E (aX + b) = aE (X ) + b
2. E (X + Y ) = E (X ) + E (Y )
3. E (h(X )) =

h(xi ) p(xi )

May 27, 2014

15 / 17

Board Question: Interpreting Expectation

a) Would you accept a gamble that oers a 10% chance


to win $95 and a 90% chance of losing $5?
b) Would you pay $5 to participate in a lottery that oers
a 10% percent chance to win $100 and a 90% chance to
win nothing?
Find the expected value of your change in assets in each
case?

May 27, 2014

16 / 17

Board Question

Suppose (hypothetically!) that everyone at your table got


up, ran around the room, and sat back down randomly
(i.e., all seating arrangements are equally likely).
What is the expected value of the number of people
sitting in their original seat?
(We will explore this with simulations in Friday Studio.)

May 27, 2014

17 / 17

MIT OpenCourseWare
http://ocw.mit.edu

18.05 Introduction to Probability and Statistics


Spring 2014

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

You might also like