MA-PT-3
Discrete Probability
PEC al eC W AL mCL}* The theory of probability was first developed
more than 300 years ago, when certain gambling
games were analyzed. Although probability
theory was originally invented to study gambling,
it now plays an essential role in a wide variety of
disciplines.
For example, probability theory is extensively
applied in the study of genetics, where it can be
used tohelp understand the inheritance of
teats.In computer science, probability theory plays an
important role in the study of the complexity of
algorithms. In particular, ideas and techniques
from probability theory are used to determine
the average-case complexity of algorithms.
Probability theory can help us answer questions
that involve uncertainty, such as determining
whether we should reject an incoming mail
message as spam based on the words __ that
pear in the message.+ Probability theory dates back to 1526 when the Italian
mathematician, physician Girolamo Cardano wrote the
Book on Games of Chance.
+ In the seventeenth century the French mathematician
Blaise Pascal determined the odds of winning some
popular bets based on the outcome when a pair of dice is
repeatedly rolled.
s—In the eighteenth century, the French mathematician
Laplace, who also studied gambling, defined the
eo probability of an event as the number of successful
utcomes divided by the number of possible outcomes.Discrete Probability
* Everything you have learned about counting
constitutes the basis for computing the
of events to happen.
* In the following, we will use the notion
for a procedure that yields one of a
given set of possible outcomes.
his set of possible outcomes is called the
; of the experiment.
is a subset of the sample space.Discrete Probability
= If all Bicones in the. sample space are ecg
likely, the following definition of probability
applies:
The probability of an event E, which is a subset
of a finite sample space S of equally likely
outcomes, is given by p(E) = |E|/|S|.
robability values range from 0 (for an event that
ill happen) to 1 (for an event that will
happen whenever the experiment is
arried out). :Discrete Probability
An urn contains 4 blue balls and 5 red balls.
What is the probability that a ball chosen at
random from the urn is blue?
There are 9 possible outcomes, and the event
blue ball is chosen” comprises four of these
ioutcomes. Therefore, the probability of this event
is 4/9 or approximately 44.44%.Discrete Probability
What is the probability of winning the lottery 6/49,
that is, picking the correct set of six numbers out
of 49?
There are C(49, 6) possible outcomes. Only one
pf these outcomes will actually make us win the
lottery.
) = 1/C(49, 6) = 1/13,983,816ee Events
LetE es an RSvenel ina sample s space 8. The
probability of an event —E, the
of E, is given by
p(-E) = 1 — p(E).
This can easily be shown:
p(-E) = ([S] - [EI/IS| = 1 - IEI/|S] = 1 — p(E).
This rule is useful if it is easier to determine the
probability of the complimentary event than the
obability of the event itself.Complimentary Events
| A sequence of 10 bits is randomly
generated. What is the probability that at least one of
these bits is zero?
There are 21° = 1024 possible outcomes of
generating such a sequence. The event —-E,
, includes only one of these outcomes,
namely the sequence 1111111111.
Therefore, p(-E) = 1/1024.
Now p(E) can easily be computed as
P(E) = 1 — p(-E) = 1 — 1/1024 = 1023/1024. 10Discrete Probability
Let E, and E, be events in the sample space S.
Then we have:
p(E, U E,) = p(E,) + p(E2)- p(E; m £2)
Does this remind you of something?
Of course, the principle of inclusion-exclusion.Discrete Probability
What i is MeRScreRsE Ne of a one
integer selected at random from the set of positive
integers not exceeding 100 to be divisible by 2 or 5?
E,: “integer is divisible by 2”
E;: “integer is divisible by 5”
E, = {2, 4, 6, ..., 100}
|E,| = 50
p(E,) = 0.5Discrete Probability
E, = {5, 10, 15, ..., 100}
IEs| = 20
p(Es) = 0.2
E, 0 Es = (10, 20, 30, ..., 100}
IE, 0 E,| = 10
p(E, 0 Es) = 0.1
p(E, VU Es) = p(E2) + p(Es)— p(E2 1 Es )
p(E, UE,) = 0.5+0.2-0.1=06 a
gicDiscrete Probabi
What happens if the outcomes of an experiment are not
equally likely?
In that case, we assign a probability p(s) to each
outcome séS, where S is the sample space.
Two conditions have to be met:
(1): 0
0.
e conditional probability of E given F, denoted by
p(E | F), is defined as
p(E | F) = p(E > F)/p(F)Conditional Probability
What is the probability of a random bit string of
length four contains at least two consecutive Os, given that its
first bit is a 0 ?
E: “bit string contains at least two consecutive Os”
F: “first bit of the string is a 0”
We know the formula
E 4 F = {0000, 0001, 0010, 0011, 0100}
p(E 4 F) = 5/16
p(F) = 8/16 = 1/2
p(E | F) = (5/16)/(1/2) = 10/16 = 5/8 = 0.625
1234567800
20Independence
Let us ein 16 the example of —e a coin three
times.
Does the probability of event E (odd number of
tails) on the occurrence of event F (first
toss is a tail) ?
In other words, is it the case that
p(E | F) # p(E) ?
Ve actually find that p(E | F) = 0.5 and p(E) = 0.5,
0 we say that E and F are
21Independence
Because we have p(E | F) = p(E 7 F)/p(F),
p(E | F) = p(E) if and only if p(E 7 F) = p(E)p(F).
The events E and F are said to be
independent if and only if p(E 7 F) = p(E)p(F).
‘Obviously, this definition is for E and F.
If we have p(E 4 F) = p(E)p(F), then it is also true that
p(F | E) = p(F).
22Independence
Suppose E is the event that a randomly generated
bit string of length four begins with a 1, and F is the event that
a randomly generated bit string contains an even number of Os.
Are E and F independent?
Obviously, p(E) = p(F) = 0.5.
EF = {1111, 1001, 1010, 1100}
p(E 4 F) = 0.25
p(E 4 F) = p(E)p(F)
Conclusion: E And F are
1234567800
23f'254567800
ord
momen os