[go: up one dir, main page]

0% found this document useful (0 votes)
17 views20 pages

Discrete Random Variable

The document covers discrete random variables and their associated probability distributions, including how to determine probabilities, calculate statistical measures, and apply common distributions in real-world scenarios. It outlines key concepts such as probability mass functions, cumulative distribution functions, and various types of discrete distributions like binomial, geometric, and Poisson. Additionally, it provides practical applications and examples to illustrate the use of these distributions in analyzing random events.

Uploaded by

embrunio
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views20 pages

Discrete Random Variable

The document covers discrete random variables and their associated probability distributions, including how to determine probabilities, calculate statistical measures, and apply common distributions in real-world scenarios. It outlines key concepts such as probability mass functions, cumulative distribution functions, and various types of discrete distributions like binomial, geometric, and Poisson. Additionally, it provides practical applications and examples to illustrate the use of these distributions in analyzing random events.

Uploaded by

embrunio
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Discrete Random

Variables and Probability


Distributions

Eng’r. Ma Liezl C. Gallardo, Ph.D. P.IE


Learning Objectives
Determine Probabilities
Calculate probabilities from probability mass functions and cumulative
distribution functions, and vice versa.

Calculate Statistical Measures


Compute means and variances for discrete random variables to understand
their central tendency and dispersion.

Apply Common Distributions


Understand the assumptions and applications of common discrete
probability distributions in real-world scenarios.

Solve Probability Problems


Calculate probabilities and determine means and variances for specific
discrete probability distributions.
Chapter Outline

1 Probability Distributions and Probability Mass


Functions
Understanding how to describe the probabilities associated with discrete
random variables.

2 Cumulative Distribution Functions


Using cumulative probabilities to describe random variables and find
probability mass functions.

3 Mean and Variance of a Discrete Random Variable


Calculating and interpreting measures of central tendency and dispersion.

4 Common Discrete Distributions


Exploring discrete uniform, binomial, geometric, negative binomial,
hypergeometric, and Poisson distributions.
RAID Systems: A Real-World Application
A redundant array of independent disks (RAID) uses multiple
physical disk drives as one logical unit in a computer system.
The array can increase performance and robustness to disk
failures.

Data copies can be written simultaneously to multiple drives


(mirroring) to provide immediate backup, or data can be
distributed among multiple disks (striping) to increase
performance.

RAID systems use different designs to balance performance,


availability, and cost. The number of failed drives can be
modeled as a discrete random variable, making it an
excellent application of the topics in this chapter.
Probability Distributions and Probability
Mass Functions
Many physical systems can be modeled by the same or similar random experiments and random variables. The distribution of
random variables involved in these common systems can be analyzed, with results applied across different applications.

Probability Distribution Probability Mass Practical Application


Function
A description of the probabilities Random variables simplify the
associated with the possible values A function f(x) that specifies the description and analysis of random
of a random variable X. For discrete probability at each possible discrete experiments, allowing us to focus
variables, this is often specified by a value for X, where f(x) ≥ 0 and the on the outcomes of interest rather
list of possible values and their sum of all f(x) equals 1. than the entire sample space.
probabilities.
Example: Flash Recharge Time
The time to recharge the flash is tested in three cell-phone cameras. The
Camera 1 Camera Camera Probabili X
probability that a camera meets the recharge specification is 0.8, and
2 3 ty
the cameras perform independently.

Let X denote the number of cameras that pass the test. The sample Pass Pass Pass 0.512 3
space consists of all possible outcomes (pass/fail) for the three cameras,
with probabilities determined by independence. Fail Pass Pass 0.128 2

Pass Fail Pass 0.128 2

Fail Fail Pass 0.032 1

Pass Pass Fail 0.128 2

Fail Pass Fail 0.032 1

Pass Fail Fail 0.032 1

Fail Fail Fail 0.008 0


Cumulative Distribution Functions
Definition Finding PMF from CDF

The cumulative distribution function (CDF) of a discrete The probability mass function can be determined from the
random variable X, denoted as F(x), is: "jumps" in the cumulative distribution function:

F(x) = P(X ≤ x) = ∑(xi≤x) f(xi) P(X = xi) = F(xi) - lim(x↑xi) F(x)

Properties of F(x): This calculates the difference between F(xi) and the limit of
F(x) as x increases to xi.
• 0 ≤ F(x) ≤ 1
• If x ≤ y, then F(x) ≤ F(y)
• F(x) is piecewise constant between values
Example: Digital Channel Error
Analysis
Problem
A bit transmitted through a digital channel has a chance of being received in
error. Let X equal the number of bits in error in the next four bits transmitted.

Probability Mass Function


P(X = 0) = 0.6561

P(X = 1) = 0.2916

P(X = 2) = 0.0486

P(X = 3) = 0.0036

P(X = 4) = 0.0001

Cumulative Probability
P(X ≤ 3) = P(X = 0) + P(X = 1) + P(X = 2) + P(X = 3) = 0.6561 + 0.2916 + 0.0486 + 0.0036
= 0.9999
Mean and Variance of a Discrete Random
Variable
Mean (Expected Value) Variance

The mean of a discrete random variable X, denoted as μ or The variance of X, denoted as σ² or V(X), is:
E(X), is:
σ² = V(X) = E(X - μ)² = ∑x (x - μ)²·f(x) = ∑x x²·f(x) - μ²
μ = E(X) = ∑x x·f(x)
The standard deviation is σ = √σ².
This is a weighted average of the possible values of X with
weights equal to the probabilities.
Expected Value of a
Function
Definition Linear Functions
If X is a discrete random variable For linear functions h(X) = aX + b
with probability mass function (for any constants a and b):
f(x), the expected value of any
E(aX + b) = aE(X) + b
function h(X) is:
V(aX + b) = a²V(X)
E[h(X)] = ∑x h(x)·f(x)

Application
This allows us to find expected values of transformations of random
variables, such as squared values, without having to derive new
probability distributions.
Discrete Uniform Distribution
Definition Example: Voice Lines

A random variable X has a discrete uniform distribution if Let X denote the number of 48 voice lines used at a particular
each of the n values in its range, x₁, x₂, ..., xₙ, has equal time, with a uniform distribution from 0 to 48.
probability 1/n.
E(X) = (48+0)/2 = 24
For consecutive integers a, a+1, a+2, ..., b:
σ = √(49² - 1)/12 = 14.14
• Mean: μ = (b+a)/2
The proportion of lines used Y = X/48 has:
• Variance: σ² = ((b-a+1)² - 1)/12
E(Y) = E(X)/48 = 0.5

V(Y) = V(X)/48² = 0.087


Binomial Distribution

Binomial Random Probability Mass Mean and Variance


Variable Function
μ = np and σ² = np(1-p)
Counts successes in n independent f(x) = (n choose x)p^x(1-p)^(n-x) for
Bernoulli trials x = 0,1,...,n

Bernoulli Trials
Independent trials with constant probability p of success
Binomial Distribution
Examples

1 Coin Flips
Flipping a coin 10 times and counting the number of heads obtained.

2 Manufacturing Defects
A worn machine tool produces 1% defective parts. Counting defective parts in
the next 25 produced.

3 Digital Transmission
10% of bits transmitted through a channel are received in error. Counting errors
in the next five bits.

4 Multiple-Choice Test
A test with 10 questions, each with four choices, and you guess at each
question. Counting correct answers.
Geometric and Negative Binomial
Distributions
Geometric Distribution Negative Binomial Distribution

Counts the number of trials until the first success in a series Counts the number of trials until r successes occur.
of Bernoulli trials.
f(x) = (x-1 choose r-1)(1-p)^(x-r)p^r for x = r, r+1,...
f(x) = (1-p)^(x-1)p for x = 1,2,...
Mean: μ = r/p
Mean: μ = 1/p
Variance: σ² = r(1-p)/p²
Variance: σ² = (1-p)/p²
Special case: When r=1, it's a geometric distribution
Key property: Lack of memory - the system does not "wear
out"
Hypergeometric Distribution

Sampling without replacement


Used when selecting n objects from N total objects without replacement

Probability mass function


f(x) = [(K choose x)(N-K choose n-x)]/[N choose n]

Mean and variance


μ = np and σ² = np(1-p)(N-n)/(N-1) where p = K/N

The term (N-n)/(N-1) is called the finite population correction factor. When n is small relative to N, the hypergeometric
distribution is similar to the binomial distribution.
Example: Parts from Suppliers

300 4
Total Parts Sample Size
100 from local supplier, 200 from out-of-state Parts selected randomly without replacement

0.0119 0.407
Probability Probability
All four parts from local supplier Two or more parts from local supplier

This example demonstrates how the hypergeometric distribution is used to calculate probabilities when sampling without replacement from a finite population with two categories of
objects.
Poisson Distribution

Random Occurrences
Probability Mass Function
Events occur randomly in an interval
f(x) = e^(-λT)(λT)^x/x! for x = 0,1,2,...
or region 2

Poisson Process
Mean = Variance
Subintervals are independent with
μ = σ² = λT
probability λΔt of one event
Poisson Distribution Applications

The Poisson distribution is widely used to model random occurrences in intervals of time, length, area, or volume. Examples include flaws in wires or textiles, contamination particles in
manufacturing, calls to a telephone exchange, power outages, and atomic particles emitted from a specimen.
Example: Wire Flaws
Scenario Calculations

Flaws occur at random along the length of a thin copper wire Probability of 10 flaws in 5mm:
with a mean of 2.3 flaws per millimeter.
λT = 2.3 flaws/mm × 5mm = 11.5 flaws
Let X denote the number of flaws in a given length of wire.
P(X=10) = e^(-11.5)(11.5)^10/10! = 0.113

Probability of at least one flaw in 2mm:

λT = 2.3 flaws/mm × 2mm = 4.6 flaws

P(X≥1) = 1 - P(X=0) = 1 - e^(-4.6) = 0.99


Summary of Discrete Probability Distributions
Distribution Parameters Mean Variance

Discrete Uniform a, b (range) (b+a)/2 ((b-a+1)²-1)/12

Binomial n, p np np(1-p)

Geometric p 1/p (1-p)/p²

Negative Binomial r, p r/p r(1-p)/p²

Hypergeometric N, K, n np np(1-p)(N-n)/(N-1)

Poisson λ, T λT λT

These distributions provide mathematical models for a wide range of real-world phenomena involving discrete random variables.
Understanding their properties and applications is essential for statistical analysis and probability calculations.

You might also like