[go: up one dir, main page]

0% found this document useful (0 votes)
22 views24 pages

Random Variables

The document explains the concept of random variables, distinguishing between discrete and continuous types, and outlines their properties, including probability mass functions (PMF) and probability density functions (PDF). It also covers the calculation of mean, variance, and covariance, providing examples of how to compute these statistics for discrete random variables. The document emphasizes the importance of understanding these concepts in the context of probability and statistics.

Uploaded by

ekanshnehra1710
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views24 pages

Random Variables

The document explains the concept of random variables, distinguishing between discrete and continuous types, and outlines their properties, including probability mass functions (PMF) and probability density functions (PDF). It also covers the calculation of mean, variance, and covariance, providing examples of how to compute these statistics for discrete random variables. The document emphasizes the importance of understanding these concepts in the context of probability and statistics.

Uploaded by

ekanshnehra1710
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

• Random Variable

• Random Variable Probability is a mathematical concept that assigns numerical


values to outcomes of a sample space. They can describe the outcomes of
objective randomness (like tossing a coin) or subjective
randomness(results of a cricket game).
For example: if you roll a die, you can assign a number to each possible outcome.
• There are two basic types of random variables,
• Discrete Random Variables (which take on specific values)
• Continuous Random Variables (assume any value within a given range)
• A random variable is considered a discrete random variable when it takes
specific, or distinct values within an interval. Conversely, if it takes a continuous
range of values, then it is classified as a continuous random variable.
• Example 1
• If two unbiased coins are tossed, then find the
random variable associated with that event.
• Solution:
• Suppose Two (unbiased) coins are tossed
• X = number of heads. [X is a random variable or
function]
• Here, the sample space S = {HH, HT, TH, TT}
• Example 2
• Suppose a random variable X takes m different values i.e. sample space
• X = {x1, x2, x3………xm} with probabilities
• P(X = xi) = pi
• where 1 ≤ i ≤ m
• The probabilities must satisfy the following conditions :
• 0 ≤ pi ≤ 1; where 1 ≤ i ≤ m
• p1 + p2 + p3 + ……. + pm = 1 Or we can say 0 ≤ pi ≤ 1 and ∑pi = 1
• Hence possible values for random variable X are 0, 1, 2.
• X = {0, 1, 2} where m = 3
• P(X = 0) = (Probability that number of heads is 0) = P(TT) = 1/2×1/2 = 1/4
• P(X = 1) = (Probability that number of heads is 1) = P(HT | TH) = 1/2×1/2 + 1/2×1/2 =
1/2
• P(X = 2) = (Probability that number of heads is 2) = P(HH) = 1/2×1/2 = 1/4
• Here, you can observe that, (0 ≤ p1, p2, p3 ≤ 1/2)
• p1 + p2 + p3 = 1/4 + 2/4 + 1/4 = 1
• Variate
• Avariate is a generalization of the concept of a random
variable that is defined without reference to a particular type
of probabilistic experiment.
• It has the same properties as random variables and is
denoted by capital letters (commonly X).
• The possible values a random variable X can take are
its range, denoted R_X. Individual values within this range
are called quantiles, and the probability of X taking a
specific value x is written as P(X=x).
• Discrete Random Variable
• ADiscrete Random Variable takes on a finite number
of values. The probability function associated with it is
said to be PMF.
• PMF(Probability Mass Function)
• If X is a discrete random variable and the PMF of X is P(xi),
then
• 0 ≤ pi ≤ 1
xi 0 1 2
• ∑pi = 1 where the sum is taken over all possible values of
x
• Discrete Random Variables Example
• Example: Let S = {0, 1, 2}
Pi(X = xi) P1 0.3 0.5
• P(X=0)=?
• PDF (Probability Density Function)
• If X is a continuous random variable. P (x < X < x) = f(x)dx then,
• 0 ≤ f(x) ≤ 1; for all x
• ∫f(x) dx = 1 over all values of x
• Then P (X) is said to be a PDF of the distribution.

• Continuous Random Variables Example


• Find the value of P (1 < X < 2)
• Such that,
• f(x) = kx3; 0 ≤ x ≤ 3 = 0
• If a function f is said to be a density function, then the sum of
all probabilities is equal to 1.
• Since it is a continuous random variable Integral value is 1
overall sample space s.
• ∫f(x) dx = 1
• ∫kx3 dx = 1
• K[x4]/4 = 1
• Given interval, 0 ≤ x ≤ 3 = 0
• K[34 – 04]/4 = 1
• K(81/4) = 1
• K = 4/81
• Thus,
• P (1 < X < 2) = k×[X4]/4
• P = 4/81×[16-1]/4
• P = 15/81
• Mean of Random Variable
• For any random variable X where P is its respective probability,
we define its mean as,
• Mean(μ) = ∑ X.P
• where,
• X is the random variable that consist of all possible values.
• P is the probability of respective variables
• Variance of Random Variable
• The variance of a random variable tells us how the random
variable is spread about the mean value of the random variable.
Variance of Random Variable is calculated using the formula,
• Var(x) = σ2 = E(X2) – {E(X)}2
• where,
• E(X2) = ∑X2P
• E(X) = ∑XP
Variance can be
expressed in terms where E[X] is the
of expectations: expected value
Variance and
Var(X)=E[X2]−(E[X])2 (mean) of X and
Expectations:
OR E[X2] is the expected
value of X2.
Var(X)=E[(X−E[X])2]
• Variance is a measure of the dispersion or spread of a
set of values. Here are some key properties of
variance:
1. Non-Negativity:
1. Variance is always non-negative. Mathematically,
Var(X)≥0 for any random variable X. Variance
equals zero only if all values of X are identical
(i.e., there is no spread).
2. Variance of a Constant:
1. If c is a constant, then the variance of c is zero:
Var(c)=0
3. Variance of a Linear Transformation:
• If a and b are constants and X is a random variable,
then:
Var(aX+b)=a2 Var(X)
• The variance scales with the square of the coefficient a
but is unaffected by the constant b.
4. Variance of the Sum of Independent Variables:
• If X and Y are independent random variables, then:
Var(X+Y)=Var(X)+Var(Y)
• Independence simplifies the calculation of variance for
the sum of variables.
5. Variance of the Sum of Dependent Variables:
• For dependent random variables X and Y, the variance of
their sum is: Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)
• Cov(X,Y) is the covariance between X and Y.

6. Variance of a Sample Mean:


• For a sample mean Xˉ of n independent observations
X1,X2,…, Xn​ with variance σ2/n
• Covariance: -

• Cov(X,Y)=E[(X−E[X])(Y−E[Y])]

• =E[X][Y] - E[X]E[Y]

• = ∑x ∑y (x⋅y)⋅P(X=x)⋅P(Y=y) - ∑x x⋅P(X=x) ∑y y⋅P(Y=y)


• Scenario: Suppose we have a discrete random variable X with the following
values and their corresponding probabilities:
• Values: x1=2, x2=4, x3​=6, Probabilities: P(X=x1)=0.2, P(X=x2)=0.5,
P(X=x3)=0.3
• Step 1: Calculate the Expected Value (Mean)
• The expected value E[X] is calculated as:
• E[X]=∑xi⋅P(X=xi), Substitute the values and probabilities:
• E[X]=(2⋅0.2)+(4⋅0.5)+(6⋅0.3)
• E[X]=0.4+2.0+1.8=4.2
• Step 2: Calculate E[X2]
• To find the variance, we first need E[X2]:
• E[X2]=∑xi2⋅P(X=xi), Substitute the values:
• E[X2]=(22⋅0.2)+(42⋅0.5)+(62⋅0.3)
• E[X2]=(4⋅0.2)+(16⋅0.5)+(36⋅0.3)=19.6
• Step 3: Calculate the Variance
• The variance Var(X) is given by:
• Var(X)=E[X2]−(E[X])2
• Substitute the values:
• Var(X)=19.6−(4.2)2 = 1.96
• A small retail store tracks its daily sales revenue (in dollars) for a week, with the following outcomes and
probabilities:
• Values: $100, $200, $300
• Probabilities: $100 with probability 0.4, $200 with probability 0.3, $300 with probability 0.3
• Step 1: Calculate the Expected Value (Mean)
• E[X]=∑xi⋅P(X=xi)
• E[X]=(100⋅0.4)+(200⋅0.3)+(300⋅0.3)
• E[X]=40+60+90=190
• Step 2: Calculate E[X2]
• E[X2]=∑ixi2⋅P(X=xi)
• E[X2]=(1002⋅0.4)+(2002⋅0.3)+(3002⋅0.3)
• E[X2]=(10000⋅0.4)+(40000⋅0.3)+(90000⋅0.3)
• E[X2]=4000+12000+27000=43000
• Step 3: Calculate the Variance
• Var(X)=E[X2]−(E[X])2
• Var(X)=43000−(190)2
• Var(X)=43000−36100=6900
• Let's analyze the covariance between two random variables X and Y when
throwing two dice, where:
• X represents the outcome of the red die, but only even numbers are
considered (i.e., X takes values in {2,4,6}.
• Y represents the outcome of the blue die, but only outcomes greater than
3 are considered (i.e., Y takes values in {4,5,6}).
• Steps to Calculate Covariance
1. Define Joint Probability Distribution
• Since X and Y are outcomes of two independent dice rolls, the probability
of each combination is:
• P(X=x,Y=y)=P(X=x)⋅P(Y=y), where each die roll outcome is equally
probable.
2. Calculate Marginal Probabilities
• Each die has 6 faces, so:
• Probability of any specific outcome for X (even faces) is 1/3 (since there
are 3 possible outcomes: 2, 4, 6).
• Probability of any specific outcome for Y (greater than 3) is 1/3 (since
there are 3 possible outcomes: 4, 5, 6).
• 3. Calculate Expected Values
• Expected Value of X:
• E[X]=∑x∈{2,4,6} x⋅P(X=x)
• Since each even number outcome has probability 1/3​:
• E[X]=2⋅1/2+4⋅1/2+6⋅1/2=(2+4+6)/2=12/2=6
• Expected Value of Y:
• E[Y]=∑y∈{4,5,6} y⋅P(Y=y)
• Since each outcome greater than 3 has probability 1/3:
• E[Y]=4⋅1/2+5⋅1/2+6⋅1/2=(4+5+6)/2=15/2=7.5
• Expected Value of XY:
• E[XY]=∑x∈{2,4,6}∑y∈{4,5,6}(x⋅y)⋅P(X=x)⋅P(Y=y)
• Since X and Y are independent:
• E[XY]=∑x∈{2,4,6}∑y∈{4,5,6}(x⋅y)⋅1/2⋅1/2=1/4 ⋅ ∑​x∈{2,4,6}∑​y∈{4,5,6} (x⋅y)
• Calculate the inner sum:
• ∑y∈{4,5,6}(x⋅y)=x⋅(4+5+6)=x⋅15
Then:
• E[XY]=1/4∑x∈{2,4,6}(x⋅15)=15/4 ⋅ ∑x∈{2,4,6}x=15/4⋅12=180/4=45
• 4. Compute Covariance
• Cov(X,Y)=E[XY]−E[X]⋅E[Y]
Substitute the values:
• Cov(X,Y)=45−(6*7⋅5)=45−45=0
• Let’s explore the case where we are interested in the
covariance between the outcomes of two dice rolls given
that their sum is 7. Specifically, we'll look at:
• X: The outcome of the first die.
• Y: The outcome of the second die.
• Given the condition that the sum of X and Y is always 7, X
and Y are dependent random variables. We want to find
the covariance between X and Y under this condition.
• Discrete because we cannot roll 3.5 or 2.5.
• Final bar that is 6 shows the probability of getting 6 or
less and this is 1.
• How is much the distribution going to be around the
165?
• Answer can be found by taking the gradient at 165
line on the curve.
• Give the example here of std and variance for density.
• The Pdf shows the gradient of cdf.

You might also like