Random variables
S. Devi Yamini
Module - II 1 / 33
Overview
1 Probability
Module - II 2 / 33
Overview
1 Probability
2 Random Variables
Discrete Random Variable
Continuous Random Variable
Module - II 2 / 33
Overview
1 Probability
2 Random Variables
Discrete Random Variable
Continuous Random Variable
3 Joint Distribution
Module - II 2 / 33
Overview
1 Probability
2 Random Variables
Discrete Random Variable
Continuous Random Variable
3 Joint Distribution
4 Expectation
Module - II 2 / 33
Overview
1 Probability
2 Random Variables
Discrete Random Variable
Continuous Random Variable
3 Joint Distribution
4 Expectation
5 Moment Generating Function
Module - II 2 / 33
Probability - Introduction
Consider a random experiment.
Outcome - result of the experiment
Sample space S - the collection of all possible outcomes in the experiment.
Event E - a subset of the sample space
Example
Toss a coin
S = {H, T }
Let E be the event of occurrence of a head .
Example
Roll a die
S = {1, 2, 3, 4, 5, 6}
Let E be an event whose number on topface is even. E = {2, 4, 6}
Module - II 3 / 33
Introduction
Example
Toss a coin twice
S = {HH, HT , TH, TT }
Let E be an event which contains exactly one head
E = {HT , TH}
Example
Toss a coin until we get a head.
S = {H, TH, TTH, TTTH, . . .}
Module - II 4 / 33
Axioms of Probability
A real valued function P on the space of all events of an experiment is
called probability measure if
(i) 0 ≤ P(E ) ≤ 1 ∀ events E
(ii) P(S) = 1
(iii) For any mutually exclusive events E1 , E2 , . . . , En , . . .,
∞
[ ∞
X
P( Ei ) = P(Ei )
i=1 i=1
Mutually Exclusive events
Two events A and B are mutually exclusive if the occurence of A excludes
the occurence of B. For example, in tossing a coin,
Let A be the event of getting a head and B be the event of getting a tail.
If the events A and B are not mutually exclusive, then
P(A ∪ B) = P(A) + P(B) − P(A ∩ B)
Module - II 5 / 33
Introduction
Independent Events
Events A and B are said to be independent if the occurrence of A doesnot
depend on the occurrence of B. For example, consider tossing a coin twice.
Let A be the occurrence of head in the first toss
Let B be the occurrence of head in the second toss
Note that A and B are independent.
For independent events
P(A ∩ B) = P(A)P(B)
For dependent events
Conditional probability is defined as
P(A ∩ B)
P(A/B) =
P(B)
Module - II 6 / 33
Random Variables
Random Variable - a number associated with each outcome of an
experiment
Example
Toss a coin
S = {H, T }
Let
1, if head appears
X =
0, if tail appears
1
P(X = 0) = P(X = 1) = 2
Module - II 7 / 33
Random Variables
Example
Toss a coin twice
S = {HH, HT , TH, TT }
Let X denote the number of heads. X = {0, 1, 2}
P(X = 0) = P(X = 2) = 41
P(X = 1) = 12
Module - II 8 / 33
Probability mass function
If X takes x1 , x2 , . . ., the P(X = xi ) = pi is called the Probability mass
function of xi , if it satisfies
(i) pi ≥ 0 ∀i
X∞
(ii) pi = 1
i=1
Here {xi , pi } is the probability mass function of the random varaible X .
Cummulative Distribution Function
x
X
F (x) = P(X ≤ x) = P(X )
−∞
if a < b, then P(a < X ≤ b) = F (b) − F (a)
0 ≤ F (x) ≤ 1
F (x) ≤ F (y ) if x < y
Module - II 9 / 33
Discrete Random Variables
Mean
P
E (X ) = xP(x)
Variance
V (X ) = E (X 2 ) − [E (X )]2 = x 2 P(x) − [ xP(x)]2
P P
Module - II 10 / 33
Problems
1. A random variable X has the following probability distribution:
x 0 1 2 3 4 5 6 7
p(x) 0 k 2k 2k 3k k2 2k 2 7k 2 + k
(i) Find k
(ii) Evaluate P(X < 6), P(X ≥ 6), P(0 < X < 5)
(iii) Determine the distribution function of X
(iv) If P(X ≤ c) > 12 , find the minimum value of c
2. From a lot of 10 items containing 3 defectives, a sample of 4 items are
drawn at random. Let the random variable X denote the number of
defective items in the sample. (i) Find the probability distribution of X (ii)
Evaluate P(X ≤ 1), P(X < 1), P(0 < X < 2)
Module - II 11 / 33
Problems
3. Let X be a random variable such that
P(X = −2) = P(X = −1), P(X = 2) = P(X = 1), and
P(X > 0) = P(X < 0) = P(X = 0). Obtain the PMF of X , mean,
variance, and its distribution function.
x
4. If P(x) = 15 , x = 1, 2, 3, 4, 5
Find (i) P(x=1 or 2) (ii)
0, Otherwise
P( 21 < x < 52 /x > 1)
5. A discrete
random variable X has the cumulative distribution function
0 x <0
1
, 0≤x <1
10
3
10 , 1≤x <2
F (x) = 5 Determine the probability mass function of
10 , 2≤x <4
8
, 4≤x <5
10
1, x ≥5
X.
Module - II 12 / 33
Continuous random variable
If X is a continuous random variable, then the corresponding probability is
Probability density function (pdf). The pdf f (x) satisfies the following:
(i) f (x) ≥ 0, −∞ < x < ∞
R∞
(ii) −∞ f (x)dx = 1
Rb
(iii) P(a ≤ x ≤ b)= a f (x)dx
For a ≤ x ≤ b, Rb
Mean = X̄ = E [x] = a xf (x)dx
Rb Rb
Variance = σx2 = a x 2 f (x)dx − ( a xf (x)dx)2
Module - II 13 / 33
Cummulative Distribution Function F (x)
Rx
F (x) = P(X ≤ x) = −∞ f (x)dx
Properties of CDF
d
dx F (x) = f (x)
P(a ≤ X ≤ b) = F (b) − F (a)
Problems:
2x, 0<x <1
1. A random variable X has the pdf f (x) = Find
0, Otherwise
(i) P(X < 21 )
(ii) P( 14 < X < 21 )
(iii) P(X > 34 /X > 12 )
(iv) P(X < 34 /X > 12 )
Module - II 14 / 33
Problems
2. Let Xbe a continuous random variable with pdf
kx,
0≤x ≤1
k, 1≤x ≤2
f (x) =
−kx + 3k, 2≤x ≤3
0, Otherwise
(i) Determine k
(ii) Determine F (x)
3. Find
the value of a for which the follows is a pdf
a(x 2 + 1), 1≤x ≤4
f (x) =
0, Otherwise
Calculate
(i) P(X = 3)
(ii) P(2 < X ≤ 3)
(iii) P(2X − 3 < 2)
Module - II 15 / 33
Problems
4. The length of time (in mins) that a certain lady speaks over telephone
is found to be random phenomenon, with a probability density function
x
Ae − 5 ,
x ≥0
f (x) as f (x) =
0, Otherwise
(i) Find the value of A
(ii) What is the probability that the number of minutes that she will talk
over the phone is (a) more than 10 mins (b) less than 5 mins (c)
between 5 and 10 mins
5. A continuous
random variable X has the distribution function
0, x ≤1
F (x) = k(x − 1)4 , 1 < x ≤ 3 Find (i) k , (ii) pdf f (x), (iii) Mean
1 x >3
Module - II 16 / 33
Joint Distribution
Joint Probability mass function
If X and Y are discrete variables, then the corresponding probability
P(X = x, Y = y ) is Joint probability mass function which satisfies:
(i) P(x, y ) ≥ 0 ∀x, y
P P
(ii) X Y P(X = x, Y = y ) = 1
Marginal distribution of X
X
PX (x) = P(X = x, Y = y )
Y
Marginal distribution of Y
X
PY (y ) = P(X = x, Y = y )
X
Module - II 17 / 33
Conditional distribution for X given Y
P(X = x.Y = y )
P(X /Y ) =
PY (y )
Conditional distribution for Y given X
P(X = x.Y = y )
P(Y /X ) =
PX (x)
Cummulative Distribution Function
y
x X
X
F (x, y ) = P(X ≤ x, Y ≤ y ) = P(X = x, Y = y )
−∞ −∞
Two random variables X and Y are independent if
P(X = x, Y = y ) = PX (x)PY (y ) ∀x, y
Module - II 18 / 33
Problems
1. For the following bivariate probability distribution of X and Y , find (i)
the marginal distribution of X , Y , (ii) Conditional distribution of X given
Y = 3, (iii) P(X ≤ 1, Y = 2), (iv) P(X ≤ 1), (v) P(Y = 3), (vi)
P(Y ≤ 3), (vii) P(X < 3, Y ≤ 4), (viii) P(X + Y ≤ 3), (ix) Are these
variables independent?
X /Y 1 2 3 4 5 6
1 2 2 3
0 0 0 32 32 32 32
1 1 1 1 1 1
1 16 16 8 8 8 8
1 1 1 1 2
2 32 32 64 64 0 64
Module - II 19 / 33
Problems
2. Let the joint
xpmf of X1 and X2 be
1 +x2
21 , x1 = 1, 2, 3; x2 = 1, 2.
P(x1 , x2 ) =
0 Otherwise.
Show that the marginal pmf of X1 and X2 are
P1 (x1 ) = 2x21
1 +3
, x1 = 1, 2, 3 and P2 (x2 ) = 6+3x 21 ,
2
x2 = 1, 2. Are these
two random variables independent?
3. Given the following bivariate probability distribution, obtain the (i)
marginal distribution of X and Y , (ii) Conditional distribution of X given
Y = 2.
Y /X -1 0 1
1 2 1
0 15 15 15
3 2 1
1 15 15 15
2 1 2
2 15 15 15
Module - II 20 / 33
Joint Distribution-Continuous
Joint pdf
If X and Y are two continuous random variables, then the corresponding
probability f (x, y ) is called Joint pdf if it satisfies
(i) f (x, y ) ≥ 0 ∀x, y
R∞ R∞
(ii) −∞ −∞ f (x, y )dxdy = 1
Marginal distribution of X
Z
fX (x) = f (x, y )dy
Y
Marginal distribution of Y
Z
fY (y ) = f (x, y )dx
X
Module - II 21 / 33
Conditional distribution for X given Y
f (x.y )
f (X /Y ) =
fY (y )
Conditional distribution for Y given X
f (x.y )
f (Y /X ) =
fX (x)
Cummulative Distribution Function
Z x Z y
F (x, y ) = P(X ≤ x, Y ≤ y ) = f (x, y )dydx
−∞ −∞
Two random variables X and Y are independent if
f (x, y ) = fX (x)fY (y ) ∀x, y
Module - II 22 / 33
Problems
1. Given that
the joint pdf of the random variables X and Y is
k(x + 2y ), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1
f (x, y ) =
0 Otherwise.
(a) Determine k, (b) Find the marginal distribution of X and Y , (c) Find
P(X ≤ 21 , Y ≤ 21 ), (d) Check for independence of X and Y .
2. The joint pdf of a two dimensional random variable X , Y is
2, 0 < x < 1, 0 < y < x
f (x, y ) =
0 Otherwise.
(i) Find the marginal density functions of X and Y , (ii) Find the
conditional density function of Y given X = x, (iii) Check for
independence of X and Y .
Module - II 23 / 33
Problems
3. If X and Y1 are two random variables such that
f (x, y ) = 8 (6 − x − y ), 0 < x < 2, 2 < y < 4
Find (i)
0 Otherwise.
P(X < 1, Y < 3), (ii) P(X + Y < 3), (iii) P(X < 1|Y < 3)
Module - II 24 / 33
Expectation
P
For a univariate distribution, E (X ) = R xP(x), X is discrete
xf (x) X is continuous.
P 2 P 2
Var (X ) = σx 2 = R x2 P(x) − ( R xP(x)) , 2 X is discrete
x f (x)dx − ( xf (x)dx) X is continuous.
P
For a bivariate distribution, E (X ) = R xPX (x), X is discrete
xfX (x)dx X is continuous.
P
E (Y ) = R yPY (y ), Y is discrete
yfY (y )dy Y is continuous.
PP
E (XY ) = R R xyP(x, y ), X , Y are discrete
xyf (x, y )dxdy X , Y are continuous.
Module - II 25 / 33
Covariance
Cov (X , Y ) = E [XY ] − E [X ]E [Y ]
If X and Y are independent, then E [XY ] = E [X ]E [Y ] and hence
Cov (X , Y ) = 0
Properties
1 E [c] = c where c is a constant
2 E [aX + b] = aE [X ] + b
3 E [X + Y ] = E [X ] + E [Y ]
4 Var (c) = 0 where c is a constant
5 Var (aX + b) = a2 Var (X )
0 0 0
6 E [X ] = µ1 , E [X 2 ] = µ2 , . . . , E [X r ] = µr
Problem
Find E [X ] where X is a discrete random variable such that
P(X = x) = ( 34 )x x = 0, 1, 2, . . .
Module - II 26 / 33
1. Given the following bivariate probability distribution, calculate
cov(X , Y )
Y /X -1 0 1
1 2 1
0 15 15 15
3 2 1
1 15 15 15
2 1 2
2 15 15 15
Y /X -1 0 1 PY (y )
1 2 1 4
0 15 15 15 15
3 2 1 6
1 15 15 15 15
2 1 2 5
2 15 15 15 15
6 5 4
PX (x) 15 15 15 1
E (X ) = P xPX (x) = −2
P
15
E (Y ) = P yPYP(y ) = 16
15
6
E (XY ) = x y xyP(x, y ) = 15
38
cov(x, y ) = E (XY ) − E (X )E (Y ) = 15
Module - II 27 / 33
2. Given that
the joint pdf of the random variables X and Y is
k(x + 2y ), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1
f (x, y ) =
0 Otherwise.
Calculate cov(x, y )
Solution:
k = 23
Recall theR 1marginal distribution of X and Y :
2
fX (x) = 0 f (x, y )dy = 3 (x + 1)
R1
fY (y ) = 0 f (x, y )dx = 32 ( 12 + 2y )
R1
E (X ) = 0 xfX (x)dx = 95
R1
E (Y ) = 0 yfY (y )dy = 11 18
R1R1
E (XY ) = 0 0 xyf (x, y )dxdy = 13
cov(x, y ) = E (XY ) − E (X )E (Y ) = −0.00617
Module - II 28 / 33
Moment Generating Function
MGF
P tx
MX (t) = E [e tX ] = R etx P(x), if X is discrete
e f (x), if X is continuous
Properties of MGF
1 McX (t) = MX (ct)
2 If a random variable Y = aX + b where X is also a random variable,
a, b are constants, then MY (t) = e bt MX (at)
3 If X1 , X2 , . . . , Xn are independent random variables with MGF’s
MXi (t), then for Y = X1 + X2 + . . . + Xn ,
MY (t) = MX1 (t)MX2 (t) . . . MXn (t)
Module - II 29 / 33
Relation between MGF and moments
To find MGF
P using moments,
tr 0
MX (t) = ∞ µ
r =0 r ! r
To find moments using MGF,
0 dr
µr = ( dt r MX (t)) at t = 0
0 d
Mean= E (X ) = µ1 = ( dt MX (t)) at t = 0
0 d2
µ2 = ( dt 2 MX (t)) at t = 0
0 0
Variance= µ2 = µ2 − (µ1 )2
Module - II 30 / 33
Problems
1. The random variable X takes probability P(x) = ( 12 )x , x = 1, 2, . . ..
Find MGF, mean(X ), and Var(X ).
x, 0<x <1
2. Let X be a random variable with pdf f (x) = 2−x 1<x <2
0 Otherwise.
Find the MGF of X , mean and variance of X .
t
MX (t) = ( e t−1 )2
3. Find the MGF and the rth moment for the distribution whose pdf is
f (x) = ke −x , 0 < x < ∞. Also, calculate mean of X .
k
MX (t) = 1−t
0
4. Find the
PMGF of a random variable whose moments are µr = (r + 1)!2r
r 0
MX (t) = ∞ t ∞ r −1
P
r =0 r ! µr = r =0 (r + 1)(2t) = (1 − 2t)
5. Calculate the standard deviation of a random variable which has MGF
5
MX (t) = 5−t .
0 1 0 2
µ1 = 5 , µ2 = 25 , Standard deviation = 15
Module - II 31 / 33
Characteristic Function
The characteristic function is defined as
P itx
itX
φX (t) = E [e ] = R eitx P(x), if X is discrete
e f (x), if X is continuous
Note that MGF may not exists always. But characteristic function will
always exists.
Problem
Find the characteristic function of X whose pdf is f (x) = e −3x , x ≥ 0.
Also, compute the mean and variance of X .
Module - II 32 / 33
To find moments using CF,
0 dr
µr = (−i)r ( dt r φX (t)) at t = 0
0 d
Mean= E (X ) = µ1 = (−i)( dt φX (t)) at t = 0
0 d 2
µ2 = (−i)2 ( dt 2 φX (t)) at t = 0
0 0
Variance= µ2 = µ2 − (µ1 )2
Module - II 33 / 33