Random Varible
Random Varible
Example 1 Let X and Y be random variables that take on values from the set {1, 0, 1}.
(a) Find a joint probability mass assignment for which X and Y are independent, and
conrm that X
2
and Y
2
are then also independent.
(b) Find a joint pmf assignment for which X and Y are not independent, but for which
X
2
and Y
2
are independent.
Solution
(a) We assign a joint probability mass function for X and Y as shown in the table below.
The values are designed to observe the relations: P
XY
(x
k
, y
j
) = P
X
(x
k
)P
Y
(y
j
) for all
k, j. Hence, the independence property of X and Y is enforced in the assignment.
P
XY
(x
k
, y
j
) x
1
= 1 x
2
= 0 x
3
= 1 P
Y
(y
j
)
y
1
= 1
1
12
1
6
1
4
1
2
y
2
= 0
1
18
1
9
1
6
1
3
y
3
= 1
1
36
1
18
1
12
1
6
P
X
(x
k
)
1
6
1
3
1
2
Given the above assignment for X and Y , the corresponding joint probability mass
function for the pair X
2
and Y
2
is seen to be
P
X
2
Y
2( x
k
, y
j
) x
1
= 1 x
2
= 0 P
Y
2( y
j
)
y
1
= 1
1
12
+
1
4
+
1
36
+
1
12
=
4
9
1
6
+
1
18
=
2
9
2
3
y
2
= 0
1
18
+
1
6
=
2
9
1
9
1
3
P
X
2( x
k
)
2
3
1
3
Note that P
X
2
,Y
2( x
k
, y
j
) = P
X
2( x
k
)P
Y
2( y
j
) for all k and j, so X
2
and Y
2
are also
independent.
(b) Suppose we take the same joint pmf assignment for X
2
and Y
2
as in the second table,
but modify the joint pmf for X and Y as shown in the following table.
P
XY
(x
k
, y
j
) x
1
= 1 x
2
= 0 x
3
= 1 P
Y
(y
j
)
y
1
= 1
1
4
1
6
1
12
1
2
y
2
= 0
1
18
1
9
1
6
1
3
y
3
= 1
1
12
1
18
1
36
1
6
P
X
(x
k
)
7
18
1
3
5
18
1
This new joint pmf assignment for X and Y can be seen to give rise to the same joint
pmf assignment for X
2
and Y
2
in the second table. However, in this new assignment,
we observe that
1
4
= P
XY
(x
1
, y
1
) = P
X
(x
1
)P
Y
(y
1
) =
7
18
1
2
=
7
36
,
and the inequality of values can be observed also for P
XY
(x
1
, y
3
), P
XY
(x
3
, y
1
) and
P
XY
(x
3
, y
3
), etc. Hence, X and Y are not independent.
Remark
1. Since 1 and 1 are the two positive square roots of 1, we have
P
X
(1) +P
X
(1) = P
X
2(1) and P
Y
(1) +P
Y
(1) = P
Y
2(1),
therefore
P
X
2(1)P
Y
2(1) = [P
X
(1) +P
X
(1)][P
Y
(1) +P
Y
(1)]
= P
X
(1)P
Y
(1) +P
X
(1)P
Y
(1) +P
X
(1)P
Y
(1) +P
X
(1)P
Y
(1).
On the other hand, P
X
2
Y
2(1, 1) = P
XY
(1, 1) + P
XY
(1, 1) + P
XY
(1, 1) + P
XY
(1, 1). Given that X
2
and Y
2
are independent, we have P
X
2
Y
2(1, 1) = P
X
2(1)
P
Y
2(1), that is,
P
XY
(1, 1) +P
XY
(1, 1) +P
XY
(1, 1) +P
XY
(1, 1)
= P
X
(1)P
Y
(1) +P
X
(1)P
Y
(1) +P
X
(1)P
Y
(1) +P
X
(1)P
Y
(1).
However, there is no guarantee that P
XY
(1, 1) = P
X
(1)P
Y
(1), P
XY
(1, 1) = P
X
(1)
P
Y
(1), etc., though their sums are equal.
2. Suppose X
3
and Y
3
are considered instead of X
2
and Y
2
. Can we construct a pmf
assignment where X
3
and Y
3
are independent but X and Y are not?
3. If the set of values assumed by X and Y is {0, 1, 2} instead of {1, 0, 1}, can we
construct a pmf assignment for which X
2
and Y
2
are independent but X and Y are
not?
Example 2 Suppose the random variables X and Y have the joint density function
dened by
f(x, y) =
_
c(2x +y) 2 < x < 6, 0 < y < 5
0 otherwise
.
(a) To nd the constant c, we use
1 = total probability =
_
6
2
_
5
0
c(2x +y) dydx =
_
5
2
c
_
2xy +
y
2
2
_
5
0
dx
=
_
6
2
c
_
10x +
25
2
_
dx = 210c,
2
so c =
1
210
.
(b) The marginal cdf for X and Y are given by
F
X
(x) = P(X x) =
_
x
f(x, y) dydx
=
_
_
0 x < 2
_
x
2
_
5
0
2x +y
210
dydx =
2x
2
+ 5x 18
84
2 x < 6
_
6
2
_
5
0
2x +y
210
dydx = 1 x 6
;
F
Y
(y) = P(Y y) =
_
_
y
2x +y
210
dydx
=
_
_
0 y < 0
_
6
2
_
y
0
2x +y
210
dydx =
y
2
+ 16y
105
0 y < 5
_
6
2
_
5
0
2x +y
210
dydx = 1 y 5
.
(c) Marginal cdf for X: f
X
(x) =
d
dx
F
X
(x) =
_
4x+5
84
2 < x < 6
0 otherwise
.
Marginal cdf for Y : f
Y
(y) =
d
dy
F
Y
(y) =
_
2y+16
105
0 < y < 5
0 otherwise
.
(d)
P(X > 3, Y > 2) =
1
210
_
6
3
_
5
2
(2x +y) dydx =
3
20
P(X > 3) =
1
210
_
6
3
_
5
0
(2x +y) dydx =
23
28
P(X +Y < 4) =
1
210
_
4
2
_
4x
0
(2x +y) dxdy =
2
35
3
(e) Joint distribution function
2 6
5
y
x
0 ) , ( = y x F
1 ) , ( = y x F
0 ) , ( = y x F
84
18 5 2
) , (
2
+
=
x x
y x F
420
2 8 2
) , (
2 2 2
y y xy y x
y x F
+
=
105
16
) , (
2
y y
y x F
+
=
0 ) , ( = y x F
0 ) , ( = y x F 0 ) , ( = y x F
0
Suppose (x, y) is located in {(x, y) : x > 6, 0 < y < 5}, then
F(x, y) =
_
6
2
_
y
0
2x +y
210
dydx =
y
2
+ 16y
105
,
and f(x, y) =
2y + 16
105
.
Note that for this density f(x, y), we have
f(x, y) = f
X
(x)f
Y
(y),
so x and Y are not independent.
Example 3 The joint density of X and Y is given by
f(x, y) =
_
15
2
x(2 x y) 0 < x < 1, 0 < y < 1
0 otherwise
.
Compute the condition density of X, given that Y = y, where 0 < y < 1.
Solution For 0 < x < 1, 0 < y < 1, we have
f
X
(x|y) =
f(x, y)
f
Y
(y)
=
f(x, y)
_
f(x, y) dx
=
x(2 x y)
_
1
0
x(2 x y) dx
=
x(2 x y)
2
3
y
2
=
6x(2 x y)
4 3y
.
Example 4 If X and Y have the joint density function
f
XY
(x, y) =
_
3
4
+xy 0 < x < 1, 0 < y < 1
0 otherwise
4
nd (a) f
Y
(y|x), (b) P
_
Y >
1
2
|
1
2
< X <
1
2
+dx
_
.
Solution
(a) For 0 < x < 1,
f
X
(x) =
_
1
0
_
3
4
+xy
_
dy =
3
4
+
x
2
and
f
Y
(y|x) =
f
XY
(x, y)
f
X
(x)
=
_
3+4xy
3+2x
0 < y < 1
0 otherwise
.
For other values of x, f(y|x) is not dened.
(b) P
_
Y >
1
2
1
2
< X <
1
2
+dx
_
=
_
1/2
f
Y
_
y|
1
2
_
dy =
_
1
1/2
3 + 2y
4
dy =
9
16
.
Example 5 Let X and Y be independent exponential random variables with parameter
and , respectively. Consider the square with corners (0, 0), (0, a), (a, a) and (a, 0), that
is, the length of each side is a.
y
x
(0, a)
(0, 0) (a, 0)
(a, a)
(a) Find the value of a for which the probability that (X, Y ) falls inside a square of side
a is 1/2.
(b) Find the conditional pdf of (X, Y ) given that X a and Y a.
Solution
(a) The density function of X and Y are given by
f
X
(x) =
_
e
x
, x 0
0 x < 0
, f
Y
(y) =
_
e
y
, y 0
0 y < 0
.
Since X and Y are independent, so f
XY
(x, y) = f
X
(x)f
Y
(y). Next, we compute
P[0 X a, 0 Y a] =
_
a
0
_
a
0
e
x
e
y
dxdy = (1 e
a
)(1 e
a
),
5
and solve for a such that (1 e
a
)(1 e
a
) = 1/2.
(b) Consider the following conditional pdf of (X, Y )
F
XY
(x, y|X a, Y a)
= P[X x, Y y|X a, Y a]
=
P[a X x, a Y y]
P[X a, Y a]
=
P[a X x]P[a Y y]
P[X a]P[Y a]
since X and Y are independent
=
_
_
_
_
y
a
_
x
a
e
x
e
y
dxdy
_
a
_
a
e
x
e
y
dxdy
=
(e
a
e
x
)(e
a
e
y
)
e
a
e
a
, x > a, y > a
0 otherwise
.
f
XY
(x, y|X a, Y a) =
2
xy
F
XY
(x, y|X a, Y a)
=
_
e
x
e
y
/e
a
e
a
for x > a, y > a
0 otherwise
.
Example 6 A point is chosen uniformly at random from the triangle that is formed by
joining the three points (0, 0), (0, 1) and (1, 0) (units measured in centimetre). Let X and
Y be the co-ordinates of a randomly chosen point.
(i) What is the joint density of X and Y ?
(ii) Calculate the expected value of X and Y , i.e., expected co-ordinates of a randomly
chosen point.
(iii) Find the correlation between X and Y . Would the correlation change if the units are
measured in inches?
Solution
(i) f
X,Y
(x, y) =
1
Area
= 2, (x, y) lies in the triangle.
(ii) f
X
(x) =
_
f
X,Y
(x, y
) dy
=
_
1x
0
2 dy = 2(1 x).
f
Y
(y) =
_
f
X,Y
(x
, y) dx
=
_
1y
0
2 dx = 2(1 y).
Hence, E[X] = 2
_
1
0
x(1 x) dx = 2
_
x
2
2
x
3
3
_
1
0
=
1
3
6
and E[Y ] = 2
_
1
0
y(1 y) dy =
1
3
.
(iii) To nd the correlation between X and Y , we consider
E[XY ] = 2
_
1
0
_
1y
0
xy dxdy = 2
_
1
0
y
_
x
2
2
_
1y
0
dy
=
_
1
0
y(1 2y +y
2
) dy
=
_
y
2
2
2
3
y
3
+
y
4
4
_
1
0
=
1
12
.
COV(X, Y ) = E[XY ] E[X]E[Y ]
=
1
12
_
1
3
_
2
=
1
36
.
E[X
2
] = 2
_
1
0
x
2
(1 x) dx = 2
_
x
3
3
x
4
4
_
1
0
=
1
6
so
VAR(X) = E[X
2
] [E[X]]
2
=
1
6
_
1
3
_
2
=
1
18
.
Similarly, we obtain VAR(Y ) =
1
18
.
XY
=
COV(X, Y )
_
VAR(X)
_
VAR(Y )
=
1
36
1
18
=
1
2
.
Since (aX, bY ) =
COV(aX, bY )
(aX)(bY )
=
abCOV(X, Y )
a(X)b(Y )
= (X, Y ), for any scalar multi-
ples a and b. Therefore, the correlation would not change if the units are measured in
inches.
Example 7 Let X, Y, Z be independent and uniformly distributed over (0, 1). Compute
P{X Y Z}.
Solution Since X, Y, Z are independent, we have
f
X,Y,Z
(x, y, z) = f
X
(x)f
Y
(y)f
Z
(z) = 1, 0 x 1, 0 y 1, 0 z 1.
7
Therefore,
P[X Y Z] =
___
xyz
f
X,Y,Z
(x, y, z) dxdydz
=
_
1
0
_
1
0
_
1
yz
dxdydz =
_
1
0
_
1
0
(1 yz) dydz
=
_
1
0
_
1
z
2
_
dz =
3
4
.
Example 8 The joint density of X and Y is given by
f(x, y) =
_
e
(x+y)
0 < x < , 0 < y <
0 otherwise
.
Find the density function of the random variable X/Y .
Solution We start by computing the distribution function of X/Y . For a > 0,
F
X/Y
(a) = P
_
X
Y
a
_
=
_ _
x/ya
e
(x+y)
dxdy =
_
0
_
ay
0
e
(x+y)
dxdy
=
_
0
(1 e
ay
)e
y
dy =
_
e
y
+
e
(a+1)y
a + 1
_
0
= 1
1
a + 1
=
a
a + 1
.
By dierentiating F
X/Y
(a) with respect to a, the density function X/Y is given by
f
X/Y
(a) = 1/(a + 1)
2
, 0 < a < .
Example 9 Let X and Y be a pair of independent random variables, where X is uni-
formly distributed in the interval (1, 1) and Y is uniformly distributed in the interval
(4, 1). Find the pdf of Z = XY .
8
Solution Assume Y = y, then Z = XY is a scaled version of X. Suppose U = W + ,
then f
U
(u) =
1
||
f
W
_
u
_
. Now, f
Z
(z|y) =
1
|y|
f
X
_
z
y
y
_
; the pdf of z is given by
f
Z
(z) =
_
1
|y|
f
X
_
z
y
y
_
f
Y
(y) dy =
_
1
|y|
f
XY
_
z
y
, y
_
dy.
Since X is uniformly distributed over (1, 1), f
X
(x) =
_
1
2
1 < x < 1
0 otherwise
. Similarly, Y
is uniformly distributed over (4, 1), f
Y
(y) =
_
1
3
4 < y < 1
0 otherwise
. As X and Y are
independent,
f
XY
_
z
y
, y
_
= f
X
_
z
y
_
f
Y
(y) =
_
1
6
1 <
z
y
< 1 and 4 < y < 1
0 otherwise
.
We need to observe 1 < z/y < 1, which is equivalent to |z| < |y|. Consider the following
cases:
(i) |z| > 4; now 1 < z/y < 1 is never satised so that f
XY
_
z
y
, y
_
= 0.
(ii) |z| < 1; in this case, 1 < z/y < 1 is automatically satised so that
f
Z
(z) =
_
1
4
1
|y|
1
6
dy =
_
1
4
1
6y
dy =
1
6
ln|y|
_
1
4
=
ln4
6
.
(iii) 1 < |z| < 4; note that f
XY
_
z
y
, y
_
=
1
6
only for 4 < y < |z|, so that
f
Z
(z) =
_
|z|
4
1
|y|
1
6
dy =
1
6
ln|y|
_
|z|
4
=
1
6
[ln4 ln|z|].
In summary, f
Z
(z) =
_
_
ln4
6
if |z| < 1
1
6
[ln4 ln|z|] if 1 < |z| < 4
0 otherwise
.
Remark Check that
_
f
Z
(z) dz =
_
1
4
1
6
[ln4 ln|z|] dz
+
_
1
1
ln4
6
dz +
_
4
1
1
6
[ln4 ln|z|] dz
=
_
4
4
ln4
6
dz 2
_
4
1
ln|z|
6
dz
9
=
8
6
ln4
1
3
[z lnz z]
4
1
= 1.
Example 10 Let X and Y be two independent Gaussian random variables with zero
mean and unit variance. Find the pdf of Z = |X Y |.
Solution We try to nd F
Z
(z) = P[Z z]. Note that z 0 since Z is a non-negative
random variable.
x
y
Z = Y X
when y > x
Z = X Y
when x > y
Consider the two separate cases: x > y and x < y. When X = Y , Z is identically zero.
(i) x > y, Z z x y z, z 0; that is, x z y < x.
x
y
Z = Y X
when y > x
Z = X Y
when x > y
x
x
x z
x z y < x
F
Z
(z) =
_
_
x
xz
f
XY
(x, y) dydx
f
Z
(z) =
d
dz
F
Z
(z) =
_
f
XY
(x, x z) dx
=
_
1
2
e
[x
2
+(xz)
2
]/2
dx
=
1
2
e
z
2
/4
_
e
(x
z
2
)
2
dx =
1
2
e
z
2
/4
.
10
(ii) x < y, Z z y x z, z 0; that is x < y x +z.
F
Z
(z) =
_
_
x+z
x
f
XY
(x, y) dydx
f
Z
(z) =
_
f
XY
(x, x +z) dx =
1
2
e
z
2
/4
_
e
(x+z)
2
dx =
1
2
e
z
2
/4
.
Example 11 Suppose two persons A and B come to two separate counters for service.
Let their service times be independent exponential random variables with parameters
A
and
B
, respectively. Find the probability that B leaves before A?
Solution Let T
A
and T
B
denote the continuous random service time of A and B, re-
spectively. Recall that the expected value of the service times are: E[T
A
] =
1
A
and
E[T
B
] =
1
B
. That is, a higher value of implies a shorter average service time. One
would expect
P[T
A
> T
B
] : P[T
B
> T
A
] =
1
A
:
1
B
;
and together with P[T
A
> T
B
] +P[T
B
> T
A
] = 1, we obtain
P[T
A
> T
B
] =
B
A
+
B
and P[T
B
> T
A
] =
A
A
+
B
.
Justication:- Since T
A
and T
B
are independent exponential random variables, their joint
density f
T
A
,T
B
(t
A
, t
B
) is given by
f
T
A
,T
B
(t
A
, t
B
) dt
A
dt
B
= P[t
A
< T
A
< t
A
+dt
A
, t
B
< T
B
< t
B
+dt
B
]
= P[t
A
< T
A
< t
A
+dt
A
]P[t
B
< T
B
< t
B
+dt
B
]
= (
A
e
A
t
A
dt
A
)(
B
e
B
t
B
dt
B
).
P[T
A
> T
B
] =
_
0
_
t
A
0
B
e
A
t
A
e
B
t
B
dt
B
dt
A
=
_
0
A
e
A
t
A
(1 e
B
t
A
) dt
A
=
_
0
A
e
A
t
A
dt
A
_
0
A
e
(
A
+
B
)t
A
dt
A
= 1
A
A
+
B
=
B
A
+
B
.
11
12