[go: up one dir, main page]

0% found this document useful (0 votes)
5 views3 pages

Week9 Exercise Solution

The document contains exercises related to continuous random variables, including calculations of expected values, conditional probability density functions, and covariance. It also demonstrates the relationship between variance and conditional expectations, as well as the moment generating functions for normally distributed random variables. The solutions provide detailed mathematical derivations for each exercise.

Uploaded by

qiao.wen16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views3 pages

Week9 Exercise Solution

The document contains exercises related to continuous random variables, including calculations of expected values, conditional probability density functions, and covariance. It also demonstrates the relationship between variance and conditional expectations, as well as the moment generating functions for normally distributed random variables. The solutions provide detailed mathematical derivations for each exercise.

Uploaded by

qiao.wen16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Week 9: Exercises

Date: November 10, 2024

Exercise 1. Let the continuous random variable X, Y have joint distribution



1/x if 0 < y < x < 1,

fX,Y (x, y) =
0

otherwise.
(a). compute E(X) and E(Y );
(b). compute the conditional pdf of Y given X = x, for all 0 < x < 1;
(c). compute Cov(X, Y ).

Solution 1.
(a).
Z ∞ Z ∞ Z 1Z x Z 1
1 1
E(X) = x · fX,Y (x, y)dydx = x · dydx = xdx =
−∞ −∞ 0 0 x 0 2
Z∞ Z ∞ Z 1Z x Z 1
1 1 1
E(Y ) = y · fX,Y (x, y)dydx = y · dydx = xdx =
−∞ −∞ 0 0 x 0 2 4
R∞ Rx 1
(b). The marginal pdf for X is fX (x) = −∞ fX,Y (x, y)dy = 0 x = 1 for 0 < x < 1 (and equals zero
otherwise). That is, X is uniform on the interval from 0 to 1. So the conditional pdf for Y given X is
1
fX,Y (y|x) = fX,Y (x, y)/fX (x) = , for 0 < y < x,
x
and 0 otherwise.
(c).
Z 1Z x Z 1
1 1 2 1
E(XY ) = xy · dydx = x dx = .
0 0 x 0 2 6
So,
1 1 1 1
Cov(X, Y ) = E(XY ) − E(X)E(Y ) = − · = .
6 2 4 24

Exercise 2. For any two random variables X and Y , show that


Var(X) = E(Var(X|Y)) + Var(E(X|Y )).

1
Solution 2. Note the fact that
E(X) = E(E(X|Y )). (1)
By definition, we have
 2 
Var(X) = E [X − E(X)]2 = E X − E(X|Y ) + E(X|Y ) − E(X)

 2   2    
= E X − E(X|Y ) + E E(X|Y ) − E(X) + 2E X − E(X|Y ) E(X|Y ) − E(X) .
The last term is equal to 0, which can be derived by (1)
       
E X − E(X|Y ) E(X|Y ) − E(X) = E E X − E(X|Y ) E(X|Y ) − E(X) |Y .
In the conditional distribution X|Y , X is the random variable. So in the expression,
   
E X − E(X|Y ) E(X|Y ) − E(X) |Y ,
E(X|Y ) and E(X) are constants. Thus,
     
E X − E(X|Y ) E(X|Y ) − E(X) |Y = E(X|Y ) − E(X) E(X − E(X|Y )|Y )
 
= E(X|Y ) − E(X) E(X|Y ) − E(X|Y )
= 0.
   
Thus, we have that E X − E(X|Y ) E(X|Y ) − E(X) |Y = E(0) = 0. Moreover, we see that
 2    2  
E X − E(X|Y ) = E E X − E(X|Y ) |Y = E(Var(X|Y ))
and
 2 
E E(X|Y ) − E(X) = Var(E(X|Y )),
which establishes the result.

Exercise 3. Let X be a random variable. Define the moment generating function of X, denoted by
MX (t) = E(etX ), provided that the expectation exists for t in some neighborhood of 0. If the moment
generating function exist and MX (t) = MY (t) for all t in the sam neighborhood of 0, then FX (u) = FY (u)
for all u.
(a). Compute MX (t) if X ∼ N (µ1 , σ12 );
(b). Compute MX+Y (t) if X ∼ N (µ1 , σ12 ), Y ∼ N (µ2 , σ22 ) and X, Y are independent;
(c). Show that X + Y ∼ N (µ1 + µ2 , σ12 + σ22 ) if X ∼ N (µ1 , σ12 ), Y ∼ N (µ2 , σ22 ) and X, Y are
independent.

Solution 3.

2
(a).

(x − µ1 )2
Z
tX 1
MX (t) = E(e )=
exp(tx) p exp(− )dx
−∞ 2πσ12 2σ12
Z ∞
1 x2 − 2(µ1 + σ12 t)x + (µ1 + σ12 t)2 − σ14 t2 − 2µ1 σ12 t
= exp(− )dx
2σ12
p
−∞ 2πσ12
Z ∞
σ 2 t2 1 (x − (µ1 + σ12 t))2
= exp(µ1 t + 1 ) exp(− )dx
2σ12
p
2 −∞ 2πσ12
σ 2 t2
= exp(µ1 t + 1 ).
2
R∞ 1 (x−(µ1 +σ12 t))2
(The integral −∞ √ exp(− )dx = 1 since it is integral of the density function of
2 2πσ1 2σ 2 1

N (µ1 + σ12 t, σ12 ). )


(b).
MX+Y (t) = E(et(X+Y ) ) = E(etX · etY ) = E(etX )E(etY )
σ12 t2 σ 2 t2
) · exp(µ2 t + 2 )
= exp(µ1 t +
2 2
σ12 t2 σ22 t2
= exp(µ1 t + + µ2 t + ).
2 2
(c). To show X + Y ∼ N (µ1 + µ2 , σ12 + σ22 ), we just need to show MZ (t) = MX+Y (t) if Z ∼
N (µ1 + µ2 , σ12 + σ22 ).
Z ∞
tZ 1 (z − µ1 − µ2 )2
MZ (t) = E(e ) = exp(tz) p exp(− )dz
−∞ 2π(σ12 + σ22 ) 2(σ12 + σ22 )
Z ∞
1 x2 − 2(µ1 + µ2 + σ12 t + σ22 t)x + (µ1 + µ2 + σ12 t + σ22 t)2
= exp(− )
2(σ12 + σ22 )
p
−∞ 2π(σ12 + σ22 )
σ12 t2 σ 2 t2
· exp(µ1 t + + µ2 t + 2 )dx
2 2

σ 2 t2 σ 2 t2 (x − (µ1 + µ2 + σ12 t + σ22 t))2
Z
1
= exp(µ1 t + 1 + µ2 t + 2 ) · exp(− )dx
2(σ12 + σ22 )
p
2 2 −∞ 2π(σ12 + σ22 )
σ12 t2 σ22 t2
= exp(µ1 t + + µ2 t + )
2 2
= MX+Y (t),
which proves the result.

You might also like