Transformations of Two Random Variables
Problem : (X, Y ) is a bivariate rv. Find the distribution of
Z = g(X, Y ).
• The very 1st step: specify the support of Z.
• X, Y are discrete – straightforward; see Example 0(a)(b) from
Transformation of Several Random Variables.pdf.
• X, Y are continuous
– The CDF approach (the basic, off-the-shelf method)
– Special formula (convolution) for Z = X + Y
– MGF approach for sums of multiple independent rvs.
1
Examples of the CDF Approach
Example 3(f) from Note Mixture Joint 0830.pdf
Example (p83, Exercise 2.1.6) Let f (x, y) = e−x−y , 0 < x, y < ∞,
zero elsewhere, be the pdf of X, Y .
a) Find the pdf of Z = X + Y.
Z z Z z−x
F (Z ≤ z) = P(X + Y ≤ z) = dx e−x−y dy
0 0
= 1 − e−z − ze−z , z>0
fZ (z) = ze−z , z > 0, i.e., Z ∼ Ga(2, 1)
2
b) Find the pdf of W = 2X + Y.
Z w/2 Z w−2x
F (W ≤ w) = P(2X + Y ≤ w) = dx e−x−y dy
0 0
= 1 + e−w − 2e−w/2 , w>0
fW (w) = e−w/2 − e−w , w > 0.
c) Find the pdf of V = Y /X.
Z ∞ Z vx
F (V ≤ v) = P(Y ≤ vX) = dx e−x−y dy
0 0
1
= 1− , v>0
v+1
1
fV (v) = , v>0
(v + 1)2
3
d) Find the pdf of U = Y /(X + Y ).
Z ∞ Z ux/(1−u)
u
F (U ≤ u) = P(Y ≤ X) = dx e−x−y dy
1−u 0 0
= u, 0<u<1
fU (u) = 1, 0 < u < 1.
e) Find the pdf of T = X − Y.
Z ∞ Z t+y
F (T ≤ t) = P(X ≤ t + Y ) = dy e−x−y dx = 1 − e−t /2, t > 0
0 0
fT (t) = e−t /2 t > 0;
Z ∞ Z t+y
F (T ≤ t) = P(X ≤ t + Y ) = dy e−x−y dx = et /2, t < 0
−t 0
fT (t) = et /2 t < 0.
That is, fT (t) = e−|t| /2 (Double Exponential).
4
The Convolution Formula
• Suppose X, Y are discrete, the pmf for W = X + Y is given by
X X
pW (w) = p(x, y) = p(x, w − y). (1)
(x,y):x+y=w x
• What if X, Y are continuous with pdf f (x, y)? Here is the guess:
follow eq (1) but replace sum with integral
Z ∞
fW (w) = f (x, w − x)dx.
−∞
Our guess turns out to be correct; the rigorous proof is given on
the next slide.
5
Let X, Y be continuous random variables with joint pdf f (x, y). Then
the pdf for W = X + Y is given by
Z ∞ Z ∞
fW (w) = f (x, w − x)dx = f (w − y, y)dy.
−∞ −∞
Proof:
Z ∞ Z w−x
FW (w) = P(X + Y ≤ w) = dx f (x, y)dy
−∞ −∞
Change-of-variables: y = u − x, i.e., u = x + y.
Z ∞ Z w
= dx f (x, u − x)du
−∞ −∞
Z w Z ∞
= du f (x, u − x)dx
−∞ −∞
Z ∞
dFW (w)
fW (w) = = f (x, w − x)dx
dw −∞
Don’t forget about the range of w and the range of (w − x)!
6
• Example 2.1.6 (a)(Revisit): X, Y ∼ Exp(1) and they are
independent. Find the pdf of Z = X + Y.
Z ∞ Z z
fZ (z) = f (x, z−x)dx = e−x e−(z−x) dx = ze−z , Z ∼ Ga(2, 1).
−∞ 0
Note that the joint pdf f (x, y) = fX (x)fY (y) is non-zero when
x, y > 0, that is, f (x, z − x) > 0 only if 0 < x < z.
• Example 2 from Convolution2.pdf.
7
The MGF Approach
If X, Y are independent, thena
MX+Y (t) = Eet(X+Y ) = EetX × etY
= (EetX )(EetY ) = MX (t)MY (t).
• Example 3 from Convolution2.pdf.
• Example 2.1.6 (a)(Revisit): X, Y ∼ Exp(1) and they are
independent. Find the pdf of Z = X + Y.
a Don’t confuse the equality above with MXY (t1 , t2 ) = MX (t1 )MY (t2 ) for
independent X, Y .
8
Additivity of Random Variables
If X, Y are independent,
• X ∼ Bin(n1 , p), Y ∼ Bin(n2 , p) =⇒ X + Y ∼ Bin(n1 + n2 , p)
• X ∼ Po(λ1 ), Y ∼ Po(λ2 ) =⇒ X + Y ∼ Po(λ1 + λ2 )
• X ∼ NB(r1 , p), Y ∼ NB(r2 , p) =⇒ X + Y ∼ NB(r1 + r2 , p)a
• X ∼ Geo(p), Y ∼ Geo(p) =⇒ X + Y ∼ NB(r = 2, p)
• X ∼ Ga(α1 , β), Y ∼ Ga(α2 , β) =⇒ X + Y ∼ Ga(α1 + α2 , β)
• X ∼ Ex(λ), Y ∼ Ex(λ) =⇒ X + Y ∼ Ga(2, 1/λ)
• X ∼ χ2 (r1 ), Y ∼ χ2 (r2 ) =⇒ X + Y ∼ χ2 (r1 + r2 )
• X ∼ No(µ1 , σ12 ), Y ∼ No(µ2 , σ22 )
=⇒ X + Y ∼ No(µ1 + µ2 , σ12 + σ22 )
a NB = Negative Binomial.