[go: up one dir, main page]

0% found this document useful (0 votes)
222 views6 pages

Ps 1

1) The document discusses properties of characteristic functions including that if X is symmetric, then its characteristic function φX(u) is real. 2) It also shows that if X-Y is symmetric, then φX-Y(u) is real. 3) Wald's identity is derived, stating that the expected value of a sum of random variables equals the expected value of each term times the expected number of terms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
222 views6 pages

Ps 1

1) The document discusses properties of characteristic functions including that if X is symmetric, then its characteristic function φX(u) is real. 2) It also shows that if X-Y is symmetric, then φX-Y(u) is real. 3) Wald's identity is derived, stating that the expected value of a sum of random variables equals the expected value of each term times the expected number of terms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Homework # 7, #8

14.7. Assume that X (u) is real. Then

X (u) = X (u) = X (u) = X (u)


d
Therefore X = X, i.e., X is symmetric.
d
Assume X = X. Then

X (u) = X (u) = X (u)

So X (u) is real.

14.8. Notice that

XY (u) = X (u)Y (u) = X (u)X (u) = |X (u)|2

is a real function. So X Y is symmetric.

15.5. Write

X
SN = Sn 1{N=n}
n=0

We have

X  m
X  m
X 
ESN = E Sn 1{N=n} = E lim Sn 1{N=n} = lim E Sn 1{N=n}
m m
n=0 n=0 n=0
m m
(1)
X X X
= lim ESn 1{N=n} = lim E(Sn )P {N = n} = E(Sn )P {N = n}
m m
n=0 n=0 n=0

where the third equality follows from dominated convergence with the controaling random
veriable

X
Y = |Sn |1{N=n}
n=1

This is because of
m
X


Sn 1 {N=n}
Y
m = 1, 2,
n=0

and

X nX
X n o

EY = E |Sn |1{N=n} E|Xj |1{N=n}
n=1 n=1 j=1

(2)
X  X 
= n E|Xj | P {N = n} = E|X1 | nP {N = n} = E|X1 | EN <
n=1 n=1

1
Inaddition, the fifth equality in (1), and the third step in (2) follows from independence
between {Xj } and N .

Further, from (1) we have


X  
ESN = n EX1 P {N = n} = EX1 EN
n=0

This is a special form of Walds first identity.

15.13. Write Z = (X, Y1 , , Yn )

n n
X o
Z (u0 , u1 , , un ) = E exp iu0 X + i uk Y k
k=1

Notice that
n
X  n
X  n
X
u0 X + uk Y k = u0 uk X + uk X k
k=1 k=1 k=1
n n n
1 X X X
= u0 uk Xk + uk X k
n
k=1 k=1 k=1
n  n 
X 1 X 
= u0 uk + uk X k
n
k=1 k=1

Z (u0 , u1 , , un )
n n 2 n
2 1 
    
Y X  1 X 
= exp u0 uk + uk + i u0 uk + uk
2 n n
k=1 k=1 k=1
 2 X n   n 2
1 X 
= exp u0 uk + uk
2 n
k=1 k=1
n   n 
X 1 X 
+ i u0 uk + uk
n
k=1 k=1

n   n 
X 1 X 
u0 uk + uk = u0
n
k=1 k=1

2
n   n 2
X 1 X 
u0 uk + uk
n
k=1 k=1
n
X 1   Xn 2 2  n
X  
2
= u0 uk + uk u0 uk + uk
n2 n
k=1 k=1 k=1
n n n n
1 X 2 2  X  X  X
= u0 uk + uk u0 uk + u2k
n n
k=1 k=1 k=1 k=1
 n n 2 n n
1 X  X 1 X
  2 X
= u0 uk + uk uk + u2k
n n
k=1 k=1 k=1 k=1
n n
1 2 X 1X 2
= u + u2k uk
n 0 n
k=1 k=1

Therefore,

Z (u0 , u1 , , un )
n 2  n n 2 
2 X 2 1  X 2
o 
2
= exp u iu0 exp uk uk
2n 0 2 n
k=1 k=1

Letting u0 = 0, we obtain the characteristic function of Y = (Y1 , , Yn )


 n n 2 
2 X 2 1  X 2

Y (u1 , , un ) = Z (0, u1 , , un ) = exp uk uk
2 n
k=1 k=1

Similarly, letting u1 = = un = 0 we obtain the characteristic function of X


n 2 2 o
X (u0 ) = Z (u0 , 0, , 0) = exp u0 iu0
2n

In particular X N (, 2 /n).
In addition,
Z (u0 , u1 , , un ) = X (u0 )Y (u1 , , un )
This implies that X and (Y1 , , Yn ) are independent. Since
n
1X 2
Sn2 = Y
n j=1 j

is a function of (Y1 , , Yn ), X and Sn2 are independent.

Alternative solusion. We may also use Gaussian property to give a proof. First, we
claim that (x, Y1 , , Yn ) is Gaussian. Indeed, any linear combination of x, Y1 , , Yn is
1-dimensional normal, as it can be written as a linear combination of X1 , , Xn .

3
Second,
n
  2 1 1 X
Cov(x, Yj ) = E (x )(Xj ) E x = Var(Xj ) 2 Var(Xk ) = 0
n n
k=1

for j = 1, , n. By Theorem 16.4, x and (Y1 , , Yn ) are independent. Consequently, x


and S 2 are independent.

16.2. Let A (, ) be Borel-measurable.


o o
P {Z A} = P {Y A, |Y | a + P {Y A, |Y | > a

By the symmetry of Y , we can replace Y by Y in the second ter on the right hand side:
o o o
P {Y A, |Y | > a = P {(Y ) A, | Y | > a = P {Y A, |Y | > a

Thus,
o o
P {Z A} = P {Y A, |Y | a + P {Y A, |Y | > a = P {Y A}

d
Hence, Z = Y .
Remark. What we really need here is not the normality, but symmetry of Y .

16.7. By definition of Gaussian random variable, Y N (, 2 ) where

n
X
= EY = aj E(Xj )
j=1

and
n
2 hX i2
2 = Var(Y ) = E Y EY = E

aj Xj E(Xj )
j=1
n
X 
2 X
a2j
 
=E Xj E(Xj ) +2 aj ak Xj E(Xj ) Xk E(Xk )
j=1 j<k
n
X X
= a2j Var(Xj ) + 2 aj ak Cov(Xj , Xk )
j=1 j<k

16.16. We need only to exam two things: First, (X, Y X) is 2-dimensional Gaussian.
Second, Cov(X, Y X) = 0.

4
For any constant a1 , a2 ,

a1 X + a2 (Y X) = (a1 )X + a2 Y

Since (X, Y ) is Gaussian, a1 X + a2 (Y X) is normal. This shows that (X, Y X) is


Gaussian.
By linearty,
2
Cov(X, Y X) = Cov(X, Y ) Cov(X, X) = hboxCov(X, Y ) X
2
By the fact X = Y2 q q
2 2
X = X Y2 = Cov(X, Y )
Hence,
Cov(X, Y X) = Cov(X, Y ) Cov(X, Y ) = 0.

17.3. By Chebyshev inequality, for any > 0


n
n 1 X n n
o 1 1 X  1 X 1
P Xj 2 V ar Xj = 2 2 V ar(Xj ) = 2

n n n n
j=1 j=1 j=1

The right hand side tens to 0 as n . So we have


n
n 1 X o
lim P Xj = 0

n n
j=1

17.4 Replace n by n2 in above estimate.

n 1 Xn2 o 1
P 2 Xj 2 2

n j=1 n

Hence,
n 1 Xn2
X o X 1
P 2 Xj <

n=1
n j=1 n=1
n 2
2

By Borel-Cantelli lemma,
\
 [ n n2 o
1 X

P 2 Xj =0
m=1 n=m
n j=1

Notice that
 1 X n2  \ n n2
1 X
[ o
lim sup 2 Xj = Xj

2
n n j=1 m=1 n=m
n j=1

5
Thus,
1 X n2
lim sup 2 Xj a.s.

n n j=1

Letting 0+ on the right hand side we have


2
n
1 X
lim Xj = 0 a.s.
n n2
j=1

You might also like