0 ratings0% found this document useful (0 votes) 55 views37 pages07 Random Processes
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
RANDOM PROCESSES
DUCTION
1s for random message signals and noise encountered in communication systems are developed in
jer. Random signals cannot be explicitly described prior to their occurrence, and noises cannot be
by deterministic functions of time. However, when observed over a long period, a random signal
may exhibit certain regularities that can be described in terms of probabilities and statistical
uch a model, in the form of a probabilistic description of a collection of functions of times, is
indom proce:
TONS AND NOTATIONS OF RANDOM PROCESSES
+ a random experiment with outcomes 2 and a sample space S. If to every outcome €$ we
-valued time funetion X(1,), we create a random (or stochastic) process. A random process
refore a function of two parameters, the time t and the outcome 2. For a specific 2, say, A, WE
time function X(t, ;) = xi(). This time function is called a sample function. The totality of all
ions is called an ensemble. For a specific time , X(t,,2) =X; denotes a random variable.
ty) and fixed A= 4), Xt A) = 3) is a number.
indom process is sometimes defined as a family of random variables indexed by the parameter
is called the index set
| illustrates the concepts of the sample space of the random experiment, outcomes of the
sociated sample functions, and random variables resulting from taking two measurements of
ctions.
wing we use the notation X(t) to represent X(t, 2).
S OF RANDOM PROCESSES
Expressions:
random process X(t). For a particular time 1,, X(t) =X; is a random variable, and its
tion Fy(3f:) is defined as
Feast) = PR) <4} (71)
real number.
165166 RANDOM PROCESSES (CHAP. 7
Sale space
age t ‘
ih 4
wo bean
Outcome y
“ Asai)
hook
Fig. 7-1 Random process
And Fy(x1:t1) is called the first-order distribution of X(). The comresponding firstorder density
function is obtained by
OF (xis)
oa (7.2)
Similarly, given 1 and t,, X(t,)=X, and X()=Xo represent two random variables. Their joint
distribution is called the second-order distribution and is given by
Fy, mith, ) = PIX) 0
2 Six(-@) = Sex)
3 °
1
xl. Sxx(@) do = Ryx(0) = EXO)
F. Cross Spectral Densities:
(7.36)
(737)
(7.38)
(7.39)
(7.40)
op
Then
1p on neopsin pt
al t= ost sy
And Sy() and Ryy(s) of band-limited white noise are shown in Fig. 7-4
Note that the term white or band-limited white refers to the spectral shape of the process X(t) only, and
these terms do not imply that the distribution associated with (1) is gaussian.
Sage Rear)
me
® Te
se Oe o 7 7
© o
Fig. 7-4 Band-limited white noise
D. Narrowband Random Process:
Suppose that X(f) is a WSS process with zero mean and its power spectral density Syy(0) is nonzero
only in some narrow frequency band of width 2W that is very small compared to a center frequency 0, as
shown in Fig. 7-5. Then the process X(t) is called a narrowband random process.
In many communication systems, a narrowband process (or noise) is produced when white noise (or
broadband noise) is passed through a narrowband linear filter. When a sample function of the narrowband
process is viewed on an oscilloscope, the observed waveform appears as a sinusoid of random amplitude
and phase, For this reason, the narrowband noise X() is conveniently represented by the expression
X(t) = VQeos [o,¢+ 0) (7.66)CHAP. 7) RANDOM PROCESSES 175
Syxhod
=e, 0 % “
bet ki
Ww Ww
Fig. 7-5 Narrowband random process
where V(0) and (i) are random processes that are called the envelope function and phase function,
respectively.
Equation (7.66) can be rewritten as
X() = Vieos $(0) cos «t= Vidsin 6(Osin «gt
or)
= X(N cos @.t — X(Nsin wt
where X.(0) = V(a)cos (0) (in-phase component) (7.68a)
X,() = V)sin $(¢) (quadrature component) (7.686)
Vit) = yX2() + X20), (7.694)
0 = 1a EO (7.696)
XO
Equation (7.67) is called the quadrature representation of X(t). Given X(), its in-phase component X.()
and quadrature component X,(t) can be obtained by using the arrangement shown in Fig. 7-6.
‘Low-pass ,
owe x
xe
xo 208 wt
Low-pass x0
‘iter
H2sin et
Fig. 7-6
Properties of X.(t) and X,(0:
1, X.() and X,(6) have identical power spectral densities related to that of X(7) by
Sx(O- 0) +Sxlo+o,) lol SW
$10) = 5x0) = 19 otherwise
(7.70)176 RANDOM PROCESSES [CHAP. 7
2. X.(r) and X,(f) have the same means and variances as X(t):
dx, = Hs, = sx = 0 on)
a, =o = 7.72)
3. X.(0) and X,(0) are uncorrelated with each other:
EIX(OX,(0)] = 0 (7.73)
If X(# is gaussian, then X,(#) and X,(0) are also gaussian.
5. 1FX()is gaussian, then for fixed but arbitrary t, V(t) is a random variable with Rayleigh distribution and
(0 is a random variable uniformly distributed over [0, 2x}. (See Prob. 6.39.)
Solved Problems
RANDOM PROCESSES AND STATISTICS OF RANDOM PROCESSES
7.1. Consider a random process X(®) given by
X() = Acos (ot + ®) (7.74)
where A and o are constants and @ is a uniform random variable over [~, x]. Show that X(1) is WSS.
From Eq, (6.105) (Prob. 6.23), we have
1
-ns0sn
Jo =} 28
0 otherwise
Thus, 1x00 = BIxen= | Avos (or-+ AYot®) a0
ar
Aff coctarsaao=o 75)
Reet, t +1) = ELX(OX(t+ 1)
= FI cos tar Oeos xe + 9+ 0449
ac cot wt 7.76;
7 7.76)
Since the mean of X() is a constant and the autocorrelation of X(t) is @ function of time difference only, we
‘conclude that X(1) is WSS.
[Note that Ryy(2) is periodic with the period Tp = 2x/w. A WSS random process is called periodic if its
autocorrelation is periodic.
4 eos ot +008 (a +20 + w)]d8
7.2. Consider a random process X() given by
X(t) = Aeos (or + 6) 777)
where @ and @ are constants and A is a random variable. Determine whether X() is WSS.Hap. 7) RANDOM PROCESSES im
73.
14,
ix) = EIX()] = EtAcos (at +8)
= cos (at + ETA] 78)
Which indicates tht the mean of X(t) is not constant unless EA] =.
Raxltt +) = BIXOX(+ 21
= BiAe0s (ot + eos folt +0) + ON
Moos wt + cos (2ot +26 + we))EIA"] 7.79)
‘Thus, we see that the autocorrelation of X(1) is nota function of the time difference + only, and the process X() is
not WSS,
Consider a random process Y(t) defined by
r= [xe id (7.80)
where X(0) is given by
X() = Acos ot 7st)
where @ is constant and A = N[0; 67],
(@) Determine the pdf of ¥() at t= ty.
(b) Is ¥(t) WSS?
@ Yay = f° Avon onds = 32% oan)
‘Then fom the esl of Pio 622, we see ta 4) gassan and variable with
BY) = — EIA) =0 (7.83)
ant vaaroo= (2220 ose
Hence, by Eq. (6.91), the pdf of ¥(1,) is
1
Fro) = Tina, (7.85)
(b) From Eqs. (7.83) and (7.84), the mean and variance of ¥(1) depend on time 1(j,), so Y(t) is not WSS.
Consider a random process X(t) given by
X() = Acos wt + Bsin cot (7.80)
where « is constant and A and B are random variables.
(@) Show that the condition
EIA] = E(B] = 0 (787)
is necessary for X(t) to be stationary.
(Show that X() is WSS if and only if the random variables A and B are uncorrelated with equal
variance, that is,
EIAB]=0 (7.88)178 RANDOM PROCESSES (CHAP.7
(7.89)
and
(@) x(t) = EIX()) = ElAlcos wt + B[B)sin ot must be independent of ¢ for X(@ to be stationary. This is
possible only if y(t) = 0, that is,
FIA] = E18) =0
()_IEX(0 is WSS, then from Eg, (A717)
aol=s[x(E)]=R0= ek
su xo-A ant x(Z)=2
Tha, BA) = B18] =o} =
Using the preceding result, we obtain
Raat) = EKOXC +O)
= EA c0s oF + B sin oA cos colt +2) +B sin ot +2)
= o%c0s ot + EIABIsin Qa + 0x) 790)
which will be a function of « only if EAB] = 0.
Conversely, if [4B] = 0 and {4} = E(B") = o?, then from the result of part (a) and Eq, (7.90), we have
px = 0
Ryxltst +1) = cos ot = Ry)
Hence, X(0 is WSS.
7.8. A random process X(t) is said to be covariance-stationary if the covariance of X(t) depends only on
the time difference =r — th, that is,
Cyultst +2) = Cxx(@) oly
Let X(t) be given by
X() = (A+ Deos t+ Bsint
where A and B are independent random variables for which,
EA] = E(B] =0 and E[A?] = E(B
Show that X(t) is not WSS, but it is covariance-stationary.
y(®) = ELX()| = EVA + 1) cos 1 + Bsin )
1A + Leos t+ £1B|sin 1
which depends on t. Thus, X() cannot be WSS.
Rex(tists) = EIXH)K(D]
= BUUA + 1084, + Bsin 4 ILA + 1) cos tg + Bint)
= ALA +1)? Joos 008 fy + EIB") sin sin tp
+ BUA + 1)B](608 4 sin f+ sin 1, €08 1)CHAP. 7) RANDOM PROCESSES 179
bfca 41] = 21a? + 28 + 1) = 1A?) +2814} +1 = 2
EA + 1)B] = E{AB| + EB) = ELAJEIB] + E18] = 0
EIB] = 1
Substituting these values into the expression of Ryy( sta), we obtain
Rants) = 2008 1 608 + in sin
= €08 (fy — fh) + £08 1 608 fy
From Eg, (7.9), we have
Caen) = Realty sto)— Hat ata)
£608 (tp — fy) + 608 11008 fz ~ C08 11608 fy
= c0s (2-H)
‘Thus, X(0) is covarance-stationary
7.6. Show that the process X(@) defined in Eq. (7.74) (Prob. 7.1) is ergodic in both the mean and the
autocorrelation.
From Eq. (7.20), we have
1p
0) = fim 7] aos (ott
A hie
AP cos (wea dt =
(7.92)
where Tp = 2n/o.
rom 72, wee
Ry) = GtOxtt +9)
ea ou
fin} ses orton ft 4) 004
1
Fl nn leos or reoseant +20 + ont
4
= Leos wt 7.
5 (7.93)
‘Thus, we have
Hxlt) = EIX()] = (x0) = ¥
Rugs) = EKOX+ 2)] = WOE) = Rx)
Hence, by definitions (7.24) and (7.25), we conclude that X() is ergodic in both the mean and the autocorrelation.
CORRELATIONS AND POWER SPECTRA
7.7. Show that if X(Q is WSS, then
FXG +) -x0F
21Rrx(0) - Rix] 794)
where Ryy(2) iS the autocorrelation of X(0).
Using the linearity of & (the expectation operator) and Eqs. (7.15) and (7.17), we have180 RANDOM PROCESSES (CHAP. 7
Axe + 9 -xoF] = Ae+ 9 -2K0+ Ox +H]
= Hre+o]-2exe+oxorrafeo] ,
(0) — 2x0) + Raa)
= 2Rrx(0) ~ Rex(e)]
7.8. Let X() be a WSS random process. Verify Eqs. (7.26) and (7.27), that is,
@ Ryx(-0) = Rex lt)
® WRrx()] < Ryx(O)
(@)_ From Eg. (7.15)
Ryx(2) = EIXOXE +2)
Setting +4 =¢, we have
Reg(s) — EIME — 9X)
= XOX = Ryx(-1)
® Axo = x0+07]=0
or Heo =mxoxero+Ke+9]=0
or Afro] + 2EKKOXE+ 9) + APU +O] 0
or 2Ryy(O) * Ryle) =0
Hence, Ry > Rex(Ol
7.9, Show that the power spectrum of a (real) random process X(0) is real, and verify Eq, (7.39), that is,
Sue) = Syx(0)
From Eq. (7.36) and by expanding the exponential, we have
Selo) = f. Roeder
Ryx(oKe0s ot ~jsin wr)de (795)
FF Retoeosends=[" Re (osin ode
Since Roy(t) = Rry() [Eq. (7.20)], the imaginary term in Eq, (7.95) then vanishes and we obtain
sa |
Ryx(t)c08 ot dt (7.96)
which indicates that Sy(w) is real
‘Since the cosine is an even function of its arguments, that is cos (~c7t) = cos ar, it follows that
Sexl-@) = Sxl)
Which indicates that the power spectrum of X() is an even function of frequen.CHAP. 7) RANDOM PROCESSES 181
7.10,
TAL.
‘A class of modulated random signal ¥(¢) is defined by
¥(0) = AX(Nc0s (w.t+ ©) (797)
where X(j) is the random message signal and Acos (w fy. When fy~f, > Ty then f; and ¢> must fallin different pulse intervals [Fig. 7-10 (q)] and the random
variables X(t,) and X(p) are therefore independent, We thus have
Relist) = EIXCG XC = ELX(C EI)
When ff, < Ty, then depending on the value of Ty, t, and fg may or may not be in the same pulse interval
o (7.106)
[Fig, 7-10 () and (0) Ife let B denote the random event "t, and are in adjacent pulse intervals,” then we have
Rela) = ELK IX()IBIPB) + EXC )XE BIB)
Now FUX(H )X(C)B) = EKG NEUE]
EAX()X()BL =?
Since PCB) will be the same when 1; and fy fll in any time range of length 7 it sulices to consider the case
0<1-< Th, as shown in Fig. 7-10 (0). From Fig. 7-10 (0),
PB) = Ply < Ta < tp)
=f fnttoats
From Eq, (6.4), we have
PB) =1~PB)
Thus, Realty) =A(1 7.107)
where t= ft
Since Ryy(-t) = Riy(®), We conclude that .184
RANDOM PROCESSES
(CHAP. 7
, Jaw
“| [—
Ol ala th & ue
ofa |e te 7
wit
wop tt
na Ie *
Fig. 7-10
At it I=7,
Ral = 7) RST 7.108)
oO I> 7,
which is ploted in Fig. 7-11(@.
rom Eqs. (7.105) and (7.108), we see that X() is WSS. Thus, from Eq (7.36), the power spectrum of X() is
Which is plotted in Fig. 7-110).
Ratt)
az
See) =
enf® coreap (7.109)
ot /2
Seal)
Fig. 7-11CHAP. 7) RANDOM PROCESSES 185
7113.
714.
Let X(2) and Y(#) be WSS random processes. Verify Eqs. (7.29) and (7.30), that is,
@ Rey(-0) = Rint)
O) Irv] < VR ORO
(a) By Bq. 718)
axOre—9l
Seting 1 = #, we obtain
Rey(-0) = BIKE + DYE)) = EOE +91 = Ral)
(b) From the Cauchy-Schwarz inequality Eq, (6.148) (Prob. 6.52) it follows that
{ELXOOVE + DIP = ELON + 91
or [Rar(oP = RexORy(O
Thus, Wr $ VR ORO
Let X(#) and Y(t) be both zero-mean and WSS random processes. Consider the random process Z(?)
defined by
20) =XO+VO (7.110)
(@ Determine the autocorrelation and the power spectrum of Z(t) if X¢) and Y(¢) are jointly
Wss.
(b) Repeat part (a) if Xi) and ¥() are orthogonal,
(©) Show that if X(@) and ¥() are orthogonal, then the mean square of Z(0) is equal to the sum of
the mean squares of X() and YC).
(@) The autoconclation of 2(9 in given by
Roath) = EZ)
= BUX) + YOR) + VEIT
= EKG )MG)) + FIXED
+ ELM) + EOI)
= Rexistn) + Ravltyt) + Rallis) + Rnttit) ny
If X() and ¥() are jointly WSS. then we have
Real) = Rul) + Revo) + Rix) + Rr) guy
where t= fi
‘Taking the Fourier transform of both sides of Eq. (7.112), we obtain
Sx) = Sexo) + Sey) + S60) + Sx) 7.113)
() IEX(H and NH are orthogonal (Bq. (7.34),
Re)
Rex) = 0 .186
7.16.
RANDOM PROCESSES (CHAP. 7
‘Then Eqs. (7.112) and (2.113) become
Rezl2) = Rex) + Riv) id)
and Sz2(0) = Syxl0) + S(O) sy
(©) From Bgs. (7.114) and (7.17),
Rez{0) = Real) + Ry)
or AZO) = XO] + EO) 7.116)
Which indicates that the mean square of Z(t) is equal to the sum of the mean squares of X() and Y¢0)
Two random processes X() and Y(0) are given by
X() = Acos (ot + ©) (7.117)
¥(@) = Asin (wt +) 7.1176)
where A and @ are constants and © is a uniform random variable over [0,2n]. Find the cross-
correlation of X(2) and ¥(), and verify Eq, (7.29).
From Bq, (7.18), the eross-correlation of (0) and ¥() is
Rett +9) = EXON 9)
= BIAc0s (ot + O)sin [att +2) +O]
S blsin Qox tax +20)—sin (-00)]
sin ot = Ren) 7.1180)
Similarly, Roltse+ 9) = BWR + 9)
= Bla?sin (at + @)cos [ole +) + O})
2
4 Blain Qt +a +26) + sin (-02)) 7118)
R & in ot = Rn)
anya) = 4 snot = Ry)
Which verifies Eg. (7.29)
Let X(0) and ¥(#) be defined by
X() = Acos wot + Bsin ot (7.119)
Y(®) = Boos wr —Asin ot (7.1196)
where « is constant and A and B are independent random variables both having zero mean and variance
2°, Find the eross-correlation of X(f) and Y().
‘The cross-correlation of X(0) and Y() isCHAP. 71 RANDOM PROCESSES 187
Revtti te) = EIXH)Y
EI(ACOS on, + Bin of (B008 oF, ~ Asin O,)]
= EVAB)(008 on,c08 cory — sin cn,sin of)
= BIA? Joos ox sin cory + ETB? sin corcos oty
Since FAB) = EVAJEW] = 0. 1A) = E18" =
we have Rarity
sin or Cos ety — c0$ wt Sin Of)
sin ot) =f)
(7.120)
where t= hh
‘TRANSMISSION OF RANDOM PROCESSES THROUGH LINEAR SYSTEMS.
7.17. A WSS random process X(*) is applied to the input of an LTI system with impulse response
A(t) = 3e"*u(0). Find the mean value of the output ¥(2) of the system if E[X(®)] = 2.
By Eq, (1.53, the frequency response H() of the system is
H(o) = F(h)| = —
‘Then, by Eq, (7.50), the mean value of Y(0 is
y(t) = EIVOL = pyH(O)
7.18. Let ¥(0)be the output of an LTI system with impulse response hi), when X() is applied as input. Show
that
Rarttntad= [ADRs ts Dab (7.121)
® Ryyltyt) -[ NOR es(6, —2, 2) dx (7.122)
(@ Using Bq. (7.47), we have
Reyltist) = EKG)
= Axe |” snxe.- nan]
=f mmercenxe— onde
. = [mre Dae
(Sinn,
RylG,t) = BYE)
= Af seoxe,—naonea|
= jf NEEL ~ OYE188
7.19.
720.
RANDOM PROCESSES (CHAP. 7
= [remot and
Let X(¢) be a WSS random input process to an LTI system with impulse response h(t), and let ¥(0) be
the corresponding output process. Show that
@ Ryy(0) = A) * Rex) (7.123)
@) Ryyl(2) = h(t) # Ryy(@) . (7.124)
© SO) = MOS) (7.123)
@ Sy(@) = H"()S¢(@) (7.126)
where * denotes the convolution and "(c) is the complex conjugate of H(@).
(@ IE X( is WSS, then Eg. (7.122) of Prob. 7.18 becomes
Realty)
[ntatn=1-as on
‘which indicates that Ryy(ty, 42) isa function ofthe time difference = r, ~ 1, only. Hence, Eq. (7.127) yields
Rar) = [ MPDRx(=— dP =H) * Rex)
(B) Similarly, if XC) is WSS, then Bg, (7.122) becomes
ed ee
or Ry
J WeRle +2) = Ha) + Rel)
(©) Taking the Fourier transform of both sides of Eq. (7.123) and using Egs. (7.47) and (1.28), we obtain
Se(@) = H@Sy(0)
(q)_ Simitarly, taking the Fourier transform of both sides of Eq. (7.124) and using Eqs. (7.36), (1.28), and (1.21),
‘we obtain
HOS)
Note that by combining Eqs. (7.125) and (7.126), we obtain Eq, (7.53) that is,
So)
Si(@) = HHS, (0) = HO)
ex)
Let X() and ¥(@ be the wide-sense stationary random input process and random output process,
respectively, of a quadrature phase-shifting filter (—n/2 rad phase shifter of Sec. 2.6). Show that
(a) Raxlt) = Rw) (7.128)
& Ray(o) = Rex) (7.129)
where Ryx(1) is the Hilbert transform of Rxx(7).CHAP. 71 RANDOM PROCESSES 189
TU.
7.22.
(a) The Hilbert transform X(t) of X(0) was defined in See, 2.6 asthe output of a quadrature phase-shifting filter
with
n= 4 He)=-jseno)
Since |H()|? = 1, we conclude that if X() is a WSS random signal, then ¥(¢) = X(@) and by Eq. (7.53)
Srl) = IH(@)PS(@) = Sexo)
Hence, Ry) = FSO = F [Sy] = Ral)
(®) Using Eqs. (7.123) and (2.26), we have
Rey) =H) # Re) = 4 # Rete) = Rex
A.WSS random process X(0) with autocorrelation
Ryx(t) = Ae
Where A and a are real positive constants, is applied to the input of an LTI system with impulse
response
he ult)
where b is a real positive constant. Find the autocorrelation of the output ¥(¢) of the system,
Using Eq, (J.53), we see that the frequency response F(a) of the system is
1
Mo) = FO =
2)
So Wo? = Se
‘Using Eq, (1.55) we see thatthe power spectral density of X() is
Selo) = FRO = As
By Eq, (7.53), the power spectral density of Yi) is
Sr(@) = HOPS yo)
~ (eral)
_ a ( 2» )- A ( 2a )
. CIF FP) PP or +e
‘Taking the inverse Fourier transform of both sides of the above equation and using Eq. (J.55), we obtain
pot
se
Verify Eq. (7.38); that is, the power spectrum of any WSS process X() is real and
Sx 0
for every 0.190
123.
RANDOM PROCESSES CHAP. 7
‘The realness of the power spectrum of X(j) was shown in Prob. 7.9. Consider an ideal bandpass filter with
frequency response (Fig. 7-12)
af! a (A) 7.140)
Substituting Eqs. (7.144) und (7.146) into Eq, (7.58), we obtain Eq. (7.141).
Let X; = X(t) and X, = X(t) be jointly gaussian random variables, each with a zero mean and a
variance o*, Show that the joint bivariate gaussian density function is given by
1 [et]
fix 2) 7.147)
ex ee
i=p | 2 @-p*)
where p is the correlation coefficient of X; and X, given by p = Cyz/o? (Eq. (6.83)
2ni
Sutiming Cy = Con nd Cia = C= po int Ba 7.8, we ave
e-[2 @]-[) 4]
det cP 4 f=)
Since x= 0, x— =x, and
roxsaiiely T]s]
- a 2pxix, +33)
‘Substituting these results in Eq, (7.58), we obtain Bq, (7.147).
‘Note that Eq. (7.147) can be obtained from Eq, (6.107) (Prob, 6.28) by setting X = X,;¥ = Xa, ie = say = 0,
and oy = oy = 0,
‘The relation between the input X(t) and output ¥() of a diode is expressed as
Ye) = 2 (7.148)
Let X(0 be a zero-mean stationary gaussian random process with autocorrelation
Rex
Find the output mean ip(#), the output autocorrelation Ryy(1), and the output power spectral density
Sy).
a>0
Hxl0) = EU) — BX} — RyxlO) 149)
Ryrlty te) = ELMO VON = EPG196 RANDOM PROCESSES (CHAP. 7
‘Since X(¢,) and X(¢) are zero-mean jointly gaussian random variables, by the result of Prob. 6.71
eon] = EPP ey] e)] + ERO KOE (7.150)
Since X() is stationary
EX" (t,)] = EIX*()] = Rex)
and ELK(0) X¢@)] = Rexltn— A) = Ria)
Hence, Retin) = Rw) = ROP + AReeOP (7.151)
and using Eqs. (.4/) and (1.29), we have
So) = FUR) = Pala OP6(0) ++ SL) * Sx) (7.132)
Now, forthe given input autocorrelation, by Eqs. (7.149) and (7.151),
Hy = Ry) = 1
and Riya) — 1 4 2
By using Eqs. (747) and (1.55), the output power spectral density is
8a
Siro) = FIR (0) = 225(0) ++ Sas
7.32. The input X(0) to the RC filter shown in Fig. 7-18 is a white noise process.
(@) Determine the power spectrum of the output process Y(0).
(6) Determine the autocorrelation and the mean-square value of ¥(1).
it
[|
xo ck Yo
| |
Fig. 7-18 RC filter
From Prob. 2.6 the frequency response of the RC filter is
1
HO) = TjoRe
(@)_ From Eqs. (7.62) and (7.53)
Sux(o)
14
Smo) = HOPS) = Re
(7.153)CHAP. 7) RANDOM PROCESSES 197
(b)_ Rewriting Bq, (7.153) as
1 21/RO))
DERE + 1 ROE
So)
and using the Fourier transform pair Eq, (1.55), we obtain
(7.154)
Finally, from Eq. (7.154)
(7.453)
7.33, The input X(0) to an ideal bandpass filter having the frequency response characteristic shown in
Fig, 7-19 is a white noise process. Determine the total noise power at the output of the filter.
me
”
m4
= ° .
Fig. 719
=!
Sa(o) =4
Spo) = H(@)PSya) = 310)?
‘The total noise power atthe output of the filter is
BP Ol=z) _Srodo=ZF[" lado
(7.156)
where B = Wp/(2x) (in Hz),
7.34, The equivalent noise bandwidth of a system is defined as
1 Jo IH(@)Pdeo
Sdn HOM
max |H(@)/
Hz (7.157)
where |H(co){>,
(@) Determine the equivalent noise bandwidth of the ideal bandpass filter shown in Fig, 7-19.
(b) Determine the equivalent noise bandwidth of the low-pass RC filter shown in Fig. 7-18, and
compare the result with the 3-dB bandwidth obtained in Prob. 2.9.
(@) For the ideal bandpass filter shown in Fig. 7-19, we have max |(a)I? = 1 and198 RANDOM PROCESSES (CHAP. 7
Wa,
baad °
(odo BH 7.188
na = ag), Modo = 5. assy
(D) For the low-pass RC filter shown in Fig. 7-18 we have
1
Woo! = oReF
and max |H(@)P? = |H(O)P = 1
Thus a f£ do Lip? de
Be Jo THORCF 222). TF wRCE
ae HM (7.159)
1 1
Bus = 5 Wass = sen He
M8 9g NM SRC
Thus, Bag = Bap = 157B 39
7.35. A narrowband white noise X(0) has the power spectrum shown in Fig. 7-20(a). Represent X(t) in
terms of quadrature components. Derive the power spectra of X,(t) and X,(t), and show that
ELX2()] = BIX:(0] = EXO] (7.160)
From Eq, (7.67)
X() =X. eos wt ~X(O sin ot
From Eq, (7.70) and Fig. 7-20 (a), we have
Sea(0)=Syu(0)=f8 ISRO 2B
= » oO
Pz
= ° * 7
By
Sxuxle = Sux)
“wow >
w
Fig. 7-20CHAP. 7) RANDOM PROCESSES 199
736.
731.
7.238.
7.29.
which is plotted in Fig. 7-20(b). From Fig. 7-20(a)
awol=zfSuordo
oy
Joe
From Fig. 7-20(6)
soe! 1
austen eto ze) [ yom Zena) = 298
Hence, FO)
X20] = ELC)
nb (7.161)
Supplementary Problems
Consider a random process X(¢) defined by
X() = cos OF
where Q is a random variable uniformly distributed over [0, ap]. Determine whether X() is stationary.
Ans, Nonstationary
Hint: Examine specific sample functions of X(0) for different frequencies, say, Q = 1/2, x, and 2r.
Consider the random process X(t) defined by
X() = Acos ot
Where © is a constant and A is a random variable uniformly distributed over (0, 1]. Find the autocorrelation and
autocovariance of X(0)
Ans, Ryxltsts) = feos 4008
Galt fa) = 4998 008
Let X( be a WSS random process with autocorrelation
Ryle) = Ae
Find the second moment of the random variable ¥ = X(5) ~ X(2).
Ans, 2A(1
Let X() be a zero-mean WSS random process with autocorrelation Ryy(). A random variable ¥ is formed by
integrating XC:
1
2,
y
[roe
FFind the mean and variance of ¥.
Lf aaca(
Ans. jy = Oe)200 RANDOM PROCESSES (CHAP. 7
740, A sample function of a random telegraph signal X(#) is shown in Fig. 7-21. This signal makes independent
random shifts between two equally likely Valves, A and 0. ‘The number of shifts per unit tlme is governed by the
Poisson distribution with parameter 2.
|
(a) Find the autocorrelation and the power spectrum of X(),
(&) Find the rms value of X(9.
x0
il
Fig. 7-21
Ans) Regi) = 20 +e" Seco)
) A
2
7.41. Suppose that X(#) is a gaussian process with
B= 2 Ryxle) = Se
Find the probability that X(4) < 1
Ans. 0.159
7A2. The output of a filter is given by
YQ =XC+T)-XG-1)
where X(t) is a WSS process with power spectrum Syx(0o) and Tis a constant. Find the power spectrum of H(t).
Ans, Sy) = 4sin 7@TSex(0)
743. Let X@) is the Hilbert transform of a WSS process X(0). Show that
Ryx(O) ol
Exes
Hint, Use relation (b) of Prob. 7.20 and definition (2.26).
TAS, When a metallic resistor Ris at temperature T; random electron motion produces @ noise voltage Vis) at tho open
circuited terminals. This voltage V() is known as the thermal noise. Its povser spectrum Sjy(») is practically
constant for f = 10"? Hz and is given by
[ — |
;
| Svvta |
ig. 7-22CHAP. 7) RANDOM PROCESSES 201
Syy(o) = 2KTR
where k= Boltzmann constant = 1.37(10"%), joules per kelvin (J/K)
T = absolute temperature, Kelvins (K)
R= resistance, ohms ()
Calculate the thermal noise voltage (rms value) across the simple RC circuit shown in Fig. 7-22 with R = 1 kilohm
(4M), C= 1 microfarad (uF), at T= 27° C.
f
stage eon