url) ¢-tan)
PROBABILITY AND
RANDOM VARIABLES
DUCTION
far we have discussed the transmission of deterministic signals over a channel, and we
lemphasized the central role played by the concept of “randomness” in communication,
random means unpredictable. If the receiver at the end of a channel knew in advance the
itput from the originating source, there would be no need for communication. So there is
ess in the message source. Moreover, transmitted signals are always accompanied by
juced in the system. These noise waveforms are also unpredictable. The objective of
is to present the mathematical background essential for further study of communi-
dy of probability, any process of observation is referred to as an experiment. The results
tion are called the outcomes of the experiment. An experiment is called a random
its outcome cannot be predicted. Typical examples of a random experiment are the roll
ss of a coin, drawing a card from a deck, or selecting a message signal for transmission
essages,
and Events:
Il possible outcomes of a random experiment is called the sample space S. An clement
sample point. Each outcome of a random experiment corresponds to a sample point.
lled a subset of B, denoted by A cB if every element of A is also an element of B. Any
imple space S is called an event. A sample point of S is often referred to as an
1. Note that the sample space S is the subset of itself, that is, S cS. Since Sis the set of
tcomes, it is often called the certain event.
128CHAP. 6] PROBABILITY AND RANDOM VARIABLES 129
C. Algebra of Events:
The complement of event A, denoted , is the event containing all sample points in $ but not in A.
2. The union of events A and B, denoted A U 8, is the event containing all sample points in either A
or Bor both.
3. The intersection of events A and B, denoted 4 B, is the event containing all sample points in
both A and B.
4, The event containing no sample point is called the null event, denoted @. Thus & corresponds to an
impossible event.
5. Two events A and B are called mutually exclusive or disjoint if they contain no common sample
point, that is, AN B=
By the preceding set of definitions, we obtain the following identities:
2 B=S
SUA=S SNA=A
AUd=8 Ana=0 4
D. Probabilities of Events:
‘An assignment of real numbers to the events defined on $ is known as the probability measure.
In the axiomatic definition, the probability P(A) of the event A is a real number assigned to 4 that
satisfies the following three axioms:
Axiom 1: PA) 0 6.2)
Axiom 2: AS) =1 6.2)
Axiom 3: PAU B)= P(A) + PB) if ANB=O 6.3)
With the preceding axioms, the following useful properties of probability can be obtained (Probs. 6.1—
64):
1 1- P(A) 64)
2. 65)
3 PAS PB) if ACB 66)
4 PAS 1 (67)
5. PAUB)= P(A) + P(B)- ANB) (68)
Note that Property 4 can be easily derived from axiom 2 and property 3. Since A cS, we have
P(A) = PS) = 1
Thus, combining with axiom 1, we obtain
0= P= 1 69)
Property 5 implies that
PAU B)< P(A) + PB) (6.10)
since P(A 9 B) > 0 by axiom 1.
‘One can also define P(A) intuitively, in terms of relative frequency. Suppose that a random
experiment is repeated n times. If an event A occurs mg times, then its probability P(A) is defined as
PA) = fim 4 11)
Note that this limit may not exist.130 PROBABILITY AND RANDOM VARIABLES [CHAP. 6
E. Equally Likely Events:
Consider a finite sample space $ with finite elements
S= fy Aay---sdnb
where 4's are elementary events, Let P(2,) = pp. Then
1 0
0 6.16
PIB) = Fm PB) > (6.16)
where P(4.9 B) is the joint probability of A and B. Similarly,
PAB)
Baldy SOS Py > 0 6.17
PL PLA) A) 6.17)
is the conditional probability of an event B given event A, From Eqs. (6.16) and (6.17) we have
P(A B) = PCAIB)P(B) = PCBIA)P(A) 6.18)
Equation (6.18) is often quite useful in computing the joint probability of events.
From Eq. (6.18) we can obtain the following Bayes rule:
PUBIA)P(A)
PLB) PB
6.19)
G. Independent Events:
Two events A and B are said to be (statistically) independent if
PCAIB) = P(A) and P(BIA) = PCB) (6.20)
This, together with Eq, (6.19), implies that for two statistically independent events
PAN B) = PAB) (6.21)CHAP. 6) PROBABILITY AND RANDOM VARIABLES 131
We may also extend the definition of independence to more than two events. The events
Aj, Aps.+yAq are independent if and only if for every subset {4;,,4i,-..4,} (2k BOA) => PBIADPLA) 6.24)
&
which is known as the total probability of event B (Prob. 6.13). Let A= 4; in Eq. (6.19); using Eq,
(6.24) we obtain
POBIAY)
P(A IB) = => —
& PwlA)PLA),
a
(6.25)
‘Note that the terms on the right-hand side are all conditioned on events 4,, while that on the left is
conditioned on B, Equation (6.25) is sometimes referred to as Bayes" theorem.
6.3 RANDOM VARIABLES
A. Random Variables:
Consider a random experiment with sample space S. A random variable X(2) is a single-valued real
function that assigns a real number called the value of X(2) to each sample point 2. of S, Often we use a
single letter X for this function in place of X(4) and use r.v. to denote the random variable. A
schematic diagram representing a r.v. is given in Fig. 6-1
xa) *
Fig. 6-1 Random variable ¥ as a function
The sample space S is termed the domain of the r.v. X, and the collection of all numbers [values of
X(A)] is termed the range of the r.v. X. Thus, the range of X is a certain subset of the set of all real
‘numbers and itis usually denoted by Ry. Note that two or more different sample points might give the
same value of X(2), but two different numbers in the range cannot be assigned to the same sample point.
The rv. X induces a probability measure on the real line as follows:
PU = 3) = PA XG) =a}
PU Sx) = PU X=
Pls <¥ a) = 1-Fy(a) 6.29)
PU 0) if its pmf is given by
pi) = P= Bere k=0, 1, (688)
The corresponding cdf of X is
nsxentl (6.89)
‘The mean and variance of the Poisson r.v. X are (Prob. 6.42)
Mesa yea (6.90)
‘The Poisson distribution arises in some problems involving counting, for example, monitoring
the number of telephone calls arriving at a switching center during various intervals of time. In
digital communication, the Poisson distribution is pertinent to the problem of the transmission of
many data bite when the error rates are low. The binomial distribution becomes awkward to
handle in such cases. However, if the mean value of the error rate remains finite and equal to 3,
we can approximate the binomial distribution by the Poisson distribution. (See Probs. 6.19 and
6.20)
C. Normal (or Gaussian) Distribution:
Atty. Xs called normal (or gaussian) r.v. if its pdf is of the form
i) eile) 691)
‘The corresponding edf of ¥ is
Fr =
1p » 1 powie
AW Oe ge f O72
é dg = oe Pat 692
Jia m0 6 f )
This integral cannot be evaluated in a closed form and must be evaluated numerically. It is convenient
to use the function rz) defined as
oP Pag (6.93)
‘Then Eq. (6.92) can be written as
Fe) (6.94)
‘The function Q/=) is known as the complementary error fiatction or simply the Q function. The
function Q(z) is tabulated in Table C-1 (App. C). Figure 6-3 illustrates a normal distribution.
The mean and variance of ¥ are (Prob. 6.43)140 PROBABILITY AND RANDOM VARIABLES. [CHAP. 6
o
Fig. 63. Normal distribution
by=n of=o' (6.95)
We shall use the notation N(j; 0%) to denote that X is normal with mean 1 and variance o. In
particular, X = N(0;1); that is, X with zero mean and unit varianco is defined as a standard normal
rN.
The normal (or gaussian) distribution has played a significant role in the study of random
phenomena in nature. Many naturally occurring random phenomena are approximately normal,
‘Another reason for the importance of the normal distribution is a remarkable theorem called the
central-limit theorem. This theorem states that the sum of a large number of independent random
variables, under certain conditions, can be approximated by a normal distribution,
Solved Problems
PROBABILITY
6.1. Using the axioms of probability, prove Eq. (6.4).
Ua and ANd =o
‘Then the use of axioms 1 and 3 yields
PS)
Thus PA)
= P(A) + PACA)
62. Verify Eq. (6.5).
Av@ and AD@=o
‘Therefore, by axiom 3,
P(A) = PAV2)= PU) + PO)
and we conclude that
Pe)=0
63. Verify Eq. (6.6).
Let ACB, Then from the Venn diagram shown in Fig. 6-4, we see thatCHAP. 6] PROBABILITY AND RANDOM VARIABLES 141
AU(BOA) and ANBOA)=
Hence, from axiom 3,
PCB) = PA) + PUB A) = PLA)
because by axiom 1, (BO A) =0.
Shaded region: An B
Fig. 6-4
6.4, Verify Eq. (6.8)
From the Venn diagram of Fig. 6-5, each of the sets AU B and B can be expressed, respectively, as a
union of mutually exclusive sets as follows:
AUB=AUANB) and B= (40 BUTOB)
Thus, by axiom 3,
PAUB)= PA) + PAB) (6.96)
and PLB) = PLA B) + PAB) (6.97)
From Eq, (6.97) we have
PUL B) = P(B)— PAM B) (6.98)
Substituting Eq. (6.98) into Ea. (6.96), we obtain
PAU B) = PA) + P(B)— P(A B)
‘Shaded region: A 0B Shaded region: AB
3
Fig. 65
65. Let P(A) = 0.9 and P(B) = 0.8 . Show that P(A 9 B) > 0.7.
From Eq, (6.8) we have
PAD B)= PA) + PB)~ PAB)
By Ea. (6.9), 0< PAW B)< 1. Hence,
PAO B) = PLA) + PCB) 1 (6.99)
Substituting the given values of P(4) and P(B) in Ea. (6.99), we get
PAA B)=09+08-1=0.7
Equation (6,99) is known as Bonferroni's inequality142
6.6.
6.1.
68.
69.
PROBABILITY AND RANDOM VARIABLES (CHAP. 6
Show that
P(A) = PAB) + ANB) (6.100)
From the Venn diagram of Fig. 6-6, we see that
A=(ANBUUNB) and (AN BOANB)=0 (6.101)
‘Thus, by axiom 3 we have
PUA) = PAB) + PAB)
5
‘@:
AaB AnB
Fig. 6-6
Consider a telegraph source generating two symbols: dot and dash. We observed that the dots
were twice as likely to occur as the dashes. Find the probabilities of the dots occurring and dash’s
occurring,
From the observation, we have
P(dot) = 2P(dash)
Then, by Ea. (6.12)
P(dot) + Pidash) = 3P(dash) = 1
Thus, P(dash) = 1/3 and P(dot) = 2/3
Show that P(4|B) defined by Eq. (6.16) satisfies the three axioms of a probability, that is,
(@ PCALB)=0, (6) P(S|B) = 1, and (6) P(A U CB) = PLAIB) + PICIB), if ANC =
(@) By axiom 1, P(A B)= 0. Thus,
PUAIB)=0
PSB) _ PB) _
Pi = a
(Now AVON B= (ANB UCN B). IW ANC= 0, then (Fig. 67),
. AnBacoB=2
Hence, by axiom 3
PUAVONB_ PANB)+PCOB)_ ,
eB = PLAlB) + PCB)
PAU)
Find P(AIB) if (a) AM B= @, (6) ACB, and (-) BA.
(@ IAN B=@, then PAM B)= P(@) = 0. Thus,CHAP. 6] PROBABILITY AND RANDOM VARIABLES 143
se.
Tens
a 2
ane
Fig. 67
PAB) PO) _
Pala = 2402) Fe
OI ACB then AN B= A and
Pan B)_ Pa)
PAI = Toa = Pea
(BCA, then AA B= Band
PAA B)_ PB) _
PAID = RB) = PB)
6.10. Show that if P(4lB) > P(4), then P(B|A) > PCB).
PNB). »
Pay P(A), then P(A > B) > P(A)P(B). Thus
POR), PAPE)
Pay PCA)
AB)
6.11. Let A and B be events in a sample space S. Show that if A and B are independent, then so are
(@) A and B and (6) 4 and B.
(@) From Eq, (6.100) (Prob. 6.6), we have
PA) = PAB) + PLADB)
Since 4 and B are independent, using Eqs. (6.21) and (6.4), we obtain
PAB) = P(A)~ PAN B) = P(A) PAVPCB)
(6.102)
= PALL = PCB) = PAYPB)
‘Thus, by Eq. (6.2/), A and B are independent.
(8) Interchanging 4 and B in Eg, (6.102), we obtain
PENA) = PBA)
which indicates that A and # are independent.
6.12, Let A and B be events defined in a sample space S. Show that if both P(A) and P(B) are nonzero,
then events 4 and B cannot be both mutually exclusive and independent.
Let A and 8 be mutually exclusive events and P(A) #0, P(B)#0. Then P(A MB) = P(@)=0 and
P(A)P(B) #0. Therefore
PAB) # PA)PB)
That is, A and B are not independent.144 PROBABILITY AND RANDOM VARIABLES ICHAP. 6
6.13. Verify Ea. (6.24.
Since BS = B [and using Eq. (6.23)], we have
B=BOS= BAA VALU: Uy)
= BOA )UBOAyyu+ + Body)
Now the events 8 dy (k= 1, 2, ..., N) are mutually exclusive, as seen from the Venn diagram of Fig. 68,
‘Then by axiom 3 of the probability definition and Eq, (6.18), we obtain
PB)= BAS) = > PBA) Zralanndy
BoA, B
Baa, Baa BOA
Fig. 68
6.14, Ina binary communication system (Fig. 6-9), 0 or | is transmitted, Because of channel noise, a 0
can be received as a 1 and vice versa. Let mig and m; denote the events of transmitting 0 and 1
respectively. Let ro and r; denote the events of receiving 0 and 1, respectively. Let P(g) =
0.5, P(ribnie) = p= 0.1, and Plrolm,) = q = 0.2
Porolme)
Pom) Porilm)
Fig. 69 Binary communication system
(@) Find P(r) and P(r).
(8) If a 0 was received, what is the probability that a 0 was sent?
(0) Ifa 1 was received, what is the probability that a 1 was sent?
(d) Calculate the probability of error P,.
(c) Calculate the probability that the transmitted signal is correctly read at the receiver.
(@) From Fig, 6-9, we have
Pom) = 1~ Pom) = 1-05 = 05
Plroltie) = 1 ~ PCr) = 1=p = 10.1 = 0.9
Porm) = 1—Plralem) = 1-g= 1-02 = 08CHAP. 6 PROBABILITY AND RANDOM VARIABLES. 145
6.15.
Using Ea. (6.24), we obtain
(8) Using Bayes’ rule (6.19), we have
PlmyPtrolos) _ 0-509) _ 9 51g
Pont Pir) ~
(© Simic,
Pim Perm) _ (050.8) _
Fem ey 0a OP
@ P, = P(r; |g) Pomp) + Plrolem) Pm) = 0.1(0.5) + 0.2(0.5) = 0.15
(© The probability that the transmitted signal is correctly read at the receiver is
P, = Paruline) Peon) + Perse) Per) — 0.9(0.5) + 0.8(0.5) = 0.85
Note that the probability of error P, is
085=0.15
Consider a binary communication system of Fig. 6-9 with P(rlm) = 0.9, P(rjlm) = 0.6. To
decide which of the message was sent from an observed response rg or 7), we utilize the following
criterion:
If rq is received:
De
te mg if Plmglro) > P(rm|ro)
Decide m, if Pomlra) > Pomolra)
If 7 is received:
Decide my if Pomglrs) > Pomlry)
Decide my if Pomslr,) > Pomglr)
This criterion is known as the maximum a posteriori probability (MAP) decision criterion (see
Prob. 9.1).
(a) Find the range of P(mg) for which the MAP criterion prescribes that we decide mg if ry is
received,
(6) Find the range of P(mp) for which the MAP criterion prescribes that we decide m if r, is
received,
()_ Find the range of P(m) for which the MAP criterion prescribes that we decide my no matter
what is received,
(@ Find the range of P(m) for which the MAP criterion prescribes that we decide m no matter
what is received,
1 Perm), we have
0.1 Porybm)=0.6 Perolm) = 04
= Pirolmg) and Porolam)
Pom) = 0.9 Peli)
By Eq. (6.24), we obtain
P(ro) = Plrolina)Plma) + Plrolm,)PCrm) = 0.9 Pm) + 0.41 — Plom)] = 0.5 Plinig) + 04
(@) Since Prime
‘Using Bayes rule (6.19), we have146 PROBABILITY AND RANDOM VARIABLES (CHAP. 6
Perlm)POr) __0.9 Pla)
Pm Plea) OS Pla) + 04
Perm rn) = PelmidPtm) _ O81 ~ Ploy _ 04-04 Pom)
io Pore) OS Por) +04 OSPlrmg) +04
‘Now by the MAP decision rule, we decide m if rp is received if P(mplrs) > P(r), that is,
0.9P (mg) | 0.4-0.4PUmy)
OSPOm) +04 ~ 0.5P(mg) +04
or 29Rm) = 04-04) of 13Rom) > 04 ot Rix) > 4031
Thus, the range of Pl) for which the MAP criterion prescribes that we decide m ifr 8 received is
0.31 < Pim) <1
@ Similarly, we have
Por) = PErdng) lon) + Perm) Plo) = OPC) + 0.611 — Porg)] = ~0.5 Pl) +066
= Persbmg)P(rmg) 0.1 Png)
Pn T= Sas) FOS
Perilm)Plon) __0.6(1 = Poma)]___0.6 = 0.6Pm)
Pomln) =
Por) —O.SP(rm) + 0.6 -0.5P(m) + 0.6
Now by he MAP ison ue we desi mi ies i om)» Ps that is
06-06F0m) 017%)
—0.5P(m) + 0.6 ~ =0.5P(mg) + 0.6
« 06-060) = ORI) oF 607m) oF Romy < 25086
TIhus, the range of Pom) for which the MAP criterion prescribes that we decide mir is received is
0 Pom) < 0.86
(©) From the result of (b) we sce that the range of P(mia) for which we decide my if ris received is
Pom) > 0.86
‘Combining with the result of (a), the range of P(ma) for which we decide my no matter what is received is
sven by
0.86 < Porm) = 1
(@ Similarly, from the result of (a) we see that the range of Pim) for which we decide my ifr is received is
Pome) < 0.31
Combining with the result of (6), the range of Pim) for which we decide m, no matter what is received is
given by
0 Pom) <0.31
6.16. Consider an experiment consisting of the observation of six successive pulse positions on a
communication link, Suppose that at each of the six possible pulse positions there can be a
positive pulse, a negative pulse, or no pulse. Suppose also that the individual experiments that
determine the kind of pulse at each possible position are independent. Let us denote the event that
the ith pulse is positive by {x; = +1}, that it is negative by {x = —I}, and that itis zero by {x; = 0}
Assume that
P= +N =p=04 P=
3 for i
N=q
(@ Find the probability that all pulses are positive.CHAP. 6) PROBABILITY AND RANDOM VARIABLES 147
(6) Find the probability that the first three pulses are positive, the next two are zero, and the last
is negative.
(a) Since the individual experiments are independent, by Eq. (6.22) the probability that all pulses are
positive is given by
Ploy =F NG = FO G6 = DI = Poy = HPO = HD+ + P= HD
(0.4) = 0.0041
(®) From the given assumptions, we have
Poy == 1=p-4=03
‘Thus, the probability that the first three pulses are positive, the next two are zero, and the last is negative is
given by
OAs =0.AG6=—DI
P(x =-1)
Pls, = DAG? = +008 = +D 904
Posy = FD Poa = +)Pos = +P CY = OPE
=P p= aha = 0.490.3)°(0.3) = 0.0017
RANDOM VARIABLES
6.17. A binary source generates digits 1 and 0 randomly with probabilities 0.6 and 0.4, respectively.
(@) What is the probability that two 1s and three 0s will occur in a five-digit sequence?
(b) What is the probability that at least three 1s will occur in a five-digit sequence?
(@)_ Lot X'be the random variable denoting the number of 1s generated in a five-digit sequence. Since there
are only two possible outcomes (1 or 0) and the probability of generating | is constant and there are five
digits, itis clear that X has a binomial distribution described by Eq. (6.85) with n = Sand k = 2.
Hence, the probability that two Is and three 09 will oveur in a five-digit sequence is
(SJostes?=
P= 23
() The probability that at least three 1s will occur in a five-digit sequence is,
POS 3) =1- PINS 2)
where pur<2)= > ({)o.otoa)* = 0317
Hence, PUX= 3)= 10.317 = 0.683
6.18, Let X be a binomial r.v. with parameters (n,p). Show that py(k) given by Eq. (6.85) satisfies
Bq. (6.330),
Recall that the binomial expansion formula is given by
oxor= 3 (peor
Thus, by Eq. (6.85),
Spxw= > ({)ete-art = bi-phait=
distribution
6.19. Show that when » is very large (n> k) and p very small (p <1), the binomi
[Eq. (6.85)] can be approximated by the following Poisson distribution (Eq. (6.88)148
PROBABILITY AND RANDOM VARIABLES
(CHAP. 6
P(X =k) newt (6.103)
From Ea, (685)
ramna(t te
RMADD RED pt (6.106
When n>kand p< 1, then
n(n 1s 1 k+ Dann
consi
gal-pwe? hacen ee
Substituting these relations into Eq, (6.104), we obtain
PU = by = re OO
6.20. A noisy transmission channel has a per-digit error probability p, = 0.01.
(@) Calculate the probability of more than one error in 10 received digits.
(®) Repeat (a), using the Poisson approximation, Eq. (6.103).
(@ Let X bea binomial random variable denoting the number of errors in 10 received digits. Then using
Eq. (6.85), we obtain
Pu> 1)
P= 0)- PX = 1)
10) or exo on 0 10 °
9 o.007a.9)-( | }oon'0.99y
= 0.0042
(Using Eq, (6.103) with mp. = 10(0.01) = 0.1, we have
pars y= taet OaP coi 00 4
6.21.
Let X be a Poisson r.v. with parameter a. Show that p(k) given by Eq. (6.88) sal
By Ea. (688),
> pxth +e
6.22. Verify Bq. (6.35).
From Eqs. (6.6) and (6.28), we have
PUL = 3) = Pome < ¥0 (6.105)
‘A random variable X with the pdf given by Eq. (6.106) is called an exponential random variable with
parameter 4,
All manufactured devices and machines fail to work sooner of later. If the failure rate is constant,
the time to failure T is modeled as an exponential random variable. Suppose that a particular
class of computer memory chips has been found to have the exponential failure law of Eq. (6.106)
in hours.
(a Measurements show that the probability that the time to failure exceeds 10* hours (h) for
chips in the given class is e“!(=0.368). Calculate the value of parameter a for this ease.
(b) Using the value of parameter a determined in part (a), calculate the time fy such that the
probability is 0.05 that the time to failure is less than ,
(Using Bas. 6.38) and (6.106), we se thatthe distribution function of T's given by
Fuo= [foe
Now PT > 10) =
1 elt)
T= 10°)
(10)
ratty = rath
from which we obtain a= 104,
(We want
Plt) = PCT t9) = 0.05
Hence, Orn — 9.05150
6.26.
6.21.
6.28,
PROBABILITY AND RANDOM VARIABLES [CHAP. 6
or Fe — 9.95
from which we obtain
f= -10*In0.95 = 513
The joint pdf of X and Y is given by
Faxlx y= ke udu”)
where a and b are positive constants, Determine the value of constant k.
The value of k is determined by Eq, (6.498), that is,
Sf pte natn = kf [pera = fre
i [ety
Henee, = ab
The joint pdf of X and Y is given by
ve PP xuy)
fer) =
(a) Find the marginal pdf's fy(x) and fy)
(6) Are X¥ and ¥ independent?
(@) By Eqs, (6.504) and (6.508), we have
160 [Sintonar= [or eee
Since fry(x, y) is symmetric with respect to x and y, interchanging x and y, we obtain,
ful) = ye "Paty
(0) Since fier(x,y) = fatafy0), we conclude that X and Y are independent.
The random variables ¥ and Y are said to be jointly normal random variables if thei joint pdf is
given by
I ofa!
Mayol =p PL T= pH
[ertoe-eyy
(a) Find the marginal pdf's of X and ¥.
(b) Show that ¥ and ¥ are independent when p = 0.
fey)
(0) By Eq, (6504) the marginal pat of is
f= [ Sertenay
By completing the square in the exponent of Eq. (6.107), we obtainCHAP. 6] PROBABILITY AND RANDOM VARIABLES 15I
1 =)']
ex[-
meek EM
Le Ip Tema
sonst y= 0SEew ~»0] ba
‘Comparing the integrand with Eq. (6.91), we see that the integrand is a normal pdf with mean,
1
xo) =
nyt phoma)
and variance
op)
‘Thus, the integral must be unity, and we obtain
fr) = gt enfto~ 20] (6.108)
Viney
Ina similar manner, the marginal pdf of Y is
Sioy= [ fasbs.na exrf-w— ay /204] (6.109)
(©) wren p =0, £4. (6.107) eaces to
ame se) G2}
= eof 1 (4) | ew
Vinay of 3 ox \ ee of
=fx@fro)
Hence, X and ¥ are independent
fersy
FUNCTIONS OF RANDOM VARIABLES
6.29. IE X is Nus
The cdf of Z is
, then show that Z= (X—)/o is a standard normal r.v.; that is, N(0; 1)
min
-» Vira
Fy) = Z<2) A107) de
(Stas) = rorese+0 f
By the change of variable y= (x p)/o (that is x= oy +1), we obtain
1
Si
Fa ras2)=[ poe "Pdy
and
which indicates that Z = (0:1).152
6.30,
6.31.
PROBABILITY AND RANDOM VARIABLES (CHAP. 6
Verify Eq. (6.57).
Assume that y = g(x) is a continuous monotonically increasing function [Fig. 6-10(q)].
‘Then it has an inverse that we denote by x= g"'() = h(y). Then
Fr) = PO Sy) = PLY hy) = Fed) (6.110)
a
ans AO) = EPO) = FEO
Applying the chain rule of ifferentiaton to this expression yields
iO) = LON EHO)
which can be written as
HO =H x= 10)" Gully
Hy = (3) is monotonically decreasing [Fig.6-10(), then
FyQ) = POS) = PLX> HG) = 1 FAQ) 112)
ans F=fioE sho) (6.113)
Combining Eqs. (6.117) and (6.113), we obtain
@ e
Fig. 6-10
Jor = cole = seven
which is valid for any continuous monotonic (increasing or decreasing) function y = (1).
Let Y= 2X +3. Ifa random variable X is uniformly distributed over [1,2], find fy(y).
From Eq. (6.105) (Prob. 6.23), we have
fo nisx=2
10-43 Serie
‘The equation y = g(x) 2x-+3 has a single solution x; = (y—3)/2, the range of
‘Thus, ~1 0, then y= x has two solutions
=F Mae
Now, y = g(x) = and g"(x) = 2x. Hence, by Ea. (6.58)
1
Sr) = UD) + Sl-VINI HO) (6.117)
Since X= MQ; 1) from Eq, (6.97), we have
Se)
‘Since f(x) is an even function from Eq. (6.117), we have
1
LX) = GIMP) =
(6.118)
(6.119)
634, The input to a noisy communication channel is a binary random variable X with
P(X =0) = P(X = 1) =}, The output of channel Z is given by X+ Y, where Y is the additive
noise introduced by the channel. Assuming that X and Y are independent and Y = (0; 1), find
the density function of Z,
Using Bqs. (6.24) and (6.26), we have
F(z) = PZ = P(Z 0,
pore a <2 (6.146)
where jty = E[X1. This is known as the Markov inequality
From Eq. (6.370)
PU O= i Sala) dx
Since f(x) = 0 for x <0,
y= B00 = J xfoo axe [7 axe doa” yoo ae
Hence, f fr(x) de = POX 2 a) = AE
For any € > 0, show that
%
PUX= axl = 0< (6.147)
where jy = E[X] and of is the variance of X. This is known as the Chebyshev inequality.
From Eq. (6.370)
rix-mi=o=[poodes |” frac [seo ae
ure lle
By Fa. (673)
Fle-mptnooae fF o-mitnooaee f porer
beni boi
Hence, i
. J fio avs
[x-uylee
Alx-ml20<%
Let X and Y be real random variables with finite second moments. Show that
(EWN = ELIE) (6.148)
This is known as the Cauchy-Schwarz inequality.
Because the mean-square value of a random variable can never be negative,
EX a1 =0
for any value of a. Expanding this, we obtain
EWA]-22E [XY] +8 B= 0
Choose a value of « for which the left-hand side of this inequality is minimum
EW)162
6.53,
654,
655.
657.
658.
PROBABILITY AND RANDOM VARIABLES ICHAP. 6
which results in the inequality
evry,
EL
or (LX? < EW) EP)
Ee]
Verify Eq. (6.84).
From the Cauchy-Schwarz inequality Eq. (6.148) we have
{E(CE ey — we )IP SEU pee EU = py?
Then
from which it follows that
Supplementary Problems
For any three events 4, B, and C, show that
PAU BUO)= PA) + PUB) + PIC) ANB)
~PBOO)~ MENA) + FANBAO,
Hint: Write AU BU C= AU(BUO and then apply Eq. (6.8).
Given that P(A) =
9, PCB) = 0.8, PAM B)=0.75, find (a) PAL By: (b) PAB); and () PAB)
Ans. (a) 0.95; (b) 0.15; (@) 008.
Show that if events 4 and B are independent, then
PONB) = PA)PB)
Hint: Use Eq. (6.102) and the relation
T=AODVaB
Let 4 and B be events defined in a sample space S. Show that if both P(A) and P(B) are nonzero, then the
events 4 and B cannot be both mutually exclusive and independent.
Hint: Show that condition (6.21) will not hold.
A certain computer becomes inoperable if two components 4 and B both fail. The probability that A fails is
0.01, and the probability that B fails is 0.005. However, the probability that B fails increases by a factor of 3
if A has failed.
(@ Calculate the probability that the computer becomes inoperable.
©) Find the probability that will fail if 2 has failed,
Ans. (a) 0.00015 (6) 0.03,CHAP. 6) PROBABILITY AND RANDOM VARIABLES. 163
659,
6.60.
661.
6.62.
6.3.
6.64.
‘A certain binary PCM system transmits the two binary states ¥= +1, ¥=~1 with equal probability
However, because of channel noise, the receiver makes recognition errors. Also, as a result of path
distortion, the receiver may lose necessary signal strength to make any decision. Thus, there are three
possible receiver states: Y= +1, Y= 0, and Y= —1, where ¥ = 0 corresponds to “loss of signal.” Assume
that PO ==11¥ = +1) = 0.1, AU = 411 = 1) = 02, and PCY = OX = +1) = PCY = OX ==1) = 0.05.
(a) Find the probabilities POY = +1), POY = —1), and POY:
(8) Find the probabilities PUY = +1]¥ = +1) and PO
Ans, (a) POY = +1) = 0525, POY ==1) = 0.425, POY
(©) P= 4+1lY= +1) =081, P= =F
‘Suppose 10 000 digits are transmitted over a noisy channel having per-digit error probability p = 5x 10-*.
Find the probability that there will be no more than two-digit errors.
Ans. 0.9856
Show that Eq. (6.92) does infact define a true probability density in particular, show that
ffi ava
Hint; Make @ change of variable [y = (x — 4)/o] and show that
r f. PP dy =n
‘which can be proved by evaluating P by using the polar coordinates.
A noisy resistor produces a voltage V(t). At f= 4, the noise level Y= V(t) is known to be @ gaussian
random variable with density
Sido) = Foes F 10
Vino
‘Compute the probability that [XI > ke for k = 1,2,3.
Ans. PUX|> 0) = 03173, PIX > 20) = 0.0455, P(LX| > 30) = 0.0027
‘Consider the transformation ¥ = 1/X.
(@ Find fy) in terms of fr)
©) Wy = tnd fr
ma
Ans.) fy0) = h(5)
Mor)
Ves?
Note that X and Y are known as Cauchy random variables
® fo) =
Let X and ¥ be two independent random variables with
Six) = 96 ul) ful) = fer Muty)
Find the density function of Z = X-+ Y.
Ans. £2
jee ben
eeu) 8164
6.065.
6.6.
667.
6.68.
6.09.
60,
on.
PROBABILITY AND RANDOM VARIABLES (CHAP. 6
Let X be a random variable uniformly distributed over (a, 6]. Find the mean and the variance of X.
bea, (b-aF
x
Ans. wy
Let (X, ¥) be a bivariate rv. If ¥ and ¥ are independent, show that X and ¥ are uncorrelated,
Hint: Use Eqs. (6.78) and (6.51)
Let (X, ¥) be a bivariate rv. with the joint pa
F eer
fits, y= SEP ~ecxyew
Show that Yand ¥ are not independent but are uncorrelated.
Hint: Use Bas. (6.50) and (6.59)
Given the random variable X with mean jy and o}, find the linear transformation Y= a¥+6 such that
y= and of = 1
1 yak
= 6
ans.
Dif mado aisles 2 8 by
ZaXeay Wexmay
shri el mune, Deermieoich tht Zan Hae toga
_ [Fea
tna fH)
The moment generating function of X is defined by
J freneac
My) = Ele)
Where 2s real variable. Then
m= EX = MPO) k=1,2,
HMO)
a
(a) Find the moment generating function of ¥ uniformly distributed over (a,b).
@). Using the result of (a), find £1X1, EL), and EL)
where wo
ones
[O40 BUA) = 4 +ab +a), EU?) =40 + Bat be +0)
Show that if ¥ and Y are zero-mean jointly normal random variables, then
EDP Y= BLP £0") +E
Hint: Use the moment generating function of X and ¥ given by
Elesttay
1
Myra 2s
= Zaz (eo satis