0 ratings 0% found this document useful (0 votes) 31 views 18 pages Random Variable Giri and Banerjee
The document discusses random variables and their probability distributions, defining discrete and continuous random variables along with their expectations and variances. It includes important results such as the sum law and product law of expectation, as well as Chebyshev's inequality and the weak law of large numbers. Additionally, it covers bivariate probability distributions and the relationship between independent random variables.
AI-enhanced title and description
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here .
Available Formats
Download as PDF or read online on Scribd
Go to previous items Go to next items
Save Random variable Giri and Banerjee For Later ———————
10. RANDOM VARIABLE AND Med
ITS PROBABILITY DISTRIBUTION the J
prob
10.1 Random variable
Very often we can attach a real number with each elementary event in a sample, space.
For instance, in tossing a pair of fair coins, we may attach 2 with the appearance of 2 heads Re
(ie. with HED, 1 with the appearance of one head (ie. HT or TH) and 0 with the appearance |
Of 2 tails (ie. TT). Similarly, in throwing a fair die, we can attach the numbers 1, 2, w» 6
With the appearance of | point, 2 points, ... 6 points on the uppermost face. In this way, we 2
may define a function over the sample space. 3
A random variable is a real-valued function defined over a sample space. It takes a definite
set of values with definite probabilities
Thus, the number of heads obtained in 3 tosses of a fair coin is a random variable that
3 1
assumes the values 0, 1, 2 and 3 with respective probabilities — 3 and $
A random variable which can assume a finite or a countably infinite number of values is
called a discrete random variable.
ained in two throws of a fair die or the number of tosses of a fair
The sum of points o
coin before a head appears is a discrete random variable.
ntably infinite number of values is called
Again, a random variable that can take an unc
a continuous random variable,
ight of a new born baby is a continuous
The decay time for a radioactive particle or the
random variable.
10.2 Probability distribution, expectation and variance of a random variable
of a random variable together with the
A statement of all possible values or
corresponding probabilities gives the probability distribution of the random variable. It may
be represented by a function, in a table or a
Mlustration 10.
If a fair coin is tossed twice, the number of heads obtained (X) will have the following
aph,
P
probability distribution
Value of X] 0 | 1
PX=x) | 4 | 12RANDOM VARIABLE AND ITS PROBABILITY DISTRIBUTION 2ST
An dinperiant pea of a random variable is its expectation. The expectation of a
random variable X, denoted by E(X) or fx, serves as a measure of central tendency of
tho probability distribution of X. If X assumes the values x,, x2, ... with respective
probabilities py. Pay. , Where ¥p, = 1, then
7
E(X) = Lapis provided it is finite.
Note + If g(X) be a function of X, then Elg(X)} = ¥.g6x)-p;-
For example, if X takes the values 1 and -1 with respective probabilities
> and } then
and
B(X) =
0
So, here E(X) does not exist.
portant results :
Some
(i) If X = C, a constant, then E(X) = C
(ii) If Y = CX, C being a constant, E(Y) = C.E(X)
(iii) If Z = X + Y, where X and Y are two random variables, defined on the same sample
space, then E(Z) = E(X) + E(Y).
(iv) If Y = a + bX, then E(Y) = a + bE(X)
(v) If Z = XY, where X and Y are independent random variables, defined on the same
ace, then
sample sp:
E(Z) = E(X). E(Y).
Results (iii) & (v) are respectively known as swm law and product law of expectation.
The proofs of these results are given later in this chapter.
mportant characteristic of a random variable. It is a measure of
Variance is another i
nce of X, denoted by Var(X) or o%, is given by
dispersion of the random variable. Va
Var(X) = E[X - EX)?
3
_important results + "
FX =Caconstant, then Var(X) = 0, and Var(X) = 0 implies that X = C with probability 1
(i) If ¥ = bX, then Var(Y) = B?. VarQ.
blox.
2 Oy
pes ey
For the probability distribution in Illustration 10.1,
yt e2 t=
EX) = OG 415 +2.go!
EO?
Ta person gains or loses an amount equal to the number appearing when a balanced
die is rolled once, according to whether the number is even OF odd, how much money
can he expect per game in the long run ?
Loss may be regarded as negative gain. So, if X denotes the gain of the person per
ee
game, then X assumes the values I, 2, -3, 4, -5, and 6, each with probability A
Macey cl imnselasie ees
6t* Fart =
i = eae ah
o BM) == Lg +26 6 6 GaRe
Thus, his expected gain is $ unit of money per game in the long run.
a 104 assesment it ag ecg gman
Three balls are drawn at random from a bag containing 5 white and 3 black balls. If
X denotes the number of white balls drawn, write down the probability distribution |
X, and find E(X) and Var (X).V3)-
‘So, the probability distribution of X is as follows :
[Value of X] 0 ¥ 2 | aaa ero
Es Fe cee nats
[eed ber bl ee Dealer l> |
F(X) = lee 84 2.15-43.5,
are
BX?) = OP et 342? 54g Se
4 = K(X?) = 225 _ 225,
2, Var(X) = BOX?) - BCX) = 225 225
X denotes the number of heads and Y the number of tails obtained in three tosses of
a fair coin. If Z = X — Y, write down the probability distribution of Z and find var (Z).
The possible outcomes from three tosses of the fair coin and the corresponding values
of the variables are as follows :
Value of
Outcome Probability
Xx Y
HHH ys 3 0
HHT 8 2 1
HTH ys 2 1
HIT V/s 1 2
THH Vs 2 1
THT ys 1 2
1TH 8 1 2
TIT 8 0 3 3a 1
: a
Mor =0
eae capt ecnd edad = 24 23,
e+ Var (Z) = een
; the conditions that Sup
RreMIMERTSuE ent cin ecelcirlny a eumekuadent i
will get a Kapil m rupees if he wins and will pay Kapil n rupees if he loses,
probability of Sunil’s winning is p. Denote Sunil’s gain by z.
Show that the variance of z (i) increases with m, and (ii) decreases as p increases, fy
pot.
The random variable z assumes two values m and (~ n) with rspective probabilities p
and 1 — p = q, say.
Wp + nq.
So, var(z) = E(z?) ~ B%(z) = mp + n2q ~ (mp — ng)?
= mp + nq — mp? + Amnpg — 2g?
= mp (1 — p) + n2q(1 - q) + 2mnpq
= mpg + npg + 2mnpq = (m + n?pq.
@ Loar = Am + n)pq > 0.
~. var(z) increases with m.
Gi) Saw = (m + nPil.g + p- 1] = (n+ nC — 2) < 0, for p> $
+ var(2) decreases as p increases, for p > q
Note :(i) If X be a negative random variable, ie. if POX < 0) = 1, then B(X) <0,
Var(X) 2 0, as usual.
Gi) For any random variable X with mean j
(X=Hy) ree :
“= is called the standardised variable of X. For a standardised variable
fx and variance oR, the variable YSp
261
y= B0) =8{ Xt oa
ox By ECO = uy) = Ha Hx eo,
and OF = var(¥) = BLY ~ ECYy? = Bey) = p(X=ux P -
ae ag
2
10.3\Some exceedingly useful tools
10.3.1 Chebyshev's inequality ; -
gi Then for any ¢ > 0, ly : Let X be a random variable with mean /# and variance
1
PIIX- “| Sto} >1- 4,
z
This shows, for instance, that probability for X to lie between + 20 exceeds >
.
i 8
that for X to lie between 4 + 30 exceeds of and so on. This inequality is taken as basis
for using 30-limits in the construction of control charts, particularly when the concerned.
statistic is non-normal.
It is pointed out that a small variance indicates that large deviations from the mean
are improbable. This statement is made more precise by Chebyshev's inequality.
103.2 Weak law of large numbers :
Let X,, Xp, Xz, «-. be a sequence of random variables with expectations Hy, Ha Hs «
Vv.
Also let V, = Var(X, + Xp +. + X,). If st > Oas w— cc, then, given any two positive
0
quantities € and 5, however small, we can find an 1, depending on ¢ and 5, such that
off +Xp tnt Xp
n
>
for all n > ng.
Mets tthl cel 1-0,
{In short, the probability that an average (X, 4+Xp4o+X,,)/n will differ from its
ly prescribed € tends to 1, provided
expectation by less than an arbitrari|
Var(X)+X2+..+X,)/n? 30 asa > «1
We now consider two important corollaries of the above.
Corollary 1 ‘ ‘ q
Let xy, x5, «. be independent random sample observations from a population with meana a LEO
STATISTICAL TOOLS AND TECHNIQUES.
262
wand variance 02. Then x1, x7 - are independently distributed with identical marging, pro
istributions, their common expectation being jf and common variance 07. Here of
(x) +2) +.44q)/,=¥ is mean of a random sample of size n. ve
an
di
Now, V, = Var(x; + x) + =. + x,) = #07, so that +0 asin > &, provided
07 is finite, Hence, given € > 0 and 5 > 0, however small, we can find an mo, depending
on € and 6, such that
Pfle-p|1-6
for all n 2 1g, if o? be finite
Corollary 2
Consider a series of independent repetitions of an expe
event E is p. Let x; be a random variable associated with the i" repetition such that
ment for which probability of an
1 if E occurs in the i" repetition
* | 0 otherwise.
Then x ty ware independent and identically distributed random variables with
B(x) = Lp + 0.(1 - p) =p.
and Var(x,) = (1 - p)?. p + (0 - py. (1 = p) = pl ~ p).
ney of the event E inn
Here (xj +49 +--+ 4,)/m=fy/ns say, is the relative frequ
repetitions of the experiment
Since p(1 — p) is necessarily finite (actually, 0 $ p(l = p) 1) we get from the
previous result that, given, € > 0-and > 0, however small, we can find an ng, depending
on € and 6, such that
in_,| ness
for any n> 1
This particular form of the Weak la g is called Bernoulli's theorem
after James Bernoulli who, how arrived at th tin a diff wa
Bernoulli's theorem justifies and m ¢ precise the statement made in section
9.9 that probability of an event is its relative frequency in the long run
10.4 Bivariate probability distribution
X and Y simultaneously. A st
ther with the corre:
Suppose we are studying d
of the possible pairs of vTABLE 10.1
BIVARIATE PROBABILITY DISTRI
a itt 1 ms TA Pin
Xe Pa Pa Py Pi Pro
Marginal total! Por Poo ~~ Po; _-Pal 1
Here Pig = Py = P(X = x) and pop = Ly = PCY = y)-
7
So, the first and last columns: in TABLE 10.1 give the probability distribution of X,
which is called the marginal distribution of X in the present context. Similarly, first and
last rows give the marginal distribution of Y.
Again, the probabilities in different cells in a column divided by the total of that
column give the conditional distribution of X, given the value of Y as stated in the
column. Similarly, the cell probabilities in any row when divided by the total of that
ution of Y, given the value of X as stated in that row.
row give the conditional distribt
Je random variables X and Y are said to be independent if
PK = x, Y = yp = POX = x) POY = ypr te. py = Pay Pop for all i is
nt and there is some association between them. As
otherwise, the variables are depende!
have the correlation coefficient
‘a measure of linear association, We
Cov(X.Y)
Pxy = Far(x) Var(Y)”
where
| is the covariance of X and Y. Now,
Cov(X, Y) = EX - ECOHY - BOD)
E[(X — EX) HY - ECYMH= BIXY - XE(Y) - YECX) + ECOEQ
_ il“we prove the sum law and the prod
em 10.1 (Sum law of expectation)
If X and Y be two jointly distributed random: variables, then
E(X + Y) = E(X) + EY).
Proof : Let X assume the values xy #3,» and Y the values yj» Yor Y1- Further,
Y = yp = py = 1M f= MDA
so that,
POX = x) and poy = EPpy = POY = yp.
Pio = L Py
ary
‘The sum X + Y is a random variable that assumes KL values x; + ¥; with respective
probabilities py- So, by the definition of expectation,
EX +Y)= EEG; + yppy = LLAPy + LEYsPH
77 77
= Ea E rg tHE od Lxppio+EIjPo) |
1
= E(X) + BOY).
it can easily be extended to the case of more than two random variables,
This result
Theorem 10.2 (product law of expectation)
If X and Y are independent random variables, then
E(XY) = E(X).E(Y).
Proof : Suppose X assumes the values *, ¥2 ~~» %e and Y assumes the values yy» Yor
y,- Further, suppose that
P(X = xy ¥ = y) = yp P= 1D | = LOD.
Since X and Y are independent,
for all i i
Pee hem OY =) a
Now, the product XY is a random variable that assumes AL values x; with respective
probabilities p,j-
So, by definition of expectation,
ie‘Note :If variables X and Y are independent, then pyy = 0, but the
necessarily true. If, however, each of X and Y assumes only
the converse is true.
‘An important result :
For k random variables X,, X>, ..., Xj, defined on the same sample space,
Var(X, + Xp +. + Xp) = ZS variX) +2 F Cov%y.X))
=) ist jel
is
k
(Ky + Xp + +X) — BK) + Xp to HRY 2 (Xi -EOm}
i
k 2
s. Var(X, + Xp + + XP) = e{£s,-205)]
EK: -EX)}{X; -+0%)}|
= ela -E(X))}° +2
i
ici
= Se (x) BN} 5 L(x -Eo} ER]
i ia
, Cov(X;,Xj)-
= pansy 28S NE eel
igh
In particular,
VarX, + Xp) = VareX) + Var) + 2 Cov(X, X)-We require
E(X + Y) = E(X) + E(Y) = 7,
and E(XY) = E(X) E(Y), since X and Y are independent
Let X denote the number obtained from the throw of a die. Let Y be another variable —
such that
y = [+1 when X is odd
= AL when X is even
From the joint distribution of X and Y, calculate var(Z = XY). Are X and Y independent? 4
X assumes the values 1, 2, .., 6, each with probability {, and Y assumes two values
+1 and -1, each with probability 4. The joint distribution of X and Y is
Pelt >| (ra 5 | Wom oral
-1] 0 | yo} 0 | yo] 0 | yo] 2
1 | Vo] 0 | Yo} 0 | Yo} o } 2
Total] 1/6 | 6 | 1/6 | yo | yo | yo | 1
he: i 1
EQ) =EXY)=0+(D.2¢ + CD45 +65 411d + 13.2 415.RANDOM VARIA
BLE AND ITS PROBABILITY DISTRIBUTION, 267
F(Z?) = EXPY?) = 0 + (12.22.14
eects caperds 224 21 yee
ot 6 tC L + 12.122 + 1232-6 + POG
ae
+, Var(Z) = E(Z?) — E2(Z)
Ome
Sod ier)
Oh 4 ie ay
X and Y are independent when
P(X =i, ¥
D=PK=)PY =), Vii
Here PK = 1, ¥ =-l) =04 PK = DRY =-1) = +
12°
So, X and Y are not independent.
X and Y are two random variables having the
en 4 (ad 1 al
0 oO. On 0.1
1 Gi a2 || aa
2 (One| ROS CEE
lity distribution as given below :
Write down the marginal distribution of X and the conditional distribution of Y. given
X = 1. Also find Lx, My» Ox. Fy» Pxy» PK = 2! Y = 0), P(X + Y 2 3) and P(X > Y). Are
X and Y independent?
The joint distribution is as follows +
x Y 0 1 2 | Marginal Total |
0 0.1 o1 | 01 03 \
1 oO. 02 | 01 04
2 o1 | o1 | 0.4 03
Marginal Total 0.3 04 | 03 1 \
Marginal distribution of X
Nate of X Pe 22)
0 0.3
1 04
2 0.3
Total [oc LeEE _—h =
STATISTICAL TOOLS AND TECHNIQUES
268
Values of Y (y)
0
1
2
Total
Pec) = Oe Ose x 04 y 202
F(X) = 02 x03 + 12 x 04 + 2 x 03 = 16.
2 ot = HK?) — E2(X) = 1.6 - 1? = 0.6, 50, Ox = 10.6 = 0.77.
Since X and Y have same marginal distribution,
By = by = Land oy = Ox = 0.77,
‘Again, E(XY) = 1x 1x02+1%2x0.1
=02+02+02+04=1.
Coy (X, Y) = E(XY) - E(X) E(Y) =
42x 1x 0.1 +2 %2 x 01 (other terms vanish)
1x1=0.
So, pxy = 0.
P(X =2/¥ =0) = P(X=2,Y = 0)/PCY =0)=0. yo.3=4
Px + Y2 SPR = 1 Y=2+PK=2Y=Y+PK=2 Yad)
= 01401401 =03
PK > ¥) = PK = 1, Y= 0) + PK = 2, ¥=0)+PK=2Y=N
= 01401401 +03
X and Y are independent if
P(X = i, ¥ =f) = PK = 1). PY = j), for alll i, j.
Here,
P(X = 0, ¥ = 0) = 0.1 # P(X = 0).P(Y = 0) = 03 x 03 = 0.09.
So, X and Y are not independent.
Prove that two uncorrelated random variables are independent, if each of the variables
assumes only two distinct values.Let us define two new variables U and V as
EAT and V = S21
Bea! 2-H
$o, each of the variables U and V ass ist
probability distribution of U and vival nt ee
uw | 9 1 | Total
g Pu Pio | Pio
a Par | P22 | P20
Total | Po | Pm |! |
Since x» - x, and yp — y, are of the same sign,
Puy = Pxy =0-
Now, Pyy =0= Cov (U, V) = 0 ie., (UV) = EW). EV).
Here E(UV) = 0.0.p,1 + O-L-py2 + LOD.) + Ll-Pag = Par »
F(U) = O.pjo + P29 = Pao and ECV) = Oo) + LPo2 = Por «
~ Pag = Pao Pon ~ (-
Again, py2 = Poz ~ P22 = Po2 ~ P20 Por
Puy = Pro ~ Piz = Pio ~ Pio Poo a
and pp = Pot ~ Pi = Por ~ Pio Pow using (ii)
From (i) to (iv), we find that
using (i) = Poy(l ~ Pao) = Pio: Poo «= Gs
ii),
ising (ii) = Pyo(l - Por) = Pio Por ~
Poi — Pio) = Pao: Por GY):
Pig: Pop VES
Jes X and Y are independent.
Py
So, the random variabl“(w) Cov(X +3, ¥ +5).
(vi) Cov(3X +7, 2Y +9)
(ii) We find Cov(X, Y) = Axy Jwarta) vary)
= (0.6) (6) (5) = 18.
Or, E(XY) - E(X) E(Y) = 18
Or, E(XY) - 8.4 = 18
Or, E(XY) = 18 + 32 = 50.
(iii) E(X2) = Var(X) + E2(X) = 36 + 8% = 100.
(iv) Var(3X - 2Y) = 9Var(X) + 4Var(Y) — 12Cov(X, Y)
= 9,36 + 4.25 - 12.18
= 324 + 100 - 216 = 208.
(v) Cov(X +3, Y +5) = Cov(X, Y) = 18.
(vi) Cov(3X + 7, 2Y + 9) = 3.2 Cov(X, Y) = 6.18 = 108.
EXERCISES
1. Examine each of the following statements and write whether it is True or False +
(i) Values of a random variable are always positive real numbers.
(ii) For a negative random variable X, Var(X) must be positive.
(iii) If E(X) is negative, X must be a negative random variable.
(iv) If E(XY) = E(X).E(Y), then X and Y are independent.
(v) If X and Y are independent, then E(XY) = E(X).E(Y). 4
(Qi) IF X and Y are independent random variables, then Var(X + Y) = Van SER
(vii) If correlation coefficient of X and Y is zero, then Var(X + Y) = Var(X - Y’i)
(iv) If X and Y are independent, then Cov(2X,
(vy) If EC) = 5, E{X(X ~ 1)} = 44, then Var(l ~ 2X) =
(vi) If a random variable X assumes only two values ~ 2 and 1 such that
2.P(X = -2) = P(X = 1), then E(X) = .... ASF
(vit) If EQ) = EQ) = E(XY) = 1, then pyy = an ‘
(viii) When X and Y are independent with V: = =
psane Y8 pendent with Var(X) = 6 and Var(Y) = 10, then
B. Choose the correct option :
(@) The expected value of the sum of points obtained in two throws of a fair die ts
(@)6, (7, (©) 8, — G) none of these.
(ii) If X denotes the number of heads obtained in 3 tosses of
equals
Oe, 2 Ob | Cinna
a fair coin, then E(X)
(iii) If X be a negative random variable with variance k, then we must have
Gk<0, W)k>0 ©) kz 09 9@)noneiohiines
(iv) A random variable Y assumes the values Q and 1 with respective probabilities
p and q (= 1 ~ p). Then Var(Y) is
@1 om © 4, mone of these
(v) A random variable Z takes the value 2 with probability 1. Then Var(z) equals
(a) 0, (b) 1, (c) 2, (d) none of these.
(vi) If random variables X and Y are independent and E(X) = 0, E(Y) = 5, Var(X)
= 10, then Cov(X, X - Y) is !
FeO nh (i) sine (©): Orem (cnc of these.
(oily EGY) = ECX).ECY) implies that the random variables X and Y are
(a) independent, (6) linearly related, (A) n¢
3. (a) Define random variable, .d its variance with an example.
(b) uncorrelated,
its mean anpee
V2 4
show that [Eo] 2 E(X) Hence show oa
d standard deviation. :
jable and its probability distribution »
(b) For a random variable X,
deviation about mean cannot excee
y a discrete random vari
i fa fair coin. Write
is sists of three independent tosses 0} 7
Bic apie aa sete the number of heads obtained, obtain the probabity
thatribution of X, and calculate its expectation and variance.
Find the expected gain of Hari, if he gains Rs. 16 from Ram for getting at least ong
head end loses Rs. 40 to Ram otherwise, when he tosses one unbiased coin thrice,
6. If X and Y are both negative random variables, mu
and |E(X)| = 2, find E(Y).
7. An unbiased die is thrown twice. Write down the sample space of this experiment
If X denotes the sum of points obtained in the two throws, obtain the probability
distribution of X.
8. A bag contains 5 white and 3 black balls. 3 balls are drawn randomly without
replacement. If X is a random variable, which takes value 1, if at least 2 white balls
are drawn, and value 0, otherwise, find E(X).
4, What do you mean b;
tually independent, E(XY) = ¢
9. A perfect coin is tossed 3 times in succession. Given X = | if first toss gives head,
X =0 if first toss gives tail, and Y = number of heads obtained in 3 tosses, construct
the joint distribution of X and Y, and find correlation coefficient between them.
10.If a person gets Rs. 2x + 5, where x denotes the number appearing when a balanced
die is rolled once, how much money can he expect in the long run per game ?
L1.If x, y and z are three random variables so that x and y are independent and z = xy.*
can take two values 10 and 20, and the probability that x is 10 is 4. y can take three
values 5, 6 and 7. The probability that y is 5 is + and that it is 6 is
4 . Find the
a
7
expectation of z.
12. Two random variables X and Y are jointly distributed such that
PY =i) = 402-1), for i= 1, 2; POX
$, for i = 0, 1; and P(X = 0, Y =2)=p.
Find out the correlation coefficient between X and Y, and find the interval in which
possible values of p lie. For what value of p, ent i
P, correlation coeff ? i
Bag of 2 be efficient is zero? Are the
13.If two random variables X and Y are such that E(X) + E(Y) = 0, Var(X) - Var) =
O and 1 + Pxy = 0, what is the relationship between X and Y?
14.n
RANDOM VARI
ABLE AND ITS PROBABILITY DISTRIBUTION 273
id Y are i
ie os ne Pca random variables such that expectations of X and Y are
Ay an pectively, variances of X, Y and XY are o?, 03 and of» respectively,
correlation coefficient betwee: 5
then show that n X and XY is p, and that between Y and XY is Pp»
@ 0% = o703 + Ro} + Bo? Gi) pz
2 102 +430; Gi) py: py = Ogionh.
15.For two random variables X and Y,
E(X) = 8, E(Y) = 6, Var(X) = 16, Var(Y) = 36 and pxy = 0.5.
Find (I) ECXY),
(ii) Cov(X, X + Y),
(iii) Var(2X - SY),
(iv) Correlation coefficient between (2X + 3Y) and (2X - 3Y).
16.Prove that two uncorrelated random variables are independent, if each of the variables
assumes only two distinct values.
17,1f X and Y are independent random variables with E(X) = E(Y) = 0, then show that
VXY) = VO)V(Y)-
18, What is the expectation of the number of failures preceding the first success in an infinite
series of independent trials with constant probability p of success in each trial ?
12 lamps which include 4 defectives, a sample of 2 lamps is drawn at
19, From a lot of
tive lamps.
random, Obtain the probability distribution of the number of defe
20. Two cards are drawn, without replacement, from a well-shuffled pack of 52 cards.
Find E(X)and Var(X), where X is the number of black cards.
Dif P(K = i) = p and PCY = i) = 4, (] = 1. 2 >») and X and Y are two mutually
independent random variables, prove that
n+l)
xy)= ("= ]
eary= (42)
22. Let X be a discrete random variable assuming values 1, 2, 3 ... and suppose E(X)
exists. Show that
E(X) = YP 2 d-
isl
23.A fair coin is tossed until a head appears: Find the expectation of the number of
tosses required.
n containing 4 white and 6 black balls. If
24. man draws 2 balls at random from an un 1
he is to receive Rs.5 for each white ball and Rs. 4 for each black ball, then find his
expected gain.
35