Exercise 2
Exercise 2
Exercise 2
25
a) To calculate H ( X ) .
X Y
P(0) = 0.2 0 0.7
0
0.3
0.1
P(1) = 0.8 1 1
0.9
7. Example of joint entropy. Let p(x,y) be given by
X Y 0 1
0 1/3 1/3
1 0 1/3
Find
(a) H(X), H(Y).
(b) H(X|Y), H(Y|X).
(c) H(X,Y).
(d) H(Y) - H(Y|X).
(e) I(X;Y).
8. In a female population X, consisting of 1/4 blondes, 1/2 brunettes, and 1/4 redheads, blondes
are always on time for engagements, redheads are always late, and each brunette always flips
an unbiased coin for each engagement to decide whether to be prompt or tardy.
(a) How much information is given by the statement “x, a member of X, arrived on time”
about each of the following propositions:
(1) x is a blonde,
(2) x is a brunette,
(3) x is a redhead.
(b) How much information is given by the statement “x, a member of X, arrived on time for
three engagements in a row” about the proposition “x is a brunette”?
9. A source X produces letters from a three-symbol alphabet with the probability assignment
PX(0)=1/4, PX(1)=1/4, PX(2)=1/2. Each source letter x is directly transmitted through two
channels simultaneously with outputs y and z and the transition probabilities indicated below:
x P(z|x) z
x P(y|x) y
0 1 0
0 1 0
1/2 1
1 1 1
1 1
1/2
1
2
2
(Note that this could be considered as a single channel with output yz).
Calculate H(X), H(Y), H(Z), H(YZ), I(X;Y), I(X;Z), I(X;YZ). Interpret the mutual information
expressions.
10. Please find the I(X; Y) of the followed channel
P( x1 ) = p
1-a
x1 y1
a
P(x2 ) = 1 - p b
x2 y2
1- b