Information Theory and Coding - Unit 2
Detailed Notes
1. Introduction to Discrete Information Channels
A discrete channel is a communication system that sends symbols from a finite set from
sender to receiver. It is modeled using transition probabilities like P(Y=y | X=x), where X is
input and Y is output. Example: Binary Symmetric Channel (BSC) with flip probability p.
2. Equivocation and Mutual Information
Equivocation H(Y|X) is the uncertainty in output Y given input X. Mutual Information I(X;Y)
= H(Y) - H(Y|X) shows shared information. High mutual information implies low noise.
3. Properties of Different Information Channels
- BSC: Two-symbol channel with flip probability p.
- BEC: Output is either correct or erased.
- Asymmetric: Different probabilities for different flips.
4. Reduction of Information Channels
Simplifying a channel model by combining similar states or removing unused transitions,
helping in easier analysis.
5. Noiseless Channel
Perfect transmission, where P(Y = X) = 1. Mutual Information is maximum and equivocation
is zero.
6. Properties of Mutual Information
- Symmetric: I(X;Y) = I(Y;X)
- Always non-negative
- Maximum when there is no noise
- Zero when X and Y are independent
7. Introduction to Channel Capacity
Channel Capacity (C) = max I(X;Y). It is the highest rate of error-free transmission possible.
8. Shannon’s Channel Coding Theorem
If transmission rate R < C, reliable communication is possible. It's the foundational result in
information theory.
9. Bandwidth – S/N Trade Off
Capacity C = B log2(1 + S/N). To increase capacity, increase bandwidth (B) or signal-to-
noise ratio (S/N).
10. Channel Capacity Theorem
General formula: C = max I(X;Y). For AWGN channels, C = B log2(1 + S/N).
11. Shannon Limit
Theoretical maximum data rate for reliable communication. If rate exceeds this, errors are
inevitable.
12. Channel Capacity for MIMO System
Uses multiple antennas to increase capacity. C = min(n, m) B log2(1 + S/N), where n =
transmit antennas, m = receive antennas.