DCNIT-LDTalks-1
Here I have discussed the channel capacity for noiseless and noisy channels. How Nyquist capacity and Shannon capacity play a key role in the noiseless and noisy channels are discussed in detail. We will see the several expressions of SNR_dB in terms of power and amplitude and try to understand how both the capacity are different from each other. For extreme values of SNR, we will deduce the Shannon capacity formula to understand the bandwidth-limited region and power-limited region. Here I have used a few numerical examples to understand the concept clearly. In the last section of the talk, I have deduced the Shannon capacity formula from scratch to get better exposure and will understand how these ideas contribute to its mathematical framework.
Course: CMS-A-CC-4-8
youtube: https://www.youtube.com/watch?v=1OjlMqHWq6o
Introduction to Channel Capacity | DCNIT-LDTalks-1
1. Channel Capacity
Noiseless and Noisy Channels
CMS-A-CC-4-8
Lockdown Talk Series: DCNIT-LDTalks-1
Arunabha Saha
Department of Computer Science, Vidyasagar College
University of Calcutta
April 2020
2. Outline
Communication
Basics of Channels
Noiseless Channels : Nyquist Capacity
Noisy Channels: SNR + Shannon Capacity
Approximations on Shannon Capacity
Numerical Examples
Deduce the Shannon Capacity expression
5. Communication
A communicates with B
The physical acts of A have induced a desired physical state in B1
1 Elements of Information Theory, T.M. Cover, J.A. Thomas
6. Communication
A communicates with B
The physical acts of A have induced a desired physical state in B1
Success: when receiver B and sender A both agrees on what was sent.
1 Elements of Information Theory, T.M. Cover, J.A. Thomas
8. Channel Basics
Channel capacity: maximum possible data rate under a given set of
conditions
Bandwidth: range of frequencies where most energy is concentrated.
B = fmax ā fmin
Channels
Noisless channels
(Nyquist capacity)
Noisy channel
(Shannon capacity)
9. Noiseless channel
Defn.: a communications channel in which the eļ¬ects of random
inļ¬uences are negligible, and there is essentially no random error2
.
No frames are lost, corrupted or duplicated.
Nyquist capacity gives the upper bound of maximum data rate of
this kind of channels.
CN = 2.B.log2M
CN ā” Nyquist capacity(in bps), B ā” bandwidth(Hz),
M ā” number of signal levels
2 McGraw-Hill Dictionary of Scientiļ¬c & Technical Terms, 6e
10. Noiseless channel
Defn.: a communications channel in which the eļ¬ects of random
inļ¬uences are negligible, and there is essentially no random error2
.
No frames are lost, corrupted or duplicated.
Nyquist capacity gives the upper bound of maximum data rate of
this kind of channels.
CN = 2.B.log2M
CN ā” Nyquist capacity(in bps), B ā” bandwidth(Hz),
M ā” number of signal levels
but this is unrealistic..!!
2 McGraw-Hill Dictionary of Scientiļ¬c & Technical Terms, 6e
12. Nyquist capacity(1928)
assumes a channel entirely free of noise
for any given bandwidth B, max. signal rate is 2B
signal with M levels may carry log2M bits3
CN = 2.B.log2M (1)
Tradeoļ¬s:
increase in bandwidth, increase in data rate
increase the signal levels, increase the data rate
increase in signal levels, diļ¬cult for receiver to process the bits
(there is a limit to M)
3 "Certain topics in telegraph transmission theory, Nyquist, Harry., Trans. AIEE, vol.
47, pp. 617ā644, Apr. 1928
13. Noisy channel
Defn.: a communications channel in which the eļ¬ects of random
inļ¬uences cannot be dismissed4
.
this depicts the real communication systems
we will use Shannon capacity formula for noisy systems..!!
4 McGraw-Hill Dictionary of Scientiļ¬c & Technical Terms, 6e
15. SNR(1)
Defn.: Ratio of signal power to noise power; unit-less quantity.
SNR =
Psignal
Pnoise
, where P = average power
for random signal(S), SNR = E[S2
]
E[N2] ,
where E is the expectation value; mean square value of the quantity
SNR =
Psignal
Pnoise
=
Asignal
Anoise
2
(2)
where A ā” RMS amplitude(voltage)
If SNR < 1 the signal become unusable..!!
16. SNR(2)
Signals have wide dynamics range. It is often expressed in decibels(dB)
SNRdB = 10.log10(SNR) (3)
ā SNRdB = 10.log10
Psignal
Pnoise
(4)
SNRdB = Psignal,dB ā Pnoise,dB (5)
If we measure the signal or noise in volts(V), which is measure of
amplitude
SNRdB = 10.log10
Asignal
Anoise
2
(6)
SNRdB = Asignal,dB ā Anoise,dB (7)
18. Shannon capacity(1948)
assumes presence of additive white Gaussian noise(AWGN)
provide theoretical upper bound of data transmission rate
increasing signal power overcomes noise
Shannonās capacity of a channel5
CS = B.log2(1 +
S
N
) = B.log2(1 + SNR) (8)
Tradeoļ¬s:
increase bandwidth or signal power, increase data rate
increase the noise, reduce data rate
increase bandwidth, allows more noise
increase signal power, results more noise and errors.
5 Communication in the presence of noise, Shannon, C.E., Proceedings of the Institute
of Radio Engineers. 37 (1): 10-21, Jan. 1949
19. Numerical example(1)
Say some spectrum is given with fmin = 3MHz and fmax = 5MHz. Given
SNRdB = 30dB.
hence the bandwidth, B = 5 ā 3 = 2MHz
SNRdB = 10.log10(SNR)
ā SNR = 1000
Shannon capacity:
CS = B.log2(1 + SNR)
ā CS = 19.934Mbps
20. Numericals(2)
For the previous problem, how many signal levels we need?
According to Nyquist capacity CN = 2.B.log2M
CN = 19.934
ā 2.B.log2M = 19.934
ā M = 2
19.934
2.B
ā M = 32
As the number of levels must be an integer; But this value only work
with the assumption of Nyquist capacity.
In reality very less accuracy will be obtained with 32 levels
21. Few points on Shannon capacity
It determines the upper bound on how fast we can communicate
over a channel.
No way to exceed it; we try to achieve the maximum possible.
Hard limit: CS ā B
No transmission cond.: If N >> S then S
N ā 0 and log2(1 + S
N ) ā 0
Increasing S will enhance the system performance, but not
indeļ¬nitely; we need a CAP to the power of these devices.
22. Theoretical Maximum data rate of LTE
We usually see in the adv. that a LTE opertor provides hundreds of Mbps
download speed. On the other hand several reports that the actual peak
download speed we get may be few tens of Mbps or less.
Whatās the true capacity..??!!
Max. bandwidth for LTE = 20 MHz.
Say, 4 diļ¬erent SNRdB rates with equal probability; 3dB, 6db, 9dB, 12dB
using Shannonās capacity CS = b.log2(1 + SNR).
C1 = 31.6536Mbps
C2 = 46.3291Mbps
C3 = 63.2161Mbps
C4 = 81.4917Mbps
On average 55.67 Mbps (with one transmit and receive antenna).
23. Approximations(1)
We will see the capacity for large and small SNR.
Bandwidth limited: when S/N >> 1
log2(1 + S
N ) ā log2
S
N ā 3.32log10
S
N = 0.332 Ć SNRdB
CS ā 0.332 Ć B.SNRdB
capacity is logarithmic in power and appx. linear. Bandwidth-limited
region
Power limited: when S/N << 1
log2(1 + S
N ) = 1
ln2 .ln(1 + S
N ) ā 1
ln2
S
N ā 1.44 S
N
CS ā 1.44 Ć B. S
N
capacity is linear in power; Power-limited region
In the low-SNR apprx., CS is independent of B if white noise is
considered of spectral density N0 watts/Hz
Total noise power, N = B.N0;
CS ā 1.44 Ć B. S
B.N0
= 1.44 S
N0
26. Signal looks like noise
ā Let a signal successfully communicated represented by analog voltage
V (t). At any time t1 and t2 the voltage will be V (t1), V (t2) respectively.
ā Both of the voltage appear independently provided they are far enough
apart; |t1 ā t2| > 1
2B
27. Signal looks like noise
ā Let a signal successfully communicated represented by analog voltage
V (t). At any time t1 and t2 the voltage will be V (t1), V (t2) respectively.
ā Both of the voltage appear independently provided they are far enough
apart; |t1 ā t2| > 1
2B
The voltage levels are predetermined; Receiver have no idea about
it, until received.
At receiver end it seems as a varying time to time; random in nature.
The statistical nature of a signal and a noise are same; how we can
distinguish eļ¬ciently..??
Adding parity bits(redundancy); reduce system eļ¬ciency but makes
signal distinguishable form noise.
28. The amount of noise present in the system can be expressed in mean
noise power
N =
V 2
n
R
(9)
Average signal power,
S =
V 2
s
R
(10)
where R ā” characteristic impedance, Vn, Vs ā” RMS noise and signal
voltage respectively
As discussed before, noise and signal both need some upper bound.
Ā±Vn,max = Ā±Ī³Vn (11)
We try to maximize the signal as much as possible
Ī³ =
Vs,max
Vs
(12)
Ī³ ā” Form factor
29. Total power
PT = S + N (13)
RMS voltage
V 2
T = V 2
s + V 2
n (14)
As signal and the noise are statistically similar, so the form factor; Since
both S and N having the same form factor, VT will be conļ¬ned within
the range scaled by the same factor also
VT,max = Ā±Ī³VT (15)
Divide the range into 2b
bands of size,
āV =
2Ī³VT
2b
(16)
30. We need 2b
numbers to label all the bands
We can use b-bit binary numbers for each āV
In sense, we are doing digital sampling with b-bit A/D converter
over the range 2VT
Donāt choose such b for which, āV < 2Ī³Vn
31. We need 2b
numbers to label all the bands
We can use b-bit binary numbers for each āV
In sense, we are doing digital sampling with b-bit A/D converter
over the range 2VT
Donāt choose such b for which, āV < 2Ī³Vn
The max. # of bits given by
2b
=
VT
Vn
(17)
using eqn.(14) and simplifying
2b
=
V 2
T
V 2
n
= 1 +
S
N
(18)
b = log2(1 + S
N )
1
2 (19)
32. If M , b-bit measurements of level are taken in time T, then total # of
bits of info. collected
H = M.b = M.log2 1 +
S
N
1
2
(20)
Information transmission rate,
I =
M
T
.log2 1 +
S
N
1
2
(21)
From sampling theorem, max. sampling rate for a channel with
bandwidth B
M
T
= 2B (22)
33. The max. information transmission rate (from eqn. 21)
CS = 2.B.log2 1 +
S
N
1
2
(23)
CS = B.log2 1 + S
N (24)
Eqn. 24 gives us the expression for Shannon capacity of a channel.
34. Thank You
Image source: Google Images
all materials and codes are copyright protected under GNU GPL and Creative Commons(cc) license. Anyone can share or redistribute for
non-commercial purpose only. Usage of any lecture materials or(and) codes by any individual or institution for commercial beneļ¬ts or
advancement without permission is strictly prohibited.