Here we study the channel capacity of the signal from analog and digital communication signals. Also study data rates limit , Noisy-channel coding theorem, Shannon capacity theorem.
3. INTRODUCTION
• The channel capacity is a very important consideration in data communications that
is how fast we can send data, in bits per second, over a channel.
• Channel capacity in electrical engineering, computer science and information theory,
is the tight upper bound on the rate at which information can be reliably
transmitted over a communication channel.
• Channel capacity is maximum data rate transfer per second.
4. FORMAL DEFINITION
• The basic mathematical model for a communication system is the following:
• Where:
W is the message to be transmitted.
X is the channel input symbol.
Y is the channel output symbol.
W^ is the estimate of the transmitted message.
fn is the encoding function for a block of length n.
gn is the decoding function for a block of length n.
5. DATA RATE LIMITS
• The maximum data rate limit over a medium is decided by following factors:
1. Bandwidth of channel.
2. Signal levels.
3. Channel quality (level of noise).
• Two theoretical formulas were developed to calculate the data rate :one by Nyquist
for a noiseless channel, another by Shannon for a noisy channel.
1. For noiseless channel – Nyquist bit rate
2. For noisy channel – Shannon capacity.
6. NOISY-CHANNEL CODING THEOREM
• “If a bandwidth of channel is B which carries a signal of L number of levels then the
maximum data rate is given by :-
R = 2B log2 L
• As maximum data rate without an error is also called as Channel Capacity.
C=2B log2 L bits/sec
OR
C= Rmax bits/sec
• As we can simply increasing capacity using increasing in levels.
7. SHANNON CAPACITY THEOREM
• “Given that a source of M equally likely message with M>>1 , which is generating
information at a rate R. Given that a channel of capacity C exists .
• If R<= C then there exists a coding technique such that the output od source may be
transmitted over the channel with probability of error in the received message
which may be made arbitrarily small. ”
• Shannon negative statement :-
“Given a source equally likely messages with M>>1, which is generating information
at a rate R , if R > C , then the probability of error is close to unity for every possible
set of M transmitted signals.”
Complexity of coding is increase then probability of increase in error.
8. EXAMPLE APPLICATION
• An application of the channel capacity concept to an additive white Gaussian
noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is
the Shannon–Hartley theorem:
• C = B log2 ( 1 + S/N )
• C is measured in bits per second if the logarithm is taken in base 2, or nets per
second if the natural logarithm is used, assuming B is in hertz; the signal and noise
powers S and N are expressed in a linear power unit (like watts or volts2).
Since S/N figures are often cited in dB, a conversion may be needed.