SlideShare a Scribd company logo
1 of 67
MATRUSRI ENGINEERING COLLEGE
DEPARTMENT OF ELECTRONICS COMMUNICATION
AND ENGINEERING
SUBJECT NAME: DIGITAL COMMUNICATION(PC601EC)-VI SEM
FACULTY NAME: Mr.A.ABHISHEK Reddy,Asst.Prof.
MATRUSRI
ENGINEERING COLLEGE
DIGITAL COMMUNICATION
COURSE OBJECTIVES:
1. Familiarize the students with elements of digital communication system and
waveform coding techniques like PCM, DPCM, DM and ADM.
2. Introduce the concepts of information theory and source coding
3. Familiarize the students with channel coding techniques such as LBC, BCC and
convolution codes
4. Introduce the concepts of baseband digital data transmission and analyze the
error performance of different digital carrier modulation schemes like ASK, FSK,
PSK etc.
5. Familiarize the students with the concepts of spread spectrum communication
with emphasis on DSSS and FHSS.
COURSE OUTCOMES:
CO1: Classify the different types of digital modulation techniques PCM, DPCM, DM
and ADM and compare their performance by SNR.
CO2: Illustrate the classification of channels and Source coding methods.
CO3:Distinguish different types of Error control codes along with their
encoding/decoding algorithms.
CO4: Examine the Performance of different Digital Carrier Modulation schemes of
Coherent and Non-coherent type based on Probability of error.
CO5:Generation of PN sequence using Spread Spectrum and characterize the
Acquisition Schemes for Receivers to track the signals.
MATRUSRI
ENGINEERING COLLEGE
UNIT II-Information theory and source coding:
Uncertainty, Information and Entropy. Source coding,
Shannon – Fano and Huffman coding discrete memory less
channel – probability relations in a channel, priori & posteriori
entropies, mutual information, channel capacity –binary
symmetric channel, binary erasure channel, cascaded channels,
information rate. Shannon-Hartley theorem – Shannon bound.
UNIT-II
OUTCOMES:
Introduce the concepts of Information Theory, Discrete Channels and
source coding Methods.
MATRUSRI
ENGINEERING COLLEGE
TEXT BOOKS /REFERENCES
TEXT BOOKS:
1. Simon Haykin, “Communication systems” 4/e, Wiley India 2011
2. Sam Shanmugam K, “Digital and Analog Communication systems”,
Wiley 1979.
3. B.P.Lathi, “Modern digital and analog communication systems” 3/e,
OxfordUniversityPress. 1998.
4. Leon W.Couch II., Digital and Analog Communication Systems, 6th Edn,
Pearson Education inc., New Delhi, 2001.
5. R.E.Zimer&R.L.Peterson : Introduction to Digital Communication,
PHI, 2001.
REFERENCES:
1. P. Ramakrishna Rao, “Digital Communication”, TMH, 2011.
2. Dr. Sanjay Sharma, “Digital and Analog Communication”, Mc Graw
Hill Publication, 2009.
3. Bernard Sklar “Digital Communications – Fundamentals and
Applications” / 2nd Edition, Prentice Hall.
4. John G. Proakis” Digital Communications” Fourth Edition (textbook)
McGraw Hill.
MATRUSRI
ENGINEERING COLLEGE
LESSON PLAN:
UNIT II- Information Theory and Source Coding
MATRUSRI
ENGINEERING COLLEGE
S. No. Topic(S)
No.
of Hrs
Relevant
COs
Text Book/
Reference
Book
1. Uncertainty, Information and entropy 2 CO2 T2,R1,R2
2. Source coding, Shannon – Fano 1 CO2 T2,R1,R2
3. Huffman coding 1 CO2 T2,R1,R2
4. Discrete memory less channel – Probability
relations in a channel
2 CO2 T2,R1,R2
5. Binary Symmetric Channel, Binary Erasure Channel 1 CO2 T2,R1,R2
6. priori & posteriori entropies, mutual information 1 CO2 T2,R1,R2
7. Channel capacity, cascaded channels 1 CO2 T2,R1,R2
8. Information rate. Shannon-Hartley Theorem – Shannon
Bound.
1 CO2 T2,R1,R2
TOTAL 10
CONTENTS:
-UNCERTAINTY
- INFORMATION
- ENTROPY
OUTCOMES:
Estimate the amount of Uncertainty for a given Source.
MODULE-I
MATRUSRI
ENGINEERING COLLEGE
1. Communication systems are basically meant to transfer
information from one location to another.
2. Information theory is a branch of probability theory, which
can be applied to study of communication systems.
3. The communication of information is statistical in nature
and the main aim of information theory is to study the simple
ideal statistical communication models.
Measure of Information:
Consider the following three statements:
1. Brazil defeats India in football.
2. Japan defeats India in football.
3. India defeats Brazil in football.
Introduction
MATRUSRI
ENGINEERING COLLEGE
The more the probability of an event, the less is the amount of information associated
with it and vice-versa.
Average Information /Self-Information
where “b” is base of logarithm, if b=2 then the units are called “BITS”, if b = 10, the
units are HARTLEY or DECITS and if b= e , the units are called NATS.
MATRUSRI
ENGINEERING COLLEGE
1
( ) log log ( )
( )
i b b i
i
I x p x
p x
  
MATRUSRI
ENGINEERING COLLEGE
A communication system is not only meant to deal with a single message but with
all possible messages. Messages produced by information sources consists of
sequence of symbols that corresponds to the message. we may describe the
source interms of average information (statistical average) per individual messages
known as “entropy” of source.
bits/message
The quantity H(M) , represents the average information per message emitted by
DMS with source alphabet M is called “entropy” of the source. It should be read as
“entropy of the source M”. It is a measure of uncertainty, the probability
distribution that generates maximum uncertainty will have maximum entropy.
Entropy
MATRUSRI
ENGINEERING COLLEGE
2 2
1 1
1
( ) log log
M M
k k k
k k
k
H M p p p
p
 
   
 
FOR A BINARY SYSTEM (M=2), THE ENTROPY IS
The condition for maximum entropy can be found by differtiating above
expression with respect to p and equating to zero yields,
MATRUSRI
ENGINEERING COLLEGE
1 1
log (1 )log
(1 )
H p p
p p
  

0 ln 2 log ln 2 log(1 )
dH
p p
dp
      
0.5
p 
max 0.5
1 /
p
H H bit message

 
LET US EXAMINE H UNDER DIFFERENT CASES FOR M=2
CASE I: P1=0.01, P2= 0.99, H= 0.08
CASE II: P1= 0.4, P2 = 0.6, H= 0.97
CASE III: P1 = 0.5, P2 = 0.5, H = 1.00
In case-I, the message m1 with p1=0.01 will occur or the message m2 with
p2=0.99 will occur, but most of times m2 will occur. Thus, the uncertainty is
less.
In case-II, difficult to guess bcz probabilities are nearly equal. Thus,
uncertainty is more.
In case-III, it is extremely difficult to guess bcz probabilities are equal. Thus,
uncertainty is maximum. Thus, the entropy is less when uncertainty is less
and is more when uncertainty is more.
Thus, we can say that entropy is a measure of uncertainty.
MATRUSRI
ENGINEERING COLLEGE
PROPERTIES OF ENTROPY
MATRUSRI
ENGINEERING COLLEGE
1. A binary symbol occurs with probability of 0.75. determine
the information associated with symbol in bits, nats and
Hartleys.
2. Differentiate between Information and entropy?
3. A communication system consists of six messages with
probabilities 1/8,1/8,1/8,1/4, and ¼ respectively. Determine
the entropy of system?
4. What is meant by Discrete-Memory Less Source(DMS)?
ANSWER!
MATRUSRI
ENGINEERING COLLEGE
CONTENTS:
Rate of Information
Joint entropy
Conditional Entropy
OUTCOMES:
Apply Joint and Conditional Entropies to measure Infromation.
MODULE-II
MATRUSRI
ENGINEERING COLLEGE
RATE OF INFORMATION: If a message source generates at the rate r
messages per second, the rate of information R is defined as the average
number of bits of information per second. Hence
r- rate at which messages are generated/sec
H- avg. Information (or) entropy
Information Rate
MATRUSRI
ENGINEERING COLLEGE
Suppose a DMS source, X emits symbols of Size M, if we consider a block
of such symbols and each block consisting of n symbols, consider that a
new source, called the extended source, is emitting such blocks as its
symbols. The alphabet size for the extended source is
MATRUSRI
ENGINEERING COLLEGE
n
M
MATRUSRI
ENGINEERING COLLEGE
JOINT ENTROPY
MATRUSRI
ENGINEERING COLLEGE
JOINT ENTROPY
MATRUSRI
ENGINEERING COLLEGE
H(𝑋 ∣ Y ) is a measure of our average uncertainty of the transmitted
symbol after we have received a symbol, sometimes called the
equivocation. The function H (𝑌 ∣ 𝑋) is also called equivocation (i.e gives a
measure of error, or noise) is the average uncertainty of the received
symbol given that 𝑋 was transmitted.
Conditional Entropy
MATRUSRI
ENGINEERING COLLEGE
H(X): average information per character at the transmitter, or entropy of
the transmitter.
H(Y): average information per character at the receiver, or entropy of the
receiver.
H(X, Y): average information per pair of the transmitted and received
characters, or average uncertainty of the communication system as a
whole.
H(X/Y): A measure of information about the transmitter, where it is known
that Y is received.
H(Y/X): A measure of information about the receiver, where it is known
that X is transmitted.
CHAIN RULE: H(X, Y) = H(X) + H(Y / X) OR
= H(Y) + H(X / Y)
MATRUSRI
ENGINEERING COLLEGE
1. What are the properties of entropy?
2. Consider a DMS with source alphabet S={S0,S1,S2} with
probabilities P(S0)=1/4=P(S1)=P(S2). Find 2nd Order
entropy?
3. An event has six possible outcomes with the probabilities,
p1=1/2, p2=1/4, p3= 1/8, p4=1/16, p5=1/32 and p6=1/32.
Find the entropy of the system and rate of information if there
are 16 outcomes per second.
4. A discrete source transmits message
with probabilities 0.3, 0.4 and 0.3.
The source is connected to
the channel as given in Fig.
Calculate all entropies.
Answer!
MATRUSRI
ENGINEERING COLLEGE
1 2 3
, ,
x x x
CONTENTS:
SOURCE CODING-CLASSIFICATION
SHANNON_FANO CODING
HUFFMAN CODING
OUTCOMES:
Understand the importance of Source Coding Methods.
MODULE-III
MATRUSRI
ENGINEERING COLLEGE
A discrete source produces symbols which are to be represented in such a
way that it would be possible to transmit them over a given channel. The
source output, which is in the form of a sequence of source symbols, must
be converted into a sequence of what are called the ‘code elements’. This
process is called ‘encoding’ and the device, or system which performs this
encoding is called an encoder. The encoder assigns a unique sequence of
code elements, called a ‘codeword’, for representing each source symbol
SOURCE CODING
MATRUSRI
ENGINEERING COLLEGE
The objective of source coding is to remove or reduce the redundancy in
the source output so as to give an efficient representation of the
message information given by the source by using less number of bits
1. Block Codes
2. Distinct Codes(non-singular)
3. Fixed-Length Codes
4.Variable –Length Codes
5.Instantaneous Codes(Prefix-free)
6.Uniquily Decodable Codes
MATRUSRI
ENGINEERING COLLEGE
Example:
MATRUSRI
ENGINEERING COLLEGE
Code1 Code2 Code3 Code4 Code5 Code6
00 00 0 0 0 1
01 01 1 10 01 01
00 10 00 110 011 001
11 11 11 111 0111 0001
i
x
1
x
2
x
3
x
4
x
AVERAGE CODEWORD LENGTH:
CODE EFFICIENCY: It is defined as the ratio of minimum possible value of average
codeword length to average codeword length of the symbols used in source
encoding process.
REDUNDANCY
MATRUSRI
ENGINEERING COLLEGE
1
0
( ) / message
M
i i
i
L P x n bits


 
min
L
L
 
min
( )
log
H x
L
M
 
min
2
( )
% 100
log
L H x
L L M
   
( )
H x
L
 
1
 
 
KRAFT’S INEQUALITY: A necessary and sufficient condition for the existence of
an instantaneous binary codes.
It only assures existence of an instantaneous decodable code with code length
that satisfy the inequality. But it does not show how to obtain codewords, nor
does it say any code satisfies the inequality is automatically uniquely
decodable.
MATRUSRI
ENGINEERING COLLEGE
1
0
D 1
i
M
n
i





STEPS:
1. Write down the message or source symbols in the order of decreasing
probabilities.
2. Draw a line after say the k-th symbol such that the total probability of
the symbols above the line and below the line are approximately equal,
i.E., Divide the source symbols into two groups of almost equal
probability.
3. Assign to each symbol above the line a binary value ‘0’ and to each
symbol below the line a binary value ‘1’.
4. Repeat steps 2 and 3 until all subgroups have only one symbol is left.
5. When that stage is reached, the coding is complete.
MATRUSRI
ENGINEERING COLLEGE
SHANNON-FANO CODING
STEPS:
1. Write down the message or source probabilities in decreasing order.
2.Assign a binary value 0 and binary value 1 to the last two symbols of lowest
probability. This forms stage-1.
3. Combine the last two symbols into one new symbol with probability equal to
sum of probabilities of the two original symbols. List the probabilities of original
symbols except the last two and the new symbol in decreasing order. This forms
stage-ii.
4. Step-3 process is adopted (steps) until we are left with two symbols in which 0
and 1 is assigned. This forms the last stage.
5.The code for each original source is then obtained by tracking out the sequence
of 0’s and 1’s which we have to go through backwards to arrive to the original
source symbols.
MATRUSRI
ENGINEERING COLLEGE
HUFFMAN CODING
HUFFMAN CODE CAN BE APPLIED TO AN M-ARY SOURCE AS WELL. THE
ALGORITHM IS,
1. Rearrange the symbols in the order of decreasing probability.
2. The last M symbols are combined into one symbol.
3. a and b are repeated till the set reduces to M.
4. Each of these reduced symbols are now assigned one of the 0,1,……..,M-1
numbers as the first digit in their respective codeword.
5. Now retrace and assign the numbers 0,1,……..,M-1 to the second digit for the m
symbols that were combined in the previous step.
6. This is repeated till the original symbol set is reached.
Disadvantages of Huffman
1. It depends heavily on source statistics, an a priori knowledge of the
probabilities of occurrence of the source symbols is a must.
2. As most of the sources that we come across in practice are not memoryless,
(i.E., The probability of occurrence of a symbol is not independent of which
symbols have preceded it) and since Huffman coding takes into account only the
individual symbol probabilities, use of this coding in practical applications does
not yield good compression.
MATRUSRI
ENGINEERING COLLEGE
1.Consider a source emits independent symbols A,B,C,D with probability
of occurrence as P(A)=0.5, P(B)=0.25, P(C)=0.125 and P(D)= 0.125. Apply
Shannon-Fano Coding and Find Average Length, efficiency and
redundancy.
2. Compute the Huffman Source Coding for the message symbols in two
methods and compute the variance of the average codeword length and
comment on the result.
3. What are the advantages of Huffman source codes over Shannon-
Fano codes?
4. What happens when source coding is applied to fixed-length codes
used for non-equiprobable source?
ANSWER!
MATRUSRI
ENGINEERING COLLEGE
   
0 1 2 3 4
S S S S S S

   
(s ) 0.4 0.2 0.2 0.1 0.1
i
P 
CONTENTS:
-MUTUAL INFORMATION
-DESCRETE MEMORYLESS CHANNELS
OUTCOMES:
Estimate the mutual information for a given channel.
MODULE-IV
MATRUSRI
ENGINEERING COLLEGE
In order to characteristic the complete communication system none of the
entropies discussed quantifies about the information lost in the channel.
To incorporate this missing part, the concept of mutual information is
needed.
MUTUAL INFORMATION
MATRUSRI
ENGINEERING COLLEGE
= INITIAL UNCERTAINTY – FINAL UNCERTAINTY
MATRUSRI
ENGINEERING COLLEGE
 
,
i k
I x y
 
2 2
log ( ) log ( )
i i k
p x p x y
  
 
2
( )
log ,
( )
k i
k i
k
p y x
I y x
p y
 
i.e. When an avg. Information H(X) or H(Y) is transmitted over the channel, an avg.
Amount of information equal to H(X/Y) or H(Y/X) is lost in the channel due to
intersymbol conversion which is due to noise. The balance amount of information
received at the receiver with respect to an observed output symbol is the mutual
information.
CONCLUSIONS:
1. I(X, Y) is an average mutual information, indicates measure of the information
transferred through the channel. It is also known as transferred information or
transinformation of the channel.
2. The equation states that the transferred
information is equal to the average source information minus the average
uncertainty that still remains about the messages. In other words, H(X/Y) is the
average additional information needed at the receiver after reception in order to
completely specify the message sent. Thus, H(X/Y) gives the information lost in
the channel. This is also known as equivocation.
MATRUSRI
ENGINEERING COLLEGE
( , ) ( ) ( / )
I X Y H X H X Y
 
3. The Equation states that the transferred information
is equal to the receiver information minus that part of the receiver entropy which
is not the information about the source. Thus, H(Y/X) is a measure of noise or
error due to the channel.
Properties:
1. The mutual information of a channel is symmetric i.E I(X;Y)=I(Y;X)
2. the mutual information is non-negative i.E I(X;Y )≥ 0.
even on a noisy channel, by observing the output of the channel, on the
average we cannot lose any information. At the most, the mutual information
may be zero, i.e., We do not gain any information by observing the output, and
this happens when the input and output symbols of the channel are statistically
independent.
3. The mutual information I(X;Y)
of a channel is related to
the marginal entropies
H(X) , H(Y) and H(X,Y) as
I(X;Y) = H(X) + H(Y) – H(X,Y)
MATRUSRI
ENGINEERING COLLEGE
( , ) H(Y) H(Y/ X)
I X Y  
DISCRETE MEMORYLESS CHANNEL
MATRUSRI
ENGINEERING COLLEGE
MATRUSRI
ENGINEERING COLLEGE
     
     
     
0 0 1 0 1 0
0 1 1 1 1 1
0 1 1 1 1 1
P
L
L
j i
M M L M
Y
p y x p y x p y x
p y x p y x p y x
Y y X x X
p y x p y x p y x


   
 
 
 
 
 
  
   
 
 
 
SYMMETRIC/UNIFORM CHANNEL: A channel is said to be symmetric (or) uniform
channel if the second and subsequent rows of the channel matrix contains the
same elements as that of the first row, but in different order.
(2) LOSSLESS CHANNEL : A channel represented by a channel matrix with one and
only one non-zero element in “every column” is defined as “Lossless channel”.
CLASSIFICATION OF CHANNELS
MATRUSRI
ENGINEERING COLLEGE
 
1 2 3
1
2
3
1 1 1
2 3 6
1 1 1
/
3 6 2
1 1 1
6 2 3
y y y
x
P Y X x
x
 
 
 

 
 
 
 
 
1 2 3 4 5
1
2
3
3 1 0 0 0
4 4
1 2
/ 0 0 0
3 3
0 0 0 0 1
y y y y y
x
P Y X x
x
 
 
 

 
 
 
(3) DETERMINISTIC CHANNEL : A channel represented by a channel
matrix with one and only one non-zero element in “every row” is defined
as “deterministic channel”.
1. Each row must contain only
one element with other elements
in that column being zeros.
2. Sum of all elements in any row
is equal to unity.
BINARY SYMMETRIC CHANNEL(BSC) : A BSC consists of two inputs
And two outputs The channel is
symmetric because the probability of receiving 1 when 0 is sent is same as
the probability of receiving 0 when 1. This common transition probability
is denoted by p
MATRUSRI
ENGINEERING COLLEGE
 
1 2 3
1
2
3
4
5
1 0 0
1 0 0
/ 0 1 0
0 1 0
0 0 1
y y y
x
x
P Y X x
x
x
 
 
 
 

 
 
 
 
 
0 1
0& 1
x x
   
0 1
0&y 1
y  
BINARY ERASURE CHANNEL(BEC) A BEC consists of two inputs
and three outputs
Due to noise, it may not be possible to identify the output symbol as one
or the other of the input symbols. In that case, it is erased, i.e., Ignored
and a request is sent to the transmitter to retransmit. That is why it is
called a binary erasure channel.
MATRUSRI
ENGINEERING COLLEGE
 
0 1
0
1
1
/
1
y y
x p p
P Y X
x p p

 
  

 
 
0 1
0& 1
x x
   
0 1 2
0,y & 1
y y
  
 
0 1 2
0
1
1 0
/
0 1
y y y
x p p
P Y X
x p p

 
  

 
NOISE-FREE CHANNEL : In this channel there is a one-to-one
correspondence between input and output i.e each input symbol is
received as one and only one output symbol. In this channel there is no
loss of information in transition. The number of source and destination
symbols are same, n = m.
MATRUSRI
ENGINEERING COLLEGE
 
1 1
2 2
( , ) 0 0
0 ( , ) 0
( , )
0 0 ( , )
m m
p x y
p x y
P X Y
p x y
 
 
 

 
 
 
   
1 0 0 0 0
0 1 0 0 0
( / ) ( / )
0 0 0 0 1
P Y X P X Y
 
 
 
 
 
 
 
( ; ) ( ) ( / ) ( ) ( ) ( ,Y)
I X Y H X H X Y H X H Y H X
    
Channel with independent input and output: In these channels there is
no correlation between the input and output symbols.
MATRUSRI
ENGINEERING COLLEGE
 
1 2
1 1 2
2 1 2
1 2
( , )
n
n
n
m n
y y y
x p p p
x p p p
P X Y
x p p p
 
 
 

 
 
 
 
1 2
1 1 1 1
2 2 2 2
( , )
n
m m m m
y y y
x p p p
x p p p
P X Y
x p p p
 
 
 

 
 
 
IN THE CASE OF CHANNEL WITH AN INDEPENDENT INPUT AND OUTPUT,
“NO INFORMATION IS TRANSMITTED THROUGH THE CHANNEL”
A channel with an independent input and output with JPM satisfies
atleast one of the following conditions:
1. Each row consists of the same element.
2. Each column consists of the same element.
MATRUSRI
ENGINEERING COLLEGE
1. What is meant by priori probability?
2. State the properties of Mutual Information?
3. What is the conditional entropy H(X/Y) for an lossless
channel?
4. What is meant by BSC and BEC?
ANSWER!
MATRUSRI
ENGINEERING COLLEGE
CONTENTS:
-CHANNEL CAPCITY
-BINARY CHANNEL
-CASCADED CHANNEL
OUTCOMES:
Analyze the maximum average Mutual Information for differenent
channels.
MODULE-V
MATRUSRI
ENGINEERING COLLEGE
CHANNEL CAPACITY PER SYMBOL (C) The channel capacity of a discrete
memoryless channel, commonly denoted by C, is defined as the
maximum mutual information I(X;Y) in any single use of the channel (i.e.,
Signaling interval), where the maximization is over all possible input
probability distributions
CHANNEL CAPACITY PER SECOND CS If r symbols are being transmitted
per second, then the maximum rate of transmission of information per
second is rc. This is the channel capacity binits per second and is denoted
by cs (binits/sec).
CHANNEL CAPACITY
MATRUSRI
ENGINEERING COLLEGE
 
( )
i
P x
/ sec
s
C r C binits
 
The transmission efficiency or channel efficiency is defined as
Redundancy:
MATRUSRI
ENGINEERING COLLEGE
sin
max sin
actual tran formation
imum tran formation
 
 
( ; )
max ( ; )
I X Y
I X Y

( ; ) ( ; )
s
I X Y I X Y
C C
  
( ; )
( ; )
1 s
s
C I X Y
C I X Y
R
C C



   
LOSSLESS CHANNEL
DETERMINISTIC CHANNEL
NOISE-FREE CHANNEL
SYMMETRIC CHANNEL
MATRUSRI
ENGINEERING COLLEGE
  2
( )
max (X) log
i
P x
C H m
 
  2
( )
max (Y) log
i
P x
C H n
 
2 2
C max ( ; ) log log
I X Y m n
  
( ; ) H(Y) A
I X Y    
j
A H Y x

2
log
C n A
 
BINARY SYMMETRIC CHANNEL
MATRUSRI
ENGINEERING COLLEGE

 
1
/
1
p p
P Y X
p p

 
  

 
     
( , ) ( ) /
d
P X Y P X P Y X

  1 1 1 2
2 1 2 2
( , ) ( , )
(1 )
( , )
( , ) ( , )
(1 ) (1 )(1 )
p x y p x y
p p
P X Y
p x y p x y
p p
 
 
  
 
    
 
  
   
2 2
( ; ) ( ) log (1 )log (1 )
I X Y H Y p p p p
     
 
2 2
C max ( ; ) max ( ) log (1 )log (1 )
I X Y H Y p p p p
     
2 2
1 log (1 )log (1 )
BSC
C p p p p
    
1. When the channel is noise free, i.e if p = 0 or 1, the channel output is
completely determined by the channel input, and the capacity is 1 bit per
symbol. At this value of p, the entropy function H (p) attains its minimum
value of zero.
2. When the conditional probability of error p is equal to 0.5 due to
channel noise, an input symbol yields either output symbol with equal
probability, and the capacity is zero. Whereas the entropy function H(P)
attains its maximum value of unity; in such a case, the channel is said to
be useless in the sense that the channel input and output assume
statistically independent structures.
MATRUSRI
ENGINEERING COLLEGE
1 ( )
BSC
C H p
 
BINARY ERASURE CHANNEL
MATRUSRI
ENGINEERING COLLEGE
 
     
     
1 1 2 1 3 1
1 2 2 2 3 2
1 0
/
0 1
P y x P y x P y x
p p
P Y X
P y x P y x P y x
p p
 

 
   
 

   
   
1 0
( ) 1
0 1
p p
P Y
p p
 

 
   

 
 
(1 ) (1 )(1 )
p p p
 
   
MATRUSRI
ENGINEERING COLLEGE
( ; ) ( ) ( / )
I X Y H Y H Y X
  
 
2 2
(1 ) log (1 )log (1 )
p    
     
(1 ) ( )
p H X
 
 
(1 )max ( )
BEC
C p H X
 
1
BEC
C p
 
BINARY CHANNELS:
Channel capacity of the
binary channel, a method was suggested
by Dr. S.Muroga
MATRUSRI
ENGINEERING COLLEGE
  11 12
21 22
( / )
P P
D P Y X
P P
 
   
 
2
1 2
log bits/message-symbol
2 2
Q Q
C  

 

MATRUSRI
ENGINEERING COLLEGE
11 12 1 1 11 2 11 12 2 12 1 2 1
21 22 2 2 21 2 21 22 2 22 2 2 2
1 2 1 2 1 2 2 2 2
log log log
log log log
log log log
m m m
m m m
m m mm m m m m m mm mm
P P P Q P P P P P P
P P P Q P P P P P P
P P P Q P P P P P P
  
     
     
  
     

     
     
  
     
2
1 2 bits/message-symbol
log 2
2 2 m
Q
Q Q
C
 
 
 
  

Cascaded Channels: when the information is transmitted from X to Y
through the channel-I, there will be loss of information due to the noise in
channel-I and the mutual information at the output of channel –I
When I(X, Y) is passed through channel-II, there will be further loss of
information and the mutual information at the output of channel-II is
MATRUSRI
ENGINEERING COLLEGE
( , ) (Y) H(Y/X)
I X Y H
 
( ,Z) (Z) H(Z/X)
I X H
 
MATRUSRI
ENGINEERING COLLEGE
 
1 2 2
(Z/ )
2 1 2
pq pq p q
P X
pq pq q p
 

   
 
   
 

   
1 (2 )
C H pq
 
1. Derive the expression for channel capacity of a binary
symmetric channel?
2. Derive the expression for the channel capacity of binary
erasure channel?
3. Find the mutual information and channel capacity of the
channel shown in figure
4. Find the mutual information for a
Given channel matrix with
5. Define Channel Capacity?
ANSWER!
MATRUSRI
ENGINEERING COLLEGE
1 2
( ) 0.6, ( ) 0.4
P x P x
 
2 / 3 1/ 3 0
( / )
0 1/ 6 5/ 6
P Y X
 
  
 
1 2
( ) ( ) 0.5
P x P x
 
CONTENTS:
-SHANNON'S SECOND THEOREM
-SHANNONS CHANNEL CODING THEOREM
-DIFFERENTIAL ENTROPIES & MUTUAL INFORMATION OF CONTINUOUS
-SHANNON-HARTLEY LAW
OUTCOMES:
Illustrate the impact of bandwidth and SNR on capacity .
MODULE-VI
MATRUSRI
ENGINEERING COLLEGE
It states that it is possible to device a means where by a communication
system will transmit information with an arbitrarily small probability of
error provided that the information on capacity.
SHANNON’S CHANNEL CODING THEOREM
Given a discrete memoryless source with an entropy of H(S) bits
per symbol emitting symbols at the rate of (1/ts) symbols per second, and
given a discrete memoryless channel with a capacity of C bits per symbol
and through which the symbols are transmitted at the rate of (1/tc)
symbols per second, it is possible to construct a channel code which
would make it possible to transmit the source symbols through the
channel and be reconstructed with arbitrarily small probability of error, if
and only if
SHANNONS SECOND THEOREM
MATRUSRI
ENGINEERING COLLEGE
t s
R C

( )
s c
H S C
T T

DIFFERENTIAL ENTROPY AND MUTUAL INFORMATION FOR CONTINUOUS
RANDOM ENSEMBLES
MATRUSRI
ENGINEERING COLLEGE
2
( ) ( )log ( ) bits/sample
X X
h X f x f x dx


  
2
(Y) (y)log (y) bits/sample
Y Y
h f f dx


  
, 2
( / )
( ; ) ( , )log
(x)
X
X Y
X
f x y
I X Y f x y dxdy
f
 
 
 
  
 
 
, 2
1
(X/ Y) ( , )log
( / )
X Y
X
h f x y dxdy
f x y
 
 
 
  
 
 
, 2
1
(Y/ X) ( , )log
(y/ x)
X Y
X
h f x y dxdy
f
 
 
 
  
 
 
IF R.V IS GAUSSIAN DISTRIBUTED THEN DIFFERENTIAL ENTROPY WILL
HAVE MAXIMUM VALUES AS,
SHANNON-HARTLEY LAW /SHANNON’S INFORMATION –CAPACITY
THEOREM
Shannon’s information capacity theorem is also known as shannon’s third
theorem or shannon-hartley theorem or gaussian channel capacity
theorem. If a channel bandwidth B is fixed and output is also band-limited
signal completely characterized by its periodic sample values taken at
nyquist rate 2B samples/sec. Then the channel capacity C (bits/sec) of
AWGN channel is given by
MATRUSRI
ENGINEERING COLLEGE
2
2 2
1
( ) log (2 ) log ( 2 )
2
h X e e
  
  
2
log 1 bits/second
S
C B
N
 
 
 
 
B= CHANNEL BANDWIDTH IN HZ
S= AVG. SIGNAL POWER IN WATTS
N= NOISE POWER IN WATTS
IDEAL SYSTEM: An ideal system is defined as one that transmits data at a
bit rate that is equal to the channel capacity C, in bits per second.
MATRUSRI
ENGINEERING COLLEGE
2
log 1 bits/sec
S
C B
B

 
 
 
 
N B


2
lim log 1.44
B
S S
C e
 

  
b
S E C

b
E C
S
N B


 
/
/
2 1
2 1
/
C B
C B
b
E B
C B C


  
1 ln2 1
ln2 1.6
/
b
C
E B
db
C B

 
 
 
 
    
MATRUSRI
ENGINEERING COLLEGE
1. One internet service provider (ISP) gives dial-up connections at
56kbps. Assume that the telephone connection provides a usable
bandwidth of 3.5 kHz. What is the minimum SNR required to support
this?
2. A Gaussian channel has 1MHz bandwidth. Calculate the channel
capacity if the signal power to noise spectral density ratio is 105 Hz.
Also, find maximum information rate.
3. Why downloading an audio or video stored file from Internet
sometimes takes much longer time than it requires playing?
4. Is it possible to reduce the bandwidth required to transmit a
given amount of information?
5. Is it possible to achieve error-free transmission? Discuss
ANSWER!
MATRUSRI
ENGINEERING COLLEGE

More Related Content

What's hot

Ec8352 signals and systems 2 marks with answers
Ec8352 signals and systems   2 marks with answersEc8352 signals and systems   2 marks with answers
Ec8352 signals and systems 2 marks with answersGayathri Krishnamoorthy
 
Mesosphere stratosphere and troposphere (mst) radars
Mesosphere    stratosphere and troposphere (mst) radarsMesosphere    stratosphere and troposphere (mst) radars
Mesosphere stratosphere and troposphere (mst) radarsmayuresh gotarne
 
Transmission lines
Transmission linesTransmission lines
Transmission linesSuneel Varma
 
M-ary Modulation, noise modelling, bandwidth, Bandpass Modulation
M-ary Modulation, noise modelling, bandwidth, Bandpass ModulationM-ary Modulation, noise modelling, bandwidth, Bandpass Modulation
M-ary Modulation, noise modelling, bandwidth, Bandpass ModulationDrAimalKhan
 
Dc Choppers
Dc ChoppersDc Choppers
Dc Choppersstooty s
 
8085 interfacing with memory chips
8085 interfacing with memory chips8085 interfacing with memory chips
8085 interfacing with memory chipsSrikrishna Thota
 
Principles of communication engineering
Principles of communication engineeringPrinciples of communication engineering
Principles of communication engineeringLochan Neupane
 
Current commutated chopper
Current commutated chopperCurrent commutated chopper
Current commutated chopperJyoti Singh
 
Power Electronics - Thyristor Commutation
Power Electronics - Thyristor CommutationPower Electronics - Thyristor Commutation
Power Electronics - Thyristor CommutationBurdwan University
 
Pulse width modulation (PWM)
Pulse width modulation (PWM)Pulse width modulation (PWM)
Pulse width modulation (PWM)amar pandey
 
EEP306: pulse width modulation
EEP306: pulse width modulation EEP306: pulse width modulation
EEP306: pulse width modulation Umang Gupta
 

What's hot (20)

Convolution Codes
Convolution CodesConvolution Codes
Convolution Codes
 
311 linear modulation
311 linear modulation311 linear modulation
311 linear modulation
 
Ec8352 signals and systems 2 marks with answers
Ec8352 signals and systems   2 marks with answersEc8352 signals and systems   2 marks with answers
Ec8352 signals and systems 2 marks with answers
 
Mesosphere stratosphere and troposphere (mst) radars
Mesosphere    stratosphere and troposphere (mst) radarsMesosphere    stratosphere and troposphere (mst) radars
Mesosphere stratosphere and troposphere (mst) radars
 
Transmission lines
Transmission linesTransmission lines
Transmission lines
 
M-ary Modulation, noise modelling, bandwidth, Bandpass Modulation
M-ary Modulation, noise modelling, bandwidth, Bandpass ModulationM-ary Modulation, noise modelling, bandwidth, Bandpass Modulation
M-ary Modulation, noise modelling, bandwidth, Bandpass Modulation
 
Dcs unit 2
Dcs unit 2Dcs unit 2
Dcs unit 2
 
Dc Choppers
Dc ChoppersDc Choppers
Dc Choppers
 
8085 interfacing with memory chips
8085 interfacing with memory chips8085 interfacing with memory chips
8085 interfacing with memory chips
 
Equalization
EqualizationEqualization
Equalization
 
Antenna
AntennaAntenna
Antenna
 
Information theory
Information theoryInformation theory
Information theory
 
Principles of communication engineering
Principles of communication engineeringPrinciples of communication engineering
Principles of communication engineering
 
Chapter03 fm modulation
Chapter03 fm modulationChapter03 fm modulation
Chapter03 fm modulation
 
Current commutated chopper
Current commutated chopperCurrent commutated chopper
Current commutated chopper
 
Propagation mechanisms
Propagation mechanismsPropagation mechanisms
Propagation mechanisms
 
Power Electronics - Thyristor Commutation
Power Electronics - Thyristor CommutationPower Electronics - Thyristor Commutation
Power Electronics - Thyristor Commutation
 
Pulse width modulation (PWM)
Pulse width modulation (PWM)Pulse width modulation (PWM)
Pulse width modulation (PWM)
 
Scr firing circuits
Scr firing circuitsScr firing circuits
Scr firing circuits
 
EEP306: pulse width modulation
EEP306: pulse width modulation EEP306: pulse width modulation
EEP306: pulse width modulation
 

Similar to DC@UNIT 2 ppt.ppt

Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATIONUnit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATIONrubini Rubini
 
Information Theory Final.pptx
Information Theory Final.pptxInformation Theory Final.pptx
Information Theory Final.pptxSkNick1
 
Information theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptxInformation theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptxprernaguptaec
 
INFORMATION_THEORY.pdf
INFORMATION_THEORY.pdfINFORMATION_THEORY.pdf
INFORMATION_THEORY.pdftemmy7
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxKIRUTHIKAAR2
 
Lecture1
Lecture1Lecture1
Lecture1ntpc08
 
UNIT-3 : CHANNEL CODING
UNIT-3 : CHANNEL CODINGUNIT-3 : CHANNEL CODING
UNIT-3 : CHANNEL CODINGabhishek reddy
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxKIRUTHIKAAR2
 
Towards a theory of semantic communication
Towards a theory of semantic communicationTowards a theory of semantic communication
Towards a theory of semantic communicationJie Bao
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and CodingVIT-AP University
 
3 models of communication
3 models of communication3 models of communication
3 models of communicationDiego Rodrigo
 
A Mathematical Theory of Communication
A Mathematical Theory of CommunicationA Mathematical Theory of Communication
A Mathematical Theory of CommunicationSergey Oboroc
 
Information Theory Coding 1
Information Theory Coding 1Information Theory Coding 1
Information Theory Coding 1Mahafuz Aveek
 

Similar to DC@UNIT 2 ppt.ppt (20)

Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATIONUnit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
 
Information Theory Final.pptx
Information Theory Final.pptxInformation Theory Final.pptx
Information Theory Final.pptx
 
UNIT-2.pdf
UNIT-2.pdfUNIT-2.pdf
UNIT-2.pdf
 
Information theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptxInformation theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptx
 
INFORMATION_THEORY.pdf
INFORMATION_THEORY.pdfINFORMATION_THEORY.pdf
INFORMATION_THEORY.pdf
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptx
 
Lecture1
Lecture1Lecture1
Lecture1
 
DC@UNIT 3 ppt.ppt
DC@UNIT 3 ppt.pptDC@UNIT 3 ppt.ppt
DC@UNIT 3 ppt.ppt
 
UNIT-3 : CHANNEL CODING
UNIT-3 : CHANNEL CODINGUNIT-3 : CHANNEL CODING
UNIT-3 : CHANNEL CODING
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptx
 
Towards a theory of semantic communication
Towards a theory of semantic communicationTowards a theory of semantic communication
Towards a theory of semantic communication
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and Coding
 
What_is_Information.pdf
What_is_Information.pdfWhat_is_Information.pdf
What_is_Information.pdf
 
Shannon1948
Shannon1948Shannon1948
Shannon1948
 
Cybernetics Tradition
Cybernetics TraditionCybernetics Tradition
Cybernetics Tradition
 
Digital Communication Pulse code modulation.ppt
Digital Communication Pulse code modulation.pptDigital Communication Pulse code modulation.ppt
Digital Communication Pulse code modulation.ppt
 
3 models of communication
3 models of communication3 models of communication
3 models of communication
 
Entropy
EntropyEntropy
Entropy
 
A Mathematical Theory of Communication
A Mathematical Theory of CommunicationA Mathematical Theory of Communication
A Mathematical Theory of Communication
 
Information Theory Coding 1
Information Theory Coding 1Information Theory Coding 1
Information Theory Coding 1
 

More from Matrusri Engineering College (6)

UNIT-5 Spread Spectrum Communication.pdf
UNIT-5 Spread Spectrum Communication.pdfUNIT-5 Spread Spectrum Communication.pdf
UNIT-5 Spread Spectrum Communication.pdf
 
DC@UNIT 5 ppt.ppt
DC@UNIT 5 ppt.pptDC@UNIT 5 ppt.ppt
DC@UNIT 5 ppt.ppt
 
UNIT-4 Baseband Digital Modulation.pdf
UNIT-4 Baseband Digital Modulation.pdfUNIT-4 Baseband Digital Modulation.pdf
UNIT-4 Baseband Digital Modulation.pdf
 
UNIT-3.pdf
UNIT-3.pdfUNIT-3.pdf
UNIT-3.pdf
 
UNIT-1.pdf
UNIT-1.pdfUNIT-1.pdf
UNIT-1.pdf
 
DC@UNIT 1 ppt.ppt
DC@UNIT 1 ppt.pptDC@UNIT 1 ppt.ppt
DC@UNIT 1 ppt.ppt
 

Recently uploaded

Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...Call Girls in Nagpur High Profile
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSSIVASHANKAR N
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
Biology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxBiology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxDeepakSakkari2
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝soniya singh
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 

Recently uploaded (20)

Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
Biology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxBiology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptx
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 

DC@UNIT 2 ppt.ppt

  • 1. MATRUSRI ENGINEERING COLLEGE DEPARTMENT OF ELECTRONICS COMMUNICATION AND ENGINEERING SUBJECT NAME: DIGITAL COMMUNICATION(PC601EC)-VI SEM FACULTY NAME: Mr.A.ABHISHEK Reddy,Asst.Prof. MATRUSRI ENGINEERING COLLEGE
  • 2. DIGITAL COMMUNICATION COURSE OBJECTIVES: 1. Familiarize the students with elements of digital communication system and waveform coding techniques like PCM, DPCM, DM and ADM. 2. Introduce the concepts of information theory and source coding 3. Familiarize the students with channel coding techniques such as LBC, BCC and convolution codes 4. Introduce the concepts of baseband digital data transmission and analyze the error performance of different digital carrier modulation schemes like ASK, FSK, PSK etc. 5. Familiarize the students with the concepts of spread spectrum communication with emphasis on DSSS and FHSS. COURSE OUTCOMES: CO1: Classify the different types of digital modulation techniques PCM, DPCM, DM and ADM and compare their performance by SNR. CO2: Illustrate the classification of channels and Source coding methods. CO3:Distinguish different types of Error control codes along with their encoding/decoding algorithms. CO4: Examine the Performance of different Digital Carrier Modulation schemes of Coherent and Non-coherent type based on Probability of error. CO5:Generation of PN sequence using Spread Spectrum and characterize the Acquisition Schemes for Receivers to track the signals. MATRUSRI ENGINEERING COLLEGE
  • 3. UNIT II-Information theory and source coding: Uncertainty, Information and Entropy. Source coding, Shannon – Fano and Huffman coding discrete memory less channel – probability relations in a channel, priori & posteriori entropies, mutual information, channel capacity –binary symmetric channel, binary erasure channel, cascaded channels, information rate. Shannon-Hartley theorem – Shannon bound. UNIT-II OUTCOMES: Introduce the concepts of Information Theory, Discrete Channels and source coding Methods. MATRUSRI ENGINEERING COLLEGE
  • 4. TEXT BOOKS /REFERENCES TEXT BOOKS: 1. Simon Haykin, “Communication systems” 4/e, Wiley India 2011 2. Sam Shanmugam K, “Digital and Analog Communication systems”, Wiley 1979. 3. B.P.Lathi, “Modern digital and analog communication systems” 3/e, OxfordUniversityPress. 1998. 4. Leon W.Couch II., Digital and Analog Communication Systems, 6th Edn, Pearson Education inc., New Delhi, 2001. 5. R.E.Zimer&R.L.Peterson : Introduction to Digital Communication, PHI, 2001. REFERENCES: 1. P. Ramakrishna Rao, “Digital Communication”, TMH, 2011. 2. Dr. Sanjay Sharma, “Digital and Analog Communication”, Mc Graw Hill Publication, 2009. 3. Bernard Sklar “Digital Communications – Fundamentals and Applications” / 2nd Edition, Prentice Hall. 4. John G. Proakis” Digital Communications” Fourth Edition (textbook) McGraw Hill. MATRUSRI ENGINEERING COLLEGE
  • 5. LESSON PLAN: UNIT II- Information Theory and Source Coding MATRUSRI ENGINEERING COLLEGE S. No. Topic(S) No. of Hrs Relevant COs Text Book/ Reference Book 1. Uncertainty, Information and entropy 2 CO2 T2,R1,R2 2. Source coding, Shannon – Fano 1 CO2 T2,R1,R2 3. Huffman coding 1 CO2 T2,R1,R2 4. Discrete memory less channel – Probability relations in a channel 2 CO2 T2,R1,R2 5. Binary Symmetric Channel, Binary Erasure Channel 1 CO2 T2,R1,R2 6. priori & posteriori entropies, mutual information 1 CO2 T2,R1,R2 7. Channel capacity, cascaded channels 1 CO2 T2,R1,R2 8. Information rate. Shannon-Hartley Theorem – Shannon Bound. 1 CO2 T2,R1,R2 TOTAL 10
  • 6. CONTENTS: -UNCERTAINTY - INFORMATION - ENTROPY OUTCOMES: Estimate the amount of Uncertainty for a given Source. MODULE-I MATRUSRI ENGINEERING COLLEGE
  • 7. 1. Communication systems are basically meant to transfer information from one location to another. 2. Information theory is a branch of probability theory, which can be applied to study of communication systems. 3. The communication of information is statistical in nature and the main aim of information theory is to study the simple ideal statistical communication models. Measure of Information: Consider the following three statements: 1. Brazil defeats India in football. 2. Japan defeats India in football. 3. India defeats Brazil in football. Introduction MATRUSRI ENGINEERING COLLEGE
  • 8. The more the probability of an event, the less is the amount of information associated with it and vice-versa. Average Information /Self-Information where “b” is base of logarithm, if b=2 then the units are called “BITS”, if b = 10, the units are HARTLEY or DECITS and if b= e , the units are called NATS. MATRUSRI ENGINEERING COLLEGE 1 ( ) log log ( ) ( ) i b b i i I x p x p x   
  • 10. A communication system is not only meant to deal with a single message but with all possible messages. Messages produced by information sources consists of sequence of symbols that corresponds to the message. we may describe the source interms of average information (statistical average) per individual messages known as “entropy” of source. bits/message The quantity H(M) , represents the average information per message emitted by DMS with source alphabet M is called “entropy” of the source. It should be read as “entropy of the source M”. It is a measure of uncertainty, the probability distribution that generates maximum uncertainty will have maximum entropy. Entropy MATRUSRI ENGINEERING COLLEGE 2 2 1 1 1 ( ) log log M M k k k k k k H M p p p p        
  • 11. FOR A BINARY SYSTEM (M=2), THE ENTROPY IS The condition for maximum entropy can be found by differtiating above expression with respect to p and equating to zero yields, MATRUSRI ENGINEERING COLLEGE 1 1 log (1 )log (1 ) H p p p p     0 ln 2 log ln 2 log(1 ) dH p p dp        0.5 p  max 0.5 1 / p H H bit message   
  • 12. LET US EXAMINE H UNDER DIFFERENT CASES FOR M=2 CASE I: P1=0.01, P2= 0.99, H= 0.08 CASE II: P1= 0.4, P2 = 0.6, H= 0.97 CASE III: P1 = 0.5, P2 = 0.5, H = 1.00 In case-I, the message m1 with p1=0.01 will occur or the message m2 with p2=0.99 will occur, but most of times m2 will occur. Thus, the uncertainty is less. In case-II, difficult to guess bcz probabilities are nearly equal. Thus, uncertainty is more. In case-III, it is extremely difficult to guess bcz probabilities are equal. Thus, uncertainty is maximum. Thus, the entropy is less when uncertainty is less and is more when uncertainty is more. Thus, we can say that entropy is a measure of uncertainty. MATRUSRI ENGINEERING COLLEGE
  • 14. 1. A binary symbol occurs with probability of 0.75. determine the information associated with symbol in bits, nats and Hartleys. 2. Differentiate between Information and entropy? 3. A communication system consists of six messages with probabilities 1/8,1/8,1/8,1/4, and ¼ respectively. Determine the entropy of system? 4. What is meant by Discrete-Memory Less Source(DMS)? ANSWER! MATRUSRI ENGINEERING COLLEGE
  • 15. CONTENTS: Rate of Information Joint entropy Conditional Entropy OUTCOMES: Apply Joint and Conditional Entropies to measure Infromation. MODULE-II MATRUSRI ENGINEERING COLLEGE
  • 16. RATE OF INFORMATION: If a message source generates at the rate r messages per second, the rate of information R is defined as the average number of bits of information per second. Hence r- rate at which messages are generated/sec H- avg. Information (or) entropy Information Rate MATRUSRI ENGINEERING COLLEGE
  • 17. Suppose a DMS source, X emits symbols of Size M, if we consider a block of such symbols and each block consisting of n symbols, consider that a new source, called the extended source, is emitting such blocks as its symbols. The alphabet size for the extended source is MATRUSRI ENGINEERING COLLEGE n M
  • 21. H(𝑋 ∣ Y ) is a measure of our average uncertainty of the transmitted symbol after we have received a symbol, sometimes called the equivocation. The function H (𝑌 ∣ 𝑋) is also called equivocation (i.e gives a measure of error, or noise) is the average uncertainty of the received symbol given that 𝑋 was transmitted. Conditional Entropy MATRUSRI ENGINEERING COLLEGE
  • 22. H(X): average information per character at the transmitter, or entropy of the transmitter. H(Y): average information per character at the receiver, or entropy of the receiver. H(X, Y): average information per pair of the transmitted and received characters, or average uncertainty of the communication system as a whole. H(X/Y): A measure of information about the transmitter, where it is known that Y is received. H(Y/X): A measure of information about the receiver, where it is known that X is transmitted. CHAIN RULE: H(X, Y) = H(X) + H(Y / X) OR = H(Y) + H(X / Y) MATRUSRI ENGINEERING COLLEGE
  • 23. 1. What are the properties of entropy? 2. Consider a DMS with source alphabet S={S0,S1,S2} with probabilities P(S0)=1/4=P(S1)=P(S2). Find 2nd Order entropy? 3. An event has six possible outcomes with the probabilities, p1=1/2, p2=1/4, p3= 1/8, p4=1/16, p5=1/32 and p6=1/32. Find the entropy of the system and rate of information if there are 16 outcomes per second. 4. A discrete source transmits message with probabilities 0.3, 0.4 and 0.3. The source is connected to the channel as given in Fig. Calculate all entropies. Answer! MATRUSRI ENGINEERING COLLEGE 1 2 3 , , x x x
  • 24. CONTENTS: SOURCE CODING-CLASSIFICATION SHANNON_FANO CODING HUFFMAN CODING OUTCOMES: Understand the importance of Source Coding Methods. MODULE-III MATRUSRI ENGINEERING COLLEGE
  • 25. A discrete source produces symbols which are to be represented in such a way that it would be possible to transmit them over a given channel. The source output, which is in the form of a sequence of source symbols, must be converted into a sequence of what are called the ‘code elements’. This process is called ‘encoding’ and the device, or system which performs this encoding is called an encoder. The encoder assigns a unique sequence of code elements, called a ‘codeword’, for representing each source symbol SOURCE CODING MATRUSRI ENGINEERING COLLEGE
  • 26. The objective of source coding is to remove or reduce the redundancy in the source output so as to give an efficient representation of the message information given by the source by using less number of bits 1. Block Codes 2. Distinct Codes(non-singular) 3. Fixed-Length Codes 4.Variable –Length Codes 5.Instantaneous Codes(Prefix-free) 6.Uniquily Decodable Codes MATRUSRI ENGINEERING COLLEGE
  • 27. Example: MATRUSRI ENGINEERING COLLEGE Code1 Code2 Code3 Code4 Code5 Code6 00 00 0 0 0 1 01 01 1 10 01 01 00 10 00 110 011 001 11 11 11 111 0111 0001 i x 1 x 2 x 3 x 4 x
  • 28. AVERAGE CODEWORD LENGTH: CODE EFFICIENCY: It is defined as the ratio of minimum possible value of average codeword length to average codeword length of the symbols used in source encoding process. REDUNDANCY MATRUSRI ENGINEERING COLLEGE 1 0 ( ) / message M i i i L P x n bits     min L L   min ( ) log H x L M   min 2 ( ) % 100 log L H x L L M     ( ) H x L   1    
  • 29. KRAFT’S INEQUALITY: A necessary and sufficient condition for the existence of an instantaneous binary codes. It only assures existence of an instantaneous decodable code with code length that satisfy the inequality. But it does not show how to obtain codewords, nor does it say any code satisfies the inequality is automatically uniquely decodable. MATRUSRI ENGINEERING COLLEGE 1 0 D 1 i M n i     
  • 30. STEPS: 1. Write down the message or source symbols in the order of decreasing probabilities. 2. Draw a line after say the k-th symbol such that the total probability of the symbols above the line and below the line are approximately equal, i.E., Divide the source symbols into two groups of almost equal probability. 3. Assign to each symbol above the line a binary value ‘0’ and to each symbol below the line a binary value ‘1’. 4. Repeat steps 2 and 3 until all subgroups have only one symbol is left. 5. When that stage is reached, the coding is complete. MATRUSRI ENGINEERING COLLEGE SHANNON-FANO CODING
  • 31. STEPS: 1. Write down the message or source probabilities in decreasing order. 2.Assign a binary value 0 and binary value 1 to the last two symbols of lowest probability. This forms stage-1. 3. Combine the last two symbols into one new symbol with probability equal to sum of probabilities of the two original symbols. List the probabilities of original symbols except the last two and the new symbol in decreasing order. This forms stage-ii. 4. Step-3 process is adopted (steps) until we are left with two symbols in which 0 and 1 is assigned. This forms the last stage. 5.The code for each original source is then obtained by tracking out the sequence of 0’s and 1’s which we have to go through backwards to arrive to the original source symbols. MATRUSRI ENGINEERING COLLEGE HUFFMAN CODING
  • 32. HUFFMAN CODE CAN BE APPLIED TO AN M-ARY SOURCE AS WELL. THE ALGORITHM IS, 1. Rearrange the symbols in the order of decreasing probability. 2. The last M symbols are combined into one symbol. 3. a and b are repeated till the set reduces to M. 4. Each of these reduced symbols are now assigned one of the 0,1,……..,M-1 numbers as the first digit in their respective codeword. 5. Now retrace and assign the numbers 0,1,……..,M-1 to the second digit for the m symbols that were combined in the previous step. 6. This is repeated till the original symbol set is reached. Disadvantages of Huffman 1. It depends heavily on source statistics, an a priori knowledge of the probabilities of occurrence of the source symbols is a must. 2. As most of the sources that we come across in practice are not memoryless, (i.E., The probability of occurrence of a symbol is not independent of which symbols have preceded it) and since Huffman coding takes into account only the individual symbol probabilities, use of this coding in practical applications does not yield good compression. MATRUSRI ENGINEERING COLLEGE
  • 33. 1.Consider a source emits independent symbols A,B,C,D with probability of occurrence as P(A)=0.5, P(B)=0.25, P(C)=0.125 and P(D)= 0.125. Apply Shannon-Fano Coding and Find Average Length, efficiency and redundancy. 2. Compute the Huffman Source Coding for the message symbols in two methods and compute the variance of the average codeword length and comment on the result. 3. What are the advantages of Huffman source codes over Shannon- Fano codes? 4. What happens when source coding is applied to fixed-length codes used for non-equiprobable source? ANSWER! MATRUSRI ENGINEERING COLLEGE     0 1 2 3 4 S S S S S S      (s ) 0.4 0.2 0.2 0.1 0.1 i P 
  • 34. CONTENTS: -MUTUAL INFORMATION -DESCRETE MEMORYLESS CHANNELS OUTCOMES: Estimate the mutual information for a given channel. MODULE-IV MATRUSRI ENGINEERING COLLEGE
  • 35. In order to characteristic the complete communication system none of the entropies discussed quantifies about the information lost in the channel. To incorporate this missing part, the concept of mutual information is needed. MUTUAL INFORMATION MATRUSRI ENGINEERING COLLEGE
  • 36. = INITIAL UNCERTAINTY – FINAL UNCERTAINTY MATRUSRI ENGINEERING COLLEGE   , i k I x y   2 2 log ( ) log ( ) i i k p x p x y      2 ( ) log , ( ) k i k i k p y x I y x p y  
  • 37. i.e. When an avg. Information H(X) or H(Y) is transmitted over the channel, an avg. Amount of information equal to H(X/Y) or H(Y/X) is lost in the channel due to intersymbol conversion which is due to noise. The balance amount of information received at the receiver with respect to an observed output symbol is the mutual information. CONCLUSIONS: 1. I(X, Y) is an average mutual information, indicates measure of the information transferred through the channel. It is also known as transferred information or transinformation of the channel. 2. The equation states that the transferred information is equal to the average source information minus the average uncertainty that still remains about the messages. In other words, H(X/Y) is the average additional information needed at the receiver after reception in order to completely specify the message sent. Thus, H(X/Y) gives the information lost in the channel. This is also known as equivocation. MATRUSRI ENGINEERING COLLEGE ( , ) ( ) ( / ) I X Y H X H X Y  
  • 38. 3. The Equation states that the transferred information is equal to the receiver information minus that part of the receiver entropy which is not the information about the source. Thus, H(Y/X) is a measure of noise or error due to the channel. Properties: 1. The mutual information of a channel is symmetric i.E I(X;Y)=I(Y;X) 2. the mutual information is non-negative i.E I(X;Y )≥ 0. even on a noisy channel, by observing the output of the channel, on the average we cannot lose any information. At the most, the mutual information may be zero, i.e., We do not gain any information by observing the output, and this happens when the input and output symbols of the channel are statistically independent. 3. The mutual information I(X;Y) of a channel is related to the marginal entropies H(X) , H(Y) and H(X,Y) as I(X;Y) = H(X) + H(Y) – H(X,Y) MATRUSRI ENGINEERING COLLEGE ( , ) H(Y) H(Y/ X) I X Y  
  • 40. MATRUSRI ENGINEERING COLLEGE                   0 0 1 0 1 0 0 1 1 1 1 1 0 1 1 1 1 1 P L L j i M M L M Y p y x p y x p y x p y x p y x p y x Y y X x X p y x p y x p y x                             
  • 41. SYMMETRIC/UNIFORM CHANNEL: A channel is said to be symmetric (or) uniform channel if the second and subsequent rows of the channel matrix contains the same elements as that of the first row, but in different order. (2) LOSSLESS CHANNEL : A channel represented by a channel matrix with one and only one non-zero element in “every column” is defined as “Lossless channel”. CLASSIFICATION OF CHANNELS MATRUSRI ENGINEERING COLLEGE   1 2 3 1 2 3 1 1 1 2 3 6 1 1 1 / 3 6 2 1 1 1 6 2 3 y y y x P Y X x x                  1 2 3 4 5 1 2 3 3 1 0 0 0 4 4 1 2 / 0 0 0 3 3 0 0 0 0 1 y y y y y x P Y X x x             
  • 42. (3) DETERMINISTIC CHANNEL : A channel represented by a channel matrix with one and only one non-zero element in “every row” is defined as “deterministic channel”. 1. Each row must contain only one element with other elements in that column being zeros. 2. Sum of all elements in any row is equal to unity. BINARY SYMMETRIC CHANNEL(BSC) : A BSC consists of two inputs And two outputs The channel is symmetric because the probability of receiving 1 when 0 is sent is same as the probability of receiving 0 when 1. This common transition probability is denoted by p MATRUSRI ENGINEERING COLLEGE   1 2 3 1 2 3 4 5 1 0 0 1 0 0 / 0 1 0 0 1 0 0 0 1 y y y x x P Y X x x x                    0 1 0& 1 x x     0 1 0&y 1 y  
  • 43. BINARY ERASURE CHANNEL(BEC) A BEC consists of two inputs and three outputs Due to noise, it may not be possible to identify the output symbol as one or the other of the input symbols. In that case, it is erased, i.e., Ignored and a request is sent to the transmitter to retransmit. That is why it is called a binary erasure channel. MATRUSRI ENGINEERING COLLEGE   0 1 0 1 1 / 1 y y x p p P Y X x p p            0 1 0& 1 x x     0 1 2 0,y & 1 y y      0 1 2 0 1 1 0 / 0 1 y y y x p p P Y X x p p         
  • 44. NOISE-FREE CHANNEL : In this channel there is a one-to-one correspondence between input and output i.e each input symbol is received as one and only one output symbol. In this channel there is no loss of information in transition. The number of source and destination symbols are same, n = m. MATRUSRI ENGINEERING COLLEGE   1 1 2 2 ( , ) 0 0 0 ( , ) 0 ( , ) 0 0 ( , ) m m p x y p x y P X Y p x y                  1 0 0 0 0 0 1 0 0 0 ( / ) ( / ) 0 0 0 0 1 P Y X P X Y               ( ; ) ( ) ( / ) ( ) ( ) ( ,Y) I X Y H X H X Y H X H Y H X     
  • 45. Channel with independent input and output: In these channels there is no correlation between the input and output symbols. MATRUSRI ENGINEERING COLLEGE   1 2 1 1 2 2 1 2 1 2 ( , ) n n n m n y y y x p p p x p p p P X Y x p p p                1 2 1 1 1 1 2 2 2 2 ( , ) n m m m m y y y x p p p x p p p P X Y x p p p             
  • 46. IN THE CASE OF CHANNEL WITH AN INDEPENDENT INPUT AND OUTPUT, “NO INFORMATION IS TRANSMITTED THROUGH THE CHANNEL” A channel with an independent input and output with JPM satisfies atleast one of the following conditions: 1. Each row consists of the same element. 2. Each column consists of the same element. MATRUSRI ENGINEERING COLLEGE
  • 47. 1. What is meant by priori probability? 2. State the properties of Mutual Information? 3. What is the conditional entropy H(X/Y) for an lossless channel? 4. What is meant by BSC and BEC? ANSWER! MATRUSRI ENGINEERING COLLEGE
  • 48. CONTENTS: -CHANNEL CAPCITY -BINARY CHANNEL -CASCADED CHANNEL OUTCOMES: Analyze the maximum average Mutual Information for differenent channels. MODULE-V MATRUSRI ENGINEERING COLLEGE
  • 49. CHANNEL CAPACITY PER SYMBOL (C) The channel capacity of a discrete memoryless channel, commonly denoted by C, is defined as the maximum mutual information I(X;Y) in any single use of the channel (i.e., Signaling interval), where the maximization is over all possible input probability distributions CHANNEL CAPACITY PER SECOND CS If r symbols are being transmitted per second, then the maximum rate of transmission of information per second is rc. This is the channel capacity binits per second and is denoted by cs (binits/sec). CHANNEL CAPACITY MATRUSRI ENGINEERING COLLEGE   ( ) i P x / sec s C r C binits  
  • 50. The transmission efficiency or channel efficiency is defined as Redundancy: MATRUSRI ENGINEERING COLLEGE sin max sin actual tran formation imum tran formation     ( ; ) max ( ; ) I X Y I X Y  ( ; ) ( ; ) s I X Y I X Y C C    ( ; ) ( ; ) 1 s s C I X Y C I X Y R C C       
  • 51. LOSSLESS CHANNEL DETERMINISTIC CHANNEL NOISE-FREE CHANNEL SYMMETRIC CHANNEL MATRUSRI ENGINEERING COLLEGE   2 ( ) max (X) log i P x C H m     2 ( ) max (Y) log i P x C H n   2 2 C max ( ; ) log log I X Y m n    ( ; ) H(Y) A I X Y     j A H Y x  2 log C n A  
  • 52. BINARY SYMMETRIC CHANNEL MATRUSRI ENGINEERING COLLEGE    1 / 1 p p P Y X p p                ( , ) ( ) / d P X Y P X P Y X    1 1 1 2 2 1 2 2 ( , ) ( , ) (1 ) ( , ) ( , ) ( , ) (1 ) (1 )(1 ) p x y p x y p p P X Y p x y p x y p p                        2 2 ( ; ) ( ) log (1 )log (1 ) I X Y H Y p p p p         2 2 C max ( ; ) max ( ) log (1 )log (1 ) I X Y H Y p p p p       2 2 1 log (1 )log (1 ) BSC C p p p p     
  • 53. 1. When the channel is noise free, i.e if p = 0 or 1, the channel output is completely determined by the channel input, and the capacity is 1 bit per symbol. At this value of p, the entropy function H (p) attains its minimum value of zero. 2. When the conditional probability of error p is equal to 0.5 due to channel noise, an input symbol yields either output symbol with equal probability, and the capacity is zero. Whereas the entropy function H(P) attains its maximum value of unity; in such a case, the channel is said to be useless in the sense that the channel input and output assume statistically independent structures. MATRUSRI ENGINEERING COLLEGE 1 ( ) BSC C H p  
  • 54. BINARY ERASURE CHANNEL MATRUSRI ENGINEERING COLLEGE               1 1 2 1 3 1 1 2 2 2 3 2 1 0 / 0 1 P y x P y x P y x p p P Y X P y x P y x P y x p p                     1 0 ( ) 1 0 1 p p P Y p p               (1 ) (1 )(1 ) p p p      
  • 55. MATRUSRI ENGINEERING COLLEGE ( ; ) ( ) ( / ) I X Y H Y H Y X      2 2 (1 ) log (1 )log (1 ) p           (1 ) ( ) p H X     (1 )max ( ) BEC C p H X   1 BEC C p  
  • 56. BINARY CHANNELS: Channel capacity of the binary channel, a method was suggested by Dr. S.Muroga MATRUSRI ENGINEERING COLLEGE   11 12 21 22 ( / ) P P D P Y X P P         2 1 2 log bits/message-symbol 2 2 Q Q C      
  • 57. MATRUSRI ENGINEERING COLLEGE 11 12 1 1 11 2 11 12 2 12 1 2 1 21 22 2 2 21 2 21 22 2 22 2 2 2 1 2 1 2 1 2 2 2 2 log log log log log log log log log m m m m m m m m mm m m m m m mm mm P P P Q P P P P P P P P P Q P P P P P P P P P Q P P P P P P                                               2 1 2 bits/message-symbol log 2 2 2 m Q Q Q C          
  • 58. Cascaded Channels: when the information is transmitted from X to Y through the channel-I, there will be loss of information due to the noise in channel-I and the mutual information at the output of channel –I When I(X, Y) is passed through channel-II, there will be further loss of information and the mutual information at the output of channel-II is MATRUSRI ENGINEERING COLLEGE ( , ) (Y) H(Y/X) I X Y H   ( ,Z) (Z) H(Z/X) I X H  
  • 59. MATRUSRI ENGINEERING COLLEGE   1 2 2 (Z/ ) 2 1 2 pq pq p q P X pq pq q p                     1 (2 ) C H pq  
  • 60. 1. Derive the expression for channel capacity of a binary symmetric channel? 2. Derive the expression for the channel capacity of binary erasure channel? 3. Find the mutual information and channel capacity of the channel shown in figure 4. Find the mutual information for a Given channel matrix with 5. Define Channel Capacity? ANSWER! MATRUSRI ENGINEERING COLLEGE 1 2 ( ) 0.6, ( ) 0.4 P x P x   2 / 3 1/ 3 0 ( / ) 0 1/ 6 5/ 6 P Y X        1 2 ( ) ( ) 0.5 P x P x  
  • 61. CONTENTS: -SHANNON'S SECOND THEOREM -SHANNONS CHANNEL CODING THEOREM -DIFFERENTIAL ENTROPIES & MUTUAL INFORMATION OF CONTINUOUS -SHANNON-HARTLEY LAW OUTCOMES: Illustrate the impact of bandwidth and SNR on capacity . MODULE-VI MATRUSRI ENGINEERING COLLEGE
  • 62. It states that it is possible to device a means where by a communication system will transmit information with an arbitrarily small probability of error provided that the information on capacity. SHANNON’S CHANNEL CODING THEOREM Given a discrete memoryless source with an entropy of H(S) bits per symbol emitting symbols at the rate of (1/ts) symbols per second, and given a discrete memoryless channel with a capacity of C bits per symbol and through which the symbols are transmitted at the rate of (1/tc) symbols per second, it is possible to construct a channel code which would make it possible to transmit the source symbols through the channel and be reconstructed with arbitrarily small probability of error, if and only if SHANNONS SECOND THEOREM MATRUSRI ENGINEERING COLLEGE t s R C  ( ) s c H S C T T 
  • 63. DIFFERENTIAL ENTROPY AND MUTUAL INFORMATION FOR CONTINUOUS RANDOM ENSEMBLES MATRUSRI ENGINEERING COLLEGE 2 ( ) ( )log ( ) bits/sample X X h X f x f x dx      2 (Y) (y)log (y) bits/sample Y Y h f f dx      , 2 ( / ) ( ; ) ( , )log (x) X X Y X f x y I X Y f x y dxdy f              , 2 1 (X/ Y) ( , )log ( / ) X Y X h f x y dxdy f x y              , 2 1 (Y/ X) ( , )log (y/ x) X Y X h f x y dxdy f             
  • 64. IF R.V IS GAUSSIAN DISTRIBUTED THEN DIFFERENTIAL ENTROPY WILL HAVE MAXIMUM VALUES AS, SHANNON-HARTLEY LAW /SHANNON’S INFORMATION –CAPACITY THEOREM Shannon’s information capacity theorem is also known as shannon’s third theorem or shannon-hartley theorem or gaussian channel capacity theorem. If a channel bandwidth B is fixed and output is also band-limited signal completely characterized by its periodic sample values taken at nyquist rate 2B samples/sec. Then the channel capacity C (bits/sec) of AWGN channel is given by MATRUSRI ENGINEERING COLLEGE 2 2 2 1 ( ) log (2 ) log ( 2 ) 2 h X e e       2 log 1 bits/second S C B N        
  • 65. B= CHANNEL BANDWIDTH IN HZ S= AVG. SIGNAL POWER IN WATTS N= NOISE POWER IN WATTS IDEAL SYSTEM: An ideal system is defined as one that transmits data at a bit rate that is equal to the channel capacity C, in bits per second. MATRUSRI ENGINEERING COLLEGE 2 log 1 bits/sec S C B B          N B   2 lim log 1.44 B S S C e       b S E C  b E C S N B     / / 2 1 2 1 / C B C B b E B C B C      1 ln2 1 ln2 1.6 / b C E B db C B              
  • 67. 1. One internet service provider (ISP) gives dial-up connections at 56kbps. Assume that the telephone connection provides a usable bandwidth of 3.5 kHz. What is the minimum SNR required to support this? 2. A Gaussian channel has 1MHz bandwidth. Calculate the channel capacity if the signal power to noise spectral density ratio is 105 Hz. Also, find maximum information rate. 3. Why downloading an audio or video stored file from Internet sometimes takes much longer time than it requires playing? 4. Is it possible to reduce the bandwidth required to transmit a given amount of information? 5. Is it possible to achieve error-free transmission? Discuss ANSWER! MATRUSRI ENGINEERING COLLEGE