Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Unit I.pptx INTRODUCTION TO DIGITAL COMMUNICATION
1. Course Code : EC 8501
Name : Digital Communication
Branch : ECE
Year/Semester : III/V
2. UNIT I INFORMATION THEORY 9
Discrete Memory less source, Information, Entropy, Mutual Information - Discrete Memoryless channels – Binary Symmetric
Channel, Channel Capacity - Hartley - Shannon law - Source coding theorem - Shannon - Fano & Huffman codes.
UNIT II WAVEFORM CODING & REPRESENTATION 9
Prediction filtering and DPCM - Delta Modulation - ADPCM & ADM principles-Linear Predictive Coding - Properties of Line codes-
Power Spectral Density of Unipolar / Polar RZ & NRZ – Bipolar NRZ - Manchester
UNIT III BASEBAND TRANSMISSION & RECEPTION 9
ISI – Nyquist criterion for distortion less transmission – Pulse shaping – Correlative coding - Eye pattern Receiving Filters- Matched
Filter, Correlation receiver, Adaptive Equalization
UNIT IV DIGITAL MODULATION SCHEME 9
Geometric Representation of signals - Generation, detection, PSD & BER of Coherent BPSK, BFSK & QPSK - QAM - Carrier
Synchronization - Structure of Non-coherent Receivers - Principle of DPSK.
UNIT V ERROR CONTROL CODING 9
Channel coding theorem - Linear Block codes - Hamming codes - Cyclic codes - Convolutional codes - Viterbi Decoder.
TEXT BOOK:
1. S. Haykin, ―Digital Communications‖, John Wiley, 2005 (Unit I –V)
REFERENCES
1. B. Sklar, ―Digital Communication Fundamentals and Applications‖, 2nd Edition, Pearson Education, 2009
2. B.P.Lathi, ―Modern Digital and Analog Communication Systems‖ 3rd Edition, Oxford University Press,2007
3. H P Hsu, Schaum Outline Series - ―Analog and Digital Communications‖, TMH 2006
4. J.G Proakis, ―Digital Communication‖, 4th Edition, Tata Mc Graw Hill Company, 2001.
6. • Information is the source of a communication system.
• Information theory is a mathematical approach to the study of coding of
information along with the quantification, storage, and communication of
information.
• If we consider an event, there are three conditions of occurrence.
If the event has not occurred, there is a condition of uncertainty.
If the event has just occurred, there is a condition of surprise.
If the event has occurred, a time back, there is a condition of
having some information.
Information
7. • Consider a communication system which transmits messages m1,m2,m3
with probabilities of occurrence p1,p2,p3.
• The amount of information transmitted through the message mk with
probability pk is given as
Amount of Information Ik = log 2 [
1
𝑝𝑘
]= [
𝑙𝑜𝑔10
(
1
𝑝𝑘
)
𝑙𝑜𝑔10
2
]
8. • Properties of Information
i. If there is more uncertainty about message, information carried is
also more.
ii. If receiver knows the message being transmitted the amount of
information carried is zero.
iii. If I1 is the information carried by message m1, I2 is the information
carried by m2, then amount of information carried is I1 + I2.
iv. If there are M = 2N equally likely messages, then amount of
information carried by each messages will be N bits.
9. Discrete Memoryless Source
A source from which the data is being emitted at successive intervals, which
is independent of previous values, can be termed as discrete memoryless
source.
10. Entropy (Average Information)
Entropy can be defined as a measure of the average information content per
source symbol.
Claude Shannon, the “father of the Information Theory”, provided a formula
𝐸𝑛𝑡𝑟𝑜𝑝𝑦 = −
𝑘=1
𝑀
𝑝𝑘 log 2 𝑝𝑘
Where pk is the probability of the occurrence of character number k from a
given stream of characters and 2 is the base of the algorithm used. Hence,
this is also called as Shannon’s Entropy.
11. Information Rate
The information rate is represented by R = rH
Where R is Information rate, H is Entropy & r is rate at which messages are
generated.
R = (r in Messages/second) x (H in Information bits/message)
R = Information bits/second