SlideShare a Scribd company logo
ANALOG COMMUNICATION SYSTEMS
By
Ms. Swati Shrrpal Halunde
Assistant Professor DEPT. of
E.C.E S.I.T.C.O.E. Yadrav-
Ichalkaranji
RADIO RECEIVER MEASUREMENTS &INFORMATION & CHANNEL CAPACITY
Radio receiver measurements
The important characteristics of superheterodyne
radio receiver are,
• Sensitivity
• Selectivity
• Fidelity
Sensitivity:
• It is defined as the ability of receiver to amplify
weak signals
• It is defined in terms of voltage which must be
applied at the receiver input terminals to provide a
standard output power at the receiver output.
Radio receiver measurements
• Sensitivity is expressed in milli volts
• For practical receivers sensitivity is expressed in terms of
signal power required to produce minimum acceptable
output with minimum acceptable noise.
• Sensitivity of superheterodyne radio receiver depends on
• Gain of RF amplifier
• Gain of IF amplifier
• Noise figure of RX
Radio receiver measurements
Selectivity:
It is defined as the ability of receiver to reject unwanted
signals.
Selectivity depends on
• Receiving frequency
• Response of IF section
Radio receiver measurements
Fidelity:
It is the ability of a receiver to reproduce all the
modulating frequencies equally.
INFORMATION & CHANNEL CAPACITY
Information:
Information is defined as a sequence of letters,
carries a message with
alphabets, symbols which
specific meaning.
Source of Information:
The sources of information can be divided into 2 types.
•Analog Information sources
•Digital information sources
Analog information sources produce continuous
amplitude continuous time electrical waveforms.
Discrete information sources produces messages
consisting of discrete letters or symbols.
Information content of a message
• The information content of a message is represented by
the probability or uncertainty of the event associated
with the message.
• The probability of occurrence of a message is inversely
related to amount of information.
• Therefore, a message with least probability of
occurrence will have maximum amount of information.
• The relation between information content of message
and its probability of occurrence is given by,
Ik = log (1/Pk)
• The unit of information is bit.
• Ik = log 2(1/Pk) bits, Ik = log10 (1/Pk)Decits, Ik = loge(1/Pk)
nats.
Entropy (Average information content)
Entropy is defined as
information conveyed
denoted by H.
the average
by a message. It
amount of
is
Properties of Entropy:
1. Entropy is always non negative i.e H(x) ≥ 0.
2.Entropy is zero when probability of all symbols is
zero except probability one symbol is one.
3. Entropy is maximum when probability
occurrence of all symbols is equal
i.e H(x) =
Entropy of symbols in long independent
sequences
• In a statistically independent sequence, the occurrence
of a particular symbol during a time interval is
independent of occurrence of symbols at other time
interval.
• If P1, P2, P3, ……. PM are the probabilities of occurrences
of M symbols, then the total information content of the
message consisting N symbols is given by,
• To obtain entropy or average information per symbol, the
total information content is divided by number of
symbols in a message.
Therefore, H(X) =
Entropy of symbols in long dependent
sequences
• In statistically dependent sequences,
occurrence of one message alters the
occurrence of other message.
• Due to this type of dependency, amount of
information coming from a source is gradually
decreased.
• To determine the entropy and information
rate of symbols for
dependent sequences a
long statistically
special model is
developed which is Markoff statistical model.
Markoff statistical model for information
sources
A random process in which probability of future
values depends on probability of previous events
is called Markoff process.
The sequence generated from such process is
called Markoff sequence.
Entropy of Markoff sources:
Entropy of Markoff sources is defined as average
entropy of each state.
Information rate of Markoff sources
The information rate of Markoff sources is given by,
R = rH
Where, r = Rate at which symbols are generated
H= Entropy of Markoff sources
Information rate is measured in bits/sec
Different types of Entropies
Marginal Entropies:
Joint entropy:
Conditional entropy:
Relation between Entropies:
H(X,Y)=H(X)+H(Y/X) = H(Y)+H(X/Y)
Mutual Information
I(X; Y) of a channel is equal to difference between
initial uncertainty and final uncertainty.
I(X;Y) = Initial uncertainty – final uncertainty.
I(X;Y) = H(X) - H(X/Y) bits/symbol
Where, H(X) - entropy of the source andH(X/Y) -
Conditional Entropy.
Properties of mutual information:
1. I(X;Y) = I(Y;X)
2. I(X;Y)>=0
3. I(X;Y) = H(X) - H(X/Y)
4. I(X;Y)) = H(X)+H(Y)-H(X,Y).
Discrete communication channel
The communication channel in which both input
and output is a sequence of symbols is called a
discrete communication channel or coding
channel.
A discrete4 channel is characterized by a set of
transition probability Pij which depends on the
parameters of modulator, transmission medium
or channel, noise and demodulator.
Discrete communication channel
The input to the discrete channel is any of the M
symbols of an alphabet provided and output is
the symbol belonging to same alphabet.
Model of discrete channel is shown below:
Rate of information over a discrete
channel
• In discrete channels, the average rate of
information transmission is assumed to be the
difference between input data rate and error
rate.
• The average rate of information transmission
over a discrete channel is defined as the amount
the channel
of information transmitted over
minus information lost.
• It is denoted by Dt and is given by,
Capacity of discrete memoryless channel
The maximum allowable rate of information that
can be transmitted over a discrete channel is
called capacity of memoryless channel.
When channel matches with the source, maximum
rate of transmission takes place.
Therefore, channel capacity,
C= max [I(X,Y)] = Max [ H(X) – H(X/Y) ]
M-Ary discrete memoryless channel
The channel which transmit and receive one of the
‘m’ possible symbols depending on the present
input and independent of previous input is called
M-Ary discrete memoryless channel.
The relation between conditional entropy and joint
entropy can be written as,
H(X,Y) = H(X/Y) + H(Y) = H(Y/X) + H(X)
Capacity of Gaussian channel- Shannon
Hartley Theorem
Shannon- Hartley theorem states that the capacity
of Gaussian channel having bandwidth ‘W’ is
given as,
Where, W = Channel bandwidth
S= Average signal power
N= Average noise power
Shannon- Fano algorithm
Shannon Fano coding is source encoding technique
which is used to remove the redundancy
(repeated information). The following steps
are involved
1. For a given list of symbols, develop a
corresponding list of probabilities or frequency
counts so that each symbol’s relative frequency of
occurrence is known.
2. Sort the lists of symbols according to frequency,
with the most frequently occurring symbols at
the left and the least common at the right.
Shannon- Fano algorithm
3. Divide the list into two parts, with the total
frequency counts of the left part being as close to
the total of the right as possible.
4. The left part of the list is assigned the binary digit
0, and the right part is assigned the digit 1. This
means that the codes for the symbols in the first
part will all start with 0, and the codes in the
second part will all start with 1.
5. Recursively apply the steps 3 and 4 to each of the
two halves, subdividing groups and adding bits to
the codes until each symbol has become a
corresponding code leaf on the tree.

More Related Content

Similar to Radio receiver and information coding .pptx

Noise infotheory1
Noise infotheory1Noise infotheory1
Noise infotheory1
vmspraneeth
 
Lecture1
Lecture1Lecture1
Lecture1
ntpc08
 
advance coding techniques - probability
advance coding techniques -  probabilityadvance coding techniques -  probability
advance coding techniques - probability
YaseenMo
 
Unit 5.pdf
Unit 5.pdfUnit 5.pdf
Communication System
Communication SystemCommunication System
Communication System
Ridwanul Hoque
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.ppt
shortstime400
 
Tele4653 l9
Tele4653 l9Tele4653 l9
Tele4653 l9
Vin Voro
 
Lecture7 channel capacity
Lecture7   channel capacityLecture7   channel capacity
Lecture7 channel capacity
Frank Katta
 
Lecture1: Introduction to signals
Lecture1: Introduction to signalsLecture1: Introduction to signals
Lecture1: Introduction to signals
Jawaher Abdulwahab Fadhil
 
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
vani374987
 
UNIT-2.pdf
UNIT-2.pdfUNIT-2.pdf
communication system lec2
 communication system lec2 communication system lec2
communication system lec2
ZareenRauf1
 
Data communication
Data communicationData communication
Data communication
Saba Rathinam
 
Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSS
Kamal Bhagat
 
Data Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano codingData Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano coding
Dr Rajiv Srivastava
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and Coding
VIT-AP University
 
Optimum Receiver corrupted by AWGN Channel
Optimum Receiver corrupted by AWGN ChannelOptimum Receiver corrupted by AWGN Channel
Optimum Receiver corrupted by AWGN Channel
AWANISHKUMAR84
 
Communication theory
Communication theoryCommunication theory
Communication theory
Vijay Balaji
 
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
Dongseo University
 
Information theory
Information theoryInformation theory
Information theory
Madhumita Tamhane
 

Similar to Radio receiver and information coding .pptx (20)

Noise infotheory1
Noise infotheory1Noise infotheory1
Noise infotheory1
 
Lecture1
Lecture1Lecture1
Lecture1
 
advance coding techniques - probability
advance coding techniques -  probabilityadvance coding techniques -  probability
advance coding techniques - probability
 
Unit 5.pdf
Unit 5.pdfUnit 5.pdf
Unit 5.pdf
 
Communication System
Communication SystemCommunication System
Communication System
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.ppt
 
Tele4653 l9
Tele4653 l9Tele4653 l9
Tele4653 l9
 
Lecture7 channel capacity
Lecture7   channel capacityLecture7   channel capacity
Lecture7 channel capacity
 
Lecture1: Introduction to signals
Lecture1: Introduction to signalsLecture1: Introduction to signals
Lecture1: Introduction to signals
 
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
 
UNIT-2.pdf
UNIT-2.pdfUNIT-2.pdf
UNIT-2.pdf
 
communication system lec2
 communication system lec2 communication system lec2
communication system lec2
 
Data communication
Data communicationData communication
Data communication
 
Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSS
 
Data Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano codingData Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano coding
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and Coding
 
Optimum Receiver corrupted by AWGN Channel
Optimum Receiver corrupted by AWGN ChannelOptimum Receiver corrupted by AWGN Channel
Optimum Receiver corrupted by AWGN Channel
 
Communication theory
Communication theoryCommunication theory
Communication theory
 
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
 
Information theory
Information theoryInformation theory
Information theory
 

More from swatihalunde

FM demodulation and superheterodyne receiver
FM demodulation and superheterodyne receiverFM demodulation and superheterodyne receiver
FM demodulation and superheterodyne receiver
swatihalunde
 
Frequency modulation and demodulation along with types
Frequency modulation and demodulation along with typesFrequency modulation and demodulation along with types
Frequency modulation and demodulation along with types
swatihalunde
 
Angel modulization in Frequency modulation and Phase modulation
Angel modulization in Frequency modulation and Phase modulationAngel modulization in Frequency modulation and Phase modulation
Angel modulization in Frequency modulation and Phase modulation
swatihalunde
 
Quadrature amplitude modulation and demodulation in analog communication
Quadrature amplitude modulation and demodulation in analog communicationQuadrature amplitude modulation and demodulation in analog communication
Quadrature amplitude modulation and demodulation in analog communication
swatihalunde
 
Double side band suppressed carrier AM generation
Double side band suppressed carrier AM generationDouble side band suppressed carrier AM generation
Double side band suppressed carrier AM generation
swatihalunde
 
modulation of analog communication system
modulation of analog communication systemmodulation of analog communication system
modulation of analog communication system
swatihalunde
 
Basic introduction to analog communication
Basic introduction to analog communicationBasic introduction to analog communication
Basic introduction to analog communication
swatihalunde
 
research methodology
research methodologyresearch methodology
research methodology
swatihalunde
 
Presentation1.pptx
Presentation1.pptxPresentation1.pptx
Presentation1.pptx
swatihalunde
 
DCOM_3_Random process1.pptx
DCOM_3_Random process1.pptxDCOM_3_Random process1.pptx
DCOM_3_Random process1.pptx
swatihalunde
 
Analog pulse modulation scheme.pptx
Analog pulse modulation scheme.pptxAnalog pulse modulation scheme.pptx
Analog pulse modulation scheme.pptx
swatihalunde
 
Noise.pptx
Noise.pptxNoise.pptx
Noise.pptx
swatihalunde
 
Pre-emphasis and De-emphasis.pptx
Pre-emphasis and De-emphasis.pptxPre-emphasis and De-emphasis.pptx
Pre-emphasis and De-emphasis.pptx
swatihalunde
 
Angle modulation .pptx
Angle modulation .pptxAngle modulation .pptx
Angle modulation .pptx
swatihalunde
 
Types of modulation .pptx
Types of modulation .pptxTypes of modulation .pptx
Types of modulation .pptx
swatihalunde
 
Basic of analog communication system.pptx
Basic of analog communication system.pptxBasic of analog communication system.pptx
Basic of analog communication system.pptx
swatihalunde
 
Error correction error detection in digital communication
Error correction error detection in digital communicationError correction error detection in digital communication
Error correction error detection in digital communication
swatihalunde
 
Basics of analog communication system
Basics of analog communication systemBasics of analog communication system
Basics of analog communication system
swatihalunde
 

More from swatihalunde (18)

FM demodulation and superheterodyne receiver
FM demodulation and superheterodyne receiverFM demodulation and superheterodyne receiver
FM demodulation and superheterodyne receiver
 
Frequency modulation and demodulation along with types
Frequency modulation and demodulation along with typesFrequency modulation and demodulation along with types
Frequency modulation and demodulation along with types
 
Angel modulization in Frequency modulation and Phase modulation
Angel modulization in Frequency modulation and Phase modulationAngel modulization in Frequency modulation and Phase modulation
Angel modulization in Frequency modulation and Phase modulation
 
Quadrature amplitude modulation and demodulation in analog communication
Quadrature amplitude modulation and demodulation in analog communicationQuadrature amplitude modulation and demodulation in analog communication
Quadrature amplitude modulation and demodulation in analog communication
 
Double side band suppressed carrier AM generation
Double side band suppressed carrier AM generationDouble side band suppressed carrier AM generation
Double side band suppressed carrier AM generation
 
modulation of analog communication system
modulation of analog communication systemmodulation of analog communication system
modulation of analog communication system
 
Basic introduction to analog communication
Basic introduction to analog communicationBasic introduction to analog communication
Basic introduction to analog communication
 
research methodology
research methodologyresearch methodology
research methodology
 
Presentation1.pptx
Presentation1.pptxPresentation1.pptx
Presentation1.pptx
 
DCOM_3_Random process1.pptx
DCOM_3_Random process1.pptxDCOM_3_Random process1.pptx
DCOM_3_Random process1.pptx
 
Analog pulse modulation scheme.pptx
Analog pulse modulation scheme.pptxAnalog pulse modulation scheme.pptx
Analog pulse modulation scheme.pptx
 
Noise.pptx
Noise.pptxNoise.pptx
Noise.pptx
 
Pre-emphasis and De-emphasis.pptx
Pre-emphasis and De-emphasis.pptxPre-emphasis and De-emphasis.pptx
Pre-emphasis and De-emphasis.pptx
 
Angle modulation .pptx
Angle modulation .pptxAngle modulation .pptx
Angle modulation .pptx
 
Types of modulation .pptx
Types of modulation .pptxTypes of modulation .pptx
Types of modulation .pptx
 
Basic of analog communication system.pptx
Basic of analog communication system.pptxBasic of analog communication system.pptx
Basic of analog communication system.pptx
 
Error correction error detection in digital communication
Error correction error detection in digital communicationError correction error detection in digital communication
Error correction error detection in digital communication
 
Basics of analog communication system
Basics of analog communication systemBasics of analog communication system
Basics of analog communication system
 

Recently uploaded

Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
IJECEIAES
 
Applications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdfApplications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdf
Atif Razi
 
Curve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods RegressionCurve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods Regression
Nada Hikmah
 
Data Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason WebinarData Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason Webinar
UReason
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
VICTOR MAESTRE RAMIREZ
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
MDSABBIROJJAMANPAYEL
 
Mechanical Engineering on AAI Summer Training Report-003.pdf
Mechanical Engineering on AAI Summer Training Report-003.pdfMechanical Engineering on AAI Summer Training Report-003.pdf
Mechanical Engineering on AAI Summer Training Report-003.pdf
21UME003TUSHARDEB
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
insn4465
 
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
171ticu
 
Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...
Prakhyath Rai
 
CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1
PKavitha10
 
LLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by Anant
LLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by AnantLLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by Anant
LLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by Anant
Anant Corporation
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
kandramariana6
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
KrishnaveniKrishnara1
 
Data Control Language.pptx Data Control Language.pptx
Data Control Language.pptx Data Control Language.pptxData Control Language.pptx Data Control Language.pptx
Data Control Language.pptx Data Control Language.pptx
ramrag33
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
Madan Karki
 
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...
shadow0702a
 
Generative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of contentGenerative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of content
Hitesh Mohapatra
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
Yasser Mahgoub
 
An improved modulation technique suitable for a three level flying capacitor ...
An improved modulation technique suitable for a three level flying capacitor ...An improved modulation technique suitable for a three level flying capacitor ...
An improved modulation technique suitable for a three level flying capacitor ...
IJECEIAES
 

Recently uploaded (20)

Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
 
Applications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdfApplications of artificial Intelligence in Mechanical Engineering.pdf
Applications of artificial Intelligence in Mechanical Engineering.pdf
 
Curve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods RegressionCurve Fitting in Numerical Methods Regression
Curve Fitting in Numerical Methods Regression
 
Data Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason WebinarData Driven Maintenance | UReason Webinar
Data Driven Maintenance | UReason Webinar
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
 
Mechanical Engineering on AAI Summer Training Report-003.pdf
Mechanical Engineering on AAI Summer Training Report-003.pdfMechanical Engineering on AAI Summer Training Report-003.pdf
Mechanical Engineering on AAI Summer Training Report-003.pdf
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
 
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样学校原版美国波士顿大学毕业证学历学位证书原版一模一样
学校原版美国波士顿大学毕业证学历学位证书原版一模一样
 
Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...Software Engineering and Project Management - Introduction, Modeling Concepts...
Software Engineering and Project Management - Introduction, Modeling Concepts...
 
CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1CEC 352 - SATELLITE COMMUNICATION UNIT 1
CEC 352 - SATELLITE COMMUNICATION UNIT 1
 
LLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by Anant
LLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by AnantLLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by Anant
LLM Fine Tuning with QLoRA Cassandra Lunch 4, presented by Anant
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
 
Data Control Language.pptx Data Control Language.pptx
Data Control Language.pptx Data Control Language.pptxData Control Language.pptx Data Control Language.pptx
Data Control Language.pptx Data Control Language.pptx
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
 
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...
 
Generative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of contentGenerative AI leverages algorithms to create various forms of content
Generative AI leverages algorithms to create various forms of content
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
 
An improved modulation technique suitable for a three level flying capacitor ...
An improved modulation technique suitable for a three level flying capacitor ...An improved modulation technique suitable for a three level flying capacitor ...
An improved modulation technique suitable for a three level flying capacitor ...
 

Radio receiver and information coding .pptx

  • 1. ANALOG COMMUNICATION SYSTEMS By Ms. Swati Shrrpal Halunde Assistant Professor DEPT. of E.C.E S.I.T.C.O.E. Yadrav- Ichalkaranji RADIO RECEIVER MEASUREMENTS &INFORMATION & CHANNEL CAPACITY
  • 2. Radio receiver measurements The important characteristics of superheterodyne radio receiver are, • Sensitivity • Selectivity • Fidelity Sensitivity: • It is defined as the ability of receiver to amplify weak signals • It is defined in terms of voltage which must be applied at the receiver input terminals to provide a standard output power at the receiver output.
  • 3. Radio receiver measurements • Sensitivity is expressed in milli volts • For practical receivers sensitivity is expressed in terms of signal power required to produce minimum acceptable output with minimum acceptable noise. • Sensitivity of superheterodyne radio receiver depends on • Gain of RF amplifier • Gain of IF amplifier • Noise figure of RX
  • 4. Radio receiver measurements Selectivity: It is defined as the ability of receiver to reject unwanted signals. Selectivity depends on • Receiving frequency • Response of IF section
  • 5. Radio receiver measurements Fidelity: It is the ability of a receiver to reproduce all the modulating frequencies equally.
  • 6. INFORMATION & CHANNEL CAPACITY Information: Information is defined as a sequence of letters, carries a message with alphabets, symbols which specific meaning. Source of Information: The sources of information can be divided into 2 types. •Analog Information sources •Digital information sources Analog information sources produce continuous amplitude continuous time electrical waveforms. Discrete information sources produces messages consisting of discrete letters or symbols.
  • 7. Information content of a message • The information content of a message is represented by the probability or uncertainty of the event associated with the message. • The probability of occurrence of a message is inversely related to amount of information. • Therefore, a message with least probability of occurrence will have maximum amount of information. • The relation between information content of message and its probability of occurrence is given by, Ik = log (1/Pk) • The unit of information is bit. • Ik = log 2(1/Pk) bits, Ik = log10 (1/Pk)Decits, Ik = loge(1/Pk) nats.
  • 8. Entropy (Average information content) Entropy is defined as information conveyed denoted by H. the average by a message. It amount of is Properties of Entropy: 1. Entropy is always non negative i.e H(x) ≥ 0. 2.Entropy is zero when probability of all symbols is zero except probability one symbol is one. 3. Entropy is maximum when probability occurrence of all symbols is equal i.e H(x) =
  • 9. Entropy of symbols in long independent sequences • In a statistically independent sequence, the occurrence of a particular symbol during a time interval is independent of occurrence of symbols at other time interval. • If P1, P2, P3, ……. PM are the probabilities of occurrences of M symbols, then the total information content of the message consisting N symbols is given by, • To obtain entropy or average information per symbol, the total information content is divided by number of symbols in a message. Therefore, H(X) =
  • 10. Entropy of symbols in long dependent sequences • In statistically dependent sequences, occurrence of one message alters the occurrence of other message. • Due to this type of dependency, amount of information coming from a source is gradually decreased. • To determine the entropy and information rate of symbols for dependent sequences a long statistically special model is developed which is Markoff statistical model.
  • 11. Markoff statistical model for information sources A random process in which probability of future values depends on probability of previous events is called Markoff process. The sequence generated from such process is called Markoff sequence. Entropy of Markoff sources: Entropy of Markoff sources is defined as average entropy of each state.
  • 12. Information rate of Markoff sources The information rate of Markoff sources is given by, R = rH Where, r = Rate at which symbols are generated H= Entropy of Markoff sources Information rate is measured in bits/sec
  • 13. Different types of Entropies Marginal Entropies: Joint entropy: Conditional entropy: Relation between Entropies: H(X,Y)=H(X)+H(Y/X) = H(Y)+H(X/Y)
  • 14. Mutual Information I(X; Y) of a channel is equal to difference between initial uncertainty and final uncertainty. I(X;Y) = Initial uncertainty – final uncertainty. I(X;Y) = H(X) - H(X/Y) bits/symbol Where, H(X) - entropy of the source andH(X/Y) - Conditional Entropy. Properties of mutual information: 1. I(X;Y) = I(Y;X) 2. I(X;Y)>=0 3. I(X;Y) = H(X) - H(X/Y) 4. I(X;Y)) = H(X)+H(Y)-H(X,Y).
  • 15. Discrete communication channel The communication channel in which both input and output is a sequence of symbols is called a discrete communication channel or coding channel. A discrete4 channel is characterized by a set of transition probability Pij which depends on the parameters of modulator, transmission medium or channel, noise and demodulator.
  • 16. Discrete communication channel The input to the discrete channel is any of the M symbols of an alphabet provided and output is the symbol belonging to same alphabet. Model of discrete channel is shown below:
  • 17. Rate of information over a discrete channel • In discrete channels, the average rate of information transmission is assumed to be the difference between input data rate and error rate. • The average rate of information transmission over a discrete channel is defined as the amount the channel of information transmitted over minus information lost. • It is denoted by Dt and is given by,
  • 18. Capacity of discrete memoryless channel The maximum allowable rate of information that can be transmitted over a discrete channel is called capacity of memoryless channel. When channel matches with the source, maximum rate of transmission takes place. Therefore, channel capacity, C= max [I(X,Y)] = Max [ H(X) – H(X/Y) ]
  • 19. M-Ary discrete memoryless channel The channel which transmit and receive one of the ‘m’ possible symbols depending on the present input and independent of previous input is called M-Ary discrete memoryless channel. The relation between conditional entropy and joint entropy can be written as, H(X,Y) = H(X/Y) + H(Y) = H(Y/X) + H(X)
  • 20. Capacity of Gaussian channel- Shannon Hartley Theorem Shannon- Hartley theorem states that the capacity of Gaussian channel having bandwidth ‘W’ is given as, Where, W = Channel bandwidth S= Average signal power N= Average noise power
  • 21. Shannon- Fano algorithm Shannon Fano coding is source encoding technique which is used to remove the redundancy (repeated information). The following steps are involved 1. For a given list of symbols, develop a corresponding list of probabilities or frequency counts so that each symbol’s relative frequency of occurrence is known. 2. Sort the lists of symbols according to frequency, with the most frequently occurring symbols at the left and the least common at the right.
  • 22. Shannon- Fano algorithm 3. Divide the list into two parts, with the total frequency counts of the left part being as close to the total of the right as possible. 4. The left part of the list is assigned the binary digit 0, and the right part is assigned the digit 1. This means that the codes for the symbols in the first part will all start with 0, and the codes in the second part will all start with 1. 5. Recursively apply the steps 3 and 4 to each of the two halves, subdividing groups and adding bits to the codes until each symbol has become a corresponding code leaf on the tree.