SlideShare a Scribd company logo
1 of 22
Sanjivani College of Engineering, Kopargaon
Department of Electronics & Computer Engineering
(An Autonomous Institute)
Affiliated to Savitribai Phule Pune University
Accredited ‘A’ Grade by NAAC
________________________________________________________________________________________
Subject: Discrete Mathematics and Information Theory (EC 201)
UNIT-6
Topic: Information Sources and Entropy
Dipak Mahurkar
Assistant Professor, ECE Department
Dipak Mahurkar ECE Department 1
What is Information?
Dipak Mahurkar ECE Department 2
• Information is the source of a communication system, whether it is analog
or digital.
• Information theory is a mathematical approach to the study of coding of
information along with the quantification, storage, and communication of
information.
• A measure of uncertainty.
• Can we really analyze it quantitatively?
• What do the numerical values mean?
• Is it tied to “knowledge”?
• Is it subjective?
Conditions of Occurrence of Events
Dipak Mahurkar ECE Department 3
• If we consider an event, there are three conditions of occurrence.
• If the event has not occurred, there is a condition of uncertainty.
• If the event has just occurred, there is a condition of surprise.
• If the event has occurred, a time back, there is a condition of having some
information.
• These three events occur at different times.
• The difference in these conditions help us gain knowledge on the probabilities
of the occurrence of events.
Uncertainty and Probability
Dipak Mahurkar ECE Department 4
Rate of Information
Dipak Mahurkar ECE Department 5
The average number of bits of information per second.
R = r. H bits / second
Where r=generated message per second.
Entropy
Dipak Mahurkar ECE Department 6
When we observe the possibilities of the occurrence of an event, how
surprising or uncertain it would be, it means that we are trying to have an
idea on the average content of the information from the source of the event.
Entropy can be defined as a measure of the average information content per
source symbol.
Where pi is the probability of the occurrence of character number i from a
given stream of characters and b is the base of the logarithm used. This is
also called as Shannon’s Entropy.
The amount of uncertainty remaining about the channel input after
observing the channel output, is called as Conditional Entropy. It is
denoted by H(x∣y)
Entropy Contd…
Dipak Mahurkar ECE Department 7
Consider that there are M={m1,m2,….} different message with
probabilities P={p1,p2…..}
Suppose that a sequence of L messages is transmitted,
p1L message of m1 are transmitted
p2L message of m2 are transmitted
..
pmL message of mm are transmitted
Info I(m1) = log2(1/p1)
If (p, L) message at m1 are transmitted
I1(total) = p1L log2(1/p1)
I2(total) = p2L log2(1/p2)
Entropy Contd…
Dipak Mahurkar ECE Department 8
If (p, L) message at m1 are transmitted
I1(total) = p1L log2(1/p1)
I2(total) = p2L log2(1/p2)
.
.
.
Im (total) = pm L log2(1/pm)
I(total) = p1L log2(1/p1) + p2L log2(1/p2) + ….+ pmL log2(1/pm)
Average Info = Total Info / No. of messages
= I(total)/L
= [p1L log2(1/p1) + p2L log2(1/p2) + ….+ pmL log2(1/pm)] / L
= p1 log2(1/p1) + p2 log2(1/p2) + ….+ pm log2(1/pm)
Hence, we can write,
Entropy = 𝑘=1
𝑚
𝑝𝑘𝑙𝑜𝑔2(
1
𝑝𝑘
)
Properties of Entropy
• Entropy is zero if the event is sure, then H = 0
• When pk = 1/m for all m symbols, then symbols are equally
likely H = log2m
• Upper bound on entropy is given as Hmax = log2m
Dipak Mahurkar ECE Department 9
Properties of Entropy Contd…
• H = 0 if pk = 1 or pk = 0
For pk = 1
H = 𝑘=1
𝑚
𝑝𝑘𝑙𝑜𝑔2(
1
𝑝𝑘
)
= 𝑘=1
𝑚
𝑙𝑜𝑔2(
1
1
)
= 𝑘=1
𝑚
(
𝑙𝑜𝑔10(1)
𝑙𝑜𝑔2(10)
) = 0
Dipak Mahurkar ECE Department 10
Properties of Entropy Contd…
For pk = 0
H = 𝑘=1
𝑚
𝑝𝑘𝑙𝑜𝑔2(
1
𝑝𝑘
) = 0
Dipak Mahurkar ECE Department 11
Properties of Entropy Contd…
For pk = 1/m
H = 𝑘=1
𝑚
𝑝𝑘𝑙𝑜𝑔2(
1
𝑝𝑘
)
= 𝑘=1
𝑚
(
1
𝑚
)𝑙𝑜𝑔2(𝑚)
= log2(m)
Dipak Mahurkar ECE Department 12
Source Efficiency and Redundancy
Dipak Mahurkar ECE Department 13
Self Information and Mutual Information
Dipak Mahurkar ECE Department 14
• Self information is always non negative.
• The unit of average mutual information is bits
• When the base of the logarithm is 2 then the unit of measure of
information is bits.
• Entropy of a random variable is also infinity.
• The self information of a random variable is infinity.
• Smaller the code rate, more are the redundant bits.
• When probability of error during transmission is 0.5 then the
channel is very noisy and thus no information is received.
Example
Dipak Mahurkar ECE Department 15
For the discrete memoryless source there are three symbols with
p1 = α and p2 = p3. Find the entropy of the source.
Solution:
Given p1 = α , p2 = p3
p1+p2+p3 = 1
Since, p2=p3;
p1+p2+p2 = 1 => α+2p2 = 1
Hence, p2 = (1- α)/2 = p3
Example
Dipak Mahurkar ECE Department 16
H = 𝑘=1
𝑚
𝑝𝑘𝑙𝑜𝑔2(
1
𝑝𝑘
)
=𝑝1 𝑙𝑜𝑔2(
1
𝑝1
)+𝑝2 𝑙𝑜𝑔2(
1
𝑝2
)+𝑝3 𝑙𝑜𝑔2(
1
𝑝3
)
= α𝑙𝑜𝑔2(
1
α
)+(1- α )𝑙𝑜𝑔2(
1
(1−α)
)
Example 2
Dipak Mahurkar ECE Department 17
Example 3
Dipak Mahurkar ECE Department 18
H = 𝑘=1
𝑚
𝑝𝑘𝑙𝑜𝑔2(
1
𝑝𝑘
)
=(
1
2
) 𝑙𝑜𝑔2(2)+(
1
4
) 𝑙𝑜𝑔2(4)+
1
8
𝑙𝑜𝑔2 8 +……… +
1
2𝑛 𝑙𝑜𝑔2 2𝑛
= (1/2)+(2/4)+(3/8)+…….+(n/2n)
= 2 – [ (n+2)/2n]
Example 3
Dipak Mahurkar ECE Department 19
The source emits three message with probabilities p1 = 0.7, p2 = 0.2, p3 =
0.1.
Calculate:
1. Source of Entropy
2. Maximum Entropy
3. Source Efficiency
4. Redundancy
Example 3
Dipak Mahurkar ECE Department 20
H = 𝑘=1
𝑚
𝑝𝑘𝑙𝑜𝑔2(
1
𝑝𝑘
)
= 0.7𝑙𝑜𝑔2(1/0.7)+0.2 𝑙𝑜𝑔2(1/0.2)+0.1𝑙𝑜𝑔2 1/0.1
= 0.3602+ 0.4643 + 0.3321
= 1.1566 bits/message
Hmax = log2m = log23 = 1.5849 bits/message
ηsource = H/Hmax = 1.1566/1.5849 = 0.7297
γsource = 1 – ηsource = 1 – 0.7297= 0.27
Example
Dipak Mahurkar ECE Department 21
A discrete source emits one of six symbols once every milli sec.
The symbol probabilities are ½, ¼, 1/8, 1/16, 1/32 and 1/32
respectively.
Find the source Entropy and Information rate
Solution:
R = r*H M = 6, r = 103
H = 1.9375 bits/message
R = 103 x 1.9375
R = 1937.5 bits/sec
Dipak Mahurkar ECE Department 22
Thank You!

More Related Content

Similar to 6.1-Information Sources and Entropy.pptx

Information Theory - Introduction
Information Theory  -  IntroductionInformation Theory  -  Introduction
Information Theory - IntroductionBurdwan University
 
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfvani374987
 
Machine Learning
Machine LearningMachine Learning
Machine Learningbutest
 
The Complexity Of Primality Testing
The Complexity Of Primality TestingThe Complexity Of Primality Testing
The Complexity Of Primality TestingMohammad Elsheikh
 
Channel capacity of continuous memory less channel
Channel capacity of continuous memory less channelChannel capacity of continuous memory less channel
Channel capacity of continuous memory less channelVARUN KUMAR
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and CodingVIT-AP University
 
Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...
Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...
Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...Cemal Ardil
 
Information theory 1
Information theory 1Information theory 1
Information theory 1ksrinivas_ece
 
VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...
VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...
VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...Jayanth Dwijesh H P
 
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notesVTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notesJayanth Dwijesh H P
 
Introduction to Information Theory and Coding.pdf
Introduction to Information Theory and Coding.pdfIntroduction to Information Theory and Coding.pdf
Introduction to Information Theory and Coding.pdfJimma University
 
Python for Scientific Computing
Python for Scientific ComputingPython for Scientific Computing
Python for Scientific ComputingAlbert DeFusco
 
Tutorial on Belief Propagation in Bayesian Networks
Tutorial on Belief Propagation in Bayesian NetworksTutorial on Belief Propagation in Bayesian Networks
Tutorial on Belief Propagation in Bayesian NetworksAnmol Dwivedi
 
Pointing the Unknown Words
Pointing the Unknown WordsPointing the Unknown Words
Pointing the Unknown Wordshytae
 

Similar to 6.1-Information Sources and Entropy.pptx (20)

Information Theory - Introduction
Information Theory  -  IntroductionInformation Theory  -  Introduction
Information Theory - Introduction
 
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
 
Information theory
Information theoryInformation theory
Information theory
 
Randomization
RandomizationRandomization
Randomization
 
Machine Learning
Machine LearningMachine Learning
Machine Learning
 
The Complexity Of Primality Testing
The Complexity Of Primality TestingThe Complexity Of Primality Testing
The Complexity Of Primality Testing
 
Datacompression1
Datacompression1Datacompression1
Datacompression1
 
Channel capacity of continuous memory less channel
Channel capacity of continuous memory less channelChannel capacity of continuous memory less channel
Channel capacity of continuous memory less channel
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and Coding
 
pattern recognition
pattern recognition pattern recognition
pattern recognition
 
Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...
Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...
Investigation of-combined-use-of-mfcc-and-lpc-features-in-speech-recognition-...
 
Chap 3
Chap 3Chap 3
Chap 3
 
ML-04.pdf
ML-04.pdfML-04.pdf
ML-04.pdf
 
Information theory 1
Information theory 1Information theory 1
Information theory 1
 
VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...
VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...
VTU E&C,TCE CBCS[NEW] 5th Sem Information Theory and Coding Module-1 notes(15...
 
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notesVTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notes
 
Introduction to Information Theory and Coding.pdf
Introduction to Information Theory and Coding.pdfIntroduction to Information Theory and Coding.pdf
Introduction to Information Theory and Coding.pdf
 
Python for Scientific Computing
Python for Scientific ComputingPython for Scientific Computing
Python for Scientific Computing
 
Tutorial on Belief Propagation in Bayesian Networks
Tutorial on Belief Propagation in Bayesian NetworksTutorial on Belief Propagation in Bayesian Networks
Tutorial on Belief Propagation in Bayesian Networks
 
Pointing the Unknown Words
Pointing the Unknown WordsPointing the Unknown Words
Pointing the Unknown Words
 

Recently uploaded

MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINESIVASHANKAR N
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSRajkumarAkumalla
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Call Girls in Nagpur High Profile
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).pptssuser5c9d4b1
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations120cr0395
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Christo Ananth
 
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptxthe ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptxhumanexperienceaaa
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...RajaP95
 
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSSIVASHANKAR N
 

Recently uploaded (20)

MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
 
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptxthe ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
the ladakh protest in leh ladakh 2024 sonam wangchuk.pptx
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
 
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
 

6.1-Information Sources and Entropy.pptx

  • 1. Sanjivani College of Engineering, Kopargaon Department of Electronics & Computer Engineering (An Autonomous Institute) Affiliated to Savitribai Phule Pune University Accredited ‘A’ Grade by NAAC ________________________________________________________________________________________ Subject: Discrete Mathematics and Information Theory (EC 201) UNIT-6 Topic: Information Sources and Entropy Dipak Mahurkar Assistant Professor, ECE Department Dipak Mahurkar ECE Department 1
  • 2. What is Information? Dipak Mahurkar ECE Department 2 • Information is the source of a communication system, whether it is analog or digital. • Information theory is a mathematical approach to the study of coding of information along with the quantification, storage, and communication of information. • A measure of uncertainty. • Can we really analyze it quantitatively? • What do the numerical values mean? • Is it tied to “knowledge”? • Is it subjective?
  • 3. Conditions of Occurrence of Events Dipak Mahurkar ECE Department 3 • If we consider an event, there are three conditions of occurrence. • If the event has not occurred, there is a condition of uncertainty. • If the event has just occurred, there is a condition of surprise. • If the event has occurred, a time back, there is a condition of having some information. • These three events occur at different times. • The difference in these conditions help us gain knowledge on the probabilities of the occurrence of events.
  • 4. Uncertainty and Probability Dipak Mahurkar ECE Department 4
  • 5. Rate of Information Dipak Mahurkar ECE Department 5 The average number of bits of information per second. R = r. H bits / second Where r=generated message per second.
  • 6. Entropy Dipak Mahurkar ECE Department 6 When we observe the possibilities of the occurrence of an event, how surprising or uncertain it would be, it means that we are trying to have an idea on the average content of the information from the source of the event. Entropy can be defined as a measure of the average information content per source symbol. Where pi is the probability of the occurrence of character number i from a given stream of characters and b is the base of the logarithm used. This is also called as Shannon’s Entropy. The amount of uncertainty remaining about the channel input after observing the channel output, is called as Conditional Entropy. It is denoted by H(x∣y)
  • 7. Entropy Contd… Dipak Mahurkar ECE Department 7 Consider that there are M={m1,m2,….} different message with probabilities P={p1,p2…..} Suppose that a sequence of L messages is transmitted, p1L message of m1 are transmitted p2L message of m2 are transmitted .. pmL message of mm are transmitted Info I(m1) = log2(1/p1) If (p, L) message at m1 are transmitted I1(total) = p1L log2(1/p1) I2(total) = p2L log2(1/p2)
  • 8. Entropy Contd… Dipak Mahurkar ECE Department 8 If (p, L) message at m1 are transmitted I1(total) = p1L log2(1/p1) I2(total) = p2L log2(1/p2) . . . Im (total) = pm L log2(1/pm) I(total) = p1L log2(1/p1) + p2L log2(1/p2) + ….+ pmL log2(1/pm) Average Info = Total Info / No. of messages = I(total)/L = [p1L log2(1/p1) + p2L log2(1/p2) + ….+ pmL log2(1/pm)] / L = p1 log2(1/p1) + p2 log2(1/p2) + ….+ pm log2(1/pm) Hence, we can write, Entropy = 𝑘=1 𝑚 𝑝𝑘𝑙𝑜𝑔2( 1 𝑝𝑘 )
  • 9. Properties of Entropy • Entropy is zero if the event is sure, then H = 0 • When pk = 1/m for all m symbols, then symbols are equally likely H = log2m • Upper bound on entropy is given as Hmax = log2m Dipak Mahurkar ECE Department 9
  • 10. Properties of Entropy Contd… • H = 0 if pk = 1 or pk = 0 For pk = 1 H = 𝑘=1 𝑚 𝑝𝑘𝑙𝑜𝑔2( 1 𝑝𝑘 ) = 𝑘=1 𝑚 𝑙𝑜𝑔2( 1 1 ) = 𝑘=1 𝑚 ( 𝑙𝑜𝑔10(1) 𝑙𝑜𝑔2(10) ) = 0 Dipak Mahurkar ECE Department 10
  • 11. Properties of Entropy Contd… For pk = 0 H = 𝑘=1 𝑚 𝑝𝑘𝑙𝑜𝑔2( 1 𝑝𝑘 ) = 0 Dipak Mahurkar ECE Department 11
  • 12. Properties of Entropy Contd… For pk = 1/m H = 𝑘=1 𝑚 𝑝𝑘𝑙𝑜𝑔2( 1 𝑝𝑘 ) = 𝑘=1 𝑚 ( 1 𝑚 )𝑙𝑜𝑔2(𝑚) = log2(m) Dipak Mahurkar ECE Department 12
  • 13. Source Efficiency and Redundancy Dipak Mahurkar ECE Department 13
  • 14. Self Information and Mutual Information Dipak Mahurkar ECE Department 14 • Self information is always non negative. • The unit of average mutual information is bits • When the base of the logarithm is 2 then the unit of measure of information is bits. • Entropy of a random variable is also infinity. • The self information of a random variable is infinity. • Smaller the code rate, more are the redundant bits. • When probability of error during transmission is 0.5 then the channel is very noisy and thus no information is received.
  • 15. Example Dipak Mahurkar ECE Department 15 For the discrete memoryless source there are three symbols with p1 = α and p2 = p3. Find the entropy of the source. Solution: Given p1 = α , p2 = p3 p1+p2+p3 = 1 Since, p2=p3; p1+p2+p2 = 1 => α+2p2 = 1 Hence, p2 = (1- α)/2 = p3
  • 16. Example Dipak Mahurkar ECE Department 16 H = 𝑘=1 𝑚 𝑝𝑘𝑙𝑜𝑔2( 1 𝑝𝑘 ) =𝑝1 𝑙𝑜𝑔2( 1 𝑝1 )+𝑝2 𝑙𝑜𝑔2( 1 𝑝2 )+𝑝3 𝑙𝑜𝑔2( 1 𝑝3 ) = α𝑙𝑜𝑔2( 1 α )+(1- α )𝑙𝑜𝑔2( 1 (1−α) )
  • 17. Example 2 Dipak Mahurkar ECE Department 17
  • 18. Example 3 Dipak Mahurkar ECE Department 18 H = 𝑘=1 𝑚 𝑝𝑘𝑙𝑜𝑔2( 1 𝑝𝑘 ) =( 1 2 ) 𝑙𝑜𝑔2(2)+( 1 4 ) 𝑙𝑜𝑔2(4)+ 1 8 𝑙𝑜𝑔2 8 +……… + 1 2𝑛 𝑙𝑜𝑔2 2𝑛 = (1/2)+(2/4)+(3/8)+…….+(n/2n) = 2 – [ (n+2)/2n]
  • 19. Example 3 Dipak Mahurkar ECE Department 19 The source emits three message with probabilities p1 = 0.7, p2 = 0.2, p3 = 0.1. Calculate: 1. Source of Entropy 2. Maximum Entropy 3. Source Efficiency 4. Redundancy
  • 20. Example 3 Dipak Mahurkar ECE Department 20 H = 𝑘=1 𝑚 𝑝𝑘𝑙𝑜𝑔2( 1 𝑝𝑘 ) = 0.7𝑙𝑜𝑔2(1/0.7)+0.2 𝑙𝑜𝑔2(1/0.2)+0.1𝑙𝑜𝑔2 1/0.1 = 0.3602+ 0.4643 + 0.3321 = 1.1566 bits/message Hmax = log2m = log23 = 1.5849 bits/message ηsource = H/Hmax = 1.1566/1.5849 = 0.7297 γsource = 1 – ηsource = 1 – 0.7297= 0.27
  • 21. Example Dipak Mahurkar ECE Department 21 A discrete source emits one of six symbols once every milli sec. The symbol probabilities are ½, ¼, 1/8, 1/16, 1/32 and 1/32 respectively. Find the source Entropy and Information rate Solution: R = r*H M = 6, r = 103 H = 1.9375 bits/message R = 103 x 1.9375 R = 1937.5 bits/sec
  • 22. Dipak Mahurkar ECE Department 22 Thank You!