SlideShare a Scribd company logo
1 of 41
Chapter 6
Information Theory
1
6.1 Mathematical models for
information source
• Discrete source
1
]
[
P
}
,
,
,
{
1
2
1






L
k
k
k
k
L
p
x
X
p
x
x
x
X 
2
6.1 Mathematical models for
information source
• Discrete memoryless source (DMS)
Source outputs are independent random variables
• Discrete stationary source
– Source outputs are statistically dependent
– Stationary: joint probabilities of and
are identical for all shifts m
– Characterized by joint PDF
N
i
Xi ,
,
2
,
1
}
{ 

)
,
,
( 2
1 m
x
x
x
p 
n
x
x
x 
,
, 2
1
m
n
m
m x
x
x 

 
,
, 2
1
3
6.2 Measure of information
• Entropy of random variable X
– A measure of uncertainty or ambiguity in X
– A measure of information that is required by knowledge of
X, or information content of X per symbol
– Unit: bits (log_2) or nats (log_e) per symbol
– We define
– Entropy depends on probabilities of X, not values of X







L
k
k
k
L
x
X
x
X
X
H
x
x
x
X
1
2
1
]
[
P
log
]
P[
)
(
}
,
,
,
{ 
0
0
log
0 
4
Shannon’s fundamental paper in 1948
“A Mathematical Theory of Communication”
Can we define a quantity which will measure how much
information is “produced” by a process?
He wants this measure to satisfy:
1) H should be continuous in
2) If all are equal, H should be monotonically
increasing with n
3) If a choice can be broken down into two
successive choices, the original H should be the
weighted sum of the individual values of H
)
,
,
,
( 2
1 n
p
p
p
H 
i
p
i
p
5
Shannon’s fundamental paper in 1948
“A Mathematical Theory of Communication”
)
3
1
,
3
2
(
2
1
)
2
1
,
2
1
(
)
6
1
,
3
1
,
2
1
( H
H
H 

6
Shannon’s fundamental paper in 1948
“A Mathematical Theory of Communication”
The only H satisfying the three assumptions is of
the form:
K is a positive constant.




n
i
i
i p
p
K
H
1
log
7
Binary entropy function
)
1
log(
)
1
(
log
)
( p
p
p
p
X
H 




H(p)
Probability p
H=0: no uncertainty
H=1: most uncertainty
1 bit for binary information
8
Mutual information
• Two discrete random variables: X and Y
• Measures the information knowing either variables
provides about the other
• What if X and Y are fully independent or dependent?












]
[
P
]
[
P
]
,
[
P
log
]
,
[
P
]
[
P
]
|
[
P
log
]
,
[
P
)
,
(
]
,
[
P
)
;
(
y
x
y
x
y
Y
x
X
x
y
x
y
Y
x
X
y
x
I
y
Y
x
X
Y
X
I
9
)
,
(
)
(
)
(
)
|
(
)
(
)
|
(
)
(
)
;
(
Y
X
H
Y
H
X
H
X
Y
H
Y
H
Y
X
H
X
H
Y
X
I







10
Some properties
)
(
)
(
then
),
(
If
log
)
(
0
)}
(
),
(
min{
)
;
(
)
(
)
;
(
0
)
;
(
)
;
(
)
;
(
X
H
Y
H
X
g
Y
X
H
Y
H
X
H
Y
X
I
X
H
X
X
I
Y
X
I
X
Y
I
Y
X
I









11
Entropy is maximized
when probabilities are equal
Joint and conditional entropy
• Joint entropy
• Conditional entropy of Y given X
12
]
,
[
P
log
]
,
[
P
)
,
( y
Y
x
X
y
Y
x
X
Y
X
H 



 











]
|
[
log
]
,
[
P
)
|
(
]
[
P
)
|
(
x
X
y
Y
P
y
Y
x
X
x
X
Y
H
x
X
X
Y
H
Joint and conditional entropy
• Chain rule for entropies
• Therefore,
• If Xi are iid
13
)
,
,
,
(
)
,
|
(
)
|
(
)
(
)
,
,
,
(
1
2
1
2
1
3
1
2
1
2
1






n
n
n
X
X
X
X
H
X
X
X
H
X
X
H
X
H
X
X
X
H






n
i
i
n X
H
X
X
X
H
1
2
1 )
(
)
,
,
,
( 
)
(
)
,
,
,
( 2
1 X
nH
X
X
X
H n 

6.3 Lossless coding of information
source
• Source sequence with length n
n is assumed to be large
• Without any source coding
we need bits per symbol
14
}
,
,
,
{ 2
1 L
x
x
x
X 



]
,
,
,
[ 2
1 n
X
X
X 

x
]
[
P i
i x
X
p 

L
log
Lossless source coding
• Typical sequence
– Number of occurrence of is roughly
– When , any will be “typical”
15
i
x i
np


n x
)
(
log
)
(
log
]
[
P
log
1
1
X
nH
p
np
p
N
i
i
i
L
i
np
i
i



 
 

x
)
(
2
]
[
P X
nH


x All typical sequences have
the same probability


 n
when
1
]
[
P x
Lossless source coding
• Typical sequence
• Since typical sequences are almost certain to
occur, for the source output it is sufficient to
consider only these typical sequences
• How many bits per symbol we need now?
16
Number of typical sequences = )
(
2
]
[
P
1 X
nH

x
L
X
H
n
X
nH
R log
)
(
)
(



Lossless source coding
Shannon’s First Theorem - Lossless Source Coding
Let X denote a discrete memoryless source.
There exists a lossless source code at rate R if
)
(X
H
R  bits per transmission
17
Lossless source coding
For discrete stationary source…
)
,
,
,
|
(
lim
)
,
,
,
(
1
lim
)
(
1
2
1
2
1









k
k
k
k
k
X
X
X
X
H
X
X
X
H
k
X
H
R


18
Lossless source coding algorithms
• Variable-length coding algorithm
– Symbols with higher probability are assigned
shorter code words
– E.g. Huffman coding
• Fixed-length coding algorithm
E.g. Lempel-Ziv coding
)
(
min
1
}
{
k
L
k
k
n
x
P
n
R
k



19
Huffman coding algorithm
20
P(x1)
P(x2)
P(x3)
P(x4)
P(x5)
P(x6)
P(x7)
x1 00
x2 01
x3 10
x4 110
x5 1110
x6 11110
x7 11111
H(X)=2.11
R=2.21 bits per symbol
6.5 Channel models and channel
capacity
• Channel models
input sequence
output sequence
A channel is memoryless if
)
,
,
,
(
)
,
,
,
(
2
1
2
1
n
n
y
y
y
x
x
x




y
x



n
i
i
i x
y
1
]
|
[
P
]
|
[
P x
y
21
Binary symmetric channel (BSC) model
Channel
encoder
Binary
modulator
Channel
Demodulator
and detector
Channel
decoder
Source
data
Output
data
Composite discrete-input discrete output channel
22
Binary symmetric channel (BSC) model
0 0
1 1
1-p
1-p
p
p
Input Output
p
X
Y
P
X
Y
p
X
Y
P
X
Y













1
]
0
|
0
[
]
1
|
1
[
P
]
0
|
1
[
]
1
|
0
[
P
23
Discrete memoryless channel (DMC)
x0
Input Output
x1
xM-1
…
y0
y1
yQ-1
…
…
{X} {Y}
]
|
[
P x
y
24
can be arranged
in a matrix
Discrete-input continuous-output channel
N
X
Y 







n
i
i
i
n
n
x
y
x
y
p
x
x
x
y
y
y
p
e
x
y
p
1
2
1
2
1
2
)
(
2
)
|
(
)
,
,
,
|
,
,
,
(
2
1
)
|
(
2
2




25
If N is additive white Gaussian noise…
Discrete-time AWGN channel
• Power constraint
• For input sequence with large
n
i
i
i n
x
y 

P
X 
]
[
E 2
)
,
,
,
( 2
1 n
x
x
x 

x
P
n
x
n
n
i
i 



2
1
2 1
1
x
26
AWGN waveform channel
• Assume channel has bandwidth W, with
frequency response C(f)=1, [-W, +W]
27
Channel
encoder
Modulator
Physical
channel
Demodulator
and detector
Channel
decoder
Source
data
Output
data
Input
waveform
Output
waveform
)
(
)
(
)
( t
n
t
x
t
y 

28
AWGN waveform channel
• Power constraint
P
dt
t
x
T
P
t
X
E
T
T
T





2
2
2
2
)
(
1
lim
)]
(
[
29
AWGN waveform channel
• How to define probabilities that characterize
the channel?






j
j
j
j
j
j
j
j
j
t
y
t
y
t
n
t
n
t
x
t
x
)
(
)
(
)
(
)
(
)
(
)
(



j
j
i n
x
y 

Equivalent to 2W
uses per second of
a discrete-time
channel
}
2
,
,
2
,
1
),
(
{ WT
j
t
j 


30
AWGN waveform channel
• Power constraint becomes...
• Hence,
P
X
W
X
WT
T
x
T
dt
t
x
T T
WT
j
j
T
T
T
T
















]
[
E
2
]
[
E
2
1
lim
1
lim
)
(
1
lim
2
2
2
1
2
2
2
2
W
P
X
E
2
]
[ 2

Channel capacity
• After source coding, we have binary sequency
of length n
• Channel causes probability of bit error p
• When n->inf, the number of sequences that
have np errors
31
)
(
2
))!
1
(
(
)!
(
! p
nHb
p
n
np
n
np
n











Channel capacity
• To reduce errors, we use a subset of all
possible sequences
• Information rate [bits per transmission]
32
))
(
1
(
)
(
2
2
2 p
H
n
p
nH
n
b
b
M 


)
(
1
log
1
2 p
H
M
n
R b


 Capacity of
binary channel
Channel capacity
33
We use 2m different binary
sequencies of length m for
transmission
2n different binary
sequencies of length n
contain information
1
)
(
1
0 


 p
H
R b
We cannot transmit more
than 1 bit per channel use
Channel encoder: add redundancy
Channel capacity
• Capacity for abitray discrete memoryless channel
• Maximize mutual information between input and
output, over all
• Shannon’s Second Theorem – noisy channel coding
- R < C, reliable communication is possible
- R > C, reliable communication is impossible
)
;
(
max Y
X
I
C
p

)
,
,
,
( 2
1 X
p
p
p
p 

34
Channel capacity
For binary symmetric channel
2
1
]
0
[
P
]
1
[
P 


 X
X
35
)
(
1
)
1
(
2
log
)
1
(
2
log
1 p
H
p
p
p
p
C 






Channel capacity
Discrete-time AWGN channel with an input
power constraint
For large n,
36
N
X
Y 
 P
X 
]
[
E 2
2
2
2
2
]
[
E
]
[
E
1




 P
N
X
n
y
2
2
2 1
1



 n
x
y
n
n
Channel capacity
37
Discrete-time AWGN channel with an input
power constraint
Maximum number of symbols to transmit
Transmission rate
N
X
Y 
 P
X 
]
[
E 2
 
 
2
2
2
2
)
1
(
)
( n
n
n
P
n
P
n
M







)
1
(
log
2
1
log
1
2
2
2

P
M
n
R 


Can be obtained by directly
maximizing I(X;Y), subject to
power constraint
Channel capacity
Band-limited waveform AWGN channel with
input power constraint
- Equivalent to 2W use per second of discrete-
time channel
38
)
1
(
log
2
1
)
2
2
1
(
log
2
1
0
2
0
2
W
N
P
N
W
P
C 


 bits/channel use
)
1
(
log
)
1
(
log
2
1
2
0
2
0
2
W
N
P
W
W
N
P
W
C 



 bits/s
Channel capacity
39
0
0
2
44
.
1
)
1
(
log
N
P
C
W
C
P
W
N
P
W
C









Channel capacity
• Bandwidth efficiency
• Relation of bandwidth efficiency and power
efficiency
40
)
1
(
log
0
2
W
N
P
W
R
r 


R
P
M
PT
M
s
b 


2
2 log
log


)
1
(
log
)
1
(
log
0
2
0
2
N
r
W
N
R
r b
b 





dB
6
.
1
2
ln
,
0
0




N
r b

r
N
r
b 1
2
0



41

More Related Content

What's hot

Homomorphic encryption in_cloud
Homomorphic encryption in_cloudHomomorphic encryption in_cloud
Homomorphic encryption in_cloudShivam Singh
 
Algorithm Analyzing
Algorithm AnalyzingAlgorithm Analyzing
Algorithm AnalyzingHaluan Irsad
 
Algorithem complexity in data sructure
Algorithem complexity in data sructureAlgorithem complexity in data sructure
Algorithem complexity in data sructureKumar
 
Introduction to Information Channel
Introduction to Information ChannelIntroduction to Information Channel
Introduction to Information ChannelAkhil Nadh PC
 
Speaker Diarization
Speaker DiarizationSpeaker Diarization
Speaker DiarizationHONGJOO LEE
 
EuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOME
EuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOMEEuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOME
EuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOMEHONGJOO LEE
 
Introduction to datastructure and algorithm
Introduction to datastructure and algorithmIntroduction to datastructure and algorithm
Introduction to datastructure and algorithmPratik Mota
 
Homomorphic encryption and Private Machine Learning Classification
Homomorphic encryption and Private Machine Learning ClassificationHomomorphic encryption and Private Machine Learning Classification
Homomorphic encryption and Private Machine Learning ClassificationMohammed Ashour
 
Lattice Cryptography
Lattice CryptographyLattice Cryptography
Lattice CryptographyPriyanka Aash
 
Threshold and Proactive Pseudo-Random Permutations
Threshold and Proactive Pseudo-Random PermutationsThreshold and Proactive Pseudo-Random Permutations
Threshold and Proactive Pseudo-Random PermutationsAleksandr Yampolskiy
 
Computer Science Engineering : Data structure & algorithm, THE GATE ACADEMY
Computer Science Engineering : Data structure & algorithm, THE GATE ACADEMYComputer Science Engineering : Data structure & algorithm, THE GATE ACADEMY
Computer Science Engineering : Data structure & algorithm, THE GATE ACADEMYklirantga
 
Real Time Big Data Management
Real Time Big Data ManagementReal Time Big Data Management
Real Time Big Data ManagementAlbert Bifet
 
Detecting Bugs in Binaries Using Decompilation and Data Flow Analysis
Detecting Bugs in Binaries Using Decompilation and Data Flow AnalysisDetecting Bugs in Binaries Using Decompilation and Data Flow Analysis
Detecting Bugs in Binaries Using Decompilation and Data Flow AnalysisSilvio Cesare
 
Gradient Estimation Using Stochastic Computation Graphs
Gradient Estimation Using Stochastic Computation GraphsGradient Estimation Using Stochastic Computation Graphs
Gradient Estimation Using Stochastic Computation GraphsYoonho Lee
 

What's hot (18)

Unit ii algorithm
Unit   ii algorithmUnit   ii algorithm
Unit ii algorithm
 
Homomorphic encryption in_cloud
Homomorphic encryption in_cloudHomomorphic encryption in_cloud
Homomorphic encryption in_cloud
 
Algorithm Analyzing
Algorithm AnalyzingAlgorithm Analyzing
Algorithm Analyzing
 
Algorithem complexity in data sructure
Algorithem complexity in data sructureAlgorithem complexity in data sructure
Algorithem complexity in data sructure
 
Scribed lec8
Scribed lec8Scribed lec8
Scribed lec8
 
Big o
Big oBig o
Big o
 
Introduction to Information Channel
Introduction to Information ChannelIntroduction to Information Channel
Introduction to Information Channel
 
Speaker Diarization
Speaker DiarizationSpeaker Diarization
Speaker Diarization
 
EuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOME
EuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOMEEuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOME
EuroPython 2017 - PyData - Deep Learning your Broadband Network @ HOME
 
Introduction to datastructure and algorithm
Introduction to datastructure and algorithmIntroduction to datastructure and algorithm
Introduction to datastructure and algorithm
 
Homomorphic encryption and Private Machine Learning Classification
Homomorphic encryption and Private Machine Learning ClassificationHomomorphic encryption and Private Machine Learning Classification
Homomorphic encryption and Private Machine Learning Classification
 
Lattice Cryptography
Lattice CryptographyLattice Cryptography
Lattice Cryptography
 
Threshold and Proactive Pseudo-Random Permutations
Threshold and Proactive Pseudo-Random PermutationsThreshold and Proactive Pseudo-Random Permutations
Threshold and Proactive Pseudo-Random Permutations
 
Computer Science Engineering : Data structure & algorithm, THE GATE ACADEMY
Computer Science Engineering : Data structure & algorithm, THE GATE ACADEMYComputer Science Engineering : Data structure & algorithm, THE GATE ACADEMY
Computer Science Engineering : Data structure & algorithm, THE GATE ACADEMY
 
Real Time Big Data Management
Real Time Big Data ManagementReal Time Big Data Management
Real Time Big Data Management
 
ECE 4490 Multimedia Communication Lec01
ECE 4490 Multimedia Communication Lec01ECE 4490 Multimedia Communication Lec01
ECE 4490 Multimedia Communication Lec01
 
Detecting Bugs in Binaries Using Decompilation and Data Flow Analysis
Detecting Bugs in Binaries Using Decompilation and Data Flow AnalysisDetecting Bugs in Binaries Using Decompilation and Data Flow Analysis
Detecting Bugs in Binaries Using Decompilation and Data Flow Analysis
 
Gradient Estimation Using Stochastic Computation Graphs
Gradient Estimation Using Stochastic Computation GraphsGradient Estimation Using Stochastic Computation Graphs
Gradient Estimation Using Stochastic Computation Graphs
 

Similar to Ch6 information theory

Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfvani374987
 
Tele4653 l9
Tele4653 l9Tele4653 l9
Tele4653 l9Vin Voro
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptshortstime400
 
Itblock2 150209161919-conversion-gate01
Itblock2 150209161919-conversion-gate01Itblock2 150209161919-conversion-gate01
Itblock2 150209161919-conversion-gate01Xuan Phu Nguyen
 
basicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxbasicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxupendrabhatt13
 
Lecture1
Lecture1Lecture1
Lecture1ntpc08
 
Lecture7 channel capacity
Lecture7   channel capacityLecture7   channel capacity
Lecture7 channel capacityFrank Katta
 
Huffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.pptHuffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.pptPrincessSaro
 
Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information TheoryDr. Sanjay M. Gulhane
 
INFORMATION_THEORY.pdf
INFORMATION_THEORY.pdfINFORMATION_THEORY.pdf
INFORMATION_THEORY.pdftemmy7
 
Static-talk
Static-talkStatic-talk
Static-talkgiunti
 
Information Theory - Introduction
Information Theory  -  IntroductionInformation Theory  -  Introduction
Information Theory - IntroductionBurdwan University
 
Noise infotheory1
Noise infotheory1Noise infotheory1
Noise infotheory1vmspraneeth
 

Similar to Ch6 information theory (20)

Unit 5.pdf
Unit 5.pdfUnit 5.pdf
Unit 5.pdf
 
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
 
Tele4653 l9
Tele4653 l9Tele4653 l9
Tele4653 l9
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.ppt
 
Itblock2 150209161919-conversion-gate01
Itblock2 150209161919-conversion-gate01Itblock2 150209161919-conversion-gate01
Itblock2 150209161919-conversion-gate01
 
basicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxbasicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptx
 
Arithmetic Coding
Arithmetic CodingArithmetic Coding
Arithmetic Coding
 
3320 cyclic codes.ppt
3320 cyclic codes.ppt3320 cyclic codes.ppt
3320 cyclic codes.ppt
 
Lecture1
Lecture1Lecture1
Lecture1
 
Information theory
Information theoryInformation theory
Information theory
 
Information theory
Information theoryInformation theory
Information theory
 
Lecture7 channel capacity
Lecture7   channel capacityLecture7   channel capacity
Lecture7 channel capacity
 
Huffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.pptHuffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.ppt
 
information theory
information theoryinformation theory
information theory
 
Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information Theory
 
INFORMATION_THEORY.pdf
INFORMATION_THEORY.pdfINFORMATION_THEORY.pdf
INFORMATION_THEORY.pdf
 
Flat
FlatFlat
Flat
 
Static-talk
Static-talkStatic-talk
Static-talk
 
Information Theory - Introduction
Information Theory  -  IntroductionInformation Theory  -  Introduction
Information Theory - Introduction
 
Noise infotheory1
Noise infotheory1Noise infotheory1
Noise infotheory1
 

Recently uploaded

Call Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts Service
Call Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts ServiceCall Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts Service
Call Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts Servicejennyeacort
 
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
Call Girl in Low Price Delhi Punjabi Bagh  9711199012Call Girl in Low Price Delhi Punjabi Bagh  9711199012
Call Girl in Low Price Delhi Punjabi Bagh 9711199012sapnasaifi408
 
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证obuhobo
 
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...Suhani Kapoor
 
Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012
Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012
Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012rehmti665
 
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call GirlsSonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call GirlsNiya Khan
 
Delhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Internshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University CertificateInternshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University CertificateSoham Mondal
 
Business Development and Product Strategy for a SME named SARL based in Leban...
Business Development and Product Strategy for a SME named SARL based in Leban...Business Development and Product Strategy for a SME named SARL based in Leban...
Business Development and Product Strategy for a SME named SARL based in Leban...Soham Mondal
 
内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士
内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士
内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士obuhobo
 
Production Day 1.pptxjvjbvbcbcb bj bvcbj
Production Day 1.pptxjvjbvbcbcb bj bvcbjProduction Day 1.pptxjvjbvbcbcb bj bvcbj
Production Day 1.pptxjvjbvbcbcb bj bvcbjLewisJB
 
VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...
VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...
VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...Suhani Kapoor
 
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service BhilaiVIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service BhilaiSuhani Kapoor
 
VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...
VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...
VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...Suhani Kapoor
 
PM Job Search Council Info Session - PMI Silver Spring Chapter
PM Job Search Council Info Session - PMI Silver Spring ChapterPM Job Search Council Info Session - PMI Silver Spring Chapter
PM Job Search Council Info Session - PMI Silver Spring ChapterHector Del Castillo, CPM, CPMM
 
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Preventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptxPreventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptxGry Tina Tinde
 
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big BoodyDubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boodykojalkojal131
 
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Employee of the Month - Samsung Semiconductor India Research
Employee of the Month - Samsung Semiconductor India ResearchEmployee of the Month - Samsung Semiconductor India Research
Employee of the Month - Samsung Semiconductor India ResearchSoham Mondal
 

Recently uploaded (20)

Call Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts Service
Call Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts ServiceCall Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts Service
Call Girls In Bhikaji Cama Place 24/7✡️9711147426✡️ Escorts Service
 
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
Call Girl in Low Price Delhi Punjabi Bagh  9711199012Call Girl in Low Price Delhi Punjabi Bagh  9711199012
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
 
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
 
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
 
Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012
Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012
Call Girls Mukherjee Nagar Delhi reach out to us at ☎ 9711199012
 
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call GirlsSonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
 
Delhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Patparganj 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Internshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University CertificateInternshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University Certificate
 
Business Development and Product Strategy for a SME named SARL based in Leban...
Business Development and Product Strategy for a SME named SARL based in Leban...Business Development and Product Strategy for a SME named SARL based in Leban...
Business Development and Product Strategy for a SME named SARL based in Leban...
 
内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士
内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士
内布拉斯加大学林肯分校毕业证录取书( 退学 )学位证书硕士
 
Production Day 1.pptxjvjbvbcbcb bj bvcbj
Production Day 1.pptxjvjbvbcbcb bj bvcbjProduction Day 1.pptxjvjbvbcbcb bj bvcbj
Production Day 1.pptxjvjbvbcbcb bj bvcbj
 
VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...
VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...
VIP Russian Call Girls Amravati Chhaya 8250192130 Independent Escort Service ...
 
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service BhilaiVIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
 
VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...
VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...
VIP Call Girls in Jamshedpur Aarohi 8250192130 Independent Escort Service Jam...
 
PM Job Search Council Info Session - PMI Silver Spring Chapter
PM Job Search Council Info Session - PMI Silver Spring ChapterPM Job Search Council Info Session - PMI Silver Spring Chapter
PM Job Search Council Info Session - PMI Silver Spring Chapter
 
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Preventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptxPreventing and ending sexual harassment in the workplace.pptx
Preventing and ending sexual harassment in the workplace.pptx
 
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big BoodyDubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
 
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Employee of the Month - Samsung Semiconductor India Research
Employee of the Month - Samsung Semiconductor India ResearchEmployee of the Month - Samsung Semiconductor India Research
Employee of the Month - Samsung Semiconductor India Research
 

Ch6 information theory

  • 2. 6.1 Mathematical models for information source • Discrete source 1 ] [ P } , , , { 1 2 1       L k k k k L p x X p x x x X  2
  • 3. 6.1 Mathematical models for information source • Discrete memoryless source (DMS) Source outputs are independent random variables • Discrete stationary source – Source outputs are statistically dependent – Stationary: joint probabilities of and are identical for all shifts m – Characterized by joint PDF N i Xi , , 2 , 1 } {   ) , , ( 2 1 m x x x p  n x x x  , , 2 1 m n m m x x x     , , 2 1 3
  • 4. 6.2 Measure of information • Entropy of random variable X – A measure of uncertainty or ambiguity in X – A measure of information that is required by knowledge of X, or information content of X per symbol – Unit: bits (log_2) or nats (log_e) per symbol – We define – Entropy depends on probabilities of X, not values of X        L k k k L x X x X X H x x x X 1 2 1 ] [ P log ] P[ ) ( } , , , {  0 0 log 0  4
  • 5. Shannon’s fundamental paper in 1948 “A Mathematical Theory of Communication” Can we define a quantity which will measure how much information is “produced” by a process? He wants this measure to satisfy: 1) H should be continuous in 2) If all are equal, H should be monotonically increasing with n 3) If a choice can be broken down into two successive choices, the original H should be the weighted sum of the individual values of H ) , , , ( 2 1 n p p p H  i p i p 5
  • 6. Shannon’s fundamental paper in 1948 “A Mathematical Theory of Communication” ) 3 1 , 3 2 ( 2 1 ) 2 1 , 2 1 ( ) 6 1 , 3 1 , 2 1 ( H H H   6
  • 7. Shannon’s fundamental paper in 1948 “A Mathematical Theory of Communication” The only H satisfying the three assumptions is of the form: K is a positive constant.     n i i i p p K H 1 log 7
  • 8. Binary entropy function ) 1 log( ) 1 ( log ) ( p p p p X H      H(p) Probability p H=0: no uncertainty H=1: most uncertainty 1 bit for binary information 8
  • 9. Mutual information • Two discrete random variables: X and Y • Measures the information knowing either variables provides about the other • What if X and Y are fully independent or dependent?             ] [ P ] [ P ] , [ P log ] , [ P ] [ P ] | [ P log ] , [ P ) , ( ] , [ P ) ; ( y x y x y Y x X x y x y Y x X y x I y Y x X Y X I 9
  • 12. Joint and conditional entropy • Joint entropy • Conditional entropy of Y given X 12 ] , [ P log ] , [ P ) , ( y Y x X y Y x X Y X H                  ] | [ log ] , [ P ) | ( ] [ P ) | ( x X y Y P y Y x X x X Y H x X X Y H
  • 13. Joint and conditional entropy • Chain rule for entropies • Therefore, • If Xi are iid 13 ) , , , ( ) , | ( ) | ( ) ( ) , , , ( 1 2 1 2 1 3 1 2 1 2 1       n n n X X X X H X X X H X X H X H X X X H       n i i n X H X X X H 1 2 1 ) ( ) , , , (  ) ( ) , , , ( 2 1 X nH X X X H n  
  • 14. 6.3 Lossless coding of information source • Source sequence with length n n is assumed to be large • Without any source coding we need bits per symbol 14 } , , , { 2 1 L x x x X     ] , , , [ 2 1 n X X X   x ] [ P i i x X p   L log
  • 15. Lossless source coding • Typical sequence – Number of occurrence of is roughly – When , any will be “typical” 15 i x i np   n x ) ( log ) ( log ] [ P log 1 1 X nH p np p N i i i L i np i i         x ) ( 2 ] [ P X nH   x All typical sequences have the same probability    n when 1 ] [ P x
  • 16. Lossless source coding • Typical sequence • Since typical sequences are almost certain to occur, for the source output it is sufficient to consider only these typical sequences • How many bits per symbol we need now? 16 Number of typical sequences = ) ( 2 ] [ P 1 X nH  x L X H n X nH R log ) ( ) (   
  • 17. Lossless source coding Shannon’s First Theorem - Lossless Source Coding Let X denote a discrete memoryless source. There exists a lossless source code at rate R if ) (X H R  bits per transmission 17
  • 18. Lossless source coding For discrete stationary source… ) , , , | ( lim ) , , , ( 1 lim ) ( 1 2 1 2 1          k k k k k X X X X H X X X H k X H R   18
  • 19. Lossless source coding algorithms • Variable-length coding algorithm – Symbols with higher probability are assigned shorter code words – E.g. Huffman coding • Fixed-length coding algorithm E.g. Lempel-Ziv coding ) ( min 1 } { k L k k n x P n R k    19
  • 20. Huffman coding algorithm 20 P(x1) P(x2) P(x3) P(x4) P(x5) P(x6) P(x7) x1 00 x2 01 x3 10 x4 110 x5 1110 x6 11110 x7 11111 H(X)=2.11 R=2.21 bits per symbol
  • 21. 6.5 Channel models and channel capacity • Channel models input sequence output sequence A channel is memoryless if ) , , , ( ) , , , ( 2 1 2 1 n n y y y x x x     y x    n i i i x y 1 ] | [ P ] | [ P x y 21
  • 22. Binary symmetric channel (BSC) model Channel encoder Binary modulator Channel Demodulator and detector Channel decoder Source data Output data Composite discrete-input discrete output channel 22
  • 23. Binary symmetric channel (BSC) model 0 0 1 1 1-p 1-p p p Input Output p X Y P X Y p X Y P X Y              1 ] 0 | 0 [ ] 1 | 1 [ P ] 0 | 1 [ ] 1 | 0 [ P 23
  • 24. Discrete memoryless channel (DMC) x0 Input Output x1 xM-1 … y0 y1 yQ-1 … … {X} {Y} ] | [ P x y 24 can be arranged in a matrix
  • 25. Discrete-input continuous-output channel N X Y         n i i i n n x y x y p x x x y y y p e x y p 1 2 1 2 1 2 ) ( 2 ) | ( ) , , , | , , , ( 2 1 ) | ( 2 2     25 If N is additive white Gaussian noise…
  • 26. Discrete-time AWGN channel • Power constraint • For input sequence with large n i i i n x y   P X  ] [ E 2 ) , , , ( 2 1 n x x x   x P n x n n i i     2 1 2 1 1 x 26
  • 27. AWGN waveform channel • Assume channel has bandwidth W, with frequency response C(f)=1, [-W, +W] 27 Channel encoder Modulator Physical channel Demodulator and detector Channel decoder Source data Output data Input waveform Output waveform ) ( ) ( ) ( t n t x t y  
  • 28. 28 AWGN waveform channel • Power constraint P dt t x T P t X E T T T      2 2 2 2 ) ( 1 lim )] ( [
  • 29. 29 AWGN waveform channel • How to define probabilities that characterize the channel?       j j j j j j j j j t y t y t n t n t x t x ) ( ) ( ) ( ) ( ) ( ) (    j j i n x y   Equivalent to 2W uses per second of a discrete-time channel } 2 , , 2 , 1 ), ( { WT j t j   
  • 30. 30 AWGN waveform channel • Power constraint becomes... • Hence, P X W X WT T x T dt t x T T WT j j T T T T                 ] [ E 2 ] [ E 2 1 lim 1 lim ) ( 1 lim 2 2 2 1 2 2 2 2 W P X E 2 ] [ 2 
  • 31. Channel capacity • After source coding, we have binary sequency of length n • Channel causes probability of bit error p • When n->inf, the number of sequences that have np errors 31 ) ( 2 ))! 1 ( ( )! ( ! p nHb p n np n np n           
  • 32. Channel capacity • To reduce errors, we use a subset of all possible sequences • Information rate [bits per transmission] 32 )) ( 1 ( ) ( 2 2 2 p H n p nH n b b M    ) ( 1 log 1 2 p H M n R b    Capacity of binary channel
  • 33. Channel capacity 33 We use 2m different binary sequencies of length m for transmission 2n different binary sequencies of length n contain information 1 ) ( 1 0     p H R b We cannot transmit more than 1 bit per channel use Channel encoder: add redundancy
  • 34. Channel capacity • Capacity for abitray discrete memoryless channel • Maximize mutual information between input and output, over all • Shannon’s Second Theorem – noisy channel coding - R < C, reliable communication is possible - R > C, reliable communication is impossible ) ; ( max Y X I C p  ) , , , ( 2 1 X p p p p   34
  • 35. Channel capacity For binary symmetric channel 2 1 ] 0 [ P ] 1 [ P     X X 35 ) ( 1 ) 1 ( 2 log ) 1 ( 2 log 1 p H p p p p C       
  • 36. Channel capacity Discrete-time AWGN channel with an input power constraint For large n, 36 N X Y   P X  ] [ E 2 2 2 2 2 ] [ E ] [ E 1      P N X n y 2 2 2 1 1     n x y n n
  • 37. Channel capacity 37 Discrete-time AWGN channel with an input power constraint Maximum number of symbols to transmit Transmission rate N X Y   P X  ] [ E 2     2 2 2 2 ) 1 ( ) ( n n n P n P n M        ) 1 ( log 2 1 log 1 2 2 2  P M n R    Can be obtained by directly maximizing I(X;Y), subject to power constraint
  • 38. Channel capacity Band-limited waveform AWGN channel with input power constraint - Equivalent to 2W use per second of discrete- time channel 38 ) 1 ( log 2 1 ) 2 2 1 ( log 2 1 0 2 0 2 W N P N W P C     bits/channel use ) 1 ( log ) 1 ( log 2 1 2 0 2 0 2 W N P W W N P W C      bits/s
  • 40. Channel capacity • Bandwidth efficiency • Relation of bandwidth efficiency and power efficiency 40 ) 1 ( log 0 2 W N P W R r    R P M PT M s b    2 2 log log   ) 1 ( log ) 1 ( log 0 2 0 2 N r W N R r b b       dB 6 . 1 2 ln , 0 0     N r b  r N r b 1 2 0   
  • 41. 41

Editor's Notes

  1. Model relatives to the probability, if everything is deterministic, no information is transmitted
  2. If information source is deterministic, for one value of X the probability equals 1, others equal 0, entropy of X is 0, contain no information
  3. On 3), give example on board
  4. When equal probability, we reach the maximal – 1 bit information per symbol
  5. Fully independent: I=0, fully dependent I=H(X)=H(Y)
  6. Intuitively, if entropy H(X) is regarded as a measure of uncertainty about a random variable, then H(X|Y) is a measure of what Y does not say about X. This is "the amount of uncertainty remaining about X after Y is known",
  7. This gives a fundamental limits on source coding. And entropy plays an important role in lossless compression of information source
  8. Huffman coding is optimum is a sense that the average number of bits presents source symbol is a minimum, subject to that the code words satisfy the prefix condition
  9. Channel models are useful when we design codes
  10. A more general case of BSC. M-ary modulation
  11. Output of detector is unquantized. Probability density function…
  12. We can use the coefficients to characterize the channel...
  13. symmetric
  14. Minimum snr required for any communication system