SlideShare a Scribd company logo
1 of 34
Noise, Information Theory,
and Entropy
CS414 – Spring 2007
By Roger Cheng
(Huffman coding slides courtesy
of Brian Bailey)
Why study noise?
It’s present in all
systems of interest,
and we have to deal
with it
By knowing its
characteristics, we
can fight it better
Create models to
evaluate analytically
Communication system
abstraction
Information
source
Encoder Modulator
Channel
Output signal Decoder Demodulator
Sender side
Receiver side
The additive noise channel
Transmitted signal s(t)
is corrupted by noise
source n(t), and the
resulting received signal
is r(t)
Noise could result form
many sources, including
electronic components
and transmission
interference
n(t)
+s(t) r(t)
Random processes
A random variable is the result of a single
measurement
A random process is a indexed collection of
random variables, or equivalently a non-
deterministic signal that can be described by
a probability distribution
Noise can be modeled as a random process
WGN (White Gaussian Noise)
Properties
 At each time instant t = t0, the value of n(t)
is normally distributed with mean 0,
variance σ2
(ie E[n(t0)] = 0, E[n(t0)2
] = σ2
)
 At any two different time instants, the
values of n(t) are uncorrelated (ie
E[n(t0)n(tk)] = 0)
 The power spectral density of n(t) has
equal power in all frequency bands
WGN continued
When an additive noise channel has a white
Gaussian noise source, we call it an AWGN
channel
Most frequently used model in
communications
Reasons why we use this model
 It’s easy to understand and compute
 It applies to a broad class of physical channels
Signal energy and power
Energy is defined as
Power is defined as
Most signals are either finite energy and zero
power, or infinite energy and finite power
Noise power is hard to compute in time domain
 Power of WGN is its variance σ2
2
= | ( ) | dtx x tε
∞
−∞
∫
/2
2
/2
1
= lim | ( ) |
T
x
T
T
P x t dt
T→∞
−
∫
Signal to Noise Ratio (SNR)
Defined as the ratio of signal power to the
noise power corrupting the signal
Usually more practical to measure SNR on a
dB scale
Obviously, want as high an SNR as possible
SNR example - Audio
Original sound file is
CD quality (16 bit,
44.1 kHz sampling
rate)
Original 40 dB
20 dB 10 dB
Analog vs. Digital
Analog system
 Any amount of noise will create distortion at the
output
Digital system
 A relatively small amount of noise will cause no
harm at all
 Too much noise will make decoding of received
signal impossible
Both - Goal is to limit effects of noise to a
manageable/satisfactory amount
Information theory and
entropy
Information theory tries to
solve the problem of
communicating as much
data as possible over a
noisy channel
Measure of data is
entropy
Claude Shannon first
demonstrated that
reliable communication
over a noisy channel is
possible (jump-started
digital age)
Entropy definitions
Shannon entropy
Binary entropy formula
Differential entropy
Properties of entropy
Can be defined as the expectation of log p(x)
(ie H(X) = E[-log p(x)])
Is not a function of a variable’s values, is a
function of the variable’s probabilities
Usually measured in “bits” (using logs of base
2) or “nats” (using logs of base e)
Maximized when all values are equally likely
(ie uniform distribution)
Equal to 0 when only one value is possible
 Cannot be negative
Joint and conditional entropy
Joint entropy is the entropy of the
pairing (X,Y)
Conditional entropy is the entropy of X if
the value of Y was known
Relationship between the two
Mutual information
Mutual information is how much
information about X can be obtained by
observing Y
Mathematical model of a
channel
Assume that our input to the channel is
X, and the output is Y
Then the characteristics of the channel
can be defined by its conditional
probability distribution p(y|x)
Channel capacity and rate
Channel capacity is defined as the
maximum possible value of the mutual
information
 We choose the best f(x) to maximize C
For any rate R < C, we can transmit
information with arbitrarily small
probability of error
Binary symmetric channel
Correct bit transmitted with probability 1-p
Wrong bit transmitted with probability p
 Sometimes called “cross-over probability”
Capacity C = 1 - H(p,1-p)
Binary erasure channel
Correct bit transmitted with probability 1-p
“Erasure” transmitted with probability p
Capacity C = 1 - p
Coding theory
Information theory only gives us an upper
bound on communication rate
Need to use coding theory to find a practical
method to achieve a high rate
2 types
 Source coding - Compress source data to a
smaller size
 Channel coding - Adds redundancy bits to make
transmission across noisy channel more robust
Source-channel separation
theorem
Shannon showed that when dealing
with one transmitter and one receiver,
we can break up source coding and
channel coding into separate steps
without loss of optimality
Does not apply when there are multiple
transmitters and/or receivers
 Need to use network information theory
principles in those cases
Huffman Encoding
Use probability distribution to determine
how many bits to use for each symbol
 higher-frequency assigned shorter codes
 entropy-based, block-variable coding
scheme
Huffman Encoding
Produces a code which uses a minimum
number of bits to represent each symbol
 cannot represent same sequence using fewer real
bits per symbol when using code words
 optimal when using code words, but this may differ
slightly from the theoretical lower limit
Build Huffman tree to assign codes
Informal Problem Description
Given a set of symbols from an alphabet and
their probability distribution
 assumes distribution is known and stable
Find a prefix free binary code with minimum
weighted path length
 prefix free means no codeword is a prefix of any
other codeword
Huffman Algorithm
Construct a binary tree of codes
 leaf nodes represent symbols to encode
 interior nodes represent cumulative probability
 edges assigned 0 or 1 output code
Construct the tree bottom-up
 connect the two nodes with the lowest probability
until no more nodes to connect
Huffman Example
Construct the
Huffman coding tree
(in class)
Symbol
(S)
P(S)
A 0.25
B 0.30
C 0.12
D 0.15
E 0.18
Characteristics of Solution
Lowest probability symbol is
always furthest from root
Assignment of 0/1 to children
edges arbitrary
 other solutions possible; lengths
remain the same
 If two nodes have equal
probability, can select any two
Notes
 prefix free code
 O(nlgn) complexity
Symbol
(S)
Code
A 11
B 00
C 010
D 011
E 10
Example Encoding/Decoding
Encode “BEAD”
⇒001011011
⇒Decode “0101100”
Symbol
(S)
Code
A 11
B 00
C 010
D 011
E 10
Entropy (Theoretical Limit)
= -.25 * log2 .25 +
-.30 * log2 .30 +
-.12 * log2 .12 +
-.15 * log2 .15 +
-.18 * log2 .18
H = 2.24 bits
∑=
−=
N
i
ii spspH
1
)(log)( 2
Symbol P(S)
Cod
e
A 0.25 11
B 0.30 00
C 0.12 010
D 0.15 011
E 0.18 10
Average Codeword Length
= .25(2) +
.30(2) +
.12(3) +
.15(3) +
.18(2)
L = 2.27 bits
∑=
=
N
i
ii scodelengthspL
1
)()(
Symbol P(S)
Cod
e
A 0.25 11
B 0.30 00
C 0.12 010
D 0.15 011
E 0.18 10
Code Length Relative to
Entropy
Huffman reaches entropy limit when all
probabilities are negative powers of 2
 i.e., 1/2; 1/4; 1/8; 1/16; etc.
H <= Code Length <= H + 1
∑=
−=
N
i
ii spspH
1
)(log)( 2∑=
=
N
i
ii scodelengthspL
1
)()(
Example
H = -.01*log2.01 +
-.99*log2.99
= .08
L = .01(1) +
.99(1)
= 1
Symbol P(S) Code
A 0.01 1
B 0.99 0
Limitations
Diverges from lower limit when probability of
a particular symbol becomes high
 always uses an integral number of bits
Must send code book with the data
 lowers overall efficiency
Must determine frequency distribution
 must remain stable over the data set

More Related Content

What's hot

Information Theory Coding 1
Information Theory Coding 1Information Theory Coding 1
Information Theory Coding 1Mahafuz Aveek
 
Noise info theory and Entrophy
Noise info theory and EntrophyNoise info theory and Entrophy
Noise info theory and EntrophyIzah Asmadi
 
arithmetic and adaptive arithmetic coding
arithmetic and adaptive arithmetic codingarithmetic and adaptive arithmetic coding
arithmetic and adaptive arithmetic codingAyush Gupta
 
Huffman Algorithm and its Application by Ekansh Agarwal
Huffman Algorithm and its Application by Ekansh AgarwalHuffman Algorithm and its Application by Ekansh Agarwal
Huffman Algorithm and its Application by Ekansh AgarwalEkansh Agarwal
 
Multimedia lossless compression algorithms
Multimedia lossless compression algorithmsMultimedia lossless compression algorithms
Multimedia lossless compression algorithmsMazin Alwaaly
 
Arithmetic coding
Arithmetic codingArithmetic coding
Arithmetic codingVikas Goyal
 
Huffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysisHuffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysisRamakant Soni
 
Huffman Algorithm By Shuhin
Huffman Algorithm By ShuhinHuffman Algorithm By Shuhin
Huffman Algorithm By Shuhinsuhin4000
 
Data Compression - Text Compression - Run Length Encoding
Data Compression - Text Compression - Run Length EncodingData Compression - Text Compression - Run Length Encoding
Data Compression - Text Compression - Run Length EncodingMANISH T I
 
Arithmetic coding
Arithmetic codingArithmetic coding
Arithmetic coding09lavee
 

What's hot (20)

Chap 3
Chap 3Chap 3
Chap 3
 
Information Theory Coding 1
Information Theory Coding 1Information Theory Coding 1
Information Theory Coding 1
 
Noise info theory and Entrophy
Noise info theory and EntrophyNoise info theory and Entrophy
Noise info theory and Entrophy
 
arithmetic and adaptive arithmetic coding
arithmetic and adaptive arithmetic codingarithmetic and adaptive arithmetic coding
arithmetic and adaptive arithmetic coding
 
Huffman Algorithm and its Application by Ekansh Agarwal
Huffman Algorithm and its Application by Ekansh AgarwalHuffman Algorithm and its Application by Ekansh Agarwal
Huffman Algorithm and its Application by Ekansh Agarwal
 
Multimedia lossless compression algorithms
Multimedia lossless compression algorithmsMultimedia lossless compression algorithms
Multimedia lossless compression algorithms
 
Arithmetic coding
Arithmetic codingArithmetic coding
Arithmetic coding
 
Lossless
LosslessLossless
Lossless
 
Huffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysisHuffman and Arithmetic coding - Performance analysis
Huffman and Arithmetic coding - Performance analysis
 
Huffman Algorithm By Shuhin
Huffman Algorithm By ShuhinHuffman Algorithm By Shuhin
Huffman Algorithm By Shuhin
 
Huffman codes
Huffman codesHuffman codes
Huffman codes
 
Huffman coding01
Huffman coding01Huffman coding01
Huffman coding01
 
Huffman coding
Huffman coding Huffman coding
Huffman coding
 
Lec-03 Entropy Coding I: Hoffmann & Golomb Codes
Lec-03 Entropy Coding I: Hoffmann & Golomb CodesLec-03 Entropy Coding I: Hoffmann & Golomb Codes
Lec-03 Entropy Coding I: Hoffmann & Golomb Codes
 
Arithmetic Coding
Arithmetic CodingArithmetic Coding
Arithmetic Coding
 
Huffman Coding
Huffman CodingHuffman Coding
Huffman Coding
 
Text compression
Text compressionText compression
Text compression
 
Huffman coding || Huffman Tree
Huffman coding || Huffman TreeHuffman coding || Huffman Tree
Huffman coding || Huffman Tree
 
Data Compression - Text Compression - Run Length Encoding
Data Compression - Text Compression - Run Length EncodingData Compression - Text Compression - Run Length Encoding
Data Compression - Text Compression - Run Length Encoding
 
Arithmetic coding
Arithmetic codingArithmetic coding
Arithmetic coding
 

Viewers also liked

Viewers also liked (20)

Information theory & coding (ECE)
Information theory & coding (ECE)Information theory & coding (ECE)
Information theory & coding (ECE)
 
Viterbi algorithm
Viterbi algorithmViterbi algorithm
Viterbi algorithm
 
Source coding theorem
Source coding theoremSource coding theorem
Source coding theorem
 
Fano algorithm
Fano algorithmFano algorithm
Fano algorithm
 
5 linear block codes
5 linear block codes5 linear block codes
5 linear block codes
 
Lz77 (sliding window)
Lz77 (sliding window)Lz77 (sliding window)
Lz77 (sliding window)
 
Dictionary Based Compression
Dictionary Based CompressionDictionary Based Compression
Dictionary Based Compression
 
Convolutional codes
Convolutional codesConvolutional codes
Convolutional codes
 
Chapter 03 cyclic codes
Chapter 03   cyclic codesChapter 03   cyclic codes
Chapter 03 cyclic codes
 
Amplitude shift keying
Amplitude shift keyingAmplitude shift keying
Amplitude shift keying
 
Convolutional Codes And Their Decoding
Convolutional Codes And Their DecodingConvolutional Codes And Their Decoding
Convolutional Codes And Their Decoding
 
Linear block code
Linear block codeLinear block code
Linear block code
 
PSK (PHASE SHIFT KEYING )
PSK (PHASE SHIFT KEYING )PSK (PHASE SHIFT KEYING )
PSK (PHASE SHIFT KEYING )
 
Phase shift keying Presentation
Phase shift keying PresentationPhase shift keying Presentation
Phase shift keying Presentation
 
Phase shift keying(PSK)
Phase shift keying(PSK)Phase shift keying(PSK)
Phase shift keying(PSK)
 
Digital Communication Techniques
Digital Communication TechniquesDigital Communication Techniques
Digital Communication Techniques
 
Entropy Presentation
Entropy PresentationEntropy Presentation
Entropy Presentation
 
Amplitude shift keying (ask)
Amplitude shift keying (ask)Amplitude shift keying (ask)
Amplitude shift keying (ask)
 
Entropy
EntropyEntropy
Entropy
 
Source coding
Source codingSource coding
Source coding
 

Similar to Noise infotheory1

Lecture1
Lecture1Lecture1
Lecture1ntpc08
 
Vidyalankar final-essentials of communication systems
Vidyalankar final-essentials of communication systemsVidyalankar final-essentials of communication systems
Vidyalankar final-essentials of communication systemsanilkurhekar
 
communication system lec2
 communication system lec2 communication system lec2
communication system lec2ZareenRauf1
 
basicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxbasicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxupendrabhatt13
 
Audio Signal Processing
Audio Signal Processing Audio Signal Processing
Audio Signal Processing Ahmed A. Arefin
 
Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3Aref35
 
Introduction to Channel Capacity | DCNIT-LDTalks-1
Introduction to Channel Capacity | DCNIT-LDTalks-1Introduction to Channel Capacity | DCNIT-LDTalks-1
Introduction to Channel Capacity | DCNIT-LDTalks-1Arunabha Saha
 
Radio receiver and information coding .pptx
Radio receiver and information coding .pptxRadio receiver and information coding .pptx
Radio receiver and information coding .pptxswatihalunde
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptshortstime400
 
Huffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.pptHuffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.pptPrincessSaro
 
Communication Networks Ii
Communication Networks IiCommunication Networks Ii
Communication Networks Iianishgoel
 
Communication Networks II
Communication Networks IICommunication Networks II
Communication Networks IIanishgoel
 

Similar to Noise infotheory1 (20)

Lecture1
Lecture1Lecture1
Lecture1
 
Vidyalankar final-essentials of communication systems
Vidyalankar final-essentials of communication systemsVidyalankar final-essentials of communication systems
Vidyalankar final-essentials of communication systems
 
Unit 5.pdf
Unit 5.pdfUnit 5.pdf
Unit 5.pdf
 
communication system lec2
 communication system lec2 communication system lec2
communication system lec2
 
basicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxbasicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptx
 
Image compression
Image compressionImage compression
Image compression
 
Audio Signal Processing
Audio Signal Processing Audio Signal Processing
Audio Signal Processing
 
Lec5 Compression
Lec5 CompressionLec5 Compression
Lec5 Compression
 
Coding
CodingCoding
Coding
 
add9.5.ppt
add9.5.pptadd9.5.ppt
add9.5.ppt
 
Channel impairments
Channel impairmentsChannel impairments
Channel impairments
 
Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3
 
Cs8591 Computer Networks
Cs8591 Computer NetworksCs8591 Computer Networks
Cs8591 Computer Networks
 
Introduction to Channel Capacity | DCNIT-LDTalks-1
Introduction to Channel Capacity | DCNIT-LDTalks-1Introduction to Channel Capacity | DCNIT-LDTalks-1
Introduction to Channel Capacity | DCNIT-LDTalks-1
 
Radio receiver and information coding .pptx
Radio receiver and information coding .pptxRadio receiver and information coding .pptx
Radio receiver and information coding .pptx
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.ppt
 
Huffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.pptHuffman&Shannon-multimedia algorithms.ppt
Huffman&Shannon-multimedia algorithms.ppt
 
Communication Networks Ii
Communication Networks IiCommunication Networks Ii
Communication Networks Ii
 
Communication Networks II
Communication Networks IICommunication Networks II
Communication Networks II
 
Communication System
Communication SystemCommunication System
Communication System
 

Recently uploaded

Alambagh Call Girl 9548273370 , Call Girls Service Lucknow
Alambagh Call Girl 9548273370 , Call Girls Service LucknowAlambagh Call Girl 9548273370 , Call Girls Service Lucknow
Alambagh Call Girl 9548273370 , Call Girls Service Lucknowmakika9823
 
vip Model Basti Call Girls 9999965857 Call or WhatsApp Now Book
vip Model Basti Call Girls 9999965857 Call or WhatsApp Now Bookvip Model Basti Call Girls 9999965857 Call or WhatsApp Now Book
vip Model Basti Call Girls 9999965857 Call or WhatsApp Now Bookmanojkuma9823
 
Beautiful Sapna Call Girls CP 9711199012 ☎ Call /Whatsapps
Beautiful Sapna Call Girls CP 9711199012 ☎ Call /WhatsappsBeautiful Sapna Call Girls CP 9711199012 ☎ Call /Whatsapps
Beautiful Sapna Call Girls CP 9711199012 ☎ Call /Whatsappssapnasaifi408
 
Call Girls Delhi {Rohini} 9711199012 high profile service
Call Girls Delhi {Rohini} 9711199012 high profile serviceCall Girls Delhi {Rohini} 9711199012 high profile service
Call Girls Delhi {Rohini} 9711199012 high profile servicerehmti665
 
VIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service Saharanpur
VIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service SaharanpurVIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service Saharanpur
VIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service SaharanpurSuhani Kapoor
 
(办理学位证)多伦多大学毕业证成绩单原版一比一
(办理学位证)多伦多大学毕业证成绩单原版一比一(办理学位证)多伦多大学毕业证成绩单原版一比一
(办理学位证)多伦多大学毕业证成绩单原版一比一C SSS
 
如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查
如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查
如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查awo24iot
 
Call Girls Service Kolkata Aishwarya 🤌 8250192130 🚀 Vip Call Girls Kolkata
Call Girls Service Kolkata Aishwarya 🤌  8250192130 🚀 Vip Call Girls KolkataCall Girls Service Kolkata Aishwarya 🤌  8250192130 🚀 Vip Call Girls Kolkata
Call Girls Service Kolkata Aishwarya 🤌 8250192130 🚀 Vip Call Girls Kolkataanamikaraghav4
 
如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一
如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一
如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一ga6c6bdl
 
WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service - Bandra F...
WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service -  Bandra F...WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service -  Bandra F...
WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service - Bandra F...Pooja Nehwal
 
vip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Book
vip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Bookvip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Book
vip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Bookmanojkuma9823
 
(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
Call Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一
如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一
如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一ga6c6bdl
 
Presentation.pptxjnfoigneoifnvoeifnvklfnvf
Presentation.pptxjnfoigneoifnvoeifnvklfnvfPresentation.pptxjnfoigneoifnvoeifnvklfnvf
Presentation.pptxjnfoigneoifnvoeifnvklfnvfchapmanellie27
 
原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证
原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证
原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证gwhohjj
 
如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一
如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一
如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一ga6c6bdl
 
Real Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCR
Real Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCRReal Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCR
Real Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCRdollysharma2066
 
定制(UI学位证)爱达荷大学毕业证成绩单原版一比一
定制(UI学位证)爱达荷大学毕业证成绩单原版一比一定制(UI学位证)爱达荷大学毕业证成绩单原版一比一
定制(UI学位证)爱达荷大学毕业证成绩单原版一比一ss ss
 

Recently uploaded (20)

Alambagh Call Girl 9548273370 , Call Girls Service Lucknow
Alambagh Call Girl 9548273370 , Call Girls Service LucknowAlambagh Call Girl 9548273370 , Call Girls Service Lucknow
Alambagh Call Girl 9548273370 , Call Girls Service Lucknow
 
vip Model Basti Call Girls 9999965857 Call or WhatsApp Now Book
vip Model Basti Call Girls 9999965857 Call or WhatsApp Now Bookvip Model Basti Call Girls 9999965857 Call or WhatsApp Now Book
vip Model Basti Call Girls 9999965857 Call or WhatsApp Now Book
 
Beautiful Sapna Call Girls CP 9711199012 ☎ Call /Whatsapps
Beautiful Sapna Call Girls CP 9711199012 ☎ Call /WhatsappsBeautiful Sapna Call Girls CP 9711199012 ☎ Call /Whatsapps
Beautiful Sapna Call Girls CP 9711199012 ☎ Call /Whatsapps
 
Call Girls Delhi {Rohini} 9711199012 high profile service
Call Girls Delhi {Rohini} 9711199012 high profile serviceCall Girls Delhi {Rohini} 9711199012 high profile service
Call Girls Delhi {Rohini} 9711199012 high profile service
 
VIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service Saharanpur
VIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service SaharanpurVIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service Saharanpur
VIP Call Girl Saharanpur Aashi 8250192130 Independent Escort Service Saharanpur
 
(办理学位证)多伦多大学毕业证成绩单原版一比一
(办理学位证)多伦多大学毕业证成绩单原版一比一(办理学位证)多伦多大学毕业证成绩单原版一比一
(办理学位证)多伦多大学毕业证成绩单原版一比一
 
CIVIL ENGINEERING
CIVIL ENGINEERINGCIVIL ENGINEERING
CIVIL ENGINEERING
 
如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查
如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查
如何办理(Adelaide毕业证)阿德莱德大学毕业证成绩单Adelaide学历认证真实可查
 
Call Girls Service Kolkata Aishwarya 🤌 8250192130 🚀 Vip Call Girls Kolkata
Call Girls Service Kolkata Aishwarya 🤌  8250192130 🚀 Vip Call Girls KolkataCall Girls Service Kolkata Aishwarya 🤌  8250192130 🚀 Vip Call Girls Kolkata
Call Girls Service Kolkata Aishwarya 🤌 8250192130 🚀 Vip Call Girls Kolkata
 
如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一
如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一
如何办理萨省大学毕业证(UofS毕业证)成绩单留信学历认证原版一比一
 
WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service - Bandra F...
WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service -  Bandra F...WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service -  Bandra F...
WhatsApp 9892124323 ✓Call Girls In Khar ( Mumbai ) secure service - Bandra F...
 
vip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Book
vip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Bookvip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Book
vip Krishna Nagar Call Girls 9999965857 Call or WhatsApp Now Book
 
(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(ANIKA) Wanwadi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
Call Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Bhavna Call 7001035870 Meet With Nagpur Escorts
 
如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一
如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一
如何办理伦敦大学伯贝克学院毕业证(BBK毕业证) 成绩单留信学历认证原版一比一
 
Presentation.pptxjnfoigneoifnvoeifnvklfnvf
Presentation.pptxjnfoigneoifnvoeifnvklfnvfPresentation.pptxjnfoigneoifnvoeifnvklfnvf
Presentation.pptxjnfoigneoifnvoeifnvklfnvf
 
原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证
原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证
原版1:1复刻斯坦福大学毕业证Stanford毕业证留信学历认证
 
如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一
如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一
如何办理(UCLA毕业证书)加州大学洛杉矶分校毕业证成绩单留信学历认证原版一比一
 
Real Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCR
Real Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCRReal Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCR
Real Sure (Call Girl) in I.G.I. Airport 8377087607 Hot Call Girls In Delhi NCR
 
定制(UI学位证)爱达荷大学毕业证成绩单原版一比一
定制(UI学位证)爱达荷大学毕业证成绩单原版一比一定制(UI学位证)爱达荷大学毕业证成绩单原版一比一
定制(UI学位证)爱达荷大学毕业证成绩单原版一比一
 

Noise infotheory1

  • 1. Noise, Information Theory, and Entropy CS414 – Spring 2007 By Roger Cheng (Huffman coding slides courtesy of Brian Bailey)
  • 2. Why study noise? It’s present in all systems of interest, and we have to deal with it By knowing its characteristics, we can fight it better Create models to evaluate analytically
  • 4. The additive noise channel Transmitted signal s(t) is corrupted by noise source n(t), and the resulting received signal is r(t) Noise could result form many sources, including electronic components and transmission interference n(t) +s(t) r(t)
  • 5. Random processes A random variable is the result of a single measurement A random process is a indexed collection of random variables, or equivalently a non- deterministic signal that can be described by a probability distribution Noise can be modeled as a random process
  • 6. WGN (White Gaussian Noise) Properties  At each time instant t = t0, the value of n(t) is normally distributed with mean 0, variance σ2 (ie E[n(t0)] = 0, E[n(t0)2 ] = σ2 )  At any two different time instants, the values of n(t) are uncorrelated (ie E[n(t0)n(tk)] = 0)  The power spectral density of n(t) has equal power in all frequency bands
  • 7. WGN continued When an additive noise channel has a white Gaussian noise source, we call it an AWGN channel Most frequently used model in communications Reasons why we use this model  It’s easy to understand and compute  It applies to a broad class of physical channels
  • 8. Signal energy and power Energy is defined as Power is defined as Most signals are either finite energy and zero power, or infinite energy and finite power Noise power is hard to compute in time domain  Power of WGN is its variance σ2 2 = | ( ) | dtx x tε ∞ −∞ ∫ /2 2 /2 1 = lim | ( ) | T x T T P x t dt T→∞ − ∫
  • 9. Signal to Noise Ratio (SNR) Defined as the ratio of signal power to the noise power corrupting the signal Usually more practical to measure SNR on a dB scale Obviously, want as high an SNR as possible
  • 10. SNR example - Audio Original sound file is CD quality (16 bit, 44.1 kHz sampling rate) Original 40 dB 20 dB 10 dB
  • 11. Analog vs. Digital Analog system  Any amount of noise will create distortion at the output Digital system  A relatively small amount of noise will cause no harm at all  Too much noise will make decoding of received signal impossible Both - Goal is to limit effects of noise to a manageable/satisfactory amount
  • 12. Information theory and entropy Information theory tries to solve the problem of communicating as much data as possible over a noisy channel Measure of data is entropy Claude Shannon first demonstrated that reliable communication over a noisy channel is possible (jump-started digital age)
  • 13. Entropy definitions Shannon entropy Binary entropy formula Differential entropy
  • 14. Properties of entropy Can be defined as the expectation of log p(x) (ie H(X) = E[-log p(x)]) Is not a function of a variable’s values, is a function of the variable’s probabilities Usually measured in “bits” (using logs of base 2) or “nats” (using logs of base e) Maximized when all values are equally likely (ie uniform distribution) Equal to 0 when only one value is possible  Cannot be negative
  • 15. Joint and conditional entropy Joint entropy is the entropy of the pairing (X,Y) Conditional entropy is the entropy of X if the value of Y was known Relationship between the two
  • 16. Mutual information Mutual information is how much information about X can be obtained by observing Y
  • 17. Mathematical model of a channel Assume that our input to the channel is X, and the output is Y Then the characteristics of the channel can be defined by its conditional probability distribution p(y|x)
  • 18. Channel capacity and rate Channel capacity is defined as the maximum possible value of the mutual information  We choose the best f(x) to maximize C For any rate R < C, we can transmit information with arbitrarily small probability of error
  • 19. Binary symmetric channel Correct bit transmitted with probability 1-p Wrong bit transmitted with probability p  Sometimes called “cross-over probability” Capacity C = 1 - H(p,1-p)
  • 20. Binary erasure channel Correct bit transmitted with probability 1-p “Erasure” transmitted with probability p Capacity C = 1 - p
  • 21. Coding theory Information theory only gives us an upper bound on communication rate Need to use coding theory to find a practical method to achieve a high rate 2 types  Source coding - Compress source data to a smaller size  Channel coding - Adds redundancy bits to make transmission across noisy channel more robust
  • 22. Source-channel separation theorem Shannon showed that when dealing with one transmitter and one receiver, we can break up source coding and channel coding into separate steps without loss of optimality Does not apply when there are multiple transmitters and/or receivers  Need to use network information theory principles in those cases
  • 23. Huffman Encoding Use probability distribution to determine how many bits to use for each symbol  higher-frequency assigned shorter codes  entropy-based, block-variable coding scheme
  • 24. Huffman Encoding Produces a code which uses a minimum number of bits to represent each symbol  cannot represent same sequence using fewer real bits per symbol when using code words  optimal when using code words, but this may differ slightly from the theoretical lower limit Build Huffman tree to assign codes
  • 25. Informal Problem Description Given a set of symbols from an alphabet and their probability distribution  assumes distribution is known and stable Find a prefix free binary code with minimum weighted path length  prefix free means no codeword is a prefix of any other codeword
  • 26. Huffman Algorithm Construct a binary tree of codes  leaf nodes represent symbols to encode  interior nodes represent cumulative probability  edges assigned 0 or 1 output code Construct the tree bottom-up  connect the two nodes with the lowest probability until no more nodes to connect
  • 27. Huffman Example Construct the Huffman coding tree (in class) Symbol (S) P(S) A 0.25 B 0.30 C 0.12 D 0.15 E 0.18
  • 28. Characteristics of Solution Lowest probability symbol is always furthest from root Assignment of 0/1 to children edges arbitrary  other solutions possible; lengths remain the same  If two nodes have equal probability, can select any two Notes  prefix free code  O(nlgn) complexity Symbol (S) Code A 11 B 00 C 010 D 011 E 10
  • 29. Example Encoding/Decoding Encode “BEAD” ⇒001011011 ⇒Decode “0101100” Symbol (S) Code A 11 B 00 C 010 D 011 E 10
  • 30. Entropy (Theoretical Limit) = -.25 * log2 .25 + -.30 * log2 .30 + -.12 * log2 .12 + -.15 * log2 .15 + -.18 * log2 .18 H = 2.24 bits ∑= −= N i ii spspH 1 )(log)( 2 Symbol P(S) Cod e A 0.25 11 B 0.30 00 C 0.12 010 D 0.15 011 E 0.18 10
  • 31. Average Codeword Length = .25(2) + .30(2) + .12(3) + .15(3) + .18(2) L = 2.27 bits ∑= = N i ii scodelengthspL 1 )()( Symbol P(S) Cod e A 0.25 11 B 0.30 00 C 0.12 010 D 0.15 011 E 0.18 10
  • 32. Code Length Relative to Entropy Huffman reaches entropy limit when all probabilities are negative powers of 2  i.e., 1/2; 1/4; 1/8; 1/16; etc. H <= Code Length <= H + 1 ∑= −= N i ii spspH 1 )(log)( 2∑= = N i ii scodelengthspL 1 )()(
  • 33. Example H = -.01*log2.01 + -.99*log2.99 = .08 L = .01(1) + .99(1) = 1 Symbol P(S) Code A 0.01 1 B 0.99 0
  • 34. Limitations Diverges from lower limit when probability of a particular symbol becomes high  always uses an integral number of bits Must send code book with the data  lowers overall efficiency Must determine frequency distribution  must remain stable over the data set