SlideShare a Scribd company logo
1 of 28
Information Theory
MSU, MARAWI CITY
EEE DEPARTMENT
Shannon Theorem
“If the rate of information from the
source does not exceed the capacity of
a communication channel, then there
exist a coding techniques such that the
information can be transmitted over the
channel with an arbitrary small
frequency of errors, despite the
presence of noise.”
Three Basic Concepts:
 Measure of source information
 Information capacity of a channel
 Coding for information transfer
4
 Information Source
 Measuring Information
 Entropy
 Source Coding
 Designing Codes
5
Information Source
 4 characteristics of information source
 The no. of symbols, n
 The symbols, S1, S2, …, Sn
 The probability of occurrence of each symbol,
P(S1), P(S2), …, P(Sn)
 The correlation between successive symbols
 Memoryless source: if each symbol is
independent
 A message: a stream of symbols from the
senders to the receiver
6
Examples …
 Ex. 1.: A source that sends binary
information (streams of 0s and 1s)
with each symbol having equal
probability and no correlation can be
modeled as a memoryless source
 n = 2
 Symbols: 0 and 1
 Probabilities: p(0) = ½ and P(1) = ½
7
Measuring Information
 To measure the information contained in a
message
 How much information does a message
carry from the sender to the receiver?
 Examples
 Ex.2.: Imagine a person sitting in a room.
Looking out the window, she can clearly see
that the sun is shining. If at this moment she
receives a call from a neighbor saying “It is now
daytime”, does this message contain any
information?
 Ex. 3. : A person has bought a lottery ticket. A
friend calls to tell her that she has won first
prize. Does this message contain any
information?
8
Examples …
 Ex.2. It does not, the message contains no
information. Why? Because she is already
certain that is daytime.
 Ex. 3. It does. The message contains a lot of
information, because the probability of winning
first prize is very small
 Conclusion
 The information content of a message is
inversely proportional to the probability of the
occurrence of that message.
 If a message is very probable, it does not
contain any information. If it is very improbable,
it contains a lot of information
9
Symbol Information
 To measure the information contained in a
message, it is needed to measure the information
contained in each symbol
 I(s) = log2 1/P(s) bits
 Bits is different from the bit, binary digit, used to define a
0 or 1
 Examples
 Ex.5. Find the information content of each symbol
when the source is binary (sending only 0 or 1 with
equal probability)
 Ex. 6. Find the information content of each symbol
when the source is sending four symbols with prob.
P(S1) = 1/8, P(S2) = 1/8, P(S3) = ¼ ; and P(S4) =
1/2
10
Examples …
 Ex. 5.
 P(0) = P(1) = ½ , the information content of
each symbol is
 Ex.6.
bit
1
]
2
[
log
1
log
)
1
(
1
log
)
1
(
bit
1
]
2
[
log
1
log
)
0
(
1
log
)
0
(
2
2
1
2
2
2
2
1
2
2








P
I
P
I
bit
1
]
2
[
log
1
log
)
(
1
log
)
(
bit
2
]
4
[
log
1
log
)
(
1
log
)
(
bit
3
]
8
[
log
1
log
)
(
1
log
)
(
bit
3
]
8
[
log
1
log
)
(
1
log
)
(
2
2
1
2
4
2
4
2
4
1
2
3
2
3
2
8
1
2
2
2
2
2
8
1
2
1
2
1
















S
P
S
I
S
P
S
I
S
P
S
I
S
P
S
I
11
Examples …
 Ex.6.
 The symbols S1 and S2 are least probable.
At the receiver each carries more
information (3 bits) than S3 or S4. The
symbol S3 is less probable than S4, so S3
carries more information than S4
 Definition the relationships
 If P(Si) = P(Sj), then I(Si) = I(Sj)
 If P(Si) < P(Sj), then I(Si) > I(Sj)
 If P(Si) = 1, then I(Si) = 0
12
Message Information
 If the message comes from a memoryless
source, each symbol is independent and the
probability of receiving a message with
symbols Si, Sj, Sk, … (where i, j, and k can
be the same) is:
 P(message) = P(Si)P(Sj)P(Sk) …
 Then the information content carried by the
message is
...
)
(
)
(
)
(
)
(
...
log
log
log
)
(
log
)
(
)
(
1
2
)
(
1
2
)
(
1
2
)
(
1
2









k
j
i
S
P
Sj
P
S
P
message
P
S
I
S
I
S
I
message
I
message
I
message
I
k
i
13
Example …
 Ex.7.
 An equal – probability binary source
sends an 8-bit message. What is the
amount of information received?
 The information content of the message is
 I(message) = I(first bit) + I(second bit) +
… + I(eight bit) = 8 bits
14
Entropy
 Entropy (H) of the source
 The average amount of information
contained in the symbols
 H(Source) = P(S1)xI(S1) + P(S2)xI(S2) + …
+ P(Sn)xI(Sn)
 Example
 What is the entropy of an equal-probability
binary source?
 H(Source) = P(0)xI(0) + P(1)xI(1) = 0.5x1
+ 0.5x1 = 1 bit
 1 bit per symbol
15
Maximum Entropy
 For a particular source with n symbols,
maximum entropy can be achieved only if all
the probabilities are the same. The value of
this max is
 In othe words, the entropy of every source
has an upper limit defined by
 H(Source)≤log2n
   

 

 n
S
P
Source
H
n
n
Si
P
i 2
1
2
1
)
(
1
2
max log
log
log
)
(
)
( 1
16
Example …
 What is the maximum entropy of a
binary source?
 Hmax = log22 = 1 bit
17
Source Coding
 To send a message from a source to a
destination, a symbol is normally coded
into a sequence of binary digits.
 The result is called code word
 A code is a mapping from a set of symbols
into a set of code words.
 Example, ASCII code is a mapping of a set
of 128 symbols into a set of 7-bit code
words
 A ………………………..> 0100001
 B …………………………> 0100010
 Set of symbols ….> Set of binary streams
18
Fixed- and Variable-Length Code
 A code can be designed with all the
code words the same length (fixed-
length code) or with different lengths
(variable length code)
 Examples
 A code with fixed-length code words:
 S1 -> 00; S2 -> 01; S3 -> 10; S4 -> 11
 A code with variable-length code words:
 S1 -> 0; S2 -> 10; S3 -> 11; S4 -> 110
19
Distinct Codes
 Each code words is different from every
other code word
 Example
 S1 -> 0; S2 -> 10; S3 -> 11; S4 -> 110
 Uniquely Decodable Codes
 A distinct code is uniquely decodable if each
code word can be decoded when inserted
between other code words.
 Example
 Not uniquely decodable
 S1 -> 0; S2 -> 1; S3 -> 00; S4 -> 10 because
 0010 -> S3 S4 or S3S2S1 or S1S1S4
20
Instantaneous Codes
 A uniquely decodable
 S1 -> 0; S2 -> 01; S3 -> 011; S4 -> 0111
 A 0 uniquely defines the beginning of a code
word
 A uniquely decodable code is
instantaneously decodable if no code
word is the prefix of any other code
word
21
Examples …
 A code word and its prefixes (note that each
code word is also a prefix of itself)
 S -> 01001 ; prefixes: 0, 10, 010, 0100, 01001
 A uniquely decodable code that is instantaneously
decodable
 S1 -> 0; s2 -> 10; s3 -> 110; s4 -> 111
 When the receiver receives a 0, it immediately
knows that it is S1; no other symbol starts with a
0. When the rx receives a 10, it immediately
knows that it is S2; no other symbol starts with
10, and so on
22
Relationship between different
types of coding
Instantaneous
codes
Uniquely decodable codes
Distinct codes
All codes
23
Code …
 Average code length
 L=L(S1)xP(S1) + L(S2)xP(S2) + …
 Example
 Find the average length of the following
code:
 S1 -> 0; S2 -> 10; S3 -> 110; S4 -> 111
 P(S1) = ½, P(S2) = ¼; P(S3) = 1/8; P(S4) =
1/8
 Solution
 L = 1x ½ + 2x ¼ + 3x 1/8 + 3x1/8 = 1
¾ bits
24
Code …
 Code efficiency
  (code efficiency) is defined as the entropy of
the source code divided by the average length
of the code
 Example
 Find the efficiency of the following code:
 S1 ->0; S2->10; S3 -> 110; S4 -> 111
 P(S1) = ½, P(S2) = ¼; P(S3) = 1/8; P(S4) = 1/8
 Solution
  %
100
)
(
L
source
H


  %
100
%
100
bits
1
)
8
(
log
)
8
(
log
)
4
(
log
)
2
(
log
)
(
bits
1
4
3
4
3
1
1
4
3
2
8
1
2
8
1
2
4
1
2
2
1
4
3









source
H
L
25
Designing Codes
 Two examples of instantaneous codes
 Shannon – Fano code
 Huffman code
 Shannon – Fano code
 An instantaneous variable – length encoding method in
which the more probable symbols are given shorter
code words and the less probable are given longer code
words
 Design builds a binary tree top (top to bottom
construction) following the steps below:
 1. List the symbols in descending order of probability
 2. Divide the list into two equal (or nearly equal)
probability sublists. Assign 0 to the first sublist and 1
to the second
 3. Repeat step 2 for each sublist until no further
division is possible
26
Example of Shannon – Fano
Encoding
 Find the Shannon – Fano code words for
the following source
 P(S1) = 0.3 ; P(S2) = 0.2 ; P(S3) = 0.15 ; P(S4)
= 0.1 ; P(S5) = 0.1 ; P(S6) = 0.05 ; P(S7) =
0.05 ; P(S8) = 0.05
 Solution
 Because each code word is assigned a leaf of
the tree, no code word is the prefix of any
other. The code is instantaneous. Calculation of
the average length and the efficiency of this
code
 H(source) = 2.7
 L = 2.75
  = 98%
27
Example of Shannon – Fano
Encoding
S1
0.30
S2
0.20
S3
0.15
S4
0.10
S5
0.10
S6
0.05
S7
0.05
S8
0.05
0 1
S1 S2 S3 S4 S5 S6 S7 S8
0 1 0 1
S1 S2 S3 S4 S5 S6 S7 S8
00 01 0 1 0 1
S3 S4 S5 S6 S7 S8
100 101 0 1 0 1
S5 S6 S7 S8
1100 1101 1110 1111
28
Huffman Encoding
 An instantaneous variable – length
encoding method in which the more
probable symbols are given shorter
code words and the less probable are
given longer code words
 Design builds a binary tree (bottom
up construction):
 1. Add two least probable symbols
 2. Repeat step 1 until no further
combination is possible
29
Example Huffman encoding
 Find the Huffman code words for the
following source
 P(S1) = 0.3 ; P(S2) = 0.2 ; P(S3) = 0.15 ; P(S4)
= 0.1 ; P(S5) = 0.1 ; P(S6) = 0.05 ; P(S7) =
0.05 ; P(S8) = 0.05
 Solution
 Because each code word is assigned a leaf of
the tree, no code word is the prefix of any
other. The code is instantaneous. Calculation of
the average length and the efficiency of this
code
 H(source) = 2.70 ; L = 2.75 ;  = 98%
30
Example Huffman encoding
0 1
0 1
0 1
0 1
0 1
0 1 0 1
0.30 0.20 0.15 0.10 0.10 0.05 0.05 0.05
S1
00
S2
10
S3
010
S4
110
S5
111
S6
0110
S7
01110
S8
01111
0.20 0.10
0.15
0.3
0.40
0.60
1.00

More Related Content

What's hot

Information theory
Information theoryInformation theory
Information theory
Madhumita Tamhane
 
Chap 3
Chap 3Chap 3
Chap 3
Tanuj Patel
 
Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information Theory
Dr. Sanjay M. Gulhane
 
Source coding
Source codingSource coding
Source coding
Shamna Saleem
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and Coding
VIT-AP University
 
Adaptive filters
Adaptive filtersAdaptive filters
Adaptive filters
Mustafa Khaleel
 
Information theory
Information theoryInformation theory
Information theory
AMIE(I) Study Circle
 
Presentation
PresentationPresentation
Presentation
Virak Sou
 
Bch codes
Bch codesBch codes
Bch codes
Gaurav Thakur
 
Pulse modulation
Pulse modulationPulse modulation
Pulse modulation
stk_gpg
 
SIGNAL CHARACTERISTICS
SIGNAL CHARACTERISTICSSIGNAL CHARACTERISTICS
SIGNAL CHARACTERISTICS
RUpaliLohar
 
Convolutional Codes And Their Decoding
Convolutional Codes And Their DecodingConvolutional Codes And Their Decoding
Convolutional Codes And Their Decoding
Kakali Saharia
 
Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02
Rediet Moges
 
Information theory
Information theoryInformation theory
Information theory
AparnaLal2
 
Classification of signals
Classification of signalsClassification of signals
Classification of signals
harsh shah
 
Signal & systems
Signal & systemsSignal & systems
Signal & systems
AJAL A J
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signals
INDIAN NAVY
 
Multirate digital signal processing
Multirate digital signal processingMultirate digital signal processing
Multirate digital signal processing
MOHAN MOHAN
 
Channel coding
Channel codingChannel coding
Channel coding
Piyush Mittal
 
Binary symmetric channel review
Binary symmetric channel reviewBinary symmetric channel review
Binary symmetric channel review
ShilpaDe
 

What's hot (20)

Information theory
Information theoryInformation theory
Information theory
 
Chap 3
Chap 3Chap 3
Chap 3
 
Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information Theory
 
Source coding
Source codingSource coding
Source coding
 
Information Theory and Coding
Information Theory and CodingInformation Theory and Coding
Information Theory and Coding
 
Adaptive filters
Adaptive filtersAdaptive filters
Adaptive filters
 
Information theory
Information theoryInformation theory
Information theory
 
Presentation
PresentationPresentation
Presentation
 
Bch codes
Bch codesBch codes
Bch codes
 
Pulse modulation
Pulse modulationPulse modulation
Pulse modulation
 
SIGNAL CHARACTERISTICS
SIGNAL CHARACTERISTICSSIGNAL CHARACTERISTICS
SIGNAL CHARACTERISTICS
 
Convolutional Codes And Their Decoding
Convolutional Codes And Their DecodingConvolutional Codes And Their Decoding
Convolutional Codes And Their Decoding
 
Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02
 
Information theory
Information theoryInformation theory
Information theory
 
Classification of signals
Classification of signalsClassification of signals
Classification of signals
 
Signal & systems
Signal & systemsSignal & systems
Signal & systems
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signals
 
Multirate digital signal processing
Multirate digital signal processingMultirate digital signal processing
Multirate digital signal processing
 
Channel coding
Channel codingChannel coding
Channel coding
 
Binary symmetric channel review
Binary symmetric channel reviewBinary symmetric channel review
Binary symmetric channel review
 

Similar to Information Theory MSU-EEE.ppt

Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3
Aref35
 
710402_Lecture 1.ppt
710402_Lecture 1.ppt710402_Lecture 1.ppt
710402_Lecture 1.ppt
Pratik Gohel
 
Data Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano codingData Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano coding
Dr Rajiv Srivastava
 
Module 1 till huffman coding5c-converted.pdf
Module 1 till huffman coding5c-converted.pdfModule 1 till huffman coding5c-converted.pdf
Module 1 till huffman coding5c-converted.pdf
AmoghR3
 
Arithmetic Coding
Arithmetic CodingArithmetic Coding
Arithmetic Coding
anithabalaprabhu
 
Basics of coding theory
Basics of coding theoryBasics of coding theory
Basics of coding theory
Madhumita Tamhane
 
Komdat-Kompresi Data
Komdat-Kompresi DataKomdat-Kompresi Data
Komdat-Kompresi Data
mursalinfajri007
 
Presentation ppt 3.pptx
Presentation ppt 3.pptxPresentation ppt 3.pptx
Presentation ppt 3.pptx
temesgen545750
 
Stallings Kurose and Ross
Stallings Kurose and RossStallings Kurose and Ross
Stallings Kurose and Ross
Information Security Awareness Group
 
Losseless
LosselessLosseless
Losseless
anithabalaprabhu
 
Noise info theory and Entrophy
Noise info theory and EntrophyNoise info theory and Entrophy
Noise info theory and Entrophy
Izah Asmadi
 
Noise infotheory1
Noise infotheory1Noise infotheory1
Noise infotheory1
vmspraneeth
 
information_theory_1.ppt
information_theory_1.pptinformation_theory_1.ppt
information_theory_1.ppt
TrongMinhHoang1
 
basicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxbasicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptx
upendrabhatt13
 
Encoding in sc
Encoding in scEncoding in sc
Encoding in sc
rajshreemuthiah
 
Huffman coding
Huffman codingHuffman coding
Huffman coding
George Ang
 
Unit 4
Unit 4 Unit 4
Unit 4
RemyaRoseS
 
Huffman Coding
Huffman CodingHuffman Coding
Huffman Coding
anithabalaprabhu
 
Analysing space complexity of various encryption algorithms 2
Analysing space complexity of various encryption algorithms 2Analysing space complexity of various encryption algorithms 2
Analysing space complexity of various encryption algorithms 2
IAEME Publication
 
Error Detection and Correction - Data link Layer
Error Detection and Correction - Data link LayerError Detection and Correction - Data link Layer
Error Detection and Correction - Data link Layer
Abdullaziz Tagawy
 

Similar to Information Theory MSU-EEE.ppt (20)

Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3Information Theory and coding - Lecture 3
Information Theory and coding - Lecture 3
 
710402_Lecture 1.ppt
710402_Lecture 1.ppt710402_Lecture 1.ppt
710402_Lecture 1.ppt
 
Data Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano codingData Communication & Computer network: Shanon fano coding
Data Communication & Computer network: Shanon fano coding
 
Module 1 till huffman coding5c-converted.pdf
Module 1 till huffman coding5c-converted.pdfModule 1 till huffman coding5c-converted.pdf
Module 1 till huffman coding5c-converted.pdf
 
Arithmetic Coding
Arithmetic CodingArithmetic Coding
Arithmetic Coding
 
Basics of coding theory
Basics of coding theoryBasics of coding theory
Basics of coding theory
 
Komdat-Kompresi Data
Komdat-Kompresi DataKomdat-Kompresi Data
Komdat-Kompresi Data
 
Presentation ppt 3.pptx
Presentation ppt 3.pptxPresentation ppt 3.pptx
Presentation ppt 3.pptx
 
Stallings Kurose and Ross
Stallings Kurose and RossStallings Kurose and Ross
Stallings Kurose and Ross
 
Losseless
LosselessLosseless
Losseless
 
Noise info theory and Entrophy
Noise info theory and EntrophyNoise info theory and Entrophy
Noise info theory and Entrophy
 
Noise infotheory1
Noise infotheory1Noise infotheory1
Noise infotheory1
 
information_theory_1.ppt
information_theory_1.pptinformation_theory_1.ppt
information_theory_1.ppt
 
basicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptxbasicsofcodingtheory-160202182933-converted.pptx
basicsofcodingtheory-160202182933-converted.pptx
 
Encoding in sc
Encoding in scEncoding in sc
Encoding in sc
 
Huffman coding
Huffman codingHuffman coding
Huffman coding
 
Unit 4
Unit 4 Unit 4
Unit 4
 
Huffman Coding
Huffman CodingHuffman Coding
Huffman Coding
 
Analysing space complexity of various encryption algorithms 2
Analysing space complexity of various encryption algorithms 2Analysing space complexity of various encryption algorithms 2
Analysing space complexity of various encryption algorithms 2
 
Error Detection and Correction - Data link Layer
Error Detection and Correction - Data link LayerError Detection and Correction - Data link Layer
Error Detection and Correction - Data link Layer
 

Recently uploaded

Engine Lubrication performance System.pdf
Engine Lubrication performance System.pdfEngine Lubrication performance System.pdf
Engine Lubrication performance System.pdf
mamamaam477
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Christina Lin
 
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
ihlasbinance2003
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
MDSABBIROJJAMANPAYEL
 
Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...
bijceesjournal
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
insn4465
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
Dr Ramhari Poudyal
 
International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...
gerogepatton
 
Textile Chemical Processing and Dyeing.pdf
Textile Chemical Processing and Dyeing.pdfTextile Chemical Processing and Dyeing.pdf
Textile Chemical Processing and Dyeing.pdf
NazakatAliKhoso2
 
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptxML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
JamalHussainArman
 
Engineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdfEngineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdf
abbyasa1014
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
Yasser Mahgoub
 
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTCHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
jpsjournal1
 
Recycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part IIRecycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part II
Aditya Rajan Patra
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
Madan Karki
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
kandramariana6
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
gerogepatton
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
IJECEIAES
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
SUTEJAS
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
NidhalKahouli2
 

Recently uploaded (20)

Engine Lubrication performance System.pdf
Engine Lubrication performance System.pdfEngine Lubrication performance System.pdf
Engine Lubrication performance System.pdf
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
 
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
5214-1693458878915-Unit 6 2023 to 2024 academic year assignment (AutoRecovere...
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
 
Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...Comparative analysis between traditional aquaponics and reconstructed aquapon...
Comparative analysis between traditional aquaponics and reconstructed aquapon...
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
 
International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...
 
Textile Chemical Processing and Dyeing.pdf
Textile Chemical Processing and Dyeing.pdfTextile Chemical Processing and Dyeing.pdf
Textile Chemical Processing and Dyeing.pdf
 
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptxML Based Model for NIDS MSc Updated Presentation.v2.pptx
ML Based Model for NIDS MSc Updated Presentation.v2.pptx
 
Engineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdfEngineering Drawings Lecture Detail Drawings 2014.pdf
Engineering Drawings Lecture Detail Drawings 2014.pdf
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
 
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTCHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
 
Recycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part IIRecycled Concrete Aggregate in Construction Part II
Recycled Concrete Aggregate in Construction Part II
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
 
basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
 

Information Theory MSU-EEE.ppt

  • 1. 1 of 28 Information Theory MSU, MARAWI CITY EEE DEPARTMENT
  • 2. Shannon Theorem “If the rate of information from the source does not exceed the capacity of a communication channel, then there exist a coding techniques such that the information can be transmitted over the channel with an arbitrary small frequency of errors, despite the presence of noise.”
  • 3. Three Basic Concepts:  Measure of source information  Information capacity of a channel  Coding for information transfer
  • 4. 4  Information Source  Measuring Information  Entropy  Source Coding  Designing Codes
  • 5. 5 Information Source  4 characteristics of information source  The no. of symbols, n  The symbols, S1, S2, …, Sn  The probability of occurrence of each symbol, P(S1), P(S2), …, P(Sn)  The correlation between successive symbols  Memoryless source: if each symbol is independent  A message: a stream of symbols from the senders to the receiver
  • 6. 6 Examples …  Ex. 1.: A source that sends binary information (streams of 0s and 1s) with each symbol having equal probability and no correlation can be modeled as a memoryless source  n = 2  Symbols: 0 and 1  Probabilities: p(0) = ½ and P(1) = ½
  • 7. 7 Measuring Information  To measure the information contained in a message  How much information does a message carry from the sender to the receiver?  Examples  Ex.2.: Imagine a person sitting in a room. Looking out the window, she can clearly see that the sun is shining. If at this moment she receives a call from a neighbor saying “It is now daytime”, does this message contain any information?  Ex. 3. : A person has bought a lottery ticket. A friend calls to tell her that she has won first prize. Does this message contain any information?
  • 8. 8 Examples …  Ex.2. It does not, the message contains no information. Why? Because she is already certain that is daytime.  Ex. 3. It does. The message contains a lot of information, because the probability of winning first prize is very small  Conclusion  The information content of a message is inversely proportional to the probability of the occurrence of that message.  If a message is very probable, it does not contain any information. If it is very improbable, it contains a lot of information
  • 9. 9 Symbol Information  To measure the information contained in a message, it is needed to measure the information contained in each symbol  I(s) = log2 1/P(s) bits  Bits is different from the bit, binary digit, used to define a 0 or 1  Examples  Ex.5. Find the information content of each symbol when the source is binary (sending only 0 or 1 with equal probability)  Ex. 6. Find the information content of each symbol when the source is sending four symbols with prob. P(S1) = 1/8, P(S2) = 1/8, P(S3) = ¼ ; and P(S4) = 1/2
  • 10. 10 Examples …  Ex. 5.  P(0) = P(1) = ½ , the information content of each symbol is  Ex.6. bit 1 ] 2 [ log 1 log ) 1 ( 1 log ) 1 ( bit 1 ] 2 [ log 1 log ) 0 ( 1 log ) 0 ( 2 2 1 2 2 2 2 1 2 2         P I P I bit 1 ] 2 [ log 1 log ) ( 1 log ) ( bit 2 ] 4 [ log 1 log ) ( 1 log ) ( bit 3 ] 8 [ log 1 log ) ( 1 log ) ( bit 3 ] 8 [ log 1 log ) ( 1 log ) ( 2 2 1 2 4 2 4 2 4 1 2 3 2 3 2 8 1 2 2 2 2 2 8 1 2 1 2 1                 S P S I S P S I S P S I S P S I
  • 11. 11 Examples …  Ex.6.  The symbols S1 and S2 are least probable. At the receiver each carries more information (3 bits) than S3 or S4. The symbol S3 is less probable than S4, so S3 carries more information than S4  Definition the relationships  If P(Si) = P(Sj), then I(Si) = I(Sj)  If P(Si) < P(Sj), then I(Si) > I(Sj)  If P(Si) = 1, then I(Si) = 0
  • 12. 12 Message Information  If the message comes from a memoryless source, each symbol is independent and the probability of receiving a message with symbols Si, Sj, Sk, … (where i, j, and k can be the same) is:  P(message) = P(Si)P(Sj)P(Sk) …  Then the information content carried by the message is ... ) ( ) ( ) ( ) ( ... log log log ) ( log ) ( ) ( 1 2 ) ( 1 2 ) ( 1 2 ) ( 1 2          k j i S P Sj P S P message P S I S I S I message I message I message I k i
  • 13. 13 Example …  Ex.7.  An equal – probability binary source sends an 8-bit message. What is the amount of information received?  The information content of the message is  I(message) = I(first bit) + I(second bit) + … + I(eight bit) = 8 bits
  • 14. 14 Entropy  Entropy (H) of the source  The average amount of information contained in the symbols  H(Source) = P(S1)xI(S1) + P(S2)xI(S2) + … + P(Sn)xI(Sn)  Example  What is the entropy of an equal-probability binary source?  H(Source) = P(0)xI(0) + P(1)xI(1) = 0.5x1 + 0.5x1 = 1 bit  1 bit per symbol
  • 15. 15 Maximum Entropy  For a particular source with n symbols, maximum entropy can be achieved only if all the probabilities are the same. The value of this max is  In othe words, the entropy of every source has an upper limit defined by  H(Source)≤log2n          n S P Source H n n Si P i 2 1 2 1 ) ( 1 2 max log log log ) ( ) ( 1
  • 16. 16 Example …  What is the maximum entropy of a binary source?  Hmax = log22 = 1 bit
  • 17. 17 Source Coding  To send a message from a source to a destination, a symbol is normally coded into a sequence of binary digits.  The result is called code word  A code is a mapping from a set of symbols into a set of code words.  Example, ASCII code is a mapping of a set of 128 symbols into a set of 7-bit code words  A ………………………..> 0100001  B …………………………> 0100010  Set of symbols ….> Set of binary streams
  • 18. 18 Fixed- and Variable-Length Code  A code can be designed with all the code words the same length (fixed- length code) or with different lengths (variable length code)  Examples  A code with fixed-length code words:  S1 -> 00; S2 -> 01; S3 -> 10; S4 -> 11  A code with variable-length code words:  S1 -> 0; S2 -> 10; S3 -> 11; S4 -> 110
  • 19. 19 Distinct Codes  Each code words is different from every other code word  Example  S1 -> 0; S2 -> 10; S3 -> 11; S4 -> 110  Uniquely Decodable Codes  A distinct code is uniquely decodable if each code word can be decoded when inserted between other code words.  Example  Not uniquely decodable  S1 -> 0; S2 -> 1; S3 -> 00; S4 -> 10 because  0010 -> S3 S4 or S3S2S1 or S1S1S4
  • 20. 20 Instantaneous Codes  A uniquely decodable  S1 -> 0; S2 -> 01; S3 -> 011; S4 -> 0111  A 0 uniquely defines the beginning of a code word  A uniquely decodable code is instantaneously decodable if no code word is the prefix of any other code word
  • 21. 21 Examples …  A code word and its prefixes (note that each code word is also a prefix of itself)  S -> 01001 ; prefixes: 0, 10, 010, 0100, 01001  A uniquely decodable code that is instantaneously decodable  S1 -> 0; s2 -> 10; s3 -> 110; s4 -> 111  When the receiver receives a 0, it immediately knows that it is S1; no other symbol starts with a 0. When the rx receives a 10, it immediately knows that it is S2; no other symbol starts with 10, and so on
  • 22. 22 Relationship between different types of coding Instantaneous codes Uniquely decodable codes Distinct codes All codes
  • 23. 23 Code …  Average code length  L=L(S1)xP(S1) + L(S2)xP(S2) + …  Example  Find the average length of the following code:  S1 -> 0; S2 -> 10; S3 -> 110; S4 -> 111  P(S1) = ½, P(S2) = ¼; P(S3) = 1/8; P(S4) = 1/8  Solution  L = 1x ½ + 2x ¼ + 3x 1/8 + 3x1/8 = 1 ¾ bits
  • 24. 24 Code …  Code efficiency   (code efficiency) is defined as the entropy of the source code divided by the average length of the code  Example  Find the efficiency of the following code:  S1 ->0; S2->10; S3 -> 110; S4 -> 111  P(S1) = ½, P(S2) = ¼; P(S3) = 1/8; P(S4) = 1/8  Solution   % 100 ) ( L source H     % 100 % 100 bits 1 ) 8 ( log ) 8 ( log ) 4 ( log ) 2 ( log ) ( bits 1 4 3 4 3 1 1 4 3 2 8 1 2 8 1 2 4 1 2 2 1 4 3          source H L
  • 25. 25 Designing Codes  Two examples of instantaneous codes  Shannon – Fano code  Huffman code  Shannon – Fano code  An instantaneous variable – length encoding method in which the more probable symbols are given shorter code words and the less probable are given longer code words  Design builds a binary tree top (top to bottom construction) following the steps below:  1. List the symbols in descending order of probability  2. Divide the list into two equal (or nearly equal) probability sublists. Assign 0 to the first sublist and 1 to the second  3. Repeat step 2 for each sublist until no further division is possible
  • 26. 26 Example of Shannon – Fano Encoding  Find the Shannon – Fano code words for the following source  P(S1) = 0.3 ; P(S2) = 0.2 ; P(S3) = 0.15 ; P(S4) = 0.1 ; P(S5) = 0.1 ; P(S6) = 0.05 ; P(S7) = 0.05 ; P(S8) = 0.05  Solution  Because each code word is assigned a leaf of the tree, no code word is the prefix of any other. The code is instantaneous. Calculation of the average length and the efficiency of this code  H(source) = 2.7  L = 2.75   = 98%
  • 27. 27 Example of Shannon – Fano Encoding S1 0.30 S2 0.20 S3 0.15 S4 0.10 S5 0.10 S6 0.05 S7 0.05 S8 0.05 0 1 S1 S2 S3 S4 S5 S6 S7 S8 0 1 0 1 S1 S2 S3 S4 S5 S6 S7 S8 00 01 0 1 0 1 S3 S4 S5 S6 S7 S8 100 101 0 1 0 1 S5 S6 S7 S8 1100 1101 1110 1111
  • 28. 28 Huffman Encoding  An instantaneous variable – length encoding method in which the more probable symbols are given shorter code words and the less probable are given longer code words  Design builds a binary tree (bottom up construction):  1. Add two least probable symbols  2. Repeat step 1 until no further combination is possible
  • 29. 29 Example Huffman encoding  Find the Huffman code words for the following source  P(S1) = 0.3 ; P(S2) = 0.2 ; P(S3) = 0.15 ; P(S4) = 0.1 ; P(S5) = 0.1 ; P(S6) = 0.05 ; P(S7) = 0.05 ; P(S8) = 0.05  Solution  Because each code word is assigned a leaf of the tree, no code word is the prefix of any other. The code is instantaneous. Calculation of the average length and the efficiency of this code  H(source) = 2.70 ; L = 2.75 ;  = 98%
  • 30. 30 Example Huffman encoding 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0.30 0.20 0.15 0.10 0.10 0.05 0.05 0.05 S1 00 S2 10 S3 010 S4 110 S5 111 S6 0110 S7 01110 S8 01111 0.20 0.10 0.15 0.3 0.40 0.60 1.00