MEASURE OF INFORMATION
• The probability of occurrence of an event is a measure of its unexpectedness
and, hence, is related to the information content
• the amount of information received from a message is directly related to the
uncertainty or inversely related to the probability of its occurrence
• The more unexpected the event, the greater the surprise, and hence the more
information
Average Information per Message:
Entropy of a Source
• A memoryless source implies that each message
emitted is independent of the previous
message(s)
• the information content of message m, is I_i ,
given by
• The probability of occurrence of message m_i is
P_i
• The average information per message of a source
m is called its entropy
• the entropy is a measure of uncertainty, the probability
distribution that generates the maximum uncertainty
will have the maximum entropy
SOURCE ENCODING
Efficiency
Average Code-word length
Redundancy
Huffman Code
• The source encoding theorem says that to encode a
source with entropy H(m), we need, on the average,
a minimum of H (m) binary digits per message
• The number of digits in the codeword is the length of
the codeword
• The average word length of an optimum code is H
(m)
• it is not desirable to use long sequences, since they
cause transmission delay and add to equipment
complexity
symbol Prob.
Entropy
. The average length of the compact code
• The merit of any code is measured by its average
length in comparison to H(m) (the average
minimum length)
• code efficiency
• Redundancy
• Huffman code is a variable length code
The entropy H(m) of the source is given by
ENTROPY_HUFFMANggggnnnnnnnnnn_CODING.pptx

ENTROPY_HUFFMANggggnnnnnnnnnn_CODING.pptx

  • 1.
    MEASURE OF INFORMATION •The probability of occurrence of an event is a measure of its unexpectedness and, hence, is related to the information content • the amount of information received from a message is directly related to the uncertainty or inversely related to the probability of its occurrence • The more unexpected the event, the greater the surprise, and hence the more information
  • 2.
    Average Information perMessage: Entropy of a Source • A memoryless source implies that each message emitted is independent of the previous message(s) • the information content of message m, is I_i , given by • The probability of occurrence of message m_i is P_i • The average information per message of a source m is called its entropy
  • 3.
    • the entropyis a measure of uncertainty, the probability distribution that generates the maximum uncertainty will have the maximum entropy
  • 4.
  • 5.
    Huffman Code • Thesource encoding theorem says that to encode a source with entropy H(m), we need, on the average, a minimum of H (m) binary digits per message • The number of digits in the codeword is the length of the codeword • The average word length of an optimum code is H (m) • it is not desirable to use long sequences, since they cause transmission delay and add to equipment complexity
  • 6.
  • 7.
  • 8.
    . The averagelength of the compact code
  • 9.
    • The meritof any code is measured by its average length in comparison to H(m) (the average minimum length) • code efficiency • Redundancy • Huffman code is a variable length code The entropy H(m) of the source is given by