CS8792 –
CRYPTOGRAPHY AND
NETWORK SECURITY
~ S. Janani, AP/CSE
KCET
Foundations of Modern Cryptography – Information
Unit I - Introduction
Topic Teaching Aid Activity
Security trends -
Legal, Ethical and Professional Aspects of Security
PPT -
Need for Security at Multiple levels, Security Policies PPT -
Model of network security,
Security attacks, services and mechanism,
OSI security architecture
PPT Quiz
Classical Encryption techniques (Symmetric cipher model, substitution
techniques, transposition techniques, steganography)
WB Worksheet
Foundations of modern cryptography: perfect security, information
theory, product cryptosystem, cryptanalysis
PPT Worksheet
Information Theory
 Information theory studies the quantification,
storage, and communication of information
 proposed by Claude Shannon
 intersection of mathematics, statistics,
computer science, physics, neurobiology,
information engineering, and electrical
engineering
Information Theory
 History
 Definition
 Shannon’s Theory
 Huffman Coding
 Application areas
Information Theory - History
Definition
 Can we measure information?
 Consider the two following sentences:
1. There is a traffic jam on I5
2. There is a traffic jam on I5 near Exit 234
6
Definition – Contd.
 It is hard to measure the “semantic”
information!
 Consider the following two sentences
1. There is a traffic jam on I5 near Exit 160
2. There is a traffic jam on I5 near Exit 234
7
Definition – Contd.
 Let’s attempt at a different definition of
information.
 How about counting the number of letters in the
two sentences:
8
1. There is a traffic jam on I5 (22 letters)
2. There is a traffic jam on I5 near Exit 234 (33 letters)
Definitely something we can measure and compare!
Why?
Shannon’s Theory
 Shannon’s measure of information is the number of bits to
represent the amount of uncertainty (randomness) in a
data source, and is defined as entropy
)
log(
1




n
i
i
i p
p
H
Key Measures
 Entropy - Entropy quantifies the amount of uncertainty
involved in the value of a random variable or the outcome of
a random process.
 For example, identifying the outcome of a fair coin flip provides
less information (lower entropy) than specifying the outcome
from a roll of a die.
 Mutual information
 channel capacity
 error exponents
 relative entropy
Huffman coding algorithm
12
P(x1)
P(x2)
P(x3)
P(x4)
P(x5)
P(x6)
P(x7)
x1
00
x2 01
x3
10
x4
110
x5 1110
x6
11110
x7
11111
H(X)=2.11
R=2.21 bits per symbol
Information Theory - Applications
 lossless data compression, lossy data compression,
channel coding
 Natural Language Processing, Cryptography,
Neurobiology, Human vision, the evolution and function of
molecular codes (bioinformatics), model selection in
statistics, quantum computing, linguistics, plagiarism
detection, pattern recognition, and anomaly detection
 Voyager missions to deep space, the invention of the
compact disc, the feasibility of mobile phones, the
development of the Internet, the study of linguistics and of

Information Theory

  • 1.
    CS8792 – CRYPTOGRAPHY AND NETWORKSECURITY ~ S. Janani, AP/CSE KCET Foundations of Modern Cryptography – Information
  • 2.
    Unit I -Introduction Topic Teaching Aid Activity Security trends - Legal, Ethical and Professional Aspects of Security PPT - Need for Security at Multiple levels, Security Policies PPT - Model of network security, Security attacks, services and mechanism, OSI security architecture PPT Quiz Classical Encryption techniques (Symmetric cipher model, substitution techniques, transposition techniques, steganography) WB Worksheet Foundations of modern cryptography: perfect security, information theory, product cryptosystem, cryptanalysis PPT Worksheet
  • 3.
    Information Theory  Informationtheory studies the quantification, storage, and communication of information  proposed by Claude Shannon  intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering
  • 4.
    Information Theory  History Definition  Shannon’s Theory  Huffman Coding  Application areas
  • 5.
  • 6.
    Definition  Can wemeasure information?  Consider the two following sentences: 1. There is a traffic jam on I5 2. There is a traffic jam on I5 near Exit 234 6
  • 7.
    Definition – Contd. It is hard to measure the “semantic” information!  Consider the following two sentences 1. There is a traffic jam on I5 near Exit 160 2. There is a traffic jam on I5 near Exit 234 7
  • 8.
    Definition – Contd. Let’s attempt at a different definition of information.  How about counting the number of letters in the two sentences: 8 1. There is a traffic jam on I5 (22 letters) 2. There is a traffic jam on I5 near Exit 234 (33 letters) Definitely something we can measure and compare!
  • 9.
  • 10.
    Shannon’s Theory  Shannon’smeasure of information is the number of bits to represent the amount of uncertainty (randomness) in a data source, and is defined as entropy ) log( 1     n i i i p p H
  • 11.
    Key Measures  Entropy- Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process.  For example, identifying the outcome of a fair coin flip provides less information (lower entropy) than specifying the outcome from a roll of a die.  Mutual information  channel capacity  error exponents  relative entropy
  • 12.
    Huffman coding algorithm 12 P(x1) P(x2) P(x3) P(x4) P(x5) P(x6) P(x7) x1 00 x201 x3 10 x4 110 x5 1110 x6 11110 x7 11111 H(X)=2.11 R=2.21 bits per symbol
  • 13.
    Information Theory -Applications  lossless data compression, lossy data compression, channel coding  Natural Language Processing, Cryptography, Neurobiology, Human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection  Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of

Editor's Notes

  • #13 Huffman coding is optimum is a sense that the average number of bits presents source symbol is a minimum, subject to that the code words satisfy the prefix condition