Information Theory
*Entropy *Channel Capacity
Dr.T.Logeswari
Dept of CS
DRSNSRCAS
What is information theory?
• Information theory was invented by Claude Shannon in the
late 1940’s.
• The goal of information theory is to quantify the amount
of information contained in a signal, as well as the capacity
of a channel or communication medium for sending
information.
• Information theory is used by engineers to design and
analyze the communication systems—telephone networks,
modems, radio communication, etc.
• In neuroscience, information theory is used to quantify the
amount of information conveyed by a neuron, or a
population of neurons, as well as the efficiency of neural
representation.
In information theory, entropy and channel capacity
are concepts that relate to the transmission of
information over a channel
Key concepts in information theory include:
• Entropy
A measure of the uncertainty associated with a single
bit. Entropy is maximized for a uniform distribution
The amount of potential information contained is a signal
is termed the entropy, usually denoted by H, which is defined
as follows:
H(X) = P(X) log P(X)
−
x
• The channel capacity
The amount of information that can be processed per unit
time over a noisy channel. In a channel, uncertainty is
detrimental to information transfer, so the capacity is equal to
1 - H, where H is the entropy associated with a single bit
Channel capacity, C, measures the maximum amount of
information that can be sent over a channel (e.g., a wire).
It is defined as follows:
C = max I(X, Y )
P(X)
where X is the input to the channel and Y is the output
• Information: Information is a reduction in uncertainty. If an event is
certain, it provides no information. However, if the event is uncertain
or surprising, it provides information. The amount of information can
be measured in bits.
• Bit: The basic unit of information is the bit (binary digit), representing
a choice between two alternatives, such as 0 or 1 in digital systems.
• Coding Theory: This is a part of information theory that deals with
the efficient representation of information. It includes techniques for
error detection and correction, as well as data compression
• Noise: In communication systems, noise refers to any unwanted or
random interference that can corrupt the transmitted signal.
Information theory helps in understanding how to design systems to
minimize the impact of noise.
• Source Coding (Compression): Information theory is used to develop
algorithms and techniques for compressing data without loss of
information. This is essential for efficient storage and transmission of
information.
important concepts in information
theory are
✓ Entropy
✓ Channel capacity
✓ Binary Symmetric Channel
✓ AWGN Channel
• Entropy, in the context of information theory, is a measure
of the uncertainty or randomness associated with a random
variable.
• It quantifies the average amount of information contained
in a message or the average uncertainty in predicting the
value of a random variable.
• Entropy is a fundamental concept used in information
theory for various purposes, including data compression,
coding theory, and understanding the limits of
communication system
• Channel Capacity Channel capacity refers to the maximum
rate at which information can be reliably transmitted over a
communication channel, taking into account the channel's
characteristics and potential sources of noise or
interference.
• It is a fundamental concept in information theory, a branch
of applied mathematics and electrical engineering.
• Claude Shannon, often regarded as the father of
information theory, introduced the concept of channel
capacity in his landmark paper "A Mathematical Theory of
Communication" in 1948. The formula for the channel
capacity, known as the Shannon-Hartley theorem, is given
by:
• The formula indicates that the channel capacity is
proportional to the bandwidth and the logarithm of the
signal-to-noise ratio.
• This logarithmic term implies that the channel capacity
grows with the signal-to-noise ratio but at a diminishing
rate.
• The concept of channel capacity is crucial in designing
communication systems to ensure efficient and reliable
information transmission

1.2 Information & Coding :Information Theory.pptx

  • 1.
    Information Theory *Entropy *ChannelCapacity Dr.T.Logeswari Dept of CS DRSNSRCAS
  • 2.
    What is informationtheory? • Information theory was invented by Claude Shannon in the late 1940’s. • The goal of information theory is to quantify the amount of information contained in a signal, as well as the capacity of a channel or communication medium for sending information. • Information theory is used by engineers to design and analyze the communication systems—telephone networks, modems, radio communication, etc. • In neuroscience, information theory is used to quantify the amount of information conveyed by a neuron, or a population of neurons, as well as the efficiency of neural representation.
  • 4.
    In information theory,entropy and channel capacity are concepts that relate to the transmission of information over a channel Key concepts in information theory include: • Entropy A measure of the uncertainty associated with a single bit. Entropy is maximized for a uniform distribution The amount of potential information contained is a signal is termed the entropy, usually denoted by H, which is defined as follows: H(X) = P(X) log P(X) − x
  • 5.
    • The channelcapacity The amount of information that can be processed per unit time over a noisy channel. In a channel, uncertainty is detrimental to information transfer, so the capacity is equal to 1 - H, where H is the entropy associated with a single bit Channel capacity, C, measures the maximum amount of information that can be sent over a channel (e.g., a wire). It is defined as follows: C = max I(X, Y ) P(X) where X is the input to the channel and Y is the output
  • 6.
    • Information: Informationis a reduction in uncertainty. If an event is certain, it provides no information. However, if the event is uncertain or surprising, it provides information. The amount of information can be measured in bits. • Bit: The basic unit of information is the bit (binary digit), representing a choice between two alternatives, such as 0 or 1 in digital systems. • Coding Theory: This is a part of information theory that deals with the efficient representation of information. It includes techniques for error detection and correction, as well as data compression • Noise: In communication systems, noise refers to any unwanted or random interference that can corrupt the transmitted signal. Information theory helps in understanding how to design systems to minimize the impact of noise. • Source Coding (Compression): Information theory is used to develop algorithms and techniques for compressing data without loss of information. This is essential for efficient storage and transmission of information.
  • 7.
    important concepts ininformation theory are ✓ Entropy ✓ Channel capacity ✓ Binary Symmetric Channel ✓ AWGN Channel
  • 8.
    • Entropy, inthe context of information theory, is a measure of the uncertainty or randomness associated with a random variable. • It quantifies the average amount of information contained in a message or the average uncertainty in predicting the value of a random variable. • Entropy is a fundamental concept used in information theory for various purposes, including data compression, coding theory, and understanding the limits of communication system • Channel Capacity Channel capacity refers to the maximum rate at which information can be reliably transmitted over a communication channel, taking into account the channel's characteristics and potential sources of noise or interference.
  • 9.
    • It isa fundamental concept in information theory, a branch of applied mathematics and electrical engineering. • Claude Shannon, often regarded as the father of information theory, introduced the concept of channel capacity in his landmark paper "A Mathematical Theory of Communication" in 1948. The formula for the channel capacity, known as the Shannon-Hartley theorem, is given by:
  • 10.
    • The formulaindicates that the channel capacity is proportional to the bandwidth and the logarithm of the signal-to-noise ratio. • This logarithmic term implies that the channel capacity grows with the signal-to-noise ratio but at a diminishing rate. • The concept of channel capacity is crucial in designing communication systems to ensure efficient and reliable information transmission