Entropy
• What is Information? - It is facts provided or learned about something or someone.
• The definition above is qualitative. In a scientific theory, there is a need to quantify
it.
• Q. Does every piece of information convey something? What determines whether
one piece of information is more meaningful than another?
• Let’s consider an example:-
• Consider a statement like “ It is hot in Agra today”.
• Does it convey enough information? Is it hot compared to New Delhi? Is it hotter
than it was yesterday? – Imprecise Information.
• Suppose we define a day to be hot if the temperature is in the range 36 ≤ T ≤ 41.
Then we have a little more knowledge.
• Our Information is better, but we have an equal probability (1/6) of T being 36, 37,
38, 39, 40 or 41
• Additional information – yesterday’s temperature was 39 and it is hotter today.
Entropy
• Measurement is very important in Information theory.
• Physicists use entropy to measure the amount of disorder
in a physical system.
• Entropy is the key concept of quantum information
theory.
• In information theory , entropy is the expected value
(average) of the information contained in each message
received. It measures how much uncertainty there is in
the state of a physical system.
• It can be considered as the degree of randomness in a
message.
• The entropy of the message is its amount of uncertainty,
it increases when the message is closer to random, and
decreases when it is less random.
• The less likely an event is, the more information it
provides when it occurs.
• The message aaaaa appears to be very structured and
not random at all . It contains much less information
than the message alphabet, which is some what
structured , but more random.
• The first information has low entropy and the second
one has higher entropy.
• The key concept of classical information theory
is the Shannon entropy. Claude E. Shannon
introduced Shannon’s entropy , used for
measuring entropy of a classical system. It
measures the amount of order in a classical
system.
• Von Neumann entropy is used for measuring
entropy of a quantum system. It gauges order
in a given quantum system.
Shannon Entropy
The Shannon entropy can be used to define other measures of information which
capture the relationships between two random variables X and Y. Four such
measures are the following:
• Relative entropy: Relative entropy measures the similarity between two
random variables.
• Joint entropy: joint entropy measures the combined information in two
random variables.
• Conditional entropy: Conditional entropy measures the information
contained in one random variable given that the outcome of another
random variable is known.
• Mutual information: Mutual information measures the correlation between
two random variables, in terms of how much knowledge of one random
variable reduces the information gained from learning the other random
variable.
Shannon’s entropy
• For a discrete random variable X corresponding to a
physical process with possible outcomes x1,x2,———xn
with probabilities , p1,————pn ; ∑pi. = 1
• The average uncertainty associated with the event X = xi
• The Shannon entropy associated with this probability
distribution is defined by
• H( p1,————pn )
• = - ∑pi log2 (pi )
• = ∑pi log2 (1/pi )
Von Neumann Entropy
• Quantum entropy refers to the measure of uncertainty or
information content within a quantum system.
• It quantifies the amount of information that is missing or unknown
about a quantum state.
• Unlike classical entropy, which deals with disorder or randomness in
classical systems, quantum entropy deals with the complexities
arising from superposition, entanglement, and the probabilistic
nature of quantum states.Quantum entropy plays a pivotal role in
understanding the behavior of quantum systems.
• It provides fundamental insights into the information content,
complexity, and predictability of quantum states.
• As quantum mechanics governs the behavior of particles at a
fundamental level, understanding and quantifying entropy in this
context are crucial for various applications, including quantum
computing, cryptography, and information theory.
•
• Named after mathematician John von Neumann, Von Neumann entropy is
a key concept in quantum information theory.
• It quantifies the amount of uncertainty or information content associated
with a quantum state.
• Von Neumann defined the entropy of a quantum state ƿ by the formula
• Von Neumann entropy
• S(ƿ) = - tr (ƿ log2 ƿ). Where ƿ = ∑ pi IΨi><ΨiI.
• Suppose we have a mixture of quantum states IΨi> with probability pi .
Each IΨi> can be represented by a vector in the 2n space.
• If λi are eigenvalues of the density matrix ƿ then Von Neumann’s
definition can be re-expressed
• S(ƿ) = ∑λi log2 1/ λi
• Von Neumann entropy S(ƿ) provides a
measure of degree of mixedness of the
ensemble.
• S(ƿ) = 0 for a well ordered state.
Classical versus Quantum Entropy
• Classical Information:
• Classical information is based on classical physics and follows classical
laws of physics.
• It is represented in bits (0 or 1) and operates on classical states that
are deterministic and definite.
• Information in classical systems can be copied without limitations due
to the principle of no-cloning.
• Classical information processing relies on classical computers that use
classical bits for computation.
• Classical information theory, developed by Claude Shannon, deals
with encoding, transmitting, and decoding information in classical
systems.
• Quantum Information:
• Quantum information is based on quantum physics and operates using
quantum states and qubits.
• It represented in qubits, which can exist in superposition states (0, 1, or
both simultaneously) due to quantum superposition.
• Quantum information cannot be cloned perfectly due to the no-cloning
theorem.
• Quantum computers leverage quantum bits (qubits) and quantum gates
for computation, enabling parallel processing and potential exponential
speedup.
• Quantum information theory extends classical information theory into the
quantum realm, dealing with quantum encoding, transmission, and
decoding.

Quantum entropy pdf 12345678910 lecture notes

  • 1.
  • 2.
    • What isInformation? - It is facts provided or learned about something or someone. • The definition above is qualitative. In a scientific theory, there is a need to quantify it. • Q. Does every piece of information convey something? What determines whether one piece of information is more meaningful than another? • Let’s consider an example:- • Consider a statement like “ It is hot in Agra today”. • Does it convey enough information? Is it hot compared to New Delhi? Is it hotter than it was yesterday? – Imprecise Information. • Suppose we define a day to be hot if the temperature is in the range 36 ≤ T ≤ 41. Then we have a little more knowledge. • Our Information is better, but we have an equal probability (1/6) of T being 36, 37, 38, 39, 40 or 41 • Additional information – yesterday’s temperature was 39 and it is hotter today.
  • 3.
    Entropy • Measurement isvery important in Information theory. • Physicists use entropy to measure the amount of disorder in a physical system. • Entropy is the key concept of quantum information theory. • In information theory , entropy is the expected value (average) of the information contained in each message received. It measures how much uncertainty there is in the state of a physical system. • It can be considered as the degree of randomness in a message.
  • 4.
    • The entropyof the message is its amount of uncertainty, it increases when the message is closer to random, and decreases when it is less random. • The less likely an event is, the more information it provides when it occurs. • The message aaaaa appears to be very structured and not random at all . It contains much less information than the message alphabet, which is some what structured , but more random. • The first information has low entropy and the second one has higher entropy.
  • 5.
    • The keyconcept of classical information theory is the Shannon entropy. Claude E. Shannon introduced Shannon’s entropy , used for measuring entropy of a classical system. It measures the amount of order in a classical system. • Von Neumann entropy is used for measuring entropy of a quantum system. It gauges order in a given quantum system.
  • 6.
    Shannon Entropy The Shannonentropy can be used to define other measures of information which capture the relationships between two random variables X and Y. Four such measures are the following: • Relative entropy: Relative entropy measures the similarity between two random variables. • Joint entropy: joint entropy measures the combined information in two random variables. • Conditional entropy: Conditional entropy measures the information contained in one random variable given that the outcome of another random variable is known. • Mutual information: Mutual information measures the correlation between two random variables, in terms of how much knowledge of one random variable reduces the information gained from learning the other random variable.
  • 7.
    Shannon’s entropy • Fora discrete random variable X corresponding to a physical process with possible outcomes x1,x2,———xn with probabilities , p1,————pn ; ∑pi. = 1 • The average uncertainty associated with the event X = xi • The Shannon entropy associated with this probability distribution is defined by • H( p1,————pn ) • = - ∑pi log2 (pi ) • = ∑pi log2 (1/pi )
  • 8.
    Von Neumann Entropy •Quantum entropy refers to the measure of uncertainty or information content within a quantum system. • It quantifies the amount of information that is missing or unknown about a quantum state. • Unlike classical entropy, which deals with disorder or randomness in classical systems, quantum entropy deals with the complexities arising from superposition, entanglement, and the probabilistic nature of quantum states.Quantum entropy plays a pivotal role in understanding the behavior of quantum systems. • It provides fundamental insights into the information content, complexity, and predictability of quantum states. • As quantum mechanics governs the behavior of particles at a fundamental level, understanding and quantifying entropy in this context are crucial for various applications, including quantum computing, cryptography, and information theory. •
  • 9.
    • Named aftermathematician John von Neumann, Von Neumann entropy is a key concept in quantum information theory. • It quantifies the amount of uncertainty or information content associated with a quantum state. • Von Neumann defined the entropy of a quantum state ƿ by the formula • Von Neumann entropy • S(ƿ) = - tr (ƿ log2 ƿ). Where ƿ = ∑ pi IΨi><ΨiI. • Suppose we have a mixture of quantum states IΨi> with probability pi . Each IΨi> can be represented by a vector in the 2n space. • If λi are eigenvalues of the density matrix ƿ then Von Neumann’s definition can be re-expressed • S(ƿ) = ∑λi log2 1/ λi
  • 10.
    • Von Neumannentropy S(ƿ) provides a measure of degree of mixedness of the ensemble. • S(ƿ) = 0 for a well ordered state.
  • 11.
    Classical versus QuantumEntropy • Classical Information: • Classical information is based on classical physics and follows classical laws of physics. • It is represented in bits (0 or 1) and operates on classical states that are deterministic and definite. • Information in classical systems can be copied without limitations due to the principle of no-cloning. • Classical information processing relies on classical computers that use classical bits for computation. • Classical information theory, developed by Claude Shannon, deals with encoding, transmitting, and decoding information in classical systems.
  • 12.
    • Quantum Information: •Quantum information is based on quantum physics and operates using quantum states and qubits. • It represented in qubits, which can exist in superposition states (0, 1, or both simultaneously) due to quantum superposition. • Quantum information cannot be cloned perfectly due to the no-cloning theorem. • Quantum computers leverage quantum bits (qubits) and quantum gates for computation, enabling parallel processing and potential exponential speedup. • Quantum information theory extends classical information theory into the quantum realm, dealing with quantum encoding, transmission, and decoding.