4. WELCOME
WE ARE NOT JUST ANOTHER BRICK IN THE WALL
CONTEN
T
History
CONCEP
T
DEE
ERL/LU
A/C
4
PARTPART
3
PART
2
PART
1
KAZI EMAD
B.Sc. in CSE
ID:
191902025
Slide (4-7)
IKHTIAR
B.Sc. in CSE
ID:
191902022
Slide (8-11)
ASMA
B.Sc. in CSE
ID:
191902027
Slide (12-14)
RAIHAN
B.Sc. in CSE
ID: 1919024
Slide (15-18)
6. CONTEN
History
CONCEP
T
DEE
ERL/LU
A/C
INTRODUCTION :The word “entropy” was created by
German physicist “Rudolf Clausius ” in 1854. Rudolf
Clausius (1822 - 1988)The word has a Greek origin, its
first part reminds us of “energy” and the second part is
from “tropes” which means turning point.
HISTORY :Why’s this notion called entropy, anyway?
From the American Heritage Book of English Usage
(1996): “When the American scientist Claude Shannon
found that the mathematical formula of Boltzmann
defined a useful quantity in information theory, he
hesitated to name this newly discovered quantity
entropy because of its philosophical baggage.The
mathematician JohnVon [sic] Neumann encouraged
Shannon to go ahead with the name entropy, however,
since`no one knows what entropy is, so in a debate you
will always have the advantage.’
Rudolf Clausius (1822 - 1988)
7. CONTEN
History
CONCEP
T
DEE
ERL/LE
A/C
What is entropy?
The word entropy is sometimes confused with
energy. Although they are related quantities, they
are distinct.
or energy measures the capability of an object or
system to do work.
on the other hand, is a measure of the "disorder"
of a system. What "disorder refers to is really the
number of different microscopic states a system can
be in, given that the system has a particular fixed
composition, volume, energy, pressure, and
temperature. By "microscopic states", we mean the
exact states of all the molecules making up the
system.
Entropy = (Boltzmann's constant k) x logarithm of
number of possible states = k log(N).
8. CONTEN
HISTORY
CONCEP
T
DEE
ERL/LU
A/C CONCEPT The idea of entropy comes from a
principle of thermodynamic dealing with energy .It
usually refers to idea that everything in the universe
eventually moves from order to disorder , and
entropy is the measurement of that change. •
Physicists use entropy to measure the amount of
disorder in a physical system. • In information theory
, entropy is the expected value (average) of the
information contained in each message received. • It
can be considered as the degree of randomness in
a message.
Entropy :thermodynamic property-- a quantitative
measure of disorder
Entropy traces out its origin –molecular movement
interpretation-Rudolf Clausias in 1850
The concept of entropy -thermodynamic laws(i.e.
the 2nd law of thermodynamics)
It can be visualised due to the process of
expansion, heating, mixing and reaction.
Entropy is associated with heat and temperature.
9. CONTEN
HISTORY
DEE
DEE
ERL/LU
A/Legt/
C
Definition and expression of entropy
Entropy may be defined as the property
of a system which measure the degree of
disorder or randomness in the system
It is a Greek word which means
transformation
It is denoted by the symbol ‘S’
Clausius was convinced of the
significance of the ratio of heat delivered
and the temperature at which it is delivered
10. CONTEN
HISTORY
DEE
ERL/LU
CLASSIFICATION
A/C
1.Entropy is the sum total of entropy due to positional
disorder, vibrational disorder and configurational disorder. i.e
randomness due to change of state S=sp+st+sc
2.When a system is undergoing change then the entropy
change is equal to the heat absorbed by the system divided
by the temperature at which change taken place. ΔS = S2 –
S1 = ∫ dq / T T ΔS = dq or TdS = dq this is the II law
expression. Suppose the process is undergoing change at
constant temperature:
3.From I Law we know that ΔE = q – w or dE = dq – dw or dE
= dq – PdV At constant temperature ΔE = 0, therefore dq =
PdV. From II law we know that dq = TdS , Substituting this in
the above we get, Tds = Pdv ΔS = PdV / T,
4.Suppose the process is undergoing change at constant
pressure condition then: T ΔS = (q)p - but we know that (q)p
= CpdT T ΔS = Cp dT, Or TdS = Cp dT By integration, 1∫2dS
= 1∫2 Cp dT /T S2 – S1 = Δ Cp ln (T2 / T1) This is the entropy
change of the system at constant pressure condition from
room temperature to the reaction temperature.
12. CONTEN
HISTORY
CONCEP
T
DEE
LE/UE
CLASSIFICATION
Law Of Entropy The second law of
thermodynamics, “entropy of an isolated
system always increases.” Or in other
words “The entropy can be created but not
destroyed
Unit of Entropy The SI unit for Entropy (S)
is Joules per Kelvin (J/K). Clausius is also
gives the relation between the units i.e. 1
Clausius (Cl) = 1 (cal/°C) = 4.1868 (J/K)
14. CONTEN
HISTORY
CONCEP
T
DEE
TE/SE
ECW/EQ
W
Thermodynamical Entropy
Entropy is defined using Clausius
inequality 𝛿𝑄 𝑇 ≤ 0 The cyclic integral of 𝛅Q
𝑇 can be viewed as the sum of all these
differential amounts of heat transfer divided
by the temperature at the boundary. 𝛿𝑄 𝑇 =
0 (for reversible cycle) 𝛿𝑄 𝑇 < 0 (for
irreversible cycle) 𝛿𝑄 𝑇 > 0 (for impossible
cycle) by defintion: Let’s define a
thermodynamic property entropy (S), such
that S2-S1= dS = 𝑑𝑄 𝑇
Statistical Entropy
In 1877, Ludwig Boltzmann developed a statistical
entropy S. this suggest the connection between
entropy and thermodynamic probability,. It may be
written as: S=F(Ω)=KBlnΩ Where KB= Boltzmann’s
constant Ω = thermodynamic probability
15. CONTEN
HISTORY
CONCEP
T
DEE
ECW/EQ
W
APPLICATIONS
Entropy in classical world
Claude E. Shannon introduced Shannon’s
entropy , used for measuring entropy of a
classical system. Thus shannon’s
introduced a entropy , i. e. HS= Pilog2
(1/Pi) Where Pi = probability distribution
function
Entropy in Quantum world Von Neumann
entropy is used for measuring entropy of a
quantum system. It gauges order in a given
quantum system. The entropy of a quantum
state was introduced by von Neumann.
This entropy of a state P is defined by
S(P)= λilog2 ( 1 λi ) Where λi = Eigenvalues
of the density matrix
16. CONTEN
HISTORY
CONCEP
T
DEE
APPLICATIONS
CONCLUSION
APPLICATIONS
We use a shannon’s entropy in
information theory.
Identify an information processing task –
data compression, information
transmission, teleportation.
Quantum Shannon Theory provides
General theory of interconvertibility
between different types of communications
resources: qubits, cbits, ebits, cobits,
sbits…
It can store information so that at a later
time the information can be reconstructed.
Entropy as a measure of
entanglement.Entropy is a measure of the
uncertainty about a quantum system before
we make a measurement of its state.
17. CONTEN
HISTORY
CONCEP
T
DEE
CONCLUSION
Conclusion
Entropy is the thermodynamic property which is the
measure of disorder in a system.
It can be expresses by ‘S’=q/t
The term is coined by Rudolf Clausius.
Entropy is mainly associated with heat and temperature.
Disorder can be of 3 types- Positional, Vibrational and
Configurational
Thermobarometric models is an excellent case study
when the application of thermodynamic parameters are
involve
Second law of thermodynamics implies that the entropy of
a universe is increasing continously because energy
conservation is not 100% efficient. i.e. some heat is always
released.
Entropy can be zero At absolute 0 (0 K), all atomic motion
ceases and disorder in a substance is zero. entropy will
always be positive