0
Upcoming SlideShare
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Standard text messaging rates apply

# Presentation draft

1,682

Published on

1 Like
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

Views
Total Views
1,682
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
25
0
Likes
1
Embeds 0
No embeds

No notes for slide

### Transcript

• 1. Entropy
Introduction
When a system has more degrees of freedom and more constituents, there are more possible states for it to occupy.
Lets have a look on micro & macro states before proceeding further
• 2. A microstateis a particular configuration of the individual constituents of the system
A macrostate is a description of the conditions from a macroscopic point of view
It makes use of macroscopic variables such as pressure, density, and temperature for gases
For a given macrostate, a number of microstates are possible
It is assumed that all microstates are equally probable
When all possible macrostates are examined, it is found that macrostates associated with disorder have far more microstates than those associated with order.
• 3. A connection between entropy and the number of microstates (W) for a given macrostate is entropy, S = kBlnW
For all macroscates the entropy will be the submission of all.
Entropy is defined as the amount of information needed to exactly specify the state of the system. More the information required more is the entropy
Example, shuffling a deck of cards rarely separates them into their original "orderly" state (by suit, ordering each suit Ace through King) because there are only 4 factorial (4! = 24) such states out of 52! possible states.
• 4. Every system wants to achieve equilibrium which in turn is at least energy.
In other words the system proceeds towards the direction of least energy. So the system moves from orderly state to disordered state.
The probability of a system moving in time from an ordered macrostate to a disordered macrostate is far greater and more information is required specify disordered state, implies the entropy is more.
Entropy is a measure of disorder or randomness
• 5. Q. From where this energy comes, which needed to be dissipated to attain stability?
Answer is, that it is internal energy.
Q. And from where the internal energy comes in?
So this gives the another view angle to entropy.
“a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work”
• 6. Entropy is a unidirectional like spontaneous process, and can proceed in one direction only, the direction of dis-orderness or low energy
• 7. Some features
It was stated earlier that exothermic processes are spontaneous..
• Consider the following reaction:
H2O(s) H2O(l) DH°rxn = +6.02 kJ
• Endothermic…..yet spontaneous!
• 8. Consider the following problem: Mixing of a gas inside a bulb adiabatically (q = 0).
• q = 0, w = 0, DE = 0, and DH = 0
….but it still happens
• 9. Entropy, S
One property common to spontaneous processes is that the final state is more DISORDERED or RANDOM than the original.
Spontaneity is related to an increase in randomness.
The thermodynamic property related to randomness is ENTROPY, S.
Reaction of K with water
• 10. The entropy of liquid water is greater than the entropy of solid water (ice) at 0˚ C.
Because Liquid state is more disordered then solid state
• 11. Entropy, S
So (J/K•mol)
H2O(liq) 69.95
H2O(gas) 188.8
S (gases) > S (liquids) > S (solids)
• 12. Entropy and States of Matter
S˚(Br2 liq) < S˚(Br2 gas)
S˚(H2O sol) < S˚(H2O liq)
• 13. Entropy does not violate the second law
Source
800 K
Q=2000 kJ
Sink
500 K
Show that heat can not be transferred from the low-temperature sink to the high-temperature source based on the increase of entropy principle.
DS(source) = 2000/800 = 2.5 (kJ/K)
DS(sink) = -2000/500 = -4 (kJ/K)
Sgen= DS(source)+ DS(sink) = -1.5(kJ/K) < 0
It is impossible based on the entropy increase principle
Sgen0, therefore, the heat can not transfer from low-temp. to high-temp. without external work input
• If the process is reversed, 2000 kJ of heat is transferred from the source to the sink, Sgen=1.5 (kJ/K) > 0, and the process can occur according to the second law
• 14. If the sink temperature is increased to 700 K, how about the entropy generation? DS(source) = -2000/800 = -2.5(kJ/K)
DS(sink) = 2000/700 = 2.86 (kJ/K)
Sgen= DS(source)+ DS(sink) = 0.36 (kJ/K) < 1.5 (kJ/K)
Entropy generation is less than when the sink temperature is 500 K, less irreversibility. Heat transfer between objects having large temperature difference generates higher degree of irreversibilities
• 15. Why ordered state is having more energy then unordered state?
• 16. Entropy, S
Entropy of a substance increases with temperature.
Molecular motions of heptane at different temps.
Molecular motions of heptane, C7H16
• 17. Entropy, S
Increase in molecular complexity generally leads to increase in S.
• 18. Entropy, S
Entropies of ionic solids depend on coulombic attractions.
So (J/K•mol)
MgO 26.9
NaF 51.5
Mg2+ & O2-
Na+ & F-
• 19. Entropy, S
Entropy usually increases when a pure liquid or solid dissolves in a solvent.
• 20. Standard Molar Entropies
• 21. Origin of entropy
Carnot’s Principle:
1825:
A steam machine needs 2
sources of heat:
• a hot one: temperatureTh
• 22. a cold one: temperatureTc
Th > Tc
Stated that “no change occurs in the condition of the working body”
• 23. Rudolf CLAUSIUS
1865:
Gives a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction.[10]Clausius described entropy as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[
Definition of entropy:
• d S = d Q/T
Entropy: definition
• 24. Ice melting example
S= δQ/T,
For system dS= δQ/273 K.
The entropy of the surroundings will change by an amount dS = −δQ/298 K.
So in this example, the entropy of the system increases, whereas the entropy of the surroundings decreases.
But dSsystem > dSsurrounding
Hence entropy is positive
• 25. 2nd Law of Thermodynamics
A reaction is spontaneous if ∆S for the universe is positive.
∆Suniverse = ∆Ssystem + ∆Ssurroundings
∆Suniverse > 0 for spontaneous process
First calc. entropy created by matter dispersal (∆Ssystem)
Next, calc. entropy created by energy dispersal (∆Ssurround)
• 26. 2nd Law of Thermodynamics
2 H2(g) + O2(g) ---> 2 H2O(liq)
∆Sosystem = -326.9 J/K
Can calc. that ∆Horxn = ∆Hosystem = -571.7 kJ
∆Sosurroundings = +1917 J/K
• 27. 2nd Law of Thermodynamics
2 H2(g) + O2(g) ---> 2 H2O(liq)
∆Sosystem = -326.9 J/K
∆Sosurroundings = +1917 J/K
∆Souniverse = +1590. J/K
• 28. Spontaneous or Not?
Remember that –∆H˚sys is proportional to ∆S˚surr
An exothermic process has ∆S˚surr > 0.
• 29. Conclusion form the statements of entropy
∆Stotal>=0
This mathematical statement of the second law affirms that every process proceeds in such a direction that the total entropy change associated with it is positive, the limiting value of zero being attained only by a reversible process. No process is possible foe which the total entropy decreases.
• 30. Some facts about entropy
• 31. Heat Death of the Universe
Ultimately, the entropy of the Universe should reach a maximum value
At this value, the Universe will be in a state of uniform temperature and density
All physical, chemical, and biological processes will cease
The state of perfect disorder implies that no energy is available for doing work
This state is called the heat death of the Universe and the universe will….
• 32. Entropy and life
"living organisms preserve their internal order by taking from their surroundings free energy, in the form of nutrients or sunlight, and returning to their surroundings an equal amount of energy as heat and entropy.“
I’d look for an entropy reduction, since this must be a general characteristic of life
• 33. 2 Information and computer science
3 Mathematics
4 Other sciences and social sciences
5 Music
6 Modern culture
Spirituality and low entropy culture
• 34. Entropy is a vast topic and still to define entropy is very complex