1.
Entropy<br />Introduction<br />When a system has more degrees of freedom and more constituents, there are more possible states for it to occupy.<br />Lets have a look on micro & macro states before proceeding further<br />
2.
A microstateis a particular configuration of the individual constituents of the system<br />A macrostate is a description of the conditions from a macroscopic point of view<br />It makes use of macroscopic variables such as pressure, density, and temperature for gases<br />For a given macrostate, a number of microstates are possible<br />It is assumed that all microstates are equally probable<br />When all possible macrostates are examined, it is found that macrostates associated with disorder have far more microstates than those associated with order.<br />
3.
A connection between entropy and the number of microstates (W) for a given macrostate is entropy, S = kBlnW<br />For all macroscates the entropy will be the submission of all.<br />More information is hence required to exactly specify the system.<br />Entropy is defined as the amount of information needed to exactly specify the state of the system. More the information required more is the entropy<br />Example, shuffling a deck of cards rarely separates them into their original "orderly" state (by suit, ordering each suit Ace through King) because there are only 4 factorial (4! = 24) such states out of 52! possible states. <br />
4.
Every system wants to achieve equilibrium which in turn is at least energy.<br />In other words the system proceeds towards the direction of least energy. So the system moves from orderly state to disordered state.<br />The probability of a system moving in time from an ordered macrostate to a disordered macrostate is far greater and more information is required specify disordered state, implies the entropy is more.<br /> Entropy is a measure of disorder or randomness<br />
5.
Q. From where this energy comes, which needed to be dissipated to attain stability? <br />Answer is, that it is internal energy.<br />Q. And from where the internal energy comes in?<br /> So this gives the another view angle to entropy.<br /> “a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work”<br />
6.
Entropy is a unidirectional like spontaneous process, and can proceed in one direction only, the direction of dis-orderness or low energy<br />
7.
Some features<br />It was stated earlier that exothermic processes are spontaneous.. <br />• Consider the following reaction:<br />H2O(s) H2O(l) DH°rxn = +6.02 kJ<br />• Endothermic…..yet spontaneous!<br />
8.
Consider the following problem: Mixing of a gas inside a bulb adiabatically (q = 0).<br />• q = 0, w = 0, DE = 0, and DH = 0<br /> ….but it still happens<br />
9.
Entropy, S<br />One property common to spontaneous processes is that the final state is more DISORDERED or RANDOM than the original.<br />Spontaneity is related to an increase in randomness.<br />The thermodynamic property related to randomness is ENTROPY, S.<br />Reaction of K with water<br />
10.
The entropy of liquid water is greater than the entropy of solid water (ice) at 0˚ C.<br />Because Liquid state is more disordered then solid state<br />
11.
Entropy, S<br /> So (J/K•mol)<br />H2O(liq) 69.95<br />H2O(gas) 188.8 <br />S (gases) > S (liquids) > S (solids)<br />
12.
Entropy and States of Matter<br />S˚(Br2 liq) < S˚(Br2 gas)<br />S˚(H2O sol) < S˚(H2O liq)<br />
13.
Entropy does not violate the second law<br />Source<br />800 K<br />Q=2000 kJ<br />Sink<br />500 K<br />Show that heat can not be transferred from the low-temperature sink to the high-temperature source based on the increase of entropy principle.<br />DS(source) = 2000/800 = 2.5 (kJ/K)<br />DS(sink) = -2000/500 = -4 (kJ/K)<br />Sgen= DS(source)+ DS(sink) = -1.5(kJ/K) < 0<br />It is impossible based on the entropy increase principle<br />Sgen0, therefore, the heat can not transfer from low-temp. to high-temp. without external work input<br /><ul><li> If the process is reversed, 2000 kJ of heat is transferred from the source to the sink, Sgen=1.5 (kJ/K) > 0, and the process can occur according to the second law
14.
If the sink temperature is increased to 700 K, how about the entropy generation? DS(source) = -2000/800 = -2.5(kJ/K)</li></ul>DS(sink) = 2000/700 = 2.86 (kJ/K)<br />Sgen= DS(source)+ DS(sink) = 0.36 (kJ/K) < 1.5 (kJ/K)<br />Entropy generation is less than when the sink temperature is 500 K, less irreversibility. Heat transfer between objects having large temperature difference generates higher degree of irreversibilities<br />
15.
Why ordered state is having more energy then unordered state?<br />
16.
Entropy, S<br />Entropy of a substance increases with temperature.<br />Molecular motions of heptane at different temps.<br />Molecular motions of heptane, C7H16<br />
17.
Entropy, S<br />Increase in molecular complexity generally leads to increase in S.<br />
18.
Entropy, S<br />Entropies of ionic solids depend on coulombic attractions.<br /> So (J/K•mol)<br />MgO 26.9<br />NaF 51.5<br />Mg2+ & O2-<br />Na+ & F-<br />
19.
Entropy, S<br />Entropy usually increases when a pure liquid or solid dissolves in a solvent.<br />
21.
Origin of entropy<br />Carnot’s Principle:<br />Sadi CARNOT<br />1825: <br />A steam machine needs 2<br />sources of heat:<br /><ul><li>a hot one: temperatureTh
22.
a cold one: temperatureTc</li></ul>Th > Tc<br />Stated that “no change occurs in the condition of the working body”<br />
23.
Rudolf CLAUSIUS<br />1865:<br /> Gives a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction.[10]Clausius described entropy as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[<br />Definition of entropy:<br /><ul><li>d S = d Q/T</li></ul>Entropy: definition<br />
24.
Ice melting example<br />S= δQ/T, <br />For system dS= δQ/273 K. <br />The entropy of the surroundings will change by an amount dS = −δQ/298 K. <br />So in this example, the entropy of the system increases, whereas the entropy of the surroundings decreases.<br />But dSsystem > dSsurrounding<br />Hence entropy is positive<br />
25.
2nd Law of Thermodynamics<br />A reaction is spontaneous if ∆S for the universe is positive.<br />∆Suniverse = ∆Ssystem + ∆Ssurroundings<br />∆Suniverse > 0 for spontaneous process<br />First calc. entropy created by matter dispersal (∆Ssystem)<br />Next, calc. entropy created by energy dispersal (∆Ssurround)<br />
26.
2nd Law of Thermodynamics<br />2 H2(g) + O2(g) ---> 2 H2O(liq)<br />∆Sosystem = -326.9 J/K<br />Can calc. that ∆Horxn = ∆Hosystem = -571.7 kJ<br />∆Sosurroundings = +1917 J/K<br />
28.
Spontaneous or Not?<br />Remember that –∆H˚sys is proportional to ∆S˚surr<br />An exothermic process has ∆S˚surr > 0.<br />
29.
Conclusion form the statements of entropy<br />∆Stotal>=0<br />This mathematical statement of the second law affirms that every process proceeds in such a direction that the total entropy change associated with it is positive, the limiting value of zero being attained only by a reversible process. No process is possible foe which the total entropy decreases.<br />
31.
Heat Death of the Universe<br />Ultimately, the entropy of the Universe should reach a maximum value<br />At this value, the Universe will be in a state of uniform temperature and density<br />All physical, chemical, and biological processes will cease<br />The state of perfect disorder implies that no energy is available for doing work<br />This state is called the heat death of the Universe and the universe will….<br />
32.
Entropy and life<br />"living organisms preserve their internal order by taking from their surroundings free energy, in the form of nutrients or sunlight, and returning to their surroundings an equal amount of energy as heat and entropy.“<br />I’d look for an entropy reduction, since this must be a general characteristic of life<br />
33.
2 Information and computer science<br />3 Mathematics<br />4 Other sciences and social sciences<br />5 Music<br />6 Modern culture<br />Spirituality and low entropy culture<br />
34.
Entropy is a vast topic and still to define entropy is very complex <br />
A particular slide catching your eye?
Clipping is a handy way to collect important slides you want to go back to later.
Be the first to comment