Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Entropy : statistical approach

1,817 views

Published on

Published in: Science
  • Be the first to comment

Entropy : statistical approach

  1. 1. ENTROPY & LAW OFINCREASE OF ENTROPY Khemendra Shukla MSc. II Semester Applied Physics B.B.A. University Lucknow
  2. 2. Contents : Entropy : Statistical Interpretation Principle Of Entropy Increase. Equilibrium Conditions i) Thermal Equilibrium, ii) Mechanical Equilibrium iii)concentration Equilibrium Gibbs Paradox 2
  3. 3. Entropy :• Physically Entropy of a system is a measure of disorder of its molecular motion.• Greater is disorder of molecular motion greater is the entropy.• The connection between statistical mechanics is provided by entropy.• Natural processes tend to move toward a state of greater disorder. 3
  4. 4. Entropy in Statistical MechanicsFrom the principles of thermodynamics the thermodynamicentropy S has the following important properties: dS is an exact differential and is equal to dQ/T for areversible process, where dQ is the heat quantity added tothe system.Entropy is additive: S=S1+S2.S  0. If the state of a closed system is givenmacroscopically at any instant, the most probable state atany other instant is one of equal or greater entropy. 4
  5. 5. State FunctionThis statement means that entropy is a state function inthat the value of entropy does not depend on the pasthistory of the system but only on the actual state of thesystem...The entropy  of a system (in classical statistical physics)in statistical equilibrium can be defined as  = ln where  is the volume of phase space accessible to thesystem, i.e., the volume corresponding to the energies 5between E and E+ .
  6. 6. Change in entropyLet us show first, that changes in the entropy areindependent of the system of units used to measure .As  is a volume in the phase space of N point particles ithas dimensions(Momentum  Length)3N =(Action)3NLet  denote the unit of action; then  /  3N isdimensionless. If we were to define    ln 3 N  ln   3 N ln  we see that for changes    ln  6
  7. 7. independent of the system units.  =Plank’s constant is anatural unit of action in phase space. It is obvious that the entropy , has a definite value for an ensemble in statistical equilibrium; thus the change in entropy is an exact differential. We see that, if  is interpreted as a measure of the imprecision of our knowledge of a system or as a measure of the “randomness” of a system then the entropy is also to be interpreted as a measure of the imprecision or randomness. 7
  8. 8. Entropy is an additive It can be easily shown that  is an additive. Let us consider a system made up of two parts, one with N1 particles and the other with N2 particles. Then N  N1  N 2 and the phase space of the combined system is the product space of the phase spaces of the individual parts:   12The additive property of the entropy follows directly:   ln   ln 12  ln 1  ln 2   1   2 8
  9. 9. Principle Of Entropy Increase The entropy of a system remains constant in reversible process but increase inevitably in all reversible process. Since a reversible process represents a limiting ideal case, all actual processes are inherently irreversible. That means after each cycle of operation the entropy of system increases and tends to a maximum. “The entropy of an isolated system either increases or remains constant or remains constant according as processes is undergoes are irreversible or reversible.”
  10. 10. Equilibrium conditionsWe have supposed that the condition of statisticalequilibrium is given by the most probable condition of aclosed system, and therefore we may also say that theentropy  is a maximum when a closed system is inequilibrium condition.The value of  for a system in equilibrium will depend onthe energy E (  <E>) of the system; on the number Niof each molecular species i in the system; and on externalvariables, such as volume, strain, magnetization, etc. 10
  11. 11. Insulation 1 2 Barrier Fig. 3.1.Let us consider the condition for equilibrium in a systemmade up of two interconnected subsystems, as in Fig. 3.1.Initially a rigid, insulating, non-permeable barrier separatesthe subsystems from each other. 11
  12. 12. Thermal equilibriumThermal contact - the systems can exchange theenergy. In equilibrium, there is no any flow ofenergy between the systems. Let us suppose that thebarrier is allowed to transmit energy, the other inhibitionsremaining in effect. If the conditions of the twosubsystems 1 and 2 do not change we say they are inthermal equilibrium.In the thermal equilibrium the entropy  of the totalsystem must be a maximum with respect to smalltransfers of energy from one subsystem to the other.Writing, by the additive property of the entropy 12
  13. 13.   1   2we have in equilibrium  =1+2=0   1    2      E1    E2  0  E1   E2  We know, however, that E  E1  E2  0as the total system is thermally closed, the energy in amicrocanonical ensemble being constant. Thus   1    2         E1  0  E1   E2   13
  14. 14. As E1 was an arbitrary variation we must have   1    2       E1   E 2 in thermal equilibrium. If we define a quantity by 1    Ethen in thermal equilibrium 1   2Here  is known as the temperature and will shown laterto be related to the absolute temperature T by =kT, where k is the Boltzmann constant, 1.38010-23 j/deg K. 14
  15. 15. Mechanicalequilibrium Mechanical contact the systems are separated by themobile barrier; The equilibrium is reached in this case byadequation of pressure from both sides of the barrier.We now imagine that the wall is allowed to move and alsopasses energy but do not passes particles. The volumesV1, V2 of the two systems can readjust to maximize theentropy. In mechanical equilibrium   1    2    1    2     V1    V2    E1    E2  0  V1   V2   E1   E2  15
  16. 16. After thermal equilibrium has been established the last twoterms on the right add up to zero, so we must have   1    2    V1    V2  0  V1   V2 Now the total volume V=V1+V2 is constant, so that V  V1  V2We have then   1    2         V1  0  V1   V2   16
  17. 17. As V1 was an arbitrary variation we must have   1    2       V1   V2 in mechanical equilibrium. If we define a quantity  by         V  E , Nwe see that for a system in thermal equilibrium thecondition for mechanical equilibrium is 1   2 We show now that  has the essential characteristics of the usual pressure p. 17
  18. 18. Concentration EquilibriumThe systems can be exchange by the particles. Let ussuppose that the wall allows diffusion through it ofmolecules of the i th chemical species. We have N i1  N i 2For equilibrium   1    2         N i1  0or  N i1   N i 2   1  2  Ni1 Ni 2 18
  19. 19. We define a quantity i by the relation i         N i  E ,VThe quantity i is called the chemical potential of the ithspecies. For equilibrium at constant temperature  i1   i 2 19
  20. 20. Gibbs Paradox i) Mixing Of Two Different Ideal GasIt can be regarded as the expansionof each of the gases to the volume N1 , V1 N 2 , V2 V  V1  V2 T TThe change in entropy,    12   1   2  Mixing of two gases  N1 ln V  N 2 ln V   N1 ln V1  N 2 ln V2  V  V   N1 ln    N 2 ln    0 V  V   1  2
  21. 21. For this case 1 V1  V2  V & N1  N 2  N 2 We get,   2N ln 2 This gives the entropy of mixing for two different ideal gases and is in agreement with experiments.
  22. 22. ii) Mixing Of one Ideal Gas with the same ideal gasThe removal of the partition should not affect the distribution of the system over the accessible states. The change in entropy,    12   1   2   0in particular, for the case 1 V1  V2  V & N1  N 2  N 2we get an un observed and therefore unaccountable increase of entropy 2Nln2 when a partition is removed from a box containing the same gas the entropy becomes zero. This is the Gibbs paradox.
  23. 23. Resolution Of Gibbs Paradox  In the case (i) Removal of partition leads to diffusion of molecules which is an irreversible process therefore increase of entropy makes sense.  We can imagine mixing of gases iterchanges the position of molecules,creates a new state Therefore, the no. of accesible states increases or equivalently the entropy increase.
  24. 24. Cont’d…  In the case (ii) no new state is created when partition is removed. If there are N molecules N! permutations are possible, also the mixing is indistinguishable. Therefore no. of accessible states is too large by a factor of N!,    N !         ln  3N   ln  3 N   ln N !  N !h  h      ln  3 N    N ln N  N  h  This extra term resolves the gibbs paradox.it makes entropy properly additive.

×