LECTURE ONTHE CONCEPT OF SECOND LAW OF THERMODYNAMICSBYDR. NNABUK OKON EDDY<br />
Outline<br />Introduction<br />The need for the second law<br />Concept of entropy<br />Statements of the second law<br />Properties of entropy<br />Derivation of equation for entropy<br />Consequences of the second law<br />
Introduction<br />The second law explains the phenomenon of irreversibility in nature<br />The need for the second law arises because the first law failed in some aspects. For example, <br />It fails to explain why natural processes have a preferred direction<br />The first law fails to produce thermodynamic functions that can be used to predict the direction of a spontaneous reaction<br />The second law deals with entropy <br />
Entropy<br />The key concept for the explanation of phenomenon through the second law is the definition of a physical property called entropy<br />Entropy is a measure of the degree of disorderliness of a system. <br />A change in entropy of a system is the infinitesimal transfer of heat to a close system driving a reversible process divided by the equilibrium temperature (T) of the system, i.edS = dqrev /T<br />
Statements of the second law<br />No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature (Clausius-Mussoti)<br />No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work (Kelvin-Plank)<br />Equivalent ways of stating the laws are<br />i. the entropy of a spontaneous reaction increases and tends toward a maximum<br />ii. After any spontaneous reaction, work must be converted to heat in order to restore the system to its initial state<br />
Properties of entropy<br />Entropy is a state function: its properties depends on the initial and final state of the system<br />Entropy is additive. i.e ST = S1 + S2 + S3 + -----<br />Entropy is a probability function<br />
Derivation of expression for entropy change <br />If the probability of finding a system in state 1 and 2 are W1 and W2, then the probability of finding the system in the two parts is the total probability, W = W1 x W2, <br />S(W) = S(W1) x S(W2) = S(W1) + S(W2) (1)<br /> Conditions set by Eq 1 can only be fulfilled if entropy is logarithm dependent, i.e<br /> S = log(W1 x W2) = logW1 x logW2 (2)<br /> Consider an ideal gas expanding into two systems joined together, the probabilities for the first and second is proportional to their respective volumes, therefore, W1 = aV1, W2 = aV2 and since S is additive, S = S2 – S1 = log(aV2) – log(aV1) = log(V2/V1)<br /> From first law of thermodynamics, it can be shown that the reversible work done = reversible heat absorbed = nRTln(V2/V1) and if we multiply S by the constants, 2.303R, we have, qads = T x S. It therefore follows that S can be expressed as follows<br /> S = qads/T (3) <br />
Consequence of the second law of thermodynamics<br /> We shall consider the following consequences of the 2nd law of thermodynamics, <br />Entropy change for an ideal gas<br />Entropy of mixing ideal gases<br />Carnot cycle<br />Free energy change<br />
S and spontaneousity of a reaction<br />When S is positive, spontaneous reaction<br />When S is zero, reaction at equilibrium<br />When S is negative, non spontaneous <br />Limitation is that we who measures the entropy are part of the environment. Therefore S is not a unique parameter for predicting the direction of a chemical reaction<br />
G and spontaneousity of a reaction<br />G > 0, non spontaneous (H > TS)<br />G < 0, spontaneous (H < TS)<br />G = 0, reaction at equilibrium (H = TS<br />G is a state function obtained at constant pressure. At constant volume the state function is work function expressed as <br />A = E - TS<br />When A > 0, spontaneous<br />When A <0 , non spontaneous<br />When A = 0, at equilibrium<br />
CONCLUSION<br />Thermodynamic function obtained from the second law is entropy<br />Entropy is a measure of disorderliness while enthalpy measures orderliness<br />Entropy data must be combined with enthalpy (or internal energy data) in order to predict the direction of a chemical reaction <br />
A particular slide catching your eye?
Clipping is a handy way to collect important slides you want to go back to later.