• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content

Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this presentation? Why not share!

Entropy and Probability

on

  • 9,452 views

A Keynote companion to a lecture designed to give introductory physics students a deeper understanding of the idea of entropy than simply "disorder"

A Keynote companion to a lecture designed to give introductory physics students a deeper understanding of the idea of entropy than simply "disorder"

Statistics

Views

Total Views
9,452
Views on SlideShare
9,399
Embed Views
53

Actions

Likes
16
Downloads
0
Comments
4

9 Embeds 53

http://www.slideshare.net 29
https://lms.kku.edu.sa 9
http://campus.psecademy.org 4
http://psecademy.mrooms3.net 4
https://lms.tp.edu.sg 3
http://blackboard.stonybrook.edu 1
https://bishopstate.blackboard.com 1
https://education.blackboard.com 1
https://blendedschools.blackboard.com 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

14 of 4 previous next Post a comment

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • I like it. this keynote is very stimulating to thinking...
    Are you sure you want to
    Your message goes here
    Processing…
  • To
    Are you sure you want to
    Your message goes here
    Processing…
  • To think about entropy, a different and good way.
    Are you sure you want to
    Your message goes here
    Processing…
  • Very lucid,
    thanks
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Entropy and Probability Entropy and Probability Presentation Transcript

    • Entropy and Probability David L. Morgan: Eugene Lang College: The New School
    • Is there any total which is more likely?
    • 1,1 1,2 1,3 1,4 1,5 1,6 2,1 2,2 2,3 2,4 2,5 2,6 3,1 3,2 3,3 3,4 3,5 3,6 4,1 4,2 4,3 4,4 4,5 4,6 5,1 5,2 5,3 5,4 5,5 5,6 6,1 6,2 6,3 6,4 6,5 6,6
    • 2 3 4 5 6 7 3 4 5 6 7 8 4 5 6 7 8 9 5 6 7 8 9 10 6 7 8 9 10 11 7 8 9 10 11 12
    • 2 3 4 5 6 7 8 9 10 11 12 2 3 4 5 6 7 3 4 5 6 7 8 4 5 6 7 9 8 5 6 7 9 10 8 6 7 9 10 11 8 7 9 10 11 12 8
    • Imagine we divide your bedroom into 100 squares of equal size...
    • Imagine we divide your bedroom into 100 squares of equal size...
    • Imagine we divide your bedroom into 100 squares of equal size... One square is occupied by a laundry basket...
    • Let’s imagine you have 10 pairs of socks...
    • Let’s imagine you have 10 pairs of socks...
    • Let’s imagine you have 10 pairs of socks... If we throw our 20 socks randomly around the room – what is the chance that they’ll all land in the laundry basket?
    • 1/100
    • 1/100 x 1/100 = 1/10,000
    • 1/100 x 1/100 x 1/100 = 1 in 1,000,000
    • 1/100 x 1/100 x 1/100 x 1/100 x 1/100 x 1/100............... = 1 in 1040
    • With all twenty socks, the probability that they will all wind up in the basket purely by chance is one in 1040 !! 1/100 x 1/100 x 1/100 x 1/100 x 1/100 x 1/100............... = 1 in 1040
    • 1/100 x 1/100 x 1/100 x 1/100 x 1/100 x 1/100............... = 1 in 1040
    • Ordered arrangements like this one are very rare, and therefore very unlikely. 1/100 x 1/100 x 1/100 x 1/100 x 1/100 x 1/100............... = 1 in 1040
    • There are many more of these “messy” arrangements than ordered ones.
    • LOW ENTROPY HIGH ENTROPY If there are only a few ways to get an arrangement, we say that state has a LOW ENTROPY... If there are LOTS of equivalent arrangements for a particular state, we say that state has a HIGH ENTROPY.
    • LOW ENTROPY HIGH ENTROPY The Second Law of Thermodynamics tells us that in nature, a closed system will always move from a state of low entropy to a state of higher entropy over time.
    • “Unlikely” “Likely” If we think of entropy in terms of probability, the reason for the 2nd Law is clear... what we call “disordered” states are more likely to occur over time, simply because there are more of them!