ENTROPY-ITS
SIGNIFICANCE
SUBMITTED BY:
NAINA GUPTA
ENTROPY
• Entropy is a mathematical quantity
describing disorder or
discontinuity.
• Boltzmann introduced the concept
of e...
• According to Wilson, four ways
to view the concept of entropy
can be defined:
- Entropy as a measure of system
propertie...
- Entropy as stastistics of a
probability distribution for
measure of information or
uncertainity;
- Entropy as the negati...
Information Theory
SHANNON’S ENTROPY
• It is generally defined as an
average value of information rate
to eliminate uncertainty, which is
giv...
H = - ∑i=1P(si)log P(si)
• Where S is the system with finite
number of possible events si,
P(si) represents the probablity...
Entropy in case of two possibilities with
probablities p and (1-p)
• The entropy units are in bits and
the binary logarithm is used.
• If the common logarithm is
used, then the units are di...
ELIMINATING THE ENTROPY
SPATIAL ENTROPY
• This was defined by Batty,from the
information theory basis presented by
Shannon.

• Where,xi represents...
ENTROPY AS A USEFUL PART OF A
SPATIAL VISUALIZATION AND
MODELLING
• This figure shows the dependency between
the number of...
• This figure shows the dependency of the
outcoming amount of information, measured
with the help of the entropy function ...
• The last part is the spatial visualization of
the natural phenomena.In this fig., the
function of spatial entropy is the...
• In this there are 4 GRID layers of the same
climatic phenomemon but with different
numbers of intervals,increasing from ...
CONCLUSION
• The information theory,entropy and its
spatial form is widely used in geographical
research.
• Entropy concep...
Thank You!
Upcoming SlideShare
Loading in …5
×

Entropy and its significance related to GIS

855 views

Published on

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
855
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
27
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Entropy and its significance related to GIS

  1. 1. ENTROPY-ITS SIGNIFICANCE SUBMITTED BY: NAINA GUPTA
  2. 2. ENTROPY • Entropy is a mathematical quantity describing disorder or discontinuity. • Boltzmann introduced the concept of entropy to measure the disorder in a thermodynamic system. • Shannon used the concept of informational entropy to measure the uncertainty associated with given information.
  3. 3. • According to Wilson, four ways to view the concept of entropy can be defined: - Entropy as a measure of system properties (eg order and disorder,reversibility and irreversibility, complexity and simplicity); - Entropy as a measure of information, uncertainty, or probablity;
  4. 4. - Entropy as stastistics of a probability distribution for measure of information or uncertainity; - Entropy as the negative of Bayesian log likelihood function for measure of function.
  5. 5. Information Theory
  6. 6. SHANNON’S ENTROPY • It is generally defined as an average value of information rate to eliminate uncertainty, which is given by finite number of alternative events. • Shannon defines the entropy as:
  7. 7. H = - ∑i=1P(si)log P(si) • Where S is the system with finite number of possible events si, P(si) represents the probablity of event si and the summation is over the ranges 1,2,3…n. • Maximum Entropy when all the probablities are equal and the most uncertain situation occurs. • Mininum entropy when H=0
  8. 8. Entropy in case of two possibilities with probablities p and (1-p)
  9. 9. • The entropy units are in bits and the binary logarithm is used. • If the common logarithm is used, then the units are dits. • And it is nits in case natural logarithm is used.
  10. 10. ELIMINATING THE ENTROPY
  11. 11. SPATIAL ENTROPY • This was defined by Batty,from the information theory basis presented by Shannon. • Where,xi represents the spatial interval size.This spatial component is implemented and this equation is more applicable in spatial analysis, such as comparison between different regions.
  12. 12. ENTROPY AS A USEFUL PART OF A SPATIAL VISUALIZATION AND MODELLING • This figure shows the dependency between the number of observation and the outcoming kriging estimators of the spatial phenomenon. Here ,there are number of observations that gives us no useful information for the outcoming spatial modelling.
  13. 13. • This figure shows the dependency of the outcoming amount of information, measured with the help of the entropy function on the number of modelled points.
  14. 14. • The last part is the spatial visualization of the natural phenomena.In this fig., the function of spatial entropy is the convex function and therefore there exists points where the next added interval to the visualization of the spatial phenomenon gives us much less information like the previous one.
  15. 15. • In this there are 4 GRID layers of the same climatic phenomemon but with different numbers of intervals,increasing from upper left to lower right picture.Each layers contains various amount of information,depending on various factors.
  16. 16. CONCLUSION • The information theory,entropy and its spatial form is widely used in geographical research. • Entropy concepts can be used to investigate channel networks. Much of the work employing entropy concepts in hydrology have been done with reference to the informational entropy.
  17. 17. Thank You!

×