Your SlideShare is downloading. ×
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Entropy and its significance related to GIS
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Entropy and its significance related to GIS

259

Published on

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
259
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
17
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. ENTROPY-ITS SIGNIFICANCE SUBMITTED BY: NAINA GUPTA
  • 2. ENTROPY • Entropy is a mathematical quantity describing disorder or discontinuity. • Boltzmann introduced the concept of entropy to measure the disorder in a thermodynamic system. • Shannon used the concept of informational entropy to measure the uncertainty associated with given information.
  • 3. • According to Wilson, four ways to view the concept of entropy can be defined: - Entropy as a measure of system properties (eg order and disorder,reversibility and irreversibility, complexity and simplicity); - Entropy as a measure of information, uncertainty, or probablity;
  • 4. - Entropy as stastistics of a probability distribution for measure of information or uncertainity; - Entropy as the negative of Bayesian log likelihood function for measure of function.
  • 5. Information Theory
  • 6. SHANNON’S ENTROPY • It is generally defined as an average value of information rate to eliminate uncertainty, which is given by finite number of alternative events. • Shannon defines the entropy as:
  • 7. H = - ∑i=1P(si)log P(si) • Where S is the system with finite number of possible events si, P(si) represents the probablity of event si and the summation is over the ranges 1,2,3…n. • Maximum Entropy when all the probablities are equal and the most uncertain situation occurs. • Mininum entropy when H=0
  • 8. Entropy in case of two possibilities with probablities p and (1-p)
  • 9. • The entropy units are in bits and the binary logarithm is used. • If the common logarithm is used, then the units are dits. • And it is nits in case natural logarithm is used.
  • 10. ELIMINATING THE ENTROPY
  • 11. SPATIAL ENTROPY • This was defined by Batty,from the information theory basis presented by Shannon. • Where,xi represents the spatial interval size.This spatial component is implemented and this equation is more applicable in spatial analysis, such as comparison between different regions.
  • 12. ENTROPY AS A USEFUL PART OF A SPATIAL VISUALIZATION AND MODELLING • This figure shows the dependency between the number of observation and the outcoming kriging estimators of the spatial phenomenon. Here ,there are number of observations that gives us no useful information for the outcoming spatial modelling.
  • 13. • This figure shows the dependency of the outcoming amount of information, measured with the help of the entropy function on the number of modelled points.
  • 14. • The last part is the spatial visualization of the natural phenomena.In this fig., the function of spatial entropy is the convex function and therefore there exists points where the next added interval to the visualization of the spatial phenomenon gives us much less information like the previous one.
  • 15. • In this there are 4 GRID layers of the same climatic phenomemon but with different numbers of intervals,increasing from upper left to lower right picture.Each layers contains various amount of information,depending on various factors.
  • 16. CONCLUSION • The information theory,entropy and its spatial form is widely used in geographical research. • Entropy concepts can be used to investigate channel networks. Much of the work employing entropy concepts in hydrology have been done with reference to the informational entropy.
  • 17. Thank You!

×