Upcoming SlideShare
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Standard text messaging rates apply

# Fuzzy directed divergence and image segmentation

744

Published on

This presents an approach of obtainig fuzzy directed divergences and their use in binary image segmentation.

This presents an approach of obtainig fuzzy directed divergences and their use in binary image segmentation.

1 Like
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

Views
Total Views
744
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
20
0
Likes
1
Embeds 0
No embeds

No notes for slide

### Transcript

• 1. National Conference on Machine Intelligence Research and Advancement (NCMIRA, 12) Shri Mata Vaishno Devi University, Katra Jammu & KashmirNew Measure of Fuzzy Directed Divergence and its Application in Image Segmentation Surender Singh Asstt. Prof. , School of MathematicsShri Mata Vaishno Devi University , Katra –182320 (J & K) School of Mathematics, Shri Mata 21-23rd Dec., 2012 Vaishno Devi University, Katra
• 2.  Shannon’s Measure Divergence Measure Fuzzy set Fuzzy Directed Divergence Aggregation Operations Development of New measure of fuzzy directed divergence Application in image segmentation Conclusion
• 3.  Shannon initially developed information theory for quantifying the information loss in transmitting a given message in a communication channel Shannon(1948). The measure of information was defined Claude E. Shannon in his treatise paper in 1948. n H ( P)   pi log pi , P  n (1) i 1Where n n  {P  ( p1 , p2 ,...pn ) / pi  0,  pi  1 ; n  2} i1is the set of all complete finite discrete probability distributions.
• 4.  The relative entropy or directed divergence is a measure of the distance between two probability distributions. The relative entropy or Kullback-Leibler distance Kullback and Leibler (1951) between two probability distributions is defined as n D( P, Q)   pi log pi (2) i 1 qi
• 5. A correct measure of directed divergence must satisfy the following postulates:a. D (P,Q) ≥ 0b. D (P,Q) = 0 iff P = Qc. D (P, Q) is a convex function of bothandif in addition symmetry and triangle inequality is also satisfied by D(P,Q) then it called a distance measure.
• 6.  Let a universal set X and F (X) be the set of all fuzzy subsets .A mapping D:F (X) × F (X)→ R is called a divergence between fuzzy subsets if and only if the following axioms hold:a. D (A, B)b. D (A, B) =0 if A=Bc. max.{D( A  C, B  C), D( A  C, B  C)}  D( A, B)for any A, B, C ε F(X)
• 7.  Bhandari and Pal (1992) defined measure of fuzzy directed divergence corresponding to (2) as follow: n  A ( xi ) D( A, B)    A ( xi ) log  i 1 B ( xi ) n (1   A ( xi )) (1 A (xi )) log (1 B (xi )) i 1 (3)
• 8. An aggregation operation is defined by the function M : [0,1]n  [0,1]verifying1. M(0,0,0,...0) = 0 , M(1, 1, 1,…,1) = 1(Boundary Conditions)2. M is Monotonic in each argument. (Monotonicity)If n=2 then M is called a binary aggregation operation.
• 9. Let U (a, b) and V (a, b) be two binary aggregation operators then D(P, Q)   U ( pi , qi ) V ( pi , qi ) (4) iWhere P, Q  nis a divergence measure.We have A* :[0,1]2 [0,1] such that ab A* (a, b)  2and H * : [0,1]2  [0,1] a 2  b2such that H (a, b)  * a b
• 10. are aggregation operators.Then following divergence measure can be defined using the proposed method. n  pi2  qi2 pi  qi  DH * A* ( P, Q)     i 1  pi  qi 2   n ( pi  qi )2  (5) i 1 2( pi  qi )
• 11. The measure of fuzzy directed divergence between two fuzzy sets corresponding to (5) is defined as follow: M H * A* ( A, B)  F n ( A ( xi )  B ( xi ))2    1 1   (x )   (x )  2   (x )   (x )  i 1 2  A i B i A i B i  (6)
• 12. Let X  { fij , ( fij )}  fij  X , be an image of size M Mhaving L levels. fij  gray level of (i, j)th pixel in theimage X. 0  ( fij )  1( fij )  MembershipValue of (i, j )th pixel in X.Count( f )  Numberof occurencesof the graylevel f in the image. t  given threshold value which separates the object and the background
• 13. . t  f .count( f ) 0  f 0 t  Averagegray level of the backgroundregion count( f ) f 0 L1  f .count( f ) 1  f t 1 L1  Averagegray level of the object region count( f ) f t 1
• 14. For bilevel thresholding ( fij )  exp(c. fij  0 ) if fij  t , for background  exp(c. fij  1 ) if fij  t , for objectWhere ‘t’ is chosen threshold as stated. 1 c ( f max  f min)where fmin and fmax are the minimum and maximum gray level in the image respectively.
• 15.  A ( fij ) and B ( fij ) be the membershipvaluesof the (i, j)th pixelin the image A and B.Then in view of equation (6) fuzzy divergence between A and B is given by M *F ( A, B)  M 1 M 1 ( A ( fij )  B ( fij ))2     1 1    i 0 j 0 2    A ( fij )  B ( xi ) 2   A ( fij )  B ( fij )   (7)
• 16. Chaira and Ray (2005) proposed the following methodology for binary image thresholding.For bi-level or multilevel thresholding a searching methodology based on image histogram is employed here. For each threshold, the membership values of all the pixels in the image are found out using the above procedure. For each threshold value, the membership values of the thresholded image are compared with an ideally thresholded image.
• 17. Thus equation (7) reduces to M *F ( A, B)  M 1 M 1 ( A ( fij ) 1)2    1 1    i 0 j 0 2   A ( fij )  1 2   A ( fij ) 1   M 1 M 1 (  ( f )  1 2  )    1 1 A ij    i 0 j 0 2    A ( fij )  1 1   A ( fij )   M 1 M 1 1   A ( fij )    1  A ( fij )  i 0 j 0     (8)
• 18.  An ideally thresholded image is that image which is precisely segmented so that the pixels, which are in the object or the background region, belong totally to the respective regions. From the divergence value of each pixel between the ideally segmented image and the above chosen thresholded image, the fuzzy divergence is found out. In this way, for each threshold, divergence of each pixel is determined according to Eq. (17) and the cumulative divergence is computed for the whole image.
• 19.  The minimum divergence is selected and the corresponding gray level is chosen as the optimum threshold. After thresholding, the thresholded image leads almost towards the ideally thresholded image.
• 20. In this communication an approach to develop measures of fuzzy directed divergence using aggregation operators is proposed.The proposed class of fuzzy directed divergence can be generalized in terms one , two or three parametres. The fuzzy directed divergence is also useful to solve problems related to decision making,pattern recognition and so on.
• 21. Basseville, M., Divergence measures for statistical data processing.Publications Internes de lIRISA ,Novembre 2010.Bhandari, D., Pal,N. R., Some new information measures for fuzzy sets, Information Sciences, 1993, 67: 204 – 228Bhatia,P.K., Singh,S., On some divergence measures between fuzzy sets and aggregation operations (Communicated).Brink, A.D., Pendcock, N.E.,Minimum cross-entropy threshold selection. Pattern Recognition, 1996, 29: 179-188.Cheng, H.D., Chen, H.H., Image segmentation using fuzzy homogeneity criterion. Inform. Sci., 1997, 98: 237-262.
• 22. Chaira, T., Ray, A.K., Segmentation using fuzzy divergence Pattern Recognition Letters , 2005, 24: 1837-1844.Couso, I., Janis,V.,Montes,S., Fuzzy Divergence Measures, Acta Univ. M. Belii Math., 2004, 8: 21 – 26.De Luca,A., Termini,S., A definition of non-probabilistic entropy in the setting of fuzzy set theory, Inform. and Control, 1971, 20: 301 - 312.Esteban ,María Dolores, Domingo Morales,,A summary on entropy statistic,s”Kybernetika,1995, 31(4): 337–346.Huang, L.K., Wang, M.J., Image thresholding by minimizing the measure of fuzziness. Pattern Recognition, 1995, 28 (1) : 41-51.
• 23. Kapur, J.N., Sahoo, P.K., Wong, A.K.C., A new method of gray level picture thresholding using the entropy of the histogram. Comput. Vision Graphics Image Process, 1985, 29: 273-285.Kullback,S., R.A. Leibler, On Information and Sufficiency. Ann. Math. Statist., 1951,22 : 79-86.Otsu, K., A threshold selection method from gray level histograms. IEEE Trans. Systems Man Cyber.,1979, 9: 62- 66.Pal, S.K., Dasgupta, A., Spectral fuzzy sets and soft thresholding. Inform. Sci,1992,. 65 : 65-97.Pal, N.R., Pal, S.K., Entropy, a new definition and its application. IEEE Systems Man Cyber.,1991, 21 :1260- 1270.
• 24. Qing, M., Li, T., Some Properties and New Formulae of Fuzzy Entropy, Proceedings of the 2004 IEEE International Conference on Networking, Sensing & Control, Taipei, Taiwan, March 21- 23, 2004: 401-406.Ramar, K. et al., Quantitative fuzzy measures for threshold selection. Pattern Recognition Lett..2000, 21 : 1-7.Sahoo, P.K., Wilkins, C., Yager, Threshold selection using Renyi s entropy. Pattern Recognition,1997, 30: 71-84.Shannon, C.E., The Mathematical Theory of Communications. Bell Syst. Tech. Journal, 1948, 27: 423–467.Taneja,I.J.,Generalized Information Measures and their Applications - On-line book :http: //www. mtm.ufsc.br/∼ taneja/book/book.html, 2001.
• 25. Renyi, A.,On Measures of Entropy and Information. Proc. 4th Berkeley Symp. Math. Stat Probab.1, 1961 :547–561.Sahoo, P.K., Wong, A.K.C. A gray level threshold selection method based on maximum entropy principal. IEEE Trans. Man Systems and Cyber.,1989, 19 :866-871.Sahoo, P.K., Soltani, S., Wong, A.K.C., Chen, Y.C., A survey of thresholding techniques. Comput. Vision Graphics Image Process.,1988: 233-260Taneja,I.J., On Mean divergence measures. available at arXiv:math/0501297v1[math.ST] 19 Jan 2005.Taxt,T.,Flynn,P. J. and Jain,A. K., Segmentation of document images, IEEE Trans. Pattern Analysis Mach. Intell.,1989, 11 (12): 1322-1329 .Zadeh,L.A., Fuzzy Sets, Information and control, 1965, 8 : 338 - 353.
• 26. Thanks