Cit112010

  • 18 views
Uploaded on

Divergence of Jensen Shanon for texture classification

Divergence of Jensen Shanon for texture classification

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
18
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
1
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Membrane Detector by Texture Analisys An Analysis of Edge Detection by Using the Jensen-Shannon Divergence, G´omez-Lopera, Juan Francisco and Mart´ınez-Aroza, Jos´e and Robles-P´erez, Aureliano M. and Rom´an-Rold´an, Ram´on Rodrigo Rojas Moraleda July 4, 2012 Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 1/24
  • 2. Outline 1 Introduction 2 The system 3 Conclusions Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 2/24
  • 3. Outline 1 Introduction 2 The system 3 Conclusions Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 3/24
  • 4. Introduction Texture analisys Definition Texture and texture analisys is the most important visual clue in identifying types of homogeneous regions. This is called texture classification. The goal of texture classification then is to produce a classification map of the input image where each uniform textured region is identified with the texture class it belongs to. Problem features In many machine vision and image processing algorithms, simplifying assumptions are made about the uniformity of intensities in local image regions. However, images of real objects often do not exhibit regions of uniform intensities. The patterns in a image can be the result of physical surface properties such as roughness or oriented strands which often have a tactile quality, or they could be the result of reflectance differences such as the color on a surface. One immediate application of image texture is the recognition of image regions using texture properties. Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 4/24
  • 5. Introduction Jensen–Shannon divergence Jensen–Shannon divergence Jensen–Shannon divergence is a popular method of measuring the similarity between two probability distributions. It is also known as information radius (IRad) or total divergence to the average. JSD(P Q) = 1 2 D(P M) + 1 2 D(Q M) M = 1/2(P + Q) D(P Q) = DKL(P Q) = i P(i)log P(i) Q(i) The average of the logarithmic difference between the probabilities P and Q, where the average is taken using the probabilities P. Divergence grows as the differences between its arguments (the probability distributions involved) increase, and vanishes when all the probability distributions are identical. Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 5/24
  • 6. Introduction Jensen–Shannon divergence Texture and texture analisys is the most important visual clue in identifying types of homogeneous regions. Texture analisys aim to produce a classification map of the input image where each uniform textured region is identified. Considerations about texture analisys and the real world In image processing is possible made assumptions about the uniformity of intensities in local regions. Despite of in real objects often do not exhibit regions of uniform intensities. The patterns in a image can be the result of physical surface properties such as roughness, oriented strands or reflectance differences such as the color on a surface. Image Intensities and probabilities Image histograms represents how frequent brightness levels from 0 to 255 appear in the image, showing a visual impression of the distribution of data. It is an estimate of the probability distribution of a continuous variable. The total area of a histogram used for probability density is always normalized to 1. Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 6/24
  • 7. Introduction Jensen–Shannon divergence Jensen–Shannon divergence is a popular method of measuring the cohesion of a finite set of probability distributions having the same number of possible events.Its value grows as the differences between its arguments (the probability distributions involved) increase, and vanishes when all the probability distributions are identical. If we consider a window W made up of two identical subwindows W1 and W2, sliding over a straight horizontal edge between two different homogeneous regions a and b, Jensen-Shannon divergence between the normalised histograms of the subwindows reaches maximum value just when each subwindow lies completely within one region. W1 W2 W1 W2 W1 W2 W1 W2 W1 W2 W1 W2 Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 7/24
  • 8. Introduction Jensen–Shannon divergence Trying several window orientations for each pixel is possible to obtain an estimate for the edge orientation which maximize the divergence value. W1 W2 W 1 W 2 W1 W2 W 1 W 2 JS1 JS2 JS3 JS4 Figure: The values JS1,JS2,JS3 and JS4 are calculated for the fixed window orientations 0, π/4, π/2and3π/4 Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 8/24
  • 9. Outline 1 Introduction 2 The system 3 Conclusions Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 9/24
  • 10. The system Texture analisys Steps Step 1. Calculation of divergence and direction matrices. Step 2. Edge-pixel selection. Step 3. Edge linking. Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 10/24
  • 11. Step 1 Calculation of divergence and direction matrices Window sliding W1 W2 W1 W2 W1 W2 W1 W2 W1 W2 W1 W2 Figure: Behavior of Jensen Shanon divergence versus sliding window over an perfect edge Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 11/24
  • 12. Step 1 Calculation of divergence and direction matrices Window sliding W1 W2 W 1 W 2 W1 W2 W 1 W 2 JS1 JS2 JS3 JS4 Figure: The values JS1,JS2,JS3 and JS4 are calculated for the fixed window orientations 0, π/4, π/2and3π/4 Problem How to obtain an estimate of the direction fromthese four values that maximizes the JS and then the value of this maximum, JSmax . For a given pixel, the JS value is a π − periodic function of window orientation over the image. It reaches its maximum value for a given orientation, β, and a minimum in β + π. A periodic function can be expressed as: JS(x) = a + bcos(β + 2πx), x ∈ [0, 1] Here β ∈ [0, π) is the edge direction in the pixel, a,b are constants used to specify the amplitude. JS(x) = c + msen(2πx) + ncos(2πx), x ∈ [0, 1] Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 12/24
  • 13. Step 1 Calculation of divergence and direction matrices Maximum JS f (x) ≈ sen(2πx), g(x) ≈ cos(2πx) With a least-squares fir over the points JS1 + JS2 + JS3 + JS4 JS(x) = JS1 + JS2 + JS3 + JS4 4 + JS2 + JS4 2 f (x) + JS1 + JS3 2 g(x) Maximum JS The direction, x, having the maximun JS can be obtained by: if JS1 − JS3 ≥ 0, JS2 − JS4 ≥ 0 ⇒ x = JS2 − JS4 4[(JS1 − JS3) − (JS2 − JS4)] ∈ [0, 1/4] if JS1 − JS3 ≥ 0, JS2 − JS4 ≤ 0 ⇒ x = 4(JS1 − JS3) − 3(JS2 − JS4) 4[(JS1 − JS3) − (JS2 − JS4)] ∈ [3/4, 1] Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 13/24
  • 14. Step 1 Calculation of divergence and direction matrices Maximum JS if JS1 − JS3 ≤ 0, JS2 − JS4 ≥ 0 ⇒ x = 2(JS1 − JS3) − (JS2 − JS4) 4[(JS1 − JS3) − (JS2 − JS4)] ∈ [1/4, 1/2] if JS1 − JS3 ≤ 0, JS2 − JS4 ≤ 0 ⇒ x = 2(JS1 − JS3) + 3(JS2 − JS4) 4[(JS1 − JS3) − (JS2 − JS4)] ∈ [1/2, 3/4] Finally δ = πx ∈ [0, π) as the estimated edge direction. The x direction maximizes the JS. Now each pixel is labelled with a pair of values, (estimated edge direction, and the estimated JSmax) Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 14/24
  • 15. Step 1 Calculation of divergence and direction matrices Attenuation Factor Due the JS is too sensitive to any change in grey levels between regions is necesary include extra information, as an attenuation factor. JS∗ i,j = JSi,j (1 − α + αWi,j ) Where Wi,j =|Nw1 − Nw2|/Nw Nw1, Nw2 are the average grey level of subwindows W1 and W2 Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 15/24
  • 16. Step 2 Edge-pixel selection Edge-pixel selection In this step the procedure selects which pixels from the divergence matrix are edge pixels. Thresholding the divergence matrix is not always useful, since maximum JS values depend on the composition of adjacent textures, and will thus vary according to texture. Consequently, it would seem more appropriate to use a local criterion. Accordingly, each edge-pixel candidate is the centre of an odd-length monodimensional window, placed perpendicular to the estimated edge direction in that pixel Estimated edge direction Pixel under study Figure: Monidimensional Window Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 16/24
  • 17. Step 2 Edge-pixel selection Edge-pixel selection JScentre − JSj Td Any other pixel j in that particular monodimensional window, where Td is a threshold. Pixels marked as edge pixels are then outstanding local maxima of the divergence matrix. Obviously, detection results depend directly on the parameter Td, which can be modified by the user if necessary. This local edge-pixel detection method requires simple divergence matrix pre-processing. The divergence matrix is therefore smoothed out by repeatedly applying a 3 × 3 mean filter. Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 17/24
  • 18. Step 3 Edge-linking Edge-linking This step attempts to join the various sets of edge pixels using information from the divergence matrix associated with the image, together with knowledge of the direction in which maximum JS is produced. In broad terms, the linking procedure consists in extracting edge pixels unmarked since they did not satisfy the condition, but nearly did. Not all the pixels in the image are candidates for filling the gaps, only those classified as neighbour candidates of end pixels. Figure: End points and neighbour candidates for edge prolongation. E, end point; C, neighbour candidates. The remaining grey pixels are edge pixels. Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 18/24
  • 19. Step 3 Edge-linking Join End-points End pixel criteria, is a pixel having one or two marked pixels joined together. Neighbour candidate must have a JS reasonably high. The estimated edge direction of the end pixel Dirend , the edge-direction neighbour candidate and the edge-direction of the physical line joining them must not differ more than a specified amount. Join End-points JSend − JSneighbourcandidate τd (Dir(end, neighbourcandidate)) − Dirend )2 +(Dir(end, neighbourcandidate) −Dirneighbourcandidate )2 τθ Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 19/24
  • 20. Results Theoretical Figure:Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 20/24
  • 21. Results Theoretical Figure: Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 21/24
  • 22. Outline 1 Introduction 2 The system 3 Conclusions Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 22/24
  • 23. Discussion Discussion Although this work is still in preliminary stages, we have seen the Monfroy framework is suitable for use in the modeling and prototype a dynamic composition of Web Services in the import of goods constrained problem. Solve the backtracking problem in an totally distributed environment is still a problem, and must be resolved for use in a real environment Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 23/24
  • 24. Questions ? Rodrigo Rojas Moraleda rodrigo.rojas@postgrado.usm.cl Rodrigo Rojas Moraleda — Membrane Detector by Texture Analisys 24/24