Irrera gold2010


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Irrera gold2010

  1. 1. INTEGRATION OF SUPPORT VECTOR MACHINES AND MARKOV RANDOM FIELDS FOR REMOTE SENSING IMAGE CLASSIFICATION Paolo Irrera, Gabriele Moser, Sebastiano B. Serpico University of Genoa, Dept. of Biophysical and Electronic Eng. (DIBE), Via Opera Pia 11a, I-16145 Genoa Italy
  2. 2. OUTLINE <ul><li>Introduction </li></ul><ul><ul><li>Remote sensing image classification </li></ul></ul><ul><ul><li>Objective of the paper </li></ul></ul><ul><ul><li>Support vector machines </li></ul></ul><ul><ul><li>Markov random fields </li></ul></ul><ul><li>Methodology </li></ul><ul><ul><li>Markovian proposed method </li></ul></ul><ul><ul><li>Architecture of the method </li></ul></ul><ul><ul><li>Parameter optimization </li></ul></ul><ul><li>Experimental results </li></ul><ul><ul><li>Confusion matrices </li></ul></ul><ul><ul><li>Classification maps </li></ul></ul><ul><li>Conclusions </li></ul>
  3. 3. REMOTE SENSING IMAGE CLASSIFICATION <ul><li>Techniques that aim at labeling each image pixel as belonging to a thematic class. </li></ul><ul><li>Examples of applications: </li></ul><ul><ul><li>land-use or land-cover mapping; </li></ul></ul><ul><ul><li>urban-area mapping; </li></ul></ul><ul><ul><li>forest inventory; </li></ul></ul><ul><ul><li>snow-cover mapping. </li></ul></ul><ul><li>Many approaches have been proposed for supervised classification: </li></ul><ul><ul><li>parametric and nonparametric Bayesian; </li></ul></ul><ul><ul><li>neural; </li></ul></ul><ul><ul><li>fuzzy; </li></ul></ul><ul><ul><li>support vector machines (SVMs), </li></ul></ul><ul><ul><li>… </li></ul></ul>
  4. 4. OBJECTIVE OF THE PAPER <ul><li>Key-idea of SVMs: </li></ul><ul><ul><li>identifying an optimal linear discriminant hypersurface in a suitable nonlinearly trasformed feature space. </li></ul></ul><ul><li>Good analytical properties (generalization capability) and excellent performance in many applications (e.g., object recognition, hyperspectral image classification). </li></ul><ul><li>Limitation: </li></ul><ul><ul><li>SVMs focus on i.i.d (indipendent and identically distribuited) samples; </li></ul></ul><ul><ul><li>in image classification, this implies an intrinsically noncontextual approach. </li></ul></ul><ul><li>Objective of the paper: </li></ul><ul><ul><li>integration of the SVM and Markov random field (MRF) approaches to classification, aiming at a rigorous contextual generalization of SVMs. </li></ul></ul>
  5. 5. SVM CLASSIFIER <ul><li>It exploits the information associated to the samples located at the interface between distinct classes (support vectors). </li></ul><ul><li>Training is expressed as a quadratic programming problem. </li></ul><ul><li>The nonlinear transformation of the feature space is implicitly defined by a kernel function K(x,y), that allows a nonlinear problem to be formalized as a linear problem without a relevant increase in computational complexity. </li></ul><ul><li>Here, we use a gaussian kernel. </li></ul>Quadratic programming problem Discriminant function, nonlinear case, two classes
  6. 6. MARKOV RANDOM FIELDS <ul><li>MRFs constitute a general family of stochastic models for the contextual information associated with an image, in Bayesian image-analysis problems. </li></ul><ul><li>They allow global stochastic models to be formalized according to the local statistical relationships among neighboring pixels (Hammersley-Clifford’s theorem) . </li></ul><ul><li>When modeling the random field of the thematic class labels as an MRF, the “maximum a-posteriori ” criterion can be formalized as the minimization of a suitable energy function : </li></ul>
  7. 7. INTEGRATING MRF AND SVM <ul><li>Here, we prove that, under proper assumptions, the Markovian minimum-energy decision rule can be reformulated as the application of a SVM discriminant function in a transformed feature space, associated to a suitable “contextual kernel” . </li></ul><ul><li>Contextual information is formalized through an additional feature (“stacked vector”) </li></ul><ul><li>A modified kernel function fuses contextual and noncontextual information (the linear combination of two related contributions). </li></ul><ul><li>In this framework, a novel classifier is introduced by using the “iterated conditional mode” approach. </li></ul>Discriminant function. Contextual kernel Kernel-based expression of the discriminant function
  8. 8. PROPOSED CLASSIFIER I = image n channels to be classified. T = training map. m = update classification map at each iteration.
  9. 9. PARAMETER OPTIMIZATION <ul><li>The method presents the following parameters: </li></ul><ul><ul><li>SVM regularization parameter C; </li></ul></ul><ul><ul><li>variance of the Gaussian kernel; </li></ul></ul><ul><ul><li>weight parameter λ of the spatial kernel contribution. </li></ul></ul><ul><li>Algorithms used for parameter estimation: Powell, Ho-Kashyap. </li></ul><ul><li>Powell’s algorithm is a local unconstrained minimization method for multidimensional spaces. It does not involves derivatives and is applied here to the cross-validation error (nondifferentiable function) to optimize C and the variance of the Gaussian kernel. </li></ul><ul><li>For the estimation of λ a recently proposed approach, based on the Ho-Kashyap’s algorithm for the optimization of weight parameters in MRF models, has been used . </li></ul>
  10. 10. DATA SETS FOR EXPERIMENTS <ul><li>Data set “Pavia” </li></ul><ul><ul><li>SIR-C/XSAR </li></ul></ul><ul><ul><li>Rural area (near Pavia) </li></ul></ul><ul><ul><li>700 x 280 pixels </li></ul></ul><ul><ul><li>4 channels (XSAR channel is shown in the figure) </li></ul></ul><ul><ul><li>Medium resolution (25m) </li></ul></ul><ul><ul><li>Main classes: “dry soil” and “wet soil”. </li></ul></ul><ul><li>Data set “Tanaro” </li></ul><ul><ul><li>COSMO/SkyMed </li></ul></ul><ul><ul><li>Flood of the Tanaro River near Alessandria </li></ul></ul><ul><ul><li>3155 x 1695 pixels </li></ul></ul><ul><ul><li>single-channel </li></ul></ul><ul><ul><li>Very high resolution (1m) </li></ul></ul><ul><ul><li>Main classes : “dry soil” and “water or flooded soil”. </li></ul></ul><ul><li>Spatially disjoint training and test fields are available for both data sets. </li></ul>
  11. 11. EXPERIMENTAL RESULTS CONFUSION MATRICES AND ACCURACIES Pavia. Confusion matrix, noncontextual SVM. Pavia. Confusion matrix, proposed method . Tanaro. Confusion matrix, noncontextual SVM . Tanaro. Confusion matrix, proposed method .
  12. 12. EXPERIMENTAL RESULTS CLASSIFICATION MAPS Pavia: map generated by a noncontextual SVM. Pavia: map generated by the proposed method. Tanaro: map generated by a noncontextual SVM. Tanaro : map generated by the proposed method.
  13. 13. EXPERIMENTAL RESULTS CONVERGENCE OF THE METHOD Tanaro: behavior of the accuracy ( overall accuracy – OA, average accuracy – AA, and crossvalidation accuracy – XVAL) as a function of the number of iterations of the proposed method.
  14. 14. CONCLUSIONS <ul><li>A feasible Markovian extension of SVM to contextual classification has been introduced. </li></ul><ul><li>Experiments with real data suggest that the proposed method allows a significant accuracy increase to be obtained, as compared to a standard (noncontextual) SVM. </li></ul><ul><li>Very accurate results on different types of remote-sensing data, including very high resolution COSMO/SkyMed SAR data. </li></ul><ul><li>Possible future extensions: </li></ul><ul><ul><li>theoretical analysis of convergence properties (even though no experimental evidence was collected about possibly critical convergence issues); </li></ul></ul><ul><ul><li>testing the method with other typologies of remote-sensing data (in particular, optical and hyperspectral images) and with more sophisticated MRF models. </li></ul></ul>
  15. 15. REFERENCES <ul><ul><li>[1] J. Besag. Spatial interaction and statistical analysis of lattice systems. Journal of the Royal Statistical Society, (6):192–236, 1974. </li></ul></ul><ul><ul><li>[2] R. Brent. Algorithm for minimization without derivatives, chapter 5. Englewood Cliffs, NJ: Prentice-Hall, 1973. </li></ul></ul><ul><ul><li>[3] C. J. Burges. A tutorial on support vector machines for pattern recognition. Research report, Kluwer Academic Publishers, 1998. </li></ul></ul><ul><ul><li>[4] N. Cristianini and J. Shawe-Taylor. An Introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, 2000. </li></ul></ul><ul><ul><li>[5] M. Datcu, K. Seidel, and M. Walessa. Spatial information retrieva from remote sensing images: Information theoretical perspective. IEEE Trans. Geosci. Remote Sensing, 36(5):1431–1445, 1998. </li></ul></ul><ul><ul><li>[6] R. Dubes and A. Jain. Random fields models in image analysis. J. Appl. Stat., 16(2):131–163, 1989. </li></ul></ul><ul><ul><li>[7] R. O. Duda, P. E. Hart, and D. G. Stork. Pattern classification. Wiley Interscience, 2001. </li></ul></ul><ul><ul><li>[8] S. Geman and D. Geman. Sochastic relaxation Gibbs distributions and the bayesian restoration. IEEE Trans. Pattern Anal. Mach. Intell., 6):721–741, 1984. </li></ul></ul><ul><ul><li>[9] D. A. Landgrebe. Signal theory methods in multispectral remote sensing. Wiley-InterScience, 2003. </li></ul></ul><ul><ul><li>[10] F. Melgani and S. B. Serpico. A Markov random field approach to spatio-temporal contextual image classification. IEEE Trans. Geosci. Remote Sensing, 41(11):2478–2487, 2003. </li></ul></ul><ul><ul><li>[11] G. Moser. Analisi di immagini telerilevate per osservazione della Terra, pages 7–48 and 140–197. ECIG, 2006. </li></ul></ul><ul><ul><li>[12] C. Oliver and S. Quegan. Understanding synthetic aperture radar images. SciTech Publishing, 2004. </li></ul></ul><ul><ul><li>[13] W. K. Pratt. Digital image processing. Wiley Interscience, 2007. </li></ul></ul><ul><ul><li>[14] W. Press, S. Teukolsky, W. Vetterling, and B. Flannery. Numerical recipes in C, pages 394–455. Cambridge University Press, New York, NY, U.S.A., 1992. </li></ul></ul><ul><ul><li>[15] J. Richards and X. Jia. Remote sensing digital image analysis. Springer, 2005. </li></ul></ul><ul><ul><li>[16] S. B. Serpico and G. Moser. Weight parameter optimization by the Ho-Kashyap algorithm in MRF model for supervised image classification. IEEE Trans. Geosci. Remote Sensing, 44(12):3695–3705, 2006. </li></ul></ul><ul><ul><li>[17] A. H. S. Solberg. Flexible nonlinear contextual classification. Pattern Recognit. Lett., 25(13):1501–1508, 2004. </li></ul></ul><ul><ul><li>[18] V. N. Vapnik. Statistical learning theory. Wiley Interscience, 1998. </li></ul></ul>
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.