SlideShare a Scribd company logo
1 of 27
Download to read offline
Kernel Entropy Component Analysis
      in Remote Sensing Data Clustering

Luis GĂłmez-Chova1          Robert Jenssen2    Gustavo Camps-Valls1

 1 Image  Processing Laboratory (IPL), Universitat de ValĂšncia, Spain.
     luis.gomez-chova@uv.es , http://www.valencia.edu/chovago
2 Department of Physics and Technology, University of TromsĂž, Norway.
       robert.jenssen@uit.no , http://www.phys.uit.no/∌robertj


                   IGARSS 2011 – Vancouver, Canada

                  *
       IPL




    Image Processing Laboratory
Intro            ECA           KECA              Clustering       Results            Conclusions

Outline




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    1/26
Intro            ECA            KECA              Clustering       Results            Conclusions

Motivation

        Feature Extraction
             Feature selection/extraction essential before classiïŹcation or regression
                  to discard redundant or noisy components
                  to reduce the dimensionality of the data
             Create a subset of new features by combinations of the existing ones

        Linear Feature Extraction
             OïŹ€er Interpretability ∌ knowledge discovery
                  PCA: projections maximizing the data set variance
                  PLS: projections maximally aligned with the labels
                  ICA: non-orthogonal projections with maximal independent axes
             Fail when data distributions are curved




            Nonlinear feature relations




L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    2/26
Intro            ECA               KECA              Clustering             Results           Conclusions

Objectives




        Objectives
             Kernel-based non-linear data-transformation
                     Captures the data higher order statistics
                     Extracts features suited for clustering


        Method
             Kernel Entropy Component Analysis (KECA)               [Jenssen, 2010]

             Based on Information Theory:
                     Maximally preserves entropy of the input data
                     Angular clustering maximizes cluster divergence
             Out-of-sample extension to deal with test data

        Experiments
             Cloud screening from ENVISAT/MERIS multispectral images



L. Gómez-Chova et al.           Kernel Entropy Component Analysis        IGARSS 2011 – Vancouver    3/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    4/26
Intro           ECA           KECA              Clustering       Results            Conclusions

Information-Theoretic Learning




        Entropy Concept
            Entropy of a probability density function (pdf) is a measure of information




             Entropy ⇔ Shape
                 of the pdf




L. Gómez-Chova et al.      Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    5/26
Intro           ECA             KECA              Clustering       Results            Conclusions

Information-Theoretic Learning



        Divergence Concept
            The entropy concept can be extended to obtain a measure of dissimilarity
            between distributions

                                                                 ←→




            Divergence ⇔ Distance
                 between pdfs




L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    6/26
Intro            ECA            KECA                   Clustering            Results           Conclusions

Entropy Component Analysis


        Shannon entropy
                                                   Z
                                     H(p) = −           p(x) log p(x)dx
            How to handle densities?                                How to compute integrals?

        RĂ©nyi’s entropies
                                                             Z
                                                1
                                    H(p) = −       log           p α (x)dx
                                               1−α

             RĂ©nyi’s entropies contain Shannon as a special case α → 1
             We focus on the RĂ©nyi’s quadratic entropy α = 2

        RĂ©nyi’s quadratic entropy
                                               Z
                             H(p) = − log          p 2 (x)dx = − log V (p)


             It can be estimated directly from samples!

L. Gómez-Chova et al.        Kernel Entropy Component Analysis            IGARSS 2011 – Vancouver    7/26
Intro            ECA               KECA              Clustering          Results              Conclusions

Entropy Component Analysis


        RĂ©nyi’s quadratic entropy estimator

             Estimated from data D = {x1 , . . . , xN } ∈ Rd generated by the pdf p(x)
             Parzen window estimator with a Gaussian or Radial Basis Function (RBF):
                        1 X                                                              2
                               K (x, xt | σ)                                                 /2σ 2
                                                                             `                       ÂŽ
              p (x) =
              ˆ                                    with      K (x, xt ) = exp − x − xt
                        N x ∈D
                            t


             Idea: Place a kernel over the samples and sum with proper normalization
             The estimator for the information potential V (p) = p 2 (x)dx
                                                                      R

                            Z              Z
                                             1 X                    1 X
                ˆ
               V (p) =         p 2 (x)dx =
                               ˆ                      K (x, xt | σ)        K (x, xt | σ)dx
                                             N x ∈D                 N x ∈D
                                                t                      t
                                           Z
                             1 X X
                       =                     K (x, xt | σ)K (x, xt | σ)dx
                            N 2 x ∈D x ∈D
                                     t    t

                                 1 X X                  √      1
                        =         2
                                            K (xt , xt | 2σ) = 2 1 K1
                                N x ∈D x ∈D                   N
                                     t    t



L. Gómez-Chova et al.           Kernel Entropy Component Analysis      IGARSS 2011 – Vancouver           8/26
Intro            ECA            KECA              Clustering        Results             Conclusions

Entropy Component Analysis

        RĂ©nyi’s quadratic entropy estimator
             Empirical RĂ©nyi entropy estimate resides in the corresponding kernel matrix

                                              ˆ       1
                                              V (p) = 2 1 K1
                                                     N
             It can be expressed in terms of eigenvalues and eigenvectors of K

                                 D diagonal matrix of eigenvalues λ1 , . . . , λN
                              
                  K = EDE
                                 E matrix with the eigenvectors       e1 , . . . , eN

             Therefore we then have
                                                 N
                                               1 X “p        ”2
                                       ˆ
                                       V (p) = 2     λi ei 1
                                              N
                                                     i =1
                            √
             where each term λi ei 1 will contribute to the entropy estimate

        ECA dimensionality reduction
             Idea: to ïŹnd the smallest set of features √ maximally preserve the
                                                       that
             entropy of the input data (contributions λi ei 1)
L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver      9/26
Intro           ECA          KECA              Clustering         Results            Conclusions

Entropy Component Analysis



            H(p) = 4.36                    H(p) = 4.74                   H(p) = 5.05




                        H(p) = 4.71                                ˆ
                                                     H(p) = 4.81 , H(p) = 4.44




L. Gómez-Chova et al.     Kernel Entropy Component Analysis     IGARSS 2011 – Vancouver    10/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    11/26
Intro           ECA             KECA              Clustering          Results           Conclusions

Kernel Principal Component Analysis (KPCA)

        Principal Component Analysis (PCA)

            Find projections of X = [x1 , . . . , xN ]   maximizing the variance of data XU

                  PCA:         maximize:      Trace{(XU) (XU)} = Trace{U Cxx U}
                               subject to:    U U=I
            Including Lagrange multipliers λ, this is equivalent to the eigenproblem
                                       Cxx ui = λi ui → Cxx U = UD

            ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0




                                        PCA



L. Gómez-Chova et al.        Kernel Entropy Component Analysis     IGARSS 2011 – Vancouver    12/26
Intro           ECA            KECA              Clustering       Results            Conclusions

Kernel Principal Component Analysis (KPCA)


        Kernel Principal Component Analysis (KPCA)

            Find projections maximizing variance of mapped data [φ(x1 ), . . . , φ(xN )]

                   KPCA:         maximize:       Tr{(ΊU) (ΊU)} = Tr{U Ί ΊU}
                                 subject to:     U U=I

            The covariance matrix Ω Ω and projection matrix U are dH × dH !!!

        KPCA through kernel trick

            Apply the representer’s theorem: U = Ί A where A = [α1 , . . . , αN ]

                   KPCA:        maximize:       Tr{A ΊΊ ΊΊ A} = Tr{A KKA}
                                subject to:     U U = A ΊΊ A = A KA = I

            Including Lagrange multipliers λ, this is equivalent to the eigenproblem

                                    KKαi = λi Kαi → Kαi = λi αi

            Now matrix A is N × N !!! (eigendecomposition of K = EDE = AA )

L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    13/26
Intro            ECA              KECA           Clustering            Results            Conclusions

Kernel ECA Transformation



        Kernel Entropy Component Analysis (KECA)
            KECA: projection of Ί onto those m feature-space principal axes
            contributing most to the RĂ©nyi entropy estimate of the input data
                                                                 1
                                                          2
                                         Ίeca = ΊUm = Em Dm

                                                                                      √
                 Projections onto a single principal axis ui in H is given by ui Ί = λi ei
                                                           1               1 Pm `√          ®2
                                                   ˆ
                 Entropy associated with Ίeca is Vm = N 2 1 Keca 1 = N 2             λi ei 1
                                                                                i =1


            Note that Ίeca is not necessarily based on the top eigenvalues λi since
            ei 1 also contributes to the entropy estimate

        Out-of-sample extension
            Projections for a collection of test data points:
                                                                −1               −1
                        Ίeca,test = Ίtest Um = Ίtest ΊEm Dm 2 = Ktest Em Dm 2


L. Gómez-Chova et al.       Kernel Entropy Component Analysis        IGARSS 2011 – Vancouver     14/26
Intro           ECA          KECA              Clustering       Results            Conclusions

Kernel ECA Transformation




        KECA example

              Original             PCA                 KPCA           KECA




            KECA reveals cluster structure → underlying labels of the data
            Nonlinearly related clusters in X → diïŹ€erent angular directions in H
            An angular clustering based on the kernel features Ίeca seems reasonable




L. Gómez-Chova et al.     Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    15/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    16/26
Intro           ECA            KECA              Clustering        Results            Conclusions

KECA Spectral Clustering

        Cauchy-Schwarz divergence
            The Cauchy-Schwarz divergence between the pdf of two clusters is
                                                                  R
                                                                    pi (x)pj (x)d x
                DCS (pi , pj ) = − log(VCS (pi , pj )) = − log qR           R
                                                                  pi (x)d x pj2 (x)d x
                                                                    2



            Measuring dissimilarity in a probability space is a complex issue
                                                                              1
                                                                                  φ(xt ):
                                                                                P
            Entropy interpretation in the kernel space → mean vector ” = N
                         Z
                                          1            1
                 V (p) = p 2 (x)dx = 2 1 K1 = 2 1 ΊΊ 1 = ” ” = ” 2
                 ˆ          ˆ
                                         N            N
                                                                ”i ”j
            Diverg. via Parzen windowing ⇒ VCS (pi , pj ) =
                                           ˆ
                                                                ”i ”j
                                                                         = cos ∠(”i , ”j )


        KECA Spectral Clustering
            Angular clustering of Ίeca maximizes the CS divergence between clusters:
                                                   k
                                                   X
                             J(C1 , . . . , Ck ) =   Ni cos ∠(φeca (x), ”i )
                                                  i =1
L. Gómez-Chova et al.       Kernel Entropy Component Analysis    IGARSS 2011 – Vancouver     17/26
Intro               ECA         KECA              Clustering       Results            Conclusions

KECA Spectral Clustering


        KECA Spectral Clustering Algorithm




         1   Obtain Ίeca by Kernel ECA
         2   Initialize means ”i , i = 1, . . . , k
         3   For all training samples assign a cluster
             xt → Ci maximizing cos ∠(φeca (xt ), ”i )
         4   Update mean vectors ”i                               CS
         5   Repeat steps 3 and 4 until convergence
                                                                          py
                                                                       tro
                                                                    En



        Intuition
        A kernel feature space data point φeca (xt ) is assigned to the cluster represented
        by the closest mean vector ”i in terms of angular distance

L. Gómez-Chova et al.        Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver      18/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    19/26
Intro            ECA           KECA              Clustering       Results            Conclusions

Experimental results: Data material



        Cloud masking from ENVISAT/MERIS multispectral images
            Pixel-wise binary decisions about the presence/absence of clouds
            MERIS images taken over Spain and France
            Input samples with 13 spectral bands and 6 physically inspired features




           Barrax (BR-2003-07-14)      Barrax (BR-2004-07-14)     France (FR-2005-03-19)



L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    20/26
Intro                                           ECA                    KECA                                         Clustering                                               Results                      Conclusions

Experimental results: Numerical comparison

        Experimental setup
                                        KECA compared with k-means, KPCA + k-means, and Kernel k-means
                                        Number of clusters ïŹxed to k = 2 (cloud-free and cloudy areas)
                                        Number of KPCA and KECA features ïŹxed to m = 2 (stress diïŹ€erences)
                                        RBF-kernel width parameter is selected by gird-search for all methods

        Numerical results
                                        Validation results on 10000 pixels per image manually labeled
                                        Kappa statistic results over 10 realizations for all images
                                                 BR-2003-07-14                                                    BR-2004-07-14                                                     FR-2005-03-19
                                   1                                                                  0.8                                                              0.6


                                                                                                                                                                       0.5
                                  0.9
                                                                                                      0.7




                                                                                                                                               Estimated Îș statistic
          Estimated Îș statistic




                                                                              Estimated Îș statistic




                                                                                                                                                                       0.4
                                  0.8
                                                                                                      0.6                                                              0.3
                                                                                                                                                                                            KECA
                                  0.7                                                                                                                                                       KPCA
                                                                                                                                                                       0.2                  Kernel k-means
                                                                                                      0.5                                                                                   k-means
                                  0.6
                                                                                                                                                                       0.1


                                  0.5                                                                 0.4                                                               0
                                          200    400    600      800   1000                                 200   400    600      800   1000                                  200   400    600      800      1000
                                                   #Samples                                                         #Samples                                                          #Samples



L. Gómez-Chova et al.                                              Kernel Entropy Component Analysis                                                         IGARSS 2011 – Vancouver                                21/26
Intro            ECA                            KECA                Clustering                Results            Conclusions

Experimental results: Numerical comparison



        Average numerical results
                                                      0.8



                                                      0.7
                              Estimated Îș statistic

                                                                                                        KECA
                                                      0.6                                               KPCA
                                                                                                        Kernel k-means
                                                                                                        k-means
                                                      0.5



                                                      0.4
                                                            200   400    600     800   1000
                                                                    #Samples


             KECA outperforms k-means (+25%) and Kk-means and KPCA (+15%)
             In general, the number of training samples positively aïŹ€ect the results


L. Gómez-Chova et al.        Kernel Entropy Component Analysis                          IGARSS 2011 – Vancouver          22/26
Intro              ECA                  KECA                  Clustering                 Results                Conclusions

Experimental results: ClassiïŹcation maps

               Test Site               k-means            Kernel k-means              KPCA                  KECA
        Spain (BR-2003-07-14)    OA=96.25% ; Îș=0.6112   OA=96.22% ; Îș=0.7540   OA=47.52% ; Îș=0.0966   OA=99.41% ; Îș=0.9541




        Spain (BR-2004-07-14)    OA=96.91% ; Îș=0.6018   OA=62.03% ; Îș=0.0767   OA=96.66% ; Îș=0.6493   OA=97.54% ; Îș=0.7319




        France (FR-2005-03-19)   OA=92.87% ; Îș=0.6142   OA=92.64% ; Îș=0.6231   OA=80.93% ; Îș=0.4051   OA=92.91% ; Îș=0.6302




L. Gómez-Chova et al.               Kernel Entropy Component Analysis                 IGARSS 2011 – Vancouver                23/26
Intro            ECA           KECA              Clustering       Results            Conclusions




        1   Introduction


        2   Entropy Component Analysis


        3   Kernel Entropy Component Analysis (KECA)


        4   KECA Spectral Clustering


        5   Experimental Results


        6   Conclusions and Open questions




L. Gómez-Chova et al.       Kernel Entropy Component Analysis   IGARSS 2011 – Vancouver    24/26
Intro            ECA           KECA              Clustering          Results            Conclusions

Conclusions and open questions




        Conclusions
            Kernel entropy component analysis for clustering remote sensing data
                 Nonlinear features preserving entropy of the input data
                 Angular clustering reveals structure in terms of clusters divergence
            Out-of-sample extension for test data → mandatory in remote sensing
            Good results on cloud screening from MERIS images
            KECA code is available at http://www.phys.uit.no/∌robertj/
            Simple feature extraction toolbox (SIMFEAT) soon at http://isp.uv.es

        Open questions and Future work
            Pre-images of transformed data in the input space
            Learn kernel parameters in an automatic way
            Test KECA in more remote sensing applications



L. Gómez-Chova et al.       Kernel Entropy Component Analysis      IGARSS 2011 – Vancouver    25/26
Intro           ECA              KECA              Clustering           Results           Conclusions




                        Kernel Entropy Component Analysis
                        in Remote Sensing Data Clustering

             Luis GĂłmez-Chova1            Robert Jenssen2         Gustavo Camps-Valls1

                1 Image  Processing Laboratory (IPL), Universitat de ValĂšncia, Spain.
                    luis.gomez-chova@uv.es , http://www.valencia.edu/chovago
             2 Department     of Physics and Technology, University of TromsĂž, Norway.
                        robert.jenssen@uit.no , http://www.phys.uit.no/∌robertj


                                  IGARSS 2011 – Vancouver, Canada

                                 *
                        IPL




                  Image Processing Laboratory

L. Gómez-Chova et al.         Kernel Entropy Component Analysis      IGARSS 2011 – Vancouver    26/26

More Related Content

What's hot

Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodSSA KPI
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...zukun
 
Visualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time SeriesVisualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time Serieshanshang
 
Do we need a logic of quantum computation?
Do we need a logic of quantum computation?Do we need a logic of quantum computation?
Do we need a logic of quantum computation?Matthew Leifer
 
Stochastic Differentiation
Stochastic DifferentiationStochastic Differentiation
Stochastic DifferentiationSSA KPI
 
Random Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application ExamplesRandom Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application ExamplesFörderverein Technische FakultÀt
 
6. balance laws jan 2013
6. balance laws jan 20136. balance laws jan 2013
6. balance laws jan 2013Olowosulu Emmanuel
 
3. tensor calculus jan 2013
3. tensor calculus jan 20133. tensor calculus jan 2013
3. tensor calculus jan 2013Olowosulu Emmanuel
 
CSMR11b.ppt
CSMR11b.pptCSMR11b.ppt
CSMR11b.pptPtidej Team
 
Two dimensional Pool Boiling
Two dimensional Pool BoilingTwo dimensional Pool Boiling
Two dimensional Pool BoilingRobvanGils
 
Introduction to FDA and linear models
 Introduction to FDA and linear models Introduction to FDA and linear models
Introduction to FDA and linear modelstuxette
 
Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)Matthew Leingang
 
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRIOriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRILeonid Zhukov
 
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...SSA KPI
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering TutorialZitao Liu
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayesPhong Vo
 
Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)Dr. Hari Arora
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58Alexander Decker
 
Intro probability 4
Intro probability 4Intro probability 4
Intro probability 4Phong Vo
 

What's hot (20)

Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
 
Bertail
BertailBertail
Bertail
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
 
Visualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time SeriesVisualizing, Modeling and Forecasting of Functional Time Series
Visualizing, Modeling and Forecasting of Functional Time Series
 
Do we need a logic of quantum computation?
Do we need a logic of quantum computation?Do we need a logic of quantum computation?
Do we need a logic of quantum computation?
 
Stochastic Differentiation
Stochastic DifferentiationStochastic Differentiation
Stochastic Differentiation
 
Random Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application ExamplesRandom Matrix Theory in Array Signal Processing: Application Examples
Random Matrix Theory in Array Signal Processing: Application Examples
 
6. balance laws jan 2013
6. balance laws jan 20136. balance laws jan 2013
6. balance laws jan 2013
 
3. tensor calculus jan 2013
3. tensor calculus jan 20133. tensor calculus jan 2013
3. tensor calculus jan 2013
 
CSMR11b.ppt
CSMR11b.pptCSMR11b.ppt
CSMR11b.ppt
 
Two dimensional Pool Boiling
Two dimensional Pool BoilingTwo dimensional Pool Boiling
Two dimensional Pool Boiling
 
Introduction to FDA and linear models
 Introduction to FDA and linear models Introduction to FDA and linear models
Introduction to FDA and linear models
 
Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)Lesson 22: Optimization (Section 041 slides)
Lesson 22: Optimization (Section 041 slides)
 
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRIOriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
Oriented Tensor Reconstruction. Tracing Neural Pathways from DT-MRI
 
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
Application of the Monte-Carlo Method to Nonlinear Stochastic Optimization wi...
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering Tutorial
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
 
Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)Exponentialentropyonintuitionisticfuzzysets (1)
Exponentialentropyonintuitionisticfuzzysets (1)
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58
 
Intro probability 4
Intro probability 4Intro probability 4
Intro probability 4
 

Viewers also liked

fauvel_igarss.pdf
fauvel_igarss.pdffauvel_igarss.pdf
fauvel_igarss.pdfgrssieee
 
Nonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemNonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemMichele Filannino
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...zukun
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceKhulna University
 
KPCA_Survey_Report
KPCA_Survey_ReportKPCA_Survey_Report
KPCA_Survey_ReportRandy Salm
 
Principal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionPrincipal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionJordan McBain
 
Analyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itAnalyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itMilan Rajpara
 
Adaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingAdaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingieeepondy
 
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...hanshang
 
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdfExplicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdfgrssieee
 
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...Sahidul Islam
 
Regularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataRegularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataWen-Ting Wang
 
Pca and kpca of ecg signal
Pca and kpca of ecg signalPca and kpca of ecg signal
Pca and kpca of ecg signales712
 
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleDataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleHakka Labs
 
Probabilistic PCA, EM, and more
Probabilistic PCA, EM, and moreProbabilistic PCA, EM, and more
Probabilistic PCA, EM, and morehsharmasshare
 
Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...zukun
 
Principal Component Analysis and Clustering
Principal Component Analysis and ClusteringPrincipal Component Analysis and Clustering
Principal Component Analysis and ClusteringUsha Vijay
 
ECG: Indication and Interpretation
ECG: Indication and InterpretationECG: Indication and Interpretation
ECG: Indication and InterpretationRakesh Verma
 
Introduction to Statistical Machine Learning
Introduction to Statistical Machine LearningIntroduction to Statistical Machine Learning
Introduction to Statistical Machine Learningmahutte
 
Principal component analysis
Principal component analysisPrincipal component analysis
Principal component analysisFarah M. Altufaili
 

Viewers also liked (20)

fauvel_igarss.pdf
fauvel_igarss.pdffauvel_igarss.pdf
fauvel_igarss.pdf
 
Nonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemNonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problem
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical Distance
 
KPCA_Survey_Report
KPCA_Survey_ReportKPCA_Survey_Report
KPCA_Survey_Report
 
Principal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionPrincipal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty Detection
 
Analyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itAnalyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving it
 
Adaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingAdaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and merging
 
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
 
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdfExplicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf
 
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
 
Regularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataRegularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial Data
 
Pca and kpca of ecg signal
Pca and kpca of ecg signalPca and kpca of ecg signal
Pca and kpca of ecg signal
 
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleDataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
 
Probabilistic PCA, EM, and more
Probabilistic PCA, EM, and moreProbabilistic PCA, EM, and more
Probabilistic PCA, EM, and more
 
Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...
 
Principal Component Analysis and Clustering
Principal Component Analysis and ClusteringPrincipal Component Analysis and Clustering
Principal Component Analysis and Clustering
 
ECG: Indication and Interpretation
ECG: Indication and InterpretationECG: Indication and Interpretation
ECG: Indication and Interpretation
 
Introduction to Statistical Machine Learning
Introduction to Statistical Machine LearningIntroduction to Statistical Machine Learning
Introduction to Statistical Machine Learning
 
Principal component analysis
Principal component analysisPrincipal component analysis
Principal component analysis
 

Similar to Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf

Approximate Tree Kernels
Approximate Tree KernelsApproximate Tree Kernels
Approximate Tree KernelsNiharjyoti Sarangi
 
Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCSR2011
 
Convolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelsConvolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelstuxette
 
Integration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methodsIntegration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methodsMercier Jean-Marc
 
Statistical Analysis of Neural Coding
Statistical Analysis of Neural CodingStatistical Analysis of Neural Coding
Statistical Analysis of Neural CodingYifei Shea, Ph.D.
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdfgrssieee
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdfgrssieee
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdfgrssieee
 
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...grssieee
 
Astaño 4
Astaño 4Astaño 4
Astaño 4Cire Oreja
 
Topological Inference via Meshing
Topological Inference via MeshingTopological Inference via Meshing
Topological Inference via MeshingDon Sheehy
 
Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes FamilyKota Matsui
 
Non-parametric regressions & Neural Networks
Non-parametric regressions & Neural NetworksNon-parametric regressions & Neural Networks
Non-parametric regressions & Neural NetworksGiuseppe Broccolo
 
A formal ontology of sequences
A formal ontology of sequencesA formal ontology of sequences
A formal ontology of sequencesRobert Hoehndorf
 
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdfKernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdfgrssieee
 
IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)Gianluca Antonelli
 
JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience
JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience
JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience hirokazutanaka
 
NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningzukun
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingSSA KPI
 

Similar to Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf (20)

Approximate Tree Kernels
Approximate Tree KernelsApproximate Tree Kernels
Approximate Tree Kernels
 
Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawal
 
Pres metabief2020jmm
Pres metabief2020jmmPres metabief2020jmm
Pres metabief2020jmm
 
Convolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelsConvolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernels
 
Integration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methodsIntegration with kernel methods, Transported meshfree methods
Integration with kernel methods, Transported meshfree methods
 
Statistical Analysis of Neural Coding
Statistical Analysis of Neural CodingStatistical Analysis of Neural Coding
Statistical Analysis of Neural Coding
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdf
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdf
 
simplex.pdf
simplex.pdfsimplex.pdf
simplex.pdf
 
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
SIMPLEX VOLUME ANALYSIS BASED ON TRIANGULAR FACTORIZATION: A FRAMEWORK FOR HY...
 
Astaño 4
Astaño 4Astaño 4
Astaño 4
 
Topological Inference via Meshing
Topological Inference via MeshingTopological Inference via Meshing
Topological Inference via Meshing
 
Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes Family
 
Non-parametric regressions & Neural Networks
Non-parametric regressions & Neural NetworksNon-parametric regressions & Neural Networks
Non-parametric regressions & Neural Networks
 
A formal ontology of sequences
A formal ontology of sequencesA formal ontology of sequences
A formal ontology of sequences
 
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdfKernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
Kernel-Based_Retrieval_of_Atmospheric_Profiles_from_IASI_Data.pdf
 
IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)IROS 2011 talk 2 (Filippo's file)
IROS 2011 talk 2 (Filippo's file)
 
JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience
JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience
JAISTă‚”ăƒžăƒŒă‚čă‚ŻăƒŒăƒ«2016ă€Œè„łă‚’çŸ„ă‚‹ăŸă‚ăźç†è«–ă€èŹ›çŸ©04 Neural Networks and Neuroscience
 
NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learning
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programming
 

More from grssieee

Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...grssieee
 
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELSEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
 
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...grssieee
 
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESTHE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESgrssieee
 
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSGMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSgrssieee
 
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERPROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERgrssieee
 
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
Test
TestTest
Testgrssieee
 
test 34mb wo animations
test  34mb wo animationstest  34mb wo animations
test 34mb wo animationsgrssieee
 
Test 70MB
Test 70MBTest 70MB
Test 70MBgrssieee
 
Test 70MB
Test 70MBTest 70MB
Test 70MBgrssieee
 
2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdfgrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
Tana_IGARSS2011.ppt
Tana_IGARSS2011.pptTana_IGARSS2011.ppt
Tana_IGARSS2011.pptgrssieee
 
Solaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptSolaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptgrssieee
 

More from grssieee (20)

Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
 
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELSEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
 
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
 
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESTHE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
 
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSGMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
 
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERPROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
 
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
Test
TestTest
Test
 
test 34mb wo animations
test  34mb wo animationstest  34mb wo animations
test 34mb wo animations
 
Test 70MB
Test 70MBTest 70MB
Test 70MB
 
Test 70MB
Test 70MBTest 70MB
Test 70MB
 
2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf
 
DLR open house
DLR open houseDLR open house
DLR open house
 
DLR open house
DLR open houseDLR open house
DLR open house
 
DLR open house
DLR open houseDLR open house
DLR open house
 
Tana_IGARSS2011.ppt
Tana_IGARSS2011.pptTana_IGARSS2011.ppt
Tana_IGARSS2011.ppt
 
Solaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptSolaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.ppt
 

Recently uploaded

Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Victor Rentea
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Angeliki Cooney
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxRemote DBA Services
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityWSO2
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...apidays
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamUiPathCommunity
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProduct Anonymous
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...DianaGray10
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...apidays
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWERMadyBayot
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Victor Rentea
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxRustici Software
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodJuan lago vĂĄzquez
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontologyjohnbeverley2021
 

Recently uploaded (20)

Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptx
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital Adaptability
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 

Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf

  • 1. Kernel Entropy Component Analysis in Remote Sensing Data Clustering Luis GĂłmez-Chova1 Robert Jenssen2 Gustavo Camps-Valls1 1 Image Processing Laboratory (IPL), Universitat de ValĂšncia, Spain. luis.gomez-chova@uv.es , http://www.valencia.edu/chovago 2 Department of Physics and Technology, University of TromsĂž, Norway. robert.jenssen@uit.no , http://www.phys.uit.no/∌robertj IGARSS 2011 – Vancouver, Canada * IPL Image Processing Laboratory
  • 2. Intro ECA KECA Clustering Results Conclusions Outline 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 1/26
  • 3. Intro ECA KECA Clustering Results Conclusions Motivation Feature Extraction Feature selection/extraction essential before classiïŹcation or regression to discard redundant or noisy components to reduce the dimensionality of the data Create a subset of new features by combinations of the existing ones Linear Feature Extraction OïŹ€er Interpretability ∌ knowledge discovery PCA: projections maximizing the data set variance PLS: projections maximally aligned with the labels ICA: non-orthogonal projections with maximal independent axes Fail when data distributions are curved Nonlinear feature relations L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 2/26
  • 4. Intro ECA KECA Clustering Results Conclusions Objectives Objectives Kernel-based non-linear data-transformation Captures the data higher order statistics Extracts features suited for clustering Method Kernel Entropy Component Analysis (KECA) [Jenssen, 2010] Based on Information Theory: Maximally preserves entropy of the input data Angular clustering maximizes cluster divergence Out-of-sample extension to deal with test data Experiments Cloud screening from ENVISAT/MERIS multispectral images L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 3/26
  • 5. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 4/26
  • 6. Intro ECA KECA Clustering Results Conclusions Information-Theoretic Learning Entropy Concept Entropy of a probability density function (pdf) is a measure of information Entropy ⇔ Shape of the pdf L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 5/26
  • 7. Intro ECA KECA Clustering Results Conclusions Information-Theoretic Learning Divergence Concept The entropy concept can be extended to obtain a measure of dissimilarity between distributions ←→ Divergence ⇔ Distance between pdfs L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 6/26
  • 8. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis Shannon entropy Z H(p) = − p(x) log p(x)dx How to handle densities? How to compute integrals? RĂ©nyi’s entropies Z 1 H(p) = − log p α (x)dx 1−α RĂ©nyi’s entropies contain Shannon as a special case α → 1 We focus on the RĂ©nyi’s quadratic entropy α = 2 RĂ©nyi’s quadratic entropy Z H(p) = − log p 2 (x)dx = − log V (p) It can be estimated directly from samples! L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 7/26
  • 9. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis RĂ©nyi’s quadratic entropy estimator Estimated from data D = {x1 , . . . , xN } ∈ Rd generated by the pdf p(x) Parzen window estimator with a Gaussian or Radial Basis Function (RBF): 1 X 2 K (x, xt | σ) /2σ 2 ` ÂŽ p (x) = ˆ with K (x, xt ) = exp − x − xt N x ∈D t Idea: Place a kernel over the samples and sum with proper normalization The estimator for the information potential V (p) = p 2 (x)dx R Z Z 1 X 1 X ˆ V (p) = p 2 (x)dx = ˆ K (x, xt | σ) K (x, xt | σ)dx N x ∈D N x ∈D t t Z 1 X X = K (x, xt | σ)K (x, xt | σ)dx N 2 x ∈D x ∈D t t 1 X X √ 1 = 2 K (xt , xt | 2σ) = 2 1 K1 N x ∈D x ∈D N t t L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 8/26
  • 10. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis RĂ©nyi’s quadratic entropy estimator Empirical RĂ©nyi entropy estimate resides in the corresponding kernel matrix ˆ 1 V (p) = 2 1 K1 N It can be expressed in terms of eigenvalues and eigenvectors of K D diagonal matrix of eigenvalues λ1 , . . . , λN  K = EDE E matrix with the eigenvectors e1 , . . . , eN Therefore we then have N 1 X “p ”2 ˆ V (p) = 2 λi ei 1 N i =1 √ where each term λi ei 1 will contribute to the entropy estimate ECA dimensionality reduction Idea: to ïŹnd the smallest set of features √ maximally preserve the that entropy of the input data (contributions λi ei 1) L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 9/26
  • 11. Intro ECA KECA Clustering Results Conclusions Entropy Component Analysis H(p) = 4.36 H(p) = 4.74 H(p) = 5.05 H(p) = 4.71 ˆ H(p) = 4.81 , H(p) = 4.44 L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 10/26
  • 12. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 11/26
  • 13. Intro ECA KECA Clustering Results Conclusions Kernel Principal Component Analysis (KPCA) Principal Component Analysis (PCA) Find projections of X = [x1 , . . . , xN ] maximizing the variance of data XU PCA: maximize: Trace{(XU) (XU)} = Trace{U Cxx U} subject to: U U=I Including Lagrange multipliers λ, this is equivalent to the eigenproblem Cxx ui = λi ui → Cxx U = UD ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0 PCA L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 12/26
  • 14. Intro ECA KECA Clustering Results Conclusions Kernel Principal Component Analysis (KPCA) Kernel Principal Component Analysis (KPCA) Find projections maximizing variance of mapped data [φ(x1 ), . . . , φ(xN )] KPCA: maximize: Tr{(ΊU) (ΊU)} = Tr{U Ί ΊU} subject to: U U=I The covariance matrix Ί Ί and projection matrix U are dH × dH !!! KPCA through kernel trick Apply the representer’s theorem: U = Ί A where A = [α1 , . . . , αN ] KPCA: maximize: Tr{A ΊΊ ΊΊ A} = Tr{A KKA} subject to: U U = A ΊΊ A = A KA = I Including Lagrange multipliers λ, this is equivalent to the eigenproblem KKαi = λi Kαi → Kαi = λi αi Now matrix A is N × N !!! (eigendecomposition of K = EDE = AA ) L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 13/26
  • 15. Intro ECA KECA Clustering Results Conclusions Kernel ECA Transformation Kernel Entropy Component Analysis (KECA) KECA: projection of Ί onto those m feature-space principal axes contributing most to the RĂ©nyi entropy estimate of the input data 1 2 Ίeca = ΊUm = Em Dm √ Projections onto a single principal axis ui in H is given by ui Ί = λi ei 1 1 Pm `√ ÂŽ2 ˆ Entropy associated with Ίeca is Vm = N 2 1 Keca 1 = N 2 λi ei 1 i =1 Note that Ίeca is not necessarily based on the top eigenvalues λi since ei 1 also contributes to the entropy estimate Out-of-sample extension Projections for a collection of test data points: −1 −1 Ίeca,test = Ίtest Um = Ίtest ΊEm Dm 2 = Ktest Em Dm 2 L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 14/26
  • 16. Intro ECA KECA Clustering Results Conclusions Kernel ECA Transformation KECA example Original PCA KPCA KECA KECA reveals cluster structure → underlying labels of the data Nonlinearly related clusters in X → diïŹ€erent angular directions in H An angular clustering based on the kernel features Ίeca seems reasonable L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 15/26
  • 17. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 16/26
  • 18. Intro ECA KECA Clustering Results Conclusions KECA Spectral Clustering Cauchy-Schwarz divergence The Cauchy-Schwarz divergence between the pdf of two clusters is R pi (x)pj (x)d x DCS (pi , pj ) = − log(VCS (pi , pj )) = − log qR R pi (x)d x pj2 (x)d x 2 Measuring dissimilarity in a probability space is a complex issue 1 φ(xt ): P Entropy interpretation in the kernel space → mean vector ” = N Z 1 1 V (p) = p 2 (x)dx = 2 1 K1 = 2 1 ΊΊ 1 = ” ” = ” 2 ˆ ˆ N N ”i ”j Diverg. via Parzen windowing ⇒ VCS (pi , pj ) = ˆ ”i ”j = cos ∠(”i , ”j ) KECA Spectral Clustering Angular clustering of Ίeca maximizes the CS divergence between clusters: k X J(C1 , . . . , Ck ) = Ni cos ∠(φeca (x), ”i ) i =1 L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 17/26
  • 19. Intro ECA KECA Clustering Results Conclusions KECA Spectral Clustering KECA Spectral Clustering Algorithm 1 Obtain Ίeca by Kernel ECA 2 Initialize means ”i , i = 1, . . . , k 3 For all training samples assign a cluster xt → Ci maximizing cos ∠(φeca (xt ), ”i ) 4 Update mean vectors ”i CS 5 Repeat steps 3 and 4 until convergence py tro En Intuition A kernel feature space data point φeca (xt ) is assigned to the cluster represented by the closest mean vector ”i in terms of angular distance L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 18/26
  • 20. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 19/26
  • 21. Intro ECA KECA Clustering Results Conclusions Experimental results: Data material Cloud masking from ENVISAT/MERIS multispectral images Pixel-wise binary decisions about the presence/absence of clouds MERIS images taken over Spain and France Input samples with 13 spectral bands and 6 physically inspired features Barrax (BR-2003-07-14) Barrax (BR-2004-07-14) France (FR-2005-03-19) L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 20/26
  • 22. Intro ECA KECA Clustering Results Conclusions Experimental results: Numerical comparison Experimental setup KECA compared with k-means, KPCA + k-means, and Kernel k-means Number of clusters ïŹxed to k = 2 (cloud-free and cloudy areas) Number of KPCA and KECA features ïŹxed to m = 2 (stress diïŹ€erences) RBF-kernel width parameter is selected by gird-search for all methods Numerical results Validation results on 10000 pixels per image manually labeled Kappa statistic results over 10 realizations for all images BR-2003-07-14 BR-2004-07-14 FR-2005-03-19 1 0.8 0.6 0.5 0.9 0.7 Estimated Îș statistic Estimated Îș statistic Estimated Îș statistic 0.4 0.8 0.6 0.3 KECA 0.7 KPCA 0.2 Kernel k-means 0.5 k-means 0.6 0.1 0.5 0.4 0 200 400 600 800 1000 200 400 600 800 1000 200 400 600 800 1000 #Samples #Samples #Samples L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 21/26
  • 23. Intro ECA KECA Clustering Results Conclusions Experimental results: Numerical comparison Average numerical results 0.8 0.7 Estimated Îș statistic KECA 0.6 KPCA Kernel k-means k-means 0.5 0.4 200 400 600 800 1000 #Samples KECA outperforms k-means (+25%) and Kk-means and KPCA (+15%) In general, the number of training samples positively aïŹ€ect the results L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 22/26
  • 24. Intro ECA KECA Clustering Results Conclusions Experimental results: ClassiïŹcation maps Test Site k-means Kernel k-means KPCA KECA Spain (BR-2003-07-14) OA=96.25% ; Îș=0.6112 OA=96.22% ; Îș=0.7540 OA=47.52% ; Îș=0.0966 OA=99.41% ; Îș=0.9541 Spain (BR-2004-07-14) OA=96.91% ; Îș=0.6018 OA=62.03% ; Îș=0.0767 OA=96.66% ; Îș=0.6493 OA=97.54% ; Îș=0.7319 France (FR-2005-03-19) OA=92.87% ; Îș=0.6142 OA=92.64% ; Îș=0.6231 OA=80.93% ; Îș=0.4051 OA=92.91% ; Îș=0.6302 L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 23/26
  • 25. Intro ECA KECA Clustering Results Conclusions 1 Introduction 2 Entropy Component Analysis 3 Kernel Entropy Component Analysis (KECA) 4 KECA Spectral Clustering 5 Experimental Results 6 Conclusions and Open questions L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 24/26
  • 26. Intro ECA KECA Clustering Results Conclusions Conclusions and open questions Conclusions Kernel entropy component analysis for clustering remote sensing data Nonlinear features preserving entropy of the input data Angular clustering reveals structure in terms of clusters divergence Out-of-sample extension for test data → mandatory in remote sensing Good results on cloud screening from MERIS images KECA code is available at http://www.phys.uit.no/∌robertj/ Simple feature extraction toolbox (SIMFEAT) soon at http://isp.uv.es Open questions and Future work Pre-images of transformed data in the input space Learn kernel parameters in an automatic way Test KECA in more remote sensing applications L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 25/26
  • 27. Intro ECA KECA Clustering Results Conclusions Kernel Entropy Component Analysis in Remote Sensing Data Clustering Luis GĂłmez-Chova1 Robert Jenssen2 Gustavo Camps-Valls1 1 Image Processing Laboratory (IPL), Universitat de ValĂšncia, Spain. luis.gomez-chova@uv.es , http://www.valencia.edu/chovago 2 Department of Physics and Technology, University of TromsĂž, Norway. robert.jenssen@uit.no , http://www.phys.uit.no/∌robertj IGARSS 2011 – Vancouver, Canada * IPL Image Processing Laboratory L. GĂłmez-Chova et al. Kernel Entropy Component Analysis IGARSS 2011 – Vancouver 26/26