SlideShare a Scribd company logo
1 of 28
Download to read offline
Explicit Signal to Noise Ratio in
          Reproducing Kernel Hilbert Spaces

 Luis Gómez-Chova1         Allan A. Nielsen2     Gustavo Camps-Valls1

  1 Image    Processing Laboratory (IPL), Universitat de València, Spain.
        luis.gomez-chova@uv.es , http://www.valencia.edu/chovago
2 DTU   Space - National Space Institute. Technical University of Denmark.


                    IGARSS 2011 – Vancouver, Canada

                   *
         IPL




    Image Processing Laboratory
Intro                   SNR                   KMNF                    Results                Conclusions

Outline




        1   Introduction


        2   Signal-to-noise ratio transformation


        3   Kernel Minimum Noise Fraction


        4   Experimental Results


        5   Conclusions and Open questions




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    1/23
Intro                   SNR                   KMNF                    Results                Conclusions

Motivation



        Feature Extraction
             Feature selection/extraction is essential before classification or regression
                  to discard redundant or noisy components
                  to reduce the dimensionality of the data
             Create a subset of new features by combinations of the existing ones

        Linear Feature Extraction
             Linear methods offer Interpretability ∼ knowledge discovery
                  PCA: projections maximizing the data set variance
                  PLS: projections maximally aligned with the labels
                  ICA: non-orthogonal projections with maximal independent axes




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    2/23
Intro                   SNR                   KMNF                    Results                Conclusions

Motivation



        Feature Extraction
              Feature selection/extraction is essential before classification or regression
                  to discard redundant or noisy components
                  to reduce the dimensionality of the data
              Create a subset of new features by combinations of the existing ones

        Linear Feature Extraction
              Linear methods offer Interpretability ∼ knowledge discovery
                  PCA: projections maximizing the data set variance
                  PLS: projections maximally aligned with the labels
                  ICA: non-orthogonal projections with maximal independent axes


        Drawbacks
          1   Most feature extractors disregard the noise characteristics!
          2   Linear methods fail when data distributions are curved (nonlinear relations)


L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    2/23
Intro                    SNR                    KMNF                    Results                Conclusions

Objectives



        Objectives
             New nonlinear kernel feature extraction method for remote sensing data
             Extract features robust to data noise

        Method
             Based on the Minimum Noise Fraction (MNF) transformation
             Explicit Kernel MNF (KMNF)
                     Noise is explicitly estimated in the reproducing kernel Hilbert space
                     Deals with non-linear relations between the noise and signal features jointly
                     Reduces the number of free parameters in the formulation to one


        Experiments
             PCA, MNF, KPCA, and two versions of KMNF (implicit and explicit)
             Test feature extractors for real hyperspectral image classification



L. Gómez-Chova et al.           Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    3/23
Intro                   SNR                   KMNF                    Results                Conclusions




        1   Introduction


        2   Signal-to-noise ratio transformation


        3   Kernel Minimum Noise Fraction


        4   Experimental Results


        5   Conclusions and Open questions




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    4/23
Intro                   SNR                   KMNF                    Results                Conclusions

Signal and noise




        Signal vs noise
             Signal: magnitude generated by an inaccesible system, si
             Noise: magnitude generated by the medium corrupting the signal, ni
             Observation: signal corrupted by noise, xi

        Notation

             Observations: xi ∈ RN , i = 1, . . . , n
             Matrix notation: X = [x1 , . . . , xn ] ∈ Rn×N
             Centered data sets: assume X has zero mean
                                                1
             Empirical covariance matrix: Cxx = n X X
             Projection matrix: U (size N × np ) → X = XU (np extracted features)




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    5/23
Intro                   SNR                   KMNF                    Results                Conclusions

Principal Component Analysis Transformation



        Principal Component Analysis (PCA)

            Find projections of X = [x1 , . . . , xN ]      maximizing the variance of data XU

                  PCA:          maximize:        Trace{(XU) (XU)} = Trace{U Cxx U}
                                subject to:      U U=I
            Including Lagrange multipliers λ, this is equivalent to the eigenproblem
                                        Cxx ui = λi ui → Cxx U = UD

            ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    6/23
Intro                   SNR                    KMNF                    Results                Conclusions

Principal Component Analysis Transformation



        Principal Component Analysis (PCA)

              Find projections of X = [x1 , . . . , xN ]     maximizing the variance of data XU

                    PCA:         maximize:        Trace{(XU) (XU)} = Trace{U Cxx U}
                                 subject to:      U U=I
              Including Lagrange multipliers λ, this is equivalent to the eigenproblem
                                         Cxx ui = λi ui → Cxx U = UD

              ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0

        PCA limitations
          1   Axes rotation to the directions of maximum variance of data
          2   It does not consider noise characteristics:
                   Assumes noise variance is low → last eigenvectors with low eigenvalues
                   Maximum variance directions may be affected by noise



L. Gómez-Chova et al.          Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    6/23
Intro                   SNR                   KMNF                    Results                Conclusions

Minimum Noise Fraction Transformation

        The SNR transformation
            Find projections maximizing the ratio between signal and noise variances:
                                                                  ff
                                                         U Css U
                             SNR:       maximize: Tr
                                                         U Cnn U
                                        subject to: U Cnn U = I
            Unknown signal and noise covariance matrices Css and Cnn




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    7/23
Intro                   SNR                   KMNF                    Results                Conclusions

Minimum Noise Fraction Transformation

        The SNR transformation
            Find projections maximizing the ratio between signal and noise variances:
                                                                  ff
                                                         U Css U
                             SNR:       maximize: Tr
                                                         U Cnn U
                                        subject to: U Cnn U = I
            Unknown signal and noise covariance matrices Css and Cnn

        The MNF transformation
            Assuming additive X = S + N and orthogonal S N = N S = 0 noise
            Maximizing SNR is equivalent to Minimizing NF = 1/(SNR+1):
                                                               ff
                                                        U Cxx U
                            MNF:       maximize: Tr
                                                        U Cnn U
                                       subject to: U Cnn U = I
            This is equivalent to solving the generalized eigenproblem:
                                    Cxx ui = λi Cnn ui → Cxx U = Cnn UD
L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    7/23
Intro                   SNR                   KMNF                    Results                Conclusions

Minimum Noise Fraction Transformation


        The MNF transformation
            Minimum Noise Fraction equivalent to solve the generalized eigenproblem:

                                    Cxx ui = λi Cnn ui → Cxx U = Cnn UD

            Since U Cnn U = I, eigenvalues λi are the SNR+1 in the projected space
            Need estimates of signal Cxx = X X and noise Cnn ≈ N N covariances




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    8/23
Intro                   SNR                   KMNF                       Results                   Conclusions

Minimum Noise Fraction Transformation


        The MNF transformation
             Minimum Noise Fraction equivalent to solve the generalized eigenproblem:

                                    Cxx ui = λi Cnn ui → Cxx U = Cnn UD

             Since U Cnn U = I, eigenvalues λi are the SNR+1 in the projected space
             Need estimates of signal Cxx = X X and noise Cnn ≈ N N covariances

        The noise covariance estimation
             Noise estimate: diff. between actual value and a reference ‘clean’ value

                                                   N = X − Xr

             Xr from neighborhood assuming a spatially smoother signal than the noise
             Assume stationary processes in wide sense:
                  Differentiation: ni ≈ xi − xi −1
                                                      1   PM
                  Smoothing filtering: ni ≈ xi −       M      k=1   wk xi −k
                  Wiener estimates
                  Wavelet domain estimates

L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio           IGARSS 2011 – Vancouver    8/23
Intro                   SNR                   KMNF                    Results                Conclusions




        1   Introduction


        2   Signal-to-noise ratio transformation


        3   Kernel Minimum Noise Fraction


        4   Experimental Results


        5   Conclusions and Open questions




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    9/23
Intro                   SNR                          KMNF                    Results                  Conclusions

Kernel methods for non-linear feature extraction

        Kernel methods

                              Input features space                     Kernel feature space



                                                            Φ



          1   Map the data to a high-dimensional feature space, H (dH → ∞)
          2   Solve a linear problem there

        Kernel trick
              No need to know dH → ∞ coordinates for each mapped sample φ(xi )
              Kernel trick: “if an algorithm can be expressed in the form of dot
              products, its non-linear (kernel) version only needs the dot products
              among mapped samples, the so-called kernel function:”

                                             K (xi , xj ) = φ(xi ), φ(xj )

              Using this trick, we can implement K-PCA, K-PLS, K-ICA, etc
L. Gómez-Chova et al.          Explicit Kernel Signal to Noise Ratio             IGARSS 2011 – Vancouver    10/23
Intro                   SNR                   KMNF                    Results                Conclusions

Kernel Principal Component Analysis (KPCA)

        Kernel Principal Component Analysis (KPCA)

            Find projections maximizing variance of mapped data [φ(x1 ), . . . , φ(xN )]

                   KPCA:            maximize:        Tr{(ΦU) (ΦU)} = Tr{U Φ ΦU}
                                    subject to:      U U=I

            The covariance matrix Φ Φ and projection matrix U are dH × dH !!!

        KPCA through kernel trick

            Apply the representer’s theorem: U = Φ A where A = [α1 , . . . , αN ]

                   KPCA:           maximize:        Tr{A ΦΦ ΦΦ A} = Tr{A KKA}
                                   subject to:      U U = A ΦΦ A = A KA = I

            Including Lagrange multipliers λ, this is equivalent to the eigenproblem

                                       KKαi = λi Kαi → Kαi = λi αi

            Now matrix A is N × N !!! (eigendecomposition of K)
            Projections are obtained as ΦU = ΦΦ A = KA
L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    11/23
Intro                   SNR                   KMNF                    Results                Conclusions

Kernel MNF Transformation


        KMNF through kernel trick

            Find projections maximizing SNR of mapped data [φ(x1 ), . . . , φ(xN )]
                 Replace X ∈ Rn×N with Φ ∈ Rn×NH
                 Replace N ∈ Rn×N with Φn ∈ Rn×NG

                              Cxx U = Cnn UD ⇒ Φ ΦU = Φn Φn UD

            Not solvable: matrices Φ Φ and Φn Φn are NH × NH and NG × NG
            Left multiply both sides by Φ, and use representer’s theorem, U = Φ A:

                        ΦΦ ΦΦ A = ΦΦn Φn Φ AD → Kxx Kxx A = Kxn Kxn AD


                 Now matrix A is N × N !!! (eigendecomposition of Kxx wrt Kxn )
                 Kxx = ΦΦ is symmetric with elements K (xi , xj )
                 Kxn = ΦΦn = Knx is non-symmetric with elements K (xi , nj )
            Easy and simple to program!
            Potentially useful when signal and noise are nonlinearly related

L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    12/23
Intro                   SNR                   KMNF                     Results                 Conclusions

SNR in Hilbert spaces


        Implicit KMNF: noise estimate in the input space
             Estimate the noise directly in the input space: N = X − Xr
             Signal-to-noise kernel:

                                          Kxn = ΦΦn          → K (xi , nj )

             with Φn = [φ(n1 ), . . . , φ(nn )]
             Kernels Kxx and Kxn dealing with objects of different nature → 2 params
             Two different kernel spaces → eigenvalues have no longer meaning of SNR

        Explicit KMNF: noise estimate in the feature space
             Estimate the noise explicitly in the Hilbert space: Φn = Φ − Φr
             Signal-to-noise kernel:

                    Kxn = ΦΦn         = Φ(Φ − Φr ) = ΦΦ − ΦΦr                    = Kxx − Kxr
             Again it is not symmetric K (xi , rj ) = K (ri , xj )
             Advantage: same kernel parameter for Kxx and Kxn

L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio       IGARSS 2011 – Vancouver    13/23
Intro                   SNR                   KMNF                     Results                       Conclusions

SNR in Hilbert spaces

        Explicit KMNF: nearest reference

            Differentiation in feature space: φni ≈ φ(xi ) − φ(xi ,d )
                                                                                                       x xr
               (Kxn )ij ≈ φ(xi ), φ(xj ) − φ(xj,d ) = K (xi , xj ) − K (xi , xj,d )


        Explicit KMNF: averaged reference

            Difference to a local average in feature space (e.g. 4-connected           xr
                                                1 PD                              xr x xr
            neighboring pixels): φni ≈ φ(xi ) −          φ(xi ,d )
                                                D d =1                                xr
                                         D                              D
                                       1 X                            1 X
           (Kxn )ij ≈ φ(xi ), φ(xj ) −      φ(xj,d ) = K (xi , xj ) −     K (xi , xj,d )
                                       D                              D
                                              d =1                                  d =1



        Explicit KMNF: autoregression reference
             Weight the relevance of each kernel in the summation:
                                              D
                                              X                                     D
                                                                                    X
              (Kxn )ij ≈ φ(xi ), φ(xj ) −            wd φ(xj,d ) = K (xi , xj ) −          wd K (xi , xj,d )
                                              d =1                                  d =1
L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio       IGARSS 2011 – Vancouver              14/23
Intro                   SNR                   KMNF                    Results                Conclusions




        1   Introduction


        2   Signal-to-noise ratio transformation


        3   Kernel Minimum Noise Fraction


        4   Experimental Results


        5   Conclusions and Open questions




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    15/23
Intro                   SNR                   KMNF                    Results                Conclusions

Experimental results


        Data material
            AVIRIS hyperspectral image (220-bands): Indian Pine test site
            145 × 145 pixels, 16 crop types classes, 10366 labeled pixels
            The 20 noisy bands in the water absorption region are intentionally kept

        Experimental setup
            PCA, MNF, KPCA, and two versions of KMNF (implicit and explicit)
            The 220 bands transformed into a lower dimensional space of 18 features




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    16/23
Intro                   SNR                    KMNF                      Results                Conclusions

Visual inspection: extracted features in descending order of relevance

          Features        1–3            4–6           7–9            10–12        13–15     16–18

           PCA



          MNF



          KPCA



         implicit
         KMNF



         explicit
         KMNF

L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio        IGARSS 2011 – Vancouver    17/23
Intro                   SNR                                  KMNF                     Results                  Conclusions

Analysis of the eigenvalues: signal variance and SNR of transformed data


        Analysis of the eigenvalues:
            Signal variance of the transformed data for PCA
            SNR of the transformed data for MNF and KMNF
                                             150
                                                                                            PCA
                                                                                            MNF
                                                                                            KMNF
                          Eigenvalue / SNR




                                             100




                                             50




                                              0
                                               0            5             10         15          20
                                                                      # feature

                        The proposed approach provides the highest SNR!

L. Gómez-Chova et al.                        Explicit Kernel Signal to Noise Ratio        IGARSS 2011 – Vancouver    18/23
Intro                                       SNR                        KMNF                                       Results                        Conclusions

LDA classifier: land-cover classification accuracy




                              0.6                                                                       0.6
        Kappa statistic (κ)




                                                                                  Kappa statistic (κ)
                              0.5                                                                       0.5


                              0.4                                                                       0.4


                              0.3                                                                       0.3
                                                                        PCA                                                                       PCA
                                                                        MNF                                                                       MNF
                              0.2                                       KPCA                            0.2                                       KPCA
                                                                        KMNFi                                                                     KMNFi
                                                                        KMNF                                                                      KMNF
                              0.1                                                                       0.1
                                    2   4    6    8     10   12   14    16   18                               2   4    6    8     10   12   14    16   18
                                                  # features                                                                # features

                                    Original hyperspectral image                                         Multiplicative random noise (10%)

                               Best results: linear MNF and the proposed KMNF
                               The proposed KMNF method outperforms MNF when the image is
                               corrupted with non additive noise



L. Gómez-Chova et al.                              Explicit Kernel Signal to Noise Ratio                              IGARSS 2011 – Vancouver               19/23
Intro                   SNR                   KMNF                      Results                Conclusions

LDA classifier: land-cover classification maps




                              LDA-MNF                                 LDA-KMNF




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio       IGARSS 2011 – Vancouver    20/23
Intro                   SNR                   KMNF                    Results                Conclusions




        1   Introduction


        2   Signal-to-noise ratio transformation


        3   Kernel Minimum Noise Fraction


        4   Experimental Results


        5   Conclusions and Open questions




L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    21/23
Intro                   SNR                   KMNF                    Results                Conclusions

Conclusions and open questions



        Conclusions
            Kernel method for nonlinear feature extraction maximizing the SNR
            Good theoretical and practical properties for extracting noise-free features
                 Deals with non-linear relations between the noise and signal
                 The only parameter is the width of the kernel
                 Knowledge about noise can be encoded in the method
            Simple optimization problem → eigendecomposition of the kernel matrix
            Noise estimation in the kernel space with different levels of sophistication
            Simple feature extraction toolbox (SIMFEAT) soon at http://isp.uv.es

        Open questions and Future Work
            Pre-images of transformed data in the input space
            Learn kernel parameters in an automatic way
            Test KMNF in more remote sensing applications: denoising, unmixing, ...



L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio     IGARSS 2011 – Vancouver    22/23
Intro                   SNR                   KMNF                     Results                Conclusions




                         Explicit Signal to Noise Ratio in
                        Reproducing Kernel Hilbert Spaces

             Luis Gómez-Chova1             Allan A. Nielsen2          Gustavo Camps-Valls1

                1 Image  Processing Laboratory (IPL), Universitat de València, Spain.
                    luis.gomez-chova@uv.es , http://www.valencia.edu/chovago
            2 DTU   Space - National Space Institute. Technical University of Denmark.


                                  IGARSS 2011 – Vancouver, Canada

                                  *
                        IPL




                  Image Processing Laboratory

L. Gómez-Chova et al.         Explicit Kernel Signal to Noise Ratio      IGARSS 2011 – Vancouver    23/23

More Related Content

What's hot

Software-defined white-space cognitive systems: implementation of the spectru...
Software-defined white-space cognitive systems: implementation of the spectru...Software-defined white-space cognitive systems: implementation of the spectru...
Software-defined white-space cognitive systems: implementation of the spectru...CSP Scarl
 
Pulse Code Modulation
Pulse Code Modulation Pulse Code Modulation
Pulse Code Modulation ZunAib Ali
 
Crosstalk characterization in gmap arrays
Crosstalk characterization in gmap arraysCrosstalk characterization in gmap arrays
Crosstalk characterization in gmap arraysSaverio Aurite
 
Spatial Fourier transform-based localized sound zone generation with loudspea...
Spatial Fourier transform-based localized sound zone generation with loudspea...Spatial Fourier transform-based localized sound zone generation with loudspea...
Spatial Fourier transform-based localized sound zone generation with loudspea...Takuma_OKAMOTO
 
Ee463 communications 2 - lab 1 - loren schwappach
Ee463   communications 2 - lab 1 - loren schwappachEe463   communications 2 - lab 1 - loren schwappach
Ee463 communications 2 - lab 1 - loren schwappachLoren Schwappach
 
Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSSKamal Bhagat
 
Introduction to compressive sensing
Introduction to compressive sensingIntroduction to compressive sensing
Introduction to compressive sensingAhmed Nasser Agag
 
Lect 11.regenerative repeaters
Lect 11.regenerative repeatersLect 11.regenerative repeaters
Lect 11.regenerative repeatersNebiye Slmn
 
Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...
Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...
Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...Hiroki_Tanji
 
Smith et al. - Efficient auditory coding (Nature 2006)
Smith et al. - Efficient auditory coding (Nature 2006)Smith et al. - Efficient auditory coding (Nature 2006)
Smith et al. - Efficient auditory coding (Nature 2006)xrampino
 
Chapter7 circuits
Chapter7 circuitsChapter7 circuits
Chapter7 circuitsVin Voro
 
Models of neuronal populations
Models of neuronal populationsModels of neuronal populations
Models of neuronal populationsSSA KPI
 
DNN-based permutation solver for frequency-domain independent component analy...
DNN-based permutation solver for frequency-domain independent component analy...DNN-based permutation solver for frequency-domain independent component analy...
DNN-based permutation solver for frequency-domain independent component analy...Kitamura Laboratory
 
Differential pulse code modulation
Differential pulse code modulationDifferential pulse code modulation
Differential pulse code modulationRamraj Bhadu
 
Vis03 Workshop. DT-MRI Visualization
Vis03 Workshop. DT-MRI VisualizationVis03 Workshop. DT-MRI Visualization
Vis03 Workshop. DT-MRI VisualizationLeonid Zhukov
 
Missing Component Restoration for Masked Speech Signals based on Time-Domain ...
Missing Component Restoration for Masked Speech Signals based on Time-Domain ...Missing Component Restoration for Masked Speech Signals based on Time-Domain ...
Missing Component Restoration for Masked Speech Signals based on Time-Domain ...NU_I_TODALAB
 
DNN-based frequency component prediction for frequency-domain audio source se...
DNN-based frequency component prediction for frequency-domain audio source se...DNN-based frequency component prediction for frequency-domain audio source se...
DNN-based frequency component prediction for frequency-domain audio source se...Kitamura Laboratory
 

What's hot (20)

Software-defined white-space cognitive systems: implementation of the spectru...
Software-defined white-space cognitive systems: implementation of the spectru...Software-defined white-space cognitive systems: implementation of the spectru...
Software-defined white-space cognitive systems: implementation of the spectru...
 
Pulse Code Modulation
Pulse Code Modulation Pulse Code Modulation
Pulse Code Modulation
 
Crosstalk characterization in gmap arrays
Crosstalk characterization in gmap arraysCrosstalk characterization in gmap arrays
Crosstalk characterization in gmap arrays
 
Spatial Fourier transform-based localized sound zone generation with loudspea...
Spatial Fourier transform-based localized sound zone generation with loudspea...Spatial Fourier transform-based localized sound zone generation with loudspea...
Spatial Fourier transform-based localized sound zone generation with loudspea...
 
Ee463 communications 2 - lab 1 - loren schwappach
Ee463   communications 2 - lab 1 - loren schwappachEe463   communications 2 - lab 1 - loren schwappach
Ee463 communications 2 - lab 1 - loren schwappach
 
DSP Lab 1-6.pdf
DSP Lab 1-6.pdfDSP Lab 1-6.pdf
DSP Lab 1-6.pdf
 
Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSS
 
Introduction to compressive sensing
Introduction to compressive sensingIntroduction to compressive sensing
Introduction to compressive sensing
 
Lect 11.regenerative repeaters
Lect 11.regenerative repeatersLect 11.regenerative repeaters
Lect 11.regenerative repeaters
 
Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...
Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...
Learning the Statistical Model of the NMF Using the Deep Multiplicative Updat...
 
Smith et al. - Efficient auditory coding (Nature 2006)
Smith et al. - Efficient auditory coding (Nature 2006)Smith et al. - Efficient auditory coding (Nature 2006)
Smith et al. - Efficient auditory coding (Nature 2006)
 
Chapter7 circuits
Chapter7 circuitsChapter7 circuits
Chapter7 circuits
 
Models of neuronal populations
Models of neuronal populationsModels of neuronal populations
Models of neuronal populations
 
DNN-based permutation solver for frequency-domain independent component analy...
DNN-based permutation solver for frequency-domain independent component analy...DNN-based permutation solver for frequency-domain independent component analy...
DNN-based permutation solver for frequency-domain independent component analy...
 
W4 physics 2003
W4 physics 2003W4 physics 2003
W4 physics 2003
 
Differential pulse code modulation
Differential pulse code modulationDifferential pulse code modulation
Differential pulse code modulation
 
Vis03 Workshop. DT-MRI Visualization
Vis03 Workshop. DT-MRI VisualizationVis03 Workshop. DT-MRI Visualization
Vis03 Workshop. DT-MRI Visualization
 
Missing Component Restoration for Masked Speech Signals based on Time-Domain ...
Missing Component Restoration for Masked Speech Signals based on Time-Domain ...Missing Component Restoration for Masked Speech Signals based on Time-Domain ...
Missing Component Restoration for Masked Speech Signals based on Time-Domain ...
 
DNN-based frequency component prediction for frequency-domain audio source se...
DNN-based frequency component prediction for frequency-domain audio source se...DNN-based frequency component prediction for frequency-domain audio source se...
DNN-based frequency component prediction for frequency-domain audio source se...
 
Lecture13
Lecture13Lecture13
Lecture13
 

Viewers also liked

72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...
72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...
72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...hamdanraza
 
Hilbert huang transform(hht)
Hilbert huang transform(hht)Hilbert huang transform(hht)
Hilbert huang transform(hht)Puneet Gupta
 
Nonlinear Structural Dynamics: The Fundamentals Tutorial
Nonlinear Structural Dynamics: The Fundamentals TutorialNonlinear Structural Dynamics: The Fundamentals Tutorial
Nonlinear Structural Dynamics: The Fundamentals TutorialVanderbiltLASIR
 
fauvel_igarss.pdf
fauvel_igarss.pdffauvel_igarss.pdf
fauvel_igarss.pdfgrssieee
 
Nonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemNonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemMichele Filannino
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...zukun
 
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdfKernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdfgrssieee
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceKhulna University
 
KPCA_Survey_Report
KPCA_Survey_ReportKPCA_Survey_Report
KPCA_Survey_ReportRandy Salm
 
Principal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionPrincipal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionJordan McBain
 
Analyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itAnalyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itMilan Rajpara
 
Adaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingAdaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingieeepondy
 
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...hanshang
 
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...Sahidul Islam
 
Regularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataRegularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataWen-Ting Wang
 
Pca and kpca of ecg signal
Pca and kpca of ecg signalPca and kpca of ecg signal
Pca and kpca of ecg signales712
 
Probabilistic PCA, EM, and more
Probabilistic PCA, EM, and moreProbabilistic PCA, EM, and more
Probabilistic PCA, EM, and morehsharmasshare
 
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleDataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleHakka Labs
 
Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...zukun
 

Viewers also liked (20)

72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...
72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...
72268096 non-linearity-in-structural-dynamics-detection-identification-and-mo...
 
Hilbert huang transform(hht)
Hilbert huang transform(hht)Hilbert huang transform(hht)
Hilbert huang transform(hht)
 
Nonlinear Structural Dynamics: The Fundamentals Tutorial
Nonlinear Structural Dynamics: The Fundamentals TutorialNonlinear Structural Dynamics: The Fundamentals Tutorial
Nonlinear Structural Dynamics: The Fundamentals Tutorial
 
fauvel_igarss.pdf
fauvel_igarss.pdffauvel_igarss.pdf
fauvel_igarss.pdf
 
Nonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problemNonlinear component analysis as a kernel eigenvalue problem
Nonlinear component analysis as a kernel eigenvalue problem
 
Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...Principal component analysis and matrix factorizations for learning (part 2) ...
Principal component analysis and matrix factorizations for learning (part 2) ...
 
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdfKernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
 
Different kind of distance and Statistical Distance
Different kind of distance and Statistical DistanceDifferent kind of distance and Statistical Distance
Different kind of distance and Statistical Distance
 
KPCA_Survey_Report
KPCA_Survey_ReportKPCA_Survey_Report
KPCA_Survey_Report
 
Principal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty DetectionPrincipal Component Analysis For Novelty Detection
Principal Component Analysis For Novelty Detection
 
Analyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving itAnalyzing Kernel Security and Approaches for Improving it
Analyzing Kernel Security and Approaches for Improving it
 
Adaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and mergingAdaptive anomaly detection with kernel eigenspace splitting and merging
Adaptive anomaly detection with kernel eigenspace splitting and merging
 
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...
 
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
A Comparative Study between ICA (Independent Component Analysis) and PCA (Pri...
 
Regularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial DataRegularized Principal Component Analysis for Spatial Data
Regularized Principal Component Analysis for Spatial Data
 
Pca and kpca of ecg signal
Pca and kpca of ecg signalPca and kpca of ecg signal
Pca and kpca of ecg signal
 
Hilbert
HilbertHilbert
Hilbert
 
Probabilistic PCA, EM, and more
Probabilistic PCA, EM, and moreProbabilistic PCA, EM, and more
Probabilistic PCA, EM, and more
 
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleDataEngConf: Feature Extraction: Modern Questions and Challenges at Google
DataEngConf: Feature Extraction: Modern Questions and Challenges at Google
 
Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...Principal component analysis and matrix factorizations for learning (part 1) ...
Principal component analysis and matrix factorizations for learning (part 1) ...
 

Similar to Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf

Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Peter Morovic
 
2015-04 PhD defense
2015-04 PhD defense2015-04 PhD defense
2015-04 PhD defenseNil Garcia
 
Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...
Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...
Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...IDES Editor
 
Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...
Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...
Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...IDES Editor
 
PIMRC 2016 Presentation
PIMRC 2016 PresentationPIMRC 2016 Presentation
PIMRC 2016 PresentationMohamed Seif
 
Csss2010 20100803-kanevski-lecture2
Csss2010 20100803-kanevski-lecture2Csss2010 20100803-kanevski-lecture2
Csss2010 20100803-kanevski-lecture2hasan_elektro
 
Bayesian modelling and computation for Raman spectroscopy
Bayesian modelling and computation for Raman spectroscopyBayesian modelling and computation for Raman spectroscopy
Bayesian modelling and computation for Raman spectroscopyMatt Moores
 
Blind Audio Source Separation (Bass): An Unsuperwised Approach
Blind Audio Source Separation (Bass): An Unsuperwised Approach Blind Audio Source Separation (Bass): An Unsuperwised Approach
Blind Audio Source Separation (Bass): An Unsuperwised Approach IJEEE
 
Speech signal time frequency representation
Speech signal time frequency representationSpeech signal time frequency representation
Speech signal time frequency representationNikolay Karpov
 
Single Electron Spin Detection Slides For Uno Interview
Single Electron Spin Detection Slides For Uno InterviewSingle Electron Spin Detection Slides For Uno Interview
Single Electron Spin Detection Slides For Uno Interviewchenhm
 
EBDSS Max Research Report - Final
EBDSS  Max  Research Report - FinalEBDSS  Max  Research Report - Final
EBDSS Max Research Report - FinalMax Robertson
 
International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...CSCJournals
 
Vibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICAVibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICAIJRES Journal
 
Receive Antenna Diversity and Subset Selection in MIMO Communication Systems
Receive Antenna Diversity and Subset Selection in MIMO Communication SystemsReceive Antenna Diversity and Subset Selection in MIMO Communication Systems
Receive Antenna Diversity and Subset Selection in MIMO Communication SystemsIDES Editor
 
Deep Learning Based Voice Activity Detection and Speech Enhancement
Deep Learning Based Voice Activity Detection and Speech EnhancementDeep Learning Based Voice Activity Detection and Speech Enhancement
Deep Learning Based Voice Activity Detection and Speech EnhancementNAVER Engineering
 

Similar to Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf (20)

Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
 
2015-04 PhD defense
2015-04 PhD defense2015-04 PhD defense
2015-04 PhD defense
 
Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...
Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...
Comparative Analysis of Distortive and Non-Distortive Techniques for PAPR Red...
 
Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...
Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...
Chebyshev Functional Link Artificial Neural Networks for Denoising of Image C...
 
PIMRC 2016 Presentation
PIMRC 2016 PresentationPIMRC 2016 Presentation
PIMRC 2016 Presentation
 
Csss2010 20100803-kanevski-lecture2
Csss2010 20100803-kanevski-lecture2Csss2010 20100803-kanevski-lecture2
Csss2010 20100803-kanevski-lecture2
 
Bayesian modelling and computation for Raman spectroscopy
Bayesian modelling and computation for Raman spectroscopyBayesian modelling and computation for Raman spectroscopy
Bayesian modelling and computation for Raman spectroscopy
 
Hr3114661470
Hr3114661470Hr3114661470
Hr3114661470
 
Blind Audio Source Separation (Bass): An Unsuperwised Approach
Blind Audio Source Separation (Bass): An Unsuperwised Approach Blind Audio Source Separation (Bass): An Unsuperwised Approach
Blind Audio Source Separation (Bass): An Unsuperwised Approach
 
PhD_defense_Alla
PhD_defense_AllaPhD_defense_Alla
PhD_defense_Alla
 
Speech signal time frequency representation
Speech signal time frequency representationSpeech signal time frequency representation
Speech signal time frequency representation
 
Single Electron Spin Detection Slides For Uno Interview
Single Electron Spin Detection Slides For Uno InterviewSingle Electron Spin Detection Slides For Uno Interview
Single Electron Spin Detection Slides For Uno Interview
 
Lecture 3 sapienza 2017
Lecture 3 sapienza 2017Lecture 3 sapienza 2017
Lecture 3 sapienza 2017
 
EBDSS Max Research Report - Final
EBDSS  Max  Research Report - FinalEBDSS  Max  Research Report - Final
EBDSS Max Research Report - Final
 
Sparse and Redundant Representations: Theory and Applications
Sparse and Redundant Representations: Theory and ApplicationsSparse and Redundant Representations: Theory and Applications
Sparse and Redundant Representations: Theory and Applications
 
7 227 2005
7 227 20057 227 2005
7 227 2005
 
International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (1) Issue...
 
Vibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICAVibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICA
 
Receive Antenna Diversity and Subset Selection in MIMO Communication Systems
Receive Antenna Diversity and Subset Selection in MIMO Communication SystemsReceive Antenna Diversity and Subset Selection in MIMO Communication Systems
Receive Antenna Diversity and Subset Selection in MIMO Communication Systems
 
Deep Learning Based Voice Activity Detection and Speech Enhancement
Deep Learning Based Voice Activity Detection and Speech EnhancementDeep Learning Based Voice Activity Detection and Speech Enhancement
Deep Learning Based Voice Activity Detection and Speech Enhancement
 

More from grssieee

Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...grssieee
 
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELSEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELgrssieee
 
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...grssieee
 
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESTHE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESgrssieee
 
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSGMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSgrssieee
 
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERPROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERgrssieee
 
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...grssieee
 
test 34mb wo animations
test  34mb wo animationstest  34mb wo animations
test 34mb wo animationsgrssieee
 
2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdfgrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
DLR open house
DLR open houseDLR open house
DLR open housegrssieee
 
Tana_IGARSS2011.ppt
Tana_IGARSS2011.pptTana_IGARSS2011.ppt
Tana_IGARSS2011.pptgrssieee
 
Solaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptSolaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptgrssieee
 

More from grssieee (20)

Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
Tangent height accuracy of Superconducting Submillimeter-Wave Limb-Emission S...
 
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODELSEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
SEGMENTATION OF POLARIMETRIC SAR DATA WITH A MULTI-TEXTURE PRODUCT MODEL
 
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
TWO-POINT STATISTIC OF POLARIMETRIC SAR DATA TWO-POINT STATISTIC OF POLARIMET...
 
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIESTHE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
THE SENTINEL-1 MISSION AND ITS APPLICATION CAPABILITIES
 
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUSGMES SPACE COMPONENT:PROGRAMMATIC STATUS
GMES SPACE COMPONENT:PROGRAMMATIC STATUS
 
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETERPROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
PROGRESSES OF DEVELOPMENT OF CFOSAT SCATTEROMETER
 
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
DEVELOPMENT OF ALGORITHMS AND PRODUCTS FOR SUPPORTING THE ITALIAN HYPERSPECTR...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
EO-1/HYPERION: NEARING TWELVE YEARS OF SUCCESSFUL MISSION SCIENCE OPERATION A...
 
Test
TestTest
Test
 
test 34mb wo animations
test  34mb wo animationstest  34mb wo animations
test 34mb wo animations
 
Test 70MB
Test 70MBTest 70MB
Test 70MB
 
Test 70MB
Test 70MBTest 70MB
Test 70MB
 
2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf2011_Fox_Tax_Worksheets.pdf
2011_Fox_Tax_Worksheets.pdf
 
DLR open house
DLR open houseDLR open house
DLR open house
 
DLR open house
DLR open houseDLR open house
DLR open house
 
DLR open house
DLR open houseDLR open house
DLR open house
 
Tana_IGARSS2011.ppt
Tana_IGARSS2011.pptTana_IGARSS2011.ppt
Tana_IGARSS2011.ppt
 
Solaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.pptSolaro_IGARSS_2011.ppt
Solaro_IGARSS_2011.ppt
 

Recently uploaded

🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsRoshan Dwivedi
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024The Digital Insurer
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 

Recently uploaded (20)

🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 

Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces.pdf

  • 1. Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces Luis Gómez-Chova1 Allan A. Nielsen2 Gustavo Camps-Valls1 1 Image Processing Laboratory (IPL), Universitat de València, Spain. luis.gomez-chova@uv.es , http://www.valencia.edu/chovago 2 DTU Space - National Space Institute. Technical University of Denmark. IGARSS 2011 – Vancouver, Canada * IPL Image Processing Laboratory
  • 2. Intro SNR KMNF Results Conclusions Outline 1 Introduction 2 Signal-to-noise ratio transformation 3 Kernel Minimum Noise Fraction 4 Experimental Results 5 Conclusions and Open questions L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 1/23
  • 3. Intro SNR KMNF Results Conclusions Motivation Feature Extraction Feature selection/extraction is essential before classification or regression to discard redundant or noisy components to reduce the dimensionality of the data Create a subset of new features by combinations of the existing ones Linear Feature Extraction Linear methods offer Interpretability ∼ knowledge discovery PCA: projections maximizing the data set variance PLS: projections maximally aligned with the labels ICA: non-orthogonal projections with maximal independent axes L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 2/23
  • 4. Intro SNR KMNF Results Conclusions Motivation Feature Extraction Feature selection/extraction is essential before classification or regression to discard redundant or noisy components to reduce the dimensionality of the data Create a subset of new features by combinations of the existing ones Linear Feature Extraction Linear methods offer Interpretability ∼ knowledge discovery PCA: projections maximizing the data set variance PLS: projections maximally aligned with the labels ICA: non-orthogonal projections with maximal independent axes Drawbacks 1 Most feature extractors disregard the noise characteristics! 2 Linear methods fail when data distributions are curved (nonlinear relations) L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 2/23
  • 5. Intro SNR KMNF Results Conclusions Objectives Objectives New nonlinear kernel feature extraction method for remote sensing data Extract features robust to data noise Method Based on the Minimum Noise Fraction (MNF) transformation Explicit Kernel MNF (KMNF) Noise is explicitly estimated in the reproducing kernel Hilbert space Deals with non-linear relations between the noise and signal features jointly Reduces the number of free parameters in the formulation to one Experiments PCA, MNF, KPCA, and two versions of KMNF (implicit and explicit) Test feature extractors for real hyperspectral image classification L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 3/23
  • 6. Intro SNR KMNF Results Conclusions 1 Introduction 2 Signal-to-noise ratio transformation 3 Kernel Minimum Noise Fraction 4 Experimental Results 5 Conclusions and Open questions L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 4/23
  • 7. Intro SNR KMNF Results Conclusions Signal and noise Signal vs noise Signal: magnitude generated by an inaccesible system, si Noise: magnitude generated by the medium corrupting the signal, ni Observation: signal corrupted by noise, xi Notation Observations: xi ∈ RN , i = 1, . . . , n Matrix notation: X = [x1 , . . . , xn ] ∈ Rn×N Centered data sets: assume X has zero mean 1 Empirical covariance matrix: Cxx = n X X Projection matrix: U (size N × np ) → X = XU (np extracted features) L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 5/23
  • 8. Intro SNR KMNF Results Conclusions Principal Component Analysis Transformation Principal Component Analysis (PCA) Find projections of X = [x1 , . . . , xN ] maximizing the variance of data XU PCA: maximize: Trace{(XU) (XU)} = Trace{U Cxx U} subject to: U U=I Including Lagrange multipliers λ, this is equivalent to the eigenproblem Cxx ui = λi ui → Cxx U = UD ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0 L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 6/23
  • 9. Intro SNR KMNF Results Conclusions Principal Component Analysis Transformation Principal Component Analysis (PCA) Find projections of X = [x1 , . . . , xN ] maximizing the variance of data XU PCA: maximize: Trace{(XU) (XU)} = Trace{U Cxx U} subject to: U U=I Including Lagrange multipliers λ, this is equivalent to the eigenproblem Cxx ui = λi ui → Cxx U = UD ui are the eigenvectors of Cxx and they are orthonormal, ui uj = 0 PCA limitations 1 Axes rotation to the directions of maximum variance of data 2 It does not consider noise characteristics: Assumes noise variance is low → last eigenvectors with low eigenvalues Maximum variance directions may be affected by noise L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 6/23
  • 10. Intro SNR KMNF Results Conclusions Minimum Noise Fraction Transformation The SNR transformation Find projections maximizing the ratio between signal and noise variances:  ff U Css U SNR: maximize: Tr U Cnn U subject to: U Cnn U = I Unknown signal and noise covariance matrices Css and Cnn L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 7/23
  • 11. Intro SNR KMNF Results Conclusions Minimum Noise Fraction Transformation The SNR transformation Find projections maximizing the ratio between signal and noise variances:  ff U Css U SNR: maximize: Tr U Cnn U subject to: U Cnn U = I Unknown signal and noise covariance matrices Css and Cnn The MNF transformation Assuming additive X = S + N and orthogonal S N = N S = 0 noise Maximizing SNR is equivalent to Minimizing NF = 1/(SNR+1):  ff U Cxx U MNF: maximize: Tr U Cnn U subject to: U Cnn U = I This is equivalent to solving the generalized eigenproblem: Cxx ui = λi Cnn ui → Cxx U = Cnn UD L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 7/23
  • 12. Intro SNR KMNF Results Conclusions Minimum Noise Fraction Transformation The MNF transformation Minimum Noise Fraction equivalent to solve the generalized eigenproblem: Cxx ui = λi Cnn ui → Cxx U = Cnn UD Since U Cnn U = I, eigenvalues λi are the SNR+1 in the projected space Need estimates of signal Cxx = X X and noise Cnn ≈ N N covariances L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 8/23
  • 13. Intro SNR KMNF Results Conclusions Minimum Noise Fraction Transformation The MNF transformation Minimum Noise Fraction equivalent to solve the generalized eigenproblem: Cxx ui = λi Cnn ui → Cxx U = Cnn UD Since U Cnn U = I, eigenvalues λi are the SNR+1 in the projected space Need estimates of signal Cxx = X X and noise Cnn ≈ N N covariances The noise covariance estimation Noise estimate: diff. between actual value and a reference ‘clean’ value N = X − Xr Xr from neighborhood assuming a spatially smoother signal than the noise Assume stationary processes in wide sense: Differentiation: ni ≈ xi − xi −1 1 PM Smoothing filtering: ni ≈ xi − M k=1 wk xi −k Wiener estimates Wavelet domain estimates L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 8/23
  • 14. Intro SNR KMNF Results Conclusions 1 Introduction 2 Signal-to-noise ratio transformation 3 Kernel Minimum Noise Fraction 4 Experimental Results 5 Conclusions and Open questions L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 9/23
  • 15. Intro SNR KMNF Results Conclusions Kernel methods for non-linear feature extraction Kernel methods Input features space Kernel feature space Φ 1 Map the data to a high-dimensional feature space, H (dH → ∞) 2 Solve a linear problem there Kernel trick No need to know dH → ∞ coordinates for each mapped sample φ(xi ) Kernel trick: “if an algorithm can be expressed in the form of dot products, its non-linear (kernel) version only needs the dot products among mapped samples, the so-called kernel function:” K (xi , xj ) = φ(xi ), φ(xj ) Using this trick, we can implement K-PCA, K-PLS, K-ICA, etc L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 10/23
  • 16. Intro SNR KMNF Results Conclusions Kernel Principal Component Analysis (KPCA) Kernel Principal Component Analysis (KPCA) Find projections maximizing variance of mapped data [φ(x1 ), . . . , φ(xN )] KPCA: maximize: Tr{(ΦU) (ΦU)} = Tr{U Φ ΦU} subject to: U U=I The covariance matrix Φ Φ and projection matrix U are dH × dH !!! KPCA through kernel trick Apply the representer’s theorem: U = Φ A where A = [α1 , . . . , αN ] KPCA: maximize: Tr{A ΦΦ ΦΦ A} = Tr{A KKA} subject to: U U = A ΦΦ A = A KA = I Including Lagrange multipliers λ, this is equivalent to the eigenproblem KKαi = λi Kαi → Kαi = λi αi Now matrix A is N × N !!! (eigendecomposition of K) Projections are obtained as ΦU = ΦΦ A = KA L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 11/23
  • 17. Intro SNR KMNF Results Conclusions Kernel MNF Transformation KMNF through kernel trick Find projections maximizing SNR of mapped data [φ(x1 ), . . . , φ(xN )] Replace X ∈ Rn×N with Φ ∈ Rn×NH Replace N ∈ Rn×N with Φn ∈ Rn×NG Cxx U = Cnn UD ⇒ Φ ΦU = Φn Φn UD Not solvable: matrices Φ Φ and Φn Φn are NH × NH and NG × NG Left multiply both sides by Φ, and use representer’s theorem, U = Φ A: ΦΦ ΦΦ A = ΦΦn Φn Φ AD → Kxx Kxx A = Kxn Kxn AD Now matrix A is N × N !!! (eigendecomposition of Kxx wrt Kxn ) Kxx = ΦΦ is symmetric with elements K (xi , xj ) Kxn = ΦΦn = Knx is non-symmetric with elements K (xi , nj ) Easy and simple to program! Potentially useful when signal and noise are nonlinearly related L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 12/23
  • 18. Intro SNR KMNF Results Conclusions SNR in Hilbert spaces Implicit KMNF: noise estimate in the input space Estimate the noise directly in the input space: N = X − Xr Signal-to-noise kernel: Kxn = ΦΦn → K (xi , nj ) with Φn = [φ(n1 ), . . . , φ(nn )] Kernels Kxx and Kxn dealing with objects of different nature → 2 params Two different kernel spaces → eigenvalues have no longer meaning of SNR Explicit KMNF: noise estimate in the feature space Estimate the noise explicitly in the Hilbert space: Φn = Φ − Φr Signal-to-noise kernel: Kxn = ΦΦn = Φ(Φ − Φr ) = ΦΦ − ΦΦr = Kxx − Kxr Again it is not symmetric K (xi , rj ) = K (ri , xj ) Advantage: same kernel parameter for Kxx and Kxn L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 13/23
  • 19. Intro SNR KMNF Results Conclusions SNR in Hilbert spaces Explicit KMNF: nearest reference Differentiation in feature space: φni ≈ φ(xi ) − φ(xi ,d ) x xr (Kxn )ij ≈ φ(xi ), φ(xj ) − φ(xj,d ) = K (xi , xj ) − K (xi , xj,d ) Explicit KMNF: averaged reference Difference to a local average in feature space (e.g. 4-connected xr 1 PD xr x xr neighboring pixels): φni ≈ φ(xi ) − φ(xi ,d ) D d =1 xr D D 1 X 1 X (Kxn )ij ≈ φ(xi ), φ(xj ) − φ(xj,d ) = K (xi , xj ) − K (xi , xj,d ) D D d =1 d =1 Explicit KMNF: autoregression reference Weight the relevance of each kernel in the summation: D X D X (Kxn )ij ≈ φ(xi ), φ(xj ) − wd φ(xj,d ) = K (xi , xj ) − wd K (xi , xj,d ) d =1 d =1 L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 14/23
  • 20. Intro SNR KMNF Results Conclusions 1 Introduction 2 Signal-to-noise ratio transformation 3 Kernel Minimum Noise Fraction 4 Experimental Results 5 Conclusions and Open questions L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 15/23
  • 21. Intro SNR KMNF Results Conclusions Experimental results Data material AVIRIS hyperspectral image (220-bands): Indian Pine test site 145 × 145 pixels, 16 crop types classes, 10366 labeled pixels The 20 noisy bands in the water absorption region are intentionally kept Experimental setup PCA, MNF, KPCA, and two versions of KMNF (implicit and explicit) The 220 bands transformed into a lower dimensional space of 18 features L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 16/23
  • 22. Intro SNR KMNF Results Conclusions Visual inspection: extracted features in descending order of relevance Features 1–3 4–6 7–9 10–12 13–15 16–18 PCA MNF KPCA implicit KMNF explicit KMNF L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 17/23
  • 23. Intro SNR KMNF Results Conclusions Analysis of the eigenvalues: signal variance and SNR of transformed data Analysis of the eigenvalues: Signal variance of the transformed data for PCA SNR of the transformed data for MNF and KMNF 150 PCA MNF KMNF Eigenvalue / SNR 100 50 0 0 5 10 15 20 # feature The proposed approach provides the highest SNR! L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 18/23
  • 24. Intro SNR KMNF Results Conclusions LDA classifier: land-cover classification accuracy 0.6 0.6 Kappa statistic (κ) Kappa statistic (κ) 0.5 0.5 0.4 0.4 0.3 0.3 PCA PCA MNF MNF 0.2 KPCA 0.2 KPCA KMNFi KMNFi KMNF KMNF 0.1 0.1 2 4 6 8 10 12 14 16 18 2 4 6 8 10 12 14 16 18 # features # features Original hyperspectral image Multiplicative random noise (10%) Best results: linear MNF and the proposed KMNF The proposed KMNF method outperforms MNF when the image is corrupted with non additive noise L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 19/23
  • 25. Intro SNR KMNF Results Conclusions LDA classifier: land-cover classification maps LDA-MNF LDA-KMNF L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 20/23
  • 26. Intro SNR KMNF Results Conclusions 1 Introduction 2 Signal-to-noise ratio transformation 3 Kernel Minimum Noise Fraction 4 Experimental Results 5 Conclusions and Open questions L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 21/23
  • 27. Intro SNR KMNF Results Conclusions Conclusions and open questions Conclusions Kernel method for nonlinear feature extraction maximizing the SNR Good theoretical and practical properties for extracting noise-free features Deals with non-linear relations between the noise and signal The only parameter is the width of the kernel Knowledge about noise can be encoded in the method Simple optimization problem → eigendecomposition of the kernel matrix Noise estimation in the kernel space with different levels of sophistication Simple feature extraction toolbox (SIMFEAT) soon at http://isp.uv.es Open questions and Future Work Pre-images of transformed data in the input space Learn kernel parameters in an automatic way Test KMNF in more remote sensing applications: denoising, unmixing, ... L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 22/23
  • 28. Intro SNR KMNF Results Conclusions Explicit Signal to Noise Ratio in Reproducing Kernel Hilbert Spaces Luis Gómez-Chova1 Allan A. Nielsen2 Gustavo Camps-Valls1 1 Image Processing Laboratory (IPL), Universitat de València, Spain. luis.gomez-chova@uv.es , http://www.valencia.edu/chovago 2 DTU Space - National Space Institute. Technical University of Denmark. IGARSS 2011 – Vancouver, Canada * IPL Image Processing Laboratory L. Gómez-Chova et al. Explicit Kernel Signal to Noise Ratio IGARSS 2011 – Vancouver 23/23