SlideShare a Scribd company logo
1 of 87
Download to read offline
Kernel Methods
   (Support Vector Machines)
               for
Environmental and Geo- Sciences

            Alexei Pozdnoukhov

                  Lecturer

    National Centre for Geocomputation
   National University of Ireland, Maynooth

            +353 (0)1 7086146
       Alexei.Pozdnoukhov@nuim.ie
Machine Learning
Learning From Data
• Environmental monitoring
 Current rate of data acquisition is about
 0.5Tb/day (increasing at 82% per year)


                                             • Remote Sensing Data
                                             NASA holds more than 10Pb of data,
                                                increasing by 10x every 5 years.
                                           ESA data stream is about 0.5Tb/year,
                                        likely to increase by 20x in next 5 years.


• GIS, DEM
• Sensor Networks
• Field Measurements
Clustering



                         Cluster 1




Cluster 2
Dimensionality Reduction
Classification




  Binary         Multi-Class
Regression


y




                 Input, x
Curse of Dimensionality
Sensor Network
Sensor Network
                       Need more data?




        Batteries Recharged at WSN
                    Human activity
                   Remote Sensing
           Wireless Sensor Network
           Geographical Information
Detecting Events
 Observed environment:                      Events: Very Rare, Extreme
   high-dimensional input space




• High-dimensional spaces: risk of overfitting
• Robust to noise in both inputs/outputs
• Non-linear and non-parametric
• Computationally effective for real-time processing and LBS dissemination
Curse of Dimensionality
Statistical Learning Theory




   • Models that can generalise from data
   • Good predictive abilities
   • Complexity can be controlled
Statistical Learning Theory

• Occam’s Razor Principle (14th century)

    One should not increase, beyond what is necessary,
    the number of entities required to explain anything

• When many solutions are available for a given problem, we
  should select the simplest one.

• But what do we mean by simple?

• We will use prior knowledge of the problem to solve to define
  what is a simple solution (example of a prior: smoothness).
Occam’s Razor and Classification




                 Model 1   Model 2   Model 3
Complexity        √√         √        ××
Training error    ××         √        √√
Overall            -         √         -
Structural Risk Minimization


       • Define a set of learning functions, {S}
       • Order it in terms of complexity, {S1, …, SN}
       • Select the optimal S*

                F = {f(x,α), α∈Λ}
Classification


Support Vector Machine
        SVM
Separating Hyperplane



                             x - input patterns
                             w - weight vector

                             b - threshold




     f w,b ( x ) = sign ( w ⋅ x + b)
How powerful are linear decision functions?
VC-dimension in classification
                                                                    Shattering

• the number of samples which can be discriminated by the function for all
                 possible class memberships – shattered.



3 samples:                                                            x

                                         x
                         x


4 samples:
                x    ?       x

                                       VC-dimension h of the linear
                                    decision functions in RN equals N+1



      That is, the power of linear decision functions is beyond our control…?
Support Vector Machine


                                   Decision function is a margin hyperplane(*)

                                                            1, (w⋅ x) − b ≥ 1
                                            f (x,{w, b}) = 
                                                            −1, (w⋅ x) − b ≤ −1

                                      Intuition:

                                          Large Margin is good.

Lemma: Given that the N-dimensional data {xl, x2, …xL} lie inside a finite
enclosing sphere of the radius R, the VC-dimension h of the margin-based
decision functions (*) follows the inequality:
                                                   h ≤ min R2 w , N  +1
                                                                2
                                                                    

        The complexity (VC-dimension) can be controlled with ||w||2 !!
Separating Hyperplane: Max Margin




To maximize the margin ρ, one would like to minimize ||w||, or ||w||2.


             1, (w ⋅ x) − b ≥ 1
fw,b ( x) =                             f w,b ( x) = sign (( w ⋅ x) + b)
             −1, (w ⋅ x) − b ≤ −1
Optimization Problem, Lagrangian



{          1       2
     min       w
           2
                                                             ⇒
    yi ( w ⋅ xi + b) ≥ 1, i = 1,..., L.




                                      {
                                                                         L
                                                                 w − ∑ α i ( yi ( w ⋅ xi + b) − 1)
                                                                   2
                                          Lp =           1
                                                         2
                                                                        i =1
                                           L

                         ⇒                ∑α ⋅ y
                                          i =1
                                                 i           i   = 0,
                                                     L
                                          w = ∑ αi ⋅ yi ⋅ xi
                                                 i =1




     KKT conditions:                                               αi > 0      - Support Vectors

     α i ( yi ( w ⋅ xi + b) − 1) = 0, ∀i                           αi = 0
Optimization Problem: Dual Variables.

                        L            L
            LD = ∑ α i − 1 ∑ α iα j yi y j ( xi ⋅ x j )
                         2
                       i =1        i , j =1
             L

           ∑α y
            i =1
                   i    i     =0

           α i ≥ 0, i = 1,...L

                                     L                       
 f ( x ) = sign ( w ⋅ x + b) = sign  ∑ α i yi ( x ⋅ xi ) + b 
                                     i =1                    

         • inputs are presented as dot products
         • Quadratic Programming
         • convex problem, nice theoretical field
         • unique solution, good solvers
Soft margin hyperplane:
             allowing for the training error.
                                       error




{
                                             L
                        w + C ∑ξi
                    1    2
      min           2
                                            i =1
     yi ( w ⋅ xi + b) ≥ 1 − ξ i , i = 1,..., L.
     ξ i ≥ 0, i = 1,...L




{
                L                  L
    LD = ∑ α i −             1
                             2   ∑α α         i    j   yi y j ( xi ⋅ x j )
               i =1              i , j =1                                    C - regularization parameter
     L

    ∑α y   i    i   =0                                                           trade-off between
    i =1                                                                         margin maximization
    0 ≤ αi ≤ C , i = 1,...L                                                                &
                                                                                    training error
Support Vector Terminology


                                             αi = 0         Normal Samples



                                             0 < αi < C      Support Vectors


                                             αi = C       Support Vectors
                                                          untypical or noisy


                                             C - regularization parameter
                L                       
f ( x ) = sign  ∑ α i yi ( x ⋅ xi ) + b 
                i =1                           trade-off between
                                                 margin maximization
                                                           &
                                                    training error
Support Vector Algorithm
                                             Kernel Trick
      If data is not linearly separable, it can be projected into (sufficiently)
       high dimensional space. There it is much easier to separate!

Example.                                        K ( x, x′) = ( x ⋅ x′) 2
                x12 
        x1            
        x  →  2 x1 x2 
        2    x2 2   
                        



x → Φ( x)        ?      The algorithm was formulated in terms of dot products!



x ⋅ x′ → Φ ( x ) ⋅ Φ ( x′)     ⇔     x ⋅ x′ → K ( x, x′) •K is symmetric
                                                               •K is positive-definite
Nonlinear SVM. Kernel trick.


                                                          f ( x ) = wx + b →
                                                               L
                                                     f ( x ) = ∑ yiα i K ( x, xi ) + b
                                                              i =1




Any linear algorithm, formulated in terms of dot products of input data,
       can be modified into a non-linear one using the kernel trick.
                                                              trick

                     • Support Vector Machine
                     • Kernel Ridge Regression
                     • Kernel Principle Component Analysis
                     • Kernel Fischer Discriminant Analysis
                     • etc.
Nonlinear SVM. Kernel types.


• Polynomial kernel:      K ( x, y ) = ( x ⋅ y + 1) p
                                                                 2
                                                          x− y
                                                      −
• Radial Basis Function kernel:      K ( x, y ) = e       2σ 2




           f ( x ) = sign ( ∑ yiαi K ( x, xi ) + b)
                              i∈SV
Nonlinear SVM. Optimization problem.

                                       L            L
                        LD = ∑ α i − 1 ∑ α iα j yi y j K ( xi , x j )
                                     2
                                      i =1        i , j =1
                          L

                        ∑α y
                         i =1
                                  i    i     =0

                        0 ≤ α i ≤ C , i = 1,...L

           L
b = yi − ∑ yiα i K ( xi , x j )                   f ( x ) = sign( ∑ yiαi K ( x, xi ) + b)
          i =0                                                  i∈SV



 K is positive-definite, still QP programming, hence unique solution!
Support Vector Machine




    http://www.geokernels.org/teaching/svm
SVM: Software.
Examples
SV Porosity Mapping

                                                                Data description

                                                          200 training samples
                                                          “+” 94 validation samples
                                                          minimum = 0.0
                                                          median = 0.515
                                                          max = 1.000
                                                          mean = 0.53
                                                          variance = 0.048




The original continuous data were transformed into 2-class data according to the
0.5 threshold:
                          If fpor ≥ 0.5, then y = +1
                          If fpor < 0.5, then y = -1
SV Porosity Mapping

                        Data: 2-class transformation




•   class “+1”, ≥ 0.5
o class “-1”,   < 0.5

+ validation data
SV Porosity Mapping

                                                    Data loading


150 training samples
                 50 testing samples




       Prediction Grid
SV Porosity Mapping
                                                              Hyper-parameters tuning

                                                                                     2
                                                                            x − x′
 • Gaussian RBF kernel is selected.                                     −
                                                       K ( x, x′) = e       2σ 2
 • Two hyper-parameters: C and σ.
 • Grid search: testing error analysis for every pair of paramaters.


               The range of σ
min(σ) - minimum distance between data samples
max(σ) - max distance between data samples



          The range of log(C)
        min(C) - some small value, 1 or less
        max(C) – depends on data, 1e3-1e6



  Start calculation using testing data
                                                         Save results to file
SV Porosity Mapping
                                    Hyper-parameters tuning

            Training error surface
Log(C)




         Gaussian RBF kernel bandwidth

         • increase with kernel bandwidth
         • decrease with C
SV Porosity Mapping
                                                     Hyper-parameters tuning

                       Testing error surface
Log(C)




                       Gaussian RBF kernel bandwidth

 Complex structure, but generally, if the range is selected reasonably and
 data splitting is correct, there exist a region of minima – optimal values.
SV Porosity Mapping
                                                 Hyper-parameters tuning

                   Normalized number of Support Vectors
  Log(C)




                       Gaussian RBF kernel bandwidth

Represents the complexity of the model, the more complex one has more SVs.
What are the parameters for the final model?

                               Hyper-parameters selection

Testing error


                                    C=3
                                    σ = 0.09




Training error                Normalized NSV
What are the parameters for the final model?

                               Hyper-parameters selection

Testing error


                                    C = 18
                                    σ = 0.13




Training error                Normalized NSV
SV Porosity Mapping
                                 Dependence on Parameters


C = 10




σ
0.02   0.06   0.1   0.2    0.3        0.4          0.5
SV Porosity Mapping
                                          Dependence on Parameters

                     C=100




              C=10




        C=1



C=0.1

                                      σ   = 0.1
SV Porosity Mapping
   Predictive Mapping and Support Vectors


                    Predictive mapping
                          +
                        MARGIN
                             +
                    Normal SV, 0<α<C.
                             +
                     Critical SV,   α=C.
Applications for Natural Hazards

    • Topo-climatic mapping
    • Landslides
    • Snow avalanches prediction
Weather observations

                             •   110 meteo stations
                             •   Measurements, up to every
                                 10min
                             •   Altitude: 270m-3580m

                             •   Temperature
                             •   Precipitation
                             •   Humidity
                             •   Air Pressure
                             •   Wind Speed
                             •   Insolation
                             •   Etc.




Spatio-temporal prediction mapping?
Temperature Inversion




Can only be explained using terrain surface characteristics (convexity, slope, etc.)
Physical Models at local scales




 •   Terrain roughness is too high for physical models, computational speed,
     precision, uncertainty estimation…

     PDE on smoothed terrain + empirical correction
            vModel ( x, y ) = vPhysical + cRidges + cCanyons + cValues + cFlatAreas + cSea ...

Can this information be extracted directly from data?
Modelling Scheme
Data                                       Predictive Modeling
                                                   with
                                            Machine Learning


DEM

              Non-linear dependencies
                   Noise, Outliers
                                        Spatio-Temporal Mapping
 F
 E
 A
 T
 U     ….
 R
                     Feature
 E
               Selection/Extraction
 S                                      Analysis Decision Support
Temperature vs. Elevation

Mean Monthly           Mean Daily

Linear                 Locally Linear
                       Regionalized




Mean Hourly            Mean Hourly
                        Explained
Non-linear
Regionalized           Temperature
                        Inversion
DEM Features
Large Scale Difference of Gaussians   Short Scale Difference of Gaussians




         Slope                                Local Variance
Temperature Inversion Mapping




Probability of Inversion
                           Temperature
Visual Validation
Operational setting




http://www.geokernels.org/services/meteo
Applications

• Topo-climatic mapping
• Landslides
• Snow avalanches prediction
• Remote Sensing
Landslide inventory




                      SFI (SRC-ID 07/SRC/I1168)
Method I
Factor 1   Probability density estimation




                                            Factor 2




                                                  SFI (SRC-ID 07/SRC/I1168)
Model vs. Training Data




                          SFI (SRC-ID 07/SRC/I1168)
What is wrong with this susceptibility map?




                                              SFI (SRC-ID 07/SRC/I1168)
Method II
                       Classification



                                        Stable
Factor 1




           Unstable
                                                 Factor 2




                                                        SFI (SRC-ID 07/SRC/I1168)
Predictive models




                    SFI (SRC-ID 07/SRC/I1168)
A model should fit the observed landslides, and …




                                                    SFI (SRC-ID 07/SRC/I1168)
Applications

• Topo-climatic mapping
• Landslides
• Snow avalanches prediction
• Remote Sensing
Lochaber, Scotland
• 1842 days of weather conditions (11 features) recording,
  1991-2007
• 1135 days with documented avalanche events
• 797 safe days, 245 with avalanches
• 260 days unknown (mainly bad weather)
Spatial Data
                                  Training data: 722 events,
                                    winters 1991-2005




                                Validation data: 72 events,
                                  winters 2006-2007

• 47 avalanche paths, x, y, z, slope, aspect, date
• DEM, 10m resolution, 5km x 5km
Lochaber weather
              observations
•   Snow index         0-10
•   No-settle                   cumulative Snow over a season
•   Rain at 900m       binary [0, 1]
•   Snow drift                  binary [0, 1]
•   Air temperature             -10,… +10
•   Wind speed                  0, … 25 m/s
•   Wind Direction              0o-360o
•   Cloudness                   [25, 50, 75, 100]
•   Foot penetration            0, … 50
•   Snow temperature            0, … -10
•   Insolation                  cumulative over season
Classification Problem

Z Slope Aspect: SN-WE      [Spatialized Weather Features]              +1

720              …over all the documented avalanche events…

Z Slope Aspect: SN-WE      [Spatialized Weather Features]              +1



 Z Slope Aspect: SN-WE      [Spatialized Weather Features]              -1


44000     …over all the 47 gullies for documented days without avalanches…


 Z Slope Aspect: SN-WE      [Spatialized Weather Features]              -1


                           4 + 22 = 26
Wind Speed and Direction

                                    Wind speed weighting:



                                       Correction for slope:



                                       Correction for curvature:




Terrain-corrected wind direction:
Snow accumulation

Simple heuristics based on wind speed gradients


                                          If Snow index > 0

                                           If Snow drift = 1

                                           Snow accumulation =
                                             F(Wind Speed,
                                                Wind Direction)
Results




DEM        Avalanche Danger
Results
wind




       Animation in 3D
Applications


• Topo-climatic mapping
• Landslides
• Snow avalanches prediction
• Remote Sensing
Inhabited areas
      Ground truth is known: population census




Testing                                    Training
Inhabited areas
Ground truth is known: population census
Inhabited areas: examples
Inhabited areas: examples
Inhabited areas: examples
Inhabited areas: examples
Inhabited areas: examples
Inhabited areas: examples
Pre-processing and Features




    Mathematical morphology (image closing)
Pre-processing and Features




             SIFT
Pre-processing and Features




      Gaussian Mixture Model
Pre-processing and Features
Testing: inhabited areas
Inhabited areas
Inhabited areas
Summary and Conclusions



• Statistical Learning Theory
• Classification Problem
• Support Vector Machines and Kernel Methods

• GeoSpatial Data Classification with SVM
Open PhD positions at NCG



Thank you!




                           Alexei Pozdnoukhov
                  Alexei.Pozdnoukhov@nuim.ie




                            SFI (SRC-ID 07/SRC/I1168)

More Related Content

What's hot

The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptronESCOM
 
Bayesian Methods for Machine Learning
Bayesian Methods for Machine LearningBayesian Methods for Machine Learning
Bayesian Methods for Machine Learningbutest
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer visionzukun
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slidesCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slideszukun
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGabriel Peyré
 
Expert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, KanpurExpert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, KanpurSuddhasheel GHOSH, PhD
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualStéphane Canu
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas EberleBigMC
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsGabriel Peyré
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsGabriel Peyré
 
Monte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLPMonte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLPSSA KPI
 
NIPS2009: Sparse Methods for Machine Learning: Theory and Algorithms
NIPS2009: Sparse Methods for Machine Learning: Theory and AlgorithmsNIPS2009: Sparse Methods for Machine Learning: Theory and Algorithms
NIPS2009: Sparse Methods for Machine Learning: Theory and Algorithmszukun
 
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...Leo Asselborn
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...ijceronline
 
Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...zukun
 
Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...
Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...
Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...Alex (Oleksiy) Varfolomiyev
 

What's hot (20)

YSC 2013
YSC 2013YSC 2013
YSC 2013
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptron
 
Bayesian Methods for Machine Learning
Bayesian Methods for Machine LearningBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer vision
 
BMC 2012
BMC 2012BMC 2012
BMC 2012
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slidesCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
 
Expert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, KanpurExpert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, Kanpur
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dual
 
SSA slides
SSA slidesSSA slides
SSA slides
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas Eberle
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse Problems
 
Monte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLPMonte-Carlo method for Two-Stage SLP
Monte-Carlo method for Two-Stage SLP
 
NIPS2009: Sparse Methods for Machine Learning: Theory and Algorithms
NIPS2009: Sparse Methods for Machine Learning: Theory and AlgorithmsNIPS2009: Sparse Methods for Machine Learning: Theory and Algorithms
NIPS2009: Sparse Methods for Machine Learning: Theory and Algorithms
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
Robust Control of Uncertain Switched Linear Systems based on Stochastic Reach...
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
 
Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...Principal component analysis and matrix factorizations for learning (part 3) ...
Principal component analysis and matrix factorizations for learning (part 3) ...
 
Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...
Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...
Optimal Finite Difference Grids for Elliptic and Parabolic PDEs with Applicat...
 

Similar to Kernel Methods for Environmental and Geo-Sciences (SVM

Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01Cheng Feng
 
lecture9-support vector machines algorithms_ML-1.ppt
lecture9-support vector machines algorithms_ML-1.pptlecture9-support vector machines algorithms_ML-1.ppt
lecture9-support vector machines algorithms_ML-1.pptNaglaaAbdelhady
 
Pydata Katya Vasilaky
Pydata Katya VasilakyPydata Katya Vasilaky
Pydata Katya Vasilakyknv4
 
23 industrial engineering
23 industrial engineering23 industrial engineering
23 industrial engineeringmloeb825
 
The world of loss function
The world of loss functionThe world of loss function
The world of loss function홍배 김
 
Support Vector Machines is the the the the the the the the the
Support Vector Machines is the the the the the the the the theSupport Vector Machines is the the the the the the the the the
Support Vector Machines is the the the the the the the the thesanjaibalajeessn
 
support vector machine algorithm in machine learning
support vector machine algorithm in machine learningsupport vector machine algorithm in machine learning
support vector machine algorithm in machine learningSamGuy7
 
super vector machines algorithms using deep
super vector machines algorithms using deepsuper vector machines algorithms using deep
super vector machines algorithms using deepKNaveenKumarECE
 
A Simple Review on SVM
A Simple Review on SVMA Simple Review on SVM
A Simple Review on SVMHonglin Yu
 
latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxMdMahfoozAlam5
 
Support vector machine
Support vector machineSupport vector machine
Support vector machinePrasenjit Dey
 
linear SVM.ppt
linear SVM.pptlinear SVM.ppt
linear SVM.pptMahimMajee
 

Similar to Kernel Methods for Environmental and Geo-Sciences (SVM (20)

QMC: Operator Splitting Workshop, A Splitting Method for Nonsmooth Nonconvex ...
QMC: Operator Splitting Workshop, A Splitting Method for Nonsmooth Nonconvex ...QMC: Operator Splitting Workshop, A Splitting Method for Nonsmooth Nonconvex ...
QMC: Operator Splitting Workshop, A Splitting Method for Nonsmooth Nonconvex ...
 
Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01
 
lecture9-support vector machines algorithms_ML-1.ppt
lecture9-support vector machines algorithms_ML-1.pptlecture9-support vector machines algorithms_ML-1.ppt
lecture9-support vector machines algorithms_ML-1.ppt
 
Pydata Katya Vasilaky
Pydata Katya VasilakyPydata Katya Vasilaky
Pydata Katya Vasilaky
 
23 industrial engineering
23 industrial engineering23 industrial engineering
23 industrial engineering
 
The world of loss function
The world of loss functionThe world of loss function
The world of loss function
 
support vector machine
support vector machinesupport vector machine
support vector machine
 
Support Vector Machines is the the the the the the the the the
Support Vector Machines is the the the the the the the the theSupport Vector Machines is the the the the the the the the the
Support Vector Machines is the the the the the the the the the
 
Support Vector Machine.ppt
Support Vector Machine.pptSupport Vector Machine.ppt
Support Vector Machine.ppt
 
svm.ppt
svm.pptsvm.ppt
svm.ppt
 
support vector machine algorithm in machine learning
support vector machine algorithm in machine learningsupport vector machine algorithm in machine learning
support vector machine algorithm in machine learning
 
super vector machines algorithms using deep
super vector machines algorithms using deepsuper vector machines algorithms using deep
super vector machines algorithms using deep
 
A Simple Review on SVM
A Simple Review on SVMA Simple Review on SVM
A Simple Review on SVM
 
Svm my
Svm mySvm my
Svm my
 
Svm my
Svm mySvm my
Svm my
 
Gentle intro to SVM
Gentle intro to SVMGentle intro to SVM
Gentle intro to SVM
 
latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptx
 
Support vector machine
Support vector machineSupport vector machine
Support vector machine
 
Lect3cg2011
Lect3cg2011Lect3cg2011
Lect3cg2011
 
linear SVM.ppt
linear SVM.pptlinear SVM.ppt
linear SVM.ppt
 

More from Beniamino Murgante

Analyzing and assessing ecological transition in building sustainable cities
Analyzing and assessing ecological transition in building sustainable citiesAnalyzing and assessing ecological transition in building sustainable cities
Analyzing and assessing ecological transition in building sustainable citiesBeniamino Murgante
 
Smart Cities: New Science for the Cities
Smart Cities: New Science for the CitiesSmart Cities: New Science for the Cities
Smart Cities: New Science for the CitiesBeniamino Murgante
 
The evolution of spatial analysis and modeling in decision processes
The evolution of spatial analysis and modeling in decision processesThe evolution of spatial analysis and modeling in decision processes
The evolution of spatial analysis and modeling in decision processesBeniamino Murgante
 
Involving citizens in smart energy approaches: the experience of an energy pa...
Involving citizens in smart energy approaches: the experience of an energy pa...Involving citizens in smart energy approaches: the experience of an energy pa...
Involving citizens in smart energy approaches: the experience of an energy pa...Beniamino Murgante
 
Programmazione per la governance territoriale in tema di tutela della biodive...
Programmazione per la governance territoriale in tema di tutela della biodive...Programmazione per la governance territoriale in tema di tutela della biodive...
Programmazione per la governance territoriale in tema di tutela della biodive...Beniamino Murgante
 
Involving Citizens in a Participation Process for Increasing Walkability
Involving Citizens in a Participation Process for Increasing WalkabilityInvolving Citizens in a Participation Process for Increasing Walkability
Involving Citizens in a Participation Process for Increasing WalkabilityBeniamino Murgante
 
Presentation of ICCSA 2019 at the University of Saint petersburg
Presentation of ICCSA 2019 at the University of Saint petersburg Presentation of ICCSA 2019 at the University of Saint petersburg
Presentation of ICCSA 2019 at the University of Saint petersburg Beniamino Murgante
 
RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...
RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...
RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...Beniamino Murgante
 
Presentation of ICCSA 2017 at the University of trieste
Presentation of ICCSA 2017 at the University of triestePresentation of ICCSA 2017 at the University of trieste
Presentation of ICCSA 2017 at the University of triesteBeniamino Murgante
 
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...Beniamino Murgante
 
Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...
Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...
Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...Beniamino Murgante
 
Socio-Economic Planning profiles: Sciences VS Daily activities in public sector 
Socio-Economic Planning profiles: Sciences VS Daily activities in public sector Socio-Economic Planning profiles: Sciences VS Daily activities in public sector 
Socio-Economic Planning profiles: Sciences VS Daily activities in public sector Beniamino Murgante
 
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...Beniamino Murgante
 
Garden in motion. An experience of citizens involvement in public space regen...
Garden in motion. An experience of citizens involvement in public space regen...Garden in motion. An experience of citizens involvement in public space regen...
Garden in motion. An experience of citizens involvement in public space regen...Beniamino Murgante
 
Planning and Smartness: the true challenge
Planning and Smartness: the true challengePlanning and Smartness: the true challenge
Planning and Smartness: the true challengeBeniamino Murgante
 
GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...
GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...
GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...Beniamino Murgante
 
Informazione Geografica, Città, Smartness
Informazione Geografica, Città, Smartness Informazione Geografica, Città, Smartness
Informazione Geografica, Città, Smartness Beniamino Murgante
 
Tecnologie, Territorio, Smartness
Tecnologie, Territorio, SmartnessTecnologie, Territorio, Smartness
Tecnologie, Territorio, SmartnessBeniamino Murgante
 

More from Beniamino Murgante (20)

Analyzing and assessing ecological transition in building sustainable cities
Analyzing and assessing ecological transition in building sustainable citiesAnalyzing and assessing ecological transition in building sustainable cities
Analyzing and assessing ecological transition in building sustainable cities
 
Smart Cities: New Science for the Cities
Smart Cities: New Science for the CitiesSmart Cities: New Science for the Cities
Smart Cities: New Science for the Cities
 
The evolution of spatial analysis and modeling in decision processes
The evolution of spatial analysis and modeling in decision processesThe evolution of spatial analysis and modeling in decision processes
The evolution of spatial analysis and modeling in decision processes
 
Smart City or Urban Science?
Smart City or Urban Science?Smart City or Urban Science?
Smart City or Urban Science?
 
Involving citizens in smart energy approaches: the experience of an energy pa...
Involving citizens in smart energy approaches: the experience of an energy pa...Involving citizens in smart energy approaches: the experience of an energy pa...
Involving citizens in smart energy approaches: the experience of an energy pa...
 
Programmazione per la governance territoriale in tema di tutela della biodive...
Programmazione per la governance territoriale in tema di tutela della biodive...Programmazione per la governance territoriale in tema di tutela della biodive...
Programmazione per la governance territoriale in tema di tutela della biodive...
 
Involving Citizens in a Participation Process for Increasing Walkability
Involving Citizens in a Participation Process for Increasing WalkabilityInvolving Citizens in a Participation Process for Increasing Walkability
Involving Citizens in a Participation Process for Increasing Walkability
 
Presentation of ICCSA 2019 at the University of Saint petersburg
Presentation of ICCSA 2019 at the University of Saint petersburg Presentation of ICCSA 2019 at the University of Saint petersburg
Presentation of ICCSA 2019 at the University of Saint petersburg
 
RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...
RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...
RISCHIO TERRITORIALE NEL GOVERNO DEL TERRITORIO: Ricerca e formazione nelle s...
 
Presentation of ICCSA 2017 at the University of trieste
Presentation of ICCSA 2017 at the University of triestePresentation of ICCSA 2017 at the University of trieste
Presentation of ICCSA 2017 at the University of trieste
 
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
 
Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...
Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...
Focussing Energy Consumers’ Behaviour Change towards Energy Efficiency and Lo...
 
Socio-Economic Planning profiles: Sciences VS Daily activities in public sector 
Socio-Economic Planning profiles: Sciences VS Daily activities in public sector Socio-Economic Planning profiles: Sciences VS Daily activities in public sector 
Socio-Economic Planning profiles: Sciences VS Daily activities in public sector 
 
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
GEOGRAPHIC INFORMATION – NEED TO KNOW (GI-N2K) Towards a more demand-driven g...
 
Garden in motion. An experience of citizens involvement in public space regen...
Garden in motion. An experience of citizens involvement in public space regen...Garden in motion. An experience of citizens involvement in public space regen...
Garden in motion. An experience of citizens involvement in public space regen...
 
Planning and Smartness: the true challenge
Planning and Smartness: the true challengePlanning and Smartness: the true challenge
Planning and Smartness: the true challenge
 
GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...
GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...
GeoSDI: una piattaforma social di dati geografici basata sui principi di INSP...
 
Murgante smart energy
Murgante smart energyMurgante smart energy
Murgante smart energy
 
Informazione Geografica, Città, Smartness
Informazione Geografica, Città, Smartness Informazione Geografica, Città, Smartness
Informazione Geografica, Città, Smartness
 
Tecnologie, Territorio, Smartness
Tecnologie, Territorio, SmartnessTecnologie, Territorio, Smartness
Tecnologie, Territorio, Smartness
 

Recently uploaded

Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsMiki Katsuragi
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 

Recently uploaded (20)

Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort ServiceHot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering Tips
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 

Kernel Methods for Environmental and Geo-Sciences (SVM

  • 1. Kernel Methods (Support Vector Machines) for Environmental and Geo- Sciences Alexei Pozdnoukhov Lecturer National Centre for Geocomputation National University of Ireland, Maynooth +353 (0)1 7086146 Alexei.Pozdnoukhov@nuim.ie
  • 3. Learning From Data • Environmental monitoring Current rate of data acquisition is about 0.5Tb/day (increasing at 82% per year) • Remote Sensing Data NASA holds more than 10Pb of data, increasing by 10x every 5 years. ESA data stream is about 0.5Tb/year, likely to increase by 20x in next 5 years. • GIS, DEM • Sensor Networks • Field Measurements
  • 4. Clustering Cluster 1 Cluster 2
  • 6. Classification Binary Multi-Class
  • 7. Regression y Input, x
  • 8. Curse of Dimensionality Sensor Network Sensor Network Need more data? Batteries Recharged at WSN Human activity Remote Sensing Wireless Sensor Network Geographical Information
  • 9. Detecting Events Observed environment: Events: Very Rare, Extreme high-dimensional input space • High-dimensional spaces: risk of overfitting • Robust to noise in both inputs/outputs • Non-linear and non-parametric • Computationally effective for real-time processing and LBS dissemination
  • 11. Statistical Learning Theory • Models that can generalise from data • Good predictive abilities • Complexity can be controlled
  • 12. Statistical Learning Theory • Occam’s Razor Principle (14th century) One should not increase, beyond what is necessary, the number of entities required to explain anything • When many solutions are available for a given problem, we should select the simplest one. • But what do we mean by simple? • We will use prior knowledge of the problem to solve to define what is a simple solution (example of a prior: smoothness).
  • 13. Occam’s Razor and Classification Model 1 Model 2 Model 3 Complexity √√ √ ×× Training error ×× √ √√ Overall - √ -
  • 14. Structural Risk Minimization • Define a set of learning functions, {S} • Order it in terms of complexity, {S1, …, SN} • Select the optimal S* F = {f(x,α), α∈Λ}
  • 16. Separating Hyperplane x - input patterns w - weight vector b - threshold f w,b ( x ) = sign ( w ⋅ x + b) How powerful are linear decision functions?
  • 17. VC-dimension in classification Shattering • the number of samples which can be discriminated by the function for all possible class memberships – shattered. 3 samples: x x x 4 samples: x ? x VC-dimension h of the linear decision functions in RN equals N+1 That is, the power of linear decision functions is beyond our control…?
  • 18. Support Vector Machine Decision function is a margin hyperplane(*)  1, (w⋅ x) − b ≥ 1 f (x,{w, b}) =   −1, (w⋅ x) − b ≤ −1 Intuition: Large Margin is good. Lemma: Given that the N-dimensional data {xl, x2, …xL} lie inside a finite enclosing sphere of the radius R, the VC-dimension h of the margin-based decision functions (*) follows the inequality: h ≤ min R2 w , N  +1 2   The complexity (VC-dimension) can be controlled with ||w||2 !!
  • 19. Separating Hyperplane: Max Margin To maximize the margin ρ, one would like to minimize ||w||, or ||w||2.  1, (w ⋅ x) − b ≥ 1 fw,b ( x) =  f w,b ( x) = sign (( w ⋅ x) + b)  −1, (w ⋅ x) − b ≤ −1
  • 20. Optimization Problem, Lagrangian { 1 2 min w 2 ⇒ yi ( w ⋅ xi + b) ≥ 1, i = 1,..., L. { L w − ∑ α i ( yi ( w ⋅ xi + b) − 1) 2 Lp = 1 2 i =1 L ⇒ ∑α ⋅ y i =1 i i = 0, L w = ∑ αi ⋅ yi ⋅ xi i =1 KKT conditions: αi > 0 - Support Vectors α i ( yi ( w ⋅ xi + b) − 1) = 0, ∀i αi = 0
  • 21. Optimization Problem: Dual Variables. L L LD = ∑ α i − 1 ∑ α iα j yi y j ( xi ⋅ x j ) 2 i =1 i , j =1 L ∑α y i =1 i i =0 α i ≥ 0, i = 1,...L  L  f ( x ) = sign ( w ⋅ x + b) = sign  ∑ α i yi ( x ⋅ xi ) + b   i =1  • inputs are presented as dot products • Quadratic Programming • convex problem, nice theoretical field • unique solution, good solvers
  • 22. Soft margin hyperplane: allowing for the training error. error { L w + C ∑ξi 1 2 min 2 i =1 yi ( w ⋅ xi + b) ≥ 1 − ξ i , i = 1,..., L. ξ i ≥ 0, i = 1,...L { L L LD = ∑ α i − 1 2 ∑α α i j yi y j ( xi ⋅ x j ) i =1 i , j =1 C - regularization parameter L ∑α y i i =0 trade-off between i =1 margin maximization 0 ≤ αi ≤ C , i = 1,...L & training error
  • 23. Support Vector Terminology αi = 0 Normal Samples 0 < αi < C Support Vectors αi = C Support Vectors untypical or noisy C - regularization parameter  L  f ( x ) = sign  ∑ α i yi ( x ⋅ xi ) + b   i =1  trade-off between margin maximization & training error
  • 24. Support Vector Algorithm Kernel Trick If data is not linearly separable, it can be projected into (sufficiently) high dimensional space. There it is much easier to separate! Example. K ( x, x′) = ( x ⋅ x′) 2  x12   x1     x  →  2 x1 x2   2   x2 2     x → Φ( x) ? The algorithm was formulated in terms of dot products! x ⋅ x′ → Φ ( x ) ⋅ Φ ( x′) ⇔ x ⋅ x′ → K ( x, x′) •K is symmetric •K is positive-definite
  • 25. Nonlinear SVM. Kernel trick. f ( x ) = wx + b → L f ( x ) = ∑ yiα i K ( x, xi ) + b i =1 Any linear algorithm, formulated in terms of dot products of input data, can be modified into a non-linear one using the kernel trick. trick • Support Vector Machine • Kernel Ridge Regression • Kernel Principle Component Analysis • Kernel Fischer Discriminant Analysis • etc.
  • 26. Nonlinear SVM. Kernel types. • Polynomial kernel: K ( x, y ) = ( x ⋅ y + 1) p 2 x− y − • Radial Basis Function kernel: K ( x, y ) = e 2σ 2 f ( x ) = sign ( ∑ yiαi K ( x, xi ) + b) i∈SV
  • 27. Nonlinear SVM. Optimization problem. L L LD = ∑ α i − 1 ∑ α iα j yi y j K ( xi , x j ) 2 i =1 i , j =1 L ∑α y i =1 i i =0 0 ≤ α i ≤ C , i = 1,...L L b = yi − ∑ yiα i K ( xi , x j ) f ( x ) = sign( ∑ yiαi K ( x, xi ) + b) i =0 i∈SV K is positive-definite, still QP programming, hence unique solution!
  • 28. Support Vector Machine http://www.geokernels.org/teaching/svm
  • 31. SV Porosity Mapping Data description 200 training samples “+” 94 validation samples minimum = 0.0 median = 0.515 max = 1.000 mean = 0.53 variance = 0.048 The original continuous data were transformed into 2-class data according to the 0.5 threshold: If fpor ≥ 0.5, then y = +1 If fpor < 0.5, then y = -1
  • 32. SV Porosity Mapping Data: 2-class transformation • class “+1”, ≥ 0.5 o class “-1”, < 0.5 + validation data
  • 33. SV Porosity Mapping Data loading 150 training samples 50 testing samples Prediction Grid
  • 34. SV Porosity Mapping Hyper-parameters tuning 2 x − x′ • Gaussian RBF kernel is selected. − K ( x, x′) = e 2σ 2 • Two hyper-parameters: C and σ. • Grid search: testing error analysis for every pair of paramaters. The range of σ min(σ) - minimum distance between data samples max(σ) - max distance between data samples The range of log(C) min(C) - some small value, 1 or less max(C) – depends on data, 1e3-1e6 Start calculation using testing data Save results to file
  • 35. SV Porosity Mapping Hyper-parameters tuning Training error surface Log(C) Gaussian RBF kernel bandwidth • increase with kernel bandwidth • decrease with C
  • 36. SV Porosity Mapping Hyper-parameters tuning Testing error surface Log(C) Gaussian RBF kernel bandwidth Complex structure, but generally, if the range is selected reasonably and data splitting is correct, there exist a region of minima – optimal values.
  • 37. SV Porosity Mapping Hyper-parameters tuning Normalized number of Support Vectors Log(C) Gaussian RBF kernel bandwidth Represents the complexity of the model, the more complex one has more SVs.
  • 38. What are the parameters for the final model? Hyper-parameters selection Testing error C=3 σ = 0.09 Training error Normalized NSV
  • 39. What are the parameters for the final model? Hyper-parameters selection Testing error C = 18 σ = 0.13 Training error Normalized NSV
  • 40. SV Porosity Mapping Dependence on Parameters C = 10 σ 0.02 0.06 0.1 0.2 0.3 0.4 0.5
  • 41. SV Porosity Mapping Dependence on Parameters C=100 C=10 C=1 C=0.1 σ = 0.1
  • 42. SV Porosity Mapping Predictive Mapping and Support Vectors Predictive mapping + MARGIN + Normal SV, 0<α<C. + Critical SV, α=C.
  • 43. Applications for Natural Hazards • Topo-climatic mapping • Landslides • Snow avalanches prediction
  • 44. Weather observations • 110 meteo stations • Measurements, up to every 10min • Altitude: 270m-3580m • Temperature • Precipitation • Humidity • Air Pressure • Wind Speed • Insolation • Etc. Spatio-temporal prediction mapping?
  • 45. Temperature Inversion Can only be explained using terrain surface characteristics (convexity, slope, etc.)
  • 46. Physical Models at local scales • Terrain roughness is too high for physical models, computational speed, precision, uncertainty estimation… PDE on smoothed terrain + empirical correction vModel ( x, y ) = vPhysical + cRidges + cCanyons + cValues + cFlatAreas + cSea ... Can this information be extracted directly from data?
  • 47. Modelling Scheme Data Predictive Modeling with Machine Learning DEM Non-linear dependencies Noise, Outliers Spatio-Temporal Mapping F E A T U …. R Feature E Selection/Extraction S Analysis Decision Support
  • 48. Temperature vs. Elevation Mean Monthly Mean Daily Linear Locally Linear Regionalized Mean Hourly Mean Hourly Explained Non-linear Regionalized Temperature Inversion
  • 49. DEM Features Large Scale Difference of Gaussians Short Scale Difference of Gaussians Slope Local Variance
  • 50. Temperature Inversion Mapping Probability of Inversion Temperature
  • 53. Applications • Topo-climatic mapping • Landslides • Snow avalanches prediction • Remote Sensing
  • 54. Landslide inventory SFI (SRC-ID 07/SRC/I1168)
  • 55. Method I Factor 1 Probability density estimation Factor 2 SFI (SRC-ID 07/SRC/I1168)
  • 56. Model vs. Training Data SFI (SRC-ID 07/SRC/I1168)
  • 57. What is wrong with this susceptibility map? SFI (SRC-ID 07/SRC/I1168)
  • 58. Method II Classification Stable Factor 1 Unstable Factor 2 SFI (SRC-ID 07/SRC/I1168)
  • 59. Predictive models SFI (SRC-ID 07/SRC/I1168)
  • 60. A model should fit the observed landslides, and … SFI (SRC-ID 07/SRC/I1168)
  • 61. Applications • Topo-climatic mapping • Landslides • Snow avalanches prediction • Remote Sensing
  • 62. Lochaber, Scotland • 1842 days of weather conditions (11 features) recording, 1991-2007 • 1135 days with documented avalanche events • 797 safe days, 245 with avalanches • 260 days unknown (mainly bad weather)
  • 63. Spatial Data Training data: 722 events, winters 1991-2005 Validation data: 72 events, winters 2006-2007 • 47 avalanche paths, x, y, z, slope, aspect, date • DEM, 10m resolution, 5km x 5km
  • 64. Lochaber weather observations • Snow index 0-10 • No-settle cumulative Snow over a season • Rain at 900m binary [0, 1] • Snow drift binary [0, 1] • Air temperature -10,… +10 • Wind speed 0, … 25 m/s • Wind Direction 0o-360o • Cloudness [25, 50, 75, 100] • Foot penetration 0, … 50 • Snow temperature 0, … -10 • Insolation cumulative over season
  • 65. Classification Problem Z Slope Aspect: SN-WE [Spatialized Weather Features] +1 720 …over all the documented avalanche events… Z Slope Aspect: SN-WE [Spatialized Weather Features] +1 Z Slope Aspect: SN-WE [Spatialized Weather Features] -1 44000 …over all the 47 gullies for documented days without avalanches… Z Slope Aspect: SN-WE [Spatialized Weather Features] -1 4 + 22 = 26
  • 66. Wind Speed and Direction Wind speed weighting: Correction for slope: Correction for curvature: Terrain-corrected wind direction:
  • 67. Snow accumulation Simple heuristics based on wind speed gradients If Snow index > 0 If Snow drift = 1 Snow accumulation = F(Wind Speed, Wind Direction)
  • 68. Results DEM Avalanche Danger
  • 69. Results wind Animation in 3D
  • 70. Applications • Topo-climatic mapping • Landslides • Snow avalanches prediction • Remote Sensing
  • 71. Inhabited areas Ground truth is known: population census Testing Training
  • 72. Inhabited areas Ground truth is known: population census
  • 79. Pre-processing and Features Mathematical morphology (image closing)
  • 81. Pre-processing and Features Gaussian Mixture Model
  • 86. Summary and Conclusions • Statistical Learning Theory • Classification Problem • Support Vector Machines and Kernel Methods • GeoSpatial Data Classification with SVM
  • 87. Open PhD positions at NCG Thank you! Alexei Pozdnoukhov Alexei.Pozdnoukhov@nuim.ie SFI (SRC-ID 07/SRC/I1168)