SlideShare a Scribd company logo
1 of 18
Download to read offline
Gaussian Process Regression
An intuitive introduction
Juan Pablo Carbajal
Siedlungswasserwirtschaft
Eawag - aquatic research
Dübendorf, Switzerland
juanpablo.carbajal@eawag.ch
November 24, 2017
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
The learning problem in a nutshell
0.2 0.4 0.6 0.8
-2
-1
0
1
2
t
y
Data given

(ti, yi) = (t, y), what model to use?
2
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
The learning problem
to use a set of observations to uncover an underlying process.
For prediction (maybe for understanding).
3
Yaser Abu-Mostafa. Learning from data. https://work.caltech.edu/telecourse.html
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
The learning problem
Input: x position in a map
Output: y height of the terrain
Target function: f : X → Y height map
Data: (x1, y1) , . . . , (xn, yn) field measurements
Hypothesis: g : X → Y formula to be used
3
Yaser Abu-Mostafa. Learning from data. https://work.caltech.edu/telecourse.html
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
The learning problem
Learning Algorithm
Training Examples
Available text snippets
Unknown Target Function
How much legal-like is the text?
Hypothesis Set
Possible text clas-
sification functions
Final Hypothesis
Final classification function
3
Yaser Abu-Mostafa. Learning from data. https://work.caltech.edu/telecourse.html
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Data set
0.2 0.4 0.6 0.8
-2
-1
0
1
2
t
y
Data given

(ti, yi) = (t, y), what model to use?
4
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Naive regression
0.2 0.4 0.6 0.8
-2
-1
0
1
2
t
y
w0 + w1t + w2t2
+ w3t3
= y(t)

1 t t2
t3





w0
w1
w2
w3



 = y(t)
φ(t)
w = y(t)


1 t1 t2
1 t3
1
1 t2 t2
2 t3
2
1 t3 t2
3 t3
3






w0
w1
w2
w3



 =


y1
y2
y3

 Φ(t)
w =


φ(t1)
φ(t2)
φ(t3)

 w = y
5
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Pseudo-inverse
Φ
is an n×N (3×4) matrix with N ≥ n, then rank Φ
≤ n
With a feature vector φ complex enough we have that
rank Φ
= n, i.e. the n row vectors of the matrix are linearly
independent, ∃ Φ
Φ
−1
Φ
Φ is called Gramian matrix: the matrix of all scalar products.
Φ
w = y →
I
z }| {
Φ
Φ Φ
Φ
−1
Φ
w
| {z }
w∗
= y
Φ Φ
Φ
−1
Φ
w = Φ Φ
Φ
−1
| {z }
Moore-Penrose pseudoinverse
y = w∗
6
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
A change of perspective
Instead of looking at the rows, look at the columns. These are l.i.
functions ψi(t) = ti−1
. The model looks like
y(t) =
N−1
X
i=0
ψi(t)wi
and the regression problem now looks like
ψi(t) = ti−1
, Ψ(t)w =

ψ3(t) ψ2(t) ψ1(t) ψ0(t)

w = y
Note that Ψ = Φ
.
7
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
A change of perspective
Ψ is an n×N (3×4) matrix with N ≥ n, then rank Ψ ≤ n
If the N column vectors of the matrix linearly independent, then
∃ ΨΨ
−1
K = ΨΨ
is called Covariance matrix: Kij =
N−1
X
k=0
ψk(ti)ψk(tj).
Ψw = y →
I
z }| {
Ψ Ψ
ΨΨ
−1
Ψw
| {z }
w∗
= y
Ψ
ΨΨ
−1
Ψw = Ψ
ΨΨ
−1
| {z }
Moore-Penrose pseudoinverse
y = w∗
8
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Recapitulation: the problem
Given n examples {(ti, yi)}, propose a model using N ≥ n l.i.
functions (a.k.a. features),
f(t) =
N
X
i
ψi(t)wi
and find some good {wi}.
9
Hansen, Per Christian. Rank-deficient and discrete ill-posed problems: numerical aspects of
linear inversion. Vol. 4. Siam, 1998.
Wendland, Holger. Scattered data approximation. Vol. 17. Cambridge university press, 2004.
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Recapitulation: the solution
Data view
Think in terms of n feature vectors φi in a (high) dimensional space
RN
φ
j =

ψ1(tj) . . . ψN (tj)

, j = 1, . . . , n
The solution reads
f(t) = Φ(t)
w∗ =
scalar product
z }| {
Φ(t)
Φ Φ
Φ
| {z }
scalar product
-1
y
Function view
Think in terms of a N dimensional function space H spanned by the
ψi(t). The solution reads
f(t) = Ψ(t)w =
covariance
z }| {
Ψ(t)Ψ
ΨΨ
| {z }
covariance
-1
y = k(t, t) k(t, t)−1
y
10
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
The Kernel trick
To calculate the solutions we only need scalar products or
covariances: we never use the actual {φi} or {ψi}.
cov
Ψ
(t, t0
) = k(t, t0
) = Φ(t) · Φ(t0
)
Infinite features
Now we can use N = ∞, i.e infinite dimensional features or infinite
basis functions!
By selecting valid covariance functions we implicitly select features
for our model.
How to choose the covariance function?
11
Rasmussen, C.,  Williams, C. (2006). Gaussian Processes for Machine Learning.
http://www.gaussianprocess.org/gpml/
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
The Kernel trick
To calculate the solutions we only need scalar products or
covariances: we never use the actual {φi} or {ψi}.
cov
Ψ
(t, t0
) = k(t, t0
) = Φ(t) · Φ(t0
)
Infinite features
Now we can use N = ∞, i.e infinite dimensional features or infinite
basis functions!
By selecting valid covariance functions we implicitly select features
for our model.
How to choose the covariance function? Prior knowledge about the
solution.
11
Rasmussen, C.,  Williams, C. (2006). Gaussian Processes for Machine Learning.
http://www.gaussianprocess.org/gpml/
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Digression: back to the solution
Lets call the pseudoinverse Ψ+
. The proposed solution
Ψ+
y = w∗, Ψw∗ = ΨΨ+
y = y → y(t) = Ψ(t)w∗
LHS of the arrow is the interpolation, RHS is the intra- or
extra-polation.
But with any random vector ξ we have
w∗ +
null Ψ
z }| {
I − Ψ+
Ψ

ξ = ŵ∗
Ψŵ∗ = y + Ψ − ΨΨ+
| {z }
I
Ψ
!
ξ = y
ŵ∗ also solves the interpolation problem. There are many solutions!
(unless Ψ+
Ψ = I, i.e. null Ψ = 0, i.e. invertible matrix: not our
case).
12
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Digression: back to the solution
0.2 0.4 0.6 0.8
-2
-1
0
1
2
t
y
w∗ + (I − Ψ+
Ψ) ξ
13
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Gaussian Process
14
Intro to GP
JuanPi Carbajal
Learning from
data
Interpolation
(polynomial)
Thank you!
15

More Related Content

Similar to Conceptual Introduction to Gaussian Processes

A nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formulaA nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formulaAlexander Litvinenko
 
A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...JuanPabloCarbajal3
 
MLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackMLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackarogozhnikov
 
MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)Arnaud de Myttenaere
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
Numerical integration based on the hyperfunction theory
Numerical integration based on the hyperfunction theoryNumerical integration based on the hyperfunction theory
Numerical integration based on the hyperfunction theoryHidenoriOgata
 
Radial Basis Function Interpolation
Radial Basis Function InterpolationRadial Basis Function Interpolation
Radial Basis Function InterpolationJesse Bettencourt
 
Connection between inverse problems and uncertainty quantification problems
Connection between inverse problems and uncertainty quantification problemsConnection between inverse problems and uncertainty quantification problems
Connection between inverse problems and uncertainty quantification problemsAlexander Litvinenko
 
Hands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive ModelingHands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive ModelingArthur Charpentier
 
Cheatsheet supervised-learning
Cheatsheet supervised-learningCheatsheet supervised-learning
Cheatsheet supervised-learningSteve Nouri
 
Learning to Reconstruct
Learning to ReconstructLearning to Reconstruct
Learning to ReconstructJonas Adler
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer visionzukun
 
Efficient end-to-end learning for quantizable representations
Efficient end-to-end learning for quantizable representationsEfficient end-to-end learning for quantizable representations
Efficient end-to-end learning for quantizable representationsNAVER Engineering
 

Similar to Conceptual Introduction to Gaussian Processes (20)

A nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formulaA nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formula
 
A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...A walk through the intersection between machine learning and mechanistic mode...
A walk through the intersection between machine learning and mechanistic mode...
 
talk MCMC & SMC 2004
talk MCMC & SMC 2004talk MCMC & SMC 2004
talk MCMC & SMC 2004
 
MLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackMLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic track
 
MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Numerical integration based on the hyperfunction theory
Numerical integration based on the hyperfunction theoryNumerical integration based on the hyperfunction theory
Numerical integration based on the hyperfunction theory
 
Slides ACTINFO 2016
Slides ACTINFO 2016Slides ACTINFO 2016
Slides ACTINFO 2016
 
Radial Basis Function Interpolation
Radial Basis Function InterpolationRadial Basis Function Interpolation
Radial Basis Function Interpolation
 
Connection between inverse problems and uncertainty quantification problems
Connection between inverse problems and uncertainty quantification problemsConnection between inverse problems and uncertainty quantification problems
Connection between inverse problems and uncertainty quantification problems
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Multidimensional Monot...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Multidimensional Monot...MUMS: Bayesian, Fiducial, and Frequentist Conference - Multidimensional Monot...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Multidimensional Monot...
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
Madrid easy
Madrid easyMadrid easy
Madrid easy
 
Hands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive ModelingHands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive Modeling
 
Cheatsheet supervised-learning
Cheatsheet supervised-learningCheatsheet supervised-learning
Cheatsheet supervised-learning
 
QMC: Transition Workshop - Importance Sampling the Union of Rare Events with ...
QMC: Transition Workshop - Importance Sampling the Union of Rare Events with ...QMC: Transition Workshop - Importance Sampling the Union of Rare Events with ...
QMC: Transition Workshop - Importance Sampling the Union of Rare Events with ...
 
Learning to Reconstruct
Learning to ReconstructLearning to Reconstruct
Learning to Reconstruct
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer vision
 
Efficient end-to-end learning for quantizable representations
Efficient end-to-end learning for quantizable representationsEfficient end-to-end learning for quantizable representations
Efficient end-to-end learning for quantizable representations
 
Bayesian computation with INLA
Bayesian computation with INLABayesian computation with INLA
Bayesian computation with INLA
 

Recently uploaded

Pests of Blackgram, greengram, cowpea_Dr.UPR.pdf
Pests of Blackgram, greengram, cowpea_Dr.UPR.pdfPests of Blackgram, greengram, cowpea_Dr.UPR.pdf
Pests of Blackgram, greengram, cowpea_Dr.UPR.pdfPirithiRaju
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Patrick Diehl
 
Pests of Bengal gram_Identification_Dr.UPR.pdf
Pests of Bengal gram_Identification_Dr.UPR.pdfPests of Bengal gram_Identification_Dr.UPR.pdf
Pests of Bengal gram_Identification_Dr.UPR.pdfPirithiRaju
 
User Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationUser Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationColumbia Weather Systems
 
GenBio2 - Lesson 1 - Introduction to Genetics.pptx
GenBio2 - Lesson 1 - Introduction to Genetics.pptxGenBio2 - Lesson 1 - Introduction to Genetics.pptx
GenBio2 - Lesson 1 - Introduction to Genetics.pptxBerniceCayabyab1
 
Microteaching on terms used in filtration .Pharmaceutical Engineering
Microteaching on terms used in filtration .Pharmaceutical EngineeringMicroteaching on terms used in filtration .Pharmaceutical Engineering
Microteaching on terms used in filtration .Pharmaceutical EngineeringPrajakta Shinde
 
Pests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdfPests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdfPirithiRaju
 
TOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsTOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsssuserddc89b
 
preservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptxpreservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptxnoordubaliya2003
 
Citronella presentation SlideShare mani upadhyay
Citronella presentation SlideShare mani upadhyayCitronella presentation SlideShare mani upadhyay
Citronella presentation SlideShare mani upadhyayupadhyaymani499
 
Topic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptxTopic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptxJorenAcuavera1
 
Base editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editingBase editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editingNetHelix
 
FREE NURSING BUNDLE FOR NURSES.PDF by na
FREE NURSING BUNDLE FOR NURSES.PDF by naFREE NURSING BUNDLE FOR NURSES.PDF by na
FREE NURSING BUNDLE FOR NURSES.PDF by naJASISJULIANOELYNV
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Nistarini College, Purulia (W.B) India
 
BUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdf
BUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdfBUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdf
BUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdfWildaNurAmalia2
 
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptxTHE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptxNandakishor Bhaurao Deshmukh
 
Scheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docxScheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docxyaramohamed343013
 
LIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptx
LIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptxLIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptx
LIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptxmalonesandreagweneth
 
Pests of castor_Binomics_Identification_Dr.UPR.pdf
Pests of castor_Binomics_Identification_Dr.UPR.pdfPests of castor_Binomics_Identification_Dr.UPR.pdf
Pests of castor_Binomics_Identification_Dr.UPR.pdfPirithiRaju
 
(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)
(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)
(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)riyaescorts54
 

Recently uploaded (20)

Pests of Blackgram, greengram, cowpea_Dr.UPR.pdf
Pests of Blackgram, greengram, cowpea_Dr.UPR.pdfPests of Blackgram, greengram, cowpea_Dr.UPR.pdf
Pests of Blackgram, greengram, cowpea_Dr.UPR.pdf
 
Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?Is RISC-V ready for HPC workload? Maybe?
Is RISC-V ready for HPC workload? Maybe?
 
Pests of Bengal gram_Identification_Dr.UPR.pdf
Pests of Bengal gram_Identification_Dr.UPR.pdfPests of Bengal gram_Identification_Dr.UPR.pdf
Pests of Bengal gram_Identification_Dr.UPR.pdf
 
User Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather StationUser Guide: Magellan MX™ Weather Station
User Guide: Magellan MX™ Weather Station
 
GenBio2 - Lesson 1 - Introduction to Genetics.pptx
GenBio2 - Lesson 1 - Introduction to Genetics.pptxGenBio2 - Lesson 1 - Introduction to Genetics.pptx
GenBio2 - Lesson 1 - Introduction to Genetics.pptx
 
Microteaching on terms used in filtration .Pharmaceutical Engineering
Microteaching on terms used in filtration .Pharmaceutical EngineeringMicroteaching on terms used in filtration .Pharmaceutical Engineering
Microteaching on terms used in filtration .Pharmaceutical Engineering
 
Pests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdfPests of soyabean_Binomics_IdentificationDr.UPR.pdf
Pests of soyabean_Binomics_IdentificationDr.UPR.pdf
 
TOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsTOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physics
 
preservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptxpreservation, maintanence and improvement of industrial organism.pptx
preservation, maintanence and improvement of industrial organism.pptx
 
Citronella presentation SlideShare mani upadhyay
Citronella presentation SlideShare mani upadhyayCitronella presentation SlideShare mani upadhyay
Citronella presentation SlideShare mani upadhyay
 
Topic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptxTopic 9- General Principles of International Law.pptx
Topic 9- General Principles of International Law.pptx
 
Base editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editingBase editing, prime editing, Cas13 & RNA editing and organelle base editing
Base editing, prime editing, Cas13 & RNA editing and organelle base editing
 
FREE NURSING BUNDLE FOR NURSES.PDF by na
FREE NURSING BUNDLE FOR NURSES.PDF by naFREE NURSING BUNDLE FOR NURSES.PDF by na
FREE NURSING BUNDLE FOR NURSES.PDF by na
 
Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...Bentham & Hooker's Classification. along with the merits and demerits of the ...
Bentham & Hooker's Classification. along with the merits and demerits of the ...
 
BUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdf
BUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdfBUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdf
BUMI DAN ANTARIKSA PROJEK IPAS SMK KELAS X.pdf
 
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptxTHE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
THE ROLE OF PHARMACOGNOSY IN TRADITIONAL AND MODERN SYSTEM OF MEDICINE.pptx
 
Scheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docxScheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docx
 
LIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptx
LIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptxLIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptx
LIGHT-PHENOMENA-BY-CABUALDIONALDOPANOGANCADIENTE-CONDEZA (1).pptx
 
Pests of castor_Binomics_Identification_Dr.UPR.pdf
Pests of castor_Binomics_Identification_Dr.UPR.pdfPests of castor_Binomics_Identification_Dr.UPR.pdf
Pests of castor_Binomics_Identification_Dr.UPR.pdf
 
(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)
(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)
(9818099198) Call Girls In Noida Sector 14 (NOIDA ESCORTS)
 

Conceptual Introduction to Gaussian Processes

  • 1. Gaussian Process Regression An intuitive introduction Juan Pablo Carbajal Siedlungswasserwirtschaft Eawag - aquatic research Dübendorf, Switzerland juanpablo.carbajal@eawag.ch November 24, 2017
  • 2. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) The learning problem in a nutshell 0.2 0.4 0.6 0.8 -2 -1 0 1 2 t y Data given (ti, yi) = (t, y), what model to use? 2
  • 3. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) The learning problem to use a set of observations to uncover an underlying process. For prediction (maybe for understanding). 3 Yaser Abu-Mostafa. Learning from data. https://work.caltech.edu/telecourse.html
  • 4. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) The learning problem Input: x position in a map Output: y height of the terrain Target function: f : X → Y height map Data: (x1, y1) , . . . , (xn, yn) field measurements Hypothesis: g : X → Y formula to be used 3 Yaser Abu-Mostafa. Learning from data. https://work.caltech.edu/telecourse.html
  • 5. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) The learning problem Learning Algorithm Training Examples Available text snippets Unknown Target Function How much legal-like is the text? Hypothesis Set Possible text clas- sification functions Final Hypothesis Final classification function 3 Yaser Abu-Mostafa. Learning from data. https://work.caltech.edu/telecourse.html
  • 6. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Data set 0.2 0.4 0.6 0.8 -2 -1 0 1 2 t y Data given (ti, yi) = (t, y), what model to use? 4
  • 7. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Naive regression 0.2 0.4 0.6 0.8 -2 -1 0 1 2 t y w0 + w1t + w2t2 + w3t3 = y(t) 1 t t2 t3     w0 w1 w2 w3     = y(t) φ(t) w = y(t)   1 t1 t2 1 t3 1 1 t2 t2 2 t3 2 1 t3 t2 3 t3 3       w0 w1 w2 w3     =   y1 y2 y3   Φ(t) w =   φ(t1) φ(t2) φ(t3)   w = y 5
  • 8. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Pseudo-inverse Φ is an n×N (3×4) matrix with N ≥ n, then rank Φ ≤ n With a feature vector φ complex enough we have that rank Φ = n, i.e. the n row vectors of the matrix are linearly independent, ∃ Φ Φ −1 Φ Φ is called Gramian matrix: the matrix of all scalar products. Φ w = y → I z }| { Φ Φ Φ Φ −1 Φ w | {z } w∗ = y Φ Φ Φ −1 Φ w = Φ Φ Φ −1 | {z } Moore-Penrose pseudoinverse y = w∗ 6
  • 9. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) A change of perspective Instead of looking at the rows, look at the columns. These are l.i. functions ψi(t) = ti−1 . The model looks like y(t) = N−1 X i=0 ψi(t)wi and the regression problem now looks like ψi(t) = ti−1 , Ψ(t)w = ψ3(t) ψ2(t) ψ1(t) ψ0(t) w = y Note that Ψ = Φ . 7
  • 10. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) A change of perspective Ψ is an n×N (3×4) matrix with N ≥ n, then rank Ψ ≤ n If the N column vectors of the matrix linearly independent, then ∃ ΨΨ −1 K = ΨΨ is called Covariance matrix: Kij = N−1 X k=0 ψk(ti)ψk(tj). Ψw = y → I z }| { Ψ Ψ ΨΨ −1 Ψw | {z } w∗ = y Ψ ΨΨ −1 Ψw = Ψ ΨΨ −1 | {z } Moore-Penrose pseudoinverse y = w∗ 8
  • 11. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Recapitulation: the problem Given n examples {(ti, yi)}, propose a model using N ≥ n l.i. functions (a.k.a. features), f(t) = N X i ψi(t)wi and find some good {wi}. 9 Hansen, Per Christian. Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion. Vol. 4. Siam, 1998. Wendland, Holger. Scattered data approximation. Vol. 17. Cambridge university press, 2004.
  • 12. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Recapitulation: the solution Data view Think in terms of n feature vectors φi in a (high) dimensional space RN φ j = ψ1(tj) . . . ψN (tj) , j = 1, . . . , n The solution reads f(t) = Φ(t) w∗ = scalar product z }| { Φ(t) Φ Φ Φ | {z } scalar product -1 y Function view Think in terms of a N dimensional function space H spanned by the ψi(t). The solution reads f(t) = Ψ(t)w = covariance z }| { Ψ(t)Ψ ΨΨ | {z } covariance -1 y = k(t, t) k(t, t)−1 y 10
  • 13. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) The Kernel trick To calculate the solutions we only need scalar products or covariances: we never use the actual {φi} or {ψi}. cov Ψ (t, t0 ) = k(t, t0 ) = Φ(t) · Φ(t0 ) Infinite features Now we can use N = ∞, i.e infinite dimensional features or infinite basis functions! By selecting valid covariance functions we implicitly select features for our model. How to choose the covariance function? 11 Rasmussen, C., Williams, C. (2006). Gaussian Processes for Machine Learning. http://www.gaussianprocess.org/gpml/
  • 14. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) The Kernel trick To calculate the solutions we only need scalar products or covariances: we never use the actual {φi} or {ψi}. cov Ψ (t, t0 ) = k(t, t0 ) = Φ(t) · Φ(t0 ) Infinite features Now we can use N = ∞, i.e infinite dimensional features or infinite basis functions! By selecting valid covariance functions we implicitly select features for our model. How to choose the covariance function? Prior knowledge about the solution. 11 Rasmussen, C., Williams, C. (2006). Gaussian Processes for Machine Learning. http://www.gaussianprocess.org/gpml/
  • 15. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Digression: back to the solution Lets call the pseudoinverse Ψ+ . The proposed solution Ψ+ y = w∗, Ψw∗ = ΨΨ+ y = y → y(t) = Ψ(t)w∗ LHS of the arrow is the interpolation, RHS is the intra- or extra-polation. But with any random vector ξ we have w∗ + null Ψ z }| { I − Ψ+ Ψ ξ = ŵ∗ Ψŵ∗ = y + Ψ − ΨΨ+ | {z } I Ψ ! ξ = y ŵ∗ also solves the interpolation problem. There are many solutions! (unless Ψ+ Ψ = I, i.e. null Ψ = 0, i.e. invertible matrix: not our case). 12
  • 16. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Digression: back to the solution 0.2 0.4 0.6 0.8 -2 -1 0 1 2 t y w∗ + (I − Ψ+ Ψ) ξ 13
  • 17. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Gaussian Process 14
  • 18. Intro to GP JuanPi Carbajal Learning from data Interpolation (polynomial) Thank you! 15