SlideShare a Scribd company logo
1 of 28
SUPERVISED
LEARNING
WHAT IS MACHINELEARNING?
The subfield of computer
science that “gives computers
the ability to learn
without being explicitly
programmed”.
A computer program is said to
learn from experience E with
respect to some class of tasks
T and performance measure P
if its performance at tasks in T,
as measured by P, improves
with experience E.”
Using data for answering
questions Training
Predicting
TYPES OF
LEARNING
• Training data includes
desired outputs
Supervised
(inductive)
learning
• Training data does not
include desired outputs
Unsupervised
learning
• Training data includes a
few desired outputs
Semi-supervised
learning
• Rewards from sequence
of actions
Reinforcement
learning
MACHINE
LEARNING
BASED
PREDICTIVE
MODELING
ANN (Artificial Neural Network)
ML (Machine Learning)
DL (Deep Learning)
DEFINITION
OF GAUSSIAN
PROCESS
REGRESSION
(GPR)
• A Gaussian process is defined as a probability distribution
over functions y(x), such that the set of values of y(x)
evaluated at an arbitrary set of points x1,.. Xn jointly have a
Gaussian distribution.
• Probability distribution indexed by an arbitrary set
• Any finite subset of indices defines a multivariate
Gaussian distribution
• Input space X, for each x the distribution is a Gaussian,
what determines the GP is
• The mean function µ(x) = E(y(x))
• The covariance function (kernel) k(x,x')=E(y(x)y(x'))
• In most applications, we take µ(x)=0. Hence the prior
is represented by the kernel.
LINEAR REGRESSION UPDATED BY GPR
• Specific case of a Gaussian Process
• It is defined by the linear regression model
with a weight prior
the kernel function is given by
)
(
)
(
1
)
,
( m
T
n
m
n x
x
x
x
k 



)
(
)
( x
w
x
y T


)
,
0
|
(
)
( 1
I
w
N
w
p 
 
KERNEL
FUNCTION
• We can also define the
kernel function directly.
• The figure show samples of
functions drawn from
Gaussian processes for two
different choices of kernel
functions
GP FOR REGRESSION
Take account of the noise on the observed target values,
which are given by
)
I
y,
|
N(t
y)
|
p(t
by
given
is
)
y
,...,
(y
y
on
d
conditione
)
t
,...,
(t
t
of
on
distributi
joint
the
t,
independen
is
noise
the
Because
noise.
the
of
precision
the
ng
representi
eter
hyperparam
a
is
where
)
,
y
|
N(t
)
y
|
p(t
that
so
on,
distributi
Gaussian
a
have
that
processes
noise
consider
we
Here
variable
noise
random
a
is
and
,
)
(
where
t
n
1
-
T
n
1
T
n
1
1
-
n
n
n
n
n
n












n
n
n
n
x
y
y
y
GP FOR REGRESSION
• From the definition of GP, the marginal
distribution p(y) is given by
• The marginal distribution of t is given by
• Where the covariance matrix C has elements
)
,
0
|
(
)
( K
y
N
y
p 
 
 )
,
0
|
(
)
(
)
|
(
)
( C
t
N
dy
y
p
y
t
p
t
p
nm
m
n
m
n x
x
k
x
x
C 
 1
)
,
(
)
,
( 


GP FOR REGRESSION
• We’ve used GP to build a model of the joint
distribution over sets of data points
• Goal:
• To find , we begin by writing down the
joint distribution
1
n
1
n
n
1
1
n
input x
new
a
for
predict t
,
x
,...,
x
es
input valu
,
)
,..,
(
t
points
training
Given


 T
n
t
t
1
-
1
n
1
n
1
1
1
1
1
)
x
,
k(x
c
and
matrix,
n
n
is
where
,
c
matrix,
1)
(n
1)
(n
is
where
)
,
0
|
(
)
(
























n
T
n
n
n
n
n
n
C
k
k
C
C
C
C
t
N
t
p
)
|
( 1 t
t
p n
GP FOR REGRESSION
• The conditional distribution is a Gaussian
distribution with mean and covariance given by
• These are the key results that define Gaussian process
regression.
• The predictive distribution is a Gaussian whose mean and
variance both depend on
)
|
( 1 t
t
p n
k
C
k
c
x
t
C
k
x
m
n
T
n
n
T
n
1
1
2
1
1
)
(
)
(








1

n
x
EPSILON SUPPORT VECTOR REGRESSION (E-SVR)
• Given: a data set {x1, ..., xn} with target values {u1, ..., un}, we want to do -SVR
• The optimization problem is
• Similar to SVM, this can be solved as a quadratic programming problem
A DECISION TREE: CLASSIFICATION
Play Play
Don’t Play Don’t Play
INTRODUCTION
• Artificial Neural Networks (ANN)
• Information processing paradigm inspired by biological
nervous systems
• ANN is composed of a system of neurons connected by
synapses
• ANN learn by example
• Adjust synaptic connections between neurons
COMPARISON OF BRAINS AND
TRADITIONAL COMPUTERS
• 200 billion neurons, 32 trillion
synapses
• Element size: 10-6 m
• Energy use: 25W
• Processing speed: 100 Hz
• Parallel, Distributed
• Fault Tolerant
• Learns: Yes
• Intelligent/Conscious: Usually
• 1 billion bytes RAM but trillions of bytes
on disk
• Element size: 10-9 m
• Energy watt: 30-90W (CPU)
• Processing speed: 109 Hz
• Serial, Centralized
• Generally not Fault Tolerant
• Learns: Some
• Intelligent/Conscious: Generally No
BIOLOGICAL INSPIRATION
“My brain: It's my second favorite organ.”
- Woody Allen, from the movie Sleeper
Idea : To make the computer more robust, intelligent, and learn, …
Let’s model our computer software (and/or hardware) after the brain
NEURONS IN THE BRAIN
• Although heterogeneous, at a low level the brain is
composed of neurons
• A neuron receives input from other neurons (generally
thousands) from its synapses
• Inputs are approximately summed
• When the input exceeds a threshold the neuron sends an
electrical spike that travels that travels from the body,
down the axon, to the next neuron(s)
LEARNING IN THE BRAIN
• Brains learn
• Altering strength between neurons
• Creating/deleting connections
• Hebb’s Postulate (Hebbian Learning)
• When an axon of cell A is near enough to excite a cell B and repeatedly
or persistently takes part in firing it, some growth process or metabolic
change takes place in one or both cells such that A's efficiency, as one of
the cells firing B, is increased.
• Long Term Potentiation (LTP)
• Cellular basis for learning and memory
• LTP is the long-lasting strengthening of the connection between two
nerve cells in response to stimulation
• Discovered in many regions of the cortex
PERCEPTRONS
• Initial proposal of connectionist networks
• Rosenblatt, 50’s and 60’s
• Essentially a linear discriminant composed of
nodes, weights
I1
I2
I3
W1
W2
W3
 O


















 
otherwise
I
w
O i
i
i
:
0
0
:
1 
I1
I2
I3
W1
W2
W3

O
or
1
Activation Function
Multi-layer Networks and Perceptrons
- Have one or more
layers of hidden units.
- With two possibly
very large hidden
layers, it is possible to
implement any
function.
- Networks without hidden
layer are called
perceptrons.
- Perceptrons are very
limited in what they can
represent, but this makes
their learning problem
much simpler.
ACTIVATION FUNCTION
• To apply the LMS learning rule, also known as the delta rule, we
need a differentiable activation function.
   
Function
Activation
f
O
T
cI
w j
j
k
k '












 



otherwise
I
w
O i
i
i
:
0
0
:
1 
Old:


New:


 


i
i
i I
w
e
O
1
1
NETWORK STRUCTURES
Feed-forward neural nets:
• Links can only go in one direction.
Recurrent neural nets:
• Links can go anywhere and form arbitrary
• topologies.
ANN ARCHITECTURE
y
target
layer
H1
H2
H3
hidden
layer
x2
input
layer
x1
Feed-forward Networks
• Arranged in layers.
• Each unit is linked only in the unit in next layer.
• No units are linked between the same layer, back
to
• the previous layer or skipping a layer.
• -Computations can proceed uniformly from input
to
• output units.
• - No internal state exists.
Feed-Forward Example
I1
I2
t = -0.5
W24= -1
H4
W46 = 1
t = 1.5
H6
W67 = 1
t = 0.5
I1
t = -0.5
W13 = -1
H3
W35 = 1
t = 1.5
H5
O7
W57 = 1
W25 = 1
W16 = 1
Inputs skip the layer in this case
Recurrent Network
• The brain is not and cannot be a feed-forward
network.
• Allows activation to be fed back to the
previous unit.
• Internal state is stored in its activation level.
• Can become unstable
• Can oscillate.
• May take long time to compute a stable output.
• Learning process is much more difficult.
• Can implement more complex designs.
• Can model certain systems with internal states.
RECURRENT
NETWORK
Thank you

More Related Content

Similar to Supervised Learning.pptx

Artificial neural networks (2)
Artificial neural networks (2)Artificial neural networks (2)
Artificial neural networks (2)sai anjaneya
 
2011 0480.neural-networks
2011 0480.neural-networks2011 0480.neural-networks
2011 0480.neural-networksParneet Kaur
 
Artificial Neural Network (ANN
Artificial Neural Network (ANNArtificial Neural Network (ANN
Artificial Neural Network (ANNAndrew Molina
 
Artificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaArtificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaEr. Arpit Sharma
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkPrakash K
 
Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Balázs Hidasi
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Randa Elanwar
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cseNaveenBhajantri1
 
Neural-Networks.ppt
Neural-Networks.pptNeural-Networks.ppt
Neural-Networks.pptRINUSATHYAN
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Networkssuserab4f3e
 
Feed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptxFeed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptxneelamsanjeevkumar
 
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Simplilearn
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdfgnans Kgnanshek
 
20200428135045cfbc718e2c.pdf
20200428135045cfbc718e2c.pdf20200428135045cfbc718e2c.pdf
20200428135045cfbc718e2c.pdfTitleTube
 

Similar to Supervised Learning.pptx (20)

Artificial neural networks (2)
Artificial neural networks (2)Artificial neural networks (2)
Artificial neural networks (2)
 
2011 0480.neural-networks
2011 0480.neural-networks2011 0480.neural-networks
2011 0480.neural-networks
 
Deep learning
Deep learningDeep learning
Deep learning
 
Artificial Neural Network (ANN
Artificial Neural Network (ANNArtificial Neural Network (ANN
Artificial Neural Network (ANN
 
Artificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaArtificial neural network by arpit_sharma
Artificial neural network by arpit_sharma
 
Perceptron
Perceptron Perceptron
Perceptron
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
Terminology Machine Learning
Terminology Machine LearningTerminology Machine Learning
Terminology Machine Learning
 
Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017Deep Learning in Recommender Systems - RecSys Summer School 2017
Deep Learning in Recommender Systems - RecSys Summer School 2017
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 
Neural-Networks.ppt
Neural-Networks.pptNeural-Networks.ppt
Neural-Networks.ppt
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Feed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptxFeed forward back propogation algorithm .pptx
Feed forward back propogation algorithm .pptx
 
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf
 
20200428135045cfbc718e2c.pdf
20200428135045cfbc718e2c.pdf20200428135045cfbc718e2c.pdf
20200428135045cfbc718e2c.pdf
 
Unit 2 ml.pptx
Unit 2 ml.pptxUnit 2 ml.pptx
Unit 2 ml.pptx
 
tutorial.ppt
tutorial.ppttutorial.ppt
tutorial.ppt
 

Recently uploaded

Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfLars Albertsson
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdfHuman37
 
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptdokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptSonatrach
 
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一F La
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...soniya singh
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingNeil Barnes
 
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...limedy534
 
Call Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceCall Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceSapana Sha
 
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /WhatsappsBeautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsappssapnasaifi408
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPramod Kumar Srivastava
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxEmmanuel Dauda
 
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一fhwihughh
 
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样vhwb25kk
 
04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationships04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationshipsccctableauusergroup
 
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Jack DiGiovanna
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024thyngster
 
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...Florian Roscheck
 
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Callshivangimorya083
 
Dubai Call Girls Wifey O52&786472 Call Girls Dubai
Dubai Call Girls Wifey O52&786472 Call Girls DubaiDubai Call Girls Wifey O52&786472 Call Girls Dubai
Dubai Call Girls Wifey O52&786472 Call Girls Dubaihf8803863
 

Recently uploaded (20)

Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdf
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf
 
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.pptdokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
dokumen.tips_chapter-4-transient-heat-conduction-mehmet-kanoglu.ppt
 
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data Storytelling
 
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
 
Call Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceCall Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts Service
 
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /WhatsappsBeautiful Sapna Vip  Call Girls Hauz Khas 9711199012 Call /Whatsapps
Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsapps
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptx
 
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
办理学位证纽约大学毕业证(NYU毕业证书)原版一比一
 
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
1:1定制(UQ毕业证)昆士兰大学毕业证成绩单修改留信学历认证原版一模一样
 
04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationships04242024_CCC TUG_Joins and Relationships
04242024_CCC TUG_Joins and Relationships
 
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
 
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
 
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
 
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
꧁❤ Greater Noida Call Girls Delhi ❤꧂ 9711199171 ☎️ Hard And Sexy Vip Call
 
Dubai Call Girls Wifey O52&786472 Call Girls Dubai
Dubai Call Girls Wifey O52&786472 Call Girls DubaiDubai Call Girls Wifey O52&786472 Call Girls Dubai
Dubai Call Girls Wifey O52&786472 Call Girls Dubai
 

Supervised Learning.pptx

  • 2. WHAT IS MACHINELEARNING? The subfield of computer science that “gives computers the ability to learn without being explicitly programmed”. A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E.” Using data for answering questions Training Predicting
  • 3. TYPES OF LEARNING • Training data includes desired outputs Supervised (inductive) learning • Training data does not include desired outputs Unsupervised learning • Training data includes a few desired outputs Semi-supervised learning • Rewards from sequence of actions Reinforcement learning
  • 4. MACHINE LEARNING BASED PREDICTIVE MODELING ANN (Artificial Neural Network) ML (Machine Learning) DL (Deep Learning)
  • 5. DEFINITION OF GAUSSIAN PROCESS REGRESSION (GPR) • A Gaussian process is defined as a probability distribution over functions y(x), such that the set of values of y(x) evaluated at an arbitrary set of points x1,.. Xn jointly have a Gaussian distribution. • Probability distribution indexed by an arbitrary set • Any finite subset of indices defines a multivariate Gaussian distribution • Input space X, for each x the distribution is a Gaussian, what determines the GP is • The mean function µ(x) = E(y(x)) • The covariance function (kernel) k(x,x')=E(y(x)y(x')) • In most applications, we take µ(x)=0. Hence the prior is represented by the kernel.
  • 6. LINEAR REGRESSION UPDATED BY GPR • Specific case of a Gaussian Process • It is defined by the linear regression model with a weight prior the kernel function is given by ) ( ) ( 1 ) , ( m T n m n x x x x k     ) ( ) ( x w x y T   ) , 0 | ( ) ( 1 I w N w p   
  • 7. KERNEL FUNCTION • We can also define the kernel function directly. • The figure show samples of functions drawn from Gaussian processes for two different choices of kernel functions
  • 8. GP FOR REGRESSION Take account of the noise on the observed target values, which are given by ) I y, | N(t y) | p(t by given is ) y ,..., (y y on d conditione ) t ,..., (t t of on distributi joint the t, independen is noise the Because noise. the of precision the ng representi eter hyperparam a is where ) , y | N(t ) y | p(t that so on, distributi Gaussian a have that processes noise consider we Here variable noise random a is and , ) ( where t n 1 - T n 1 T n 1 1 - n n n n n n             n n n n x y y y
  • 9. GP FOR REGRESSION • From the definition of GP, the marginal distribution p(y) is given by • The marginal distribution of t is given by • Where the covariance matrix C has elements ) , 0 | ( ) ( K y N y p     ) , 0 | ( ) ( ) | ( ) ( C t N dy y p y t p t p nm m n m n x x k x x C   1 ) , ( ) , (   
  • 10. GP FOR REGRESSION • We’ve used GP to build a model of the joint distribution over sets of data points • Goal: • To find , we begin by writing down the joint distribution 1 n 1 n n 1 1 n input x new a for predict t , x ,..., x es input valu , ) ,.., ( t points training Given    T n t t 1 - 1 n 1 n 1 1 1 1 1 ) x , k(x c and matrix, n n is where , c matrix, 1) (n 1) (n is where ) , 0 | ( ) (                         n T n n n n n n C k k C C C C t N t p ) | ( 1 t t p n
  • 11. GP FOR REGRESSION • The conditional distribution is a Gaussian distribution with mean and covariance given by • These are the key results that define Gaussian process regression. • The predictive distribution is a Gaussian whose mean and variance both depend on ) | ( 1 t t p n k C k c x t C k x m n T n n T n 1 1 2 1 1 ) ( ) (         1  n x
  • 12. EPSILON SUPPORT VECTOR REGRESSION (E-SVR) • Given: a data set {x1, ..., xn} with target values {u1, ..., un}, we want to do -SVR • The optimization problem is • Similar to SVM, this can be solved as a quadratic programming problem
  • 13. A DECISION TREE: CLASSIFICATION Play Play Don’t Play Don’t Play
  • 14. INTRODUCTION • Artificial Neural Networks (ANN) • Information processing paradigm inspired by biological nervous systems • ANN is composed of a system of neurons connected by synapses • ANN learn by example • Adjust synaptic connections between neurons
  • 15. COMPARISON OF BRAINS AND TRADITIONAL COMPUTERS • 200 billion neurons, 32 trillion synapses • Element size: 10-6 m • Energy use: 25W • Processing speed: 100 Hz • Parallel, Distributed • Fault Tolerant • Learns: Yes • Intelligent/Conscious: Usually • 1 billion bytes RAM but trillions of bytes on disk • Element size: 10-9 m • Energy watt: 30-90W (CPU) • Processing speed: 109 Hz • Serial, Centralized • Generally not Fault Tolerant • Learns: Some • Intelligent/Conscious: Generally No
  • 16. BIOLOGICAL INSPIRATION “My brain: It's my second favorite organ.” - Woody Allen, from the movie Sleeper Idea : To make the computer more robust, intelligent, and learn, … Let’s model our computer software (and/or hardware) after the brain
  • 17. NEURONS IN THE BRAIN • Although heterogeneous, at a low level the brain is composed of neurons • A neuron receives input from other neurons (generally thousands) from its synapses • Inputs are approximately summed • When the input exceeds a threshold the neuron sends an electrical spike that travels that travels from the body, down the axon, to the next neuron(s)
  • 18. LEARNING IN THE BRAIN • Brains learn • Altering strength between neurons • Creating/deleting connections • Hebb’s Postulate (Hebbian Learning) • When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased. • Long Term Potentiation (LTP) • Cellular basis for learning and memory • LTP is the long-lasting strengthening of the connection between two nerve cells in response to stimulation • Discovered in many regions of the cortex
  • 19. PERCEPTRONS • Initial proposal of connectionist networks • Rosenblatt, 50’s and 60’s • Essentially a linear discriminant composed of nodes, weights I1 I2 I3 W1 W2 W3  O                     otherwise I w O i i i : 0 0 : 1  I1 I2 I3 W1 W2 W3  O or 1 Activation Function
  • 20. Multi-layer Networks and Perceptrons - Have one or more layers of hidden units. - With two possibly very large hidden layers, it is possible to implement any function. - Networks without hidden layer are called perceptrons. - Perceptrons are very limited in what they can represent, but this makes their learning problem much simpler.
  • 21. ACTIVATION FUNCTION • To apply the LMS learning rule, also known as the delta rule, we need a differentiable activation function.     Function Activation f O T cI w j j k k '                  otherwise I w O i i i : 0 0 : 1  Old:   New:       i i i I w e O 1 1
  • 22. NETWORK STRUCTURES Feed-forward neural nets: • Links can only go in one direction. Recurrent neural nets: • Links can go anywhere and form arbitrary • topologies.
  • 24. Feed-forward Networks • Arranged in layers. • Each unit is linked only in the unit in next layer. • No units are linked between the same layer, back to • the previous layer or skipping a layer. • -Computations can proceed uniformly from input to • output units. • - No internal state exists.
  • 25. Feed-Forward Example I1 I2 t = -0.5 W24= -1 H4 W46 = 1 t = 1.5 H6 W67 = 1 t = 0.5 I1 t = -0.5 W13 = -1 H3 W35 = 1 t = 1.5 H5 O7 W57 = 1 W25 = 1 W16 = 1 Inputs skip the layer in this case
  • 26. Recurrent Network • The brain is not and cannot be a feed-forward network. • Allows activation to be fed back to the previous unit. • Internal state is stored in its activation level. • Can become unstable • Can oscillate.
  • 27. • May take long time to compute a stable output. • Learning process is much more difficult. • Can implement more complex designs. • Can model certain systems with internal states. RECURRENT NETWORK