SlideShare a Scribd company logo
1 of 40
Download to read offline
1
Aprendizaje por máquina:
Redes neuronales
2
Machine Learning
Machine learning involves adaptive mechanisms
that enable computers to learn from experience,
learn by example and learn by analogy. Learning
capabilities can improve the performance of an
intelligent system over time. The most popular
approaches to machine learning are artificial
neural networks and genetic algorithms. This
lecture is dedicated to neural networks.
3
• Heuristic Search
• Knowledge Based Systems (KBS)
• Neural networks (Perceptron)
• Genetic Algorithms (GAs)
En sistemas basadas en conocimiento la solución es un
conjunto de reglas:
RULE Prescribes_Erythromycin
IF Diseases = strep_throat AND Allergy <> erythromycin
THEN Prescribes = erythromycin CNF 95
BECAUSE "Erythromycin is an effective treatment for a
strep throat if the patient is not allergic to it";
5
 La neurona como un elemeno simple de cálculo
 Perceptron
 Redes Neuronales Artificiales Multicapa
(Multilayer artificial neural networks)
Neurona artificial
6
 Our brain can be considered as a highly complex,
non-linear and parallel information-processing
system.
 Information is stored and processed in a neural
network simultaneously throughout the whole
network, rather than at specific locations. In other
words, in neural networks, both data and its
processing are global rather than local.
 Learning is a fundamental and essential
characteristic of biological neural networks. The
ease with which they can learn led to attempts to
emulate a biological neural network in a computer.
7
Biological neural network
Soma Soma
Synapse
Synapse
Dendrites
Axon
Synapse
Dendrites
Axon
8
Santiago Ramón y Cajal (1852-1934)
“Beautiful Brain: The Drawings of Santiago Ramon y Cajal”, L.Swanson, 2017
Ramón y Cajal refinó la técnica de tinción de Golgi y, obteniendo las
imágenes más nítidas, revolucionó la neurociencia.
La teoría de que las neuronas eran células cerebrales individuales
ayudó a comprender como estas células cerebrales individuales
envían y reciben información de manera direccional a través del
espacio entre ellas, al mandar información desde unos largos
apéndices llamados axones hacia las dendritas ramificadas.
9
 The neurons are connected by weighted links passing
signals from one neuron to another.
 The human brain incorporates nearly 10 billion neurons
and 60 trillion connections, synapses, between them.
By using multiple neurons simultaneously, the brain
can perform its functions much faster than the fastest
computers in existence today.
https://www.improvisedlife.com/2017/04/12/beautiful-visualizations-of-
your-brain-actively-perceiving-itself/
http://www.humanconnectomeproject.org/
Visualización artística de actividad cerebral:
12
 A artificial neural network can be defined as a model
of reasoning based on the human brain. The brain
consists of a densely interconnected set of nerve cells,
or basic information-processing units, called neurons.
 An artificial neural network consists of a number of
very simple processors, also called neurons, which are
analogous to the biological neurons in the brain.
13
Architecture of a typical artificial neural network
Input Layer Output Layer
Middle Layer
I
n
p
u
t
S
i
g
n
a
l
s
O
u
t
p
u
t
S
i
g
n
a
l
s
14
Analogy between biological and
artificial neural networks
Soma Soma
Synapse
Synapse
Dendrites
Axon
Synapse
Dendrites
Axon
Input Layer Output Layer
Middle Layer
I
n
p
u
t
S
i
g
n
a
l
s
O
u
t
p
u
t
S
i
g
n
a
l
s
15
 Each neuron has a simple structure, but an army of
such elements constitutes a tremendous processing
power.
 A neuron consists of a cell body, soma, a number of
fibers called dendrites, and a single long fiber
called the axon.
16
The neuron as a simple computing element
Diagram of a neuron
Neuron Y
Input Signals
x1
x2
xn
Output Signals
Y
Y
Y
w2
w1
wn
Weights
17
The Perceptron
 The operation of Rosenblatt’s perceptron is based
on the McCulloch and Pitts neuron model. The
model consists of a linear combiner followed by a
hard limiter.
 The weighted sum of the inputs is applied to the
hard limiter. (Step or sign function)
18
Threshold
Inputs
x1
x2
Output
Y

Hard
Limiter
w2
w1
Linear
Combiner

Single-layer two-input perceptron
19
 The neuron computes the weighted sum of the input
signals and compares the result with a threshold
value, . If the net input is less than the threshold,
the neuron output is –1. But if the net input is greater
than or equal to the threshold, the neuron becomes
activated and its output attains a value +1.
 The neuron uses the following transfer or activation
function:
 This type of activation function is called a sign
function.



n
i
i
iw
x
X
1 









X
X
Y
if
,
1
if
,
1
20
Y = sign(x1 * w1 + x2 * w2 - threshold);
static int sign(double x) {
return (x < - 0) ? -1 : 1;
}
Simple neurona:
21
Possible Activation functions
22
Exercise
Threshold
Inputs
x1
x2
Output
Y

Hard
Limiter
w2
w1
Linear
Combiner

A neuron uses a step function as its
activation function, has threshold 0.2
and has W1 = 0.1, W2 = 0.1.
What is the output with the
following values of x1 and x2?
Do you recognize this?
x1 x2 Y
0 0
0 1
1 0
1 1
Can you find weights
that will give an or
function?
An xor function?
23
Can a single neuron learn a task?
 In 1958, Frank Rosenblatt introduced a training
algorithm that provided the first procedure for
training a simple ANN: a perceptron.
 The perceptron is the simplest form of a neural
network. It consists of a single neuron with
adjustable synaptic weights and a hard limiter.
24
This is done by making small adjustments in the
weights to reduce the difference between the actual
and desired outputs of the perceptron. The initial
weights are randomly assigned, usually in the range
[0.5, 0.5], and then updated to obtain the output
consistent with the training examples.
How does the perceptron learn its classification
tasks?
25
 If at iteration p, the actual output is Y(p) and the
desired output is Yd (p), then the error is given by:
where p = 1, 2, 3, . . .
Iteration p here refers to the pth training example
presented to the perceptron.
 If the error, e(p), is positive, we need to increase
perceptron output Y(p), but if it is negative, we
need to decrease Y(p).
)
(
)
(
)
( p
Y
p
Y
p
e d 

26
The perceptron learning rule
where p = 1, 2, 3, . . .
 is the learning rate, a positive constant less than
unity.
The perceptron learning rule was first proposed by
Rosenblatt in 1960. Using this rule we can derive
the perceptron training algorithm for classification
tasks.
)
(
)
(
)
(
)
1
( p
e
p
x
p
w
p
w i
i
i 



 
27
Step 1: Initialisation
Set initial weights w1, w2,…, wn and threshold 
to random numbers in the range [0.5, 0.5].
Perceptron’s training algorithm
28
Step 2: Activation
Activate the perceptron by applying inputs x1(p),
x2(p),…, xn(p) and desired output Yd (p).
Calculate the actual output at iteration p = 1
where n is the number of the perceptron inputs,
and step is a step activation function.
Perceptron’s tarining algorithm (continued)










 

n
i
i
i p
w
p
x
step
p
Y
1
)
(
)
(
)
(
29
Step 3: Weight training
Update the weights of the perceptron
where is the weight correction at iteration p.
The weight correction is computed by the delta
rule:
Step 4: Iteration
Increase iteration p by one, go back to Step 2 and
repeat the process until convergence.
)
(
)
(
)
1
( p
w
p
w
p
w i
i
i 



Perceptron’s tarining algorithm (continued)
)
(
)
(
)
( p
e
p
x
p
w i
i 



 )
(
)
(
)
( p
Y
p
Y
p
e d 

30
Epoch x1 X2 Yd w1 w2 Y err w1 w2
1 0 0 0.3 -0.1
2
Exercise: Complete the first two epochs
training the perceptron to learn the logical
AND function. Assume  = 0.2,  = 0.1
31
 The perceptron in effect classifies the inputs,
x1, x2, . . ., xn, into one of two classes, say
A1 and A2.
 In the case of an elementary perceptron, the n-
dimensional space is divided by a hyperplane into
two decision regions. The hyperplane is defined by
the linearly separable function:
0
1





n
i
i
iw
x
32
• If you proceed with the algorithm described above the
termination occurs with w1 = 0.1 and w2 = 0.1
• This means that the line is
0.1x1 + 0.1x2 = 0.2
or
x1 + x2 - 2 = 0
• The region below the line is where
x1 + x2 - 2 < 0
and above the line is where
x1 + x2 - 2 >= 0
33
Linear separability in the perceptron
x1
x2
Class A2
Class A1
1
2
x1w1 + x2w2   = 0
(a) Two-input perceptron. (b) Three-input perceptron.
x2
x1
x3
x1w1 + x2w2 + x3w3   = 0
1
2
34
Perceptron Capability
• Independence of the activation function
(Shynk, 1990; Shynk and Bernard, 1992)
35
General Network Structure
• The output signal is transmitted through the
neuron’s outgoing connection. The outgoing
connection splits into a number of branches that
transmit the same signal. The outgoing branches
terminate at the incoming connections of other
neurons in the network.
• Training by back-propagation algorithm
• Overcame the proven limitations of perceptron
36
Architecture of a typical artificial neural network
Input Layer Output Layer
Middle Layer
I
n
p
u
t
S
i
g
n
a
l
s
O
u
t
p
u
t
S
i
g
n
a
l
s
37
Entrenamiento de una red neuronal en la naturaleza:
38
https://www.youtube.com/watch?v=ehno85yI-sA
Starfish Self Modeling Robot
Entrenamiento de una red neuronal artificial:
39
https://www.youtube.com/watch?v=rVlhMGQgDkY
Ejercicio para el viernes:
Probar una implementación sencilla del perceptron en C.
- Seguir el documento README y probar los ejemplos dados en el sitio:
ftp://ftp.cs.brown.edu/u/tld/Supplement/05.Learning/C++Code/Perceptron
- Deben descargar todos los archivos y compilarlos usando el archivo de
configuración Makefile incluido.
- Para compilar el paquete con Dev C++ ver la respuesta en:
https://stackoverflow.com/questions/13544684/running-a-project-in-dev-c

More Related Content

Similar to 10-Perceptron.pdf

Machine Learning - Neural Networks - Perceptron
Machine Learning - Neural Networks - PerceptronMachine Learning - Neural Networks - Perceptron
Machine Learning - Neural Networks - PerceptronAndrew Ferlitsch
 
Machine Learning - Introduction to Neural Networks
Machine Learning - Introduction to Neural NetworksMachine Learning - Introduction to Neural Networks
Machine Learning - Introduction to Neural NetworksAndrew Ferlitsch
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2sravanthi computers
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkPrakash K
 
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptxArtificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptxMDYasin34
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxpratik610182
 
Neural network
Neural networkNeural network
Neural networkmarada0033
 
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxgnans Kgnanshek
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Mohd Faiz
 
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Simplilearn
 
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…Dongseo University
 
MLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningMLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningCharles Deledalle
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cseNaveenBhajantri1
 

Similar to 10-Perceptron.pdf (20)

Machine Learning - Neural Networks - Perceptron
Machine Learning - Neural Networks - PerceptronMachine Learning - Neural Networks - Perceptron
Machine Learning - Neural Networks - Perceptron
 
Machine Learning - Introduction to Neural Networks
Machine Learning - Introduction to Neural NetworksMachine Learning - Introduction to Neural Networks
Machine Learning - Introduction to Neural Networks
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
6
66
6
 
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptxArtificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
 
Neural network
Neural networkNeural network
Neural network
 
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
 
tutorial.ppt
tutorial.ppttutorial.ppt
tutorial.ppt
 
Perceptron
PerceptronPerceptron
Perceptron
 
071bct537 lab4
071bct537 lab4071bct537 lab4
071bct537 lab4
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
 
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…
2013-1 Machine Learning Lecture 04 - Michael Negnevitsky - Artificial neur…
 
Lec-02.pdf
Lec-02.pdfLec-02.pdf
Lec-02.pdf
 
MLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learningMLIP - Chapter 2 - Preliminaries to deep learning
MLIP - Chapter 2 - Preliminaries to deep learning
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 

Recently uploaded

2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degreeyuu sss
 
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一F dds
 
PORTAFOLIO 2024_ ANASTASIYA KUDINOVA
PORTAFOLIO   2024_  ANASTASIYA  KUDINOVAPORTAFOLIO   2024_  ANASTASIYA  KUDINOVA
PORTAFOLIO 2024_ ANASTASIYA KUDINOVAAnastasiya Kudinova
 
NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...
NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...
NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...Amil baba
 
定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一
定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一
定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一lvtagr7
 
Design Portfolio - 2024 - William Vickery
Design Portfolio - 2024 - William VickeryDesign Portfolio - 2024 - William Vickery
Design Portfolio - 2024 - William VickeryWilliamVickery6
 
Cosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable BricksCosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable Bricksabhishekparmar618
 
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)jennyeacort
 
306MTAMount UCLA University Bachelor's Diploma in Social Media
306MTAMount UCLA University Bachelor's Diploma in Social Media306MTAMount UCLA University Bachelor's Diploma in Social Media
306MTAMount UCLA University Bachelor's Diploma in Social MediaD SSS
 
在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证
在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证
在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证nhjeo1gg
 
办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一
办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一
办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一A SSS
 
PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024
PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024
PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024CristobalHeraud
 
3D Printing And Designing Final Report.pdf
3D Printing And Designing Final Report.pdf3D Printing And Designing Final Report.pdf
3D Printing And Designing Final Report.pdfSwaraliBorhade
 
Passbook project document_april_21__.pdf
Passbook project document_april_21__.pdfPassbook project document_april_21__.pdf
Passbook project document_april_21__.pdfvaibhavkanaujia
 
Introduction-to-Canva-and-Graphic-Design-Basics.pptx
Introduction-to-Canva-and-Graphic-Design-Basics.pptxIntroduction-to-Canva-and-Graphic-Design-Basics.pptx
Introduction-to-Canva-and-Graphic-Design-Basics.pptxnewslab143
 
shot list for my tv series two steps back
shot list for my tv series two steps backshot list for my tv series two steps back
shot list for my tv series two steps back17lcow074
 
Architecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdfArchitecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdfSumit Lathwal
 
Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...
Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...
Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...narwatsonia7
 
call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Call Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts Service
Call Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts ServiceCall Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts Service
Call Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts Servicejennyeacort
 

Recently uploaded (20)

2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
2024新版美国旧金山州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
 
PORTAFOLIO 2024_ ANASTASIYA KUDINOVA
PORTAFOLIO   2024_  ANASTASIYA  KUDINOVAPORTAFOLIO   2024_  ANASTASIYA  KUDINOVA
PORTAFOLIO 2024_ ANASTASIYA KUDINOVA
 
NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...
NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...
NO1 Famous Amil Baba In Karachi Kala Jadu In Karachi Amil baba In Karachi Add...
 
定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一
定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一
定制(RMIT毕业证书)澳洲墨尔本皇家理工大学毕业证成绩单原版一比一
 
Design Portfolio - 2024 - William Vickery
Design Portfolio - 2024 - William VickeryDesign Portfolio - 2024 - William Vickery
Design Portfolio - 2024 - William Vickery
 
Cosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable BricksCosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable Bricks
 
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
 
306MTAMount UCLA University Bachelor's Diploma in Social Media
306MTAMount UCLA University Bachelor's Diploma in Social Media306MTAMount UCLA University Bachelor's Diploma in Social Media
306MTAMount UCLA University Bachelor's Diploma in Social Media
 
在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证
在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证
在线办理ohio毕业证俄亥俄大学毕业证成绩单留信学历认证
 
办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一
办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一
办理学位证(NTU证书)新加坡南洋理工大学毕业证成绩单原版一比一
 
PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024
PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024
PORTFOLIO DE ARQUITECTURA CRISTOBAL HERAUD 2024
 
3D Printing And Designing Final Report.pdf
3D Printing And Designing Final Report.pdf3D Printing And Designing Final Report.pdf
3D Printing And Designing Final Report.pdf
 
Passbook project document_april_21__.pdf
Passbook project document_april_21__.pdfPassbook project document_april_21__.pdf
Passbook project document_april_21__.pdf
 
Introduction-to-Canva-and-Graphic-Design-Basics.pptx
Introduction-to-Canva-and-Graphic-Design-Basics.pptxIntroduction-to-Canva-and-Graphic-Design-Basics.pptx
Introduction-to-Canva-and-Graphic-Design-Basics.pptx
 
shot list for my tv series two steps back
shot list for my tv series two steps backshot list for my tv series two steps back
shot list for my tv series two steps back
 
Architecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdfArchitecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdf
 
Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...
Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...
Housewife Call Girls NRI Layout - Call 7001305949 Rs-3500 with A/C Room Cash ...
 
call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Call Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts Service
Call Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts ServiceCall Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts Service
Call Girls In Safdarjung Enclave 24/7✡️9711147426✡️ Escorts Service
 

10-Perceptron.pdf

  • 2. 2 Machine Learning Machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy. Learning capabilities can improve the performance of an intelligent system over time. The most popular approaches to machine learning are artificial neural networks and genetic algorithms. This lecture is dedicated to neural networks.
  • 3. 3 • Heuristic Search • Knowledge Based Systems (KBS) • Neural networks (Perceptron) • Genetic Algorithms (GAs)
  • 4. En sistemas basadas en conocimiento la solución es un conjunto de reglas: RULE Prescribes_Erythromycin IF Diseases = strep_throat AND Allergy <> erythromycin THEN Prescribes = erythromycin CNF 95 BECAUSE "Erythromycin is an effective treatment for a strep throat if the patient is not allergic to it";
  • 5. 5  La neurona como un elemeno simple de cálculo  Perceptron  Redes Neuronales Artificiales Multicapa (Multilayer artificial neural networks) Neurona artificial
  • 6. 6  Our brain can be considered as a highly complex, non-linear and parallel information-processing system.  Information is stored and processed in a neural network simultaneously throughout the whole network, rather than at specific locations. In other words, in neural networks, both data and its processing are global rather than local.  Learning is a fundamental and essential characteristic of biological neural networks. The ease with which they can learn led to attempts to emulate a biological neural network in a computer.
  • 7. 7 Biological neural network Soma Soma Synapse Synapse Dendrites Axon Synapse Dendrites Axon
  • 8. 8 Santiago Ramón y Cajal (1852-1934) “Beautiful Brain: The Drawings of Santiago Ramon y Cajal”, L.Swanson, 2017 Ramón y Cajal refinó la técnica de tinción de Golgi y, obteniendo las imágenes más nítidas, revolucionó la neurociencia. La teoría de que las neuronas eran células cerebrales individuales ayudó a comprender como estas células cerebrales individuales envían y reciben información de manera direccional a través del espacio entre ellas, al mandar información desde unos largos apéndices llamados axones hacia las dendritas ramificadas.
  • 9. 9  The neurons are connected by weighted links passing signals from one neuron to another.  The human brain incorporates nearly 10 billion neurons and 60 trillion connections, synapses, between them. By using multiple neurons simultaneously, the brain can perform its functions much faster than the fastest computers in existence today.
  • 10.
  • 12. 12  A artificial neural network can be defined as a model of reasoning based on the human brain. The brain consists of a densely interconnected set of nerve cells, or basic information-processing units, called neurons.  An artificial neural network consists of a number of very simple processors, also called neurons, which are analogous to the biological neurons in the brain.
  • 13. 13 Architecture of a typical artificial neural network Input Layer Output Layer Middle Layer I n p u t S i g n a l s O u t p u t S i g n a l s
  • 14. 14 Analogy between biological and artificial neural networks Soma Soma Synapse Synapse Dendrites Axon Synapse Dendrites Axon Input Layer Output Layer Middle Layer I n p u t S i g n a l s O u t p u t S i g n a l s
  • 15. 15  Each neuron has a simple structure, but an army of such elements constitutes a tremendous processing power.  A neuron consists of a cell body, soma, a number of fibers called dendrites, and a single long fiber called the axon.
  • 16. 16 The neuron as a simple computing element Diagram of a neuron Neuron Y Input Signals x1 x2 xn Output Signals Y Y Y w2 w1 wn Weights
  • 17. 17 The Perceptron  The operation of Rosenblatt’s perceptron is based on the McCulloch and Pitts neuron model. The model consists of a linear combiner followed by a hard limiter.  The weighted sum of the inputs is applied to the hard limiter. (Step or sign function)
  • 19. 19  The neuron computes the weighted sum of the input signals and compares the result with a threshold value, . If the net input is less than the threshold, the neuron output is –1. But if the net input is greater than or equal to the threshold, the neuron becomes activated and its output attains a value +1.  The neuron uses the following transfer or activation function:  This type of activation function is called a sign function.    n i i iw x X 1           X X Y if , 1 if , 1
  • 20. 20 Y = sign(x1 * w1 + x2 * w2 - threshold); static int sign(double x) { return (x < - 0) ? -1 : 1; } Simple neurona:
  • 22. 22 Exercise Threshold Inputs x1 x2 Output Y  Hard Limiter w2 w1 Linear Combiner  A neuron uses a step function as its activation function, has threshold 0.2 and has W1 = 0.1, W2 = 0.1. What is the output with the following values of x1 and x2? Do you recognize this? x1 x2 Y 0 0 0 1 1 0 1 1 Can you find weights that will give an or function? An xor function?
  • 23. 23 Can a single neuron learn a task?  In 1958, Frank Rosenblatt introduced a training algorithm that provided the first procedure for training a simple ANN: a perceptron.  The perceptron is the simplest form of a neural network. It consists of a single neuron with adjustable synaptic weights and a hard limiter.
  • 24. 24 This is done by making small adjustments in the weights to reduce the difference between the actual and desired outputs of the perceptron. The initial weights are randomly assigned, usually in the range [0.5, 0.5], and then updated to obtain the output consistent with the training examples. How does the perceptron learn its classification tasks?
  • 25. 25  If at iteration p, the actual output is Y(p) and the desired output is Yd (p), then the error is given by: where p = 1, 2, 3, . . . Iteration p here refers to the pth training example presented to the perceptron.  If the error, e(p), is positive, we need to increase perceptron output Y(p), but if it is negative, we need to decrease Y(p). ) ( ) ( ) ( p Y p Y p e d  
  • 26. 26 The perceptron learning rule where p = 1, 2, 3, . . .  is the learning rate, a positive constant less than unity. The perceptron learning rule was first proposed by Rosenblatt in 1960. Using this rule we can derive the perceptron training algorithm for classification tasks. ) ( ) ( ) ( ) 1 ( p e p x p w p w i i i      
  • 27. 27 Step 1: Initialisation Set initial weights w1, w2,…, wn and threshold  to random numbers in the range [0.5, 0.5]. Perceptron’s training algorithm
  • 28. 28 Step 2: Activation Activate the perceptron by applying inputs x1(p), x2(p),…, xn(p) and desired output Yd (p). Calculate the actual output at iteration p = 1 where n is the number of the perceptron inputs, and step is a step activation function. Perceptron’s tarining algorithm (continued)              n i i i p w p x step p Y 1 ) ( ) ( ) (
  • 29. 29 Step 3: Weight training Update the weights of the perceptron where is the weight correction at iteration p. The weight correction is computed by the delta rule: Step 4: Iteration Increase iteration p by one, go back to Step 2 and repeat the process until convergence. ) ( ) ( ) 1 ( p w p w p w i i i     Perceptron’s tarining algorithm (continued) ) ( ) ( ) ( p e p x p w i i      ) ( ) ( ) ( p Y p Y p e d  
  • 30. 30 Epoch x1 X2 Yd w1 w2 Y err w1 w2 1 0 0 0.3 -0.1 2 Exercise: Complete the first two epochs training the perceptron to learn the logical AND function. Assume  = 0.2,  = 0.1
  • 31. 31  The perceptron in effect classifies the inputs, x1, x2, . . ., xn, into one of two classes, say A1 and A2.  In the case of an elementary perceptron, the n- dimensional space is divided by a hyperplane into two decision regions. The hyperplane is defined by the linearly separable function: 0 1      n i i iw x
  • 32. 32 • If you proceed with the algorithm described above the termination occurs with w1 = 0.1 and w2 = 0.1 • This means that the line is 0.1x1 + 0.1x2 = 0.2 or x1 + x2 - 2 = 0 • The region below the line is where x1 + x2 - 2 < 0 and above the line is where x1 + x2 - 2 >= 0
  • 33. 33 Linear separability in the perceptron x1 x2 Class A2 Class A1 1 2 x1w1 + x2w2   = 0 (a) Two-input perceptron. (b) Three-input perceptron. x2 x1 x3 x1w1 + x2w2 + x3w3   = 0 1 2
  • 34. 34 Perceptron Capability • Independence of the activation function (Shynk, 1990; Shynk and Bernard, 1992)
  • 35. 35 General Network Structure • The output signal is transmitted through the neuron’s outgoing connection. The outgoing connection splits into a number of branches that transmit the same signal. The outgoing branches terminate at the incoming connections of other neurons in the network. • Training by back-propagation algorithm • Overcame the proven limitations of perceptron
  • 36. 36 Architecture of a typical artificial neural network Input Layer Output Layer Middle Layer I n p u t S i g n a l s O u t p u t S i g n a l s
  • 37. 37 Entrenamiento de una red neuronal en la naturaleza:
  • 38. 38 https://www.youtube.com/watch?v=ehno85yI-sA Starfish Self Modeling Robot Entrenamiento de una red neuronal artificial:
  • 40. Ejercicio para el viernes: Probar una implementación sencilla del perceptron en C. - Seguir el documento README y probar los ejemplos dados en el sitio: ftp://ftp.cs.brown.edu/u/tld/Supplement/05.Learning/C++Code/Perceptron - Deben descargar todos los archivos y compilarlos usando el archivo de configuración Makefile incluido. - Para compilar el paquete con Dev C++ ver la respuesta en: https://stackoverflow.com/questions/13544684/running-a-project-in-dev-c