Perceptron
Compiled by : Dr. Kumud Kundu
Perceptron
A perceptron is a simple model of a biological neuron
in an artificial neural network.
A single layer perceptron (SLP)
• Introduced by Frank Rosenblatt in 1957
• Utilized as Supervised Binary Classifier
• A feed-forward network.
• The decision function is a step function and the output is binary.
• SLP is the simplest type of artificial neural networks and can only
classify linearly separable cases with a binary target (1 , 0).
Perceptron-Binary Classifier
Perceptron is a single layer neural network
Perceptron is usually used to classify the data into two parts.
• Perceptron can’t handle tasks which are not separable.
• Sets of points in 2-D space are linearly separable if the
sets can be separated by a straight line.
• Perceptron can’t find weights for classification problems
that are not linearly separable.
Role of Bias
• Bias is like the intercept added in a linear equation.
• It is an additional parameter in the Neural Network which is used to
adjust the output along with the weighted sum of the inputs to the
neuron.
• Bias is a constant which helps the model in a way that it can fit best
for the given data
A bias value allows you to shift the activation function curve up or down.
Role of Activation Function
• A function which decides the activation(firing) of neuron
• It is attached to each neuron in the network
• It help the network to learn complex patterns in the data.
• Different layers can have different activation functions
• Higher the value of activation function, greater is the information passed
to the next connected neuron.
• For example: Perceptron uses step rule as the activation function which
converts the numerical output into +1 or -1
The figure shows how the step activation function squashes the inputs of weighted
either +1 or -1.
Step activation function helps in the discrimination of two linearly separable classes.
https://www.simplilearn.com/what-is-perceptron-tutorial
http://hagan.okstate.edu/4_Perceptron.pdf
To determine the appropriate connection weights, the common procedure is to have the network learn the appropriate
weights from a representative set of training data.
“Input times weights , add Bias and Activate”
Computation of final weights after the processing of
Training Set
Instance Target Weights
Weighted Sum
(∑W(i)X(i) ) Actual Target
Predicted Target
( if ∑W(i)X(i) >0 Then 1
Else 0 ) Delta Weight = old weight +η*(Actual Target - Predicted Target)
0 0 1 1 0 0 0 0 0 0 0 0 [0 0 0 0] + 1*0 * [ 0 0 0 0] = [ 0 0 0 0]
1 1 1 1 1 0 0 0 0 0 1 0 [1 1 1 1] + 1*1 * [ 0 0 0 0] = [ 1 1 1 1]
1 0 1 1 1 1 1 1 1 3 1 1 [1 0 1 1] + 1*0 * [ 1 1 1 1 ] = [ 1 0 1 1]
0 1 1 1 0 1 1 1 1 3 0 1 [1 1 1 1] + 1*(-1) * [ 1 1 1 1 ] = [ 1 0 0 0]
Multi Layer Perceptron or Feed-forward
Neural Network
There are two types of Perceptrons: Single layer and Multilayer.
Single layer Perceptrons can learn only linearly separable patterns.
Multilayer Perceptrons or feedforward neural networks with two or more layers have the greater
processing power.
Points of
Difference
Single Layer Perceptron Multi Layer Perceptron
1 It does not contain any hidden layer. It contains one or more hidden layers
2 It can learn only linear function It can learn both linear and non-linear
function
3 It requires less training input It requires more number of training inputs
4 Learning is faster Learning is slower

Perceptron

  • 1.
    Perceptron Compiled by :Dr. Kumud Kundu
  • 2.
    Perceptron A perceptron isa simple model of a biological neuron in an artificial neural network.
  • 3.
    A single layerperceptron (SLP) • Introduced by Frank Rosenblatt in 1957 • Utilized as Supervised Binary Classifier • A feed-forward network. • The decision function is a step function and the output is binary. • SLP is the simplest type of artificial neural networks and can only classify linearly separable cases with a binary target (1 , 0).
  • 4.
    Perceptron-Binary Classifier Perceptron isa single layer neural network Perceptron is usually used to classify the data into two parts.
  • 5.
    • Perceptron can’thandle tasks which are not separable. • Sets of points in 2-D space are linearly separable if the sets can be separated by a straight line. • Perceptron can’t find weights for classification problems that are not linearly separable.
  • 6.
    Role of Bias •Bias is like the intercept added in a linear equation. • It is an additional parameter in the Neural Network which is used to adjust the output along with the weighted sum of the inputs to the neuron. • Bias is a constant which helps the model in a way that it can fit best for the given data A bias value allows you to shift the activation function curve up or down.
  • 7.
    Role of ActivationFunction • A function which decides the activation(firing) of neuron • It is attached to each neuron in the network • It help the network to learn complex patterns in the data. • Different layers can have different activation functions • Higher the value of activation function, greater is the information passed to the next connected neuron. • For example: Perceptron uses step rule as the activation function which converts the numerical output into +1 or -1
  • 8.
    The figure showshow the step activation function squashes the inputs of weighted either +1 or -1. Step activation function helps in the discrimination of two linearly separable classes. https://www.simplilearn.com/what-is-perceptron-tutorial
  • 9.
  • 10.
    To determine theappropriate connection weights, the common procedure is to have the network learn the appropriate weights from a representative set of training data. “Input times weights , add Bias and Activate”
  • 12.
    Computation of finalweights after the processing of Training Set Instance Target Weights Weighted Sum (∑W(i)X(i) ) Actual Target Predicted Target ( if ∑W(i)X(i) >0 Then 1 Else 0 ) Delta Weight = old weight +η*(Actual Target - Predicted Target) 0 0 1 1 0 0 0 0 0 0 0 0 [0 0 0 0] + 1*0 * [ 0 0 0 0] = [ 0 0 0 0] 1 1 1 1 1 0 0 0 0 0 1 0 [1 1 1 1] + 1*1 * [ 0 0 0 0] = [ 1 1 1 1] 1 0 1 1 1 1 1 1 1 3 1 1 [1 0 1 1] + 1*0 * [ 1 1 1 1 ] = [ 1 0 1 1] 0 1 1 1 0 1 1 1 1 3 0 1 [1 1 1 1] + 1*(-1) * [ 1 1 1 1 ] = [ 1 0 0 0]
  • 13.
    Multi Layer Perceptronor Feed-forward Neural Network There are two types of Perceptrons: Single layer and Multilayer. Single layer Perceptrons can learn only linearly separable patterns. Multilayer Perceptrons or feedforward neural networks with two or more layers have the greater processing power.
  • 14.
    Points of Difference Single LayerPerceptron Multi Layer Perceptron 1 It does not contain any hidden layer. It contains one or more hidden layers 2 It can learn only linear function It can learn both linear and non-linear function 3 It requires less training input It requires more number of training inputs 4 Learning is faster Learning is slower