3. A single layer perceptron (SLP)
• Introduced by Frank Rosenblatt in 1957
• Utilized as Supervised Binary Classifier
• A feed-forward network.
• The decision function is a step function and the output is binary.
• SLP is the simplest type of artificial neural networks and can only
classify linearly separable cases with a binary target (1 , 0).
5. • Perceptron can’t handle tasks which are not separable.
• Sets of points in 2-D space are linearly separable if the
sets can be separated by a straight line.
• Perceptron can’t find weights for classification problems
that are not linearly separable.
6. Role of Bias
• Bias is like the intercept added in a linear equation.
• It is an additional parameter in the Neural Network which is used to
adjust the output along with the weighted sum of the inputs to the
neuron.
• Bias is a constant which helps the model in a way that it can fit best
for the given data
A bias value allows you to shift the activation function curve up or down.
7. Role of Activation Function
• A function which decides the activation(firing) of neuron
• It is attached to each neuron in the network
• It help the network to learn complex patterns in the data.
• Different layers can have different activation functions
• Higher the value of activation function, greater is the information passed
to the next connected neuron.
• For example: Perceptron uses step rule as the activation function which
converts the numerical output into +1 or -1
8. The figure shows how the step activation function squashes the inputs of weighted
either +1 or -1.
Step activation function helps in the discrimination of two linearly separable classes.
https://www.simplilearn.com/what-is-perceptron-tutorial
10. To determine the appropriate connection weights, the common procedure is to have the network learn the appropriate
weights from a representative set of training data.
“Input times weights , add Bias and Activate”
13. Multi Layer Perceptron or Feed-forward
Neural Network
There are two types of Perceptrons: Single layer and Multilayer.
Single layer Perceptrons can learn only linearly separable patterns.
Multilayer Perceptrons or feedforward neural networks with two or more layers have the greater
processing power.
14. Points of
Difference
Single Layer Perceptron Multi Layer Perceptron
1 It does not contain any hidden layer. It contains one or more hidden layers
2 It can learn only linear function It can learn both linear and non-linear
function
3 It requires less training input It requires more number of training inputs
4 Learning is faster Learning is slower