Classification by Backpropagation
DEEP NEURAL NETWORK(DNN)
March 29, 2019
Bineesh Jose
Research Scholar
School of Computer Science
M G University
Kottayam
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Contents
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a Classifier
Backpropagation
Backpropagation with 3 hidden layers
Gradient Descent Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and Interpretability
Challenges in DNN training
HDR Using Backpropagation
Backpropagation with 4 hidden layers
17
Classification by
Backpropagation
Bineesh Jose
2 Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
DEEPNET
Perceptron
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
3 Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
DEEPNET
Multi-Layer Perceptron
When MLP contains two or more hidden layers then such an
MLP is called Deep Neural Network DNN
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
4 MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
MLP Cntd..
Multi-Layer Perceptron
• The inputs to the network correspond to the
attributes measured for each training tuple
• Inputs are fed simultaneously into the units
making up the input layer
• They are then weighted and fed simultaneously to
a hidden layer
• The number of hidden layers is arbitrary,
although usually only one
• The weighted outputs of the last hidden layer are
input to units making up the output layer, which
emits the network's prediction
• The network is feed-forward in that none of the
weights cycles back to an input unit or to an
output unit of a previous layer
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
5 Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
DEEPNET
Training MLP
• Training MLP was a big challenge, because it was
needed to handle derivatives at multiple layers.
• In MLP, the activation function was a step
function.
z = w0x0 + w1x1 + . . . + wnxn = wT
x (1)
• Gradients cannot move on a flat surface.
• Backpropagation algorithm was developed to solve
this problem.
• The step function is replaced with a logistic
function:
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
6 Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Neural Network as a Classifier
Strength
• High tolerance to noisy data and Ability to
classify untrained patterns
• Well-suited for continuous-valued inputs and
outputs
• Successful on a wide array of real-world data
• Techniques have recently been developed for the
extraction of rules from trained neural networks
Weakness
• Long training time.
• Require a number of parameters typically best
determined empirically, e.g., the network
topology or structure..
• Poor interpretability: Difficult to interpret
the symbolic meaning behind the learned weights
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
7 Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Backpropagation
• Backpropagation: A neural network learning
algorithm.
• Started by psychologists and neurobiologists to
develop and test computational analogues of
neurons.
• During the learning phase, the network learns by
adjusting the weights so as to be able to predict
the correct class labelof the input tuples
• Also referred to as connectionist learning due to
the connections between units
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
8 Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Backpropagation Cntd..
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
9 Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Backpropagation with 3 hidden layers
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
10 Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Gradient Descent Backpropagation
This is described as Gradient Descent using reverse-mode
autodiff.
• Error gradient along all connection weights were
measured by propagating the error from output
layer..
• First, a forward pass is performed - output of
every neuron in every layer is computed..
• Output error is estimated
• Then compute how much each neuron in last hidden
layer contributed to output error.
• This is repeated backwards until input layer..
• Last step is Gradient Descent on all connection
weights using error gradients estimated in
previous steps.
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
11 Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Gradient Descent
Mean Squared Error (MSE) cost function
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
12 Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Backpropagation Example
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
13 Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Backpropagation Algorithm
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
14 Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Efficiency and Interpretability
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
15 Challenges in DNN
training
HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Challenges in DNN training
Vanishing Gradients
• The gradients often gets smaller and smaller when
progressing to lower layers.
• This happens in Directed Acyclic Graph models.
• So, no weights get updated in lower layers (no
convergence to good solution)
Exploding Gradients
• The gradients often gets bigger and bigger during
backprop.
• This happens in Directed Cyclic Graph models.
• Weights get large updates.
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
16 HDR Using
Backpropagation
Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
Handwritten Digit Recognition Using MNIST
dataset
17
Classification by
Backpropagation
Bineesh Jose
Perceptron
Multi-Layer Perceptron
MLP Cntd..
Training MLP
Neural Network as a
Classifier
Backpropagation
Backpropagation with 3
hidden layers
Gradient Descent
Backpropagation
Gradient Descent
Backpropagation Example
Backpropagation Algorithm
Efficiency and
Interpretability
Challenges in DNN
training
HDR Using
Backpropagation
17 Backpropagation with
4 hidden layers
SOCS
M.G University
Kottayam
HDR Using Backpropagation with 4 hidden
layers
Thank you!

Classification By Back Propagation

  • 1.
    Classification by Backpropagation DEEPNEURAL NETWORK(DNN) March 29, 2019 Bineesh Jose Research Scholar School of Computer Science M G University Kottayam
  • 2.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Contents Perceptron Multi-Layer Perceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers
  • 3.
    17 Classification by Backpropagation Bineesh Jose 2Perceptron Multi-Layer Perceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam DEEPNET Perceptron
  • 4.
    17 Classification by Backpropagation Bineesh Jose Perceptron 3Multi-Layer Perceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam DEEPNET Multi-Layer Perceptron When MLP contains two or more hidden layers then such an MLP is called Deep Neural Network DNN
  • 5.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron 4 MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam MLP Cntd.. Multi-Layer Perceptron • The inputs to the network correspond to the attributes measured for each training tuple • Inputs are fed simultaneously into the units making up the input layer • They are then weighted and fed simultaneously to a hidden layer • The number of hidden layers is arbitrary, although usually only one • The weighted outputs of the last hidden layer are input to units making up the output layer, which emits the network's prediction • The network is feed-forward in that none of the weights cycles back to an input unit or to an output unit of a previous layer
  • 6.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. 5 Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam DEEPNET Training MLP • Training MLP was a big challenge, because it was needed to handle derivatives at multiple layers. • In MLP, the activation function was a step function. z = w0x0 + w1x1 + . . . + wnxn = wT x (1) • Gradients cannot move on a flat surface. • Backpropagation algorithm was developed to solve this problem. • The step function is replaced with a logistic function:
  • 7.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP 6 Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Neural Network as a Classifier Strength • High tolerance to noisy data and Ability to classify untrained patterns • Well-suited for continuous-valued inputs and outputs • Successful on a wide array of real-world data • Techniques have recently been developed for the extraction of rules from trained neural networks Weakness • Long training time. • Require a number of parameters typically best determined empirically, e.g., the network topology or structure.. • Poor interpretability: Difficult to interpret the symbolic meaning behind the learned weights
  • 8.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier 7 Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Backpropagation • Backpropagation: A neural network learning algorithm. • Started by psychologists and neurobiologists to develop and test computational analogues of neurons. • During the learning phase, the network learns by adjusting the weights so as to be able to predict the correct class labelof the input tuples • Also referred to as connectionist learning due to the connections between units
  • 9.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier 8 Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Backpropagation Cntd..
  • 10.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation 9 Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Backpropagation with 3 hidden layers
  • 11.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers 10 Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Gradient Descent Backpropagation This is described as Gradient Descent using reverse-mode autodiff. • Error gradient along all connection weights were measured by propagating the error from output layer.. • First, a forward pass is performed - output of every neuron in every layer is computed.. • Output error is estimated • Then compute how much each neuron in last hidden layer contributed to output error. • This is repeated backwards until input layer.. • Last step is Gradient Descent on all connection weights using error gradients estimated in previous steps.
  • 12.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation 11 Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Gradient Descent Mean Squared Error (MSE) cost function
  • 13.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent 12 Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Backpropagation Example
  • 14.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example 13 Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Backpropagation Algorithm
  • 15.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm 14 Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Efficiency and Interpretability
  • 16.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability 15 Challenges in DNN training HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Challenges in DNN training Vanishing Gradients • The gradients often gets smaller and smaller when progressing to lower layers. • This happens in Directed Acyclic Graph models. • So, no weights get updated in lower layers (no convergence to good solution) Exploding Gradients • The gradients often gets bigger and bigger during backprop. • This happens in Directed Cyclic Graph models. • Weights get large updates.
  • 17.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training 16 HDR Using Backpropagation Backpropagation with 4 hidden layers SOCS M.G University Kottayam Handwritten Digit Recognition Using MNIST dataset
  • 18.
    17 Classification by Backpropagation Bineesh Jose Perceptron Multi-LayerPerceptron MLP Cntd.. Training MLP Neural Network as a Classifier Backpropagation Backpropagation with 3 hidden layers Gradient Descent Backpropagation Gradient Descent Backpropagation Example Backpropagation Algorithm Efficiency and Interpretability Challenges in DNN training HDR Using Backpropagation 17 Backpropagation with 4 hidden layers SOCS M.G University Kottayam HDR Using Backpropagation with 4 hidden layers
  • 19.