BACKPROPAGATION ALGORITHM
BY:
Amit kumar
CONTENTS:
1.Introduction
2. Example of Backpropagation
4.Algorithm
6. Advantages
7.Disadvantages
8. Application
9. Conclusion
Blackcollar4/23/2015 2
INTRODUCTION
 Backpropagation, an abbreviation for "backward
propagation of errors" is a common method of
training artificial neural networks.
The method calculates the gradient of a loss
function with respects to all the weights in the network.
The gradient is fed to the optimization method which
in turn uses it to update the weights, in an attempt to
minimize the loss function.
Blackcollar4/23/2015 3
Backpropagation requires a known, desired output for
each input value in order to calculate the loss function
gradient.
 The backpropagation learning algorithm can be
divided into two phases:
 Propagation
Weight update
In Propagation neural network using the training
pattern target in order to generate the deltas of all
output and hidden neurons.
Multiply its output delta and input activation to get
the gradient of the weight.
Blackcollar4/23/2015 4
EXAMPLE OF BACKPROPAGATION
 Inputs xi arrive
through pre-
connected path
 Input is modeled
using real weights wi
 The response of the
neuron is a nonlinear
function f of its
weighted inputs
Blackcollar4/23/2015 5
Learning is the process of modifying the weights
in order to produce a network that performs some
function.
Blackcollar4/23/2015 6
ALGORITHM
The following steps is the recursive definition of
algorithm:
Step:-
1.Randomly choose the initial weights.
2.For each training pattern apply the inputs to the
network.
3.Calculate the output for every neuron from the input
layer, through the hidden layer(s), to the output layer.
4.Calculate the error at the outputs:
Use the output error to compute error signals
for pre-output layers.
ErrorB= OutputB(1-OutputB) (TargetB-OutputB)
Blackcollar4/23/2015 7
use the error signals to compute weight
adjustments.
W+AB = WAB + (ErrorB x OutputA)
Apply the weight adjustments.
Where
W+AB is New weight and WAB is initial weight
Output(1-Output)- the Sigmoid Function .
Blackcollar4/23/2015 8
Advantages
Backpropagation has many advantages:-
 It is fast, simple and easy to program.
 It has no parameters to tune (except for the number of
input) .
 This is a shift in mind set for the learning-system
designer instead of trying to design a learning algorithm
that is accurate over the entire space
 It requires no prior knowledge about the weak learner
and so can be flexible.
Blackcollar4/23/2015 9
Disadvantages
Disadvantages are:-
The actual performance of Backpropagation on
a particular problem is clearly dependent on the
input data.
Backpropagation can be sensitive to noisy data
and outliers.
Fully matrix-based approach to backpropagation
over a mini-batch .
Blackcollar4/23/2015 10
Application
 Mapping character strings into phonemes so they can
be pronounced by a computer.
Neural network trained how to pronounce each letter in a
word in a sentence, given the three letters before and
three letters after it in a window
 In the field of Speech Recognition.
In the field of Character Recognition.
In the field of Face Recognition.
Blackcollar4/23/2015 11
Conclusion
 The backpropagation algorithm normally
converges reasonably fast However, the actual
speed depends very much on the simulation
parameters on the initial weight values.
Blackcollar4/23/2015 12
Thank you !
Blackcollar4/23/2015 13

Backpropagation algo

  • 1.
  • 2.
    CONTENTS: 1.Introduction 2. Example ofBackpropagation 4.Algorithm 6. Advantages 7.Disadvantages 8. Application 9. Conclusion Blackcollar4/23/2015 2
  • 3.
    INTRODUCTION  Backpropagation, anabbreviation for "backward propagation of errors" is a common method of training artificial neural networks. The method calculates the gradient of a loss function with respects to all the weights in the network. The gradient is fed to the optimization method which in turn uses it to update the weights, in an attempt to minimize the loss function. Blackcollar4/23/2015 3
  • 4.
    Backpropagation requires aknown, desired output for each input value in order to calculate the loss function gradient.  The backpropagation learning algorithm can be divided into two phases:  Propagation Weight update In Propagation neural network using the training pattern target in order to generate the deltas of all output and hidden neurons. Multiply its output delta and input activation to get the gradient of the weight. Blackcollar4/23/2015 4
  • 5.
    EXAMPLE OF BACKPROPAGATION Inputs xi arrive through pre- connected path  Input is modeled using real weights wi  The response of the neuron is a nonlinear function f of its weighted inputs Blackcollar4/23/2015 5
  • 6.
    Learning is theprocess of modifying the weights in order to produce a network that performs some function. Blackcollar4/23/2015 6
  • 7.
    ALGORITHM The following stepsis the recursive definition of algorithm: Step:- 1.Randomly choose the initial weights. 2.For each training pattern apply the inputs to the network. 3.Calculate the output for every neuron from the input layer, through the hidden layer(s), to the output layer. 4.Calculate the error at the outputs: Use the output error to compute error signals for pre-output layers. ErrorB= OutputB(1-OutputB) (TargetB-OutputB) Blackcollar4/23/2015 7
  • 8.
    use the errorsignals to compute weight adjustments. W+AB = WAB + (ErrorB x OutputA) Apply the weight adjustments. Where W+AB is New weight and WAB is initial weight Output(1-Output)- the Sigmoid Function . Blackcollar4/23/2015 8
  • 9.
    Advantages Backpropagation has manyadvantages:-  It is fast, simple and easy to program.  It has no parameters to tune (except for the number of input) .  This is a shift in mind set for the learning-system designer instead of trying to design a learning algorithm that is accurate over the entire space  It requires no prior knowledge about the weak learner and so can be flexible. Blackcollar4/23/2015 9
  • 10.
    Disadvantages Disadvantages are:- The actualperformance of Backpropagation on a particular problem is clearly dependent on the input data. Backpropagation can be sensitive to noisy data and outliers. Fully matrix-based approach to backpropagation over a mini-batch . Blackcollar4/23/2015 10
  • 11.
    Application  Mapping characterstrings into phonemes so they can be pronounced by a computer. Neural network trained how to pronounce each letter in a word in a sentence, given the three letters before and three letters after it in a window  In the field of Speech Recognition. In the field of Character Recognition. In the field of Face Recognition. Blackcollar4/23/2015 11
  • 12.
    Conclusion  The backpropagationalgorithm normally converges reasonably fast However, the actual speed depends very much on the simulation parameters on the initial weight values. Blackcollar4/23/2015 12
  • 13.