SlideShare a Scribd company logo
1 of 24
Back-Propagation Algorithm
AN INTRODUCTION TO LEARNING INTERNAL
REPRESENTATIONS BY ERROR PROPAGATION
Presented by:
Kunal Parmar
UHID: 1329834
1
Outline of the Presentation
• Introduction
• Historical Background
• Perceptron
• Back propagation algorithm
• Limitations and improvements
• Questions and Answers
2
Introduction
• Artificial Neural Networks are crude attempts to model the highly
massive parallel and distributed processing we believe takes place in the
brain.
• Back propagation, an abbreviation for "backward propagation of errors",
is a common method of training artificial neural networks used in
conjunction with an optimization method such as gradient descent.
• The method calculates the gradient of a loss function with respect to all
the weights in the network.
• The gradient is fed to the optimization method which in turn uses it to
update the weights, in an attempt to minimize the loss function.
3
Why do we need multi layer neural networks??
• Some input and output patterns can be easily learned by single-layer
neural networks (i.e. perceptron). However, these single-layer
perceptron cannot learn some relatively simple patterns, such as those
that are not linearly separable.
• Single layer neural network can’t learn any abstract features of the input
since it is limited to having only one layer. A multi-layered network
overcomes this limitation as it can create internal representations and
learn different features in each layer.
• Each higher layer learns more and more abstract features that can be
used to classify the image. Each layer finds patterns in the layer below it
and it is this ability to create internal representations that are
independent of outside input that gives multi-layered networks their
power.
4
Historical Background
• Early attempts to implement artificial neural networks: McCulloch
(Neuroscientist) and Pitts (Logician) (1943)
• Based on simple neurons (MCP neurons)
• Based on logical functions
• Donald Hebb (1949) gave the hypothesis in his thesis “The Organization of
Behavior”:
• “Neural pathways are strengthened every time they are used.”
• Frank Rosenblatt (1958) created the perceptron, an algorithm for
pattern recognition based on a two-layer computer learning
network using simple addition and subtraction .
5
[1]
[1]
Historical Background
• However, neural network research stagnated when Minsky and
Papert (1969) criticized the idea of the perceptron discovering two
key issues in neural networks:
• Could not solve the XOR problem.
• Training time grows exponentially with the size of the input.
• Neural network research slowed until computers achieved greater
processing power. Another key advance that came later was
the back propagation algorithm which effectively solved the
exclusive-OR problem (Werbos 1975) .
6
[1]
[1]
Perceptron
• It is a step function based on a linear combination of real-valued
inputs. If the combination is above a threshold it outputs a 1,
otherwise it outputs a –1
• A perceptron can only learn examples that are linearly separable.
x1
x2
xn
X0=1
w0
w1
w2
wn
Σ {1 or –1}
7
[3]
Delta Rule
• The delta rule is used for calculating the gradient that is used for
updating the weights.
• We will try to minimize the following error:
• E = ½ Σi (ti – oi) 2
• For a new training example X = (x1, x2, …, xn), update each weight
according to this rule:
• wi = wi + Δwi
where, Δwi= -n * E’(w)/wi
8
Delta Rule
• The derivative gives,
• E’(w)/wi= Σi (ti – oi)*(- xi)
• So that gives us the following equation:
• ∆ wi = η Σi (ti – oi) xi
9
Multi-layer neural networks
• In contrast to perceptrons, multilayer neural networks can not only
learn multiple decision boundaries, but the boundaries may be
nonlinear.
Input nodes
Internal nodes
Output nodes
10
[3]
Learning non-linear boundaries
• To make nonlinear partitions on the space we need to define each
unit as a nonlinear function (unlike the perceptron). One solution is
to use the sigmoid (logistic) function. So,
•
• We use the sigmoid function because of the following property,
• d σ(y) / dy = σ(y) (1 – σ(y))
O(x1,x2,…,xn) = σ ( WX )
where: σ ( WX ) = 1 / 1 + e -WX
11
[3]
Back propagation Algorithm
• The back propagation learning algorithm can be divided into two phases:
• Phase 1: Propagation
• Forward propagation of a training pattern's input through the neural network in
order to generate the propagation's output activations.
• Backward propagation of the propagation's output activations through the neural
network using the training pattern target in order to generate the deltas (the
difference between the input and output values) of all output and hidden neurons.
• Phase 2: Weight update
• Multiply its output delta and input activation to get the gradient of the weight.
• Subtract a ratio (percentage) of the gradient from the weight.
12
Back propagation Algorithm (contd.)
13
[2]
What is gradient descent algorithm??
• Back propagation calculates the gradient of the error of the
network regarding the network's modifiable weights.
• This gradient is almost always used in a simple stochastic gradient
descent algorithm to find weights that minimize the error.
w1
w2
E(W)
14
[3]
Propagating forward
• Given example X, compute the output of every node until we reach
the output nodes:
Input nodes
Internal nodes
Output nodes
Example X
Compute
sigmoid function
15
[3]
Propagating Error Backward
• For each output node k compute the error:
δk = Ok (1-Ok)(tk – Ok)
• For each hidden unit h, calculate the error:
δh = Oh (1-Oh) Σk Wkh δk
• Update each network weight:
Wji = Wji + Δwji
where Δwji = η δj Xji (Wji and Xji are the input and
weight of node i to node j)
16
Number of Hidden Units
• The number of hidden units is related to the complexity of the
decision boundary.
• If examples are easy to discriminate few nodes would be enough.
Conversely complex problems require many internal nodes.
• A rule of thumb is to choose roughly m / 10 weights, where m is
the number of training examples.
17
Learning Rates
• Different learning rates affect the performance of a neural network
significantly.
• Optimal Learning Rate:
• Leads to the error minimum in one learning step.
18
Learning Rates
19
[3]
Limitations of the back propagation algorithm
• It is not guaranteed to find global minimum of the error function. It
may get trapped in a local minima,
• Improvements,
• Add momentum.
• Use stochastic gradient descent.
• Use different networks with different initial values for the weights.
• Back propagation learning does not require normalization of input
vectors; however, normalization could improve performance.
• Standardize all features previous to training.
20
Generalization and Overfitting
• Solutions,
• Use a validation set and stop until the error is small in this set.
• Use 10 fold cross validation
Number of weight updates
Validation set error
Training set error
21
[3]
References
• Artificial Neural Networks,
https://en.wikipedia.org/wiki/Artificial_neural_network
• Back propagation Algorithm,
https://en.wikipedia.org/wiki/Backpropagation
• Lecture Slides from Dr. Vilalta’s machine learning class
22
Questions??
23
Thank you!!
24

More Related Content

Similar to Feed forward back propogation algorithm .pptx

Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network Subham Preetam
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)James Boulie
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Randa Elanwar
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learningViet-Trung TRAN
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience hirokazutanaka
 
Deep Learning Sample Class (Jon Lederman)
Deep Learning Sample Class (Jon Lederman)Deep Learning Sample Class (Jon Lederman)
Deep Learning Sample Class (Jon Lederman)Jon Lederman
 
Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2sravanthi computers
 
M7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdfM7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdfArushiKansal3
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...DurgadeviParamasivam
 
Artificial Neural Network (ANN
Artificial Neural Network (ANNArtificial Neural Network (ANN
Artificial Neural Network (ANNAndrew Molina
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksFrancesco Collova'
 

Similar to Feed forward back propogation algorithm .pptx (20)

02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network
 
Unit 2 ml.pptx
Unit 2 ml.pptxUnit 2 ml.pptx
Unit 2 ml.pptx
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)
 
Neural network
Neural networkNeural network
Neural network
 
Deep Learning
Deep LearningDeep Learning
Deep Learning
 
Ffnn
FfnnFfnn
Ffnn
 
Lec 6-bp
Lec 6-bpLec 6-bp
Lec 6-bp
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learning
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
 
Deep Learning Sample Class (Jon Lederman)
Deep Learning Sample Class (Jon Lederman)Deep Learning Sample Class (Jon Lederman)
Deep Learning Sample Class (Jon Lederman)
 
Perceptron
PerceptronPerceptron
Perceptron
 
Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2
 
M7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdfM7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdf
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
 
Artificial Neural Network (ANN
Artificial Neural Network (ANNArtificial Neural Network (ANN
Artificial Neural Network (ANN
 
19_Learning.ppt
19_Learning.ppt19_Learning.ppt
19_Learning.ppt
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural Networks
 

More from neelamsanjeevkumar

HILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENG
HILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENGHILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENG
HILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENGneelamsanjeevkumar
 
simulated aneeleaning in artificial intelligence .pptx
simulated aneeleaning in artificial intelligence .pptxsimulated aneeleaning in artificial intelligence .pptx
simulated aneeleaning in artificial intelligence .pptxneelamsanjeevkumar
 
IOT Unit 3 for engineering second year .pptx
IOT Unit 3 for engineering second year .pptxIOT Unit 3 for engineering second year .pptx
IOT Unit 3 for engineering second year .pptxneelamsanjeevkumar
 
Genetic-Algorithms forv artificial .ppt
Genetic-Algorithms forv artificial  .pptGenetic-Algorithms forv artificial  .ppt
Genetic-Algorithms forv artificial .pptneelamsanjeevkumar
 
Genetic_Algorithms_genetic for_data .ppt
Genetic_Algorithms_genetic for_data .pptGenetic_Algorithms_genetic for_data .ppt
Genetic_Algorithms_genetic for_data .pptneelamsanjeevkumar
 
Genetic-Algorithms for machine learning and ai.ppt
Genetic-Algorithms for machine learning and ai.pptGenetic-Algorithms for machine learning and ai.ppt
Genetic-Algorithms for machine learning and ai.pptneelamsanjeevkumar
 
Stepwise Selection Choosing the Optimal Model .ppt
Stepwise Selection  Choosing the Optimal Model .pptStepwise Selection  Choosing the Optimal Model .ppt
Stepwise Selection Choosing the Optimal Model .pptneelamsanjeevkumar
 
the connection of iot with lora pan which enable
the connection of iot with lora pan which enablethe connection of iot with lora pan which enable
the connection of iot with lora pan which enableneelamsanjeevkumar
 
what is lorapan ,explanation of iot module with
what is lorapan ,explanation of iot module withwhat is lorapan ,explanation of iot module with
what is lorapan ,explanation of iot module withneelamsanjeevkumar
 
What is First Order Logic in AI or FOL in AI.docx
What is First Order Logic in AI or FOL in AI.docxWhat is First Order Logic in AI or FOL in AI.docx
What is First Order Logic in AI or FOL in AI.docxneelamsanjeevkumar
 
unit2_mental objects pruning and game theory .pptx
unit2_mental objects pruning and game theory .pptxunit2_mental objects pruning and game theory .pptx
unit2_mental objects pruning and game theory .pptxneelamsanjeevkumar
 
2_RaspberryPi presentation.pptx
2_RaspberryPi presentation.pptx2_RaspberryPi presentation.pptx
2_RaspberryPi presentation.pptxneelamsanjeevkumar
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfneelamsanjeevkumar
 
An_Introduction_To_The_Backpropagation_A.ppt
An_Introduction_To_The_Backpropagation_A.pptAn_Introduction_To_The_Backpropagation_A.ppt
An_Introduction_To_The_Backpropagation_A.pptneelamsanjeevkumar
 

More from neelamsanjeevkumar (20)

HILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENG
HILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENGHILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENG
HILL CLIMBING FOR ELECTRONICS AND COMMUNICATION ENG
 
simulated aneeleaning in artificial intelligence .pptx
simulated aneeleaning in artificial intelligence .pptxsimulated aneeleaning in artificial intelligence .pptx
simulated aneeleaning in artificial intelligence .pptx
 
IOT Unit 3 for engineering second year .pptx
IOT Unit 3 for engineering second year .pptxIOT Unit 3 for engineering second year .pptx
IOT Unit 3 for engineering second year .pptx
 
Genetic-Algorithms forv artificial .ppt
Genetic-Algorithms forv artificial  .pptGenetic-Algorithms forv artificial  .ppt
Genetic-Algorithms forv artificial .ppt
 
Genetic_Algorithms_genetic for_data .ppt
Genetic_Algorithms_genetic for_data .pptGenetic_Algorithms_genetic for_data .ppt
Genetic_Algorithms_genetic for_data .ppt
 
Genetic-Algorithms for machine learning and ai.ppt
Genetic-Algorithms for machine learning and ai.pptGenetic-Algorithms for machine learning and ai.ppt
Genetic-Algorithms for machine learning and ai.ppt
 
Stepwise Selection Choosing the Optimal Model .ppt
Stepwise Selection  Choosing the Optimal Model .pptStepwise Selection  Choosing the Optimal Model .ppt
Stepwise Selection Choosing the Optimal Model .ppt
 
the connection of iot with lora pan which enable
the connection of iot with lora pan which enablethe connection of iot with lora pan which enable
the connection of iot with lora pan which enable
 
what is lorapan ,explanation of iot module with
what is lorapan ,explanation of iot module withwhat is lorapan ,explanation of iot module with
what is lorapan ,explanation of iot module with
 
What is First Order Logic in AI or FOL in AI.docx
What is First Order Logic in AI or FOL in AI.docxWhat is First Order Logic in AI or FOL in AI.docx
What is First Order Logic in AI or FOL in AI.docx
 
unit2_mental objects pruning and game theory .pptx
unit2_mental objects pruning and game theory .pptxunit2_mental objects pruning and game theory .pptx
unit2_mental objects pruning and game theory .pptx
 
2_RaspberryPi presentation.pptx
2_RaspberryPi presentation.pptx2_RaspberryPi presentation.pptx
2_RaspberryPi presentation.pptx
 
LINEAR REGRESSION.pptx
LINEAR REGRESSION.pptxLINEAR REGRESSION.pptx
LINEAR REGRESSION.pptx
 
Adaline and Madaline.ppt
Adaline and Madaline.pptAdaline and Madaline.ppt
Adaline and Madaline.ppt
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdf
 
PRNN syllabus.pdf
PRNN syllabus.pdfPRNN syllabus.pdf
PRNN syllabus.pdf
 
PRNN syllabus.pdf
PRNN syllabus.pdfPRNN syllabus.pdf
PRNN syllabus.pdf
 
backprop.ppt
backprop.pptbackprop.ppt
backprop.ppt
 
feedforward-network-
feedforward-network-feedforward-network-
feedforward-network-
 
An_Introduction_To_The_Backpropagation_A.ppt
An_Introduction_To_The_Backpropagation_A.pptAn_Introduction_To_The_Backpropagation_A.ppt
An_Introduction_To_The_Backpropagation_A.ppt
 

Recently uploaded

247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).pptssuser5c9d4b1
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130Suhani Kapoor
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSISrknatarajan
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSRajkumarAkumalla
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...Call Girls in Nagpur High Profile
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝soniya singh
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingUNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingrknatarajan
 

Recently uploaded (20)

247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSIS
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
 
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
Roadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and RoutesRoadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and Routes
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingUNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
 

Feed forward back propogation algorithm .pptx

  • 1. Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID: 1329834 1
  • 2. Outline of the Presentation • Introduction • Historical Background • Perceptron • Back propagation algorithm • Limitations and improvements • Questions and Answers 2
  • 3. Introduction • Artificial Neural Networks are crude attempts to model the highly massive parallel and distributed processing we believe takes place in the brain. • Back propagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent. • The method calculates the gradient of a loss function with respect to all the weights in the network. • The gradient is fed to the optimization method which in turn uses it to update the weights, in an attempt to minimize the loss function. 3
  • 4. Why do we need multi layer neural networks?? • Some input and output patterns can be easily learned by single-layer neural networks (i.e. perceptron). However, these single-layer perceptron cannot learn some relatively simple patterns, such as those that are not linearly separable. • Single layer neural network can’t learn any abstract features of the input since it is limited to having only one layer. A multi-layered network overcomes this limitation as it can create internal representations and learn different features in each layer. • Each higher layer learns more and more abstract features that can be used to classify the image. Each layer finds patterns in the layer below it and it is this ability to create internal representations that are independent of outside input that gives multi-layered networks their power. 4
  • 5. Historical Background • Early attempts to implement artificial neural networks: McCulloch (Neuroscientist) and Pitts (Logician) (1943) • Based on simple neurons (MCP neurons) • Based on logical functions • Donald Hebb (1949) gave the hypothesis in his thesis “The Organization of Behavior”: • “Neural pathways are strengthened every time they are used.” • Frank Rosenblatt (1958) created the perceptron, an algorithm for pattern recognition based on a two-layer computer learning network using simple addition and subtraction . 5 [1] [1]
  • 6. Historical Background • However, neural network research stagnated when Minsky and Papert (1969) criticized the idea of the perceptron discovering two key issues in neural networks: • Could not solve the XOR problem. • Training time grows exponentially with the size of the input. • Neural network research slowed until computers achieved greater processing power. Another key advance that came later was the back propagation algorithm which effectively solved the exclusive-OR problem (Werbos 1975) . 6 [1] [1]
  • 7. Perceptron • It is a step function based on a linear combination of real-valued inputs. If the combination is above a threshold it outputs a 1, otherwise it outputs a –1 • A perceptron can only learn examples that are linearly separable. x1 x2 xn X0=1 w0 w1 w2 wn Σ {1 or –1} 7 [3]
  • 8. Delta Rule • The delta rule is used for calculating the gradient that is used for updating the weights. • We will try to minimize the following error: • E = ½ Σi (ti – oi) 2 • For a new training example X = (x1, x2, …, xn), update each weight according to this rule: • wi = wi + Δwi where, Δwi= -n * E’(w)/wi 8
  • 9. Delta Rule • The derivative gives, • E’(w)/wi= Σi (ti – oi)*(- xi) • So that gives us the following equation: • ∆ wi = η Σi (ti – oi) xi 9
  • 10. Multi-layer neural networks • In contrast to perceptrons, multilayer neural networks can not only learn multiple decision boundaries, but the boundaries may be nonlinear. Input nodes Internal nodes Output nodes 10 [3]
  • 11. Learning non-linear boundaries • To make nonlinear partitions on the space we need to define each unit as a nonlinear function (unlike the perceptron). One solution is to use the sigmoid (logistic) function. So, • • We use the sigmoid function because of the following property, • d σ(y) / dy = σ(y) (1 – σ(y)) O(x1,x2,…,xn) = σ ( WX ) where: σ ( WX ) = 1 / 1 + e -WX 11 [3]
  • 12. Back propagation Algorithm • The back propagation learning algorithm can be divided into two phases: • Phase 1: Propagation • Forward propagation of a training pattern's input through the neural network in order to generate the propagation's output activations. • Backward propagation of the propagation's output activations through the neural network using the training pattern target in order to generate the deltas (the difference between the input and output values) of all output and hidden neurons. • Phase 2: Weight update • Multiply its output delta and input activation to get the gradient of the weight. • Subtract a ratio (percentage) of the gradient from the weight. 12
  • 13. Back propagation Algorithm (contd.) 13 [2]
  • 14. What is gradient descent algorithm?? • Back propagation calculates the gradient of the error of the network regarding the network's modifiable weights. • This gradient is almost always used in a simple stochastic gradient descent algorithm to find weights that minimize the error. w1 w2 E(W) 14 [3]
  • 15. Propagating forward • Given example X, compute the output of every node until we reach the output nodes: Input nodes Internal nodes Output nodes Example X Compute sigmoid function 15 [3]
  • 16. Propagating Error Backward • For each output node k compute the error: δk = Ok (1-Ok)(tk – Ok) • For each hidden unit h, calculate the error: δh = Oh (1-Oh) Σk Wkh δk • Update each network weight: Wji = Wji + Δwji where Δwji = η δj Xji (Wji and Xji are the input and weight of node i to node j) 16
  • 17. Number of Hidden Units • The number of hidden units is related to the complexity of the decision boundary. • If examples are easy to discriminate few nodes would be enough. Conversely complex problems require many internal nodes. • A rule of thumb is to choose roughly m / 10 weights, where m is the number of training examples. 17
  • 18. Learning Rates • Different learning rates affect the performance of a neural network significantly. • Optimal Learning Rate: • Leads to the error minimum in one learning step. 18
  • 20. Limitations of the back propagation algorithm • It is not guaranteed to find global minimum of the error function. It may get trapped in a local minima, • Improvements, • Add momentum. • Use stochastic gradient descent. • Use different networks with different initial values for the weights. • Back propagation learning does not require normalization of input vectors; however, normalization could improve performance. • Standardize all features previous to training. 20
  • 21. Generalization and Overfitting • Solutions, • Use a validation set and stop until the error is small in this set. • Use 10 fold cross validation Number of weight updates Validation set error Training set error 21 [3]
  • 22. References • Artificial Neural Networks, https://en.wikipedia.org/wiki/Artificial_neural_network • Back propagation Algorithm, https://en.wikipedia.org/wiki/Backpropagation • Lecture Slides from Dr. Vilalta’s machine learning class 22