SlideShare a Scribd company logo
1 of 16
UNIT II
Neural Networks
Course :Machine Learning
By: Dr P Indira priyadarsini B.Tech,M.Tech,Ph.D
3/4/2022
DEPARTMENT OF Computer science and Engineering 1
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 2
MULTI LAYERED FEED FORWARD
NEURAL NETWORK ARCHITECTURES
Multilayer networks solve the classification problem for non linear sets by
employing hidden layers, whose neurons are not directly connected to the output.
The additional hidden layers can be interpreted geometrically as additional hyper-
planes, which enhance the separation capacity of the network. Figure 2.2 shows
typical multilayer network architectures.
This new architecture introduces a new question: how to train the hidden units for
which the desired output is not known. The Back propagation algorithm offers a
solution to this problem.
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 3
Input Nodes – The Input nodes provide information from the outside world to the
network and are together referred to as the “Input Layer”. No computation is
performed in any of the Input nodes – they just pass on the information to the hidden
nodes.
Hidden Nodes – The Hidden nodes have no direct connection with the outside world
(hence the name “hidden”). They perform computations and transfer information
from the input nodes to the output nodes. A collection of hidden nodes forms
a “Hidden Layer”.
While a feedforward network will only have a single input layer and a single output
layer, it can have zero or multiple Hidden Layers.
Output Nodes – The Output nodes are collectively referred to as the “Output Layer”
and are responsible for computations and transferring information from the network
to the outside world.
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 4
The training occurs in a supervised style. The basic idea is to present the input vector
to the network;
calculate in the forward direction the output of each layer and the final output of the
network.
For the output layer the desired values are known and therefore the weights can be
adjusted as for a single layer network; in the case of the BP algorithm according to the
gradient decent rule.
To calculate the weight changes in the hidden layer the error in the output layer is
back-propagated to these layers according to the connecting weights.
This process is repeated for each sample in the training set. One cycle through the
training set is called an epoch.
The number of epochs needed to train the network depends on various parameters,
especially on the error calculated in the output layer.
The following description of the Back propagation algorithm is based on the
descriptions in [rume86],[faus94], and [patt96].
The assumed architecture is depicted in Figure 2.3. The input vector has n dimensions,
the output vector has m dimensions, the bias (the used constant input) is -1, there is
one hidden layer with g neurons. The matrix V holds the weights of the neurons in the
hidden layer. The matrix W defines the weights of the neurons in the output layer. The
learning parameter is η, and the momentum is α.
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 5
• In machine learning, gradient descent and back propagation often appear at the
same time, and sometimes they can replace each other.
• Back propagation can be considered as a subset of gradient descent, which is the
implementation of gradient descent in multi-layer neural networks.
• Back propagation, also named the Generalized Delta Rule, is an algorithm used in
the training of ANNs for supervised learning. (generalizations exist for other
artificial neural networks) It efficiently computes the gradient of the error
function with respect to the weights of the network for a single input-output
example.
• This makes it feasible to use gradient methods for training multi-layer networks,
updating the weights to minimize loss.
• Since the same training rule is applied recursively for each layer of the neural
network, we can calculate the contribution of each weight to the total error
inversely from the output layer to the input layer.
BACKPROPAGATION
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 6
Back Propagation Neural Networks Continued….
• Back Propagation Neural Network is a multilayer neural network consisting of
the input layer, at least one hidden layer and output layer.
• As its name suggests, back propagating will take place in this network.
• The error which is calculated at the output layer, by comparing the target output
and the actual output, will be propagated back towards the input layer.
Architecture
• As shown in the diagram, the architecture of BPN has three interconnected
layers having weights on them.
• The hidden layer as well as the output layer also has bias, whose weight is
always 1, on them.
• As is clear from the diagram, the working of BPN is in two phases.
• One phase sends the signal from the input layer to the output layer, and the
other phase back propagates the error from the output layer to the input layer.
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 7
• Simply put, we’re propagating the total error backward through the
connections in the network layer by layer, calculate the contribution
(gradient) of each weight and bias to the total error in every layer, then use
gradient descent algorithm to optimize the weights and biases, and
eventually minimize the total error of the neural network.
• The back propagation algorithm has two phases:
• Forward pass phase: feed-forward propagation of input pattern signals
through the network, from inputs towards the network outputs.
• Backward pass phase: computes ‘error signal’ – propagation of error
(difference between actual and desired output values) backwards through
network, starting form output units towards the input units.
• Visualizing this can help understand how the backpropagation algorithm
works step by step
BACKPROPAGATION (CONTINUED…. )
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 8
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 9
3/4/2022 9
The back-propagation training algorithm
Step 1: Initialization
Set all the weights and threshold levels of the
network to random numbers uniformly
distributed inside a small range:
where Fi is the total number of inputs of neuron i
in the network. The weight initialization is done
on a neuron-by-neuron basis.










i
i F
F
4
.
2
,
4
.
2
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY
10
Step 2: Activation
Activate the back-propagation neural network
by applying inputs x1(p), x2(p),…, xn(p) and
desired outputs yd,1(p), yd,2(p),…, yd,n(p).
(a) Calculate the actual outputs of the neurons
in the hidden layer:
where n is the number of inputs of neuron j in
the hidden layer, and sigmoid is the sigmoid
activation function.











 

j
n
i
ij
i
j p
w
p
x
sigm
oid
p
y
1
)
(
)
(
)
(
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY
11
(b) Calculate the actual outputs of the neurons
in the output layer:
Step 2 : Activation (continued)
where m is the number of inputs of neuron k in
the output layer.











 

k
m
j
jk
jk
k p
w
p
x
sigm
oid
p
y
1
)
(
)
(
)
(
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 12
12
Step 3: Weight training
Update the weights in the back-propagation
network propagating backward the errors
associated with output neurons.
(a) Calculate the error gradient for the neurons in
the output layer:
where
Calculate the weight corrections:
Update the weights at the output neurons:
)
(
)
(
)
1
( p
w
p
w
p
w jk
jk
jk 



)
(
)
(
1
)
(
)
( p
e
p
y
p
y
p k
k
k
k 



)
(
)
(
)
( , p
y
p
y
p
e k
k
d
k 

)
(
)
(
)
( p
p
y
p
w k
j
jk 



DEPARTMENT OF INFORMATION TECHNOLOGY
13
(b) Calculate the error gradient for the neurons
in the hidden layer:
Step 3: Weight training (continued)
Calculate the weight corrections:
Update the weights at the hidden neurons:
)
(
)
(
)
(
1
)
(
)
(
1
]
[ p
w
p
p
y
p
y
p jk
l
k
k
j
j
j 





)
(
)
(
)
( p
p
x
p
w j
i
ij 



)
(
)
(
)
1
( p
w
p
w
p
w ij
ij
ij 



3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 14
3/4/2022 14
Step 4: Iteration
Increase iteration p by one, go
back to Step 2 and repeat the process until the
selected error criterion is satisfied.
As an example, we may consider the three-layer
back-propagation network. Suppose that the
network is required to perform logical operation
Exclusive-OR. Recall that a single-layer
perceptron could not do this operation. Now we
will apply the three-layer net.
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 15
Working Example
Back propagation (mynotes)
3/4/2022
DEPARTMENT OF INFORMATION TECHNOLOGY 16
APPLICATIONS OF FEEDFORWARD NEURAL
NETWORKS
There are a wide variety of applications of neural networks to real world problems.
1. Gene Expression Profiling for predicting Clinical Outcomes in cancer patients.
2. Steering an Autonomous Vehicle.
3. Call admission control in ATM Networks.
4. Robot Arm Control and Navigation.

More Related Content

What's hot

Nural network ER. Abhishek k. upadhyay
Nural network ER. Abhishek  k. upadhyayNural network ER. Abhishek  k. upadhyay
Nural network ER. Abhishek k. upadhyayabhishek upadhyay
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagationKrish_ver2
 
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...ijistjournal
 
Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
Implementation of Back-Propagation Neural Network using Scilab and its Conver...
Implementation of Back-Propagation Neural Network using Scilab and its Conver...Implementation of Back-Propagation Neural Network using Scilab and its Conver...
Implementation of Back-Propagation Neural Network using Scilab and its Conver...IJEEE
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesMohammed Bennamoun
 
ARTIFICIAL NEURAL NETWORKS
ARTIFICIAL NEURAL NETWORKSARTIFICIAL NEURAL NETWORKS
ARTIFICIAL NEURAL NETWORKSAIMS Education
 
03 neural network
03 neural network03 neural network
03 neural networkTianlu Wang
 
Principles of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksPrinciples of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksSivagowry Shathesh
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural NetworksArslan Zulfiqar
 
15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer PerceptronAndres Mendez-Vazquez
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madalineNagarajan
 
MPerceptron
MPerceptronMPerceptron
MPerceptronbutest
 
Character Recognition using Artificial Neural Networks
Character Recognition using Artificial Neural NetworksCharacter Recognition using Artificial Neural Networks
Character Recognition using Artificial Neural NetworksJaison Sabu
 
Fundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksFundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksNelson Piedra
 
Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...Muhammad Ishaq
 

What's hot (20)

Nural network ER. Abhishek k. upadhyay
Nural network ER. Abhishek  k. upadhyayNural network ER. Abhishek  k. upadhyay
Nural network ER. Abhishek k. upadhyay
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagation
 
04 Multi-layer Feedforward Networks
04 Multi-layer Feedforward Networks04 Multi-layer Feedforward Networks
04 Multi-layer Feedforward Networks
 
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
Implementation Of Back-Propagation Neural Network For Isolated Bangla Speech ...
 
Back propagation
Back propagationBack propagation
Back propagation
 
Implementation of Back-Propagation Neural Network using Scilab and its Conver...
Implementation of Back-Propagation Neural Network using Scilab and its Conver...Implementation of Back-Propagation Neural Network using Scilab and its Conver...
Implementation of Back-Propagation Neural Network using Scilab and its Conver...
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rules
 
ARTIFICIAL NEURAL NETWORKS
ARTIFICIAL NEURAL NETWORKSARTIFICIAL NEURAL NETWORKS
ARTIFICIAL NEURAL NETWORKS
 
Ann
Ann Ann
Ann
 
03 neural network
03 neural network03 neural network
03 neural network
 
Back propagation method
Back propagation methodBack propagation method
Back propagation method
 
Principles of soft computing-Associative memory networks
Principles of soft computing-Associative memory networksPrinciples of soft computing-Associative memory networks
Principles of soft computing-Associative memory networks
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron
 
Perceptron & Neural Networks
Perceptron & Neural NetworksPerceptron & Neural Networks
Perceptron & Neural Networks
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madaline
 
MPerceptron
MPerceptronMPerceptron
MPerceptron
 
Character Recognition using Artificial Neural Networks
Character Recognition using Artificial Neural NetworksCharacter Recognition using Artificial Neural Networks
Character Recognition using Artificial Neural Networks
 
Fundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksFundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural Networks
 
Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...Artificial neural network model & hidden layers in multilayer artificial neur...
Artificial neural network model & hidden layers in multilayer artificial neur...
 

Similar to Unit ii supervised ii

Web Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network AlgorithmsWeb Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network Algorithmsaciijournal
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsaciijournal
 
A survey research summary on neural networks
A survey research summary on neural networksA survey research summary on neural networks
A survey research summary on neural networkseSAT Publishing House
 
Artificial neural networks in hydrology
Artificial neural networks in hydrology Artificial neural networks in hydrology
Artificial neural networks in hydrology Jonathan D'Cruz
 
Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...IOSR Journals
 
Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...IOSR Journals
 
Modeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technologyModeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technologytheijes
 
Review: “Implementation of Feedforward and Feedback Neural Network for Signal...
Review: “Implementation of Feedforward and Feedback Neural Network for Signal...Review: “Implementation of Feedforward and Feedback Neural Network for Signal...
Review: “Implementation of Feedforward and Feedback Neural Network for Signal...IJERA Editor
 
ML_ Unit 2_Part_B
ML_ Unit 2_Part_BML_ Unit 2_Part_B
ML_ Unit 2_Part_BSrimatre K
 
Electricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural NetworkElectricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural NetworkNaren Chandra Kattla
 
Simulation of Single and Multilayer of Artificial Neural Network using Verilog
Simulation of Single and Multilayer of Artificial Neural Network using VerilogSimulation of Single and Multilayer of Artificial Neural Network using Verilog
Simulation of Single and Multilayer of Artificial Neural Network using Verilogijsrd.com
 
Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)Adam Blevins
 
Backpropagation
BackpropagationBackpropagation
Backpropagationariffast
 
Implementation of Feed Forward Neural Network for Classification by Education...
Implementation of Feed Forward Neural Network for Classification by Education...Implementation of Feed Forward Neural Network for Classification by Education...
Implementation of Feed Forward Neural Network for Classification by Education...ijsrd.com
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 

Similar to Unit ii supervised ii (20)

Web Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network AlgorithmsWeb Spam Classification Using Supervised Artificial Neural Network Algorithms
Web Spam Classification Using Supervised Artificial Neural Network Algorithms
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithms
 
A survey research summary on neural networks
A survey research summary on neural networksA survey research summary on neural networks
A survey research summary on neural networks
 
MNN
MNNMNN
MNN
 
Artificial neural networks in hydrology
Artificial neural networks in hydrology Artificial neural networks in hydrology
Artificial neural networks in hydrology
 
Lec 6-bp
Lec 6-bpLec 6-bp
Lec 6-bp
 
Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...
 
Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...Digital Implementation of Artificial Neural Network for Function Approximatio...
Digital Implementation of Artificial Neural Network for Function Approximatio...
 
Modeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technologyModeling of neural image compression using gradient decent technology
Modeling of neural image compression using gradient decent technology
 
N ns 1
N ns 1N ns 1
N ns 1
 
Review: “Implementation of Feedforward and Feedback Neural Network for Signal...
Review: “Implementation of Feedforward and Feedback Neural Network for Signal...Review: “Implementation of Feedforward and Feedback Neural Network for Signal...
Review: “Implementation of Feedforward and Feedback Neural Network for Signal...
 
ML_ Unit 2_Part_B
ML_ Unit 2_Part_BML_ Unit 2_Part_B
ML_ Unit 2_Part_B
 
Electricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural NetworkElectricity Demand Forecasting Using Fuzzy-Neural Network
Electricity Demand Forecasting Using Fuzzy-Neural Network
 
Simulation of Single and Multilayer of Artificial Neural Network using Verilog
Simulation of Single and Multilayer of Artificial Neural Network using VerilogSimulation of Single and Multilayer of Artificial Neural Network using Verilog
Simulation of Single and Multilayer of Artificial Neural Network using Verilog
 
Backpropagation.pptx
Backpropagation.pptxBackpropagation.pptx
Backpropagation.pptx
 
Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)
 
Unsupervised learning networks
Unsupervised learning networksUnsupervised learning networks
Unsupervised learning networks
 
Backpropagation
BackpropagationBackpropagation
Backpropagation
 
Implementation of Feed Forward Neural Network for Classification by Education...
Implementation of Feed Forward Neural Network for Classification by Education...Implementation of Feed Forward Neural Network for Classification by Education...
Implementation of Feed Forward Neural Network for Classification by Education...
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 

Recently uploaded

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxAshokKarra1
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomnelietumpap1
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxChelloAnnAsuncion2
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
 

Recently uploaded (20)

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptx
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choom
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
 

Unit ii supervised ii

  • 1. UNIT II Neural Networks Course :Machine Learning By: Dr P Indira priyadarsini B.Tech,M.Tech,Ph.D 3/4/2022 DEPARTMENT OF Computer science and Engineering 1
  • 2. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 2 MULTI LAYERED FEED FORWARD NEURAL NETWORK ARCHITECTURES Multilayer networks solve the classification problem for non linear sets by employing hidden layers, whose neurons are not directly connected to the output. The additional hidden layers can be interpreted geometrically as additional hyper- planes, which enhance the separation capacity of the network. Figure 2.2 shows typical multilayer network architectures. This new architecture introduces a new question: how to train the hidden units for which the desired output is not known. The Back propagation algorithm offers a solution to this problem.
  • 3. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 3 Input Nodes – The Input nodes provide information from the outside world to the network and are together referred to as the “Input Layer”. No computation is performed in any of the Input nodes – they just pass on the information to the hidden nodes. Hidden Nodes – The Hidden nodes have no direct connection with the outside world (hence the name “hidden”). They perform computations and transfer information from the input nodes to the output nodes. A collection of hidden nodes forms a “Hidden Layer”. While a feedforward network will only have a single input layer and a single output layer, it can have zero or multiple Hidden Layers. Output Nodes – The Output nodes are collectively referred to as the “Output Layer” and are responsible for computations and transferring information from the network to the outside world.
  • 4. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 4 The training occurs in a supervised style. The basic idea is to present the input vector to the network; calculate in the forward direction the output of each layer and the final output of the network. For the output layer the desired values are known and therefore the weights can be adjusted as for a single layer network; in the case of the BP algorithm according to the gradient decent rule. To calculate the weight changes in the hidden layer the error in the output layer is back-propagated to these layers according to the connecting weights. This process is repeated for each sample in the training set. One cycle through the training set is called an epoch. The number of epochs needed to train the network depends on various parameters, especially on the error calculated in the output layer. The following description of the Back propagation algorithm is based on the descriptions in [rume86],[faus94], and [patt96]. The assumed architecture is depicted in Figure 2.3. The input vector has n dimensions, the output vector has m dimensions, the bias (the used constant input) is -1, there is one hidden layer with g neurons. The matrix V holds the weights of the neurons in the hidden layer. The matrix W defines the weights of the neurons in the output layer. The learning parameter is η, and the momentum is α.
  • 5. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 5 • In machine learning, gradient descent and back propagation often appear at the same time, and sometimes they can replace each other. • Back propagation can be considered as a subset of gradient descent, which is the implementation of gradient descent in multi-layer neural networks. • Back propagation, also named the Generalized Delta Rule, is an algorithm used in the training of ANNs for supervised learning. (generalizations exist for other artificial neural networks) It efficiently computes the gradient of the error function with respect to the weights of the network for a single input-output example. • This makes it feasible to use gradient methods for training multi-layer networks, updating the weights to minimize loss. • Since the same training rule is applied recursively for each layer of the neural network, we can calculate the contribution of each weight to the total error inversely from the output layer to the input layer. BACKPROPAGATION
  • 6. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 6 Back Propagation Neural Networks Continued…. • Back Propagation Neural Network is a multilayer neural network consisting of the input layer, at least one hidden layer and output layer. • As its name suggests, back propagating will take place in this network. • The error which is calculated at the output layer, by comparing the target output and the actual output, will be propagated back towards the input layer. Architecture • As shown in the diagram, the architecture of BPN has three interconnected layers having weights on them. • The hidden layer as well as the output layer also has bias, whose weight is always 1, on them. • As is clear from the diagram, the working of BPN is in two phases. • One phase sends the signal from the input layer to the output layer, and the other phase back propagates the error from the output layer to the input layer.
  • 7. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 7 • Simply put, we’re propagating the total error backward through the connections in the network layer by layer, calculate the contribution (gradient) of each weight and bias to the total error in every layer, then use gradient descent algorithm to optimize the weights and biases, and eventually minimize the total error of the neural network. • The back propagation algorithm has two phases: • Forward pass phase: feed-forward propagation of input pattern signals through the network, from inputs towards the network outputs. • Backward pass phase: computes ‘error signal’ – propagation of error (difference between actual and desired output values) backwards through network, starting form output units towards the input units. • Visualizing this can help understand how the backpropagation algorithm works step by step BACKPROPAGATION (CONTINUED…. )
  • 9. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 9 3/4/2022 9 The back-propagation training algorithm Step 1: Initialization Set all the weights and threshold levels of the network to random numbers uniformly distributed inside a small range: where Fi is the total number of inputs of neuron i in the network. The weight initialization is done on a neuron-by-neuron basis.           i i F F 4 . 2 , 4 . 2
  • 10. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 10 Step 2: Activation Activate the back-propagation neural network by applying inputs x1(p), x2(p),…, xn(p) and desired outputs yd,1(p), yd,2(p),…, yd,n(p). (a) Calculate the actual outputs of the neurons in the hidden layer: where n is the number of inputs of neuron j in the hidden layer, and sigmoid is the sigmoid activation function.               j n i ij i j p w p x sigm oid p y 1 ) ( ) ( ) (
  • 11. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 11 (b) Calculate the actual outputs of the neurons in the output layer: Step 2 : Activation (continued) where m is the number of inputs of neuron k in the output layer.               k m j jk jk k p w p x sigm oid p y 1 ) ( ) ( ) (
  • 12. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 12 12 Step 3: Weight training Update the weights in the back-propagation network propagating backward the errors associated with output neurons. (a) Calculate the error gradient for the neurons in the output layer: where Calculate the weight corrections: Update the weights at the output neurons: ) ( ) ( ) 1 ( p w p w p w jk jk jk     ) ( ) ( 1 ) ( ) ( p e p y p y p k k k k     ) ( ) ( ) ( , p y p y p e k k d k   ) ( ) ( ) ( p p y p w k j jk    
  • 13. DEPARTMENT OF INFORMATION TECHNOLOGY 13 (b) Calculate the error gradient for the neurons in the hidden layer: Step 3: Weight training (continued) Calculate the weight corrections: Update the weights at the hidden neurons: ) ( ) ( ) ( 1 ) ( ) ( 1 ] [ p w p p y p y p jk l k k j j j       ) ( ) ( ) ( p p x p w j i ij     ) ( ) ( ) 1 ( p w p w p w ij ij ij    
  • 14. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 14 3/4/2022 14 Step 4: Iteration Increase iteration p by one, go back to Step 2 and repeat the process until the selected error criterion is satisfied. As an example, we may consider the three-layer back-propagation network. Suppose that the network is required to perform logical operation Exclusive-OR. Recall that a single-layer perceptron could not do this operation. Now we will apply the three-layer net.
  • 15. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 15 Working Example Back propagation (mynotes)
  • 16. 3/4/2022 DEPARTMENT OF INFORMATION TECHNOLOGY 16 APPLICATIONS OF FEEDFORWARD NEURAL NETWORKS There are a wide variety of applications of neural networks to real world problems. 1. Gene Expression Profiling for predicting Clinical Outcomes in cancer patients. 2. Steering an Autonomous Vehicle. 3. Call admission control in ATM Networks. 4. Robot Arm Control and Navigation.

Editor's Notes

  1. https://adatis.co.uk/introduction-to-artificial-neural-networks-part-two-gradient-descent-backpropagation-supervised-unsupervised-learning/
  2. https://adatis.co.uk/introduction-to-artificial-neural-networks-part-two-gradient-descent-backpropagation-supervised-unsupervised-learning/
  3. from Han and Kamber (mynotes)