Presentedby:SangeetaTiwari
Guidedby:Prof.N.G.Dharashive sir
Agenda
Introduction
What is ANN?
Motivation behind ANN
Comparsion between biological
neuron & Artficial Neuron
Perceptron
Feedforward Neural Network
Backpropagation Algorithm
Types of ANN
Pattern Recognition working
Future of ANN:Deep learning
Deep Learning
Application
Advantages &Disadvantages
Conclusion
Introduction:
Can we code for
such different
patterns
????
Human Vision
What is Artifical Neural Network ?
Neural Network
Neurons Connections
between them
An artificial neuron network (ANN) is a computational i.e.,an
information processing model that is inspired by the way
biological nervous systems(such as the brain) process the
information.
ANNs are considered nonlinear statistical data modeling tools
where the complex relationships between inputs and outputs
are modeled or patterns are found.
Motivation behind Artificial
Neural Network
Comparsion between Biological Neuron & artifical Neuron
Biological Neural Network(BNN) Artifical Neural Network(ANN)
Soma Node
Dendrites Input
Synapse Weights or interconections
Axon Output
Perceptron
A perceptron is a most fundamental unit of neural
network(Artificial Neurons) that does certain computations
to detect feature or business intelligence in the input data.
The perceptron is a linear model for supervised learning
used for Binary Classification.
Perceptron consist of 4 parts:
i. input
ii. weights & bias
iii. summation function
iv. Activation Function
v. Output
Perceptron Learning Rule
Perceptron learns the weights for the input signals in
order to draw a linear decision boundary
Two types of perceptron:
a) Single Layer Perceptron
b) Multi-layer perceptron
Activation Functions
Sigmoid Fuction Step function
ReLu Function Tanh function
Single-layer Perceptron
Example:
 Limitation of Single-Layer Perceptron:
Feed-Forward Neural Network:
It is also known as
Multi-layered Neural
Network
Information only travels
forward in the neural
network, through the
input nodes then through
the hidden layers(1 or
more)and finally through
the output nodes.
Capable of handling the
non-linearly separable
data
Layer present between
input and output layer
are called HIDDEN Layers.
Backpropagation Algorithm:
Backpropagation, an abbreviation for
”Backward propagation of errors” is
common method of training ANN.
The method calculates the gradient of a
loss function w.r.t all the weights in the
network.
The gradient is fed to the optimizer in
order to minimize the loss function.
The Backpropagation algorithm looks for
the minimum value of the error function in
weight space using a technique called the
delta rule or gradient descent.
The backpropagation learning algorithm
can be divided into two phases
i. Forward Propagation(propagate)
ii. Backward Propagation(update weights)
Algorithm
Step 1:Initializtion
Randomly set all the weights threshold levels of n/w.
Step 2: Forward computing:
compute the hidden vector h on hidden layer
zj =φ(∑i vijxi)
compute the o/p vector y on o/p layer
yk= φ(∑i wjkzj)
Step 3: Calculate the Total Error
Check the difference between y(actual o/p) and ŷ (predicted
o/p)
E=1/2(y-ŷ)2
Step 4: Backward computing
Finding the derivative of the error
Calculating the partial derivative of the error w.r.t
weight
Update the weights: ∆ wj = - ɳ ∂E
∂wij
wj=wj+ ∆ wj
Example:
Consider the below table
Input Desired
output
0 0
1 2
2 4
Input Desired O/p Model
O/p(w=3)
0 0 0
1 2 3
2 4 6
Input Desire
d o/p
Model
o/p
(w=3)
Square
Error
(y-ŷ)2
Model
o/p
(W=4)
Square
Error
(w=4)
0 0 0 0 0 0
1 2 3 1 4 4
2 4 6 4 8 16
Input Desired
o/p
Model
o/p
(w=3)
Square
Error
(y-ŷ)2
Model
o/p
(W=2)
Square
Error
(w=2)
0 0 0 0 0 0
1 2 3 1 2 0
2 4 6 4 4 0
Use Case : Classify leaf images as either Diseased
or Non-Diseased
1.Compute the weights and according to that check the probability
of desired output
If the predicted output is wrong then by using Backpropagation
learning again train the neural net
Update the
weights by
propagating
backward
After updating the weights again forward compute the ouptut i.e.,hence we
classified the Diseased and Non-diseased leaf
Types of ANN
Use CASE:
Pattern Recognition
Data Acquistion &pre-processing
Segmentation
Feature Extraction
Classification & Recognition
Example:
1
0
σ( - 10)
BiasSigmoid
Future of ANN: Deep
Learning
Artificial Intelligence
,Machine Learning and
Deep Learning are
interconnected fields
Machine learning and Deep
learning aids AI by
providing a set of
algorithms and Neural Net
to solve a data driven
problems
Deep Learning
Deep learning is a subset of machine learning in artificial intelligence
(AI) that has networks capable of learning unsupervised from data
that is unstructured or unlabeled
Application:
Google translator Self-driving car
Voice Assistant
Advantages & Disadvantages
of ANN
Parallel processing ability
Information is stored on an
entire network ,not just a
database
Fault tolerance means
corruption of one or more cells of
the ANN will not stop the
generation of output
Gradual Corruption means the
network will slowly degrade over
time, instead of a problem
destroying the network instantly
The lack of rules for
determining the proper network
structure
The requirement of processors
with parallel processing abilities
makes ANN hardware dependent
The lack of explanation behind
probing solutions
Generation of lack of trust in
the network
Advantages
Disadvantages
Any Queries
???

Artifical Neural Network and its applications

  • 1.
  • 2.
    Agenda Introduction What is ANN? Motivationbehind ANN Comparsion between biological neuron & Artficial Neuron Perceptron Feedforward Neural Network Backpropagation Algorithm Types of ANN Pattern Recognition working Future of ANN:Deep learning Deep Learning Application Advantages &Disadvantages Conclusion
  • 3.
    Introduction: Can we codefor such different patterns ???? Human Vision
  • 4.
    What is ArtificalNeural Network ? Neural Network Neurons Connections between them An artificial neuron network (ANN) is a computational i.e.,an information processing model that is inspired by the way biological nervous systems(such as the brain) process the information. ANNs are considered nonlinear statistical data modeling tools where the complex relationships between inputs and outputs are modeled or patterns are found.
  • 5.
  • 6.
    Comparsion between BiologicalNeuron & artifical Neuron Biological Neural Network(BNN) Artifical Neural Network(ANN) Soma Node Dendrites Input Synapse Weights or interconections Axon Output
  • 7.
    Perceptron A perceptron isa most fundamental unit of neural network(Artificial Neurons) that does certain computations to detect feature or business intelligence in the input data. The perceptron is a linear model for supervised learning used for Binary Classification. Perceptron consist of 4 parts: i. input ii. weights & bias iii. summation function iv. Activation Function v. Output Perceptron Learning Rule Perceptron learns the weights for the input signals in order to draw a linear decision boundary Two types of perceptron: a) Single Layer Perceptron b) Multi-layer perceptron
  • 8.
    Activation Functions Sigmoid FuctionStep function ReLu Function Tanh function
  • 9.
  • 10.
    Feed-Forward Neural Network: Itis also known as Multi-layered Neural Network Information only travels forward in the neural network, through the input nodes then through the hidden layers(1 or more)and finally through the output nodes. Capable of handling the non-linearly separable data Layer present between input and output layer are called HIDDEN Layers.
  • 11.
    Backpropagation Algorithm: Backpropagation, anabbreviation for ”Backward propagation of errors” is common method of training ANN. The method calculates the gradient of a loss function w.r.t all the weights in the network. The gradient is fed to the optimizer in order to minimize the loss function. The Backpropagation algorithm looks for the minimum value of the error function in weight space using a technique called the delta rule or gradient descent. The backpropagation learning algorithm can be divided into two phases i. Forward Propagation(propagate) ii. Backward Propagation(update weights)
  • 12.
    Algorithm Step 1:Initializtion Randomly setall the weights threshold levels of n/w. Step 2: Forward computing: compute the hidden vector h on hidden layer zj =φ(∑i vijxi) compute the o/p vector y on o/p layer yk= φ(∑i wjkzj) Step 3: Calculate the Total Error Check the difference between y(actual o/p) and ŷ (predicted o/p) E=1/2(y-ŷ)2 Step 4: Backward computing Finding the derivative of the error Calculating the partial derivative of the error w.r.t weight Update the weights: ∆ wj = - ɳ ∂E ∂wij wj=wj+ ∆ wj
  • 13.
    Example: Consider the belowtable Input Desired output 0 0 1 2 2 4 Input Desired O/p Model O/p(w=3) 0 0 0 1 2 3 2 4 6 Input Desire d o/p Model o/p (w=3) Square Error (y-ŷ)2 Model o/p (W=4) Square Error (w=4) 0 0 0 0 0 0 1 2 3 1 4 4 2 4 6 4 8 16 Input Desired o/p Model o/p (w=3) Square Error (y-ŷ)2 Model o/p (W=2) Square Error (w=2) 0 0 0 0 0 0 1 2 3 1 2 0 2 4 6 4 4 0
  • 14.
    Use Case :Classify leaf images as either Diseased or Non-Diseased
  • 15.
    1.Compute the weightsand according to that check the probability of desired output
  • 16.
    If the predictedoutput is wrong then by using Backpropagation learning again train the neural net Update the weights by propagating backward
  • 17.
    After updating theweights again forward compute the ouptut i.e.,hence we classified the Diseased and Non-diseased leaf
  • 18.
  • 19.
    Use CASE: Pattern Recognition DataAcquistion &pre-processing Segmentation Feature Extraction Classification & Recognition
  • 20.
  • 21.
  • 24.
    Future of ANN:Deep Learning Artificial Intelligence ,Machine Learning and Deep Learning are interconnected fields Machine learning and Deep learning aids AI by providing a set of algorithms and Neural Net to solve a data driven problems
  • 25.
    Deep Learning Deep learningis a subset of machine learning in artificial intelligence (AI) that has networks capable of learning unsupervised from data that is unstructured or unlabeled
  • 26.
  • 27.
  • 30.
    Advantages & Disadvantages ofANN Parallel processing ability Information is stored on an entire network ,not just a database Fault tolerance means corruption of one or more cells of the ANN will not stop the generation of output Gradual Corruption means the network will slowly degrade over time, instead of a problem destroying the network instantly The lack of rules for determining the proper network structure The requirement of processors with parallel processing abilities makes ANN hardware dependent The lack of explanation behind probing solutions Generation of lack of trust in the network Advantages Disadvantages
  • 31.