ANN & EBPNN
Mrs. Swapna Devi
What is a neural network ?
 a branch of "Artificial Intelligence".
 It can be considered as a black box that is
able to predict an output pattern when it
recognizes a given input pattern. Once
trained, the neural network is able to
recognize similarities when presented with
a new input pattern, resulting in a
predicted output pattern.
Biological neuron
mV
Applications
 Aerospace : High performance aircraft autopilots, flight path
simulations, aircraft control systems, autopilot enhancements,
aircraft component fault detectors
 Automotive : Automobile automatic guidance systems
 Banking : Cheque & other document readers, credit
application evaluators
 Defense: Weapon steering, target tracking, object
discrimination, facial recognition, new kinds of sensors, sonar,
radar and image signal processing including data
compression, feature extraction and noise suppression,
signal/image identification
 Electronics : Code sequence prediction, integrated circuit chip
layout, process control, chip failure analysis, machine vision,
voice synthesis, nonlinear modeling
Applications
 Robotics : Trajectory control, forklift robot, manipulator
controllers, vision systems
 Speech: Speech recognition, speech compression, vowel
classification, text to speech synthesis
 Securities: Market analysis, stock trading advisory systems
 Telecommunications: Image and data compression,
automated information services, real-time, translation of
spoken language, customer payment processing systems
 Transportation: vehicle scheduling, routing systems
What is an ANN ?
 It is a system loosely modeled based on the human
brain.
 The field goes by many names, such as
connectionism, parallel distributed processing,
neuro-computing, natural intelligent systems,
machine learning algorithms, and artificial neural
networks.
 It is an inherently multiprocessor-friendly
architecture. It has ability to account for any
functional dependency. The network discovers
(learns, models) the nature of the dependency
without needing to be prompted.
 a powerful technique to solve many real world
problems.
 have the ability to learn from experience in order to
improve their performance & to adapt themselves
to changes in the environment.
 In addition to that they are able to deal with
incomplete information or noisy data and can be
very effective especially in situations where it is not
possible to define the rules or steps that lead to the
solution of a problem.
Classifications of NN
• Neural Network Applications can be grouped into
four categories
Function
approximation
Clustering
Classification/Pattern
recognition
Prediction/Dynamical
Systems
Clustering
• A clustering algorithm explores the similarity
between patterns and places similar patterns
in a cluster. Best known applications include
data compression and data mining.
Clustering
• A clustering algorithm explores the similarity
between patterns and places similar patterns
in a cluster. Best known applications include
data compression and data mining.
Classification/Pattern recognition
• The task of pattern recognition is to assign
an input pattern (like handwritten symbol)
to one of many classes. This category
includes algorithmic implementations such
as associative memory.
Function approximation
• The tasks of function approximation is to
find an estimate of the unknown function
f() subject to noise. Various engineering
and scientific disciplines require function
approximation.
Prediction/Dynamical Systems
• The task is to forecast some future values of a
time-sequenced data. Prediction has a significant
impact on decision support systems. Prediction
differs from Function approximation by
considering time factor.
• Here the system is dynamic and may produce
different results for the same input data based on
system state (time).
Connection Type
• Static (feed forward)
• Dynamic (feedback)
Topology
- Single layer
- Multilayer
- Recurrent
- Self-organized
- . . .
Learning Methods
- Supervised
- Unsupervised
Biological Neurons
 Neurons respond slowly
 The brain uses massively parallel computation
 ≈1011
neurons in the brain
 ≈104
connections per neuron
The Neuron
Consists of three
sections
 cell body
 dendrites
 axon
Computers and Human Brain
Similarities
• Both operate on electrical signals
• Both are a composition of a large
number of simple elements.
• Both perform functions that are
computational.
Computers and Human Brain
Differences
• Compared to µs or ns time scales of digital
computation, nerve impulses are astoundingly
slow.
• The brain’s huge computation rate is achieved
by a tremendous number of parallel
computational units, far beyond any proposed
for a computer system.
• A digital computer is inherently error free, but
brain often produces best guesses and
approximations from partially incomplete and
incorrect inputs, which may be wrong.
In mathematical terms, the neuron fires if and only if;
X1W1 + X2W2 + X3W3 + ... > T
The addition of input weights and of the threshold
makes this neuron a very flexible and powerful one.
The MCP neuron has the ability to adapt to a
particular situation by changing its weights and/or
threshold. Various algorithms exist that cause the
neuron to 'adapt'; the most used ones are the Delta
rule and the back error propagation. The former is
used in feed-forward networks and the latter in
feedback networks.
Single-Input Neuron
Fundamentals of ANN
f = x1ω1+ x2ω2 ----- xr ωr
Activation Function
Y = K(f)
where K is a
threshold function
ie. Y = 1 IF f > T
Y = O otherwise
where T is a
constant threshold
value.
Inputs
X 1
X 2
X 3
X 4
X n
...
Output
Propagation Function Activation Function
Y
f = w x + O
n
i=0
i i
-.
w1
w2
w3
w4
wn
Y
f
Fundamentals of ANN
Squashing Function or Logistic Function or Sigmoid
Function.
Y = 1/1+e-f
f = 0 Y = 0.5
f > 0 Y = 1
f < 0 Y = 0
Fundamentals of ANN
Hyperbolic Tangent Function.
Y = tanh(f)
f = 0 Y = 0
f > 0 Y = 1
f < 0 Y = -1
Multiple Input Neuron
Single Layer Artificial Neural Networks
Layer of Neurons
Abbreviated Notation
Multilayer Network
Multilayer Artificial Neural Networks
• Output of the first layer is obtained by
multiplying the input vector by the first weight
matrix.
• If there is no nonlinear activation function
multiply the resulting vector by the second
weight matrix.
Y = (XW)K.
Transfer/Threshold Functions
Banana & Apple Sorter
Prototype Vectors
Banana Apple Problem
Illustration of a Neural
Network
Different networks
☻Perceptron
– Feedforward Network, Linear Decision Boundary, One Neuron for
Each Decision
☻Hamming Network
☻Hopfield Network
- Dynamic Associative Memory Network
☻Error Back Propagation network
☻Radial basis network
☻ART
☻Brain in a box neural network
☻Cellular neural Network
☻Neocognitron
☻Functional
☻Cellular neural network
Books :
• Artificial Neural Network – Simon Haykin
• Artificial Neural Network – Jacek Zurada

Nural network ER. Abhishek k. upadhyay

  • 1.
    ANN & EBPNN Mrs.Swapna Devi
  • 2.
    What is aneural network ?  a branch of "Artificial Intelligence".  It can be considered as a black box that is able to predict an output pattern when it recognizes a given input pattern. Once trained, the neural network is able to recognize similarities when presented with a new input pattern, resulting in a predicted output pattern.
  • 3.
  • 4.
    Applications  Aerospace :High performance aircraft autopilots, flight path simulations, aircraft control systems, autopilot enhancements, aircraft component fault detectors  Automotive : Automobile automatic guidance systems  Banking : Cheque & other document readers, credit application evaluators  Defense: Weapon steering, target tracking, object discrimination, facial recognition, new kinds of sensors, sonar, radar and image signal processing including data compression, feature extraction and noise suppression, signal/image identification  Electronics : Code sequence prediction, integrated circuit chip layout, process control, chip failure analysis, machine vision, voice synthesis, nonlinear modeling
  • 5.
    Applications  Robotics :Trajectory control, forklift robot, manipulator controllers, vision systems  Speech: Speech recognition, speech compression, vowel classification, text to speech synthesis  Securities: Market analysis, stock trading advisory systems  Telecommunications: Image and data compression, automated information services, real-time, translation of spoken language, customer payment processing systems  Transportation: vehicle scheduling, routing systems
  • 6.
    What is anANN ?  It is a system loosely modeled based on the human brain.  The field goes by many names, such as connectionism, parallel distributed processing, neuro-computing, natural intelligent systems, machine learning algorithms, and artificial neural networks.  It is an inherently multiprocessor-friendly architecture. It has ability to account for any functional dependency. The network discovers (learns, models) the nature of the dependency without needing to be prompted.
  • 7.
     a powerfultechnique to solve many real world problems.  have the ability to learn from experience in order to improve their performance & to adapt themselves to changes in the environment.  In addition to that they are able to deal with incomplete information or noisy data and can be very effective especially in situations where it is not possible to define the rules or steps that lead to the solution of a problem.
  • 8.
    Classifications of NN •Neural Network Applications can be grouped into four categories Function approximation Clustering Classification/Pattern recognition Prediction/Dynamical Systems
  • 9.
    Clustering • A clusteringalgorithm explores the similarity between patterns and places similar patterns in a cluster. Best known applications include data compression and data mining.
  • 10.
    Clustering • A clusteringalgorithm explores the similarity between patterns and places similar patterns in a cluster. Best known applications include data compression and data mining.
  • 11.
    Classification/Pattern recognition • Thetask of pattern recognition is to assign an input pattern (like handwritten symbol) to one of many classes. This category includes algorithmic implementations such as associative memory.
  • 12.
    Function approximation • Thetasks of function approximation is to find an estimate of the unknown function f() subject to noise. Various engineering and scientific disciplines require function approximation.
  • 13.
    Prediction/Dynamical Systems • Thetask is to forecast some future values of a time-sequenced data. Prediction has a significant impact on decision support systems. Prediction differs from Function approximation by considering time factor. • Here the system is dynamic and may produce different results for the same input data based on system state (time).
  • 14.
    Connection Type • Static(feed forward) • Dynamic (feedback)
  • 15.
    Topology - Single layer -Multilayer - Recurrent - Self-organized - . . .
  • 16.
  • 17.
    Biological Neurons  Neuronsrespond slowly  The brain uses massively parallel computation  ≈1011 neurons in the brain  ≈104 connections per neuron
  • 18.
    The Neuron Consists ofthree sections  cell body  dendrites  axon
  • 19.
    Computers and HumanBrain Similarities • Both operate on electrical signals • Both are a composition of a large number of simple elements. • Both perform functions that are computational.
  • 20.
    Computers and HumanBrain Differences • Compared to µs or ns time scales of digital computation, nerve impulses are astoundingly slow. • The brain’s huge computation rate is achieved by a tremendous number of parallel computational units, far beyond any proposed for a computer system. • A digital computer is inherently error free, but brain often produces best guesses and approximations from partially incomplete and incorrect inputs, which may be wrong.
  • 22.
    In mathematical terms,the neuron fires if and only if; X1W1 + X2W2 + X3W3 + ... > T The addition of input weights and of the threshold makes this neuron a very flexible and powerful one. The MCP neuron has the ability to adapt to a particular situation by changing its weights and/or threshold. Various algorithms exist that cause the neuron to 'adapt'; the most used ones are the Delta rule and the back error propagation. The former is used in feed-forward networks and the latter in feedback networks.
  • 23.
  • 24.
    Fundamentals of ANN f= x1ω1+ x2ω2 ----- xr ωr Activation Function Y = K(f) where K is a threshold function ie. Y = 1 IF f > T Y = O otherwise where T is a constant threshold value. Inputs X 1 X 2 X 3 X 4 X n ... Output Propagation Function Activation Function Y f = w x + O n i=0 i i -. w1 w2 w3 w4 wn Y f
  • 25.
    Fundamentals of ANN SquashingFunction or Logistic Function or Sigmoid Function. Y = 1/1+e-f f = 0 Y = 0.5 f > 0 Y = 1 f < 0 Y = 0
  • 26.
    Fundamentals of ANN HyperbolicTangent Function. Y = tanh(f) f = 0 Y = 0 f > 0 Y = 1 f < 0 Y = -1
  • 27.
  • 28.
    Single Layer ArtificialNeural Networks
  • 29.
  • 30.
  • 31.
  • 32.
    Multilayer Artificial NeuralNetworks • Output of the first layer is obtained by multiplying the input vector by the first weight matrix. • If there is no nonlinear activation function multiply the resulting vector by the second weight matrix. Y = (XW)K.
  • 33.
  • 35.
  • 36.
  • 37.
  • 38.
    Illustration of aNeural Network
  • 39.
    Different networks ☻Perceptron – FeedforwardNetwork, Linear Decision Boundary, One Neuron for Each Decision ☻Hamming Network ☻Hopfield Network - Dynamic Associative Memory Network ☻Error Back Propagation network ☻Radial basis network ☻ART ☻Brain in a box neural network ☻Cellular neural Network ☻Neocognitron ☻Functional ☻Cellular neural network
  • 40.
    Books : • ArtificialNeural Network – Simon Haykin • Artificial Neural Network – Jacek Zurada