NEURAL NETWORK
Introduction
• Machine learning mimics the human form of learning.
• Human learning or every action of human being is controlled by the nervous
system.
• The nervous system coordinates the different actions by transmitting signals
to and from different parts of body.
• The nervous system is constituted of a special type of cell called neuron or
nerve cell, which has special structure allowing it to receive or send signals
from other neurons.
• This structure essentially forms a network of neurons or a neural network.
• Biological neural networks is a massively large and complex parallel
computing network.
• It is because of this massive parallel computing network that the nervous
system helps human beings to perform actions or take decisions at a speed
and with such ease that the fastest supercomputer of the world be afraid of.
For example
✓ Superb flying catches taken by the fielders in the cricket ground.
✓ Swimming in the pool.
The fascinating capability of the biological neural network has inspired the
inception of artificial neural network (ANN).
ANN is made of artificial neurons and is machine designed to model the
functioning of the nervous system.
The biological form of neuron is replicated in the electronic or digital form
of neuron.
Human Brain
Receptors Neural Nets Effectors
Stimulus Response
• The human brain may be viewed as the three stage system. Centre to the system is the brain which
receives information and make appropriate decisions.
• The receptors converts stimuli from external environment into electrical pulses that convey information
to the brain.
• The effectors convert electrical impulses generated by the neural net into discernible responses as
system outputs.
• Structural constituents of the brain is neurons and they are massively connected with each other.
• Neuron is able to receive, process and transmit information in the form of chemical and electrical
signals.
• It is estimated that there are approximately 10 billion neurons in the human cortex and 60 trillion
synapses or connection.
• Synapses are elementary structural and functional units that mediate the interaction between neurons.
• Synapse converts a presynaptic electrical signal into a chemical signal and then back into a post synaptic
electrical signal.
Structure of a Biological Neuron
Cytoarchitectural map of the cerebral cortex
• Dendrites: It has irregular surface and receives signals from neighbouring neurons.
• Soma: It is the main body of the neuron which accumulates the signals coming from the different
dendrites. It fires when a sufficient amount of signals is accumulated.
• Axon: It has smoother surface, fewer branches and greater length. It is the last part of the neuron
which receives signal from soma, once the neuron fires and passes it on to the neighbouring
neurons through the axon terminals.
Central Nervous System
Interregional circuits
Local circuits
Neurons
Dendritic Trees
Neural Microcircuits
Synapses
Molecules
Structural Organization of Levels in the Brain
A neural microcircuits refers to an assembly
of synapses organized into patterns of
connectivity to produce a functional
operation of interest.
•Artificial neural network (ANN) is a machine learning approach that models human
brain and consists of a number of artificial neurons.
•The brain is a highly complex non-linear and parallel computer.
•Neuron in ANNs tend to have fewer connections than biological neurons.
•Each neuron in ANN receives a number of inputs.
•An activation function is applied to these inputs which results in activation level of
neuron (output value of the neuron).
•Knowledge about the learning task is given in the form of examples called training
examples.
Plasticity permits the developing nervous system to adapt to its surrounding environment.
In its most general form, a neural network is a machine that is designed
to model the way in which the brain performs a particular task or function
of interest; the network is usually implemented by using electronic
components or is simulated in software on a digital computer.
A neural network is a massively parallel distributed processor made up of
simple processing units, which has a natural propensity for storing
experimental knowledge and making it available for use.
It resemble the brain in two respects:
(a) Knowledge is acquired by the network from its environment through a
learning process.
(b) Interneuron connection strength, known as synaptic weight, are used to
store the acquired knowledge.
• The procedure used to perform the learning process is called a learning algorithm, the
function of which is to modify the synaptic weights of the network in an orderly fashion
to attain the desired design objective.
• It is also possible for a neural network to modify its own topology, which is motivated
by the fact that neurons in the human brain can die and that new synaptic weights can
grow.
• Neural network are also referred in literature as neurocomputers, connectionist
network, parallel distributed processor.
• Neural Network derive its computing power first from the massively parallel distributed
structure and second its ability to learn and thereafter generalize.
• Generalization refers to the neural network producing reasonable outputs for inputs not
encountered during training (learning).
• An Artificial Neural Network is specified by:
➢neuron model: the information processing unit of the NN,
➢an architecture: a set of neurons and links connecting neurons. Each link
has a weight,
➢a learning algorithm: used for training the NN by modifying the
weights in order to model a particular learning task correctly on the
training examples.
• The aim is to obtain a NN that is trained and generalizes well.
• It should behaves correctly on new instances of the learning task.
NEURON
• The neuron is the basic information processing unit of a NN. It consists of:
1 A set of links, describing the neuron inputs, with weights W1, W2, …, Wm
2 An adder function (linear combiner) for computing the weighted sum of the inputs:
(real numbers)
(1)
3 Activation function for limiting the amplitude of the neuron output. Here ‘b’ denotes
bias.
(2)
𝑢𝑘 = ෍
𝑗=1
𝑚
𝑤𝑘𝑗𝑥𝑗
𝑦𝑘 = 𝜑 𝑢𝑘 + 𝑏𝑘
𝑣𝑘 = 𝑢𝑘 + 𝑏𝑘
(.)
 Output
yk
Activation
function
Non Linear Model of a Neuron

wk1
wk2
wkm
x1
x2
xm
vk
Induced field
Summing
junction
Synaptic weights
Bias
bk
Affine Transformation produced by the presence of a bias
Another Non Linear model of a Neuron
Neuron Model
• The choice of activation function determines the neuron model.
Examples:
➢Threshold function:



=
0
1
)
(v

if v ≥ 0
if v < 0
➢ Piecewise Linear Function
• Piecewise linear function is viewed as a approximation to a non linear amplifier.
• It reduces to a threshold function if the amplification factor of the linear region is made infinitely large.
➢Sigmoid function:
)
exp(
1
1
)
(
av
v
−
+
=

➢Gaussian function:













 −
−

=
2
2
1
exp
2
1
)
(




v
v
NETWORK ARCHITECTURE
There are three different classes of network architectures:
✓ Single-layer feed-forward network
✓ Multi-layer feed-forward network
✓ Recurrent network
The manner in which the neurons of a neural network are structured is intimately
linked with the learning algorithm used to train the network.
Single Layer Feed-Forward Neural Network
Output layer
of
neurons
Input layer
of
source nodes
Feedforward network with a single layer of neurons.
In a layered neural network the neurons are organized in the form of layers. In the simplest form of a
layered network, we have an input layer of source nodes that project onto an output layers of neurons,
but not vice-versa. This is a feedforward or acyclic network.
Multi Layer Feed-Forward Neural Network
• MFFNN is a more general network architecture, where there are hidden
layers between input and output layers.
• Hidden nodes do not directly receive inputs nor send outputs to the
external environment.
• MFFNNs overcome the limitation of single-layer NN.
• They can handle non-linearly separable learning tasks.
Input
layer
Output
layer
Feedforward network with one hidden layer and one output layer.
Deep Learning
• In a multi-layer neural network, as we keep increasing the number of
hidden layers the computation becomes very expensive.
• Going beyond two to three layers becomes quite difficult computationally.
Such intense computation is handled by graphics processing unit (GPU).
• When the number of layers at the maximum are two to three then it is
called as shallow neural network.
• When the number of layers increases to more than three then it is termed
as Deep neural network
Recurrent Neural Network
• A recurrent neural network distinguishes itself from a feedforward neural network in that it has at
least one feedback loop.
Recurrent neural network with no hidden neurons Recurrent neural network with hidden neurons
McCulloch-Pitts Model of a Neuron
• McCulloch-Pitts neuron model is one of the earliest ANN model, has only two
types of inputs excitatory and inhibitory.
• The excitatory inputs have weights of positive magnitude and the inhibitory
weights have weights of negative magnitude.
• The inputs of the McCulloch-Pitts model could be either 0 or 1.
• It has a threshold function as activation function and the output is 1 if the input
is greater than equal to a given threshold else 0.
• McCulloch-Pitts neuron model can be used to design logical operations. For
that purpose, the connection weights need to be correctly decided along with
the threshold function.
McCulloch-Pitts Model of a Neuron
Example
• John carries an umbrella if it is sunny or if it is raining. There are four given
situations. We need to decide when John will carry the umbrella.
The situations are as follows:
• Situation 1: It is not raining nor it is sunny
• Situation 2: It is not raining but it is sunny
• Situation 3: It is raining and it is not sunny
• Situation 4: It is raining and it is sunny
• To analyze the situations using the McCulloch-Pitts neuron model, we can
consider the input signals as follows:
• X1=It is raining
• X2=It is Sunny
Situation X1 X2 Ysum Yout
1 0 0 0 0
2 0 1 1 1
3 1 0 1 1
4 1 1 2 1
X1
X2
1
1
Ysum Yout
𝑦𝑠𝑢𝑚=σ𝑖=1
2
𝑤𝑖𝑥𝑖
𝑦𝑜𝑢𝑡 = 𝑓 𝑦𝑠𝑢𝑚 = ቊ
1, 𝑥 ≥ 1
0, 𝑥 < 1
• From the truth table we can conclude that when Yout=1, John needs to carry
an umbrella.
• Hence in situation 2, 3, 4 John needs to carry an Umbrella.
• It is really an implementation of logical OR function using the McCulloch-Pitts
neuron model.
Perceptron
• Perceptron is the simplest form of a neural network used for the classification of
patterns that are linearly separable.
• It consists of a single neuron with adjustable weights and bias.
• It is built around a neuron namely McCulloch-Pitts model of a neuron.
• The algorithm is used to adjust the free parameters of this neural network.
• Rosenblatt proved that if the patterns is drawn from two linearly separable
classes then the perceptron algorithm converges and position the decision
surface in the form of a hyperplane between the two classes.
• It is limited to perform pattern classification with only two classes.
A neuron consists of a linear combiner followed by a hard limiter (signum
activation function).
The decision boundary, a hyperplane is defined by:
For the perceptron to function properly, the two
classes C1 and C2 must be linearly separable.
Major Aspects in ANN
• The number of layers in the network
• The direction of signal flow
• The number of nodes in each layer
• The value of weights attached with each interconnection between neurons
MULTI LAYER PERCEPTRON NEURAL NETWORK
• Multi layer perceptron neural network is an important class of feed forward neural network that consists
of a input layer, hidden layers and an output layer.
• The input signal propagates through the network in a forward direction on a layer by layer basis which is
referred to as a multi layer perceptron neural network.
• It is a generalization of the single layer perceptron.
• Multi layer perceptron have been successfully applied to solve difficult and diverse problems by training in
a supervised manner with a highly popular algorithm known as error back propagation (BP) algorithm.
• BP algorithm is based on the error correction learning rule and may be viewed as the generalization of the
least mean square algorithm.
• BP algorithm consists of two phases forward and backward pass.
• In the forward pass the weights are fixed and in the backward pass the weights are adjusted in accordance
with an error correction rule.
• Error signal is propagated backward through network against the direction of synaptic connection hence
the name error back propagation algorithm.
• The synaptic weights are adjusted to make the actual response of the network move closer to the desired
response in a statistical sense.
Back Propagation
Algorithm
Radial Basis Function Neural Network
• A function is said to be a radial basis function (RBF) if its output depends on the
distance of the input from a given stored vector.
➢The RBF neural network is a three layered feedforward network.
➢In such RBF networks, the hidden layer uses neurons with RBFs as activation
functions.
➢The outputs of all these hidden neurons are combined linearly at the output node.
• It has a faster learning speed and requires less iterations as compared to MLP with the
Back Propagation rule using the sigmoid activation function.
• These networks have a wide variety of applications such as
➢function approximation,
➢time series prediction,
➢control and regression,
➢pattern classification tasks for performing complex (non-linear) operations.
Radial Basis Function Architecture
x2
xm
x1
y
wm1
w1
1

1
m

Radial basis function neural network.
• It consists of one hidden layer with RBF activation function.
• It consists of output layer with linear activation function.
)
(
....
)
( 1
1
1
1
1
1 m
m
m t
x
w
t
x
w
y −
+
+
−
= 

1
1.... m


distance of x=(x1,……,xm) from center t.
t
x −
●Here we require weights, wi from the hidden layer to the output layer
only.
●The weights wi can be determined with the help of any of the standard
iterative methods described earlier for neural networks.
●However, since the approximating function given below is linear w. r.
t. wi, it can be directly calculated using the matrix methods of linear
least squares without having to explicitly determine wi iteratively.
●It should be noted that the approximate function f(X) is differentiable
with respect to wi.
)
(
)
(
1

=
−
=
=
N
i
i
i
i t
X
w
X
f
Y 
Comparison of RBFNN and FFNN
RBF NN FF NN
Non-linear layered feed-forward
networks.
Non-linear layered feed-forward
networks
Hidden layer of RBF is non-linear, the
output layer of RBF is linear.
Hidden and output layers of FFNN
are usually non-linear.
One single hidden layer May have more hidden layers.
Neuron model of the hidden neurons is
different from the one of the output
nodes.
Hidden and output neurons share
a common neuron model.
Activation function of each hidden
neuron in a RBF NN computes the
Euclidean distance between input
vector and the center of that unit.
Activation function of each hidden
neuron in a FFNN computes the
inner product of input vector and
the synaptic weight vector of that
neuron
THANK YOU

NeuralNetworksPresentationPart1important.pdf

  • 2.
  • 3.
    Introduction • Machine learningmimics the human form of learning. • Human learning or every action of human being is controlled by the nervous system. • The nervous system coordinates the different actions by transmitting signals to and from different parts of body. • The nervous system is constituted of a special type of cell called neuron or nerve cell, which has special structure allowing it to receive or send signals from other neurons. • This structure essentially forms a network of neurons or a neural network.
  • 4.
    • Biological neuralnetworks is a massively large and complex parallel computing network. • It is because of this massive parallel computing network that the nervous system helps human beings to perform actions or take decisions at a speed and with such ease that the fastest supercomputer of the world be afraid of. For example ✓ Superb flying catches taken by the fielders in the cricket ground. ✓ Swimming in the pool. The fascinating capability of the biological neural network has inspired the inception of artificial neural network (ANN). ANN is made of artificial neurons and is machine designed to model the functioning of the nervous system. The biological form of neuron is replicated in the electronic or digital form of neuron.
  • 5.
    Human Brain Receptors NeuralNets Effectors Stimulus Response • The human brain may be viewed as the three stage system. Centre to the system is the brain which receives information and make appropriate decisions. • The receptors converts stimuli from external environment into electrical pulses that convey information to the brain. • The effectors convert electrical impulses generated by the neural net into discernible responses as system outputs. • Structural constituents of the brain is neurons and they are massively connected with each other. • Neuron is able to receive, process and transmit information in the form of chemical and electrical signals. • It is estimated that there are approximately 10 billion neurons in the human cortex and 60 trillion synapses or connection. • Synapses are elementary structural and functional units that mediate the interaction between neurons. • Synapse converts a presynaptic electrical signal into a chemical signal and then back into a post synaptic electrical signal.
  • 6.
    Structure of aBiological Neuron
  • 7.
    Cytoarchitectural map ofthe cerebral cortex • Dendrites: It has irregular surface and receives signals from neighbouring neurons. • Soma: It is the main body of the neuron which accumulates the signals coming from the different dendrites. It fires when a sufficient amount of signals is accumulated. • Axon: It has smoother surface, fewer branches and greater length. It is the last part of the neuron which receives signal from soma, once the neuron fires and passes it on to the neighbouring neurons through the axon terminals.
  • 8.
    Central Nervous System Interregionalcircuits Local circuits Neurons Dendritic Trees Neural Microcircuits Synapses Molecules Structural Organization of Levels in the Brain A neural microcircuits refers to an assembly of synapses organized into patterns of connectivity to produce a functional operation of interest.
  • 9.
    •Artificial neural network(ANN) is a machine learning approach that models human brain and consists of a number of artificial neurons. •The brain is a highly complex non-linear and parallel computer. •Neuron in ANNs tend to have fewer connections than biological neurons. •Each neuron in ANN receives a number of inputs. •An activation function is applied to these inputs which results in activation level of neuron (output value of the neuron). •Knowledge about the learning task is given in the form of examples called training examples. Plasticity permits the developing nervous system to adapt to its surrounding environment.
  • 10.
    In its mostgeneral form, a neural network is a machine that is designed to model the way in which the brain performs a particular task or function of interest; the network is usually implemented by using electronic components or is simulated in software on a digital computer. A neural network is a massively parallel distributed processor made up of simple processing units, which has a natural propensity for storing experimental knowledge and making it available for use. It resemble the brain in two respects: (a) Knowledge is acquired by the network from its environment through a learning process. (b) Interneuron connection strength, known as synaptic weight, are used to store the acquired knowledge.
  • 11.
    • The procedureused to perform the learning process is called a learning algorithm, the function of which is to modify the synaptic weights of the network in an orderly fashion to attain the desired design objective. • It is also possible for a neural network to modify its own topology, which is motivated by the fact that neurons in the human brain can die and that new synaptic weights can grow. • Neural network are also referred in literature as neurocomputers, connectionist network, parallel distributed processor. • Neural Network derive its computing power first from the massively parallel distributed structure and second its ability to learn and thereafter generalize. • Generalization refers to the neural network producing reasonable outputs for inputs not encountered during training (learning).
  • 12.
    • An ArtificialNeural Network is specified by: ➢neuron model: the information processing unit of the NN, ➢an architecture: a set of neurons and links connecting neurons. Each link has a weight, ➢a learning algorithm: used for training the NN by modifying the weights in order to model a particular learning task correctly on the training examples. • The aim is to obtain a NN that is trained and generalizes well. • It should behaves correctly on new instances of the learning task.
  • 13.
    NEURON • The neuronis the basic information processing unit of a NN. It consists of: 1 A set of links, describing the neuron inputs, with weights W1, W2, …, Wm 2 An adder function (linear combiner) for computing the weighted sum of the inputs: (real numbers) (1) 3 Activation function for limiting the amplitude of the neuron output. Here ‘b’ denotes bias. (2) 𝑢𝑘 = ෍ 𝑗=1 𝑚 𝑤𝑘𝑗𝑥𝑗 𝑦𝑘 = 𝜑 𝑢𝑘 + 𝑏𝑘 𝑣𝑘 = 𝑢𝑘 + 𝑏𝑘
  • 14.
    (.)  Output yk Activation function Non LinearModel of a Neuron  wk1 wk2 wkm x1 x2 xm vk Induced field Summing junction Synaptic weights Bias bk
  • 15.
    Affine Transformation producedby the presence of a bias
  • 16.
    Another Non Linearmodel of a Neuron
  • 17.
    Neuron Model • Thechoice of activation function determines the neuron model. Examples: ➢Threshold function:    = 0 1 ) (v  if v ≥ 0 if v < 0
  • 18.
    ➢ Piecewise LinearFunction • Piecewise linear function is viewed as a approximation to a non linear amplifier. • It reduces to a threshold function if the amplification factor of the linear region is made infinitely large.
  • 19.
  • 20.
    NETWORK ARCHITECTURE There arethree different classes of network architectures: ✓ Single-layer feed-forward network ✓ Multi-layer feed-forward network ✓ Recurrent network The manner in which the neurons of a neural network are structured is intimately linked with the learning algorithm used to train the network.
  • 21.
    Single Layer Feed-ForwardNeural Network Output layer of neurons Input layer of source nodes Feedforward network with a single layer of neurons. In a layered neural network the neurons are organized in the form of layers. In the simplest form of a layered network, we have an input layer of source nodes that project onto an output layers of neurons, but not vice-versa. This is a feedforward or acyclic network.
  • 22.
    Multi Layer Feed-ForwardNeural Network • MFFNN is a more general network architecture, where there are hidden layers between input and output layers. • Hidden nodes do not directly receive inputs nor send outputs to the external environment. • MFFNNs overcome the limitation of single-layer NN. • They can handle non-linearly separable learning tasks. Input layer Output layer Feedforward network with one hidden layer and one output layer.
  • 23.
    Deep Learning • Ina multi-layer neural network, as we keep increasing the number of hidden layers the computation becomes very expensive. • Going beyond two to three layers becomes quite difficult computationally. Such intense computation is handled by graphics processing unit (GPU). • When the number of layers at the maximum are two to three then it is called as shallow neural network. • When the number of layers increases to more than three then it is termed as Deep neural network
  • 24.
    Recurrent Neural Network •A recurrent neural network distinguishes itself from a feedforward neural network in that it has at least one feedback loop. Recurrent neural network with no hidden neurons Recurrent neural network with hidden neurons
  • 25.
    McCulloch-Pitts Model ofa Neuron • McCulloch-Pitts neuron model is one of the earliest ANN model, has only two types of inputs excitatory and inhibitory. • The excitatory inputs have weights of positive magnitude and the inhibitory weights have weights of negative magnitude. • The inputs of the McCulloch-Pitts model could be either 0 or 1. • It has a threshold function as activation function and the output is 1 if the input is greater than equal to a given threshold else 0. • McCulloch-Pitts neuron model can be used to design logical operations. For that purpose, the connection weights need to be correctly decided along with the threshold function.
  • 26.
  • 27.
    Example • John carriesan umbrella if it is sunny or if it is raining. There are four given situations. We need to decide when John will carry the umbrella. The situations are as follows: • Situation 1: It is not raining nor it is sunny • Situation 2: It is not raining but it is sunny • Situation 3: It is raining and it is not sunny • Situation 4: It is raining and it is sunny • To analyze the situations using the McCulloch-Pitts neuron model, we can consider the input signals as follows: • X1=It is raining • X2=It is Sunny
  • 28.
    Situation X1 X2Ysum Yout 1 0 0 0 0 2 0 1 1 1 3 1 0 1 1 4 1 1 2 1 X1 X2 1 1 Ysum Yout 𝑦𝑠𝑢𝑚=σ𝑖=1 2 𝑤𝑖𝑥𝑖 𝑦𝑜𝑢𝑡 = 𝑓 𝑦𝑠𝑢𝑚 = ቊ 1, 𝑥 ≥ 1 0, 𝑥 < 1
  • 29.
    • From thetruth table we can conclude that when Yout=1, John needs to carry an umbrella. • Hence in situation 2, 3, 4 John needs to carry an Umbrella. • It is really an implementation of logical OR function using the McCulloch-Pitts neuron model.
  • 30.
    Perceptron • Perceptron isthe simplest form of a neural network used for the classification of patterns that are linearly separable. • It consists of a single neuron with adjustable weights and bias. • It is built around a neuron namely McCulloch-Pitts model of a neuron. • The algorithm is used to adjust the free parameters of this neural network. • Rosenblatt proved that if the patterns is drawn from two linearly separable classes then the perceptron algorithm converges and position the decision surface in the form of a hyperplane between the two classes. • It is limited to perform pattern classification with only two classes.
  • 31.
    A neuron consistsof a linear combiner followed by a hard limiter (signum activation function).
  • 32.
    The decision boundary,a hyperplane is defined by: For the perceptron to function properly, the two classes C1 and C2 must be linearly separable.
  • 34.
    Major Aspects inANN • The number of layers in the network • The direction of signal flow • The number of nodes in each layer • The value of weights attached with each interconnection between neurons
  • 35.
    MULTI LAYER PERCEPTRONNEURAL NETWORK • Multi layer perceptron neural network is an important class of feed forward neural network that consists of a input layer, hidden layers and an output layer. • The input signal propagates through the network in a forward direction on a layer by layer basis which is referred to as a multi layer perceptron neural network. • It is a generalization of the single layer perceptron. • Multi layer perceptron have been successfully applied to solve difficult and diverse problems by training in a supervised manner with a highly popular algorithm known as error back propagation (BP) algorithm. • BP algorithm is based on the error correction learning rule and may be viewed as the generalization of the least mean square algorithm. • BP algorithm consists of two phases forward and backward pass. • In the forward pass the weights are fixed and in the backward pass the weights are adjusted in accordance with an error correction rule. • Error signal is propagated backward through network against the direction of synaptic connection hence the name error back propagation algorithm. • The synaptic weights are adjusted to make the actual response of the network move closer to the desired response in a statistical sense.
  • 36.
  • 37.
    Radial Basis FunctionNeural Network • A function is said to be a radial basis function (RBF) if its output depends on the distance of the input from a given stored vector. ➢The RBF neural network is a three layered feedforward network. ➢In such RBF networks, the hidden layer uses neurons with RBFs as activation functions. ➢The outputs of all these hidden neurons are combined linearly at the output node. • It has a faster learning speed and requires less iterations as compared to MLP with the Back Propagation rule using the sigmoid activation function. • These networks have a wide variety of applications such as ➢function approximation, ➢time series prediction, ➢control and regression, ➢pattern classification tasks for performing complex (non-linear) operations.
  • 38.
    Radial Basis FunctionArchitecture x2 xm x1 y wm1 w1 1  1 m  Radial basis function neural network. • It consists of one hidden layer with RBF activation function. • It consists of output layer with linear activation function. ) ( .... ) ( 1 1 1 1 1 1 m m m t x w t x w y − + + − =   1 1.... m   distance of x=(x1,……,xm) from center t. t x −
  • 39.
    ●Here we requireweights, wi from the hidden layer to the output layer only. ●The weights wi can be determined with the help of any of the standard iterative methods described earlier for neural networks. ●However, since the approximating function given below is linear w. r. t. wi, it can be directly calculated using the matrix methods of linear least squares without having to explicitly determine wi iteratively. ●It should be noted that the approximate function f(X) is differentiable with respect to wi. ) ( ) ( 1  = − = = N i i i i t X w X f Y 
  • 40.
    Comparison of RBFNNand FFNN RBF NN FF NN Non-linear layered feed-forward networks. Non-linear layered feed-forward networks Hidden layer of RBF is non-linear, the output layer of RBF is linear. Hidden and output layers of FFNN are usually non-linear. One single hidden layer May have more hidden layers. Neuron model of the hidden neurons is different from the one of the output nodes. Hidden and output neurons share a common neuron model. Activation function of each hidden neuron in a RBF NN computes the Euclidean distance between input vector and the center of that unit. Activation function of each hidden neuron in a FFNN computes the inner product of input vector and the synaptic weight vector of that neuron
  • 41.