3. SUPERVISED LEARNING
In this,every input pattern that is
used to
Train the network is associated with an
output pattern.
The error can then be change
network parameter.
Which result in an improvement in
performance.
4. UNSUPERVISED LEARNING
In this learning methods, the target
output is not presented to the
network.
It is as if there is no teacher to
present the patterns and hence.
5. REINFORCED LEARNING
In this methods, a teacher though
available does not present the
expected answer but only indicated
if the computed output is correct or
incorrect.
Supervised and unsupervised
learning methods, which are most
popular from of learning.
6. HEBBIAN LEARNING
This rule was proposed by Hebb
(1949) and is based on correlative
we weight adjustment .
This is the oldest learning
mechanism inspired by biology.
In this,the input-output pattern
pairs(xi,yi) are associated weight
matrix w, known as the correlation
maatrix.
7. GRADIENT DESCENT LEARNING:
This is based on the minimization of error E
defined in terms of weights and the activation
function of the network.
Also it is required that the activation function
employed by the network is differentiable as
the weight update is dependent on the
gradient of the error E.
8. WIJ=N∂E
Weight update is the on link connecting
the jyh necuron of the two neighbouring
layers, then