hetero associative memory is a single layer neural network. However, in this network the input training vector and the output target vectors are not the same. The weights are determined so that the network stores a set of patterns. Hetero associative network is static in nature, hence, there would be no non-linear and delay operations.
2. ASSOCIATIVE MEMORY
Associative memory is defined as the ability to learn and remember the
relationship between unrelated items. for example, remembering the name
of someone.
There are two type of associative memory :-
I. Auto associative memory
II. Hetro associative memory
In auto associative memory the training input and target output vectors are
identical.
Now we will discuss hetro associative memory in a detail.
3. INTRODUCTION OF HETRO ASSOCIATIVE
MEMORY
Single layer neural network .
The input training vector and the output target vectors are not the same.
The weights are determined so that the network stores a set of patterns.
Hetero associative network is static in nature, hence, there would be no non-linear and
delay operations.
The weights may be found using the Hebb rule or the delta rule.
Input has ‘n’ units and output has ‘m’ units and there is a weighted interconnection
between input and output.
4. Hetero Associative Memory network has ‘n’ number of input training vectors and ‘m’ number of
output target vectors.
5. TRAINING ALGORITHM
For training, this network is using the Hebb or Delta learning rule.
Step 1 − Initialize all the weights to zero as
wij = 0 (i = 1 to n, j = 1 to m)
Step 2 − Perform steps 3-4 for each input vector.
Step 3 − Activate each input unit as follows −
xi = si ( i=1 to n )
Step 4 − Activate each output unit as follows −
yj = sj (j=1 to m)
Step 5 − Adjust the weights as follows −
wij (new)=wij (old)+xi yj
6. TESTING ALGORITHM
Step 1 − Set the weights obtained during training for Hebb’s rule.
Step 2 − Perform steps 3-5 for each input vector.
Step 3 − Set the activation of the input units equal to that of the input vector.
Step 4 − Calculate the net input to each output unit j = 1 to m
Yij = 𝒊=𝟏
𝒏
𝒙𝒊 𝒘𝒊j
Step 5 − Apply the following activation function to calculate the output
1 if yij > 0
Yj = f(yij) = 0 if yij = 0
-1 if yij < 0
7. Hetero associative Architecture
wi1
wn1
0
wi2
wn2
0
w12
wi3
wn3
0
w13
For m = 3
w11
xi
xn
n inputs
x1
Activation Functions:
For bipolar targets:
yi =
1 if y_ini >0
0 if y_ini =0
-1 if y_ini <0
{
For binary targets:
iy = i1 if y_in >0
0 if y_ini <=0{
8. EXAMPLE
GOAL: build a neural network which will associate the following two sets of patterns using Hebb’s Rule:
s1 = ( 1 -1 -1 -1) f1 = ( 1 -1 -1)
s2 = (-1 1 -1 -1) f2 = ( 1 -1 1)
s3 = (-1 -1 1 -1) f3 = (-1 1 -1)
s4 = (-1 -1 -1 1) f4 = (-1 1 1)
The process will involve 4 input neurons and 3 output neurons
The algorithm involves finding the four outer products and adding them