2. Autocorrelator easily recognized
by the title of Hope Field
Association Memory(HAM).
Autocorrelator were introduced as
a Theoretical notation by DONALD
HEBB(1949) and Rigiorously
analysis by AMARI(1972,1977).
3. A first order auto correlated stores M
bipolar pattern A1,A2,A3,…..Am by
summing together M outer product as,
T= ∑ [Ai][Aj]
Here,
T=[tij] is a (p*p) connection matrix and
Ai È {-1,1}.
The Autocorrelator is recalls Equation is
Vector matrix Multipulication.
4. Consider the following pattern’s
A1=(-1,1,-1,1)
A2=(1,1,1,-1)
A3=(-1,-1,-1,1)
The Connection Matrix:-
3 1 3 -3
T = ∑ [Ai][Aj]= 1 3 1 -1
3 1 3 -3
-3 -1 -3 3
5. The autocorrelation is presented stored
pattern A2=(1,1,1,-1) with the help of recall
Equation,
a1new=f(3+1+3+3,1)=1
a2new=f(6,1)=1
a3new=f(10,1)=1
a4new=f(-10,1)=-1
This is indeed the vector, retrieval of A3=(-1,-
1,-1,1)
(a1new,a2new,a3new,a4new)=(-1,-1,-1,1) yield
the same vector.
6. Consider a vector A’=(1,1,1,1) which is a
distorted presentation of one among the
stored patterns.
The Hamming distance (HD) of a vector X
from Y, given X=(x1,x2,x3,….xn) and
Y=(y1,y2,y3,….yn).
HD (x,y) = ∑|xi-yi|
Thus the HD of A’ from each of the pattern
in the stored set,
HD(A’ , A1) = 4
HD(A’ ,A2) =2
7. It is evident that they vector A’ is closer to A2
and therefore resembles it, or in other word, is a
noisy version of A2,
The computation are,
A2=(1, 1, 1, -1)
( a1, a2, a3, a4 ) = ( f(4,1), f(4,1), f(4,1), f(-4,1) )
= ( 1, 1, 1, -1 )
= A2.
Hence in the case of partial vectors, an
autocorrelation results in refinement of the
pattern or removal of noisy to retrieve the
closest matching stored pattern
8. As KOSKO and others (1987a, 1987b),
Curz. Jr. and Stubberud (1987) have
noted, the bidirectional associative
memory (BAM) is two level non-linear
neural network based on earlier studies
and models of associative memory.
Kosko extended the unidirectional
autoassociators to bidirectional
processes.
9. There are N training pairs ,
{ (A1, B1), (A2, B2) ….(Ai, Bi)….(An,Bn)}
where,
Ai = ( ai1, ai2, …,ain )
Bi = ( bi1,bi2,…,bin )
Here aij or bij is either in the ON or OFF
state.
In the binary mode, ON = 1 and OFF = 0
and in the bipolar mode, ON = 1 and OFF =
-1. We frame the correlation matrix
10. To retrieve the nearest (Ai , Bi) pair given
any pair (α,β) the recall equation are as
follows:
Starting with (α,β) as the initial condition,
We determine a finite sequence (α’,β’),
(α’’,β’’),… until an equilibrium point (αf,βf),
is reached.
Here, β’=Φ(αM)
α’=Φ(βM)
Φ(F)=G=g1,g2,…,gn
F=(f1,f2,…,fn)
11. 1 if fi>0
0(binary)
gi = -1(bipolor) , fi<0
Previous gi , fi=0
One important performance attribute of
discrete BAM is its ability to recall stored
pairs particularly in the presence of noise.
12. Given a set of pattern pairs (Xi,Yi), for i =
1, 2, …, n and their correlation matrix M, a
pair (X’,Y’) can be added or an existing
pair (Xj,Yj) can be erased or deleted from
the memory model.
In case of addition, the new correlation
matrix M is,
M = X1Y1+X2Y2+…+XnYn+XY
13. In the case of deletion, we subtract the
matrix corresponding to (Xj, Yj) from the
matrix M,
(New) M=M-(Xj,Yj)
The addition and deletion of information
contributes to the functioning of the
system as a typical human memory
exhibiting learning and forgetfulness.
14. The stability of a BAM can be proved by
identifying a Lyapunov or energy function
E with each state (A,B).
In the autoassociative case, Hopfield
identified an appropriate E (actually,
Hopfield defined half this quantity) as,
E(A) = -AMA
Kosko proposed an energy function,
E(A,B) = -AMB
15. Kosko proved that each cycle of decoding
lowers the energy E if the energy function
for any point ( α,β ) is given by,
E = -αMβ ,
However, if the energy E evaluated using
the coordinates of the pair (Ai, Bi),
(i.e),
E = -AiMBi
Eventhought one starts with α = Ai. In this
aspect, Kosko’s encoding method does not
ensure that the stored pairs are at a local
mimima.