Upcoming SlideShare
×

Autocorrelators1

267 views
168 views

Published on

0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
267
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
4
0
Likes
0
Embeds 0
No embeds

No notes for slide

Autocorrelators1

1. 1. 1 Autocorrelators Dept. of Computer Science & Engineering 2013-2014 Presented By: Ranjit R. Banshpal Mtech 1st sem (CSE) Roll NO.18 1 A seminar on G.H. Raisoni College of Engineering Nagpur 1
2. 2. Autocorrelators  Autocorrelators easily recognized by the title of Hopfield Associative Memory (HAM).  First order autocorrelators obtain their connection matrix by multiplying a pattern’s element with every other pattern’s elements.  A first order autocorrelator stores M bipolar pattern A1,A2,………,Am by summing together m outer product as
3. 3. Here, is a (p × p) connection matrix and p  The autocorrelator’s recall equation is vector-matrix multiplication,  The recall equation is given by, =f ( ) Where Ai=(a1,a2,…..,ap) and two parameter bipolar threshold function is, 1, if α > 0 f(α , β )= β, if α = 0 -1, if α < 0
4. 4. Working of an autocorrelator Consider the following pattern, A1=(-1,1,-1,1) A2=(1,1,1,-1) A3=(-1,-1,-1,1) The connection matrix, 3 1 3 -3 4×1 1×4 1 3 1 -1 3 1 3 -3 -3 -1 -3 3
5. 5. Recognition of stored patterns  The autocorrelator is presented stored pattern A2=(1,1,1,-1) With the help of recall equation = f ( 3 + 1 + 3 + 3, 1 ) = 1 = f ( 6, 1 ) = 1 = f ( 10, 1 ) = 1 = f ( -10, 1 ) = -1
6. 6. Recognition of noisy patterns  Consider a vector A’=(1,1,1,1) which is a distorted presentation of one among the store pattern  With the help of Hamming Distance measure we can find noisy vector pattern  The Hamming distance (HD) of vector X from Y, given X=(x1,x2,….,xn) and Y=(y1,y2,….,yn) is given by, HD( x, y ) =
7. 7. Heterocorrelators : Kosko’s discrete BAM  Bidirectional associative memory (BAM) is two level nonlinear neural network  Kosko extended the unidirectional to bidirectional processes.  Noise does not affect performance  There are N training pairs {(A1,B1),(A2,B2),….,(Ai,Bi),……,(An, Bn)} where Ai=(ai1,ai2,…….,ain) Bi =(bi1,bi2,……,bip)
8. 8. Here, aij or bij is either ON or OFF state In binary mode, ON = 1 and OFF = 0 and In bipolar mode, ON = 1 and OFF = -1  Formula for correlation matrix is,  Recall equations, Starting with (α, β) as the initial condition, we determine the finite sequence (α’, β’ ),(α’’, β’’),…….., until equilibrium point (αF, β F ) is reached. Here , β’ = ϕ (αM) α’ = ϕ (β’ MT)
9. 9. Φ(F) = G = g1, g2, …., gn F = ( f1,f2,….,f n) 1 if fi > 0 0 (binary) gi = , fi < 0 -1 (bipolar) previous gi, fi = 0
10. 10. Addition and Deletion of Pattern Pairs If given set of pattern pairs (Xi, Yi) for i=1,2,….,n  Then we can be added (X’,Y’) or can be deleted (Xj,Yj) from the memory model.  In the case of addition,  In the case of deletion,
11. 11. Energy function for BAM  The value of the energy function for particular pattern has to occupy a minimum point in energy landscape,  Adding new patterns do not destroy previously stored patterns.
12. 12.  Hopfield propose an energy function as, E(A) = -AMAT  Kosko propose an energy function as, E(A,B)= - AMBT  If energy function for any point (α, β) is given by E = - αMβT  If energy E evaluate using the coordinates of the pair (Ai,Bi),
13. 13.  Working of Kosko’s BAM Step 1: converting to bipolar forms  Step 2: The matrix M is calculated as,  Step 3: Retrieve the associative pair β’ = ϕ (αM) α’ = ϕ (β’ MT)
14. 14. THANK YOU…!!!