This document describes Bidirectional Associative Memory (BAM), a supervised learning model in artificial neural networks. BAM is able to store hetero-associative pattern pairs and retrieve a pattern given an incomplete or noisy input pattern. It has an architecture that accepts an n-dimensional input vector from set A and recalls an m-dimensional output vector from set B, and vice versa. The algorithm involves learning weight matrices between pattern pairs and testing recall accuracy. An example demonstrates storing and retrieving 4 input-output pattern pairs using a BAM with 6 input and 3 output neurons. Applications include pattern recognition and image processing.
2. TABLE OF CONTENT
• 1. INTRODUCTION
• 2.OBJECTIVE OF BAM
• 3.ARCHITECTURE
• 4. ALGORITHM
• 5.EXAMPLE BASED BAM
• 6.APPLICATION
• 7. LIMITATION
• 8.REFERENCES
3. INTRODUCTION
• Bidirectional Associative Memory (BAM) is a supervised learning model in Artificial
Neural Network.
• This is hetero-associative memory, for an input pattern, it returns another pattern which
is potentially of a different size
• A Recurrent Neural Network (RNN) is needed to receive a pattern of one set of neurons
as an input and generate a related, but different, output pattern of another set of neurons.
4. OBJECTIVE OF BAM
• A network model is to store hetero-associative pattern pairs
• This is used to retrieve a pattern given a noisy or incomplete pattern.
5. ARCHITECTURE
• When BAM accepts an input of n-dimensional vector X from set A then the model
recalls m-dimensional vector Y from set B. Similarly when Y is treated as input, the BAM
recalls X.
6. ALGORITHM
• 1. Storage (Learning): In this learning step of BAM, weight matrix is calculated between
M pairs of patterns (fundamental memories) are stored in the synaptic weights of the network
following the equation
7. 2. Testing: We have to check that the BAM recalls perfectly for corresponding and
recalls for corresponding Using,
All pairs should be recalled accordingly.
3. Retrieval: For an unknown vector X (a corrupted or incomplete version of a pattern from
set A or B) to the BAM and retrieve a previously stored association
8. o Initialize the BAM:
o
o Calculate the BAM output at iteration :
o
o Update the input vector :
o
o Repeat the iteration until convergence, when input and output remain unchanged.
9. EXAMPLE BASED BAM
• Set A: Input Patterns
• Set B: Corresponding Target Patterns
10. Step 1: Here, the value of M (no of pairs of patterns) is 4.
Step 2: Assign the neurons in the input and output layer. Here, neurons in the input layer are 6
and the output layer are 3
Step 3: Now, compute the Weight Matrix (W):
11. Step 4: Test the BAM model learning algorithm- for the input patterns BAM will return the
corresponding target patterns as output. And for each of the target patterns, BAM will return the
corresponding input patterns.
•Test on input patterns (Set A) using-
12. Test on target patterns (Set B) using -
Here, for each of the input patterns, the BAM model will give correct target patterns and for target patterns, the
model will also give corresponding input patterns.
This signifies the bidirectional association in memory or model network.
14. LIMITATION
• Storage capacity of the BAM: In the BAM, stored number of
associations should not be exceeded the number of neurons in the smaller
layer.
• Incorrect convergence: Always the closest association may not be
produced by BAM.