SlideShare a Scribd company logo
1 of 4
Adaptive Resonance Theory (ART): An abstract
by
L.G. Heins & D.R. Tauritz
Motivation
A problem area to which neural networks can readily be applied is that of dynamically self-
organizing data. Self-organizing implies the use of an unsupervised clustering neural
network. And because we want it to dynamically classify the data we need it to be an
unsupervised incremental clustering neural network. Alas, many of these neural networks
do not satisfy the stability-plasticity dilemma, which can be posed as follows:
• How can a learning system be designed to remain plastic, or adaptive, in response to
significant events and yet remain stable in response to irrelevant events?
• How does the system know how to switch between its stable and its plastic modes
to achieve stability without rigidity and plasticity without chaos?
• In particular, how can it preserve its previously learned knowledge while continuing
to learn new things?
• And, what prevents the new learning to wash away the memories of prior learning?
Most existing systems are either stable but not capable of forming new clusters, or
incremental but unstable. The ART-1 neural network was specifically designed to
overcome this dilemma for binary input vectors, ART-2 for continuous ones as well. In this
abstract we will further confine ourselves to discussing ART-1.
Concepts
A neural network is a parallel implementation of a sequential algorithm. Thus, we can study
the properties of a neural network by examining the algorithm it implements without being
distracted by its architecture. To gain insight into what ART-1 does, as opposed to how it
does it, we will first present an algorithmic description.
Note: = bitwise AND of vectors v and w; = [magnitude of u] = # of 1's in u
Step 1 - Initialisation
 Initialise N to the total number of clusters


 Initialise the set P of prototype vectors to
Step 2 - Apply new input vector
 Let I:=[next input vector]
 Let P':=P be the set of candidate prototype vectors
Step 3 - Find the closest prototype vector from P'
 Find the i which maximizes
Step 4 - Check if is too far from I
 ; if P' is empty goto step 2 otherwise goto step 3
Step 5 - Update the matched prototype vector

The acts as a tie-breaker, favouring larger magnitude prototype vectors when multiple
prototype vectors are subsets of the input vector. This compensates for the fact that
prototype vectors can only move in one direction. The vigilance parameter defines the class
sizes. When it is small it produces large classes. As it gets larger, the vigilance of the
network increases, and finer classes are the result. When equal to one, the prototype vectors
have to match the input vectors perfectly, thus every input vector produces a new class
equal to itself.
Mechanics
Two layers, F1 and F2, of the attentional subsystem encode patterns of activation in short-
term memory (STM). Bottom-up and top-down pathways between F1 and F2 contain
adaptive long-term memory (LTM) traces which multiply the signals in these pathways.
The remainder of the circuit modulates these STM and LTM processes.
F1 nodes are supraliminally activated (that is, sufficiently activated to generate output) if
they receive a signal from at least two out of three possible input sources. The three are
bottom-up input, top-down input and attentional gain control input. If a F1 node receives
input from only one of these sources it is subliminally activated. This is called the 2/3 rule.
ART-1 hypothesis testing cycle:
1. Input pattern I generates the STM activity pattern X at F1 and
activates both F1's gain control and the orienting subsystem A. Pattern X both inhibits A
and generates the bottom-up signal pattern S which is transformed by the adaptive filter
into the input pattern T. F2 is designed as a competitive network, only the node which
receives the largest total input is activated ("winner-take-all").
2. Pattern Y at F2 generates the top-down signal pattern U which is
transformed by the top-down adaptive filter into the expectation pattern V. Pattern Y also
inhibits F1's gain control, as a result of which only those F1 nodes that represent bits in
the intersection of the input pattern I and the expectation pattern V remain supraliminally
activated. If V mismatches I this results in a decrease in the total inhibition from F1 to A.
3. If the mismatch is severe enough A can no longer be prevented
from releasing a nonspecific arousal wave to F2, which resets the active node at F2. The
vigilance parameter determines how much mismatch will be tolerated.
4. After the F2 node is inhibited its top-down expectation is
eliminated and X can be reinstated at F1. The cycle then begins again. X once again
generates input pattern T to F2, but a different node is activated. The previously chosen F2
node remains inhibited until F2's gain control is disengaged by removal of the input
pattern.
The parallel search, or hypothesis testing, cycle repeats automatically at a very fast rate
until one of three possibilities occurs: (1) a F2 node is chosen whose top-down expectation
approximately matches input I; (2) a previously uncommitted F2 node is selected; or (3)
the entire capacity of the system is used and input I cannot be accommodated. Until one of
these outcomes prevails, essentially no learning occurs because all the STM computations
of the hypothesis testing cycle proceed so quickly that the more slowly varying LTM traces
in the bottom-up and top-down adaptive filters cannot change in response to them.
Significant learning in response to an input pattern occurs only after the cycle that it
generates comes to an end and the system is in a resonant state.
References
Carpenter, Gail, Grossberg, Stephen (1987) "A Massively Parallel Architecture for a Self
Organizing Neural Pattern Recognition Machine", Computer Vision, Graphics, and Image
Processing, 1987, Volume 37, pp.54-115
Freeman, James A., Skapura, David M. (1991) "Neural Networks: Algorithms,
Applications, and Programming Techniques", Chapter 8
Hertz, John, Krogh, Anders, Palmer, Richard G. (1991) "Introduction to the theory of
neural computation", Paragraph 9.3
Moore, Barbara (1989) "ART 1 and Pattern Clustering", pp.174-185
Please note that this file is a private user file of user dtauritz. We have a policy on this kind of page.

More Related Content

Similar to Art2

All answers are in the form of TrueFalse with a explantion as to wh.pdf
All answers are in the form of TrueFalse with a explantion as to wh.pdfAll answers are in the form of TrueFalse with a explantion as to wh.pdf
All answers are in the form of TrueFalse with a explantion as to wh.pdf
arjunstores123
 
Electricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANNElectricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANN
Naren Chandra Kattla
 
Supervised Learning
Supervised LearningSupervised Learning
Supervised Learning
butest
 

Similar to Art2 (20)

IRJET- Overview of Artificial Neural Networks Applications in Groundwater...
IRJET-  	  Overview of Artificial Neural Networks Applications in Groundwater...IRJET-  	  Overview of Artificial Neural Networks Applications in Groundwater...
IRJET- Overview of Artificial Neural Networks Applications in Groundwater...
 
Adaptive resonance theory (art)
Adaptive resonance theory (art)Adaptive resonance theory (art)
Adaptive resonance theory (art)
 
Nural network ER.Abhishek k. upadhyay
Nural network  ER.Abhishek k. upadhyayNural network  ER.Abhishek k. upadhyay
Nural network ER.Abhishek k. upadhyay
 
Unit iii update
Unit iii updateUnit iii update
Unit iii update
 
Art network
Art networkArt network
Art network
 
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
 
MNN
MNNMNN
MNN
 
A Comparison of Fuzzy ARTMAP
A Comparison of Fuzzy ARTMAPA Comparison of Fuzzy ARTMAP
A Comparison of Fuzzy ARTMAP
 
ARTIFICIAL NEURAL NETWORK APPROACH TO MODELING OF POLYPROPYLENE REACTOR
ARTIFICIAL NEURAL NETWORK APPROACH TO MODELING OF POLYPROPYLENE REACTORARTIFICIAL NEURAL NETWORK APPROACH TO MODELING OF POLYPROPYLENE REACTOR
ARTIFICIAL NEURAL NETWORK APPROACH TO MODELING OF POLYPROPYLENE REACTOR
 
Cs handouts(r18)
Cs handouts(r18)Cs handouts(r18)
Cs handouts(r18)
 
Unit 3
Unit 3Unit 3
Unit 3
 
Carpenter Grossberg Rosen1991b
Carpenter Grossberg Rosen1991bCarpenter Grossberg Rosen1991b
Carpenter Grossberg Rosen1991b
 
All answers are in the form of TrueFalse with a explantion as to wh.pdf
All answers are in the form of TrueFalse with a explantion as to wh.pdfAll answers are in the form of TrueFalse with a explantion as to wh.pdf
All answers are in the form of TrueFalse with a explantion as to wh.pdf
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics Course
 
Ijtra150320
Ijtra150320Ijtra150320
Ijtra150320
 
Electricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANNElectricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANN
 
Lec 6-bp
Lec 6-bpLec 6-bp
Lec 6-bp
 
Survey on Artificial Neural Network Learning Technique Algorithms
Survey on Artificial Neural Network Learning Technique AlgorithmsSurvey on Artificial Neural Network Learning Technique Algorithms
Survey on Artificial Neural Network Learning Technique Algorithms
 
Supervised Learning
Supervised LearningSupervised Learning
Supervised Learning
 
Advanced atpg based on fan, testability measures and fault reduction
Advanced atpg based on fan, testability measures and fault reductionAdvanced atpg based on fan, testability measures and fault reduction
Advanced atpg based on fan, testability measures and fault reduction
 

More from ESCOM

redes neuronales tipo Som
redes neuronales tipo Somredes neuronales tipo Som
redes neuronales tipo Som
ESCOM
 
redes neuronales Som
redes neuronales Somredes neuronales Som
redes neuronales Som
ESCOM
 
redes neuronales Som Slides
redes neuronales Som Slidesredes neuronales Som Slides
redes neuronales Som Slides
ESCOM
 
red neuronal Som Net
red neuronal Som Netred neuronal Som Net
red neuronal Som Net
ESCOM
 
Self Organinising neural networks
Self Organinising  neural networksSelf Organinising  neural networks
Self Organinising neural networks
ESCOM
 
redes neuronales Kohonen
redes neuronales Kohonenredes neuronales Kohonen
redes neuronales Kohonen
ESCOM
 
Teoria Resonancia Adaptativa
Teoria Resonancia AdaptativaTeoria Resonancia Adaptativa
Teoria Resonancia Adaptativa
ESCOM
 
ejemplo red neuronal Art1
ejemplo red neuronal Art1ejemplo red neuronal Art1
ejemplo red neuronal Art1
ESCOM
 
redes neuronales tipo Art3
redes neuronales tipo Art3redes neuronales tipo Art3
redes neuronales tipo Art3
ESCOM
 
Redes neuronales tipo Art
Redes neuronales tipo ArtRedes neuronales tipo Art
Redes neuronales tipo Art
ESCOM
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
ESCOM
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
ESCOM
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
ESCOM
 
Fukushima Cognitron
Fukushima CognitronFukushima Cognitron
Fukushima Cognitron
ESCOM
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
ESCOM
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
ESCOM
 
Counterpropagation
CounterpropagationCounterpropagation
Counterpropagation
ESCOM
 
Teoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAPTeoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAP
ESCOM
 
Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1
ESCOM
 
Teoría de Resonancia Adaptativa ART
Teoría de Resonancia Adaptativa ARTTeoría de Resonancia Adaptativa ART
Teoría de Resonancia Adaptativa ART
ESCOM
 

More from ESCOM (20)

redes neuronales tipo Som
redes neuronales tipo Somredes neuronales tipo Som
redes neuronales tipo Som
 
redes neuronales Som
redes neuronales Somredes neuronales Som
redes neuronales Som
 
redes neuronales Som Slides
redes neuronales Som Slidesredes neuronales Som Slides
redes neuronales Som Slides
 
red neuronal Som Net
red neuronal Som Netred neuronal Som Net
red neuronal Som Net
 
Self Organinising neural networks
Self Organinising  neural networksSelf Organinising  neural networks
Self Organinising neural networks
 
redes neuronales Kohonen
redes neuronales Kohonenredes neuronales Kohonen
redes neuronales Kohonen
 
Teoria Resonancia Adaptativa
Teoria Resonancia AdaptativaTeoria Resonancia Adaptativa
Teoria Resonancia Adaptativa
 
ejemplo red neuronal Art1
ejemplo red neuronal Art1ejemplo red neuronal Art1
ejemplo red neuronal Art1
 
redes neuronales tipo Art3
redes neuronales tipo Art3redes neuronales tipo Art3
redes neuronales tipo Art3
 
Redes neuronales tipo Art
Redes neuronales tipo ArtRedes neuronales tipo Art
Redes neuronales tipo Art
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
 
Neocognitron
NeocognitronNeocognitron
Neocognitron
 
Fukushima Cognitron
Fukushima CognitronFukushima Cognitron
Fukushima Cognitron
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
 
Counterpropagation
CounterpropagationCounterpropagation
Counterpropagation
 
Teoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAPTeoría de Resonancia Adaptativa Art2 ARTMAP
Teoría de Resonancia Adaptativa Art2 ARTMAP
 
Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1Teoría de Resonancia Adaptativa ART1
Teoría de Resonancia Adaptativa ART1
 
Teoría de Resonancia Adaptativa ART
Teoría de Resonancia Adaptativa ARTTeoría de Resonancia Adaptativa ART
Teoría de Resonancia Adaptativa ART
 

Recently uploaded

Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
ZurliaSoop
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
KarakKing
 

Recently uploaded (20)

Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 

Art2

  • 1. Adaptive Resonance Theory (ART): An abstract by L.G. Heins & D.R. Tauritz Motivation A problem area to which neural networks can readily be applied is that of dynamically self- organizing data. Self-organizing implies the use of an unsupervised clustering neural network. And because we want it to dynamically classify the data we need it to be an unsupervised incremental clustering neural network. Alas, many of these neural networks do not satisfy the stability-plasticity dilemma, which can be posed as follows: • How can a learning system be designed to remain plastic, or adaptive, in response to significant events and yet remain stable in response to irrelevant events? • How does the system know how to switch between its stable and its plastic modes to achieve stability without rigidity and plasticity without chaos? • In particular, how can it preserve its previously learned knowledge while continuing to learn new things? • And, what prevents the new learning to wash away the memories of prior learning? Most existing systems are either stable but not capable of forming new clusters, or incremental but unstable. The ART-1 neural network was specifically designed to overcome this dilemma for binary input vectors, ART-2 for continuous ones as well. In this abstract we will further confine ourselves to discussing ART-1. Concepts A neural network is a parallel implementation of a sequential algorithm. Thus, we can study the properties of a neural network by examining the algorithm it implements without being distracted by its architecture. To gain insight into what ART-1 does, as opposed to how it does it, we will first present an algorithmic description. Note: = bitwise AND of vectors v and w; = [magnitude of u] = # of 1's in u Step 1 - Initialisation  Initialise N to the total number of clusters    Initialise the set P of prototype vectors to Step 2 - Apply new input vector  Let I:=[next input vector]
  • 2.  Let P':=P be the set of candidate prototype vectors Step 3 - Find the closest prototype vector from P'  Find the i which maximizes Step 4 - Check if is too far from I  ; if P' is empty goto step 2 otherwise goto step 3 Step 5 - Update the matched prototype vector  The acts as a tie-breaker, favouring larger magnitude prototype vectors when multiple prototype vectors are subsets of the input vector. This compensates for the fact that prototype vectors can only move in one direction. The vigilance parameter defines the class sizes. When it is small it produces large classes. As it gets larger, the vigilance of the network increases, and finer classes are the result. When equal to one, the prototype vectors have to match the input vectors perfectly, thus every input vector produces a new class equal to itself. Mechanics Two layers, F1 and F2, of the attentional subsystem encode patterns of activation in short- term memory (STM). Bottom-up and top-down pathways between F1 and F2 contain adaptive long-term memory (LTM) traces which multiply the signals in these pathways. The remainder of the circuit modulates these STM and LTM processes. F1 nodes are supraliminally activated (that is, sufficiently activated to generate output) if they receive a signal from at least two out of three possible input sources. The three are bottom-up input, top-down input and attentional gain control input. If a F1 node receives input from only one of these sources it is subliminally activated. This is called the 2/3 rule.
  • 3. ART-1 hypothesis testing cycle: 1. Input pattern I generates the STM activity pattern X at F1 and activates both F1's gain control and the orienting subsystem A. Pattern X both inhibits A and generates the bottom-up signal pattern S which is transformed by the adaptive filter into the input pattern T. F2 is designed as a competitive network, only the node which receives the largest total input is activated ("winner-take-all"). 2. Pattern Y at F2 generates the top-down signal pattern U which is transformed by the top-down adaptive filter into the expectation pattern V. Pattern Y also inhibits F1's gain control, as a result of which only those F1 nodes that represent bits in the intersection of the input pattern I and the expectation pattern V remain supraliminally activated. If V mismatches I this results in a decrease in the total inhibition from F1 to A. 3. If the mismatch is severe enough A can no longer be prevented from releasing a nonspecific arousal wave to F2, which resets the active node at F2. The vigilance parameter determines how much mismatch will be tolerated.
  • 4. 4. After the F2 node is inhibited its top-down expectation is eliminated and X can be reinstated at F1. The cycle then begins again. X once again generates input pattern T to F2, but a different node is activated. The previously chosen F2 node remains inhibited until F2's gain control is disengaged by removal of the input pattern. The parallel search, or hypothesis testing, cycle repeats automatically at a very fast rate until one of three possibilities occurs: (1) a F2 node is chosen whose top-down expectation approximately matches input I; (2) a previously uncommitted F2 node is selected; or (3) the entire capacity of the system is used and input I cannot be accommodated. Until one of these outcomes prevails, essentially no learning occurs because all the STM computations of the hypothesis testing cycle proceed so quickly that the more slowly varying LTM traces in the bottom-up and top-down adaptive filters cannot change in response to them. Significant learning in response to an input pattern occurs only after the cycle that it generates comes to an end and the system is in a resonant state. References Carpenter, Gail, Grossberg, Stephen (1987) "A Massively Parallel Architecture for a Self Organizing Neural Pattern Recognition Machine", Computer Vision, Graphics, and Image Processing, 1987, Volume 37, pp.54-115 Freeman, James A., Skapura, David M. (1991) "Neural Networks: Algorithms, Applications, and Programming Techniques", Chapter 8 Hertz, John, Krogh, Anders, Palmer, Richard G. (1991) "Introduction to the theory of neural computation", Paragraph 9.3 Moore, Barbara (1989) "ART 1 and Pattern Clustering", pp.174-185 Please note that this file is a private user file of user dtauritz. We have a policy on this kind of page.