SlideShare a Scribd company logo
Graded Patterns in Attractor Networks
  Tristan Webb              Supervisor: Jianfeng Feng                          Co-Supervisor: Edmund Rolls
  Complexity Science DTC, Computational Biology Research Group
  University of Warwick

   Summary
 We demonstrate how noise can exist in a neural network as large as the brain. Graded firing patterns allow us to tune noise levels in the
 engineering of neural networks. The levels of noise in the brain may change with age and play a functional role in the retrieval of memory.

   Attractor Neural Networks                                                                                                       Graded Patterns
 Neural coding, and its relationship to behavior, is heavily researched                                     The network was simulated numerically for a time period of four sec-
 in many areas of neuroscience. Attractor networks are a demonstra-                                         onds. We present the network two different periods of different exter-
 tion of how decisions, memories, and other cognitive representations                                       nal stimulus levels: first a base period, and later a cue period. During
 can be encoded in a firing pattern (or set of active neurons) in a neu-                                     the cue period the qualitative firing pattern in the network is sporadic
 ral network.                                                                                               and uneven. When cues are applied, the firing rate for the neurons in
 An attractor network receives sensory information to the network                                           a winning decision pool is raised to through positive feedback, while
 through connections known as synapses. The network is character-                                           the other pool is suppressed through increased inhibition.
 ized by recurrent collateral synapses providing feedback to neurons.                                                                                                          Uniform                                                                               Graded
 Recurrent synaptic activity will cause the firing patterns in the network                                                                                              Final Second Mean Neuron Rates                                                        Final Second Mean Neuron Rates
                                                                                                                                                      60                                                                                      60
 to persist even after the input is removed.                                                                                                                                                   Winning Pool                                                                       Winning Pool
 Learning occurs through the modification of synaptic strengths (wij ,                                                                                 50                                       Losing Pool                                    50                                  Losing Pool
 where i is the ith neuron and j is the jth synapse). An associative                                                                                  40                                                                                      40




                                                                                                                                   Firing Rate (Hz)




                                                                                                                                                                                                                           Firing Rate (Hz)
 learning (Hebbian) rule can create the correct structure for the re-                                                                                 30                                                                                      30
 call of information. This type of learning strengthens connections                                                                                   20                                                                                      20
 between neurons that are simultaneously active.
                                                                                                                                                      10                                                                                      10
 The network dynamics can be thought of as a gradient descent to-
 wards a local minimum in an energy landscape. When the network                                                                                        00        5       10   15 20 25                  30   35   40                           00        5     10   15 20 25          30      35   40
                                                                                                                                                                              Neuron Number                                                                         Neuron Number
 has reached this minimum the learned pattern is recalled. The en-
 ergy is defined as                                                                                          We imposed uniform and graded firing patterns on the network by
           1                                                  External Inputs                               selecting the distribution of the recurrent weight for each of the deci-
    E =−       (yi − < y >)(yj − < y >)                                                                     sion pools. To achieve a uniform firing pattern, weights were all set
           2
               ij                             Recurrent firing
                                                            yj
                                                                              Dendrites                     to the same value w+ = 2.1. Graded firing patterns were achieved
                                                                                             Recurrent

 where yi is the firing of the ith neu-                                        wij            collateral
                                                                                                            by conforming weights to a discrete exponential-like distribution with
 ron, < y > is the population’s mean fir-                                                     synapses       mean value w+ ≈ 2.1.
 ing rate. Fixed points in attractor net-                                                    Cell bodies


 works can correspond to a spontaneous                                                   Output firing
                                                                                                                                   Results
 state (where all neurons have a low fir-                                                     yi
                                                                                                            Graded simulations were more likely to jump to a decision
 ing rate), or a persistent state in which a                                                                early.            This could be caused by decreased stability of the
 subset of neurons have a high firing rate.                                                                  spontaneous state.                                 Changes in reaction time distributions
                                                                                                            are statistically significant and the decrease in reaction time
   Network Dynamics                                                                                         is robust across different firing rates of the winning pool.
                                                                                                                                                                 Variability in the system increases when
 Neurons in simulations use Integrate-and-Fire (IF) dynamics to de-                                                      Reaction Times vs Firing Rates
                                                                                                              1100                                               graded patterns are introduced. Here
 scribe the membrane potential of neurons. We chose biologically
                                                                                                              1000                                               we use the Fano factor to compute trial
 realistic constants to obtain firing rates that are comparable to ex-
                                                                                                            Reaction Time (msec)




                                                                                                               900                                               to trial variability of membrane potentials
 perimental measurements of neural activity. IF neurons integrate                                              800                                               across simulations. The Fano factor is
 synaptic current into a membrane potential, and then fire when the                                             700                                               calculated from the variance in the poten-
 membrane potential reaches a voltage threshold.                                                               600      Graded Simulations
                                                                                                                        Uniform Simulations
                                                                                                                                                                 tial measured in a window with temporal
 The synaptic current flowing into each neuron is described in terms                                            500
                                                                                                                 26 27 28 29 30 31 32 33 34
                                                                                                                    Winning Pool Final Second Firing Rate (Hz)   length T and expressed as a function of
 of neurotransmitter components. The four families of receptors used
                                                                                                                                                                 time,
 are GABA, NDMA, AMPArec , and AMPAext . The neurotransmitter re-                                                                                     Average Fano Factor of Membrane Potential
                                                                                                                            0.005                                                                                                                   Tr
 leased from a presynaptic excitatory neuron are AMPArec and NMDA,                                                                                                                                                                                       [Vi,n (T ) − Vi (T ) ]2
 while inhibitory neurons transmit GABA currents. Each neuron re-                                                           0.004
                                                                                                                                                                                                                       F (T ) = n                                                              ,
 ceives external input through a spike train modeled by a Poisson pro-                                                      0.003
                                                                                                           Fano Factor




                                                                                                                                                                                                                                   Vi (T )
 cess with rate λi = 3.0Hz.                                                                                                 0.002
                                                                                                                                                                                                             where Vi (T ) is the average potential of
 Synaptic current flowing into a neuron is given by the following equa-                                                      0.001                           Graded Simulations                               neuron i in the time window, and Tr is the
 tion, where each term on the RHS refers to the current from one class                                                                                      Uniform Simulations
                                                                                                                            0.000 0.5                      1.0   1.5      2.0 2.5 3.0    3.5      4.0        number of trials.
 of neurotransmitter,                                                                                                                                                  Time (seconds)


        Isyn (t) = IGABA(t) + INDMA(t) + IAMPA,rec (t) + IAMPA,ext (t)                                                             Conclusion
                                                                                                              The transition time to an attractor state, or reaction time, is
   Architecture                                                                                               decreased when neurons fire in a more biologically realistic
 We structure the network by establishing the strength of interactions                                        pattern.
 between two decision pools, D1 & D2, to be values that could occur                                           There is greater variability in the system’s states over time when
 through associative learning.                                                                                graded patterns are introduced.
                                                                                                            We state that increased variance in synaptic input to each neuron can
                                                  Non-Specific     1                                         be thought of as increased noise in the system. Conceptually, graded
               Inhibitory       Excitatory
                Neurons         Neurons
                                                                      Blowup showing sub-populations
                                                                      of exictatory neurons
                                                                                                            patterns are more noisy because recurrent synaptic input to neurons
                                             w+
                                                  D1
                                                       w−
                                                            D2
                                                                 w+
                                                                                                            will vary across the population.
                                                                                                            As neural networks become larger, noise will invariably become
                                                                                                            lower. However, when we consider the situation in brain, even though
 Neurons in the same decision pool are connected to each other with                                         the network is large, there is still significant noise in the system. We
 an strong average weight w+, and are connected to the other excita-                                        present the hypothesis that this noise is due in part to graded firing
 tory pools with an weak average weight w−.                                                                 pattens. Further work will explore this analytically.

Complexity DTC - University of Warwick - Coventry, UK                               Mail: tristan.webb@warwick.ac.uk                                                                              WWW: http://warwick.ac.uk/go/tristanwebb

More Related Content

Viewers also liked

Dynamic Kohonen Network for Representing Changes in Inputs
Dynamic Kohonen Network for Representing Changes in InputsDynamic Kohonen Network for Representing Changes in Inputs
Dynamic Kohonen Network for Representing Changes in Inputs
Jean Fecteau
 
week9_Machine_Learning.ppt
week9_Machine_Learning.pptweek9_Machine_Learning.ppt
week9_Machine_Learning.ppt
butest
 
Week 1
Week 1Week 1
Place Cell Latex report
Place Cell Latex reportPlace Cell Latex report
Place Cell Latex report
Jacob Senior
 
Viewing and editing different versions of a wiki
Viewing and editing different versions of a wikiViewing and editing different versions of a wiki
Viewing and editing different versions of a wiki
HKIEd Centre for Learning, Teaching & Technology
 
My Three Ex’s: A Data Science Approach for Applied Machine Learning
My Three Ex’s: A Data Science Approach for Applied Machine LearningMy Three Ex’s: A Data Science Approach for Applied Machine Learning
My Three Ex’s: A Data Science Approach for Applied Machine Learning
Daniel Tunkelang
 

Viewers also liked (6)

Dynamic Kohonen Network for Representing Changes in Inputs
Dynamic Kohonen Network for Representing Changes in InputsDynamic Kohonen Network for Representing Changes in Inputs
Dynamic Kohonen Network for Representing Changes in Inputs
 
week9_Machine_Learning.ppt
week9_Machine_Learning.pptweek9_Machine_Learning.ppt
week9_Machine_Learning.ppt
 
Week 1
Week 1Week 1
Week 1
 
Place Cell Latex report
Place Cell Latex reportPlace Cell Latex report
Place Cell Latex report
 
Viewing and editing different versions of a wiki
Viewing and editing different versions of a wikiViewing and editing different versions of a wiki
Viewing and editing different versions of a wiki
 
My Three Ex’s: A Data Science Approach for Applied Machine Learning
My Three Ex’s: A Data Science Approach for Applied Machine LearningMy Three Ex’s: A Data Science Approach for Applied Machine Learning
My Three Ex’s: A Data Science Approach for Applied Machine Learning
 

Similar to Graded Patterns in Attractor Networks

neural-networks (1)
neural-networks (1)neural-networks (1)
neural-networks (1)
rockeysuseelan
 
Entropy based algorithm for community detection in augmented networks
Entropy based algorithm for community detection in augmented networksEntropy based algorithm for community detection in augmented networks
Entropy based algorithm for community detection in augmented networks
Juan David Cruz-Gómez
 
Lec 5
Lec 5Lec 5
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
Adityendra Kumar Singh
 
Adaptive equalization
Adaptive equalizationAdaptive equalization
Adaptive equalization
Kamal Bhatt
 
Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02
anandECE2010
 
Artificial Neural Network Paper Presentation
Artificial Neural Network Paper PresentationArtificial Neural Network Paper Presentation
Artificial Neural Network Paper Presentation
guestac67362
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
DEEPASHRI HK
 
Ann by rutul mehta
Ann by rutul mehtaAnn by rutul mehta
Ann by rutul mehta
Rutul Mehta
 
Neural network
Neural networkNeural network
Neural network
KRISH na TimeTraveller
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
ncct
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligence
alldesign
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
NaveenBhajantri1
 

Similar to Graded Patterns in Attractor Networks (13)

neural-networks (1)
neural-networks (1)neural-networks (1)
neural-networks (1)
 
Entropy based algorithm for community detection in augmented networks
Entropy based algorithm for community detection in augmented networksEntropy based algorithm for community detection in augmented networks
Entropy based algorithm for community detection in augmented networks
 
Lec 5
Lec 5Lec 5
Lec 5
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Adaptive equalization
Adaptive equalizationAdaptive equalization
Adaptive equalization
 
Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial neural-network-paper-presentation-100115092527-phpapp02
 
Artificial Neural Network Paper Presentation
Artificial Neural Network Paper PresentationArtificial Neural Network Paper Presentation
Artificial Neural Network Paper Presentation
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Ann by rutul mehta
Ann by rutul mehtaAnn by rutul mehta
Ann by rutul mehta
 
Neural network
Neural networkNeural network
Neural network
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligence
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 

Recently uploaded

By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024
Pierluigi Pugliese
 
“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”
Claudio Di Ciccio
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
KAMESHS29
 
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
James Anderson
 
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
Neo4j
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
Kumud Singh
 
20240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 202420240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 2024
Matthew Sinclair
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Malak Abu Hammad
 
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
Neo4j
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
DianaGray10
 
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofszkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
Alex Pruden
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
Neo4j
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
Safe Software
 
GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...
ThomasParaiso2
 
Introduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - CybersecurityIntroduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - Cybersecurity
mikeeftimakis1
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
sonjaschweigert1
 
みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...
みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...
みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...
名前 です男
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
Kari Kakkonen
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 

Recently uploaded (20)

By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024
 
“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”“I’m still / I’m still / Chaining from the Block”
“I’m still / I’m still / Chaining from the Block”
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
 
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
 
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
 
20240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 202420240609 QFM020 Irresponsible AI Reading List May 2024
20240609 QFM020 Irresponsible AI Reading List May 2024
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
 
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
 
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofszkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
 
GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...
 
Introduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - CybersecurityIntroduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - Cybersecurity
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
 
みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...
みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...
みなさんこんにちはこれ何文字まで入るの?40文字以下不可とか本当に意味わからないけどこれ限界文字数書いてないからマジでやばい文字数いけるんじゃないの?えこ...
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 

Graded Patterns in Attractor Networks

  • 1. Graded Patterns in Attractor Networks Tristan Webb Supervisor: Jianfeng Feng Co-Supervisor: Edmund Rolls Complexity Science DTC, Computational Biology Research Group University of Warwick Summary We demonstrate how noise can exist in a neural network as large as the brain. Graded firing patterns allow us to tune noise levels in the engineering of neural networks. The levels of noise in the brain may change with age and play a functional role in the retrieval of memory. Attractor Neural Networks Graded Patterns Neural coding, and its relationship to behavior, is heavily researched The network was simulated numerically for a time period of four sec- in many areas of neuroscience. Attractor networks are a demonstra- onds. We present the network two different periods of different exter- tion of how decisions, memories, and other cognitive representations nal stimulus levels: first a base period, and later a cue period. During can be encoded in a firing pattern (or set of active neurons) in a neu- the cue period the qualitative firing pattern in the network is sporadic ral network. and uneven. When cues are applied, the firing rate for the neurons in An attractor network receives sensory information to the network a winning decision pool is raised to through positive feedback, while through connections known as synapses. The network is character- the other pool is suppressed through increased inhibition. ized by recurrent collateral synapses providing feedback to neurons. Uniform Graded Recurrent synaptic activity will cause the firing patterns in the network Final Second Mean Neuron Rates Final Second Mean Neuron Rates 60 60 to persist even after the input is removed. Winning Pool Winning Pool Learning occurs through the modification of synaptic strengths (wij , 50 Losing Pool 50 Losing Pool where i is the ith neuron and j is the jth synapse). An associative 40 40 Firing Rate (Hz) Firing Rate (Hz) learning (Hebbian) rule can create the correct structure for the re- 30 30 call of information. This type of learning strengthens connections 20 20 between neurons that are simultaneously active. 10 10 The network dynamics can be thought of as a gradient descent to- wards a local minimum in an energy landscape. When the network 00 5 10 15 20 25 30 35 40 00 5 10 15 20 25 30 35 40 Neuron Number Neuron Number has reached this minimum the learned pattern is recalled. The en- ergy is defined as We imposed uniform and graded firing patterns on the network by 1 External Inputs selecting the distribution of the recurrent weight for each of the deci- E =− (yi − < y >)(yj − < y >) sion pools. To achieve a uniform firing pattern, weights were all set 2 ij Recurrent firing yj Dendrites to the same value w+ = 2.1. Graded firing patterns were achieved Recurrent where yi is the firing of the ith neu- wij collateral by conforming weights to a discrete exponential-like distribution with ron, < y > is the population’s mean fir- synapses mean value w+ ≈ 2.1. ing rate. Fixed points in attractor net- Cell bodies works can correspond to a spontaneous Output firing Results state (where all neurons have a low fir- yi Graded simulations were more likely to jump to a decision ing rate), or a persistent state in which a early. This could be caused by decreased stability of the subset of neurons have a high firing rate. spontaneous state. Changes in reaction time distributions are statistically significant and the decrease in reaction time Network Dynamics is robust across different firing rates of the winning pool. Variability in the system increases when Neurons in simulations use Integrate-and-Fire (IF) dynamics to de- Reaction Times vs Firing Rates 1100 graded patterns are introduced. Here scribe the membrane potential of neurons. We chose biologically 1000 we use the Fano factor to compute trial realistic constants to obtain firing rates that are comparable to ex- Reaction Time (msec) 900 to trial variability of membrane potentials perimental measurements of neural activity. IF neurons integrate 800 across simulations. The Fano factor is synaptic current into a membrane potential, and then fire when the 700 calculated from the variance in the poten- membrane potential reaches a voltage threshold. 600 Graded Simulations Uniform Simulations tial measured in a window with temporal The synaptic current flowing into each neuron is described in terms 500 26 27 28 29 30 31 32 33 34 Winning Pool Final Second Firing Rate (Hz) length T and expressed as a function of of neurotransmitter components. The four families of receptors used time, are GABA, NDMA, AMPArec , and AMPAext . The neurotransmitter re- Average Fano Factor of Membrane Potential 0.005 Tr leased from a presynaptic excitatory neuron are AMPArec and NMDA, [Vi,n (T ) − Vi (T ) ]2 while inhibitory neurons transmit GABA currents. Each neuron re- 0.004 F (T ) = n , ceives external input through a spike train modeled by a Poisson pro- 0.003 Fano Factor Vi (T ) cess with rate λi = 3.0Hz. 0.002 where Vi (T ) is the average potential of Synaptic current flowing into a neuron is given by the following equa- 0.001 Graded Simulations neuron i in the time window, and Tr is the tion, where each term on the RHS refers to the current from one class Uniform Simulations 0.000 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 number of trials. of neurotransmitter, Time (seconds) Isyn (t) = IGABA(t) + INDMA(t) + IAMPA,rec (t) + IAMPA,ext (t) Conclusion The transition time to an attractor state, or reaction time, is Architecture decreased when neurons fire in a more biologically realistic We structure the network by establishing the strength of interactions pattern. between two decision pools, D1 & D2, to be values that could occur There is greater variability in the system’s states over time when through associative learning. graded patterns are introduced. We state that increased variance in synaptic input to each neuron can Non-Specific 1 be thought of as increased noise in the system. Conceptually, graded Inhibitory Excitatory Neurons Neurons Blowup showing sub-populations of exictatory neurons patterns are more noisy because recurrent synaptic input to neurons w+ D1 w− D2 w+ will vary across the population. As neural networks become larger, noise will invariably become lower. However, when we consider the situation in brain, even though Neurons in the same decision pool are connected to each other with the network is large, there is still significant noise in the system. We an strong average weight w+, and are connected to the other excita- present the hypothesis that this noise is due in part to graded firing tory pools with an weak average weight w−. pattens. Further work will explore this analytically. Complexity DTC - University of Warwick - Coventry, UK Mail: tristan.webb@warwick.ac.uk WWW: http://warwick.ac.uk/go/tristanwebb