Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

HTM Spatial Pooler

1,867 views

Published on

Numenta engineer Yuwei Cui walks through how the HTM Spatial Pooler works, explaining why desired properties exist and how they work. Includes lots of graphs of SP online learning performance, discussion of topology and boosting.

Published in: Science
  • How can I sharpen my memory? How can I improve forgetfulness? find out more... ●●● https://bit.ly/2GEWG9T
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

HTM Spatial Pooler

  1. 1. Yuwei Cui ycui@numenta.com HTM spatial pooler 02/8/2017
  2. 2. Spatial Pooler Resources • Hawkins J, Ahmad S, Dubinsky D. (2011) Cortical learning algorithm and hierarchical temporal memory. Numenta Whitepaper http://numenta.org/resources/HTM_CorticalLearningAlgorithms.pdf • Cui Y, Ahmad S, Hawkins J (under review) The HTM Spatial Pooler: A Neocortical Algorithm for Online Sparse Distributed Coding. bioRxiv. DOI: 10.1101/085035 • HTM School: http://numenta.org/htm-school/ • Spatial Pooler in NUPIC: https://github.com/numenta/nupic/blob/master/src/nupic/research/spatial_pooler.py
  3. 3. Background • How do individual neurons learn to respond to specific input patterns? • How do populations of neurons represents input features? retina cochlea somatic data stream motor control millions of spike trains!
  4. 4. Online sequence learning with HTM HTM High Order Sequence Memory Encoder SDRData Predictions Classification Classifier Anomaly Likelihood Anomalies Spatial Pooler
  5. 5. HTM Spatial Pooler HTM High Order Sequence Memory Encoder SDRData Predictions Classification Classifier Anomaly Likelihood Anomalies Spatial Pooler Spatial Pooler Input 1 Input 2 Input 3 SDRs with fixed sparsity Large number of unlabeled inputs Variable sparsity ………… What properties do SP achieve?
  6. 6. Outline • Background • HTM spatial pooler • Properties and metrics • Simulation results
  7. 7. HTM spatial pooler
  8. 8. HTM spatial pooler – winner take all Overlap of the ith MC jth input Synaptic connection from jth input to ith MC
  9. 9. HTM spatial pooler – boosting
  10. 10. HTM spatial pooler – learning Hebbian Learning Rule: Active (winner) MCs reinforce their active input connections, and depress inactive inputs
  11. 11. Learning in HTM Spatial Pooler • Why do we need learning in SP? If inputs are random, an untrained random SP will do just as good as any trained SP However, real inputs are structured. Input SDRs occurs with non-equal probabilities. The actual inputs should be “better” represented than random inputs after learning
  12. 12. Properties of HTM spatial pooler • Fixed-sparseness • Distributed coding • Preserving semantic similarity • Noise robustness / Fault tolerance • Continuous learning • Stability
  13. 13. Properties of SP– fixed sparseness Population sparseness:
  14. 14. Properties of SP – distributed coding Activation prob. of the ith MC Entropy of the ith MC
  15. 15. Properties of SP – noise robustness
  16. 16. Properties of SP – continuous learning
  17. 17. Properties of SP – continuous learning
  18. 18. Properties of SP – fault tolerance Experiment 1: lesion of SP MCs We monitor: • RF center of SP MCs • Coverage of input space • Avg boost factors • Growth & elimination of synapses
  19. 19. Properties of SP – fault tolerance Experiment 1: lesion of SP MCs
  20. 20. Properties of SP – fault tolerance Experiment 1: lesion of SP MCs
  21. 21. Properties of SP – fault tolerance Experiment 2: lesion of afferent inputs We monitor: • RF center of SP MCs • Coverage of input space • Avg boost factors • Growth & elimination of synapses
  22. 22. Properties of SP – fault tolerance Experiment 2: lesion of afferent inputs
  23. 23. Properties of SP – fault tolerance Experiment 2: lesion of afferent inputs
  24. 24. Properties of HTM spatial pooler • Fixed-sparseness • Distributed coding • Preserving semantic similarity • Noise robustness / Fault tolerance • Continuous learning • Stability
  25. 25. Resources • Hawkins J, Ahmad S, Dubinsky D. Cortical learning algorithm and hierarchical temporal memory. In: Numenta Whitepaper [Internet]. 2011 pp. 1–68. Available: http://numenta.org/resources/HTM_CorticalLearningAlgorithms.pdf • Cui Y, Ahmad S, Hawkins J (under review) The HTM Spatial Pooler: A Neocortical Algorithm for Online Sparse Distributed Coding. bioRxiv. DOI: 10.1101/085035
  26. 26. Neural mechanisms of SP

×