SlideShare a Scribd company logo
MIT
December 15, 2017
Jeff Hawkins
jhawkins@numenta.com
Have We Missed Half of What the Neocortex Does?
Allocentric Location as the Basis of Perception
1) Reverse engineer the neocortex
- an ambitious but realizable goal
- seek biologically accurate theories
- test empirically and via simulation
2) Enable technology based on cortical theory
- active open source community
- basis for Machine Intelligence
L2
L3a
L3b
L4
L6a
L6b
L6 ip
L6 mp
L6 bp
L5 tt
L5 cc
L5 cc-ns
L2/3
L4
L6
L5
Input
The Cortical Column
1) Cortical columns are complex
- Twelve or more excitatory cellular layers
- Two parallel FF pathways
- Parallel FB pathways (not shown)
- Numerous intra- and inter-column connections (not shown)
- Inhibitory neurons/circuits are equally complex
2) The function of a cortical column must also be complex.
3) Whatever a column does applies to everything the cortex does.
L5: Calloway et. al, 2015
L6: Zhang and Deschenes, 1997
Simple
Output, via thalamus
50%10%
Cortex
Thalamus
Output, direct
L5 CTC: Guillery, 1995
Constantinople and Bruno, 2013
A Couple of Thoughts
Output
Observation:
The neocortex is constantly predicting its inputs.
How do networks of neurons, as seen in the neocortex,
learn predictive models of the world?
Research:
1) How does the cortex learn predictive models of extrinsic sequences?
2) How does the cortex learn predictive models of sensorimotor sequences?
Current research: How do columns compute allocentric location?
- Grid cells in entorhinal cortex solve a similar problem
- Big Idea: cortical columns contain analogs of grid cells and head direction cells
- Starting to understand the function of numerous layers and connections
“Why Neurons Have Thousands of Synapses, a Theory of Sequence Memory in the Neocortex”
Hawkins and Ahmad, Frontiers in Neural Circuits, 2016/03/30
- Big Idea: Pyramidal neuron model for prediction
- A single layer network model for sequence memory
- Properties of sparse activations
“A Theory of How Columns in the Neocortex Learn the Structure of the World”
Hawkins, Ahmad, and Cui, Frontiers in Neural Circuits, 2017/10/25
- Extension of sequence memory model
- Big Idea: Columns compute “allocentric” location of input
- By moving sensor, columns learn models of complete objects
Proximal synapses: Cause somatic spikes
Define classic receptive field of neuron
Distal synapses: Cause dendritic spikes
Put the cell into a depolarized, or “predictive” state
Depolarized neurons fire sooner, inhibiting nearby neurons.
A neuron can predict its activity in hundreds of unique contexts.
5K to 30K excitatory synapses
- 10% proximal
- 90% distal
Distal dendrites are pattern detectors
- 8-15 co-active, co-located synapses
generate dendritic spike
- sustained depolarization of soma
HTM Neuron Model
Prediction Starts in the Neuron
Pyramidal Neuron
Major, Larkum and Schiller 2013
Properties of Sparse Activations
L2
L3a
L3b
L4
L6a
L6b
L6 ip
L6 mp
L6 bp
L5 tt
L5 cc
L5 cc-ns
Example: One layer of cells, 5,000 neurons, 2% (100) active
1) Representational capacity is virtually unlimited
(5,000 choose 100) = 3x10211
2) Randomly chosen representations have minimal overlap
3) A neuron can robustly recognize an activation pattern by forming 10 to 20 synapses
4) Unions of patterns do not cause errors in recognition
Hypothesis: Cellular layers use unions to represent uncertainty
Hawkins, Ahmad, 2016
Ahmad, Hawkins, 2015
Pattern 1 (100 active cells)
Cell robustly recognizes pattern1
by forming synapses to small sub-
sample of active cells
Union
Patterns 1-10 (1,000 active cells)
Cell still robustly recognizes pattern 1
A Single Layer Network Model for Sequence Memory
- Neurons in a mini-column learn same FF receptive field.
- Neurons forms distal connections to nearby cells.
No prediction Predicted input
(Hawkins & Ahmad, 2016)
(Cui et al, 2016)
- High capacity (learns up to 1M transitions)
- Learns high-order sequences: “ABCD” vs “XBCY”
- Makes simultaneous predictions: “BC…” predicts “D” and “Y”
- Extremely robust (tolerant to 40% noise and faults)
- Learning is unsupervised, continuous, and local
- Satisfies many biological constraints
- Multiple open source implementations (some commercial)
t=0
t=1
Predicted cells fire first
and inhibit neighbors
Next prediction t=2
t=0
t=1
1) How does the cortex learn predictive models of extrinsic sequences?
2) How does the cortex learn predictive models of sensorimotor sequences?
Current research: How do columns compute allocentric location?
- Grid cells in entorhinal cortex solve a similar problem
- Hypothesis: cortical columns contain analogs of grid cells and head direction cells
- Starting to understand the function of numerous layers and connections
“Why Neurons Have Thousands of Synapses, a Theory of Sequence Memory in the Neocortex”
Hawkins and Ahmad, Frontiers in Neural Circuits, 2016/03/30
- Pyramidal neuron model
- A single layer network model for sequence memory
- Properties of sparse activations
“A Theory of How Columns in the Neocortex Learn the Structure of the World”
Hawkins, Ahmad, and Cui, Frontiers in Neural Circuits, 2017/10/25
- Extension of sequence memory model
- Big Idea: Columns compute “allocentric” location of input
- By moving sensor, columns learn models of complete objects
How Could a Layer of Neurons Learn a Predictive Model of
Sensorimotor Sequences?
Sequence memory
Sensorimotor sequences
SensorMotor-related context
Hypothesis:
By adding motor-related context, a cellular layer can predict
its input as the sensor moves.
What is the correct motor-related context?
L2
L3a
L3b
L4
L6a
L6b
L6 ip
L6 mp
L6 bp
L5 tt
L5 cc
L5 cc-ns
50%
Sensory
feature
Two Layer Model of Sensorimotor Sequence Memory
Feature @ location
Object Stable over movement of sensor
With allocentric location input, a column can learn models of
complete objects by sensing different locations on object over time.
Sensor
Feature
Allocentric
Location
Pooling
Seq Mem
Changes with each movement
Object
Feature @ Location
Location
on object
Column 1 Column 2 Column 3
Sensor
feature
Sensorimotor Inference With Multiple Columns
Each column has partial knowledge of object.
Long range connections in object layer allow columns to vote.
Inference is much faster with multiple columns.
FeatureFeatureFeatureLocationLocationLocation
Output
Input
Objects Recognized By Integrating Inputs Over Time
FeatureLocationFeatureLocationFeatureLocation
Column 1 Column 2 Column 3
Output
Input
Recognition is Faster with Multiple Columns
Yale-CMU-Berkeley (YCB) Object Benchmark (Calli et al, 2017)
- 80 objects designed for robotics grasping tasks
- Includes high-resolution 3D CAD files
YCB Object Benchmark
We created a virtual hand using the Unity game engine
Curvature based sensor on each fingertip
4096 neurons per layer per column
98.7% recall accuracy (77/78 uniquely classified)
Convergence time depends on object, sequence of
sensations, number of fingers.
Simulation using YCB Object Benchmark
Pairwise confusion between objects after 1 touch
Convergence 1 finger 1 touch
Pairwise confusion between objects after 2 touches
Convergence 1 finger 2 touches
Pairwise confusion between objects after 6 touches
Convergence 1 finger 6 touches
Pairwise confusion between objects after 10
touches
Convergence 1 finger 10 touches
Convergence Time vs. Number of Columns
This is why we can infer complex objects in a single grasp or single visual fixation.
1) How does the cortex learn predictive models of extrinsic sequences?
2) How does the cortex learn predictive models of sensorimotor sequences?
Current research: How do columns compute allocentric location?
- Hypothesis: cortical columns contain analogs of grid cells and head direction cells
- Starting to understand the function of numerous layers and connections
“Why Neurons Have Thousands of Synapses, a Theory of Sequence Memory in the Neocortex”
Hawkins and Ahmad, Frontiers in Neural Circuits, 2016/03/30
- Pyramidal neuron model
- A single layer network model for sequence memory
- Properties of sparse activations
“A Theory of How Columns in the Neocortex Learn the Structure of the World”
Hawkins, Ahmad, and Cui, Frontiers in Neural Circuits, 2017/10/25
- Extension of sequence memory model
- Big Idea: Columns compute “allocentric” location of input
- By moving sensor, columns learn models of complete objects
Entorhinal Cortex
environments
A
B C
X
Y Z
R
S T
Room 3
Room 2Room 1
Location
- Encoded by Grid Cells
- Unique to location in room AND room
- Location is updated by movement
Orientation (of head to room)
- Encoded by Head Direction Cells
- Anchored to room
- Orientation is updated by movement
Location
- Unique to location on object AND object
- Location is updated by movement
Orientation (of sensor patch to object)
- Anchored to object
- Orientation is updated by movement
Cortical Column
objects
Hypothesis:
Cortical columns contain analogs of grid cells and head direction cells
A
C
B
X
Y
Z
Stensola, Solstad, Frøland, Moser, Moser: 2012
Location and Orientation are both necessary
to learn the structure of rooms and predict
sensory input.
Location and Orientation are both necessary
to learn the structure of objects and predict
sensory input.
L3
L4
L6a
L6b
L5a
L5b
Mapping Orientation and Location to a Cortical Column (most complex slide)
Sensation
Orientation
1) A column is a two-stage sensorimotor model for learning and inferring structure.
2) A column usually cannot infer a Feature or Object in one sensation.
- Integrate over time (sense, move, sense, move, sense..)
- Vote with neighboring columns
3) This system is most obvious for touch, but it applies to vision and other sensory modalities.
Because this architecture exists throughout the neocortex, it suggests we learn, infer,
and manipulate abstract concepts the same way we manipulate objects in the world.
Location
Sensation @ Orientation
Feature
Feature @ Location
Object
Motor updated (HD cell-like)
Motor updated (grid cell-like)
Seq mem
Pooling
Seq mem
Pooling
Meaning Operation
Rethinking Hierarchy
Every column learns complete models of objects. They operate in parallel.
Inputs project to multiple levels at once. Columns operate at different
scales of input.
Sense
Simple features
Complex features
Objects
Classic
Objects
Objects
Objects
Sensor array
Proposed
Region 3
Region 2
Region 1
Rethinking Hierarchy
Every column learns complete models of objects. They operate in parallel.
Inputs project to multiple levels at once. Columns operate at different
scales of input.
Non-hierarchical connections allow columns to vote on shared elements
such as “object” and “feature”.
Sense
Simple features
Complex features
Objects
Classic
Sensor array
Objects
Objects
Objects
Sensor array
vision touch
Proposed
Region 3
Region 2
Region 1
Summary
Goal: Understand the function and operation of the laminar circuits in the neocortex.
Method: Study how cortical columns make predictions of their inputs.
Proposals
1) Pyramidal neurons are the substrate of prediction.
Each neuron predicts its activity in hundreds of contexts.
2) A single layer of neurons forms a predictive memory of high-order sequences.
(sparse activations, mini-columns, fast inhibition, and lateral connections)
3) A two-layer network forms a predictive memory of sensorimotor sequences.
(add motor-derived context and a pooling layer)
4) Columns need motor-derived representations of location and orientation, of the
sensor relative to the object. These are analogous to grid and head direction cells.
5) A framework for the cortical column.
- Columns learn complete models of objects as “features at locations”, using two
sensorimotor inference stages.
6) The neocortex contains thousands of parallel models, that resolve uncertainty by
associative linking and/or movement of the sensors.
Open Issues
Behaviors: how are they learned, encoded, and applied to objects?
Detailed model of hierarchy including thalamus
How can the model be applied to “Where” pathways, and how do “What” and “Where”
pathways work together
Collaborations
There are many testable predictions in this model, a “green field”. We welcome
collaborations and discussions.
We are always interested in hosting visiting scholars and interns.
Numenta Team
Subutai Ahmad
VP Research
Marcus Lewis
Thank You

More Related Content

What's hot

Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine LearningNumenta
 
Numenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
Numenta Brain Theory Discoveries of 2016/2017 by Jeff HawkinsNumenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
Numenta Brain Theory Discoveries of 2016/2017 by Jeff HawkinsNumenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...Christy Maver
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...Numenta
 
ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...
ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...
ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...Numenta
 
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...Numenta
 
Sparse Distributed Representations: Our Brain's Data Structure
Sparse Distributed Representations: Our Brain's Data Structure Sparse Distributed Representations: Our Brain's Data Structure
Sparse Distributed Representations: Our Brain's Data Structure Numenta
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisNumenta
 
Fundamentals of Neural Networks
Fundamentals of Neural NetworksFundamentals of Neural Networks
Fundamentals of Neural NetworksGagan Deep
 
What is (computational) neuroscience?
What is (computational) neuroscience?What is (computational) neuroscience?
What is (computational) neuroscience?SSA KPI
 
CARLsim 3: Concepts, Tools, and Applications
CARLsim 3: Concepts, Tools, and ApplicationsCARLsim 3: Concepts, Tools, and Applications
CARLsim 3: Concepts, Tools, and ApplicationsMichael Beyeler
 
Computational neuroscience
Computational neuroscienceComputational neuroscience
Computational neuroscienceNicolas Rougier
 
neural networks
 neural networks neural networks
neural networksjoshiblog
 
Neural Computing
Neural ComputingNeural Computing
Neural ComputingESCOM
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network AbstractAnjali Agrawal
 

What's hot (20)

Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine Learning
 
Numenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
Numenta Brain Theory Discoveries of 2016/2017 by Jeff HawkinsNumenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
Numenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
 
ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...
ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...
ICMNS Presentation: Presence of high order cell assemblies in mouse visual co...
 
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
 
Sparse Distributed Representations: Our Brain's Data Structure
Sparse Distributed Representations: Our Brain's Data Structure Sparse Distributed Representations: Our Brain's Data Structure
Sparse Distributed Representations: Our Brain's Data Structure
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus Lewis
 
Nencki321 day2
Nencki321 day2Nencki321 day2
Nencki321 day2
 
Fundamentals of Neural Networks
Fundamentals of Neural NetworksFundamentals of Neural Networks
Fundamentals of Neural Networks
 
Lec 1-2-3-intr.
Lec 1-2-3-intr.Lec 1-2-3-intr.
Lec 1-2-3-intr.
 
What is (computational) neuroscience?
What is (computational) neuroscience?What is (computational) neuroscience?
What is (computational) neuroscience?
 
CARLsim 3: Concepts, Tools, and Applications
CARLsim 3: Concepts, Tools, and ApplicationsCARLsim 3: Concepts, Tools, and Applications
CARLsim 3: Concepts, Tools, and Applications
 
Computational neuroscience
Computational neuroscienceComputational neuroscience
Computational neuroscience
 
Lesson 37
Lesson 37Lesson 37
Lesson 37
 
neural networks
 neural networks neural networks
neural networks
 
Neural Computing
Neural ComputingNeural Computing
Neural Computing
 
7 nn1-intro.ppt
7 nn1-intro.ppt7 nn1-intro.ppt
7 nn1-intro.ppt
 
intelligent system
intelligent systemintelligent system
intelligent system
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network Abstract
 

Similar to Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)

Principles of Hierarchical Temporal Memory - Foundations of Machine Intelligence
Principles of Hierarchical Temporal Memory - Foundations of Machine IntelligencePrinciples of Hierarchical Temporal Memory - Foundations of Machine Intelligence
Principles of Hierarchical Temporal Memory - Foundations of Machine IntelligenceNumenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...Numenta
 
Consciousness, Graph theory and brain network tsc 2017
Consciousness, Graph theory and brain network tsc 2017Consciousness, Graph theory and brain network tsc 2017
Consciousness, Graph theory and brain network tsc 2017Nir Lahav
 
Why Neurons have thousands of synapses? A model of sequence memory in the brain
Why Neurons have thousands of synapses? A model of sequence memory in the brainWhy Neurons have thousands of synapses? A model of sequence memory in the brain
Why Neurons have thousands of synapses? A model of sequence memory in the brainNumenta
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANNMostafaHazemMostafaa
 
20141003.journal club
20141003.journal club20141003.journal club
20141003.journal clubHayaru SHOUNO
 
Fundamentals of Neural Network (Soft Computing)
Fundamentals of Neural Network (Soft Computing)Fundamentals of Neural Network (Soft Computing)
Fundamentals of Neural Network (Soft Computing)Amit Kumar Rathi
 
Brain machine-interface
Brain machine-interfaceBrain machine-interface
Brain machine-interfaceVysakh Sharma
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationMohammed Bennamoun
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...Chester Chen
 
Final cnn shruthi gali
Final cnn shruthi galiFinal cnn shruthi gali
Final cnn shruthi galiSam Ram
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKSESCOM
 
Neural networks
Neural networksNeural networks
Neural networksBasil John
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learningViet-Trung TRAN
 
Design of Cortical Neuron Circuits With VLSI Design Approach
Design of Cortical Neuron Circuits With VLSI Design ApproachDesign of Cortical Neuron Circuits With VLSI Design Approach
Design of Cortical Neuron Circuits With VLSI Design Approachijsc
 
Neuromorphic computing for neural networks
Neuromorphic computing for neural networksNeuromorphic computing for neural networks
Neuromorphic computing for neural networksClaudio Gallicchio
 

Similar to Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017) (20)

Principles of Hierarchical Temporal Memory - Foundations of Machine Intelligence
Principles of Hierarchical Temporal Memory - Foundations of Machine IntelligencePrinciples of Hierarchical Temporal Memory - Foundations of Machine Intelligence
Principles of Hierarchical Temporal Memory - Foundations of Machine Intelligence
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
Consciousness, Graph theory and brain network tsc 2017
Consciousness, Graph theory and brain network tsc 2017Consciousness, Graph theory and brain network tsc 2017
Consciousness, Graph theory and brain network tsc 2017
 
Why Neurons have thousands of synapses? A model of sequence memory in the brain
Why Neurons have thousands of synapses? A model of sequence memory in the brainWhy Neurons have thousands of synapses? A model of sequence memory in the brain
Why Neurons have thousands of synapses? A model of sequence memory in the brain
 
Soft computing BY:- Dr. Rakesh Kumar Maurya
Soft computing BY:- Dr. Rakesh Kumar MauryaSoft computing BY:- Dr. Rakesh Kumar Maurya
Soft computing BY:- Dr. Rakesh Kumar Maurya
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANN
 
20141003.journal club
20141003.journal club20141003.journal club
20141003.journal club
 
ANN.pptx
ANN.pptxANN.pptx
ANN.pptx
 
Fundamentals of Neural Network (Soft Computing)
Fundamentals of Neural Network (Soft Computing)Fundamentals of Neural Network (Soft Computing)
Fundamentals of Neural Network (Soft Computing)
 
Brain machine-interface
Brain machine-interfaceBrain machine-interface
Brain machine-interface
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
 
Image recognition
Image recognitionImage recognition
Image recognition
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...
 
Final cnn shruthi gali
Final cnn shruthi galiFinal cnn shruthi gali
Final cnn shruthi gali
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKS
 
Neural networks
Neural networksNeural networks
Neural networks
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learning
 
Design of Cortical Neuron Circuits With VLSI Design Approach
Design of Cortical Neuron Circuits With VLSI Design ApproachDesign of Cortical Neuron Circuits With VLSI Design Approach
Design of Cortical Neuron Circuits With VLSI Design Approach
 
SoftComputing5
SoftComputing5SoftComputing5
SoftComputing5
 
Neuromorphic computing for neural networks
Neuromorphic computing for neural networksNeuromorphic computing for neural networks
Neuromorphic computing for neural networks
 

More from Numenta

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesNumenta
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyNumenta
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiNumenta
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Numenta
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Numenta
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Numenta
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenNumenta
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...Numenta
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroNumenta
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...Numenta
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)Numenta
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisNumenta
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial PoolerNumenta
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AINumenta
 
Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligenceNumenta
 

More from Numenta (15)

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devices
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial Pooler
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AI
 
Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine Intelligence
 

Recently uploaded

FAIRSpectra - Towards a common data file format for SIMS images
FAIRSpectra - Towards a common data file format for SIMS imagesFAIRSpectra - Towards a common data file format for SIMS images
FAIRSpectra - Towards a common data file format for SIMS imagesAlex Henderson
 
The importance of continents, oceans and plate tectonics for the evolution of...
The importance of continents, oceans and plate tectonics for the evolution of...The importance of continents, oceans and plate tectonics for the evolution of...
The importance of continents, oceans and plate tectonics for the evolution of...Sérgio Sacani
 
Aerodynamics. flippatterncn5tm5ttnj6nmnynyppt
Aerodynamics. flippatterncn5tm5ttnj6nmnynypptAerodynamics. flippatterncn5tm5ttnj6nmnynyppt
Aerodynamics. flippatterncn5tm5ttnj6nmnynypptsreddyrahul
 
Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...
Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...
Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...NathanBaughman3
 
NuGOweek 2024 full programme - hosted by Ghent University
NuGOweek 2024 full programme - hosted by Ghent UniversityNuGOweek 2024 full programme - hosted by Ghent University
NuGOweek 2024 full programme - hosted by Ghent Universitypablovgd
 
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...Health Advances
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionpablovgd
 
GLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptx
GLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptxGLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptx
GLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptxSultanMuhammadGhauri
 
Shuaib Y-basedComprehensive mahmudj.pptx
Shuaib Y-basedComprehensive mahmudj.pptxShuaib Y-basedComprehensive mahmudj.pptx
Shuaib Y-basedComprehensive mahmudj.pptxMdAbuRayhan16
 
platelets- lifespan -Clot retraction-disorders.pptx
platelets- lifespan -Clot retraction-disorders.pptxplatelets- lifespan -Clot retraction-disorders.pptx
platelets- lifespan -Clot retraction-disorders.pptxmuralinath2
 
SAMPLING.pptx for analystical chemistry sample techniques
SAMPLING.pptx for analystical chemistry sample techniquesSAMPLING.pptx for analystical chemistry sample techniques
SAMPLING.pptx for analystical chemistry sample techniquesrodneykiptoo8
 
Richard's entangled aventures in wonderland
Richard's entangled aventures in wonderlandRichard's entangled aventures in wonderland
Richard's entangled aventures in wonderlandRichard Gill
 
electrochemical gas sensors and their uses.pptx
electrochemical gas sensors and their uses.pptxelectrochemical gas sensors and their uses.pptx
electrochemical gas sensors and their uses.pptxHusna Zaheer
 
National Biodiversity protection initiatives and Convention on Biological Di...
National Biodiversity protection initiatives and  Convention on Biological Di...National Biodiversity protection initiatives and  Convention on Biological Di...
National Biodiversity protection initiatives and Convention on Biological Di...PABOLU TEJASREE
 
insect taxonomy importance systematics and classification
insect taxonomy importance systematics and classificationinsect taxonomy importance systematics and classification
insect taxonomy importance systematics and classificationanitaento25
 
biotech-regenration of plants, pharmaceutical applications.pptx
biotech-regenration of plants, pharmaceutical applications.pptxbiotech-regenration of plants, pharmaceutical applications.pptx
biotech-regenration of plants, pharmaceutical applications.pptxANONYMOUS
 
Microbial Type Culture Collection (MTCC)
Microbial Type Culture Collection (MTCC)Microbial Type Culture Collection (MTCC)
Microbial Type Culture Collection (MTCC)abhishekdhamu51
 
Lab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerinLab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerinossaicprecious19
 
Transport in plants G1.pptx Cambridge IGCSE
Transport in plants G1.pptx Cambridge IGCSETransport in plants G1.pptx Cambridge IGCSE
Transport in plants G1.pptx Cambridge IGCSEjordanparish425
 
Structures and textures of metamorphic rocks
Structures and textures of metamorphic rocksStructures and textures of metamorphic rocks
Structures and textures of metamorphic rockskumarmathi863
 

Recently uploaded (20)

FAIRSpectra - Towards a common data file format for SIMS images
FAIRSpectra - Towards a common data file format for SIMS imagesFAIRSpectra - Towards a common data file format for SIMS images
FAIRSpectra - Towards a common data file format for SIMS images
 
The importance of continents, oceans and plate tectonics for the evolution of...
The importance of continents, oceans and plate tectonics for the evolution of...The importance of continents, oceans and plate tectonics for the evolution of...
The importance of continents, oceans and plate tectonics for the evolution of...
 
Aerodynamics. flippatterncn5tm5ttnj6nmnynyppt
Aerodynamics. flippatterncn5tm5ttnj6nmnynypptAerodynamics. flippatterncn5tm5ttnj6nmnynyppt
Aerodynamics. flippatterncn5tm5ttnj6nmnynyppt
 
Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...
Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...
Astronomy Update- Curiosity’s exploration of Mars _ Local Briefs _ leadertele...
 
NuGOweek 2024 full programme - hosted by Ghent University
NuGOweek 2024 full programme - hosted by Ghent UniversityNuGOweek 2024 full programme - hosted by Ghent University
NuGOweek 2024 full programme - hosted by Ghent University
 
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...The ASGCT Annual Meeting was packed with exciting progress in the field advan...
The ASGCT Annual Meeting was packed with exciting progress in the field advan...
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
 
GLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptx
GLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptxGLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptx
GLOBAL AND LOCAL SCENARIO OF FOOD AND NUTRITION.pptx
 
Shuaib Y-basedComprehensive mahmudj.pptx
Shuaib Y-basedComprehensive mahmudj.pptxShuaib Y-basedComprehensive mahmudj.pptx
Shuaib Y-basedComprehensive mahmudj.pptx
 
platelets- lifespan -Clot retraction-disorders.pptx
platelets- lifespan -Clot retraction-disorders.pptxplatelets- lifespan -Clot retraction-disorders.pptx
platelets- lifespan -Clot retraction-disorders.pptx
 
SAMPLING.pptx for analystical chemistry sample techniques
SAMPLING.pptx for analystical chemistry sample techniquesSAMPLING.pptx for analystical chemistry sample techniques
SAMPLING.pptx for analystical chemistry sample techniques
 
Richard's entangled aventures in wonderland
Richard's entangled aventures in wonderlandRichard's entangled aventures in wonderland
Richard's entangled aventures in wonderland
 
electrochemical gas sensors and their uses.pptx
electrochemical gas sensors and their uses.pptxelectrochemical gas sensors and their uses.pptx
electrochemical gas sensors and their uses.pptx
 
National Biodiversity protection initiatives and Convention on Biological Di...
National Biodiversity protection initiatives and  Convention on Biological Di...National Biodiversity protection initiatives and  Convention on Biological Di...
National Biodiversity protection initiatives and Convention on Biological Di...
 
insect taxonomy importance systematics and classification
insect taxonomy importance systematics and classificationinsect taxonomy importance systematics and classification
insect taxonomy importance systematics and classification
 
biotech-regenration of plants, pharmaceutical applications.pptx
biotech-regenration of plants, pharmaceutical applications.pptxbiotech-regenration of plants, pharmaceutical applications.pptx
biotech-regenration of plants, pharmaceutical applications.pptx
 
Microbial Type Culture Collection (MTCC)
Microbial Type Culture Collection (MTCC)Microbial Type Culture Collection (MTCC)
Microbial Type Culture Collection (MTCC)
 
Lab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerinLab report on liquid viscosity of glycerin
Lab report on liquid viscosity of glycerin
 
Transport in plants G1.pptx Cambridge IGCSE
Transport in plants G1.pptx Cambridge IGCSETransport in plants G1.pptx Cambridge IGCSE
Transport in plants G1.pptx Cambridge IGCSE
 
Structures and textures of metamorphic rocks
Structures and textures of metamorphic rocksStructures and textures of metamorphic rocks
Structures and textures of metamorphic rocks
 

Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)

  • 1. MIT December 15, 2017 Jeff Hawkins jhawkins@numenta.com Have We Missed Half of What the Neocortex Does? Allocentric Location as the Basis of Perception
  • 2. 1) Reverse engineer the neocortex - an ambitious but realizable goal - seek biologically accurate theories - test empirically and via simulation 2) Enable technology based on cortical theory - active open source community - basis for Machine Intelligence
  • 3.
  • 4. L2 L3a L3b L4 L6a L6b L6 ip L6 mp L6 bp L5 tt L5 cc L5 cc-ns L2/3 L4 L6 L5 Input The Cortical Column 1) Cortical columns are complex - Twelve or more excitatory cellular layers - Two parallel FF pathways - Parallel FB pathways (not shown) - Numerous intra- and inter-column connections (not shown) - Inhibitory neurons/circuits are equally complex 2) The function of a cortical column must also be complex. 3) Whatever a column does applies to everything the cortex does. L5: Calloway et. al, 2015 L6: Zhang and Deschenes, 1997 Simple Output, via thalamus 50%10% Cortex Thalamus Output, direct L5 CTC: Guillery, 1995 Constantinople and Bruno, 2013 A Couple of Thoughts Output
  • 5. Observation: The neocortex is constantly predicting its inputs. How do networks of neurons, as seen in the neocortex, learn predictive models of the world? Research:
  • 6. 1) How does the cortex learn predictive models of extrinsic sequences? 2) How does the cortex learn predictive models of sensorimotor sequences? Current research: How do columns compute allocentric location? - Grid cells in entorhinal cortex solve a similar problem - Big Idea: cortical columns contain analogs of grid cells and head direction cells - Starting to understand the function of numerous layers and connections “Why Neurons Have Thousands of Synapses, a Theory of Sequence Memory in the Neocortex” Hawkins and Ahmad, Frontiers in Neural Circuits, 2016/03/30 - Big Idea: Pyramidal neuron model for prediction - A single layer network model for sequence memory - Properties of sparse activations “A Theory of How Columns in the Neocortex Learn the Structure of the World” Hawkins, Ahmad, and Cui, Frontiers in Neural Circuits, 2017/10/25 - Extension of sequence memory model - Big Idea: Columns compute “allocentric” location of input - By moving sensor, columns learn models of complete objects
  • 7. Proximal synapses: Cause somatic spikes Define classic receptive field of neuron Distal synapses: Cause dendritic spikes Put the cell into a depolarized, or “predictive” state Depolarized neurons fire sooner, inhibiting nearby neurons. A neuron can predict its activity in hundreds of unique contexts. 5K to 30K excitatory synapses - 10% proximal - 90% distal Distal dendrites are pattern detectors - 8-15 co-active, co-located synapses generate dendritic spike - sustained depolarization of soma HTM Neuron Model Prediction Starts in the Neuron Pyramidal Neuron Major, Larkum and Schiller 2013
  • 8. Properties of Sparse Activations L2 L3a L3b L4 L6a L6b L6 ip L6 mp L6 bp L5 tt L5 cc L5 cc-ns Example: One layer of cells, 5,000 neurons, 2% (100) active 1) Representational capacity is virtually unlimited (5,000 choose 100) = 3x10211 2) Randomly chosen representations have minimal overlap 3) A neuron can robustly recognize an activation pattern by forming 10 to 20 synapses 4) Unions of patterns do not cause errors in recognition Hypothesis: Cellular layers use unions to represent uncertainty Hawkins, Ahmad, 2016 Ahmad, Hawkins, 2015 Pattern 1 (100 active cells) Cell robustly recognizes pattern1 by forming synapses to small sub- sample of active cells Union Patterns 1-10 (1,000 active cells) Cell still robustly recognizes pattern 1
  • 9. A Single Layer Network Model for Sequence Memory - Neurons in a mini-column learn same FF receptive field. - Neurons forms distal connections to nearby cells. No prediction Predicted input (Hawkins & Ahmad, 2016) (Cui et al, 2016) - High capacity (learns up to 1M transitions) - Learns high-order sequences: “ABCD” vs “XBCY” - Makes simultaneous predictions: “BC…” predicts “D” and “Y” - Extremely robust (tolerant to 40% noise and faults) - Learning is unsupervised, continuous, and local - Satisfies many biological constraints - Multiple open source implementations (some commercial) t=0 t=1 Predicted cells fire first and inhibit neighbors Next prediction t=2 t=0 t=1
  • 10. 1) How does the cortex learn predictive models of extrinsic sequences? 2) How does the cortex learn predictive models of sensorimotor sequences? Current research: How do columns compute allocentric location? - Grid cells in entorhinal cortex solve a similar problem - Hypothesis: cortical columns contain analogs of grid cells and head direction cells - Starting to understand the function of numerous layers and connections “Why Neurons Have Thousands of Synapses, a Theory of Sequence Memory in the Neocortex” Hawkins and Ahmad, Frontiers in Neural Circuits, 2016/03/30 - Pyramidal neuron model - A single layer network model for sequence memory - Properties of sparse activations “A Theory of How Columns in the Neocortex Learn the Structure of the World” Hawkins, Ahmad, and Cui, Frontiers in Neural Circuits, 2017/10/25 - Extension of sequence memory model - Big Idea: Columns compute “allocentric” location of input - By moving sensor, columns learn models of complete objects
  • 11. How Could a Layer of Neurons Learn a Predictive Model of Sensorimotor Sequences? Sequence memory Sensorimotor sequences SensorMotor-related context Hypothesis: By adding motor-related context, a cellular layer can predict its input as the sensor moves. What is the correct motor-related context? L2 L3a L3b L4 L6a L6b L6 ip L6 mp L6 bp L5 tt L5 cc L5 cc-ns 50% Sensory feature
  • 12.
  • 13. Two Layer Model of Sensorimotor Sequence Memory Feature @ location Object Stable over movement of sensor With allocentric location input, a column can learn models of complete objects by sensing different locations on object over time. Sensor Feature Allocentric Location Pooling Seq Mem Changes with each movement
  • 14. Object Feature @ Location Location on object Column 1 Column 2 Column 3 Sensor feature Sensorimotor Inference With Multiple Columns Each column has partial knowledge of object. Long range connections in object layer allow columns to vote. Inference is much faster with multiple columns.
  • 16. FeatureLocationFeatureLocationFeatureLocation Column 1 Column 2 Column 3 Output Input Recognition is Faster with Multiple Columns
  • 17. Yale-CMU-Berkeley (YCB) Object Benchmark (Calli et al, 2017) - 80 objects designed for robotics grasping tasks - Includes high-resolution 3D CAD files YCB Object Benchmark We created a virtual hand using the Unity game engine Curvature based sensor on each fingertip 4096 neurons per layer per column 98.7% recall accuracy (77/78 uniquely classified) Convergence time depends on object, sequence of sensations, number of fingers. Simulation using YCB Object Benchmark
  • 18. Pairwise confusion between objects after 1 touch Convergence 1 finger 1 touch
  • 19. Pairwise confusion between objects after 2 touches Convergence 1 finger 2 touches
  • 20. Pairwise confusion between objects after 6 touches Convergence 1 finger 6 touches
  • 21. Pairwise confusion between objects after 10 touches Convergence 1 finger 10 touches
  • 22. Convergence Time vs. Number of Columns This is why we can infer complex objects in a single grasp or single visual fixation.
  • 23. 1) How does the cortex learn predictive models of extrinsic sequences? 2) How does the cortex learn predictive models of sensorimotor sequences? Current research: How do columns compute allocentric location? - Hypothesis: cortical columns contain analogs of grid cells and head direction cells - Starting to understand the function of numerous layers and connections “Why Neurons Have Thousands of Synapses, a Theory of Sequence Memory in the Neocortex” Hawkins and Ahmad, Frontiers in Neural Circuits, 2016/03/30 - Pyramidal neuron model - A single layer network model for sequence memory - Properties of sparse activations “A Theory of How Columns in the Neocortex Learn the Structure of the World” Hawkins, Ahmad, and Cui, Frontiers in Neural Circuits, 2017/10/25 - Extension of sequence memory model - Big Idea: Columns compute “allocentric” location of input - By moving sensor, columns learn models of complete objects
  • 24. Entorhinal Cortex environments A B C X Y Z R S T Room 3 Room 2Room 1 Location - Encoded by Grid Cells - Unique to location in room AND room - Location is updated by movement Orientation (of head to room) - Encoded by Head Direction Cells - Anchored to room - Orientation is updated by movement Location - Unique to location on object AND object - Location is updated by movement Orientation (of sensor patch to object) - Anchored to object - Orientation is updated by movement Cortical Column objects Hypothesis: Cortical columns contain analogs of grid cells and head direction cells A C B X Y Z Stensola, Solstad, Frøland, Moser, Moser: 2012 Location and Orientation are both necessary to learn the structure of rooms and predict sensory input. Location and Orientation are both necessary to learn the structure of objects and predict sensory input.
  • 25. L3 L4 L6a L6b L5a L5b Mapping Orientation and Location to a Cortical Column (most complex slide) Sensation Orientation 1) A column is a two-stage sensorimotor model for learning and inferring structure. 2) A column usually cannot infer a Feature or Object in one sensation. - Integrate over time (sense, move, sense, move, sense..) - Vote with neighboring columns 3) This system is most obvious for touch, but it applies to vision and other sensory modalities. Because this architecture exists throughout the neocortex, it suggests we learn, infer, and manipulate abstract concepts the same way we manipulate objects in the world. Location Sensation @ Orientation Feature Feature @ Location Object Motor updated (HD cell-like) Motor updated (grid cell-like) Seq mem Pooling Seq mem Pooling Meaning Operation
  • 26. Rethinking Hierarchy Every column learns complete models of objects. They operate in parallel. Inputs project to multiple levels at once. Columns operate at different scales of input. Sense Simple features Complex features Objects Classic Objects Objects Objects Sensor array Proposed Region 3 Region 2 Region 1
  • 27. Rethinking Hierarchy Every column learns complete models of objects. They operate in parallel. Inputs project to multiple levels at once. Columns operate at different scales of input. Non-hierarchical connections allow columns to vote on shared elements such as “object” and “feature”. Sense Simple features Complex features Objects Classic Sensor array Objects Objects Objects Sensor array vision touch Proposed Region 3 Region 2 Region 1
  • 28. Summary Goal: Understand the function and operation of the laminar circuits in the neocortex. Method: Study how cortical columns make predictions of their inputs. Proposals 1) Pyramidal neurons are the substrate of prediction. Each neuron predicts its activity in hundreds of contexts. 2) A single layer of neurons forms a predictive memory of high-order sequences. (sparse activations, mini-columns, fast inhibition, and lateral connections) 3) A two-layer network forms a predictive memory of sensorimotor sequences. (add motor-derived context and a pooling layer) 4) Columns need motor-derived representations of location and orientation, of the sensor relative to the object. These are analogous to grid and head direction cells. 5) A framework for the cortical column. - Columns learn complete models of objects as “features at locations”, using two sensorimotor inference stages. 6) The neocortex contains thousands of parallel models, that resolve uncertainty by associative linking and/or movement of the sensors.
  • 29. Open Issues Behaviors: how are they learned, encoded, and applied to objects? Detailed model of hierarchy including thalamus How can the model be applied to “Where” pathways, and how do “What” and “Where” pathways work together Collaborations There are many testable predictions in this model, a “green field”. We welcome collaborations and discussions. We are always interested in hosting visiting scholars and interns.
  • 30. Numenta Team Subutai Ahmad VP Research Marcus Lewis Thank You

Editor's Notes

  1. I am the outlier on the agenda
  2. Excellent progress, it is accelerating
  3. 2 min 30 seconds
  4. Larger axes