SlideShare a Scribd company logo
1 of 68
November 21, 2014
Jeff Hawkins
jhawkins@Numenta.com
What the Brain Says About Machine Intelligence
1940’s 1950’s
- Dedicated vs. universal
- Analog vs. digital
- Decimal vs. binary
- Wired vs. memory-based programming
- Serial vs. random access memory
Many approaches
- Universal
- Digital
- Binary
- Memory-based programming
- Two tier memory
One dominant paradigm
The Birth of Programmable Computing
Why Did One Paradigm Win?
- Network effects
Why Did This Paradigm Win?
- Most flexible
- Most scalable
2010’s 2020’s
The Birth of Machine Intelligence
- Specific vs. universal algorithms
- Mathematical vs. memory-based
- Batch vs. on-line learning
- Labeled vs. behavior-based learning
Many approaches
- Universal algorithms
- Memory-based
- On-line learning
- Behavior-based learning
One dominant paradigm
Why Will One Paradigm Win?
- Network effects
Why Will This Paradigm Win?
- Most flexible
- Most scalable
How Do We Know This is Going to Happen?
- Brain is proof case
- We have made great progress
1) Discover operating principles of neocortex.
2) Create machine intelligence technology
based on neocortical principles.
Numenta’s Mission
Talk Topics
- Cortical facts
- Cortical theory
- Research roadmap
- Applications
- Thoughts on Machine Intelligence
What the Cortex Does
patterns Learns a model of world
from changing sensory data
The model generates
- predictions
- anomalies
- actions
Most sensory changes are due
to your own movement
The neocortex learns a sensory-motor model of the world
patterns
patterns
light
sound
touch
retina
cochlear
somatic
Cortical Facts
Hierarchy
Cellular layers
Mini-columns
Neurons: 3-10K synapses
- 10% proximal
- 90% distal
Active dendrites
Learning = new synapses
Remarkably uniform
- anatomically
- functionally
2.5 mm
Sheet of cells
2/3
4
6
5
Cortical Theory
Hierarchy
Cellular layers
Mini-columns
Neurons: 3-10K synapses
- 10% proximal
- 90% distal
Active dendrites
Learning = new synapses
Remarkably uniform
- anatomically
- functionally
Sheet of cellsHTM
Hierarchical Temporal Memory
1) Hierarchy of identical regions
2) Each region learns sequences
3) Stability increases going up hierarchy if
input is predictable
4) Sequences unfold going down
Questions
- What does a region do?
- What do the cellular layers do?
- How do neurons implement this?
- How does this work in hierarchy?
2/3
4
6
5
2/3
4
5
6
Cellular Layers
Sequence memory:
Sequence memory:
Sequence memory:
Sequence memory:
Inference (high-order)
Inference (sensory-motor)
Motor
Attention
FeedforwardFeedback
Each layer is a variation of common sequence memory algorithm.
These are universal functions. They apply to:
- all cortical regions
- all sensory-motor modalities.
Copy of motor commands
Sensor data Higher region
Sub-cortical
Motor centers
Lower region
2/3
4
5
6
Sequence memory:
Sequence memory:
Sequence memory:
Sequence memory:
?
?
?
?
How Does Sequence Memory Work?
HTM Temporal Memory
Learns sequences
Recognizes and recalls sequences
Predicts next inputs
- High capacity
- Distributed
- Local learning rules
- Fault tolerant
- No sensitive parameters
- Generalizes
HTM Temporal Memory
Not Just Another ANN 1) Cortical Anatomy
Mini-columns
Inhibitory cells
Cell connectivity patterns
2) Sparse Distributed
Representations
3) Realistic Neurons
Active dendrites
Thousands of synapses
Learn via synapse formation
numenta.com/learn/
2/3
4
5
6
Research Roadmap
Sensory-motor Inference
High-order Inference
Motor Sequences
Attention/Feedback
Theory 98%
Extensively tested
Commercial
Theory 80%
In development
Theory 50%
Theory 30%
Streaming Data
Capabilities: Prediction
Anomaly detection
Classification
Applications: Predictive maintenance
Security
Natural Language Processing
HTM
Encoder
SDRData stream Predictions
Anomalies
Classification
Streaming Data Applications
Numbers
Categories
Date
Time
GPS
Words
Applications
Servers
Biometrics
Medical
Vehicles
Industrial equipment
Social media
Comm. networks
Streaming Data Applications
Server metrics Human metrics
Natural languageGPS dataEEG data
Financial data
.
.
.
Anomaly Detection in Server Metrics (Grok for AWS)
HTM
Encoder
SDRServer Metric
Anomaly Score
HTM
Encoder
SDRServer Metric
Anomaly Score
Mobile Dashboard
 Servers sorted by
anomaly score
 Continuously updated
Web Dashboard
What Kind of Anomalies Can HTM Detect?
Sudden changes Slow changes Changes in noisy dataSubtle changes
in regular data
Changes that humans can’t see
Engineer manually started
build on automated build server
What Kind of Anomalies Can HTM Detect?
Created large
Zip file
Anomaly Detection in Human Metrics
Keystrokes
File access
CPU usage
App access
Anomaly Detection in Financial and Social Media Data
Stock volume
Social media
Stock volume
Social media
Berkeley Cognitive Technology Group
Classification of EEG Data
GPS Data: SmartHarbors
Document corpus
(e.g. Wikipedia)
128 x 128
100K “Word SDRs”
- =
Apple Fruit Computer
Macintosh
Microsoft
Mac
Linux
Operating system
….
Natural Language
Training set
frog eats flies
cow eats grain
elephant eats leaves
goat eats grass
wolf eats rabbit
cat likes ball
elephant likes water
sheep eats grass
cat eats salmon
wolf eats mice
lion eats cow
dog likes sleep
elephant likes water
cat likes ball
coyote eats rodent
coyote eats rabbit
wolf eats squirrel
dog likes sleep
cat likes ball
---- ---- -----
Word 3Word 2Word 1
Sequences of Word SDRs
HTM
Training set
eats“fox”
?
frog eats flies
cow eats grain
elephant eats leaves
goat eats grass
wolf eats rabbit
cat likes ball
elephant likes water
sheep eats grass
cat eats salmon
wolf eats mice
lion eats cow
dog likes sleep
elephant likes water
cat likes ball
coyote eats rodent
coyote eats rabbit
wolf eats squirrel
dog likes sleep
cat likes ball
---- ---- -----
Sequences of Word SDRs
HTM
Training set
eats“fox”
rodent
- Learning is unsupervised
- Semantic generalization
- Works across languages
- Many applications
Intelligent search
Sentiment analysis
Semantic filtering
frog eats flies
cow eats grain
elephant eats leaves
goat eats grass
wolf eats rabbit
cat likes ball
elephant likes water
sheep eats grass
cat eats salmon
wolf eats mice
lion eats cow
dog likes sleep
elephant likes water
cat likes ball
coyote eats rodent
coyote eats rabbit
wolf eats squirrel
dog likes sleep
cat likes ball
---- ---- -----
Sequences of Word SDRs
HTM
Server metrics Human metrics
Natural language
GPS dataEEG dataFinancial data
All these applications run on
the exact same HTM code.
2/3
4
5
6
Research Roadmap
Sensory-motor Inference
High-order Inference
Motor Sequences
Attention/Feedback
Theory 98%
Extensively tested
Commercial
Theory 80%
In development
Theory 50%
Theory 30%
Streaming Data
Capabilities: Prediction
Anomaly detection
Classification
Applications: IT
Security
Natural Language Processing
Static Data (via active learning)
Capabilities: Classification
Prediction
Applications: Vision image classification
Network classification
Classification of connected graphs
2/3
4
5
6
Research Roadmap
Sensory-motor Inference
High-order Inference
Motor Sequences
Attention/Feedback
Theory 98%
Extensively tested
Commercial
Theory 80%
In development
Theory 50%
Theory 30%
Streaming Data
Capabilities: Prediction
Anomaly detection
Classification
Applications: IT
Security
Natural Language Processing
Static Data (via active learning)
Capabilities: Classification
Prediction
Applications: Vision image classification
Network classification
Classification of connected graphs
Static and/or streaming Data
Capabilities: Goal-oriented behavior
Applications: Robotics
Smart bots
Proactive defense
2/3
4
5
6
Research Roadmap
Sensory-motor Inference
High-order Inference
Motor Sequences
Attention/Feedback
Theory 98%
Extensively tested
Commercial
Theory 80%
In development
Theory 50%
Theory 30%
Streaming Data
Capabilities: Prediction
Anomaly detection
Classification
Applications: IT
Security
Natural Language Processing
Static Data (via active learning)
Capabilities: Classification
Prediction
Applications: Vision image classification
Network classification
Classification of connected graphs
Static and/or streaming Data
Capabilities: Goal-oriented behavior
Applications: Robotics
Smart bots
Proactive defense
Enables : Multi-sensory modalities
Multi-behavioral modalities
- Algorithms are documented
- Multiple independent implementations
NuPIC www.Numenta.org
- Numenta’s software is open source (GPLv3)
- Numenta’s daily research code is online
- Active discussion groups for theory and implementation
- Collaborative
IBM Almaden Research, San Jose, CA
DARPA, Washington D.C
Cortical.IO, Austria
Research Transparency
NuPIC Community
Machine Intelligence Landscape
Cortical
(e.g. HTM)
ANNs
(e.g. Deep learning)
A.I.
(e.g. Watson)
Machine Intelligence Landscape
Premise Biological Mathematical Engineered
Cortical
(e.g. HTM)
ANNs
(e.g. Deep learning)
A.I.
(e.g. Watson)
Machine Intelligence Landscape
Premise Biological Mathematical Engineered
Data Spatial-temporal
Language, Behavior
Spatial-temporal Language
Documents
Cortical
(e.g. HTM)
ANNs
(e.g. Deep learning)
A.I.
(e.g. Watson)
Machine Intelligence Landscape
Premise Biological Mathematical Engineered
Data Spatial-temporal
Language, Behavior
Spatial-temporal Language
Documents
Capabilities Classification
Prediction
Goal-oriented Behavior
Classification NL Query
Cortical
(e.g. HTM)
ANNs
(e.g. Deep learning)
A.I.
(e.g. Watson)
Machine Intelligence Landscape
Premise Biological Mathematical Engineered
Data Spatial-temporal
Language, Behavior
Spatial-temporal Language
Documents
Capabilities Classification
Prediction
Goal-oriented Behavior
Classification NL Query
Path to M.I.? Yes Probably not Probably not
Cortical
(e.g. HTM)
ANNs
(e.g. Deep learning)
A.I.
(e.g. Watson)
Learning Normal Behavior
Learning Normal Behavior
Learning Normal Behavior
Geospatial Anomalies
Deviation in path Change in direction
Learning Transitions
Time = 1
Learning Transitions
Time = 2
Learning Transitions
Learning Transitions
Form connections to previously active cells.
Predict future activity.
- This is a first order sequence memory.
- It cannot learn A-B-C-D vs. X-B-C-Y.
- Mini-columns turn this into a high-order sequence memory.
Learning Transitions
Multiple predictions can occur at once.
A-B A-C A-D
Forming High Order Representations
Feedforward: Sparse activation of columns
Burst of activity Highly sparse unique pattern
Unpredicted Predicted
Feedforward: Sparse activation of columns
Representing High-order Sequences
A
X B
B
C
C
Y
D
Before training
A
X B’’
B’
C’’
C’
Y’’
D’
After training
Same columns,
but only one cell active per column.
IF 40 active columns, 10 cells per column
THEN 1040 ways to represent the same input in different contexts
SDR Properties
subsampling is OK
3) Union membership:
Indices
1
2
|
10
Is this SDR
a member?
2) Store and Compare:
store indices of active bits
Indices
1
2
3
4
5
|
40
1)
2)
3)
….
10)
2%
20%Union
1) Similarity:
shared bits = semantic similarity
What Can Be Done With Software
1 layer
30 msec / learning-inference-prediction step
10-6 of human cortex
2048 columns 65,000 neurons
300M synapses
Challenges
Dendritic regions
Active dendrites
1,000s of synapses
10,000s of potential synapses
Continuous learning
Challenges and Opportunities for Neuromorphic HW
Opportunities
Low precision memory (synapses)
Fault tolerant
- memory
- connectivity
- neurons
- natural recovery
Simple activation states (no spikes)
Connectivity
- very sparse, topological
2/3
4
5
6
Cellular Layers
Sequence memory
Sequence memory
Sequence memory
Sequence memory
Inference
Inference
Motor
Attention
FeedforwardFeedback
Each layer implements a variation of a common sequence
memory algorithm.
Higher cortexSensor/lower cortex
Lower cortex
Motor center
Why Will Machine Intelligence be Based on Cortical Principles?
1) Cortex uses a common learning algorithm
vision
hearing
touch
behavior
2) Cortical algorithm is incredibly adaptable
languages
engineering
science
arts …
3) Network effects
Hardware and software efforts will
focus on most universal solution
2/3
4
5
6
Cellular Layers
Sequence memory:
Sequence memory:
Sequence memory:
Sequence memory:
Inference
Inference
Motor
Attention
FeedforwardFeedback
Each layer is a variation of a common sequence memory algorithm.
Higher cortexSensor/lower cortex
Lower cortex
Sub-cortical
motor center
Inputs/outputs define the role of each layer.
Learning Transitions
Feedforward activation
Learning Transitions
Inhibition
Sparse Distributed Representations (SDRs)
- Sensory perception
- Planning
- Motor control
- Prediction
- Attention
Sparse Distribution Representations are used
everywhere in the cortex.
Sparse Distributed Representations
What are they
• Many bits (thousands)
• Few 1’s mostly 0’s
• Example: 2,000 bits, 2% active
• Each bit has semantic meaning
• No bit is essential
01000000000000000001000000000000000000000000000000000010000…………01000
Desirable attributes
• High capacity
• Robust to noise and deletion
• Efficient and fast
• Enable new operations
SDR Operations
1) Similarity:
shared bits = semantic similarity
subsampling is OK
3) Union membership:
Indices
1
2
|
10
Is this SDR
a member?
2) Store and Compare:
store indices of active bits
Indices
1
2
3
4
5
|
40
1)
2)
3)
….
10)
2%
20%Union
SmartHarbors
GPS to SDR Encoder
GPS to SDR Encoder
GPS to SDR Encoder
GPS to SDR Encoder
Feedback
Local
Feedforward
Activates cell
Neurons
Biological neuron HTM neuron
Non-linear
Dendritic AP’s
Depolarize soma
Coincidence
detectors
HTM SynapsesBiological Synapses
Learning is formation of new
synapses.
Synapses have low fidelity. Connection weight is binary
0.0 1.00.4
Learning forms new connections
(“permanence” is scalar)
0 1
Feedforward
Activates cell
Prediction:
Recognize hundreds
of unique patterns
Synapses
Activation:
Recognize dozens of
unique patterns
SDRs are used everywhere in the cortex.
Sparse Distributed Representations (SDRs)
From: Prof. Hasan, Max-Planck-
Institute for Research
x = 0100000000000000000100000000000110000000
• Extremely high capacity
• Robust to noise and deletions
• Have many desirable properties
• Solve semantic representation problem
Attributes
SDR Basics
• Large number of neurons
• Few active at once
• Every cell represents something
• Information is distributed
• SDRs are binary
10 to 15 synapses are
sufficient to
recognize patterns in
thousands of cells.
A single dendrite can
recognize multiple
unique patterns
without confusion.
Example: SDR Classification Capacity in Presence of Noise
• n = number of bits in SDR
• w = number of 1 bits
• W = number of vectors that overlap vector x by b bits
• Probability of false positive for one stored pattern
• Probability of false positive for M stored patterns
Wx (n,w,b) =
wx
b
æ
èç
ö
ø÷ ´
n - wx
w - b
æ
èç
ö
ø÷
fpw
n
(q) =
Wx (n,w,b)
b=q
w
å
n
w
æ
èç
ö
ø÷
fpX (q) £ fpwxi
n
(q)
i=0
M-1
å n = 2048, w = 40
With 50% noise, you can classify 1015 patterns with an error < 10-11
n = 64, w=12
With 33% noise, you can classify only 10 patterns with an error 0.04%
Link.to.whitepaper.com

More Related Content

What's hot

The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
Numenta
 
Neural networks...
Neural networks...Neural networks...
Neural networks...
Molly Chugh
 

What's hot (20)

Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AI
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
 
Numenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
Numenta Brain Theory Discoveries of 2016/2017 by Jeff HawkinsNumenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
Numenta Brain Theory Discoveries of 2016/2017 by Jeff Hawkins
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep Learning
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
 
Deep Learning
Deep LearningDeep Learning
Deep Learning
 
Introduction of Deep Learning
Introduction of Deep LearningIntroduction of Deep Learning
Introduction of Deep Learning
 
Ai ml dl_bct and mariners-1
Ai  ml dl_bct and mariners-1Ai  ml dl_bct and mariners-1
Ai ml dl_bct and mariners-1
 
Deep learning tutorial 9/2019
Deep learning tutorial 9/2019Deep learning tutorial 9/2019
Deep learning tutorial 9/2019
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
 
Neural networks...
Neural networks...Neural networks...
Neural networks...
 
Deep Learning: Application & Opportunity
Deep Learning: Application & OpportunityDeep Learning: Application & Opportunity
Deep Learning: Application & Opportunity
 
Does the neocortex use grid cell-like mechanisms to learn the structure of ob...
Does the neocortex use grid cell-like mechanisms to learn the structure of ob...Does the neocortex use grid cell-like mechanisms to learn the structure of ob...
Does the neocortex use grid cell-like mechanisms to learn the structure of ob...
 
Deep Visual Understanding from Deep Learning by Prof. Jitendra Malik
Deep Visual Understanding from Deep Learning by Prof. Jitendra MalikDeep Visual Understanding from Deep Learning by Prof. Jitendra Malik
Deep Visual Understanding from Deep Learning by Prof. Jitendra Malik
 
Artificial Intelligence, Machine Learning and Deep Learning with CNN
Artificial Intelligence, Machine Learning and Deep Learning with CNNArtificial Intelligence, Machine Learning and Deep Learning with CNN
Artificial Intelligence, Machine Learning and Deep Learning with CNN
 
Deep learning
Deep learningDeep learning
Deep learning
 
Artificial Neural Network Seminar - Google Brain
Artificial Neural Network Seminar - Google BrainArtificial Neural Network Seminar - Google Brain
Artificial Neural Network Seminar - Google Brain
 
Neural networks and deep learning
Neural networks and deep learningNeural networks and deep learning
Neural networks and deep learning
 
Deep Learning Class #0 - You Can Do It
Deep Learning Class #0 - You Can Do ItDeep Learning Class #0 - You Can Do It
Deep Learning Class #0 - You Can Do It
 
Deep Learning - The Past, Present and Future of Artificial Intelligence
Deep Learning - The Past, Present and Future of Artificial IntelligenceDeep Learning - The Past, Present and Future of Artificial Intelligence
Deep Learning - The Past, Present and Future of Artificial Intelligence
 

Viewers also liked

Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPIC
Numenta
 

Viewers also liked (8)

HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial Pooler
 
Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPIC
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow models
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus Lewis
 
Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)
 
Getting Started with Numenta Technology
Getting Started with Numenta Technology Getting Started with Numenta Technology
Getting Started with Numenta Technology
 
Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine Intelligence
 
TouchNet preview at Numenta
TouchNet preview at NumentaTouchNet preview at Numenta
TouchNet preview at Numenta
 

Similar to What the Brain says about Machine Intelligence

SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...
Chester Chen
 
Deep learning - A Visual Introduction
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual Introduction
Lukas Masuch
 
AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...
AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...
AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...
RudrakshAmar
 
SMACS Research
SMACS ResearchSMACS Research
SMACS Research
butest
 
Nural network ER. Abhishek k. upadhyay
Nural network ER. Abhishek  k. upadhyayNural network ER. Abhishek  k. upadhyay
Nural network ER. Abhishek k. upadhyay
abhishek upadhyay
 

Similar to What the Brain says about Machine Intelligence (20)

Useful Techniques in Artificial Intelligence
Useful Techniques in Artificial IntelligenceUseful Techniques in Artificial Intelligence
Useful Techniques in Artificial Intelligence
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...
 
Ai ml dl_bct and mariners
Ai  ml dl_bct and marinersAi  ml dl_bct and mariners
Ai ml dl_bct and mariners
 
Ai ml dl_bct and mariners
Ai  ml dl_bct and marinersAi  ml dl_bct and mariners
Ai ml dl_bct and mariners
 
Deep learning - A Visual Introduction
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual Introduction
 
Artificial Intelligence Today (22 June 2017)
Artificial Intelligence Today (22 June 2017)Artificial Intelligence Today (22 June 2017)
Artificial Intelligence Today (22 June 2017)
 
AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...
AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...
AI in 6 Hours this pdf contains a general idea of how AI will be asked in the...
 
SMACS Research
SMACS ResearchSMACS Research
SMACS Research
 
Big Sky Earth 2018 Introduction to machine learning
Big Sky Earth 2018 Introduction to machine learningBig Sky Earth 2018 Introduction to machine learning
Big Sky Earth 2018 Introduction to machine learning
 
Pharo-AI
Pharo-AIPharo-AI
Pharo-AI
 
AI/ML/DL/BCT A Revolution in Maritime Sector
AI/ML/DL/BCT A Revolution in Maritime SectorAI/ML/DL/BCT A Revolution in Maritime Sector
AI/ML/DL/BCT A Revolution in Maritime Sector
 
AI and Expert Systems
AI and Expert SystemsAI and Expert Systems
AI and Expert Systems
 
AI for Cybersecurity Innovation
AI for Cybersecurity InnovationAI for Cybersecurity Innovation
AI for Cybersecurity Innovation
 
Nural network ER. Abhishek k. upadhyay
Nural network ER. Abhishek  k. upadhyayNural network ER. Abhishek  k. upadhyay
Nural network ER. Abhishek k. upadhyay
 
[DSC Europe 23] Goran S. Milovanovic - Deciphering the AI Landscape: Business...
[DSC Europe 23] Goran S. Milovanovic - Deciphering the AI Landscape: Business...[DSC Europe 23] Goran S. Milovanovic - Deciphering the AI Landscape: Business...
[DSC Europe 23] Goran S. Milovanovic - Deciphering the AI Landscape: Business...
 
AI for Everyone: Master the Basics
AI for Everyone: Master the BasicsAI for Everyone: Master the Basics
AI for Everyone: Master the Basics
 
Novi sad ai event 1-2018
Novi sad ai event 1-2018Novi sad ai event 1-2018
Novi sad ai event 1-2018
 
Artificial intelligent Lec 1-ai-introduction-
Artificial intelligent Lec 1-ai-introduction-Artificial intelligent Lec 1-ai-introduction-
Artificial intelligent Lec 1-ai-introduction-
 
Caretaker Theory introduction - Intelligence And Artificial Consciousness
Caretaker Theory introduction - Intelligence And Artificial ConsciousnessCaretaker Theory introduction - Intelligence And Artificial Consciousness
Caretaker Theory introduction - Intelligence And Artificial Consciousness
 
ML.pdf
ML.pdfML.pdf
ML.pdf
 

More from Numenta

Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Numenta
 
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
Numenta
 

More from Numenta (20)

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devices
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
 
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
BAAI Conference 2021: The Thousand Brains Theory - A Roadmap for Creating Mac...
 
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine Learning
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
 
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
 
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
 
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ... Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ...
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
 
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
 
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
 

Recently uploaded

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Recently uploaded (20)

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
AI+A11Y 11MAY2024 HYDERBAD GAAD 2024 - HelloA11Y (11 May 2024)
AI+A11Y 11MAY2024 HYDERBAD GAAD 2024 - HelloA11Y (11 May 2024)AI+A11Y 11MAY2024 HYDERBAD GAAD 2024 - HelloA11Y (11 May 2024)
AI+A11Y 11MAY2024 HYDERBAD GAAD 2024 - HelloA11Y (11 May 2024)
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptx
 

What the Brain says about Machine Intelligence

  • 1. November 21, 2014 Jeff Hawkins jhawkins@Numenta.com What the Brain Says About Machine Intelligence
  • 2. 1940’s 1950’s - Dedicated vs. universal - Analog vs. digital - Decimal vs. binary - Wired vs. memory-based programming - Serial vs. random access memory Many approaches - Universal - Digital - Binary - Memory-based programming - Two tier memory One dominant paradigm The Birth of Programmable Computing Why Did One Paradigm Win? - Network effects Why Did This Paradigm Win? - Most flexible - Most scalable
  • 3. 2010’s 2020’s The Birth of Machine Intelligence - Specific vs. universal algorithms - Mathematical vs. memory-based - Batch vs. on-line learning - Labeled vs. behavior-based learning Many approaches - Universal algorithms - Memory-based - On-line learning - Behavior-based learning One dominant paradigm Why Will One Paradigm Win? - Network effects Why Will This Paradigm Win? - Most flexible - Most scalable How Do We Know This is Going to Happen? - Brain is proof case - We have made great progress
  • 4. 1) Discover operating principles of neocortex. 2) Create machine intelligence technology based on neocortical principles. Numenta’s Mission Talk Topics - Cortical facts - Cortical theory - Research roadmap - Applications - Thoughts on Machine Intelligence
  • 5. What the Cortex Does patterns Learns a model of world from changing sensory data The model generates - predictions - anomalies - actions Most sensory changes are due to your own movement The neocortex learns a sensory-motor model of the world patterns patterns light sound touch retina cochlear somatic
  • 6. Cortical Facts Hierarchy Cellular layers Mini-columns Neurons: 3-10K synapses - 10% proximal - 90% distal Active dendrites Learning = new synapses Remarkably uniform - anatomically - functionally 2.5 mm Sheet of cells 2/3 4 6 5
  • 7. Cortical Theory Hierarchy Cellular layers Mini-columns Neurons: 3-10K synapses - 10% proximal - 90% distal Active dendrites Learning = new synapses Remarkably uniform - anatomically - functionally Sheet of cellsHTM Hierarchical Temporal Memory 1) Hierarchy of identical regions 2) Each region learns sequences 3) Stability increases going up hierarchy if input is predictable 4) Sequences unfold going down Questions - What does a region do? - What do the cellular layers do? - How do neurons implement this? - How does this work in hierarchy? 2/3 4 6 5
  • 8. 2/3 4 5 6 Cellular Layers Sequence memory: Sequence memory: Sequence memory: Sequence memory: Inference (high-order) Inference (sensory-motor) Motor Attention FeedforwardFeedback Each layer is a variation of common sequence memory algorithm. These are universal functions. They apply to: - all cortical regions - all sensory-motor modalities. Copy of motor commands Sensor data Higher region Sub-cortical Motor centers Lower region
  • 9. 2/3 4 5 6 Sequence memory: Sequence memory: Sequence memory: Sequence memory: ? ? ? ? How Does Sequence Memory Work?
  • 10. HTM Temporal Memory Learns sequences Recognizes and recalls sequences Predicts next inputs - High capacity - Distributed - Local learning rules - Fault tolerant - No sensitive parameters - Generalizes
  • 11. HTM Temporal Memory Not Just Another ANN 1) Cortical Anatomy Mini-columns Inhibitory cells Cell connectivity patterns 2) Sparse Distributed Representations 3) Realistic Neurons Active dendrites Thousands of synapses Learn via synapse formation numenta.com/learn/
  • 12. 2/3 4 5 6 Research Roadmap Sensory-motor Inference High-order Inference Motor Sequences Attention/Feedback Theory 98% Extensively tested Commercial Theory 80% In development Theory 50% Theory 30% Streaming Data Capabilities: Prediction Anomaly detection Classification Applications: Predictive maintenance Security Natural Language Processing
  • 13. HTM Encoder SDRData stream Predictions Anomalies Classification Streaming Data Applications Numbers Categories Date Time GPS Words Applications Servers Biometrics Medical Vehicles Industrial equipment Social media Comm. networks
  • 14. Streaming Data Applications Server metrics Human metrics Natural languageGPS dataEEG data Financial data
  • 15. . . . Anomaly Detection in Server Metrics (Grok for AWS) HTM Encoder SDRServer Metric Anomaly Score HTM Encoder SDRServer Metric Anomaly Score Mobile Dashboard  Servers sorted by anomaly score  Continuously updated Web Dashboard
  • 16. What Kind of Anomalies Can HTM Detect? Sudden changes Slow changes Changes in noisy dataSubtle changes in regular data
  • 17. Changes that humans can’t see Engineer manually started build on automated build server What Kind of Anomalies Can HTM Detect?
  • 18. Created large Zip file Anomaly Detection in Human Metrics Keystrokes File access CPU usage App access
  • 19. Anomaly Detection in Financial and Social Media Data Stock volume Social media Stock volume Social media
  • 20. Berkeley Cognitive Technology Group Classification of EEG Data
  • 22. Document corpus (e.g. Wikipedia) 128 x 128 100K “Word SDRs” - = Apple Fruit Computer Macintosh Microsoft Mac Linux Operating system …. Natural Language
  • 23. Training set frog eats flies cow eats grain elephant eats leaves goat eats grass wolf eats rabbit cat likes ball elephant likes water sheep eats grass cat eats salmon wolf eats mice lion eats cow dog likes sleep elephant likes water cat likes ball coyote eats rodent coyote eats rabbit wolf eats squirrel dog likes sleep cat likes ball ---- ---- ----- Word 3Word 2Word 1 Sequences of Word SDRs HTM
  • 24. Training set eats“fox” ? frog eats flies cow eats grain elephant eats leaves goat eats grass wolf eats rabbit cat likes ball elephant likes water sheep eats grass cat eats salmon wolf eats mice lion eats cow dog likes sleep elephant likes water cat likes ball coyote eats rodent coyote eats rabbit wolf eats squirrel dog likes sleep cat likes ball ---- ---- ----- Sequences of Word SDRs HTM
  • 25. Training set eats“fox” rodent - Learning is unsupervised - Semantic generalization - Works across languages - Many applications Intelligent search Sentiment analysis Semantic filtering frog eats flies cow eats grain elephant eats leaves goat eats grass wolf eats rabbit cat likes ball elephant likes water sheep eats grass cat eats salmon wolf eats mice lion eats cow dog likes sleep elephant likes water cat likes ball coyote eats rodent coyote eats rabbit wolf eats squirrel dog likes sleep cat likes ball ---- ---- ----- Sequences of Word SDRs HTM
  • 26. Server metrics Human metrics Natural language GPS dataEEG dataFinancial data All these applications run on the exact same HTM code.
  • 27. 2/3 4 5 6 Research Roadmap Sensory-motor Inference High-order Inference Motor Sequences Attention/Feedback Theory 98% Extensively tested Commercial Theory 80% In development Theory 50% Theory 30% Streaming Data Capabilities: Prediction Anomaly detection Classification Applications: IT Security Natural Language Processing Static Data (via active learning) Capabilities: Classification Prediction Applications: Vision image classification Network classification Classification of connected graphs
  • 28. 2/3 4 5 6 Research Roadmap Sensory-motor Inference High-order Inference Motor Sequences Attention/Feedback Theory 98% Extensively tested Commercial Theory 80% In development Theory 50% Theory 30% Streaming Data Capabilities: Prediction Anomaly detection Classification Applications: IT Security Natural Language Processing Static Data (via active learning) Capabilities: Classification Prediction Applications: Vision image classification Network classification Classification of connected graphs Static and/or streaming Data Capabilities: Goal-oriented behavior Applications: Robotics Smart bots Proactive defense
  • 29. 2/3 4 5 6 Research Roadmap Sensory-motor Inference High-order Inference Motor Sequences Attention/Feedback Theory 98% Extensively tested Commercial Theory 80% In development Theory 50% Theory 30% Streaming Data Capabilities: Prediction Anomaly detection Classification Applications: IT Security Natural Language Processing Static Data (via active learning) Capabilities: Classification Prediction Applications: Vision image classification Network classification Classification of connected graphs Static and/or streaming Data Capabilities: Goal-oriented behavior Applications: Robotics Smart bots Proactive defense Enables : Multi-sensory modalities Multi-behavioral modalities
  • 30. - Algorithms are documented - Multiple independent implementations NuPIC www.Numenta.org - Numenta’s software is open source (GPLv3) - Numenta’s daily research code is online - Active discussion groups for theory and implementation - Collaborative IBM Almaden Research, San Jose, CA DARPA, Washington D.C Cortical.IO, Austria Research Transparency
  • 32. Machine Intelligence Landscape Cortical (e.g. HTM) ANNs (e.g. Deep learning) A.I. (e.g. Watson)
  • 33. Machine Intelligence Landscape Premise Biological Mathematical Engineered Cortical (e.g. HTM) ANNs (e.g. Deep learning) A.I. (e.g. Watson)
  • 34. Machine Intelligence Landscape Premise Biological Mathematical Engineered Data Spatial-temporal Language, Behavior Spatial-temporal Language Documents Cortical (e.g. HTM) ANNs (e.g. Deep learning) A.I. (e.g. Watson)
  • 35. Machine Intelligence Landscape Premise Biological Mathematical Engineered Data Spatial-temporal Language, Behavior Spatial-temporal Language Documents Capabilities Classification Prediction Goal-oriented Behavior Classification NL Query Cortical (e.g. HTM) ANNs (e.g. Deep learning) A.I. (e.g. Watson)
  • 36. Machine Intelligence Landscape Premise Biological Mathematical Engineered Data Spatial-temporal Language, Behavior Spatial-temporal Language Documents Capabilities Classification Prediction Goal-oriented Behavior Classification NL Query Path to M.I.? Yes Probably not Probably not Cortical (e.g. HTM) ANNs (e.g. Deep learning) A.I. (e.g. Watson)
  • 40. Geospatial Anomalies Deviation in path Change in direction
  • 42. Time = 1 Learning Transitions
  • 43. Time = 2 Learning Transitions
  • 44. Learning Transitions Form connections to previously active cells. Predict future activity.
  • 45. - This is a first order sequence memory. - It cannot learn A-B-C-D vs. X-B-C-Y. - Mini-columns turn this into a high-order sequence memory. Learning Transitions Multiple predictions can occur at once. A-B A-C A-D
  • 46. Forming High Order Representations Feedforward: Sparse activation of columns Burst of activity Highly sparse unique pattern Unpredicted Predicted Feedforward: Sparse activation of columns
  • 47. Representing High-order Sequences A X B B C C Y D Before training A X B’’ B’ C’’ C’ Y’’ D’ After training Same columns, but only one cell active per column. IF 40 active columns, 10 cells per column THEN 1040 ways to represent the same input in different contexts
  • 48. SDR Properties subsampling is OK 3) Union membership: Indices 1 2 | 10 Is this SDR a member? 2) Store and Compare: store indices of active bits Indices 1 2 3 4 5 | 40 1) 2) 3) …. 10) 2% 20%Union 1) Similarity: shared bits = semantic similarity
  • 49. What Can Be Done With Software 1 layer 30 msec / learning-inference-prediction step 10-6 of human cortex 2048 columns 65,000 neurons 300M synapses
  • 50. Challenges Dendritic regions Active dendrites 1,000s of synapses 10,000s of potential synapses Continuous learning Challenges and Opportunities for Neuromorphic HW Opportunities Low precision memory (synapses) Fault tolerant - memory - connectivity - neurons - natural recovery Simple activation states (no spikes) Connectivity - very sparse, topological
  • 51. 2/3 4 5 6 Cellular Layers Sequence memory Sequence memory Sequence memory Sequence memory Inference Inference Motor Attention FeedforwardFeedback Each layer implements a variation of a common sequence memory algorithm. Higher cortexSensor/lower cortex Lower cortex Motor center
  • 52. Why Will Machine Intelligence be Based on Cortical Principles? 1) Cortex uses a common learning algorithm vision hearing touch behavior 2) Cortical algorithm is incredibly adaptable languages engineering science arts … 3) Network effects Hardware and software efforts will focus on most universal solution
  • 53. 2/3 4 5 6 Cellular Layers Sequence memory: Sequence memory: Sequence memory: Sequence memory: Inference Inference Motor Attention FeedforwardFeedback Each layer is a variation of a common sequence memory algorithm. Higher cortexSensor/lower cortex Lower cortex Sub-cortical motor center Inputs/outputs define the role of each layer.
  • 56. Sparse Distributed Representations (SDRs) - Sensory perception - Planning - Motor control - Prediction - Attention Sparse Distribution Representations are used everywhere in the cortex.
  • 57. Sparse Distributed Representations What are they • Many bits (thousands) • Few 1’s mostly 0’s • Example: 2,000 bits, 2% active • Each bit has semantic meaning • No bit is essential 01000000000000000001000000000000000000000000000000000010000…………01000 Desirable attributes • High capacity • Robust to noise and deletion • Efficient and fast • Enable new operations
  • 58. SDR Operations 1) Similarity: shared bits = semantic similarity subsampling is OK 3) Union membership: Indices 1 2 | 10 Is this SDR a member? 2) Store and Compare: store indices of active bits Indices 1 2 3 4 5 | 40 1) 2) 3) …. 10) 2% 20%Union
  • 60. GPS to SDR Encoder
  • 61. GPS to SDR Encoder
  • 62. GPS to SDR Encoder
  • 63. GPS to SDR Encoder
  • 64. Feedback Local Feedforward Activates cell Neurons Biological neuron HTM neuron Non-linear Dendritic AP’s Depolarize soma Coincidence detectors HTM SynapsesBiological Synapses Learning is formation of new synapses. Synapses have low fidelity. Connection weight is binary 0.0 1.00.4 Learning forms new connections (“permanence” is scalar) 0 1 Feedforward Activates cell Prediction: Recognize hundreds of unique patterns Synapses Activation: Recognize dozens of unique patterns
  • 65. SDRs are used everywhere in the cortex. Sparse Distributed Representations (SDRs)
  • 66. From: Prof. Hasan, Max-Planck- Institute for Research
  • 67. x = 0100000000000000000100000000000110000000 • Extremely high capacity • Robust to noise and deletions • Have many desirable properties • Solve semantic representation problem Attributes SDR Basics • Large number of neurons • Few active at once • Every cell represents something • Information is distributed • SDRs are binary 10 to 15 synapses are sufficient to recognize patterns in thousands of cells. A single dendrite can recognize multiple unique patterns without confusion.
  • 68. Example: SDR Classification Capacity in Presence of Noise • n = number of bits in SDR • w = number of 1 bits • W = number of vectors that overlap vector x by b bits • Probability of false positive for one stored pattern • Probability of false positive for M stored patterns Wx (n,w,b) = wx b æ èç ö ø÷ ´ n - wx w - b æ èç ö ø÷ fpw n (q) = Wx (n,w,b) b=q w å n w æ èç ö ø÷ fpX (q) £ fpwxi n (q) i=0 M-1 å n = 2048, w = 40 With 50% noise, you can classify 1015 patterns with an error < 10-11 n = 64, w=12 With 33% noise, you can classify only 10 patterns with an error 0.04% Link.to.whitepaper.com