SlideShare a Scribd company logo
Sparse Distributed Representations:
Our Brain’s Data Structure
Numenta Workshop
October 17, 2014
Subutai Ahmad, VP Research
sahmad@numenta.com
Sparse Distributed Representations:
Our Brain’s Data Structure
Numenta Workshop
October 17, 2014
Subutai Ahmad, VP Research
sahmad@numenta.com
The Role of Sparse Distributed Representations in Cortex
1) Sensory perception
3) Motor control
4) Prediction
2) Planning
5) Attention
Sparse Distribution Representations (SDRs) are the foundation for all these
functions, across all sensory modalities
Analysis of this common cortical data structure can provide a rigorous
foundation for cortical computing
Talk Outline
1) Introduction to Sparse Distributed Representations (SDRs)
2) Fundamental properties of SDRs
– Error bounds
– Scaling laws
From: Prof. Hasan, Max-Planck-
Institut for Research
Basics Attributes of SDRs
1) Only a small number of neurons are firing
at any point in time
3) Every cell represents something and has
meaning
4) Information is distributed and no single
neuron is critical
2) There are a very large number of neurons
5) Every neuron only connects to a subset of
other neurons
6) SDRs enable extremely fast computation
7) SDRs are binary
x = 0100000000000000000100000000000110000000
Multiple input SDR’s Single bit in an output SDR
How Does a Single Neuron Operate on SDRs?
Proximal segments
represent dozens of
separate patterns in a single
segment
How Does a Single Neuron Operate on SDRs?
Hundreds of distal segments each detect a
unique SDR using a threshold
Feedback SDR
Context SDR
Bottom-up input SDR
In both cases each synapse corresponds to one bit in
the incoming high dimensional SDR
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent dynamic set of patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Notation
• We represent a SDR vector as a vector with n binary values
where each bit represents the activity of a single neuron:
• s = percent of ON bits, w = number of ON bits
x =[b0,… ,bn-1]
wx = s ´ n = x 1
Example
• n = 40, s = 0.1, w = 4
• Typical range of numbers in HTM implementations:
n = 2048 to 65,536 s = 0.05% to 2% w = 40
y =1000000000000000000100000000000110000000
x = 0100000000000000000100000000000110000000
SDRs Have Extremely High Capacity
• The number of unique patterns that can be represented is:
• This is far smaller than 2n, but far larger than any reasonable need
• Example: with n = 2048 and w = 40,
the number of unique patterns is > 1084 >> # atoms in universe
• Chance that two random vectors are identical is essentially zero:
n
w
æ
èç
ö
ø÷ =
n!
w! n - w( )!
1/
n
w
æ
èç
ö
ø÷
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Similarity Metric for Recognition of SDR Patterns
• We don’t use typical vector similarities
– Neurons cannot compute Euclidean or Hamming distance between SDRs
– Any p-norm requires full connectivity
• Compute similarity using an overlap metric
– The overlap is simply the number of bits in common
– Requires only minimal connectivity
– Mathematically, take the AND of two vectors and compute its length
• Detecting a “Match”
– Two SDR vectors “match” if their overlap meets a minimum threshold
overlap(x,y) º x Ù y
match(x,y) º overlap(x,y) ³q
q
Overlap example
• N=40, s=0.1, w=4
• The two vectors have an overlap of 3, so they “match” if the
threshold is 3.
y =1000000000000000000100000000000110000000
x = 0100000000000000000100000000000110000000
How Accurate is Matching With Noise?
• As you decrease the match threshold , you decrease sensitivity and increase
robustness to noise
• You also increase the chance of false positives
Decrease
q
q
How Many Vectors Match When You Decrease the Threshold?
• Define the “overlap set of x” to be the set of
vectors with exactly b bits of overlap with x
• The number of such vectors is:
Wx (n,w,b) =
wx
b
æ
èç
ö
ø÷ ´
n - wx
w - b
æ
èç
ö
ø÷
Wx (n,w,b)
Number subsets of x with
exactly b bits ON
Number patterns occupying the rest
of the vector with exactly w-b bits ON
Error Bound for Classification with Noise
• Give a single stored pattern, probability of false positive is:
• Given M patterns, probability of a false positive is:
fpw
n
(q) =
Wx (n,w,b)
b=q
w
å
n
w
æ
èç
ö
ø÷
fpX (q) £ fpwxi
n
(q)
i=0
M-1
å
What Does This Mean in Practice?
• With SDRs you can classify a huge number of patterns with substantial noise
(if n and w are large enough)
Examples
• n = 2048, w = 40
With up to 14 bits of noise (33%), you can classify a quadrillion
patterns with an error rate of less than 10-24
With up to 20 bits of noise (50%), you can classify a quadrillion
patterns with an error rate of less than 10-11
• n = 64, w=12
With up to 4 bits of noise (33%), you can classify 10 patterns
with an error rate of 0.04%
Neurons Are Highly Robust Pattern Recognizers
Hundreds of distal segments each detect a
unique SDR using a threshold
You can have tens of thousands of neurons examining a single input SDR, and very
robustly matching complex patterns
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
SDRs are Robust to Random Deletions
• In cortex bits in an SDR can randomly disappear
– Synapses can be quite unreliable
– Individual neurons can die
– A patch of cortex can be damaged
• The analysis for random deletions is very similar to noise
• SDRs can naturally handle fairly significant random failures
– Failures are tolerated in any SDR and in any part of the system
• This is a great property for those building HTM based hardware
– The probability of failures can be exactly characterized
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Representing Multiple Patterns in a Single SDR
• There are situations where we want to store multiple patterns within a single SDR
and match them
• In temporal inference the system might make multiple predictions about the future
Example
Unions of SDRs
• We can store a set of patterns in a single fixed representation by taking the OR of
all the individual patterns
• The vector representing the union is also going to match a large number of other
patterns that were not one of the original 10
• How many such patterns can we store reliably, without a high chance of false
positives?
Is this SDR
a member?
1)
2)
3)
….
10)
2%
< 20%Union
Error Bounds for Unions
• Expected number of ON bits:
• Give a union of M patterns, the expected probability of a false positive (with
noise) is:
What Does This Mean in Practice?
• You can form reliable unions of a reasonable number of patterns (assuming
large enough n and w)
Examples
• n = 2048, w = 40
The union of 50 patterns leads to an error rate of 10-9
• n = 512, w=10
The union of 50 patterns leads to an error rate of 0.9%
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
SDRs Enable Highly Efficient Operations
• In cortex complex operations are carried out rapidly
– Visual system can perform object recognition in 100-150 msecs
• SDR vectors are large, but all operations are O(w) and independent of
vector size
– No loops or optimization process required
• Matching a pattern against a dynamic list (unions) is O(w) and
independent of the number of items in the list
• Enables a tiny dendritic segment to perform robust pattern recognition
• We can simulate 200,000 neurons in software at about 25-50Hz
Summary
• SDR’s are the common data structure in the cortex
• SDR’s enable flexible recognition systems that have very high capacity, and are
robust to a large amount of noise
• The union property allows a fixed representation to encode a dynamically
changing set of patterns
• The analysis of SDR’s provides a principled foundation for characterizing the
behavior of the HTM learning algorithms and all cognitive functions
• Sparse memory (Kanerva), Sparse coding (Olshausen), Bloom filters (Broder)
Related work
Questions? Math jokes?
Follow us on Twitter @numenta
Sign up for our newsletter at www.numenta.com
Subutai Ahmad
sahmad@numenta.com
nupic-theory mailing list
numenta.org/lists

More Related Content

What's hot

Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Numenta
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
Numenta
 
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Numenta
 
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Numenta
 
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Numenta
 
Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine Learning
Numenta
 
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Numenta
 
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ... Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ...
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
Numenta
 
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Numenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
Christy Maver
 
RNN Explore
RNN ExploreRNN Explore
RNN Explore
Yan Kang
 
Neural networks
Neural networksNeural networks
Neural networks
Learnbay Datascience
 
Artificial neural networks and its applications
Artificial neural networks and its applications Artificial neural networks and its applications
Artificial neural networks and its applications
PoojaKoshti2
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: Pointers
Fariz Darari
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural Networks
Dean Wyatte
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
Ahmed_hashmi
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its applicationHưng Đặng
 
Neural Networks
Neural Networks Neural Networks
Neural Networks Eric Su
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKSESCOM
 
Neural networks
Neural networksNeural networks
Neural networks
Basil John
 

What's hot (20)

Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
 
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
 
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
 
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
 
Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine Learning
 
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
 
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ... Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ...
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
RNN Explore
RNN ExploreRNN Explore
RNN Explore
 
Neural networks
Neural networksNeural networks
Neural networks
 
Artificial neural networks and its applications
Artificial neural networks and its applications Artificial neural networks and its applications
Artificial neural networks and its applications
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: Pointers
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural Networks
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
Neural Networks
Neural Networks Neural Networks
Neural Networks
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKS
 
Neural networks
Neural networksNeural networks
Neural networks
 

Viewers also liked

Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine Intelligence
Numenta
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow models
jtoy
 
What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence
Numenta
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus Lewis
Numenta
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
Numenta
 
Getting Started with Numenta Technology
Getting Started with Numenta Technology Getting Started with Numenta Technology
Getting Started with Numenta Technology
Numenta
 
Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)
Numenta
 
Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPICNumenta
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial Pooler
Numenta
 
TouchNet preview at Numenta
TouchNet preview at NumentaTouchNet preview at Numenta
TouchNet preview at Numenta
jtoy
 

Viewers also liked (10)

Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine Intelligence
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow models
 
What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus Lewis
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
 
Getting Started with Numenta Technology
Getting Started with Numenta Technology Getting Started with Numenta Technology
Getting Started with Numenta Technology
 
Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)
 
Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPIC
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial Pooler
 
TouchNet preview at Numenta
TouchNet preview at NumentaTouchNet preview at Numenta
TouchNet preview at Numenta
 

Similar to Sparse Distributed Representations: Our Brain's Data Structure

Hierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionHierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly Detection
Ihor Bobak
 
is2015_poster
is2015_posteris2015_poster
is2015_posterJan Svec
 
CNN for modeling sentence
CNN for modeling sentenceCNN for modeling sentence
CNN for modeling sentence
ANISH BHANUSHALI
 
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
mohanapriyastp
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
Mohammed Bennamoun
 
Introduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMAIntroduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMA
Bidhan Ghimire
 
Wits presentation 6_28072015
Wits presentation 6_28072015Wits presentation 6_28072015
Wits presentation 6_28072015
Beatrice van Eden
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...
Chester Chen
 
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingTed Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
MLconf
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)
James Boulie
 
Lecture on Deep Learning
Lecture on Deep LearningLecture on Deep Learning
Lecture on Deep Learning
Yasas Senarath
 
Neural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep LearningNeural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep Learning
comifa7406
 
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationInformation Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Xavier Anguera
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learning
Viet-Trung TRAN
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANN
MostafaHazemMostafaa
 
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Peter Morovic
 
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Alexander Gorban
 
Biologically Inspired Methods for Adversarially Robust Deep Learning
Biologically Inspired Methods for Adversarially Robust Deep LearningBiologically Inspired Methods for Adversarially Robust Deep Learning
Biologically Inspired Methods for Adversarially Robust Deep Learning
MuhammadAhmedShah2
 
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksFeasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
Sangjun Han
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Förderverein Technische Fakultät
 

Similar to Sparse Distributed Representations: Our Brain's Data Structure (20)

Hierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionHierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly Detection
 
is2015_poster
is2015_posteris2015_poster
is2015_poster
 
CNN for modeling sentence
CNN for modeling sentenceCNN for modeling sentence
CNN for modeling sentence
 
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
 
Introduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMAIntroduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMA
 
Wits presentation 6_28072015
Wits presentation 6_28072015Wits presentation 6_28072015
Wits presentation 6_28072015
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...
 
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingTed Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)
 
Lecture on Deep Learning
Lecture on Deep LearningLecture on Deep Learning
Lecture on Deep Learning
 
Neural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep LearningNeural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep Learning
 
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationInformation Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learning
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANN
 
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
 
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
 
Biologically Inspired Methods for Adversarially Robust Deep Learning
Biologically Inspired Methods for Adversarially Robust Deep LearningBiologically Inspired Methods for Adversarially Robust Deep Learning
Biologically Inspired Methods for Adversarially Robust Deep Learning
 
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksFeasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
 

More from Numenta

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devices
Numenta
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Numenta
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Numenta
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Numenta
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Numenta
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Numenta
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
Numenta
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
Numenta
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
Numenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
Numenta
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
Numenta
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
Numenta
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AI
Numenta
 

More from Numenta (13)

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devices
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AI
 

Recently uploaded

Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
TravisMalana
 
Q1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year ReboundQ1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year Rebound
Oppotus
 
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
John Andrews
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
axoqas
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP
 
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdfCh03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
haila53
 
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
slg6lamcq
 
Data_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptx
Data_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptxData_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptx
Data_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptx
AnirbanRoy608946
 
原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样
原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样
原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样
u86oixdj
 
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Subhajit Sahu
 
一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理
一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理
一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理
dwreak4tg
 
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
Timothy Spann
 
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
u86oixdj
 
一比一原版(NYU毕业证)纽约大学毕业证成绩单
一比一原版(NYU毕业证)纽约大学毕业证成绩单一比一原版(NYU毕业证)纽约大学毕业证成绩单
一比一原版(NYU毕业证)纽约大学毕业证成绩单
ewymefz
 
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Subhajit Sahu
 
Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...
Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...
Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...
2023240532
 
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
v3tuleee
 
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
mbawufebxi
 
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
NABLAS株式会社
 
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
ewymefz
 

Recently uploaded (20)

Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
 
Q1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year ReboundQ1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year Rebound
 
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
 
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdfCh03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
 
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
 
Data_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptx
Data_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptxData_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptx
Data_and_Analytics_Essentials_Architect_an_Analytics_Platform.pptx
 
原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样
原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样
原版制作(swinburne毕业证书)斯威本科技大学毕业证毕业完成信一模一样
 
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
 
一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理
一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理
一比一原版(BCU毕业证书)伯明翰城市大学毕业证如何办理
 
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
 
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
 
一比一原版(NYU毕业证)纽约大学毕业证成绩单
一比一原版(NYU毕业证)纽约大学毕业证成绩单一比一原版(NYU毕业证)纽约大学毕业证成绩单
一比一原版(NYU毕业证)纽约大学毕业证成绩单
 
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
 
Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...
Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...
Quantitative Data AnalysisReliability Analysis (Cronbach Alpha) Common Method...
 
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
 
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
 
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
 
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
 

Sparse Distributed Representations: Our Brain's Data Structure

  • 1. Sparse Distributed Representations: Our Brain’s Data Structure Numenta Workshop October 17, 2014 Subutai Ahmad, VP Research sahmad@numenta.com
  • 2.
  • 3.
  • 4. Sparse Distributed Representations: Our Brain’s Data Structure Numenta Workshop October 17, 2014 Subutai Ahmad, VP Research sahmad@numenta.com
  • 5. The Role of Sparse Distributed Representations in Cortex 1) Sensory perception 3) Motor control 4) Prediction 2) Planning 5) Attention Sparse Distribution Representations (SDRs) are the foundation for all these functions, across all sensory modalities Analysis of this common cortical data structure can provide a rigorous foundation for cortical computing
  • 6. Talk Outline 1) Introduction to Sparse Distributed Representations (SDRs) 2) Fundamental properties of SDRs – Error bounds – Scaling laws
  • 7. From: Prof. Hasan, Max-Planck- Institut for Research
  • 8. Basics Attributes of SDRs 1) Only a small number of neurons are firing at any point in time 3) Every cell represents something and has meaning 4) Information is distributed and no single neuron is critical 2) There are a very large number of neurons 5) Every neuron only connects to a subset of other neurons 6) SDRs enable extremely fast computation 7) SDRs are binary x = 0100000000000000000100000000000110000000
  • 9. Multiple input SDR’s Single bit in an output SDR How Does a Single Neuron Operate on SDRs?
  • 10. Proximal segments represent dozens of separate patterns in a single segment How Does a Single Neuron Operate on SDRs? Hundreds of distal segments each detect a unique SDR using a threshold Feedback SDR Context SDR Bottom-up input SDR In both cases each synapse corresponds to one bit in the incoming high dimensional SDR
  • 11. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent dynamic set of patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 12. Notation • We represent a SDR vector as a vector with n binary values where each bit represents the activity of a single neuron: • s = percent of ON bits, w = number of ON bits x =[b0,… ,bn-1] wx = s ´ n = x 1 Example • n = 40, s = 0.1, w = 4 • Typical range of numbers in HTM implementations: n = 2048 to 65,536 s = 0.05% to 2% w = 40 y =1000000000000000000100000000000110000000 x = 0100000000000000000100000000000110000000
  • 13. SDRs Have Extremely High Capacity • The number of unique patterns that can be represented is: • This is far smaller than 2n, but far larger than any reasonable need • Example: with n = 2048 and w = 40, the number of unique patterns is > 1084 >> # atoms in universe • Chance that two random vectors are identical is essentially zero: n w æ èç ö ø÷ = n! w! n - w( )! 1/ n w æ èç ö ø÷
  • 14. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 15. Similarity Metric for Recognition of SDR Patterns • We don’t use typical vector similarities – Neurons cannot compute Euclidean or Hamming distance between SDRs – Any p-norm requires full connectivity • Compute similarity using an overlap metric – The overlap is simply the number of bits in common – Requires only minimal connectivity – Mathematically, take the AND of two vectors and compute its length • Detecting a “Match” – Two SDR vectors “match” if their overlap meets a minimum threshold overlap(x,y) º x Ù y match(x,y) º overlap(x,y) ³q q
  • 16. Overlap example • N=40, s=0.1, w=4 • The two vectors have an overlap of 3, so they “match” if the threshold is 3. y =1000000000000000000100000000000110000000 x = 0100000000000000000100000000000110000000
  • 17. How Accurate is Matching With Noise? • As you decrease the match threshold , you decrease sensitivity and increase robustness to noise • You also increase the chance of false positives Decrease q q
  • 18. How Many Vectors Match When You Decrease the Threshold? • Define the “overlap set of x” to be the set of vectors with exactly b bits of overlap with x • The number of such vectors is: Wx (n,w,b) = wx b æ èç ö ø÷ ´ n - wx w - b æ èç ö ø÷ Wx (n,w,b) Number subsets of x with exactly b bits ON Number patterns occupying the rest of the vector with exactly w-b bits ON
  • 19. Error Bound for Classification with Noise • Give a single stored pattern, probability of false positive is: • Given M patterns, probability of a false positive is: fpw n (q) = Wx (n,w,b) b=q w å n w æ èç ö ø÷ fpX (q) £ fpwxi n (q) i=0 M-1 å
  • 20. What Does This Mean in Practice? • With SDRs you can classify a huge number of patterns with substantial noise (if n and w are large enough) Examples • n = 2048, w = 40 With up to 14 bits of noise (33%), you can classify a quadrillion patterns with an error rate of less than 10-24 With up to 20 bits of noise (50%), you can classify a quadrillion patterns with an error rate of less than 10-11 • n = 64, w=12 With up to 4 bits of noise (33%), you can classify 10 patterns with an error rate of 0.04%
  • 21. Neurons Are Highly Robust Pattern Recognizers Hundreds of distal segments each detect a unique SDR using a threshold You can have tens of thousands of neurons examining a single input SDR, and very robustly matching complex patterns
  • 22. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 23. SDRs are Robust to Random Deletions • In cortex bits in an SDR can randomly disappear – Synapses can be quite unreliable – Individual neurons can die – A patch of cortex can be damaged • The analysis for random deletions is very similar to noise • SDRs can naturally handle fairly significant random failures – Failures are tolerated in any SDR and in any part of the system • This is a great property for those building HTM based hardware – The probability of failures can be exactly characterized
  • 24. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 25. Representing Multiple Patterns in a Single SDR • There are situations where we want to store multiple patterns within a single SDR and match them • In temporal inference the system might make multiple predictions about the future Example
  • 26. Unions of SDRs • We can store a set of patterns in a single fixed representation by taking the OR of all the individual patterns • The vector representing the union is also going to match a large number of other patterns that were not one of the original 10 • How many such patterns can we store reliably, without a high chance of false positives? Is this SDR a member? 1) 2) 3) …. 10) 2% < 20%Union
  • 27. Error Bounds for Unions • Expected number of ON bits: • Give a union of M patterns, the expected probability of a false positive (with noise) is:
  • 28. What Does This Mean in Practice? • You can form reliable unions of a reasonable number of patterns (assuming large enough n and w) Examples • n = 2048, w = 40 The union of 50 patterns leads to an error rate of 10-9 • n = 512, w=10 The union of 50 patterns leads to an error rate of 0.9%
  • 29. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 30. SDRs Enable Highly Efficient Operations • In cortex complex operations are carried out rapidly – Visual system can perform object recognition in 100-150 msecs • SDR vectors are large, but all operations are O(w) and independent of vector size – No loops or optimization process required • Matching a pattern against a dynamic list (unions) is O(w) and independent of the number of items in the list • Enables a tiny dendritic segment to perform robust pattern recognition • We can simulate 200,000 neurons in software at about 25-50Hz
  • 31. Summary • SDR’s are the common data structure in the cortex • SDR’s enable flexible recognition systems that have very high capacity, and are robust to a large amount of noise • The union property allows a fixed representation to encode a dynamically changing set of patterns • The analysis of SDR’s provides a principled foundation for characterizing the behavior of the HTM learning algorithms and all cognitive functions • Sparse memory (Kanerva), Sparse coding (Olshausen), Bloom filters (Broder) Related work
  • 32. Questions? Math jokes? Follow us on Twitter @numenta Sign up for our newsletter at www.numenta.com Subutai Ahmad sahmad@numenta.com nupic-theory mailing list numenta.org/lists