Hierarchical
Temporal Memory
Numenta (2005)
Brain Like Learning !!
Cognitive Computing
Human Brain
• Neocortex
• 2D Sheet of repeating Modular
Assemblies of neurons with binary
inputs
• Not a processor but a Memory
System
• Stores pattern sequences
• Only takes SDR format data
• HTM by Numenta (2005) – Jeff Hawking
• Semantic Folding – Francisco D.S.
Webber
SDRs
• Compare – Semantic Similarity
• Store – Efficient (indices of 1s)
• Union – (OR) Maintains Member Info
• Overlap – AND Operation
• Very long binary vectors
with max. 2% active @ t
• Every bit has a meaning
• Subsampling,
Sparse Distributed Representations
Differences between Word2Vec & SDRs
WORD2VEC
Sparse Distributed Representations
(SDRs)
Dense Sparse
Cannot capture ambiguity Captures all possible meanings
300-800 Features 16384 Features (Currently Used)
Union of vectors does not exist Union retains information from members
Semantic Fingerprint Generation
Word & Text Fingerprints
Semantic Folding
Spatial
Pooling
Hebbian Learning
• When two joining neurons fire
simultaneously, the connection
between them strengthens – Hebb,
1949
• Basic Difference – in most cases we
penalize loss/error and update
weights – hebbian learning we reward
a correct response/output (also
known as reward learning)
• Spatial Pooling creates some weak
columns, we can multiply the
strength of these columns to encoded
sequence by a boosting factor.
Hierarchical
+
Temporal Memory
• The segments / layers in context are
called – “coincidence detectors” –
i.e. “I guess I see something where I
think you are gonna have a part to
play”.
• Bursting occurs when there is a
“new” sequence.
Cortical.io
• Contract Intelligence
• Automated data extraction from contracts
• Learning driven by subject matter experts
(SMEs)
• Support Intelligence
• AI based assistance for customers
• Natural language understanding
• Continuous learning
• Easy updating with new vocab
Thank You

Hierarchical Temporal Memory Xceedance

  • 1.
    Hierarchical Temporal Memory Numenta (2005) BrainLike Learning !! Cognitive Computing
  • 2.
    Human Brain • Neocortex •2D Sheet of repeating Modular Assemblies of neurons with binary inputs • Not a processor but a Memory System • Stores pattern sequences • Only takes SDR format data • HTM by Numenta (2005) – Jeff Hawking • Semantic Folding – Francisco D.S. Webber
  • 3.
    SDRs • Compare –Semantic Similarity • Store – Efficient (indices of 1s) • Union – (OR) Maintains Member Info • Overlap – AND Operation • Very long binary vectors with max. 2% active @ t • Every bit has a meaning • Subsampling, Sparse Distributed Representations
  • 4.
    Differences between Word2Vec& SDRs WORD2VEC Sparse Distributed Representations (SDRs) Dense Sparse Cannot capture ambiguity Captures all possible meanings 300-800 Features 16384 Features (Currently Used) Union of vectors does not exist Union retains information from members
  • 5.
  • 6.
    Word & TextFingerprints
  • 7.
  • 8.
  • 9.
    Hebbian Learning • Whentwo joining neurons fire simultaneously, the connection between them strengthens – Hebb, 1949 • Basic Difference – in most cases we penalize loss/error and update weights – hebbian learning we reward a correct response/output (also known as reward learning) • Spatial Pooling creates some weak columns, we can multiply the strength of these columns to encoded sequence by a boosting factor.
  • 10.
    Hierarchical + Temporal Memory • Thesegments / layers in context are called – “coincidence detectors” – i.e. “I guess I see something where I think you are gonna have a part to play”. • Bursting occurs when there is a “new” sequence.
  • 11.
    Cortical.io • Contract Intelligence •Automated data extraction from contracts • Learning driven by subject matter experts (SMEs) • Support Intelligence • AI based assistance for customers • Natural language understanding • Continuous learning • Easy updating with new vocab
  • 12.

Editor's Notes

  • #3 1. The limbic system supports most of the emotions linked functions, including behaviour, motivation and emotional state. 2. Reptilian complex is for all the survival instincts like eating, sleeping etc. 3. Neocortex is the that part of the brain which gives us power to reason and other higher order brain functions like perception, cognition, spatial reasoning, language and generation of motor command.
  • #4 Union is very important in sequence learning becuz u have to remember all the things that happened in the past. Thus, the union has features from all the past.
  • #10 1. The neuron once active sends some hormones or signals in all possible segments and the neurons activated next get strengthened. 2. Once we have trained our brain enough we are able to know what to do next precisely as the flow of signal from one neuron has been trained. 3. Therefore, the segment which has strongest union get fired.
  • #11 Every context is a distal connection to a neuron….and if the union of any one of the segments reaches a particular threshold then that neuron fires up. Every pyramidal neuron is in constant state of prediction, trying to figure out when it has to fire. It tries to assess 100s of unique patterns and contexts. Each active column represents the number of time steps it can remember and it will have a winner neuron, bursting or not. After bursting how does the column decide – it computes the winner neurons – which is defined by the previous segments. If the pattern is totally new then the neuron with minimum connections is chosen to initiate a new learning.
  • #12 Further Ambitions of Numenta: layer 1 in the picture is like a getting simple neural network functionality with HTM (obviously with the added benefits of HTM) layer 2 in the picture is like a getting convolution neural network functionality with HTM layer 3 in the picture is like a getting reinforcement learning functionality with HTM layer 4 in the picture is like getting multiple CNNs to work with reinforcement learning and HTM