past dual space future
Mental and neural representations of the
past and the future
Marc W. Howard
Center for Memory & Brain
Department of Psychological and Brain Sciences
Department of Physics
Boston University
March 25, 2019
past dual space future
Zoran Tiganj, Ian Bright, Nathan Cruzado, Yue Liu
Inder Singh, Karthik Shankar
Collaborators: Howard Eichenbaum, Michael Hasselmo, Earl
Miller (MIT), Miriam Meister and Beth Buffalo (Washington)
Current funding:
NIH: R01MH112169, R01EB022864, R01MH095297
NSF: IIS-1631460
ONR: N00014-16-1-2832 (Hasselmo, PI)
Private Sector: Google FRA, Facebook Reality Labs
http://sites.bu.edu/tcn/
past dual space future
past dual space future
past dual space future
past dual space future
Overview
• Compressed timeline of
the past in the brain
• Dual space for the past
(Laplace).
• Neural evidence for this
dual space.
• The future.
past dual space future
Overview
• Compressed timeline of
the past in the brain
• Dual space for the past
(Laplace).
• Neural evidence for this
dual space.
• The future.
past dual space future
Compressed temporal memory
• Events are represented on a temporal axis.
• Resolution of this axis decreases as
events recede into the past.
• This compression should be logarithmic.
William James, Karl Lashley, Robert Crowder, Gordon Brown,
Randy Gallistel, Nick Chater, John Anderson . . .
past dual space future
Cognitive models for judging the past
Data, Singh & Howard (2017, bioRxiv))
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
Recency
ResponseTime(s)
−7 −6 −5 −4 −3 −2
0.7
0.8
0.9
1.0
1.1
1.2
1.3
●
●
●
●
●
●
Recency
ResponseTime(s)
−6 −4 −3 −2 −1
0.0
0.5
1.0
1.5
Go to Zoran’s poster (E42)!
past dual space future
Cognitive models for judging the past
Data, Singh & Howard (2017, bioRxiv))
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
Recency
ResponseTime(s)
−7 −6 −5 −4 −3 −2
0.7
0.8
0.9
1.0
1.1
1.2
1.3
●
●
●
●
●
●
Recency
ResponseTime(s)
−6 −4 −3 −2 −1
0.0
0.5
1.0
1.5
Model, Zoran’s poster
-7 -6 -5 -4 -3 -2
Recency
ResponseTime
-1-2-3-4-5-6
Recency
ResponseTime
Go to Zoran’s poster (E42)!
past dual space future
What would a timeline look like in the brain?
0 5 10
0
2.5
5
0 5 10
0
1.5
3
0 5 10
0
5
10
Time
MacDonald, et al., 2011
• “Time cells” are compressed.
• Different stimuli trigger different
sequences.
• Hippocampus (CA1, CA3, DG), lPFC,
mPFC, striatum . . .
past dual space future
What would a timeline look like in the brain?
Bolkan, et al., 2017
• “Time cells” are compressed.
• Different stimuli trigger different
sequences.
• Hippocampus (CA1, CA3, DG), lPFC,
mPFC, striatum . . .
past dual space future
Compressed record of what happened when
Tiganj, Cromer, Roy, Miller, & Howard (2018, J Cog Neuro)
Monkey WM task; remember different stimuli during delay
Dog Cat Car/Truck
0 0.5 1 1.5
Time [s]
50
100
150
200
Cell#
0 0.5 1 1.5
Time [s]
50
100
150
200
Cell#
0 0.5 1 1.5
Time [s]
50
100
150
200
Cell#
past dual space future
Cognitive models of many “kinds” of memory
Howard, et al., (2015, Psych Rev)
• Quantitative description of the activity of many neurons.
• Write out solvable cognitive models of behavior.
• Modeled working memory, episodic memory, conditioning.
• Representation observed in PFC, hippocampus, striatum.
past dual space future
Cognitive models of many “kinds” of memory
Howard, et al., (2015, Psych Rev)
• Quantitative description of the activity of many neurons.
• Write out solvable cognitive models of behavior.
• Modeled working memory, episodic memory, conditioning.
• Representation observed in PFC, hippocampus, striatum.
Make your own cognitive models!
https://github.com/zorant/WM_demo
past dual space future
A dual space using the Laplace transform
F(s) =
∞
0
e−st
f(t)dt
Invertible:
F(s) ⇔ f(t)
Real-time:
dF(s)
dt
= −sF(s) + f(t)
past dual space future
Timelines in the Laplace domain and out
Shankar & Howard (2012, 2013)
F(s) =
t
−∞
es(t−t′)
f(t′
)dt′
• The cells F(s), with a spectrum of s values give
the Laplace transform of f(t − τ).
• A set of weights L-1
k approximate the inverse
transform (Post, 1930).
• The cells ˜f(
∗
τ), with different values of
∗
τ,
approximate the function itself.
past dual space future
Timelines in the Laplace domain and out
Shankar & Howard (2012, 2013)
dF(s)
dt
= −sF(s) + f(t)
• The cells F(s), with a spectrum of s values give
the Laplace transform of f(t − τ).
• A set of weights L-1
k approximate the inverse
transform (Post, 1930).
• The cells ˜f(
∗
τ), with different values of
∗
τ,
approximate the function itself.
past dual space future
Timelines in the Laplace domain and out
Shankar & Howard (2012, 2013)
past dual space future
Timelines in the Laplace domain and out
Shankar & Howard (2012, 2013)
past dual space future
Inverse is just on-center/off-surround receptive fields
Liu, Tiganj, Hasselmo, & Howard (2019)
Inverse operator L-1
k
• Requires derivatives wrt s
• A gradient of s . . .
• . . . implies
on-center/off-surround
receptive fields.
Inverse introduces error, but it’s scale-invariant.
past dual space future
Laplace transform of time in monkey EC
Bright, Meister, Cruzado, Tiganj, Howard and Buffalo, (submitted?)
See also Tsao, et al., (2018; Nature)
past dual space future
The future
Greetings, my friend. We are all
interested in the future, for that is
where you and I are going to spend
the rest of our lives. And
remember, my friend, future events
such as these will affect you in the
future.
Criswell
past dual space future
Beginning of the future
• Behavioral evidence for a
timeline of the future.
• Computational models for
scale-invariant future.
• Modeling is out front of the
data.
past dual space future
Human judgements of future events
Singh & Howard (2017, bioRxiv)
Judgment of recency (JOR)
• Rapid presentation of list.
• Choose which probe was
closer to the present.
• Scanning in STM
Judgment of imminence (JOI)
• Statistical learning
• Choose which probe was
closer to the present.
• ???
past dual space future
Remembering the past, predicting the future
Singh & Howard (2017, BioRxiv)
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
Recency
ResponseTime(s)
−7 −6 −5 −4 −3 −2
0.7
0.8
0.9
1.0
1.1
1.2
1.3
●
●
●
●
●
●
Recency
ResponseTime(s)
−6 −4 −3 −2 −1
0.0
0.5
1.0
1.5
past dual space future
Remembering the past, predicting the future
Singh & Howard (2017, BioRxiv)
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
Recency
ResponseTime(s)
−7 −6 −5 −4 −3 −2
0.7
0.8
0.9
1.0
1.1
1.2
1.3
●
●
●
●
●
●
Recency
ResponseTime(s)
−6 −4 −3 −2 −1
0.0
0.5
1.0
1.5
●
●
●
●
● ●
●
●
●
●
●
●
Imminence
ResponseTime(s)
3 4 5 6 7
1.0
1.2
1.4
1.6
1.8
●
●
●
●
●
Imminence
ResponseTime(s)
1 2 3 4 5
0.5
1.0
1.5
2.0
Past and future both appear to be compressed timelines.
past dual space future
Constructing the future
Tiganj, Gershman, Sederberg & Howard (2019)
• Timeline: what and when.
• Flexible responses.
• Average over possible
futures.
0 5 10 15 20 25 30 35 40 45 50 55
Time
Scale: 5
D
R
2R
0 5 10 15 20 25 30 35 40 45 50 55
Future time
0
2
4
6
Prediction
10-5
Tim
e
0 5 10 15 20 25 30 35 40 45 50 55
Time
Scale: 20
D
R
2R
0 5 10 15 20 25 30 35 40 45 50 55
Future time
0
2
4
6
Prediction
10-6
Tim
e
*
*
See also Mommenejad & Howard (2018; BioRxiv);
Shankar, Singh & Howard (2016, Neural Comp).
past dual space future
Constructing the future
Tiganj, Gershman, Sederberg & Howard (2019)
• Timeline: what and when.
• Flexible responses.
• Average over possible
futures.
See also Mommenejad & Howard (2018; BioRxiv);
Shankar, Singh & Howard (2016, Neural Comp).
past dual space future
Constructing the future
Tiganj, Gershman, Sederberg & Howard (2019)
• Timeline: what and when.
• Flexible responses.
• Average over possible
futures.
See also Mommenejad & Howard (2018; BioRxiv);
Shankar, Singh & Howard (2016, Neural Comp).
past dual space future
Take-home
• The brain computes a compressed timeline of the past.
• Computationally, we can make sense of this using a
Laplace domain dual space.
• There’s new evidence the brain computes the Laplace
transform of time.
• I think we construct a logarithmically-compressed map of
the future.
past dual space future
Zoran Tiganj, Nathan Cruzado, Yue Liu, Andre Luzardo
Inder Singh, Karthik Shankar
Collaborators: Howard Eichenbaum, Michael Hasselmo, Earl
Miller (MIT), Miriam Meister and Beth Buffalo (Washington)
Current funding:
NIH: R01MH112169, R01EB022864, R01MH095297
NSF: IIS-1631460
ONR: N00014-16-1-2832 (Hasselmo, PI)
Private Sector: Google FRA, Facebook Reality Labs
http://sites.bu.edu/tcn/

CNS 2019 - Mental Models of Time - Marc Howard

  • 1.
    past dual spacefuture Mental and neural representations of the past and the future Marc W. Howard Center for Memory & Brain Department of Psychological and Brain Sciences Department of Physics Boston University March 25, 2019
  • 2.
    past dual spacefuture Zoran Tiganj, Ian Bright, Nathan Cruzado, Yue Liu Inder Singh, Karthik Shankar Collaborators: Howard Eichenbaum, Michael Hasselmo, Earl Miller (MIT), Miriam Meister and Beth Buffalo (Washington) Current funding: NIH: R01MH112169, R01EB022864, R01MH095297 NSF: IIS-1631460 ONR: N00014-16-1-2832 (Hasselmo, PI) Private Sector: Google FRA, Facebook Reality Labs http://sites.bu.edu/tcn/
  • 3.
  • 4.
  • 5.
  • 6.
    past dual spacefuture Overview • Compressed timeline of the past in the brain • Dual space for the past (Laplace). • Neural evidence for this dual space. • The future.
  • 7.
    past dual spacefuture Overview • Compressed timeline of the past in the brain • Dual space for the past (Laplace). • Neural evidence for this dual space. • The future.
  • 8.
    past dual spacefuture Compressed temporal memory • Events are represented on a temporal axis. • Resolution of this axis decreases as events recede into the past. • This compression should be logarithmic. William James, Karl Lashley, Robert Crowder, Gordon Brown, Randy Gallistel, Nick Chater, John Anderson . . .
  • 9.
    past dual spacefuture Cognitive models for judging the past Data, Singh & Howard (2017, bioRxiv)) ● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● Recency ResponseTime(s) −7 −6 −5 −4 −3 −2 0.7 0.8 0.9 1.0 1.1 1.2 1.3 ● ● ● ● ● ● Recency ResponseTime(s) −6 −4 −3 −2 −1 0.0 0.5 1.0 1.5 Go to Zoran’s poster (E42)!
  • 10.
    past dual spacefuture Cognitive models for judging the past Data, Singh & Howard (2017, bioRxiv)) ● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● Recency ResponseTime(s) −7 −6 −5 −4 −3 −2 0.7 0.8 0.9 1.0 1.1 1.2 1.3 ● ● ● ● ● ● Recency ResponseTime(s) −6 −4 −3 −2 −1 0.0 0.5 1.0 1.5 Model, Zoran’s poster -7 -6 -5 -4 -3 -2 Recency ResponseTime -1-2-3-4-5-6 Recency ResponseTime Go to Zoran’s poster (E42)!
  • 11.
    past dual spacefuture What would a timeline look like in the brain? 0 5 10 0 2.5 5 0 5 10 0 1.5 3 0 5 10 0 5 10 Time MacDonald, et al., 2011 • “Time cells” are compressed. • Different stimuli trigger different sequences. • Hippocampus (CA1, CA3, DG), lPFC, mPFC, striatum . . .
  • 12.
    past dual spacefuture What would a timeline look like in the brain? Bolkan, et al., 2017 • “Time cells” are compressed. • Different stimuli trigger different sequences. • Hippocampus (CA1, CA3, DG), lPFC, mPFC, striatum . . .
  • 13.
    past dual spacefuture Compressed record of what happened when Tiganj, Cromer, Roy, Miller, & Howard (2018, J Cog Neuro) Monkey WM task; remember different stimuli during delay Dog Cat Car/Truck 0 0.5 1 1.5 Time [s] 50 100 150 200 Cell# 0 0.5 1 1.5 Time [s] 50 100 150 200 Cell# 0 0.5 1 1.5 Time [s] 50 100 150 200 Cell#
  • 14.
    past dual spacefuture Cognitive models of many “kinds” of memory Howard, et al., (2015, Psych Rev) • Quantitative description of the activity of many neurons. • Write out solvable cognitive models of behavior. • Modeled working memory, episodic memory, conditioning. • Representation observed in PFC, hippocampus, striatum.
  • 15.
    past dual spacefuture Cognitive models of many “kinds” of memory Howard, et al., (2015, Psych Rev) • Quantitative description of the activity of many neurons. • Write out solvable cognitive models of behavior. • Modeled working memory, episodic memory, conditioning. • Representation observed in PFC, hippocampus, striatum. Make your own cognitive models! https://github.com/zorant/WM_demo
  • 16.
    past dual spacefuture A dual space using the Laplace transform F(s) = ∞ 0 e−st f(t)dt Invertible: F(s) ⇔ f(t) Real-time: dF(s) dt = −sF(s) + f(t)
  • 17.
    past dual spacefuture Timelines in the Laplace domain and out Shankar & Howard (2012, 2013) F(s) = t −∞ es(t−t′) f(t′ )dt′ • The cells F(s), with a spectrum of s values give the Laplace transform of f(t − τ). • A set of weights L-1 k approximate the inverse transform (Post, 1930). • The cells ˜f( ∗ τ), with different values of ∗ τ, approximate the function itself.
  • 18.
    past dual spacefuture Timelines in the Laplace domain and out Shankar & Howard (2012, 2013) dF(s) dt = −sF(s) + f(t) • The cells F(s), with a spectrum of s values give the Laplace transform of f(t − τ). • A set of weights L-1 k approximate the inverse transform (Post, 1930). • The cells ˜f( ∗ τ), with different values of ∗ τ, approximate the function itself.
  • 19.
    past dual spacefuture Timelines in the Laplace domain and out Shankar & Howard (2012, 2013)
  • 20.
    past dual spacefuture Timelines in the Laplace domain and out Shankar & Howard (2012, 2013)
  • 21.
    past dual spacefuture Inverse is just on-center/off-surround receptive fields Liu, Tiganj, Hasselmo, & Howard (2019) Inverse operator L-1 k • Requires derivatives wrt s • A gradient of s . . . • . . . implies on-center/off-surround receptive fields. Inverse introduces error, but it’s scale-invariant.
  • 22.
    past dual spacefuture Laplace transform of time in monkey EC Bright, Meister, Cruzado, Tiganj, Howard and Buffalo, (submitted?) See also Tsao, et al., (2018; Nature)
  • 23.
    past dual spacefuture The future Greetings, my friend. We are all interested in the future, for that is where you and I are going to spend the rest of our lives. And remember, my friend, future events such as these will affect you in the future. Criswell
  • 24.
    past dual spacefuture Beginning of the future • Behavioral evidence for a timeline of the future. • Computational models for scale-invariant future. • Modeling is out front of the data.
  • 25.
    past dual spacefuture Human judgements of future events Singh & Howard (2017, bioRxiv) Judgment of recency (JOR) • Rapid presentation of list. • Choose which probe was closer to the present. • Scanning in STM Judgment of imminence (JOI) • Statistical learning • Choose which probe was closer to the present. • ???
  • 26.
    past dual spacefuture Remembering the past, predicting the future Singh & Howard (2017, BioRxiv) ● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● Recency ResponseTime(s) −7 −6 −5 −4 −3 −2 0.7 0.8 0.9 1.0 1.1 1.2 1.3 ● ● ● ● ● ● Recency ResponseTime(s) −6 −4 −3 −2 −1 0.0 0.5 1.0 1.5
  • 27.
    past dual spacefuture Remembering the past, predicting the future Singh & Howard (2017, BioRxiv) ● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● Recency ResponseTime(s) −7 −6 −5 −4 −3 −2 0.7 0.8 0.9 1.0 1.1 1.2 1.3 ● ● ● ● ● ● Recency ResponseTime(s) −6 −4 −3 −2 −1 0.0 0.5 1.0 1.5 ● ● ● ● ● ● ● ● ● ● ● ● Imminence ResponseTime(s) 3 4 5 6 7 1.0 1.2 1.4 1.6 1.8 ● ● ● ● ● Imminence ResponseTime(s) 1 2 3 4 5 0.5 1.0 1.5 2.0 Past and future both appear to be compressed timelines.
  • 28.
    past dual spacefuture Constructing the future Tiganj, Gershman, Sederberg & Howard (2019) • Timeline: what and when. • Flexible responses. • Average over possible futures. 0 5 10 15 20 25 30 35 40 45 50 55 Time Scale: 5 D R 2R 0 5 10 15 20 25 30 35 40 45 50 55 Future time 0 2 4 6 Prediction 10-5 Tim e 0 5 10 15 20 25 30 35 40 45 50 55 Time Scale: 20 D R 2R 0 5 10 15 20 25 30 35 40 45 50 55 Future time 0 2 4 6 Prediction 10-6 Tim e * * See also Mommenejad & Howard (2018; BioRxiv); Shankar, Singh & Howard (2016, Neural Comp).
  • 29.
    past dual spacefuture Constructing the future Tiganj, Gershman, Sederberg & Howard (2019) • Timeline: what and when. • Flexible responses. • Average over possible futures. See also Mommenejad & Howard (2018; BioRxiv); Shankar, Singh & Howard (2016, Neural Comp).
  • 30.
    past dual spacefuture Constructing the future Tiganj, Gershman, Sederberg & Howard (2019) • Timeline: what and when. • Flexible responses. • Average over possible futures. See also Mommenejad & Howard (2018; BioRxiv); Shankar, Singh & Howard (2016, Neural Comp).
  • 31.
    past dual spacefuture Take-home • The brain computes a compressed timeline of the past. • Computationally, we can make sense of this using a Laplace domain dual space. • There’s new evidence the brain computes the Laplace transform of time. • I think we construct a logarithmically-compressed map of the future.
  • 32.
    past dual spacefuture Zoran Tiganj, Nathan Cruzado, Yue Liu, Andre Luzardo Inder Singh, Karthik Shankar Collaborators: Howard Eichenbaum, Michael Hasselmo, Earl Miller (MIT), Miriam Meister and Beth Buffalo (Washington) Current funding: NIH: R01MH112169, R01EB022864, R01MH095297 NSF: IIS-1631460 ONR: N00014-16-1-2832 (Hasselmo, PI) Private Sector: Google FRA, Facebook Reality Labs http://sites.bu.edu/tcn/