3. Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
4. Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
5. Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Spontaneous
brain activity
6. Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Earth’s
magnetic field
Spontaneous
brain activity
7. Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Earth’s
magnetic field
Traffic,
Electrical disturbance
Spontaneous
brain activity
8. Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Earth’s
magnetic field
Traffic,
Electrical disturbance
Spontaneous
brain activity
Sensor noise
9. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
10. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
11. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
12. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
13. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
14. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
15. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trigger
channel
16. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
17. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
18. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
Evoked
response
19. Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
Evoked
response
N170: faces
P300: surprise
N400: words
etc.
21. 4
Manual analysis cannot be scaled and is not
reproducible
1http://biorxiv.org/content/biorxiv/early/2016/02/12/039354.full.pdf
(A practical guide for improving transparency and reproducibility in neuroimaging research)
“Quite often in the course of a project parameters are modified, list of
subjects are changed, and processing steps need to be rerun …
automating instead of manual interventions can really pay off”
—Russel Poldrack1
22. 4
Manual analysis cannot be scaled and is not
reproducible
1http://biorxiv.org/content/biorxiv/early/2016/02/12/039354.full.pdf
(A practical guide for improving transparency and reproducibility in neuroimaging research)
“Quite often in the course of a project parameters are modified, list of
subjects are changed, and processing steps need to be rerun …
automating instead of manual interventions can really pay off”
—Russel Poldrack1
Advances in automating analysis
of neural time series
24. Advances in automating analysis of neural time series 5
Contributions
Reproducibility
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Brain Imaging Methods)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
25. Advances in automating analysis of neural time series 5
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Brain Imaging Methods)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
26. Advances in automating analysis of neural time series 5
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Brain Imaging Methods)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
27. Advances in automating analysis of neural time series 6
Contribution I: Brain Imaging Data Structure (BIDS) validator
28. Advances in automating analysis of neural time series 6
Contribution I: Brain Imaging Data Structure (BIDS) validator
- Automatic converter MNE-BIDS
- BIDS compatible dataset ds000248
29. Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
30. Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
code
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
31. Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
code
?
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
32. Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
33. Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
•Diagnostic plots
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
34. Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
•Diagnostic plots
•Alternatives
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
35. Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
•Diagnostic plots
•Alternatives
•Statistics
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
36. Advances in automating analysis of neural time series 8
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Neuroscience)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
37. Advances in automating analysis of neural time series 9
Contribution III: Automatic rejection of artifactssensors
Trials
[Jas, Engemann, Raimondo, Bekhti, Gramfort. NeuroImage. 2017]
Bad
trial
38. Related work
Riemannian Potato
[Barachant, 2013]
PREP (RANSAC)
[Bigdely-Shamlo, 2015]
FASTER
[Nolan, 2010]
10
Robust regression
[Diedrichsen, 2005]
Sensor Noise Suppression
[Cheveigné, 2008]
Advances in automating analysis of neural time series
39. Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
40. Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
41. Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
42. Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
43. Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
A < threshold τ
44. Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
A < threshold τ
Good data
Yes?
45. Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
A < threshold τ
Good data Bad data
Yes? No?
46. Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12Advances in automating analysis of neural time series
47. Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Advances in automating analysis of neural time series
48. Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Advances in automating analysis of neural time series
49. Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Outliers not
removed
Advances in automating analysis of neural time series
50. Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Outliers not
removedOptimal
Advances in automating analysis of neural time series
Evoked
response
51. 13
How do we measure data quality?
Many trials for a
single sensor
54. 13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
55. 13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
XvalXtrain(τ)
56. 13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
XvalXtrain(τ)
57. 13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
Xval Fro
RMSE = Xtrain(τ)
58. (lower error = cleaner training set) 13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
Xval Fro
RMSE = Xtrain(τ)
59. 14
But what if my validation data also has artifacts?
Advances in automating analysis of neural time series
60. 14
But what if my validation data also has artifacts?
Validation set
Advances in automating analysis of neural time series
61. 14
But what if my validation data also has artifacts?
Xval
Validation set
Advances in automating analysis of neural time series
62. 14
But what if my validation data also has artifacts?
Xval
Xval
~
Validation set
Advances in automating analysis of neural time series
63. 14
But what if my validation data also has artifacts?
Xval
Xval
~
Answer: use the median instead of the mean RMSE= Xtrain(τ) Xval
~
Fro
Validation set
Advances in automating analysis of neural time series
64. Autoreject (global) vs. human threshold
15Advances in automating analysis of neural time series
65. Autoreject (global) vs. human threshold
15Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
66. Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
67. Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15
Average with 5-fold cross-validation
Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
68. Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15
Average with 5-fold cross-validation
Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
69. Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15
Average with 5-fold cross-validation
Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
89. 19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
90. 19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
91. 19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
92. 19
Schematic of the complexity of the problem
Naïve solution:
Drop bad trials and sensors
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
93. 19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
Naïve solution:
Drop bad trials and sensors
94. 19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
Naïve solution:
Drop bad trials and sensors
95. 19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
96. 19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
97. 19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Interpolation
Bad sensor
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
98. 19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Interpolation
• EEG: Spherical splines
• MEG: MNE
Bad sensor
[Hamalainen et al., 1994]
[Perrin et al., 1989]
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
99. 19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Interpolation
• EEG: Spherical splines
• MEG: MNE
Bad sensor
[Hamalainen et al., 1994]
[Perrin et al., 1989]
Else:
interpolate ρ (=2) worst
bad sensors per trial
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
100. 19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Else:
interpolate ρ (=2) worst
bad sensors per trial
Interpolation
• EEG: Spherical splines
• MEG: MNE
Bad sensor
[Hamalainen et al., 1994]
[Perrin et al., 1989]
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
101. Autoreject in action: MNE sample data (MEG)
20http://autoreject.github.io/auto_examples/plot_auto_repair.html
102. 21
Autoreject in action (EEG data)
Advances in automating analysis of neural time series
Before autoreject After autoreject
19 subject Faces dataset
[Wakeman and Henson, 2015]
104. 22
Quantitative evaluation
Advances in automating analysis of neural time series
humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
105. 22
Quantitative evaluation
Advances in automating analysis of neural time series
one subject
humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
106. 22
Quantitative evaluation
Advances in automating analysis of neural time series
Competing
method better
one subject
humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
107. 22
Quantitative evaluation
Advances in automating analysis of neural time series
Autoreject
better
Competing
method better
one subject
humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
108. 22
Quantitative evaluation
Advances in automating analysis of neural time series
Autoreject
better
Competing
method better
one subject
humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
109. Transparency: an example diagnostic plot
23http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
110. Transparency: an example diagnostic plot
23
Sensor to be interpolated
http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
111. Transparency: an example diagnostic plot
23
Sensor to be interpolated
Bad sensor but not going to be interpolated
http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
112. Transparency: an example diagnostic plot
23
Sensor to be interpolated
Bad sensor but not going to be interpolated Bad trials
http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
113. Advances in automating analysis of neural time series 24
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Neuroscience)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
114. Advances in automating analysis of neural time series 24
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Neuroscience)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
115. Contribution IV: Learn representations from neural data
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
116. Contribution IV: Learn representations from neural data
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
117. Contribution IV: Learn representations from neural data
#2 Filtering
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
118. Contribution IV: Learn representations from neural data
#2 Filtering
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
asymmetry
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
119. Some attempts in the neuroscience
community
26Advances in automating analysis of neural time series
120. Some attempts in the neuroscience
community
26
Adaptive Waveform Learning [Hitziger, 2017]
Template waveform Coefficient updates waveform updates
Different durations
Advances in automating analysis of neural time series
121. Some attempts in the neuroscience
community
Sliding Window Method
[Gips et al., 2017]
26
Adaptive Waveform Learning [Hitziger, 2017]
Template waveform Coefficient updates waveform updates
Different durations
Advances in automating analysis of neural time series
122. Some attempts in the neuroscience
community
Sliding Window Method
[Gips et al., 2017]
MoTIF [Jost et al., 2006]
26
Adaptive Waveform Learning [Hitziger, 2017]
Template waveform Coefficient updates waveform updates
Different durations
Learning recurrent waveforms in
EEG [Brockmeier & Principe, 2016]
Multivariate temporal dictionary
learning [Barthélemy et al., 2013]
Advances in automating analysis of neural time series
123. The problem setup (simulation)
27Advances in automating analysis of neural time series
124. The problem setup (simulation)
27Advances in automating analysis of neural time series
125. The problem setup (simulation)
27Advances in automating analysis of neural time series
126. The problem setup (simulation)
Two different atoms
(can have more)
27Advances in automating analysis of neural time series
127. The problem setup (simulation)
Different amplitudes
Two different atoms
(can have more)
27Advances in automating analysis of neural time series
128. The problem setup (simulation)
Different locations
Different amplitudes
Two different atoms
(can have more)
27Advances in automating analysis of neural time series
129. The problem setup (simulation)
Different locations
Different amplitudes
Two different atoms
(can have more)
Atoms can even overlap
27Advances in automating analysis of neural time series
130. The problem setup (simulation)
Different locations
Different amplitudes
Two different atoms
(can have more)
Small gaussian noise
Atoms can even overlap
27Advances in automating analysis of neural time series
131. 28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
132. 28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
=nx
133. 28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*=nx
k
d k
nz
134. 28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*=nx
k
d k
nz
135. 28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz
136. 28
Convolutional Sparse Coding (CSC) formulation
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz
137. 28
Convolutional Sparse Coding (CSC) formulation
Penalty to enforce sparsity
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz
138. 28
Convolutional Sparse Coding (CSC) formulation
Penalty to enforce sparsity
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
s.t. 0k
nz
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz
139. 28
Convolutional Sparse Coding (CSC) formulation
d k 2
2 ≤ 1
Penalty to enforce sparsity
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
s.t. 0k
nz
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz
140. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
29Advances in automating analysis of neural time series
141. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
142. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
143. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
144. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
d-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
145. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
z-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
146. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
d-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
147. Basic strategy: alternate minimization
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min
z-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
148. z-step: The Toeplitz matrix trick
30Advances in automating analysis of neural time series
149. z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
30Advances in automating analysis of neural time series
150. z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
k
D
30Advances in automating analysis of neural time series
151. z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
k
D
k
nz
30Advances in automating analysis of neural time series
152. z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
k
D
k
nz
kn
k
n
n k
k
n
k
n
z
zzDx
,
2
2
min
30Advances in automating analysis of neural time series
153. z-step: The Toeplitz matrix trick
31Advances in automating analysis of neural time series
154. z-step: The Toeplitz matrix trick
k
k
n
k
zD
31Advances in automating analysis of neural time series
155. z-step: The Toeplitz matrix trick
1
D
k
k
n
k
zD
1
nz
31Advances in automating analysis of neural time series
156. z-step: The Toeplitz matrix trick
1
D k
D
k
k
n
k
zD
1
nz
k
nz
31Advances in automating analysis of neural time series
157. [ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
31Advances in automating analysis of neural time series
158. [ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
nDz
31Advances in automating analysis of neural time series
162.
n
nnn
n
zDzx
z
2
2
2
1
[ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
nDz
1 )( nn
T
DzxD
Now, we can feed the gradient to L-BFGS-B optimization algorithm
31Advances in automating analysis of neural time series
163.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0
k
nZ
32Advances in automating analysis of neural time series
164.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
k
nZ
32Advances in automating analysis of neural time series
165.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
k
nZ
32Advances in automating analysis of neural time series
166.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
k
nZ
32Advances in automating analysis of neural time series
167.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
k
nZ
32Advances in automating analysis of neural time series
168.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
k
nZ
32Advances in automating analysis of neural time series
169.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
0
2
4
k
nZ k
d
32Advances in automating analysis of neural time series
170.
n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
0
2
4
k
nZ k
d
kk
n
k
n
k
dZzd *
32Advances in automating analysis of neural time series
171.
n k
kk
nn
d
dZx
2
2
1 2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
0
2
4
k
nZ k
d
kk
n
k
n
k
dZzd *
We remove the regularization
term as it doesn’t depend on d
32Advances in automating analysis of neural time series
Least square problem under
unit norm constraint
172. 33
d-step: The atoms are updated using LBFGS-B
Advances in automating analysis of neural time series
173. 33
d-step: The atoms are updated using LBFGS-B
n k
kk
nnk
dZx
d
2
2
2
1
Advances in automating analysis of neural time series
174. 33
d-step: The atoms are updated using LBFGS-B
n k
kk
nnk
dZx
d
2
2
2
1
k
kk
nn
Tk
n dZxZ
Advances in automating analysis of neural time series
175. 33
d-step: The atoms are updated using LBFGS-B
n k
kk
nnk
dZx
d
2
2
2
1
2
ˆ
1
,1minˆˆ
k
kk
d
dd
1. Gradient step
2. Projection step
k
kk
nn
Tk
n dZxZ
Advances in automating analysis of neural time series
Projected gradient
descent
176. 33
d-step: The atoms are updated using LBFGS-B
n k
kk
nnk
dZx
d
2
2
2
1
2
ˆ
1
,1minˆˆ
k
kk
d
dd
1. Gradient step
2. Projection step
k
kk
nn
Tk
n dZxZ
Advances in automating analysis of neural time series
In practice, it is more complicated … LBFGS-B in the dual
Projected gradient
descent
177. Putting it all together
34Advances in automating analysis of neural time series
178. Putting it all together
z update
34Advances in automating analysis of neural time series
179. Putting it all together
z update
Toeplitz D
34Advances in automating analysis of neural time series
180. Putting it all together
z update
d update
Toeplitz D
34Advances in automating analysis of neural time series
181. Putting it all together
z update
d update
Toeplitz D
Strided Z
34Advances in automating analysis of neural time series
182. Putting it all together
z update
d update
Toeplitz D
Strided Z
(L-BFGS-B)
(L-BFGS-B)
34Advances in automating analysis of neural time series
183. Putting it all together
z update
d update
Toeplitz D
Strided Z
(L-BFGS-B)
(L-BFGS-B)
Alternate minimization
34Advances in automating analysis of neural time series
184. 35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
Advances in automating analysis of neural time series
185. 35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
Advances in automating analysis of neural time series
186. 35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
Advances in automating analysis of neural time series
187. 35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
Advances in automating analysis of neural time series
188. 35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
Advances in automating analysis of neural time series
189. 35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
45 minutes
Advances in automating analysis of neural time series
190. 35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
45 minutes
2 hours
Advances in automating analysis of neural time series
191. 36
The quasi-Newton solver outperform ADMM
solvers
Advances in automating analysis of neural time series
192. 36
The quasi-Newton solver outperform ADMM
solvers
ADMM solvers
Advances in automating analysis of neural time series
193. 36
The quasi-Newton solver outperform ADMM
solvers
Different random seeds
ADMM solvers
Advances in automating analysis of neural time series
194. Cross frequency coupling uncovered via CSC
~80 Hz
37Advances in automating analysis of neural time series
[Canolty, 2006]
[Dupré la Tour, 2017]
195. Challenge
Neural data often contains transient artifacts
38Advances in automating analysis of neural time series
196. 39
CSC in the presence of transient artifacts
Advances in automating analysis of neural time series
197. 39
CSC in the presence of transient artifacts
Advances in automating analysis of neural time series
198. 39
Neural signals
CSC in the presence of transient artifacts
Advances in automating analysis of neural time series
202. A probabilistic interpretation
tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
40Advances in automating analysis of neural time series
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
203. A probabilistic interpretation
1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
Data likelihood
40Advances in automating analysis of neural time series
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
204. A probabilistic interpretation
)(~, k
tnz
1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
PriorData likelihood
40Advances in automating analysis of neural time series
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
205. A probabilistic interpretation
)(~, k
tnz
1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
PriorData likelihood
40Advances in automating analysis of neural time series
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
206. A probabilistic interpretation
)(~, k
tnz
1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
PriorData likelihood
40Advances in automating analysis of neural time series
n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min
Sparser
activations
212. Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[
Location parameter
),0( Scale parameter
),(
41Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
213. Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[
Location parameter
),0( Scale parameter
),(
41
Special cases
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
214. Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[
Location parameter
),0( Scale parameter
),(
41
Special cases
Normal distribution
),,0,2(~ SX
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
215. Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[
Location parameter
),0( Scale parameter
),(
41
Special cases
Normal distribution
),,0,2(~ SX
),,0,1(~ SX
Cauchy distribution
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
217. Symmetric α-stable distribution is
conditionally Gaussian
43Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
218. Symmetric α-stable distribution is
conditionally Gaussian
43
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,|
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
219.
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Symmetric α-stable distribution is
conditionally Gaussian
43
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,|
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
220.
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Symmetric α-stable distribution is
conditionally Gaussian
43
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,|
Advances in automating analysis of neural time series
from a positive stable
distribution
tn,
[Samorodnitsky et al., 1996]
221.
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Symmetric α-stable distribution is
conditionally Gaussian
43
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,|
n kn
k
n
k
k
n
k
n
n
zd
zzdx
,
2
2
,
*
1
min
Weighted CSC!
Advances in automating analysis of neural time series
from a positive stable
distribution
tn,
[Samorodnitsky et al., 1996]
222.
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Symmetric α-stable distribution is
conditionally Gaussian
43
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,|
n kn
k
n
k
k
n
k
n
n
zd
zzdx
,
2
2
,
*
1
min
Weighted CSC!
Advances in automating analysis of neural time series
But is not known …tn,
from a positive stable
distribution
tn,
[Samorodnitsky et al., 1996]
223. EM algorithm is used to estimate MAP when
some variables are missing
44Advances in automating analysis of neural time series
[Dempster et. al., 1977]
224. EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
E
Advances in automating analysis of neural time series
[Dempster et. al., 1977]
225. EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
E
n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),(
Advances in automating analysis of neural time series
[Dempster et. al., 1977]
226. EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
E
n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),(
Advances in automating analysis of neural time series
[Dempster et. al., 1977]
weights
227. EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
Maximization step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
E
),(maxarg),( )(
,
)1()1(
zdBzd i
zd
ii
n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),(
Advances in automating analysis of neural time series
[Dempster et. al., 1977]
weights
228. EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
Maximization step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
E
),(maxarg),( )(
,
)1()1(
zdBzd i
zd
ii
n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),(
Iterate until
convergence
Advances in automating analysis of neural time series
[Dempster et. al., 1977]
weights
229. 45
For the E-step, we compute the expectation
using sampling
Advances in automating analysis of neural time series
[Bishop, 2007]
230. 45
For the E-step, we compute the expectation
using sampling
tntn
tntn
dzdxp ,,
,,
),,|(
11
E
Advances in automating analysis of neural time series
[Bishop, 2007]
231. 45
For the E-step, we compute the expectation
using sampling
tntn
tntn
dzdxp ,,
,,
),,|(
11
E
J
j
j
tnJ 1 ,
11
Advances in automating analysis of neural time series
[Bishop, 2007]
232. 45
For the E-step, we compute the expectation
using sampling
tntn
tntn
dzdxp ,,
,,
),,|(
11
E
J
j
j
tnJ 1 ,
11
Advances in automating analysis of neural time series
[Bishop, 2007]
233. 45
For the E-step, we compute the expectation
using sampling
tntn
tntn
dzdxp ,,
,,
),,|(
11
E
J
j
j
tnJ 1 ,
11
Sampled from the
posterior distribution
Advances in automating analysis of neural time series
[Bishop, 2007]
234. 45
For the E-step, we compute the expectation
using sampling
tntn
tntn
dzdxp ,,
,,
),,|(
11
E
J
j
j
tnJ 1 ,
11
Sampled from the
posterior distribution
Markov Chain Monte Carlo (MCMC)
Advances in automating analysis of neural time series
[Bishop, 2007]
237. 46
Let’s recap
Data likelihood term
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Conditionally Gaussian
Advances in automating analysis of neural time series
238. 46
Let’s recap
Data likelihood term
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Conditionally Gaussian
Latent variable
Advances in automating analysis of neural time series
239. 46
Let’s recap
Data likelihood term
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Conditionally Gaussian
Latent variable
EM algorithm
Advances in automating analysis of neural time series
240. 46
Let’s recap
Data likelihood term
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Conditionally Gaussian
Latent variable
EM algorithm
E step: MCMC to learn weights
tn,
1
E
Advances in automating analysis of neural time series
241. 46
Let’s recap
Data likelihood term
tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,|
Conditionally Gaussian
Latent variable
EM algorithm
E step: MCMC to learn weights
M step: Weighted CSC
tn,
1
E
Advances in automating analysis of neural time series
242. 47
-CSC on simulated data
Advances in automating analysis of neural time series
[Jost et al., 2006]
243. 48
-CSC with = 2 (reduces to Gaussian)
Advances in automating analysis of neural time series
244. 48
-CSC with = 2 (reduces to Gaussian)
Advances in automating analysis of neural time series
248. Advances in automating analysis of neural time series 50
Conclusion
Data sharing
And data standards
249. Advances in automating analysis of neural time series 50
Conclusion
Data sharing
And data standards
Autoreject
to remove artifacts
250. Advances in automating analysis of neural time series 50
Conclusion
Data sharing
And data standards
Autoreject
to remove artifacts
αCSC to learn
recurring waveforms