Neural Decoding
June 7th
Hiroaki hamada
1
• Encoding: (neuron response) = f(stimuli)
• Decoding: (stimuli) = f-1(neuron response)
Inverse problem of Encoding
Bayesian Theorem is commonly used.
1. P[s]: the probability of stimulus s being presented, often called prior probability
2. P[r], r = (r1, … , rN): spike-count firing rates: the probability of response r being
recorded.
3. P[r, s]: the probability of stimulus s being presented and response r being
recorded(Joint probability).
4. P[r|s]: the probability of evoking response r, given that stimulus s was presented.
5. P[s|r]: the conditional probability that stimulus s was presented, given that
response r was recorded.
Neural Decoding
Neural Decoding
P[r] can be calculated from P[r|s] by summing over
all stimulus values weighted by their probabilities.
P[r] = ΣsP[r|s]P[s] and P[s] =
ΣrP[s|r]P[r]
Also, the joint probability P[r|s] times the probability
of the stimulus, or as P[s|r] times the probability of
the response.
P[r, s] = P[r|s]P[s] = P[s|r]P[r]
Bayes theorem:
P[s|r] = P[r|s]P[s]/P[r] or P[r|s]
= P[s|r]P[r]/P[s]
2
Neural Decoding
Probabilities in decoding:
Decoding of intention.
3
Probabilities in decoding
Two datasets are given, and we
decode intention of arm reaching.
4
neuronL neuronR
Firing rate during left trial i Firing rate during right trial i
53.9781
57.6395
56.1187
38.0109
67.3739
…
58.3077
56.3932
38.9440
39.9052
50.7743
…
(remember Capitalization is important for
MATLAB)
Probabilities in decoding
5
Probabilities in decoding
6
Probabilities in decoding
7
Neuron 1 Neuron 2
Movement intention
Assumption (naïve Bayes)
Probabilities in decoding
8
Decoding by Linear Filter
Translation from Observed data to Predicted data:
9
Decoding by Linear Filter
10
Pinball Task
(Serruya et al., Nature, 2002)
Decoding by Particle Filter
Particle filter:
1. A method for approximating the probability density function by multiple samples
2. This is also called as Monte Calo Filter, Boostrap Filter, Sampling/Importance
resampling(SIR) Filter.
3. Independently, Kitagawa and Gordon invented this.
Depending on observed data(we would predict
that data have noise), we try to focus on
expected point by some particles.
11
Reverse Correlation
A way of construct the receptive field.
1. Random patterns of inputs give to the receptive field.
2. Superimpose response patterns of neurons.
3. Reconstruct the receptive field.
The firing rate modulation due to the random stimulus X(t)
12
Reverse Correlation
13
Signal Detection Theory
Decision rule:
Is the signal present or absent in the noisy back ground?
14
The stimulus s1, no stimulus s0
The likelihood ratio
Two possible types of error:
1. Calling the stimulus present when it was not(a
false-alarm or False positive)
2. Calling the stimulus absent when it was
present(a miss or True negative)
ROC curve(a receiver-operating characteristic):
Relationship between a false-alarm and a miss
15

Neural decoding

  • 1.
  • 2.
    1 • Encoding: (neuronresponse) = f(stimuli) • Decoding: (stimuli) = f-1(neuron response) Inverse problem of Encoding Bayesian Theorem is commonly used. 1. P[s]: the probability of stimulus s being presented, often called prior probability 2. P[r], r = (r1, … , rN): spike-count firing rates: the probability of response r being recorded. 3. P[r, s]: the probability of stimulus s being presented and response r being recorded(Joint probability). 4. P[r|s]: the probability of evoking response r, given that stimulus s was presented. 5. P[s|r]: the conditional probability that stimulus s was presented, given that response r was recorded. Neural Decoding
  • 3.
    Neural Decoding P[r] canbe calculated from P[r|s] by summing over all stimulus values weighted by their probabilities. P[r] = ΣsP[r|s]P[s] and P[s] = ΣrP[s|r]P[r] Also, the joint probability P[r|s] times the probability of the stimulus, or as P[s|r] times the probability of the response. P[r, s] = P[r|s]P[s] = P[s|r]P[r] Bayes theorem: P[s|r] = P[r|s]P[s]/P[r] or P[r|s] = P[s|r]P[r]/P[s] 2
  • 4.
    Neural Decoding Probabilities indecoding: Decoding of intention. 3
  • 5.
    Probabilities in decoding Twodatasets are given, and we decode intention of arm reaching. 4 neuronL neuronR Firing rate during left trial i Firing rate during right trial i 53.9781 57.6395 56.1187 38.0109 67.3739 … 58.3077 56.3932 38.9440 39.9052 50.7743 … (remember Capitalization is important for MATLAB)
  • 6.
  • 7.
  • 8.
    Probabilities in decoding 7 Neuron1 Neuron 2 Movement intention Assumption (naïve Bayes)
  • 9.
  • 10.
    Decoding by LinearFilter Translation from Observed data to Predicted data: 9
  • 11.
    Decoding by LinearFilter 10 Pinball Task (Serruya et al., Nature, 2002)
  • 12.
    Decoding by ParticleFilter Particle filter: 1. A method for approximating the probability density function by multiple samples 2. This is also called as Monte Calo Filter, Boostrap Filter, Sampling/Importance resampling(SIR) Filter. 3. Independently, Kitagawa and Gordon invented this. Depending on observed data(we would predict that data have noise), we try to focus on expected point by some particles. 11
  • 13.
    Reverse Correlation A wayof construct the receptive field. 1. Random patterns of inputs give to the receptive field. 2. Superimpose response patterns of neurons. 3. Reconstruct the receptive field. The firing rate modulation due to the random stimulus X(t) 12
  • 14.
  • 15.
    Signal Detection Theory Decisionrule: Is the signal present or absent in the noisy back ground? 14 The stimulus s1, no stimulus s0 The likelihood ratio Two possible types of error: 1. Calling the stimulus present when it was not(a false-alarm or False positive) 2. Calling the stimulus absent when it was present(a miss or True negative) ROC curve(a receiver-operating characteristic): Relationship between a false-alarm and a miss
  • 16.