Neural decoding

425 views

Published on

Neural Decoding Method

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
425
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Neural decoding

  1. 1. Neural DecodingJune 7thHiroaki hamada
  2. 2. 1• Encoding: (neuron response) = f(stimuli)• Decoding: (stimuli) = f-1(neuron response)Inverse problem of EncodingBayesian Theorem is commonly used.1. P[s]: the probability of stimulus s being presented, often called prior probability2. P[r], r = (r1, … , rN): spike-count firing rates: the probability of response r beingrecorded.3. P[r, s]: the probability of stimulus s being presented and response r beingrecorded(Joint probability).4. P[r|s]: the probability of evoking response r, given that stimulus s was presented.5. P[s|r]: the conditional probability that stimulus s was presented, given thatresponse r was recorded.Neural Decoding
  3. 3. Neural DecodingP[r] can be calculated from P[r|s] by summing overall stimulus values weighted by their probabilities.P[r] = ΣsP[r|s]P[s] and P[s] =ΣrP[s|r]P[r]Also, the joint probability P[r|s] times the probabilityof the stimulus, or as P[s|r] times the probability ofthe response.P[r, s] = P[r|s]P[s] = P[s|r]P[r]Bayes theorem:P[s|r] = P[r|s]P[s]/P[r] or P[r|s]= P[s|r]P[r]/P[s]2
  4. 4. Neural DecodingProbabilities in decoding:Decoding of intention.3
  5. 5. Probabilities in decodingTwo datasets are given, and wedecode intention of arm reaching.4neuronL neuronRFiring rate during left trial i Firing rate during right trial i53.978157.639556.118738.010967.3739…58.307756.393238.944039.905250.7743…(remember Capitalization is important forMATLAB)
  6. 6. Probabilities in decoding5
  7. 7. Probabilities in decoding6
  8. 8. Probabilities in decoding7Neuron 1 Neuron 2Movement intentionAssumption (naïve Bayes)
  9. 9. Probabilities in decoding8
  10. 10. Decoding by Linear FilterTranslation from Observed data to Predicted data:9
  11. 11. Decoding by Linear Filter10Pinball Task(Serruya et al., Nature, 2002)
  12. 12. Decoding by Particle FilterParticle filter:1. A method for approximating the probability density function by multiple samples2. This is also called as Monte Calo Filter, Boostrap Filter, Sampling/Importanceresampling(SIR) Filter.3. Independently, Kitagawa and Gordon invented this.Depending on observed data(we would predictthat data have noise), we try to focus onexpected point by some particles.11
  13. 13. Reverse CorrelationA way of construct the receptive field.1. Random patterns of inputs give to the receptive field.2. Superimpose response patterns of neurons.3. Reconstruct the receptive field.The firing rate modulation due to the random stimulus X(t)12
  14. 14. Reverse Correlation13
  15. 15. Signal Detection TheoryDecision rule:Is the signal present or absent in the noisy back ground?14The stimulus s1, no stimulus s0The likelihood ratioTwo possible types of error:1. Calling the stimulus present when it was not(afalse-alarm or False positive)2. Calling the stimulus absent when it waspresent(a miss or True negative)ROC curve(a receiver-operating characteristic):Relationship between a false-alarm and a miss
  16. 16. 15

×