Neurally Driven Prosthetics

765 views
696 views

Published on

final presentation for capstone neuroscience class

Published in: Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
765
On SlideShare
0
From Embeds
0
Number of Embeds
12
Actions
Shares
0
Downloads
36
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Neurally Driven Prosthetics

  1. 1. Neurally Driven Prosthetics: Pt. II<br />Study: (Schwartz, et. Al: primate control of robotic arm)<br />http://news.nationalgeographic.com/news/2008/05/080530-monkey-video-ap.html<br />
  2. 2. Signal is derived from M1<br />Primary motor cortex produces stereotyped neuron ensemble firing patterns during a particular action<br />Strong neural activity-behavior coupling in this area, ideal for a decoding algorithm<br />Ensemble firing patterns are vectorial, and encoded by firing rate.<br />
  3. 3. Cortical ensemble activity and behavioral pairing is strong<br />This allows engineers to be confident about designing a patterned output from a neural signal<br />Figure shows the high predictability of a neuron’s activity with behavior—machine canpredict accurately what the monkey’s arm would do.<br />
  4. 4. Macaque Brain<br />
  5. 5. Other motor association cortices<br />Occur in parallel processing networks, and work in concert to initiate, guide/modify, and extinguish movements.<br />These include: supplementary motor cortex (fine/complex movements), cerebellum (coordination/error suppression), basalganglia (initiation/extinction) and possibly parietal association cortices involved in internal representations that govern hand manipulation/grasping.<br />
  6. 6. The interface<br />Researchers found that microelectrode arrays in the 10’s are sufficient as a signal source<br />Most probably 100’s are needed to accommodate full repertoire of limb movements.<br />
  7. 7. Interface is closed-loop<br />Once the microelectrode array is implanted, it cannot be modified-closed system.<br />Manipulation of robot arm is real-time, sampling neuronal recordings once every 30 msec<br />Time delay for arm movement is ~150 msec, comparable to the delay seen in a biological arm.<br />Visual feedback substitutes for sensory feedback, allowing increased accuracy/agility through learning.<br />
  8. 8. Predicted vs. Actual<br />
  9. 9. A little computation (don’t be scared..)<br />A mathematical algorithm is used to turn electrical signal patterns from the electrodes into a meaningful set of commands for the robotic arm.<br /><ul><li> It is a linear and integrative function.
  10. 10. Population Vector Algorithm is used (PVA).</li></li></ul><li>PVA (Population Vector Algorithm)<br />“relies on the directional tuning of each unit, characterized by a single preferred direction in which the unit fires maximally. The real-time population vector is essentially a vector sum of the preferred directions of the units in the recorded population, weighted by the instantaneous firing rates of the units” –(Schwartz, et. Al)<br />
  11. 11. Visualizing PVA<br />Neuronal assembly assignment: Upon movement of the arm in one of 8 directions, excitability of that neuron determines calibration in that direction. Tuning!<br />
  12. 12. Inverse Kinematics<br />An additional algorithm is needed to calculate proper joint positioning.<br />Since there is such a large number of possible limb positionings (shoulder flexion/extension, adduction/abduction, rotation, and elbow flexion/extension), to avoid computational overload, an algorithm computes the most probable joint positioning using inverse kinematics equations.<br />
  13. 13. Inverse Kinematics<br />To produce such an output for an inverse kinematics equation, only starting and ending points are needed.<br />Endpoint in 3D space is calculated before being fed to the algorithm real-time by integrating the endpoint velocity to endpoint position, which is then converted to a joint-angle command to the robot.<br />
  14. 14. Kinematic Data<br />
  15. 15. Kinematic Data<br />
  16. 16. Movement Quality<br />Different colors represent 4 different targets.<br />Thin grey lines represent the average over all trials<br />Space-filling shapes represent the standard deviation<br />Grey balls represent areas where assistance was provided.<br />
  17. 17. Misc. Methodology<br />It takes about 1,000 trials over a period of 1-2 weeks before high accuracy (80-100%) is achieved. <br />A training period was necessary, where small automated velocity vectors were added (in the direction of interest) for a handicap<br />Kinematics of arm control mirrors bell-shaped profile of a natural arm, but much slower (3-5 sec robotic vs. 1-2 sec biological)<br />
  18. 18. Hurdles<br />Engineering: long-term, stable electrodes must be developed if this technology is to be used for prostheses.<br />Sizable array of immobile recording, computer and robotic control hardware must be sized down.<br />Not autonomic--trained specialist is needed to supervise.<br />Neuroscientific: feedback is purely visual (clumsiness during training is typical for visual-only guidance); for optimal interaction with a physical environment, sensory ganglion, mainly pressure sensors, are needed.<br />Velliste, et. Al. recorded from the primary motor cortex; but as mentioned before, many areas in the brain with unique properties produce signals which may be useful in guiding a prosthetic device. <br />
  19. 19. Areas of great promise<br />Patients of paralysis or locked-in syndrome could feasibly interact with their environment again.<br />Opens up and makes feasible the field of Brain/Machine Interfaces (BMI’s) moving beyond the field of prostheses. Theoretically, as long as neural activity and behavior are coupled stereotypically, we can understand how the information is encoded. BMI engineers can decode it and feed the signal to a computer algorithm which produces an output. <br />
  20. 20. So maybe one day, we’ll finally say BUH-BYE to this…<br />
  21. 21. And Hello to this…<br />Other applications:<br />-Biomimetics (retinal and cochlear implants)<br />-Neural signal feedback therapy<br />-Gaming control <br />
  22. 22. But more likely this.<br />
  23. 23. Sources<br />Alexander GE, Crutcher MD. Neural representations of the target (goal) of visually guided arm movements in three motor areas of the monkey. J Neurophysiol. 1990 Jul;64(1):164-78.<br />Carmena JM, Lebedev MA, Crist RE, O'Doherty JE, Santucci DM, Dimitrov DF, Patil PG, Henriquez CS, Nicolelis MA. Learning to control a brain-machine interface for reaching and grasping by<br /> primates. PLoSBiol. 2003 Nov;1(2):E42. Epub 2003 Oct 13.<br />Chapin JK, Moxon KA, Markowitz RS, Nicolelis MA. Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex.<br /> Nat Neurosci. 1999 Jul;2(7):664-70.<br />CrutcherMD, Russo GS, Ye S, Backus DA. Target-, limb-, and context-dependent neural activity in the cingulate and supplementary motor areas of the monkey. Exp Brain Res. 2004 Oct;158(3):278-88. Epub 2004 Jul 29.<br />Dornay M, Sanger TD. Equilibrium point control of a monkey arm simulator by a fast learning tree structured artificial neural network.<br />BiolCybern. 1993;68(6):499-508.<br />Helms Tillery SI, Taylor DM, Schwartz AB. Training in cortical control of neuroprosthetic devices improves signal extraction from small neuronal ensembles. Rev Neurosci. 2003;14(1-2):107-19.<br />Liu X, Robertson E, Miall RC. Neuronal activity related to the visual representation of arm movements in thelateralcerebellarcortex.JNeurophysiol. 2003 Mar;89(3):1223-37. Epub 2002 Nov 20.<br />Serruya M, Hatsopoulos N, Fellows M, Paninski L, Donoghue J. Robustness of neuroprosthetic decoding algorithms. BiolCybern. 2003 Mar;88(3):219-28.<br />Taylor DM, Tillery SI, Schwartz AB. Information conveyed through brain-control: cursor versus robot. IEEE Trans Neural SystRehabil Eng. 2003 Jun;11(2):195-9.<br /> <br />Tillery SI, Taylor DM. Signal acquisition and analysis for cortical control of neuroprosthetics. CurrOpinNeurobiol. 2004 Dec;14(6):758-62. Review.<br />VellisteM, Perel S, Spalding MC, Whitford AS, Schwartz AB. Cortical control of a prosthetic arm for self-feeding. Nature. 2008 May 28.<br /> <br />Wahnoun R, He J, Helms Tillery SI. Selection and parameterization of cortical neurons for neuroprosthetic control. J Neural Eng. 2006 Jun;3(2):162-71. Epub 2006 May 16.<br /> <br />Wessberg J, Nicolelis MA. Optimizing a linear algorithm for real-time robotic control using chronic cortical ensemble recordings in monkeys.<br /> J CognNeurosci. 2004 Jul-Aug;16(6):1022-35.<br /> <br />Wessberg J, Stambaugh CR, Kralik JD, Beck PD, Laubach M, Chapin JK, Kim J, Biggs SJ, Srinivasan MA, Nicolelis MA. Real-time prediction of hand trajectory by ensembles of cortical neurons in primates. Nature. 2000 Nov 16;408(6810):361-5.<br /> <br />Video: http://news.nationalgeographic.com/news/2008/05/080530-monkey-video-ap.html<br />Electrode figure: http://www.crunchgear.com/wp-content/uploads/2008/04/windowslivewriterjapansfirstonbraininterfacebeingresearch-9723microelectrode-thumb.jpg<br />Inverse kinematics figure: http://staff.aist.go.jp/eimei.oyama/IKPE.GIF<br />BMI figure: http://www.sms.mavt.ethz.ch/flow_man_machine<br />Cartoon: http://www.usabilitycorner.com/images/hci.JPG<br />Terminator figure: http://www.igargoyle.com/archives/t800arm.jpg<br />Macaque brain figure: http://www.nature.com/nrn/journal/v6/n3/thumbs/nrn1626-f6.jpg<br />Keyboard figure: www.logitech.com<br />“Predicted vs. Actual “kinematic trajectory figure: Wessberg, et. Al. <br />All other figures are from Velliste, et. Al. <br />

×