Brain computer interfaces


Published on


Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • At present perhaps the most cumbersome factor is the need for scalp electrodes, which require an electrolyte gel for electrical conductivity, and as little hair as possible. Users with normal hair have to deal with electrode prep before use and hair cleaning after use. The scalp electrodes may always be the limiting factor in resolution of a EEG- computer interface. There is probably much electrical activity concomitant with thought patterns and sensory images in the brain, but the fine resolution of this activity is not detectable with surface electrodes. Another difficulty, is that the EP systems are quite slow. The EP must be derived by signal averaging, that is, multiple repetitions of the evoked response must be accumulated in order to see the EP signal above the noise. In the case of Dr. Sutter's system, 1.5 seconds is required to discriminate the selection of a particular letter from the alphabet array. The continuous EEG interface systems have faster switch functions because the change in alpha or mu wave amplitude can be detected more quickly.
  • The listener wore headphones to hear the music, and a cap with EEG sensors on it to record neural activity. The 26 sensor electrodes were arranged according to the 10-20 standard for EEG placement. . The sensors are labelled by proximity over a regions of the brain (F=Front, T=Temporal, C=Central, P-Parietal, O=Occipital) followed by either a 'z' for the midline, or a number that increases as it moves further from the midline. Odd numbers (1,3,5) are on the left hemisphere and even numbers (2,4,6) on the right e.g. T4 is on the right temporal lobe, above the right ear. An additional 10 sensors were used to record heart-rate, skin conductance, eye movements, breathing and other data. The sensors were recorded as interleaved channels of signed 32 bit integers at a rate of 500 samples per second. The channels were separated into individually named files and converted to ascii format for simplicity of loading on different systems.
  • The continuous or resting rhythms of the brain, "brain waves", are categorized by frequency bands. Different brain wave frequencies correspond to behavioral and attentional states of the brain, and a traditional classification system has long been used to characterize these different EEG rhythms: Alpha waves are between 8 and 13 Hz with amplitude in the range of 25 - 100 µV . They appear mainly from the occipital and parietal brain regions and demonstrate reduced amplitude with afferent stimulation, especially light, and also with intentional visual imagery or mental effort. Beta activity normally occurs in the range of 14 to 30 Hz, and can reach 50 Hz during intense mental activity. Beta arises mainly from the parietal and frontal areas and is associated with the normal alert mental state. Theta waves occur in the 4 to 7 Hz range and arise from the temporal and parietal regions in children, but also occur in adults in response to emotional stress, especially frustration or disappointment. Delta activity is inclusive of all brain waves below 3.5 Hz. Delta occurs in deep sleep, during infancy, and in patients with severe organic brain disease Mu waves, also known the comb or wicket rhythm, appears in bursts at 9 - 11 Hz. This activity appears to be associated with the motor cortex and is diminished with movement or the intention to move. Lambda waves are large electropositive sharp or saw-toothed waves that appear mainly from the occipital region and are associated with visual attention. Vertex waves are electronegative waves of 100 µV amplitude which appear in normal individuals, especially children, in the absence of overt stimulation. These waves have been observed to have a higher incidence in patients with epilepsy or other encephalopathy.
  • Alpha waves can also be volitionally manipulated. Alpha activity appears with closing the eyes or defocussed attention. Also, alpha is suppressed by light or normal attentive activity. Thus, most people can learn to produce bursts or "epochs" of alpha activity, and then return to normal beta activity. This behavioral "switch" between beta and alpha activity can be used as the mental command for a brain wave controller. When the signal processor detects the alpha epoch by using an FFT to detect the change in the fundamental frequency of the brain rhythm, an instruction is sent to control an output device.
  • The mu wave has been studied since the 1930s and came to be referred to as the "wicket rhythm" since the rounded waves on the EEG record resembled a croquet wicket. In a study in the 1950s, Gian Emilio Chatrian and colleagues showed that the amplitude of this wave could be suppressed by physical movements, and later studies showed that simply the intent to move or certain other efforts requiring visual or mental activity would also suppress the amplitude of the mu wave. In Wolpaw and MacFarlands' lab, subjects can learn to control the amplitude of this waveform by trial and error when visualizing various motor activities, such smiling, chewing, or swallowing. For different subjects, different images enhance or suppress the voltage of the mu waveform. Upon detection of the voltage change in the mu wave, the system sends output code to drive a cursor up or down on a computer screen. Thus, with a certain amount of feedback training, users can learn to move the cursor with the appropriate mental effort. The researchers hope that this system will eventually provide a communications link for profoundly disabled individuals.
  • In 1987, the authors (Lusted and Knapp) demonstrated an EEG controller which was configured to switch settings of a music synthesizer. Music was chosen for the controller's output because sound provided a good demonstration the real-time capabilities of this technology. By wearing a headband that positioned electrodes on the back of the head to detect the occipital alpha activity, users controlled a switch that responded to the transitions between beta and alpha epochs. More recently, composer Atau Tanaka of the Stanford Center for Computer Research in Music and Acoustics uses this EEG controller in his performance pieces to switch certain synthesizer functions while generating sounds using EMG signals. Another recent application for the EEG-alpha interface is being used as a controller for visual keyboard software. In Brazil, Roberto Santini is using a Biomuse system configured to provide him with the EEG switch, since he is immobilized with advanced ALS (amyotrophic lateral sclerosis) and cannot make use of his eye movements to use the EOG controller. With the EEG controller interfaced to the mouse port of his personal computer, Roberto can select letters from the visual keyboard on the screen. The selection process is somewhat laborious because each choice is binary. The word processing software allows him to zoom in on a given letter by dividing the screen in half. Thus, starting from the full keyboard, as many as 6 steps may be required to move down the branching pattern in order to select a desired letter. Roberto now writes complete letters and is pleased that he can again communicate with others. Currently, the authors and a few other researchers, notably a group headed by Alkira Hiraiwa at the NTT Laboratories in Japan, are continuing development in EEG controllers by using pattern recognition algorithms in an attempt to detect signature patterns of EEG activity which correspond to volitional behaviors. The eventual aim is to develop a vocabulary of EEG signals that are recognizable by the computer. The process of pattern recognition is similar to that used for EMG gesture recognition. In this case, the "gesture" is a thought pattern or type of visualization. For instance, attempts have been made to train a neural network to recognize subvocalized letters, where subjects think a particular letter as though about to speak it, and over many repetitions, train the neural net to recognize a brain wave pattern that occurs with this behavior. This is a promising technique, but the training period is laborious in order to obtain a high percentage of accuracy in matching letters with brain wave patterns. As mentioned earlier, another approach to development of an EEG-computer interface involves the use of an evoked potential (EP) paradigm. Evoked potentials are produced by activating a sensory pathway with a particular type of stimulus, such as a flash of light or a noise burst, and then recording a characteristic waveform from the brain at a particular time interval after the stimulus presentation. Since the characteristic evoked waveform appears at a specific time after the stimulus, researchers can discriminate between the EP and the noise because they know its temporal location in the post-stimulus EEG recording. Other electrical activity which occurs before and after the EP latency window can be ignored. Eric Sutter at the Smith-Kettlewell Institute in San Francisco has developed a visual EP controller system for physically handicapped users. The user can select words or phrases from a matrix of flashing squares on a computer screen. The flashing square upon which the user is fixating his or her gaze produces a characteristic EP from a particular portion of the visual cortex, and since the amplitude of the EP produced from the foveal portion (point of maximal accuity) of the retina is much larger than the response form surrounding retinal areas, the computer can discriminate which word square the user is watching at any given time. Dr. Sutter has implanted electrodes under the scalp to improve the quality of the EEG signal in these patients. Also, this eliminates the need to put on scalp electrodes for each test session since the patients simply "plug in" their transdermal connection to interface with the computer.
  • Brain computer interfaces

    1. 1. Brain Computer Interfaces or Krang’s Body
    2. 2. What is an EEG? <ul><li>An electroencephalogram is a measure of the brain's voltage fluctuations as detected from scalp electrodes. </li></ul><ul><li>It is an approximation of the cumulative electrical activity of neurons. </li></ul>
    3. 3. What is it good for? <ul><li>Neurofeedback </li></ul><ul><ul><li>treating ADHD </li></ul></ul><ul><ul><li>guiding meditation </li></ul></ul><ul><li>Brain Computer Interfaces </li></ul><ul><ul><li>People with little muscle control (i.e. not enough control for EMG or gaze tracking) </li></ul></ul><ul><ul><li>People with ALS, spinal injuries </li></ul></ul><ul><ul><li>High Precision </li></ul></ul><ul><ul><li>Low bandwidth (bit rate) </li></ul></ul>
    4. 4. EEG Background <ul><li>1875 - Richard Caton discovered electrical properties of exposed cerebral hemispheres of rabbits and monkeys. </li></ul><ul><li>1924 - German Psychiatrist Hans Berger discovered alpha waves in humans and invented the term “electroencephalogram” </li></ul><ul><li>1950s - Walter Grey Walter developed “EEG topography” - mapping electrical activity of the brain. </li></ul>
    5. 5. Physical Mechanisms <ul><li>EEGs require electrodes attached to the scalp with sticky gel </li></ul><ul><li>Require physical connection to the machine </li></ul>
    6. 6. Electrode Placement <ul><li>Standard “10-20 System” </li></ul><ul><li>Spaced apart 10-20% </li></ul><ul><li>Letter for region </li></ul><ul><ul><li>F - Frontal Lobe </li></ul></ul><ul><ul><li>T - Temporal Lobe </li></ul></ul><ul><ul><li>C - Center </li></ul></ul><ul><ul><li>O - Occipital Lobe </li></ul></ul><ul><li>Number for exact position </li></ul><ul><ul><li>Odd numbers - left </li></ul></ul><ul><ul><li>Even numbers - right </li></ul></ul>
    7. 7. Electrode Placement <ul><li>A more detailed view: </li></ul>
    8. 8. Brain “Features” <ul><li>User must be able to control the output: </li></ul><ul><ul><li>use a feature of the continuous EEG output that the user can reliably modify (waves), or </li></ul></ul><ul><ul><li>evoke an EEG response with an external stimulus (evoked potential) </li></ul></ul>
    9. 9. Continuous Brain Waves <ul><li>Generally grouped by frequency: (amplitudes are about 100µV max) </li></ul>can increase amplitude during intense mental activity parietal and frontal 12-36 Hz Beta diminishes with movement or intention of movement frontal (motor cortex) 9-11 Hz Mu correlated with emotional stress (frustration & disappointment) temporal and parietal 4-7 Hz Theta occur during sleep, coma everywhere <4 Hz Delta higher incidence in patients with epilepsy or encephalopathy Vertex correlated with visual attention occipital sharp, jagged Lambda reduce amplitude with sensory stimulation or mental imagery occipital and parietal 8-12 Hz Alpha Use Location Frequency Type
    10. 10. Brain Waves Transformations <ul><li>wave-form averaging over several trials </li></ul><ul><li>auto-adjustment with a known signal </li></ul><ul><li>Fourier transforms to detect relative amplitude at different frequencies </li></ul>
    11. 11. Alpha and Beta Waves <ul><li>Studied since 1920s </li></ul><ul><li>Found in Parietal and Frontal Cortex </li></ul><ul><li>Relaxed - Alpha has high amplitude </li></ul><ul><li>Excited - Beta has high amplitude </li></ul><ul><li>So, Relaxed -> Excited </li></ul><ul><li>means Alpha -> Beta </li></ul>
    12. 12. Mu Waves <ul><li>Studied since 1930s </li></ul><ul><li>Found in Motor Cortex </li></ul><ul><li>Amplitude suppressed by Physical Movements, or intent to move physically </li></ul><ul><li>(Wolpaw, et al 1991) trained subjects to control the mu rhythm by visualizing motor tasks to move a cursor up and down (1D) </li></ul>
    13. 13. Mu Waves
    14. 14. Mu and Beta Waves <ul><li>(Wolpaw and McFarland 2004) used a linear combination of Mu and Beta waves to control a 2D cursor. </li></ul><ul><li>Weights were learned from the users in real time. </li></ul><ul><li>Cursor moved every 50ms (20 Hz) </li></ul><ul><li>92% “hit rate” in average 1.9 sec </li></ul>
    15. 15. Mu and Beta Waves <ul><li>Movie! </li></ul>
    16. 16. Mu and Beta Waves <ul><li>How do you handle more complex tasks? </li></ul><ul><li>Finite Automata, such as this from (Millán et al, 2004) </li></ul>
    17. 17. P300 (Evoked Potentials) <ul><li>occurs in response to a significant but low-probability event </li></ul><ul><li>300 milliseconds after the onset of the target stimulus </li></ul><ul><li>found in 1965 by (Sutton et al., 1965; Walter, 1965) </li></ul><ul><li>focus specific </li></ul>
    18. 18. P300 Experiments <ul><li>(Farwell and Donchin 1988) </li></ul><ul><li>95% accuracy at 1 character per 26s </li></ul>
    19. 19. P300 (Evoked Potentials) <ul><li>(Polikoff, et al 1995) allowed users to control a cursor by flashing control points in 4 different directions </li></ul><ul><li>Each sample took 4 seconds </li></ul><ul><li>Threw out samples masked by muscle movements (such as blinks) </li></ul>
    20. 20. (Polikoff, et al 1995) Results <ul><li>50% accuracy at ~1/4 Hz </li></ul><ul><li>80% accuracy at ~1/30 Hz </li></ul>
    21. 21. VEP - Visual Evoked Potential <ul><li>Detects changes in the visual cortex </li></ul><ul><li>Similar in use to P300 </li></ul><ul><li>Close to the scalp </li></ul>
    22. 22. Model Generalization (time) <ul><li>EEG models so far haven’t adjusted to fit the changing nature of the user. </li></ul><ul><li>(Curran et al 2004) have proposed using Adaptive Filtering algorithms to deal with this. </li></ul>
    23. 23. Model Generalization (users) <ul><li>Many manual adjustments still must be made for each person (such as EEG placement) </li></ul><ul><li>Currently, users have to adapt to the system rather than the system adapting to the users. </li></ul><ul><li>Current techniques learn a separate model for each user. </li></ul>
    24. 24. Model Generalization (users) <ul><li>(Müller 2004) applied typical machine learning techniques to reduce the need for training data. </li></ul><ul><li>Support Vector Machines (SVM) and Regularized Linear Discriminant Analysis (RLDA) </li></ul><ul><li>This is only the beginning of applying machine learning to BCIs! </li></ul>
    25. 25. BCI Examples - Communication <ul><li>Farwell and Donchin (1988) allowed the user to select a command by looking for P300 signals when the desired command flashed </li></ul>
    26. 26. BCI Examples - Prostheses <ul><li>(Wolpaw and McFarland 2004) allowed a user to move a cursor around a 2 dimensional screen </li></ul><ul><li>(Millán, et al. 2004) allowed a user to move a robot around the room. </li></ul>
    27. 27. BCI Examples - Music <ul><li>1987 - Lusted and Knapp demonstrated an EEG controlling a music synthesizer in real time. </li></ul><ul><li>Atau Tanaka (Stanford Center for Computer Research in Music and Acoustics) uses it in performances to switch synthesizer functions while generating sound using EMG. </li></ul>
    28. 28. In Review… <ul><li>Brain Computer Interfaces </li></ul><ul><ul><li>Allow those with poor muscle control to communicate and control physical devices </li></ul></ul><ul><ul><li>High Precision (can be used reliably) </li></ul></ul><ul><ul><li>Requires somewhat invasive sensors </li></ul></ul><ul><ul><li>Requires extensive training (poor generalization) </li></ul></ul><ul><ul><li>Low bandwidth (today 24 bits/minute, or at most 5 characters/minute) </li></ul></ul>
    29. 29. Future Work <ul><li>Improving physical methods for gathering EEGs </li></ul><ul><li>Improving generalization </li></ul><ul><li>Improving knowledge of how to interpret waves (not just the “new phrenology”) </li></ul>
    30. 30. References <ul><li> </li></ul><ul><li> </li></ul><ul><li> </li></ul><ul><li> </li></ul><ul><li> </li></ul><ul><li>Toward a P300-based Computer Interface </li></ul><ul><li>James B. Polikoff, H. Timothy Bunnell, & Winslow J. Borkowski Jr. Applied Science and Engineering Laboratories Alfred I. Dupont Institute </li></ul><ul><li>Various papers from PASCAL 2004 </li></ul><ul><li>Original Paper on Evoked Potential: </li></ul><ul><li> </li></ul>
    31. 31. Invasive BCIs <ul><li>Have traditionally provided much finer control than non-invasive EEGs (no longer true?) </li></ul><ul><li>May have ethical/practical issues </li></ul><ul><li>(Chapin et al. 1999) trained rats to control a “robot arm” to fetch water </li></ul><ul><li>(Wessberg et al. 2000) allowed primates to accurately control a robot arm in 3 dimensions in real time. </li></ul>