SlideShare a Scribd company logo
1 of 71
Download to read offline
Page 1 of 71
BRAIN-COMPUTER INTERFACING TO DETECT STRESS
DURING MOTOR IMAGERY TASKS
Project Report submitted in partial fulfillment of
the requirements for the degree of
BACHELOR OF TECHNOLOGY
In
APPLIED ELECTRONICS AND INSTRUMENTATION ENGINEERING
Of
MAULANA ABUL KALAM AZAD UNIVERSITY OF TECHNOLOGY
By
ABHISEK SENGUPTA Roll No.-02 WBUT Roll No.-10905514002
ARNAB BAIN Roll No.-11 WBUT Roll No.-10905514011
DEBOSHRUTI BANERJI Roll No.-15 WBUT Roll No.-10905514015
MAHIM MALLICK Roll No.-21 WBUT Roll No.-10905514021
PARAMITA DEY Roll No.-27 WBUT Roll No.-10905514027
TETASH BASU Roll No.-48 WBUT Roll No.-10905514048
Under the guidance of
Prof. (Dr.) Anuradha Saha
DEPARTMENT OF APPLIED ELECTRONICS AND INSTRUMENTATION ENGINEERING
NETAJI SUBHASH ENGINEERING COLLEGE
TECHNO CITY, GARIA, KOLKATA –700152
Academic year of pass out 2017-18
Page 2 of 71
CERTIFICATE
This is to certify that this project report titled Brain-Computer Interfacing To Detect Stress
During Motor Imagery Tasks submitted in partial fulfillment of requirements for award of the
degree Bachelor of Technology (B. Tech) in AEIE of Maulana Abul Kalam Azad University of
Technology is a faithful record of the original work carried out by,
ABHISEK SENGUPTA Roll no.10905514002 Regd. No. 141090110730 & 2014-15
ARNAB BAIN Roll no.10905514011 Regd. No. 141090110739 & 2014-15
DEBOSHRUTI BANERJI Roll no.10905514015 Regd. No. 141090110743 & 2014-15
MAHIM MALLICK Roll no.10905514021 Regd. No. 141090110749 & 2014-15
PARAMITA DEY Roll no.10905514027 Regd. No. 141090110755 & 2014-15
TETASH BASU Roll no.10905514048 Regd. No. 141090110778 & 2014-15
under my guidance and supervision.
It is further certified that it contains no material, which to a substantial extent has been submitted for
the award of any degree/diploma in any institute or has been published in any form, except the
assistances drawn from other sources, for which due acknowledgement has been made.
___________
Date…... Guide’s signature
Prof.(Dr.) Anuradha Saha
Sd/__________________
HOD-Incharge: Prof. Sumitesh Majumder
APPLIED ELECTRONICS AND INSTRUMENTATION ENGINEERING
NETAJI SUBHASH ENGINEERING COLLEGE
TECHNO CITY, GARIA, KOLKATA – 700 152
Page 3 of 71
DECLARATION
We hereby declare that this project report titled Brain-Computer Interfacing To Detect Stress
During Motor Imagery Tasks is our own original work carried out as a under graduate student in
Netaji Subhash Engineering College except to the extent that assistances from other sources are
duly acknowledged. All sources used for this project report have been fully and properly cited. It
contains no material which to a substantial extent has been submitted for the award of any
degree/diploma in any institute or has been published in any form, except where due
acknowledgement is made.
Student’s names: Signatures: Dates:
……………………….. ……………………….. ……………………
……………………….. ……………………….. ……………………
……………………….. ……………………….. ……………………
……………………….. ……………………….. ……………………
……………………….. ……………………….. ……………………
……………………….. ……………………….. ……………………
Page 4 of 71
CERTIFICATE OF APPROVAL
We hereby approve this dissertation titled
BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING
MOTOR IMAGERY TASKS
carried out by
ABHISEK SENGUPTA Roll no.10905514002 Regd. No. 141090110730 & 2014-15
ARNAB BAIN Roll no.10905514011 Regd. No. 141090110739 & 2014-15
DEBOSHRUTI BANERJI Roll no.10905514015 Regd. No. 141090110743 & 2014-15
MAHIM MALLICK Roll no.10905514021 Regd. No. 141090110749 & 2014-15
PARAMITA DEY Roll no.10905514027 Regd. No. 141090110755 & 2014-15
TETASH BASU Roll no.10905514048 Regd. No. 141090110778 & 2014-15
under the guidance of
Prof.(Dr.) Anuradha Saha
of Netaji Subhash Engineering College, Kolkata in partial fulfillment of requirements for award
of the degree Bachelor of Technology (B. Tech) in Applied Electronics and Instrumentation
Engineering of Maulana Abul Kalam Azad University of Technology
Date…...
Examiners’ signatures:
1. ………………………………………….
2. ………………………………………….
3. ………………………………………….
4. ………………………………………….
5. ………………………………………….
Page 5 of 71
ACKNOWLEDGEMENT
We would like to take this opportunity to acknowledge and thank those who made this work possible
and unforgettable to us. First and foremost, we would like to express our heartfelt gratitude to our
guide Prof. (Dr.) Anuradha Saha, for her constant encouragement, guidance and constructive feedback
throughout the researching and preparation. Her support towards our participation in the experiment
is deeply appreciated.
We would like to express our sincere gratitude to Dr. Hirikesh Mondal, the Director and Prof. A. K.
Ghosh, the Principal of Netaji Subhash Engineering College for providing highly supportive research
environment.
We would also like to thank our external co-supervisors of the Artificial Intelligence Laboratory in
the department of Electronics and Communications Engineering at Jadavpur University, Kolkata for
their helpful collaboration, facilities, equipment and assistance for the acquisition EEG data of the
subjects.
Last but not the least; we would also like to thank our friends for their support and encouragement.
Without their support it would not have been possible.
ABHISEK SENGUPTA
ARNAB BAIN
DEBOSHRUTI BANERJI
MAHIM MALLICK
PARAMITA DEY
TETASH BASU
Date…………………
Page 6 of 71
ABSTRACT
Brain-Computer Interfacing To Detect Stress During Motor Imagery Tasks is an emerging
technology to rehabilitate motor deficits.
HIGHLIGHTS:
• BCIs permit to reintegrate the sensory motor loop by accessing to brain information.
• Motor imagery based BCIs seem to be an effective system for an early rehabilitation.
• This technology does not need remaining motor activity and promotes neuroplasticity.
• BCIs for rehabilitation tends towards implantable devices plus stimulation systems.
When the sensory-motor integration is malfunctioning provokes a wide variety of neurological
disorders, which in many cases cannot be treated with conventional medication, or via existing
therapeutic technology. A brain –computer interface(BCI) is a tool that permits to reintegrate the
sensory-motor loop, accessing directly to brain information. A potential, promising and quite
investigated application of BCI has been in the motor rehabilitation field. It is well- known that motor
deficits are the major disability wherewith the worldwide population lives. Therefore, this paper aims
to specify the foundation of motor rehabilitation BCI’s as well as to review the recent research
conducted so far (specially, from 2007 to date), in order to evaluate the suitability and reliability of
this technology. Although BCI for post-stroke rehabilitation is still in its infancy, the tendency is
towards the development of implantable devices that encompass a BCI module plus a stimulation
system.
Page 7 of 71
TABLE OF CONTENTS
1. An Introduction to Brain-Computer Interfacing
1.1 An overview of Brain-Computer Interfacing (BCI)
1.2 Types of BCI
1.3 Brain Map and Brain-Imaging Techniques
1.4 EEG Device
1.5 Components of a BCI System
2. Stress Detection during Motor-Imagery Tasks
2.1 Defining Stress and Motor Imagery (MI)
2.2 Decoding of Stress and MI
2.3 Brain Signals for Decoding MI
2.4 Features Used for Decoding MI
2.5 Classifiers Used for Decoding MI
2.6 Performance Analysis in MI-Based BCI Research
3. Stress Detection of Vehicle Drivers during Driving-A Case Study
3.1 Problem Formulation
3.2 Experimental Framework
3.3 Real-Time Signal Acquisition
3.4 Feature Extraction and Selection
3.5 Classifier Validation and Performance
4. Conclusions and Future Scope
4.1 Self-Review of the Work
4.2 Future Research directions
Page 8 of 71
LIST OF FIGURES
Fig. No. Description Page No.
1.1
Brain computer interfacing
11
1.2 Different brain lobes and their association with their cognitive abilities. 14
1.3 A typical EEG based BCI system 16
1.4 EEG acquisition device 18
1.5 Components of brain-computer interfacing 19
2.1 Various frequency components of EEG 32
3.1 Experimental Apparatus -Schematic Diagram 40
3.2 Experimental Apparatus -photographs of setup a) subject 1 with car simulation b) electrode
placement EEG system c) subject 1 with brake and accelerator
41
3.3 EEG signal acquisition during the experiment 42
3.4 Screenshot of EEG signal showing mild stress levels 42
3.5 15 channel baseline plots 43
3.6 Mean plot of the 15 channels 43
3.7 Power spectral density plots of 0.2 seconds EEG data 44
3.8 a) Raw EEG signal b) Fourier Transformed Signal 45
3.9 Area under different sub-bands of the frequency spectrum (Z set) 46
3.10 4 wavelets obtained for 1000x15 raw EEG data 47
3.11 Kalman filter analysis of EEG data 52
3.12 Gaussian membership curve 53
3.13 Gaussian curve from PSD average values 54
3.14 Hjorth Complexity Gaussian plot during alarmingly stressed video 56
3.15 Wavelet Coefficient Gaussian plot during alarmingly stressed video 56
3.16 Kalman filter Gaussian plot during alarmingly stressed video 57
3.17 Power Spectral Density Gaussian plot during alarmingly stressed video 57
3.18 Power Spectral Density Gaussian plot during moderately stressed video 58
3.19 Wavelet coefficient Gaussian plot during moderately stressed video 58
3.20 Kalman filter Gaussian plot during moderately stressed video 59
3.21 Hjorth Complexity Parameter Gaussian plot during moderately stressed video 59
3.22 Power Spectral Density Gaussian plot during relaxed video 60
3.23 Kalman filter Gaussian plot during relaxed video 60
3.24 Wavelet coefficient Gaussian plot during relaxed video 61
Page 9 of 71
LIST OF TABLES
Table
No.
Description Page No.
2.1 Characteristics of EEG Bands 31
3.1 Peaks of Signals in Fig. 3.8(b) 45
3.2
Average Values of 10 PSD Graphs
54
3.3 Minimum and Maximum Values of Features for Relaxed Video 62
3.4 Minimum and Maximum Values of Features for Moderately
Stressed Video
62
3.5 Minimum and Maximum Values of Features for Alarmingly
Stressed Video
63
3.6 tnorm for different stress levels 63
Page 10 of 71
Chapter 1
An Introduction to Brain-Computer Interfacing
This chapter provides a general introduction to brain-computer interfacing (BCI) to understand the
relation between brain signal processing and cognitive tasks performed. The chapter begins with the
overview and types of BCI, and gradually progresses through brain map and different modalities of
brain signaling/imaging techniques. The next part of the chapter includes detailed description of EEG
and fNIRS device, which will be used in the subsequent chapters. The later part of the chapter explains
the major components of BCI, such as filtering, artifact removal, and low level feature extraction and
classification.
Page 11 of 71
1.1 OVERVIEW OF BRAIN COMPUTER INTERFACING
The growth of the power of modern computers alongside our understanding of the human brain entails
the movement closer to making some pretty spectacular science fiction into reality. Imagine
transmitting signals directly to someone's brain that would allow them to see, hear or feel specific
sensory inputs. Consider the potential to manipulate computers or machinery with nothing more than
a thought. It isn't about convenience -- for severely disabled people, development of a brain-computer
interface (BCI) could be the most important technological breakthrough in decades.
The reason a BCI works at all is because of the way our brains function. Our brains are filled
with neurons, individual nerve cells connected to one another by dendrites and axons. Every time we
think, move, feel or remember something, our neurons are at work. That work is carried out by small
electric signals that zip from neuron to neuron as fast as 250 mph. The signals are generated by
differences in electric potential carried by ions on the membrane of each neuron.
Although the paths the signals take are insulated by something called myelin, some of the electric
signal escapes. Scientists can detect those signals, interpret what they mean and use them to direct a
device of some kind. It can also work the other way around. For example, researchers could figure
out what signals are sent to the brain by the optic nerve when someone sees the colour red. They could
rig a camera that would send those exact signals into someone's brain whenever the camera saw red,
allowing a blind person to "see" without eyes.
Fig. 1.1 Brain computer interfacing
Page 12 of 71
1.2 TYPES OF BCI
BCI is of three types: i) invasive, ii) partially invasive and iii) non-invasive. In invasive BCI,
electrodes are implanted directly into the grey matter of the brain. Although it produces the highest
quality signals of BCI devices, however, it is prone to scar-tissue build-up, causing the signal to
become weaker, or even non-existent, as the body reacts to a foreign object in the brain. Partially
invasive BCI involves electrode implantation inside the skull but rest outside the brain. This method
produces better resolution signals than non-invasive BCIs and also offers lower risk of forming scar-
tissue in the brain than fully invasive BCIs. Lastly, in non-invasive BCL, electrodes are placed on
specified positions on the scalp. It has disadvantage of having relatively poor spatial resolution,
however offers superior temporal resolution and the merits of cost effectiveness, easy to wear and
non-requirement of surgery.
Page 13 of 71
1.3 BRAIN MAP AND BRAIN IMAGING TECHNIQUES
BCIs measure brain activity, process it, and produce control signals that reflect the user’s intent. To
understand BCI operation better, one has to understand how brain activity can be measured and which
brain signals can be utilized. In this section, we focus on the brain map and the most important
recording methods and brain signals.
The human brain comprises several 100 billion of nerve cells, called neurons, which
individually/in groups are responsible for executing complex mental tasks like interpretation of
stimuli, memory encoding and recall, motor planning /execution and coordination of multi-
sensory/sensory-motor interactions. Apart from this, the human brain is also involved to control most
of our biological activities, including respiration rate, cardiac activity, muscular activity, and many
others. The neurons in the brain and also in the rest of our nervous system act partly electrically and
partly chemically for stimuli processing, signal transduction and motor activity. A look inside the
neuron reveals that the cell-body of the neuron yields a linear combination of the received electrical
stimuli for transfer to the pre-synaptic region. The accumulated electrical stimuli next trigger the
synapse to synthesize the neurotransmitters for transfer of information from the pre-synaptic region
to the post-synaptic region. Thus, communication of information inside a neuron is performed by both
electrical and chemical means.
The brain is divided into three main modules, called cerebrum, cerebellum and Pons. The cerebrum
is the largest part of the human brain with highest functionality. The second part of the brain, called
cerebellum is the area of the hindbrain. The third part called pones is the portion of brain stem, which
is located above the medulla oblongata and below the midbrain. The cerebrum is covered with a
cortical layer having convoluted topography, called the cerebral cortex. It looks like a sheet of neural
tissue that includes a large surface area within the skull by folding itself. Cerebral cortex is divided
into almost symmetrical right and left hemispheres. Each hemisphere consists of different lobes such
as frontal, parietal, temporal and occipital lobes. Besides the four lobes, neocortical areas of the brain
including primary motor and sensorimotor cortices play major role during motor-planning/execution
and tactile perception respectively. Figure 1.2 shows the different brain lobes and their association
with their cognitive abilities, which is briefly described in this section.
Page 14 of 71
1. The frontal lobe is one of the important lobes of cerebral hemisphere. It is located in the frontal
part of the brain. Central sulcus separates the frontal lobe from the parietal lobe whereas sylvian
sulcus separates the frontal lobe from the temporal lobe. The frontal lobe and its pre-frontal region
are responsible for problem solving tasks, physical reaction, abstract thinking, planning, short
term memory task and motivation [1]. The anterior portion of frontal lobe is known as pre-frontal
area, which is associated with olfaction recognition [2], [3] and emotion recognition [4], [5].
2. The parietal lobe extends from the central sulcus nearly to the occipital lobe and is situated on the
postcentral gyrus, which is responsible for processing all tactile and proprioceptive sensory
information from the contralateral side of the body [6]. This lobe is also used for
planning/navigation and spatial sense.
3. The temporal lobe, which is the largest brain lobe (containing approximately 17% of the cerebral
cortex) [7], is situated below the frontal lobe, and is separated from the frontal lobe by sylvian
sulcus [8]. The temporal lobe controls auditory and olfactory information processing, semantic
memory, and perception of spoken or written language [8].
Fig. 1.2 Different brain lobes and their association with their cognitive abilities
Page 15 of 71
4. The occipital lobe is the smallest lobe in the brain. It is situated behind the parietal lobe. The main
function of this lobe is visual reception, colour recognition and visuo-spatial processing [9].
Acquisition of brain activity broadly falls under two categories: i) without surgery, and ii) with
surgery.
❖ Measuring Brain Activity (Without Surgery)
Brain activity produces electrical and magnetic activity. Therefore, sensors can detect different types
of changes in electrical or magnetic activity, at different times over different areas of the brain, to
study brain activity. Most BCIs rely on electrical measures of brain activity and rely on sensors placed
over the head to measure this activity. Electroencephalography (EEG) refers to recording electrical
activity from the scalp with electrodes. It is a very well-established method, which has been used in
clinical and research settings for decades. EEG equipment is inexpensive, lightweight, and
comparatively easy to apply. Temporal resolution, meaning the ability to detect changes within a
certain time interval, is very good. However, the EEG is not without disadvantages: The spatial
(topographic) resolution and the frequency range are limited. The EEG is susceptible to so-called
artifacts, which are contaminations in the EEG caused by other electrical activities. Examples are
bioelectrical activities caused by eye movements or eye blinks (electrooculographic activity, EOG)
and from muscles (electromyographic activity, EMG) close to the recording sites. External
electromagnetic sources such as the power line can also contaminate the EEG. Furthermore, although
the EEG is not very technically demanding, the setup procedure can be cumbersome. To achieve
adequate signal quality, the skin areas that are contacted by the electrodes have to be carefully
prepared with special abrasive electrode gel. Because gel is required, these electrodes are also called
wet electrodes.
The number of electrodes required by current BCI systems ranges from only a few to more than
100 electrodes. Most groups try to minimize the number of electrodes to reduce setup time and hassle.
Since electrode gel can dry out and wearing the EEG cap with electrodes is not convenient or
fashionable, the setting up procedure usually has to be repeated before each session of BCI use. From
a practical viewpoint, this is one of largest drawbacks of EEG-based BCIs. A possible solution is a
technology called dry electrodes. Dry electrodes do not require skin preparation nor electrode gel.
This technology is currently being researched, but a practical solution that can provide signal quality
comparable to wet electrodes is not in sight at the moment. A BCI analyzes ongoing brain activity for
Page 16 of 71
brain patterns that originate from specific brain areas. To get consistent recordings from specific
regions of the head, scientists rely on a standard system for accurately placing electrodes, which is
called the International 10–20 System [10]. It is widely used in clinical EEG recording and EEG
research as well as BCI research. The name 10–20 indicates that the most commonly used electrodes
are positioned 10, 20, 20, 20, 20, and 10% of the total nasion-inion distance. The other electrodes are
placed at similar fractional distances.
The inter-electrode distances are equal along any transverse (from left to right) and antero-posterior
(from front to back) line and the placement is symmetrical. The labels of the electrode positions are
usually also the labels of the recorded channels. For example, if an electrode is placed at site C3, the
recorded signal from this electrode is typically also denoted as C3. The first letters of the labels give
a hint of the brain region over which the electrode is located: Fp – pre-frontal, F – frontal, C – central,P
– parietal, O – occipital, T – temporal. Fig. 1.3 provides a typical EEG based BCI system that consists
of an electrode cap with electrodes, cables that transmit the signals from the electrodes to the bio-
signal amplifier, a device that converts the brain signals from analog to digital format, and a computer
that processes the data as well as controls and often even runs the BCI application.
Fig. 1.3 A typical EEG based BCI system
Page 17 of 71
❖ Measuring Brain Activity (With Surgery)
The techniques discussed in the last section are all non-invasive recording techniques. For example,
there is no need to perform surgery or even break the skin. In contrast, invasive recording methods
require surgery to implant the necessary sensors. This surgery includes opening the skull through a
surgical procedure called a craniotomy and cutting the membranes that cover the brain. When the
electrodes are placed on the surface of the cortex, the signal recorded from these electrodes is called
the electrocorticogram (ECoG). ECoG does not damage any neurons because no electrodes penetrate
the brain. The signal recorded from electrodes that penetrate brain tissue is called intra-cortical
recording. Invasive recording techniques combine excellent signal quality, very good spatial
resolution, and a higher frequency range. Artifacts are less problematic with invasive recordings.
Further, the cumbersome application and re-application of electrodes as described above is
unnecessary for invasive approaches. Intra-cortical electrodes can record the neural activity of a single
brain cell or small assemblies of brain cells. The ECoG records the integrated activity of a much larger
number of neurons that are in the proximity of the ECoG electrodes. However, any invasive technique
has better spatial resolution than the EEG. Clearly, invasive methods have some advantages over non-
invasive methods. However, these advantages come with the serious drawback of requiring surgery.
Ethical, financial, and other considerations make neurosurgery impractical except for some users who
need a BCI to communicate. Even then, some of these users may find that a noninvasive BCI meets
their needs. It is also unclear whether both ECoG and intracortical recordings can provide safe and
stable recording over years. Long term stability may be especially problematic in the case of intra-
cortical recordings. Electrodes implanted into the cortical tissue can cause tissue reactions that lead
to deteriorating signal quality or even complete electrode failure. Research on invasive BCIs is
difficult because of the cost and risk of neurosurgery. For ethical reasons, some invasive research
efforts rely on patients who undergo neurosurgery for other reasons, such as treatment of epilepsy.
Studies with these patients can be very informative, but it is impossible to study the effects of training
and long-term use because these patients typically have an ECoG system for only a few days before
it is removed.
Page 18 of 71
1.4 EEG DEVICE
Electroencephalography (EEG) is an electrophysiological monitoring method to record the
spontaneous electrical activity of the brain over a period of time. EEG measures voltage fluctuations
resulting from ionic current within the neurons of the brain. It is typically noninvasive, with the
electrodes placed along the scalp, although invasive electrodes are sometimes used in specific
applications. The amplitude of the EEG is about 100 µV when measured on the scalp, and about 1-2
mV when measured on the surface of the brain. EEG signal is sub-divided into a number of specific
frequency bands including i) delta (~0.1-3Hz), ii) theta (~3-7Hz), iii) alpha (~7-13Hz), iv) mu (~7-
13Hz), v) beta (~13-30Hz) and vi) gamma bands (>30Hz).
The versatile and expandable EEG-1200 model from NIHON KOHDEN is equipped for all in-
patient/human-subject EEG diagnostics applications. Ranging from standard equipment for routine
EEG to the highest clinical discipline of intracranial long-term monitoring, EEG-1200 provides the
ideal basis for customized configuration. Fig. 1.4 provides the stand-alone EEG data acquisition
device provided in the official website. Our procured version consists of 32 channel amplifier, of
which 21 channels are dedicated to measure EEG signals and the remaining 11 channels are dedicated
for SpO2, EtCO2 and DC.
Fig 1.4 EEG Acquisition Device
Page 19 of 71
1.5 COMPONENTS OF BCI SYSTEM
The main aim of an EEG-based BCI system is to create a communication channel between the user’s
intention and an external device (e.g. computers, prosthesis) without any muscular intervention.
Unfortunately, while executing an assigned task, the human brain occasionally undertakes parallel
thoughts, which might appear as the cross-talk to the acquired EEG signals recorded to examine the
targeted task. In case, the frequency band of the EEG signals for the non-targeted parallel tasks do not
overlap with those of the targeted task, the frequency band for the targeted task can be separated from
the parallel thoughts by filtering. The EEG signal being of very low frequency and pass bands for
individual tasks being too narrow, we go for digital filtering rather than conventional analog filtering.
Fig. 1.5 presents all steps of BCI.
The next step that follows digital filtering is feature extraction. Feature extraction involves
determining the most appropriate features of the acquired EEG that best resembles the EEG signal for
a given task. In other words, true features of an EEG signal are those, which directly/indirectly can
help in reconstruction of the EEG signal. Unfortunately, there is no standard technique to extract the
true features of an EEG for a given task. The usual practice thus is to determine a set of standard
features that can capture one or more characteristics of the EEG signal. If the list of features is too
long, we need to select a fewer of the features. In fact, there is an extensive literature on feature
selection. A few of these that deserve mentioning includes forward search, backward search, and
evolutionary search algorithms [11], [12]. The motivation of these algorithms is to identify a subset
of features from its entirety so that they best represent the EEG signals at the sampled time-points.
Fig. 1.5 Components of a brain-computer interfacing system
Page 20 of 71
Most of BCI techniques terminate with a classification algorithm that aims at classifying the target
task/class from the rest. Usually, most of the BCI problems are formulated as a two-class classification
problem, unless the problem by nature is a multi-class classification problem. In a two-class
classification task, the classifier produces a binary output, one for the target class and zero for the rest.
A multi-class classification problem, such as classification of aroma from EEG signatures, is again
solved usually as a sequence of two-class classification problem. For example, A, B and C are three
classes. We use binary classifiers to classify the features into A and non-A. Then the non-A class is
again classified into class B and C. Had there been more than three classes, the classification tree
would have a longer length but that too has to follow the above principle.
Occasionally, a few BCI systems require additional steps to realize a controller to execute specific
control tasks based on the results of classification. For example, suppose, if the classifier response is
class A, we may need to turn a motor on. If it is class B, we may turn it off. More sophisticated control
logic is also adopted in recent BCI systems [11], where the motor is activated based on the
classification of subjective motor imagery and stopped based on the occurrence of error when the
motor-shaft passes the fixed target position.
Page 21 of 71
Chapter 2
Stress Detection Using Motor Imagery Task
The chapter begins with the definition of stress and motor imagery, and gradually progresses through
different brain signals including P300 event-related potential, event-related de-
synchronization/synchronization, slow cortical potential, steady-state visual evoked potential and
error-related potential. This chapter provides a brief review of current research directions and the
scope of EEG signals in decoding of motor imagery. This chapter also includes feature extraction
techniques, such as discrete wavelet transforms, power-spectral density, adaptive autoregressive
parameters, Hjorth parameters, and common spatial patterns. The chapter provides a discussion on
EEG signal classification to decode cognitive activities. The list of classifiers includes Linear
Discriminant Analysis (LDA), Support Vector Machine (SVM), Multi-layer Perceptron (MLP),
Hidden Markov Model (HMM), k-nearest neighbor (kNN) algorithm and Naïve Bayes’ classifier. The
chapter comes to an end with an outline to well-known performance analysis metrics is also included.
Page 22 of 71
2.1 DEFINING STRESS AND MOTOR IMAGERY (MI)
Stress is primarily a physical response, which releases a complex mix of hormones and chemicals
such as adrenaline, cortisol and norepinephrine to prepare the body for physical action. This causes a
number of reactions, from blood being diverted to muscles to shutting down unnecessary bodily
functions such as digestion.
Motor imagery is a cognitive process in which a subject imagines that he or she performs a
movement without actually performing the movement and without even tensing the muscles. It is a
dynamic state during which the representation of a specific motor action is internally activated without
any motor output. In other words, motor imagery requires the conscious activation of brain regions
that are also involved in movement preparation and execution, accompanied by a voluntary inhibition
of the actual movement [13].
Page 23 of 71
2.2 DECODING OF STRESS AND MI
❖ Decoding of Stress
Traffic accidents all over the world are increasing day-by-day, posing a serious danger to the driver’s
life and the lives of other people. Crashes are among the top three causes of death throughout a
person’s lifetime. This is mainly due to the diminished driver’s vigilance level causing a decline in
their perception, recognition and vehicle control abilities. Fatigue, stress, and our emotions have a
serious effect on driving, causing serious impairments that we may not even be aware of.
For this reason, developing systems that actively monitors the driver’s level of vigilance and
alerting the driver of any insecure driving condition is essential for accident prevention. Many efforts
have been made for developing an active safety automatic car control system for reducing the number
of automobiles accidents due to stress, fatigue, drunkenness, and sleepiness or health problems.
In [14], electroencephalogram signals (EEG) are used for a drowsiness detection system which
identifies suitable driver-related and/or vehicle related variables that are correlated to the driver’s
level of drowsiness. Electrodes are placed on the scalp which sends signals to the computer to record
the results. So basically, a brain-computer interface (BCI) is created which enables control of devices
only through cerebral activity without using muscles. In order to make the analysis of drowsiness in
an automatic way, the EEG power spectrum can be computed using Fast Fourier Transform (FFT) or
wavelength transform in MATLAB.
The circuit involves EEG detection circuit, micro control circuit and a processing circuit. The micro
control circuit receives the EEG signal and generates a control signal that is sent to the processing
unit. According to the control signal, the processing unit processes and analyses the EEG signal so as
to learn the fatigability of a person.
In [15], early symptoms of fatigue in train drivers are detected by image processing method of
comparing the image (frames) in the video and by using the human features estimated indirectly. On
Page 24 of 71
the onset of fatigue due to any severe medical problems or sleep deprivation, an immediate message
will be transferred to the control room using the GSM module of the system indicating the status of
the drivers. Heart rate sensors are also added in the fatigue detection system. The technique focuses
on different modes of person when driving i.e. awake, drowsy state or sleepy and sleep state. An alarm
or buzzer sound known as Automatic Alarm System (AAS) is generated in driver’s cabin if train
passes and caution or stop signal given by control room.
In many scenarios driver may be suffering from heart attack or any other desperate situation, such
position of the driver is also detected easily with the help of heart rate detection system using heart
beat sensor. An accelerometer is also used to detect the motion of the face since a person in the drowsy
or fatigue mode has a range of motion of the face different from a person in normal mode. No
movement of driver states that the driver is in sleepy mode or victim to any kind of unconsciousness.
The eye blinking and the degree of open eyelid are also factors for detecting the fatigue of driver since
the blinking rate of person awake differs from the blinking rate of person in sleepy state and there is
a wide space between the eyelid when person is fully awake as studied in [4].
In [16], remotely located charge-coupled-device cameras equipped with active infrared illuminators
are used to acquire video images of the driver. The levels of alertness of a person are extracted in real-
time. The visual cues employed characterize eyelid movement, gaze movement, head movement and
facial expression. A probabilistic model is developed to model human alertness and to predict accurate
drowsiness or fatigue based on multiple visual cues obtained.
In [17], stress is detected using eye blinks and brain activities from EEG signals. While driving,
stressful emotions can be triggered in participant and we can correlate eye blink frequency with
experienced stress. Longitudinal differences of two prefrontal cortex sensors in combination with
amplitude maps are used to classify eye blinks. It is even harder to generalize the interpretation of
these associations, since eye blinks can differ between different persons. Eye blinks of one test subject
is detected and correlated with eye blink frequency of subjects with the experienced level of stress.
Brain activity is significantly more active when doing mental calculation with eyes open as opposed
to doing them with eyes closed. Results of this research in combination with other stress detectors
lead to applications to improve transport safety and support other areas where stress levels need to be
monitored.
Page 25 of 71
❖ Decoding of MI
Over the past two decades, motor imagery (MI) has been used to design EEG-based BCI systems that
enable individuals with motor impairments to control various assistive devices, such as wheelchairs,
prosthetic devices, and computers. In fact, a MI task can be defined as a mental process in which an
individual imagines himself/herself performing a specific action without real activation of the
muscles. During MI tasks, various regions in the brain are activated such as primary motor cortex
(M1), primary and secondary sensory areas, pre-frontal areas, superior and inferior parietal lobules,
and dorsal and ventral pre-motor cortices. Therefore, the development of BCI systems that can
effectively analyze brain signals and discriminate between different MI tasks to control neural
prostheses devices has the potential to enhance the quality of life for people with severe motor
disabilities.
Literature reveals that the vast majority of the existing MI EEG-based BCI systems were focused
on differentiating between MI tasks that are associated with four different body parts, including feet,
left hand, right hand, and tongue. Despite the relatively high classification accuracies attained for
classifying MI tasks performed by different body parts, the discrimination between MI tasks within
the same hand is considered challenging. This can be attributed to three limitations associated with
the EEG signals. First, the low spatial resolution of the EEG signals constrains the ability to
discriminate between MI tasks of the same hand that activate similar and close areas in the brain. In
fact, this limitation becomes more pronounced when the MI tasks are associated with the same joint
in the hand, such as wrist movements. Second, due to the volume conducted effect, EEG signals have
a limited signal-to-noise ratio. This in turn can drastically reduce the ability to discriminate between
EEG signals of different dexterous MI tasks within the same hand, such as fingers- and wrist-related
tasks. Third, the spectral characteristics of the EEG signals are time varying, or non-stationary. The
non-stationary characteristics of EEG signals introduce large intra-trial variations for each subject and
inter-personal variations between subjects, which increase the difficulty to discriminate between the
EEG signals of MI tasks within the same hand. Therefore, traditional time-domain and frequency-
domain representations, which are employing the time-invariance assumption, are considered
inadequate to represent EEG signals.
Page 26 of 71
Recently, a few studies have been reported to utilize EEG signals in order to discriminate between
flexion/extension movements of the fingers as well as several wrist movements, including flexion and
extension. The promising results reported in these studies demonstrate the possibility of utilizing EEG
signals to discriminate between MI tasks within the same hand. Nonetheless, these studies have been
conducted using EEG signals acquired from intact subjects, without exploring the capability of
classifying MI tasks within the same hand using EEG signals that are acquired from individuals with
hand amputations.
In addition, a fast-growing number of studies indicated that brain areas engaged in the actual
performance of movements are also active during motor imagery [18]-[25]. Multiple studies showed
the involvement of the premotor, supplementary motor, cingulate and parietal cortical areas, the basal
ganglia, and the cerebellum, not only during the actual execution of a movement but also during the
imagination of a movement [26], [27]. In [28], authors showed that the imagination of different
moving body parts (foot, hand and tongue) activated the precentral gyrus in a somatotopic manner.
Similar results were obtained in [29], where authors showed that imagery of finger, tongue and toe
movements activated the specified organized areas of the primary motor cortex in a systematic
manner, which means that imagery of finger movement activated the finger area, imagery of toe
movements activated the foot zones of the posterior part of the contralateral supplementary motor
area and the contralateral primary motor cortex and imagery of tongue movements activated the
tongue region of the primary motor cortex. These data suggest that the imagined body part is reflected
more or less directly in the pattern of cortical activation. The results are in accordance with an earlier
study [30], where motor imagery influenced the corticospinal excitability in a very specific way. For
example, motor imagery of forearm flexion enhances the MEPs of the m. biceps brachialis, an agonist
during forearm flexion, whereas this was not the case during imagery of forearm extension, where the
m. biceps brachialis acts as an antagonist. Hence, motor imagery does not lead to a generalized
muscular arousal but to movement-specific central activation patterns.
In [31], motor imagery had an effect on the spinal segmental excitability. Nine healthy adult
participants had to perform a series of imagined flexion-extension movements of the fingers. The
results indicated a subthreshold activation of spinal motoneurons. Hence, at this moment there is
ample evidence that motor execution and motor imagery activate overlapping areas in the brain.
Although the majority of the studies are focused on hand/finger or mouth movements, it is in the
Page 27 of 71
context of the present text, relevant to note that the activation of brain cortical areas during motor
imagery is not limited to hand/finger or mouth movements but that also the imagination of gross
movements results in the activation of relevant areas. In [32], the activation of the pre-supplemary
motor area and the primary motor cortex is shown during imagery of locomotor movements.
Besides the overlap in neural activation between imagery and execution there are also similarities
in the behavioural domain. For instance, the time to complete an imagined movement is known to be
similar to the time needed for actual execution of that movement. This phenomenon is known as
mental isochrony. In [33], authors showed that the time needed to judge whether a rotated picture of
a hand represents a left or a right hand is related to the degree of rotation of that picture. Furthermore,
he showed that when the exposed hand positions were awkward or biomechanically difficult, the
imagined rotation time increased more than for equally rotated hands in biomechanically easy
positions and that the rotation time was similar to real hand rotation time for these positions. The fact
that motor imagery seems to respect the normal biomechanical constraints of real movements
indicates that these tasks are not accomplished by mere visual imagery but must be solved by
imagining the movement of one’s own arm and hand.
Page 28 of 71
2.3 BRAIN SIGNALS FOR DECODING MI
During execution of different cognitive tasks, EEG signals released by the brain indicate certain
special characteristics, which can be detected from the temporal changes in signal wave shapes. An
EEG signal, if elicited in response to specific events or stimuli is referred to as Event-related Potential
(ERP) [34]. Certain ERPs liberated in response to sensory stimuli with relevant discrete phase-locked
events are referred to as Evoked potential (EP) [35]. EPs are best described by their polarity (positive
or negative) and latency counted from the onset of stimuli. Among the EPs, N100, P200, N200, P300
[36], [37], Slow cortical potential (SCP) [38] and Error-related potential (ErrP) [39] need special
mention. One special type of EP, which exhibits natural responses to visual stimulations at specific
frequencies, is referred to as Steady-state visual-evoked (SSVEP) [38] response. Besides, certain EEG
signals are induced spontaneously as a response to specific cognitive tasks without any stimuli. These
ERPs liberated in absence of any stimuli represent frequency-specific changes and are generally
referred to as non-phase locked ERPs [40]. A well-known example of such ERPs is Event-related de-
synchronization/synchronization (ERD/ERS) [41], where an event-related decrease in power is
noticed at the onset of motor imagery/execution. This phase of the signal is referred to as Event
Related De-synchronization (ERD). After the motor imagination/execution is over, the signal-power
continues increasing until the original signal power is restored. The latter phase of the signal is
referred to as ERS.
Page 29 of 71
2.4 FEATURES USED FOR DECODING MI
A feature represents a distinguishing property, a recognizable measurement, and a functional or
structural component obtained from a section of a pattern. A variety of methods have been widely
used to extract the features from EEG signals, among these methods are time frequency distributions
(TFD), Fast Fourier transform (FFT), eigenvector methods (EM), wavelet transform (WT), and auto
regressive method (ARM), and so on.
Power spectral density describes the signal energy or the power distributed over the frequency.
It is a useful concept that allows us to determine the bandwidth of the system. To understand how the
strength of the signal is distributed in frequency domain, we take help of the filter-based analysis. As
power of the signal is measure of signal strength, we have used the classical definition that is derived
as the Fourier Transform of the autocorrelation function for our study. The strength of the Fourier
transform in signal analysis and pattern recognition is its ability to reveal spectral structures that may
be used to characterize a signal. For example, for a periodic signal, the power is concentrated in
extremely narrow bands of frequencies, indicating the existence of structure and the predictable
character of the signal whereas for a purely random signal the signal power is spread equally in the
frequency domain, indicating the lack of structure in the signal.
Hence, the more correlated or predictable a signal, the more concentrated its power spectrum, and
conversely the more random or unpredictable a signal, the more spread its power spectrum. Therefore,
the power spectrum of a signal can be used to deduce the existence of repetitive structures or correlated
patterns in the signal process. Such information is crucial in detection, decision making and estimation
problems, and in systems analysis.
The Fast Fourier Transform (FFT) is a useful scheme for extracting frequency-domain signal
features. Fourier analysis is extremely useful for data analysis, as it breaks down a signal into
constituent sinusoids of different frequencies. For sampled vector data, Fourier analysis is performed
using the discrete Fourier transform (DFT). The fast Fourier transform (FFT) is an efficient
algorithm for computing the DFT of a sequence.
Since the early days of automatic EEG processing, representations based on a Fourier transform
have been most commonly applied. This approach is based on earlier observations that the EEG
Page 30 of 71
spectrum contains some characteristic waveforms that fall primarily within frequency bands—delta,
theta, alpha, beta and gamma. The oscillatory activity of the spontaneous EEG is typically categorized
into five different frequency bands: delta (0-4 Hz), theta (4-8), alpha (8-12), beta (12-30) and gamma
(30-100 Hz). These frequency bands are suggested to be a result of different cognitive functions.
• Delta (0 -4 Hz): Delta activity is characterized as high amplitude and low frequency. It is usually
associated with the slow-wave sleep. Delta waves represent the onset of deep sleep phases in healthy
adults. In addition, contamination of the eye activity is mostly represented in the delta frequency band.
• Theta (4-8Hz): The generation of theta power is associated with the hippocampus as well as
neocortex. The theta band is associated with deep relaxation or meditation and it has been observed
at the transition stage between wake and sleep. Theta rhythms are suggested to be important for
learning and memory functions, encoding and retrieval which involve high concentration. Theta
oscillations are also associated with the attentional control mechanism in the anterior cingulated
cortex and are often shown to increase with a higher cognitive task demand.
• Alpha (8-12Hz): Alpha band activity is found at the occipital lobe during periods of relaxation or
idling i.e. eyes closed but awake. It is characterized by high amplitude and regular oscillations with a
maximum over parietal and occipital electrodes in the continuous EEG. The modulation of alpha
activity is thought to be a result of resonation or oscillation of the neuron groups. High alpha power
has been assumed to reflect a state of relaxation. However, when the operator devotes more effort to
the task, different regions of the cortex may be recruited in the transient function network leading to
passive oscillation of the local alpha generators in synchrony with a reduction in alpha power. Recent
results suggested that alpha is involved in auditory attention processes and the inhibition of task
irrelevant areas to enhance signal-to-noise ratio. Additionally, some researchers divide the alpha
activity further into sub-bands to achieve a finer grained description of its functionality. For example,
the “mu” band (10-12 Hz) occurs with actual motor movement and intent to move with an associated
diminished activation of the motor cortex.
• Beta (13-30Hz): The beta wave is predominant when the human is wide awake. Spatially, it
predominates in the fontal and central area of the brain. It has been described that the high power in
the beta band is associated with the increased arousal and activity pointed out that the beta wave
represents cognitive consciousness and an active, busy, or anxious thinking. Furthermore, it has been
revealed to reflect visual concentration and the orienting of attention. The beta band can be further
Page 31 of 71
divided into several sub-bands: low beta wave (12.5-15 Hz); middle beta wave (15-18 Hz); high beta
wave (> 18 Hz). These three sub-bands are associated with separate physiologic processes.
• Gamma (>30Hz): The gamma band is the fastest activity in EEG and is thought to be infrequent
during waking states of consciousness. It is reported that gamma waves are associated with perceptual
blinding problem. More specifically, areas of lateral occipital cortex and fusiform gyrus play an
important role in visual stimulus encoding and show large gamma oscillations differently affected by
attentional modulation. Recent studies reveal that gamma is linked with many other functions such as
attention, learning, memory, and language perception. Additionally, verbal memory formation led to
an increase in gamma oscillations when analyzing intracranial recording data from epilepsy patients.
Table 2.1 provides characteristics of EEG bands.
Table 2.1 Characteristics of EEG Bands
Page 32 of 71
Wavelet transform (WT) forms a general mathematical tool for signal processing with many
applications in EEG data analysis. Its basic use includes time-scale signal analysis, signal
decomposition and signal compression. Since EEG signal is non-stationary, a suitable way for feature
extraction from the raw data is the use of the time-frequency domain methods like WT which is a
spectral estimation technique in which any general function can be expressed as an infinite series of
wavelets. Since WT allows the use of variable sized windows, it gives a more flexible way of time-
frequency representation of a signal. In order to get a finer low-frequency resolution, WT long time
windows are used in contrast in order to get high-frequency information, short time windows are used.
Furthermore, WT only involves multi-scale structure and no single scale. Generally, wavelets are
purposefully crafted to have specific properties that make them useful for signal processing. Wavelets
can be combined, using a "reverse, shift, multiply and integrate" technique called convolution, with
portions of a known signal to extract information from the unknown signal. A related use is for
smoothing/denoising data based on wavelet coefficient thresholding, also called wavelet shrinkage.
By adaptively thresholding the wavelet coefficients that correspond to undesired frequency
components smoothing and/or denoising operations can be performed. Fig. 2.1 shows various EEG
frequency components, where the components are generated through the wavelet transform. The delta,
theta, alpha, beta, and gamma are correspondent to the wavelet decompositions.
The wavelet transform provides a potentially powerful technique for pre-processing EEG signals
prior to classification. WT plays an important role in the recognition and diagnostic field: it
compresses the time-varying biomedical signal, which comprises many data points, into a small few
parameters that represents the signal. There are two categories for the WT; the first one is
continuous while the other one is discrete.
Hjorth parameter is one of the ways of indicating statistical property of a signal in time domain
and it has three kinds of parameters: Activity, Mobility, and Complexity. These are also called
“normalized slope descriptors” because they can be defined by means of first and second derivatives.
❖ HJORTH ACTIVITY:
Fig. 2.1 Various frequency Components of EEG
Page 33 of 71
The wavelet transform provides a potentially powerful technique for pre-processing EEG signals
prior to classification. WT plays an important role in the recognition and diagnostic field: it
compresses the time-varying biomedical signal, which comprises many data points, into a small few
parameters that represents the signal. There are two categories for the WT; the first one is
continuous while the other one is discrete.
Hjorth parameter is one of the ways of indicating statistical property of a signal in time domain
and it has three kinds of parameters: Activity, Mobility, and Complexity. These are also called
“normalized slope descriptors” because they can be defined by means of first and second derivatives.
❖ HJORTH ACTIVITY:
The first parameter is a measure of the mean power representing the activity of the signal. The activity
parameter represents the signal power, the variance of a time function. This can indicate the surface
of power spectrum in the frequency domain. This is represented by Eq. 2.1.
powersignalthe))(var( ,txActivity = (2.1)
where, y(t) represents the signal.
❖ HJORTH MOBILITY:
The mobility parameter represents the mean frequency, or the proportion of standard deviation of the
power spectrum. This is defined as the square root of variance of the first derivative of the signal y(t)
divided by the signal y(t). This is represented by Eq. 2.2.
frequencymeanthe,
txActivity
dt
tdx
Activity
Mobility
))((
)(






= (2.2)
Page 34 of 71
❖ HJORTH COMPLEXITY:
The last parameter gives an estimate of the bandwidth of the signal. The Complexity
parameter represents the change in frequency. The parameter compares the signal's similarity
to a pure sine wave, where the value converges to 1 if the signal is more similar. This is
represented by Eq. 2.3.
Complexity =
))((
)(
txMobility
dt
tdx
Mobility 





(2.3)
While these three parameters contain information about frequency spectrum of a signal, they also
help analyze signals in time domain. Furthermore, the time-domain orientation of Hjorth
representation may prove suitable for situations where ongoing EEG analysis is required. Since the
calculation of Hjorth parameters is based on variance, the computational cost of this method is
considered low compared to other methods.
The Kalman filter is an optimal estimator. It can estimate the past, present and future states of
a system from a set of uncertain observations. Basically, it works recursively on noisy input data and
infers parameters of interest from the data that is better than the estimate obtained by using a single
data. The process of finding the best estimate from noisy data amounts to filtering out the noise.
Prediction is possible even when the exact nature of the system is not known.
The Kalman filter estimates a process by using a form of feedback control: it estimates the
process state at some time and then obtains feedback in the form of (noisy) measurements. Once the
outcome of the next measurement (including some amount of error) is observed, these estimates are
updated using a weighted average, with more weight being given to estimates that have higher
certainty. Since the algorithm is recursive, it can run in real time using only present input and
previously calculated values; no past information is required.
Kalman filter is widely used in real-time signal processing applications such as guiding, navigating
and controlling of vehicles, particularly aircraft and spacecraft, due to the following reasons:
▪ Good results in practice due to optimality and structure.
▪ Convenient form for online real time processing.
▪ Easy to formulate and implement given a basic understanding.
▪ Measurement equations need not be inverted.
Page 35 of 71
2.5 CLASSIFIERS USED FOR DECODING MOTOR IMAGERY
Brain activity patterns are considered as dynamic stochastic processes due both to biological and to
technical factors. Biologically, they change due to user fatigue and attention, due to disease
progression, and with the process of training. Technically, they change due to amplifier noises,
ambient noises, and the variation of electrode impedances [42]. Therefore, the time course of the
generated time series signal, for example, EEG should be taken into account during feature extraction
[42]. To use this temporal information, three main approaches have been proposed [43]:
• concatenation of features from different time segments: extracting features from several time
segments and concatenating them into a single feature vector [44], [45];
• combination of classifications at different time segments: it consists in performing the feature
extraction and classification steps on several time segments and then combining the results of the
different classifiers [46], [47];
• dynamic classification: it consists in extracting features from several time segments to build a
temporal sequence of feature vectors. This sequence can be classified using a dynamic classifier [48],
[49].
Usually, the first approach is the most widely used despite that the obtained feature vectors are often
of high dimensionality. In order to choose the most appropriate classifier for a given set of features,
the properties of the available classifiers must be chosen according to the following four classifier
taxonomy as described in [43].
1. Generative or Informative classifier - Discriminative classifier: Generative classifiers, for
example, Bayes quadratic, learn the class models. To classify a feature vector, generative classifiers
compute the likelihood of each class and choose the most likely. Discriminative ones, for example,
support vector machines (SVM), only learn the way of discriminating the classes or the class
membership in order to classify a feature vector directly [50], [51].
2. Static classifier - Dynamic classifier: Static classifiers, for example multilayer perceptrons (MLP),
cannot take into account temporal information during classification as they classify a single feature
vector. In contrast, dynamic classifiers such as hidden Markov model (HMM) [52], FIR filters
Page 36 of 71
multilayer perceptrons (FIR-MLP) [53] and Tree-based neural network (TBNN) [54] can classify a
sequence of feature vectors and thus catch temporal dynamics.
3. Stable classifier - Unstable classifier: Stable classifiers, for example, linear discriminant analysis
(LDA), have a low complexity (or capacity) [55], [56]. They are said to be stable as small variations
in the training set do not considerably affect their performance. In contrast, unstable classifiers, for
example, MLP, have a high complexity. As for them, small variations of the training set may lead to
important changes in performance [57].
4. Regularized classifier: Regularization consists in carefully controlling the complexity of a
classifier in order to prevent overtraining. Regularization helps limit (a) the influence of outliers and
strong noise, (b) the complexity of the classifier and (c) the raggedness of the decision surface [58].
A regularized classifier has good generalization performances and is more robust with respect to
outliers [59], [60].
Page 37 of 71
2.6 PERFORMANCE ANALYSIS IN MI BASED BCI RESEARCH
The performance of a EEG-BCI system is analyzed using a number of performance metrics. This
section discusses on few of them.
1. Confusion Matrix: The confusion matrix is a tabular representation which the relationship between
the desired class intended by the user and the actual classes predicted by the classifier [61], [62].
2. Classification Accuracy: It is the most widely used evaluation criterion in BCI research because it
is easy to calculate and interpret. It is defined as the ratio of the number of correct observations made
by the classifier to the total number of observations [63].
3. Type-I and Type-II Error Rate: A type I error (α) represents the rate of incorrect rejection of a
true null hypothesis, and hence known as false positive rate. The error of the second kind, i.e.,
a type II error (β) refers to the rate of failure to reject a false null hypothesis., and hence known as
false negative rate [64].
4. Information Transfer Rate: Information Transfer Rate (Bt) represents the bit rate of the BCI system
[65]. Its representation in bits/min is given in Eq. 2.4.
( )2 2 2
1 60
log log 1 log
1
t
P
B N P P P
N T
− 
= + + −  
− 
(2.4)
where, N represents the number of possible states and P represents the classification accuracy between
0 and 1. T is the time needed to convey each action in second/symbol i.e., time interval from the issue
of a command to the classified output of the same.
5. Statistical Hypothesis Testing: Statistical hypothesis testing [64] is required to ensure that the
experimental data are correctly interpreted; the apparent relationship between them is significant or
meaningful and does not occur by chance. There exist a number of well-known statistical tests, which
can be classified into four main categories namely, i) correlational (such as, Pearson correlation [66]
and Spearman correlation [67]), ii) comparison of means (such as, Paired t-test [67] and ANNOVA
[68]), iii) regression [67] (such as, Simple regression and Multiple regression) and non-parametric
(such as, McNemar’s test [69], Friedman’s test [70], Wilcoxon rank-sum test [71] and Wilcoxon
signed-rank test [72]). The selection of right statistical test depends on the type of data, distribution
of data, and number of data-points and observations available.
Page 38 of 71
Chapter 3
Stress Detection of Vehicle Drivers During
Driving-A Case Study
The chapter begins with the formulation of stress detection problem for vehicle drivers during driving.
To accomplish this, experimental framework for EEG data acquisition is established. This chapter
provides the feature plots by using the feature extraction techniques as has been explained in earlier
chapter. The chapter ends with the performance of a fuzzy classifier in order to decode the stress level
of a driver during driving.
Page 39 of 71
3.1 PROBLEM FORMULATION
Driving is a common yet complex skill that requires constant attention and integration of different
simultaneous streams of information by a driver. It requires the coordinated use of both hands and
feet. In particular, the driving task is actually a combination of various cognitive processes such as
perception, attention, motor control, working memory, decision-making and driver’s mental
workload. The brain signals are very good at detecting the whole environment around the car, making
decisions and controlling body movements. Unlike performance and subjective measurements,
psychophysiological measures offer continuous observation in high time resolution (e.g. in
milliseconds) and could be collected without intruding into the operator’s task.
Suppose the car needs to take a turn at an intersection, the brain signal sends command to the
driver’s muscles to take in charge and move the steering wheel and shift gears or put your foot on the
brake pedal or accelerator. Such high coordination is implemented by the brain for driving in a matter
of split-seconds. It is also well-known that humans, who recover from traumatic brain injury, are often
thought to be unfit for driving due to deficits in remembering, learning and planning.
Significant increased activation in the left dorsolateral precentral gyrus and postcentral gyrus is
observed when starting to move the car. Turning activates an extended area from occipital cortex
dorsally to superior parietal cortex and laterally in the right hemisphere to the posterior middle
temporal gyrus. Reversing activation is prominent in the lateral precentral gyrus and anterior
insula/ventrolateral prefrontal cortex. Stopping involves a more restricted activation and focuses more
on the anterior part of the pre-SMA. Monitoring actions from other drivers shows extensive activation
in the precuneus and superior parietal cortices. Traffic rule related thoughts are associated with
significant activation of the right lateral PFC.
In this chapter, we have attempted to detect the stress level of a person while driving a car. Stress
appears into the EEG spectrum by an increase of activity in the frequency bands predominantly in the
parietal and central regions of the brain. In the same time, a decrease of activity in the band can also
be observed, as beta activity increases with cognitive tasks and active concentration. This has been
shown in several studies. EEG is so efficient in detecting drowsiness and stress that it is often used as
a reference indicator.
Page 40 of 71
3.2 EXPERIMENTAL FRAMEWORK
The experimental setup is the part of research through which the brain signals of the subject is
obtained. An EEG data acquisition unit consisting of a multi-channel recording facility is a must in
this experiment.
The system setup for long term monitoring of brain signals is organized as follows:
▪ An electrode placement EEG system
▪ a 15 channels recording unit (15 EEG electrodes are used)
▪ a video camera synchronized with EEG recording
▪ a PC running a car racing game for visual and acoustic stimulation
▪ a steering, brake and accelerator for the car stimulator
We may also need a comfortable chair, a quiet room and screening from interference is a bonus. The
setup of the experiment must be consistent (i.e. room condition, stimulation used and length of data
recorded) especially when taking the data at different times. Aiming for better spatial filtering and
accuracy, five different subjects are taken. The experiment leverages an existing technology:
electroencephalography (or EEG) for noninvasively recording brain signals from the scalp. Figure 3.1
illustrates the experimental paradigm.
Fig 3.1: Experimental Apparatus -Schematic Diagram
EEG recording of brain
activity
Car Simulation in PC
Page 41 of 71
3.3 REAL-TIME SIGNAL ACQUISITION
Electrical brain activity from Subject 1 (the ‘Driver’) is recorded using EEG (Figure 3.2 b) in the
Artificial Intelligence Laboratory in the department of Electronics and Telecommunications
Engineering at Jadavpur University, Kolkata. This brain activity is interpreted by a computer. The
task that subjects must cooperatively do via brain-muscle coordination is to drive a driving simulator
in accordance with an emulated driving scenario. There are other computerized cars in the street with
various obstacles imitating the real-life conditions. The driving simulator includes a steering wheel,
brake and accelerator pedals and hence provides a feeling of a car to the subjects. Fig. 3.3 shows the
acquisition of EEG signals by EEG electrodes placed on the scalp of a subject during the driving
experiment, whereas Fig. 3.4 provides a screenshot of EEG signals corresponding to mild stress while
performing the experiment.
Fig 3.2: Experimental Apparatus -photographs of setup a) subject 1 with car simulation b) electrode
placement EEG system c) subject 1 with brake and accelerator
Page 42 of 71
Fig 3.3: EEG signal acquisition during the experiment
Fig 3.4: Screenshot of EEG signal showing mild stress levels
Page 43 of 71
3.4 FEATURE EXTRACTION AND SELECTION
Extracted features are meant to minimize the loss of imperative information embedded in the signal.
In addition, they also simplify the amount of resources needed to describe a huge set of data accurately
and hence reducing the dimension of feature space and achieving better performances.
The EEG signal data obtained from the subject is processed and stored in a word file for every 18
minutes observation time. Simultaneously 15 channel data is recorded and every word file is divided
into 1000x15 matrices in an excel file. Using the excel files we obtain the 15 channel baseline plot in
MATLAB. Then the mean plot is configured in MATLAB for further analysis. Fig. 3.5 and 3.6
provides 15-channel EEG baseline plot and the mean plot of 15-channel EEG signals respectively.
Fig. 3.5 15 channel baseline plot
Fig. 3.6 Mean plot of the 15 channels
0 100 200 300 400 500 600 700 800 900 1000
-200
-100
0
100
200
300
400
500
15 channel plot of subject1 BASE
0 100 200 300 400 500 600 700 800 900 1000
-40
-20
0
20
40
60
80
original signal
Page 44 of 71
Extracting Power Spectral Density Features
In order to select the correct features of the EEG signal related to the mental activity, the power
spectral density of the original signal using parametric methods is computed as the frequency response
of the mean values of the signal from a sequence of the time samples of the signals. The sampling
frequency of the data is 8192 Hz. By observing the peaks at the frequencies corresponding to the
periodicities of the data, we are detecting the power spectral density. The power spectral density
(PSD) is intended for continuous spectra. The integral of the PSD over a given frequency band
computes the average power in the signal over that frequency band.
By using FFT, a plot is produced (Fig 3.7) that goes from 0 to 70 Hz with a frequency spacing
of 10 Hz based on the sampling rate divided by the number of time intervals (8192/8192).
EEG signals are often quantified based on their frequency domain characteristics. Typically, the
spectrum is estimated using Fast Fourier Transform (FFT).
Fig 3.7 Power spectral density plots of 0.2 seconds EEG data
Page 45 of 71
Extracting Fourier Transform Features
In the experiment, to evaluate Fourier transform of the EEG data, the path to the mean channel plot
of the EEG data is given in MATALAB. This approach is based on earlier observations that the EEG
spectrum contains some characteristic waveforms that fall primarily within five frequency bands in
our experiment- Delta: 0-4 Hz, Theta: 4-8 Hz, Alpha: 8-12 Hz, Beta: 12-24 Hz and
Gamma: 24-32 Hz. Fig. 3.8(a) and (b) presents the raw EEG signal and Fourier transformed signal
respectively.
The chart below (Table 3.1) gives the values of the peaks of the signal displayed in fig 3.8(b).
Table 3.1 Peaks of Signals in Fig. 3.8(b)
Peak 1 Peak 2 Peak 3 Peak 4 Peak 5
4.9965e+03 420.0768 100.6872 264.6203 24.0696
Fig. 3.8 a) Raw EEG signal b) Fourier Transformed Signal
Page 46 of 71
It is important to mention here that we often apply a ‘window’ to the data. This simply means taking
the amount we want from the data stream. The window is moved along the data; we perform the FFT
on this windowed data. There are more than 60 segments in each EEG dataset collected. So, we get
corresponding Fourier transformed signal and values for each of these. For example, only 10 of these
values are displayed in Fig. 3.9.
Extracting Wavelet Coefficient
Wavelet transform (WT) is meant to resolve issues of non-stationary signals such as EEG. The mother
wavelet gives rise to these wavelets as part of derived functions through translation and dilation that
is shifting and compression and stretching operations along the time axis, respectively.
Continuous Wavelet Transform (CWT) is done on the mean channel unprocessed EEG plot,
where wavelets are formed due to dilation and different translation factor. However, its major
weakness is that scaling parameter and translation parameter of CWT change continuously. Thus, the
coefficients of the wavelet for all available scales after calculation will consume a lot of effort and
yield a lot of unused information.
Fig. 3.9 Area under different sub-bands of the frequency spectrum (Z set)
Page 47 of 71
Fig. 3.10 provides four plots depicting wavelet coefficient over 0-32Hz frequency range. This
method is just the continuation of the orthodox Fourier transform method and 0-4HZ, 4-8HZ, 8-16HZ,
and 16-32HZ frequency bands are used. The extracted wavelet coefficients provide a compact
representation that shows the energy distribution of the EEG signal in frequency band. Therefore, the
computed detail and approximation wavelet coefficients of the EEG signals were used as the feature
vectors representing the signals. There are number of wavelet coefficients. In order to reduce the
dimensionality of the feature vectors, statistics over the set of the wavelet coefficients were used. The
following statistical features were used to represent the time-frequency distribution of the EEG
signals:
(i) Maximum of the wavelet coefficients in each sub-band.
(ii) Minimum of the wavelet coefficients in each sub-band.
(iii) Mean of the wavelet coefficients in each sub-band.
(iv) Standard deviation of the wavelet coefficients in each sub-band.
Fig. 3.10 4 wavelets obtained for 1000x15 raw EEG data
0 0.5 1 1.5 2 2.5 3 3.5 4
0
200
400
600
800
1000
1200
1400
1600
1800
2000
0 0.5 1 1.5 2 2.5 3 3.5 4
0
200
400
600
800
1000
1200
1400
1600
1800
2000
4 4.5 5 5.5 6 6.5 7 7.5 8
20
30
40
50
60
70
80
90
100
110
4 4.5 5 5.5 6 6.5 7 7.5 8
20
30
40
50
60
70
80
90
100
110
8 9 10 11 12 13 14 15 16
0
10
20
30
40
50
60
70
80
90
8 9 10 11 12 13 14 15 16
0
10
20
30
40
50
60
70
80
90
16 18 20 22 24 26 28 30 32
0
10
20
30
40
50
60
16 18 20 22 24 26 28 30 32
0
10
20
30
40
50
60
Page 48 of 71
Extracting Hjorth Parameters
In the experiment, to evaluate Hjorth parameters, the path to the mean channel plot of the EEG data
is given in MATALAB. Every stress level data is divided into number of parts. The alarmingly
stressed video data is divided into 85 parts and hence 85 values of mobility and complexity are
obtained.
ALARMINGLY STRESSED VIDEO:
Parts 1 2 3 4 5 6 7 8 9 10 11 12
Complexity 0.8233 0.7990 1.3415 0.9658 0.8758 0.7890 0.8517 0.9193 0.9904 0.8813 0.7858 0.9438
Parts 13 14 15 16 17 18 19 20 21 22 23 24
Complexity 1.1937 0.9628 0.8819 0.7460 0.8797 0.8785 0.7340 0.8205 0.8423 0.8584 0.7659 0.7202
Parts 25 26 27 28 29 30 31 32 33 34 35 36
Complexity 0.6726 0.6229 0.6699 0.6016 0.8073 0.7797 1.0006 0.8455 0.7389 0.8122 0.7531 0.7798
Parts 37 38 39 40 41 42 43 44 45 46 47 48
Complexity 0.7698 0.6884 0.6608 0.7427 0.7884 0.7325 0.7968 0.7828 0.8209 1.1210 0.7703 0.6876
Parts 49 50 51 52 53 54 55 56 57 58 59 60
Complexity 0.7055 0.6622 0.5681 0.7017 0.7740 0.8002 0.6624 0.7431 0.7291 0.8761 0.8230 0.7358
Parts 61 62 63 64 65 66 67 68 69 70 71 72
Complexity 0.9184 0.7741 0.7908 0.8372 0.8407 0.7972 0.6972 0.8139 0.6724 0.7521 0.7371 0.7606
Parts 73 74 75 76 77 78 79 80 81 82 83 84
Complexity 0.7121 0.6256 0.7584 0.7285 0.7433 0.7646 0.7699 0.6627 0.8354 0.6960 0.6996 0.6775
Parts 85
Complexity 0.6205
Parts 1 2 3 4 5 6 7 8 9 10 11 12
Mobility 0.1085 0.1046 0.1969 0.0810 0.1435 0.1719 0.2251 0.1730 0.1983 0.1704 0.1559 0.1636
Page 49 of 71
Parts 13 14 15 16 17 18 19 20 21 22 23 24
Mobility 0.2938 0.1474 0.1781 0.1408 0.1590 0.1963 0.1446 0.1674 0.1926 0.1901 0.1029 0.1801
Parts 25 26 27 28 29 30 31 32 33 34 35 36
Mobility 0.1534 0.1408 0.1601 0.1292 0.1801 0.1925 0.1663 0.1965 0.1639 0.1501 0.1453 0.1792
Parts 37 38 39 40 41 42 43 44 45 46 47 48
Mobility 0.1173 0.1358 0.1240 0.1464 0.1641 0.1485 0.1484 0.1674 0.0981 0.2079 0.1249 0.1431
Parts 49 50 51 52 53 54 55 56 57 58 59 60
Mobility 0.1226 0.1455 0.1273 0.1360 0.1067 0.1489 0.1372 0.1685 0.1316 0.1646 0.1521 0.1453
Parts 61 62 63 64 65 66 67 68 69 70 71 72
Mobility 0.1267 0.1479 0.1459 0.1564 0.1382 0.1738 0.1161 0.1906 0.1401 0.1401 0.1768 0.1665
Parts 73 74 75 76 77 78 79 80 81 82 83 84
Mobility 0.1490 0.1567 0.1420 0.1745 0.1725 0.1533 0.1845 0.1309 0.1498 0.2164 0.1538 0.1377
Parts 85
Mobility 0.1222
MODERATELY STRESSED VIDEO:
Parts 1 2 3 4 5 6 7 8 9 10 11 12
Complexity 0.9837 0.8380 0.8231 0.9150 0.8009 0.8412 0.8051 0.9090 0.8529 0.8178 0.7643 0.8461
Parts 13 14 15 16 17 18 19 20 21 22 23 24
Complexity 0.7425 0.8440 0.8918 0.8483 0.8127 0.9197 0.9459 0.8894 0.8292 0.9988 0.8921 0.8095
Parts 25 26 27 28 29 30 31 32 33 34 35 36
Complexity 0.8227 0.7619 0.8642 0.9126 1.0119 0.8912 0.9607 0.8928 0.8623 0.7880 0.8313 0.8346
Parts 37 38 39 40 41 42 43 44 45 46 47 48
Complexity 0.7309 0.7557 0.7171 0.8515 0.6242 0.6310 0.7744 0.6805 0.4873 0.8457 0.7414 0.7073
Page 50 of 71
Parts 49 50 51 52 53 54 55 56 57 58 59 60
Complexity 0.6800 0.6466 0.7295 1.1789 0.6583 0.8081 0.7434 0.7017 0.7175 0.6687 0.6898 0.7801
Parts 61 62 63 64 65 66 67 68 69 70 71 72
Complexity 0.7556 0.9429 0.7816 0.7050 0.7485 0.6516 0.8283 0.8263 0.7948 0.7414 0.8214 0.6898
Parts 73 74 75 76 77 78 79 80
Complexity 0.6549 0.6964 0.6159 0.7299 0.7433 0.7165 0.8384 0.8234
Parts 1 2 3 4 5 6 7 8 9 10 11 12
Mobility 0.1598 0.1767 0.1521 0.1355 0.1489 0.1645 0.1513 0.1761 0.1730 0.1402 0.1563 0.1892
Parts 13 14 15 16 17 18 19 20 21 22 23 24
Mobility 0.1457 0.1670 0.1655 0.1588 0.1497 0.1864 0.1837 0.1870 0.1784 0.2090 0.2037 0.1496
Parts 25 26 27 28 29 30 31 32 33 34 35 36
Mobility 0.1348 0.1705 0.1727 0.2247 0.2614 0.1406 0.1911 0.1631 0.1834 0.1679 0.1530 0.2354
Parts 37 38 39 40 41 42 43 44 45 46 47 48
Mobility 0.1647 0.1297 0.1104 0.1810 0.1487 0.1484 0.2037 0.1830 0.1344 0.1892 0.1526 0.1865
Parts 49 50 51 52 53 54 55 56 57 58 59 60
Mobility 0.1604 0.1546 0.1522 0.2461 0.1516 0.1232 0.1473 0.1520 0.1916 0.1364 0.1568 0.1758
Parts 61 62 63 64 65 66 67 68 69 70 71 72
Mobility 0.1410 0.1850 0.1737 0.1549 0.1699 0.1147 0.1583 0.1468 0.2189 0.1422 0.2060 0.1447
Parts 73 74 75 76 77 78 79 80
Mobility 0.1195 0.1753 0.1523 0.1016 0.1450 0.1423 0.1451 0.1447
Page 51 of 71
RELAXED VIDEO:
Parts 1 2 3 4 5 6 7 8 9 10 11 12
Complexity 0.9468 0.8064 0.7582 0.7918 0.8362 0.8152 1.1945 0.6858 0.7623 0.7136 0.8947 0.7654
Parts 13 14 15 16 17 18 19 20 21 22 23 24
Complexity 0.7049 0.6264 0.7292 0.6397 0.7437 0.8271 0.7437 0.8047 0.9170 0.8746 0.7666 0.7416
Parts 25 26 27 28 29 30 31 32 33 34 35 36
Complexity 0.7500 0.7900 0.7503 0.8202 0.7802 0.8067 0.7517 0.8789 1.0304 0.8633 0.9247 0.8749
Parts 37 38 39 40 41 42 43 44 45 46 47 48
Complexity 0.8902 0.8316 0.7652 0.7504 0.7024 0.7576 0.8713 0.7516 0.8726 1.0002 0.7360 0.8142
Parts 49 50 51 52 53 54 55 56 57 58 59 60
Complexity 0.7616 1.1334 0.6938 0.7146 1.2038 0.6165 0.9173 0.7215 0.6791 0.7350 0.8522 0.6939
Parts 61
Complexity 1.0141
Parts 1 2 3 4 5 6 7 8 9 10 11 12
Mobility 0.1319 0.1719 0.1443 0.1628 0.1595 0.1290 0.1743 0.1299 0.1678 0.1353 0.1348 0.1613
Parts 13 14 15 16 17 18 19 20 21 22 23 24
Mobility 0.1331 0.1431 0.1399 0.1261 0.1461 0.1700 0.1546 0.1481 0.1902 0.1852 0.2089 0.1567
Parts 25 26 27 28 29 30 31 32 33 34 35 36
Mobility 0.1567 0.1486 0.1657 0.1559 0.1754 0.1531 0.1500 0.1688 0.1322 0.2089 0.1782 0.1689
Parts 37 38 39 40 41 42 43 44 45 46 47 48
Mobility 0.1820 0.1724 0.1644 0.1475 0.1397 0.1755 0.1774 0.2196 0.1522 0.1657 0.1789 0.1576
Parts 49 50 51 52 53 54 55 56 57 58 59 60
Mobility 0.1663 0.1402 0.2153 0.1476 0.1644 0.2023 0.1456 0.1144 0.1483 0.1501 0.1208 0.2497
Parts 61
Complexity 0.1436
Page 52 of 71
Extracting Kalman Filter Coefficients
For Kalman feature extraction, the power spectral density figure is given as input. At first a random
Gaussian noise is generated and added to the actual signal to obtain the estimated signal. The priori
or posteri covariance matrix is calculated along with the Kalman gain and Kalman coefficient. Then
using the original and the estimated signals, a mean square error is found out. This mean square error
is used for further calculations. Fig. 3.11 provides Kalman filter analysis of EEG data.
Fig 3.11 Kalman filter analysis of EEG data
0 10 20 30 40 50 60 70
-500
0
500
1000
1500
2000
2500
3000
3500
4000
Combined plot
original’,’ estimated
0 10 20 30 40 50 60 70
0
2
4
6
8
10
12
x 10
6
Mean square error
Page 53 of 71
3.5 CLASSIFIER VALIDATION AND PERFORMANCE
Fuzzy inference is a method that interprets the values in the input vector and, based on some set of
rules, assigns values to the output vector, hence providing a number of convenient ways to create
fuzzy sets.
A Membership function is a curve that defines how each point in the input space is mapped to a
membership value or degree of membership between 0 and 1. The only condition a membership
function must really satisfy is that it must vary between 0 and 1. The function itself can be an arbitrary
curve whose shape we can define as a function that suits us from the point of view of simplicity,
convenience, speed, and efficiency. Here we are using a Gaussian membership function (gaussmf)
(See Fig. 3.12). It provides smoothly varying continuous curve and non-zero curves at all points. It
returns a fuzzy set whose membership grades represent a normalized Gaussian function with a mean
of mu and a width of sigma.
After feature extraction we find the Gaussian for each feature of a set of data. A Gaussian curve is
obtained by using 10 data sets in the experiment. Basically, we take the average value of each plot
Fig. 3.12 Gaussian membership curve
Page 54 of 71
obtained before in the feature extraction. Such 10 values make up a single Gaussian function. The
Gaussian for PSD is obtained in a video of brain signals of a subject by following method:
The video has 79 data sets of 1000x15 data- hence 79 PSD figures have been obtained. From each
PSD graph, we have found the average value of amplitude for that graph. Using the average value
of 10 graphs, we obtain one Gaussian figure. So, for this video, we get 8 Gaussian figures for PSD
feature. The average values for 10 PSD graphs are shown in Table 3.2. With the help of these average
values, the Gaussian curve is obtained, which is shown in Fig. 3.13.
Table 3.2 Average Values of 10 PSD Graphs
Avg1 Avg2 Avg3 Avg4 Avg5 Avg6 Avg7 Avg8 Avg9 Avg10
496.50 341.96 565.59 700.45 786.25 314.91 530.90 328.43 356.17 696.31
Fig. 3.13 Gaussian curve from PSD average values
Page 55 of 71
The output-axis is a number known as the membership value between 0 and 1. The curve is known as
a membership function and is often given the designation of µ. This curve defines the transition from
the low values of PSD to high values of PSD in the fuzzy space. Similarly, 7 more Gaussian curves
will be obtained for PSD which are simultaneously drawn in the above plot representing the different
fuzzy sets. We assigned different colours to each of the new fuzzy sets. This enables us to reference
each individual fuzzy set and differentiate between them. All videos representing the subject’s EEG
data have stress levels to some degree, but one is significantly less stressed than the other. Using this
Gaussian figure, we get the probability of the unknown data’s PSD lying within these fuzzy sets. In
the same way we can get the Gaussian curves for FFT, Wavelet Coefficients, Hjorth parameters,
Kalman filter. Hence, we create a fuzzy space for unknown data whose stress level needs to be
detected. We take 5 video samples of a subject and distinguish them into different stress level anchors.
The features are then extracted for each individual video as described earlier and after that fuzzy space
is created. The experiment is done on five different subjects for more data providing accurate results.
During the beginning of the experiment, the subject is in relaxed state. The stress level starts to rise
gradually with the progression of time. So, after a time interval driver becomes less relaxed. In
anticipation of the next hurdle, a slight anticipatory stress develops in the subject which is not harmful.
Fatigue, drowsiness or any other situation may incur a kind of stress known as situational stress. When
the stress levels become alarmingly high, it is known as alarming stress. A few plots of EEG features
in fuzzy space representing stress levels of different drivers in presence of (a) alarmingly stressed, (b)
moderately stressed and (c) relaxed visual stimuli are presented in Fig. 3.14-3.22.
Page 56 of 71
Fig 3.14 Hjorth Complexity Gaussian plot during alarmingly stressed video
0 0.5 1 1.5
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Fig 3.15 Wavelet Coefficient Gaussian plot during alarmingly stressed video
0 100 200 300 400 500 600 700 800 900 1000
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Page 57 of 71
Fig 3.16 Kalman filter Gaussian plot during alarmingly stressed video
-4 -3 -2 -1 0 1 2 3 4
x 10
6
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Fig 3.17 Power Spectral Density Gaussian plot during alarmingly stressed video
0 100 200 300 400 500 600 700 800 900 1000
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Page 58 of 71
Fig 3.18 Power Spectral Density Gaussian plot during moderately stressed video
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
x 10
4
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
0 100 200 300 400 500 600 700 800 900 1000
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Fig 3.19 Wavelet coefficient Gaussian plot during moderately stressed video
Page 59 of 71
-4 -3 -2 -1 0 1 2 3 4
x 10
6
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Fig 3.21 Hjorth Complexity Parameter Gaussian plot during moderately stressed video
0 0.5 1 1.5
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Fig 3.20 Kalman filter Gaussian plot during moderately stressed video
Page 60 of 71
Fig 3.22 Power Spectral Density Gaussian plot during relaxed video
0 100 200 300 400 500 600 700 800 900 1000
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Fig 3.23 Kalman filter Gaussian plot during relaxed video
-4 -3 -2 -1 0 1 2 3 4
x 10
6
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Page 61 of 71
Recognizing Stress Level
In the experiment, we collect and analyze EEG data during real-world driving tasks to determine a
driver's relative stress level among the 5 stress anchors. EEG data were recorded continuously while
drivers followed a set route through open roads in the emulated driving scenerio. Data from 20 drives
of at least 50-min duration were collected for analysis to distinguish the three levels of driver’s stress.
After filtering and feature extraction, the mean values of features are obtained, which are then
projected onto the fuzzy space. The minimum and maximum value of the feature intersecting with the
respective mean unknown feature value in the plot for a stress level is observed and highlighted in
Table 3.3-3.5. Next, the minimum of the minimum values [Lm] of all the features of a stress level is
noted. Similarly, the minimum of the maximum values [Hm] of all the features of the stress level is
also noted. The tnorm is calculated for each of the stress levels.
Fig 3.24 Wavelet coefficient Gaussian plot during relaxed video
0 200 400 600 800 1000
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Page 62 of 71
Table 3.3 Minimum and Maximum Values of Features for Relaxed Video
Table 3.4 Minimum and Maximum Values of Features for Moderately Stressed Video
1 2 3 4 5 6 7 8 9
Power
Spectral
Density
.98615 .1834 .4438 .3449 .6596 .5549 .6587 .0952 .8283
Fourier
Transform
.9627 .99745 .48135 .54625 .53 .7 .7695 .7475 .7689
Wavelet
Coefficient
.9151 .1927 .2607 .3337 .3616 .509 .58645 .0931 .582
Hjorth
Complexity
.7891 .891 .926 .9466 .9603 .9049 .9262 0.9018 .9069
Hjorth
Mobility
.917 .9345 .2439 .9978 .9359 .95085 .6773 .9512 .9983
Kalman .9999 .0702 .8461 .3601 .9253 .6534 .8594 .1563 .9682
1 2 3 4 5 6
Power Spectral
Density
0.5313 0.4636 0.00239 0.39025 0.268 0.4901
Fourier Transform 0.7999 0.7556 0.6806 0.9688 0.7332 0.7336
Wavelet Coefficient 0.4931 0.4538 0.01 0.3166 0.3077 0.4642
Hjorth Complexity 0.9565 0.515 0.5379 0.7387 0.9328 0.97655
Hjorth Mobility 0.9498 0.9649 0.2703 0.9566 0.899 0.9957
Kalman 0.3657 0.3117 0.00001 0.7834 0.2716 0.8538
Page 63 of 71
Table 3.5 Minimum and Maximum Values of Features for Alarmingly Stressed Video
The triangular norm (tnorm) issued to calculate the membership values of intersection of the fuzzy sets.
tnorm fuzzy logics primarily aim at generalizing classical two-valued logic by admitting intermediary
truth values between 1 (truth) and 0 (falsity) representing “degrees of truth” of propositions. The
degrees are assumed to be real numbers from the unit interval [0, 1]. Hence, the tnorm obtained above
for each set indicates the degree of precision in lying in that stress level. Table 3.6 provides the values
of tnorm for different stress levels.
Since the maximum value of tnorm is in alarming stress set, hence the subject is highly stressed and the
car needs to be stopped for safety.
1 2 3 4 5 6 7 8
Power
Spectral
Density
0.4545 0.2994 0.3838 0.288 0.749 0.83785 0.7847 0.7878
Fourier
Transform
0.8386 0.8057 0.8646 0.9981 0.94 0.984 0.9851 0.9954
Wavelet
Coefficient
0.4153 0.4362 0.5006 0.3971 0.7315 0.8155 0.7635 0.867
Hjorth
Complexity
0.8701 0.6018 0.7062 0.5851 0.6995 0.9106 0.342 0.784
Hjorth
Mobility
0.9678 0.9979 0.9906 0.273 0.9924 0.97395 0.9292 0.915
Kalman 0.9946 0.1603 0.0404 0.1679 0.6326 0.9592 0.7802 0.8927
Table 3.6 tnorm for different stress levels
Stress Level Relax Situational
Stress
Alarming
Stress
tnorm 0.246555 0.49265 0.4141
Page 64 of 71
Chapter 4
Conclusions and Future Scope
This is the concluding chapter of the project. It provides a self-review of the project, highlighting the
problems undertaken therein and to what level and to what degree of accuracy the problems have
been solved. It also provides a discussion covering the open problems, in general, which may be taken
up as future research.
Page 65 of 71
4.1 SELF-REVIEW OF WORK
Data from 20 drives of at least 50-min duration were collected for analysis to distinguish the 5 levels
of driver stress with an accuracy of over 92% across multiple drivers and driving days. The results
show that for most of the drivers, brain signals are most closely correlated with driver stress level.
These findings indicate that brain signals can provide a metric of driver stress in future cars capable
of EEG monitoring. Such a metric could be used to help manage noncritical in-vehicle information
systems and could also provide a continuous measure of how different road and traffic conditions
affect drivers.
4.2 FUTURE RESEARCH DIRECTIONS
Experiments in this area have gradually shown promise. Scientists at Swiss University are working
with a car manufacturer Nissan to find out if they could use brain signals to improve driving
experience. The idea is that a computer on-board the car could detect a driver’s intentions split seconds
before they act by reading their brain signals. The computer could then opt to intervene or assist the
driver, depending on external detection of other cars and objects around the car. Work on the project
has begun in earnest on 7th
November, 2011 with the researchers testing the concept by monitoring
the brain signals of people using a driving simulator similar to our experiment. What they are trying
to do is revolutionize the way the car interacts with the driver, the person who is controlling it. Brain
measurements are used trying to understand what the driver is trying to do. The brain signals are very
good at detecting the whole environment around the car and making the decisions themselves but the
muscles react to situations much slower as when we are driving we use both our hands and feet making
it complicated. So, we are bad executors having response time slower. The body controlling signals
in our brain are still there and these brain signals are utilized to make an automated car making
response time faster and safer than the driver would have made himself. But this doesn’t mean that
the driver is half-asleep and the computer does everything for him/her. The driver is still kept active
and the driving experience is improved.
Page 66 of 71
REFERENCES
1. S. M. Courtney, L. Petit, J. V. Haxby and L. G. Ungerleider, “The role of prefrontal cortex in
working memory: examining the contents of consciousness,” Philosophical Transactions of the
Royal Society B: Biological Sciences, vol. 353, no. 1377, pp. 1819-1828, 1998.
2. W. L. Zhou, P. Yan, J. P. Wuskell, L. M. Loew and S. D. Antic, “Dynamics of action
potentialbackpropagation in basal dendrites of prefrontal cortical pyramidal neurons,” European
Journal of Neuroscience, vol. 27, no. 4, pp. 923–936, 2008.
3. E. T. Rolls, “The orbitofrontal cortex and reward,” Cerebral cortex, vol. 10, no. 3, pp. 284-294,
2000.
4. J. M. Spielberg, J. L. Stewart, R. L. Levin, G. A. Miller and W. Heller, “Prefrontal cortex, emotion,
and approach/withdrawal motivation,” Social and Personality Psychology Compass, vol. 2, no.
1, pp. 135-153, 2008.
5. C. D. Salzman and S. Fusi, “Emotion, cognition, and mental state representation in amygdala
and prefrontal cortex,” Annual Review of Neuroscience, vol. 33, pp. 173-202, 2010.
6. M. Shoykhet and R. S. B. Clark, “Structure, Function, and Development of the Nervous System,”
ed. B. P. Fuhrman, J. J. Zimmerman, J. A. Carcillo, R. S. B. Clark, M. Relvas, A. T. Rotta, A. E.
Thompson and J. D. Tobias, in Pediatric Critical Care (Fourth Edition), Elsevier, pp. 783-804,
2011.
7. J. G. Mai, G. Paxinos, and T. Voss, Atlas of the Human Brain, Elsevier, Amsterdam, The
Netherlands, 3rd edition, 2008.
8. J. A. Kiernan, “Anatomy of the Temporal Lobe,” Epilepsy Research and Treatment, Hindwai
Publishing, pp. 1-12, 2012.
9. E. Lugaresi, F. Cirignotta and P. Montagna, “Occipital lobe epilepsy with scotosensitive seizures:
the role of central vision,” Epilepsia, vol. 25, no. 1, pp. 115-120, 1984.
10. E. Niedermeyer and F.L.D. Silva, Electroencephalography: Basic principles, clinical
applications, and related fields, Lippincott Williams & Wilkins, (2004).
11. J. G. Dy and C. E. Brodley, “Feature selection for unsupervised learning,” Journal of Machine
Learning Research, vol. 5, pp. 845-889, 2004.
12. Y. Kim, W. N. Street and F. Menczer, “Feature selection in unsupervised learning viaevolutionary
search,” In Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge
Discovery and Data Mining, pp. 365-369, 2000.
13. Lotze and cohen“Volition and imagery in neurorehabilitation”, Sep;19(3):135-40,2006.
Page 67 of 71
14. M. Gulhane and P. S. Mohod, “Intelligent Fatigue Detection and Automatic Vehicle Control
System”, International Journal of Computer Science & Information Technology (IJCSIT), vol 6,
no. 3, June 2014.
15. A. R. Varma, S. V. Arote, C. Bharti and K. Singh, “Accident Prevention Using Eye Blinking and
Head Movement”, Emerging Trends in Computer Science and Information Technology
(ETCSIT2012), Proceedings published in International Journal of Computer Applications®
(IJCA), 2012.
16. M. Haak, S. Bos, S. Panic and L. J. M. Rothkrantz, “Detecting stress using eye blinks and brain
activity from EEG signals”, Faculty of Electrical Engineering, Mathematics and Computer
science, Delft University of Technology Faculty of Applied Sciences Netherlands Defence
Academy, 2010.
17. J. A. Horne and L. A. Reyner, “Sleep related vehicle accidents,” Brit. Med. J., vol. 310, pp. 565–
567, 1995.
18. Hallett M, Fieldman J, Cohen LG, Sadato N, Pascual-Leone A., “Involvement of primary motor
cortex in motor imagery and mental practice,” Behav Brain Sci. 1994.
19. Sirigu A, Cohen L, Duhamel JR, Pillon B, Dubois B, Agid Y, Pierrot-Deseilligny C. “Congruent
unilateral impairments for real and imagined hand movements,” Neuroreport. 1995.
20. Stephan KM, Fink GR, Passingham RE, Silbersweig D, Ceballos-Baum AO, Frith CD,
Frackowiak RS. Functional anatomy of the mental representation of upper extremity movements
in healthy subjects. J Neurophysiol, pp.373–386, 1995.
21. Lotze M, Montoya P, Erb M, Hulsmann E, Flor H, Klose U. “Activation of cortical and cerebellar
motor areas during executed and imagined hand movements: an fMRI study,” J Cogn Neurosci,
pp. 491–501, 1999.
22. Gerardin E, Sirigu A, Lehericy S, Poline JB, Gaymard B, Marsault C. Partially overlapping neural
networks for real and imagined hand movements. Cereb Cortex,1093–1104,2000.
23. Grezes J, Decety J. Functional anatomy of execution, mental simulation, observation and verb
generation of action: a meta-analysis. Hum Brain Mapp.12:1–19,2001.
24. Jeannerod M. Neural simulation of action: a unifying mechanism for motor
cognition. NeuroImage, pp 103–109, 2001.
25. Kimberley TJ, Khandekar G, Skraba LL, Spencer JA, Van Gorp EA, Walker SR. “Neural
substrates for motor imagery in severe hemiparesis,” Neurorehabil Neural Repair, vol:20
pp:268–277,2006.
26. Hanakawa T, Immisch I, Toma K, Dimyan M, Van Gelderen P, Hallett M. Functional properties
of brain areas associated with motor execution and imagery. J Neurophysiol,pp:989–1002,2003.
27. Dechent P, Merboldt KD, Frahm J. Is the human primary motor cortex involved in motor
imagery? Cogn Brain Res;19 pp:138–144,2004.
BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS
BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS
BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS
BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS

More Related Content

What's hot

IRJET- Review on Depression Prediction using Different Methods
IRJET- Review on Depression Prediction using Different MethodsIRJET- Review on Depression Prediction using Different Methods
IRJET- Review on Depression Prediction using Different MethodsIRJET Journal
 
Eyad zuraiqi ph_d_dissertation_04_2012_5
Eyad zuraiqi ph_d_dissertation_04_2012_5Eyad zuraiqi ph_d_dissertation_04_2012_5
Eyad zuraiqi ph_d_dissertation_04_2012_5Tuan Huynh
 
A virtual analysis on various techniques using ann with
A virtual analysis on various techniques using ann withA virtual analysis on various techniques using ann with
A virtual analysis on various techniques using ann witheSAT Publishing House
 
IRJET- Intelligent Character Recognition of Handwritten Characters using ...
IRJET-  	  Intelligent Character Recognition of Handwritten Characters using ...IRJET-  	  Intelligent Character Recognition of Handwritten Characters using ...
IRJET- Intelligent Character Recognition of Handwritten Characters using ...IRJET Journal
 
Brain computer interface based smart keyboard using neurosky mindwave headset
Brain computer interface based smart keyboard using neurosky mindwave headsetBrain computer interface based smart keyboard using neurosky mindwave headset
Brain computer interface based smart keyboard using neurosky mindwave headsetTELKOMNIKA JOURNAL
 
IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...
IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...
IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...IRJET Journal
 
IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...
IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...
IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...IRJET Journal
 
Using Brain Waves as New Biometric Feature for Authenticating a Computer User...
Using Brain Waves as New Biometric Feature for Authenticatinga Computer User...Using Brain Waves as New Biometric Feature for Authenticatinga Computer User...
Using Brain Waves as New Biometric Feature for Authenticating a Computer User...Buthainah Hamdy
 
Simulation and Coding of a Neural Network, Performing Generalized Function wi...
Simulation and Coding of a Neural Network, Performing Generalized Function wi...Simulation and Coding of a Neural Network, Performing Generalized Function wi...
Simulation and Coding of a Neural Network, Performing Generalized Function wi...IJCSIS Research Publications
 
Robot Motion Control Using the Emotiv EPOC EEG System
Robot Motion Control Using the Emotiv EPOC EEG SystemRobot Motion Control Using the Emotiv EPOC EEG System
Robot Motion Control Using the Emotiv EPOC EEG SystemjournalBEEI
 
Major project report on
Major project report onMajor project report on
Major project report onAyesha Mubeen
 
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...IRJET Journal
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final ReportShikhar Agarwal
 
IRJET- Machine Learning Techniques for Brain Stroke using MRI
IRJET-  	  Machine Learning Techniques for Brain Stroke using MRIIRJET-  	  Machine Learning Techniques for Brain Stroke using MRI
IRJET- Machine Learning Techniques for Brain Stroke using MRIIRJET Journal
 
A Research Base Project Report on A study on physical activity recognition fr...
A Research Base Project Report on A study on physical activity recognition fr...A Research Base Project Report on A study on physical activity recognition fr...
A Research Base Project Report on A study on physical activity recognition fr...Diponkor Bala
 

What's hot (18)

IRJET- Review on Depression Prediction using Different Methods
IRJET- Review on Depression Prediction using Different MethodsIRJET- Review on Depression Prediction using Different Methods
IRJET- Review on Depression Prediction using Different Methods
 
12 mf3im15
12 mf3im1512 mf3im15
12 mf3im15
 
Eyad zuraiqi ph_d_dissertation_04_2012_5
Eyad zuraiqi ph_d_dissertation_04_2012_5Eyad zuraiqi ph_d_dissertation_04_2012_5
Eyad zuraiqi ph_d_dissertation_04_2012_5
 
A virtual analysis on various techniques using ann with
A virtual analysis on various techniques using ann withA virtual analysis on various techniques using ann with
A virtual analysis on various techniques using ann with
 
IRJET- Intelligent Character Recognition of Handwritten Characters using ...
IRJET-  	  Intelligent Character Recognition of Handwritten Characters using ...IRJET-  	  Intelligent Character Recognition of Handwritten Characters using ...
IRJET- Intelligent Character Recognition of Handwritten Characters using ...
 
Brain computer interface based smart keyboard using neurosky mindwave headset
Brain computer interface based smart keyboard using neurosky mindwave headsetBrain computer interface based smart keyboard using neurosky mindwave headset
Brain computer interface based smart keyboard using neurosky mindwave headset
 
IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...
IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...
IRJET- Fundamental of Electroencephalogram (EEG) Review for Brain-Computer In...
 
IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...
IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...
IRJET- BCI Mandala-based Intervention Tool to Increase Mean Attention Levels ...
 
Using Brain Waves as New Biometric Feature for Authenticating a Computer User...
Using Brain Waves as New Biometric Feature for Authenticatinga Computer User...Using Brain Waves as New Biometric Feature for Authenticatinga Computer User...
Using Brain Waves as New Biometric Feature for Authenticating a Computer User...
 
Simulation and Coding of a Neural Network, Performing Generalized Function wi...
Simulation and Coding of a Neural Network, Performing Generalized Function wi...Simulation and Coding of a Neural Network, Performing Generalized Function wi...
Simulation and Coding of a Neural Network, Performing Generalized Function wi...
 
Robot Motion Control Using the Emotiv EPOC EEG System
Robot Motion Control Using the Emotiv EPOC EEG SystemRobot Motion Control Using the Emotiv EPOC EEG System
Robot Motion Control Using the Emotiv EPOC EEG System
 
Major project report on
Major project report onMajor project report on
Major project report on
 
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
IRJET- Disentangling Brain Activity from EEG Data using Logistic Regression, ...
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final Report
 
MASci
MASciMASci
MASci
 
Ijmet 10 01_070
Ijmet 10 01_070Ijmet 10 01_070
Ijmet 10 01_070
 
IRJET- Machine Learning Techniques for Brain Stroke using MRI
IRJET-  	  Machine Learning Techniques for Brain Stroke using MRIIRJET-  	  Machine Learning Techniques for Brain Stroke using MRI
IRJET- Machine Learning Techniques for Brain Stroke using MRI
 
A Research Base Project Report on A study on physical activity recognition fr...
A Research Base Project Report on A study on physical activity recognition fr...A Research Base Project Report on A study on physical activity recognition fr...
A Research Base Project Report on A study on physical activity recognition fr...
 

Similar to BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS

FINAL PROJECT REPORT
FINAL PROJECT REPORTFINAL PROJECT REPORT
FINAL PROJECT REPORTSoham Wadekar
 
Deep Learning for Health Informatics
Deep Learning for Health InformaticsDeep Learning for Health Informatics
Deep Learning for Health InformaticsJason J Pulikkottil
 
M.tech Term paper report | Cognitive Radio Network
M.tech Term paper report | Cognitive Radio Network M.tech Term paper report | Cognitive Radio Network
M.tech Term paper report | Cognitive Radio Network Shashank Narayan
 
Wireless charger for_low_power_devices_ excellent one same
Wireless charger for_low_power_devices_ excellent one sameWireless charger for_low_power_devices_ excellent one same
Wireless charger for_low_power_devices_ excellent one sameIbrahim Khleifat
 
projects2021C11.pdf
projects2021C11.pdfprojects2021C11.pdf
projects2021C11.pdfAjaybPawara
 
Project report on Eye tracking interpretation system
Project report on Eye tracking interpretation systemProject report on Eye tracking interpretation system
Project report on Eye tracking interpretation systemkurkute1994
 
A Facial Expression Recognition System A Project Report
A Facial Expression Recognition System A Project ReportA Facial Expression Recognition System A Project Report
A Facial Expression Recognition System A Project ReportAllison Thompson
 
Front Matter Smart Grid Communications
Front Matter Smart Grid CommunicationsFront Matter Smart Grid Communications
Front Matter Smart Grid Communicationsapoorvkhare
 
REAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLER
REAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLERREAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLER
REAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLERVenkata Sai Kalyan Routhu
 
Automatic power factor_detection_and_cor
Automatic power factor_detection_and_corAutomatic power factor_detection_and_cor
Automatic power factor_detection_and_corhadafree
 
Report - PLC Based Electrical Load Management System
Report - PLC Based Electrical Load Management SystemReport - PLC Based Electrical Load Management System
Report - PLC Based Electrical Load Management SystemIjlal Siddiqui
 
final report_51_33_17
final report_51_33_17final report_51_33_17
final report_51_33_17Darshil Shah
 
Thesis report on eye tracking based driver fatigue hardeep singh pec universi...
Thesis report on eye tracking based driver fatigue hardeep singh pec universi...Thesis report on eye tracking based driver fatigue hardeep singh pec universi...
Thesis report on eye tracking based driver fatigue hardeep singh pec universi...HardeepSingh Dhillon
 
Final thesis report on eye tracking based driver fatigue hardeep singh pec un...
Final thesis report on eye tracking based driver fatigue hardeep singh pec un...Final thesis report on eye tracking based driver fatigue hardeep singh pec un...
Final thesis report on eye tracking based driver fatigue hardeep singh pec un...HardeepSingh Dhillon
 
Technical seminar project stalin babu m 116_f1a0471
Technical seminar project  stalin babu m  116_f1a0471Technical seminar project  stalin babu m  116_f1a0471
Technical seminar project stalin babu m 116_f1a0471STALIN BABU
 

Similar to BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS (20)

FINAL PROJECT REPORT
FINAL PROJECT REPORTFINAL PROJECT REPORT
FINAL PROJECT REPORT
 
Deep Learning for Health Informatics
Deep Learning for Health InformaticsDeep Learning for Health Informatics
Deep Learning for Health Informatics
 
M.tech Term paper report | Cognitive Radio Network
M.tech Term paper report | Cognitive Radio Network M.tech Term paper report | Cognitive Radio Network
M.tech Term paper report | Cognitive Radio Network
 
Wireless charger for_low_power_devices_ excellent one same
Wireless charger for_low_power_devices_ excellent one sameWireless charger for_low_power_devices_ excellent one same
Wireless charger for_low_power_devices_ excellent one same
 
Download file
Download fileDownload file
Download file
 
projects2021C11.pdf
projects2021C11.pdfprojects2021C11.pdf
projects2021C11.pdf
 
complet finalised
complet finalisedcomplet finalised
complet finalised
 
Project report on Eye tracking interpretation system
Project report on Eye tracking interpretation systemProject report on Eye tracking interpretation system
Project report on Eye tracking interpretation system
 
A Facial Expression Recognition System A Project Report
A Facial Expression Recognition System A Project ReportA Facial Expression Recognition System A Project Report
A Facial Expression Recognition System A Project Report
 
TUNING OF DC MOTOR BY USING PSO & PID
TUNING OF DC MOTOR BY USING PSO & PIDTUNING OF DC MOTOR BY USING PSO & PID
TUNING OF DC MOTOR BY USING PSO & PID
 
Front Matter Smart Grid Communications
Front Matter Smart Grid CommunicationsFront Matter Smart Grid Communications
Front Matter Smart Grid Communications
 
REAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLER
REAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLERREAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLER
REAL TIME HEART BEAT MONITORING SYSTEM USING PIC16F876 MICROCONTROLLER
 
Automatic power factor_detection_and_cor
Automatic power factor_detection_and_corAutomatic power factor_detection_and_cor
Automatic power factor_detection_and_cor
 
Report - PLC Based Electrical Load Management System
Report - PLC Based Electrical Load Management SystemReport - PLC Based Electrical Load Management System
Report - PLC Based Electrical Load Management System
 
final report_51_33_17
final report_51_33_17final report_51_33_17
final report_51_33_17
 
report.pdf
report.pdfreport.pdf
report.pdf
 
Main cerificate
Main cerificateMain cerificate
Main cerificate
 
Thesis report on eye tracking based driver fatigue hardeep singh pec universi...
Thesis report on eye tracking based driver fatigue hardeep singh pec universi...Thesis report on eye tracking based driver fatigue hardeep singh pec universi...
Thesis report on eye tracking based driver fatigue hardeep singh pec universi...
 
Final thesis report on eye tracking based driver fatigue hardeep singh pec un...
Final thesis report on eye tracking based driver fatigue hardeep singh pec un...Final thesis report on eye tracking based driver fatigue hardeep singh pec un...
Final thesis report on eye tracking based driver fatigue hardeep singh pec un...
 
Technical seminar project stalin babu m 116_f1a0471
Technical seminar project  stalin babu m  116_f1a0471Technical seminar project  stalin babu m  116_f1a0471
Technical seminar project stalin babu m 116_f1a0471
 

Recently uploaded

WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetHyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetEnjoy Anytime
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsHyundai Motor Group
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 

Recently uploaded (20)

WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetHyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 

BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS

  • 1. Page 1 of 71 BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS Project Report submitted in partial fulfillment of the requirements for the degree of BACHELOR OF TECHNOLOGY In APPLIED ELECTRONICS AND INSTRUMENTATION ENGINEERING Of MAULANA ABUL KALAM AZAD UNIVERSITY OF TECHNOLOGY By ABHISEK SENGUPTA Roll No.-02 WBUT Roll No.-10905514002 ARNAB BAIN Roll No.-11 WBUT Roll No.-10905514011 DEBOSHRUTI BANERJI Roll No.-15 WBUT Roll No.-10905514015 MAHIM MALLICK Roll No.-21 WBUT Roll No.-10905514021 PARAMITA DEY Roll No.-27 WBUT Roll No.-10905514027 TETASH BASU Roll No.-48 WBUT Roll No.-10905514048 Under the guidance of Prof. (Dr.) Anuradha Saha DEPARTMENT OF APPLIED ELECTRONICS AND INSTRUMENTATION ENGINEERING NETAJI SUBHASH ENGINEERING COLLEGE TECHNO CITY, GARIA, KOLKATA –700152 Academic year of pass out 2017-18
  • 2. Page 2 of 71 CERTIFICATE This is to certify that this project report titled Brain-Computer Interfacing To Detect Stress During Motor Imagery Tasks submitted in partial fulfillment of requirements for award of the degree Bachelor of Technology (B. Tech) in AEIE of Maulana Abul Kalam Azad University of Technology is a faithful record of the original work carried out by, ABHISEK SENGUPTA Roll no.10905514002 Regd. No. 141090110730 & 2014-15 ARNAB BAIN Roll no.10905514011 Regd. No. 141090110739 & 2014-15 DEBOSHRUTI BANERJI Roll no.10905514015 Regd. No. 141090110743 & 2014-15 MAHIM MALLICK Roll no.10905514021 Regd. No. 141090110749 & 2014-15 PARAMITA DEY Roll no.10905514027 Regd. No. 141090110755 & 2014-15 TETASH BASU Roll no.10905514048 Regd. No. 141090110778 & 2014-15 under my guidance and supervision. It is further certified that it contains no material, which to a substantial extent has been submitted for the award of any degree/diploma in any institute or has been published in any form, except the assistances drawn from other sources, for which due acknowledgement has been made. ___________ Date…... Guide’s signature Prof.(Dr.) Anuradha Saha Sd/__________________ HOD-Incharge: Prof. Sumitesh Majumder APPLIED ELECTRONICS AND INSTRUMENTATION ENGINEERING NETAJI SUBHASH ENGINEERING COLLEGE TECHNO CITY, GARIA, KOLKATA – 700 152
  • 3. Page 3 of 71 DECLARATION We hereby declare that this project report titled Brain-Computer Interfacing To Detect Stress During Motor Imagery Tasks is our own original work carried out as a under graduate student in Netaji Subhash Engineering College except to the extent that assistances from other sources are duly acknowledged. All sources used for this project report have been fully and properly cited. It contains no material which to a substantial extent has been submitted for the award of any degree/diploma in any institute or has been published in any form, except where due acknowledgement is made. Student’s names: Signatures: Dates: ……………………….. ……………………….. …………………… ……………………….. ……………………….. …………………… ……………………….. ……………………….. …………………… ……………………….. ……………………….. …………………… ……………………….. ……………………….. …………………… ……………………….. ……………………….. ……………………
  • 4. Page 4 of 71 CERTIFICATE OF APPROVAL We hereby approve this dissertation titled BRAIN-COMPUTER INTERFACING TO DETECT STRESS DURING MOTOR IMAGERY TASKS carried out by ABHISEK SENGUPTA Roll no.10905514002 Regd. No. 141090110730 & 2014-15 ARNAB BAIN Roll no.10905514011 Regd. No. 141090110739 & 2014-15 DEBOSHRUTI BANERJI Roll no.10905514015 Regd. No. 141090110743 & 2014-15 MAHIM MALLICK Roll no.10905514021 Regd. No. 141090110749 & 2014-15 PARAMITA DEY Roll no.10905514027 Regd. No. 141090110755 & 2014-15 TETASH BASU Roll no.10905514048 Regd. No. 141090110778 & 2014-15 under the guidance of Prof.(Dr.) Anuradha Saha of Netaji Subhash Engineering College, Kolkata in partial fulfillment of requirements for award of the degree Bachelor of Technology (B. Tech) in Applied Electronics and Instrumentation Engineering of Maulana Abul Kalam Azad University of Technology Date…... Examiners’ signatures: 1. …………………………………………. 2. …………………………………………. 3. …………………………………………. 4. …………………………………………. 5. ………………………………………….
  • 5. Page 5 of 71 ACKNOWLEDGEMENT We would like to take this opportunity to acknowledge and thank those who made this work possible and unforgettable to us. First and foremost, we would like to express our heartfelt gratitude to our guide Prof. (Dr.) Anuradha Saha, for her constant encouragement, guidance and constructive feedback throughout the researching and preparation. Her support towards our participation in the experiment is deeply appreciated. We would like to express our sincere gratitude to Dr. Hirikesh Mondal, the Director and Prof. A. K. Ghosh, the Principal of Netaji Subhash Engineering College for providing highly supportive research environment. We would also like to thank our external co-supervisors of the Artificial Intelligence Laboratory in the department of Electronics and Communications Engineering at Jadavpur University, Kolkata for their helpful collaboration, facilities, equipment and assistance for the acquisition EEG data of the subjects. Last but not the least; we would also like to thank our friends for their support and encouragement. Without their support it would not have been possible. ABHISEK SENGUPTA ARNAB BAIN DEBOSHRUTI BANERJI MAHIM MALLICK PARAMITA DEY TETASH BASU Date…………………
  • 6. Page 6 of 71 ABSTRACT Brain-Computer Interfacing To Detect Stress During Motor Imagery Tasks is an emerging technology to rehabilitate motor deficits. HIGHLIGHTS: • BCIs permit to reintegrate the sensory motor loop by accessing to brain information. • Motor imagery based BCIs seem to be an effective system for an early rehabilitation. • This technology does not need remaining motor activity and promotes neuroplasticity. • BCIs for rehabilitation tends towards implantable devices plus stimulation systems. When the sensory-motor integration is malfunctioning provokes a wide variety of neurological disorders, which in many cases cannot be treated with conventional medication, or via existing therapeutic technology. A brain –computer interface(BCI) is a tool that permits to reintegrate the sensory-motor loop, accessing directly to brain information. A potential, promising and quite investigated application of BCI has been in the motor rehabilitation field. It is well- known that motor deficits are the major disability wherewith the worldwide population lives. Therefore, this paper aims to specify the foundation of motor rehabilitation BCI’s as well as to review the recent research conducted so far (specially, from 2007 to date), in order to evaluate the suitability and reliability of this technology. Although BCI for post-stroke rehabilitation is still in its infancy, the tendency is towards the development of implantable devices that encompass a BCI module plus a stimulation system.
  • 7. Page 7 of 71 TABLE OF CONTENTS 1. An Introduction to Brain-Computer Interfacing 1.1 An overview of Brain-Computer Interfacing (BCI) 1.2 Types of BCI 1.3 Brain Map and Brain-Imaging Techniques 1.4 EEG Device 1.5 Components of a BCI System 2. Stress Detection during Motor-Imagery Tasks 2.1 Defining Stress and Motor Imagery (MI) 2.2 Decoding of Stress and MI 2.3 Brain Signals for Decoding MI 2.4 Features Used for Decoding MI 2.5 Classifiers Used for Decoding MI 2.6 Performance Analysis in MI-Based BCI Research 3. Stress Detection of Vehicle Drivers during Driving-A Case Study 3.1 Problem Formulation 3.2 Experimental Framework 3.3 Real-Time Signal Acquisition 3.4 Feature Extraction and Selection 3.5 Classifier Validation and Performance 4. Conclusions and Future Scope 4.1 Self-Review of the Work 4.2 Future Research directions
  • 8. Page 8 of 71 LIST OF FIGURES Fig. No. Description Page No. 1.1 Brain computer interfacing 11 1.2 Different brain lobes and their association with their cognitive abilities. 14 1.3 A typical EEG based BCI system 16 1.4 EEG acquisition device 18 1.5 Components of brain-computer interfacing 19 2.1 Various frequency components of EEG 32 3.1 Experimental Apparatus -Schematic Diagram 40 3.2 Experimental Apparatus -photographs of setup a) subject 1 with car simulation b) electrode placement EEG system c) subject 1 with brake and accelerator 41 3.3 EEG signal acquisition during the experiment 42 3.4 Screenshot of EEG signal showing mild stress levels 42 3.5 15 channel baseline plots 43 3.6 Mean plot of the 15 channels 43 3.7 Power spectral density plots of 0.2 seconds EEG data 44 3.8 a) Raw EEG signal b) Fourier Transformed Signal 45 3.9 Area under different sub-bands of the frequency spectrum (Z set) 46 3.10 4 wavelets obtained for 1000x15 raw EEG data 47 3.11 Kalman filter analysis of EEG data 52 3.12 Gaussian membership curve 53 3.13 Gaussian curve from PSD average values 54 3.14 Hjorth Complexity Gaussian plot during alarmingly stressed video 56 3.15 Wavelet Coefficient Gaussian plot during alarmingly stressed video 56 3.16 Kalman filter Gaussian plot during alarmingly stressed video 57 3.17 Power Spectral Density Gaussian plot during alarmingly stressed video 57 3.18 Power Spectral Density Gaussian plot during moderately stressed video 58 3.19 Wavelet coefficient Gaussian plot during moderately stressed video 58 3.20 Kalman filter Gaussian plot during moderately stressed video 59 3.21 Hjorth Complexity Parameter Gaussian plot during moderately stressed video 59 3.22 Power Spectral Density Gaussian plot during relaxed video 60 3.23 Kalman filter Gaussian plot during relaxed video 60 3.24 Wavelet coefficient Gaussian plot during relaxed video 61
  • 9. Page 9 of 71 LIST OF TABLES Table No. Description Page No. 2.1 Characteristics of EEG Bands 31 3.1 Peaks of Signals in Fig. 3.8(b) 45 3.2 Average Values of 10 PSD Graphs 54 3.3 Minimum and Maximum Values of Features for Relaxed Video 62 3.4 Minimum and Maximum Values of Features for Moderately Stressed Video 62 3.5 Minimum and Maximum Values of Features for Alarmingly Stressed Video 63 3.6 tnorm for different stress levels 63
  • 10. Page 10 of 71 Chapter 1 An Introduction to Brain-Computer Interfacing This chapter provides a general introduction to brain-computer interfacing (BCI) to understand the relation between brain signal processing and cognitive tasks performed. The chapter begins with the overview and types of BCI, and gradually progresses through brain map and different modalities of brain signaling/imaging techniques. The next part of the chapter includes detailed description of EEG and fNIRS device, which will be used in the subsequent chapters. The later part of the chapter explains the major components of BCI, such as filtering, artifact removal, and low level feature extraction and classification.
  • 11. Page 11 of 71 1.1 OVERVIEW OF BRAIN COMPUTER INTERFACING The growth of the power of modern computers alongside our understanding of the human brain entails the movement closer to making some pretty spectacular science fiction into reality. Imagine transmitting signals directly to someone's brain that would allow them to see, hear or feel specific sensory inputs. Consider the potential to manipulate computers or machinery with nothing more than a thought. It isn't about convenience -- for severely disabled people, development of a brain-computer interface (BCI) could be the most important technological breakthrough in decades. The reason a BCI works at all is because of the way our brains function. Our brains are filled with neurons, individual nerve cells connected to one another by dendrites and axons. Every time we think, move, feel or remember something, our neurons are at work. That work is carried out by small electric signals that zip from neuron to neuron as fast as 250 mph. The signals are generated by differences in electric potential carried by ions on the membrane of each neuron. Although the paths the signals take are insulated by something called myelin, some of the electric signal escapes. Scientists can detect those signals, interpret what they mean and use them to direct a device of some kind. It can also work the other way around. For example, researchers could figure out what signals are sent to the brain by the optic nerve when someone sees the colour red. They could rig a camera that would send those exact signals into someone's brain whenever the camera saw red, allowing a blind person to "see" without eyes. Fig. 1.1 Brain computer interfacing
  • 12. Page 12 of 71 1.2 TYPES OF BCI BCI is of three types: i) invasive, ii) partially invasive and iii) non-invasive. In invasive BCI, electrodes are implanted directly into the grey matter of the brain. Although it produces the highest quality signals of BCI devices, however, it is prone to scar-tissue build-up, causing the signal to become weaker, or even non-existent, as the body reacts to a foreign object in the brain. Partially invasive BCI involves electrode implantation inside the skull but rest outside the brain. This method produces better resolution signals than non-invasive BCIs and also offers lower risk of forming scar- tissue in the brain than fully invasive BCIs. Lastly, in non-invasive BCL, electrodes are placed on specified positions on the scalp. It has disadvantage of having relatively poor spatial resolution, however offers superior temporal resolution and the merits of cost effectiveness, easy to wear and non-requirement of surgery.
  • 13. Page 13 of 71 1.3 BRAIN MAP AND BRAIN IMAGING TECHNIQUES BCIs measure brain activity, process it, and produce control signals that reflect the user’s intent. To understand BCI operation better, one has to understand how brain activity can be measured and which brain signals can be utilized. In this section, we focus on the brain map and the most important recording methods and brain signals. The human brain comprises several 100 billion of nerve cells, called neurons, which individually/in groups are responsible for executing complex mental tasks like interpretation of stimuli, memory encoding and recall, motor planning /execution and coordination of multi- sensory/sensory-motor interactions. Apart from this, the human brain is also involved to control most of our biological activities, including respiration rate, cardiac activity, muscular activity, and many others. The neurons in the brain and also in the rest of our nervous system act partly electrically and partly chemically for stimuli processing, signal transduction and motor activity. A look inside the neuron reveals that the cell-body of the neuron yields a linear combination of the received electrical stimuli for transfer to the pre-synaptic region. The accumulated electrical stimuli next trigger the synapse to synthesize the neurotransmitters for transfer of information from the pre-synaptic region to the post-synaptic region. Thus, communication of information inside a neuron is performed by both electrical and chemical means. The brain is divided into three main modules, called cerebrum, cerebellum and Pons. The cerebrum is the largest part of the human brain with highest functionality. The second part of the brain, called cerebellum is the area of the hindbrain. The third part called pones is the portion of brain stem, which is located above the medulla oblongata and below the midbrain. The cerebrum is covered with a cortical layer having convoluted topography, called the cerebral cortex. It looks like a sheet of neural tissue that includes a large surface area within the skull by folding itself. Cerebral cortex is divided into almost symmetrical right and left hemispheres. Each hemisphere consists of different lobes such as frontal, parietal, temporal and occipital lobes. Besides the four lobes, neocortical areas of the brain including primary motor and sensorimotor cortices play major role during motor-planning/execution and tactile perception respectively. Figure 1.2 shows the different brain lobes and their association with their cognitive abilities, which is briefly described in this section.
  • 14. Page 14 of 71 1. The frontal lobe is one of the important lobes of cerebral hemisphere. It is located in the frontal part of the brain. Central sulcus separates the frontal lobe from the parietal lobe whereas sylvian sulcus separates the frontal lobe from the temporal lobe. The frontal lobe and its pre-frontal region are responsible for problem solving tasks, physical reaction, abstract thinking, planning, short term memory task and motivation [1]. The anterior portion of frontal lobe is known as pre-frontal area, which is associated with olfaction recognition [2], [3] and emotion recognition [4], [5]. 2. The parietal lobe extends from the central sulcus nearly to the occipital lobe and is situated on the postcentral gyrus, which is responsible for processing all tactile and proprioceptive sensory information from the contralateral side of the body [6]. This lobe is also used for planning/navigation and spatial sense. 3. The temporal lobe, which is the largest brain lobe (containing approximately 17% of the cerebral cortex) [7], is situated below the frontal lobe, and is separated from the frontal lobe by sylvian sulcus [8]. The temporal lobe controls auditory and olfactory information processing, semantic memory, and perception of spoken or written language [8]. Fig. 1.2 Different brain lobes and their association with their cognitive abilities
  • 15. Page 15 of 71 4. The occipital lobe is the smallest lobe in the brain. It is situated behind the parietal lobe. The main function of this lobe is visual reception, colour recognition and visuo-spatial processing [9]. Acquisition of brain activity broadly falls under two categories: i) without surgery, and ii) with surgery. ❖ Measuring Brain Activity (Without Surgery) Brain activity produces electrical and magnetic activity. Therefore, sensors can detect different types of changes in electrical or magnetic activity, at different times over different areas of the brain, to study brain activity. Most BCIs rely on electrical measures of brain activity and rely on sensors placed over the head to measure this activity. Electroencephalography (EEG) refers to recording electrical activity from the scalp with electrodes. It is a very well-established method, which has been used in clinical and research settings for decades. EEG equipment is inexpensive, lightweight, and comparatively easy to apply. Temporal resolution, meaning the ability to detect changes within a certain time interval, is very good. However, the EEG is not without disadvantages: The spatial (topographic) resolution and the frequency range are limited. The EEG is susceptible to so-called artifacts, which are contaminations in the EEG caused by other electrical activities. Examples are bioelectrical activities caused by eye movements or eye blinks (electrooculographic activity, EOG) and from muscles (electromyographic activity, EMG) close to the recording sites. External electromagnetic sources such as the power line can also contaminate the EEG. Furthermore, although the EEG is not very technically demanding, the setup procedure can be cumbersome. To achieve adequate signal quality, the skin areas that are contacted by the electrodes have to be carefully prepared with special abrasive electrode gel. Because gel is required, these electrodes are also called wet electrodes. The number of electrodes required by current BCI systems ranges from only a few to more than 100 electrodes. Most groups try to minimize the number of electrodes to reduce setup time and hassle. Since electrode gel can dry out and wearing the EEG cap with electrodes is not convenient or fashionable, the setting up procedure usually has to be repeated before each session of BCI use. From a practical viewpoint, this is one of largest drawbacks of EEG-based BCIs. A possible solution is a technology called dry electrodes. Dry electrodes do not require skin preparation nor electrode gel. This technology is currently being researched, but a practical solution that can provide signal quality comparable to wet electrodes is not in sight at the moment. A BCI analyzes ongoing brain activity for
  • 16. Page 16 of 71 brain patterns that originate from specific brain areas. To get consistent recordings from specific regions of the head, scientists rely on a standard system for accurately placing electrodes, which is called the International 10–20 System [10]. It is widely used in clinical EEG recording and EEG research as well as BCI research. The name 10–20 indicates that the most commonly used electrodes are positioned 10, 20, 20, 20, 20, and 10% of the total nasion-inion distance. The other electrodes are placed at similar fractional distances. The inter-electrode distances are equal along any transverse (from left to right) and antero-posterior (from front to back) line and the placement is symmetrical. The labels of the electrode positions are usually also the labels of the recorded channels. For example, if an electrode is placed at site C3, the recorded signal from this electrode is typically also denoted as C3. The first letters of the labels give a hint of the brain region over which the electrode is located: Fp – pre-frontal, F – frontal, C – central,P – parietal, O – occipital, T – temporal. Fig. 1.3 provides a typical EEG based BCI system that consists of an electrode cap with electrodes, cables that transmit the signals from the electrodes to the bio- signal amplifier, a device that converts the brain signals from analog to digital format, and a computer that processes the data as well as controls and often even runs the BCI application. Fig. 1.3 A typical EEG based BCI system
  • 17. Page 17 of 71 ❖ Measuring Brain Activity (With Surgery) The techniques discussed in the last section are all non-invasive recording techniques. For example, there is no need to perform surgery or even break the skin. In contrast, invasive recording methods require surgery to implant the necessary sensors. This surgery includes opening the skull through a surgical procedure called a craniotomy and cutting the membranes that cover the brain. When the electrodes are placed on the surface of the cortex, the signal recorded from these electrodes is called the electrocorticogram (ECoG). ECoG does not damage any neurons because no electrodes penetrate the brain. The signal recorded from electrodes that penetrate brain tissue is called intra-cortical recording. Invasive recording techniques combine excellent signal quality, very good spatial resolution, and a higher frequency range. Artifacts are less problematic with invasive recordings. Further, the cumbersome application and re-application of electrodes as described above is unnecessary for invasive approaches. Intra-cortical electrodes can record the neural activity of a single brain cell or small assemblies of brain cells. The ECoG records the integrated activity of a much larger number of neurons that are in the proximity of the ECoG electrodes. However, any invasive technique has better spatial resolution than the EEG. Clearly, invasive methods have some advantages over non- invasive methods. However, these advantages come with the serious drawback of requiring surgery. Ethical, financial, and other considerations make neurosurgery impractical except for some users who need a BCI to communicate. Even then, some of these users may find that a noninvasive BCI meets their needs. It is also unclear whether both ECoG and intracortical recordings can provide safe and stable recording over years. Long term stability may be especially problematic in the case of intra- cortical recordings. Electrodes implanted into the cortical tissue can cause tissue reactions that lead to deteriorating signal quality or even complete electrode failure. Research on invasive BCIs is difficult because of the cost and risk of neurosurgery. For ethical reasons, some invasive research efforts rely on patients who undergo neurosurgery for other reasons, such as treatment of epilepsy. Studies with these patients can be very informative, but it is impossible to study the effects of training and long-term use because these patients typically have an ECoG system for only a few days before it is removed.
  • 18. Page 18 of 71 1.4 EEG DEVICE Electroencephalography (EEG) is an electrophysiological monitoring method to record the spontaneous electrical activity of the brain over a period of time. EEG measures voltage fluctuations resulting from ionic current within the neurons of the brain. It is typically noninvasive, with the electrodes placed along the scalp, although invasive electrodes are sometimes used in specific applications. The amplitude of the EEG is about 100 µV when measured on the scalp, and about 1-2 mV when measured on the surface of the brain. EEG signal is sub-divided into a number of specific frequency bands including i) delta (~0.1-3Hz), ii) theta (~3-7Hz), iii) alpha (~7-13Hz), iv) mu (~7- 13Hz), v) beta (~13-30Hz) and vi) gamma bands (>30Hz). The versatile and expandable EEG-1200 model from NIHON KOHDEN is equipped for all in- patient/human-subject EEG diagnostics applications. Ranging from standard equipment for routine EEG to the highest clinical discipline of intracranial long-term monitoring, EEG-1200 provides the ideal basis for customized configuration. Fig. 1.4 provides the stand-alone EEG data acquisition device provided in the official website. Our procured version consists of 32 channel amplifier, of which 21 channels are dedicated to measure EEG signals and the remaining 11 channels are dedicated for SpO2, EtCO2 and DC. Fig 1.4 EEG Acquisition Device
  • 19. Page 19 of 71 1.5 COMPONENTS OF BCI SYSTEM The main aim of an EEG-based BCI system is to create a communication channel between the user’s intention and an external device (e.g. computers, prosthesis) without any muscular intervention. Unfortunately, while executing an assigned task, the human brain occasionally undertakes parallel thoughts, which might appear as the cross-talk to the acquired EEG signals recorded to examine the targeted task. In case, the frequency band of the EEG signals for the non-targeted parallel tasks do not overlap with those of the targeted task, the frequency band for the targeted task can be separated from the parallel thoughts by filtering. The EEG signal being of very low frequency and pass bands for individual tasks being too narrow, we go for digital filtering rather than conventional analog filtering. Fig. 1.5 presents all steps of BCI. The next step that follows digital filtering is feature extraction. Feature extraction involves determining the most appropriate features of the acquired EEG that best resembles the EEG signal for a given task. In other words, true features of an EEG signal are those, which directly/indirectly can help in reconstruction of the EEG signal. Unfortunately, there is no standard technique to extract the true features of an EEG for a given task. The usual practice thus is to determine a set of standard features that can capture one or more characteristics of the EEG signal. If the list of features is too long, we need to select a fewer of the features. In fact, there is an extensive literature on feature selection. A few of these that deserve mentioning includes forward search, backward search, and evolutionary search algorithms [11], [12]. The motivation of these algorithms is to identify a subset of features from its entirety so that they best represent the EEG signals at the sampled time-points. Fig. 1.5 Components of a brain-computer interfacing system
  • 20. Page 20 of 71 Most of BCI techniques terminate with a classification algorithm that aims at classifying the target task/class from the rest. Usually, most of the BCI problems are formulated as a two-class classification problem, unless the problem by nature is a multi-class classification problem. In a two-class classification task, the classifier produces a binary output, one for the target class and zero for the rest. A multi-class classification problem, such as classification of aroma from EEG signatures, is again solved usually as a sequence of two-class classification problem. For example, A, B and C are three classes. We use binary classifiers to classify the features into A and non-A. Then the non-A class is again classified into class B and C. Had there been more than three classes, the classification tree would have a longer length but that too has to follow the above principle. Occasionally, a few BCI systems require additional steps to realize a controller to execute specific control tasks based on the results of classification. For example, suppose, if the classifier response is class A, we may need to turn a motor on. If it is class B, we may turn it off. More sophisticated control logic is also adopted in recent BCI systems [11], where the motor is activated based on the classification of subjective motor imagery and stopped based on the occurrence of error when the motor-shaft passes the fixed target position.
  • 21. Page 21 of 71 Chapter 2 Stress Detection Using Motor Imagery Task The chapter begins with the definition of stress and motor imagery, and gradually progresses through different brain signals including P300 event-related potential, event-related de- synchronization/synchronization, slow cortical potential, steady-state visual evoked potential and error-related potential. This chapter provides a brief review of current research directions and the scope of EEG signals in decoding of motor imagery. This chapter also includes feature extraction techniques, such as discrete wavelet transforms, power-spectral density, adaptive autoregressive parameters, Hjorth parameters, and common spatial patterns. The chapter provides a discussion on EEG signal classification to decode cognitive activities. The list of classifiers includes Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), Multi-layer Perceptron (MLP), Hidden Markov Model (HMM), k-nearest neighbor (kNN) algorithm and Naïve Bayes’ classifier. The chapter comes to an end with an outline to well-known performance analysis metrics is also included.
  • 22. Page 22 of 71 2.1 DEFINING STRESS AND MOTOR IMAGERY (MI) Stress is primarily a physical response, which releases a complex mix of hormones and chemicals such as adrenaline, cortisol and norepinephrine to prepare the body for physical action. This causes a number of reactions, from blood being diverted to muscles to shutting down unnecessary bodily functions such as digestion. Motor imagery is a cognitive process in which a subject imagines that he or she performs a movement without actually performing the movement and without even tensing the muscles. It is a dynamic state during which the representation of a specific motor action is internally activated without any motor output. In other words, motor imagery requires the conscious activation of brain regions that are also involved in movement preparation and execution, accompanied by a voluntary inhibition of the actual movement [13].
  • 23. Page 23 of 71 2.2 DECODING OF STRESS AND MI ❖ Decoding of Stress Traffic accidents all over the world are increasing day-by-day, posing a serious danger to the driver’s life and the lives of other people. Crashes are among the top three causes of death throughout a person’s lifetime. This is mainly due to the diminished driver’s vigilance level causing a decline in their perception, recognition and vehicle control abilities. Fatigue, stress, and our emotions have a serious effect on driving, causing serious impairments that we may not even be aware of. For this reason, developing systems that actively monitors the driver’s level of vigilance and alerting the driver of any insecure driving condition is essential for accident prevention. Many efforts have been made for developing an active safety automatic car control system for reducing the number of automobiles accidents due to stress, fatigue, drunkenness, and sleepiness or health problems. In [14], electroencephalogram signals (EEG) are used for a drowsiness detection system which identifies suitable driver-related and/or vehicle related variables that are correlated to the driver’s level of drowsiness. Electrodes are placed on the scalp which sends signals to the computer to record the results. So basically, a brain-computer interface (BCI) is created which enables control of devices only through cerebral activity without using muscles. In order to make the analysis of drowsiness in an automatic way, the EEG power spectrum can be computed using Fast Fourier Transform (FFT) or wavelength transform in MATLAB. The circuit involves EEG detection circuit, micro control circuit and a processing circuit. The micro control circuit receives the EEG signal and generates a control signal that is sent to the processing unit. According to the control signal, the processing unit processes and analyses the EEG signal so as to learn the fatigability of a person. In [15], early symptoms of fatigue in train drivers are detected by image processing method of comparing the image (frames) in the video and by using the human features estimated indirectly. On
  • 24. Page 24 of 71 the onset of fatigue due to any severe medical problems or sleep deprivation, an immediate message will be transferred to the control room using the GSM module of the system indicating the status of the drivers. Heart rate sensors are also added in the fatigue detection system. The technique focuses on different modes of person when driving i.e. awake, drowsy state or sleepy and sleep state. An alarm or buzzer sound known as Automatic Alarm System (AAS) is generated in driver’s cabin if train passes and caution or stop signal given by control room. In many scenarios driver may be suffering from heart attack or any other desperate situation, such position of the driver is also detected easily with the help of heart rate detection system using heart beat sensor. An accelerometer is also used to detect the motion of the face since a person in the drowsy or fatigue mode has a range of motion of the face different from a person in normal mode. No movement of driver states that the driver is in sleepy mode or victim to any kind of unconsciousness. The eye blinking and the degree of open eyelid are also factors for detecting the fatigue of driver since the blinking rate of person awake differs from the blinking rate of person in sleepy state and there is a wide space between the eyelid when person is fully awake as studied in [4]. In [16], remotely located charge-coupled-device cameras equipped with active infrared illuminators are used to acquire video images of the driver. The levels of alertness of a person are extracted in real- time. The visual cues employed characterize eyelid movement, gaze movement, head movement and facial expression. A probabilistic model is developed to model human alertness and to predict accurate drowsiness or fatigue based on multiple visual cues obtained. In [17], stress is detected using eye blinks and brain activities from EEG signals. While driving, stressful emotions can be triggered in participant and we can correlate eye blink frequency with experienced stress. Longitudinal differences of two prefrontal cortex sensors in combination with amplitude maps are used to classify eye blinks. It is even harder to generalize the interpretation of these associations, since eye blinks can differ between different persons. Eye blinks of one test subject is detected and correlated with eye blink frequency of subjects with the experienced level of stress. Brain activity is significantly more active when doing mental calculation with eyes open as opposed to doing them with eyes closed. Results of this research in combination with other stress detectors lead to applications to improve transport safety and support other areas where stress levels need to be monitored.
  • 25. Page 25 of 71 ❖ Decoding of MI Over the past two decades, motor imagery (MI) has been used to design EEG-based BCI systems that enable individuals with motor impairments to control various assistive devices, such as wheelchairs, prosthetic devices, and computers. In fact, a MI task can be defined as a mental process in which an individual imagines himself/herself performing a specific action without real activation of the muscles. During MI tasks, various regions in the brain are activated such as primary motor cortex (M1), primary and secondary sensory areas, pre-frontal areas, superior and inferior parietal lobules, and dorsal and ventral pre-motor cortices. Therefore, the development of BCI systems that can effectively analyze brain signals and discriminate between different MI tasks to control neural prostheses devices has the potential to enhance the quality of life for people with severe motor disabilities. Literature reveals that the vast majority of the existing MI EEG-based BCI systems were focused on differentiating between MI tasks that are associated with four different body parts, including feet, left hand, right hand, and tongue. Despite the relatively high classification accuracies attained for classifying MI tasks performed by different body parts, the discrimination between MI tasks within the same hand is considered challenging. This can be attributed to three limitations associated with the EEG signals. First, the low spatial resolution of the EEG signals constrains the ability to discriminate between MI tasks of the same hand that activate similar and close areas in the brain. In fact, this limitation becomes more pronounced when the MI tasks are associated with the same joint in the hand, such as wrist movements. Second, due to the volume conducted effect, EEG signals have a limited signal-to-noise ratio. This in turn can drastically reduce the ability to discriminate between EEG signals of different dexterous MI tasks within the same hand, such as fingers- and wrist-related tasks. Third, the spectral characteristics of the EEG signals are time varying, or non-stationary. The non-stationary characteristics of EEG signals introduce large intra-trial variations for each subject and inter-personal variations between subjects, which increase the difficulty to discriminate between the EEG signals of MI tasks within the same hand. Therefore, traditional time-domain and frequency- domain representations, which are employing the time-invariance assumption, are considered inadequate to represent EEG signals.
  • 26. Page 26 of 71 Recently, a few studies have been reported to utilize EEG signals in order to discriminate between flexion/extension movements of the fingers as well as several wrist movements, including flexion and extension. The promising results reported in these studies demonstrate the possibility of utilizing EEG signals to discriminate between MI tasks within the same hand. Nonetheless, these studies have been conducted using EEG signals acquired from intact subjects, without exploring the capability of classifying MI tasks within the same hand using EEG signals that are acquired from individuals with hand amputations. In addition, a fast-growing number of studies indicated that brain areas engaged in the actual performance of movements are also active during motor imagery [18]-[25]. Multiple studies showed the involvement of the premotor, supplementary motor, cingulate and parietal cortical areas, the basal ganglia, and the cerebellum, not only during the actual execution of a movement but also during the imagination of a movement [26], [27]. In [28], authors showed that the imagination of different moving body parts (foot, hand and tongue) activated the precentral gyrus in a somatotopic manner. Similar results were obtained in [29], where authors showed that imagery of finger, tongue and toe movements activated the specified organized areas of the primary motor cortex in a systematic manner, which means that imagery of finger movement activated the finger area, imagery of toe movements activated the foot zones of the posterior part of the contralateral supplementary motor area and the contralateral primary motor cortex and imagery of tongue movements activated the tongue region of the primary motor cortex. These data suggest that the imagined body part is reflected more or less directly in the pattern of cortical activation. The results are in accordance with an earlier study [30], where motor imagery influenced the corticospinal excitability in a very specific way. For example, motor imagery of forearm flexion enhances the MEPs of the m. biceps brachialis, an agonist during forearm flexion, whereas this was not the case during imagery of forearm extension, where the m. biceps brachialis acts as an antagonist. Hence, motor imagery does not lead to a generalized muscular arousal but to movement-specific central activation patterns. In [31], motor imagery had an effect on the spinal segmental excitability. Nine healthy adult participants had to perform a series of imagined flexion-extension movements of the fingers. The results indicated a subthreshold activation of spinal motoneurons. Hence, at this moment there is ample evidence that motor execution and motor imagery activate overlapping areas in the brain. Although the majority of the studies are focused on hand/finger or mouth movements, it is in the
  • 27. Page 27 of 71 context of the present text, relevant to note that the activation of brain cortical areas during motor imagery is not limited to hand/finger or mouth movements but that also the imagination of gross movements results in the activation of relevant areas. In [32], the activation of the pre-supplemary motor area and the primary motor cortex is shown during imagery of locomotor movements. Besides the overlap in neural activation between imagery and execution there are also similarities in the behavioural domain. For instance, the time to complete an imagined movement is known to be similar to the time needed for actual execution of that movement. This phenomenon is known as mental isochrony. In [33], authors showed that the time needed to judge whether a rotated picture of a hand represents a left or a right hand is related to the degree of rotation of that picture. Furthermore, he showed that when the exposed hand positions were awkward or biomechanically difficult, the imagined rotation time increased more than for equally rotated hands in biomechanically easy positions and that the rotation time was similar to real hand rotation time for these positions. The fact that motor imagery seems to respect the normal biomechanical constraints of real movements indicates that these tasks are not accomplished by mere visual imagery but must be solved by imagining the movement of one’s own arm and hand.
  • 28. Page 28 of 71 2.3 BRAIN SIGNALS FOR DECODING MI During execution of different cognitive tasks, EEG signals released by the brain indicate certain special characteristics, which can be detected from the temporal changes in signal wave shapes. An EEG signal, if elicited in response to specific events or stimuli is referred to as Event-related Potential (ERP) [34]. Certain ERPs liberated in response to sensory stimuli with relevant discrete phase-locked events are referred to as Evoked potential (EP) [35]. EPs are best described by their polarity (positive or negative) and latency counted from the onset of stimuli. Among the EPs, N100, P200, N200, P300 [36], [37], Slow cortical potential (SCP) [38] and Error-related potential (ErrP) [39] need special mention. One special type of EP, which exhibits natural responses to visual stimulations at specific frequencies, is referred to as Steady-state visual-evoked (SSVEP) [38] response. Besides, certain EEG signals are induced spontaneously as a response to specific cognitive tasks without any stimuli. These ERPs liberated in absence of any stimuli represent frequency-specific changes and are generally referred to as non-phase locked ERPs [40]. A well-known example of such ERPs is Event-related de- synchronization/synchronization (ERD/ERS) [41], where an event-related decrease in power is noticed at the onset of motor imagery/execution. This phase of the signal is referred to as Event Related De-synchronization (ERD). After the motor imagination/execution is over, the signal-power continues increasing until the original signal power is restored. The latter phase of the signal is referred to as ERS.
  • 29. Page 29 of 71 2.4 FEATURES USED FOR DECODING MI A feature represents a distinguishing property, a recognizable measurement, and a functional or structural component obtained from a section of a pattern. A variety of methods have been widely used to extract the features from EEG signals, among these methods are time frequency distributions (TFD), Fast Fourier transform (FFT), eigenvector methods (EM), wavelet transform (WT), and auto regressive method (ARM), and so on. Power spectral density describes the signal energy or the power distributed over the frequency. It is a useful concept that allows us to determine the bandwidth of the system. To understand how the strength of the signal is distributed in frequency domain, we take help of the filter-based analysis. As power of the signal is measure of signal strength, we have used the classical definition that is derived as the Fourier Transform of the autocorrelation function for our study. The strength of the Fourier transform in signal analysis and pattern recognition is its ability to reveal spectral structures that may be used to characterize a signal. For example, for a periodic signal, the power is concentrated in extremely narrow bands of frequencies, indicating the existence of structure and the predictable character of the signal whereas for a purely random signal the signal power is spread equally in the frequency domain, indicating the lack of structure in the signal. Hence, the more correlated or predictable a signal, the more concentrated its power spectrum, and conversely the more random or unpredictable a signal, the more spread its power spectrum. Therefore, the power spectrum of a signal can be used to deduce the existence of repetitive structures or correlated patterns in the signal process. Such information is crucial in detection, decision making and estimation problems, and in systems analysis. The Fast Fourier Transform (FFT) is a useful scheme for extracting frequency-domain signal features. Fourier analysis is extremely useful for data analysis, as it breaks down a signal into constituent sinusoids of different frequencies. For sampled vector data, Fourier analysis is performed using the discrete Fourier transform (DFT). The fast Fourier transform (FFT) is an efficient algorithm for computing the DFT of a sequence. Since the early days of automatic EEG processing, representations based on a Fourier transform have been most commonly applied. This approach is based on earlier observations that the EEG
  • 30. Page 30 of 71 spectrum contains some characteristic waveforms that fall primarily within frequency bands—delta, theta, alpha, beta and gamma. The oscillatory activity of the spontaneous EEG is typically categorized into five different frequency bands: delta (0-4 Hz), theta (4-8), alpha (8-12), beta (12-30) and gamma (30-100 Hz). These frequency bands are suggested to be a result of different cognitive functions. • Delta (0 -4 Hz): Delta activity is characterized as high amplitude and low frequency. It is usually associated with the slow-wave sleep. Delta waves represent the onset of deep sleep phases in healthy adults. In addition, contamination of the eye activity is mostly represented in the delta frequency band. • Theta (4-8Hz): The generation of theta power is associated with the hippocampus as well as neocortex. The theta band is associated with deep relaxation or meditation and it has been observed at the transition stage between wake and sleep. Theta rhythms are suggested to be important for learning and memory functions, encoding and retrieval which involve high concentration. Theta oscillations are also associated with the attentional control mechanism in the anterior cingulated cortex and are often shown to increase with a higher cognitive task demand. • Alpha (8-12Hz): Alpha band activity is found at the occipital lobe during periods of relaxation or idling i.e. eyes closed but awake. It is characterized by high amplitude and regular oscillations with a maximum over parietal and occipital electrodes in the continuous EEG. The modulation of alpha activity is thought to be a result of resonation or oscillation of the neuron groups. High alpha power has been assumed to reflect a state of relaxation. However, when the operator devotes more effort to the task, different regions of the cortex may be recruited in the transient function network leading to passive oscillation of the local alpha generators in synchrony with a reduction in alpha power. Recent results suggested that alpha is involved in auditory attention processes and the inhibition of task irrelevant areas to enhance signal-to-noise ratio. Additionally, some researchers divide the alpha activity further into sub-bands to achieve a finer grained description of its functionality. For example, the “mu” band (10-12 Hz) occurs with actual motor movement and intent to move with an associated diminished activation of the motor cortex. • Beta (13-30Hz): The beta wave is predominant when the human is wide awake. Spatially, it predominates in the fontal and central area of the brain. It has been described that the high power in the beta band is associated with the increased arousal and activity pointed out that the beta wave represents cognitive consciousness and an active, busy, or anxious thinking. Furthermore, it has been revealed to reflect visual concentration and the orienting of attention. The beta band can be further
  • 31. Page 31 of 71 divided into several sub-bands: low beta wave (12.5-15 Hz); middle beta wave (15-18 Hz); high beta wave (> 18 Hz). These three sub-bands are associated with separate physiologic processes. • Gamma (>30Hz): The gamma band is the fastest activity in EEG and is thought to be infrequent during waking states of consciousness. It is reported that gamma waves are associated with perceptual blinding problem. More specifically, areas of lateral occipital cortex and fusiform gyrus play an important role in visual stimulus encoding and show large gamma oscillations differently affected by attentional modulation. Recent studies reveal that gamma is linked with many other functions such as attention, learning, memory, and language perception. Additionally, verbal memory formation led to an increase in gamma oscillations when analyzing intracranial recording data from epilepsy patients. Table 2.1 provides characteristics of EEG bands. Table 2.1 Characteristics of EEG Bands
  • 32. Page 32 of 71 Wavelet transform (WT) forms a general mathematical tool for signal processing with many applications in EEG data analysis. Its basic use includes time-scale signal analysis, signal decomposition and signal compression. Since EEG signal is non-stationary, a suitable way for feature extraction from the raw data is the use of the time-frequency domain methods like WT which is a spectral estimation technique in which any general function can be expressed as an infinite series of wavelets. Since WT allows the use of variable sized windows, it gives a more flexible way of time- frequency representation of a signal. In order to get a finer low-frequency resolution, WT long time windows are used in contrast in order to get high-frequency information, short time windows are used. Furthermore, WT only involves multi-scale structure and no single scale. Generally, wavelets are purposefully crafted to have specific properties that make them useful for signal processing. Wavelets can be combined, using a "reverse, shift, multiply and integrate" technique called convolution, with portions of a known signal to extract information from the unknown signal. A related use is for smoothing/denoising data based on wavelet coefficient thresholding, also called wavelet shrinkage. By adaptively thresholding the wavelet coefficients that correspond to undesired frequency components smoothing and/or denoising operations can be performed. Fig. 2.1 shows various EEG frequency components, where the components are generated through the wavelet transform. The delta, theta, alpha, beta, and gamma are correspondent to the wavelet decompositions. The wavelet transform provides a potentially powerful technique for pre-processing EEG signals prior to classification. WT plays an important role in the recognition and diagnostic field: it compresses the time-varying biomedical signal, which comprises many data points, into a small few parameters that represents the signal. There are two categories for the WT; the first one is continuous while the other one is discrete. Hjorth parameter is one of the ways of indicating statistical property of a signal in time domain and it has three kinds of parameters: Activity, Mobility, and Complexity. These are also called “normalized slope descriptors” because they can be defined by means of first and second derivatives. ❖ HJORTH ACTIVITY: Fig. 2.1 Various frequency Components of EEG
  • 33. Page 33 of 71 The wavelet transform provides a potentially powerful technique for pre-processing EEG signals prior to classification. WT plays an important role in the recognition and diagnostic field: it compresses the time-varying biomedical signal, which comprises many data points, into a small few parameters that represents the signal. There are two categories for the WT; the first one is continuous while the other one is discrete. Hjorth parameter is one of the ways of indicating statistical property of a signal in time domain and it has three kinds of parameters: Activity, Mobility, and Complexity. These are also called “normalized slope descriptors” because they can be defined by means of first and second derivatives. ❖ HJORTH ACTIVITY: The first parameter is a measure of the mean power representing the activity of the signal. The activity parameter represents the signal power, the variance of a time function. This can indicate the surface of power spectrum in the frequency domain. This is represented by Eq. 2.1. powersignalthe))(var( ,txActivity = (2.1) where, y(t) represents the signal. ❖ HJORTH MOBILITY: The mobility parameter represents the mean frequency, or the proportion of standard deviation of the power spectrum. This is defined as the square root of variance of the first derivative of the signal y(t) divided by the signal y(t). This is represented by Eq. 2.2. frequencymeanthe, txActivity dt tdx Activity Mobility ))(( )(       = (2.2)
  • 34. Page 34 of 71 ❖ HJORTH COMPLEXITY: The last parameter gives an estimate of the bandwidth of the signal. The Complexity parameter represents the change in frequency. The parameter compares the signal's similarity to a pure sine wave, where the value converges to 1 if the signal is more similar. This is represented by Eq. 2.3. Complexity = ))(( )( txMobility dt tdx Mobility       (2.3) While these three parameters contain information about frequency spectrum of a signal, they also help analyze signals in time domain. Furthermore, the time-domain orientation of Hjorth representation may prove suitable for situations where ongoing EEG analysis is required. Since the calculation of Hjorth parameters is based on variance, the computational cost of this method is considered low compared to other methods. The Kalman filter is an optimal estimator. It can estimate the past, present and future states of a system from a set of uncertain observations. Basically, it works recursively on noisy input data and infers parameters of interest from the data that is better than the estimate obtained by using a single data. The process of finding the best estimate from noisy data amounts to filtering out the noise. Prediction is possible even when the exact nature of the system is not known. The Kalman filter estimates a process by using a form of feedback control: it estimates the process state at some time and then obtains feedback in the form of (noisy) measurements. Once the outcome of the next measurement (including some amount of error) is observed, these estimates are updated using a weighted average, with more weight being given to estimates that have higher certainty. Since the algorithm is recursive, it can run in real time using only present input and previously calculated values; no past information is required. Kalman filter is widely used in real-time signal processing applications such as guiding, navigating and controlling of vehicles, particularly aircraft and spacecraft, due to the following reasons: ▪ Good results in practice due to optimality and structure. ▪ Convenient form for online real time processing. ▪ Easy to formulate and implement given a basic understanding. ▪ Measurement equations need not be inverted.
  • 35. Page 35 of 71 2.5 CLASSIFIERS USED FOR DECODING MOTOR IMAGERY Brain activity patterns are considered as dynamic stochastic processes due both to biological and to technical factors. Biologically, they change due to user fatigue and attention, due to disease progression, and with the process of training. Technically, they change due to amplifier noises, ambient noises, and the variation of electrode impedances [42]. Therefore, the time course of the generated time series signal, for example, EEG should be taken into account during feature extraction [42]. To use this temporal information, three main approaches have been proposed [43]: • concatenation of features from different time segments: extracting features from several time segments and concatenating them into a single feature vector [44], [45]; • combination of classifications at different time segments: it consists in performing the feature extraction and classification steps on several time segments and then combining the results of the different classifiers [46], [47]; • dynamic classification: it consists in extracting features from several time segments to build a temporal sequence of feature vectors. This sequence can be classified using a dynamic classifier [48], [49]. Usually, the first approach is the most widely used despite that the obtained feature vectors are often of high dimensionality. In order to choose the most appropriate classifier for a given set of features, the properties of the available classifiers must be chosen according to the following four classifier taxonomy as described in [43]. 1. Generative or Informative classifier - Discriminative classifier: Generative classifiers, for example, Bayes quadratic, learn the class models. To classify a feature vector, generative classifiers compute the likelihood of each class and choose the most likely. Discriminative ones, for example, support vector machines (SVM), only learn the way of discriminating the classes or the class membership in order to classify a feature vector directly [50], [51]. 2. Static classifier - Dynamic classifier: Static classifiers, for example multilayer perceptrons (MLP), cannot take into account temporal information during classification as they classify a single feature vector. In contrast, dynamic classifiers such as hidden Markov model (HMM) [52], FIR filters
  • 36. Page 36 of 71 multilayer perceptrons (FIR-MLP) [53] and Tree-based neural network (TBNN) [54] can classify a sequence of feature vectors and thus catch temporal dynamics. 3. Stable classifier - Unstable classifier: Stable classifiers, for example, linear discriminant analysis (LDA), have a low complexity (or capacity) [55], [56]. They are said to be stable as small variations in the training set do not considerably affect their performance. In contrast, unstable classifiers, for example, MLP, have a high complexity. As for them, small variations of the training set may lead to important changes in performance [57]. 4. Regularized classifier: Regularization consists in carefully controlling the complexity of a classifier in order to prevent overtraining. Regularization helps limit (a) the influence of outliers and strong noise, (b) the complexity of the classifier and (c) the raggedness of the decision surface [58]. A regularized classifier has good generalization performances and is more robust with respect to outliers [59], [60].
  • 37. Page 37 of 71 2.6 PERFORMANCE ANALYSIS IN MI BASED BCI RESEARCH The performance of a EEG-BCI system is analyzed using a number of performance metrics. This section discusses on few of them. 1. Confusion Matrix: The confusion matrix is a tabular representation which the relationship between the desired class intended by the user and the actual classes predicted by the classifier [61], [62]. 2. Classification Accuracy: It is the most widely used evaluation criterion in BCI research because it is easy to calculate and interpret. It is defined as the ratio of the number of correct observations made by the classifier to the total number of observations [63]. 3. Type-I and Type-II Error Rate: A type I error (α) represents the rate of incorrect rejection of a true null hypothesis, and hence known as false positive rate. The error of the second kind, i.e., a type II error (β) refers to the rate of failure to reject a false null hypothesis., and hence known as false negative rate [64]. 4. Information Transfer Rate: Information Transfer Rate (Bt) represents the bit rate of the BCI system [65]. Its representation in bits/min is given in Eq. 2.4. ( )2 2 2 1 60 log log 1 log 1 t P B N P P P N T −  = + + −   −  (2.4) where, N represents the number of possible states and P represents the classification accuracy between 0 and 1. T is the time needed to convey each action in second/symbol i.e., time interval from the issue of a command to the classified output of the same. 5. Statistical Hypothesis Testing: Statistical hypothesis testing [64] is required to ensure that the experimental data are correctly interpreted; the apparent relationship between them is significant or meaningful and does not occur by chance. There exist a number of well-known statistical tests, which can be classified into four main categories namely, i) correlational (such as, Pearson correlation [66] and Spearman correlation [67]), ii) comparison of means (such as, Paired t-test [67] and ANNOVA [68]), iii) regression [67] (such as, Simple regression and Multiple regression) and non-parametric (such as, McNemar’s test [69], Friedman’s test [70], Wilcoxon rank-sum test [71] and Wilcoxon signed-rank test [72]). The selection of right statistical test depends on the type of data, distribution of data, and number of data-points and observations available.
  • 38. Page 38 of 71 Chapter 3 Stress Detection of Vehicle Drivers During Driving-A Case Study The chapter begins with the formulation of stress detection problem for vehicle drivers during driving. To accomplish this, experimental framework for EEG data acquisition is established. This chapter provides the feature plots by using the feature extraction techniques as has been explained in earlier chapter. The chapter ends with the performance of a fuzzy classifier in order to decode the stress level of a driver during driving.
  • 39. Page 39 of 71 3.1 PROBLEM FORMULATION Driving is a common yet complex skill that requires constant attention and integration of different simultaneous streams of information by a driver. It requires the coordinated use of both hands and feet. In particular, the driving task is actually a combination of various cognitive processes such as perception, attention, motor control, working memory, decision-making and driver’s mental workload. The brain signals are very good at detecting the whole environment around the car, making decisions and controlling body movements. Unlike performance and subjective measurements, psychophysiological measures offer continuous observation in high time resolution (e.g. in milliseconds) and could be collected without intruding into the operator’s task. Suppose the car needs to take a turn at an intersection, the brain signal sends command to the driver’s muscles to take in charge and move the steering wheel and shift gears or put your foot on the brake pedal or accelerator. Such high coordination is implemented by the brain for driving in a matter of split-seconds. It is also well-known that humans, who recover from traumatic brain injury, are often thought to be unfit for driving due to deficits in remembering, learning and planning. Significant increased activation in the left dorsolateral precentral gyrus and postcentral gyrus is observed when starting to move the car. Turning activates an extended area from occipital cortex dorsally to superior parietal cortex and laterally in the right hemisphere to the posterior middle temporal gyrus. Reversing activation is prominent in the lateral precentral gyrus and anterior insula/ventrolateral prefrontal cortex. Stopping involves a more restricted activation and focuses more on the anterior part of the pre-SMA. Monitoring actions from other drivers shows extensive activation in the precuneus and superior parietal cortices. Traffic rule related thoughts are associated with significant activation of the right lateral PFC. In this chapter, we have attempted to detect the stress level of a person while driving a car. Stress appears into the EEG spectrum by an increase of activity in the frequency bands predominantly in the parietal and central regions of the brain. In the same time, a decrease of activity in the band can also be observed, as beta activity increases with cognitive tasks and active concentration. This has been shown in several studies. EEG is so efficient in detecting drowsiness and stress that it is often used as a reference indicator.
  • 40. Page 40 of 71 3.2 EXPERIMENTAL FRAMEWORK The experimental setup is the part of research through which the brain signals of the subject is obtained. An EEG data acquisition unit consisting of a multi-channel recording facility is a must in this experiment. The system setup for long term monitoring of brain signals is organized as follows: ▪ An electrode placement EEG system ▪ a 15 channels recording unit (15 EEG electrodes are used) ▪ a video camera synchronized with EEG recording ▪ a PC running a car racing game for visual and acoustic stimulation ▪ a steering, brake and accelerator for the car stimulator We may also need a comfortable chair, a quiet room and screening from interference is a bonus. The setup of the experiment must be consistent (i.e. room condition, stimulation used and length of data recorded) especially when taking the data at different times. Aiming for better spatial filtering and accuracy, five different subjects are taken. The experiment leverages an existing technology: electroencephalography (or EEG) for noninvasively recording brain signals from the scalp. Figure 3.1 illustrates the experimental paradigm. Fig 3.1: Experimental Apparatus -Schematic Diagram EEG recording of brain activity Car Simulation in PC
  • 41. Page 41 of 71 3.3 REAL-TIME SIGNAL ACQUISITION Electrical brain activity from Subject 1 (the ‘Driver’) is recorded using EEG (Figure 3.2 b) in the Artificial Intelligence Laboratory in the department of Electronics and Telecommunications Engineering at Jadavpur University, Kolkata. This brain activity is interpreted by a computer. The task that subjects must cooperatively do via brain-muscle coordination is to drive a driving simulator in accordance with an emulated driving scenario. There are other computerized cars in the street with various obstacles imitating the real-life conditions. The driving simulator includes a steering wheel, brake and accelerator pedals and hence provides a feeling of a car to the subjects. Fig. 3.3 shows the acquisition of EEG signals by EEG electrodes placed on the scalp of a subject during the driving experiment, whereas Fig. 3.4 provides a screenshot of EEG signals corresponding to mild stress while performing the experiment. Fig 3.2: Experimental Apparatus -photographs of setup a) subject 1 with car simulation b) electrode placement EEG system c) subject 1 with brake and accelerator
  • 42. Page 42 of 71 Fig 3.3: EEG signal acquisition during the experiment Fig 3.4: Screenshot of EEG signal showing mild stress levels
  • 43. Page 43 of 71 3.4 FEATURE EXTRACTION AND SELECTION Extracted features are meant to minimize the loss of imperative information embedded in the signal. In addition, they also simplify the amount of resources needed to describe a huge set of data accurately and hence reducing the dimension of feature space and achieving better performances. The EEG signal data obtained from the subject is processed and stored in a word file for every 18 minutes observation time. Simultaneously 15 channel data is recorded and every word file is divided into 1000x15 matrices in an excel file. Using the excel files we obtain the 15 channel baseline plot in MATLAB. Then the mean plot is configured in MATLAB for further analysis. Fig. 3.5 and 3.6 provides 15-channel EEG baseline plot and the mean plot of 15-channel EEG signals respectively. Fig. 3.5 15 channel baseline plot Fig. 3.6 Mean plot of the 15 channels 0 100 200 300 400 500 600 700 800 900 1000 -200 -100 0 100 200 300 400 500 15 channel plot of subject1 BASE 0 100 200 300 400 500 600 700 800 900 1000 -40 -20 0 20 40 60 80 original signal
  • 44. Page 44 of 71 Extracting Power Spectral Density Features In order to select the correct features of the EEG signal related to the mental activity, the power spectral density of the original signal using parametric methods is computed as the frequency response of the mean values of the signal from a sequence of the time samples of the signals. The sampling frequency of the data is 8192 Hz. By observing the peaks at the frequencies corresponding to the periodicities of the data, we are detecting the power spectral density. The power spectral density (PSD) is intended for continuous spectra. The integral of the PSD over a given frequency band computes the average power in the signal over that frequency band. By using FFT, a plot is produced (Fig 3.7) that goes from 0 to 70 Hz with a frequency spacing of 10 Hz based on the sampling rate divided by the number of time intervals (8192/8192). EEG signals are often quantified based on their frequency domain characteristics. Typically, the spectrum is estimated using Fast Fourier Transform (FFT). Fig 3.7 Power spectral density plots of 0.2 seconds EEG data
  • 45. Page 45 of 71 Extracting Fourier Transform Features In the experiment, to evaluate Fourier transform of the EEG data, the path to the mean channel plot of the EEG data is given in MATALAB. This approach is based on earlier observations that the EEG spectrum contains some characteristic waveforms that fall primarily within five frequency bands in our experiment- Delta: 0-4 Hz, Theta: 4-8 Hz, Alpha: 8-12 Hz, Beta: 12-24 Hz and Gamma: 24-32 Hz. Fig. 3.8(a) and (b) presents the raw EEG signal and Fourier transformed signal respectively. The chart below (Table 3.1) gives the values of the peaks of the signal displayed in fig 3.8(b). Table 3.1 Peaks of Signals in Fig. 3.8(b) Peak 1 Peak 2 Peak 3 Peak 4 Peak 5 4.9965e+03 420.0768 100.6872 264.6203 24.0696 Fig. 3.8 a) Raw EEG signal b) Fourier Transformed Signal
  • 46. Page 46 of 71 It is important to mention here that we often apply a ‘window’ to the data. This simply means taking the amount we want from the data stream. The window is moved along the data; we perform the FFT on this windowed data. There are more than 60 segments in each EEG dataset collected. So, we get corresponding Fourier transformed signal and values for each of these. For example, only 10 of these values are displayed in Fig. 3.9. Extracting Wavelet Coefficient Wavelet transform (WT) is meant to resolve issues of non-stationary signals such as EEG. The mother wavelet gives rise to these wavelets as part of derived functions through translation and dilation that is shifting and compression and stretching operations along the time axis, respectively. Continuous Wavelet Transform (CWT) is done on the mean channel unprocessed EEG plot, where wavelets are formed due to dilation and different translation factor. However, its major weakness is that scaling parameter and translation parameter of CWT change continuously. Thus, the coefficients of the wavelet for all available scales after calculation will consume a lot of effort and yield a lot of unused information. Fig. 3.9 Area under different sub-bands of the frequency spectrum (Z set)
  • 47. Page 47 of 71 Fig. 3.10 provides four plots depicting wavelet coefficient over 0-32Hz frequency range. This method is just the continuation of the orthodox Fourier transform method and 0-4HZ, 4-8HZ, 8-16HZ, and 16-32HZ frequency bands are used. The extracted wavelet coefficients provide a compact representation that shows the energy distribution of the EEG signal in frequency band. Therefore, the computed detail and approximation wavelet coefficients of the EEG signals were used as the feature vectors representing the signals. There are number of wavelet coefficients. In order to reduce the dimensionality of the feature vectors, statistics over the set of the wavelet coefficients were used. The following statistical features were used to represent the time-frequency distribution of the EEG signals: (i) Maximum of the wavelet coefficients in each sub-band. (ii) Minimum of the wavelet coefficients in each sub-band. (iii) Mean of the wavelet coefficients in each sub-band. (iv) Standard deviation of the wavelet coefficients in each sub-band. Fig. 3.10 4 wavelets obtained for 1000x15 raw EEG data 0 0.5 1 1.5 2 2.5 3 3.5 4 0 200 400 600 800 1000 1200 1400 1600 1800 2000 0 0.5 1 1.5 2 2.5 3 3.5 4 0 200 400 600 800 1000 1200 1400 1600 1800 2000 4 4.5 5 5.5 6 6.5 7 7.5 8 20 30 40 50 60 70 80 90 100 110 4 4.5 5 5.5 6 6.5 7 7.5 8 20 30 40 50 60 70 80 90 100 110 8 9 10 11 12 13 14 15 16 0 10 20 30 40 50 60 70 80 90 8 9 10 11 12 13 14 15 16 0 10 20 30 40 50 60 70 80 90 16 18 20 22 24 26 28 30 32 0 10 20 30 40 50 60 16 18 20 22 24 26 28 30 32 0 10 20 30 40 50 60
  • 48. Page 48 of 71 Extracting Hjorth Parameters In the experiment, to evaluate Hjorth parameters, the path to the mean channel plot of the EEG data is given in MATALAB. Every stress level data is divided into number of parts. The alarmingly stressed video data is divided into 85 parts and hence 85 values of mobility and complexity are obtained. ALARMINGLY STRESSED VIDEO: Parts 1 2 3 4 5 6 7 8 9 10 11 12 Complexity 0.8233 0.7990 1.3415 0.9658 0.8758 0.7890 0.8517 0.9193 0.9904 0.8813 0.7858 0.9438 Parts 13 14 15 16 17 18 19 20 21 22 23 24 Complexity 1.1937 0.9628 0.8819 0.7460 0.8797 0.8785 0.7340 0.8205 0.8423 0.8584 0.7659 0.7202 Parts 25 26 27 28 29 30 31 32 33 34 35 36 Complexity 0.6726 0.6229 0.6699 0.6016 0.8073 0.7797 1.0006 0.8455 0.7389 0.8122 0.7531 0.7798 Parts 37 38 39 40 41 42 43 44 45 46 47 48 Complexity 0.7698 0.6884 0.6608 0.7427 0.7884 0.7325 0.7968 0.7828 0.8209 1.1210 0.7703 0.6876 Parts 49 50 51 52 53 54 55 56 57 58 59 60 Complexity 0.7055 0.6622 0.5681 0.7017 0.7740 0.8002 0.6624 0.7431 0.7291 0.8761 0.8230 0.7358 Parts 61 62 63 64 65 66 67 68 69 70 71 72 Complexity 0.9184 0.7741 0.7908 0.8372 0.8407 0.7972 0.6972 0.8139 0.6724 0.7521 0.7371 0.7606 Parts 73 74 75 76 77 78 79 80 81 82 83 84 Complexity 0.7121 0.6256 0.7584 0.7285 0.7433 0.7646 0.7699 0.6627 0.8354 0.6960 0.6996 0.6775 Parts 85 Complexity 0.6205 Parts 1 2 3 4 5 6 7 8 9 10 11 12 Mobility 0.1085 0.1046 0.1969 0.0810 0.1435 0.1719 0.2251 0.1730 0.1983 0.1704 0.1559 0.1636
  • 49. Page 49 of 71 Parts 13 14 15 16 17 18 19 20 21 22 23 24 Mobility 0.2938 0.1474 0.1781 0.1408 0.1590 0.1963 0.1446 0.1674 0.1926 0.1901 0.1029 0.1801 Parts 25 26 27 28 29 30 31 32 33 34 35 36 Mobility 0.1534 0.1408 0.1601 0.1292 0.1801 0.1925 0.1663 0.1965 0.1639 0.1501 0.1453 0.1792 Parts 37 38 39 40 41 42 43 44 45 46 47 48 Mobility 0.1173 0.1358 0.1240 0.1464 0.1641 0.1485 0.1484 0.1674 0.0981 0.2079 0.1249 0.1431 Parts 49 50 51 52 53 54 55 56 57 58 59 60 Mobility 0.1226 0.1455 0.1273 0.1360 0.1067 0.1489 0.1372 0.1685 0.1316 0.1646 0.1521 0.1453 Parts 61 62 63 64 65 66 67 68 69 70 71 72 Mobility 0.1267 0.1479 0.1459 0.1564 0.1382 0.1738 0.1161 0.1906 0.1401 0.1401 0.1768 0.1665 Parts 73 74 75 76 77 78 79 80 81 82 83 84 Mobility 0.1490 0.1567 0.1420 0.1745 0.1725 0.1533 0.1845 0.1309 0.1498 0.2164 0.1538 0.1377 Parts 85 Mobility 0.1222 MODERATELY STRESSED VIDEO: Parts 1 2 3 4 5 6 7 8 9 10 11 12 Complexity 0.9837 0.8380 0.8231 0.9150 0.8009 0.8412 0.8051 0.9090 0.8529 0.8178 0.7643 0.8461 Parts 13 14 15 16 17 18 19 20 21 22 23 24 Complexity 0.7425 0.8440 0.8918 0.8483 0.8127 0.9197 0.9459 0.8894 0.8292 0.9988 0.8921 0.8095 Parts 25 26 27 28 29 30 31 32 33 34 35 36 Complexity 0.8227 0.7619 0.8642 0.9126 1.0119 0.8912 0.9607 0.8928 0.8623 0.7880 0.8313 0.8346 Parts 37 38 39 40 41 42 43 44 45 46 47 48 Complexity 0.7309 0.7557 0.7171 0.8515 0.6242 0.6310 0.7744 0.6805 0.4873 0.8457 0.7414 0.7073
  • 50. Page 50 of 71 Parts 49 50 51 52 53 54 55 56 57 58 59 60 Complexity 0.6800 0.6466 0.7295 1.1789 0.6583 0.8081 0.7434 0.7017 0.7175 0.6687 0.6898 0.7801 Parts 61 62 63 64 65 66 67 68 69 70 71 72 Complexity 0.7556 0.9429 0.7816 0.7050 0.7485 0.6516 0.8283 0.8263 0.7948 0.7414 0.8214 0.6898 Parts 73 74 75 76 77 78 79 80 Complexity 0.6549 0.6964 0.6159 0.7299 0.7433 0.7165 0.8384 0.8234 Parts 1 2 3 4 5 6 7 8 9 10 11 12 Mobility 0.1598 0.1767 0.1521 0.1355 0.1489 0.1645 0.1513 0.1761 0.1730 0.1402 0.1563 0.1892 Parts 13 14 15 16 17 18 19 20 21 22 23 24 Mobility 0.1457 0.1670 0.1655 0.1588 0.1497 0.1864 0.1837 0.1870 0.1784 0.2090 0.2037 0.1496 Parts 25 26 27 28 29 30 31 32 33 34 35 36 Mobility 0.1348 0.1705 0.1727 0.2247 0.2614 0.1406 0.1911 0.1631 0.1834 0.1679 0.1530 0.2354 Parts 37 38 39 40 41 42 43 44 45 46 47 48 Mobility 0.1647 0.1297 0.1104 0.1810 0.1487 0.1484 0.2037 0.1830 0.1344 0.1892 0.1526 0.1865 Parts 49 50 51 52 53 54 55 56 57 58 59 60 Mobility 0.1604 0.1546 0.1522 0.2461 0.1516 0.1232 0.1473 0.1520 0.1916 0.1364 0.1568 0.1758 Parts 61 62 63 64 65 66 67 68 69 70 71 72 Mobility 0.1410 0.1850 0.1737 0.1549 0.1699 0.1147 0.1583 0.1468 0.2189 0.1422 0.2060 0.1447 Parts 73 74 75 76 77 78 79 80 Mobility 0.1195 0.1753 0.1523 0.1016 0.1450 0.1423 0.1451 0.1447
  • 51. Page 51 of 71 RELAXED VIDEO: Parts 1 2 3 4 5 6 7 8 9 10 11 12 Complexity 0.9468 0.8064 0.7582 0.7918 0.8362 0.8152 1.1945 0.6858 0.7623 0.7136 0.8947 0.7654 Parts 13 14 15 16 17 18 19 20 21 22 23 24 Complexity 0.7049 0.6264 0.7292 0.6397 0.7437 0.8271 0.7437 0.8047 0.9170 0.8746 0.7666 0.7416 Parts 25 26 27 28 29 30 31 32 33 34 35 36 Complexity 0.7500 0.7900 0.7503 0.8202 0.7802 0.8067 0.7517 0.8789 1.0304 0.8633 0.9247 0.8749 Parts 37 38 39 40 41 42 43 44 45 46 47 48 Complexity 0.8902 0.8316 0.7652 0.7504 0.7024 0.7576 0.8713 0.7516 0.8726 1.0002 0.7360 0.8142 Parts 49 50 51 52 53 54 55 56 57 58 59 60 Complexity 0.7616 1.1334 0.6938 0.7146 1.2038 0.6165 0.9173 0.7215 0.6791 0.7350 0.8522 0.6939 Parts 61 Complexity 1.0141 Parts 1 2 3 4 5 6 7 8 9 10 11 12 Mobility 0.1319 0.1719 0.1443 0.1628 0.1595 0.1290 0.1743 0.1299 0.1678 0.1353 0.1348 0.1613 Parts 13 14 15 16 17 18 19 20 21 22 23 24 Mobility 0.1331 0.1431 0.1399 0.1261 0.1461 0.1700 0.1546 0.1481 0.1902 0.1852 0.2089 0.1567 Parts 25 26 27 28 29 30 31 32 33 34 35 36 Mobility 0.1567 0.1486 0.1657 0.1559 0.1754 0.1531 0.1500 0.1688 0.1322 0.2089 0.1782 0.1689 Parts 37 38 39 40 41 42 43 44 45 46 47 48 Mobility 0.1820 0.1724 0.1644 0.1475 0.1397 0.1755 0.1774 0.2196 0.1522 0.1657 0.1789 0.1576 Parts 49 50 51 52 53 54 55 56 57 58 59 60 Mobility 0.1663 0.1402 0.2153 0.1476 0.1644 0.2023 0.1456 0.1144 0.1483 0.1501 0.1208 0.2497 Parts 61 Complexity 0.1436
  • 52. Page 52 of 71 Extracting Kalman Filter Coefficients For Kalman feature extraction, the power spectral density figure is given as input. At first a random Gaussian noise is generated and added to the actual signal to obtain the estimated signal. The priori or posteri covariance matrix is calculated along with the Kalman gain and Kalman coefficient. Then using the original and the estimated signals, a mean square error is found out. This mean square error is used for further calculations. Fig. 3.11 provides Kalman filter analysis of EEG data. Fig 3.11 Kalman filter analysis of EEG data 0 10 20 30 40 50 60 70 -500 0 500 1000 1500 2000 2500 3000 3500 4000 Combined plot original’,’ estimated 0 10 20 30 40 50 60 70 0 2 4 6 8 10 12 x 10 6 Mean square error
  • 53. Page 53 of 71 3.5 CLASSIFIER VALIDATION AND PERFORMANCE Fuzzy inference is a method that interprets the values in the input vector and, based on some set of rules, assigns values to the output vector, hence providing a number of convenient ways to create fuzzy sets. A Membership function is a curve that defines how each point in the input space is mapped to a membership value or degree of membership between 0 and 1. The only condition a membership function must really satisfy is that it must vary between 0 and 1. The function itself can be an arbitrary curve whose shape we can define as a function that suits us from the point of view of simplicity, convenience, speed, and efficiency. Here we are using a Gaussian membership function (gaussmf) (See Fig. 3.12). It provides smoothly varying continuous curve and non-zero curves at all points. It returns a fuzzy set whose membership grades represent a normalized Gaussian function with a mean of mu and a width of sigma. After feature extraction we find the Gaussian for each feature of a set of data. A Gaussian curve is obtained by using 10 data sets in the experiment. Basically, we take the average value of each plot Fig. 3.12 Gaussian membership curve
  • 54. Page 54 of 71 obtained before in the feature extraction. Such 10 values make up a single Gaussian function. The Gaussian for PSD is obtained in a video of brain signals of a subject by following method: The video has 79 data sets of 1000x15 data- hence 79 PSD figures have been obtained. From each PSD graph, we have found the average value of amplitude for that graph. Using the average value of 10 graphs, we obtain one Gaussian figure. So, for this video, we get 8 Gaussian figures for PSD feature. The average values for 10 PSD graphs are shown in Table 3.2. With the help of these average values, the Gaussian curve is obtained, which is shown in Fig. 3.13. Table 3.2 Average Values of 10 PSD Graphs Avg1 Avg2 Avg3 Avg4 Avg5 Avg6 Avg7 Avg8 Avg9 Avg10 496.50 341.96 565.59 700.45 786.25 314.91 530.90 328.43 356.17 696.31 Fig. 3.13 Gaussian curve from PSD average values
  • 55. Page 55 of 71 The output-axis is a number known as the membership value between 0 and 1. The curve is known as a membership function and is often given the designation of µ. This curve defines the transition from the low values of PSD to high values of PSD in the fuzzy space. Similarly, 7 more Gaussian curves will be obtained for PSD which are simultaneously drawn in the above plot representing the different fuzzy sets. We assigned different colours to each of the new fuzzy sets. This enables us to reference each individual fuzzy set and differentiate between them. All videos representing the subject’s EEG data have stress levels to some degree, but one is significantly less stressed than the other. Using this Gaussian figure, we get the probability of the unknown data’s PSD lying within these fuzzy sets. In the same way we can get the Gaussian curves for FFT, Wavelet Coefficients, Hjorth parameters, Kalman filter. Hence, we create a fuzzy space for unknown data whose stress level needs to be detected. We take 5 video samples of a subject and distinguish them into different stress level anchors. The features are then extracted for each individual video as described earlier and after that fuzzy space is created. The experiment is done on five different subjects for more data providing accurate results. During the beginning of the experiment, the subject is in relaxed state. The stress level starts to rise gradually with the progression of time. So, after a time interval driver becomes less relaxed. In anticipation of the next hurdle, a slight anticipatory stress develops in the subject which is not harmful. Fatigue, drowsiness or any other situation may incur a kind of stress known as situational stress. When the stress levels become alarmingly high, it is known as alarming stress. A few plots of EEG features in fuzzy space representing stress levels of different drivers in presence of (a) alarmingly stressed, (b) moderately stressed and (c) relaxed visual stimuli are presented in Fig. 3.14-3.22.
  • 56. Page 56 of 71 Fig 3.14 Hjorth Complexity Gaussian plot during alarmingly stressed video 0 0.5 1 1.5 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Fig 3.15 Wavelet Coefficient Gaussian plot during alarmingly stressed video 0 100 200 300 400 500 600 700 800 900 1000 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
  • 57. Page 57 of 71 Fig 3.16 Kalman filter Gaussian plot during alarmingly stressed video -4 -3 -2 -1 0 1 2 3 4 x 10 6 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Fig 3.17 Power Spectral Density Gaussian plot during alarmingly stressed video 0 100 200 300 400 500 600 700 800 900 1000 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
  • 58. Page 58 of 71 Fig 3.18 Power Spectral Density Gaussian plot during moderately stressed video -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 x 10 4 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 100 200 300 400 500 600 700 800 900 1000 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Fig 3.19 Wavelet coefficient Gaussian plot during moderately stressed video
  • 59. Page 59 of 71 -4 -3 -2 -1 0 1 2 3 4 x 10 6 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Fig 3.21 Hjorth Complexity Parameter Gaussian plot during moderately stressed video 0 0.5 1 1.5 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Fig 3.20 Kalman filter Gaussian plot during moderately stressed video
  • 60. Page 60 of 71 Fig 3.22 Power Spectral Density Gaussian plot during relaxed video 0 100 200 300 400 500 600 700 800 900 1000 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Fig 3.23 Kalman filter Gaussian plot during relaxed video -4 -3 -2 -1 0 1 2 3 4 x 10 6 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
  • 61. Page 61 of 71 Recognizing Stress Level In the experiment, we collect and analyze EEG data during real-world driving tasks to determine a driver's relative stress level among the 5 stress anchors. EEG data were recorded continuously while drivers followed a set route through open roads in the emulated driving scenerio. Data from 20 drives of at least 50-min duration were collected for analysis to distinguish the three levels of driver’s stress. After filtering and feature extraction, the mean values of features are obtained, which are then projected onto the fuzzy space. The minimum and maximum value of the feature intersecting with the respective mean unknown feature value in the plot for a stress level is observed and highlighted in Table 3.3-3.5. Next, the minimum of the minimum values [Lm] of all the features of a stress level is noted. Similarly, the minimum of the maximum values [Hm] of all the features of the stress level is also noted. The tnorm is calculated for each of the stress levels. Fig 3.24 Wavelet coefficient Gaussian plot during relaxed video 0 200 400 600 800 1000 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
  • 62. Page 62 of 71 Table 3.3 Minimum and Maximum Values of Features for Relaxed Video Table 3.4 Minimum and Maximum Values of Features for Moderately Stressed Video 1 2 3 4 5 6 7 8 9 Power Spectral Density .98615 .1834 .4438 .3449 .6596 .5549 .6587 .0952 .8283 Fourier Transform .9627 .99745 .48135 .54625 .53 .7 .7695 .7475 .7689 Wavelet Coefficient .9151 .1927 .2607 .3337 .3616 .509 .58645 .0931 .582 Hjorth Complexity .7891 .891 .926 .9466 .9603 .9049 .9262 0.9018 .9069 Hjorth Mobility .917 .9345 .2439 .9978 .9359 .95085 .6773 .9512 .9983 Kalman .9999 .0702 .8461 .3601 .9253 .6534 .8594 .1563 .9682 1 2 3 4 5 6 Power Spectral Density 0.5313 0.4636 0.00239 0.39025 0.268 0.4901 Fourier Transform 0.7999 0.7556 0.6806 0.9688 0.7332 0.7336 Wavelet Coefficient 0.4931 0.4538 0.01 0.3166 0.3077 0.4642 Hjorth Complexity 0.9565 0.515 0.5379 0.7387 0.9328 0.97655 Hjorth Mobility 0.9498 0.9649 0.2703 0.9566 0.899 0.9957 Kalman 0.3657 0.3117 0.00001 0.7834 0.2716 0.8538
  • 63. Page 63 of 71 Table 3.5 Minimum and Maximum Values of Features for Alarmingly Stressed Video The triangular norm (tnorm) issued to calculate the membership values of intersection of the fuzzy sets. tnorm fuzzy logics primarily aim at generalizing classical two-valued logic by admitting intermediary truth values between 1 (truth) and 0 (falsity) representing “degrees of truth” of propositions. The degrees are assumed to be real numbers from the unit interval [0, 1]. Hence, the tnorm obtained above for each set indicates the degree of precision in lying in that stress level. Table 3.6 provides the values of tnorm for different stress levels. Since the maximum value of tnorm is in alarming stress set, hence the subject is highly stressed and the car needs to be stopped for safety. 1 2 3 4 5 6 7 8 Power Spectral Density 0.4545 0.2994 0.3838 0.288 0.749 0.83785 0.7847 0.7878 Fourier Transform 0.8386 0.8057 0.8646 0.9981 0.94 0.984 0.9851 0.9954 Wavelet Coefficient 0.4153 0.4362 0.5006 0.3971 0.7315 0.8155 0.7635 0.867 Hjorth Complexity 0.8701 0.6018 0.7062 0.5851 0.6995 0.9106 0.342 0.784 Hjorth Mobility 0.9678 0.9979 0.9906 0.273 0.9924 0.97395 0.9292 0.915 Kalman 0.9946 0.1603 0.0404 0.1679 0.6326 0.9592 0.7802 0.8927 Table 3.6 tnorm for different stress levels Stress Level Relax Situational Stress Alarming Stress tnorm 0.246555 0.49265 0.4141
  • 64. Page 64 of 71 Chapter 4 Conclusions and Future Scope This is the concluding chapter of the project. It provides a self-review of the project, highlighting the problems undertaken therein and to what level and to what degree of accuracy the problems have been solved. It also provides a discussion covering the open problems, in general, which may be taken up as future research.
  • 65. Page 65 of 71 4.1 SELF-REVIEW OF WORK Data from 20 drives of at least 50-min duration were collected for analysis to distinguish the 5 levels of driver stress with an accuracy of over 92% across multiple drivers and driving days. The results show that for most of the drivers, brain signals are most closely correlated with driver stress level. These findings indicate that brain signals can provide a metric of driver stress in future cars capable of EEG monitoring. Such a metric could be used to help manage noncritical in-vehicle information systems and could also provide a continuous measure of how different road and traffic conditions affect drivers. 4.2 FUTURE RESEARCH DIRECTIONS Experiments in this area have gradually shown promise. Scientists at Swiss University are working with a car manufacturer Nissan to find out if they could use brain signals to improve driving experience. The idea is that a computer on-board the car could detect a driver’s intentions split seconds before they act by reading their brain signals. The computer could then opt to intervene or assist the driver, depending on external detection of other cars and objects around the car. Work on the project has begun in earnest on 7th November, 2011 with the researchers testing the concept by monitoring the brain signals of people using a driving simulator similar to our experiment. What they are trying to do is revolutionize the way the car interacts with the driver, the person who is controlling it. Brain measurements are used trying to understand what the driver is trying to do. The brain signals are very good at detecting the whole environment around the car and making the decisions themselves but the muscles react to situations much slower as when we are driving we use both our hands and feet making it complicated. So, we are bad executors having response time slower. The body controlling signals in our brain are still there and these brain signals are utilized to make an automated car making response time faster and safer than the driver would have made himself. But this doesn’t mean that the driver is half-asleep and the computer does everything for him/her. The driver is still kept active and the driving experience is improved.
  • 66. Page 66 of 71 REFERENCES 1. S. M. Courtney, L. Petit, J. V. Haxby and L. G. Ungerleider, “The role of prefrontal cortex in working memory: examining the contents of consciousness,” Philosophical Transactions of the Royal Society B: Biological Sciences, vol. 353, no. 1377, pp. 1819-1828, 1998. 2. W. L. Zhou, P. Yan, J. P. Wuskell, L. M. Loew and S. D. Antic, “Dynamics of action potentialbackpropagation in basal dendrites of prefrontal cortical pyramidal neurons,” European Journal of Neuroscience, vol. 27, no. 4, pp. 923–936, 2008. 3. E. T. Rolls, “The orbitofrontal cortex and reward,” Cerebral cortex, vol. 10, no. 3, pp. 284-294, 2000. 4. J. M. Spielberg, J. L. Stewart, R. L. Levin, G. A. Miller and W. Heller, “Prefrontal cortex, emotion, and approach/withdrawal motivation,” Social and Personality Psychology Compass, vol. 2, no. 1, pp. 135-153, 2008. 5. C. D. Salzman and S. Fusi, “Emotion, cognition, and mental state representation in amygdala and prefrontal cortex,” Annual Review of Neuroscience, vol. 33, pp. 173-202, 2010. 6. M. Shoykhet and R. S. B. Clark, “Structure, Function, and Development of the Nervous System,” ed. B. P. Fuhrman, J. J. Zimmerman, J. A. Carcillo, R. S. B. Clark, M. Relvas, A. T. Rotta, A. E. Thompson and J. D. Tobias, in Pediatric Critical Care (Fourth Edition), Elsevier, pp. 783-804, 2011. 7. J. G. Mai, G. Paxinos, and T. Voss, Atlas of the Human Brain, Elsevier, Amsterdam, The Netherlands, 3rd edition, 2008. 8. J. A. Kiernan, “Anatomy of the Temporal Lobe,” Epilepsy Research and Treatment, Hindwai Publishing, pp. 1-12, 2012. 9. E. Lugaresi, F. Cirignotta and P. Montagna, “Occipital lobe epilepsy with scotosensitive seizures: the role of central vision,” Epilepsia, vol. 25, no. 1, pp. 115-120, 1984. 10. E. Niedermeyer and F.L.D. Silva, Electroencephalography: Basic principles, clinical applications, and related fields, Lippincott Williams & Wilkins, (2004). 11. J. G. Dy and C. E. Brodley, “Feature selection for unsupervised learning,” Journal of Machine Learning Research, vol. 5, pp. 845-889, 2004. 12. Y. Kim, W. N. Street and F. Menczer, “Feature selection in unsupervised learning viaevolutionary search,” In Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 365-369, 2000. 13. Lotze and cohen“Volition and imagery in neurorehabilitation”, Sep;19(3):135-40,2006.
  • 67. Page 67 of 71 14. M. Gulhane and P. S. Mohod, “Intelligent Fatigue Detection and Automatic Vehicle Control System”, International Journal of Computer Science & Information Technology (IJCSIT), vol 6, no. 3, June 2014. 15. A. R. Varma, S. V. Arote, C. Bharti and K. Singh, “Accident Prevention Using Eye Blinking and Head Movement”, Emerging Trends in Computer Science and Information Technology (ETCSIT2012), Proceedings published in International Journal of Computer Applications® (IJCA), 2012. 16. M. Haak, S. Bos, S. Panic and L. J. M. Rothkrantz, “Detecting stress using eye blinks and brain activity from EEG signals”, Faculty of Electrical Engineering, Mathematics and Computer science, Delft University of Technology Faculty of Applied Sciences Netherlands Defence Academy, 2010. 17. J. A. Horne and L. A. Reyner, “Sleep related vehicle accidents,” Brit. Med. J., vol. 310, pp. 565– 567, 1995. 18. Hallett M, Fieldman J, Cohen LG, Sadato N, Pascual-Leone A., “Involvement of primary motor cortex in motor imagery and mental practice,” Behav Brain Sci. 1994. 19. Sirigu A, Cohen L, Duhamel JR, Pillon B, Dubois B, Agid Y, Pierrot-Deseilligny C. “Congruent unilateral impairments for real and imagined hand movements,” Neuroreport. 1995. 20. Stephan KM, Fink GR, Passingham RE, Silbersweig D, Ceballos-Baum AO, Frith CD, Frackowiak RS. Functional anatomy of the mental representation of upper extremity movements in healthy subjects. J Neurophysiol, pp.373–386, 1995. 21. Lotze M, Montoya P, Erb M, Hulsmann E, Flor H, Klose U. “Activation of cortical and cerebellar motor areas during executed and imagined hand movements: an fMRI study,” J Cogn Neurosci, pp. 491–501, 1999. 22. Gerardin E, Sirigu A, Lehericy S, Poline JB, Gaymard B, Marsault C. Partially overlapping neural networks for real and imagined hand movements. Cereb Cortex,1093–1104,2000. 23. Grezes J, Decety J. Functional anatomy of execution, mental simulation, observation and verb generation of action: a meta-analysis. Hum Brain Mapp.12:1–19,2001. 24. Jeannerod M. Neural simulation of action: a unifying mechanism for motor cognition. NeuroImage, pp 103–109, 2001. 25. Kimberley TJ, Khandekar G, Skraba LL, Spencer JA, Van Gorp EA, Walker SR. “Neural substrates for motor imagery in severe hemiparesis,” Neurorehabil Neural Repair, vol:20 pp:268–277,2006. 26. Hanakawa T, Immisch I, Toma K, Dimyan M, Van Gelderen P, Hallett M. Functional properties of brain areas associated with motor execution and imagery. J Neurophysiol,pp:989–1002,2003. 27. Dechent P, Merboldt KD, Frahm J. Is the human primary motor cortex involved in motor imagery? Cogn Brain Res;19 pp:138–144,2004.