EMOTION BASED 
COMPUTING 
By, 
SHILPA MARY GEORGE 
Roll no : 81 
Reg no : 12120082 
Guide : Mrs. SHEENA S
2
INDEX 
1) What is Affective Computing? 
2) Objectives 
3) Psychological Theories of Emotion 
4) Classes of Expressions 
5) Components of Emotion 
6) A-V-S Emotion Model 
7) Electroencephlography (EEG) 
1) Principles of EEG 
2) Applications 
3) Major Components 
4) Limitations 
8) Conclusion 
9) References 
3
WHAT IS AFFECTIVE COMPUTING ? 
Affective Computing : 
 field of research in AI dealing with emotions and 
machines. 
 the study and development of systems and devices 
that can 
* recognize, 
* interpret, 
* process, 
* and simulate human affects. 
 an interdisciplinary field spanning computer science, 
psychology, and cognitive science. 4
OBJECTIVES 
 To develop a computing device with its capacity to 
gather cues to user emotion from a variety of sources. 
-produce “emotion aware machines”. 
 Can you quantify Fear? Can you tell whether I am afraid? 
 How often have you used Emoticons in chat messages? 
Did you feel hampered without them? 
5
PSYCHOLOGICAL THEORIES OF EMOTION 
AGGRESIVENESS REMOTE 
ANGER DANGER 
ANTICIPATION 
ANTICIPATION 
LOVE 
SADNESS 
JOY SURPRISE 
ACCEPTANCE FEAR 
SUBMISSION 
AWE 
DISSAPPOINTMENT 
JOY 
ANGER DANGER 
SADNESS 
ACCEPTANCE FEAR 
SURPRISE 
OPTIMISM 
CONTEMPT 
6
CLASSES OF EXPRESSIONS 
 Broadly classified into happy, sad, disgust, fear, anger, 
surprise and neutral. 
 Goal is to classify an unknown expression into one of 
these classes 
 Facial expression, posture, gesture, speech, force or 
rhythm of key stroke, temperature change of hand on 
mouse can signify changes in user’s. emotional state, 
detected and interpreted by a computer 
7
COMPONENTS OF EMOTIONS 
 Subjective experience (feeling of fear and so on). 
 Physiological Changes in Autonomic Nervous 
System(ANS) and Endocrine System (Glands and 
Hormones released from them). 
- e.g. trembling with fear precedes conscious control of 
them 
 Behavior evoked (such as running away or fainting due 
to fear) 
8
[A,V,S] EMOTION MODEL 
[Arousal , Valence , Stance] :- A 3-tuple models an 
“emotion”. 
 Arousal:- Surprise at high arousal, fatigue at low 
arousal 
-the intensity with which the emotion is experienced 
 Valence:- Content at high valence, Unhappiness at 
low valence 
-the discrimination between positive and negative 
experiences 
 Stance:- Stern at closed stance, accepting at open 
stance 
9
ELECTROENCEPHLOGRAPHY (EEG) 
 A medical imaging technique 
 A measurement of the electrical activity of the brain 
 The recording of the brain’s spontaneous electrical 
activity over a short period of time, usually 20-40 mins, 
as recoded from multiple electrodes placed on the scalp. 
10
PRINCIPLES OF EEG 
 The brain’s electrical charge is maintained by billions of 
neurons. 
 Neurons pass signals via action potential created by 
exchange between sodium & potassium ions in and out of 
the cell- Volume conduction 
 When the wave of ions reaches the electrodes on the 
scalp, they can push or pull electrons on the metal on the 
electrodes, the difference in push, or voltage, between 
any two electrodes can be measured by a voltmeter which 
over time gives us the EEG 
 Scalp EEG activity shows oscillations at a variety of 
frequencies. Several of these oscillations have 
characteristic frequency ranges, spatial distributions and 
are associated with different states of brain functioning. 
11
APPLICATIONS 
 Monitor alertness, coma and brain death 
 Locate areas of damage following head injury, stroke, 
coma etc. 
 Test afferent pathways (by evoked potentials) 
 Monitor cognitive engagement (alpha rhythm) 
 Control anaesthesia depth 
 Investigate epilepsy and locate seizure origin 
 Test epilepsy drug effects 
 Monitor human and animal brain development 
 Test drugs for conclusive effects 
 Investigate sleep disorder and physiology 
12
MAJOR COMPONENTS 
 Electrodes with conductive media 
 Amplifiers with filters 
 A/D converter 
 Recording device 
• electrodes read signals from head surface 
• amplifiers bring microvolt signals to the range where 
they can be digitalized accurately 
• converter changes signals from analog to digital 
• Personal computer stores and displays obtained data 
13
RECORDING ELECTRODES 
Types of electrodes : 
 Disposable (gel-less, and pre-gelled types) 
 Reusable disc electrodes (gold, silver or tin) 
 Headbands and electrode caps 
 Saline-based electrodes 
 Needle electrodes 
• Electrode caps are preferred with certain number of 
electrodes installed on its surface. 
• Needle electrodes are used for long recordings and are 
invasively inserted under the scalp. 
14
 Electrode locations and names are specified by the 
international 10-20 system 
 Label 10-20 designates proportional distance in percents 
between ears and nose where points for electrodes are 
chosen. 
 Electrode placements are labelled according to adjacent 
brain areas : F(frontal), C(central), T(temporal), 
P(posterior), and O(occipital). 
 The letters are accompanied by odd nos at the left side 
of the head and with even nos on the right side. 
15
Electrode Cap 
Labels for points 
16
LIMITATIONS OF EEG 
 Poor spatial resolution 
 Most sensitive to a particular set of post synaptic 
potentials, those generated in superficial layers of the 
cortex, in dendrites and deep structures or producing 
currents that are tangential to the skull. 
 It is mathematically impossible to construct a unique 
Intracranial current source for a given eeg signal as some 
currents produce potentials that cancel each other out – 
inverse problem. 
17
AREAS OF AFFECTIVE COMPUTING 
 AFFECTIVE WEARABLES 
Sensors & tools can be used in recognizing affective patterns, 
but these tools require a lot of attention/ maintenance. 
Figure : Wearer’s Blood Volume 
Pressure using 
photoplethysmography 
Figure : Sample & transmit 
biometric data to larger 
computer for analysis 
18
AREAS OF AFFECTIVE COMPUTING 
 EXPRESSING EMOTION: 
Evolution over 
the years 
Figure : MS Office Assistant Figure : Kismet Robot 
19
KISMET 
 an expressive robot at MIT is 
equipped with auditory and 
proprioceptive (touch) sensory 
inputs. 
 can express emotion through 
*vocalization 
* facial expression and 
adjustment of Gaze 
*direction & head orientation. 
 Recognise stimuli 
 Realistic 
20
CONCLUSION 
Affective Computing is a young field of research 
•For interactive systems, something far better than the 
current crop of “intelligent” systems are needed. 
•Affective Computing has applications in improving the 
quality of life in impaired people (successfully 
demonstrated for Autism) 
•Ethical compromises need to be done to inculcate affective 
computers 
•This field can really benefit from research into the human 
brain/mind. 
21
REFERENCES 
1. R.W. Picard (1995), "Affective Computing“,MIT 
Media Lab 
2. R.W. Picard (1998) , “Towards Agents that 
recognize emotions”, Actes Proceedings, 
IMAGINA 
3. http://www.ai.mit.edu/projects/humanoid-robotics-group/ 
kismet/kismet.html 
4. Automatic Facial Expression Recognition using 
Linear and Non-Linear Holistic Spatial Analysis, 
Ma and Wang (2005) 
5. Emotion and Reinforcement : Affective Facial 
Expressions facilitate Robot Learning, Joost 
Brokens (2007) 
6. Emotion Recognition Based on Brain-Computer 
Interface Systems- Taciana Saad Rached and 
Angelo Perkusich 22
23
QUERIES ?? 
24

EMOTION BASED COMPUTING

  • 1.
    EMOTION BASED COMPUTING By, SHILPA MARY GEORGE Roll no : 81 Reg no : 12120082 Guide : Mrs. SHEENA S
  • 2.
  • 3.
    INDEX 1) Whatis Affective Computing? 2) Objectives 3) Psychological Theories of Emotion 4) Classes of Expressions 5) Components of Emotion 6) A-V-S Emotion Model 7) Electroencephlography (EEG) 1) Principles of EEG 2) Applications 3) Major Components 4) Limitations 8) Conclusion 9) References 3
  • 4.
    WHAT IS AFFECTIVECOMPUTING ? Affective Computing :  field of research in AI dealing with emotions and machines.  the study and development of systems and devices that can * recognize, * interpret, * process, * and simulate human affects.  an interdisciplinary field spanning computer science, psychology, and cognitive science. 4
  • 5.
    OBJECTIVES  Todevelop a computing device with its capacity to gather cues to user emotion from a variety of sources. -produce “emotion aware machines”.  Can you quantify Fear? Can you tell whether I am afraid?  How often have you used Emoticons in chat messages? Did you feel hampered without them? 5
  • 6.
    PSYCHOLOGICAL THEORIES OFEMOTION AGGRESIVENESS REMOTE ANGER DANGER ANTICIPATION ANTICIPATION LOVE SADNESS JOY SURPRISE ACCEPTANCE FEAR SUBMISSION AWE DISSAPPOINTMENT JOY ANGER DANGER SADNESS ACCEPTANCE FEAR SURPRISE OPTIMISM CONTEMPT 6
  • 7.
    CLASSES OF EXPRESSIONS  Broadly classified into happy, sad, disgust, fear, anger, surprise and neutral.  Goal is to classify an unknown expression into one of these classes  Facial expression, posture, gesture, speech, force or rhythm of key stroke, temperature change of hand on mouse can signify changes in user’s. emotional state, detected and interpreted by a computer 7
  • 8.
    COMPONENTS OF EMOTIONS  Subjective experience (feeling of fear and so on).  Physiological Changes in Autonomic Nervous System(ANS) and Endocrine System (Glands and Hormones released from them). - e.g. trembling with fear precedes conscious control of them  Behavior evoked (such as running away or fainting due to fear) 8
  • 9.
    [A,V,S] EMOTION MODEL [Arousal , Valence , Stance] :- A 3-tuple models an “emotion”.  Arousal:- Surprise at high arousal, fatigue at low arousal -the intensity with which the emotion is experienced  Valence:- Content at high valence, Unhappiness at low valence -the discrimination between positive and negative experiences  Stance:- Stern at closed stance, accepting at open stance 9
  • 10.
    ELECTROENCEPHLOGRAPHY (EEG) A medical imaging technique  A measurement of the electrical activity of the brain  The recording of the brain’s spontaneous electrical activity over a short period of time, usually 20-40 mins, as recoded from multiple electrodes placed on the scalp. 10
  • 11.
    PRINCIPLES OF EEG  The brain’s electrical charge is maintained by billions of neurons.  Neurons pass signals via action potential created by exchange between sodium & potassium ions in and out of the cell- Volume conduction  When the wave of ions reaches the electrodes on the scalp, they can push or pull electrons on the metal on the electrodes, the difference in push, or voltage, between any two electrodes can be measured by a voltmeter which over time gives us the EEG  Scalp EEG activity shows oscillations at a variety of frequencies. Several of these oscillations have characteristic frequency ranges, spatial distributions and are associated with different states of brain functioning. 11
  • 12.
    APPLICATIONS  Monitoralertness, coma and brain death  Locate areas of damage following head injury, stroke, coma etc.  Test afferent pathways (by evoked potentials)  Monitor cognitive engagement (alpha rhythm)  Control anaesthesia depth  Investigate epilepsy and locate seizure origin  Test epilepsy drug effects  Monitor human and animal brain development  Test drugs for conclusive effects  Investigate sleep disorder and physiology 12
  • 13.
    MAJOR COMPONENTS Electrodes with conductive media  Amplifiers with filters  A/D converter  Recording device • electrodes read signals from head surface • amplifiers bring microvolt signals to the range where they can be digitalized accurately • converter changes signals from analog to digital • Personal computer stores and displays obtained data 13
  • 14.
    RECORDING ELECTRODES Typesof electrodes :  Disposable (gel-less, and pre-gelled types)  Reusable disc electrodes (gold, silver or tin)  Headbands and electrode caps  Saline-based electrodes  Needle electrodes • Electrode caps are preferred with certain number of electrodes installed on its surface. • Needle electrodes are used for long recordings and are invasively inserted under the scalp. 14
  • 15.
     Electrode locationsand names are specified by the international 10-20 system  Label 10-20 designates proportional distance in percents between ears and nose where points for electrodes are chosen.  Electrode placements are labelled according to adjacent brain areas : F(frontal), C(central), T(temporal), P(posterior), and O(occipital).  The letters are accompanied by odd nos at the left side of the head and with even nos on the right side. 15
  • 16.
    Electrode Cap Labelsfor points 16
  • 17.
    LIMITATIONS OF EEG  Poor spatial resolution  Most sensitive to a particular set of post synaptic potentials, those generated in superficial layers of the cortex, in dendrites and deep structures or producing currents that are tangential to the skull.  It is mathematically impossible to construct a unique Intracranial current source for a given eeg signal as some currents produce potentials that cancel each other out – inverse problem. 17
  • 18.
    AREAS OF AFFECTIVECOMPUTING  AFFECTIVE WEARABLES Sensors & tools can be used in recognizing affective patterns, but these tools require a lot of attention/ maintenance. Figure : Wearer’s Blood Volume Pressure using photoplethysmography Figure : Sample & transmit biometric data to larger computer for analysis 18
  • 19.
    AREAS OF AFFECTIVECOMPUTING  EXPRESSING EMOTION: Evolution over the years Figure : MS Office Assistant Figure : Kismet Robot 19
  • 20.
    KISMET  anexpressive robot at MIT is equipped with auditory and proprioceptive (touch) sensory inputs.  can express emotion through *vocalization * facial expression and adjustment of Gaze *direction & head orientation.  Recognise stimuli  Realistic 20
  • 21.
    CONCLUSION Affective Computingis a young field of research •For interactive systems, something far better than the current crop of “intelligent” systems are needed. •Affective Computing has applications in improving the quality of life in impaired people (successfully demonstrated for Autism) •Ethical compromises need to be done to inculcate affective computers •This field can really benefit from research into the human brain/mind. 21
  • 22.
    REFERENCES 1. R.W.Picard (1995), "Affective Computing“,MIT Media Lab 2. R.W. Picard (1998) , “Towards Agents that recognize emotions”, Actes Proceedings, IMAGINA 3. http://www.ai.mit.edu/projects/humanoid-robotics-group/ kismet/kismet.html 4. Automatic Facial Expression Recognition using Linear and Non-Linear Holistic Spatial Analysis, Ma and Wang (2005) 5. Emotion and Reinforcement : Affective Facial Expressions facilitate Robot Learning, Joost Brokens (2007) 6. Emotion Recognition Based on Brain-Computer Interface Systems- Taciana Saad Rached and Angelo Perkusich 22
  • 23.
  • 24.