ABSTRACTTitle of Document:                   EMOCHAT: EMOTIONAL INSTANT                                     MESSAGING WITH...
EMOCHAT: EMOTIONAL INSTANT MESSAGING WITH THE EPOC                     HEADSET                                 By         ...
© Copyright byFranklin Pierce Wright         2010
Table of ContentsAcknowledgements ...........................................................................................
4.3     The Need for Validation ........................................................................................ 2...
Appendix F: EmoChat Richness Questionnaire ............................................................... 95Bibliography ...
List of TablesTable 3.1 Common emoticons in Western and Eastern cultures ............................... 16Table 3.2 Examp...
List of FiguresFigure 3.1 Examples of expressive avatars ................................................................ ...
1 IntroductionThis chapter introduces the importance of emotion in interpersonal communication,and presents some of the ch...
It is clear that emotions play a very important role in interpersonal communication,and without them, communication would ...
participants perceive the interaction as more impersonal, resulting in less favorableevaluations of partners (Kiesler, Zub...
facial expressions of a basic animated avatar, and to report levels of basic affectivestates. A second study investigates ...
compares the techniques for conveying emotion used in EmoChat with techniquesdescribed in the literature.                 ...
2 EmotionThis chapter describes some of the historical perspectives on emotion, and introducesits role in affective comput...
physical aspect of emotional experience causation over the cognitive aspect. Thisphysical action before the subjective exp...
before we even consciously know what is happening to us. Conversely, consider joy-based pride after a significant accompli...
2.2 Emotion and Related Affective States2.2.1 Primary versus ComplexSeveral emotion classes exist that are more basic than...
alone, act as a sort of symptom of emotional experience (Picard, 1997). The easiestof these sentic responses to recognize ...
problems associated with relying on self-reported emotional experience. On onehand, limiting self-report of emotion to a s...
challenging than simply asking a person how he feels (Tetteroo, 2008). Still, this isan active area of research within the...
3 Existing Techniques for Conveying Emotion during  Instant Messaging3.1 IntroductionA review of the current literature on...
“happy,” that incoming messages are checked against. When a match is found, thegeneral affective nature of the message can...
3.2.3 Physiologic DataSome physiologic data is known to encode levels of affect, including galvanic skinresponse, skin tem...
3.3 Output TechniquesOutput techniques describe methods that can be used to display emotional content toa chat participant...
72% of North Americans (with an aggregate of 85% of respondents) used emoticonson a regular basis (Kayan, Fussell, & Setlo...
Figure 3.1 Ex                                      xamples of expr                                                    ress...
(Ekman & Oster, 1979), by clicking on a corresponding icon in the interface.Significant results from the study indicated h...
pu urpose-built haptic devic (force-fe            t            ces       eedback joys                                     ...
lightly thrown ball as a playful flirtatious gesture, or a fast throw to indicatedisagreement or anger. Emphasis is placed...
3.4 ConclusionThis chapter has separated the major components of emotional instant messaging intotwo categories: input tec...
4 Study 1: Validating the Emotiv Epoc Headset4.1 IntroductionThis study investigates the validity of the Epoc headset in t...
The Epoc device, however, replaces this conductive paste with saline-moistened feltpads, which reduces set up time and mak...
4.2.2 Affectiv SuiteThe Affectiv suite monitors levels of basic affective states including instantaneousexcitement, averag...
have evaluated the accuracy of its affective inference algorithms. Cambpell andcolleagues used raw EEG data from the heads...
The study design used in the present study is adapted from similar work by Chaneland colleagues, during which physiologica...
Fi igure 4.1 Initiali                  ization screen fo the TetrisClon application                                  or   ...
4.4.2.2 Hea          adset DataThe TetrisClo applicati receives affective sta informati from the EpocT           one      ...
4.5 Experimental ProceduresA total of (7) subjects participated in the experiment; however, data from one subjectwas incom...
difficult that progressing further is not practical. After each game the subjects restedfor 45 seconds to allow any height...
After the subjects finished all game play tasks the facilitator removed the headset, andsubjects completed the final Post-...
Subject   Trial Condition H_exc    H_eng    H_fru    S_exc S_eng S_fruSubject_1     1 low       0.283975 0.488978 0.347653...
The non-parametric Spearman’s rho was selected as the correlation metric fordetermining statistical dependence between hea...
of the last 15, 30, 60, and first 60 seconds of trial data. Spearman’s rho wascalculated for each new dataset by comparing...
Condition        H_exc      H_eng       H_fru      S_exc      S_eng      S_fru            low              0.29       0.54...
Fi igure 4.4 Comp              parison of grand mean headset and self-repor                                          t    ...
The significant correlation (p=.01) between headset and self-report levels ofexcitement and engagement are apparent in fig...
Q1. What types of events caused you to get excited during game play?Extracted Themes                       Participant Res...
4.7 Discussion4.7.1 Consistency in the Present StudyThe main goal of this study was to determine how accurately the Epoc h...
concentration, more planning, and more efficient decision making—all suggestive ofincreased cognitive load. With respect t...
be the reason that excitement increases from low to medium difficulty conditions, butthen decreases in the high condition....
messaging sessions. This application is called EmoChat. In the next chapter, a studyinvestigates how EmoChat can be used t...
5 Study 2: Emotional Instant Messaging with EmoChat5.1 IntroductionExisting instant messaging environments generally fail ...
5.2 EmoC5      Chat System Development                 m5.2.1 Over         rviewEmoChat is a client/serve application that...
Facial expression information that is captured by the Epoc headset is passed toEmoChat and used to animate a simple avatar...
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Emochat:  Emotional instant messaging with the Epoc headset
Upcoming SlideShare
Loading in...5
×

Emochat: Emotional instant messaging with the Epoc headset

6,863

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
6,863
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
438
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Emochat: Emotional instant messaging with the Epoc headset"

  1. 1. ABSTRACTTitle of Document: EMOCHAT: EMOTIONAL INSTANT MESSAGING WITH THE EPOC HEADSET Franklin Pierce Wright Master of Science 2010Directed By: Asst. Professor of Human-Centered Computing Dr. Ravi Kuber Information SystemsInterpersonal communication benefits greatly from the emotional informationencoded by facial expression, body language, and tone of voice, however thisinformation is noticeably missing from typical instant message communication. Thiswork investigates how instant message communication can be made richer byincluding emotional information provided by the Epoc headset. First, a studyestablishes that the Epoc headset is capable of inferring some measures of affect withreasonable accuracy. Then, the novel EmoChat application is introduced which usesthe Epoc headset to convey facial expression and levels of basic affective statesduring instant messaging sessions. A study compares the emotionality ofcommunication between EmoChat and a traditional instant messaging environment.Results suggest that EmoChat facilitates the communication of emotional informationmore readily than a traditional instant messaging environment.
  2. 2. EMOCHAT: EMOTIONAL INSTANT MESSAGING WITH THE EPOC HEADSET By Franklin Pierce Wright Thesis submitted to the Faculty of the Graduate School of the University of Maryland, Baltimore County, in partial fulfillment of the requirements for the degree of Master of Science 2010
  3. 3. © Copyright byFranklin Pierce Wright 2010
  4. 4. Table of ContentsAcknowledgements ................................................................................................................. iiTable of Contents ................................................................................................................... iiiList of Tables .......................................................................................................................... viList of Figures........................................................................................................................ vii1 Introduction .................................................................................................................... 1 1.1 The Importance of Emotion ..................................................................................... 1 1.2 Instant Messaging .................................................................................................... 2 1.3 The Emotional Problem with Instant Messaging .................................................... 2 1.4 Purpose of this Work ............................................................................................... 3 1.5 Structure of this Document ...................................................................................... 42 Emotion ........................................................................................................................... 6 2.1 What is Emotion? .................................................................................................... 6 2.1.1 The Jamesian Perspective ................................................................................... 6 2.1.2 The Cognitive-Appraisal Approach .................................................................... 7 2.1.3 Component-Process Theory ................................................................................ 8 2.2 Emotion and Related Affective States ...................................................................... 9 2.2.1 Primary versus Complex ..................................................................................... 9 2.3 Expressing Emotion ................................................................................................. 9 2.3.1 Sentic Modulation ............................................................................................... 9 2.4 Measuring Emotion ............................................................................................... 10 2.4.1 Self-Report Methods ......................................................................................... 10 2.4.2 Concurrent Expression Methods ....................................................................... 11 2.5 Conclusion ............................................................................................................. 123 Existing Techniques for Conveying Emotion during Instant Messaging ................ 13 3.1 Introduction ........................................................................................................... 13 3.2 Input Techniques ................................................................................................... 13 3.2.1 Textual Cues...................................................................................................... 13 3.2.2 Automated Expression Recognition .................................................................. 14 3.2.3 Physiologic Data ............................................................................................... 15 3.2.4 Manual Selection ............................................................................................... 15 3.3 Output Techniques ................................................................................................. 16 3.3.1 Emoticons.......................................................................................................... 16 3.3.2 Expressive Avatars ............................................................................................ 17 3.3.3 Haptic Devices .................................................................................................. 19 3.3.4 Kinetic Typography .......................................................................................... 21 3.4 Conclusion ............................................................................................................. 224 Study 1: Validating the Emotiv Epoc Headset ......................................................... 23 4.1 Introduction ........................................................................................................... 23 4.2 Overview of the Epoc Headset............................................................................... 23 4.2.1 Expressiv Suite .................................................................................................. 24 4.2.2 Affectiv Suite .................................................................................................... 25 4.2.3 Cognitiv Suite.................................................................................................... 25 iii
  5. 5. 4.3 The Need for Validation ........................................................................................ 25 4.4 Experimental Design ............................................................................................. 26 4.4.1 TetrisClone System Development ..................................................................... 27 4.4.2 Measures ........................................................................................................... 28 4.5 Experimental Procedures ...................................................................................... 30 4.6 Results and Analysis .............................................................................................. 32 4.6.1 Headset-Reported versus Self-Reported Levels of Affect ................................ 32 4.6.2 Subjective Causes of Affect during Gameplay ................................................. 38 4.7 Discussion ............................................................................................................. 40 4.7.1 Consistency in the Present Study ...................................................................... 40 4.7.2 Future Direction ................................................................................................ 42 4.8 Conclusion ............................................................................................................. 425 Study 2: Emotional Instant Messaging with EmoChat ............................................ 44 5.1 Introduction ........................................................................................................... 44 5.2 EmoChat System Development .............................................................................. 45 5.2.1 Overview ........................................................................................................... 45 5.2.2 Traditional Environment ................................................................................... 47 5.2.3 Application Architecture ................................................................................... 47 5.3 Experimental Design ............................................................................................. 48 5.3.1 Measures ........................................................................................................... 49 5.4 Experimental Setup ................................................................................................ 53 5.5 Experimental Procedures ...................................................................................... 53 5.6 Results and Analysis .............................................................................................. 55 5.6.1 Emotional Transfer Accuracy ........................................................................... 55 5.6.2 Richness of Experience ..................................................................................... 59 5.6.3 Chat Transcripts ................................................................................................ 63 5.6.4 System Usability ............................................................................................... 66 5.6.5 Informal Interviews ........................................................................................... 66 5.7 Discussion ............................................................................................................. 73 5.7.1 Summary of Results .......................................................................................... 73 5.7.2 Consistency with related work .......................................................................... 75 5.7.3 Future Direction ................................................................................................ 77 5.8 Conclusion ............................................................................................................. 786 Conclusion ..................................................................................................................... 79 6.1 Limitations ............................................................................................................. 79 6.2 Summary of Contributions ..................................................................................... 82 6.3 EmoChat Design Considerations .......................................................................... 84 6.4 EmoChat Compared with Existing Methods of Emotional Instant Messaging ..... 85 6.4.1 Input Technique ................................................................................................ 85 6.4.2 Output Technique .............................................................................................. 86 6.5 Future Direction .................................................................................................... 87 6.6 Closing................................................................................................................... 88Appendix A: Validation Study Demographics Questionnaire ......................................... 90Appendix B: Validation Study Experiment Questionnaire.............................................. 91Appendix C: Validation Study Post-Experiment Questionnaire ..................................... 92Appendix D: EmoChat Demographics Questionnaire...................................................... 93Appendix E: EmoChat Emotional Transfer Questionnaire............................................. 94 iv
  6. 6. Appendix F: EmoChat Richness Questionnaire ............................................................... 95Bibliography .......................................................................................................................... 96 v
  7. 7. List of TablesTable 3.1 Common emoticons in Western and Eastern cultures ............................... 16Table 3.2 Example of hapticons................................................................................. 20Table 4.1 Facial expression features measured by the Epoc headset ........................ 24Table 4.2 Headset and self-reported levels of affect per subject, per trial................. 33Table 4.3 Spearman correlation between headset and self-reported levels of affect (N=36) ................................................................................................................. 34Table 4.4 Spearman correlation of headset and self-report data for varied time divisions (N=36) ................................................................................................. 35Table 4.5 Grand mean headset and self-reported levels of affect per difficulty level 36Table 4.6 Grand mean Spearman correlation between headset and self-reported levels of affect (N=3) .................................................................................................... 36Table 4.7 Major themes identified in subjective affect survey results ...................... 39Table 5.1 Facial movements and affective information used by EmoChat ............... 46Table 5.2 EmoChat experimental groups................................................................... 48Table 5.3 ETAQ scores for each subject-pair and both experimental conditions ..... 56Table 5.4 Spearman correlation matrix between avatar features and perceived frequency of emotional states (N=5) .................................................................. 58Table 5.5 Wilcoxons signed rank test for significant difference between score means (N=10) ................................................................................................................. 61Table 5.6 Linguistic categories with significant differences between experimental conditions ............................................................................................................ 64Table 5.7 LIWC affective processes hierarchy .......................................................... 65Table 5.8 LIWC relativity hierarchy ........................................................................... 66 vi
  8. 8. List of FiguresFigure 3.1 Examples of expressive avatars ................................................................ 18Figure 4.1 Initialization screen for the TetrisClone application ................................ 28Figure 4.2 TetrisClone application during trials ........................................................ 28Figure 4.3 Example output from the TetrisClone application ................................... 29Figure 4.4 Comparison of grand mean headset and self-reported levels of excitement ............................................................................................................................. 37Figure 4.5 Comparison of grand mean headset and self-reported levels of engagement ......................................................................................................... 37Figure 4.6 Comparison of grand mean headset and self-reported levels of frustration ............................................................................................................................. 37Figure 5.1 The EmoChat client application ............................................................... 45Figure 5.2 EmoChat server application ..................................................................... 47Figure 5.3 Mean scores from the richness questionnaire, questions 1-5 ................... 59Figure 5.4 Mean scores from the richness questionnaire, questions 5-10 ................. 60Figure 5.5 Comparison of mean responses to REQ between subjects with headsets versus without headsets, in the EmoChat condition, Q1-5 (N=5) ...................... 62Figure 5.6 Comparison of mean responses to REQ between subjects with headsets versus without headsets, in the EmoChat condition, Q5-10 (N=5) .................... 62 vii
  9. 9. 1 IntroductionThis chapter introduces the importance of emotion in interpersonal communication,and presents some of the challenges with including emotion in instant messages. Thepurpose of this thesis is then stated, followed by an overview of the documentstructure.1.1 The Importance of EmotionConsider the following statement: “The Yankees won again.”Does the person who makes this remark intend to be perceived as pleased ordisappointed? Enthusiastic or resentful? The remark is purposely emotionallyambiguous to illustrate just how powerful the inclusion or absence of emotion can be.If the same remark were said with a big grin on the face, or with the sound ofexcitement in the voice, we would certainly understand that this person was quitepleased that his team was victorious.If the speaker displayed slumped shoulders and a head tilted downward we wouldassume that he was certainly less than jubilant. 1
  10. 10. It is clear that emotions play a very important role in interpersonal communication,and without them, communication would be significantly less efficient. A statementthat contains emotion implies context, without the necessity of explicit clarification.In some cases what is said may be equally as important as how it is said.1.2 Instant MessagingReal-time text-based communication is still on the rise. Instant messaging, in oneform of another, has infiltrated nearly all aspects of our digital lives, and shows nosign of retreat. From work, to school, to play, it’s becoming more and more difficultto shield ourselves from that popup, or that minimized window blinking in the taskbar, or that characteristic sound our phones make when somebody wants to chat withus. We are stuck with this mode of communication for the foreseeable future.1.3 The Emotional Problem with Instant MessagingAs convenient as it is, this text-based communication has inherent difficultiesconveying emotional information. It generally lacks intonation and the subtle non-verbal cues that make face-to-face communication the rich medium that it is. Facialexpression, posture, and tone of voice are among the highest bandwidth vehicles ofemotional information transfer (Pantic, Sebe, Cohn, & Huang, 2005), but arenoticeably absent from typical text-based communication. According to Kiesler andcolleagues, computer-mediated communication (CMC) in general is “observablypoor” for facilitating the exchange of affective information, and note that CMC 2
  11. 11. participants perceive the interaction as more impersonal, resulting in less favorableevaluations of partners (Kiesler, Zubrow, Moses, & Geller, 1985).The humble emoticon has done its best to remedy the situation by allowing textstatements to be qualified with the ASCII equivalent of a smile or frown. While thissuccessfully aids in conveying positive and negative affect (Rivera, Cooke, & Bauhs,1996), emoticons may have trouble communicating more subtle emotions. Othersolutions that have been proposed to address this problem are reviewed in chapter 3.Each solution is successful in its own right, and may be applicable in differentsituations. This work examines a novel method for conveying emotion in CMC,which is offered to the community as another potential solution.1.4 Purpose of this WorkThe main goal of this body of work is to investigate how instant messagecommunication is enriched by augmenting messages with emotional content, andwhether this can be achieved through the use of brain-computer interface (BCI)technology. The Emotiv Epoc headset is a relatively new BCI peripheral intendedfor use by consumers and is marketed as being capable of inferring levels of basicaffective states including excitement, engagement, frustration, and meditation. Astudy presented in this work attempts to validate those claims by comparing datareported by the headset with self-reported measures of affect during game play atvaried difficulty levels. The novel EmoChat application is then introduced, whichintegrates the Epoc headset into an instant messaging environment to control the 3
  12. 12. facial expressions of a basic animated avatar, and to report levels of basic affectivestates. A second study investigates differences between communication withEmoChat and a traditional instant messaging environment. It is posited that theEmoChat application, when integrated with the Epoc headset, facilitatescommunication that contains more emotional information, that can be described asricher, and that conveys emotional information more accurately than with traditionalIM environments.In the end, this complete work intends to provide, first, a starting point for otherresearchers interested in investigating applications that implement the Epoc headset,and second, results which may support the decision to apply the Epoc in computer-mediated communication settings.1.5 Structure of this DocumentThe remaining chapters of this work are structured as follows:Chapter 2 provides an overview of emotion, including historical perspectives, andhow emotions are related to affective computing. Chapter 3 reviews existingtechniques for conveying emotion in instant messaging environments. Chapter 4details a study to determine the accuracy of the Epoc headset. Chapter 5 introducesEmoChat, a novel instant messaging environment for exchanging emotionalinformation. A study compares EmoChat with a traditional instant messagingenvironment. Chapter 6 summarizes the contributions this work makes, and 4
  13. 13. compares the techniques for conveying emotion used in EmoChat with techniquesdescribed in the literature. 5
  14. 14. 2 EmotionThis chapter describes some of the historical perspectives on emotion, and introducesits role in affective computing. It intends to provide a background helpful for thestudy of emotional instant messaging.2.1 What is Emotion?The problem of defining what constitutes human emotion has plagued psychologistsand philosophers for centuries, and there is still no generally accepted descriptionamong researchers or laypersons. A complicating factor in attempting to defineemotion is our incomplete understanding of the complexities of the human brain.Some theorists have argued that our perceptions enter the limbic system of the brainand trigger immediate action without consultation with the more developed cortex.Others argue that the cortex plays a very important role in assessing how we relate toany given emotionally relevant situation, and subsequently provides guidance abouthow to feel.2.1.1 The Jamesian PerspectiveIn 1884 psychologist William James hypothesized that any emotional experience withwhich physiological changes may be associated requires that those samephysiological changes be expressed before the experience of the emotion (James,1884). In essence, James believed that humans feel afraid because we run from abear, and not that we run from a bear because we feel afraid. James emphasized the 6
  15. 15. physical aspect of emotional experience causation over the cognitive aspect. Thisphysical action before the subjective experience of an emotion has subsequently beenlabeled a “Jamesian” response. For historical accuracy, note, that at about the sametime that James was developing his theory Carl Lange independently developed avery similar theory. Collectively, their school of thought is referred to as James-Lange (Picard, 1997).2.1.2 The Cognitive-Appraisal ApproachIn contrast to James’ physical theory of emotion, a string of psychologists laterdeveloped several different cognitive-based theories. Notable among them is thecognitive-appraisal theory, developed by Magda Arnold, and later extended byRichard Lazarus, which holds that emotional experience starts not with a physicalresponse, but with a cognitive interpretation (appraisal) of an emotionally-inspiringsituation (Reisenzein, 2006). In continuing with the bear example, Arnold andLazarus would have us believe that we hold in our minds certain evaluations of thebear-object (it is dangerous and bad for us), we see that the bear is running toward us(is, or will soon be present), we anticipate trouble if the bear reaches us (poor copingpotential), and so we experience fear and run away.Certainly, there are valid examples of situations that seem to trigger Jamesianresponses. Consider the fear-based startle response when we catch a large objectquickly approaching from the periphery. It is natural that we sometimes react tostartling stimuli before the experience of the fear emotion, jumping out of the way 7
  16. 16. before we even consciously know what is happening to us. Conversely, consider joy-based pride after a significant accomplishment. It seems as though pride could onlybe elicited after a cognitive appraisal determines that (a) the accomplishment-object ispositive, (b) it has been achieved despite numerous challenges, and (c) it will not bestripped away. If examples can be found that validate both the James-Langeapproach and the cognitive-appraisal approach, is one theory more correct than theother?2.1.3 Component-Process TheoryIt is now suggested that emotional experience may result from very complexinteraction between the limbic system and cortex of the brain, and that emotions canbe described as having both physical and cognitive aspects (Picard, 1997).Encompassing this point of view, that a comprehensive theory of emotion shouldconsider both cognitive and physical aspects, is the component-process modelsupported by Klaus Scherer (Scherer, 2005). This model describes emotion asconsisting of synchronized changes in several neurologically based subsystems,including, cognitive (appraisal), neurophysiologic (bodily symptoms), motivational(action tendencies), motor expression (facial and vocal expression), and subjectivefeeling (emotional experience) components. Note that Scherer regards “subjectivefeeling” as a single element among many in what constitutes an “emotion.” 8
  17. 17. 2.2 Emotion and Related Affective States2.2.1 Primary versus ComplexSeveral emotion classes exist that are more basic than others. These are the emotionsthat seem the most Jamesian in nature—hard coded, almost reflex like responses that,from an evolutionary perspective, contribute the most to our survival. Fear and angerare among these basic, primary, emotions. Picard labels these types of emotions as“fast primary,” and suggests that they originate in the limbic system. This is incontrast to the “slow secondary,” or cognitive-based emotions that require time forintrospection and appraisal, and therefore require some cortical processing. Scherercalls this slow type of emotion “utilitarian,” in contrast with the fast, which he terms,“aesthetic.” An important distinction should be made between emotions and otherrelated affective states such as moods, preferences, attitudes, and sentiments. Adistinguishing factor of emotions is the comparatively short duration when consideredamong the other affective states.2.3 Expressing Emotion2.3.1 Sentic ModulationIf emotion has both physical and cognitive aspects, it seems natural that someemotions can be experienced without much, if any, outward expression. Interpersonalcommunication may benefit from those overt expressions of emotion that can beperceived by others. Picard discusses what she calls “sentic modulation,” overt orcovert changes is physiological features that, although do not constitute emotion 9
  18. 18. alone, act as a sort of symptom of emotional experience (Picard, 1997). The easiestof these sentic responses to recognize are arguably facial expression, tone of voice,and posture, or body language. Of these three, research suggests that facialexpression is the highest bandwidth, with regard to the ability to convey emotionalstate (Pantic, et al., 2005). There are other, more covert, symptoms of emotionalexperience, including heart rate, blood pressure, skin conductance, pupil dilation,perspiration, respiration rate, and temperature (Picard, 1997). Recent research hasalso demonstrated that some degree of emotionality may also be inferred byneurologic response as measured by electroencephalogram (Khalili & Moradi, 2009;Sloten, Verdonck, Nyssen, & Haueisen, 2008). Facial expression deserves additionalattention, being one of the most widely studied forms of sentic modulation. Ekmanand others have identified six basic emotion/facial expression combinations thatappear to be universal across cultures including, fear, anger, happiness, sadness,disgust, and surprise (Ekman & Oster, 1979). The universality of these facialexpressions, that they are so widely understood, suggests that it should be quite easyto infer emotion from them.2.4 Measuring Emotion2.4.1 Self-Report MethodsPerhaps the most widely used method for determining emotional state is through self-report. This technique asks a subject to describe an emotion he or she isexperiencing, or to select one from a pre-made list. Scherer discusses some of the 10
  19. 19. problems associated with relying on self-reported emotional experience. On onehand, limiting self-report of emotion to a single list of words from which the subjectmust choose the most appropriate response may lead to emotional “priming,” and/or amisrepresentation of true experience. On the other hand, allowing a subject togenerate a freeform emotional word to describe experience adds significant difficultyto any type of analysis (Scherer, 2005). Another method for self-report measurementis to have a subject identify emotional state within the space of some dimension.Emotional response is often described as falling somewhere in the two-dimensionalvalence/arousal space proposed by Lang. This dimensional model of affectdeconstructs specific emotions into some level of valence (positive feelings versusnegative feelings), and some level of arousal (high intensity versus low intensity)(Lang, 1995). As an example, joyful exuberance is categorized by high valence andhigh arousal, while feelings of sadness demonstrate low valence and low arousal.Lang posits that all emotion falls somewhere in this two-dimensional space. Problemsmay arise when emotion is represented in this space without knowing the triggeringevent, since several distinct emotions may occupy similar locations in valence-arousalspace, e.g., intense anger versus intense fear, both located in high-arousal, low-valence space (Scherer, 2005).2.4.2 Concurrent Expression MethodsAn objective way to measure emotional state is by inferring user affect by monitoringsentic modulation. This method requires the use of algorithms and sensors in order toperceive the symptoms of emotion and infer state, and is significantly more 11
  20. 20. challenging than simply asking a person how he feels (Tetteroo, 2008). Still, this isan active area of research within the affective computing domain because userintervention is not required to measure the emotion, which may be beneficial in somecases. Techniques include using camera or video input along with classificationalgorithms to automatically detect emotion from facial expression (Kaliouby &Robinson, 2004), monitoring galvanic skin response to estimate level of arousal(Wang, Prendinger, & Igarashi, 2004), and using AI learning techniques to inferemotional state from electroencephalograph signals (Khalili & Moradi, 2009; Sloten,et al., 2008).2.5 ConclusionPicard defines affective computing as, “computing that relates to, arises from, orinfluences emotion.” (Picard, 1997) According to Picard, some research in thisdomain focuses on developing methods of inferring emotional state from sentic usercharacteristics (facial expression, physiologic arousal level, etc.), while other researchfocuses on methods that computers could use to convey emotional information(avatars, sound, color, etc.) (Picard, 1997). A study of emotional instant messagingis necessarily a study of affective computing. In the context of instant messaging,Tetteroo separates these two research areas of affective computing into the study ofinput techniques, and output techniques (Tetteroo, 2008). The next chapter reviewshow these techniques are used during instant message communication to conveyemotional information. 12
  21. 21. 3 Existing Techniques for Conveying Emotion during Instant Messaging3.1 IntroductionA review of the current literature on emotional communication through instantmessage applications has identified several techniques for enriching text-basedcommunication with affective content. These techniques can be broadly classified aseither input techniques, inferring or otherwise reading in the emotion of the user, oroutput techniques, displaying or otherwise conveying the emotion to the partner(Tetteroo, 2008). These categories are reviewed in turn.3.2 Input TechniquesResearch concerning how emotions can be read into an instant messaging systemgenerally implements one of several methods: inference from textual cues, inferencethrough automated facial expression recognition, inference from physiologic data, ormanual selection.3.2.1 Textual CuesInput methods that use text cues to infer the emotional content of a message generallyimplement algorithms that parse the text of a message and compare its contents with adatabase of phrases or keywords for which emotional content is known. Yeoimplements a basic dictionary of emotional terms, such as “disappointed,” or 13
  22. 22. “happy,” that incoming messages are checked against. When a match is found, thegeneral affective nature of the message can be inferred (Yeo, 2008). Others haveused more complicated natural language processing algorithms to account for thesubtleties of communicative language (Neviarouskaya, Prendinger, & Ishizuka,2007).Another example of using text cues to infer the emotion of a message involves simplyparsing text for occurrences of standard emoticons. The presence of the smileyemoticon could indicate a positively valenced message, while a frowning emoticoncould indicate negative valence, as implemented by Rovers & Essen (2004).3.2.2 Automated Expression RecognitionThe goal during automated expression recognition involves using classificationalgorithms to infer emotion from camera or video images of a subject’s face.Kaliouby and Robinson use an automated “facial affect analyzer” in this manner toinfer happy, surprised, agreeing, disagreeing, confused, indecisive, and neutral statesof affect (Kaliouby & Robinson, 2004). The classifier makes an evaluation aboutaffective state based on information about the shape of the mouth, the presence ofteeth, and head gestures such as nods. 14
  23. 23. 3.2.3 Physiologic DataSome physiologic data is known to encode levels of affect, including galvanic skinresponse, skin temperature, heart beat and breathing rate, pupil dilation, and electricalactivity measured from the surface of the scalp (Picard, 1997). Wang and colleaguesused GSR data to estimate levels of arousal in an instant messaging application(Wang, et al., 2004). Specifically, spikes in the GSR data were used to infer highlevels of arousal, and the return to lower amplitudes signaled decreased level ofarousal.Output from electroencephalograph (EEG) has also been used to classify emotionalstate into distinct categories within arousal valence space by several researchers(Khalili & Moradi, 2009; Sloten, et al., 2008). These studies use AI learningtechniques to classify affective state into a small number of categories with moderatesuccess.3.2.4 Manual SelectionThe most basic method of adding emotional content to an instant message is bysimple manual selection or insertion. This method can take the form of a userselecting from a list of predefined emotions or emotional icons with a mouse click, orby explicitly inserting a marker, e.g., emoticon, directly in to the body of the messagetext. This type of input technique is widely used and is seen in research by (Fabri &Moore, 2005; Sanchez, Hernandez, Penagos, & Ostrovskaya, 2006; Wang, et al.,2004). 15
  24. 24. 3.3 Output TechniquesOutput techniques describe methods that can be used to display emotional content toa chat participant after it has been input into the system. These techniques generallyinvolve using emoticons, expressive avatars, haptic devices, and kinetic typography.3.3.1 EmoticonsEmoticons are typically understood as small text-based or graphical representations offaces that characterize different affective states, and have been ever evolving in anattempt to remedy the lack of non-verbal cues during text chat (Lo, 2008). Examplesof commonly used emoticons are presented in the table below. Meaning Western Emoticon Eastern Emoticon Happy :-) (^_^) Sad :-( (T_T) Surprised :-o O_o Angry >:-( (>_<) Wink ;-) (~_^) Annoyed :-/ (>_>) Table 3.1 Common emoticons in Western and Eastern culturesEmoticons are perhaps the most widely used method for augmenting textualcommunication with affective information. A survey of 40,000 Yahoo Messengerusers reported that 82% of respondents used emoticons to convey emotionalinformation during chat (Yahoo, 2010). A separate study by Kayan and colleaguesexplored differences in IM behavior between Asian and North American users andreported that of 34 total respondents, 100% of Asian subjects used emoticons while 16
  25. 25. 72% of North Americans (with an aggregate of 85% of respondents) used emoticonson a regular basis (Kayan, Fussell, & Setlock, 2006). These usage statisticsunderscore the prevalence of emoticons in IM communication.Sanchez and colleagues introduced an IM application with a unique twist on thestandard emoticon. Typical emoticons scroll with the text they are embedded in, andso lack the ability to convey anything more than brief fleeting glimpses of emotion.This novel application has a persistent area for emoticons that can be updated as oftenas the user sees fit, and does not leave the screen as messages accumulate (Sanchez,et al., 2006). Building on Russel’s model of affect (Russell, 1980), the teamdeveloped 18 different emoticons, each with three levels of intensity to represent asignificant portion of valence/arousal emotional space.3.3.2 Expressive AvatarsUsing expressive avatars to convey emotion during IM communication may beconsidered a close analog to the way emotion is encoded by facial expression duringface-to-face conversation, considering that facial expression is among the highestbandwidth channels of sentic modulation (Pantic, et al., 2005). 17
  26. 26. Figure 3.1 Ex xamples of expr ressive avatarsA study by Kaliouby and Robinson presents an in K p nstant messa aging applica ation calledFAIM which uses automa facial ex ated xpression re cognition to infer affect, and odisplays an ex xpressive av vatar reflectin that affec to the chat partner (Ka ng ct t aliouby &Robinson, 2004). Affecti states cuR ive urrently supp ported by FA include happy, AIMsu urprised, agr reeing, disag greeing, conf fused, indeci isive, and ne eutral.Fabri and Mo oore investig gated the use of animated avatars cap d pable of emo otional facial lex xpressions in an instant messaging environment (Fabri & M n m e Moore, 2005). Theyco ompared this with a condition in wh the avata was not an hich ar nimated and did not dch hange facial expressions except for minor rando eyebrow movement. The s, om why ypothesis wa that the co as ondition in question wou result in a higher lev of q uld vel“r richness,” co omprised of high levels of task invol o lvement, enj joyment, sen of nsepr resence, and sense of co d opresence. Participants i P interacted thr rough the IM application M ndu uring a class sical surviva exercise, in which both subjects w tasked w al n h were withco ollectively ordering a lis of survival items in ter of impor o st l rms rtance. An a avatarre epresenting each chat pa e artner could be made to d b display one o Ekman’s six universa of alfa acial express sions includi happines surprise, a ing ss, anger, fear, sadness, and disgust d 18
  27. 27. (Ekman & Oster, 1979), by clicking on a corresponding icon in the interface.Significant results from the study indicated higher levels of task involvement andcopresence in the expressive avatar condition, equally high levels of presence in bothconditions, and a higher level of enjoyment in the non-expressive avatar condition.The AffectIM application developed by Neviarouskaya also uses expressive avatarsto convey emotion during instant message communication (Neviarouskaya, et al.,2007). Rather than requiring a user to select an expression from a predefined set,AffectIM infers the emotional content of a message by analyzing the text of themessage itself, and automatically updates an avatar with the inferred emotion. Acomparison study identified differences between separate configurations of theAffectIM application: one in which emotions were automatically inferred, one thatrequired manual selection of a desired emotion, and one that selected an emotionautomatically in a pseudo-random fashion (Neviarouskaya, 2008). The studycompared “richness” between conditions, comprised of interactivity, involvement,sense of copresence, enjoyment, affective intelligence, and overall satisfaction.Significant differences indicated a higher sense of copresence in the automaticcondition than in the random condition, and higher levels of emotional intelligence inboth the automatic and manual conditions than in the random condition.3.3.3 Haptic DevicesHaptic instant messaging is described as instant messaging that employs waveformsof varying frequencies, amplitudes, and durations, transmitted and received by 19
  28. 28. pu urpose-built haptic devic (force-fe t ces eedback joys sticks, haptic touchpads, etc.), to c ,which special emotional meaning can be attached (Rovers & Essen, 2004 Roversw l m n d 4).an Essen int nd troduce their idea of “hap r pticons,” wh hich are desc cribed as “sm mallpr rogrammed force pattern that can be used to co ns b ommunicate a basic notio in a onsi imilar manne as ordinar icons are used in grap er ry u phical user in nterfaces.” T Theirpr reliminary application, HIM, parses instant mes sage text for occurrence of a H r estr raditional em moticons, e.g :), etc., an sends a pr g., nd redefined wa aveform to a connecte any edhaptic devices. For exam mple, a smiley face sends a waveform with mode y s m eratefr requency tha slowly gro in ampli at ows itude, while a frowny fac is represe ce ented byse everal abrup pulses with high frequency and am pt h mplitude. Emotic con Mean ning Hapt ticon Wavef form :-) Hap ppy :-( Sa ad Table 3.2 Example of h 2 hapticonsThe ContactIM applicatio developed by Oakley and O’ModT M on d dhrain takes a differentap pproach to in ntegrating ha aptic inform mation with a instant me an essaging env vironment. Aplugin for the Miranda IM environme was creat that mim the effec of tossing e M ent ted mics cts ga ball between partners by using a for enabled h n y rce haptic devic such as the phantom o ce e ora standard for rce-feedback joystick (O k Oakley, 2003 The appli 3). ication is des signed toal llow each us to impart a specific velocity and t ser t v trajectory to the ball dur o ring a throw.The generated momentum of the ball is persistent until the ch partner picks it up. InT d m t hatth way, the act of tossin the ball may convey s his ng m some degree of emotiona ality, e.g., a 20
  29. 29. lightly thrown ball as a playful flirtatious gesture, or a fast throw to indicatedisagreement or anger. Emphasis is placed on the asynchronous nature of typicalinstant message use, and the application has been designed to suit this mode ofinteraction by keeping the general characteristics of the tossed ball persistent untilinteraction by the receiver changes it.3.3.4 Kinetic TypographyKinetic typography is described as real time modification of typographiccharacteristics including animation, color, font, and size, etc., and may be used toconvey affective information (Yeo, 2008). Yeo developed an IM client that inferredaffective meaning through keyword pattern matching, and used kinetic typography toupdate the text of messages in real time (Yeo, 2008).An instant messaging client developed by Wang and colleagues represents emotion inarousal/valence space by combining kinetic typography with galvanic skin response.Manually selected text animations are meant to represent valence, while GSR that isrecorded and displayed to the chat partner represents level of arousal. Users wereasked when they felt the most involved during the online communication, andanswers typically corresponded to peaks in GSR level (Wang, et al., 2004). Thestudy participants reported that the inclusion of arousal/valence information made thecommunication feel more engaging and that it was preferred over traditional text-onlychat, although some users indicated that they would not always want their partner tobe aware of their arousal level (Wang, et al., 2004). 21
  30. 30. 3.4 ConclusionThis chapter has separated the major components of emotional instant messaging intotwo categories: input techniques and output techniques. Among input techniques,inference from textual cues, inference through automated facial expressionrecognition, inference from physiologic data, and manual selection have beenreviewed. Output techniques that were discussed include emoticons, expressiveavatars, haptic devices, and kinetic typography.The next chapter introduces the Epoc headset and describes some of its capabilities.This headset is used as the emotional input device for the EmoChat system discussedin a subsequent chapter, and can be thought of as using automated facial expressionrecognition in combination with physiologic data to infer and convey emotion. Thenext chapter also presents a study that investigates the validity of the Epoc affectclassifier. 22
  31. 31. 4 Study 1: Validating the Emotiv Epoc Headset4.1 IntroductionThis study investigates the validity of the Epoc headset in terms of how accurately itmeasures levels of excitement, engagement, and frustration. Self-reported measuresof excitement, engagement, and frustration are collected after games of Tetris areplayed at varied difficulty levels. The self-reported measures are compared with datafrom the headset to look for relationships.4.2 Overview of the Epoc HeadsetThe EmoChat application makes use of the Epoc headset for measuring affective stateand facial expression information. This headset, developed by Emotiv, was one ofthe first consumer-targeted BCI devices to become commercially available.Alternatives BCI devices that were considered include the Neurosky Mindset, and theOCZ Neural Impulse Actuator. The Epoc was selected because of the comparativelylarge number of electrodes (14) that it uses to sense electroencephalograph (EEG),and electromyography (EMG) signals, and the resulting capabilities. Additionally,the Epoc has a growing community of active developers who form a support networkfor other people using the software development kit to integrate headset capabilitieswith custom applications.Traditional EEG devices require the use of a conductive paste in order to reduceelectrical impedance and improve conductivity between the electrode and the scalp. 23
  32. 32. The Epoc device, however, replaces this conductive paste with saline-moistened feltpads, which reduces set up time and makes clean up much easier.A software development kit provides an application programming interface to allowintegration with homegrown applications, and a utility called EmoKey can be used toassociate any detection with any series of keystrokes for integration with legacyapplications. The developers have implemented three separate detection “suites”which monitor physiologic signals in different ways, and are reviewed below.4.2.1 Expressiv SuiteThis suite monitors EMG activity to detect facial expressions including left/rightwinks, blinks, brow furrowing/raising, left/right eye movement, jaw clenching,left/right smirks, smiles, and laughter. The detection sensitivity can be modifiedindependently for each feature and for different users. Universal detection signaturesare included for each feature, but signatures can also be trained to increase accuracy. Lower Face Movements Upper Face Movements Eye Movements Smirk Right Brow Raise Look Left Smirk Left Brow Furrow Look Right Smile Wink Right Laugh Wink Left Jaw Clench Blink Table 4.1 Facial expression features measured by the Epoc headset 24
  33. 33. 4.2.2 Affectiv SuiteThe Affectiv suite monitors levels of basic affective states including instantaneousexcitement, average excitement, engagement, frustration, and meditation. Detectionalgorithms for each state are proprietary and have not been released to the public,therefore the given labels may be somewhat arbitrary, and may or may not accuratelyreflect affective state. The goal of the present study is to determine the accuracy ofthese detections with a longer-term goal of investigating whether this information canbe used to augment the instant messaging experience through the presentation ofemotional content.4.2.3 Cognitiv SuiteThis suite allows a user to train the software to recognize an arbitrary pattern ofelectrical activity measured by EEG/EMG that is associated with a specific,repeatable thought or visualization. The user may then reproduce this specific patternto act as the trigger for a binary switch. Skilled users may train and be monitored forup to 4 different thought patterns at once.4.3 The Need for ValidationThe Epoc affectiv suite purports to measure levels of excitement, engagement,frustration, and meditation; however, the algorithms used to infer these states areproprietary and closed-source. There has been little research that references the Epocheadset, perhaps because it is still new and relatively unknown. No studies thus far 25
  34. 34. have evaluated the accuracy of its affective inference algorithms. Cambpell andcolleagues used raw EEG data from the headset as input to a P300-based selectionengine (Campbell, et al., 2010), and several others have reviewed the device (Andrei,2010; Sherstyuk, Vincent, & Treskunov, 2009). Methods are provided to retrievewhat each affectiv suite score is at any given moment, but do not let one see how eachscore is calculated. It is understandable that Emotiv has chosen to keep this part oftheir intellectual property out of the public domain, but if these affectiv measurementsare to be used in any serious capacity by researchers or developers, evidence shouldbe provided to support the claim that reported affectiv suite excitement levels arereasonable estimates of actual subject excitement levels, that affectiv suiteengagement levels are reasonable estimates of actual subject engagement levels, andso on.4.4 Experimental DesignA study was designed to determine the accuracy of the Epoc affectiv suite bypresenting study participants with stimuli intended to elicit different levels ofaffective and physiologic responses (in the form of game play at varied levels), andmeasuring for correlation between output from the headset and self-reported affectiveexperience. Since the overall goal of this thesis work is to investigate how theinclusion of affective information enriches instant message communication, theexcitement, engagement, and frustration headset detections are validated here. It isthought that they are the most applicable to a study of affective communication. 26
  35. 35. The study design used in the present study is adapted from similar work by Chaneland colleagues, during which physiological metrics were monitored as participantsplayed a Tetris game (Chanel, Rebetez, Betrancourt, & Pun, 2008). The difficultylevel of the game was manipulated in order to elicit differing affective states. Self-report methods were also used to collect data about participants’ subjectiveexperience of affect. The goal of the study was to use these physiological metricswith machine learning techniques to classify affective experience into threecategories, including, anxiety, engagement, and boredom. It is thought that theanxiety category of the Chanel study may be a close analog to the frustrationcomponent of the Epoc affectiv suite.4.4.1 TetrisClone System DevelopmentA small Tetris application (TetrisClone) was developed to automate the experimentand to aid with data collection. The application was written in C# using MicrosoftVisual Studio 2008 and interfaces with the Emotiv Epoc headset through the suppliedAPI.The initialization screen for TetrisClone can be seen in fig. 4.1. This screen collectsthe test subject’s name and is used to start logging data coming from the Epocheadset. After logging begins, the right panel can be hidden so that it does notdistract the subject during the experiment. A screenshot of the TetrisClone applicationduring one of the trials is presented in fig. 4.2. 27
  36. 36. Fi igure 4.1 Initiali ization screen fo the TetrisClon application or ne Figure 4.2 TetrisClon application e ne during trials g4.4.2 Meas sures4.4.2.1 Que estionnairesAll participan complete a total of 3 surveys to cA nts e collect basic demographi icin nformation, self-reported levels of af s d ffect during the experim ment, and ope en-endedop pinions abou the causes of affect du ut s play. These questionnaires are uring game ppr rovided in ap ppendices A-C. Demog A graphics ques stions asked about age, g d gender,fa amiliarity wi Tetris, an skill at co ith nd omputer/cons sole games. Self-reporte levels of edaf ffect questio asked sub ons bjects to rate their exper e riences of ex xcitement, en ngagement,an frustratio on a 5 point likert-scale between t nd on trial games. Open ended questions das sked subject to describe what game events mad them feel excited, eng ts e e de gaged, andfr rustrated. 28
  37. 37. 4.4.2.2 Hea adset DataThe TetrisClo applicati receives affective sta informati from the EpocT one ion ate ion eheadset at app proximately 2 Hz. and writes these t a hidden t box cont (visible w to text trolin the left pan of fig. 4.1). As a part n ne ticipant comp pletes one portion of the experiment e tan moves on to the next, the content of the text box control are output t a CSV file nd n ts t l to e.In this way, it becomes ea n t asier to deter rmine which groupings o headset d records h of dataar associated with which points in th experimen re d h he nt. Figure 4.3 Example ou 4 utput from the T TetrisClone app plicationThe output CSV files themT mselves con ntain time-sta amped, head dset-reported levels of dex xcitement (s short term), excitement (long term), e e engagement frustration, and t,meditation, however the present study is only conm p y ncerned with excitement (short term h t m),en ngagement, and frustrati compone ion ents. 29
  38. 38. 4.5 Experimental ProceduresA total of (7) subjects participated in the experiment; however, data from one subjectwas incomplete due to problems with maintaining a consistently good signal qualityfrom the headset. This incomplete data is not included in the analysis. Theremaining subjects, (N=6), were all male aged 25-30 (mean=28.5), pretty to veryfamiliar with the Tetris game (mode=very familiar), occasionally to very frequentlyplayed computer or console games (mode=very frequently), but never to rarely playedTetris (mode=rarely). Participants rated themselves as being average to far aboveaverage skill (mode=above average) when it came to playing computer or consolegames.Subjects arrived, were seated in front of a laptop, asked to review and sign consentforms, and then completed the Demographics Questionnaire (Appendix A). Thesubjects were then fitted with the Epoc headset. Care was taken to ensure that eachheadset sensor reported strong contact quality in the Control Panel software. Theself-scaling feature of the Epoc headset required 15-20 minutes prior to datacollection. During this time, the subjects were asked to play several leisurely gamesof Tetris as a warm-up activity.Once the headset adjusted to the subjects, the Tetris game/Headset data recorder waslaunched. Subjects played through a series of three Tetris games to determine “skilllevel,” as calculated by averaging the highest levels reached during each game. TheTetrisClone application has no maximum level cap, although levels above 10 are so 30
  39. 39. difficult that progressing further is not practical. After each game the subjects restedfor 45 seconds to allow any heightened emotional states to return to baseline.Once skill level was determined, three experimental conditions were calculatedautomatically as follows: High Difficulty Level = Skill Level (+ 2) Med Difficulty Level = Skill Level Low Difficulty Level = Skill Level (– 2)The subjects then played through a random ordered set of 6 trials consisting of 2games at each difficulty level, e.g., [8,6,6,4,8,4]. Trials were randomized to accountfor order effects. During each trial, games lasted for 3 minutes at a constantdifficulty/speed. If the subjects reached the typical game over scenario, the playingfield was immediately reset and the subjects continued playing until the 3 minuteswere over. At the end of each round, the subjects complete a portion of theExperiment Questionnaire (Appendix B). The subjects rested for 45 seconds aftercompleting the questionnaire, but before beginning the next round, to allow emotionalstate to return to baseline. Headset logging stopped after all 6 rounds had beenplayed. 31
  40. 40. After the subjects finished all game play tasks the facilitator removed the headset, andsubjects completed the final Post-Experiment Questionnaire (Appendix C). Thesubject was paid for his time, signed a receipt, and was free to leave.4.6 Results and Analysis4.6.1 Headset-Reported versus Self-Reported Levels of AffectThe main goal for this study was to determine whether or not the Epoc headsetreports data that is congruent with self-reported data of the same features. This wasdone in order to establish the validity of the Epoc affectiv suite. Headset and self-report data were compared a number of different ways. Each trial yielded 3 minutesworth of headset data sampled at 2 Hz. (approximately 360 samples x 3 affectivefeatures per sample = 1080 individual data elements), which were to be comparedwith a single self-reported level of affect for each of the 3 affective features inquestion (excitement, engagement, and frustration). Headset data from each trial wasreduced to 3 individual data elements by taking the mean of all sampled values foreach of the three affective features. Headset means from each trial were then pairedwith the corresponding self-reported levels of affect for that trial. These data arereproduced in the table below. 32
  41. 41. Subject Trial Condition H_exc H_eng H_fru S_exc S_eng S_fruSubject_1 1 low 0.283975 0.488978 0.347653 2 2 1Subject_1 2 high 0.315356 0.559145 0.400994 4 5 3Subject_1 3 med 0.316626 0.52268 0.430862 5 4 4Subject_1 4 low 0.317267 0.573343 0.396643 4 5 1Subject_1 5 med 0.389669 0.543603 0.371616 5 5 2Subject_1 6 high 0.305706 0.596423 0.454396 5 5 3Subject_2 1 high 0.301968 0.654293 0.580068 3 3 3Subject_2 2 med 0.384194 0.660933 0.785552 3 3 3Subject_2 3 high 0.323292 0.559863 0.368594 3 4 4Subject_2 4 med 0.271679 0.505589 0.336272 2 3 2Subject_2 5 low 0.289168 0.604071 0.505809 2 2 2Subject_2 6 low 0.304198 0.530313 0.37589 3 2 2Subject_3 1 high 0.302396 0.406588 0.34533 3 3 3Subject_3 2 med 0.279739 0.409497 0.356766 3 3 2Subject_3 3 low 0.353323 0.402969 0.413634 2 2 1Subject_3 4 low 0.311233 0.425248 0.360592 2 3 1Subject_3 5 med 0.365323 0.427358 0.429031 3 4 2Subject_3 6 high 0.361752 0.543518 0.380738 4 4 4Subject_4 1 low 0.233371 0.557115 0.391817 2 3 1Subject_4 2 high 0.334173 0.485377 0.448109 3 4 4Subject_4 3 med 0.265334 0.532051 0.463684 2 2 3Subject_4 4 med 0.416719 0.519389 0.522566 2 2 1Subject_4 5 low 0.281803 0.445416 0.424542 2 3 1Subject_4 6 high 0.305135 0.508839 0.474708 3 3 3Subject_5 1 low 0.27187 0.597617 0.362354 3 4 2Subject_5 2 med 0.256403 0.744225 0.361584 4 4 2Subject_5 3 high 0.292769 0.659934 0.377663 4 5 3Subject_5 4 med 0.307077 0.650888 0.533751 4 5 1Subject_5 5 high 0.256848 0.710124 0.460227 2 2 2Subject_5 6 low 0.24097 0.594766 0.555613 2 4 1Subject_6 1 high 0.250623 0.563784 0.455011 3 4 3Subject_6 2 low 0.271444 0.59069 0.452902 5 5 1Subject_6 3 high 0.282375 0.55344 0.458738 2 3 4Subject_6 4 med 0.257751 0.573329 0.482535 4 4 1Subject_6 5 med 0.235875 0.558884 0.460652 3 4 2Subject_6 6 low 0.305082 0.622437 0.490638 4 4 1Table 4.2 Headset and self-reported levels of affect per subject, per trial 33
  42. 42. The non-parametric Spearman’s rho was selected as the correlation metric fordetermining statistical dependence between headset and self-reported levels of affectbecause of the mixed ordinal-interval data. The resulting correlation coefficients andsignificances were calculated with SPSS and are presented in the table below (N=36). Correlation Pair Coefficient Sig. (2-tailed) Excitement 0.261 0.125 Engagement .361 0.03 Frustration -0.033 0.849 Table 4.3 Spearman correlation between headset and self-reported levels of affect (N=36)This analysis suggests that of the three features of affect that were examined,engagement appears to significantly correlate (p=.03) between what is reported by theheadset and what is experienced by the subject. No significant correlation ofexcitement or frustration between headset and self-reported affect levels was found.The levels of headset reported excitement, engagement, and frustration presented intable 4.2 are average levels over entire 3 minute trials. It could be possible that self-reported affect levels collected after the trials might better relate to headset levelsaveraged over smaller subdivisions of the trial time. For example, a subject whoexperienced high levels of frustration during the last 15 seconds of game play, butlow levels of frustration at all other times may have self-reported a high level offrustration after the trial, even though the subject generally experienced low levels.To investigate whether headset data averaged over smaller subdivisions of trial timebetter correlated with self-reported data, headset data was averaged from time slices 34
  43. 43. of the last 15, 30, 60, and first 60 seconds of trial data. Spearman’s rho wascalculated for each new dataset by comparing with the same self-report data. Theseresults are provided in the table below (N=36), along with original correlation resultsfrom table 4.3. Correlation Pair Time Division Coefficient Sig. (2-tailed) Excitement all 3min 0.261 0.125 last 60s 0.133 0.439 last 30s 0.21 0.219 last 15s 0.174 0.31 first 60s 0.285 0.092 Engagement all 3min .361* 0.03 last 60s .340* 0.042 last 30s 0.291 0.085 last 15s 0.316 0.061 first 60s 0.229 0.179 Frustration all 3min -0.033 0.849 last 60s -0.102 0.554 last 30s -0.049 0.775 last 15s 0.005 0.977 first 60s 0.176 0.305 Table 4.4 Spearman correlation of headset and self-report data for varied time divisions (N=36)This analysis suggests that no new significant relationships between headset and self-report data are found when analyzing headset data from specific subdivisions of time(last 60, 30, 15 seconds, and first 60 seconds), however, it does appear that self-reported excitement and frustration levels correlate best with averaged headset datafrom the first 60s of each trial.The data were further analyzed by calculating grand means for each difficultycondition, and for both headset and self-reported levels of affect. Grand means arepresented in the table below. 35
  44. 44. Condition H_exc H_eng H_fru S_exc S_eng S_fru low 0.29 0.54 0.42 2.75 3.25 1.25 med 0.31 0.55 0.46 3.33 3.58 2.08 high 0.30 0.57 0.43 3.25 3.75 3.25 Table 4.5 Grand mean headset and self-reported levels of affect per difficulty levelSpearman’s rho was again used as the metric for determining statistical dependencebetween headset and self-reported affective levels of the grand mean data. Resultingcorrelation coefficients and significances were calculated with SPSS and arepresented in the table below (N=3). Correlation Pair Coefficient Sig. (2-tailed) Excitement 1.00 0.01 Engagement 1.00 0.01 Frustration 0.50 0.667 Table 4.6 Grand mean Spearman correlation between headset and self-reported levels of affect (N=3)This analysis suggests very high correlation of excitement and engagement betweenheadset and self-reported levels of affect (p=.01), however these results should beinterpreted with caution, considering that this is a correlation between means and thenumber of values being compared is small (N=3). No significant correlation offrustration between headset and self-report levels of affect were found.To visualize the relationship between grand means of headset and self-reported affectfeatures, line charts are presented below. 36
  45. 45. Fi igure 4.4 Comp parison of grand mean headset and self-repor t rted levels of ex xcitementFi igure 4.5 Comp parison of grand mean headset and self-repor t rted levels of en ngagementFi igure 4.6 Comp parison of grand mean headset and self-repor t rted levels of fru ustration 37
  46. 46. The significant correlation (p=.01) between headset and self-report levels ofexcitement and engagement are apparent in fig. 4.4 and fig. 4.5. Frustration is seen tocorrelate moderately well from low to medium difficulty conditions; however thedegree and direction of change from medium to high difficulty conditions are clearlyin disagreement. As noted above, cautious interpretation of figures 4.4-4.6 is prudentconsidering the use of grand means and small number of data points, but theemergence of relationships between headset and self-reported levels of excitementand engagement are suggested.4.6.2 Subjective Causes of Affect during GameplayAfter the game play portion of the experiment concluded, subjects were asked tocomplete a brief survey to collect their subjective opinions about what caused theirexperiences of excitement, engagement, and frustration during game play. Participantresponses were analyzed to identify general themes. The data were coded for thesethemes, which have been aggregated and are presented in the table below. 38
  47. 47. Q1. What types of events caused you to get excited during game play?Extracted Themes Participant Responses (S3) “Doing well”Competent game (S1) “Clearing many lines at once”performance (S1) “Seeing a particularly good move open up” (S5) “Speed increase of block decent”Game speed/difficulty (S2) “When blocks got faster” (S4) “Game speed” (S5) “Block failing to land where desired” (S3) “A poor move”Poor game performance (S4) “A mistake during otherwise good game play” (S6) “End of game when doing bad” (S2) “When the blocks would get too high” (S3) “A good sequence of pieces”Positively perceived game (S1) “Getting pieces I was hoping for”variables (S6) “Seeing blocks I needed to make 4 rows”Q2. What made the game engaging?Extracted Themes Participant Responses (S2) “The intensity”Game speed/difficulty (S1) “Speed increases” (S4) “You have to think fast” (S6) “Have to think about what’s happening on board”Cognitive load (S1) “Anticipating future moves” (S1) “Seeing game board get more full” (S3) “Small learning curve in general”Game simplicity (S1) “Simplicity” (S4) “Few input options, enough to stay interesting”Q3. What happened during the game that made you feel frustrated?Extracted Themes Participant Responses (S5) “Too many of same block in a row” (S3) “Bad sequence of blocks”Negatively perceived game (S1) “Getting the same piece over and over”variables (S6) “Not getting the blocks I needed” (S5) “Block did not fit pattern I was constructing” (S1) “Getting undesired pieces” (S5) “When a block would land out of place”Poor game performance (S3) “Poor move” (S3) “Game moving too quickly”Game speed/difficulty (S4) “Game speed increase”Table 4.7 Major themes identified in subjective affect survey results 39
  48. 48. 4.7 Discussion4.7.1 Consistency in the Present StudyThe main goal of this study was to determine how accurately the Epoc headsetmeasures levels of excitement, engagement, and frustration in order to establish thevalidity of the Epoc affectiv suite. To this end, the TetrisClone application wasdeveloped and used to log headset output while subjects played games of Tetris atvaried difficulty levels. During the study, subjects were asked to self-report howexcited, engaged, and frustrated they felt for each game that they played.The responses to the self-report excitement, engagement, and frustration questionswere then statistically compared with the output from the headset. This analysissuggested that self-reported levels of engagement correlated well with levels reportedby the headset. To a lesser degree, the analysis suggested that excitement levelsmeasured by the headset correlated fairly well with self-reported levels. Frustrationlevels measured by the headset, however, did not appear to correlate with self-reported levels.Subjective responses about what made the game engaging seem to corroborate theself-report and headset data. General trends in the data described engagement asincreasing over low, medium, and high difficulty levels. The two main themesidentified in responses to “what makes the game engaging,” were gamespeed/difficulty and cognitive load. As level increases, game speed inherently alsoincreases. It makes sense that increased difficulty of the game should demand greater 40
  49. 49. concentration, more planning, and more efficient decision making—all suggestive ofincreased cognitive load. With respect to existing literature, the Chanel study, onwhich the experimental design of the present study was based, found a similar upwardlinear trend in participant arousal, however, the relationship between engagement andarousal has not been established.Excitement trends in self-report and headset data generally showed an increase fromlow to medium difficulty, then a slight decrease in the high difficulty condition.Responses to the question, “what types of events caused you to get excited duringgame play,” support this trend. General themes extracted from responses to thequestion include competent game performance, game speed/difficulty, poor gameperformance, and positively perceived game variables (such as getting a block typeyou were hoping for). Game speed increases as difficulty condition increases, so itscontribution to overall excitement level is always present in quantities that increasewith difficulty level. It might be assumed that competent game performance and poorgame performance have a balancing effect on one another, i.e., when one increasesthe other decreases, thereby creating a single contributing factor to excitement that isalways present, and arguably stable. The decisive contributor to excitement levelmay be the positively perceived game variables. It seems feasible that game variablesthat happen to be in the player’s favor should occur at a similar frequency, regardlessof difficulty level. It may also be feasible that these occurrences are less noticed inhigher difficulty levels due to increased cognitive load; a mind occupied by gametasks of greater importance. This lack of recognition of positive game variables may 41
  50. 50. be the reason that excitement increases from low to medium difficulty conditions, butthen decreases in the high condition. A similar trend reported by the Chanel studyoccurs in the valence dimension. Valence is shown to increase from low to mediumdifficulty conditions, then decrease in the high condition, although the relationshipbetween valence and excitement has not been established.4.7.2 Future DirectionIt might be beneficial to take a more granular approach to validating output from theEpoc headset to determine whether specific game events, e.g., a mistake or an optimalplacement, influence affect levels reported by the headset. The present study onlylooked at average headset output over large spans of time, but there is a great deal ofvariability in the data, some of which might have a relationship with game events.This more granular approach would require the ability to record specific game events,e.g., clearing a line, and cross referencing with data from the headset. This could beaccomplished by recording video of the game play. It might also yield interestingresults if headset data were tested for any correlation with other known physiologicalmeasures of affect such as GSR, or skin temperature.4.8 ConclusionWith the accuracy of at least some of the headset-reported levels of affect established,an instant message application was developed that uses output from the headset tocontrol a simple animated avatar and display data from the affectiv suite during 42
  51. 51. messaging sessions. This application is called EmoChat. In the next chapter, a studyinvestigates how EmoChat can be used to enrich emotional content during IMsessions. 43
  52. 52. 5 Study 2: Emotional Instant Messaging with EmoChat5.1 IntroductionExisting instant messaging environments generally fail to capture non-verbal cuesthat greatly increase information transfer bandwidth during face-to-facecommunication. The present study is a step toward investigating the use of a low-cost EEG device as a means to incorporate lost non-verbal information duringcomputer-mediated communication.An Instant Messaging application (EmoChat) has been developed that integrates withthe Emotiv Epoc headset to capture facial movements that are used to animate anexpressive avatar. Output from the headset is also used to convey basic affectivestates of the user (levels of excitement, engagement, and frustration).The present study examines how emotional information is transferred differentlybetween users of the EmoChat application and a “traditional” instant messagingenvironment in terms of emotionality. This study addresses the following researchquestions: 1. Does the system facilitate communication that generally contains more emotional information? 2. Does the system provide a greater degree of richness (as defined in section 5.3.1)? 3. Is the emotional state of participants more accurately conveyed/interpreted? 4. How usable is a system that implements this technology? 44
  53. 53. 5.2 EmoC5 Chat System Development m5.2.1 Over rviewEmoChat is a client/serve application that facilitE er n tates the exch hange of em motionalin nformation during instan message communicati d nt ion. The app plication was developed swith C# in Microsoft Visu Studio 2008.w M ualTraditional inT nstant messaging environ nments typic cally rely on manually ge eneratedem moticons in order to sha the emoti ape ional meanin of a mess ng sage. EmoC Chatin ntroduces a novel way to capture and convey em n o d motional mea aning by inte egrating with hth Emotiv Ep headset, a low cost, commercial available EEG device that is he poc , lly e eca apable of inf ferring facia expression and basic a ffective info al n ormation from raw EEG mdata.Fi igure 5.1 The EmoChat client application E t 45
  54. 54. Facial expression information that is captured by the Epoc headset is passed toEmoChat and used to animate a simple avatar with brow, eye, and mouth movements.Affective information captured by the headset is used to modify the value of a seriesof progress bar style widgets. The previous validation study suggested thatexcitement and engagement are reasonably estimated by the headset, however it wasdecided that other affective measures from the headset (frustration and meditation),would also be presented in the EmoChat application to give the users a chance todecide for themselves whether or not these measures are of any value.Although the application has been specifically designed to integrate with the Epocheadset, a headset is not required. All facial movements and affective levels may bemanually manipulated by the user at a very granular level, i.e., users may overridebrow control, but leave eye, mouth, and affect control determined by the headset.Manual override of facial and affect control is permitted whether or not a headset isbeing used. A summary of the facial movements and affective information conveyedby EmoChat is presented below:Eyebrow Eyes Mouth AffectStrong raise Blink Laugh ExcitementWeak raise Left wink Strong smile Average excitementNeutral Right wink Weak smile EngagementWeak furrow Neutral Neutral FrustrationStrong furrow Left look Clench (frown) Meditation Right lookTable 5.1 Facial movements and affective information used by EmoChat 46

×