EMPATHIC COMPUTING: NEW
APPROACHES TO GAMING
Mark Billinghurst
mark.billinghurst@auckland.ac.nz
August 5th 2021
FDG 2021
Shared Space (1999)
• Face to Face interaction, Tangible AR metaphor
• Easy collaboration with strangers
• Users acted same as if handling real objects
Shared Space Demo
AR Tennis (2005)
• First collaborative AR game on a mobile phone
Henrysson, A., Billinghurst, M., & Ollila, M. (2005, October). Face to face collaborative AR on mobile phones.
In Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ismar'05) (pp. 80-89). IEEE.
Lessons Learned
• Make the technology invisible
• Remove seam between real and virtual worlds
• Enable people to connect with each other
• Support/enhance natural communication
• People will create their own fun
• Provide a platform that encourages play
“Communication is not
only the essence of
being human, but also
a vital property of life.”
John A. Piece
1. Experience Capture
• Move from sharing faces
to sharing places
2. Natural Collaboration
• Faster networks support
more natural collaboration
3. Implicit Understanding
• Systems that recognize
behaviour and emotion
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Empathic
Computing
Empathy
“Seeing with the Eyes of another,
Listening with the Ears of another,
and Feeling with the Heart of another..”
Alfred Adler
Empathic Computing
1. Understanding: Systems that can
understand your feelings and emotions
2. Experiencing: Systems that help you
better experience the world of others
3. Sharing: Systems that help you better
share the experience of others
Sensors
VR
AR
Research Focus
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
https://www.youtube.com/watch?v=CdgWVDbMwp4
• Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing: Virtual Communication Cues (2019)
Sharing Virtual Communication Cues
• AR/VR displays
• Gesture input (Leap Motion)
• Room scale tracking
• Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
https://www.youtube.com/watch?v=K_afCWZtExk
Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
Multi-Scale Collaboration
• Changing the user’s virtual body scale
Sharing: Separating Cues from Body
• What happens when you can’t see each other?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018,
April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI
conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
https://www.youtube.com/watch?v=YrdCg8zz57E
User Study (16 participants)
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks:
• (1) asymmetric, (2) symmetric
• Key findings
• Mini-Me significantly improved performance time (task1)
• Mini-Me significantly improved Social Presence scores
• 63% (task 2) – 75% (task 1) of users preferred Mini-Me
“The ability to see the small
avatar … enhanced the
speed of solving the task”
Sharing a View
Sharing VR Experiences
• HTC Vive HMD
• Empathic glove
• Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
VR Environments
• Butterfly World: calm scene, collect butterflies
• Zombie Attack: scary scene, fighting zombies
https://www.youtube.com/watch?v=SaiHxps-Ofw
Changing the Other Person’s Heartrate?
• Follow-on study: Artificially changing and sharing heartrate (-20%, 0%, +20%)
• Key findings
• Manipulated heart rate affects perceived valence and arousal levels of another person
• No change in actual heartrate, but trend towards significance (p = 0.08)
• Significant environment effect – active has higher HR than passive
A. Dey, H. Chen, A. Hayati, M. Billinghurst and R. W. Lindeman, "Sharing Manipulated Heart
Rate Feedback in Collaborative Virtual Environments," 2019 IEEE International Symposium
on Mixed and Augmented Reality (ISMAR), Beijing, China, 2019, pp. 248-257.
Human Driven Avatars
Enhancing Emotion
• Using physiological and contextual cues to enhance emotion representation
• Show user’s real emotion, make it easier to understand user emotion, etc..
Real User
Physiological Cues
Arousal/Valence
Positive
Negative
Avatar
Context Cues
Hart, J. D., Piumsomboon, T., Lee, G. A., Smith, R. T., & Billinghurst, M. (2021, May). Manipulating Avatars for Enhanced
Communication in Extended Reality. In 2021 IEEE International Conference on Intelligent Reality (ICIR) (pp. 9-16). IEEE.
System Design
Early Results
Face Tracking Positive Affect Avatar Outcome
Technology Trends
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
HP Reverb G2 Omnicept
• Wide FOV, high resolution, best in class VR display
• Eye tracking, heart rate, pupillometry, and face camera
NextMind
• EEG attachment for AR/VR HMD
• 9 dry EEG electrodes
• https://www.next-mind.com/
• https://www.youtube.com/watch?v=yfzDcfQpdp0
Brain Synchronization
Pre-training (Finger Pointing) Session Start
Post-Training (Finger Pointing) Session End
Brain Synchronization in VR
Gumilar, I., Sareen, E., Bell, R., Stone, A., Hayati, A., Mao, J., ... & Billinghurst, M. (2021). A comparative study on
inter-brain synchrony in real and virtual environments using hyperscanning. Computers & Graphics, 94, 62-75.
asfd
NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Rhythm game like Beat Saber
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player
Conclusions
• Trend towards Empathic Computing
• Understanding, Experiencing, Sharing
• Empathic Computing enables new games
• Changes perspective
• Shares space/experience
• Enhances communication
• Many directions for future research
• Emotion recognition, empathic agents, etc..
MiniMe
Virtual Cues Enhanced Emotion
Brain
Synchronization
Emotion Recognition
Scene Capture
AI
Empathic Social Agents
• Goal: Using agents to creating
empathy between people
• Combine
• Scene capture
• Emotion recognition
• Shared tele-presence
• Enhanced communication cues
• Facilitating brain synchronization
• Separate cues from representation
Soul Machines
www.empathiccomputing.org
@marknb00
mark.billinghurst@auckland.ac.nz

Empathic Computing: New Approaches to Gaming

  • 1.
    EMPATHIC COMPUTING: NEW APPROACHESTO GAMING Mark Billinghurst mark.billinghurst@auckland.ac.nz August 5th 2021 FDG 2021
  • 2.
    Shared Space (1999) •Face to Face interaction, Tangible AR metaphor • Easy collaboration with strangers • Users acted same as if handling real objects
  • 3.
  • 4.
    AR Tennis (2005) •First collaborative AR game on a mobile phone Henrysson, A., Billinghurst, M., & Ollila, M. (2005, October). Face to face collaborative AR on mobile phones. In Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ismar'05) (pp. 80-89). IEEE.
  • 5.
    Lessons Learned • Makethe technology invisible • Remove seam between real and virtual worlds • Enable people to connect with each other • Support/enhance natural communication • People will create their own fun • Provide a platform that encourages play
  • 7.
    “Communication is not onlythe essence of being human, but also a vital property of life.” John A. Piece
  • 8.
    1. Experience Capture •Move from sharing faces to sharing places
  • 9.
    2. Natural Collaboration •Faster networks support more natural collaboration
  • 10.
    3. Implicit Understanding •Systems that recognize behaviour and emotion
  • 11.
  • 12.
  • 13.
    Empathy “Seeing with theEyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  • 14.
    Empathic Computing 1. Understanding:Systems that can understand your feelings and emotions 2. Experiencing: Systems that help you better experience the world of others 3. Sharing: Systems that help you better share the experience of others Sensors VR AR
  • 15.
    Research Focus Can wedevelop systems that allow us to share what we are seeing, hearing and feeling with others?
  • 16.
    Empathy Glasses • Combinetogether eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 17.
    Remote Collaboration • Eyegaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 18.
  • 19.
    • Using AR/VRto share communication cues • Gaze, gesture, head pose, body position • Sharing same environment • Virtual copy of real world • Collaboration between AR/VR • VR user appears in AR user’s space Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5. Sharing: Virtual Communication Cues (2019)
  • 20.
    Sharing Virtual CommunicationCues • AR/VR displays • Gesture input (Leap Motion) • Room scale tracking • Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  • 21.
  • 22.
    Results • Predictions • Eye/Headpointing better than no cues • Eye/head pointing could reduce need for pointing • Results • No difference in task completion time • Head-gaze/eye-gaze great mutual gaze rate • Using head-gaze greater ease of use than baseline • All cues provide higher co-presence than baseline • Pointing gestures reduced in cue conditions • But • No difference between head-gaze and eye-gaze
  • 23.
    Multi-Scale Collaboration • Changingthe user’s virtual body scale
  • 25.
    Sharing: Separating Cuesfrom Body • What happens when you can’t see each other? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13). Collaborating Collaborator out of View
  • 26.
    Mini-Me Communication Cuesin MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 27.
  • 28.
    User Study (16participants) • Collaboration between user in AR, expert in VR • Hololens, HTC Vive • Two tasks: • (1) asymmetric, (2) symmetric • Key findings • Mini-Me significantly improved performance time (task1) • Mini-Me significantly improved Social Presence scores • 63% (task 2) – 75% (task 1) of users preferred Mini-Me “The ability to see the small avatar … enhanced the speed of solving the task”
  • 29.
  • 30.
    Sharing VR Experiences •HTC Vive HMD • Empathic glove • Empatica E4 Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
  • 31.
    VR Environments • ButterflyWorld: calm scene, collect butterflies • Zombie Attack: scary scene, fighting zombies
  • 32.
  • 33.
    Changing the OtherPerson’s Heartrate? • Follow-on study: Artificially changing and sharing heartrate (-20%, 0%, +20%) • Key findings • Manipulated heart rate affects perceived valence and arousal levels of another person • No change in actual heartrate, but trend towards significance (p = 0.08) • Significant environment effect – active has higher HR than passive A. Dey, H. Chen, A. Hayati, M. Billinghurst and R. W. Lindeman, "Sharing Manipulated Heart Rate Feedback in Collaborative Virtual Environments," 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China, 2019, pp. 248-257.
  • 35.
  • 36.
    Enhancing Emotion • Usingphysiological and contextual cues to enhance emotion representation • Show user’s real emotion, make it easier to understand user emotion, etc.. Real User Physiological Cues Arousal/Valence Positive Negative Avatar Context Cues Hart, J. D., Piumsomboon, T., Lee, G. A., Smith, R. T., & Billinghurst, M. (2021, May). Manipulating Avatars for Enhanced Communication in Extended Reality. In 2021 IEEE International Conference on Intelligent Reality (ICIR) (pp. 9-16). IEEE.
  • 38.
  • 39.
    Early Results Face TrackingPositive Affect Avatar Outcome
  • 42.
    Technology Trends • Advanceddisplays • Wide FOV, high resolution • Real time space capture • 3D scanning, stitching, segmentation • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping
  • 43.
    HP Reverb G2Omnicept • Wide FOV, high resolution, best in class VR display • Eye tracking, heart rate, pupillometry, and face camera
  • 44.
    NextMind • EEG attachmentfor AR/VR HMD • 9 dry EEG electrodes • https://www.next-mind.com/
  • 45.
  • 46.
  • 47.
  • 48.
  • 49.
    Brain Synchronization inVR Gumilar, I., Sareen, E., Bell, R., Stone, A., Hayati, A., Mao, J., ... & Billinghurst, M. (2021). A comparative study on inter-brain synchrony in real and virtual environments using hyperscanning. Computers & Graphics, 94, 62-75.
  • 52.
  • 54.
    NeuralDrum • Using brainsynchronicity to increase connection • Collaborative VR drumming experience • Rhythm game like Beat Saber • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  • 55.
    Set Up • HTCVive HMD • OpenBCI • 3 EEG electrodes
  • 57.
    Results "It’s quite interesting,I actually felt like my body was exchanged with my partner." Poor Player Good Player
  • 58.
    Conclusions • Trend towardsEmpathic Computing • Understanding, Experiencing, Sharing • Empathic Computing enables new games • Changes perspective • Shares space/experience • Enhances communication • Many directions for future research • Emotion recognition, empathic agents, etc..
  • 59.
    MiniMe Virtual Cues EnhancedEmotion Brain Synchronization Emotion Recognition Scene Capture AI
  • 60.
    Empathic Social Agents •Goal: Using agents to creating empathy between people • Combine • Scene capture • Emotion recognition • Shared tele-presence • Enhanced communication cues • Facilitating brain synchronization • Separate cues from representation Soul Machines
  • 61.