Advertisement

From Interaction to Empathy

Director at HIT Lab NZ
Apr. 14, 2016
Advertisement

More Related Content

Advertisement

Similar to From Interaction to Empathy(20)

Advertisement

From Interaction to Empathy

  1. FROM INTERACTION TO EMPATHY: NEW DIRECTIONS IN INTERFACE TECHNOLOGY Mark Billinghurst mark.billinghurst@unisa.edu.au April 14th 2016 CHIuXiD 2016 Conference Jakarta, Indonesia
  2. How did that work? How could I get a whole room of people clapping together with no instruction? Clear goal Simple feedback Well connected network Everyone understood each other Successful crowd-sourced behaviour
  3. Computing Evolution •  Change in interface over time -> new challenges for design
  4. Interface Design for the Future • Key topics •  Feedback •  Connected Networks •  Shared Understanding Single user Connected communities
  5. TRENDS IN TECHNOLOGY
  6. Interaction Technology Natural Time Punch Card Keyboard Mouse Speech Gesture Emotion 1950 1960 1980 1990 2000 2010 Thought
  7. Physiological Sensing Emotiv Empatica
  8. Interaction Technology Natural Time Punch Card Keyboard Mouse Speech Gesture Emotion 1950 1960 1980 1990 2000 2010 Thought Implicit Explicit
  9. Content Capture Realism Time Photo Film Live Video Panorama 360 Video 3D Space 1850 1900 1940 1990 2000 2010
  10. 3D Image/Space Capture Google Project TangoSamsung Project Beyond
  11. Content Capture Realism Time Photo Film Live Video Panorama 360 Video 3D Space 1850 1900 1940 1990 2000 2010 2D Static Immersive Live Experience
  12. Networking Speeds Log (b/s) Time 100 b/s 10 Kb/s 1 Mb/s 1980 1985 1990 1995 2000 2010 100 Mb/s 2005
  13. Network Innovation Universal Connectivity
  14. Networking Speeds Log (b/s) Time 100 b/s 10 Kb/s 1 Mb/s 1980 1985 1990 1995 2000 2010 100 Mb/s 2005 Text Audio Natural Video
  15. Holoportation •  Augmented Reality + 3D capture + high bandwidth •  http://research.microsoft.com/en-us/projects/holoportation/
  16. Holoportation Demo https://www.youtube.com/watch?v=7d59O6cfaM0
  17. Natural Collaboration Implicit Understanding Experience Capture
  18. Natural Collaboration Implicit Understanding Experience Capture Empathic Computing
  19. EMPATHIC COMPUTING
  20. Empathy “Seeing with the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  21. Empathic Computing 1. Understanding: Systems that can understand your feelings and emotions 2. Experiencing: Systems that help you better experience the world of others 3. Sharing: Systems that help you better sharing the feelings of others
  22. Understanding: Affective Computing • Ros Picard – MIT Media Lab • Systems that recognize emotion
  23. Appliances That Make You Happy • Jun Rekimoto – University of Tokyo/Sony CSL • Smile detection + smart appliances
  24. Happiness Counter Demo https://vimeo.com/29169237
  25. Experiencing: Virtual Reality "Virtual reality offers a whole different medium to tell stories that really connect people and create an empathic connection." Nonny de la Peña http://www.emblematicgroup.com/
  26. Hunger •  Experience of homeless waiting in food line https://www.youtube.com/watch?v=wvXPP_0Ofzc
  27. CHILDHOOD •  Kenji Suzuki, University of Tsukuba •  What does it feel like to be a child? •  VR display + moved cameras + hand restrictors
  28. CHILDHOOD Demo https://vimeo.com/128641932
  29. Sharing Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  30. •  sdfs
  31. •  sdfgs
  32. •  axcvxca
  33. Using AR/Wearables for Empathy • Remove technology barriers • Enhance communication • Change perspective • Share experiences • Enhance interaction in real world
  34. Example: Google Glass • Camera + Processing + Display + Connectivity • Ego-Vision Collaboration (But with FixedView)
  35. Current Collaboration onWearables • First person remote conferencing/hangouts • Limitations •  Single POV, no spatial cues, no annotations, etc
  36. Social Panoramas (ISMAR 2014) • Capture and share social spaces in real time • Supports independent views into Panorama Reichherzer, C., Nassani, A., & Billinghurst, M. (2014, September). [Poster] Social panoramas using wearable computers. In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium on (pp. 303-304). IEEE.
  37. Implementation • Google Glass • Capture live image panorama (compass + camera) • Remote device (tablet) • Immersive viewing, live annotation
  38. User Interfaces Glass View Tablet View
  39. Social Panorama https://www.youtube.com/watch?v=vdC0-UV3hmY
  40. Lessons Learned • Good • Communication easy and natural • Users enjoy have view independence • Sharing panorama enhances the shared experience • Bad • Difficult to support equal input • Need to provide better awareness cues
  41. CoSense (CHI 2015) • Real time sharing - Emotion, video, and audio • Wearable (send emotion) –> Desktop (remote view) Google Glasse-Health 2.0 board + Ayyagari, S. S., Gupta, K., Tait, M., & Billinghurst, M. (2015, April). Cosense: Creating shared emotional experiences. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (pp. 2007-2012). ACM.
  42. Implementation Data Capture Feature Detection Emotion Recognition Emotion Representation Empathic User Interface Hardware User Interface
  43. Wearable Interface • Google Glass + e-Health + Spydroid + SSI • Measure GSR, pulse oxygen, ECG, voice pitch • Share video and audio remotely • Representative emotions sent back to Glass user ! !
  44. Desktop Interface
  45. CoSense Demo
  46. Lessons Learned • Good • System was wearable • Sender and receiver mirrored emotion • Minimal cues provided best experience • Bad • System delays • Need for good stimulus • Difficult to represent emotion
  47. Empathy Glasses (CHI 2016) •  Combine together eye-tracking, display, face expression •  Impicit cues – eye gaze, face expression ++ Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  48. AffectiveWear – Emotion Glasses •  Photo sensors to recognize expression •  User calibration •  Machine learning •  Recognizing 8 face expressions
  49. Integrated System • Local User • Video camera • Eye-tracking • Face expression • Remote Helper • Remote pointing
  50. System Diagram •  Two monitors on Remote User side •  Scene + Emotion Display
  51. Empathy Glasses Demo
  52. Lessons Learned • Pointing really helps in remote collaboration •  Makes remote user feel more connected • Gaze looks promising •  shows context of what person talking about • More work needed on emotion/expression cues • Limitations •  Limited implicit cues •  Two separate displays •  Task was a poor emotional trigger •  AffectiveWear needs improvement
  53. FUTURE RESEARCH
  54. Looking to the Future
  55. Scaling Up • Seeing actions of millions of users in the world • Augmentation on city/country level
  56. AR + Smart Sensors + Social Networks • Track population at city scale (mobile networks) • Match population data to external sensor data • Mine data for applications
  57. Example: MIT SENSEable City Lab http://senseable.mit.edu/wikicity/rome/
  58. Example: CSIRO WeFeel Tool • Emotionally mining global Twitter feeds • http://wefeel.csiro.au
  59. CONCLUSION
  60. Conclusions • Empathic Computing • Sharing what you see, hear and feel • AR/Wearables Enables Empathic Experiences • Removing technology • Changing perspective • Sharing space/experience • Many directions for future research
  61. www.empathiccomputing.org @marknb00 mark.billinghurst@unisa.edu.au
Advertisement