Using Augmented Reality to Create Empathic Experiences
Upcoming SlideShare
Loading in...5
×
 

Using Augmented Reality to Create Empathic Experiences

on

  • 1,114 views

Keynote address by Mark Billinghurst at the IUI 2014 conference in Haifa Israel, February 27th, 2014.

Keynote address by Mark Billinghurst at the IUI 2014 conference in Haifa Israel, February 27th, 2014.

Statistics

Views

Total Views
1,114
Views on SlideShare
1,095
Embed Views
19

Actions

Likes
4
Downloads
51
Comments
2

4 Embeds 19

http://iosubsystems.tumblr.com 15
http://www.linkedin.com 2
https://www.linkedin.com 1
https://twitter.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

12 of 2

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Using Augmented Reality to Create Empathic Experiences Using Augmented Reality to Create Empathic Experiences Presentation Transcript

    • Using Augmented Reality to Create Empathic Experiences Mark Billinghurst mark.billinghurst@hitlabnz.org The HIT Lab NZ, University of Canterbury February 27th 2014
    • Courtesy Matt Rettig, CMU
    • Processing Power Adapt Experience Operate
    • Beyond the Desktop
    • Intelligent User Interfaces   AI + HCI: User Interface involving some elements of Artificial Intelligence   Computer having model of user/domain   First IUI Conference in 1997   Readings in IUI (Wahster 1998)
    •   Microsoft Clippy (1997)   MS Office Intelligent User Interface
    • Intelligent User Interfaces
    • Multiple Intelligences   Frames of Mind: The Theory of Multiple Intelligences   Howard Gardner (1983)
    • Multiple Intelligences
    • Emotional Intelligence   Emotional Intelligence - Why it can matter more than IQ   Goldman (1995)   Identify, assess, and control the emotions  of oneself, of others, and of groups
    • Foundations of Emotional Intelligence
    • Empathy Empathy
    • Empathy vs. Intelligence   Intelligence: the power of one’s brain, divided into many categories and used in numerous ways.   Empathy: the power of one’s heart, expressing one’s true emotions with themselves, those around them, and their own world. 
    • IQ vs EQ?
    • Mirror Neurons   Neuron that fires both when an animal acts and it observes the same action performed by another   Giacomo Rizzolatti, Univ. of Palma (1980s/90s)
    • Empathic Computing 1. Computing systems that can understand your feelings and emotions 2. Computing systems that help you better understand the feelings of others
    • Affective Computing   Ros Picard – MIT Media Lab   http://affect.media.mit.edu
    • Appliances That Make You Happy   Jun Rekimoto – Univ. Tokyo   Smile detection + smart devices
    • Can we develop interfaces that allow us to be more empathetic to others?
    •   adsf
    • Empathy Computing Requirements  Basic Requirements  Making the technology transparent  Empathy Definition  Seeing with the eyes of another  Hearing with the ears of another  Feeling with the heart of another
    • Augmented Reality 1977 2008
    • Using AR for Empathy  Augmented Reality can:  Remove technology barriers  Enhance communication  Change perspective  Share experiences  Enhance interaction in real world
    • Communication Seams Communication Space Task Space   Technology introduces artificial seams in the communication (eg separate real and virtual space)
    • Removing Barriers: Shared Space   Face to Face interaction, Tangible AR metaphor -  ~3,000 users (Siggraph 1999)   Easy collaboration with strangers   Users acted same as if handling real objects Billinghurst, M., Poupyrev, I., Kato, H., & May, R. (2000). Mixing realities in shared space: An augmented reality interface for collaborative computing. In Multimedia and Expo, 2000. ICME 2000. 2000 IEEE International Conference on (Vol. 3, pp. 1641-1644).
    • Enhancing Face to Face Communication   AR Pad   Handheld AR device   AR shows viewpoints   Users collaborate easier   Show communication cues Virtual Viewpoint Visualization Mogilev, D., Kiyokawa, K., Billinghurst, M., & Pair, J. (2002, April). AR Pad: An interface for face-to-face AR collaboration. In CHI'02 extended abstracts on Human factors in computing systems (pp. 654-655).
    • Changing Perspective   CamNet (1992)   British Telecom   Wearable Teleconferencing   audio, video   Remote collaboration   Sends task space video   Similar CMU study (1996)   cut performance time in half
    • WACL: Remote Expert Collaboration   Wearable Camera/Laser Pointer   Independent pointer control   Remote panorama view
    • WACL: Remote Expert Collaboration   Remote Expert View   Panorama viewing, annotation, image capture Kurata, T., Sakata, N., Kourogi, M., Kuzuoka, H., & Billinghurst, M. (2004, October). Remote collaboration using a shoulder-worn active camera/laser. In Wearable Computers, 2004. ISWC 2004. Eighth International Symposium on (Vol. 1, pp. 62-69).
    • View Through Google Glass Always available peripheral information display Combining computing, communications and content capture
    • Ego-Vision Collaboration   Google Glass   camera + processing + display + connectivity
    • Current Collaboration on Glass   First person remote conferencing/hangouts   Limitations   Single POV, no spatial cues, no annotations, etc
    • Sharing Space: Social Panoramas   Capture and share social spaces in real time   Enable remote people to feel like they’re with you
    • Key Technology   Google Glass   Capture live image panorama (compass + camera)   Capture spatial audio, live video   Remote device (desktop, tablet)   Immersive viewing, live annotation
    • Capturing Space: Real World Capture   Hands free AR   Portable scene capture (color + depth)   Projector/Kinect combo, Remote controlled pan/tilt   Remote expert annotation interface
    • Remote Expert View
    • Capturing Behaviours   3 Gear Systems   Kinect/Primesense Sensor   Two hand tracking   http://www.threegear.com
    • Skeleton Interaction + AR   HMD AR View   Viewpoint tracking   Two hand input   Skeleton interaction, occlusion
    • Ghostman   Use AR to capture and overlay your actions into remote persons space   Eg remote therapy
    • Looking to the Future What’s Next?
    • Brain to Brain Control   Rajesh Rao, University of Washington   First Brain to Brain control
    • System Architecture
    • Scaling Up   Seeing actions of millions of users in the world   Augmentation on city/country level
    • AR + Smart Sensors + Social Networks   Track population at city scale (mobile networks)   Match population data to external sensor data   medical, environmental, etc   Mine data to improve social services
    • Research Challenges   How to convey emotion?   How to measure empathy?   Interface/interaction models?   How to communicate emotion?   Scaling up to city/country scale?
    • Conclusion
    • Harvard Grant Study   $20 million, 75 years study   268 Harvard graduates   456 disadvantaged people   Led by George Valliant   What makes us happy?   warmth of relationships throughout life have the greatest positive impact on "life satisfaction".
    • “The seventy-five years and twenty million dollars expended on the Grant Study points to a straightforward five-word conclusion: Happiness is love.  Full stop.”    George Valliant
    • Conclusions   Empathic Computing   Sharing what you see, hear and feel   AR Enables Empathic Experiences   Removing technology   Changing perspective   Sharing space/experience   Many directions for future research
    • More Information •  Mark Billinghurst –  Email: mark.billinghurst@hitlabnz.org –  Twitter: @marknb00 •  Website –  http://www.hitlabnz.org/