SlideShare a Scribd company logo
TALK TO ME
USING VIRTUAL AVATARS TO IMPROVE
REMOTE COLLABORATION
Mark Billinghurst
May 2022
Which is a better avatar for selling a business?
“Only through
communication
can Human Life
hold meaning.”
Paulo Freire
Philosopher
A wide variety of communication cues used.
Speech
Paralinguistic
Para-verbals
Prosodics
Intonation
Audio
Gaze
Gesture
Face Expression
Body Position
Visual
Object Manipulation
Writing/Drawing
Spatial Relationship
Object Presence
Environmental
Face to Face Communication
Face to Face Collaboration
Task Space
Communication Space
Face to Face Communication
Audio Cues
Visual Cues
Environmental Cues
My Workplace in 2020, 2021…
Remote Conferencing
• da
Remote Communication
Audio Cues
Visual Cues
Environmental Cues
Communication Seam
• da Task Space Communication Space
Limitations with Current Technology
•Lack of spatial cues
• Person blends with background
•Poor communication cues
• Limited gaze, gesture, non-verbal communication
•Separation of task/communication space
• Can’t see person and workspace at same time
Star Wars – Hologram (1977)
Connecting at a Distance
• Using AR/VR to connect at a distance
• Restore spatial cues
• Sharing non-verbal cues
• Creating shared spaces
Early Experiments (1994 - 2003)
Greenspace (1994)
AR conferencing (1999)
3D Live (2003)
AR Video Conferencing (2001)
• Bringing conferencing into real world
• Using AR video textures of remote people
• Attaching AR video to real objects
Billinghurst, M., & Kato, H. (2002). Collaborative augmented reality. Communications of the ACM, 45(7), 64-70.
2001
Multi-View AR Conferencing
Billinghurst, M., Cheok, A., Prince, S., & Kato, H. (2002). Real world
teleconferencing. Computer Graphics and Applications, IEEE, 22(6), 11-13.
Holoportation (2016)
• Augmented Reality + 3D capture + high bandwidth
• http://research.microsoft.com/en-us/projects/holoportation/
Collaboration in Augmented Reality
Magic Leap Avatar Chat
Collaboration in Virtual Reality
Meta Workrooms
RecRoom
Microsoft Mesh (2021)
Bringing 3D into 2D Video Conferencing
https://www.youtube.com/watch?v=OKNRMauZjFY
Remote Communication
Key Questions
• Do we need a virtual body?
• What should our virtual body look like?
• What enhancements could we provide to virtual bodies?
• What type of bodies do we need for different collaboration tasks?
• What other cues can we provide to enhance remote collaboration?
First Person Perspective Remote Collaboration
• View from remote user’s perspective
• Wearable Teleconferencing
• audio, video, pointing
• send task space video
• CamNet (1992)
• British Telecom
• Similar CMU study (1996)
• cut performance time in half
AR for Remote Collaboration
• Camera + Processing + AR Display + Connectivity
• First person Ego-Vision Collaboration
AR View Remote Expert View
Adding Gaze Cues - Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
Sharing a View
3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
Live 3D Scene Capture
Scene Capture and Sharing
Scene Reconstruction Remote Expert Local Worker
AR View Remote Expert View
3D Mixed Reality Remote Collaboration (2022)
Tian, H., Lee, G. A., Bai, H., & Billinghurst, M. (2023). Using Virtual Replicas to Improve Mixed
Reality Remote Collaboration. IEEE Transactions on Visualization and Computer Graphics.
View Sharing Evolution
• Increased immersion
• Improved scene understanding
• Better collaboration
2D 360 3D
Switching between 360 and 3D views
• 360 video
• High quality visuals
• Poor spatial representation
• 3D reconstruction
• Poor visual quality
• High quality 3D reconstruction
Swapping between 360 and 3D views
• Have pre-captured 3D model of real space
• Enable remote user to swap between live 360 video or 3D view
• Represent remote user as avatar
Teo, T., F. Hayati, A., A. Lee, G., Billinghurst, M., & Adcock, M. (2019). A technique for mixed reality remote collaboration using
360 panoramas in 3d reconstructed scenes. In 25th ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).
• Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing: Virtual Communication Cues (2019)
Sharing Virtual Communication Cues
• Collaboration between AR and VR
• Gaze Visualization Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
Multi-Scale Collaboration
• Changing the user’s virtual body scale
On the Shoulder of a Giant
Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder
of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction.
In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-17).
Role of Spatial Cues (2020)
• What is impact of spatial audio/visual cues
over large scale AR/VR collaboration?
Yang, J., Sasikumar, P., Bai, H., Barde, A., SĂśrĂśs, G., & Billinghurst, M. (2020). The effects of spatial auditory
and visual cues on mixed reality remote collaboration. Journal on Multimodal User Interfaces, 14(4), 337-352.
Experiment Environment
Starting point
Local AR space
Remote VR
space
Lego blocks
90 𝑚!
Finding Lego Blocks (2 𝑐𝑚!)
Project Set-up
Spatialized auditory
beacon from the
object
AR HMD
Controller
Local AR
worker’s field of
view
Remote VR
expert teleports
in mesh
Local Remote
VR HMD
User Study Results
• Two studies conducted
• (1) Audio only cues (spatial vs. non-spatial), (2) Different types of visual cues (head, hands)
• Key findings
• Spatial audio significantly increase Social Presence
• Users strongly preferred head, gesture, audio cues, or non-spatial voice, spatial beacon
• Integrating visual cues with the spatial auditory cues significantly improved task performance
• Integrating the remote expert’s head frustum into the spatial auditory cues can provide
significantly better social presence, spatial awareness, and system usability
Spatial Presence Scores
Task Completion Times
Sharing Gesture Cues
• What type of gesture cues should be shared in AR/VR collaboration?
Kim, S., Lee, G., Huang, W., Kim, H., Woo, W., & Billinghurst, M. (2019). Evaluating the combination of visual communication cues for HMD-
based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-13).
Augmented Reality
Virtual Reality
Communication Cues
• Four different cues used
• (1) Hands Only (HO), (2) Hands + Pointer (HP)
• (3) Hands + Sketch (HS), (4) Hands + Pointer + Sketch (HPS)
• Three experimental tasks
• Lego assembly, Tangram puzzle, Origami folding
Key Results
• Task completion time
• Sketch cues enabled users to complete tasks significantly faster (task dep.)
• Adding pointing didn’t improve task completion time
• Co-Presence
• Adding pointing and sketch cues didn’t improve feeling of co-presence
• User Preference
• User’s overwhelming preferred Hands + Pointer + Sketch condition, Hands Only ranked last
Task completion time
“sketch allowed drawings for accuracy
and hand for general use", “sketch is
pretty useful for describing actions that
was difficult by verbal words and could
express more details".
Changing Gaze Cues
How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment.
➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system
➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared
between a local host (AR) and a remote collaborator (VR).
Jing, A., May, K., Lee, G., & Billinghurst, M. (2021). Eye See What You See: Exploring How Bi-Directional Augmented
Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration. Frontiers in Virtual Reality, 2, 79.
System Design
➔ 360 Panaramic Camera + Mixed Reality View
➔ Combination of HoloLens2 + Vive Pro Eye
➔ 4 gaze behavioural visualisations:
browse, focus, mutual, fixated circle
System Design
Browse Focus
Mutual Fixed
Circle-map
System Demo
https://www.youtube.com/watch?v=1liRqdghGN8
Experiment
• Participants
• 12 pairs of people (6 women), 60% AR/VR experience, 50% with gaze experience
• Conditions
• 2x2 study design, also no gaze (NG)
• Gaze Behaviour (WB) /No Gaze Behaviour (NB)
• Uni-directional (U) /Bi-directional gaze (B)
• Tasks
• Finding abstract symbols, guiding other person to the symbols
• Measures
• Quantitative measures: gaze behaviour metrics, completion time, questionnaires
• Qualitative measures: open-ended questions and behaviour analysis
Research Questions
• (1) Gaze vs no Gaze: compared to no gaze visualisation, gaze cues
would encourage joint focus by providing explicit visual feedback
• (2) Uni-directional gaze style vs bi-directional style: Bidirectional style is
more balanced visually, leading to less confusion of gaze identity
• (3) Gaze behaviours vs no behaviour: Integrating gaze behaviour
visualisations lowers communication cognitive load
Results
• RQ1: Behaviour visualisations stimulate frequent joint attention
• RQ2: Bidirectional gaze ensured gaze information was accurately delivered
• RQ3: Gaze made it easier to coordinate on target object location, and made
communication easier and more effective
Sharing: Separating Cues from Body
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M.
(2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the
2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
Results from User Evaluation (16 subjects)
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks
• Asymmetric, symmetric collaboration
• Significant performance improvement
• 20% faster with Mini-Me
• Social Presence
• Higher sense of Presence
• Users preferred
• People felt the task was easier to complete
• 60-75% preference
“I feel like I am talking
to my partner”
“The ability to see the small
avatar … enhanced the
speed of solving the task”
Avatar Representation
• Pilot study with recorded avatar
• Motorcycle engine assembly
• Avatar types
• (A1) Annotation: Computer-generated lines drawn in 3D space.
• (A2) Hand Gesture: Real hand gestures captured using stereoscopic cameras
• (A3) Avatar: Virtual avatar reconstructed using inverse kinematics.
• (A4) Volumetric Playback: Using three Kinect cameras, the movements of an expert
are captured and played back as a virtual avatar via a see-through headset.
Avatar Representation
Remote pointer Realistic hands
Representing Remote Users
Virtual Avatar Volumetric Avatar
Experiment Design (30 participants)
Performing motorbike assembly task under guidance
- Easy, Medium, Hard task
Hypotheses
- H1. Volumetric playback would have a better sense of social presence in a
remote training system.
- H2. Volumetric playback would enable faster completion of tasks in a remote
training system
Measures
• NMM Social Presence Questionnaire, NASA TLX, SUS
Results
• Hands, Annotation significantly faster than avatar
• Volumetric playback induced the highest sense of co-presence
• Users preferred Volumetric or Annotation interface
Performance Time
Average Ranking
Results
Volumetric instruction cues exhibits an increase in co-presence and
system usability while reducing mental workload and frustration.
Mental Load (NASA TLX)
System Usability Scale
User Feedback
• Annotations easy to understand (faster performance)
• “Annotation is very clear and easy to spot in a 3d environment”.
• Volumetric creates high degree of social presence (working with person)
• “Seeing a real person demonstrate the task, feels like being next to a person”.
• Recommendations
• Use Volumetric Playback to improve Social Presence and system usability
• Using a full-bodied avatar representation in a remote training system is not
recommended unless it is well animated
• Using simple annotation can have significant improvement in performance if
social presence is not of importance.
Avatar Representation for Social Presence
• What should avatars look
like for social situations?
• Cartoon vs. realistic?
• Partial or full body?
• Impact on Social Presence?
Yoon, B., Kim, H. I., Lee, G. A., Billinghurst, M., & Woo, W. (2019, March). The effect of
avatar appearance on social presence in an augmented reality remote collaboration. In
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 547-556). IEEE.
Avatar Representations
• Cartoon vs. Realistic, Part Body vs. Whole Body
• Realistic Head & Hands (RHH), Realistic Upper Body (RUB), Realistic Whole Body (RWB),
• Cartoon Head & Hands (CHH), Cartoon Upper Body (CUB), Cartoon Whole Body (CWB).
Experiment
• Within-subjects design (24 subjects)
• 6 conditions: RHH, RUB, RWB, CHH, CUB, CWB
• AR/VR interface
• Subject in AR interface, actor in VR
• Experiment measures
• Social Presence
• Networked Mind Measure of Social Presence survey
• Bailenson’s Social Presence survey
• Post Experiment Interview
• Tasks
• Study 1: Crossword puzzle (Face to Face discussion)
• Study 2: Furniture placement (virtual object placement)
AR user
VR user
Hypotheses
H1. Body Part Visibility will affect the user’s Social Presence in AR.
H2. The Whole-Body virtual avatars will have the highest Social
Presence among the three levels of visibility.
H3. Head & Hands virtual avatars will have the lowest Social
Presence among the three levels of visibility.
H4. The Character Style will affect the user’s Social Presence.
H5. Realistic avatars will have a higher Social Presence than
Cartoon Style avatars in an AR remote collaboration.
Results
• Aggregated Presence Scores
• 1: strongly disagree - 7: strongly agree
User Comments
• ‘Whole Body’ Avatar Expression to Users
• “Presence was high with full body parts, because I could notice joints’
movement, behaviour, and reaction.”
• “I didn’t get the avatar’s intention of the movement, because it had only
head and hands.”
• ‘Upper Body’ vs. ‘Whole Body’ Avatar
• “I preferred the one with whole body, but it didn’t really matter because I
didn’t look at the legs much.”,
• “I noticed head and hands model immediately, but I didn’t feel the
difference whether the avatar had a lower body or not.”
• ‘Realistic’ vs ‘Cartoon’ style Avatars
• "The character seemed more like a game than furniture placement in real. I
felt that realistic whole body was collaborating with me more.”
Hypotheses Outcome
H1. Body Part Visibility will affect the user’s Social Presence in AR.
H2. The Whole-Body virtual avatars will have the highest Social
Presence among the three levels of visibility.
H3. Head & Hands virtual avatars will have the lowest Social
Presence among the three levels of visibility.
H4. The Character Style will affect the user’s Social Presence.
H5. Realistic avatars will have a higher Social Presence than
Cartoon Style avatars in an AR remote collaboration.
Key Lessons Learned
• Avatar Body Part visibility should be considered first when designing for AR remote
collaboration since it significantly affects Social Presence
• Body Part Visibility
• Whole Body & Upper Body: Whole body is preferred, but upper body is okay in some cases
• Head & Hands: Should be avoided
• Character Style
• No difference in Social Presence between Realistic and Cartoon avatars
• However, the majority of participants had a positive response towards the Realistic avatar
• Cartoon character for fun, Realistic avatar for professional meetings
Technology Trends
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
Sensor Enhanced HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
Enhancing Emotion
• Using physiological and contextual cues to enhance emotion representation
• Show user’s real emotion, make it easier to understand user emotion, etc..
Real User
Physiological Cues
Arousal/Valence
Positive
Negative
Avatar
Context Cues
System Design
Early Results
Face Tracking Positive Affect Avatar Outcome
Conversational agent
Intelligent Virtual Agents (IVAs)
Embodied in 2D Screen Embodied in 3D space
Photorealistic Characters
• Synthesia
• AI + ML to create videos
• Speech + image synthesis
• Supports >60 languages
• Personalized characters
https://www.youtube.com/watch?v=vifHh4WjEFE
Empathic Mixed Reality Agents
Intelligent Digital Humans
• Soul Machines
• AI digital brain
• Expressive digital humans
• Autonomous animation
• Able to see and hear
• Learn from users
Towards Empathic Social Agents
• Goal: Using agents to creating
empathy between people
• Combine
• Scene capture
• Shared tele-presence
• Trust/emotion recognition
• Enhanced communication cues
• Separate cues from representation
• Facilitating brain synchronization
Trends..
Time
Human
Touch
Empathic Agents
Digital Humans
Photo Realistic
Chatbots
Voice menus
Summary
• Being able to share communication cues is vital
• Focus on the cues needed for task
• Need for body is task dependent
• Physical task - pointer/simple cues okay
• Social task – avatar/volumetric avatar better
• Simple representation may be okay for some tasks
• Legs optional, arms essential
• Using additional communication cues can be beneficial
• Gaze lines, view frustrum, body copy, etc.
Remote Communication
MiniMe
Virtual Cues Enhanced Emotion
Brain
Synchronization
Trust Recognition
Scene Capture
AI
Opportunities for Research
• Adding sensors
• Physiological cues, sharing emotional/mental state
• Behavioral/sensor synchronization
• Novel communication cues
• Scene sharing, gaze, gesture cues
• AI + Virtual Avatars
• Realistic behaviors, visual representation, cultural translation, etc
• Autonomous Mixed Reality Agents
• Awareness of real world, real user behavior
www.empathiccomputing.org
@marknb00
mark.billinghurst@auckland.ac.nz

More Related Content

What's hot

2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
Mark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
Mark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
Mark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
Mark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
Mark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
Mark Billinghurst
 
Lecture1 introduction to VR
Lecture1 introduction to VRLecture1 introduction to VR
Lecture1 introduction to VR
Mark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
Mark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
Mark Billinghurst
 
Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2
Mark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
Mark Billinghurst
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
Mark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
Mark Billinghurst
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
Mark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
Mark Billinghurst
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
Mark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
Mark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
Mark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
Mark Billinghurst
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
Mark Billinghurst
 

What's hot (20)

2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
Lecture1 introduction to VR
Lecture1 introduction to VRLecture1 introduction to VR
Lecture1 introduction to VR
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
 

Similar to Talk to Me: Using Virtual Avatars to Improve Remote Collaboration

Can You See What I See?
Can You See What I See?Can You See What I See?
Can You See What I See?
Mark Billinghurst
 
Spatial Audio for Augmented Reality
Spatial Audio for Augmented RealitySpatial Audio for Augmented Reality
Spatial Audio for Augmented Reality
Mark Billinghurst
 
Tangible AR Interface
Tangible AR InterfaceTangible AR Interface
Tangible AR Interface
JongHyoun
 
Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to Gaming
Mark Billinghurst
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
Mark Billinghurst
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
Mark Billinghurst
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
Mark Billinghurst
 
Empathic Glasses
Empathic GlassesEmpathic Glasses
Empathic Glasses
Mark Billinghurst
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Mark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
Mark Billinghurst
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
Mark Billinghurst
 
Scottie Beam Me Up’ For Slideshare
Scottie Beam Me Up’  For SlideshareScottie Beam Me Up’  For Slideshare
Scottie Beam Me Up’ For Slideshare
Mary Allan
 
COSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative ARCOSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative AR
Mark Billinghurst
 
Moving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR ExperiencesMoving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR Experiences
Mark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
Mark Billinghurst
 
Augmented Human 2018
Augmented Human 2018Augmented Human 2018
Augmented Human 2018
Woontack Woo
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
Mark Billinghurst
 
Augmented Reality and Virtual Reality: Research Advances in Creative Industry
Augmented Reality and Virtual Reality: Research Advances in Creative IndustryAugmented Reality and Virtual Reality: Research Advances in Creative Industry
Augmented Reality and Virtual Reality: Research Advances in Creative Industry
Zi Siang See
 
Mark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented TeleportationMark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented Teleportation
AugmentedWorldExpo
 
3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring
Zi Siang See
 

Similar to Talk to Me: Using Virtual Avatars to Improve Remote Collaboration (20)

Can You See What I See?
Can You See What I See?Can You See What I See?
Can You See What I See?
 
Spatial Audio for Augmented Reality
Spatial Audio for Augmented RealitySpatial Audio for Augmented Reality
Spatial Audio for Augmented Reality
 
Tangible AR Interface
Tangible AR InterfaceTangible AR Interface
Tangible AR Interface
 
Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to Gaming
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
 
Empathic Glasses
Empathic GlassesEmpathic Glasses
Empathic Glasses
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
 
Scottie Beam Me Up’ For Slideshare
Scottie Beam Me Up’  For SlideshareScottie Beam Me Up’  For Slideshare
Scottie Beam Me Up’ For Slideshare
 
COSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative ARCOSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative AR
 
Moving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR ExperiencesMoving Beyond Questionnaires to Evaluate MR Experiences
Moving Beyond Questionnaires to Evaluate MR Experiences
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Augmented Human 2018
Augmented Human 2018Augmented Human 2018
Augmented Human 2018
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
 
Augmented Reality and Virtual Reality: Research Advances in Creative Industry
Augmented Reality and Virtual Reality: Research Advances in Creative IndustryAugmented Reality and Virtual Reality: Research Advances in Creative Industry
Augmented Reality and Virtual Reality: Research Advances in Creative Industry
 
Mark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented TeleportationMark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented Teleportation
 
3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring3D Content Development and AR/VR Authoring
3D Content Development and AR/VR Authoring
 

More from Mark Billinghurst

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
Mark Billinghurst
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
Mark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
Mark Billinghurst
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
Mark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
Mark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
Mark Billinghurst
 

More from Mark Billinghurst (7)

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Recently uploaded

From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
Product School
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
ThousandEyes
 
Knowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and backKnowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and back
Elena Simperl
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
DianaGray10
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Jeffrey Haguewood
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
Product School
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
Alan Dix
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
DanBrown980551
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
OnBoard
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
KatiaHIMEUR1
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
UiPathCommunity
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
Elena Simperl
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Albert Hoitingh
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
Alison B. Lowndes
 
The Future of Platform Engineering
The Future of Platform EngineeringThe Future of Platform Engineering
The Future of Platform Engineering
Jemma Hussein Allen
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
DianaGray10
 
JMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and GrafanaJMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and Grafana
RTTS
 
Elevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object CalisthenicsElevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object Calisthenics
Dorra BARTAGUIZ
 
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
Prayukth K V
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 

Recently uploaded (20)

From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
 
Knowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and backKnowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and back
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
 
The Future of Platform Engineering
The Future of Platform EngineeringThe Future of Platform Engineering
The Future of Platform Engineering
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
 
JMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and GrafanaJMeter webinar - integration with InfluxDB and Grafana
JMeter webinar - integration with InfluxDB and Grafana
 
Elevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object CalisthenicsElevating Tactical DDD Patterns Through Object Calisthenics
Elevating Tactical DDD Patterns Through Object Calisthenics
 
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 

Talk to Me: Using Virtual Avatars to Improve Remote Collaboration

  • 1. TALK TO ME USING VIRTUAL AVATARS TO IMPROVE REMOTE COLLABORATION Mark Billinghurst May 2022
  • 2. Which is a better avatar for selling a business?
  • 3.
  • 4. “Only through communication can Human Life hold meaning.” Paulo Freire Philosopher
  • 5. A wide variety of communication cues used. Speech Paralinguistic Para-verbals Prosodics Intonation Audio Gaze Gesture Face Expression Body Position Visual Object Manipulation Writing/Drawing Spatial Relationship Object Presence Environmental Face to Face Communication
  • 6. Face to Face Collaboration Task Space Communication Space
  • 7. Face to Face Communication Audio Cues Visual Cues Environmental Cues
  • 8. My Workplace in 2020, 2021…
  • 10. Remote Communication Audio Cues Visual Cues Environmental Cues
  • 11. Communication Seam • da Task Space Communication Space
  • 12. Limitations with Current Technology •Lack of spatial cues • Person blends with background •Poor communication cues • Limited gaze, gesture, non-verbal communication •Separation of task/communication space • Can’t see person and workspace at same time
  • 13. Star Wars – Hologram (1977)
  • 14. Connecting at a Distance • Using AR/VR to connect at a distance • Restore spatial cues • Sharing non-verbal cues • Creating shared spaces
  • 15. Early Experiments (1994 - 2003) Greenspace (1994) AR conferencing (1999) 3D Live (2003)
  • 16. AR Video Conferencing (2001) • Bringing conferencing into real world • Using AR video textures of remote people • Attaching AR video to real objects Billinghurst, M., & Kato, H. (2002). Collaborative augmented reality. Communications of the ACM, 45(7), 64-70.
  • 17. 2001
  • 18. Multi-View AR Conferencing Billinghurst, M., Cheok, A., Prince, S., & Kato, H. (2002). Real world teleconferencing. Computer Graphics and Applications, IEEE, 22(6), 11-13.
  • 19.
  • 20. Holoportation (2016) • Augmented Reality + 3D capture + high bandwidth • http://research.microsoft.com/en-us/projects/holoportation/
  • 21. Collaboration in Augmented Reality Magic Leap Avatar Chat
  • 22. Collaboration in Virtual Reality Meta Workrooms RecRoom
  • 24. Bringing 3D into 2D Video Conferencing https://www.youtube.com/watch?v=OKNRMauZjFY
  • 26. Key Questions • Do we need a virtual body? • What should our virtual body look like? • What enhancements could we provide to virtual bodies? • What type of bodies do we need for different collaboration tasks? • What other cues can we provide to enhance remote collaboration?
  • 27. First Person Perspective Remote Collaboration • View from remote user’s perspective • Wearable Teleconferencing • audio, video, pointing • send task space video • CamNet (1992) • British Telecom • Similar CMU study (1996) • cut performance time in half
  • 28. AR for Remote Collaboration • Camera + Processing + AR Display + Connectivity • First person Ego-Vision Collaboration
  • 29. AR View Remote Expert View
  • 30. Adding Gaze Cues - Empathy Glasses • Combine together eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 31. Remote Collaboration • Eye gaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 32.
  • 33. Shared Sphere – 360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
  • 34.
  • 36. 3D Live Scene Capture • Use cluster of RGBD sensors • Fuse together 3D point cloud
  • 37. Live 3D Scene Capture
  • 38. Scene Capture and Sharing Scene Reconstruction Remote Expert Local Worker
  • 39. AR View Remote Expert View
  • 40. 3D Mixed Reality Remote Collaboration (2022) Tian, H., Lee, G. A., Bai, H., & Billinghurst, M. (2023). Using Virtual Replicas to Improve Mixed Reality Remote Collaboration. IEEE Transactions on Visualization and Computer Graphics.
  • 41. View Sharing Evolution • Increased immersion • Improved scene understanding • Better collaboration 2D 360 3D
  • 42. Switching between 360 and 3D views • 360 video • High quality visuals • Poor spatial representation • 3D reconstruction • Poor visual quality • High quality 3D reconstruction
  • 43. Swapping between 360 and 3D views • Have pre-captured 3D model of real space • Enable remote user to swap between live 360 video or 3D view • Represent remote user as avatar Teo, T., F. Hayati, A., A. Lee, G., Billinghurst, M., & Adcock, M. (2019). A technique for mixed reality remote collaboration using 360 panoramas in 3d reconstructed scenes. In 25th ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).
  • 44.
  • 45. • Using AR/VR to share communication cues • Gaze, gesture, head pose, body position • Sharing same environment • Virtual copy of real world • Collaboration between AR/VR • VR user appears in AR user’s space Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5. Sharing: Virtual Communication Cues (2019)
  • 46. Sharing Virtual Communication Cues • Collaboration between AR and VR • Gaze Visualization Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  • 47.
  • 48. Results • Predictions • Eye/Head pointing better than no cues • Eye/head pointing could reduce need for pointing • Results • No difference in task completion time • Head-gaze/eye-gaze great mutual gaze rate • Using head-gaze greater ease of use than baseline • All cues provide higher co-presence than baseline • Pointing gestures reduced in cue conditions • But • No difference between head-gaze and eye-gaze
  • 49. Multi-Scale Collaboration • Changing the user’s virtual body scale
  • 50.
  • 51. On the Shoulder of a Giant Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-17).
  • 52. Role of Spatial Cues (2020) • What is impact of spatial audio/visual cues over large scale AR/VR collaboration? Yang, J., Sasikumar, P., Bai, H., Barde, A., SĂśrĂśs, G., & Billinghurst, M. (2020). The effects of spatial auditory and visual cues on mixed reality remote collaboration. Journal on Multimodal User Interfaces, 14(4), 337-352.
  • 53. Experiment Environment Starting point Local AR space Remote VR space Lego blocks 90 𝑚!
  • 54. Finding Lego Blocks (2 𝑐𝑚!)
  • 55. Project Set-up Spatialized auditory beacon from the object AR HMD Controller Local AR worker’s field of view Remote VR expert teleports in mesh Local Remote VR HMD
  • 56.
  • 57. User Study Results • Two studies conducted • (1) Audio only cues (spatial vs. non-spatial), (2) Different types of visual cues (head, hands) • Key findings • Spatial audio significantly increase Social Presence • Users strongly preferred head, gesture, audio cues, or non-spatial voice, spatial beacon • Integrating visual cues with the spatial auditory cues significantly improved task performance • Integrating the remote expert’s head frustum into the spatial auditory cues can provide significantly better social presence, spatial awareness, and system usability Spatial Presence Scores Task Completion Times
  • 58. Sharing Gesture Cues • What type of gesture cues should be shared in AR/VR collaboration? Kim, S., Lee, G., Huang, W., Kim, H., Woo, W., & Billinghurst, M. (2019). Evaluating the combination of visual communication cues for HMD- based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-13). Augmented Reality Virtual Reality
  • 59. Communication Cues • Four different cues used • (1) Hands Only (HO), (2) Hands + Pointer (HP) • (3) Hands + Sketch (HS), (4) Hands + Pointer + Sketch (HPS) • Three experimental tasks • Lego assembly, Tangram puzzle, Origami folding
  • 60. Key Results • Task completion time • Sketch cues enabled users to complete tasks significantly faster (task dep.) • Adding pointing didn’t improve task completion time • Co-Presence • Adding pointing and sketch cues didn’t improve feeling of co-presence • User Preference • User’s overwhelming preferred Hands + Pointer + Sketch condition, Hands Only ranked last Task completion time “sketch allowed drawings for accuracy and hand for general use", “sketch is pretty useful for describing actions that was difficult by verbal words and could express more details".
  • 61. Changing Gaze Cues How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment. ➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system ➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared between a local host (AR) and a remote collaborator (VR). Jing, A., May, K., Lee, G., & Billinghurst, M. (2021). Eye See What You See: Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration. Frontiers in Virtual Reality, 2, 79.
  • 62. System Design ➔ 360 Panaramic Camera + Mixed Reality View ➔ Combination of HoloLens2 + Vive Pro Eye ➔ 4 gaze behavioural visualisations: browse, focus, mutual, fixated circle
  • 65. Experiment • Participants • 12 pairs of people (6 women), 60% AR/VR experience, 50% with gaze experience • Conditions • 2x2 study design, also no gaze (NG) • Gaze Behaviour (WB) /No Gaze Behaviour (NB) • Uni-directional (U) /Bi-directional gaze (B) • Tasks • Finding abstract symbols, guiding other person to the symbols • Measures • Quantitative measures: gaze behaviour metrics, completion time, questionnaires • Qualitative measures: open-ended questions and behaviour analysis
  • 66. Research Questions • (1) Gaze vs no Gaze: compared to no gaze visualisation, gaze cues would encourage joint focus by providing explicit visual feedback • (2) Uni-directional gaze style vs bi-directional style: Bidirectional style is more balanced visually, leading to less confusion of gaze identity • (3) Gaze behaviours vs no behaviour: Integrating gaze behaviour visualisations lowers communication cognitive load
  • 67. Results • RQ1: Behaviour visualisations stimulate frequent joint attention • RQ2: Bidirectional gaze ensured gaze information was accurately delivered • RQ3: Gaze made it easier to coordinate on target object location, and made communication easier and more effective
  • 68. Sharing: Separating Cues from Body • What happens when you can’t see your colleague/agent? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13). Collaborating Collaborator out of View
  • 69. Mini-Me Communication Cues in MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 70.
  • 71. Results from User Evaluation (16 subjects) • Collaboration between user in AR, expert in VR • Hololens, HTC Vive • Two tasks • Asymmetric, symmetric collaboration • Significant performance improvement • 20% faster with Mini-Me • Social Presence • Higher sense of Presence • Users preferred • People felt the task was easier to complete • 60-75% preference “I feel like I am talking to my partner” “The ability to see the small avatar … enhanced the speed of solving the task”
  • 72. Avatar Representation • Pilot study with recorded avatar • Motorcycle engine assembly • Avatar types • (A1) Annotation: Computer-generated lines drawn in 3D space. • (A2) Hand Gesture: Real hand gestures captured using stereoscopic cameras • (A3) Avatar: Virtual avatar reconstructed using inverse kinematics. • (A4) Volumetric Playback: Using three Kinect cameras, the movements of an expert are captured and played back as a virtual avatar via a see-through headset.
  • 74. Representing Remote Users Virtual Avatar Volumetric Avatar
  • 75. Experiment Design (30 participants) Performing motorbike assembly task under guidance - Easy, Medium, Hard task Hypotheses - H1. Volumetric playback would have a better sense of social presence in a remote training system. - H2. Volumetric playback would enable faster completion of tasks in a remote training system Measures • NMM Social Presence Questionnaire, NASA TLX, SUS
  • 76. Results • Hands, Annotation significantly faster than avatar • Volumetric playback induced the highest sense of co-presence • Users preferred Volumetric or Annotation interface Performance Time Average Ranking
  • 77. Results Volumetric instruction cues exhibits an increase in co-presence and system usability while reducing mental workload and frustration. Mental Load (NASA TLX) System Usability Scale
  • 78. User Feedback • Annotations easy to understand (faster performance) • “Annotation is very clear and easy to spot in a 3d environment”. • Volumetric creates high degree of social presence (working with person) • “Seeing a real person demonstrate the task, feels like being next to a person”. • Recommendations • Use Volumetric Playback to improve Social Presence and system usability • Using a full-bodied avatar representation in a remote training system is not recommended unless it is well animated • Using simple annotation can have significant improvement in performance if social presence is not of importance.
  • 79. Avatar Representation for Social Presence • What should avatars look like for social situations? • Cartoon vs. realistic? • Partial or full body? • Impact on Social Presence? Yoon, B., Kim, H. I., Lee, G. A., Billinghurst, M., & Woo, W. (2019, March). The effect of avatar appearance on social presence in an augmented reality remote collaboration. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 547-556). IEEE.
  • 80. Avatar Representations • Cartoon vs. Realistic, Part Body vs. Whole Body • Realistic Head & Hands (RHH), Realistic Upper Body (RUB), Realistic Whole Body (RWB), • Cartoon Head & Hands (CHH), Cartoon Upper Body (CUB), Cartoon Whole Body (CWB).
  • 81. Experiment • Within-subjects design (24 subjects) • 6 conditions: RHH, RUB, RWB, CHH, CUB, CWB • AR/VR interface • Subject in AR interface, actor in VR • Experiment measures • Social Presence • Networked Mind Measure of Social Presence survey • Bailenson’s Social Presence survey • Post Experiment Interview • Tasks • Study 1: Crossword puzzle (Face to Face discussion) • Study 2: Furniture placement (virtual object placement) AR user VR user
  • 82. Hypotheses H1. Body Part Visibility will affect the user’s Social Presence in AR. H2. The Whole-Body virtual avatars will have the highest Social Presence among the three levels of visibility. H3. Head & Hands virtual avatars will have the lowest Social Presence among the three levels of visibility. H4. The Character Style will affect the user’s Social Presence. H5. Realistic avatars will have a higher Social Presence than Cartoon Style avatars in an AR remote collaboration.
  • 83. Results • Aggregated Presence Scores • 1: strongly disagree - 7: strongly agree
  • 84. User Comments • ‘Whole Body’ Avatar Expression to Users • “Presence was high with full body parts, because I could notice joints’ movement, behaviour, and reaction.” • “I didn’t get the avatar’s intention of the movement, because it had only head and hands.” • ‘Upper Body’ vs. ‘Whole Body’ Avatar • “I preferred the one with whole body, but it didn’t really matter because I didn’t look at the legs much.”, • “I noticed head and hands model immediately, but I didn’t feel the difference whether the avatar had a lower body or not.” • ‘Realistic’ vs ‘Cartoon’ style Avatars • "The character seemed more like a game than furniture placement in real. I felt that realistic whole body was collaborating with me more.”
  • 85. Hypotheses Outcome H1. Body Part Visibility will affect the user’s Social Presence in AR. H2. The Whole-Body virtual avatars will have the highest Social Presence among the three levels of visibility. H3. Head & Hands virtual avatars will have the lowest Social Presence among the three levels of visibility. H4. The Character Style will affect the user’s Social Presence. H5. Realistic avatars will have a higher Social Presence than Cartoon Style avatars in an AR remote collaboration.
  • 86. Key Lessons Learned • Avatar Body Part visibility should be considered first when designing for AR remote collaboration since it significantly affects Social Presence • Body Part Visibility • Whole Body & Upper Body: Whole body is preferred, but upper body is okay in some cases • Head & Hands: Should be avoided • Character Style • No difference in Social Presence between Realistic and Cartoon avatars • However, the majority of participants had a positive response towards the Realistic avatar • Cartoon character for fun, Realistic avatar for professional meetings
  • 87. Technology Trends • Advanced displays • Wide FOV, high resolution • Real time space capture • 3D scanning, stitching, segmentation • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping
  • 88. Sensor Enhanced HMDs Eye tracking, heart rate, pupillometry, and face camera HP Omnicept Project Galea EEG, EEG, EMG, EDA, PPG, EOG, eye gaze, etc.
  • 89.
  • 90. Enhancing Emotion • Using physiological and contextual cues to enhance emotion representation • Show user’s real emotion, make it easier to understand user emotion, etc.. Real User Physiological Cues Arousal/Valence Positive Negative Avatar Context Cues
  • 92. Early Results Face Tracking Positive Affect Avatar Outcome
  • 93.
  • 94. Conversational agent Intelligent Virtual Agents (IVAs) Embodied in 2D Screen Embodied in 3D space
  • 95. Photorealistic Characters • Synthesia • AI + ML to create videos • Speech + image synthesis • Supports >60 languages • Personalized characters
  • 98. Intelligent Digital Humans • Soul Machines • AI digital brain • Expressive digital humans • Autonomous animation • Able to see and hear • Learn from users
  • 99.
  • 100. Towards Empathic Social Agents • Goal: Using agents to creating empathy between people • Combine • Scene capture • Shared tele-presence • Trust/emotion recognition • Enhanced communication cues • Separate cues from representation • Facilitating brain synchronization
  • 102. Summary • Being able to share communication cues is vital • Focus on the cues needed for task • Need for body is task dependent • Physical task - pointer/simple cues okay • Social task – avatar/volumetric avatar better • Simple representation may be okay for some tasks • Legs optional, arms essential • Using additional communication cues can be beneficial • Gaze lines, view frustrum, body copy, etc.
  • 104. MiniMe Virtual Cues Enhanced Emotion Brain Synchronization Trust Recognition Scene Capture AI
  • 105. Opportunities for Research • Adding sensors • Physiological cues, sharing emotional/mental state • Behavioral/sensor synchronization • Novel communication cues • Scene sharing, gaze, gesture cues • AI + Virtual Avatars • Realistic behaviors, visual representation, cultural translation, etc • Autonomous Mixed Reality Agents • Awareness of real world, real user behavior
  • 106.