The document discusses using virtual avatars to improve remote collaboration. It provides background on communication cues used in face-to-face interactions versus remote communication. It then discusses early experiments using augmented reality for remote conferencing dating back to the 1990s. The document outlines key questions around designing effective virtual bodies for collaboration and discusses various technologies that have been developed for remote collaboration using augmented reality, virtual reality, and mixed reality. It summarizes several studies that have evaluated factors like avatar representation, sharing of different communication cues, and effects of spatial audio and visual cues on collaboration tasks.
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
Â
Invited guest lecture by Mark Billingurust given at the MIT Media Laboratory on November 21st 2023. This was given as part of Professor Hiroshi Ishii's class on Tangible Media
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
Lecture 1 for the 2022 COMP 4010 course on AR and VR. This course was taught by Mark Billinghurst at the University of South Australia in 2022. This lecture provides an introduction to AR, VR and XR.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
Â
Invited guest lecture by Mark Billingurust given at the MIT Media Laboratory on November 21st 2023. This was given as part of Professor Hiroshi Ishii's class on Tangible Media
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
Lecture 1 for the 2022 COMP 4010 course on AR and VR. This course was taught by Mark Billinghurst at the University of South Australia in 2022. This lecture provides an introduction to AR, VR and XR.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Presentation given by Mark Billinghurst at the 2024 XR Spring Summer School on March 7 2024. This lecture talks about different evaluation methods that can be used for Social XR/AR/VR experiences.
The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
Â
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
Lecture 2 of the COMP 4010 class on AR/VR. This lecture is about the human perception system. This lecture was given on August 3rd 2021 by Mark Billinghurst from the University of South Australia.
Lecture 3 in the 2022 COMP 4010 lecture series on AR/VR. This lecture provides an introduction for AR Technology. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
Â
Short talk by Mark Billinghurst on Empathic Computing and Collaborative Immersive Analytics, presented on July 28th 2022 at the Siggraph 2022 conference.
Lecture 8 of the COMP 4010 course taught at the University of South Australia. This lecture provides and introduction to VR technology. Taught by Mark Billinghurst on September 14th 2021 at the University of South Australia.
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
Â
Keynote talk given by Mark Billinghurst at the CHI 2023 Workshop on Towards and Inclusive and Accessible Metaverse. The talk was given on April 23rd 2023.
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Â
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.
keynote speech by Mark Billinghurst at the Workshop on Transitional Interfaces in Mixed and Cross-Reality, at the ACM ISS 2021 Conference. Given on November 14th 2021
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
COMP lecture 4 given by Bruce Thomas on August 16th 2017 at the University of South Australia about 3D User Interfaces for VR. Slides prepared by Mark Billinghurst.
Presentation on spatial audio for Augmented Reality given by Mark Billinghurst on July 14th 2016 at the Microsoft Research Faculty Summit. The presentation gives an overview of using spatial audio in Augmented Reality.
Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Presentation given by Mark Billinghurst at the 2024 XR Spring Summer School on March 7 2024. This lecture talks about different evaluation methods that can be used for Social XR/AR/VR experiences.
The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
Â
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
Lecture 2 of the COMP 4010 class on AR/VR. This lecture is about the human perception system. This lecture was given on August 3rd 2021 by Mark Billinghurst from the University of South Australia.
Lecture 3 in the 2022 COMP 4010 lecture series on AR/VR. This lecture provides an introduction for AR Technology. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
Â
Short talk by Mark Billinghurst on Empathic Computing and Collaborative Immersive Analytics, presented on July 28th 2022 at the Siggraph 2022 conference.
Lecture 8 of the COMP 4010 course taught at the University of South Australia. This lecture provides and introduction to VR technology. Taught by Mark Billinghurst on September 14th 2021 at the University of South Australia.
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
Â
Keynote talk given by Mark Billinghurst at the CHI 2023 Workshop on Towards and Inclusive and Accessible Metaverse. The talk was given on April 23rd 2023.
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Â
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.
keynote speech by Mark Billinghurst at the Workshop on Transitional Interfaces in Mixed and Cross-Reality, at the ACM ISS 2021 Conference. Given on November 14th 2021
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
COMP lecture 4 given by Bruce Thomas on August 16th 2017 at the University of South Australia about 3D User Interfaces for VR. Slides prepared by Mark Billinghurst.
Presentation on spatial audio for Augmented Reality given by Mark Billinghurst on July 14th 2016 at the Microsoft Research Faculty Summit. The presentation gives an overview of using spatial audio in Augmented Reality.
Keynote talk given by Mark Billinghurat at the Foundation of Digital Games (FDG) 2021 conference on August 5th 2021. The talk was on how Empathic Computing techniques can be used to create new type of games.
Lecture 11 from the 2017 COMP 4010 course on AR and VR at the University of South Australia. This lecture was on AR applications and was taught by Mark Billinghurst on October 26th 2017.
Final lecture from the COMP 4010 course on Virtual and Augmented Reality. This lecture was about Research Directions in Augmented Reality. Taught by Mark Billinghurst on November 1st 2016 at the University of South Australia
Presentation given by Mark Billinghurst on research into Empathic Glasses. Combining Augmented Reality, Wearable Computers, Emotion Sensing and Remote Collaboration. Given on February 18th 2016.
Lecture 5 in the 2022 COMP 4010 lecture series. This lecture is about AR prototyping tools and techniques. The lecture was given by Mark Billinghurst from University of South Australia in 2022.
Keynote speech given by Mark Billinghurst at the QCon 2018 conference on April 22nd in Beijing, China. The talk identified important future research directions for Augmented Reality.
Lecture on Collaborative Augmented Reality given to the COSC 426 graduate class in AR. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury.
Moving Beyond Questionnaires to Evaluate MR ExperiencesMark Billinghurst
Â
A talk about moving beyond using questionnaires for evaluation of Mixed Reality experiences. This was a keynote given by Mark Billinghurst at a Workshop in the 33rd British Human Computer Interaction (HCI) Conference, on July 19th 2021.
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesMark Billinghurst
Â
Keynote talk given by Mark Billinghurst at the VSMM 2016 conference on October 19th 2016.This talk was about how AR and VR can be used to create Empathic Computing experiences.
Augmented Reality and Virtual Reality: Research Advances in Creative IndustryZi Siang See
Â
This research seminar presents and discusses recent advances in augmented reality (AR) and virtual reality (VR). Recent developments have been implied that AR and VR innovation is more broadly accessible for different expert areas, these include the information technology sectors, education operations, build environment and the creative industries. During the session, participants will be able to experience AR and VR using mobile and head mount devices (HMD). This research talk will provide an overview of AR and VR interface development, industrial use cases and research direction.
Mark Billinghurst (University of South Australia ): Augmented TeleportationAugmentedWorldExpo
Â
A talk from the Intro Classes Track at AWE USA 2018 - the World's #1 XR Conference & Expo in Santa Clara, California May 30- June 1, 2018.
Mark Billinghurst (University of South Australia ): Augmented Teleportation
Modern telecommunications allow people to talk to each other almost anywhere and anytime. However using audio or video conferencing is very different from face to face communication. In this talk we show how Augmented Reality (AR) and Virtual Reality (VR) technologies can be used to create new types of conferencing experiences that overcome limitations of traditional technology.
http://AugmentedWorldExpo.com
Keynote talk by Mark Billinghurst at the 9th XR-Metaverse conference in Busan, South Korea. The talk was given on May 20th, 2024. It talks about progress on achieving the Metaverse vision laid out in Neil Stephenson's book, Snowcrash.
These are slides from the Defence Industry event orgranized by the Australian Research Centre for Interactive and Virtual Environments (IVE). This was held on April 18th 2024, and showcased IVE research capabilities to the South Australian Defence industry.
This is a guest lecture given by Mark Billinghurst at the University of Sydney on March 27th 2024. It discusses some future research directions for Augmented Reality.
Lecture given by Mark Billinghurst on June 18th 2022 about how the Metaverse can be used for corporate training. In particular how combining AR, VR and other Metaverse elements can be used to provide new types of learning experiences.
Lecture 10 in the COMP 4010 Lectures on AR/VR from the Univeristy of South Australia. This lecture is about VR Interface Design and Evaluating VR interfaces. Taught by Mark Billinghurst on October 12, 2021.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Â
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Â
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Â
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Â
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Â
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Dev Dives: Train smarter, not harder â active learning and UiPath LLMs for do...UiPathCommunity
Â
đĽ Speed, accuracy, and scaling â discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Miningâ˘:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing â with little to no training required
Get an exclusive demo of the new family of UiPath LLMs â GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
đ¨âđŤ Andras Palfi, Senior Product Manager, UiPath
đŠâđŤ Lenka Dulovicova, Product Program Manager, UiPath
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
Â
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Â
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Â
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overviewâ
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
Â
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more âmechanicalâ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
Â
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties â USA
Expansion of bot farms â how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks â Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
5. A wide variety of communication cues used.
Speech
Paralinguistic
Para-verbals
Prosodics
Intonation
Audio
Gaze
Gesture
Face Expression
Body Position
Visual
Object Manipulation
Writing/Drawing
Spatial Relationship
Object Presence
Environmental
Face to Face Communication
6. Face to Face Collaboration
Task Space
Communication Space
7. Face to Face Communication
Audio Cues
Visual Cues
Environmental Cues
12. Limitations with Current Technology
â˘Lack of spatial cues
⢠Person blends with background
â˘Poor communication cues
⢠Limited gaze, gesture, non-verbal communication
â˘Separation of task/communication space
⢠Canât see person and workspace at same time
16. AR Video Conferencing (2001)
⢠Bringing conferencing into real world
⢠Using AR video textures of remote people
⢠Attaching AR video to real objects
Billinghurst, M., & Kato, H. (2002). Collaborative augmented reality. Communications of the ACM, 45(7), 64-70.
18. Multi-View AR Conferencing
Billinghurst, M., Cheok, A., Prince, S., & Kato, H. (2002). Real world
teleconferencing. Computer Graphics and Applications, IEEE, 22(6), 11-13.
19.
20. Holoportation (2016)
⢠Augmented Reality + 3D capture + high bandwidth
⢠http://research.microsoft.com/en-us/projects/holoportation/
26. Key Questions
⢠Do we need a virtual body?
⢠What should our virtual body look like?
⢠What enhancements could we provide to virtual bodies?
⢠What type of bodies do we need for different collaboration tasks?
⢠What other cues can we provide to enhance remote collaboration?
27. First Person Perspective Remote Collaboration
⢠View from remote userâs perspective
⢠Wearable Teleconferencing
⢠audio, video, pointing
⢠send task space video
⢠CamNet (1992)
⢠British Telecom
⢠Similar CMU study (1996)
⢠cut performance time in half
28. AR for Remote Collaboration
⢠Camera + Processing + AR Display + Connectivity
⢠First person Ego-Vision Collaboration
30. Adding Gaze Cues - Empathy Glasses
⢠Combine together eye-tracking, display, face expression
⢠Implicit cues â eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
31. Remote Collaboration
⢠Eye gaze pointer and remote pointing
⢠Face expression display
⢠Implicit cues for remote collaboration
32.
33. Shared Sphere â 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
40. 3D Mixed Reality Remote Collaboration (2022)
Tian, H., Lee, G. A., Bai, H., & Billinghurst, M. (2023). Using Virtual Replicas to Improve Mixed
Reality Remote Collaboration. IEEE Transactions on Visualization and Computer Graphics.
41. View Sharing Evolution
⢠Increased immersion
⢠Improved scene understanding
⢠Better collaboration
2D 360 3D
42. Switching between 360 and 3D views
⢠360 video
⢠High quality visuals
⢠Poor spatial representation
⢠3D reconstruction
⢠Poor visual quality
⢠High quality 3D reconstruction
43. Swapping between 360 and 3D views
⢠Have pre-captured 3D model of real space
⢠Enable remote user to swap between live 360 video or 3D view
⢠Represent remote user as avatar
Teo, T., F. Hayati, A., A. Lee, G., Billinghurst, M., & Adcock, M. (2019). A technique for mixed reality remote collaboration using
360 panoramas in 3d reconstructed scenes. In 25th ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).
44.
45. ⢠Using AR/VR to share communication cues
⢠Gaze, gesture, head pose, body position
⢠Sharing same environment
⢠Virtual copy of real world
⢠Collaboration between AR/VR
⢠VR user appears in AR userâs space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing: Virtual Communication Cues (2019)
46. Sharing Virtual Communication Cues
⢠Collaboration between AR and VR
⢠Gaze Visualization Conditions
⢠Baseline, FoV, Head-gaze, Eye-gaze
47.
48. Results
⢠Predictions
⢠Eye/Head pointing better than no cues
⢠Eye/head pointing could reduce need for pointing
⢠Results
⢠No difference in task completion time
⢠Head-gaze/eye-gaze great mutual gaze rate
⢠Using head-gaze greater ease of use than baseline
⢠All cues provide higher co-presence than baseline
⢠Pointing gestures reduced in cue conditions
⢠But
⢠No difference between head-gaze and eye-gaze
51. On the Shoulder of a Giant
Piumsomboon, T., Lee, G. A., Irlitti, A., Ens, B., Thomas, B. H., & Billinghurst, M. (2019, May). On the shoulder
of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction.
In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-17).
52. Role of Spatial Cues (2020)
⢠What is impact of spatial audio/visual cues
over large scale AR/VR collaboration?
Yang, J., Sasikumar, P., Bai, H., Barde, A., SĂśrĂśs, G., & Billinghurst, M. (2020). The effects of spatial auditory
and visual cues on mixed reality remote collaboration. Journal on Multimodal User Interfaces, 14(4), 337-352.
57. User Study Results
⢠Two studies conducted
⢠(1) Audio only cues (spatial vs. non-spatial), (2) Different types of visual cues (head, hands)
⢠Key findings
⢠Spatial audio significantly increase Social Presence
⢠Users strongly preferred head, gesture, audio cues, or non-spatial voice, spatial beacon
⢠Integrating visual cues with the spatial auditory cues significantly improved task performance
⢠Integrating the remote expertâs head frustum into the spatial auditory cues can provide
significantly better social presence, spatial awareness, and system usability
Spatial Presence Scores
Task Completion Times
58. Sharing Gesture Cues
⢠What type of gesture cues should be shared in AR/VR collaboration?
Kim, S., Lee, G., Huang, W., Kim, H., Woo, W., & Billinghurst, M. (2019). Evaluating the combination of visual communication cues for HMD-
based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-13).
Augmented Reality
Virtual Reality
59. Communication Cues
⢠Four different cues used
⢠(1) Hands Only (HO), (2) Hands + Pointer (HP)
⢠(3) Hands + Sketch (HS), (4) Hands + Pointer + Sketch (HPS)
⢠Three experimental tasks
⢠Lego assembly, Tangram puzzle, Origami folding
60. Key Results
⢠Task completion time
⢠Sketch cues enabled users to complete tasks significantly faster (task dep.)
⢠Adding pointing didnât improve task completion time
⢠Co-Presence
⢠Adding pointing and sketch cues didnât improve feeling of co-presence
⢠User Preference
⢠Userâs overwhelming preferred Hands + Pointer + Sketch condition, Hands Only ranked last
Task completion time
âsketch allowed drawings for accuracy
and hand for general use", âsketch is
pretty useful for describing actions that
was difficult by verbal words and could
express more details".
61. Changing Gaze Cues
How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment.
â Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system
â Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared
between a local host (AR) and a remote collaborator (VR).
Jing, A., May, K., Lee, G., & Billinghurst, M. (2021). Eye See What You See: Exploring How Bi-Directional Augmented
Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration. Frontiers in Virtual Reality, 2, 79.
62. System Design
â 360 Panaramic Camera + Mixed Reality View
â Combination of HoloLens2 + Vive Pro Eye
â 4 gaze behavioural visualisations:
browse, focus, mutual, fixated circle
65. Experiment
⢠Participants
⢠12 pairs of people (6 women), 60% AR/VR experience, 50% with gaze experience
⢠Conditions
⢠2x2 study design, also no gaze (NG)
⢠Gaze Behaviour (WB) /No Gaze Behaviour (NB)
⢠Uni-directional (U) /Bi-directional gaze (B)
⢠Tasks
⢠Finding abstract symbols, guiding other person to the symbols
⢠Measures
⢠Quantitative measures: gaze behaviour metrics, completion time, questionnaires
⢠Qualitative measures: open-ended questions and behaviour analysis
66. Research Questions
⢠(1) Gaze vs no Gaze: compared to no gaze visualisation, gaze cues
would encourage joint focus by providing explicit visual feedback
⢠(2) Uni-directional gaze style vs bi-directional style: Bidirectional style is
more balanced visually, leading to less confusion of gaze identity
⢠(3) Gaze behaviours vs no behaviour: Integrating gaze behaviour
visualisations lowers communication cognitive load
67. Results
⢠RQ1: Behaviour visualisations stimulate frequent joint attention
⢠RQ2: Bidirectional gaze ensured gaze information was accurately delivered
⢠RQ3: Gaze made it easier to coordinate on target object location, and made
communication easier and more effective
68. Sharing: Separating Cues from Body
⢠What happens when you canât see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M.
(2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the
2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
69. Mini-Me Communication Cues in MR
⢠When lose sight of collaborator a Mini-Me avatar appears
⢠Miniature avatar in real world
⢠Mini-Me points to shared objects, show communication cues
⢠Redirected gaze, gestures
70.
71. Results from User Evaluation (16 subjects)
⢠Collaboration between user in AR, expert in VR
⢠Hololens, HTC Vive
⢠Two tasks
⢠Asymmetric, symmetric collaboration
⢠Significant performance improvement
⢠20% faster with Mini-Me
⢠Social Presence
⢠Higher sense of Presence
⢠Users preferred
⢠People felt the task was easier to complete
⢠60-75% preference
âI feel like I am talking
to my partnerâ
âThe ability to see the small
avatar ⌠enhanced the
speed of solving the taskâ
72. Avatar Representation
⢠Pilot study with recorded avatar
⢠Motorcycle engine assembly
⢠Avatar types
⢠(A1) Annotation: Computer-generated lines drawn in 3D space.
⢠(A2) Hand Gesture: Real hand gestures captured using stereoscopic cameras
⢠(A3) Avatar: Virtual avatar reconstructed using inverse kinematics.
⢠(A4) Volumetric Playback: Using three Kinect cameras, the movements of an expert
are captured and played back as a virtual avatar via a see-through headset.
75. Experiment Design (30 participants)
Performing motorbike assembly task under guidance
- Easy, Medium, Hard task
Hypotheses
- H1. Volumetric playback would have a better sense of social presence in a
remote training system.
- H2. Volumetric playback would enable faster completion of tasks in a remote
training system
Measures
⢠NMM Social Presence Questionnaire, NASA TLX, SUS
76. Results
⢠Hands, Annotation significantly faster than avatar
⢠Volumetric playback induced the highest sense of co-presence
⢠Users preferred Volumetric or Annotation interface
Performance Time
Average Ranking
77. Results
Volumetric instruction cues exhibits an increase in co-presence and
system usability while reducing mental workload and frustration.
Mental Load (NASA TLX)
System Usability Scale
78. User Feedback
⢠Annotations easy to understand (faster performance)
⢠âAnnotation is very clear and easy to spot in a 3d environmentâ.
⢠Volumetric creates high degree of social presence (working with person)
⢠âSeeing a real person demonstrate the task, feels like being next to a personâ.
⢠Recommendations
⢠Use Volumetric Playback to improve Social Presence and system usability
⢠Using a full-bodied avatar representation in a remote training system is not
recommended unless it is well animated
⢠Using simple annotation can have significant improvement in performance if
social presence is not of importance.
79. Avatar Representation for Social Presence
⢠What should avatars look
like for social situations?
⢠Cartoon vs. realistic?
⢠Partial or full body?
⢠Impact on Social Presence?
Yoon, B., Kim, H. I., Lee, G. A., Billinghurst, M., & Woo, W. (2019, March). The effect of
avatar appearance on social presence in an augmented reality remote collaboration. In
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 547-556). IEEE.
80. Avatar Representations
⢠Cartoon vs. Realistic, Part Body vs. Whole Body
⢠Realistic Head & Hands (RHH), Realistic Upper Body (RUB), Realistic Whole Body (RWB),
⢠Cartoon Head & Hands (CHH), Cartoon Upper Body (CUB), Cartoon Whole Body (CWB).
81. Experiment
⢠Within-subjects design (24 subjects)
⢠6 conditions: RHH, RUB, RWB, CHH, CUB, CWB
⢠AR/VR interface
⢠Subject in AR interface, actor in VR
⢠Experiment measures
⢠Social Presence
⢠Networked Mind Measure of Social Presence survey
⢠Bailensonâs Social Presence survey
⢠Post Experiment Interview
⢠Tasks
⢠Study 1: Crossword puzzle (Face to Face discussion)
⢠Study 2: Furniture placement (virtual object placement)
AR user
VR user
82. Hypotheses
H1. Body Part Visibility will affect the userâs Social Presence in AR.
H2. The Whole-Body virtual avatars will have the highest Social
Presence among the three levels of visibility.
H3. Head & Hands virtual avatars will have the lowest Social
Presence among the three levels of visibility.
H4. The Character Style will affect the userâs Social Presence.
H5. Realistic avatars will have a higher Social Presence than
Cartoon Style avatars in an AR remote collaboration.
84. User Comments
⢠âWhole Bodyâ Avatar Expression to Users
⢠âPresence was high with full body parts, because I could notice jointsâ
movement, behaviour, and reaction.â
⢠âI didnât get the avatarâs intention of the movement, because it had only
head and hands.â
⢠âUpper Bodyâ vs. âWhole Bodyâ Avatar
⢠âI preferred the one with whole body, but it didnât really matter because I
didnât look at the legs much.â,
⢠âI noticed head and hands model immediately, but I didnât feel the
difference whether the avatar had a lower body or not.â
⢠âRealisticâ vs âCartoonâ style Avatars
⢠"The character seemed more like a game than furniture placement in real. I
felt that realistic whole body was collaborating with me more.â
85. Hypotheses Outcome
H1. Body Part Visibility will affect the userâs Social Presence in AR.
H2. The Whole-Body virtual avatars will have the highest Social
Presence among the three levels of visibility.
H3. Head & Hands virtual avatars will have the lowest Social
Presence among the three levels of visibility.
H4. The Character Style will affect the userâs Social Presence.
H5. Realistic avatars will have a higher Social Presence than
Cartoon Style avatars in an AR remote collaboration.
86. Key Lessons Learned
⢠Avatar Body Part visibility should be considered first when designing for AR remote
collaboration since it significantly affects Social Presence
⢠Body Part Visibility
⢠Whole Body & Upper Body: Whole body is preferred, but upper body is okay in some cases
⢠Head & Hands: Should be avoided
⢠Character Style
⢠No difference in Social Presence between Realistic and Cartoon avatars
⢠However, the majority of participants had a positive response towards the Realistic avatar
⢠Cartoon character for fun, Realistic avatar for professional meetings
87. Technology Trends
⢠Advanced displays
⢠Wide FOV, high resolution
⢠Real time space capture
⢠3D scanning, stitching, segmentation
⢠Natural gesture interaction
⢠Hand tracking, pose recognition
⢠Robust eye-tracking
⢠Gaze points, focus depth
⢠Emotion sensing/sharing
⢠Physiological sensing, emotion mapping
88. Sensor Enhanced HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
89.
90. Enhancing Emotion
⢠Using physiological and contextual cues to enhance emotion representation
⢠Show userâs real emotion, make it easier to understand user emotion, etc..
Real User
Physiological Cues
Arousal/Valence
Positive
Negative
Avatar
Context Cues
98. Intelligent Digital Humans
⢠Soul Machines
⢠AI digital brain
⢠Expressive digital humans
⢠Autonomous animation
⢠Able to see and hear
⢠Learn from users
99.
100. Towards Empathic Social Agents
⢠Goal: Using agents to creating
empathy between people
⢠Combine
⢠Scene capture
⢠Shared tele-presence
⢠Trust/emotion recognition
⢠Enhanced communication cues
⢠Separate cues from representation
⢠Facilitating brain synchronization
102. Summary
⢠Being able to share communication cues is vital
⢠Focus on the cues needed for task
⢠Need for body is task dependent
⢠Physical task - pointer/simple cues okay
⢠Social task â avatar/volumetric avatar better
⢠Simple representation may be okay for some tasks
⢠Legs optional, arms essential
⢠Using additional communication cues can be beneficial
⢠Gaze lines, view frustrum, body copy, etc.
105. Opportunities for Research
⢠Adding sensors
⢠Physiological cues, sharing emotional/mental state
⢠Behavioral/sensor synchronization
⢠Novel communication cues
⢠Scene sharing, gaze, gesture cues
⢠AI + Virtual Avatars
⢠Realistic behaviors, visual representation, cultural translation, etc
⢠Autonomous Mixed Reality Agents
⢠Awareness of real world, real user behavior