Mark Billinghurst
mark.billinghurst@unisa.edu.au
October 12th 2025
Current Research and Opportunities
Neuro-XR
FOUNDATIONS
1967 – IBM 1401 – half of the computers in the world, $10,000/month to run
Jacques Vidal (1973)
Vidal, J. J. (1973). Toward direct brain-
computer communication. Annual review of
Biophysics and Bioengineering, 2(1), 157-180.
Coined the term Brain Computer Interface
UCLA
BCI Lab
BCI Publications Per Year (1977 – 2025)
~ 30,000 papers
Major Research Trends
• 1920s–1960s - Early Foundations
• 1924 invention of EEG (Hans Berger)
• 1970s – Establishing the Field
• 1977 – Vidal’s first non-invasive human study
• 1980s - 1990s - Development of Core Non-Invasive Paradigms
• 1988 - Farwell and Donchin introduce the P300 Speller paradigm
• 2000s–Present - Invasive BCI Breakthroughs and Clinical Translation
• 2004 - BrainGate – first invasive human trial
• 2016 onwards – rise of private companies (Neuralink, Synchron)
Current State of the Art
• Moderate cost reliable EEG hardware
• OpenBCI, Emotiv, Neurosity, Muse, etc.
• Excellent software tools
• OpenVibe, EEG, MNE-Python, etc.
• Fast input performance
• ~60 wpm from BrainGate (2023)
• Implantable clinical trials underway
• BrainGate (2004), Synchron (2019), Neuralink (2024), etc.
BCI and XR Publications
~1,400 XR papers
BCI and AR/VR Publications
Roz Picard (1995)
• Coined term Affective Computing
Picard, R. W. (1995). Affective computing-
mit media laboratory perceptual
computing section technical report no.
321. Cambridge, MA, 2139, 92.
Affective Computing
• Ros Picard – MIT Media Lab
• Systems that recognize emotion
Affective Computing Publications (1998 – 2025)
~ 7,500 papers
Major Research Trends
• Mid 1990s - Early Foundations
• 1997 - Affective Computing book – Roz Picard
• Late 1990s – Early 2000s Early projects
• 1998 – IBM BlueEyes project – emotion sensing in practical system
• 2000s – 2010s – Advancing Interaction and Modeling
• 2001 – USC-ICT – modeling emotions in virtual characters
• 2010s – Present – Real World Integration and Modern Deep learning
• 2010 – Establishment of IEEE Transactions on Affective Computing
• 2009 onwards – commercialization (Affectiva, Empatica)
Current State of the Art
• Widely available physiological sensors
• Shimmer, Emotibit, Plux, Empatica, etc.
• Excellent software tools
• PyAffecCT, EmoSense, EmotiEfflib, etc.
• Consumer wearable devices
• Apple, Fitbit, Garmin, Samsung, etc
• Many companies operating
• Affectiva, nViso, Realeyes, Emteq, etc.
Affective Computing and XR Publications
~450 XR papers
Affective Computing and AR/VR Publications
Landscape Summary
• Long history in BCI/Affective computing research
• Large research communities, papers from 1980/90s.
• Many devices and tools available
• Relatively low cost, open source, commercially available
• But, little XR research
• ~5% of current BCI/Affective Computing publications
• Especially in the AR space (~20% of BCI XR publications)
• Obstacles to Overcome
• Integrating multiple sensors
• Combining different raw data
• Need for multi-skilled researchers
Typical Neuro-XR setup
PhysioHMD (2019)
Toolkit for collecting
physiological data from HMDs
• EEG, EMG, EOG, EDA
• Open-source software
Bernal, G., Yang, T., Jain, A., & Maes, P. (2018, October). PhysioHMD:
a conformable, modular toolkit for collecting physiological data from
head-mounted displays. In Proceedings of the 2018 ACM international
symposium on wearable computers (pp. 160-167).
OpenBCI Galea: Multiple Physiological Sensors in VR HMD
• Incorporate range of sensors on HMD faceplate and over the head
• EMG – muscle movement
• EOG – Eye movement
• EEG – Brain activity
• EDA, PPG – Heart rate
Cognixion Axon-R
• See-through AR HMD
• Integrated 8 EEG sensors
• 8 additional channels (ECG, EMG, EOG)
• BCI studio software
RESEARCH PROJECTS
Example Research Projects
• Measuring Presence
• Towards an objective measure
• Adaptive VR Training
• Sensing and adapting to cognitive load
• Measuring Trust in AI Agents
• Using neurophysiological cues to measure trust
• Emotionally Responsive VR
• Adaptive VR experience based on emotional state
Neurophysiological Measures of Presence
• Measuring Presence using multiple
neurophysiological measures
• Combining physiological and neurological signals
Dey, A., Phoon, J., Saha, S., Dobbins, C., & Billinghurst, M. (2020, November). A
Neurophysiological Approach for Measuring Presence in Immersive Virtual
Environments. In 2020 IEEE International Symposium on Mixed and Augmented
Reality (ISMAR) (pp. 474-485). IEEE.
Experiment Design
• Independent Variable
• Presence level of VE; High (HP), Low (LP)
• High quality
• Better visual, interaction, realistic hand
• Low quality
• Reduced visual, no interaction, cartoon hand
• Between subject’s design
• Reduce variability
Measures
• Physiological Measures
• raw electrocardiogram (ECG) (Shimmer)
• heartrate
• galvanic skin response (GSR) (Shimmer)
• phasic and tonic electrodermal activity (EDA)
• electroencephalogram (EEG) (Emotiv)
• brain activity
• Subjective Measures (Presence Surveys)
• Slater-Usoh-Steed (SUS) questionnaire
• Witmer & Singer survey
• Subjects
• 24 subjects, aged 20-30, 2 groups of 12
HTC Vive + Emotiv
• Significant difference in Presence, both Witmer
& Singer and SUS surveys
Witmer & Singer
SUS
Results – Subjective Surveys
Results – EEG Analysis
• 14 channels of EEG data
• Processing Alpha, Theta, Beta bands
• Multiple methods – Chirplet Transform, Power Spectral Density, Power Load Index
• Significant differences in brain activity between LP and HP environment
• Overall cognitive engagement is higher in the HP than the LP environment
Other Physiological Cues
• Significant difference in ECG
• No difference in EDA results
Heart rate value
Lessons Learned
• Key Findings
• Higher presence in calm virtual environments can be characterised by
increased heart rate, elevated beta and theta, alpha activities in the brain.
• Approaching a neurophysiological measure of presence
• Limitations
• Simple Virtual Environment
• Consumer grade EEG
• Participants seated/limited movement
EEG-based Adaptive VR Training
Dey, A., Chatburn, A., & Billinghurst, M. (2019). Exploration
of an EEG-based cognitively adaptive training system in
virtual reality. In 2019 IEEE conference on Virtual Reality
and 3d user interfaces (VR) (pp. 220-226). IEEE.
Goal: Create an adaptive VR training system
using workload calculated from EEG
Our System
Oz, O1, O2, Pz,
P3, and P4
Adaption/Calibration
● Establish baseline (alpha power) – innate cognitive load capacity
● Two sets of n(1, 2)-back tasks to calibrate user’s own capacity
● Measured alpha activity (task load), calculate mean of two tasks
● Mean → CL Baseline
● In experimental task, adapt content
○ load > baseline → decrease difficulty level
○ load < baseline → increase difficulty level
Experimental Task
• Target selection
• number of objects, different colors
• shapes, and movement
Increasing levels (0 - 20)
Experimental Task
Difficulty - Low Difficulty - High
User Study
● Participants
● 14 subjects (6 women)
● 20 – 41 years old, 28 years average
● No experience with VR
● Measures
○ Response time
○ Brain activity (alpha power)
• 5 minutes fixed trial time
Adaption
Results – Response Time
Increasing levels
Response Time (sec.)
No difference between
easiest and hardest levels
Results – Time Frequency Representation
• Task Load
• Significant alpha synchronisation in the hardest difficulty levels of
the task when compared to the easiest difficulty levels
Easiest Hardest Difference
Key Finding + Limitations
• Findings
• Similar task time but increased brain activity
• Increased cognitive effort at higher levels to sustain performance
• Adaptive VR training can increase the user’s cognitive load without
affecting task performance
• But:
• Task: Should be similar to real-world tasks
• Behaviour: Difficulty levels could be designed differently
• EEG: Only alpha activity ignoring theta, only 12 electrodes
Understanding: Trust and Agents
• Many Agents require trust
• Guidance, collaboration, etc
• Would you trust an agent?
• How can you measure trust?
• Subjective/Objective measures
According to AAA, 71% of surveyed
Americans are afraid to ride in a fully
self-driving vehicle.
Measuring Trust
• How to reliably measure trust?
• Using physiological sensors (EEG, GSR, HRV)
• Subjective measures (STS, SMEQ, NASA-TLX)
• Relationship between cognitive load (CL) and trust?
• Novelty:
• Use EEG, GSR, HRV to evaluate trust at different CL
• Implemented custom VR environment with virtual agent
• Compare physiological, behavioral, subjective measures
Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., & Billinghurst, M. (2020, March).
Measuring human trust in a virtual assistant using physiological sensing in virtual reality.
In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 756-765). IEEE.
Experimental Task
• Target selection + N back memory task
• Agent voice guidance
Experiment Design
• Two factors
• Cognitive Load (Low, High)
• Low = N-Back with N = 1
• High = N-Back with N = 2
• Agent Accuracy (No, Low, High)
• No = No agent
• Low = 50% accurate
• High = 100% accurate
• Within Subject Design
• 24 subjects (12 Male), 23-35 years old
• All experienced with virtual assistant
2 x 3 Expt Design
Results
• Physiological Measures
• EEG sign. diff. in alpha band power level with CL
• GSR/HRV – sign. diff. in FFT mean/peak frequency
• Performance
• Better with more accurate agent, no effect of CL
• Subjective Measures
• Sign. diff. in STS scores with accuracy, and CL
• SMEQ had a significant effect of CL
• NASA-TLX significant effect of CL and accuracy
• Overall
• Trust for virtual agents can be measured using combo
of physiological, performance, and subjective measures
”I don’t trust you anymore!!”
Context-Aware Empathic VR
• VR application that identifies and
responds to user’s emotional changes
• Emotion prediction model (EEG, EDA, HRV)
• Context aware empathic agent
• Emotion adaptive VR environment
Gupta, K., Zhang, Y., Gunasekaran, T. S., Krishna,
N., Pai, Y. S., & Billinghurst, M. (2024). CAEVR:
Biosignals-driven Context-aware Empathy in Virtual
Reality. IEEE Transactions on Visualization and
Computer Graphics, 30(5), 2671-2681.
Key Research Questions
• RQ1: How can physiological signals be used to predict emotions and
facilitate context-aware empathic interactions (CAEIxs) in VR?
• RQ2: What are the effects of CAEIxs on elicited emotions, cognitive
load, and empathy towards a virtual agent in VR?
• RQ3: How can the impact of CAEIxs on users’ emotional and
cognitive load during VR experiences be evaluated?
The CAEVR System
Empathic Appraisal
• BioEmVR: A generalized emotion recognition model
• Used EEG, HRV, and EDA Data
• Gradient boost classifier recognized 4 emotional states (92% accuracy)
• Happy, Stressed, Bored, Relaxed
• Self-Projected Appraisal: To assess context and user emotional state
• Determine situational emotions based on user actions and context
• E.g. moving slow and late might mean person will be stressed
• Empathic Emotion Features: Synthesizing contextual and emotion information
• Tailor the VR environment based on user’s emotional and situational needs
• E.g. user happy and stressed might benefit from positive reinforcement
Empathic Response
• Used to express affective states
• Changing lighting and colours, virtual agents verbal and non-verbal cues
• Emotion-Adaptive Responses
• VR environment colours change depending on user’s emotional state
• Happy = Yellow, Stress = Red
• Empathic Tone
• VR agent provides speech feedback with different tones
• Context-Aware Empathic Assistance
• Agent dialogue varies depending on user’s emotional state
AR-glass capable of adding color filter based on the participant’s emotional state.
- Low-saturation colors evoke a sense of depression, while high- saturation ones lead people to feel
cheerful and pleasant
Context-Aware Empathic VR Experience
https://www.youtube.com/watch?v=cLLNXl3c1mY
System Evaluation
• Photo taking task
• Take pictures monuments in VR
• Independent Variables
• Emotion-Adaptive Environment (EA)
• Context-Aware Empathic Agent (CAE)
• 2x2 within-subjects,17 sub (7M)
• Hypotheses
• H1: EA and CAE will improve user’s emotional states, presence, empathy
• H2: EA and CAE will significantly affect cognitive aspects (cognitive load, flow)
Conditions No-CAE CAE
No-EA A: No-EA-No-CAE C: No-EA-CAE
EA B: EA-No-CAE D: EA-CAE
Measures
• Subjective Measures
• IPQ for Presence,
• NASA-TLX for Cognitive Load,
• SAM scale for Emotional State,
• Flow Short State Questionnaire (FSSQ) for flow state,
• Game Experience Questionnaire (GEQ) for affect and empathy
• Physiological Measures
• EEG, Electrodermal Activity (EDA), & Heart Rate Variability (HRV)
Key Results
• SAM: Significant effect of EA and CAE on Valence
• GEQ: Significant effect of CAE on Empathy
• EA and CAE influenced EDA and HRV metrics like RMSSD
• EEG: Significant effects of CAE on FA-Theta and FB-Beta
Conclusions
• Hypotheses
• Use of EA and CAE improved emotional experience (H1)
• CAE interventions improved user’s empathy towards agent (H1)
• Integrating EA and CAE can impact cognitive load (H2)
• No impact on Flow state or Presence
• Research Questions
• RQ1: BioEMoVR can predict emotions and facilitate CAEIxs (92% accuracy)
• RQ2: Using CAE virtual agents can enrich user experience (emotions, empathy)
• RQ3: Need to use a multi-faceted approach for evaluation (survey, physio. cues)
• Overall: user’s VR experience can be enhanced by adapting to real-
time emotional feedback – integrating EEG, EDA and HRV
EMPATHIC COMPUTING
An important research opportunity for Neuro-XR
Modern Communication Technology Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
Natural
Collaboration
Experience
Capture
Implicit
Understanding
Empathic
Computing
“Empathy is Seeing with the
Eyes of another, Listening with
the Ears of another, and Feeling
with the Heart of another..”
Alfred Adler
Empathic Computing Research Focus
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Sharing Heart Rate in VR
• HTC Vive HMD
• Heart rate sensor
• Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
VR Environments
• Butterfly World: calm scene, collect butterflies
• Zombie Attack: scary scene, fighting zombies
Experiment Design
• Key Question
• What is the impact of sharing heart rate feedback?
• Two Independent Variables
• Game Experience (Zombies vs butterflies)
• Heart Rate Feedback (On/Off)
• Measures
• Heart rate (player)
• PANAS Scale (Emotion)
• Inclusion of other in self scale (Connection)
Results
• Results
• Significant difference in Heart Rate
• Sharing HR improves positive affect (PANAS)
• Sharing HR created subjective connection between collaborators
Heart Rate Data
Likert Data
Using Neuro-XR for Empathic Computing
• Sharing cognitive state
• Enhancing remote collaboration in XR
• Measuring brain synchronisation
• Can real world synchronisation also happen in VR
• Responding to synchronisation
• Adaptive XR that encourages synchronisation
• Measure physiological cues
• Brain activity
• Heart rate
• Eye gaze
• Show user state
• Cognitive load
• Attention
Showing Cognitive Load in Collaboration
Sasikumar, P.,... & Billinghurst, M.
(2024). A user study on sharing
physiological cues in vr assembly tasks.
In 2024 IEEE VRlity (pp. 765-773).
Demo
User Study
• Aim
• How visual cues of physiological state
affect collaboration and awareness
• Task (28 people/ 14 pairs)
• Motorbike repair
• Different levels of complexity
• Found
• Users had a preference for monitoring
their partner’s attentional state,
• but paid little attention to physiological
cues and unsure of how to interpret
% of time looking at physiological cues
User preference ranking
Brain Synchronization
Pre-training (Finger Pointing) Session Start
Post-Training (Finger Pointing) Session End
Brain Synchronization in VR
asfd
Empathic Shared MR Experiences: NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player
CONCLUSIONS
Conclusions
• Significant increase in BCI, AC research
• Hardware and software becoming widely available
• However, little research in XR
• Many applications for NeuroXR
• Measuring Presence
• Adaptive VR experiences
• Creating empathic VR systems
• Empathic Computing
• Focus on enhancing collaboration
• Opportunities to apply research in Neuro-XR
Empathic Computing Journal
• Looking for
submissions
• Any topic relevant to
Empathic Computing
• Open Access, free to
publish currently
Submit intent at https://forms.gle/XXHkWh5UVQazbuTx7
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au
● Octopus-sensing
○ Simple unified interface for
● Simultaneous data acquisition
● Simultaneous data Recording
○ Study design components
● Octopus-sensing-monitoring
○ Real-time monitoring
● Octopus-sensing-visualizer
○ Offline synchronous data visualizer
● Octopus-sensing-processing
○ Real-time processing
Tools for Neuro-XR: Octopus Sensing
Octopus Sensing Visualizer
● Visualizing Raw or processed data using a config file
● Multiplatform (Linux, Mac, Windows)
● Open-source (https://github.com/octopus-sensing)
● Supports various sensors
a. OpenBCI
b. Brainflow
c. Shimmer3
d. Camera
e. Audio
f. Network (Unity and Matlab)
Octopus Sensing
Saffaryazdi, N., Gharibnavaz, A., & Billinghurst, M. (2022). Octopus Sensing: A Python library for human behavior studies.
Journal of Open Source Software, 7(71), 4045.

NeuroXR: Current Research and Opportunities

  • 1.
    Mark Billinghurst mark.billinghurst@unisa.edu.au October 12th2025 Current Research and Opportunities Neuro-XR
  • 2.
  • 3.
    1967 – IBM1401 – half of the computers in the world, $10,000/month to run
  • 4.
    Jacques Vidal (1973) Vidal,J. J. (1973). Toward direct brain- computer communication. Annual review of Biophysics and Bioengineering, 2(1), 157-180. Coined the term Brain Computer Interface
  • 5.
  • 6.
    BCI Publications PerYear (1977 – 2025) ~ 30,000 papers
  • 7.
    Major Research Trends •1920s–1960s - Early Foundations • 1924 invention of EEG (Hans Berger) • 1970s – Establishing the Field • 1977 – Vidal’s first non-invasive human study • 1980s - 1990s - Development of Core Non-Invasive Paradigms • 1988 - Farwell and Donchin introduce the P300 Speller paradigm • 2000s–Present - Invasive BCI Breakthroughs and Clinical Translation • 2004 - BrainGate – first invasive human trial • 2016 onwards – rise of private companies (Neuralink, Synchron)
  • 8.
    Current State ofthe Art • Moderate cost reliable EEG hardware • OpenBCI, Emotiv, Neurosity, Muse, etc. • Excellent software tools • OpenVibe, EEG, MNE-Python, etc. • Fast input performance • ~60 wpm from BrainGate (2023) • Implantable clinical trials underway • BrainGate (2004), Synchron (2019), Neuralink (2024), etc.
  • 9.
    BCI and XRPublications ~1,400 XR papers
  • 10.
    BCI and AR/VRPublications
  • 11.
    Roz Picard (1995) •Coined term Affective Computing Picard, R. W. (1995). Affective computing- mit media laboratory perceptual computing section technical report no. 321. Cambridge, MA, 2139, 92.
  • 12.
    Affective Computing • RosPicard – MIT Media Lab • Systems that recognize emotion
  • 13.
    Affective Computing Publications(1998 – 2025) ~ 7,500 papers
  • 14.
    Major Research Trends •Mid 1990s - Early Foundations • 1997 - Affective Computing book – Roz Picard • Late 1990s – Early 2000s Early projects • 1998 – IBM BlueEyes project – emotion sensing in practical system • 2000s – 2010s – Advancing Interaction and Modeling • 2001 – USC-ICT – modeling emotions in virtual characters • 2010s – Present – Real World Integration and Modern Deep learning • 2010 – Establishment of IEEE Transactions on Affective Computing • 2009 onwards – commercialization (Affectiva, Empatica)
  • 15.
    Current State ofthe Art • Widely available physiological sensors • Shimmer, Emotibit, Plux, Empatica, etc. • Excellent software tools • PyAffecCT, EmoSense, EmotiEfflib, etc. • Consumer wearable devices • Apple, Fitbit, Garmin, Samsung, etc • Many companies operating • Affectiva, nViso, Realeyes, Emteq, etc.
  • 16.
    Affective Computing andXR Publications ~450 XR papers
  • 17.
    Affective Computing andAR/VR Publications
  • 18.
    Landscape Summary • Longhistory in BCI/Affective computing research • Large research communities, papers from 1980/90s. • Many devices and tools available • Relatively low cost, open source, commercially available • But, little XR research • ~5% of current BCI/Affective Computing publications • Especially in the AR space (~20% of BCI XR publications) • Obstacles to Overcome • Integrating multiple sensors • Combining different raw data • Need for multi-skilled researchers Typical Neuro-XR setup
  • 19.
    PhysioHMD (2019) Toolkit forcollecting physiological data from HMDs • EEG, EMG, EOG, EDA • Open-source software Bernal, G., Yang, T., Jain, A., & Maes, P. (2018, October). PhysioHMD: a conformable, modular toolkit for collecting physiological data from head-mounted displays. In Proceedings of the 2018 ACM international symposium on wearable computers (pp. 160-167).
  • 20.
    OpenBCI Galea: MultiplePhysiological Sensors in VR HMD • Incorporate range of sensors on HMD faceplate and over the head • EMG – muscle movement • EOG – Eye movement • EEG – Brain activity • EDA, PPG – Heart rate
  • 22.
    Cognixion Axon-R • See-throughAR HMD • Integrated 8 EEG sensors • 8 additional channels (ECG, EMG, EOG) • BCI studio software
  • 23.
  • 24.
    Example Research Projects •Measuring Presence • Towards an objective measure • Adaptive VR Training • Sensing and adapting to cognitive load • Measuring Trust in AI Agents • Using neurophysiological cues to measure trust • Emotionally Responsive VR • Adaptive VR experience based on emotional state
  • 25.
    Neurophysiological Measures ofPresence • Measuring Presence using multiple neurophysiological measures • Combining physiological and neurological signals Dey, A., Phoon, J., Saha, S., Dobbins, C., & Billinghurst, M. (2020, November). A Neurophysiological Approach for Measuring Presence in Immersive Virtual Environments. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 474-485). IEEE.
  • 26.
    Experiment Design • IndependentVariable • Presence level of VE; High (HP), Low (LP) • High quality • Better visual, interaction, realistic hand • Low quality • Reduced visual, no interaction, cartoon hand • Between subject’s design • Reduce variability
  • 27.
    Measures • Physiological Measures •raw electrocardiogram (ECG) (Shimmer) • heartrate • galvanic skin response (GSR) (Shimmer) • phasic and tonic electrodermal activity (EDA) • electroencephalogram (EEG) (Emotiv) • brain activity • Subjective Measures (Presence Surveys) • Slater-Usoh-Steed (SUS) questionnaire • Witmer & Singer survey • Subjects • 24 subjects, aged 20-30, 2 groups of 12 HTC Vive + Emotiv
  • 28.
    • Significant differencein Presence, both Witmer & Singer and SUS surveys Witmer & Singer SUS Results – Subjective Surveys
  • 29.
    Results – EEGAnalysis • 14 channels of EEG data • Processing Alpha, Theta, Beta bands • Multiple methods – Chirplet Transform, Power Spectral Density, Power Load Index
  • 30.
    • Significant differencesin brain activity between LP and HP environment • Overall cognitive engagement is higher in the HP than the LP environment
  • 31.
    Other Physiological Cues •Significant difference in ECG • No difference in EDA results Heart rate value
  • 32.
    Lessons Learned • KeyFindings • Higher presence in calm virtual environments can be characterised by increased heart rate, elevated beta and theta, alpha activities in the brain. • Approaching a neurophysiological measure of presence • Limitations • Simple Virtual Environment • Consumer grade EEG • Participants seated/limited movement
  • 33.
    EEG-based Adaptive VRTraining Dey, A., Chatburn, A., & Billinghurst, M. (2019). Exploration of an EEG-based cognitively adaptive training system in virtual reality. In 2019 IEEE conference on Virtual Reality and 3d user interfaces (VR) (pp. 220-226). IEEE. Goal: Create an adaptive VR training system using workload calculated from EEG
  • 34.
    Our System Oz, O1,O2, Pz, P3, and P4
  • 35.
    Adaption/Calibration ● Establish baseline(alpha power) – innate cognitive load capacity ● Two sets of n(1, 2)-back tasks to calibrate user’s own capacity ● Measured alpha activity (task load), calculate mean of two tasks ● Mean → CL Baseline ● In experimental task, adapt content ○ load > baseline → decrease difficulty level ○ load < baseline → increase difficulty level
  • 36.
    Experimental Task • Targetselection • number of objects, different colors • shapes, and movement Increasing levels (0 - 20)
  • 37.
    Experimental Task Difficulty -Low Difficulty - High
  • 38.
    User Study ● Participants ●14 subjects (6 women) ● 20 – 41 years old, 28 years average ● No experience with VR ● Measures ○ Response time ○ Brain activity (alpha power) • 5 minutes fixed trial time
  • 39.
  • 40.
    Results – ResponseTime Increasing levels Response Time (sec.) No difference between easiest and hardest levels
  • 41.
    Results – TimeFrequency Representation • Task Load • Significant alpha synchronisation in the hardest difficulty levels of the task when compared to the easiest difficulty levels Easiest Hardest Difference
  • 42.
    Key Finding +Limitations • Findings • Similar task time but increased brain activity • Increased cognitive effort at higher levels to sustain performance • Adaptive VR training can increase the user’s cognitive load without affecting task performance • But: • Task: Should be similar to real-world tasks • Behaviour: Difficulty levels could be designed differently • EEG: Only alpha activity ignoring theta, only 12 electrodes
  • 43.
    Understanding: Trust andAgents • Many Agents require trust • Guidance, collaboration, etc • Would you trust an agent? • How can you measure trust? • Subjective/Objective measures According to AAA, 71% of surveyed Americans are afraid to ride in a fully self-driving vehicle.
  • 44.
    Measuring Trust • Howto reliably measure trust? • Using physiological sensors (EEG, GSR, HRV) • Subjective measures (STS, SMEQ, NASA-TLX) • Relationship between cognitive load (CL) and trust? • Novelty: • Use EEG, GSR, HRV to evaluate trust at different CL • Implemented custom VR environment with virtual agent • Compare physiological, behavioral, subjective measures Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., & Billinghurst, M. (2020, March). Measuring human trust in a virtual assistant using physiological sensing in virtual reality. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 756-765). IEEE.
  • 45.
    Experimental Task • Targetselection + N back memory task • Agent voice guidance
  • 46.
    Experiment Design • Twofactors • Cognitive Load (Low, High) • Low = N-Back with N = 1 • High = N-Back with N = 2 • Agent Accuracy (No, Low, High) • No = No agent • Low = 50% accurate • High = 100% accurate • Within Subject Design • 24 subjects (12 Male), 23-35 years old • All experienced with virtual assistant 2 x 3 Expt Design
  • 47.
    Results • Physiological Measures •EEG sign. diff. in alpha band power level with CL • GSR/HRV – sign. diff. in FFT mean/peak frequency • Performance • Better with more accurate agent, no effect of CL • Subjective Measures • Sign. diff. in STS scores with accuracy, and CL • SMEQ had a significant effect of CL • NASA-TLX significant effect of CL and accuracy • Overall • Trust for virtual agents can be measured using combo of physiological, performance, and subjective measures ”I don’t trust you anymore!!”
  • 48.
    Context-Aware Empathic VR •VR application that identifies and responds to user’s emotional changes • Emotion prediction model (EEG, EDA, HRV) • Context aware empathic agent • Emotion adaptive VR environment Gupta, K., Zhang, Y., Gunasekaran, T. S., Krishna, N., Pai, Y. S., & Billinghurst, M. (2024). CAEVR: Biosignals-driven Context-aware Empathy in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics, 30(5), 2671-2681.
  • 49.
    Key Research Questions •RQ1: How can physiological signals be used to predict emotions and facilitate context-aware empathic interactions (CAEIxs) in VR? • RQ2: What are the effects of CAEIxs on elicited emotions, cognitive load, and empathy towards a virtual agent in VR? • RQ3: How can the impact of CAEIxs on users’ emotional and cognitive load during VR experiences be evaluated?
  • 50.
  • 51.
    Empathic Appraisal • BioEmVR:A generalized emotion recognition model • Used EEG, HRV, and EDA Data • Gradient boost classifier recognized 4 emotional states (92% accuracy) • Happy, Stressed, Bored, Relaxed • Self-Projected Appraisal: To assess context and user emotional state • Determine situational emotions based on user actions and context • E.g. moving slow and late might mean person will be stressed • Empathic Emotion Features: Synthesizing contextual and emotion information • Tailor the VR environment based on user’s emotional and situational needs • E.g. user happy and stressed might benefit from positive reinforcement
  • 52.
    Empathic Response • Usedto express affective states • Changing lighting and colours, virtual agents verbal and non-verbal cues • Emotion-Adaptive Responses • VR environment colours change depending on user’s emotional state • Happy = Yellow, Stress = Red • Empathic Tone • VR agent provides speech feedback with different tones • Context-Aware Empathic Assistance • Agent dialogue varies depending on user’s emotional state
  • 53.
    AR-glass capable ofadding color filter based on the participant’s emotional state. - Low-saturation colors evoke a sense of depression, while high- saturation ones lead people to feel cheerful and pleasant Context-Aware Empathic VR Experience
  • 54.
  • 55.
    System Evaluation • Phototaking task • Take pictures monuments in VR • Independent Variables • Emotion-Adaptive Environment (EA) • Context-Aware Empathic Agent (CAE) • 2x2 within-subjects,17 sub (7M) • Hypotheses • H1: EA and CAE will improve user’s emotional states, presence, empathy • H2: EA and CAE will significantly affect cognitive aspects (cognitive load, flow) Conditions No-CAE CAE No-EA A: No-EA-No-CAE C: No-EA-CAE EA B: EA-No-CAE D: EA-CAE
  • 56.
    Measures • Subjective Measures •IPQ for Presence, • NASA-TLX for Cognitive Load, • SAM scale for Emotional State, • Flow Short State Questionnaire (FSSQ) for flow state, • Game Experience Questionnaire (GEQ) for affect and empathy • Physiological Measures • EEG, Electrodermal Activity (EDA), & Heart Rate Variability (HRV)
  • 57.
    Key Results • SAM:Significant effect of EA and CAE on Valence • GEQ: Significant effect of CAE on Empathy • EA and CAE influenced EDA and HRV metrics like RMSSD • EEG: Significant effects of CAE on FA-Theta and FB-Beta
  • 58.
    Conclusions • Hypotheses • Useof EA and CAE improved emotional experience (H1) • CAE interventions improved user’s empathy towards agent (H1) • Integrating EA and CAE can impact cognitive load (H2) • No impact on Flow state or Presence • Research Questions • RQ1: BioEMoVR can predict emotions and facilitate CAEIxs (92% accuracy) • RQ2: Using CAE virtual agents can enrich user experience (emotions, empathy) • RQ3: Need to use a multi-faceted approach for evaluation (survey, physio. cues) • Overall: user’s VR experience can be enhanced by adapting to real- time emotional feedback – integrating EEG, EDA and HRV
  • 59.
    EMPATHIC COMPUTING An importantresearch opportunity for Neuro-XR
  • 60.
    Modern Communication TechnologyTrends 1. Improved Content Capture • Move from sharing faces to sharing places 2. Increased Network Bandwidth • Sharing natural communication cues 3. Implicit Understanding • Recognizing behaviour and emotion
  • 61.
  • 62.
    “Empathy is Seeingwith the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  • 63.
    Empathic Computing ResearchFocus Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  • 64.
    Sharing Heart Ratein VR • HTC Vive HMD • Heart rate sensor • Empatica E4 Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
  • 65.
    VR Environments • ButterflyWorld: calm scene, collect butterflies • Zombie Attack: scary scene, fighting zombies
  • 67.
    Experiment Design • KeyQuestion • What is the impact of sharing heart rate feedback? • Two Independent Variables • Game Experience (Zombies vs butterflies) • Heart Rate Feedback (On/Off) • Measures • Heart rate (player) • PANAS Scale (Emotion) • Inclusion of other in self scale (Connection)
  • 68.
    Results • Results • Significantdifference in Heart Rate • Sharing HR improves positive affect (PANAS) • Sharing HR created subjective connection between collaborators Heart Rate Data Likert Data
  • 69.
    Using Neuro-XR forEmpathic Computing • Sharing cognitive state • Enhancing remote collaboration in XR • Measuring brain synchronisation • Can real world synchronisation also happen in VR • Responding to synchronisation • Adaptive XR that encourages synchronisation
  • 70.
    • Measure physiologicalcues • Brain activity • Heart rate • Eye gaze • Show user state • Cognitive load • Attention Showing Cognitive Load in Collaboration Sasikumar, P.,... & Billinghurst, M. (2024). A user study on sharing physiological cues in vr assembly tasks. In 2024 IEEE VRlity (pp. 765-773).
  • 71.
  • 72.
    User Study • Aim •How visual cues of physiological state affect collaboration and awareness • Task (28 people/ 14 pairs) • Motorbike repair • Different levels of complexity • Found • Users had a preference for monitoring their partner’s attentional state, • but paid little attention to physiological cues and unsure of how to interpret % of time looking at physiological cues User preference ranking
  • 73.
  • 74.
  • 75.
  • 76.
  • 79.
  • 81.
    Empathic Shared MRExperiences: NeuralDrum • Using brain synchronicity to increase connection • Collaborative VR drumming experience • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  • 82.
    Set Up • HTCVive HMD • OpenBCI • 3 EEG electrodes
  • 84.
    Results "It’s quite interesting,I actually felt like my body was exchanged with my partner." Poor Player Good Player
  • 85.
  • 86.
    Conclusions • Significant increasein BCI, AC research • Hardware and software becoming widely available • However, little research in XR • Many applications for NeuroXR • Measuring Presence • Adaptive VR experiences • Creating empathic VR systems • Empathic Computing • Focus on enhancing collaboration • Opportunities to apply research in Neuro-XR
  • 88.
    Empathic Computing Journal •Looking for submissions • Any topic relevant to Empathic Computing • Open Access, free to publish currently Submit intent at https://forms.gle/XXHkWh5UVQazbuTx7
  • 89.
  • 90.
    ● Octopus-sensing ○ Simpleunified interface for ● Simultaneous data acquisition ● Simultaneous data Recording ○ Study design components ● Octopus-sensing-monitoring ○ Real-time monitoring ● Octopus-sensing-visualizer ○ Offline synchronous data visualizer ● Octopus-sensing-processing ○ Real-time processing Tools for Neuro-XR: Octopus Sensing
  • 92.
    Octopus Sensing Visualizer ●Visualizing Raw or processed data using a config file
  • 93.
    ● Multiplatform (Linux,Mac, Windows) ● Open-source (https://github.com/octopus-sensing) ● Supports various sensors a. OpenBCI b. Brainflow c. Shimmer3 d. Camera e. Audio f. Network (Unity and Matlab) Octopus Sensing Saffaryazdi, N., Gharibnavaz, A., & Billinghurst, M. (2022). Octopus Sensing: A Python library for human behavior studies. Journal of Open Source Software, 7(71), 4045.