Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 1 for the 2022 COMP 4010 course on AR and VR. This course was taught by Mark Billinghurst at the University of South Australia in 2022. This lecture provides an introduction to AR, VR and XR.
Lecture 3 in the 2022 COMP 4010 lecture series on AR/VR. This lecture provides an introduction for AR Technology. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Lecture 5 in the 2022 COMP 4010 lecture series. This lecture is about AR prototyping tools and techniques. The lecture was given by Mark Billinghurst from University of South Australia in 2022.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture 5 in the COMP 4010 class on Augmented and Virtual Reality. This lecture was about AR Interaction and Prototyping methods. Taught by Mark Billinghurst on August 24th 2021 at the University of South Australia.
Lecture 1 for the 2022 COMP 4010 course on AR and VR. This course was taught by Mark Billinghurst at the University of South Australia in 2022. This lecture provides an introduction to AR, VR and XR.
Lecture 3 in the 2022 COMP 4010 lecture series on AR/VR. This lecture provides an introduction for AR Technology. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Lecture 5 in the 2022 COMP 4010 lecture series. This lecture is about AR prototyping tools and techniques. The lecture was given by Mark Billinghurst from University of South Australia in 2022.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture 5 in the COMP 4010 class on Augmented and Virtual Reality. This lecture was about AR Interaction and Prototyping methods. Taught by Mark Billinghurst on August 24th 2021 at the University of South Australia.
The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
Lecture 2 of the COMP 4010 class on AR/VR. This lecture is about the human perception system. This lecture was given on August 3rd 2021 by Mark Billinghurst from the University of South Australia.
Lecture 8 of the COMP 4010 course taught at the University of South Australia. This lecture provides and introduction to VR technology. Taught by Mark Billinghurst on September 14th 2021 at the University of South Australia.
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
Lecture 10 in the COMP 4010 Lectures on AR/VR from the Univeristy of South Australia. This lecture is about VR Interface Design and Evaluating VR interfaces. Taught by Mark Billinghurst on October 12, 2021.
Lecture 6 on the COMP4010 course on AR/VR. This lecture describes prototyping tools for developing interactive prototypes for AR experiences. The lecture was taught on August 31st 2020 by Mark Billinghurst at the University of South Australia
keynote speech by Mark Billinghurst at the Workshop on Transitional Interfaces in Mixed and Cross-Reality, at the ACM ISS 2021 Conference. Given on November 14th 2021
COMP4010 Lecture 4 - VR Technology - Visual and Haptic Displays. Lecture about VR visual and haptic display technology. Taught on August 16th 2016 by Mark Billinghurst from the University of South Australia
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
Lecture 3 from the COMP 4010 course and Virtual and Augmented Reality. This lecture is about VR tracking, input and systems. Taught on August 7th, 2018 by Mark Billinghurst at the University of South Australia
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
The first lecture from the Augmented Reality Summer School talk by Mark Billinghurst at the University of South Australia, February 15th - 19th, 2016. This provides an introduction to Augmented Reality and overview of the history.
COMP 4010 Lecture 9 providing an overview of Augmented Reality Technology. Taught by Mark Billinghurst on October 8th 2019 at the University of South Australia.
The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
Lecture 2 of the COMP 4010 class on AR/VR. This lecture is about the human perception system. This lecture was given on August 3rd 2021 by Mark Billinghurst from the University of South Australia.
Lecture 8 of the COMP 4010 course taught at the University of South Australia. This lecture provides and introduction to VR technology. Taught by Mark Billinghurst on September 14th 2021 at the University of South Australia.
A lecture give on AR Tehchnology taught as part of the COMP 4010 course on AR/VR. This lecture was taught by Mark Billinghurst on August 10th 2021 at the University of South Australia.
Lecture 10 in the COMP 4010 Lectures on AR/VR from the Univeristy of South Australia. This lecture is about VR Interface Design and Evaluating VR interfaces. Taught by Mark Billinghurst on October 12, 2021.
Lecture 6 on the COMP4010 course on AR/VR. This lecture describes prototyping tools for developing interactive prototypes for AR experiences. The lecture was taught on August 31st 2020 by Mark Billinghurst at the University of South Australia
keynote speech by Mark Billinghurst at the Workshop on Transitional Interfaces in Mixed and Cross-Reality, at the ACM ISS 2021 Conference. Given on November 14th 2021
COMP4010 Lecture 4 - VR Technology - Visual and Haptic Displays. Lecture about VR visual and haptic display technology. Taught on August 16th 2016 by Mark Billinghurst from the University of South Australia
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
Lecture 3 from the COMP 4010 course and Virtual and Augmented Reality. This lecture is about VR tracking, input and systems. Taught on August 7th, 2018 by Mark Billinghurst at the University of South Australia
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
The first lecture from the Augmented Reality Summer School talk by Mark Billinghurst at the University of South Australia, February 15th - 19th, 2016. This provides an introduction to Augmented Reality and overview of the history.
COMP 4010 Lecture 9 providing an overview of Augmented Reality Technology. Taught by Mark Billinghurst on October 8th 2019 at the University of South Australia.
AR101 Lecture - Introduction to Augmented Reality. Lecture providing an introduction to AR, the history of AR and some example applications. Presented by Mark Billinghurst at the AR101 summer school at the ISMAR 2016 conference, September 18th 2016.
A four lecture course on how to build AR and VR experiences using Unity, Google Cardboard VR SDK and Vuforia. Taught by Mark Billinghurst from May 10th - 13th, 2016 in XI'an, China
Lecture 1 from a course on Mobile Based Augmented Reality Development taught by Mark Billinghurst and Zi Siang See on November 29th and 30th 2015 at Johor Bahru in Malaysia. This lecture is the Introduction to the course. Look for the other 9 lectures in the course.
A presentation given by Mark Billinghurst on September 23rd 2015 at the Sydney AR meet up. It describes how the VR space in 2016 will be different from that in 1996, and directions for future work to help grow the business.
Talk given by Mark Billinghurst at the DIGI_X conference in Auckland, New Zealand on June 21st 2018. The talk was about how Mixed Reality can be applied in the work place.
Keynote speech given by Mark Billinghurst at the QCon 2018 conference on April 22nd in Beijing, China. The talk identified important future research directions for Augmented Reality.
Lecture 8 in the COMP 4010 class on VR and AR. This time giving an overview of AR Display and Tracking technologies. Taught by Bruce Thomas on Sept 11th 2018
Talk given by Mark Billinghurst to Bajaj Finance Limited in India, on May 9th 2020. The talk describes AR and VR applications, example AR/VR applications in financial services, and potential research directions.
The fifth lecture from the Augmented Reality Summer School taught by Mark Billinghurst at the University of South Australia, February 15th - 19th, 2016. This provides an overview of AR research directions.
Lecture 1 of the COMP 4010 course on AR and VR. This lecture provides an introduction to AR/VR/MR/XR. The lecture was taught at the University of South Australia by Mark Billinghurst on July 21st 2021.
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
Lecture 1 of the VR/AR class taught by Mark Billinghurst and Bruce Thomas at the University of South Australia. This lecture provides an introduction to VR and was taught on July 26th 2016.
Keynote talk by Mark Billinghurst at the 9th XR-Metaverse conference in Busan, South Korea. The talk was given on May 20th, 2024. It talks about progress on achieving the Metaverse vision laid out in Neil Stephenson's book, Snowcrash.
These are slides from the Defence Industry event orgranized by the Australian Research Centre for Interactive and Virtual Environments (IVE). This was held on April 18th 2024, and showcased IVE research capabilities to the South Australian Defence industry.
This is a guest lecture given by Mark Billinghurst at the University of Sydney on March 27th 2024. It discusses some future research directions for Augmented Reality.
Presentation given by Mark Billinghurst at the 2024 XR Spring Summer School on March 7 2024. This lecture talks about different evaluation methods that can be used for Social XR/AR/VR experiences.
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
Invited guest lecture by Mark Billingurust given at the MIT Media Laboratory on November 21st 2023. This was given as part of Professor Hiroshi Ishii's class on Tangible Media
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
A talk given by Mark Billinging in the CLIPE workshop in Tubingen, Germant on April 27th 2023. This talk describes how virtual avatars can be used to support remote collaboration.
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
Keynote talk given by Mark Billinghurst at the CHI 2023 Workshop on Towards and Inclusive and Accessible Metaverse. The talk was given on April 23rd 2023.
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
Short talk by Mark Billinghurst on Empathic Computing and Collaborative Immersive Analytics, presented on July 28th 2022 at the Siggraph 2022 conference.
Lecture given by Mark Billinghurst on June 18th 2022 about how the Metaverse can be used for corporate training. In particular how combining AR, VR and other Metaverse elements can be used to provide new types of learning experiences.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
3. The Incredible Disappearing Computer
1960-70’s
Room
1970-80’s
Desk
1980-90’s
Lap
1990-2000’s
Hand
2010 -
Head
4. Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments.
Making Interfaces Invisible
(c) Internet of Things
5. Internet of Things (IoT)..
• Embed computing and sensing in real world
• Smart objects, sensors, etc..
(c) Internet of Things
6. Virtual Reality (VR)
• Users immersed in Computer Generated environment
• HMD, gloves, 3D graphics, body tracking
7. Augmented Reality (AR)
• Virtual Images blended with the real world
• See-through HMD, handheld display, viewpoint tracking, etc..
9. Milgram’s Mixed Reality (MR) Continuum
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays
Internet of Things
10. Extended Reality (XR)
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
Extended Reality
Internet of Things
13. Super Cockpit (1965-80’s)
• US Airforce Research Program
• Wright Patterson Air Force Base
• Tom Furness III
• Multisensory
• Visual, auditory, tactile
• Head, eye, speech, and hand input
• Addressing pilot information overload
• Flight controls and tasks too complicated
• Research only
• big system, not safe for ejecting
14. VPL Research (1985 – 1999)
• First Commercial VR Company
• Jaron Lanier, Jean-Jacques Grimaud
• Provided complete systems
• Displays, software, gloves, etc
• DataGlove, EyePhone, AudioSphere
15. First Industrial Use of AR (1990’s)
• 1992: Tom Caudell at Boeing coined the term “AR.”
• Wire harness assembly application begun
• Lead by Tom Caudell, and David Mizell
19. Mobile Phone AR (2005)
• Mobile Phones
• camera
• processor
• display
• AR on Mobile Phones
• Simple graphics
• Optimized computer vision
• Collaborative Interaction
20. 2008 - Browser Based AR
• Flash + camera + 3D graphics
• ARToolKit ported to Flash
• High impact
• High marketing value
• Large potential install base
• 1.6 Billion web users
• Ease of development
• Lots of developers, mature tools
• Low cost of entry
• Browser, web camera
25. Social Mobile Camera AR Apps (2015 - )
• SnapChat - Lenses, World Lenses
• Cinco de Mayo lens > 225 million views
• Facebook - Camera Effects
• Google – Word Lens/Translate
26. Hololens (2016)
• Integrated system – Windows
• Stereo see-through display
• Depth sensing tracking
• Voice and gesture interaction
• Note: Hololens2 coming September 2019
27. ARKit/ARcore (2017)
• Visual Inertial Odometry (VIO) systems
• Mobile phone pose tracked by
• Camera (Visual), Accelerometer & Gyroscope (Intertial)
• Features
• Plane detection, lighting detection, hardware optimisation
• Links
• https://developer.apple.com/arkit/
• https://developers.google.com/ar/
28. History Summary
• 1960’s – 80’s: Early Experimentation
• 1980’s – 90’s: Basic Research
• Tracking, displays
• 1995 – 2005: Tools/Applications
• Interaction, usability, theory
• 2005 - : Commercial Applications
• Mobile, Games, Medical, Industry
30. Why 2022 won’t be like 1996
• It’s not just VR anymore
• Huge amount of investment
• Inexpensive hardware platforms
• Easy to use content creation tools
• New devices for input and output
• Proven use cases – no more Hype!
• Most important: Focus on User Experience
32. Pokemon GO Effect
• Fastest App to reach $500 million in Revenue
• Only 63 days after launch, > $1 Billion in 6 months
• Over 500 million downloads, > 25 million DAU
• Nintendo stock price up by 50% (gain of $9 Billion USD)
33. Augmented Reality in 2022
• Large growing market
• > $13Billion USD in 2021
• Many available devices
• HMD, phones, tablets, HUDs
• Robust developer tools
• Vuforia, ARToolKit, Unity, Wikitude, etc
• Large number of applications
• > 150K developers, > 100K mobile apps
• Strong research/business communities
• ISMAR, AWE conferences, AugmentedReality.org, etc
41. Conclusion
• AR/VR has a long history
• > 50 years of HMDs, simulators
• Key elements for were in place by early 1990’s
• Displays, tracking, input, graphics
• Strong support from military, government, universities
• First commercial wave failed in late 1990’s
• Too expensive, bad user experience, poor technology, etc
• We are now in second commercial wave
• Better experience, Affordable hardware
• Large commercial investment, Significant installed user base
• Will XR be a commercial success this time?
44. How do We Perceive Reality?
• We understand the world through
our senses:
• Sight, Hearing, Touch, Taste, Smell
(and others..)
• Two basic processes:
• Sensation – Gathering information
• Perception – Interpreting information
46. Goal of Virtual Reality
“.. to make it feel like you’re actually in a place that
you are not.”
Palmer Luckey
Co-founder, Oculus
47. Creating the Illusion of Reality
• Fooling human perception by using
technology to generate artificial sensations
• Computer generated sights, sounds, smell, etc
48. Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
49. Example Birdly - http://www.somniacs.co/
• Create illusion of flying like a bird
• Multisensory VR experience
• Visual, audio, wind, haptic
52. Presence ..
“The subjective experience of being in one place or
environment even when physically situated in another”
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence
questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
53. Immersion vs. Presence
• Immersion: describes the extent to which technology is capable of
delivering a vivid illusion of reality to the senses of a human participant.
• Presence: a state of consciousness, the (psychological) sense of being
in the virtual environment.
• So Immersion, defined in technical terms, is capable of producing a
sensation of Presence
• Goal of VR: Create a high degree of Presence
• Make people believe they are really in Virtual Environment
Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE): Speculations on the role
of presence in virtual environments. Presence: Teleoperators and virtual environments, 6(6), 603-616.
54. How to Create Strong Presence?
• Use Multiple Dimensions of Presence
• Create rich multi-sensory VR experiences
• Include social actors/agents that interact with the user
• Have environment respond to the user
• What Influences Presence
• Vividness – ability to provide rich experience (Steuer 1992)
• Using Virtual Body – user can see themselves (Slater 1993)
• Internal factors – individual user differences (Sadowski 2002)
• Interactivity – how much users can interact (Steuer 1992)
• Sensory, Realism factors (Witmer 1998)
55. Five Key Technical Requirements for Presence
• Persistence
• > 90 Hz refresh, < 3 ms persistence, avoid retinal blur
• Optics
• Wide FOV > 90 degrees, comfortable eyebox, good calibration
• Tracking
• 6 DOF, 360 tracking, sub-mm accuracy, no jitter, good tracking volume
• Resolution
• Correct stereo, > 1K x 1K resolution, no visible pixels
• Latency
• < 20 ms latency, fuse optical tracking and IMU, minimize tracking loop
http://www.roadtovr.com/oculus-shares-5-key-ingredients-for-presence-in-virtual-reality/
56. Example: UNC Pit Room
• Key Features
• Training room and pit room
• Physical walking
• Fast, accurate, room scale tracking
• Haptic feedback – feel edge of pit, walls
• Strong visual and 3D audio cues
• Task
• Carry object across pit
• Walk across or walk around
• Dropping virtual balls at targets in pit
• http://wwwx.cs.unc.edu/Research/eve/walk_exp/
57. Typical Subject Behaviour
• Note – from another pit experiment
• https://www.youtube.com/watch?v=VVAO0DkoD-8
59. Why do people behave like this?
• Presence can be decomposed into two dimensions (Slater 2009):
• “Place Illusion” (PI): being in the place depicted in the VR environment
• perception in VR matches natural sensorimotor input
• Plausibility Illusion (Psi): the events in the VR environment are actually occurring
• VR environment responds to user actions
• When both PI and Psi are high, people respond realistically to events in the VR
Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual
environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557.
Presence = PI + Psi + ??
60. Slater, M., Banakou, D., Beacco, A., Gallego, J., Macia-Varela, F., & Oliva, R. (2022). A Separate Reality: An
Update on Place Illusion and Plausibility in Virtual Reality. Frontiers in Virtual Reality, 81.
Four Illusions of Presence (Slater 2022)
• Place Illusion: being in the place
• Plausibility Illusion: events are real
• Body Ownership: seeing your body in VR
• Copresence/Social Presence: other people are in VR
61. Social Presence
• What makes a Person appear real?
• Interactivity
• Visual appearance
• Audio cues
• Touch
• Contextual cues
• Etc..
Oh, C. S., Bailenson, J. N., & Welch, G. F. (2018). A systematic review of social presence:
Definition, antecedents, and implications. Frontiers in Robotics and AI, 5, 114.
62.
63. Object Presence
• What makes an object appear real?
• Touch/Haptic feedback
• Appearance
• Lighting
• Audio cues
• Occlusion
• Etc..
64.
65. Benefits of High Presence
• Leads to greater engagement, excitement and satisfaction
• Increased reaction to actions in VR
• People more likely to behave like in the real world
• E.g. people scared of heights in real world will be scared in VR
• More natural communication (Social Presence)
• Use same cues as face-to-face conversation
• Note: The relationship between Presence and Performance is unclear
66. Measuring Presence
• Presence is very subjective so there is a lot of debate
among researchers about how to measure it
• Subjective Measures
• Self report questionnaire
• University College London Questionnaire (Slater 1999)
• Witmer and Singer Presence Questionnaire (Witmer 1998)
• ITC Sense Of Presence Inventory (Lessiter 2000)
• Continuous measure
• Person moves slider bar in VE depending on Presence felt
• Objective Measures
• Behavioural
• reflex/flinch measure, startle response
• Physiological measures
• change in heart rate, skin conductance, skin temperature Presence Slider
68. Motivation
• Understand: In order to create a strong sense of Presence
we need to understand the Human Perception system
• Stimulate: We need to be able to use technology to provide
real world sensory inputs, and create the VR illusion
VR Hardware Human Senses
69. Senses
• How an organism obtains information for perception:
• Sensation part of Somatic Division of Peripheral Nervous System
• Integration and perception requires the Central Nervous System
• Five major senses (but there are more..):
• Sight (Opthalamoception)
• Hearing (Audioception)
• Taste (Gustaoception)
• Smell (Olfacaoception)
• Touch (Tactioception)
70. Relative Importance of Each Sense
• Percentage of neurons in
brain devoted to each sense
• Sight – 30%
• Touch – 8%
• Hearing – 2%
• Smell - < 1%
• Over 60% of brain involved
with vision in some way
71. Other Lessor Known Senses..
• Proprioception = sense of body position
• what is your body doing right now
• Equilibrium = balance
• Acceleration
• Nociception = sense of pain
• Temperature
• Satiety = state of being fed or gratified to or beyond capacity
• Thirst
• Micturition = amount of CO2 and Na in blood
73. The Human Visual System
• Purpose is to convert visual input to signals in the brain
74. The Human Eye
• Light passes through cornea and lens onto retina
• Photoreceptors in retina convert light into electrochemical signals
75. Photoreceptors – Rods and Cones
• Retina photoreceptors come in two types, Rods and Cones
• Rods – 125 million, periphery of retina, no colour detection, night vision
• Cones – 4-6 million, center of retina, colour vision, day vision
76. Human Horizontal and Vertical FOV
• Humans can see ~135
o
vertical (60
o
above, 75
o
below)
• See up to ~ 210
o
horizontal FOV, ~ 115
o
stereo overlap
• Colour/stereo in centre, black and white/mono in periphery
79. Vergence-Accommodation Conflict
• Looking at real objects, vergence and focal distance match
• In VR, vergence and accommodation can miss-match
• Focusing on HMD screen, but accommodating for virtual object behind screen
80. Visual Acuity
Visual Acuity Test Targets
• Ability to resolve details
• Several types of visual acuity
• detection, separation, etc
• Normal eyesight can see a 50 cent coin at 80m
• Corresponds to 1 arc min (1/60th of a degree)
• Max acuity = 0.4 arc min
81. Stereo Perception/Stereopsis
• Eyes separated by IPD
• Inter pupillary distance
• 5 – 7.5cm (avge. 6.5cm)
• Each eye sees diff. image
• Separated by image parallax
• Images fused to create 3D
stereo view
82.
83. Depth Perception
• The visual system uses a range of different Stereoscopic
and Monocular cues for depth perception
Stereoscopic Monocular
eye convergence angle
disparity between left
and right images
diplopia
eye accommodation
perspective
atmospheric artifacts (fog)
relative sizes
image blur
occlusion
motion parallax
shadows
texture
Parallax can be more important for depth perception!
Stereoscopy is important for size and distance evaluation
86. Properties of the Human Visual System
• visual acuity: 20/20 is ~1 arc min
• field of view: ~200° monocular, ~120° binocular, ~135° vertical
• resolution of eye: ~576 megapixels
• temporal resolution: ~60 Hz (depends on contrast, luminance)
• dynamic range: instantaneous 6.5 f-stops, adapt to 46.5 f-stops
• colour: everything in CIE xy diagram
• depth cues in 3D displays: vergence, focus, (dis)comfort
• accommodation range: ~8cm to ∞, degrades with age
87. Creating the Perfect Illusion
Cuervo, E., Chintalapudi, K., & Kotaru, M. (2018,
February). Creating the perfect illusion: What will it
take to create life-like virtual reality headsets?.
In Proceedings of the 19th International Workshop on
Mobile Computing Systems & Applications (pp. 7-12).
• Technology to create life-like VR HMDs
• Compared to current HMDs
• 6 − 10× higher pixel density
• 20 − 30× higher frame rate
95. HRTF (Elevation Cue)
• Pinna and head shape affect frequency intensities
• Sound intensities measured with microphones in ear and
compared to intensities at sound source
• Difference is HRTF, gives clue as to sound source location
96. Accuracy of Sound Localization
• People can locate sound
• Most accurately in front of them
• 2-3° error in front of head
• Least accurately to sides and behind head
• Up to 20° error to side of head
• Largest errors occur above/below elevations and behind head
• Front/back confusion is an issue
• Up to 10% of sounds presented in the front are perceived
coming from behind and vice versa (more in headphones)
BUTEAN, A., Bălan, O., NEGOI, I., Moldoveanu, F., & Moldoveanu, A. (2015). COMPARATIVE RESEARCH ON SOUND
LOCALIZATION ACCURACY IN THE FREE-FIELD AND VIRTUAL AUDITORY DISPLAYS. InConference proceedings of»
eLearning and Software for Education «(eLSE)(No. 01, pp. 540-548). Universitatea Nationala de Aparare Carol I.
98. Haptic Sensation
• Somatosensory System
• complex system of nerve cells that responds to
changes to the surface or internal state of the body
• Skin is the largest organ
• 1.3-1.7 square m in adults
• Tactile: Surface properties
• Receptors not evenly spread
• Most densely populated area is the tongue
• Kinesthetic: Muscles, Tendons, etc.
• Also known as proprioception
99. Cutaneous System
• Skin – heaviest organ in the body
• Epidermis outer layer, dead skin cells
• Dermis inner layer, with four kinds of mechanoreceptors
100. Mechanoreceptors
• Cells that respond to pressure, stretching, and vibration
• Slow Acting (SA), Rapidly Acting (RA)
• Type I at surface – light discriminate touch
• Type II deep in dermis – heavy and continuous touch
Receptor Type Rate of
Acting
Stimulus
Frequency
Receptive Field Detection Function
Merkel Discs SA-I 0 – 10 Hz Small, well defined Edges, intensity
Ruffini
corpuscles
SA-II 0 – 10 Hz Large, indistinct Static force,
skin stretch
Meissner
corpuscles
RA-I 20 – 50 Hz Small, well defined Velocity, edges
Pacinian
corpuscles
RA-II 100 – 300 Hz Large, indistinct Acceleration,
vibration
101. Spatial Resolution
• Sensitivity varies greatly
• Two-point discrimination
Body
Site
Threshold
Distance
Finger 2-3mm
Cheek 6mm
Nose 7mm
Palm 10mm
Forehead 15mm
Foot 20mm
Belly 30mm
Forearm 35mm
Upper Arm 39mm
Back 39mm
Shoulder 41mm
Thigh 42mm
Calf 45mm
http://faculty.washington.edu/chudler/chsense.html
102. Proprioception/Kinaesthesia
• Proprioception (joint position sense)
• Awareness of movement and positions of body parts
• Due to nerve endings and Pacinian and Ruffini corpuscles at joints
• Enables us to touch nose with eyes closed
• Joints closer to body more accurately sensed
• Users know hand position accurate to 8cm without looking at them
• Kinaesthesia (joint movement sense)
• Sensing muscle contraction or stretching
• Cutaneous mechanoreceptors measuring skin stretching
• Helps with force sensation
104. Augmented Reality Definition
•Combines Real and Virtual Images
•Both can be seen at the same time
•Interactive in real-time
•The virtual content can be interacted with
•Registered in 3D
•Virtual objects appear fixed in space
105. Augmented Reality technology
•Combines Real and Virtual Images
•Needs: Display technology
•Interactive in real-time
•Needs: Input and interaction technology
•Registered in 3D
•Needs: Viewpoint tracking technology
106. Example: MagicLeap ML-1 AR Display
•Display
• Multi-layered Waveguide display
•Tracking
• Inside out SLAM tracking
•Input
• 6DOF wand, gesture input
107. MagicLeap Display
• Optical see through AR display
• Overlay graphics directly on real world
• 40o x 30o FOV, 1280 x 960 pixels/eye
• Waveguide based display
• Holographic optical element
• Very thin physical display
• Two sets of waveguides
• Different focal planes
• Overcomes vergence/accommodation problem
• Eye tracking for selecting focal plane
• Separate CPU/GPU unit
108. AR Vergence and Accommodation
• Fixed focal distance for OST displays
• Accommodation conflict between real and virtual object
109.
110. Tracking
• Inside out tracking
• Sensors on the user’s head
• Using multiple sensors
• Time of Flight Depth Sensor
• IR dot projector
• Wide angle cameras
• Internal accelerometer (IMU)
• Creates 3D model of real world
• Tracks from model
135. Pros and Cons of Optical see-throughAR
• Pros
• Simpler design (cheaper)
• Direct view of real world
• No eye displacement
• Socially acceptable (glasses form factor)
• Cons
• Difficult to occlude real world
• Image washout outdoors/bright lights
• Wide field of view challenging
• Can’t delay the real world
142. Pros and Cons ofVideo See-ThroughAR
• Pros
• True occlusion
• Digitized image of real world
• Registration, calibration, matchable time delay
• Wide FOV is easier to support
• Cons
• Larger, bulkier hardware
• Can’t see real world with natural eyes
173. Olfactory Display
MetaCookie: An olfactory display is
combined with visual augmentation of a
plain cookie to provide the illusion of a
flavored cookie (chocolate, in the inset).
Image: Takuji Narumi