Lecture 4 from the COMP 4010 course on AR/VR. This lecture reviews optical tracking for AR and starts discussion about interaction techniques. This was taught by Mark Billinghurst at the University of South Australia on August 17th 2021.
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture prepared by Mark Billinghurst on Augmented Reality tracking. Taught on October 18th 2016 by Dr. Gun Lee as part of the COMP 4010 VR class at the University of South Australia.
Talk given by Mark Billinghurst at the DIGI_X conference in Auckland, New Zealand on June 21st 2018. The talk was about how Mixed Reality can be applied in the work place.
Talk given by Mark Billinghurst to Bajaj Finance Limited in India, on May 9th 2020. The talk describes AR and VR applications, example AR/VR applications in financial services, and potential research directions.
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Lecture 4 from the COMP 4010 course on AR/VR. This lecture reviews optical tracking for AR and starts discussion about interaction techniques. This was taught by Mark Billinghurst at the University of South Australia on August 17th 2021.
Lecture 7 from the COMP 4010 class on AR and VR. This lecture was about Designing AR systems. It was taught on September 7th 2021 by Mark Billinghurst from the University of South Australia.
Lecture prepared by Mark Billinghurst on Augmented Reality tracking. Taught on October 18th 2016 by Dr. Gun Lee as part of the COMP 4010 VR class at the University of South Australia.
Talk given by Mark Billinghurst at the DIGI_X conference in Auckland, New Zealand on June 21st 2018. The talk was about how Mixed Reality can be applied in the work place.
Talk given by Mark Billinghurst to Bajaj Finance Limited in India, on May 9th 2020. The talk describes AR and VR applications, example AR/VR applications in financial services, and potential research directions.
Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Augmented reality The future of computingAbhishek Abhi
This is a PPT on Developing Augmented Reality this field is rapidly developing around the world. this ppt describes the entire meaning of the word augmented reality and what it is made up off and the working of this devices.
Lecture 9 of the COMP 4010 course on AR/VR. This lecture is about AR Interaction methods. Taught on October 2nd 2018 by Mark Billinghurst at the University of South Australia
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
A short course on how to develop AR and VR experiences using Unity. Using Unity 2017.2, Google 1.100 VR SDK, and Vuforia. Taught by Mark Billinghurst on November 7th 2017.
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
COMP 4010 Lecture 9 providing an overview of Augmented Reality Technology. Taught by Mark Billinghurst on October 8th 2019 at the University of South Australia.
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
Keynote talk given by Mark Billinghurst at the CHI 2023 Workshop on Towards and Inclusive and Accessible Metaverse. The talk was given on April 23rd 2023.
A presentation on Augmented Reality, The basic principles involved and various types of Augmented Reality. Presented on Govt. College of Technology, Coimbatore.
Slides showing how to use Unity to build Google Cardboard Virtual Reality applications. From a series of lectures given by Mark Billinghurst from the University of South Australia.
1) Augmented reality definition
2) Basic Characteristics of Augmented reality
3) Where it’s been really used?
4) Key components of Augmented Reality technology
5) Working principle of AR
6) Conclusion
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
Lecture 7 of the COMP 4010 course in Virtural Reality. This lecture was about 3D User Interfaces for Virtual Reality. The lecture was taught by Mark Billinghurst on September 13th 2016 at the University of South Australia.
Augmented Reality connects the online and offline worlds. Let us have a look at what it is, why it is so popular and what are the businesses to which it can contribute.
AUGMENTED REALITY CONNECTS THE ONLINE AND OFFLINE WORLDS.
Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
The final lecture in the COSC 426 graduate course in Augmented Reality. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury on Sept. 19th 2012
Natural Interaction for Augmented Reality ApplicationsMark Billinghurst
Keynote talk giving by Mark Billinghurst from the HIT Lab NZ at the IVCNZ 2013 conference, November 28th 2013. The talk focuses on Natural Interaction with Augmented Reality applications using speech and gesture and demonstrates some of the projects in this area developed by the HIT Lab NZ.
Augmented reality The future of computingAbhishek Abhi
This is a PPT on Developing Augmented Reality this field is rapidly developing around the world. this ppt describes the entire meaning of the word augmented reality and what it is made up off and the working of this devices.
Lecture 9 of the COMP 4010 course on AR/VR. This lecture is about AR Interaction methods. Taught on October 2nd 2018 by Mark Billinghurst at the University of South Australia
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
Guest lecture on advanced methods of user evaluation in AR/VR studies. Given by Mark Billinghurst as part of the ARIVE lecture series hosted at the University of Otago. The lecture was given on August 26th 2021.
Keynote speech given by Mark Billinghurst at the ISS 2022 conference. Presented on November 22nd, 2022. This keynote outlines some research opportunities in the Metaverse.
A short course on how to develop AR and VR experiences using Unity. Using Unity 2017.2, Google 1.100 VR SDK, and Vuforia. Taught by Mark Billinghurst on November 7th 2017.
Lecture 12 in the COMP 4010 course on AR/VR. This lecture was about research directions in AR/VR and in particular display research. This was taught by Mark Billinghurst on September 26th 2021 at the University of South Australia.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
COMP 4010 Lecture 9 providing an overview of Augmented Reality Technology. Taught by Mark Billinghurst on October 8th 2019 at the University of South Australia.
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
Keynote talk given by Mark Billinghurst at the CHI 2023 Workshop on Towards and Inclusive and Accessible Metaverse. The talk was given on April 23rd 2023.
A presentation on Augmented Reality, The basic principles involved and various types of Augmented Reality. Presented on Govt. College of Technology, Coimbatore.
Slides showing how to use Unity to build Google Cardboard Virtual Reality applications. From a series of lectures given by Mark Billinghurst from the University of South Australia.
1) Augmented reality definition
2) Basic Characteristics of Augmented reality
3) Where it’s been really used?
4) Key components of Augmented Reality technology
5) Working principle of AR
6) Conclusion
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
Lecture 7 of the COMP 4010 course in Virtural Reality. This lecture was about 3D User Interfaces for Virtual Reality. The lecture was taught by Mark Billinghurst on September 13th 2016 at the University of South Australia.
Augmented Reality connects the online and offline worlds. Let us have a look at what it is, why it is so popular and what are the businesses to which it can contribute.
AUGMENTED REALITY CONNECTS THE ONLINE AND OFFLINE WORLDS.
Lecture 2 in the 2022 COMP 4010 Lecture series on AR/VR and XR. This lecture is about human perception for AR/VR/XR experiences. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 11 of the COMP 4010 class on Augmented Reality and Virtual Reality. This lecture is about VR applications and was taught by Mark Billinghurst on October 19th 2021 at the University of South Australia
The final lecture in the COSC 426 graduate course in Augmented Reality. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury on Sept. 19th 2012
Natural Interaction for Augmented Reality ApplicationsMark Billinghurst
Keynote talk giving by Mark Billinghurst from the HIT Lab NZ at the IVCNZ 2013 conference, November 28th 2013. The talk focuses on Natural Interaction with Augmented Reality applications using speech and gesture and demonstrates some of the projects in this area developed by the HIT Lab NZ.
Hands and Speech in Space: Multimodal Input for Augmented Reality Mark Billinghurst
A keynote talk given by Mark Billinghurst at the ICMI 2013 conference, December 12th 2013. The talk is about how to use speech and gesture interaction with Augmented Reality interfaces.
Some of my recent research topics at the Meta-Perception group at the Ishikawa-Watanabe laboratory (http://www.k2.t.u-tokyo.ac.jp/index-e.html)
- The Physical Cloud
- Zero-delay, Zero-mismatch spatial AR with Laser Sensing Display
- Augmented Perception
(Link to videos in the comments)
Lecture 10 from a course on Mobile Based Augmented Reality Development taught by Mark Billinghurst and Zi Siang See on November 29th and 30th 2015 at Johor Bahru in Malaysia. This lecture provides an overview of research directions in Mobile AR. Look for the other 9 lectures in the course.
2013 Lecture 6: AR User Interface Design GuidelinesMark Billinghurst
COSC 426 Lecture 6: on AR User Interface Design Guidelines. Lecture taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury on August 16th 2013
A lecture on research directions in Augmented Reality as part of the COSC 426 class on AR. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury.
Lecture 5 in the 2022 COMP 4010 lecture series. This lecture is about AR prototyping tools and techniques. The lecture was given by Mark Billinghurst from University of South Australia in 2022.
The fifth lecture from the Augmented Reality Summer School taught by Mark Billinghurst at the University of South Australia, February 15th - 19th, 2016. This provides an overview of AR research directions.
Similar to Natural Interfaces for Augmented Reality (20)
Keynote talk by Mark Billinghurst at the 9th XR-Metaverse conference in Busan, South Korea. The talk was given on May 20th, 2024. It talks about progress on achieving the Metaverse vision laid out in Neil Stephenson's book, Snowcrash.
These are slides from the Defence Industry event orgranized by the Australian Research Centre for Interactive and Virtual Environments (IVE). This was held on April 18th 2024, and showcased IVE research capabilities to the South Australian Defence industry.
This is a guest lecture given by Mark Billinghurst at the University of Sydney on March 27th 2024. It discusses some future research directions for Augmented Reality.
Presentation given by Mark Billinghurst at the 2024 XR Spring Summer School on March 7 2024. This lecture talks about different evaluation methods that can be used for Social XR/AR/VR experiences.
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
Invited guest lecture by Mark Billingurust given at the MIT Media Laboratory on November 21st 2023. This was given as part of Professor Hiroshi Ishii's class on Tangible Media
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
A talk given by Mark Billinging in the CLIPE workshop in Tubingen, Germant on April 27th 2023. This talk describes how virtual avatars can be used to support remote collaboration.
Lecture 4 in the 2022 COMP 4010 lecture series on AR/VR. This lecture is about AR Interaction techniques. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 3 in the 2022 COMP 4010 lecture series on AR/VR. This lecture provides an introduction for AR Technology. This was taught by Mark Billinghurst at the University of South Australia in 2022.
Lecture 1 for the 2022 COMP 4010 course on AR and VR. This course was taught by Mark Billinghurst at the University of South Australia in 2022. This lecture provides an introduction to AR, VR and XR.
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
Short talk by Mark Billinghurst on Empathic Computing and Collaborative Immersive Analytics, presented on July 28th 2022 at the Siggraph 2022 conference.
Lecture given by Mark Billinghurst on June 18th 2022 about how the Metaverse can be used for corporate training. In particular how combining AR, VR and other Metaverse elements can be used to provide new types of learning experiences.
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.
keynote speech by Mark Billinghurst at the Workshop on Transitional Interfaces in Mixed and Cross-Reality, at the ACM ISS 2021 Conference. Given on November 14th 2021
The final lecture in the 2021 COMP 4010 class on AR/VR. This lecture summarizes some more research directions and trends in AR and VR. This lecture was taught by Mark Billinghurst on November 2nd 2021 at the University of South Australia
Lecture 10 in the COMP 4010 Lectures on AR/VR from the Univeristy of South Australia. This lecture is about VR Interface Design and Evaluating VR interfaces. Taught by Mark Billinghurst on October 12, 2021.
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Natural Interfaces for Augmented Reality
1. Natural Interfaces for
Augmented Reality
Mark Billinghurst
HIT Lab NZ
University of Canterbury
2.
3. Augmented Reality Definition
Defining Characteristics [Azuma 97]
Combines Real and Virtual Images
- Both can be seen at the same time
Interactive in real-time
- The virtual content can be interacted with
Registered in 3D
- Virtual objects appear fixed in space
4. AR Today
Most widely used AR is mobile or web based
Mobile AR
Outdoor AR (GPS + compass)
- Layar (10 million+ users), Junaio, etc
Indoor AR (image based tracking)
- QCAR, String etc
Web based (Flash)
Flartoolkit marker tracking
Markerless tracking
5. AR Interaction
You can see spatially registered AR..
how can you interact with it?
6. AR Interaction Today
Mostly simple interaction
Mobile
Outdoor (Junaio, Layar, Wikitude, etc)
- Viewing information in place, touch virtual tags
Indoor (Invizimals, Qualcomm demos)
- Change viewpoint, screen based (touch screen)
Web based
Change viewpoint, screen interaction (mouse)
8. 1. AR Information Viewing
Information is registered to
real-world context
Hand held AR displays
Interaction
Manipulation of a window
into information space
2D/3D virtual viewpoint control
Applications
Context-aware information displays
Examples NaviCam Rekimoto, et al. 1997
NaviCam, Cameleon, etc
9. Current AR Information Browsers
Mobile AR
GPS + compass
Many Applications
Layar
Wikitude
Acrossair
PressLite
Yelp
AR Car Finder
…
10. 2. 3D AR Interfaces
Virtual objects displayed in 3D
physical space and manipulated
HMDs and 6DOF head-tracking
6DOF hand trackers for input
Interaction
Viewpoint control
Traditional 3D UI interaction:
Kiyokawa, et al. 2000
manipulation, selection, etc.
Requires custom input devices
12. 3. Augmented Surfaces and
Tangible Interfaces
Basic principles
Virtual objects are projected
on a surface
Physical objects are used as
controls for virtual objects
Support for collaboration
15. Tangible User Interfaces (Ishii 97)
Create digital shadows
for physical objects
Foreground
graspable UI
Background
ambient interfaces
16. Tangible Interface: ARgroove
Collaborative Instrument
Exploring Physically Based Interaction
Move and track physical record
Map physical actions to Midi output
- Translation, rotation
- Tilt, shake
Limitation
AR output shown on screen
Separation between input and output
17.
18. Lessons from Tangible Interfaces
Benefits
Physical objects make us smart (affordances, constraints)
Objects aid collaboration (shared meaning)
Objects increase understanding (cognitive artifacts)
Limitations
Difficult to change object properties
Limited display capabilities (project onto surface)
Separation between object and display
19. 4: Tangible AR
AR overcomes limitation of TUIs
enhance display possibilities
merge task/display space
provide public and private views
TUI + AR = Tangible AR
Apply TUI methods to AR interface design
20. Example Tangible AR Applications
Use of natural physical object manipulations to
control virtual objects
LevelHead (Oliver)
Physical cubes become rooms
VOMAR (Kato 2000)
Furniture catalog book:
- Turn over the page to see new models
Paddle interaction:
- Push, shake, incline, hit, scoop
22. Evolution of AR Interaction
1. Information Viewing Interfaces
simple (conceptually!), unobtrusive
§ 3D AR Interfaces
expressive, creative, require attention
§ Tangible Interfaces
Embedded into conventional environments
4. Tangible AR
Combines TUI input + AR display
23. Limitations
Typical limitations
Simple/No interaction (viewpoint control)
Require custom devices
Single mode interaction
2D input for 3D (screen based interaction)
No understanding of real world
Explicit vs. implicit interaction
Unintelligent interfaces (no learning)
29. AR MicroMachines
AR experience with environment awareness
and physically-based interaction
Based on MS Kinect RGB-D sensor
Augmented environment supports
occlusion, shadows
physically-based interaction between real and
virtual objects
32. System Flow
The system flow consists of three sections:
Image Processing and Marker Tracking
Physics Simulation
Rendering
33. Physics Simulation
Create virtual mesh over real world
Update at 10 fps – can move real objects
Use by physics engine for collision detection (virtual/real)
Use by OpenScenegraph for occlusion and shadows
36. Motivation
AR MicroMachines and PhobiAR
• Treated the environment as
static – no tracking
• Tracked objects in 2D
More realistic interaction requires 3D gesture tracking
37. Motivation
Occlusion Issues
AR MicroMachines only achieved realistic occlusion because the user’s viewpoint matched the Kinect’s
Proper occlusion requires a more complete model of scene objects
39. HITLabNZ’s Gesture Library
Architecture
o Supports PCL, OpenNI, OpenCV, and Kinect SDK.
o Provides access to depth, RGB, XYZRGB.
o Usage: Capturing color image, depth image and
concatenated point clouds from a single or multiple cameras
o For example:
Kinect for Xbox 360
Kinect for Windows
Asus Xtion Pro Live
40. HITLabNZ’s Gesture Library
Architecture
o Segment images and point clouds based on color, depth and
space.
o Usage: Segmenting images or point clouds using color
models, depth, or spatial properties such as location, shape
and size.
o For example:
Skin color segmentation
Depth threshold
41. HITLabNZ’s Gesture Library
Architecture
o Identify and track objects between frames based on
XYZRGB.
o Usage: Identifying current position/orientation of the
tracked object in space.
o For example:
Training set of hand
poses, colors
represent unique
regions of the hand.
Raw output (without-
cleaning) classified
on real hand input
(depth image).
42. HITLabNZ’s Gesture Library
Architecture
o Hand Recognition/Modeling
Skeleton based (for low resolution
approximation)
Model based (for more accurate
representation)
o Object Modeling (identification and tracking rigid-
body objects)
o Physical Modeling (physical interaction)
Sphere Proxy
Model based
Mesh based
o Usage: For general spatial interaction in AR/VR
environment
46. HITLabNZ’s Gesture Library
Architecture
o Static (hand pose recognition)
o Dynamic (meaningful movement recognition)
o Context-based gesture recognition (gestures with context,
e.g. pointing)
o Usage: Issuing commands/anticipating user intention and
high level interaction.
48. Multimodal Interaction
Combined speech input
Gesture and Speech complimentary
Speech
- modal commands, quantities
Gesture
- selection, motion, qualities
Previous work found multimodal interfaces
intuitive for 2D/3D graphics interaction
49. 1. Marker Based Multimodal Interface
Add speech recognition to VOMAR
Paddle + speech commands
50.
51. Commands Recognized
Create Command "Make a blue chair": to create a virtual
object and place it on the paddle.
Duplicate Command "Copy this": to duplicate a virtual object
and place it on the paddle.
Grab Command "Grab table": to select a virtual object and
place it on the paddle.
Place Command "Place here": to place the attached object in
the workspace.
Move Command "Move the couch": to attach a virtual object
in the workspace to the paddle so that it follows the paddle
movement.
62. Results
Average performance time (MMI, speech fastest)
Gesture: 15.44s
Speech: 12.38s
Multimodal: 11.78s
No difference in user errors
User subjective survey
Q1: How natural was it to manipulate the object?
- MMI, speech significantly better
70% preferred MMI, 25% speech only, 5% gesture only
67. Intelligent Interfaces
Most AR systems stupid
Don’t recognize user behaviour
Don’t provide feedback
Don’t adapt to user
Especially important for training
Scaffolded learning
Moving beyond check-lists of actions
68. Intelligent Interfaces
AR interface + intelligent tutoring system
ASPIRE constraint based system (from UC)
Constraints
- relevance cond., satisfaction cond., feedback
72. Evaluation Results
16 subjects, with and without ITS
Improved task completion
Improved learning
73. Intelligent Agents
AR characters
Virtual embodiment of system
Multimodal input/output
Examples
AR Lego, Welbo, etc
Mr Virtuoso
- AR character more real, more fun
- On-screen 3D and AR similar in usefulness
75. Conclusions
AR traditionally involves tangible interaction
New technologies support natural interaction
Environment capture
Natural gestures
Multimodal interaction
Opportunities for future research
Mobile, intelligent systems, characters
76. More Information
• Mark Billinghurst
– mark.billinghurst@hitlabnz.org
• Website
– http://www.hitlabnz.org/
Editor's Notes
- To create an interaction volume, the Kinect is positioned above the desired interaction space facing downwards. - A reference marker is placed in the interaction space to calculate the transform between the Kinect coordinate system and the coordinate system used by the AR viewing camera. - Users can also wear color markers on their fingers for pre-defined gesture interaction.
- The OpenSceneGraph framework is used for rendering. The input video image is rendered as the background, with all the virtual objects rendered on top. - At the top level of the scene graph, the viewing transformation is applied such that all virtual objects are transformed so as to appear attached to the real world. - The trimesh is rendered as an array of quads, with an alpha value of zero. This allows realistic occlusion effects of the terrain and virtual objects, while not affecting the users’ view of the real environment. - A custom fragment shader was written to allow rendering of shadows to the invisible terrain.
Appearance-based interaction has been used at the Lab before, both in AR Micromachines and PhobiAR. Flaws in these applications have motivated my work on advanced tracking and modeling. AR Micromachines did not allow for dynamic interaction – a car could be picked up, but because the motion of the hand was not known, friction could not be simulated between the car and the hand. PhobiAR introduced tracking for dynamic interaction, but it really only tracked objects in 2D. I’ll show you what I mean.. As soon as the hand is flipped the tracking fails and the illusion of realistic interaction is broken. 3D tracking was required to make the interaction in both of these applications more realistic
Another issue with typical AR applications is the handling of occlusion. The Kinect allows a model of the environment to be developed, which can help in determining whether a real object is in front of a virtual one. Micromachines had good success by assuming a situation such as that shown on the right, with all objects in the scene in contact with the ground. This was a fair assumption when most of the objects were books etc. However, in PhobiAR the user’s hands were often above the ground, more like the scene on the left. The thing to notice is that these two scenes are indistinguishable from the Kinect’s point of view, but completely different from the observer’s point of view. The main problem is that we don’t know enough about the shape real-world objects to handle occlusion properly. My work aims to model real-world objects by combining views of the objects across multiple frames, allowing better occlusion.
The gesture library will provide a C++ API for real-time recognition and tracking of hands and rigid-body objects in 3D environments. The library will support usage of single and multiple depth sensing cameras. Collision detection and physics simulation will be integrated for realistic physical interaction. Finally, learning algorithms will be implemented for recognizing hand gestures.
The library will support usage of single and multiple depth sensing cameras. Aim for general consumer hardware.
Interaction between real objects and the virtual balls was achieved by representing objects as collections of spheres. The location of the spheres was determined by the modeling stage while their motion was found during tracking. I used the Bullet physics engine for physics simulation.
The AR scene was rendered using OpenSceneGraph. Because the Kinect’s viewpoint was also the user’s viewpoint, realistic occlusion was possible using the Kinect’s depth data. I did not have time to experiment with using the object models to improve occlusion from other viewpoints. Also, the addition of shadows could have significantly improved the realism of the application.