5. Augmented reality (AR) is used to enhance the real
environment by combining real & virtual elements.
ARART AR app brings paintings to life
6. 1. It has to combine real and virtual elements.
2. Registered in 3-D
(accurate alignment of virtual objects in real word)
3. Interactive in real time.
Augmented football field.
Main Characteristics of AR (Azuma)
7. • On any device with a display, that can utilize
computer vision to overlay digital information.
• The large use of smart mobile devices makes
visual AR the most diffused currently.
Ikea Augmented Reality catalogue. M&S Augmented Reality app.
Where can AR be used?
8. Augmented Reality can remove (hide) real elements.
AR occluding real elements
Where can AR be used?
10. Why should we augment Reality?
• Meet people’s needs of interacting with information
without distractions
• Easy to combine with multiple areas:
Manufacturing
Navigation
Tourism
Architecture
Entertainment
Medical…
11. 1968 1992 1997 1999 2004 2004―2010
Smart
phone
Camera
phone
GSM/
WiFi
History of Mobile AR
13. 1968 1992 1997 1999 2004 2004―2010
Smart
phone
Camera
phone
GSM/
WiFi
History of Mobile AR
14. • 1st Mobile AR System
• See-through head-worn display with
integral orientation tracker
• Backpack holding computer & hand-held
computer
(1997)
1st Mobile AR System
15. • A mobile, multi-user AR system
• Collaborate in augmented
shared space
(2001)
Multi-User AR
16. 1968 1992 1997 1999 2004 2004―2010
Smart
phone
Camera
phone
GSM/
WiFi
History of Mobile AR
17. • A system for tracking 3D markers
on a mobile phone
• 1st Video see-through AR system
on consumer cell phones
(2004)
Mobile Phone AR
18. • 1st multi-user AR application for PDAs
• video see-through
(2004)
PDA AR
19. ARhrrrr!
• A mobile AR game with high quality content
(2009)
Better Graphics…Better AR!
20. • PTAM system running in real-time on an iPhone
(2009)
Markerless Mobile AR - PTAM
30. Optical See-Through HMD
User sees directly through display
Direct view of the world
Full resolution, no time delay
- Safer
- Lower distortion
- No eye displacement
33. Virtual Retinal Displays
• Projects light directly into eyes
• Smaller size (no intermediate screen present)
• Highly portable
• Less power
Everything we see is reflected light!
34. …but how can we interact with them?
We can see virtual objects registered in space...
36. Natural Gestures
Depth Sensors for
Gestures CaptureTracking the user's hand(s)
can provide a 6DOF
interaction technique.
37. Thermal Touch
Thermal Touch: Thermography-Enabled Everywhere Touch Interfaces
for Mobile Augmented Reality Applications Kurz, ISMAR 2014
38. Tracking
Recover position and orientation of viewpoint at each frame.
Currently:
• Image based
• Hybrid Tracking
Future:
• Model-based
• Environmental
Past:
• Location Based (GPS)
• Marker based
• Magnetic, etc.
39. An ideal tracking system should have:
• perfect instantaneous 6 DOF measurements of the
sensor pose in any environment and under any
motion.
• Robustness against observation from different
viewing angles.
• Robustness w.r.t changing lighting condition.
• Low cost, be fast and reliable
Optimal Tracking System
43. • 3D tracking techniques that tracks natural features.
• It extracts features from the surrounding environment
(i.e. corners, edges, textures).
• Markerless AR systems use natural features instead
of fiducial markers in order to perform tracking.
Main techniques used are:
• Template based tracking.
• Edge based tracking.
• Tracking by object detection.
Markerless Tracking
44. - Also called Texture Tracking.
- Looks for features in images.
With enough features it
can be highly accurate.
• Sensible to fast camera
movements.
• Sensible to light changes.
• Sensible to occlusion.
Multi-Texture Tracking AR
Template-based Tracking
45. • It requires depth cameras.
• Sensible to fast camera
movements.
• Cluttered background.
• Tracking accuracy is jitter.
3D Texture less Tracking - AR
Edge-based Tracking
46. • Tracking might be restricted to limited range of poses.
• Stable tracking under fast motion.
Object Recognition - AR
Object-based Tracking
48. • Estimating camera pose in an unknown scene.
• Tracking and Mapping are 2 separate tasks.
• Produces detailed maps which can be tracked at frame-rate.
Parallel Tracking and Mapping (PTAM - PTAMM)
Parallel Tracking & (Multiple) Mapping
(PTAM – PTAMM)
52. • Touchpad
• Camera – record 720p
video
• Single screen
• $1,500
• Version 2 ~ 2016
• Optical see through
Google Glass
53. • 3D See-through Display
• 3D depth Camera-320x240 +
RGB Camera-1280x720
• Head tracking
• 23-35º FOV
• 9-axis accelerometer
• $667, $3000
Meta Spaceglasses+Meta Pro
• 40º FOV
• Comes with sidekick PC with a
1.5GHz Core i5 processor, 4GB of
RAM, and a 128GB SSD
• 3000$
Good morning
Thank you for being here this early to watch our presentation
My name is Marios and together with Sebastian,Zhe and Russ we will present to you some main issues regarding Mobile AR
Most of you should have already seen AR content on movies
Tony stark can interact with holograms(virtual objects) to quickly extract information or to help him assess a situation
Ar=merge between the real and virtual world.
We AUGMENT the real world using virtual objects,virtual content
Can help in numerous fields from medicine to tourism
We show the demo and in the meanwhile I will say this:
- Augmented Reality (AR) content can be accessed by scanning or viewing a trigger image with a mobile device that creates a subsequent action.
- It can also be detected trough other sensors i.e GPS
- AR can be achieved on any device with a display, that can take advantage of computer vision to overlay digital information.
- Additionally it can be designed to interact through many sensory channels (e.g. auditory, visual, olfactory, and haptic).
Virtual reality (VR) creates immersive, computer-generated environments, and the real world;
Augmented reality is closer to the real world, it combines real word elements with virtual elements, the contents is overlaid onto real life.
Augmented reality is changing the way we view the world -- or at least the way its users see the world. Picture yourself walking or driving down the street. With augmented-reality displays, which will eventually look much like a normal pair of glasses, informative graphics will appear in your field of view, and audio will coincide with whatever you see. These enhancements will be refreshed continually to reflect the movements of your head. Similar devices and applications already exist, particularly on smartphones.
Over the years, AR has been defined in different ways, but based on a paper written by Azuma, we can consider AR if:
1 - It has to be interactive in real time (augmentation in real time, not baked like in movies).
2 - It has to combines real and virtual elements (Virtual objects are integrated into a 3-D real environment in real time).
3 - Registered in 3-D (accurate alignment of virtual objects in real word)
Registered in 3-D -> the graphical line obeys all physical rules of depth as if it existed in the real world.
Registration refers to the accurate alignment of real and virtual objects. Without accurate registration, the illusion that the virtual objects exist in the real environment is severely compromised. Registration is a difficult problem and a topic of continuing research.
AR can be achieved on any device with a display, that can take advantage of computer vision to overlay digital information, but AR technologies are not only limited to vision, it can be designed to interact through many sensory channels (e.g. auditory, visual, olfactory, and haptic) (Hughes et al., 2005).
The large use of smart mobile devices makes visual augmented reality the most diffused currently. The mobile platform allows AR developers to reach a larger audience on an affirmed and AR capable platform.
Current work has focused on adding virtual objects to a real environment. However, graphic overlays might also be used to remove or hide parts of the real environment from a user.
Today thanks to the improvement of …. We can display AR contents on our smart phone devices, and nowadays we can use free tools such as unity, vuforia, and OpenCV to dispay and create our own AR applications.
just in time information and service
interact with computer-supported information without distracting from the real world.
the first fully functional AR system
mechanical tracker and ultrasonic tracker
Since about the mid-1990s computing and tracking devices have become sufficient powerful, and small to make MARS possible.
see-through HMD, hand-held computer and backpack based computer
presents 3D tour guide information to campus visitors.
a pad and a pen web camera
It allows two users to collaborate in augmented shared space.
the limitation of computing and tracking devices
Most of MARS during this period are very heavy, as they need backpack holding computer and big head mounted display
Detect 3D markers and correct integration of rendered 3D graphics into the live video stream
These virtual trains are only visible to players through their PDA's video see-through display.
All processing except for tracking are running on the GPU, making the whole application run at high frame rates on a mobile phone class device
PTAM (parallel tracking and mapping) is a fast SLAM approach.
As we stated before, an AR System needs to be able to combine REAL World + Virtual Objects
We need a device which will be able to display the virtual object on top of our view of the real world
NEED TO FIND A WAY TO INTERACT WITH VIRTUAL CONTENT IN ORDER TO MOVE IT IN 3D SPACE
NEED TO CORRECTLY ALIGN THE VIRTUAL OBJECTS AND REAL WORLD USING CORRECT TRACKING METHODS
Currently most devices are smartphones/tablets hmds
Future we will consider Projected AR,Virtual retinal…
Real environment is filmed by a camera,
then combined with virtual information and
finally displayed
Mobile phones are a type of Video See through display
Users can observe the real environment, while it is overlaid with virtual information
We use projectors to project the augmented content onto a retro-reflective surface.
The image bounces off the retro-reflective surface back to the wearer's eyes.
---Low light-intensity.
--- Depending on the application a trade-off between battery life, light intensity and projector size must be found.
Mimics natural vision
use a low-powered laser to project an image directly onto the retina of the user.
• Measures residual heat from touching surfaces.
• Looks for circular patches with temperature between hand & object.
Add touch interface to everyday objects.
---- Cold hands
---- Metals, glass do not show up well.
The biggest single obstacle to building effective Augmented Reality systems is the requirement of accuracy, long-range sensors and trackers that report the locations of the user and the surrounding objects in the environment.
No tracker currently provides high accuracy at long ranges in realtime. More work needs to be done to develop sensors and trackers that can meet these stringent requirements. Most trackers required the user to provide a registration point in order to trigger the experience.
Perfect tracking is never achieved since temporal delays and accuracy limits are inherent in all measurements. (It is the main challenge in AR)
Today, image processing and computer vision technologies have been progressed to a stage that allows us to infer the 3D information of the world directly from the images. Because of the success of these technologies, more and more vision-based AR applications are emerged.
Today, image processing and computer vision technologies have been progressed to a stage that allows us to infer the 3D information of the world directly from the images. Because of the success of these technologies, more and more vision-based AR applications are emerged.
-Tracking AR GPS – It mainly depends on the devices, it is not precise, and it is shaking, sometime the devices needs time to received the GPS data. Buttery run out quickly, and you do can not detect how the height from the ground.
-Ultrasonic trackers suffer from noise and are difficult to make accurate at long ranges because of variations in the ambient temperature.
-Some mechanical trackers are accurate enough, although they constrain the user to a limited working volume.
-Magnetic trackers are vulnerable to distortion by metal in the environment, which exists in many desired AR application environments.
-Optical tracking presents some advantages when compared to its counterparts, such as higher precision and less sensibility to interference.
-In markerless AR any part of the real environment may be used as a marker, since the system exploits natural features present in the real scene to perform tracking. Markerless AR has received more attention from researchers in the latest years, and presents important challenges to be overcome.
-Natural features tracking: tracking from features of the surrounding environment (corner, edges, blobs).
Generally more difficult than marker tracking.
-Another advantage is the possibility of extracting from the surroundings characteristic information that may later be used by the AR system for other purposes
-It requires no markers, pre-made maps, known templates, or inertial sensors.
-Tracking and mapping are split into two separate tasks processed in parallel threads on a dual-core computer: one thread deals with the task of robustly tracking erratic hand-held motion, while the other produces a 3D map of point features from previously observed video frames.
-The result is a system that produces detailed maps with thousands of landmarks which can be tracked at frame-rate, with an accuracy and robustness rivalling that of state-of-the-art model-based systems.
Optical see through – still see real world, safety, low distortion
Projection based – downside – required special reflective material to show the AR.
Started on kickstarter
Video see through – true occulsion, flexible in composotion, mathable time delays, more registration, easier to do wide FOV
Safety standard hard hat with AR display
Looks like only one AR screen, so no 3D projected AR