COSC 426 Lect. 8: AR Research Directions

1,514 views
1,388 views

Published on

A lecture on research directions in Augmented Reality as part of the COSC 426 class on AR. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury.

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,514
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
90
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

COSC 426 Lect. 8: AR Research Directions

  1. 1. Lecture 8: Research Directions Mark Billinghurst g HIT Lab NZ University of Canterbury
  2. 2. Looking to the Future
  3. 3. The Future is with usIt takes at least 20 years for new technologies to go from the lab to the lounge..“The technologies that will significantly affect our lives over the next 10 years have been around for a decade.The future is with us. The trick is learning how to spot it us it. The commercialization of research, in other words, is far more about prospecting than alchemy.” Bill Buxton Oct 11th 2004
  4. 4. Research DirectionsR h Di ti experiences Usability applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
  5. 5. Research DirectionsComponentsC Markerless tracking, hybrid tracking Displays, i Di l input d i t devicesTools Authoring tools, user generated content toolsApplications Interaction techniques/metaphorsExperiences User evaluation novel AR/MR experiences evaluation,
  6. 6. HMD Design
  7. 7. Occlusion with See-through HMD The P bl Th Problem Occluding real objects with virtual Occluding virtual objects with real Real Scene Current See-through HMD
  8. 8. ELMO (Kiyokawa 2001) Occlusive see-through HMD Masking LCD Real time range finding
  9. 9. ELMO Demo
  10. 10. ELMO Design Virtual images from LCD Depth Sensing g LCD MaskRealWorld Optical Combiner Use LCD mask to block real world Depth sensing for occluding virtual images
  11. 11. ELMO Results
  12. 12. Tools
  13. 13. BuildARhttp://www.hitlabnz.org/wiki/BuildARStand alone applicationVisual interface for AR model viewing applicationEnables non-programmers to build AR scenes p g
  14. 14. Mobile BuildARIdealId l authoring tool h l Develop on PC, deploy on handheld AR Scene PC AR Player PC BuildAR XML Mobile Phone Edgelib stbES Symbian/WM S bi /WM Python
  15. 15. Desktop PC authoring tool Desktop PC Mobile Phone
  16. 16. Applications
  17. 17. Future Directions SLIDE 17 Interaction TechniquesInput techniques 3D vs. 2D input Pen/buttons/gestures P /b /Collaboration techniques Simultaneous access to AR contentUser studies…
  18. 18. Flexible DisplaysFlexible Lens Surface Bimanual interaction Digital paper analogy Red Planet, 2000
  19. 19. Sony CSL © 2004
  20. 20. Sony CSL © 2004
  21. 21. Tangible User Interfaces (TUIs)GUMMI bendable display prototypeReproduced by permission of Sony CSL
  22. 22. Sony CSL © 2004
  23. 23. Sony CSL © 2004
  24. 24. Lucid TouchMicrosoft Research & Mitsubishi Electric Research LabsM f R h M b h El R hL bWigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C.LucidTouch: A See-Through Mobile DeviceIn Proceedings of UIST 2007, Newport, Rhode Island, October 7-10, 2007, pp. 269–278.
  25. 25. Auditory ModalitiesAuditory auditory icons earcons speech synthesis/recognition Nomadic Radio (Sawhney) - combines spatialized audio bi ti li d di - auditory cues - speech synthesis/recognition
  26. 26. Gestural interfaces1. Micro-gestures (unistroke, smartPad)2. Device-based2 D i b d gestures (tilt based examples)3.3 Embodied interaction (eye toy)
  27. 27. Haptic Modalities Haptic interfaces Simple uses in mobiles? (vibration instead of ringtone) Sony’s Touchengine - physiological experiments show you can perceive two stimulus 5ms apart, and spaced as low as 0.2 microns 4 μm n層28 μm n層 V
  28. 28. Haptic InputAR Haptic Workbench CSIRO 2003 – Adcock et. al. et al
  29. 29. AR Haptic InterfacePhantom, ARToolKit, Magellan
  30. 30. Multimodal InputCombining speech and gesture builds on the strength ofeach Speech – mode selection, group selection p g p Gesture – direct manipulationKey problem Command disambiguation - “Move that chair” - which chair?Use statistical methods for disambiguation Speech and gesture recognition provide multiple possibilities – need to look for most probable SenseShapes detect object of interest (Olwal 2003) Olwal 2003
  31. 31. Experiences
  32. 32. ExperiencesCrossing Boundaries Ubiquitous VR/ARMassive AR AR + Social NetworkingUsability
  33. 33. Crossing Boundaries Jun Rekimoto, Sony CSL
  34. 34. Invisible Interfaces Jun Rekimoto, Sony CSL
  35. 35. Milgram’s Reality-Virtuality continuum Mixed Reality Real Augmented Augmented VirtualEnvironment Reality (AR) Virtuality (AV) Environment Reality - Virtuality (RV) Continuum
  36. 36. The MagicBookReality Augmented Augmented Virtuality Reality (AR) Virtuality (AV)
  37. 37. Invisible Interfaces Jun Rekimoto, Sony CSL
  38. 38. Example: Visualizing Sensor Networks Rauhala et. al. 2007 (Linkoping) Network of Humidity Sensors ZigBee wireless communication Use Mobile AR to Visualize Humidity
  39. 39. Example: Sensor Input for AR Interaction UbiComp sensor Light, temp, motion, sound RF connection AR software plug-in Sensor input interacting with AR applications
  40. 40. uPart USB BridgeParticle http://particle.teco.edu idle: 16 Hour
  41. 41. AR response to change in light levels p g g
  42. 42. Invisible Interfaces Jun Rekimoto, Sony CSL
  43. 43. UbiVR – CAMAR CAMAR Controller CAMAR Viewer CAMAR CompanionGIST - Korea
  44. 44. ubiHome @ GIST Media services Light service MR window ubiTrack Where/When Tag-it ubiKey ©ubiHome Who/What/What/When/How When/How PDA Couch Sensor Door Sensor Who/What/When/How Wh /Wh t/Wh /H When/How Wh /H When/How Wh /H
  45. 45. CAMAR - GIST(CAMAR: Context-Aware Mobile Augmented Reality) y)
  46. 46. UCAM: Architecture wear-UCAM wearService Light <Service> IR receiver BioSensor User Conditional UserProfileManager Context #2 ubiTrack < wearService > Content wearSensor User Conditional Context #3 MRWindow <Service> Sensor User Service Conditional Context #1 ubiTV <Service> (Integrator,Manager, Interpreter,ServiceProvider) Media services Light service ubiTrack Tag-it Context Interface What/When/How Where/When PDA Couch Sensor Door Sensor Network Interface Who/What/When/How When/How When/How ubi-UCAMBAN/PAN TCP/IP (BT) (Discovery,Control,Event) Operating System O ti S t vr-UCAM
  47. 47. Ubiquitous UbiComp Ubi AR Ubi VRWeiser Mobile AR Desktop AR VR Terminal Reality Virtual Reality Milgram From: Joe Newman
  48. 48. Future Directions SLIDE 51 Massive MultiuserHandheld AR for the first time allows extremely highnumbers of AR usersRequiresR i New types of applications/games New infrastructure (server/client/peer to peer) (server/client/peer-to-peer) Content distribution…
  49. 49. MassiveMulti User Ubiquitous TerminalSingle User Reality VR
  50. 50. Massive MultiUser2D Applications MSN – 29 million Skype – 10 million Facebook – up to 70m3D/VR Applications SecondLife > 50K Stereo projection - <500Augmented RealityA mented Realit Shared Space (1999) - 4 Invisible Train (2004) - 8 ( )
  51. 51. BASIC VIEW
  52. 52. PERSONAL VIEW
  53. 53. Augmented Reality 2.0 Infrastructure
  54. 54. Leveraging Web 2 0 L i W b 2.0Content retrieval using HTTP gXML encoded meta information KML placemarks + extensionsQueries Based on location (from GPS, image recognition) Based on situation (barcode markers)Queries l deliver tracking fQ i also d li ki feature d b databasesEverybody can set up an AR 2.0 serverSyndication: y Community servers for end-user content TaggingAR client subscribes to arbitrary number of feeds
  55. 55. ContentContent creation and delivery Content creation pipeline Delivering previously unknown contentStreaming of Data (objects multi-media) (objects, multi media) ApplicationsDistribution How do users learn about all that content? How do they access it? H d th
  56. 56. Twitter 360Twitter 360 http://www.twitter-360.com pAR to geo-locate Tweets around you Better than Google maps? g p
  57. 57. Scaling UpAR on a City Scale yUsing mobile phone as ubiquitous sensorMIT Senseable City Lab http://senseable.mit.edu/
  58. 58. WikiCity Rome (Senseable City Lab MIT)
  59. 59. More Information• Mark Billinghurst – mark.billinghurst@hitlabnz.org• Websites – http://www.hitlabnz.org/ http://www hitlabnz org/ – http://artoolkit.sourceforge.net/ – http://www.osgart.org/ http://www osgart org/ – http://www.hitlabnz.org/wiki/buildAR/

×