Your SlideShare is downloading. ×
0
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Can You See What I See?
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Can You See What I See?

585

Published on

Talk by Mark Billinghurst about Collaborative Augmented Reality at CMU campus on May 6th, 2013

Talk by Mark Billinghurst about Collaborative Augmented Reality at CMU campus on May 6th, 2013

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
585
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
46
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Can You See What I See?Mark Billinghurstmark.billinghurst@hitlabnz.orgThe HIT Lab NZ, University of CanterburyMay 3rd 2013
  • 2. Augmented Reality  Key Features Combines Real and Virtual Images Interactive in Real-Time Content Registered in 3DAzuma, R., A Survey of Augmented Reality, Presence, Vol. 6, No. 4, August 1997, pp. 355-385.
  • 3. Augmented Reality for CollaborationRemote ConferencingFace to Face Collaboration
  • 4. Key Research FocusCan Augmented Reality be used to enhanceface to face and remote collaboration?  Reasons  Provide enhanced spatial cues  Anchor communication back in real world  Features not available in normal collaboration
  • 5. Communication Seams  Technology introduces artificial seams in thecommunication (eg separate real and virtual space)Task SpaceCommunication Space
  • 6. Making the Star Wars Vision Real  Combining Real and Virtual Images  Display Technology  Interacting in Real-Time  Interaction Metaphors  Content Registered in 3D  Tracking Techniques
  • 7. AR Tracking (1999)  ARToolKit - marker based AR tracking  over 600,000 downloads, multiple languagesKato, H., & Billinghurst, M. (1999). Marker tracking and hmd calibration for a video-based augmentedreality conferencing system. In Augmented Reality, 1999.(IWAR99) Proceedings. 2nd IEEE and ACMInternational Workshop on (pp. 85-94).
  • 8. AR Interaction (2000)  Tangible AR Metaphor  TUI (Ishii) for input  AR for display  Overcomes TUI limitations  merge task and display space  provide separate views  Design physical objects for AR interactionKato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., & Tachibana, K. (2000). Virtual object manipulationon a table-top AR environment. In Augmented Reality, 2000.(ISAR 2000). Proceedings. IEEE and ACMInternational Symposium on (pp. 111-119).
  • 9. Face to Face Collaboration
  • 10. A wide variety of communication cues used.SpeechParalinguisticParaverbalsProsodicsIntonationAudio GazeGestureFace ExpressionBody PositionVisualObject ManipulationWriting/DrawingSpatial RelationshipObject PresenceEnvironmentalCommunication Cues
  • 11. Shared Space  Face to Face interaction, Tangible AR metaphor  ~3,000 users (Siggraph 1999)  Easy collaboration with strangers  Users acted same as if handling real objectsBillinghurst, M., Poupyrev, I., Kato, H., & May, R. (2000). Mixing realities in shared space: An augmentedreality interface for collaborative computing. In Multimedia and Expo, 2000. ICME 2000. 2000 IEEEInternational Conference on (Vol. 3, pp. 1641-1644).
  • 12. Communication PatternsWill people use the same speech/gesture patterns?Face to Face FtF AR Projected
  • 13. Communication Patterns  User felt AR was very different from FtF  BUT speech and gesture behavior the same  Users found tangible interaction very easyBillinghurst, M., Belcher, D., Gupta, A., & Kiyokawa, K. (2003). Communication behaviors in colocatedcollaborative AR interfaces. International Journal of Human-Computer Interaction, 16(3), 395-423.% Dietic Commands Ease of Interaction (1-7 very easy)
  • 14. Mobile Collaborative ARHenrysson, A., Billinghurst, M., & Ollila, M. (2005, October). Face to face collaborative AR on mobilephones. In Mixed and Augmented Reality, 2005. Proceedings. Fourth IEEE and ACM InternationalSymposium on (pp. 80-89). IEEE.  AR Tennis  Shared AR content  Two user game  Audio + haptic feedback  Bluetooth networking
  • 15. Using AR for Communication CuesVirtual Viewpoint VisualizationMogilev, D., Kiyokawa, K., Billinghurst, M., & Pair, J. (2002, April). AR Pad: An interface for face-to-face ARcollaboration. In CHI02 extended abstracts on Human factors in computing systems (pp. 654-655).  AR Pad  Handheld AR device  AR shows viewpoints  Users collaborate easier
  • 16. AR for New FtF Experiences  MagicBook  Transitional AR interface (RW-AR-VR)  Supports both ego- and exo-centric collaborationBillinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface. Computers& Graphics, 25(5), 745-753.
  • 17. Lessons Learned  Collaboration is a Perceptual task  AR reduces perceptual cues -> Impacts collaboration  Tangible AR metaphor enhances ease of interaction  Users felt that AR collaboration different from Face to Face  But user exhibit same speech and gesture as with real content“AR’s biggest limit was lack of peripheral vision. The interaction wasnatural, it was just difficult to see""Working Solo Together"Thus we need to design AR interfaces that don’t reduceperceptual cues, while keeping ease of interaction
  • 18. Remote Collaboration
  • 19. AR Conferencing  Virtual video of remote collaborator  Moves conferencing into real world  MR users felt remote user morepresent than audio or video conf.Billinghurst, M., & Kato, H. (2000). Out and about—real world teleconferencing. BT technology journal,18(1), 80-82.
  • 20. Multi-View AR ConferencingBillinghurst, M., Cheok, A., Prince, S., & Kato, H. (2002). Real world teleconferencing. ComputerGraphics and Applications, IEEE, 22(6), 11-13.
  • 21. A Wearable AR Conferencing Space  Concept  mobile video conferencing  spatial audio/visual cues  body-stabilized data  Implementation  see-through HMD  head tracking  static images, spatial audioBillinghurst, M., Bowskill, J., Jessop, M., & Morphett, J. (1998, October). A wearable spatial conferencingspace. In Wearable Computers, 1998. Digest of Papers. Second International Symposium on (pp.76-83). IEEE.
  • 22. User Evaluation
  • 23. WACL: Remote Expert Collaboration  Wearable Camera/Laser Pointer  Independent pointer control  Remote panorama view
  • 24. WACL: Remote Expert Collaboration  Remote Expert View  Panorama viewing, annotation, image captureKurata, T., Sakata, N., Kourogi, M., Kuzuoka, H., & Billinghurst, M. (2004, October). Remote collaborationusing a shoulder-worn active camera/laser. In Wearable Computers, 2004. ISWC 2004. EighthInternational Symposium on (Vol. 1, pp. 62-69).
  • 25. Lessons Learned  AR can provide cues that increase senseof Presence  Spatial audio and visual cues  Providing good audio essential  AR can enhance remote task spacecollaboration  Annotation directly on real world  But: need good situational awareness
  • 26. Current Work
  • 27. Current Work  Natural Interaction  Speech, Gesture Input  Real World Capture  Remote scene sharing  CityView AR  Lightweight asynchronous collaboration  Handheld AR  Annotation based collaboration
  • 28. IronMan2
  • 29. Natural Hand Interaction  Using bare hands to interact with AR content  MS Kinect depth sensing  Real time hand tracking  Physics based simulation modelPiumsomboon, T., Clark, A., & Billinghurst, M. (2011, December). Physically-based interaction fortabletop augmented reality using a depth-sensing camera for environment mapping. In Proceedings ofthe 26th International Conference on Image and Vision Computing New Zealand.
  • 30. Multimodal Interaction  Combined speech and Gesture Input  Free-hand gesture tracking  Semantic fusion engine (speech + gesture input history)
  • 31. User Evaluation  Change object shape, colour and position  Results  MMI signif. faster (11.8s) than gesture alone (12.4s)  70% users preferred MMI (vs. 25% speech only)Billinghurst, M., & Lee, M. (2012). Multimodal Interfaces for Augmented Reality. In Expanding the Frontiersof Visual Analytics and Visualization (pp. 449-465). Springer London.
  • 32. Real World Capture  Hands free AR  Portable scene capture (color + depth)  Projector/Kinect combo, Remote controlled pan/tilt  Remote expert annotation interface
  • 33. Remote Expert View
  • 34. CityViewAR  Using AR to visualize Christchurch city buildings  3D models of buildings, 2D images, text, panoramas  AR View, Map view, List viewLee, G. A., Dunser, A., Kim, S., & Billinghurst, M. (2012, November). CityViewAR: A mobile outdoor ARapplication for city visualization. In Mixed and Augmented Reality (ISMAR-AMH), 2012 IEEE InternationalSymposium on (pp. 57-64).
  • 35. Client/Server ArchitectureAndroid application Web application javaand php server Database server Postgres Web Interface Add models
  • 36. Web based Outdoor AR Server  Web interface  Showing POIs asIcons on Google Map  PHP based REST API  XML based scenedata retrieval API  Scene creation andmodification API  Android client sideREST API interface
  • 37. Handheld Collaborative AR  Use handheld tablet to connect to Remote Expert  Low cost, consumer device, light weight collaboration  Different communication cues  Shared pointers, drawing annotation  Streamed video, still images
  • 38. Whats Next?
  • 39. Future Research  Ego-Vision collaboration  Shared POV collaboration  AR + Human Computation  Crowd sourced expertise  Scaling up  City/Country scale augmentation
  • 40. Ego-Vision Collaboration  Google Glass  camera + processing + display + connectivity
  • 41. Ego-Vision Research  System  How do you capture the users environment?  How do you provide good quality of service?  Interface  What visual and audio cues provide best experience?  How do you interact with the remote user?  Evaluation  How do you measure the quality of collaboration?
  • 42. AR + Human Computation  Human Computation  Real people solving problemsdifficult for computers  Web-based, non real time  Little work on AR + HC  AR attributes  Shared point of view  Real world overlay  Location sensingWhat does this say?
  • 43. Human Computation Architecture  Add AR front end to typical HC platform
  • 44. AR + HC Research Questions  System  What architecture provides best performance?  What data is needed to be shared?  Interface  What cues are needed by the human computers?  What benefits does AR provide cf. web systems?  Evaluation  How can the system be evaluated?
  • 45. Scaling Up  Seeing actions of millions of users in the world  Augmentation on city/country level
  • 46. AR + Smart Sensors + Social Networks  Track population at city scale (mobile networks)  Match population data to external sensor data  medical, environmental, etc  Mine data to improve social services
  • 47. Orange Data for Development  Orange made available 2.5 billion phone records  5 months calls from Ivory Coast  > 80 sample projects using data  eg: Monitoring human mobility for disease modeling
  • 48. Research Questions  System  How can you capture the data reliably?  How can you aggregate and correlate the information?  Interface  What data provides the most values?  How can you visualize the information?  Evaluation  How do you measure the accuracy of the model?
  • 49. Conclusions
  • 50. Conclusions  Augmented Realty can enhance face to face andremote collaboration  spatial cues, seamless communication  Current research opportunities in naturalinteraction, environment capture, mobile AR  gesture, multimodal interaction, depth sensing  Future opportunities in large scale deployment  Human computing, AR + sensors + social networks
  • 51. More Information•  Mark Billinghurst–  mark.billinghurst@hitlabnz.org•  Website–  http://www.hitlabnz.org/

×