Can You See What I See?Mark Billinghurstmark.email@example.comThe HIT Lab NZ, University of CanterburyMay 3rd 2013
Augmented Reality Key Features Combines Real and Virtual Images Interactive in Real-Time Content Registered in 3DAzuma, R., A Survey of Augmented Reality, Presence, Vol. 6, No. 4, August 1997, pp. 355-385.
Augmented Reality for CollaborationRemote ConferencingFace to Face Collaboration
Key Research FocusCan Augmented Reality be used to enhanceface to face and remote collaboration? Reasons Provide enhanced spatial cues Anchor communication back in real world Features not available in normal collaboration
Communication Seams Technology introduces artificial seams in thecommunication (eg separate real and virtual space)Task SpaceCommunication Space
Making the Star Wars Vision Real Combining Real and Virtual Images Display Technology Interacting in Real-Time Interaction Metaphors Content Registered in 3D Tracking Techniques
AR Tracking (1999) ARToolKit - marker based AR tracking over 600,000 downloads, multiple languagesKato, H., & Billinghurst, M. (1999). Marker tracking and hmd calibration for a video-based augmentedreality conferencing system. In Augmented Reality, 1999.(IWAR99) Proceedings. 2nd IEEE and ACMInternational Workshop on (pp. 85-94).
AR Interaction (2000) Tangible AR Metaphor TUI (Ishii) for input AR for display Overcomes TUI limitations merge task and display space provide separate views Design physical objects for AR interactionKato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., & Tachibana, K. (2000). Virtual object manipulationon a table-top AR environment. In Augmented Reality, 2000.(ISAR 2000). Proceedings. IEEE and ACMInternational Symposium on (pp. 111-119).
A wide variety of communication cues used.SpeechParalinguisticParaverbalsProsodicsIntonationAudio GazeGestureFace ExpressionBody PositionVisualObject ManipulationWriting/DrawingSpatial RelationshipObject PresenceEnvironmentalCommunication Cues
Shared Space Face to Face interaction, Tangible AR metaphor ~3,000 users (Siggraph 1999) Easy collaboration with strangers Users acted same as if handling real objectsBillinghurst, M., Poupyrev, I., Kato, H., & May, R. (2000). Mixing realities in shared space: An augmentedreality interface for collaborative computing. In Multimedia and Expo, 2000. ICME 2000. 2000 IEEEInternational Conference on (Vol. 3, pp. 1641-1644).
Communication PatternsWill people use the same speech/gesture patterns?Face to Face FtF AR Projected
Communication Patterns User felt AR was very different from FtF BUT speech and gesture behavior the same Users found tangible interaction very easyBillinghurst, M., Belcher, D., Gupta, A., & Kiyokawa, K. (2003). Communication behaviors in colocatedcollaborative AR interfaces. International Journal of Human-Computer Interaction, 16(3), 395-423.% Dietic Commands Ease of Interaction (1-7 very easy)
Mobile Collaborative ARHenrysson, A., Billinghurst, M., & Ollila, M. (2005, October). Face to face collaborative AR on mobilephones. In Mixed and Augmented Reality, 2005. Proceedings. Fourth IEEE and ACM InternationalSymposium on (pp. 80-89). IEEE. AR Tennis Shared AR content Two user game Audio + haptic feedback Bluetooth networking
Using AR for Communication CuesVirtual Viewpoint VisualizationMogilev, D., Kiyokawa, K., Billinghurst, M., & Pair, J. (2002, April). AR Pad: An interface for face-to-face ARcollaboration. In CHI02 extended abstracts on Human factors in computing systems (pp. 654-655). AR Pad Handheld AR device AR shows viewpoints Users collaborate easier
AR for New FtF Experiences MagicBook Transitional AR interface (RW-AR-VR) Supports both ego- and exo-centric collaborationBillinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface. Computers& Graphics, 25(5), 745-753.
Lessons Learned Collaboration is a Perceptual task AR reduces perceptual cues -> Impacts collaboration Tangible AR metaphor enhances ease of interaction Users felt that AR collaboration different from Face to Face But user exhibit same speech and gesture as with real content“AR’s biggest limit was lack of peripheral vision. The interaction wasnatural, it was just difficult to see""Working Solo Together"Thus we need to design AR interfaces that don’t reduceperceptual cues, while keeping ease of interaction
AR Conferencing Virtual video of remote collaborator Moves conferencing into real world MR users felt remote user morepresent than audio or video conf.Billinghurst, M., & Kato, H. (2000). Out and about—real world teleconferencing. BT technology journal,18(1), 80-82.
Multi-View AR ConferencingBillinghurst, M., Cheok, A., Prince, S., & Kato, H. (2002). Real world teleconferencing. ComputerGraphics and Applications, IEEE, 22(6), 11-13.
A Wearable AR Conferencing Space Concept mobile video conferencing spatial audio/visual cues body-stabilized data Implementation see-through HMD head tracking static images, spatial audioBillinghurst, M., Bowskill, J., Jessop, M., & Morphett, J. (1998, October). A wearable spatial conferencingspace. In Wearable Computers, 1998. Digest of Papers. Second International Symposium on (pp.76-83). IEEE.
WACL: Remote Expert Collaboration Remote Expert View Panorama viewing, annotation, image captureKurata, T., Sakata, N., Kourogi, M., Kuzuoka, H., & Billinghurst, M. (2004, October). Remote collaborationusing a shoulder-worn active camera/laser. In Wearable Computers, 2004. ISWC 2004. EighthInternational Symposium on (Vol. 1, pp. 62-69).
Lessons Learned AR can provide cues that increase senseof Presence Spatial audio and visual cues Providing good audio essential AR can enhance remote task spacecollaboration Annotation directly on real world But: need good situational awareness
Natural Hand Interaction Using bare hands to interact with AR content MS Kinect depth sensing Real time hand tracking Physics based simulation modelPiumsomboon, T., Clark, A., & Billinghurst, M. (2011, December). Physically-based interaction fortabletop augmented reality using a depth-sensing camera for environment mapping. In Proceedings ofthe 26th International Conference on Image and Vision Computing New Zealand.
CityViewAR Using AR to visualize Christchurch city buildings 3D models of buildings, 2D images, text, panoramas AR View, Map view, List viewLee, G. A., Dunser, A., Kim, S., & Billinghurst, M. (2012, November). CityViewAR: A mobile outdoor ARapplication for city visualization. In Mixed and Augmented Reality (ISMAR-AMH), 2012 IEEE InternationalSymposium on (pp. 57-64).
Client/Server ArchitectureAndroid application Web application javaand php server Database server Postgres Web Interface Add models
Web based Outdoor AR Server Web interface Showing POIs asIcons on Google Map PHP based REST API XML based scenedata retrieval API Scene creation andmodification API Android client sideREST API interface
Handheld Collaborative AR Use handheld tablet to connect to Remote Expert Low cost, consumer device, light weight collaboration Different communication cues Shared pointers, drawing annotation Streamed video, still images
Future Research Ego-Vision collaboration Shared POV collaboration AR + Human Computation Crowd sourced expertise Scaling up City/Country scale augmentation
Ego-Vision Collaboration Google Glass camera + processing + display + connectivity
Ego-Vision Research System How do you capture the users environment? How do you provide good quality of service? Interface What visual and audio cues provide best experience? How do you interact with the remote user? Evaluation How do you measure the quality of collaboration?
AR + Human Computation Human Computation Real people solving problemsdifficult for computers Web-based, non real time Little work on AR + HC AR attributes Shared point of view Real world overlay Location sensingWhat does this say?
Human Computation Architecture Add AR front end to typical HC platform
AR + HC Research Questions System What architecture provides best performance? What data is needed to be shared? Interface What cues are needed by the human computers? What benefits does AR provide cf. web systems? Evaluation How can the system be evaluated?
Scaling Up Seeing actions of millions of users in the world Augmentation on city/country level
AR + Smart Sensors + Social Networks Track population at city scale (mobile networks) Match population data to external sensor data medical, environmental, etc Mine data to improve social services
Orange Data for Development Orange made available 2.5 billion phone records 5 months calls from Ivory Coast > 80 sample projects using data eg: Monitoring human mobility for disease modeling
Research Questions System How can you capture the data reliably? How can you aggregate and correlate the information? Interface What data provides the most values? How can you visualize the information? Evaluation How do you measure the accuracy of the model?
Conclusions Augmented Realty can enhance face to face andremote collaboration spatial cues, seamless communication Current research opportunities in naturalinteraction, environment capture, mobile AR gesture, multimodal interaction, depth sensing Future opportunities in large scale deployment Human computing, AR + sensors + social networks
More Information• Mark Billinghurst– firstname.lastname@example.org• Website– http://www.hitlabnz.org/