A Mobile Indoor Navigation System Interface Adapted to Vision-Based Localization

1,049 views

Published on

Vision-based approaches for mobile indoor localization do not rely on the infrastructure and are therefore scalable and cheap. The particular requirements to a navigation user interface for a vision-based system, however, have not been investigated so far.
Such mobile interfaces should adapt to localization accuracy, which strongly relies on distinctive reference images, and other factors, such as the phone’s pose. If necessary, the system should motivate the user to point at distinctive regions with the smartphone to improve localization quality.
We present a combined interface of Virtual Reality (VR) and Augmented Reality (AR) elements with indicators that help to communicate and ensure localization accuracy. In an evaluation with 81 participants, we found that AR was preferred in case of reliable localization, but with VR, navigation instructions were perceived more accurate in case of localization and orientation errors. The additional indica- tors showed a potential for making users choose distinctive reference images for reliable localization.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,049
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
52
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

A Mobile Indoor Navigation System Interface Adapted to Vision-Based Localization

  1. 1. A Mobile Indoor Navigation System Interface Adapted to Vision-Based LocalizationAndreas Möller1, Matthias Kranz2, Robert Huitl1, Stefan Diewald1, Luis Roalter1 1Technische Universität München, Germany 2Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Luleå, Sweden MUM 2012, Ulm, Germany
  2. 2. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenOutline§ Motivation for Vision-Based Navigation§ Challenges§ Proposed User Interface Concepts§ Evaluation§ Discussion and Conclusion04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 2
  3. 3. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität München Background and Motivation§ Location information still the most important contextual information§ Indoor localization is a hot topic and useful for a lot of scenarios: □ Airports □ Hospitals □ Conference venues □ Large environments§ Various indoor localization methods possible: □ WLAN/cell-based localization □ Sensor-based localization □ Beacon-based localization □ Vision-based localization 04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 3
  4. 4. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenWhy vision-based localization? Live features§ Compare query image to reference data set§ Use existing features in the environment§ No additional augmentation needed§ Modern devices are equipped with a camera and powerful processor04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 4
  5. 5. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenChallenges of Vision-Based Localization§ Query images need to be distinctive □ Repeating structures throughout the environment (e.g. corridors) □ Little visual features (e.g. uniform walls)§ Orientation of the user’s smartphone □ Good candidates for query images (windows, adverts, posters) are typically found in eye height □ In a typical pose, user holds the device in a way that it points at the floor§ User interface needs to address those challenges04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 5
  6. 6. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenUser Interface Concept Our contrib§ No 1 Challenge: Localization inaccuracy ution: Addres □ When insufficient number of features problem s from in the query image a UI perspe ctive§ Solution A: Adaption of the user interface □ Augmented Reality (AR) with a live view of the environment □ Virtual Reality (VR) with panorama images (compare to Google Street View) taken in advance§ Solution B: Corrective actions □ The user needs to point at interesting regions with the camera04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 6
  7. 7. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität München Augmented Reality (AR) • Display the live camera image • Impose navigation instructions over the image • Smartphone held in eye height • Direct match of information and the real environment04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 7
  8. 8. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität München Virtual Reality (VR) • Uses preloaded panorama images • Navigation instructions are drawn in panorama • Smartphone can be directed to the ground • User matches environment and virtual reality04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 8
  9. 9. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenPointing Towards Discriminative Areas04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 9
  10. 10. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenHighlighting Objects of Interest04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 10
  11. 11. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenResearch Questions§ RQ1: Which concept (AR or VR) is preferable in terms of perceived accuracy?§ RQ2: Which concept (AR or VR) is preferred by users?§ RQ3: Which visualizations are appropriate to acquire sufficient visual features?§ RQ4: Can object highlighting be improved with a soft border visualization?04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 11
  12. 12. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenMethodology§ Online Survey§ Videos and images of mockup system§ 81 participants (18-59 years) recruited via Mobileworks04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 12
  13. 13. Virtual Reality Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenPerceived Accuracy of Virtual Reality and Augmented Reality Views RQ1: Perceived accuracy No Error Orientation Error Location Error Loc.+Ori. Error Response Std.Dev. Response Std.Dev. Response Std.Dev. Response Std.Dev. 1: 2.5 0.9 0.8 2.0 1.7 1.5 0.6 2.0 § Simulated error conditions 1.4 1.7 1.6 1.8 1.4 1.8 1.0 1.7 2: □ No Error 1.0 2.4 1.7 1.5 0.2 1.1 2.1 1.8 1.2 □ Orientation Error 1.3 1.8 1.7 0.4 0.9 2.1 1.8 3: □ Location Error 2.3 1.0 -0.2 2.0 □ Both Errors 1.9 0.4 -0.5 1.9 1.8 1.4 1.4 1.6 1.4 1.7 0.9 1.8 Augmented Reality 1: The system seemed to know well my location. 2: The system seemed to know well my orientation. Virtual Reality 3: I perceived the navigation instructions as correct.Perceived Accuracy of Virtual Reality and Augmented Reality Views No Error Orientation Error Location Error Loc.+Ori. Error Response Std.Dev. Response Std.Dev. Response Std.Dev. Response Std.Dev. 2.3 1.0 -0.2 2.0 0.4 1.9 -0.5 1.9 1.8 1.4 1.4 1.6 1.4 1.7 0.9 1.8 Augmented Reality -3: strongly disagree Virtual Reality +3: strongly agree 04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 13
  14. 14. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenRQ2: User Preferences§ In total, AR more popular with users than VR -3: strongly disagree +3: strongly agree04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 14
  15. 15. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität München RQ3: Feature Indicator § Clarity of the indicator‘s a) meaning b) c) -3: strongly disagree d) +3: strongly agreea) b)c) d) 04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 15
  16. 16. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenRQ4: Highlightingmethods§ Objects of interest highlighting □ Potentially feature-rich □ Interaction points for location-based services§ Two types □ Frame □ Soft border§ Both equally raise attention, but Frame distracts more -3: strongly disagree +3: strongly agree04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 16
  17. 17. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenDiscussion§ Accuracy perception through the interface □ VR beneficial for lower localization accuracy □ AR preferred for reliable estimate§ Feature indicators and object highlights potentially contribute to good reference images§ Situational use □ VR for 45° angle □ Less features visible □ Less accuracy required □ AR when holding the phone up □ Target to visual features □ Highlight objects of interaction04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 17
  18. 18. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenFuture andOngoing Work§ Evaluating interfaces in the real world§ Modeling the system’s and user’s state □ Localization accuracy □ Location/environment □ User context □ User mental model □ …04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 18
  19. 19. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität München Thank you for your attention! Questions? ? ? andreas.moeller@tum.de www.vmi.ei.tum.de/team/andreas-moeller.html This research project has been supported by the space agency of the German Aerospace Center with funds from the Federal Ministry of Economics and Technology on the basis of a resolution of the German Bundestag under the reference 50NA1107.04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 19
  20. 20. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenPaper References§ Please find the full paper at: http://dx.doi.org/10.1145/2406367.2406372§ Please cite this work as follows:§ Andreas Möller, Matthias Kranz, Robert Huitl, Stefan Diewald, and Luis Roalter. 2012. A mobile indoor navigation system interface adapted to vision-based localization. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia (MUM 12). ACM, New York, NY, USA, , Article 4 , 10 pages. DOI=10.1145/2406367.2406372 http://doi.acm.org/10.1145/2406367.2406372§04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 20
  21. 21. Institute for Media Technology Distributed Multimodal Information Processing Group Technische Universität MünchenPlease use the following BibTex file:§ @inproceedings{Moller:2012:MIN:2406367.2406372, author = {M"{o}ller, Andreas and Kranz, Matthias and Huitl, Robert and Diewald,Stefan and Roalter, Luis}, title = {A mobile indoor navigation system interface adapted to vision-basedlocalization}, booktitle = {Proceedings of the 11th International Conference on Mobile andUbiquitous Multimedia}, series = {MUM 12}, year = {2012}, isbn = {978-1-4503-1815-0}, location = {Ulm, Germany}, pages = {4:1--4:10}, articleno = {4}, numpages = {10}, url = {http://doi.acm.org/10.1145/2406367.2406372}, doi = {10.1145/2406367.2406372}, acmid = {2406372}, publisher = {ACM}, address = {New York, NY, USA}, keywords = {augmented reality, indoor navigation, user interface, virtualreality, vision-based localization},} [04.12.2012 A. Möller, M. Kranz, R. Huitl, S. Diewald, L. Roalter 21

×