Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

User Interfaces and User Centered Design Techniques for Augmented Reality and Virtual Reality


Published on

We chose to explore virtual and augmented reality (VR and AR) due to its recent emergence into the mainstream areas of gaming, mobile applications and various other systems. We felt it important to distinguish between VR and AR in both areas of interaction design and user interface evaluation and creation techniques. As it is a topic of great passion for us we wanted to instill the possibilities that this medium has to offer for interaction designers and UI developers.

Published in: Technology
  • Login to see the comments

User Interfaces and User Centered Design Techniques for Augmented Reality and Virtual Reality

  1. 1. UIs and User Centred Design Techniques for AR + VR
  2. 2. What’s AR and VR?! Augmented Reality and Virtual reality. AR: technology that layers over our everyday VR: technology that transports us to a different world
  3. 3. But They Must be Years Away?!
  4. 4. How do we design applications for AR and VR?
  5. 5. IxD for System vs IxD for Framework Designers are even confronted with the obstacle of a methodology or framework to design the system itself and quickly iterate through prototypes Steps are being taken to address these concerns Few applications / frameworks although commercial + open source are helping
  6. 6. Evaluating VR UI’s Goals Formation Navigate the World Locate Objects Position for Interaction Deciding on Action Manipulating Objects Recognise Feedback Interpret Feedback Decide on New Action
  7. 7. VR Related Issues with Evaluation Objects can obscure and may break interaction cycle Different modes of design for navigation and for environment driven VR systems Expert users see the modes blend together Feedback should be multisensory Can’t see things off screen or behind a wall etc. Environmental cues are key
  8. 8. Evaluation Techniques Walkthrough each phase step by step Ask necessary questions along the way ◇ Aim to uncover breaks in affordances ◇ Questions guided to create generic design principles (GDPs) Collect design issues and virtual environment features Prioritise them for development
  9. 9. UCD Tips for AR Install on a familiar device Choose a design scenario Run in an appropriate setting Build for two hands if on mobile Choose the right audience Challenge users with mental flow not with physical strain
  10. 10. Issues Overall Information is often weakly exhibited that link design problems with design solutions in VR and AR. Even some of Norman’s interaction evaluation techniques can break down when dealt with navigating 3D space. Need a tailored methodology for dealing with emerging issues in hardware and interaction.
  11. 11. Virtual Reality Uses, Current Tech and Design
  12. 12. Uses of VREducation With the leaps in technology, virtual reality can be used to transport people to other planets, tourist destinations and the many jungles and oceans on earth. Video Games Virtual Reality allows players to be transported to other worlds and puts them in the middle of the action! Medical Virtual reality can allow surgeons to move throughout the body and diagnose problems that patients have. Virtual reality is also being used for therapy for PTSD veterans and phantom limb syndrome. source: openmedical.orgsource: source:
  13. 13.
  14. 14. VR Technology available today
  15. 15. Designing for VR
  16. 16. User Interface design in VR When designing interfaces in a virtual reality, there are some considerations which must be taken into account. ● Is there a motion controller used in conjunction with the VR system? ● What is the nature of the experience? ● Who is the interface being designed for? ● What are the perceptual limitations of the user?
  17. 17. Perceptual Limitations Perceptual limitations occur in the areas that users find interaction difficult in their field of view. Peripheral
  18. 18. Perceptual Limitations Perceptual limitations occur in the areas that users find interaction difficult in their field of view. Peripheral 0.5m Region where convergence can occur
  19. 19. How do we design UIs with these limitations in mind?
  20. 20. How do we design UIs with these limitations in mind?
  21. 21. Is this a good UI design or bad UI design in VR?
  22. 22. Augmented Reality Uses, Current Tech and Design
  23. 23. Uses of AR Notification AR headsets can notify you of social media, texts or email as you go about your daily life. Video Games AR allows players to have their world transformed in front of their very own eyes! Navigation AR allows people to find their destination in a non- obtrusive way. source: source:
  24. 24. AR Technology available today
  25. 25. Designing for AR
  26. 26. User Interface design in AR When designing interfaces in a augmented reality, the considerations taken into account are similar to VR. But there is one thing that is the utmost importance! Obscurity / Opacity.
  27. 27. User Interface design in VR Continue 10m
  28. 28. User Interface design in VR
  29. 29. User Interface design in VR
  30. 30. Case Studies 1. Design and Evaluation of Menu Systems for Immersive Virtual Environments - Bowman & Wingrave 2001 2. Experimental Evaluation of User Interfaces for Visual Indoor Navigation - Moller et al. 2014
  31. 31. Key Interaction Tasks VR Navigation Selection Manipulation System Control
  32. 32. TULIP Menu vs Floating Menu vs Pen and Tablet
  33. 33. Pinch Gloves
  34. 34. Hardware for TULIP Menu Pinch Gloves consist of a flexible cloth gloves augmented with conductive cloth sewn into the tips of each of the fingers. When two or more pieces of conductive cloth come into contact with one another, a signal is sent back to the host computer indicating which fingers are being “pinched” Virtual Research V8 head-mounted display (HMD) and the head and both hands are tracked using a Polhemus Fastrak tracking system
  35. 35. Evolution of TULIP Scrolling Menu Three Up Menu
  36. 36. Scrolling Menu
  37. 37. Three Up Menu
  38. 38. Pilot Study Evaluating these two menu designs, users had to change a virtual object to match a target object. 3 parameters could be controlled: the object’s shape, color, and texture. Each of these corresponded to a top-level menu. There were 3 shapes to choose from, 8 colors, and 6 textures – these corresponded to second-level menu items. Test 4 users - “Think Aloud” - Informal Results
  39. 39. Pilot Results Neither design satisfied the desired requirements. Users did prefer the Scrolling Menu BUT realized tasks could be completed with less steps using the Three Up Menu. Three Up Menu hides options if there is more than 3. Scrolling Menus prompt users to incorrectly attempt to push palms. Lack of feedback when items selected and fatigue from hands being raised.
  40. 40. RE-DESIGN T hree U p L abels I n P alm
  41. 41. Summative Evaluation Compare Ease Of Use Compare Ease Of Learning Efficiency Comfort of all 3 Menus
  42. 42. Summative Evaluation 26 users participated Repeated object matching task Completed a questionnaire containing demographic information and information about their experience with computers and VEs Same equipment used again with added stylus for floating menus and pen and tablet menus 30 trials of each menu and no help provided after initial briefing
  43. 43. Floating Menus Pen and Tablet Menus
  44. 44. Results Appears that the gloves were the hardest to learn initially, but performance was at reasonable levels for all three types within five trials. Reason times for the pen and tablet menu are initially poor is that users were not told they needed to look at the tablet in their hand
  45. 45. Comfort Levels The main drawback of the pen and tablet system is the discomfort it causes users, which might be alleviated by adding an ergonomic handle.
  46. 46. Reflection On Study Combining the efficiency, comfort, and preference information, it appears that both the pen and tablet menu and the TULIP menus performed well in the evaluation Fifteen users expressed a preference for the TULIP interface, while nine preferred the pen and tablet, and only two preferred the floating menus This evaluation reiterated some important heuristics from the traditional human-computer interaction literature. Menu systems for VEs need to have good feedback, affordances, and constraints, and items and their actions should be visible.
  47. 47. Evaluation of User Interfaces for Visual Indoor Navigation
  48. 48. Andreas Moller et al. Implemented a novel UI for visual localization, consisting of Virtual Reality (VR) and Augmented Reality (AR) views that actively communicate and ensure localization accuracy. Beneficial for large buildings and navigating your way around. The concept consists of a panorama-based view as a complement to Augmented Reality and proposes different visualizations for motivating users to record “good” query images. Good query images are important
  49. 49. Comparing AR vs VR The users would hold the phone up as seen in the figure and look at the phone in order to see the augmentation i.e. items superimposed onto their real life surroundings. The virtual reality displays pre- recorded images of the environment (downloaded from a server) that are arranged to a 360◦ panorama on the mobile device
  50. 50. Enhancing Visual Localization Visual localization can be very dependent on how the device is being held. Four indicator types were proposed which were: - Text Hint - Blur - Colour Scale - Spirit Level
  51. 51. Initial Testing A questionnaire-based survey with mockup videos were used for early testing. Users did not actually travel through a building. From initial user evaluations there was an inconsistency. Majority of people stated that the VR mode helped them orient themselves even if the location estimate of the system was incorrect yet the subjects still claimed they prefer the AR mode. Users preferred the Text Hint and Spirit Level as a means to prompt user to provide a better quality of image.
  52. 52. Prototype A prototype was built using Android 2.3 Had both the AR and VR modes Wizard Of Oz technique - experimenter would play a role The navigation interface on the subject’s device was implemented with OpenGL ES 2.0. For the automatic trigger, they used a FAST feature detector from the OpenCV framework for Android to detect the number of features in the camera’s live image.
  53. 53. Experiments So the goal of the experiment was to verify results from earlier mock-ups. Three experiments took place to test: 1. Efficiency, perception and convenience of AR and VR under different accuracy conditions. 2. Effectivity of UI elements specific to vision-based localization. 3. Convenience and distraction of object highlighting. 12 participants took part most of which were students but none were involved in the project.
  54. 54. Hardware Subjects used a Samsung Galaxy S II (4.3- inch screen, 8 megapixel camera) The WOz app ran on a Samsung Nexus S (4-inch screen). Both devices had a screen resolution of 480×800 pixels.
  55. 55. UIs Implemented (subject)
  56. 56. UIs Implemented (WoZ)
  57. 57. Experiment 1- Navigation using AR and VR Subjects performed a navigation task in a university building on a path of 220 meters length, using both the AR and the VR mode. Each user traversed the path eight times and was asked to rely only on the given instructions. Navigation instructions were fed into the subject’s phone by the experimenter (Wizard of Oz). Sent the appropriate panoramas in VR mode (and directional arrows in AR mode) to the subject’s phone using the WOz interface. Users were encouraged to think allowed
  58. 58. Results Subjects were in average 25 seconds faster to reach their destination with VR (averagely 2:39 minutes for the 220 m path) than with AR (averagely 3:04 minutes). The experiment also proved that AR was worse with regards to orientation errors. Subjects found carrying the phone more convenient in VR. The required upright position for carrying the phone in AR was physically constraining
  59. 59. Experiment 2 Vision Specific UI Elements Subjects performed a navigation task on the path shown in figure, but in opposite direction as in Experiment 1, so that the path was not already too familiar. Three times during the walk, a relocalization procedure was simulated. The experimenter triggered a spirit level visualization to appear on the subjects’ device. This indicator told subjects to collect enough features for relocalization.
  60. 60. Results Reliable localization requires 100 to 150 features in the image. While the indicator was visible, the average number of detected features per frame rose from 42 to 101. The experiment also showed that subjects preferred the lower carrying position for VR mode, compared to the upright pose for AR mode
  61. 61. Experiment 3 Object Highlighting Methods There were two ways to highlight objects: Frame and Soft Hightlight The algorithm was optimized to detect square, feature-rich objects out of a uniform background. This applies to, e.g., a poster on a wall. Subjects pointed at the posters using both highlighting visualizations. Feedback was afterwards collected by a questionnaire.
  62. 62. Results On a Likert scale from -3 to +3, subjects indicated that Frame drew more attention to the poster (M = 3) than Soft Highlight (M = 1). The semi- transparency of Soft Highlight complicated readability of text on the poster. Regarding distraction, the visible contours of the Frame visualization were perceived as unstable.
  63. 63. Reflection on Experiments VR mode turned out to be advantageous in several ways. Contrary to initial feedback, AR UI appeared far more appealing in theory. AR UI does have it strengths though; It can help to improve feature collection using the feature indicator i.e. spirit level prompt to probability of reliable re- localization. Subjects reported that the frequent updates of the panorama images in VR mode were partly irritating, especially when not permanently looking at the screen Further studies are required in the field to strengthen these UI concepts, particularly with regards to AR and seeking more reliable localization.
  64. 64. Soooo what’s the activity? Considering the revised User Centred Design Model, we’re going design some augmented reality applications. ● Navigation ● Notification System ● Photo and Video ● Messaging System
  65. 65. thanks! ANY QUESTIONS?
  66. 66. References Alger, M., 2015. Visual Designs for Virtual Reality. Alger, M., n.d. VR Interface Design Manifesto. Alger, M., n.d. VR Interface Design Pre-Visualisation Methods. Bowman, E., Wingrave, C. (2001) "Design and Evaluation of Menu Systems for Immersive Virtual Environments", in Proceedings Of The Virtual Reality 2001 Conference (VR'01) (VR '01), IEEE Computer Society. Broll, W., Shafer, L., Hollerer, T., Bowman, D., 2001. Interface with angels: the future of VIR and AR interfaces. IEEE Computer Graphics and Applications 21, 14–17. doi:10.1109/38.963455 De Freitas, S., Rebolledo-Mendez, G., Liarokapis, F., Magoulas, G., Poulovassilis, A., 2010. Learning as immersive experiences: Using the four- dimensional framework for designing and evaluating immersive learning experiences in a virtual world. British Journal of Educational Technology 41, 69–85. Denis, J.-M., 2015. From product design to virtual reality. Google Design. DesigningInteractions_8.pdf, n.d. Faaborg, A., n.d. Designing for virtual reality and the impact on education. TEDx Talks. Hsu, C., Shiau, H., 2013. The Visual Web User Interface Design in Augmented Reality Technology. Editorial Preface 4.
  67. 67. References Jang, D., Kim, J.-S., Li, K.-J., Joo, C.-H., 2011. Overlapping and synchronizing two worlds. Proceeding GIS ’11 Proceedings of the 19th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems 493–496. Kunz, A., Wegener, K., 2013. Towards natural user interfaces in VR/AR for design and manufacturing. Eidgenössische Technische Hochschule Zürich, Institute of Machine Tools and Manufacturing. Leap Motion VR Best Practices Guidelines.pdf, n.d. rabedik, n.d. AR Screen Hackathon Project. Möller, A., Kranz, M., Diewald, S., Roalter, L., Huitl, R., Stockinger, T., Koelle, M., Lindemann, P. (2014) "Experimental evaluation of user interfaces for visual indoor navigation", in Proceedings Of The SIGCHI Conference On Human Factors In Computing Systems (CHI '14), ACM: New York, NY, USA. Samsung Developer Connection, 2014. VR Design: Transitioning from a 2D to 3D Design Paradigm. Sutcliffe, A., n.d. Multimedia and Virtual Reality: Designing Multisensory User Interfaces. Pyschology Press, 2003. Sutcliffe, A, & Deol Kaur, K 2000, “Evaluating the usability of virtual reality user interfaces”, Behaviour & Information, n.d.
  68. 68. References Investigating the Balance between Virtuality and Reality in Mobile Mixed Reality UI Design – User Perception of an Augmented City. Proceeding NordiCHI ’14 Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational 137–146. Wesolowski, M., n.d. Designing Next-Gen Virtual Reality Gaming Experiences. Sutcliffe, A, & Deol Kaur, K 2000, 'Evaluating the usability of virtual reality user interfaces', Behaviour & Information Technology, 19, 6, pp. 415-426, Academic Search Complete, EBSCOhost, viewed 12 February 2016., (2016) User Testing As A Design Driver : Looksery Created A Product For Users, Not Designers: Looksery Created A Product For Users, Not Designers | UX Magazine [online], available: [accessed 20 Feb 2016]., (2016) User Testing As A Design Driver : Looksery Created A Product For Users, Not Designers: Looksery Created A Product For Users, Not Designers | UX Magazine [online], available: [accessed 292Feb 2016].