10 lessons - Mobile Usability Testing


Published on

If you’re new to mobile usability testing, fear not. It is not as hard as you might think but there are some key differences to testing a traditional website in a lab that you need to be aware of.

These slides are from a presentation by Tania Lang at the Designing for Mobility Conference in Melbourne on March 1, 2013. The talk outlined 10 things about mobile testing tools, technologies and methods that we have learnt from testing everything from a Quit smoking app with a woman whilst breastfeeding her 5 week old on her sofa, to testing a mobile car insurance website in a lab.

This talk is for anyone who wants to conduct usability testing on their mobile site or app and will focus on methods, tools and technologies. It draws on experiences and lessons learnt from testing mobile sites and apps over the last year including a public transport journey planner, a mobile insurance website and a mobile app to help pregnant women quit smoking.

Published in: Technology
1 Comment
  • http://www.smokingweedwhilepregnant.blogspot.com
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • TransLink mobile journey planner – conducted two rounds of usability testing including wireframes and a functioning site pre-release in a labNRMA Mobile site – conducted usability testing in a lab Quit for You Quit for Two app – conducted field testing of the Quit for You Quit for Two app in users’ homes and some sessions in our test observation room. The testing was done on behalf of DoHA and BCM who designed and built the app.
  • Display Recorder – Originally available at Cydia stores for jail broken phones. A rip off was available for a short time at the iTunes store. Records screen activity when accessing a website. Has since been removed from iTunes store.http://www.appolicious.com/articles/12383-display-recorder-makes-brief-appearance-in-itunes-app-store-or-did-it
  • e.g. we had one iPad user who was trying very hard to use a slider on a travel booking website and couldn’t. They tried several times but the slider wouldn’t work. A screen recording wouldn’t have picked this up and nothing was happening on screen.
  • Can record native apps which UX Recorder can’t.Captures taps gestures. They are apparently working on capturing swipes. Much easier to use than UX Recorder and cheaper as well.
  • Balance also important. Shouldn’t be too top heavy.
  • The camera shouldn’t block the view of the device or annoy users. It should be as less obtrusive as possible.
  • Needed to be able to move the camera forward and backwards and up and down to get into position.
  • We did a scan of different common devices and found a large range of devices of different sizes. Any sled system would have to accommodate this.
  • We wanted users to be able to easily switch from portrait to landscape view.We wanted to be able to easily switch user’s device without sticking anything to their phone.
  • The whole sled set up needs to be stable. We like to fix the focus on the camera as the autofocus keeps focussing in and out on users’ fingers.
  • None of the sleds proposed really addressed our requirements.Key issues included:Perspex and cameras too heavyRequired or encouraged placement on a tableCant accommodate landscape orientationArm not adjustableDidn’t easily accommodate different user devices e.g. required velcro to stick down
  • Used off the shelf iPhone and iPad cases for the platform. Our existing lab web cam attached by cable tie.
  • Decided it was not feasible to 2nd round prototypes accommodated different smartphones and different tablets
  • Please contact tania@peakusability.com.au if you are interested in a getting a free working prototype to use and provide feedback on.
  • That transitioning from portrait to landscape is hard. If user does it and you have to either fiddle with the camera itself or the camera software.Cameras need to be flexible to accommodate shift in perspective. Need to be able to automagically flip camera orientation. Expensive!
  • Fixed arms and rigid camera mounts resulted in bending of metal to get the right spot.A lot of faffing about that you could not do in a sessionA camera with an adjustable mount was a little more flexible but still required changing when user adjusted orientation of devices and often obscured user’s view of the device.
  • iPad models get heavy. Users tended to rest them on the desk and lean over them when testing rather than holding them up and sitting back or putting them on knee or lap. Was not well balanced.Even when testing the NRMA mobile site, users were encouraged to sit back in the chair but nearly all put their phone on the table.
  • Angled arms were more comfortable for users (although you may need a ‘lefty’ and a ‘righty’ for different users) but the Morae PiP gets in the way of a clear view of the action.
  • I conducted testing of the Quit for You Quit for Two app for pregnant women to help them quit smoking. We ended up recruiting mothers with babies who had recently quit smoking or were attempting to quit. Obviously given the potentially sensitive nature of the topic (women who had tried to quit smoking when pregnant) as well as who we were testing with new mothers with babies, I needed to conduct many of the test sessions in the field in people’s homes. As a mother myself with young children, it was easy to build rapport with these women and for them to open up to me as I attempted to make the testing as casual as possible. One of the test sessions was even run on a participant’s sofa whilst she was breastfeeding her 5 month old. She had baby in one arm and her Smartphone using the app with her the other hand. Obviously it was not appropriate to record these sessions in any way and the intrusion of any additional technology would have potentially affected the results.
  • Context is always important for mobile design but when it comes to testing, it is more important for some types of sites or apps than others e.g. TransLink Journey PlannerYou need to consider where users are likely to use your app. For instance, many Quit for You Quit for Two users said they would probably use the app when alone at home when they have limited resolve when away from the public eye and their phone is on hand. Chart source: http://www.smartinsights.com/marketplace-analysis/customer-analysis/new-free-worldwide-digital-media-statistics-reports-starting-with-uk-us-and-europe/
  • In a lab environment, we may have a stable wireless connection and not be moving about but this doesn’t happen in real life. People use their phone on trains, buses, whilst waiting in a queue. Signals drop out. Activity is often in short bursts e.g. every ad break. So how do you test for this?NRMA was interested in understanding users’ expectations regarding what would happen in the event they lost their signal or if they left their session and returned to it 30 minutes later. Half way through their task of getting a quote, we stopped users and said “imagine you were doing this on a train and the signal dropped out when they went through a tunnel. We then asked them what they would expect to happen then showed them a screenshot to ask them what they would do. So even though we could not actually test on a train, we tried to simulate their experience. The next point doesn’t relate to mobile specifically but just another general point about simulating context. For the Quit for You Quit for Two mobile app, we found it difficult to recruit actual pregnant women who were smoking but we managed to recruit several mothers who had babies in the last 12 months who were smoking when they got pregnant. To help build rapport and encourage users to be a open and honest as possible without being judged, I created a fictitious persona of a pregnant smoker and asked the participants to assume the role of this persona for the whole session. This enabled them to pretend to be someone else and open up about what that persona would do but of course they were basing all their comments on their own experience.
  • Whenwe conducted testing for NRMA, we did it in a office with a boardroom table with a computer. Participants put their own Smartphone on our sled and users were encouraged to sit back in their chair and feel comfortable. Even though the sled was very light, nearly all participants ended up putting their phone on the table for the duration of the test session which is probably not how they would use their phone in real life. When we conducted testing of the Quit for You Quit for Two mobile app, some participants preferred to come into our test lab. To try to create a casual environment that was not threatening with cameras, desks, computers etc, we tried to recreate the ‘home’ environment. Instead of using our usual test room, we ran the sessions on our sofa in our test observation room and didn’t use any technology to record the session. Even though this wasn’t the users’ true context and environment, we simulated their home environment and the test outcomes were consistent with sessions with ran in participants’ homes.
  • Pros and cons of recording mobile usability test sessions in relation to mobile.Pros of using recording technology e.g. sled, webcam and usability software such as Morae: Recording the testing of the TransLink wireframes and mobile site allowed us to go back to view test footage we may have missed. Most importantly, it allowed developers and the project team to observe test sessions and see first hand how users were interacting with their mobile site and what they were saying. Of course video snippets are always good for including in presentations to management and stakeholders as well. Cons – it may affect natural user behaviour. E.g. switching to landscape view. Can be a bit instrusive. Pros of not using any recording technology:No technology – Definitely achieved a better outcome for Quit for You Quit for Two testing as a result of not recording – as previously mentioned, participants felt more at ease, were very open and honest. Technology and recording devices did not get in the way of testing. The app had a limited number of well designed and clean screens so it was easy to observe how users interacted with each screen.Cons – More difficult to communicate and convince clients generally. For Quit for You Quit for Two, the client and development vendor had to totally rely on our written findings as they couldn’t observe any test sessions.
  • NRMA mobile testing – Galaxy and iPhone users love the date picker but 1 user with an HTC phone really struggled with the date picker for inputs such as Years using a HTC smartphone. The year values in the date picker were very slow on the HTC phone. When he scrolled quickly it scrolled the whole page. TransLink – during prototype testing, we used the users’ own mobile carrier’s network connection and some screens and predictive search results were very slow to load. Users sometimes tapped on a wrong result as the predictive search was updating slowly.This had an effect on the user experience and user satisfaction. This was good to understand and inform the client prior to launch. We would not have picked up how this made users feel if we just used our own office wireless connection.
  • 10 lessons - Mobile Usability Testing

    1. 1. 10 things we have learnt fromconducting mobile usability testingTania LangPrincipal at Peak Usability@peakusabilityDesigning for Mobility ConferenceMelbourne, Australia – 1 March 2013 1
    2. 2. Apps and mobile sites tested Quit For You NRMA Mobile site – Translink Journey Quit For Two app Get a quotePlanner wireframes Oct 2012 Nov 2012 Dec 2011
    3. 3. The problem How to test paper prototypes of screens that require scrolling
    4. 4. Our solution
    5. 5. Screen recorders Display Recorder Display Recorder Available at Cydia store Previously available on iTunes
    6. 6. Screen recorders Can’t record apps and/or users’ gestures
    7. 7. UX Recorder – app overcomes some issues • Overcomes some of these problems. • Records taps and swipes. • Doesn’t record multiple taps e.g. user trying to tap something but failing • Only records mobile web – can’t record apps. • A bit clunky to use. • Have to buy credits. Captures tap & swipe gestures, audio and face
    8. 8. Magitest – new app has potential Captures tap gestures, audio and PIP. Records apps.
    9. 9. Sled systems – Our requirements Must be light
    10. 10. Sled systems – Our requirements Should not slow down users or get in the way
    11. 11. Sled systems – Our requirements Adjustable camera position
    12. 12. Sled systems – Our requirements 115 58Phones H W 129 68 230 157Tablets H W 271 186 Accommodate different sized devices
    13. 13. Sled systems – Our requirements Easy to switch – orientation and device
    14. 14. Sled systems – Our requirements But still be stable & not move during testing
    15. 15. Sled systems – Our process 3 prototypes Search for Prototype 1 2a-c sled Createdmobile testing sled system system and prototype 3 technology and tested tested
    16. 16. Sled systems we investigated
    17. 17. Sled systems – Our prototype 1iPhone sled HD Logitech C910 WebcamiPad sled HD Logitech C910 Webcam
    18. 18. Sled systems – Prototypes 2a-cSmartphone sleds • Centred arm • Angled left • Angled right • Lip at bottomTablet sleds • Angled left • Off centre arm no lip • Off centre arm with lip
    19. 19. Sled systems – Our prototype 3 • Centred adjustable arm • Lip at bottom • Lighter camera – better balance
    20. 20. Sled systems – Our prototype 4??? Looking for some usability lab rats to test
    21. 21. Key lessons from testing Switching orientation is tricky
    22. 22. Key lessons from testing An adjustable arm or camera mount is essential
    23. 23. Key lessons from testing Source: Flickr Robert Scoble Users will put the device on a table if available
    24. 24. Key lessons from testing Angled arms are more comfortable for users
    25. 25. CamerasHD Logitech C910 Webcam Pros: • High resolution recording • Plug into Morae test software • Camera mount adjustableHD Logitech C615 Webcam Cons: • Large and heavy – affected balance • C910 hard to connect to a sled
    26. 26. CamerasMicrosoft Lifecam Pros: • Still good resolution - 720p • Plug into Morae test software • Easy to connect to sled – bolt on Cons: • We found camera software hard to use • Camera mount rigid – very hard to angle • Still large and a bit heavy
    27. 27. Cameras Ipevo document camera Pros: • Very high resolution • Small and lightweight • Plug into Morae test software Cons: • Frame rate – slow and a bit jerky • Difficult to connect to sled
    28. 28. Cameras - positioning Position 2nd camera on desk angled at user’s face
    29. 29. Sometimes it is better not to record at all… …testing an app while participant is breastfeeding
    30. 30. Context is not always critical More important for some sites/apps than others
    31. 31. Simulating context Source: Flickr Gabriel White
    32. 32. Simulating users’ environment
    33. 33. To record or not to record
    34. 34. Using users’ technology will discover more issuesUsers’ own mobile carriers Users’ own speed GSM 3G 4G Users’ own devices
    35. 35. About me and Peak Usability• My name is Tania Lang. I am the founder of Peak Usability which I started in 2003.• Peak Usability is a user experience and usability consultancy based in Brisbane, Australia.• We help our clients achieve their business goals by creating highly usable and user centric websites, intranets and web applications• State of the Art Usability Testing Facility in Brisbane city including mobile and tablet testing.• We have extensive usability research, design and testing and design experience• Some of our clients:
    36. 36. ThanksIf you are interested in being a lab rat and trying out ournext mobile testing sled prototype, please contact:Tania Langtania@peakusability.com.auTwitter: @peakusabilityFollow us:Sign up for our newsletter at:www.peakusability.com.au